Unnamed: 0 int64 3 832k | id float64 2.49B 32.1B | type stringclasses 1
value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3
values | title stringlengths 2 430 | labels stringlengths 4 347 | body stringlengths 5 237k | index stringclasses 7
values | text_combine stringlengths 96 237k | label stringclasses 2
values | text stringlengths 96 219k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
129,301 | 17,770,995,992 | IssuesEvent | 2021-08-30 13:39:15 | kawalcovid19/wargabantuwarga.com | https://api.github.com/repos/kawalcovid19/wargabantuwarga.com | closed | Website 2.0 | enhancement epic ui ux design | ## Overview
We are revamping our website look and feel based on the new UI design reflected in this Figma file:
https://www.figma.com/file/XNNtIoFEdFqaXOee83n0oN/WBW?node-id=485:2485
## Current Tasks
### Home page
- [x] #332
- [x] #352
- [x] #333
- [x] #322
- [x] #323
- [x] #324
- [x] #504
- [x] #598
- [x] #599
- [x] #642
- [x] #672
- [x] #673
- [x] #668
- [x] #710
### Database-related pages
- [x] #600
### Stuff that's on a separate page
- [x] #325
- [x] #326
- [x] #327
- [x] #328
- [x] #329
- [x] #330
- [x] #712
### General structure
- [x] #346
- [x] #391
- [x] #624
### Minor tasks
- [x] #385
- [x] #422
- [x] #453
### Bugs
- [x] #448
- [x] #498
- [x] #573
- [x] #723 | 1.0 | Website 2.0 - ## Overview
We are revamping our website look and feel based on the new UI design reflected in this Figma file:
https://www.figma.com/file/XNNtIoFEdFqaXOee83n0oN/WBW?node-id=485:2485
## Current Tasks
### Home page
- [x] #332
- [x] #352
- [x] #333
- [x] #322
- [x] #323
- [x] #324
- [x] #504
- [x] #598
- [x] #599
- [x] #642
- [x] #672
- [x] #673
- [x] #668
- [x] #710
### Database-related pages
- [x] #600
### Stuff that's on a separate page
- [x] #325
- [x] #326
- [x] #327
- [x] #328
- [x] #329
- [x] #330
- [x] #712
### General structure
- [x] #346
- [x] #391
- [x] #624
### Minor tasks
- [x] #385
- [x] #422
- [x] #453
### Bugs
- [x] #448
- [x] #498
- [x] #573
- [x] #723 | non_comp | website overview we are revamping our website look and feel based on the new ui design reflected in this figma file current tasks home page database related pages stuff that s on a separate page general structure minor tasks bugs | 0 |
776,491 | 27,262,421,197 | IssuesEvent | 2023-02-22 15:43:44 | Qiskit/qiskit-ibm-runtime | https://api.github.com/repos/Qiskit/qiskit-ibm-runtime | closed | QiskitRuntimeService() does not work without qiskit-ibm.json | bug priority: medium | **Describe the bug**
It seems to be related to #301 .
While, the credential to the provider works:
```
>> from qiskit import IBMQ
>> IBMQ.load_account()
<AccountProvider for IBMQ(hub='XXX', group='XXX', project='XXX')>
```
when `~/.qiskit/qiskit-ibm.json` is not present:
```
>> from qiskit_ibm_runtime import QiskitRuntimeService
>> service = QiskitRuntimeService(channel='ibm_quantum')
AccountNotFoundError: 'No default ibm_quantum account saved.'
```
Notice that `service = QiskitRuntimeService()` does work and creates `~/.qiskit/qiskit-ibm.json` correctly, with `"channel": "ibm_quantum"`, at least in my case.
**Expected behavior**
I think `QiskitRuntimeService(channel='ibm_quantum')` should work and create `qiskit-ibm.json`.
**Additional Information**
- **qiskit-ibm-runtime version**: 0.6.2
- **Python version**: 3.9.13
- **Operating system**: macOS 12.6
| 1.0 | QiskitRuntimeService() does not work without qiskit-ibm.json - **Describe the bug**
It seems to be related to #301 .
While, the credential to the provider works:
```
>> from qiskit import IBMQ
>> IBMQ.load_account()
<AccountProvider for IBMQ(hub='XXX', group='XXX', project='XXX')>
```
when `~/.qiskit/qiskit-ibm.json` is not present:
```
>> from qiskit_ibm_runtime import QiskitRuntimeService
>> service = QiskitRuntimeService(channel='ibm_quantum')
AccountNotFoundError: 'No default ibm_quantum account saved.'
```
Notice that `service = QiskitRuntimeService()` does work and creates `~/.qiskit/qiskit-ibm.json` correctly, with `"channel": "ibm_quantum"`, at least in my case.
**Expected behavior**
I think `QiskitRuntimeService(channel='ibm_quantum')` should work and create `qiskit-ibm.json`.
**Additional Information**
- **qiskit-ibm-runtime version**: 0.6.2
- **Python version**: 3.9.13
- **Operating system**: macOS 12.6
| non_comp | qiskitruntimeservice does not work without qiskit ibm json describe the bug it seems to be related to while the credential to the provider works from qiskit import ibmq ibmq load account when qiskit qiskit ibm json is not present from qiskit ibm runtime import qiskitruntimeservice service qiskitruntimeservice channel ibm quantum accountnotfounderror no default ibm quantum account saved notice that service qiskitruntimeservice does work and creates qiskit qiskit ibm json correctly with channel ibm quantum at least in my case expected behavior i think qiskitruntimeservice channel ibm quantum should work and create qiskit ibm json additional information qiskit ibm runtime version python version operating system macos | 0 |
16,250 | 21,844,092,664 | IssuesEvent | 2022-05-18 01:41:51 | Yesssssman/epicfightmod | https://api.github.com/repos/Yesssssman/epicfightmod | closed | Make Dodge be based on camera direction instead of model (suggestion) | mod incompatibility | So basically, i noticed that the dodge / step buttons make u dodge based on the current model's direction, usually that doesn't matter since the MC character is always looking forward. BUT if u pair it with other mods like Better Third person the results might be unsatisfying.
https://user-images.githubusercontent.com/94973750/143250377-979375d8-7f39-43d5-859c-3955fd349613.mp4
Like on this vid, when i am running left but with my camera pointed forwards, id like to roll left too. Well, tis not the end of the world, but would be very cool for this amazing mod!
| True | Make Dodge be based on camera direction instead of model (suggestion) - So basically, i noticed that the dodge / step buttons make u dodge based on the current model's direction, usually that doesn't matter since the MC character is always looking forward. BUT if u pair it with other mods like Better Third person the results might be unsatisfying.
https://user-images.githubusercontent.com/94973750/143250377-979375d8-7f39-43d5-859c-3955fd349613.mp4
Like on this vid, when i am running left but with my camera pointed forwards, id like to roll left too. Well, tis not the end of the world, but would be very cool for this amazing mod!
| comp | make dodge be based on camera direction instead of model suggestion so basically i noticed that the dodge step buttons make u dodge based on the current model s direction usually that doesn t matter since the mc character is always looking forward but if u pair it with other mods like better third person the results might be unsatisfying like on this vid when i am running left but with my camera pointed forwards id like to roll left too well tis not the end of the world but would be very cool for this amazing mod | 1 |
1,242 | 3,758,474,430 | IssuesEvent | 2016-03-14 09:07:11 | t9md/atom-vim-mode-plus | https://api.github.com/repos/t9md/atom-vim-mode-plus | closed | Better placement of search result on `n`/`N`. | compatiblity enhancement | - Suppose my editor shows 100 lines of text in a very large file.
- Suppose I hit `n` to find the next occurrence of a search word.
- If that word is on the screen, it will simply be highlighted and the cursor moves to it - the text does not scroll. (vmp does this, and Vim does this).
- In Vim, if that word is *below* the visible region of text, but only 1-30 lines below the last visible line, hitting `n` will scroll just enough so that the highlighted word is near the bottom of the screen. But if that next occurrence is more than about 50 lines below the last visible line, then the word highlights and the text scrolls such that the highlighted match is exactly in the middle of the screen.
- This seems very intuitive and natural because if something is very close to the visible portion of your screen, you don't want to scroll a huge amount - you want to scroll as little as possible to try to keep as much of what you were *already* looking at on the screen. But if the text is so far away that scrolling to it is in a different part of the code, it makes sense to optimize for getting as much mental context in that new part of the code as possible.
I'm not asking for Vim compatibility, I'm just noticing that Vim did this really well and it probably makes sense for vpm to have this good feature too. | True | Better placement of search result on `n`/`N`. - - Suppose my editor shows 100 lines of text in a very large file.
- Suppose I hit `n` to find the next occurrence of a search word.
- If that word is on the screen, it will simply be highlighted and the cursor moves to it - the text does not scroll. (vmp does this, and Vim does this).
- In Vim, if that word is *below* the visible region of text, but only 1-30 lines below the last visible line, hitting `n` will scroll just enough so that the highlighted word is near the bottom of the screen. But if that next occurrence is more than about 50 lines below the last visible line, then the word highlights and the text scrolls such that the highlighted match is exactly in the middle of the screen.
- This seems very intuitive and natural because if something is very close to the visible portion of your screen, you don't want to scroll a huge amount - you want to scroll as little as possible to try to keep as much of what you were *already* looking at on the screen. But if the text is so far away that scrolling to it is in a different part of the code, it makes sense to optimize for getting as much mental context in that new part of the code as possible.
I'm not asking for Vim compatibility, I'm just noticing that Vim did this really well and it probably makes sense for vpm to have this good feature too. | comp | better placement of search result on n n suppose my editor shows lines of text in a very large file suppose i hit n to find the next occurrence of a search word if that word is on the screen it will simply be highlighted and the cursor moves to it the text does not scroll vmp does this and vim does this in vim if that word is below the visible region of text but only lines below the last visible line hitting n will scroll just enough so that the highlighted word is near the bottom of the screen but if that next occurrence is more than about lines below the last visible line then the word highlights and the text scrolls such that the highlighted match is exactly in the middle of the screen this seems very intuitive and natural because if something is very close to the visible portion of your screen you don t want to scroll a huge amount you want to scroll as little as possible to try to keep as much of what you were already looking at on the screen but if the text is so far away that scrolling to it is in a different part of the code it makes sense to optimize for getting as much mental context in that new part of the code as possible i m not asking for vim compatibility i m just noticing that vim did this really well and it probably makes sense for vpm to have this good feature too | 1 |
8,320 | 10,344,091,338 | IssuesEvent | 2019-09-04 10:22:25 | jOOQ/jOOQ | https://api.github.com/repos/jOOQ/jOOQ | closed | Remove unnecessary AlterSequenceRestartStep | C: Functionality E: All Editions P: Medium R: Fixed T: Enhancement T: Incompatible change | It appears that the `AlterSequenceRestartStep` type is not needed. It does not offer any API that could not be reached through the `AlterSequenceStep` subtype. | True | Remove unnecessary AlterSequenceRestartStep - It appears that the `AlterSequenceRestartStep` type is not needed. It does not offer any API that could not be reached through the `AlterSequenceStep` subtype. | comp | remove unnecessary altersequencerestartstep it appears that the altersequencerestartstep type is not needed it does not offer any api that could not be reached through the altersequencestep subtype | 1 |
417,368 | 12,158,980,090 | IssuesEvent | 2020-04-26 06:58:58 | GeyserMC/Geyser | https://api.github.com/repos/GeyserMC/Geyser | closed | Banners seem to be Inverted | Confirmed Bug Priority: Medium Work in Progress | Banners on Bedrock seem to be inverted from how they look in Java
**To Reproduce**
Steps to reproduce the behavior:
1. Place a banner with a design.
2. View in Java and Bedrock
3. See error
**Expected behavior**
Banners between version should look the same.
**Screenshots**
Java Edition

Bedrock Edition

Inverted Bedrock Edition

**Server version**
PaperMC 1.15.2 # 143
**Geyser version**
Inventory branch # 9 standalone
**Bedrock version**
1.14.30
| 1.0 | Banners seem to be Inverted - Banners on Bedrock seem to be inverted from how they look in Java
**To Reproduce**
Steps to reproduce the behavior:
1. Place a banner with a design.
2. View in Java and Bedrock
3. See error
**Expected behavior**
Banners between version should look the same.
**Screenshots**
Java Edition

Bedrock Edition

Inverted Bedrock Edition

**Server version**
PaperMC 1.15.2 # 143
**Geyser version**
Inventory branch # 9 standalone
**Bedrock version**
1.14.30
| non_comp | banners seem to be inverted banners on bedrock seem to be inverted from how they look in java to reproduce steps to reproduce the behavior place a banner with a design view in java and bedrock see error expected behavior banners between version should look the same screenshots java edition bedrock edition inverted bedrock edition server version papermc geyser version inventory branch standalone bedrock version | 0 |
19,055 | 26,485,815,232 | IssuesEvent | 2023-01-17 17:55:29 | storybookjs/storybook | https://api.github.com/repos/storybookjs/storybook | closed | Running Storybook with Webpack 5 and swc-loader results with an error | bug compatibility with other tools needs reproduction | **Describe the bug**
I use Storybook for a React application, and it works with babel-loader, but not with swc-loader
**To Reproduce**
in a working Storybook project, change babel-loader to swc-loader in main.js webpackFinal function's config and run Storybook
**System**
@storybook/builder-webpack5: "6.3.0-rc.4",
@storybook/manager-webpack5: "6.3.0-rc.4",
@storybook/react: "6.3.0-beta.4",
**Additional context**
I changed: loader: require.resolve('babel-loader') to loader: require.resolve('swc-loader')
ran: npm run storybook, and got this error:
modulesthread '<unnamed>' panicked at 'index out of bounds: the len is 0 but the index is 0'
| True | Running Storybook with Webpack 5 and swc-loader results with an error - **Describe the bug**
I use Storybook for a React application, and it works with babel-loader, but not with swc-loader
**To Reproduce**
in a working Storybook project, change babel-loader to swc-loader in main.js webpackFinal function's config and run Storybook
**System**
@storybook/builder-webpack5: "6.3.0-rc.4",
@storybook/manager-webpack5: "6.3.0-rc.4",
@storybook/react: "6.3.0-beta.4",
**Additional context**
I changed: loader: require.resolve('babel-loader') to loader: require.resolve('swc-loader')
ran: npm run storybook, and got this error:
modulesthread '<unnamed>' panicked at 'index out of bounds: the len is 0 but the index is 0'
| comp | running storybook with webpack and swc loader results with an error describe the bug i use storybook for a react application and it works with babel loader but not with swc loader to reproduce in a working storybook project change babel loader to swc loader in main js webpackfinal function s config and run storybook system storybook builder rc storybook manager rc storybook react beta additional context i changed loader require resolve babel loader to loader require resolve swc loader ran npm run storybook and got this error modulesthread panicked at index out of bounds the len is but the index is | 1 |
17,110 | 23,619,990,894 | IssuesEvent | 2022-08-24 19:32:36 | itchio/itch | https://api.github.com/repos/itchio/itch | closed | AirConsole games don't seem to work | compatibility v25.x | ### Details
- Version: itch @ 25.3.0
- OS: Windows 10
- Steps: Launch html5 game (https://8bitape.itch.io/dungeoncrawl) from itch app
### Description
When the game is launched, websocket connections are closed:
`WebSocket connection to 'wss://server-url' failed: WebSocket is closed before the connection is established.`
The game uses websockets to send messages from the screen to your smart phone.
If I load the game with my browser of choice, then everything is fine and dandy:


Does the itch app browser support websockets?
| True | AirConsole games don't seem to work - ### Details
- Version: itch @ 25.3.0
- OS: Windows 10
- Steps: Launch html5 game (https://8bitape.itch.io/dungeoncrawl) from itch app
### Description
When the game is launched, websocket connections are closed:
`WebSocket connection to 'wss://server-url' failed: WebSocket is closed before the connection is established.`
The game uses websockets to send messages from the screen to your smart phone.
If I load the game with my browser of choice, then everything is fine and dandy:


Does the itch app browser support websockets?
| comp | airconsole games don t seem to work details version itch os windows steps launch game from itch app description when the game is launched websocket connections are closed websocket connection to wss server url failed websocket is closed before the connection is established the game uses websockets to send messages from the screen to your smart phone if i load the game with my browser of choice then everything is fine and dandy does the itch app browser support websockets | 1 |
658,803 | 21,903,017,174 | IssuesEvent | 2022-05-20 15:06:36 | CLOSER-Cohorts/archivist | https://api.github.com/repos/CLOSER-Cohorts/archivist | closed | REACT: /cc_questions.txt to include question grid items from Y axis | High priority react | The current /cc_questions.txt includes a list of question items (their ID, label and question text), e.g. https://closer-archivist-staging.herokuapp.com/instruments/heaf_13_base/cc_questions.txt but it doesn't list out the question grids.
Can we add the question grids to this list but it would also need to include the XY axis (i.e. $1;1) and the categories for these as well, otherwise it will only be the same question literal repeated.
e.g.
qc_9_a-b$1;1 In an average week, and outside any paid jobs that you do, roughly how many hours would you spend doing the following activities? (Please answer each question) Physical activities sufficient to make you hot or sweaty (e.g. heavy gardening, dancing, cycling, jogging)
qc_9_a-b$1;2 In an average week, and outside any paid jobs that you do, roughly how many hours would you spend doing the following activities? (Please answer each question) Meeting or doing things with friends or relatives who do not live in your home
I have attached an example.
This what it is like now -
[heaf_13_base_questions_now.txt](https://github.com/CLOSER-Cohorts/archivist/files/8676297/heaf_13_base_questions_now.txt)
This would be the ideal format -
[heaf_13_base_questions.txt](https://github.com/CLOSER-Cohorts/archivist/files/8676291/heaf_13_base_questions.txt)
| 1.0 | REACT: /cc_questions.txt to include question grid items from Y axis - The current /cc_questions.txt includes a list of question items (their ID, label and question text), e.g. https://closer-archivist-staging.herokuapp.com/instruments/heaf_13_base/cc_questions.txt but it doesn't list out the question grids.
Can we add the question grids to this list but it would also need to include the XY axis (i.e. $1;1) and the categories for these as well, otherwise it will only be the same question literal repeated.
e.g.
qc_9_a-b$1;1 In an average week, and outside any paid jobs that you do, roughly how many hours would you spend doing the following activities? (Please answer each question) Physical activities sufficient to make you hot or sweaty (e.g. heavy gardening, dancing, cycling, jogging)
qc_9_a-b$1;2 In an average week, and outside any paid jobs that you do, roughly how many hours would you spend doing the following activities? (Please answer each question) Meeting or doing things with friends or relatives who do not live in your home
I have attached an example.
This what it is like now -
[heaf_13_base_questions_now.txt](https://github.com/CLOSER-Cohorts/archivist/files/8676297/heaf_13_base_questions_now.txt)
This would be the ideal format -
[heaf_13_base_questions.txt](https://github.com/CLOSER-Cohorts/archivist/files/8676291/heaf_13_base_questions.txt)
| non_comp | react cc questions txt to include question grid items from y axis the current cc questions txt includes a list of question items their id label and question text e g but it doesn t list out the question grids can we add the question grids to this list but it would also need to include the xy axis i e and the categories for these as well otherwise it will only be the same question literal repeated e g qc a b in an average week and outside any paid jobs that you do roughly how many hours would you spend doing the following activities please answer each question physical activities sufficient to make you hot or sweaty e g heavy gardening dancing cycling jogging qc a b in an average week and outside any paid jobs that you do roughly how many hours would you spend doing the following activities please answer each question meeting or doing things with friends or relatives who do not live in your home i have attached an example this what it is like now this would be the ideal format | 0 |
39,382 | 6,736,969,179 | IssuesEvent | 2017-10-19 07:33:58 | hapijs/joi | https://api.github.com/repos/hapijs/joi | closed | `when` example throws | documentation | #### Context
Node: 8.5.0
Joi: 13.0.0
standalone
OSX 10.11.6 (also fails on https://repl.it/MsDB/1 )
#### What are you trying to achieve or the steps to reproduce?
While trying to understand the `when` syntax, I noticed the last `when` example throws: https://github.com/hapijs/joi/blob/master/API.md#anywhencondition-options
```
const Joi = require('joi')
const schema = Joi.object().keys({
min: Joi.number().when('max', {
is: Joi.number().required(),
then: Joi.number().less(Joi.ref('max')),
}),
max: Joi.number().when('min', {
is: Joi.number().required(),
then: Joi.number().greater(Joi.ref('min')),
}),
});
Joi.validate({min: 123}, schema)
```
throws
```
~/dev/exp/joi-iss/node_modules/joi/lib/types/object/index.js:362
throw castErr;
^
Error: item added into group max created a dependencies error
at Object.exports.assert (~/dev/exp/joi-iss/node_modules/hoek/lib/index.js:730:11)
at module.exports.internals.Topo.internals.Topo.add (~/dev/exp/joi-iss/node_modules/topo/lib/index.js:53:10)
at internals.Object.keys (~/dev/exp/joi-iss/node_modules/joi/lib/types/object/index.js:353:22)
at Object.<anonymous> (~/dev/exp/joi-iss/index.js:4:29)
at Module._compile (module.js:624:30)
at Object.Module._extensions..js (module.js:635:10)
at Module.load (module.js:545:32)
at tryModuleLoad (module.js:508:12)
at Function.Module._load (module.js:500:3)
at Function.Module.runMain (module.js:665:10)
```
Also throws for any `{min: 123, max: 222}`, FWIW
#### Which result you had ?
^^^
#### What did you expect ?
! ^^^ | 1.0 | `when` example throws - #### Context
Node: 8.5.0
Joi: 13.0.0
standalone
OSX 10.11.6 (also fails on https://repl.it/MsDB/1 )
#### What are you trying to achieve or the steps to reproduce?
While trying to understand the `when` syntax, I noticed the last `when` example throws: https://github.com/hapijs/joi/blob/master/API.md#anywhencondition-options
```
const Joi = require('joi')
const schema = Joi.object().keys({
min: Joi.number().when('max', {
is: Joi.number().required(),
then: Joi.number().less(Joi.ref('max')),
}),
max: Joi.number().when('min', {
is: Joi.number().required(),
then: Joi.number().greater(Joi.ref('min')),
}),
});
Joi.validate({min: 123}, schema)
```
throws
```
~/dev/exp/joi-iss/node_modules/joi/lib/types/object/index.js:362
throw castErr;
^
Error: item added into group max created a dependencies error
at Object.exports.assert (~/dev/exp/joi-iss/node_modules/hoek/lib/index.js:730:11)
at module.exports.internals.Topo.internals.Topo.add (~/dev/exp/joi-iss/node_modules/topo/lib/index.js:53:10)
at internals.Object.keys (~/dev/exp/joi-iss/node_modules/joi/lib/types/object/index.js:353:22)
at Object.<anonymous> (~/dev/exp/joi-iss/index.js:4:29)
at Module._compile (module.js:624:30)
at Object.Module._extensions..js (module.js:635:10)
at Module.load (module.js:545:32)
at tryModuleLoad (module.js:508:12)
at Function.Module._load (module.js:500:3)
at Function.Module.runMain (module.js:665:10)
```
Also throws for any `{min: 123, max: 222}`, FWIW
#### Which result you had ?
^^^
#### What did you expect ?
! ^^^ | non_comp | when example throws context node joi standalone osx also fails on what are you trying to achieve or the steps to reproduce while trying to understand the when syntax i noticed the last when example throws const joi require joi const schema joi object keys min joi number when max is joi number required then joi number less joi ref max max joi number when min is joi number required then joi number greater joi ref min joi validate min schema throws dev exp joi iss node modules joi lib types object index js throw casterr error item added into group max created a dependencies error at object exports assert dev exp joi iss node modules hoek lib index js at module exports internals topo internals topo add dev exp joi iss node modules topo lib index js at internals object keys dev exp joi iss node modules joi lib types object index js at object dev exp joi iss index js at module compile module js at object module extensions js module js at module load module js at trymoduleload module js at function module load module js at function module runmain module js also throws for any min max fwiw which result you had what did you expect | 0 |
1,691 | 4,259,024,113 | IssuesEvent | 2016-07-11 09:28:12 | medic/medic-webapp | https://api.github.com/repos/medic/medic-webapp | opened | Can we do all config through config UI pages, for SMS features? | v0.4 Compatibility v0.4 features in 2.x | There's always the app_settings dashboard page to fall back on, but would be good to not depend on it.
File issues for any missing pages. | True | Can we do all config through config UI pages, for SMS features? - There's always the app_settings dashboard page to fall back on, but would be good to not depend on it.
File issues for any missing pages. | comp | can we do all config through config ui pages for sms features there s always the app settings dashboard page to fall back on but would be good to not depend on it file issues for any missing pages | 1 |
722,702 | 24,872,199,136 | IssuesEvent | 2022-10-27 16:00:57 | PrefectHQ/prefect | https://api.github.com/repos/PrefectHQ/prefect | closed | Block auto-registration is skipped when switching databases | bug priority:medium component:blocks status:in-progress | ### First check
- [X] I added a descriptive title to this issue.
- [X] I used the GitHub search to find a similar issue and didn't find it.
- [X] I searched the Prefect documentation for this issue.
- [X] I checked that this issue is related to Prefect and not one of its dependencies.
### Bug summary
A community user reported that changing the database that backed their local Orion instance resulted in block disappearing from their UI. The user resolved the issue by deleting their .prefect folder.
Thread: https://prefect-community.slack.com/archives/CL09KU1K7/p1666812855264589
### Reproduction
```python3
- Start up an Orion server with `prefect orion start`
- View blocks in UI
- Change database with `prefect config set PREFECT_ORION_DATABASE_CONNECTION_URL="new_url"
- View lack of blocks in the UI
```
### Error
_No response_
### Versions
```Text
Version: 2.6.4
API version: 0.8.2
Python version: 3.10.6
Git commit: 51e92dda
Built: Thu, Oct 20, 2022 3:11 PM
OS/Arch: darwin/x86_64
Profile: local
Server type: ephemeral
Server:
Database: sqlite
SQLite version: 3.39.3
```
### Additional context
This is likely due to block auto-registration memoization. Memoization should be updated so that the block auto-registration memostore key is no longer valid when the orion database URL is changed. | 1.0 | Block auto-registration is skipped when switching databases - ### First check
- [X] I added a descriptive title to this issue.
- [X] I used the GitHub search to find a similar issue and didn't find it.
- [X] I searched the Prefect documentation for this issue.
- [X] I checked that this issue is related to Prefect and not one of its dependencies.
### Bug summary
A community user reported that changing the database that backed their local Orion instance resulted in block disappearing from their UI. The user resolved the issue by deleting their .prefect folder.
Thread: https://prefect-community.slack.com/archives/CL09KU1K7/p1666812855264589
### Reproduction
```python3
- Start up an Orion server with `prefect orion start`
- View blocks in UI
- Change database with `prefect config set PREFECT_ORION_DATABASE_CONNECTION_URL="new_url"
- View lack of blocks in the UI
```
### Error
_No response_
### Versions
```Text
Version: 2.6.4
API version: 0.8.2
Python version: 3.10.6
Git commit: 51e92dda
Built: Thu, Oct 20, 2022 3:11 PM
OS/Arch: darwin/x86_64
Profile: local
Server type: ephemeral
Server:
Database: sqlite
SQLite version: 3.39.3
```
### Additional context
This is likely due to block auto-registration memoization. Memoization should be updated so that the block auto-registration memostore key is no longer valid when the orion database URL is changed. | non_comp | block auto registration is skipped when switching databases first check i added a descriptive title to this issue i used the github search to find a similar issue and didn t find it i searched the prefect documentation for this issue i checked that this issue is related to prefect and not one of its dependencies bug summary a community user reported that changing the database that backed their local orion instance resulted in block disappearing from their ui the user resolved the issue by deleting their prefect folder thread reproduction start up an orion server with prefect orion start view blocks in ui change database with prefect config set prefect orion database connection url new url view lack of blocks in the ui error no response versions text version api version python version git commit built thu oct pm os arch darwin profile local server type ephemeral server database sqlite sqlite version additional context this is likely due to block auto registration memoization memoization should be updated so that the block auto registration memostore key is no longer valid when the orion database url is changed | 0 |
19,984 | 27,811,087,603 | IssuesEvent | 2023-03-18 05:31:13 | Wastelander121/TFCToolsIssueTracker | https://api.github.com/repos/Wastelander121/TFCToolsIssueTracker | opened | Enslaved: Odyssey to the West | Incompatible | Compressed:
Unable to read beyond the end of the stream.
Uncompressed:
Operation is not valid due to the current state of the object. | True | Enslaved: Odyssey to the West - Compressed:
Unable to read beyond the end of the stream.
Uncompressed:
Operation is not valid due to the current state of the object. | comp | enslaved odyssey to the west compressed unable to read beyond the end of the stream uncompressed operation is not valid due to the current state of the object | 1 |
2,360 | 5,104,736,588 | IssuesEvent | 2017-01-05 02:52:11 | BVLC/caffe | https://api.github.com/repos/BVLC/caffe | closed | Provide a Caffe package in Debian | compatibility | ## Status
Caffe packages are available for `Debian/unstable`.
Caffe packages are failing to build for Ubuntu-devel and need to be patched.
Last update: Dec.20 2016
## Draft guide
**Deploy Caffe with merely one command.**
### Brief Guide for Debian/unstable users
Only experienced linux users are recommended to try Debian/unstable (Sid).
To install caffe, first make sure you have something like the follows in file `/etc/apt/sources.list`:
(Uncomment the second line if you want to re-compile caffe locally.)
```
deb http://ftp.cn.debian.org/debian sid main contrib non-free
#deb-src http://ftp.cn.debian.org/debian sid main contrib non-free
```
Then update apt cache and install it. Note, you cannot install both the cpu version and the cuda version.
```
# apt update
# apt install [ caffe-cpu | caffe-cuda ]
# caffe
```
It should work out of box. I hope this work is helpful since there are many people struggling at the Caffe compiling process.
Here are some notes:
* Please re-compile OpenBLAS locally with optimization flags for sake of performance. This is highly recommended if you are writing a paper. The way to re-compile OpenBLAS from Debian source is very similar with the next subsection.
* If you are going to install `caffe-cuda`, it will automatically pull the CUDA package and the nvidia driver packages. The installation procress may fail if any part of the caffe dependency chain gets into trouble. That is to say, please take care if you have manually installed or significantly modified nvidia driver or CUDA toolkit or protobuf or any other related stuff.
* if you encountered any problem when installing `caffe-cuda` on a clean Debian system, report bug to me (via Debian's bug tracking system) please.
* If you encountered any problem when installing `caffe-cpu`, please report bug to me via Debian's bug tracking system.
* Both of caffe-cpu and caffe-cuda contain a manpage (`man caffe`) and a bash complementation script (`caffe <TAB><TAB>`, `caffe train <TAB><TAB>`). Both of them are still not merged into caffe master.
* The python interface is Python3 version: `python3-caffe-{cpu,cuda}`. No plan to support python2.
### Compiling your custom caffe package on Debian/unstable
There is no promise for the content in this subsection. If you just want to compile again from the source without any change, the following should work as expected. If you want to compile it with e.g. CUDNN support, you should at least be able to read and hack the file `debian/rules` under the source tree (It's a Makefile).
First make sure you have a correct `deb-src` line in your apt source list file. Then we compile caffe with several simple commands.
```
# apt update
# apt install build-essential debhelper devscripts # These are standard package building tools
# apt build-dep [ caffe-cpu | caffe-cuda ] # the most elegant way to pull caffe build dependencies
# apt source [ caffe-cpu | caffe-cuda ] # download the source tarball
# cd caffe-XXXX # now we enter into the source tree
[ ... optional, make your custom changes at your own risk ... ]
# debuild -B -j4 # build caffe with 4 parallel jobs (similar to make -j4)
[ ... building ...]
# debc # optional, if you want to check the package contents
# debi # install the generated packages
```
### FAQ
1. where is caffe-cudnn?
Due to legal reason the cudnn library cannot be redistributed. I'll be happy to make this package when CUDNN becomes re-distributable. The workaround is to install cudnn by yourself, and hack at least the `debian/rules` file if you really want the caffe *.deb packages with CUDNN support.
2. how to report bug via Debian bug tracking system?
See https://www.debian.org/Bugs/ .
3. I installed the CPU version, what should I do if I want to switch to CUDA verison?
`sudo apt install caffe-cuda`, apt's dependency resolver is smart enough for this.
4. Where is the examples, the models and other documentation stuff?
`sudo apt install caffe-doc; dpkg -L caffe-doc` | True | Provide a Caffe package in Debian - ## Status
Caffe packages are available for `Debian/unstable`.
Caffe packages are failing to build for Ubuntu-devel and need to be patched.
Last update: Dec.20 2016
## Draft guide
**Deploy Caffe with merely one command.**
### Brief Guide for Debian/unstable users
Only experienced linux users are recommended to try Debian/unstable (Sid).
To install caffe, first make sure you have something like the follows in file `/etc/apt/sources.list`:
(Uncomment the second line if you want to re-compile caffe locally.)
```
deb http://ftp.cn.debian.org/debian sid main contrib non-free
#deb-src http://ftp.cn.debian.org/debian sid main contrib non-free
```
Then update apt cache and install it. Note, you cannot install both the cpu version and the cuda version.
```
# apt update
# apt install [ caffe-cpu | caffe-cuda ]
# caffe
```
It should work out of box. I hope this work is helpful since there are many people struggling at the Caffe compiling process.
Here are some notes:
* Please re-compile OpenBLAS locally with optimization flags for sake of performance. This is highly recommended if you are writing a paper. The way to re-compile OpenBLAS from Debian source is very similar with the next subsection.
* If you are going to install `caffe-cuda`, it will automatically pull the CUDA package and the nvidia driver packages. The installation procress may fail if any part of the caffe dependency chain gets into trouble. That is to say, please take care if you have manually installed or significantly modified nvidia driver or CUDA toolkit or protobuf or any other related stuff.
* if you encountered any problem when installing `caffe-cuda` on a clean Debian system, report bug to me (via Debian's bug tracking system) please.
* If you encountered any problem when installing `caffe-cpu`, please report bug to me via Debian's bug tracking system.
* Both of caffe-cpu and caffe-cuda contain a manpage (`man caffe`) and a bash complementation script (`caffe <TAB><TAB>`, `caffe train <TAB><TAB>`). Both of them are still not merged into caffe master.
* The python interface is Python3 version: `python3-caffe-{cpu,cuda}`. No plan to support python2.
### Compiling your custom caffe package on Debian/unstable
There is no promise for the content in this subsection. If you just want to compile again from the source without any change, the following should work as expected. If you want to compile it with e.g. CUDNN support, you should at least be able to read and hack the file `debian/rules` under the source tree (It's a Makefile).
First make sure you have a correct `deb-src` line in your apt source list file. Then we compile caffe with several simple commands.
```
# apt update
# apt install build-essential debhelper devscripts # These are standard package building tools
# apt build-dep [ caffe-cpu | caffe-cuda ] # the most elegant way to pull caffe build dependencies
# apt source [ caffe-cpu | caffe-cuda ] # download the source tarball
# cd caffe-XXXX # now we enter into the source tree
[ ... optional, make your custom changes at your own risk ... ]
# debuild -B -j4 # build caffe with 4 parallel jobs (similar to make -j4)
[ ... building ...]
# debc # optional, if you want to check the package contents
# debi # install the generated packages
```
### FAQ
1. where is caffe-cudnn?
Due to legal reason the cudnn library cannot be redistributed. I'll be happy to make this package when CUDNN becomes re-distributable. The workaround is to install cudnn by yourself, and hack at least the `debian/rules` file if you really want the caffe *.deb packages with CUDNN support.
2. how to report bug via Debian bug tracking system?
See https://www.debian.org/Bugs/ .
3. I installed the CPU version, what should I do if I want to switch to CUDA verison?
`sudo apt install caffe-cuda`, apt's dependency resolver is smart enough for this.
4. Where is the examples, the models and other documentation stuff?
`sudo apt install caffe-doc; dpkg -L caffe-doc` | comp | provide a caffe package in debian status caffe packages are available for debian unstable caffe packages are failing to build for ubuntu devel and need to be patched last update dec draft guide deploy caffe with merely one command brief guide for debian unstable users only experienced linux users are recommended to try debian unstable sid to install caffe first make sure you have something like the follows in file etc apt sources list uncomment the second line if you want to re compile caffe locally deb sid main contrib non free deb src sid main contrib non free then update apt cache and install it note you cannot install both the cpu version and the cuda version apt update apt install caffe it should work out of box i hope this work is helpful since there are many people struggling at the caffe compiling process here are some notes please re compile openblas locally with optimization flags for sake of performance this is highly recommended if you are writing a paper the way to re compile openblas from debian source is very similar with the next subsection if you are going to install caffe cuda it will automatically pull the cuda package and the nvidia driver packages the installation procress may fail if any part of the caffe dependency chain gets into trouble that is to say please take care if you have manually installed or significantly modified nvidia driver or cuda toolkit or protobuf or any other related stuff if you encountered any problem when installing caffe cuda on a clean debian system report bug to me via debian s bug tracking system please if you encountered any problem when installing caffe cpu please report bug to me via debian s bug tracking system both of caffe cpu and caffe cuda contain a manpage man caffe and a bash complementation script caffe caffe train both of them are still not merged into caffe master the python interface is version caffe cpu cuda no plan to support compiling your custom caffe package on debian unstable there is no promise for the content in this subsection if you just want to compile again from the source without any change the following should work as expected if you want to compile it with e g cudnn support you should at least be able to read and hack the file debian rules under the source tree it s a makefile first make sure you have a correct deb src line in your apt source list file then we compile caffe with several simple commands apt update apt install build essential debhelper devscripts these are standard package building tools apt build dep the most elegant way to pull caffe build dependencies apt source download the source tarball cd caffe xxxx now we enter into the source tree debuild b build caffe with parallel jobs similar to make debc optional if you want to check the package contents debi install the generated packages faq where is caffe cudnn due to legal reason the cudnn library cannot be redistributed i ll be happy to make this package when cudnn becomes re distributable the workaround is to install cudnn by yourself and hack at least the debian rules file if you really want the caffe deb packages with cudnn support how to report bug via debian bug tracking system see i installed the cpu version what should i do if i want to switch to cuda verison sudo apt install caffe cuda apt s dependency resolver is smart enough for this where is the examples the models and other documentation stuff sudo apt install caffe doc dpkg l caffe doc | 1 |
381,139 | 11,274,023,450 | IssuesEvent | 2020-01-14 17:40:44 | googleapis/google-cloud-python | https://api.github.com/repos/googleapis/google-cloud-python | closed | Synthesis failed for iam | api: iam autosynth failure priority: p1 type: bug | Hello! Autosynth couldn't regenerate iam. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-iam'
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 256, in <module>
main()
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 196, in main
last_synth_commit_hash = get_last_metadata_commit(args.metadata_path)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 149, in get_last_metadata_commit
text=True,
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 403, in run
with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'text'
```
Google internal developers can see the full log [here](https://sponge/264076a6-41d9-4ca2-9fc2-c9d0a8a92def).
| 1.0 | Synthesis failed for iam - Hello! Autosynth couldn't regenerate iam. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth-iam'
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 256, in <module>
main()
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 196, in main
last_synth_commit_hash = get_last_metadata_commit(args.metadata_path)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 149, in get_last_metadata_commit
text=True,
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 403, in run
with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'text'
```
Google internal developers can see the full log [here](https://sponge/264076a6-41d9-4ca2-9fc2-c9d0a8a92def).
| non_comp | synthesis failed for iam hello autosynth couldn t regenerate iam broken heart here s the output from running synth py cloning into working repo switched to branch autosynth iam traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth autosynth synth py line in main file tmpfs src git autosynth autosynth synth py line in main last synth commit hash get last metadata commit args metadata path file tmpfs src git autosynth autosynth synth py line in get last metadata commit text true file home kbuilder pyenv versions lib subprocess py line in run with popen popenargs kwargs as process typeerror init got an unexpected keyword argument text google internal developers can see the full log | 0 |
6,289 | 8,658,810,202 | IssuesEvent | 2018-11-28 02:46:03 | Snownee/Cuisine | https://api.github.com/repos/Snownee/Cuisine | closed | Immersive Engineering Cloche compatability | Compatibility | The Garden Cloche from Immersive Engineering is quite a handy tool to automate farming. It would be great if cuisine crops where compatible with it.
I also think food automation in general could be a great addition. Like supporting the IE Squeezer for juices maybe? | True | Immersive Engineering Cloche compatability - The Garden Cloche from Immersive Engineering is quite a handy tool to automate farming. It would be great if cuisine crops where compatible with it.
I also think food automation in general could be a great addition. Like supporting the IE Squeezer for juices maybe? | comp | immersive engineering cloche compatability the garden cloche from immersive engineering is quite a handy tool to automate farming it would be great if cuisine crops where compatible with it i also think food automation in general could be a great addition like supporting the ie squeezer for juices maybe | 1 |
3,809 | 6,664,099,940 | IssuesEvent | 2017-10-02 18:51:28 | storybooks/storybook | https://api.github.com/repos/storybooks/storybook | closed | Empty main pane on Firefox Nightly | bug compatibility with other tools ui | Greetings,
the main content pane is now showing in Firefox Nightly `55.0a1 (2017-06-05)`, while being properly rendered in the current stable version.
No error, nor any kind of useful log so far.
Will try to debug later.
The Navigation pane is fine.
Probably just a CSS issue since the DOM content is present.
| True | Empty main pane on Firefox Nightly - Greetings,
the main content pane is now showing in Firefox Nightly `55.0a1 (2017-06-05)`, while being properly rendered in the current stable version.
No error, nor any kind of useful log so far.
Will try to debug later.
The Navigation pane is fine.
Probably just a CSS issue since the DOM content is present.
| comp | empty main pane on firefox nightly greetings the main content pane is now showing in firefox nightly while being properly rendered in the current stable version no error nor any kind of useful log so far will try to debug later the navigation pane is fine probably just a css issue since the dom content is present | 1 |
32,933 | 8,971,527,532 | IssuesEvent | 2019-01-29 16:06:25 | avast-tl/retdec | https://api.github.com/repos/avast-tl/retdec | closed | CMake rules for pelib contain an unitialized variable | C-build-system C-pelib bug | File `deps/pelib/CMakeLists.txt` contains the following piece of code:
```
44 # Force rebuild if switch happened.
45 # Seems like this is not needed on Linux, and not working on Windows :-(
46 BUILD_ALWAYS ${CHANGED}
```
However, the `CHANGED` variable is defined later on line
```
57 check_if_variable_changed(PELIB_LOCAL_DIR CHANGED)
```
Questions:
* Can you please verify that we actually want to use an uninitialized variable there?
* Is that `BUILD_ALWAYS` part necessary? According to the comment above, it is not needed on Linux and does not work on Windows. Is it for macOS then?
| 1.0 | CMake rules for pelib contain an unitialized variable - File `deps/pelib/CMakeLists.txt` contains the following piece of code:
```
44 # Force rebuild if switch happened.
45 # Seems like this is not needed on Linux, and not working on Windows :-(
46 BUILD_ALWAYS ${CHANGED}
```
However, the `CHANGED` variable is defined later on line
```
57 check_if_variable_changed(PELIB_LOCAL_DIR CHANGED)
```
Questions:
* Can you please verify that we actually want to use an uninitialized variable there?
* Is that `BUILD_ALWAYS` part necessary? According to the comment above, it is not needed on Linux and does not work on Windows. Is it for macOS then?
| non_comp | cmake rules for pelib contain an unitialized variable file deps pelib cmakelists txt contains the following piece of code force rebuild if switch happened seems like this is not needed on linux and not working on windows build always changed however the changed variable is defined later on line check if variable changed pelib local dir changed questions can you please verify that we actually want to use an uninitialized variable there is that build always part necessary according to the comment above it is not needed on linux and does not work on windows is it for macos then | 0 |
16,505 | 22,360,173,046 | IssuesEvent | 2022-06-15 19:38:46 | Keksuccino/FancyMenu | https://api.github.com/repos/Keksuccino/FancyMenu | closed | UI elements overlap with Mine together on GUI scale < 2 | mod incompatibility | **Describe the bug**
Not sure where the issue here lies but when playing a modpack with Minetogther, UI elements overlap when GUI scaling is set to any value greater then 2.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to video settings
2. set GUI scaling to a value greater then 2
3. Return to the main pause menu
4. See error
**Expected behavior**
I'd expect to see the UI elements here to still scale but keep the format shown below when the GUI scaling is set to 2.
This beahvour only shows itself when mine together is insalled and works as exptected when that mod is disabled.
**Screenshots**
GUI scaling Auto (4)

GUI Scaling 4

GUI Scaling 3

GUI Scaling 2

GUI Scaling 1

**Basic Informations (please complete the following information):**
- OS: Windows
- FancyMenu Version: 2.3.5
- Forge/Fabric Version: 36.2.21
- Minecraft Version 1.16.5
- Active Mods: Modpack https://www.curseforge.com/minecraft/modpacks/not-too-complicated-2 lastest release
- Resolution: 1080p (Affects are the same at 3440 x 1440)
| True | UI elements overlap with Mine together on GUI scale < 2 - **Describe the bug**
Not sure where the issue here lies but when playing a modpack with Minetogther, UI elements overlap when GUI scaling is set to any value greater then 2.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to video settings
2. set GUI scaling to a value greater then 2
3. Return to the main pause menu
4. See error
**Expected behavior**
I'd expect to see the UI elements here to still scale but keep the format shown below when the GUI scaling is set to 2.
This beahvour only shows itself when mine together is insalled and works as exptected when that mod is disabled.
**Screenshots**
GUI scaling Auto (4)

GUI Scaling 4

GUI Scaling 3

GUI Scaling 2

GUI Scaling 1

**Basic Informations (please complete the following information):**
- OS: Windows
- FancyMenu Version: 2.3.5
- Forge/Fabric Version: 36.2.21
- Minecraft Version 1.16.5
- Active Mods: Modpack https://www.curseforge.com/minecraft/modpacks/not-too-complicated-2 lastest release
- Resolution: 1080p (Affects are the same at 3440 x 1440)
| comp | ui elements overlap with mine together on gui scale describe the bug not sure where the issue here lies but when playing a modpack with minetogther ui elements overlap when gui scaling is set to any value greater then to reproduce steps to reproduce the behavior go to video settings set gui scaling to a value greater then return to the main pause menu see error expected behavior i d expect to see the ui elements here to still scale but keep the format shown below when the gui scaling is set to this beahvour only shows itself when mine together is insalled and works as exptected when that mod is disabled screenshots gui scaling auto gui scaling gui scaling gui scaling gui scaling basic informations please complete the following information os windows fancymenu version forge fabric version minecraft version active mods modpack lastest release resolution affects are the same at x | 1 |
216,365 | 16,758,368,241 | IssuesEvent | 2021-06-13 09:12:32 | RoboManipal/Libraries | https://api.github.com/repos/RoboManipal/Libraries | closed | Library for ServoChain | TODO testing wontfix | **Task priority**: Low
**Task description**
Servo Chain library consisting of Dynamixels and Pololus in any order. Maximise user control over servo speeds (playtimes for pololus), through additional functions.
**Person assigned**: @akshatha-k
### **Codes ready . Testing pending.**
# Must Haves
- [x] Series of Dynamixel servos and XYZPololu servos
- [ ] Initialise the servo IDs and their Serials
- [x] Move the smart Servos to a given position at a programmer defined speed
- [x] Move the smart Servos to a given position at a user-defined speed and playtime.
- [ ] Add thorough documentation of using the Roboplus Dynamixel Wizard.
# Extras
- [ ] Inverse kinematic functions for N links/joints.
## References
Dynamixel library : https://sourceforge.net/projects/dynamixelforarduino/files/?source=navbar
Pololu LIbrary: https://github.com/pololu/xyzrobot-servo-arduino
Robotis Dynamixel Wizard : http://support.robotis.com/en/software/roboplus/dynamixel_monitor.htm
| 1.0 | Library for ServoChain - **Task priority**: Low
**Task description**
Servo Chain library consisting of Dynamixels and Pololus in any order. Maximise user control over servo speeds (playtimes for pololus), through additional functions.
**Person assigned**: @akshatha-k
### **Codes ready . Testing pending.**
# Must Haves
- [x] Series of Dynamixel servos and XYZPololu servos
- [ ] Initialise the servo IDs and their Serials
- [x] Move the smart Servos to a given position at a programmer defined speed
- [x] Move the smart Servos to a given position at a user-defined speed and playtime.
- [ ] Add thorough documentation of using the Roboplus Dynamixel Wizard.
# Extras
- [ ] Inverse kinematic functions for N links/joints.
## References
Dynamixel library : https://sourceforge.net/projects/dynamixelforarduino/files/?source=navbar
Pololu LIbrary: https://github.com/pololu/xyzrobot-servo-arduino
Robotis Dynamixel Wizard : http://support.robotis.com/en/software/roboplus/dynamixel_monitor.htm
| non_comp | library for servochain task priority low task description servo chain library consisting of dynamixels and pololus in any order maximise user control over servo speeds playtimes for pololus through additional functions person assigned akshatha k codes ready testing pending must haves series of dynamixel servos and xyzpololu servos initialise the servo ids and their serials move the smart servos to a given position at a programmer defined speed move the smart servos to a given position at a user defined speed and playtime add thorough documentation of using the roboplus dynamixel wizard extras inverse kinematic functions for n links joints references dynamixel library pololu library robotis dynamixel wizard | 0 |
471,979 | 13,613,960,534 | IssuesEvent | 2020-09-23 12:37:57 | inspireui/support | https://api.github.com/repos/inspireui/support | closed | Login does not work and the application is blank | Beonews ⭐️ priority-ticket | **_Step 1 (require): describe detail issues & screenshots_**
+ Detail Issues:
- [ ] first issue
I cannot enter the application as a user, the application responds: invalid email or password .
2 - Written comments do not reach my WordPress panel The console returns the following: WARN postNewsContent LOG {"code": "rest_post_invalid_page_number", "data ": {" status ": 400}," message ":" The requested page number is greater than the number of available pages. "}
- [ ] second issue
Home screen remains empty
- [ ] third issue
The application does not connect with Firebase
+ Product & version:
+ Flutter (or React Native) version:
+ Testing Device/Simulator:
+ Screenshot issues (drag the file to attach here):
<img width="1298" alt="debug" src="https://user-images.githubusercontent.com/28591454/92264584-c6949c80-eede-11ea-9ba9-8a8272b003d5.png">
<img width="351" alt="startingScreen" src="https://user-images.githubusercontent.com/28591454/92264654-e1671100-eede-11ea-923e-92c1b34582a7.png">
<img width="1969" alt="menuItems" src="https://user-images.githubusercontent.com/28591454/92264725-fba0ef00-eede-11ea-9b5a-4081063cd219.png">
<img width="1298" alt="debug" src="https://user-images.githubusercontent.com/28591454/92264840-268b4300-eedf-11ea-9e75-161e619d5c97.png">
**_Step 2 (require): submit proof of purchasing the license on https://verify.inspireui.com **
**_Important Note:_**
- Please help to ready the support policies before creating a new ticket: https://inspireui.com/support-policies/
- Kindly create only *One Ticket* & includes all the issues, that would help us focus to resolve it better.
- If your all ticket was resolved & closed, but want to create a new ticket, just do simple step by linking to the previous verified ID (you don't need to submit the form again), for example, #Ticket-ID
Thank you so much for your time 😊
I am not sure which file I need to modify to get what is shown [in MStore .8. Banner High and Vertical Card List](https://docs.inspireui.com/mstore/home-theme/)
What files should I modify to achieve this? I just need to know what the files are
How to discover WordPress IDs to get recent news to Banner High. ?
I'm not asking for customization, but the documentation is confusing, and I can't understand.
<img width="1073" alt="BannerHighVerticalCardList" src="https://user-images.githubusercontent.com/28591454/92313022-f7e89780-efc6-11ea-80ad-6b7ca4d341d3.png">
| 1.0 | Login does not work and the application is blank - **_Step 1 (require): describe detail issues & screenshots_**
+ Detail Issues:
- [ ] first issue
I cannot enter the application as a user, the application responds: invalid email or password .
2 - Written comments do not reach my WordPress panel The console returns the following: WARN postNewsContent LOG {"code": "rest_post_invalid_page_number", "data ": {" status ": 400}," message ":" The requested page number is greater than the number of available pages. "}
- [ ] second issue
Home screen remains empty
- [ ] third issue
The application does not connect with Firebase
+ Product & version:
+ Flutter (or React Native) version:
+ Testing Device/Simulator:
+ Screenshot issues (drag the file to attach here):
<img width="1298" alt="debug" src="https://user-images.githubusercontent.com/28591454/92264584-c6949c80-eede-11ea-9ba9-8a8272b003d5.png">
<img width="351" alt="startingScreen" src="https://user-images.githubusercontent.com/28591454/92264654-e1671100-eede-11ea-923e-92c1b34582a7.png">
<img width="1969" alt="menuItems" src="https://user-images.githubusercontent.com/28591454/92264725-fba0ef00-eede-11ea-9b5a-4081063cd219.png">
<img width="1298" alt="debug" src="https://user-images.githubusercontent.com/28591454/92264840-268b4300-eedf-11ea-9e75-161e619d5c97.png">
**_Step 2 (require): submit proof of purchasing the license on https://verify.inspireui.com **
**_Important Note:_**
- Please help to ready the support policies before creating a new ticket: https://inspireui.com/support-policies/
- Kindly create only *One Ticket* & includes all the issues, that would help us focus to resolve it better.
- If your all ticket was resolved & closed, but want to create a new ticket, just do simple step by linking to the previous verified ID (you don't need to submit the form again), for example, #Ticket-ID
Thank you so much for your time 😊
I am not sure which file I need to modify to get what is shown [in MStore .8. Banner High and Vertical Card List](https://docs.inspireui.com/mstore/home-theme/)
What files should I modify to achieve this? I just need to know what the files are
How to discover WordPress IDs to get recent news to Banner High. ?
I'm not asking for customization, but the documentation is confusing, and I can't understand.
<img width="1073" alt="BannerHighVerticalCardList" src="https://user-images.githubusercontent.com/28591454/92313022-f7e89780-efc6-11ea-80ad-6b7ca4d341d3.png">
| non_comp | login does not work and the application is blank step require describe detail issues screenshots detail issues first issue i cannot enter the application as a user the application responds invalid email or password written comments do not reach my wordpress panel the console returns the following warn postnewscontent log code rest post invalid page number data status message the requested page number is greater than the number of available pages second issue home screen remains empty third issue the application does not connect with firebase product version flutter or react native version testing device simulator screenshot issues drag the file to attach here img width alt debug src img width alt startingscreen src img width alt menuitems src img width alt debug src step require submit proof of purchasing the license on important note please help to ready the support policies before creating a new ticket kindly create only one ticket includes all the issues that would help us focus to resolve it better if your all ticket was resolved closed but want to create a new ticket just do simple step by linking to the previous verified id you don t need to submit the form again for example ticket id thank you so much for your time 😊 i am not sure which file i need to modify to get what is shown what files should i modify to achieve this i just need to know what the files are how to discover wordpress ids to get recent news to banner high i m not asking for customization but the documentation is confusing and i can t understand img width alt bannerhighverticalcardlist src | 0 |
20,084 | 28,041,651,078 | IssuesEvent | 2023-03-28 19:00:49 | VazkiiMods/Quark | https://api.github.com/repos/VazkiiMods/Quark | closed | Ecologics Compat Overlap | compatibility | Add Ecologics to the anti-overlap for Azalea wood as it adds it's own Azalea and Flowering Azalea logs
It also adds (coconut) crabs, though those might be distinct enough from Quark's crabs to not include anti-overlap. | True | Ecologics Compat Overlap - Add Ecologics to the anti-overlap for Azalea wood as it adds it's own Azalea and Flowering Azalea logs
It also adds (coconut) crabs, though those might be distinct enough from Quark's crabs to not include anti-overlap. | comp | ecologics compat overlap add ecologics to the anti overlap for azalea wood as it adds it s own azalea and flowering azalea logs it also adds coconut crabs though those might be distinct enough from quark s crabs to not include anti overlap | 1 |
2,744 | 5,495,667,908 | IssuesEvent | 2017-03-15 05:34:32 | PrinceOfAmber/Cyclic | https://api.github.com/repos/PrinceOfAmber/Cyclic | closed | [Minor Issue]Can't disable charm when it is being repaired. | bug: gameplay mod compatibility wontfix | if you charm is losing durability and winning it at the same time you can't disable it
to do it just:
1. have player stats 2 ---> https://minecraft.curseforge.com/projects/player-stats-2. or any mod that repair items on your inventory.
2. put the item repairer ability on level max.
3. have a charm on your hand.
4. try to disable it.
you can't. | True | [Minor Issue]Can't disable charm when it is being repaired. - if you charm is losing durability and winning it at the same time you can't disable it
to do it just:
1. have player stats 2 ---> https://minecraft.curseforge.com/projects/player-stats-2. or any mod that repair items on your inventory.
2. put the item repairer ability on level max.
3. have a charm on your hand.
4. try to disable it.
you can't. | comp | can t disable charm when it is being repaired if you charm is losing durability and winning it at the same time you can t disable it to do it just have player stats or any mod that repair items on your inventory put the item repairer ability on level max have a charm on your hand try to disable it you can t | 1 |
7,030 | 9,306,210,219 | IssuesEvent | 2019-03-25 09:07:32 | acemod/ACE3 | https://api.github.com/repos/acemod/ACE3 | closed | RHS helmet incorrect hearing protection values | area/compatibility | **Arma 3 Version:** `1.90` (**stable** )
**CBA Version:** `3.10.1.190316` (stable)
**ACE3 Version:** ` 3.12.6` (stable)
**ACEX Version** `3.4.2` (stable)
**Mods:**
```
- CBA_A3
- ACE
- ACEX
- ACE Compat - RHSUSAF
- RHSUSAF
```
**Description:**
- The MICH 2000 (Semi-Arid/Norotos/ARC), (rhsusf_mich_bare_norotos_arc_semi), has unwarranted hearing protection, even though the model does not include any sort of Ear-pro.
Similarly, the FAST Ballistic (Multicam Cover/Headset/NSW) (rhssusf_opscore_mc_cover_pelt_nsw),FAST Ballistic (Urban Tan/Headset/NSW) (rhssusf_opscore_ut_pelt_nsw), and IHADSS (rhsusf_ihadss) have no hearing protection.
**Steps to reproduce:**
1. Start with selected mods: ACE, ACEX, RHSUSAF, ACE Compat - RHSUSAF
2. equip one of the above mentioned helmets
**Where did the issue occur?**
- All known places
**RPT log file:**
converted to txt file for upload
[Arma3_x64_2019-03-22_18-54-10.txt](https://github.com/acemod/ACE3/files/2995650/Arma3_x64_2019-03-22_18-54-10.txt)
| True | RHS helmet incorrect hearing protection values - **Arma 3 Version:** `1.90` (**stable** )
**CBA Version:** `3.10.1.190316` (stable)
**ACE3 Version:** ` 3.12.6` (stable)
**ACEX Version** `3.4.2` (stable)
**Mods:**
```
- CBA_A3
- ACE
- ACEX
- ACE Compat - RHSUSAF
- RHSUSAF
```
**Description:**
- The MICH 2000 (Semi-Arid/Norotos/ARC), (rhsusf_mich_bare_norotos_arc_semi), has unwarranted hearing protection, even though the model does not include any sort of Ear-pro.
Similarly, the FAST Ballistic (Multicam Cover/Headset/NSW) (rhssusf_opscore_mc_cover_pelt_nsw),FAST Ballistic (Urban Tan/Headset/NSW) (rhssusf_opscore_ut_pelt_nsw), and IHADSS (rhsusf_ihadss) have no hearing protection.
**Steps to reproduce:**
1. Start with selected mods: ACE, ACEX, RHSUSAF, ACE Compat - RHSUSAF
2. equip one of the above mentioned helmets
**Where did the issue occur?**
- All known places
**RPT log file:**
converted to txt file for upload
[Arma3_x64_2019-03-22_18-54-10.txt](https://github.com/acemod/ACE3/files/2995650/Arma3_x64_2019-03-22_18-54-10.txt)
| comp | rhs helmet incorrect hearing protection values arma version stable cba version stable version stable acex version stable mods cba ace acex ace compat rhsusaf rhsusaf description the mich semi arid norotos arc rhsusf mich bare norotos arc semi has unwarranted hearing protection even though the model does not include any sort of ear pro similarly the fast ballistic multicam cover headset nsw rhssusf opscore mc cover pelt nsw fast ballistic urban tan headset nsw rhssusf opscore ut pelt nsw and ihadss rhsusf ihadss have no hearing protection steps to reproduce start with selected mods ace acex rhsusaf ace compat rhsusaf equip one of the above mentioned helmets where did the issue occur all known places rpt log file converted to txt file for upload | 1 |
9,430 | 11,489,136,065 | IssuesEvent | 2020-02-11 15:02:34 | jupyter-widgets/ipywidgets | https://api.github.com/repos/jupyter-widgets/ipywidgets | closed | display_view and display_model used inconsistently | backwards-incompatible | We discovered in https://github.com/jupyter-widgets/ipywidgets/pull/2657#issuecomment-571852389 that display_view is used inconsistently throughout the codebase. In discussion at the widgets sprint in Paris, we decided to explore getting rid of display_view (and display_model?), and instead just having create_view. | True | display_view and display_model used inconsistently - We discovered in https://github.com/jupyter-widgets/ipywidgets/pull/2657#issuecomment-571852389 that display_view is used inconsistently throughout the codebase. In discussion at the widgets sprint in Paris, we decided to explore getting rid of display_view (and display_model?), and instead just having create_view. | comp | display view and display model used inconsistently we discovered in that display view is used inconsistently throughout the codebase in discussion at the widgets sprint in paris we decided to explore getting rid of display view and display model and instead just having create view | 1 |
78,034 | 22,093,209,074 | IssuesEvent | 2022-06-01 07:55:22 | PaddlePaddle/Paddle | https://api.github.com/repos/PaddlePaddle/Paddle | opened | 3090运行nn.Conv2D(3, 16, 3)报错 | status/new-issue type/build | ### 问题描述 Issue Description
代码:
import paddle.nn as nn
nn.Conv2D(3, 16, 3)
log:
W0601 15:46:55.629635 21509 device_context.cc:447] Please NOTE: device: 0, GPU Compute Capability: 8.6, Driver API Version: 11.2, Runtime API Version: 11.2
W0601 15:46:55.633584 21509 device_context.cc:465] device: 0, cuDNN Version: 7.6.
Traceback (most recent call last):
File "111.py", line 3, in <module>
nn.Conv2D(3, 16, 3)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/nn/layer/conv.py", line 656, in __init__
data_format=data_format)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/nn/layer/conv.py", line 135, in __init__
default_initializer=_get_default_param_initializer())
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/dygraph/layers.py", line 422, in create_parameter
default_initializer)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/layer_helper_base.py", line 378, in create_parameter
**attr._to_kwargs(with_initializer=True))
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/framework.py", line 3137, in create_parameter
initializer(param, self)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/initializer.py", line 362, in __call__
stop_gradient=True)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/framework.py", line 3167, in append_op
kwargs.get("stop_gradient", False))
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/dygraph/tracer.py", line 45, in trace_op
not stop_gradient)
OSError: (External) CUBLAS error(7).
[Hint: 'CUBLAS_STATUS_INVALID_VALUE'. An unsupported value or parameter was passed to the function (a negative vector size, for example). To correct: ensure that all the parameters being passed have valid values. ] (at /paddle/paddle/fluid/platform/cuda_helper.h:107)
### 版本&环境信息 Version & Environment Information
Paddle version: 2.2.2
Paddle With CUDA: True
OS: Ubuntu 16.04
Python version: 3.7.1
CUDA version: 11.2.67
Build cuda_11.2.r11.2/compiler.29373293_0
cuDNN version: 7.6.5
Nvidia driver version: 460.84
显卡使用的3090,CUDA也曾实验过CUDA11.1,python也用过python3.8,都会报错 | 1.0 | 3090运行nn.Conv2D(3, 16, 3)报错 - ### 问题描述 Issue Description
代码:
import paddle.nn as nn
nn.Conv2D(3, 16, 3)
log:
W0601 15:46:55.629635 21509 device_context.cc:447] Please NOTE: device: 0, GPU Compute Capability: 8.6, Driver API Version: 11.2, Runtime API Version: 11.2
W0601 15:46:55.633584 21509 device_context.cc:465] device: 0, cuDNN Version: 7.6.
Traceback (most recent call last):
File "111.py", line 3, in <module>
nn.Conv2D(3, 16, 3)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/nn/layer/conv.py", line 656, in __init__
data_format=data_format)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/nn/layer/conv.py", line 135, in __init__
default_initializer=_get_default_param_initializer())
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/dygraph/layers.py", line 422, in create_parameter
default_initializer)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/layer_helper_base.py", line 378, in create_parameter
**attr._to_kwargs(with_initializer=True))
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/framework.py", line 3137, in create_parameter
initializer(param, self)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/initializer.py", line 362, in __call__
stop_gradient=True)
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/framework.py", line 3167, in append_op
kwargs.get("stop_gradient", False))
File "/opt/python3.7/lib/python3.7/site-packages/paddle/fluid/dygraph/tracer.py", line 45, in trace_op
not stop_gradient)
OSError: (External) CUBLAS error(7).
[Hint: 'CUBLAS_STATUS_INVALID_VALUE'. An unsupported value or parameter was passed to the function (a negative vector size, for example). To correct: ensure that all the parameters being passed have valid values. ] (at /paddle/paddle/fluid/platform/cuda_helper.h:107)
### 版本&环境信息 Version & Environment Information
Paddle version: 2.2.2
Paddle With CUDA: True
OS: Ubuntu 16.04
Python version: 3.7.1
CUDA version: 11.2.67
Build cuda_11.2.r11.2/compiler.29373293_0
cuDNN version: 7.6.5
Nvidia driver version: 460.84
显卡使用的3090,CUDA也曾实验过CUDA11.1,python也用过python3.8,都会报错 | non_comp | 报错 问题描述 issue description 代码 import paddle nn as nn nn log device context cc please note device gpu compute capability driver api version runtime api version device context cc device cudnn version traceback most recent call last file py line in nn file opt lib site packages paddle nn layer conv py line in init data format data format file opt lib site packages paddle nn layer conv py line in init default initializer get default param initializer file opt lib site packages paddle fluid dygraph layers py line in create parameter default initializer file opt lib site packages paddle fluid layer helper base py line in create parameter attr to kwargs with initializer true file opt lib site packages paddle fluid framework py line in create parameter initializer param self file opt lib site packages paddle fluid initializer py line in call stop gradient true file opt lib site packages paddle fluid framework py line in append op kwargs get stop gradient false file opt lib site packages paddle fluid dygraph tracer py line in trace op not stop gradient oserror external cublas error at paddle paddle fluid platform cuda helper h 版本 环境信息 version environment information paddle version paddle with cuda true os ubuntu python version cuda version build cuda compiler cudnn version nvidia driver version , , ,都会报错 | 0 |
18,322 | 25,339,241,650 | IssuesEvent | 2022-11-18 19:48:08 | Leaflet/Leaflet | https://api.github.com/repos/Leaflet/Leaflet | opened | IE failing on `v1` branch | bug compatibility | IE tests on CI are failing on `v1` branch, including the v1.9.3 tag. [The logs](https://github.com/Leaflet/Leaflet/actions/runs/3499076639/jobs/5860259993) say there's a syntax error encountered, which means that some ES6 code likely slipped into `v1` through cherry picks. Let's address this with the following steps:
- Determine whether the failures are only in the test code, or in v1.9.3 distribution build as well.
- Fix any errors.
- If it's in v1.9.3, we'll have to do a v1.9.4 to make sure `v1` doesn't break compatibility with IE.
- Harder ESLint rules on `v1` to make sure similar issues don't slip in in the future.
@Falke-Design can you take a look? | True | IE failing on `v1` branch - IE tests on CI are failing on `v1` branch, including the v1.9.3 tag. [The logs](https://github.com/Leaflet/Leaflet/actions/runs/3499076639/jobs/5860259993) say there's a syntax error encountered, which means that some ES6 code likely slipped into `v1` through cherry picks. Let's address this with the following steps:
- Determine whether the failures are only in the test code, or in v1.9.3 distribution build as well.
- Fix any errors.
- If it's in v1.9.3, we'll have to do a v1.9.4 to make sure `v1` doesn't break compatibility with IE.
- Harder ESLint rules on `v1` to make sure similar issues don't slip in in the future.
@Falke-Design can you take a look? | comp | ie failing on branch ie tests on ci are failing on branch including the tag say there s a syntax error encountered which means that some code likely slipped into through cherry picks let s address this with the following steps determine whether the failures are only in the test code or in distribution build as well fix any errors if it s in we ll have to do a to make sure doesn t break compatibility with ie harder eslint rules on to make sure similar issues don t slip in in the future falke design can you take a look | 1 |
5,048 | 7,642,615,633 | IssuesEvent | 2018-05-08 09:49:37 | datafolklabs/cement | https://api.github.com/repos/datafolklabs/cement | closed | Deprecate App.Meta.arguments_override_config | incompatible portland | This meta option is from an older design where it made more sense. Personally I've not used it in years and don't imagine many do. In Cement's current design, this type of option has too many unknown consequences where an option in a nested controller could override a completely unrelated config setting just because it has the same key name.
Dropping in Cement 3. | True | Deprecate App.Meta.arguments_override_config - This meta option is from an older design where it made more sense. Personally I've not used it in years and don't imagine many do. In Cement's current design, this type of option has too many unknown consequences where an option in a nested controller could override a completely unrelated config setting just because it has the same key name.
Dropping in Cement 3. | comp | deprecate app meta arguments override config this meta option is from an older design where it made more sense personally i ve not used it in years and don t imagine many do in cement s current design this type of option has too many unknown consequences where an option in a nested controller could override a completely unrelated config setting just because it has the same key name dropping in cement | 1 |
542,022 | 15,837,521,452 | IssuesEvent | 2021-04-06 20:54:45 | radical-cybertools/radical.pilot | https://api.github.com/repos/radical-cybertools/radical.pilot | closed | Pilot simple run failing with new update | priority:medium topic:resource type:question | ```
python 00_getting_started.py
================================================================================
Getting Started (RP version 1.4.1)
================================================================================
new session: [rp.session.diaxosvidstaff2.rad.rutgers.edu.abdullahghani.018428.0002]
\
database : [mongodb://abdullahg:*********@129.*******:27017/rp_db]ok
create pilot manager ok
create unit manager ok
--------------------------------------------------------------------------------
submit pilots
submit 1 pilot(s)
pilot.0000 rutgers.amarel 2 cores 1 gpus ok
--------------------------------------------------------------------------------
submit 1024 units
create: ########################################################################
submit: ########################################################################
wait : ########################################################################
FAILED : 1024
ok
--------------------------------------------------------------------------------
finalize
closing session rp.session.diaxosvidstaff2.rad.rutgers.edu.abdullahghani.018428.0002
\
close unit manager ok
close pilot manager \
wait for 1 pilot(s)
0 ok
ok
+ rp.session.diaxosvidstaff2.rad.rutgers.edu.abdullahghani.018428.0002 (json)
- pilot.0000 (profiles)
- pilot.0000 (logfiles)
session lifetime: 104.9s ok
--------------------------------------------------------------------------------
```
resource configuration
```
pd_init = {'resource' : 'rutgers.amarel',
'runtime' : 30, # pilot runtime (min)
'exit_on_error' : True,
'access_schema' : 'ssh',
'cores' : 2,
'gpus' : 1,
}
pdesc = rp.ComputePilotDescription(pd_init)
```
```
$ radical-stack
python : 3.7.4
pythonpath :
virtualenv : /Users/abdullahghani/myenv
radical.entk : 1.4.0
radical.pilot : 1.4.1
radical.saga : 1.4.0
radical.utils : 1.4.0
```
[radical.log](https://github.com/radical-cybertools/radical.pilot/files/4780810/radical.log)
| 1.0 | Pilot simple run failing with new update - ```
python 00_getting_started.py
================================================================================
Getting Started (RP version 1.4.1)
================================================================================
new session: [rp.session.diaxosvidstaff2.rad.rutgers.edu.abdullahghani.018428.0002]
\
database : [mongodb://abdullahg:*********@129.*******:27017/rp_db]ok
create pilot manager ok
create unit manager ok
--------------------------------------------------------------------------------
submit pilots
submit 1 pilot(s)
pilot.0000 rutgers.amarel 2 cores 1 gpus ok
--------------------------------------------------------------------------------
submit 1024 units
create: ########################################################################
submit: ########################################################################
wait : ########################################################################
FAILED : 1024
ok
--------------------------------------------------------------------------------
finalize
closing session rp.session.diaxosvidstaff2.rad.rutgers.edu.abdullahghani.018428.0002
\
close unit manager ok
close pilot manager \
wait for 1 pilot(s)
0 ok
ok
+ rp.session.diaxosvidstaff2.rad.rutgers.edu.abdullahghani.018428.0002 (json)
- pilot.0000 (profiles)
- pilot.0000 (logfiles)
session lifetime: 104.9s ok
--------------------------------------------------------------------------------
```
resource configuration
```
pd_init = {'resource' : 'rutgers.amarel',
'runtime' : 30, # pilot runtime (min)
'exit_on_error' : True,
'access_schema' : 'ssh',
'cores' : 2,
'gpus' : 1,
}
pdesc = rp.ComputePilotDescription(pd_init)
```
```
$ radical-stack
python : 3.7.4
pythonpath :
virtualenv : /Users/abdullahghani/myenv
radical.entk : 1.4.0
radical.pilot : 1.4.1
radical.saga : 1.4.0
radical.utils : 1.4.0
```
[radical.log](https://github.com/radical-cybertools/radical.pilot/files/4780810/radical.log)
| non_comp | pilot simple run failing with new update python getting started py getting started rp version new session database ok create pilot manager ok create unit manager ok submit pilots submit pilot s pilot rutgers amarel cores gpus ok submit units create submit wait failed ok finalize closing session rp session rad rutgers edu abdullahghani close unit manager ok close pilot manager wait for pilot s ok ok rp session rad rutgers edu abdullahghani json pilot profiles pilot logfiles session lifetime ok resource configuration pd init resource rutgers amarel runtime pilot runtime min exit on error true access schema ssh cores gpus pdesc rp computepilotdescription pd init radical stack python pythonpath virtualenv users abdullahghani myenv radical entk radical pilot radical saga radical utils | 0 |
11,040 | 13,067,073,721 | IssuesEvent | 2020-07-30 23:19:27 | google/model-viewer | https://api.github.com/repos/google/model-viewer | closed | Fidelity golden screenshot update automation is not running | arc: compatibility arc: rendering & effects type: bug | Our GCP-based fidelity golden screenshot update automation is currently out of commission. Unfortunately, we haven't automated spinning up this infrastructure yet, so others won't be able to diagnose the issues for themselves without a lot of manual work.
Until this problem is addressed, we will have to manually update the fidelity test goldens using the appropriate `npm` scripts. | True | Fidelity golden screenshot update automation is not running - Our GCP-based fidelity golden screenshot update automation is currently out of commission. Unfortunately, we haven't automated spinning up this infrastructure yet, so others won't be able to diagnose the issues for themselves without a lot of manual work.
Until this problem is addressed, we will have to manually update the fidelity test goldens using the appropriate `npm` scripts. | comp | fidelity golden screenshot update automation is not running our gcp based fidelity golden screenshot update automation is currently out of commission unfortunately we haven t automated spinning up this infrastructure yet so others won t be able to diagnose the issues for themselves without a lot of manual work until this problem is addressed we will have to manually update the fidelity test goldens using the appropriate npm scripts | 1 |
33,700 | 16,081,709,227 | IssuesEvent | 2021-04-26 06:03:49 | jgm/pandoc | https://api.github.com/repos/jgm/pandoc | closed | Reduce memory usage | performance | Pandoc is a memory hog.
https://groups.google.com/d/msg/pandoc-discuss/l6Xo0xk8NAQ/1KCKPyc2BgAJ
Do some profiling to figure out why and fix this.
| True | Reduce memory usage - Pandoc is a memory hog.
https://groups.google.com/d/msg/pandoc-discuss/l6Xo0xk8NAQ/1KCKPyc2BgAJ
Do some profiling to figure out why and fix this.
| non_comp | reduce memory usage pandoc is a memory hog do some profiling to figure out why and fix this | 0 |
16,577 | 22,622,634,908 | IssuesEvent | 2022-06-30 07:56:15 | ibm-s390-cloud/ocp-kvm-ipi-automation | https://api.github.com/repos/ibm-s390-cloud/ocp-kvm-ipi-automation | closed | Bug: unable to install OCP pre-release (aka nightly) builds | bug compatibility | I am unable to install pre-release (aka nightly) builds of OCP. The OpenShift installer fails with this error message:
```
fatal: [myhost]: FAILED! => changed=true
cmd:
- /usr/local/bin/openshift-install
- create
- manifests
- --dir=/root/ocp4-workdir
delta: ‘0:00:00.054399’
end: ‘2022-06-30 09:12:53.846745’
msg: non-zero return code
rc: 3
start: ‘2022-06-30 09:12:53.792346’
stderr: ‘level=error msg=failed to fetch Master Machines: failed to load asset “Install Config”: failed to create install config: invalid “install-config.yaml” file: platform: Invalid value: “libvirt”: must specify one of the platforms (alibabacloud, aws, azure, baremetal, gcp, ibmcloud, none, nutanix, openstack, ovirt, powervs, vsphere)’
stderr_lines: <omitted>
stdout: ‘’
stdout_lines: <omitted>
```
Ideally the playbooks support the installation of official OCP releases as well as pre-release builds. | True | Bug: unable to install OCP pre-release (aka nightly) builds - I am unable to install pre-release (aka nightly) builds of OCP. The OpenShift installer fails with this error message:
```
fatal: [myhost]: FAILED! => changed=true
cmd:
- /usr/local/bin/openshift-install
- create
- manifests
- --dir=/root/ocp4-workdir
delta: ‘0:00:00.054399’
end: ‘2022-06-30 09:12:53.846745’
msg: non-zero return code
rc: 3
start: ‘2022-06-30 09:12:53.792346’
stderr: ‘level=error msg=failed to fetch Master Machines: failed to load asset “Install Config”: failed to create install config: invalid “install-config.yaml” file: platform: Invalid value: “libvirt”: must specify one of the platforms (alibabacloud, aws, azure, baremetal, gcp, ibmcloud, none, nutanix, openstack, ovirt, powervs, vsphere)’
stderr_lines: <omitted>
stdout: ‘’
stdout_lines: <omitted>
```
Ideally the playbooks support the installation of official OCP releases as well as pre-release builds. | comp | bug unable to install ocp pre release aka nightly builds i am unable to install pre release aka nightly builds of ocp the openshift installer fails with this error message fatal failed changed true cmd usr local bin openshift install create manifests dir root workdir delta ‘ ’ end ‘ ’ msg non zero return code rc start ‘ ’ stderr ‘level error msg failed to fetch master machines failed to load asset “install config” failed to create install config invalid “install config yaml” file platform invalid value “libvirt” must specify one of the platforms alibabacloud aws azure baremetal gcp ibmcloud none nutanix openstack ovirt powervs vsphere ’ stderr lines stdout ‘’ stdout lines ideally the playbooks support the installation of official ocp releases as well as pre release builds | 1 |
557,167 | 16,502,861,899 | IssuesEvent | 2021-05-25 15:55:38 | CLOSER-Cohorts/archivist | https://api.github.com/repos/CLOSER-Cohorts/archivist | closed | Can there be an close x to leave the add construct side panel | High priority feature react | Can there be an close x to leave the add construct window (the one after you select the construct type). If you change your mind about entering a construct you have to click back until you reach the build again, or you have to click build.
| 1.0 | Can there be an close x to leave the add construct side panel - Can there be an close x to leave the add construct window (the one after you select the construct type). If you change your mind about entering a construct you have to click back until you reach the build again, or you have to click build.
| non_comp | can there be an close x to leave the add construct side panel can there be an close x to leave the add construct window the one after you select the construct type if you change your mind about entering a construct you have to click back until you reach the build again or you have to click build | 0 |
15,681 | 20,242,507,451 | IssuesEvent | 2022-02-14 10:35:35 | arcticicestudio/nord-visual-studio-code | https://api.github.com/repos/arcticicestudio/nord-visual-studio-code | closed | publish to open-vsx.org | context-docs context-extension type-task scope-compatibility scope-dx | ## Description
I use [VSCodium](https://github.com/VSCodium/vscodium) instead of VSCode. They [recently ditched](https://github.com/VSCodium/vscodium/pull/404) the [Visual Studio Marketplace](https://marketplace.visualstudio.com/vscode) in favour of [Open VSX](https://open-vsx.org). I'd love if you published the theme on this marketplace as well.
### Benefits
Open VSX is vendor neutral and used by editors like [VSCodium](https://github.com/VSCodium/vscodium) or [Eclipse Theia](https://theia-ide.org/). By publishing to this marketplace you would reach more users that use different editors.
### Alternative Solutions
In the meantime you can of course manually install the theme via the vsix file. But this way the theme is not as discoverable and easy to install/update on Non-Microsoft editors.
## References
* [VSCodium](https://github.com/VSCodium/vscodium)
* [Open VSX](https://open-vsx.org)
* [Open VSX Repo](https://github.com/eclipse/openvsx)
* [Open VSX Wiki](https://github.com/eclipse/openvsx/wiki)
* [Publishing Extensions on Open VSX](https://github.com/eclipse/openvsx/wiki/Publishing-Extensions)
| True | publish to open-vsx.org - ## Description
I use [VSCodium](https://github.com/VSCodium/vscodium) instead of VSCode. They [recently ditched](https://github.com/VSCodium/vscodium/pull/404) the [Visual Studio Marketplace](https://marketplace.visualstudio.com/vscode) in favour of [Open VSX](https://open-vsx.org). I'd love if you published the theme on this marketplace as well.
### Benefits
Open VSX is vendor neutral and used by editors like [VSCodium](https://github.com/VSCodium/vscodium) or [Eclipse Theia](https://theia-ide.org/). By publishing to this marketplace you would reach more users that use different editors.
### Alternative Solutions
In the meantime you can of course manually install the theme via the vsix file. But this way the theme is not as discoverable and easy to install/update on Non-Microsoft editors.
## References
* [VSCodium](https://github.com/VSCodium/vscodium)
* [Open VSX](https://open-vsx.org)
* [Open VSX Repo](https://github.com/eclipse/openvsx)
* [Open VSX Wiki](https://github.com/eclipse/openvsx/wiki)
* [Publishing Extensions on Open VSX](https://github.com/eclipse/openvsx/wiki/Publishing-Extensions)
| comp | publish to open vsx org description i use instead of vscode they the in favour of i d love if you published the theme on this marketplace as well benefits open vsx is vendor neutral and used by editors like or by publishing to this marketplace you would reach more users that use different editors alternative solutions in the meantime you can of course manually install the theme via the vsix file but this way the theme is not as discoverable and easy to install update on non microsoft editors references | 1 |
665,100 | 22,299,516,509 | IssuesEvent | 2022-06-13 07:21:19 | COS301-SE-2022/Twitter-Summariser | https://api.github.com/repos/COS301-SE-2022/Twitter-Summariser | closed | CICD: Cloudfront Enhancement | priority:high status:ready role:dev-op type:enhance scope:cicd | Invalidate the cache in the S3 bucket on cloudfront to have the latest frontend code whenever we deploy | 1.0 | CICD: Cloudfront Enhancement - Invalidate the cache in the S3 bucket on cloudfront to have the latest frontend code whenever we deploy | non_comp | cicd cloudfront enhancement invalidate the cache in the bucket on cloudfront to have the latest frontend code whenever we deploy | 0 |
19,222 | 26,716,918,214 | IssuesEvent | 2023-01-28 16:41:13 | toeverything/blocksuite | https://api.github.com/repos/toeverything/blocksuite | opened | Passing E2E test cases on Firefox and WebKit | dev environment compatibility | We have exposed a `BROWSER` shortcut for testing cross-browser compatibility issues locally in #877 (see https://github.com/toeverything/blocksuite/blob/master/BUILDING.md#testing for its usage).
But sadly, up till now, our Tier1 browser support (including the latest Firefox and WebKit) still needs to be finished. See the [GitHub action log](https://github.com/toeverything/blocksuite/actions/workflows/browser-compatibility.yml) for the test cases that fail on these two environments.
So, for now, it's convenient enough to fix these cases. And we can use the `/tier1` PR comment (supported in #796) to test it.
| True | Passing E2E test cases on Firefox and WebKit - We have exposed a `BROWSER` shortcut for testing cross-browser compatibility issues locally in #877 (see https://github.com/toeverything/blocksuite/blob/master/BUILDING.md#testing for its usage).
But sadly, up till now, our Tier1 browser support (including the latest Firefox and WebKit) still needs to be finished. See the [GitHub action log](https://github.com/toeverything/blocksuite/actions/workflows/browser-compatibility.yml) for the test cases that fail on these two environments.
So, for now, it's convenient enough to fix these cases. And we can use the `/tier1` PR comment (supported in #796) to test it.
| comp | passing test cases on firefox and webkit we have exposed a browser shortcut for testing cross browser compatibility issues locally in see for its usage but sadly up till now our browser support including the latest firefox and webkit still needs to be finished see the for the test cases that fail on these two environments so for now it s convenient enough to fix these cases and we can use the pr comment supported in to test it | 1 |
3,899 | 6,743,875,463 | IssuesEvent | 2017-10-20 13:48:44 | go-graphite/carbonapi | https://api.github.com/repos/go-graphite/carbonapi | closed | group() doesn't work properly | bug graphite-web compatibility | group(asdf, asdg) gives 2 series instead of 1.
also
```Jun 06 11:40:22 grafana carbonapi[18490]: 2017-06-06T11:40:22.183+0300 ERROR render panic during eval: {"carbonapi_uuid": "7102534b-166b-4002-9ba7-81c01ea0ed84", "username": "", "cache_key": "format=json&from=-12h&maxDataPoints=100&target=averageSeries%28group%28summarize%28netdata.asdf.system.load.load1%2C+%2715s%27%2C+%27last%27%2C+false%29%2C+summarize%28munin.asdf.load%2C+%271min%27%2C+%27last%27%2C+false%29%29%29&until=now", "stack": "github.com/go-graphite/carbonapi/vendor/go.uber.org/zap.Stack\n\t/root/go/src/github.com/go-graphite/carbonapi/vendor/go.uber.org/zap/field.go:209\nmain.renderHandler.func2.1\n\t/root/go/src/github.com/go-graphite/carbonapi/main.go:544\nruntime.call32\n\t/usr/local/go/src/runtime/asm_amd64.s:514\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:489\nruntime.panicindex\n\t/usr/local/go/src/runtime/panic.go:28\ngithub.com/go-graphite/carbonapi/expr.aggregateSeries\n\t/root/go/src/github.com/go-graphite/carbonapi/expr/expr.go:3724\ngithub.com/go-graphite/carbonapi/expr.EvalExpr\n\t/root/go/src/github.com/go-graphite/carbonapi/expr/expr.go:839\nmain.renderHandler.func2\n\t/root/go/src/github.com/go-graphite/carbonapi/main.go:548\nmain.renderHandler\n\t/root/go/src/github.com/go-graphite/carbonapi/main.go:555\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:1942\nnet/http.(*ServeMux).ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2238\ngithub.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers.CompressHandlerLevel.func1\n\t/root/go/src/github.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers/compress.go:143\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:1942\ngithub.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers.(*cors).ServeHTTP\n\t/root/go/src/github.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers/cors.go:51\ngithub.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers.ProxyHeaders.func1\n\t/root/go/src/github.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers/proxy_headers.go:59\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:1942\nnet/http.serverHandler.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2568\nnet/http.(*conn).serve\n\t/usr/local/go/src/net/http/server.go:1825"}``` | True | group() doesn't work properly - group(asdf, asdg) gives 2 series instead of 1.
also
```Jun 06 11:40:22 grafana carbonapi[18490]: 2017-06-06T11:40:22.183+0300 ERROR render panic during eval: {"carbonapi_uuid": "7102534b-166b-4002-9ba7-81c01ea0ed84", "username": "", "cache_key": "format=json&from=-12h&maxDataPoints=100&target=averageSeries%28group%28summarize%28netdata.asdf.system.load.load1%2C+%2715s%27%2C+%27last%27%2C+false%29%2C+summarize%28munin.asdf.load%2C+%271min%27%2C+%27last%27%2C+false%29%29%29&until=now", "stack": "github.com/go-graphite/carbonapi/vendor/go.uber.org/zap.Stack\n\t/root/go/src/github.com/go-graphite/carbonapi/vendor/go.uber.org/zap/field.go:209\nmain.renderHandler.func2.1\n\t/root/go/src/github.com/go-graphite/carbonapi/main.go:544\nruntime.call32\n\t/usr/local/go/src/runtime/asm_amd64.s:514\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:489\nruntime.panicindex\n\t/usr/local/go/src/runtime/panic.go:28\ngithub.com/go-graphite/carbonapi/expr.aggregateSeries\n\t/root/go/src/github.com/go-graphite/carbonapi/expr/expr.go:3724\ngithub.com/go-graphite/carbonapi/expr.EvalExpr\n\t/root/go/src/github.com/go-graphite/carbonapi/expr/expr.go:839\nmain.renderHandler.func2\n\t/root/go/src/github.com/go-graphite/carbonapi/main.go:548\nmain.renderHandler\n\t/root/go/src/github.com/go-graphite/carbonapi/main.go:555\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:1942\nnet/http.(*ServeMux).ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2238\ngithub.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers.CompressHandlerLevel.func1\n\t/root/go/src/github.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers/compress.go:143\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:1942\ngithub.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers.(*cors).ServeHTTP\n\t/root/go/src/github.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers/cors.go:51\ngithub.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers.ProxyHeaders.func1\n\t/root/go/src/github.com/go-graphite/carbonapi/vendor/github.com/gorilla/handlers/proxy_headers.go:59\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:1942\nnet/http.serverHandler.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2568\nnet/http.(*conn).serve\n\t/usr/local/go/src/net/http/server.go:1825"}``` | comp | group doesn t work properly group asdf asdg gives series instead of also jun grafana carbonapi error render panic during eval carbonapi uuid username cache key format json from maxdatapoints target averageseries asdf system load false summarize asdf load false until now stack github com go graphite carbonapi vendor go uber org zap stack n t root go src github com go graphite carbonapi vendor go uber org zap field go nmain renderhandler n t root go src github com go graphite carbonapi main go nruntime n t usr local go src runtime asm s nruntime gopanic n t usr local go src runtime panic go nruntime panicindex n t usr local go src runtime panic go ngithub com go graphite carbonapi expr aggregateseries n t root go src github com go graphite carbonapi expr expr go ngithub com go graphite carbonapi expr evalexpr n t root go src github com go graphite carbonapi expr expr go nmain renderhandler n t root go src github com go graphite carbonapi main go nmain renderhandler n t root go src github com go graphite carbonapi main go nnet http handlerfunc servehttp n t usr local go src net http server go nnet http servemux servehttp n t usr local go src net http server go ngithub com go graphite carbonapi vendor github com gorilla handlers compresshandlerlevel n t root go src github com go graphite carbonapi vendor github com gorilla handlers compress go nnet http handlerfunc servehttp n t usr local go src net http server go ngithub com go graphite carbonapi vendor github com gorilla handlers cors servehttp n t root go src github com go graphite carbonapi vendor github com gorilla handlers cors go ngithub com go graphite carbonapi vendor github com gorilla handlers proxyheaders n t root go src github com go graphite carbonapi vendor github com gorilla handlers proxy headers go nnet http handlerfunc servehttp n t usr local go src net http server go nnet http serverhandler servehttp n t usr local go src net http server go nnet http conn serve n t usr local go src net http server go | 1 |
8,823 | 10,775,014,979 | IssuesEvent | 2019-11-03 11:18:24 | widelands/widelands | https://api.github.com/repos/widelands/widelands | closed | Can't load Build 20 save: unknown ware type "thatch_reed" | bug saveloading & compatibility tribes | - Affected savegame: [215.wgf](https://www.widelands.org/forum/attachment/570041fdadfb373fde9ee7402e3230d679be8330/)
Trying to load this savegame (saved in Build 20) results in the following error:
```
Game data error
buildings: unknown ware type "thatch_reed"
```
Likely related to the following commits:
777771e9ed Renamed thatch reed to reed.
c7e8e09267 Fix savegame compatibility for reed, buildings, players view and economy requests.
- OS: Manjaro Linux
- Widelands Version: Current master (9f6391ed7e)
| True | Can't load Build 20 save: unknown ware type "thatch_reed" - - Affected savegame: [215.wgf](https://www.widelands.org/forum/attachment/570041fdadfb373fde9ee7402e3230d679be8330/)
Trying to load this savegame (saved in Build 20) results in the following error:
```
Game data error
buildings: unknown ware type "thatch_reed"
```
Likely related to the following commits:
777771e9ed Renamed thatch reed to reed.
c7e8e09267 Fix savegame compatibility for reed, buildings, players view and economy requests.
- OS: Manjaro Linux
- Widelands Version: Current master (9f6391ed7e)
| comp | can t load build save unknown ware type thatch reed affected savegame trying to load this savegame saved in build results in the following error game data error buildings unknown ware type thatch reed likely related to the following commits renamed thatch reed to reed fix savegame compatibility for reed buildings players view and economy requests os manjaro linux widelands version current master | 1 |
1,881 | 4,534,959,594 | IssuesEvent | 2016-09-08 15:55:40 | Yoast/wordpress-seo | https://api.github.com/repos/Yoast/wordpress-seo | closed | Content Metabox not showing on Grifus with WP 4.5.3, Yoast SEO 3.4.2 | compatibility javascript metabox wait for feedback | Hi, the content metabox isn't showing on Grifus with WP 4.5.3 and Yoast SEO 3.4.2.


* WordPress version: 4.5.3
* Yoast SEO version: 3.4.2
| True | Content Metabox not showing on Grifus with WP 4.5.3, Yoast SEO 3.4.2 - Hi, the content metabox isn't showing on Grifus with WP 4.5.3 and Yoast SEO 3.4.2.


* WordPress version: 4.5.3
* Yoast SEO version: 3.4.2
| comp | content metabox not showing on grifus with wp yoast seo hi the content metabox isn t showing on grifus with wp and yoast seo wordpress version yoast seo version | 1 |
73,758 | 14,116,748,565 | IssuesEvent | 2020-11-08 05:12:58 | EngTW/English-for-Programmers | https://api.github.com/repos/EngTW/English-for-Programmers | closed | 1588. Sum of All Odd Length Subarrays | LeetCode | ERROR: type should be string, got "https://leetcode.com/problems/sum-of-all-odd-length-subarrays/\r\n\r\n```C#\r\nusing System.Linq;\r\n\r\npublic class Solution\r\n{\r\n public int SumOddLengthSubarrays(int[] arr)\r\n {\r\n // 「輸入的數字」(複數)\r\n var inputNumbers = arr;\r\n\r\n // This is how we will solve this in O(n) time.\r\n //\r\n // Say we have inputNumbers like this:\r\n //\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n //\r\n // ---\r\n //\r\n // These are all of the 1-length subarrays:\r\n //\r\n // {n₁}\r\n // {n₂}\r\n // {n₃}\r\n // {n₄}\r\n // {n₅}\r\n // {n₆}\r\n // ...\r\n // {n₋₆}\r\n // {n₋₅}\r\n // {n₋₄}\r\n // {n₋₃}\r\n // {n₋₂}\r\n // {n₋₁}\r\n //\r\n // They add up like this:\r\n //\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n //\r\n // ---\r\n //\r\n // These are all of the 3-length subarrays:\r\n //\r\n // {n₁, n₂, n₃}\r\n // {n₂, n₃, n₄}\r\n // {n₃, n₄, n₅}\r\n // {n₄, n₅, n₆}\r\n // {n₅, n₆, ...\r\n // {n₆, ...\r\n // ...\r\n // ... , n₋₆}\r\n // ... , n₋₆, n₋₅}\r\n // {n₋₆, n₋₅, n₋₄}\r\n // {n₋₅, n₋₄, n₋₃}\r\n // {n₋₄, n₋₃, n₋₂}\r\n // {n₋₃, n₋₂, n₋₁}\r\n //\r\n // They add up like this:\r\n //\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // { n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, }\r\n // { n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, }\r\n //\r\n // It is equivlent to:\r\n //\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // -n₁ -n₋₁\r\n // -n₁ -n₂ -n₋₂ -n₋₁\r\n //\r\n // ---\r\n //\r\n // We can list all 5-length subarrays and add them up. It will\r\n // turn out like this:\r\n //\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // -n₁ -n₋₁\r\n // -n₁ -n₂ -n₋₂ -n₋₁\r\n // -n₁ -n₂ -n₃ -n₋₃ -n₋₂ -n₋₁\r\n // -n₁ -n₂ -n₃ -n₄ -n₋₄ -n₋₃ -n₋₂ -n₋₁\r\n //\r\n // ---\r\n //\r\n // This is the case for 7-length subarrays.\r\n //\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}\r\n // -n₁ -n₋₁\r\n // -n₁ -n₂ -n₋₂ -n₋₁\r\n // -n₁ -n₂ -n₃ -n₋₃ -n₋₂ -n₋₁\r\n // -n₁ -n₂ -n₃ -n₄ -n₋₄ -n₋₃ -n₋₂ -n₋₁\r\n // -n₁ -n₂ -n₃ -n₄ -n₅ -n₋₅ -n₋₄ -n₋₃ -n₋₂ -n₋₁\r\n // -n₁ -n₂ -n₃ -n₄ -n₅ -n₆ -n₋₆ -n₋₅ -n₋₄ -n₋₃ -n₋₂ -n₋₁\r\n //\r\n // ---\r\n //\r\n // This pattern will continue to hold for all (2m+1)-length\r\n // subarrays, and we will take advantage of it.\r\n\r\n // 「最大的 Subarray 長度」\r\n var maxSubarrayLength = inputNumbers.Length % 2 == 0 ? inputNumbers.Length - 1 : inputNumbers.Length;\r\n\r\n // 「輸出值」\r\n var output = inputNumbers.Sum() * (1 + maxSubarrayLength) * ((maxSubarrayLength - 1) / 2 + 1) / 2;\r\n\r\n // 「『多餘的項數』的數量變數1」\r\n var excessiveTermCount1 = 1;\r\n\r\n // 「『多餘的項數』的數量變數2」\r\n var excessiveTermCount2 = 2;\r\n\r\n // 「『多餘的項數』的數量的差1」\r\n var excessiveTermCountDifference1 = 1;\r\n\r\n // 「『多餘的項數』的數量的差2」\r\n var excessiveTermCountDifference2 = 2;\r\n\r\n for (int i = maxSubarrayLength - 2; i > 0; i -= 2)\r\n {\r\n output -= inputNumbers[i - 1] * excessiveTermCount2;\r\n output -= inputNumbers[i] * excessiveTermCount1;\r\n output -= inputNumbers[inputNumbers.Length - i - 1] * excessiveTermCount1;\r\n output -= inputNumbers[inputNumbers.Length - i] * excessiveTermCount2;\r\n\r\n excessiveTermCountDifference1 += 2;\r\n excessiveTermCountDifference2 += 2;\r\n excessiveTermCount1 += excessiveTermCountDifference1;\r\n excessiveTermCount2 += excessiveTermCountDifference2;\r\n }\r\n\r\n return output;\r\n }\r\n}\r\n```\r\n\r\n# 參考資料\r\n\r\n## 等差數列\r\n\r\nhttps://en.wikipedia.org/wiki/Arithmetic_progression\r\n\r\n* 項數: term\r\n* 差: difference\r\n\r\n---\r\n請參考「刷 LeetCode 練習命名」 https://github.com/EngTW/English-for-Programmers/issues/69 😊" | 1.0 | 1588. Sum of All Odd Length Subarrays - https://leetcode.com/problems/sum-of-all-odd-length-subarrays/
```C#
using System.Linq;
public class Solution
{
public int SumOddLengthSubarrays(int[] arr)
{
// 「輸入的數字」(複數)
var inputNumbers = arr;
// This is how we will solve this in O(n) time.
//
// Say we have inputNumbers like this:
//
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
//
// ---
//
// These are all of the 1-length subarrays:
//
// {n₁}
// {n₂}
// {n₃}
// {n₄}
// {n₅}
// {n₆}
// ...
// {n₋₆}
// {n₋₅}
// {n₋₄}
// {n₋₃}
// {n₋₂}
// {n₋₁}
//
// They add up like this:
//
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
//
// ---
//
// These are all of the 3-length subarrays:
//
// {n₁, n₂, n₃}
// {n₂, n₃, n₄}
// {n₃, n₄, n₅}
// {n₄, n₅, n₆}
// {n₅, n₆, ...
// {n₆, ...
// ...
// ... , n₋₆}
// ... , n₋₆, n₋₅}
// {n₋₆, n₋₅, n₋₄}
// {n₋₅, n₋₄, n₋₃}
// {n₋₄, n₋₃, n₋₂}
// {n₋₃, n₋₂, n₋₁}
//
// They add up like this:
//
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// { n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, }
// { n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, }
//
// It is equivlent to:
//
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// -n₁ -n₋₁
// -n₁ -n₂ -n₋₂ -n₋₁
//
// ---
//
// We can list all 5-length subarrays and add them up. It will
// turn out like this:
//
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// -n₁ -n₋₁
// -n₁ -n₂ -n₋₂ -n₋₁
// -n₁ -n₂ -n₃ -n₋₃ -n₋₂ -n₋₁
// -n₁ -n₂ -n₃ -n₄ -n₋₄ -n₋₃ -n₋₂ -n₋₁
//
// ---
//
// This is the case for 7-length subarrays.
//
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// {n₁, n₂, n₃, n₄, n₅, n₆, ... , n₋₆, n₋₅, n₋₄, n₋₃, n₋₂, n₋₁}
// -n₁ -n₋₁
// -n₁ -n₂ -n₋₂ -n₋₁
// -n₁ -n₂ -n₃ -n₋₃ -n₋₂ -n₋₁
// -n₁ -n₂ -n₃ -n₄ -n₋₄ -n₋₃ -n₋₂ -n₋₁
// -n₁ -n₂ -n₃ -n₄ -n₅ -n₋₅ -n₋₄ -n₋₃ -n₋₂ -n₋₁
// -n₁ -n₂ -n₃ -n₄ -n₅ -n₆ -n₋₆ -n₋₅ -n₋₄ -n₋₃ -n₋₂ -n₋₁
//
// ---
//
// This pattern will continue to hold for all (2m+1)-length
// subarrays, and we will take advantage of it.
// 「最大的 Subarray 長度」
var maxSubarrayLength = inputNumbers.Length % 2 == 0 ? inputNumbers.Length - 1 : inputNumbers.Length;
// 「輸出值」
var output = inputNumbers.Sum() * (1 + maxSubarrayLength) * ((maxSubarrayLength - 1) / 2 + 1) / 2;
// 「『多餘的項數』的數量變數1」
var excessiveTermCount1 = 1;
// 「『多餘的項數』的數量變數2」
var excessiveTermCount2 = 2;
// 「『多餘的項數』的數量的差1」
var excessiveTermCountDifference1 = 1;
// 「『多餘的項數』的數量的差2」
var excessiveTermCountDifference2 = 2;
for (int i = maxSubarrayLength - 2; i > 0; i -= 2)
{
output -= inputNumbers[i - 1] * excessiveTermCount2;
output -= inputNumbers[i] * excessiveTermCount1;
output -= inputNumbers[inputNumbers.Length - i - 1] * excessiveTermCount1;
output -= inputNumbers[inputNumbers.Length - i] * excessiveTermCount2;
excessiveTermCountDifference1 += 2;
excessiveTermCountDifference2 += 2;
excessiveTermCount1 += excessiveTermCountDifference1;
excessiveTermCount2 += excessiveTermCountDifference2;
}
return output;
}
}
```
# 參考資料
## 等差數列
https://en.wikipedia.org/wiki/Arithmetic_progression
* 項數: term
* 差: difference
---
請參考「刷 LeetCode 練習命名」 https://github.com/EngTW/English-for-Programmers/issues/69 😊 | non_comp | sum of all odd length subarrays c using system linq public class solution public int sumoddlengthsubarrays int arr 「輸入的數字」(複數) var inputnumbers arr this is how we will solve this in o n time say we have inputnumbers like this n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ these are all of the length subarrays n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ they add up like this n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ these are all of the length subarrays n₁ n₂ n₃ n₂ n₃ n₄ n₃ n₄ n₅ n₄ n₅ n₆ n₅ n₆ n₆ n₋₆ n₋₆ n₋₅ n₋₆ n₋₅ n₋₄ n₋₅ n₋₄ n₋₃ n₋₄ n₋₃ n₋₂ n₋₃ n₋₂ n₋₁ they add up like this n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ it is equivlent to n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₋₁ n₁ n₂ n₋₂ n₋₁ we can list all length subarrays and add them up it will turn out like this n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₋₁ n₁ n₂ n₋₂ n₋₁ n₁ n₂ n₃ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₋₄ n₋₃ n₋₂ n₋₁ this is the case for length subarrays n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₋₁ n₁ n₂ n₋₂ n₋₁ n₁ n₂ n₃ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ n₁ n₂ n₃ n₄ n₅ n₆ n₋₆ n₋₅ n₋₄ n₋₃ n₋₂ n₋₁ this pattern will continue to hold for all length subarrays and we will take advantage of it 「最大的 subarray 長度」 var maxsubarraylength inputnumbers length inputnumbers length inputnumbers length 「輸出值」 var output inputnumbers sum maxsubarraylength maxsubarraylength 「『多餘的項數』 」 var 「『多餘的項數』 」 var 「『多餘的項數』 」 var 「『多餘的項數』 」 var for int i maxsubarraylength i i output inputnumbers output inputnumbers output inputnumbers output inputnumbers return output 參考資料 等差數列 項數: term 差: difference 請參考「刷 leetcode 練習命名」 😊 | 0 |
8,914 | 10,915,994,969 | IssuesEvent | 2019-11-21 12:24:59 | jiangdashao/Matrix-Issues | https://api.github.com/repos/jiangdashao/Matrix-Issues | closed | [FP] Jesus false positive | False Positive Incompatibility | ## Troubleshooting Information
`Change - [ ] to - [X] to check the checkboxes below.`
- [X] Matrix and ProtocolLib are up-to-date
- [X] Matrix is running on a 1.8, 1.12, 1.13, or 1.14 server
- [X] The issue happens on default config.yml and checks.yml
- [x] I've tested if the issue happens on default config (I think we're using the default config. I'm pretty sure.)
## Issue Information
**Server version**: 1.14.2, paper 105
**Verbose messages**: unsure what this means, because English is not my first language. Please explain, and I will get it
**How/when does this happen**: this happens when players stand on griefprevention claim indicators in water
**Video of false positive**: https://youtu.be/8PobC36Our4
**Other information**:
## Configuration Files
**Link to checks.yml file**: https://pastebin.com/gwdmm1MA
**Link to config.yml file**: https://pastebin.com/ReLX6zB8
| True | [FP] Jesus false positive - ## Troubleshooting Information
`Change - [ ] to - [X] to check the checkboxes below.`
- [X] Matrix and ProtocolLib are up-to-date
- [X] Matrix is running on a 1.8, 1.12, 1.13, or 1.14 server
- [X] The issue happens on default config.yml and checks.yml
- [x] I've tested if the issue happens on default config (I think we're using the default config. I'm pretty sure.)
## Issue Information
**Server version**: 1.14.2, paper 105
**Verbose messages**: unsure what this means, because English is not my first language. Please explain, and I will get it
**How/when does this happen**: this happens when players stand on griefprevention claim indicators in water
**Video of false positive**: https://youtu.be/8PobC36Our4
**Other information**:
## Configuration Files
**Link to checks.yml file**: https://pastebin.com/gwdmm1MA
**Link to config.yml file**: https://pastebin.com/ReLX6zB8
| comp | jesus false positive troubleshooting information change to to check the checkboxes below matrix and protocollib are up to date matrix is running on a or server the issue happens on default config yml and checks yml i ve tested if the issue happens on default config i think we re using the default config i m pretty sure issue information server version paper verbose messages unsure what this means because english is not my first language please explain and i will get it how when does this happen this happens when players stand on griefprevention claim indicators in water video of false positive other information configuration files link to checks yml file link to config yml file | 1 |
5,857 | 8,305,783,402 | IssuesEvent | 2018-09-22 11:17:09 | bleakgrey/tootle | https://api.github.com/repos/bleakgrey/tootle | opened | Releasing for Juno | compatibility help wanted | I tested the app for Juno and infrastructure-wise and design-wise it looks pretty good.

So can we release it now? | True | Releasing for Juno - I tested the app for Juno and infrastructure-wise and design-wise it looks pretty good.

So can we release it now? | comp | releasing for juno i tested the app for juno and infrastructure wise and design wise it looks pretty good so can we release it now | 1 |
15,696 | 20,255,574,013 | IssuesEvent | 2022-02-14 22:44:58 | ElfFriend-DnD/foundryvtt-gmScreen | https://api.github.com/repos/ElfFriend-DnD/foundryvtt-gmScreen | closed | Popout! compatibility - unable to modify contents after popping out the screen | bug compatibility help wanted | **Environment Details**
- Foundry Core Version: 0.78
- System & Version: Noticed on 5e 1.2.0, but I assume it's not relevant in this case
- OS & Browser: Win 10, Opera, but ditto
- Hosting: Self hosted, but ditto
**Description**
After you pop out the GM screen, choosing options from the drop down menu doesn't change/update the data on the screen. Refreshing after picking an option restores the GM Screen to pre-popout state.
**Expected behavior**
The tile you chose the dropdown option for displays the data for the chosen option. | True | Popout! compatibility - unable to modify contents after popping out the screen - **Environment Details**
- Foundry Core Version: 0.78
- System & Version: Noticed on 5e 1.2.0, but I assume it's not relevant in this case
- OS & Browser: Win 10, Opera, but ditto
- Hosting: Self hosted, but ditto
**Description**
After you pop out the GM screen, choosing options from the drop down menu doesn't change/update the data on the screen. Refreshing after picking an option restores the GM Screen to pre-popout state.
**Expected behavior**
The tile you chose the dropdown option for displays the data for the chosen option. | comp | popout compatibility unable to modify contents after popping out the screen environment details foundry core version system version noticed on but i assume it s not relevant in this case os browser win opera but ditto hosting self hosted but ditto description after you pop out the gm screen choosing options from the drop down menu doesn t change update the data on the screen refreshing after picking an option restores the gm screen to pre popout state expected behavior the tile you chose the dropdown option for displays the data for the chosen option | 1 |
605,499 | 18,736,138,228 | IssuesEvent | 2021-11-04 07:52:28 | AY2122S1-CS2113T-W12-4/tp | https://api.github.com/repos/AY2122S1-CS2113T-W12-4/tp | closed | 17. As a healthy house-husband, I want to find out what recipe to make based on what I had eaten and the recommended calories intake per day | type.Story priority.Low | so that I can keep mu calories intake within the recommended range | 1.0 | 17. As a healthy house-husband, I want to find out what recipe to make based on what I had eaten and the recommended calories intake per day - so that I can keep mu calories intake within the recommended range | non_comp | as a healthy house husband i want to find out what recipe to make based on what i had eaten and the recommended calories intake per day so that i can keep mu calories intake within the recommended range | 0 |
3,810 | 6,664,288,399 | IssuesEvent | 2017-10-02 19:34:31 | kbenoit/quanteda | https://api.github.com/repos/kbenoit/quanteda | opened | Need to fix UBSAN issues | compatibility | Got this email today from the CRAN team:
> Also, please take a look at the clang-UBSAN issues reported at
> https://www.stats.ox.ac.uk/pub/bdr/memtests/clang-UBSAN/quanteda/
> and try to fix.
>
> Best,
> Uwe Ligges
> (CRAN team)
This seems to be triggered by the vignette only. We need either to explain this clearly as being not our fault, but rather that of **Rcpp**, or diagnose and fix it.
For latter, putting it to the rOpenSci Slack team might be a good start. I seem to remember some chatter about related issues there. | True | Need to fix UBSAN issues - Got this email today from the CRAN team:
> Also, please take a look at the clang-UBSAN issues reported at
> https://www.stats.ox.ac.uk/pub/bdr/memtests/clang-UBSAN/quanteda/
> and try to fix.
>
> Best,
> Uwe Ligges
> (CRAN team)
This seems to be triggered by the vignette only. We need either to explain this clearly as being not our fault, but rather that of **Rcpp**, or diagnose and fix it.
For latter, putting it to the rOpenSci Slack team might be a good start. I seem to remember some chatter about related issues there. | comp | need to fix ubsan issues got this email today from the cran team also please take a look at the clang ubsan issues reported at and try to fix best uwe ligges cran team this seems to be triggered by the vignette only we need either to explain this clearly as being not our fault but rather that of rcpp or diagnose and fix it for latter putting it to the ropensci slack team might be a good start i seem to remember some chatter about related issues there | 1 |
19,445 | 26,995,579,787 | IssuesEvent | 2023-02-10 00:32:37 | Yoast/wordpress-seo | https://api.github.com/repos/Yoast/wordpress-seo | opened | Slug not shown when using classic editor for Jetpack connected users | compatibility | <!-- Please use this template when creating an issue.
- Please check the boxes after you've created your issue.
- Please use the latest version of Yoast SEO.-->
* [x] I've read and understood the [contribution guidelines](https://github.com/Yoast/wordpress-seo/blob/trunk/.github/CONTRIBUTING.md).
* [x] I've searched for any related issues and avoided creating a duplicate issue.
### Please give us a description of what happened
The Yoast SEO meta box no longer shows the slug when using the classic editor (either with a plugin or disabled by another plugin like WooCommerce). This only happens if the site user is connected to a Jetpack account.
Important to note that the issue is a backend display issue. The markup on the front end (thus the SEO of the site) is not impacted.
Reproduced on 2 production sites and 1 non-production (new vanilla setup). A second user on the same site that is not connected to Jetpack shows the slug.
### Possible Workarounds
- Ignore the issue as it is a display issue that doesn't impact the front end or SEO of the site
- Downgrade to Jetpack 11.7.1
- Disconnect the site user from Jetpack
- Switch to the block editor
### To Reproduce
#### Step-by-step reproduction instructions
Starting with a vanilla installation:
1. Install and activate Yoast SEO (free)
2. Install and activate [Classic Editor](https://wordpress.org/plugins/classic-editor/)
3. Install, activate, and set up [Jetpack](https://wordpress.org/plugins/jetpack/) (options below)
4. Go to Posts > All Posts
5. Edit the 'Hello World' post
6. See the Yoast SEO meta box **does not contain** the slug
7. Uninstall Jetpack and install the previous version, 11.7.1
8. Go to Posts > All Posts
9. Edit the 'Hello World' post
10. See the Yoast SEO meta box **does contain** the slug
**Jetpack options**
Connect Jetpack to your WP.com account
Scroll to the bottom of the connecting account and choose `Start with Jetpack Free`
Ignore the recommendation questions (answers don't impact the behavior)
#### Expected results
Slug to appear in Yoast SEO meta box
#### Actual results
Slug does not appear in the Yoast SEO meta box
### Screenshots, screen recording, code snippet
#### Slug missing with Jetpack 11.8

#### Slug visible with Jetpack 11.7.1

#### Comparision between connected and not connected Jetpack user

### Technical info
<!-- You can check these boxes once you've created the issue.
- If you are using Gutenberg, Elementor or the Classic Editor plugin, please make sure you have updated to the latest version.
-->
* If relevant, which editor is affected (or editors):
- [ ] Block Editor
- [ ] Gutenberg Editor
- [ ] Elementor Editor
- [x] Classic Editor
- [ ] Other: <!-- please specify -->
<!-- You can check these boxes once you've created the issue. -->
* Which browser is affected (or browsers):
- [X] Chrome
- [X] Firefox
- [ ] Safari
- [ ] Other: <!-- please specify -->
#### Used versions
* Device you are using: Desktop
* Operating system: Win11
* PHP version: 7.4.30 (Supports 64bit values) / 8.1.15 (Supports 64bit values) / 7.4.33 (Supports 64bit values)
* WordPress version: 6.1.1
* WordPress Theme: Twenty Twenty-Three / Essence Pro (Genesis child theme) / Kloe
* Yoast SEO version: 20.1 / 19.14 / 19.6 (not a regression in Yoast)
* <!-- If relevant -->Classic Editor plugin version: 1.6.2
* Relevant plugins in case of a bug: Jetpack 11.8 (regression as it doesn't happen with 11.7.1) | True | Slug not shown when using classic editor for Jetpack connected users - <!-- Please use this template when creating an issue.
- Please check the boxes after you've created your issue.
- Please use the latest version of Yoast SEO.-->
* [x] I've read and understood the [contribution guidelines](https://github.com/Yoast/wordpress-seo/blob/trunk/.github/CONTRIBUTING.md).
* [x] I've searched for any related issues and avoided creating a duplicate issue.
### Please give us a description of what happened
The Yoast SEO meta box no longer shows the slug when using the classic editor (either with a plugin or disabled by another plugin like WooCommerce). This only happens if the site user is connected to a Jetpack account.
Important to note that the issue is a backend display issue. The markup on the front end (thus the SEO of the site) is not impacted.
Reproduced on 2 production sites and 1 non-production (new vanilla setup). A second user on the same site that is not connected to Jetpack shows the slug.
### Possible Workarounds
- Ignore the issue as it is a display issue that doesn't impact the front end or SEO of the site
- Downgrade to Jetpack 11.7.1
- Disconnect the site user from Jetpack
- Switch to the block editor
### To Reproduce
#### Step-by-step reproduction instructions
Starting with a vanilla installation:
1. Install and activate Yoast SEO (free)
2. Install and activate [Classic Editor](https://wordpress.org/plugins/classic-editor/)
3. Install, activate, and set up [Jetpack](https://wordpress.org/plugins/jetpack/) (options below)
4. Go to Posts > All Posts
5. Edit the 'Hello World' post
6. See the Yoast SEO meta box **does not contain** the slug
7. Uninstall Jetpack and install the previous version, 11.7.1
8. Go to Posts > All Posts
9. Edit the 'Hello World' post
10. See the Yoast SEO meta box **does contain** the slug
**Jetpack options**
Connect Jetpack to your WP.com account
Scroll to the bottom of the connecting account and choose `Start with Jetpack Free`
Ignore the recommendation questions (answers don't impact the behavior)
#### Expected results
Slug to appear in Yoast SEO meta box
#### Actual results
Slug does not appear in the Yoast SEO meta box
### Screenshots, screen recording, code snippet
#### Slug missing with Jetpack 11.8

#### Slug visible with Jetpack 11.7.1

#### Comparision between connected and not connected Jetpack user

### Technical info
<!-- You can check these boxes once you've created the issue.
- If you are using Gutenberg, Elementor or the Classic Editor plugin, please make sure you have updated to the latest version.
-->
* If relevant, which editor is affected (or editors):
- [ ] Block Editor
- [ ] Gutenberg Editor
- [ ] Elementor Editor
- [x] Classic Editor
- [ ] Other: <!-- please specify -->
<!-- You can check these boxes once you've created the issue. -->
* Which browser is affected (or browsers):
- [X] Chrome
- [X] Firefox
- [ ] Safari
- [ ] Other: <!-- please specify -->
#### Used versions
* Device you are using: Desktop
* Operating system: Win11
* PHP version: 7.4.30 (Supports 64bit values) / 8.1.15 (Supports 64bit values) / 7.4.33 (Supports 64bit values)
* WordPress version: 6.1.1
* WordPress Theme: Twenty Twenty-Three / Essence Pro (Genesis child theme) / Kloe
* Yoast SEO version: 20.1 / 19.14 / 19.6 (not a regression in Yoast)
* <!-- If relevant -->Classic Editor plugin version: 1.6.2
* Relevant plugins in case of a bug: Jetpack 11.8 (regression as it doesn't happen with 11.7.1) | comp | slug not shown when using classic editor for jetpack connected users please use this template when creating an issue please check the boxes after you ve created your issue please use the latest version of yoast seo i ve read and understood the i ve searched for any related issues and avoided creating a duplicate issue please give us a description of what happened the yoast seo meta box no longer shows the slug when using the classic editor either with a plugin or disabled by another plugin like woocommerce this only happens if the site user is connected to a jetpack account important to note that the issue is a backend display issue the markup on the front end thus the seo of the site is not impacted reproduced on production sites and non production new vanilla setup a second user on the same site that is not connected to jetpack shows the slug possible workarounds ignore the issue as it is a display issue that doesn t impact the front end or seo of the site downgrade to jetpack disconnect the site user from jetpack switch to the block editor to reproduce step by step reproduction instructions starting with a vanilla installation install and activate yoast seo free install and activate install activate and set up options below go to posts all posts edit the hello world post see the yoast seo meta box does not contain the slug uninstall jetpack and install the previous version go to posts all posts edit the hello world post see the yoast seo meta box does contain the slug jetpack options connect jetpack to your wp com account scroll to the bottom of the connecting account and choose start with jetpack free ignore the recommendation questions answers don t impact the behavior expected results slug to appear in yoast seo meta box actual results slug does not appear in the yoast seo meta box screenshots screen recording code snippet slug missing with jetpack slug visible with jetpack comparision between connected and not connected jetpack user technical info you can check these boxes once you ve created the issue if you are using gutenberg elementor or the classic editor plugin please make sure you have updated to the latest version if relevant which editor is affected or editors block editor gutenberg editor elementor editor classic editor other which browser is affected or browsers chrome firefox safari other used versions device you are using desktop operating system php version supports values supports values supports values wordpress version wordpress theme twenty twenty three essence pro genesis child theme kloe yoast seo version not a regression in yoast classic editor plugin version relevant plugins in case of a bug jetpack regression as it doesn t happen with | 1 |
600,287 | 18,292,815,596 | IssuesEvent | 2021-10-05 17:03:06 | grpc/grpc | https://api.github.com/repos/grpc/grpc | opened | Boringssl build error | kind/bug priority/P2 | Boringssl build error when building gRPC by cmake on gcc:11 docker image.
```
/grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc: In function 'bool GenerateECH(const std::vector<std::__cxx11::basic_string<char> >&)':
/grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc:77:24: error: 'numeric_limits' is not a member of 'std'
77 | config_id > std::numeric_limits<uint8_t>::max()) {
| ^~~~~~~~~~~~~~
/grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc:77:46: error: expected primary-expression before '>' token
77 | config_id > std::numeric_limits<uint8_t>::max()) {
| ^
/grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc:77:49: error: '::max' has not been declared; did you mean 'std::max'?
77 | config_id > std::numeric_limits<uint8_t>::max()) {
| ^~~
| std::max
In file included from /usr/local/include/c++/11.2.0/algorithm:62,
from /grpc/third_party/boringssl-with-bazel/src/include/openssl/span.h:26,
from /grpc/third_party/boringssl-with-bazel/src/include/openssl/bytestring.h:20,
from /grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc:19:
/usr/local/include/c++/11.2.0/bits/stl_algo.h:3467:5: note: 'std::max' declared here
3467 | max(initializer_list<_Tp> __l, _Compare __comp)
| ^~~
make[2]: *** [third_party/boringssl-with-bazel/CMakeFiles/bssl.dir/bu
``` | 1.0 | Boringssl build error - Boringssl build error when building gRPC by cmake on gcc:11 docker image.
```
/grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc: In function 'bool GenerateECH(const std::vector<std::__cxx11::basic_string<char> >&)':
/grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc:77:24: error: 'numeric_limits' is not a member of 'std'
77 | config_id > std::numeric_limits<uint8_t>::max()) {
| ^~~~~~~~~~~~~~
/grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc:77:46: error: expected primary-expression before '>' token
77 | config_id > std::numeric_limits<uint8_t>::max()) {
| ^
/grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc:77:49: error: '::max' has not been declared; did you mean 'std::max'?
77 | config_id > std::numeric_limits<uint8_t>::max()) {
| ^~~
| std::max
In file included from /usr/local/include/c++/11.2.0/algorithm:62,
from /grpc/third_party/boringssl-with-bazel/src/include/openssl/span.h:26,
from /grpc/third_party/boringssl-with-bazel/src/include/openssl/bytestring.h:20,
from /grpc/third_party/boringssl-with-bazel/src/tool/generate_ech.cc:19:
/usr/local/include/c++/11.2.0/bits/stl_algo.h:3467:5: note: 'std::max' declared here
3467 | max(initializer_list<_Tp> __l, _Compare __comp)
| ^~~
make[2]: *** [third_party/boringssl-with-bazel/CMakeFiles/bssl.dir/bu
``` | non_comp | boringssl build error boringssl build error when building grpc by cmake on gcc docker image grpc third party boringssl with bazel src tool generate ech cc in function bool generateech const std vector grpc third party boringssl with bazel src tool generate ech cc error numeric limits is not a member of std config id std numeric limits max grpc third party boringssl with bazel src tool generate ech cc error expected primary expression before token config id std numeric limits max grpc third party boringssl with bazel src tool generate ech cc error max has not been declared did you mean std max config id std numeric limits max std max in file included from usr local include c algorithm from grpc third party boringssl with bazel src include openssl span h from grpc third party boringssl with bazel src include openssl bytestring h from grpc third party boringssl with bazel src tool generate ech cc usr local include c bits stl algo h note std max declared here max initializer list l compare comp make third party boringssl with bazel cmakefiles bssl dir bu | 0 |
14,652 | 17,876,724,606 | IssuesEvent | 2021-09-07 05:33:59 | Makewebbetter/subscriptions-for-woocommerce | https://api.github.com/repos/Makewebbetter/subscriptions-for-woocommerce | closed | Wallet payment is not available. | Compatibility issue | WordPress version: 5.8
WooCommerce version: 5.6
Area Path: Store-end-> Checkout
Version: 1.0.3
Description: Wallet payment is not available when a subscription product is added in cart.


| True | Wallet payment is not available. - WordPress version: 5.8
WooCommerce version: 5.6
Area Path: Store-end-> Checkout
Version: 1.0.3
Description: Wallet payment is not available when a subscription product is added in cart.


| comp | wallet payment is not available wordpress version woocommerce version area path store end checkout version description wallet payment is not available when a subscription product is added in cart | 1 |
12,094 | 14,262,859,549 | IssuesEvent | 2020-11-20 13:37:44 | widelands/widelands | https://api.github.com/repos/widelands/widelands | closed | Assert fail in testsuite | bug crashes or hangs military saveloading & compatibility tribes | The testsuite occasionally report this assert:
```
widelands: /home/runner/work/widelands/widelands/src/logic/map_objects/tribes/production_program.cc:1818: virtual void Widelands::ProductionProgram::ActTrain::execute(Widelands::Game &, Widelands::ProductionSite &) const: Assertion `current_level < training_.level' failed.
``` | True | Assert fail in testsuite - The testsuite occasionally report this assert:
```
widelands: /home/runner/work/widelands/widelands/src/logic/map_objects/tribes/production_program.cc:1818: virtual void Widelands::ProductionProgram::ActTrain::execute(Widelands::Game &, Widelands::ProductionSite &) const: Assertion `current_level < training_.level' failed.
``` | comp | assert fail in testsuite the testsuite occasionally report this assert widelands home runner work widelands widelands src logic map objects tribes production program cc virtual void widelands productionprogram acttrain execute widelands game widelands productionsite const assertion current level training level failed | 1 |
2,689 | 5,431,353,815 | IssuesEvent | 2017-03-04 00:26:17 | MJRLegends/Space-Astronomy-Feedback- | https://api.github.com/repos/MJRLegends/Space-Astronomy-Feedback- | closed | Iridium Ingot from Advanced Solar Panels can't be used in Galacticraft Compressors | compatibility issue fixed in next update mod bug MUST SEE reported to developer | ### ---Issue Report---
**Have you checked the Known Issues page (if applicable & PLEASE CHECK BEFORE POSTING):**
Yes
**Have you checked Closed Issues (if applicable & PLEASE CHECK BEFORE POSTING):**
Yes
### Description of Issue
Iridium Ingot from Advanced Solar Panels can't be used in Galacticraft Compressors for crafting Compressed Iridium unlike same ingots (based on ore dictionary tag ingotIridium) from More Planets.
### Steps to Reproduce Issue
1. Smelt Iridium Ore (More Planets) in Induction Smelter (Thermal Expansion) and get Iridium Ingot (Advanced Solar Panels).
2. Put Iridium Ingot (Advanced Solar Panels) into Electric Compressor (Galactcraft) according to the recipe. Nothing happens.
**Version of Mod Pack using:**
1.5.8
### Additional Information
It's minor issue. Workaround: of course you can convert ingots with ore dictionary. | True | Iridium Ingot from Advanced Solar Panels can't be used in Galacticraft Compressors - ### ---Issue Report---
**Have you checked the Known Issues page (if applicable & PLEASE CHECK BEFORE POSTING):**
Yes
**Have you checked Closed Issues (if applicable & PLEASE CHECK BEFORE POSTING):**
Yes
### Description of Issue
Iridium Ingot from Advanced Solar Panels can't be used in Galacticraft Compressors for crafting Compressed Iridium unlike same ingots (based on ore dictionary tag ingotIridium) from More Planets.
### Steps to Reproduce Issue
1. Smelt Iridium Ore (More Planets) in Induction Smelter (Thermal Expansion) and get Iridium Ingot (Advanced Solar Panels).
2. Put Iridium Ingot (Advanced Solar Panels) into Electric Compressor (Galactcraft) according to the recipe. Nothing happens.
**Version of Mod Pack using:**
1.5.8
### Additional Information
It's minor issue. Workaround: of course you can convert ingots with ore dictionary. | comp | iridium ingot from advanced solar panels can t be used in galacticraft compressors issue report have you checked the known issues page if applicable please check before posting yes have you checked closed issues if applicable please check before posting yes description of issue iridium ingot from advanced solar panels can t be used in galacticraft compressors for crafting compressed iridium unlike same ingots based on ore dictionary tag ingotiridium from more planets steps to reproduce issue smelt iridium ore more planets in induction smelter thermal expansion and get iridium ingot advanced solar panels put iridium ingot advanced solar panels into electric compressor galactcraft according to the recipe nothing happens version of mod pack using additional information it s minor issue workaround of course you can convert ingots with ore dictionary | 1 |
9,205 | 11,203,913,158 | IssuesEvent | 2020-01-05 00:06:32 | rubinius/rubinius | https://api.github.com/repos/rubinius/rubinius | closed | invalid parsing errors when opening csv file | Ruby Language Compatibility | I'm having problems parsing certain CSV files via rubinius. The files appear to be perfectly normal, and can be parsed using MRI-2.0.0-p353, MRI 1.9.3-p484, MRI 1.8.7-p371. This file complains that "Unquoted fields do not allow \r or \n (line 6932). (CSV::MalformedCSVError)"
error and stack trace:
```
An exception occurred running cbsa.rb:
Unquoted fields do not allow \r or \n (line 6932). (CSV::MalformedCSVError)
Backtrace:
{ } in CSV#shift at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1850
Array#each at kernel/bootstrap/array.rb:66
{ } in CSV#shift at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1815
Kernel(CSV)#loop at kernel/common/kernel.rb:460
CSV#shift at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1775
CSV#each at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1717
{ } in CSV.foreach at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1121
CSV.open at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1267
CSV.foreach at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1120
Object#__script__ at cbsa.rb:2
Rubinius::CodeLoader#load_script at kernel/delta
/code_loader.rb:66
Rubinius::CodeLoader.load_script at kernel/delta
/code_loader.rb:152
Rubinius::Loader#script at kernel/loader.rb:649
Rubinius::Loader#main at kernel/loader.rb:831
```
## Steps to Reproduce
create file cbsa.rb
```
require 'csv'
data = CSV.foreach('./cbsa.csv') do |datum|
end
```
then:
```
rvm use rbx-2.2.6
wget "https://drive.google.com/uc?id=0B02zQBsLI7H9aVNBOTMyckxmMlk" -O cbsa.csv
ruby cbsa.rb
```
## Meta
```
jw@logopolis:~$ rbx -v
rubinius 2.2.6 (2.1.0 68d916a5 2014-03-10 JI) [x86_64-linux-gnu]
jw@logopolis:~$ uname -a
Linux logopolis 3.2.0-40-generic #64-Ubuntu SMP Mon Mar 25 21:22:10 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux
jw@logopolis:~$ rvm -v
rvm 1.25.0 (master) by Wayne E. Seguin <wayneeseguin@gmail.com>, Michal Papis <mpapis@gmail.com> [https://rvm.io/]
```
| True | invalid parsing errors when opening csv file - I'm having problems parsing certain CSV files via rubinius. The files appear to be perfectly normal, and can be parsed using MRI-2.0.0-p353, MRI 1.9.3-p484, MRI 1.8.7-p371. This file complains that "Unquoted fields do not allow \r or \n (line 6932). (CSV::MalformedCSVError)"
error and stack trace:
```
An exception occurred running cbsa.rb:
Unquoted fields do not allow \r or \n (line 6932). (CSV::MalformedCSVError)
Backtrace:
{ } in CSV#shift at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1850
Array#each at kernel/bootstrap/array.rb:66
{ } in CSV#shift at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1815
Kernel(CSV)#loop at kernel/common/kernel.rb:460
CSV#shift at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1775
CSV#each at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1717
{ } in CSV.foreach at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1121
CSV.open at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1267
CSV.foreach at .rvm/rubies/rbx-2.2.6/gems/gems
/rubysl-csv-2.0.2/lib/rubysl
/csv/csv.rb:1120
Object#__script__ at cbsa.rb:2
Rubinius::CodeLoader#load_script at kernel/delta
/code_loader.rb:66
Rubinius::CodeLoader.load_script at kernel/delta
/code_loader.rb:152
Rubinius::Loader#script at kernel/loader.rb:649
Rubinius::Loader#main at kernel/loader.rb:831
```
## Steps to Reproduce
create file cbsa.rb
```
require 'csv'
data = CSV.foreach('./cbsa.csv') do |datum|
end
```
then:
```
rvm use rbx-2.2.6
wget "https://drive.google.com/uc?id=0B02zQBsLI7H9aVNBOTMyckxmMlk" -O cbsa.csv
ruby cbsa.rb
```
## Meta
```
jw@logopolis:~$ rbx -v
rubinius 2.2.6 (2.1.0 68d916a5 2014-03-10 JI) [x86_64-linux-gnu]
jw@logopolis:~$ uname -a
Linux logopolis 3.2.0-40-generic #64-Ubuntu SMP Mon Mar 25 21:22:10 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux
jw@logopolis:~$ rvm -v
rvm 1.25.0 (master) by Wayne E. Seguin <wayneeseguin@gmail.com>, Michal Papis <mpapis@gmail.com> [https://rvm.io/]
```
| comp | invalid parsing errors when opening csv file i m having problems parsing certain csv files via rubinius the files appear to be perfectly normal and can be parsed using mri mri mri this file complains that unquoted fields do not allow r or n line csv malformedcsverror error and stack trace an exception occurred running cbsa rb unquoted fields do not allow r or n line csv malformedcsverror backtrace in csv shift at rvm rubies rbx gems gems rubysl csv lib rubysl csv csv rb array each at kernel bootstrap array rb in csv shift at rvm rubies rbx gems gems rubysl csv lib rubysl csv csv rb kernel csv loop at kernel common kernel rb csv shift at rvm rubies rbx gems gems rubysl csv lib rubysl csv csv rb csv each at rvm rubies rbx gems gems rubysl csv lib rubysl csv csv rb in csv foreach at rvm rubies rbx gems gems rubysl csv lib rubysl csv csv rb csv open at rvm rubies rbx gems gems rubysl csv lib rubysl csv csv rb csv foreach at rvm rubies rbx gems gems rubysl csv lib rubysl csv csv rb object script at cbsa rb rubinius codeloader load script at kernel delta code loader rb rubinius codeloader load script at kernel delta code loader rb rubinius loader script at kernel loader rb rubinius loader main at kernel loader rb steps to reproduce create file cbsa rb require csv data csv foreach cbsa csv do datum end then rvm use rbx wget o cbsa csv ruby cbsa rb meta jw logopolis rbx v rubinius ji jw logopolis uname a linux logopolis generic ubuntu smp mon mar utc gnu linux jw logopolis rvm v rvm master by wayne e seguin michal papis | 1 |
778,805 | 27,330,017,496 | IssuesEvent | 2023-02-25 14:02:59 | LewisTrundle/L4-Individual-Project | https://api.github.com/repos/LewisTrundle/L4-Individual-Project | opened | Allow user to input aruco marker ID to use | Priority 2 New Feature Functionality | ## Summary
the user should be able to choose which aruco marker to use for the robot
## Priority
2
## Time Estimate
1 hour
| 1.0 | Allow user to input aruco marker ID to use - ## Summary
the user should be able to choose which aruco marker to use for the robot
## Priority
2
## Time Estimate
1 hour
| non_comp | allow user to input aruco marker id to use summary the user should be able to choose which aruco marker to use for the robot priority time estimate hour | 0 |
254,637 | 27,399,465,545 | IssuesEvent | 2023-02-28 22:47:38 | ManageIQ/manageiq-ui-service | https://api.github.com/repos/ManageIQ/manageiq-ui-service | closed | CVE-2021-3807 (High) detected in ansi-regex-4.1.0.tgz, ansi-regex-3.0.0.tgz - autoclosed | stale security vulnerability | ## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-4.1.0.tgz</b>, <b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>
Dependency Hierarchy:
- angular-patternfly-5.0.3.tgz (Root Library)
- node-sass-4.14.1.tgz
- sass-graph-2.2.5.tgz
- yargs-13.3.2.tgz
- cliui-5.0.0.tgz
- strip-ansi-5.2.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>
Dependency Hierarchy:
- angular-patternfly-5.0.3.tgz (Root Library)
- node-sass-4.14.1.tgz
- npmlog-4.1.2.tgz
- gauge-2.7.4.tgz
- wide-align-1.1.3.tgz
- string-width-2.1.1.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/ManageIQ/manageiq-ui-service/commit/5def65f0b4206a5ad7d8195b61a34437fb09ec9d">5def65f0b4206a5ad7d8195b61a34437fb09ec9d</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution: ansi-regex - 5.0.1,6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-3807 (High) detected in ansi-regex-4.1.0.tgz, ansi-regex-3.0.0.tgz - autoclosed - ## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-4.1.0.tgz</b>, <b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>
Dependency Hierarchy:
- angular-patternfly-5.0.3.tgz (Root Library)
- node-sass-4.14.1.tgz
- sass-graph-2.2.5.tgz
- yargs-13.3.2.tgz
- cliui-5.0.0.tgz
- strip-ansi-5.2.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>
Dependency Hierarchy:
- angular-patternfly-5.0.3.tgz (Root Library)
- node-sass-4.14.1.tgz
- npmlog-4.1.2.tgz
- gauge-2.7.4.tgz
- wide-align-1.1.3.tgz
- string-width-2.1.1.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/ManageIQ/manageiq-ui-service/commit/5def65f0b4206a5ad7d8195b61a34437fb09ec9d">5def65f0b4206a5ad7d8195b61a34437fb09ec9d</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution: ansi-regex - 5.0.1,6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_comp | cve high detected in ansi regex tgz ansi regex tgz autoclosed cve high severity vulnerability vulnerable libraries ansi regex tgz ansi regex tgz ansi regex tgz regular expression for matching ansi escape codes library home page a href dependency hierarchy angular patternfly tgz root library node sass tgz sass graph tgz yargs tgz cliui tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href dependency hierarchy angular patternfly tgz root library node sass tgz npmlog tgz gauge tgz wide align tgz string width tgz strip ansi tgz x ansi regex tgz vulnerable library found in head commit a href found in base branch master vulnerability details ansi regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ansi regex step up your open source security game with mend | 0 |
822,337 | 30,865,825,455 | IssuesEvent | 2023-08-03 07:59:03 | space-wizards/space-station-14 | https://api.github.com/repos/space-wizards/space-station-14 | opened | Muzzle flashes should be offset | Priority: 2-Before Release Issue: Feature Request Difficulty: 1-Easy | After seeing https://github.com/space-wizards/space-station-14/pull/18507 + a bay clip I think it's a good change though offsetting the bullet / hitscan itself probably can't be done as the chances of tunneling through thindows increases.
You probably also need to clamp it so if there's a wall in front of you it offsets from yourself. | 1.0 | Muzzle flashes should be offset - After seeing https://github.com/space-wizards/space-station-14/pull/18507 + a bay clip I think it's a good change though offsetting the bullet / hitscan itself probably can't be done as the chances of tunneling through thindows increases.
You probably also need to clamp it so if there's a wall in front of you it offsets from yourself. | non_comp | muzzle flashes should be offset after seeing a bay clip i think it s a good change though offsetting the bullet hitscan itself probably can t be done as the chances of tunneling through thindows increases you probably also need to clamp it so if there s a wall in front of you it offsets from yourself | 0 |
318,249 | 9,684,077,704 | IssuesEvent | 2019-05-23 13:01:54 | HGustavs/LenaSYS | https://api.github.com/repos/HGustavs/LenaSYS | closed | Moving "Developer mode --> ER --> UML" with one shortcut | Diagram gruppA2019 highPriority | When in developer mode and pressing "Shift + M" it switches to ER and then puts up a dialog that it wants to go from ER to UML. It should go back to default mode when leaving developer mode or last used mode before going into developer mode. | 1.0 | Moving "Developer mode --> ER --> UML" with one shortcut - When in developer mode and pressing "Shift + M" it switches to ER and then puts up a dialog that it wants to go from ER to UML. It should go back to default mode when leaving developer mode or last used mode before going into developer mode. | non_comp | moving developer mode er uml with one shortcut when in developer mode and pressing shift m it switches to er and then puts up a dialog that it wants to go from er to uml it should go back to default mode when leaving developer mode or last used mode before going into developer mode | 0 |
16,868 | 23,220,224,350 | IssuesEvent | 2022-08-02 17:27:34 | dotnet/sdk | https://api.github.com/repos/dotnet/sdk | closed | [API Compat] Provide a strict option for package-validation baseline checks | untriaged Area-Compatibility | ### Is your feature request related to a problem? Please describe.
We are hitting issues using Microsoft.CodeAnalysis.PublicApiAnalyzers to verify **no** public API changes in our servicing branches. This feature would enable us to use `package-validation` instead.
### Describe the solution you'd like
Quoting me:
> 2. It checks for breaking changes, not meeting our “no public API changes” requirement at all.
Quoting @ericstj:
> 2. We do have a “strict” mode that means no changes – this is how we make sure all API is exposed in our reference assemblies. Probably we don’t expose this option for baseline comparisons but we could. That’s an easy feature to add.
### Additional context
If it matters, we already enable package validation in dotnet/aspnetcore and we leave `$(DisablePackageBaselineValidation)` alone in servicing branches release/6.0. We'd use package validation in release/5.0 as well with this feature in place. | True | [API Compat] Provide a strict option for package-validation baseline checks - ### Is your feature request related to a problem? Please describe.
We are hitting issues using Microsoft.CodeAnalysis.PublicApiAnalyzers to verify **no** public API changes in our servicing branches. This feature would enable us to use `package-validation` instead.
### Describe the solution you'd like
Quoting me:
> 2. It checks for breaking changes, not meeting our “no public API changes” requirement at all.
Quoting @ericstj:
> 2. We do have a “strict” mode that means no changes – this is how we make sure all API is exposed in our reference assemblies. Probably we don’t expose this option for baseline comparisons but we could. That’s an easy feature to add.
### Additional context
If it matters, we already enable package validation in dotnet/aspnetcore and we leave `$(DisablePackageBaselineValidation)` alone in servicing branches release/6.0. We'd use package validation in release/5.0 as well with this feature in place. | comp | provide a strict option for package validation baseline checks is your feature request related to a problem please describe we are hitting issues using microsoft codeanalysis publicapianalyzers to verify no public api changes in our servicing branches this feature would enable us to use package validation instead describe the solution you d like quoting me it checks for breaking changes not meeting our “no public api changes” requirement at all quoting ericstj we do have a “strict” mode that means no changes – this is how we make sure all api is exposed in our reference assemblies probably we don’t expose this option for baseline comparisons but we could that’s an easy feature to add additional context if it matters we already enable package validation in dotnet aspnetcore and we leave disablepackagebaselinevalidation alone in servicing branches release we d use package validation in release as well with this feature in place | 1 |
120,754 | 17,644,272,658 | IssuesEvent | 2021-08-20 02:06:11 | DavidSpek/kale | https://api.github.com/repos/DavidSpek/kale | opened | CVE-2021-29560 (High) detected in tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl, tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl | security vulnerability | ## CVE-2021-29560 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</b>, <b>tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>
<details><summary><b>tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/7b/c5/a97ed48fcc878e36bb05a3ea700c077360853c0994473a8f6b0ab4c2ddd2/tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7b/c5/a97ed48fcc878e36bb05a3ea700c077360853c0994473a8f6b0ab4c2ddd2/tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>Path to dependency file: kale/examples/dog-breed-classification/requirements/requirements.txt</p>
<p>Path to vulnerable library: kale/examples/dog-breed-classification/requirements/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</details>
<details><summary><b>tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/ef/73/205b5e7f8fe086ffe4165d984acb2c49fa3086f330f03099378753982d2e/tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ef/73/205b5e7f8fe086ffe4165d984acb2c49fa3086f330f03099378753982d2e/tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: kale/examples/taxi-cab-classification/requirements.txt</p>
<p>Path to vulnerable library: kale/examples/taxi-cab-classification/requirements.txt</p>
<p>
Dependency Hierarchy:
- tfx_bsl-0.21.4-cp27-cp27mu-manylinux2010_x86_64.whl (Root Library)
- :x: **tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
TensorFlow is an end-to-end open source platform for machine learning. An attacker can cause a heap buffer overflow in `tf.raw_ops.RaggedTensorToTensor`. This is because the implementation(https://github.com/tensorflow/tensorflow/blob/d94227d43aa125ad8b54115c03cece54f6a1977b/tensorflow/core/kernels/ragged_tensor_to_tensor_op.cc#L219-L222) uses the same index to access two arrays in parallel. Since the user controls the shape of the input arguments, an attacker could trigger a heap OOB access when `parent_output_index` is shorter than `row_split`. The fix will be included in TensorFlow 2.5.0. We will also cherrypick this commit on TensorFlow 2.4.2, TensorFlow 2.3.3, TensorFlow 2.2.3 and TensorFlow 2.1.4, as these are also affected and still in supported range.
<p>Publish Date: 2021-05-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-29560>CVE-2021-29560</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-8gv3-57p6-g35r">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-8gv3-57p6-g35r</a></p>
<p>Release Date: 2021-05-14</p>
<p>Fix Resolution: tensorflow - 2.5.0, tensorflow-cpu - 2.5.0, tensorflow-gpu - 2.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-29560 (High) detected in tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl, tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl - ## CVE-2021-29560 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</b>, <b>tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>
<details><summary><b>tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/7b/c5/a97ed48fcc878e36bb05a3ea700c077360853c0994473a8f6b0ab4c2ddd2/tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7b/c5/a97ed48fcc878e36bb05a3ea700c077360853c0994473a8f6b0ab4c2ddd2/tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>Path to dependency file: kale/examples/dog-breed-classification/requirements/requirements.txt</p>
<p>Path to vulnerable library: kale/examples/dog-breed-classification/requirements/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-1.0.0-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</details>
<details><summary><b>tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/ef/73/205b5e7f8fe086ffe4165d984acb2c49fa3086f330f03099378753982d2e/tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ef/73/205b5e7f8fe086ffe4165d984acb2c49fa3086f330f03099378753982d2e/tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: kale/examples/taxi-cab-classification/requirements.txt</p>
<p>Path to vulnerable library: kale/examples/taxi-cab-classification/requirements.txt</p>
<p>
Dependency Hierarchy:
- tfx_bsl-0.21.4-cp27-cp27mu-manylinux2010_x86_64.whl (Root Library)
- :x: **tensorflow-2.1.0-cp27-cp27mu-manylinux2010_x86_64.whl** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
TensorFlow is an end-to-end open source platform for machine learning. An attacker can cause a heap buffer overflow in `tf.raw_ops.RaggedTensorToTensor`. This is because the implementation(https://github.com/tensorflow/tensorflow/blob/d94227d43aa125ad8b54115c03cece54f6a1977b/tensorflow/core/kernels/ragged_tensor_to_tensor_op.cc#L219-L222) uses the same index to access two arrays in parallel. Since the user controls the shape of the input arguments, an attacker could trigger a heap OOB access when `parent_output_index` is shorter than `row_split`. The fix will be included in TensorFlow 2.5.0. We will also cherrypick this commit on TensorFlow 2.4.2, TensorFlow 2.3.3, TensorFlow 2.2.3 and TensorFlow 2.1.4, as these are also affected and still in supported range.
<p>Publish Date: 2021-05-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-29560>CVE-2021-29560</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-8gv3-57p6-g35r">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-8gv3-57p6-g35r</a></p>
<p>Release Date: 2021-05-14</p>
<p>Fix Resolution: tensorflow - 2.5.0, tensorflow-cpu - 2.5.0, tensorflow-gpu - 2.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_comp | cve high detected in tensorflow whl tensorflow whl cve high severity vulnerability vulnerable libraries tensorflow whl tensorflow whl tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file kale examples dog breed classification requirements requirements txt path to vulnerable library kale examples dog breed classification requirements requirements txt dependency hierarchy x tensorflow whl vulnerable library tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file kale examples taxi cab classification requirements txt path to vulnerable library kale examples taxi cab classification requirements txt dependency hierarchy tfx bsl whl root library x tensorflow whl vulnerable library found in base branch master vulnerability details tensorflow is an end to end open source platform for machine learning an attacker can cause a heap buffer overflow in tf raw ops raggedtensortotensor this is because the implementation uses the same index to access two arrays in parallel since the user controls the shape of the input arguments an attacker could trigger a heap oob access when parent output index is shorter than row split the fix will be included in tensorflow we will also cherrypick this commit on tensorflow tensorflow tensorflow and tensorflow as these are also affected and still in supported range publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tensorflow tensorflow cpu tensorflow gpu step up your open source security game with whitesource | 0 |
18,594 | 25,903,821,139 | IssuesEvent | 2022-12-15 08:38:21 | andstatus/andstatus | https://api.github.com/repos/andstatus/andstatus | closed | Misskey Support | Compatibility | Hello, I'm Dignified Silence. Misskey Contributor.
Do you have a plan to make andstatus work as a client app for Misskey?
Misskey is one of fediverse software which has rich features and a flexible UI.
Misskey Official Website: https://misskey-hub.net/
Misskey API Documentation on its Official Website: https://misskey-hub.net/en/docs/api
Misskey API Documentation in Misskey instance: https://misskey.io/api-doc | True | Misskey Support - Hello, I'm Dignified Silence. Misskey Contributor.
Do you have a plan to make andstatus work as a client app for Misskey?
Misskey is one of fediverse software which has rich features and a flexible UI.
Misskey Official Website: https://misskey-hub.net/
Misskey API Documentation on its Official Website: https://misskey-hub.net/en/docs/api
Misskey API Documentation in Misskey instance: https://misskey.io/api-doc | comp | misskey support hello i m dignified silence misskey contributor do you have a plan to make andstatus work as a client app for misskey misskey is one of fediverse software which has rich features and a flexible ui misskey official website misskey api documentation on its official website misskey api documentation in misskey instance | 1 |
41,346 | 16,701,127,246 | IssuesEvent | 2021-06-09 02:39:59 | microsoft/BotFramework-DirectLineJS | https://api.github.com/repos/microsoft/BotFramework-DirectLineJS | closed | Client cannot catch Post error Callback | Bot Services Support customer-reported | This is our client code, we found we do not catch error Callback.
sendPostBack(text, value, from, successCallback, errorCallback) {
this.botConnection
.postActivity({
type: 'event',
text,
value,
from
})
.subscribe(
id => {
if (successCallback) {
Logger.trackEvent('Chat_PostActivityCompleted', {
message: `Send ${text} to server succeed.`,
supportCaseId: this.props.supportCase.id
});
successCallback(id);
}
},
error => {
if (errorCallback) {
errorCallback(error);
Logger.trackEvent('Chat_PostActivityError', {
message: `Send ${text} to server failed.`,
supportCaseId: this.props.supportCase.id
});
}
}
);
our team member has already consulted about error handling mechanism when calling Post Activity,
and turns out Direct line handling HTTP 4xx or 5xx code, and sending the client a 502.
(https://github.com/Microsoft/BotBuilder/issues/2201)
What we want is handling 4xx or 5xx code by ourselves, and we had already have some change on source code,
private catchPostError(error: any) {
if (error.status === 403)
this. expiredToken();
else if (error.status >= 400 && error.status <= 502)
{
return Observable.throw(error);
}
return Observable.of("retry");
}
can you see if this is workable? Or do you have other alternatives and solutions?
| 1.0 | Client cannot catch Post error Callback - This is our client code, we found we do not catch error Callback.
sendPostBack(text, value, from, successCallback, errorCallback) {
this.botConnection
.postActivity({
type: 'event',
text,
value,
from
})
.subscribe(
id => {
if (successCallback) {
Logger.trackEvent('Chat_PostActivityCompleted', {
message: `Send ${text} to server succeed.`,
supportCaseId: this.props.supportCase.id
});
successCallback(id);
}
},
error => {
if (errorCallback) {
errorCallback(error);
Logger.trackEvent('Chat_PostActivityError', {
message: `Send ${text} to server failed.`,
supportCaseId: this.props.supportCase.id
});
}
}
);
our team member has already consulted about error handling mechanism when calling Post Activity,
and turns out Direct line handling HTTP 4xx or 5xx code, and sending the client a 502.
(https://github.com/Microsoft/BotBuilder/issues/2201)
What we want is handling 4xx or 5xx code by ourselves, and we had already have some change on source code,
private catchPostError(error: any) {
if (error.status === 403)
this. expiredToken();
else if (error.status >= 400 && error.status <= 502)
{
return Observable.throw(error);
}
return Observable.of("retry");
}
can you see if this is workable? Or do you have other alternatives and solutions?
| non_comp | client cannot catch post error callback this is our client code we found we do not catch error callback sendpostback text value from successcallback errorcallback this botconnection postactivity type event text value from subscribe id if successcallback logger trackevent chat postactivitycompleted message send text to server succeed supportcaseid this props supportcase id successcallback id error if errorcallback errorcallback error logger trackevent chat postactivityerror message send text to server failed supportcaseid this props supportcase id our team member has already consulted about error handling mechanism when calling post activity and turns out direct line handling http or code and sending the client a what we want is handling or code by ourselves and we had already have some change on source code private catchposterror error any if error status this expiredtoken else if error status error status return observable throw error return observable of retry can you see if this is workable or do you have other alternatives and solutions | 0 |
4,770 | 7,375,187,905 | IssuesEvent | 2018-03-13 23:07:53 | semperfiwebdesign/all-in-one-seo-pack | https://api.github.com/repos/semperfiwebdesign/all-in-one-seo-pack | opened | Metabox priority breaks Xtreme Builder theme metabox on Post Edit screen | Compatibility | Reported by Jeroen Tabbernee on the Premium Forum here - https://semperplugins.com/support/troubleshooting-all-in-one-seo-pack-pro/after-upgrading-last-version-of-aio-seo-pro-the-xtreme-theme-cant-be-modified/#p7565
I troubleshooted this issue and found out that this broke with the release of 2.3.16 (2.4.16 for Pro), specifically by this commit - https://github.com/semperfiwebdesign/all-in-one-seo-pack/commit/571ea2b11cd7277eaa10ce4982610d2e31cb84a0. | True | Metabox priority breaks Xtreme Builder theme metabox on Post Edit screen - Reported by Jeroen Tabbernee on the Premium Forum here - https://semperplugins.com/support/troubleshooting-all-in-one-seo-pack-pro/after-upgrading-last-version-of-aio-seo-pro-the-xtreme-theme-cant-be-modified/#p7565
I troubleshooted this issue and found out that this broke with the release of 2.3.16 (2.4.16 for Pro), specifically by this commit - https://github.com/semperfiwebdesign/all-in-one-seo-pack/commit/571ea2b11cd7277eaa10ce4982610d2e31cb84a0. | comp | metabox priority breaks xtreme builder theme metabox on post edit screen reported by jeroen tabbernee on the premium forum here i troubleshooted this issue and found out that this broke with the release of for pro specifically by this commit | 1 |
120,131 | 12,059,853,172 | IssuesEvent | 2020-04-15 20:03:22 | aPureBase/KGraphQL | https://api.github.com/repos/aPureBase/KGraphQL | closed | Provide documentation on Context usage | documentation | Currently there is no documentation on how to use context and the `NotIntrospected` annotation.
I quickly wrote an example of how it could be used:
```kotlin
val query = """
query fetchHelloLabel($country: String!) {
hello(country: $country) {
label
}
}
"""
val variables = """
{"country": "English"}
"""
val user = User(id = 1, name = "Username")
val ctx = context {
+user
}
schema.execute(query, variables, ctx)
...
// In your schema definition
query("hello") {
resolver { country: String, ctx: Context ->
val user = ctx.get<User>()
Hello(label = "Hello ${user?.name ?: "unknown"}")
}
}
``` | 1.0 | Provide documentation on Context usage - Currently there is no documentation on how to use context and the `NotIntrospected` annotation.
I quickly wrote an example of how it could be used:
```kotlin
val query = """
query fetchHelloLabel($country: String!) {
hello(country: $country) {
label
}
}
"""
val variables = """
{"country": "English"}
"""
val user = User(id = 1, name = "Username")
val ctx = context {
+user
}
schema.execute(query, variables, ctx)
...
// In your schema definition
query("hello") {
resolver { country: String, ctx: Context ->
val user = ctx.get<User>()
Hello(label = "Hello ${user?.name ?: "unknown"}")
}
}
``` | non_comp | provide documentation on context usage currently there is no documentation on how to use context and the notintrospected annotation i quickly wrote an example of how it could be used kotlin val query query fetchhellolabel country string hello country country label val variables country english val user user id name username val ctx context user schema execute query variables ctx in your schema definition query hello resolver country string ctx context val user ctx get hello label hello user name unknown | 0 |
10,417 | 12,390,390,907 | IssuesEvent | 2020-05-20 10:35:22 | sparna-git/Sparnatural | https://api.github.com/repos/sparna-git/Sparnatural | opened | Ability to select the source graph from within the query editor | Blue Sky SPARQL compatibility enhancement | Should generate a `FROM` clause.
This is useful to select the source of the data being queried.
For the moment there is no need to edit a `GRAPH` clause inside a subpart of the query (which would be clause to a `SERVICE` clause) | True | Ability to select the source graph from within the query editor - Should generate a `FROM` clause.
This is useful to select the source of the data being queried.
For the moment there is no need to edit a `GRAPH` clause inside a subpart of the query (which would be clause to a `SERVICE` clause) | comp | ability to select the source graph from within the query editor should generate a from clause this is useful to select the source of the data being queried for the moment there is no need to edit a graph clause inside a subpart of the query which would be clause to a service clause | 1 |
19,713 | 27,349,098,125 | IssuesEvent | 2023-02-27 08:12:36 | aesara-devs/aesara | https://api.github.com/repos/aesara-devs/aesara | reopened | The naming of the division `Op`s is inconsistent with NumPy's | enhancement good first issue help wanted refactor NumPy compatibility | Numpy [provides the following API](https://numpy.org/doc/stable/reference/generated/numpy.remainder.html) to perform division element-wise:
```python
import numpy as np
np.true_divide
np.divide # alias to `numpy.true_divide`
np.floor_divide
```
While Aesara uses the following names:
```python
import aesara.tensor as at
at.true_div
at.floor_div
at.int_div # an alias for `floor_div`
```
My suggestion is thus
- [ ] `s/at.true_div/at.true_divide`
- [ ] `s/at.floor_div/at.floor_divide`
- [ ] Deprecate the alias `at.int_div`
- [ ] Add an alias `at.divide` to `at.true_divide`
And while we're at it:
- [ ] Rename `at.mod` to `at.remainder` and set `at.mod` as an alias to `at.remainder` (like NumPy)
- [ ] Add `at.fmod`
As an aside the documentation does not reference the API that is currently implemented, it stills references `truediv`, `intdiv` and `floordiv` instead of respectively `true_div`, `int_div` and `floor_div`. | True | The naming of the division `Op`s is inconsistent with NumPy's - Numpy [provides the following API](https://numpy.org/doc/stable/reference/generated/numpy.remainder.html) to perform division element-wise:
```python
import numpy as np
np.true_divide
np.divide # alias to `numpy.true_divide`
np.floor_divide
```
While Aesara uses the following names:
```python
import aesara.tensor as at
at.true_div
at.floor_div
at.int_div # an alias for `floor_div`
```
My suggestion is thus
- [ ] `s/at.true_div/at.true_divide`
- [ ] `s/at.floor_div/at.floor_divide`
- [ ] Deprecate the alias `at.int_div`
- [ ] Add an alias `at.divide` to `at.true_divide`
And while we're at it:
- [ ] Rename `at.mod` to `at.remainder` and set `at.mod` as an alias to `at.remainder` (like NumPy)
- [ ] Add `at.fmod`
As an aside the documentation does not reference the API that is currently implemented, it stills references `truediv`, `intdiv` and `floordiv` instead of respectively `true_div`, `int_div` and `floor_div`. | comp | the naming of the division op s is inconsistent with numpy s numpy to perform division element wise python import numpy as np np true divide np divide alias to numpy true divide np floor divide while aesara uses the following names python import aesara tensor as at at true div at floor div at int div an alias for floor div my suggestion is thus s at true div at true divide s at floor div at floor divide deprecate the alias at int div add an alias at divide to at true divide and while we re at it rename at mod to at remainder and set at mod as an alias to at remainder like numpy add at fmod as an aside the documentation does not reference the api that is currently implemented it stills references truediv intdiv and floordiv instead of respectively true div int div and floor div | 1 |
32,723 | 12,142,522,395 | IssuesEvent | 2020-04-24 01:57:00 | jerry-zhang-DYG/MyHealthClinicInternal | https://api.github.com/repos/jerry-zhang-DYG/MyHealthClinicInternal | opened | CVE-2017-11556 (High) detected in libsass-3.3.6 | security vulnerability | ## CVE-2017-11556 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>libsass3.3.6</b></p></summary>
<p>
<p>A C/C++ implementation of a Sass compiler</p>
<p>Library home page: <a href=https://github.com/sass/libsass.git>https://github.com/sass/libsass.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/jerry-zhang-DYG/MyHealthClinicInternal/commit/fbc85176537c4ab2b83d1dba87b5b56a86fafe58">fbc85176537c4ab2b83d1dba87b5b56a86fafe58</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (119)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/to_value.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/source_map.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/constants.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/to_c.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/memory_manager.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/node.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/expand.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/listize.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/output.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/parser.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/backtrace.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/units.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/test/test_node.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/number.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_util.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/custom_importer_bridge.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/null.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/string.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/contrib/plugin.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/listize.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/lexer.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_util.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/test/test_superselector.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/units.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/cencode.c
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast_factory.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_functions.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8_string.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/memory_manager.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/constants.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/to_c.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/util.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/color.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/eval.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_context.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/output.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/operation.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/custom_function_bridge.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/inspect.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/file.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass/values.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/paths.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_values.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/boolean.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/test/test_unification.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/list.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/file.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/position.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/expand.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/context.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/inspect.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/parser.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/extend.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/list.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/bind.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/map.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/values.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/b64/cencode.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/emitter.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/debugger.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8/checked.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/util.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/cssize.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_context_wrapper.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/json.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/position.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/emitter.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/eval.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass2scss.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/functions.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/functions.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8/core.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/custom_importer_bridge.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8_string.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass/functions.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/lexer.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_context_wrapper.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/b64/encode.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/base64vlq.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/to_value.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/cssize.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/environment.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/c99func.c
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/test/test_subset_map.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/subset_map.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass/base.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/binding.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/source_map.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass2scss.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/plugins.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/extend.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/node.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/plugins.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/environment.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass/context.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/color.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/color_maps.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/debug.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/json.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/context.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/factory.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8/unchecked.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/color_maps.hpp
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
There is a stack consumption vulnerability in the Parser::advanceToNextToken function in parser.cpp in LibSass 3.4.5. A crafted input may lead to remote denial of service.
<p>Publish Date: 2017-07-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-11556>CVE-2017-11556</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/sass/libsass/commit/7664114543757e932f5b1a2ff5295aa9b34f8623#diff-bd8f167953cc43fb7cba7d420fbd1d48">https://github.com/sass/libsass/commit/7664114543757e932f5b1a2ff5295aa9b34f8623#diff-bd8f167953cc43fb7cba7d420fbd1d48</a></p>
<p>Release Date: 2017-07-23</p>
<p>Fix Resolution: LibSass - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2017-11556 (High) detected in libsass-3.3.6 - ## CVE-2017-11556 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>libsass3.3.6</b></p></summary>
<p>
<p>A C/C++ implementation of a Sass compiler</p>
<p>Library home page: <a href=https://github.com/sass/libsass.git>https://github.com/sass/libsass.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/jerry-zhang-DYG/MyHealthClinicInternal/commit/fbc85176537c4ab2b83d1dba87b5b56a86fafe58">fbc85176537c4ab2b83d1dba87b5b56a86fafe58</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (119)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/to_value.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/source_map.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/constants.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/to_c.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/memory_manager.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/node.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/expand.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/listize.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/output.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/parser.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/backtrace.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/units.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/test/test_node.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/number.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_util.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/custom_importer_bridge.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/null.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/string.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/contrib/plugin.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/listize.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/lexer.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_util.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/test/test_superselector.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/units.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/cencode.c
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast_factory.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_functions.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8_string.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/memory_manager.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/constants.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/to_c.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/util.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/color.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/eval.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_context.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/output.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/operation.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/custom_function_bridge.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/inspect.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/file.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass/values.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/paths.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_values.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/boolean.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/test/test_unification.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/list.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/file.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/position.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/expand.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/context.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/inspect.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/parser.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/extend.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/list.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/bind.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/map.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/values.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/b64/cencode.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/emitter.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/debugger.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8/checked.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/util.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/cssize.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_context_wrapper.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/json.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/position.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/emitter.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/eval.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/sass2scss.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/functions.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/functions.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8/core.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/custom_importer_bridge.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/ast.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8_string.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass/functions.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/lexer.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_context_wrapper.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/b64/encode.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/base64vlq.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/to_value.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/cssize.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/environment.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/c99func.c
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/test/test_subset_map.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/subset_map.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass/base.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/binding.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/source_map.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass2scss.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/plugins.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/extend.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/node.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/plugins.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/environment.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/include/sass/context.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/color.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/color_maps.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/debug.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/json.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/context.hpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/sass_types/factory.cpp
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/utf8/unchecked.h
- /MyHealthClinicInternal/src/MyHealth.Web/node_modules/node-sass/src/libsass/src/color_maps.hpp
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
There is a stack consumption vulnerability in the Parser::advanceToNextToken function in parser.cpp in LibSass 3.4.5. A crafted input may lead to remote denial of service.
<p>Publish Date: 2017-07-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-11556>CVE-2017-11556</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/sass/libsass/commit/7664114543757e932f5b1a2ff5295aa9b34f8623#diff-bd8f167953cc43fb7cba7d420fbd1d48">https://github.com/sass/libsass/commit/7664114543757e932f5b1a2ff5295aa9b34f8623#diff-bd8f167953cc43fb7cba7d420fbd1d48</a></p>
<p>Release Date: 2017-07-23</p>
<p>Fix Resolution: LibSass - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_comp | cve high detected in libsass cve high severity vulnerability vulnerable library a c c implementation of a sass compiler library home page a href found in head commit a href library source files the source files were matched to this source library based on a best effort match source libraries are selected from a list of probable public libraries myhealthclinicinternal src myhealth web node modules node sass src libsass src to value hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src source map cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src constants hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src to c hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src memory manager hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src node hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src expand cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src listize cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src output cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src parser cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src backtrace hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src h myhealthclinicinternal src myhealth web node modules node sass src libsass src units hpp myhealthclinicinternal src myhealth web node modules node sass src libsass test test node cpp myhealthclinicinternal src myhealth web node modules node sass src sass types number cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass util hpp myhealthclinicinternal src myhealth web node modules node sass src custom importer bridge cpp myhealthclinicinternal src myhealth web node modules node sass src sass types sass value wrapper h myhealthclinicinternal src myhealth web node modules node sass src sass types null cpp myhealthclinicinternal src myhealth web node modules node sass src sass types string cpp myhealthclinicinternal src myhealth web node modules node sass src libsass contrib plugin cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src listize hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src lexer hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass util cpp myhealthclinicinternal src myhealth web node modules node sass src libsass test test superselector cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src units cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src cencode c myhealthclinicinternal src myhealth web node modules node sass src libsass src ast factory hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass functions cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src ast hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src string hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src remove placeholders hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src memory manager cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass values cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src constants cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src to c cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src remove placeholders cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src util cpp myhealthclinicinternal src myhealth web node modules node sass src sass types color cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src eval cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass context hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src output hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src operation hpp myhealthclinicinternal src myhealth web node modules node sass src custom function bridge cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src inspect hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src file hpp myhealthclinicinternal src myhealth web node modules node sass src libsass include sass values h myhealthclinicinternal src myhealth web node modules node sass src libsass src paths hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass values hpp myhealthclinicinternal src myhealth web node modules node sass src sass types boolean h myhealthclinicinternal src myhealth web node modules node sass src libsass src error handling hpp myhealthclinicinternal src myhealth web node modules node sass src libsass test test unification cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass hpp myhealthclinicinternal src myhealth web node modules node sass src sass types list h myhealthclinicinternal src myhealth web node modules node sass src libsass src file cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src position cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src expand hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src context cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src prelexer cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src inspect cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src parser hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src extend cpp myhealthclinicinternal src myhealth web node modules node sass src sass types list cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src bind cpp myhealthclinicinternal src myhealth web node modules node sass src sass types map cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src sass context cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src values cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src cencode h myhealthclinicinternal src myhealth web node modules node sass src libsass src emitter cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src debugger hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src checked h myhealthclinicinternal src myhealth web node modules node sass src libsass src ast fwd decl hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src util hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src cssize cpp myhealthclinicinternal src myhealth web node modules node sass src sass context wrapper cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src error handling cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src json hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src position hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src ast def macros hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src emitter hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src eval hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src functions hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src functions cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src prelexer hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src core h myhealthclinicinternal src myhealth web node modules node sass src custom importer bridge h myhealthclinicinternal src myhealth web node modules node sass src libsass src ast cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src string cpp myhealthclinicinternal src myhealth web node modules node sass src libsass include sass functions h myhealthclinicinternal src myhealth web node modules node sass src libsass src lexer cpp myhealthclinicinternal src myhealth web node modules node sass src sass context wrapper h myhealthclinicinternal src myhealth web node modules node sass src libsass src encode h myhealthclinicinternal src myhealth web node modules node sass src libsass src cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src to value cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src cssize hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src environment cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src c myhealthclinicinternal src myhealth web node modules node sass src libsass test test subset map cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src subset map hpp myhealthclinicinternal src myhealth web node modules node sass src libsass include sass base h myhealthclinicinternal src myhealth web node modules node sass src binding cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src source map hpp myhealthclinicinternal src myhealth web node modules node sass src libsass include h myhealthclinicinternal src myhealth web node modules node sass src libsass src plugins cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src extend hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src node cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src plugins hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src environment hpp myhealthclinicinternal src myhealth web node modules node sass src libsass include sass context h myhealthclinicinternal src myhealth web node modules node sass src sass types color h myhealthclinicinternal src myhealth web node modules node sass src libsass src color maps cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src debug hpp myhealthclinicinternal src myhealth web node modules node sass src libsass src json cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src context hpp myhealthclinicinternal src myhealth web node modules node sass src sass types factory cpp myhealthclinicinternal src myhealth web node modules node sass src libsass src unchecked h myhealthclinicinternal src myhealth web node modules node sass src libsass src color maps hpp vulnerability details there is a stack consumption vulnerability in the parser advancetonexttoken function in parser cpp in libsass a crafted input may lead to remote denial of service publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass step up your open source security game with whitesource | 0 |
9,150 | 11,177,270,003 | IssuesEvent | 2019-12-30 10:08:33 | quanteda/quanteda | https://api.github.com/repos/quanteda/quanteda | closed | Update corpus_sample() for more robustness w/v2 | compatibility dev-corpus2 | To be more compatible with the issue described here:
https://github.com/sborms/sentometrics/issues/8 | True | Update corpus_sample() for more robustness w/v2 - To be more compatible with the issue described here:
https://github.com/sborms/sentometrics/issues/8 | comp | update corpus sample for more robustness w to be more compatible with the issue described here | 1 |
4,251 | 2,610,090,235 | IssuesEvent | 2015-02-26 18:27:19 | chrsmith/dsdsdaadf | https://api.github.com/repos/chrsmith/dsdsdaadf | opened | 深圳痘痘怎么样祛 | auto-migrated Priority-Medium Type-Defect | ```
深圳痘痘怎么样祛【深圳韩方科颜全国热线400-869-1818,24小时
QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘��
�——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方�
��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健
康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业��
�疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘�
��。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 7:40 | 1.0 | 深圳痘痘怎么样祛 - ```
深圳痘痘怎么样祛【深圳韩方科颜全国热线400-869-1818,24小时
QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘��
�——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方�
��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健
康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业��
�疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘�
��。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 7:40 | non_comp | 深圳痘痘怎么样祛 深圳痘痘怎么样祛【 , 】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘�� �——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方� ��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健 康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业�� �疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘� ��。 original issue reported on code google com by szft com on may at | 0 |
15,692 | 20,253,090,654 | IssuesEvent | 2022-02-14 19:59:27 | oshi/oshi | https://api.github.com/repos/oshi/oshi | closed | oshi 6.1.1 release: malformed pom files | confirmed bug compatibility | gradle complains about
```
[Fatal Error] oshi-core-6.1.1.pom:4:10: Already seen doctype.
```
oshi-core-6.1.1.pom:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE project>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
pom files of previous releases don't contain ```<!DOCTYPE project>```. | True | oshi 6.1.1 release: malformed pom files - gradle complains about
```
[Fatal Error] oshi-core-6.1.1.pom:4:10: Already seen doctype.
```
oshi-core-6.1.1.pom:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE project>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
pom files of previous releases don't contain ```<!DOCTYPE project>```. | comp | oshi release malformed pom files gradle complains about oshi core pom already seen doctype oshi core pom project xmlns xmlns xsi xsi schemalocation pom files of previous releases don t contain | 1 |
15,166 | 19,161,246,565 | IssuesEvent | 2021-12-03 00:35:30 | AlphaNodes/additionals | https://api.github.com/repos/AlphaNodes/additionals | closed | base deface error | help wanted compatibility | Hello,
I am trying to install stable branch in to our instance.
giving below error during migration:
---
Bundler could not find compatible versions for gem "deface":
In snapshot (Gemfile.lock):
deface (= 1.5.3)
In Gemfile:
deface (= 1.5.3)
additionals was resolved to 3.0.3, which depends on
deface (= 1.8.1)
Running `bundle update` will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
---
redmine_base_deface is installed and updated.
What can I do?
Environment:
Redmine version 4.2.1.stable
Ruby version 2.5.7-p206 (2019-10-01) [x86_64-linux]
Rails version 5.2.5
Environment production
Database adapter Mysql2
Best Regards,
| True | base deface error - Hello,
I am trying to install stable branch in to our instance.
giving below error during migration:
---
Bundler could not find compatible versions for gem "deface":
In snapshot (Gemfile.lock):
deface (= 1.5.3)
In Gemfile:
deface (= 1.5.3)
additionals was resolved to 3.0.3, which depends on
deface (= 1.8.1)
Running `bundle update` will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
---
redmine_base_deface is installed and updated.
What can I do?
Environment:
Redmine version 4.2.1.stable
Ruby version 2.5.7-p206 (2019-10-01) [x86_64-linux]
Rails version 5.2.5
Environment production
Database adapter Mysql2
Best Regards,
| comp | base deface error hello i am trying to install stable branch in to our instance giving below error during migration bundler could not find compatible versions for gem deface in snapshot gemfile lock deface in gemfile deface additionals was resolved to which depends on deface running bundle update will rebuild your snapshot from scratch using only the gems in your gemfile which may resolve the conflict redmine base deface is installed and updated what can i do environment redmine version stable ruby version rails version environment production database adapter best regards | 1 |
9,249 | 11,234,111,286 | IssuesEvent | 2020-01-09 03:51:24 | krzychu124/Cities-Skylines-Traffic-Manager-President-Edition | https://api.github.com/repos/krzychu124/Cities-Skylines-Traffic-Manager-President-Edition | opened | Updates to mod incompatibility checker | COMPATIBILITY adjustments required ⏸Paused | I'm pondering making some additional changes to mod incompatibility checker _after_ LABS gets updated to v11, so probably part of v11.1 release:
* Definitions should specify criticality of the incompatibility:
* Critical - guaranteed to break TM:PE in a bad way
* Major - likely to break TM:PE but in a recoverable way (diable conflicting mod an things return to normal)
* Minor - causes some problems but user can choose to ignore
* Might be better to have a class that defines the incompatible mods, rather than txt file
* Critical conflicts will always be checked, regardless of user settings
* Distinction between major/minor is possibly irrelevant but just putting the idea out there for feedback
Thoughts? | True | Updates to mod incompatibility checker - I'm pondering making some additional changes to mod incompatibility checker _after_ LABS gets updated to v11, so probably part of v11.1 release:
* Definitions should specify criticality of the incompatibility:
* Critical - guaranteed to break TM:PE in a bad way
* Major - likely to break TM:PE but in a recoverable way (diable conflicting mod an things return to normal)
* Minor - causes some problems but user can choose to ignore
* Might be better to have a class that defines the incompatible mods, rather than txt file
* Critical conflicts will always be checked, regardless of user settings
* Distinction between major/minor is possibly irrelevant but just putting the idea out there for feedback
Thoughts? | comp | updates to mod incompatibility checker i m pondering making some additional changes to mod incompatibility checker after labs gets updated to so probably part of release definitions should specify criticality of the incompatibility critical guaranteed to break tm pe in a bad way major likely to break tm pe but in a recoverable way diable conflicting mod an things return to normal minor causes some problems but user can choose to ignore might be better to have a class that defines the incompatible mods rather than txt file critical conflicts will always be checked regardless of user settings distinction between major minor is possibly irrelevant but just putting the idea out there for feedback thoughts | 1 |
13,456 | 15,876,248,555 | IssuesEvent | 2021-04-09 08:07:50 | docker/compose-cli | https://api.github.com/repos/docker/compose-cli | closed | docker compose up --build forces build but does not restart container | bug 🐞 compatibility compose | <!--
If you are reporting a new issue, make sure that we do not have any duplicates
already open. You can ensure this by searching the issue list for this
repository. If there is a duplicate, please close your issue and add a comment
to the existing issue instead.
If you suspect your issue is a bug, please edit your issue description to
include the BUG REPORT INFORMATION shown below. If you fail to provide this
information within 7 days, we cannot debug your issue and will close it. We
will, however, reopen it if you later provide the information.
For more information about reporting issues, see
https://github.com/docker/compose-cli/blob/master/CONTRIBUTING.md#reporting-other-issues
---------------------------------------------------
GENERAL SUPPORT INFORMATION
---------------------------------------------------
The GitHub issue tracker is for bug reports and feature requests.
General support can be found at the following locations:
- Docker Support Forums - https://forums.docker.com
- Docker Community Slack - https://dockr.ly/community
- Post a question on StackOverflow, using the Docker tag
---------------------------------------------------
BUG REPORT INFORMATION
---------------------------------------------------
Use the commands below to provide key information from your environment:
You do NOT have to include this information if this is a FEATURE REQUEST
-->
**Description**
<!--
Briefly describe the problem you are having in a few paragraphs.
-->
`docker compose up --build -d` forces rebuild of images if composefile includes build section for images.
However, if the containers are already started, this command does not restart the container. It will only restart the container if I add the `--force-recreate` option, where `docker-compose` will restart containers after rebuilding images.
**Steps to reproduce the issue:**
1. `docker compose up -d` a project
2. Update content
3. `docker compose up --build -d` to update and deploy changes
4. See that updates are not deployed
5. `docker compose up --build -d --force-recreate` to see updates deployed
**Describe the results you expected:**
No need to add the force-recreate option to restart container
| True | docker compose up --build forces build but does not restart container - <!--
If you are reporting a new issue, make sure that we do not have any duplicates
already open. You can ensure this by searching the issue list for this
repository. If there is a duplicate, please close your issue and add a comment
to the existing issue instead.
If you suspect your issue is a bug, please edit your issue description to
include the BUG REPORT INFORMATION shown below. If you fail to provide this
information within 7 days, we cannot debug your issue and will close it. We
will, however, reopen it if you later provide the information.
For more information about reporting issues, see
https://github.com/docker/compose-cli/blob/master/CONTRIBUTING.md#reporting-other-issues
---------------------------------------------------
GENERAL SUPPORT INFORMATION
---------------------------------------------------
The GitHub issue tracker is for bug reports and feature requests.
General support can be found at the following locations:
- Docker Support Forums - https://forums.docker.com
- Docker Community Slack - https://dockr.ly/community
- Post a question on StackOverflow, using the Docker tag
---------------------------------------------------
BUG REPORT INFORMATION
---------------------------------------------------
Use the commands below to provide key information from your environment:
You do NOT have to include this information if this is a FEATURE REQUEST
-->
**Description**
<!--
Briefly describe the problem you are having in a few paragraphs.
-->
`docker compose up --build -d` forces rebuild of images if composefile includes build section for images.
However, if the containers are already started, this command does not restart the container. It will only restart the container if I add the `--force-recreate` option, where `docker-compose` will restart containers after rebuilding images.
**Steps to reproduce the issue:**
1. `docker compose up -d` a project
2. Update content
3. `docker compose up --build -d` to update and deploy changes
4. See that updates are not deployed
5. `docker compose up --build -d --force-recreate` to see updates deployed
**Describe the results you expected:**
No need to add the force-recreate option to restart container
| comp | docker compose up build forces build but does not restart container if you are reporting a new issue make sure that we do not have any duplicates already open you can ensure this by searching the issue list for this repository if there is a duplicate please close your issue and add a comment to the existing issue instead if you suspect your issue is a bug please edit your issue description to include the bug report information shown below if you fail to provide this information within days we cannot debug your issue and will close it we will however reopen it if you later provide the information for more information about reporting issues see general support information the github issue tracker is for bug reports and feature requests general support can be found at the following locations docker support forums docker community slack post a question on stackoverflow using the docker tag bug report information use the commands below to provide key information from your environment you do not have to include this information if this is a feature request description briefly describe the problem you are having in a few paragraphs docker compose up build d forces rebuild of images if composefile includes build section for images however if the containers are already started this command does not restart the container it will only restart the container if i add the force recreate option where docker compose will restart containers after rebuilding images steps to reproduce the issue docker compose up d a project update content docker compose up build d to update and deploy changes see that updates are not deployed docker compose up build d force recreate to see updates deployed describe the results you expected no need to add the force recreate option to restart container | 1 |
673,150 | 22,949,567,848 | IssuesEvent | 2022-07-19 05:51:56 | adanvdo/YT-RED-UI | https://api.github.com/repos/adanvdo/YT-RED-UI | closed | Reddit Gif Download Support | enhancement implemented Low Priority | Add Support for downloading GIF posts from reddit.
The underlying format of the GIF is actually a video file. Access to the gif url without the format=mp4 param results in 403
This may require conversion | 1.0 | Reddit Gif Download Support - Add Support for downloading GIF posts from reddit.
The underlying format of the GIF is actually a video file. Access to the gif url without the format=mp4 param results in 403
This may require conversion | non_comp | reddit gif download support add support for downloading gif posts from reddit the underlying format of the gif is actually a video file access to the gif url without the format param results in this may require conversion | 0 |
106,260 | 9,125,895,117 | IssuesEvent | 2019-02-24 17:23:04 | spring-projects/spring-framework | https://api.github.com/repos/spring-projects/spring-framework | closed | Introduce strategy for determining if a profile value is enabled for a particular test environment [SPR-4862] | in: test type: enhancement | **[Sam Brannen](https://jira.spring.io/secure/ViewProfile.jspa?name=sbrannen)** opened **[SPR-4862](https://jira.spring.io/browse/SPR-4862?redirect=false)** and commented
ProfileValueSourceConfiguration allows one to configure an implementation for the ProfileValueSource strategy; however, AbstractJUnit38SpringContextTests and SpringJUnit4ClassRunner are currently hard coded to use ProfileValueUtils.isTestEnabledInThisEnvironment() which only matches on an exact _value_.
It would be beneficial to be able to configure different strategies for determining if a profile value is enabled for the current environment. The above classes could then delegate to the configured strategy. For backwards compatibility, the default strategy should simply delegate to ProfileValueUtils.isTestEnabledInThisEnvironment(); whereas, additional, pluggable strategies (e.g., one using regular expressions) could be provided for greater flexibility.
Further input from #10572:
It is a typical situation when one might have to run a combination of tests belonging to different test-groups. For example, a test method A is part of the 'smoke' test group, and test method B belongs to 'integration' test group. If you want to execute test methods A and B, you have to run two separate JUnit runs: one for 'smoke' and a separate one for 'integration'.
It would be great to have a way to combine tests from more than one test group in a single test run, for example:
```
% ant -Dtest-groups=smoke,integration run-tests
```
A strategy which considers the _value_ to be a comma-separated list could provide such support with OR semantics.
---
**Affects:** 2.5.4, 2.5.5, 2.5.6
**Reference URL:** http://forum.springframework.org/showthread.php?p=182516
**Issue Links:**
- #12615 TestContext framework should support declarative configuration of bean definition profiles
- #13622 Allow overriding `@ActiveProfiles` in test classes with system property
- #16300 Introduce annotation to skip test based on active Spring profile
- #8334 Create annotation to group tests
- #10572 Allow multiple values be specified in the runtime for tests filtering by `@IfProfileValue`
- #10572 Allow multiple values be specified in the runtime for tests filtering by `@IfProfileValue` (_**"supersedes"**_)
5 votes, 3 watchers
| 1.0 | Introduce strategy for determining if a profile value is enabled for a particular test environment [SPR-4862] - **[Sam Brannen](https://jira.spring.io/secure/ViewProfile.jspa?name=sbrannen)** opened **[SPR-4862](https://jira.spring.io/browse/SPR-4862?redirect=false)** and commented
ProfileValueSourceConfiguration allows one to configure an implementation for the ProfileValueSource strategy; however, AbstractJUnit38SpringContextTests and SpringJUnit4ClassRunner are currently hard coded to use ProfileValueUtils.isTestEnabledInThisEnvironment() which only matches on an exact _value_.
It would be beneficial to be able to configure different strategies for determining if a profile value is enabled for the current environment. The above classes could then delegate to the configured strategy. For backwards compatibility, the default strategy should simply delegate to ProfileValueUtils.isTestEnabledInThisEnvironment(); whereas, additional, pluggable strategies (e.g., one using regular expressions) could be provided for greater flexibility.
Further input from #10572:
It is a typical situation when one might have to run a combination of tests belonging to different test-groups. For example, a test method A is part of the 'smoke' test group, and test method B belongs to 'integration' test group. If you want to execute test methods A and B, you have to run two separate JUnit runs: one for 'smoke' and a separate one for 'integration'.
It would be great to have a way to combine tests from more than one test group in a single test run, for example:
```
% ant -Dtest-groups=smoke,integration run-tests
```
A strategy which considers the _value_ to be a comma-separated list could provide such support with OR semantics.
---
**Affects:** 2.5.4, 2.5.5, 2.5.6
**Reference URL:** http://forum.springframework.org/showthread.php?p=182516
**Issue Links:**
- #12615 TestContext framework should support declarative configuration of bean definition profiles
- #13622 Allow overriding `@ActiveProfiles` in test classes with system property
- #16300 Introduce annotation to skip test based on active Spring profile
- #8334 Create annotation to group tests
- #10572 Allow multiple values be specified in the runtime for tests filtering by `@IfProfileValue`
- #10572 Allow multiple values be specified in the runtime for tests filtering by `@IfProfileValue` (_**"supersedes"**_)
5 votes, 3 watchers
| non_comp | introduce strategy for determining if a profile value is enabled for a particular test environment opened and commented profilevaluesourceconfiguration allows one to configure an implementation for the profilevaluesource strategy however and are currently hard coded to use profilevalueutils istestenabledinthisenvironment which only matches on an exact value it would be beneficial to be able to configure different strategies for determining if a profile value is enabled for the current environment the above classes could then delegate to the configured strategy for backwards compatibility the default strategy should simply delegate to profilevalueutils istestenabledinthisenvironment whereas additional pluggable strategies e g one using regular expressions could be provided for greater flexibility further input from it is a typical situation when one might have to run a combination of tests belonging to different test groups for example a test method a is part of the smoke test group and test method b belongs to integration test group if you want to execute test methods a and b you have to run two separate junit runs one for smoke and a separate one for integration it would be great to have a way to combine tests from more than one test group in a single test run for example ant dtest groups smoke integration run tests a strategy which considers the value to be a comma separated list could provide such support with or semantics affects reference url issue links testcontext framework should support declarative configuration of bean definition profiles allow overriding activeprofiles in test classes with system property introduce annotation to skip test based on active spring profile create annotation to group tests allow multiple values be specified in the runtime for tests filtering by ifprofilevalue allow multiple values be specified in the runtime for tests filtering by ifprofilevalue supersedes votes watchers | 0 |
5,049 | 7,642,934,043 | IssuesEvent | 2018-05-08 10:55:15 | AdguardTeam/AdguardForAndroid | https://api.github.com/repos/AdguardTeam/AdguardForAndroid | closed | com.google.android.apps.photos | compatibility | @adguard-bot commented on [Mon Apr 30 2018](https://github.com/AdguardTeam/AdguardFilters/issues/15515)
### Issue URL (Incorrect Blocking)
[https://play.google.com/store/apps/details?id=com.google.android.apps.photos&hl=en](https://adguardteam.github.io/AnonymousRedirect/redirect.html?url=https%3A%2F%2Fplay.google.com%2Fstore%2Fapps%2Fdetails%3Fid%3Dcom.google.android.apps.photos%26hl%3Den)
### Comment
> При включении Adguard для приложения Google Photo фотографии не синхронизируются через Wi-Fi. Только принудительное выключение Adguard для Google Photo решает проблему. Ограничений нет, фильтрация по https отсутствует. Проблема остаётся.
>
### Screenshots
<details>
<summary>Screenshot 1</summary>

</details>
### System configuration
Information | value
--- | ---
Platform: | And 8.1
AdGuard version: | 2.11.81
AdGuard mode: | VPN
Filtering quality: | High-speed
HTTPS filtering: | enabled
Filters: | Russian,<br>English,<br>Spyware,<br>Social media,<br>Mobile Ads,<br>Annoyances
---
@Eugene-Savenko commented on [Mon Apr 30 2018](https://github.com/AdguardTeam/AdguardFilters/issues/15515#issuecomment-385351945)
debug is here - 1631686
---
@Alex-302 commented on [Thu May 03 2018](https://github.com/AdguardTeam/AdguardFilters/issues/15515#issuecomment-386385757)
У меня синхронизиреутся.
@vozersky глянь у себя.
---
@vozersky commented on [Thu May 03 2018](https://github.com/AdguardTeam/AdguardFilters/issues/15515#issuecomment-386394494)
повторилось (именно на 8.1)
| True | com.google.android.apps.photos - @adguard-bot commented on [Mon Apr 30 2018](https://github.com/AdguardTeam/AdguardFilters/issues/15515)
### Issue URL (Incorrect Blocking)
[https://play.google.com/store/apps/details?id=com.google.android.apps.photos&hl=en](https://adguardteam.github.io/AnonymousRedirect/redirect.html?url=https%3A%2F%2Fplay.google.com%2Fstore%2Fapps%2Fdetails%3Fid%3Dcom.google.android.apps.photos%26hl%3Den)
### Comment
> При включении Adguard для приложения Google Photo фотографии не синхронизируются через Wi-Fi. Только принудительное выключение Adguard для Google Photo решает проблему. Ограничений нет, фильтрация по https отсутствует. Проблема остаётся.
>
### Screenshots
<details>
<summary>Screenshot 1</summary>

</details>
### System configuration
Information | value
--- | ---
Platform: | And 8.1
AdGuard version: | 2.11.81
AdGuard mode: | VPN
Filtering quality: | High-speed
HTTPS filtering: | enabled
Filters: | Russian,<br>English,<br>Spyware,<br>Social media,<br>Mobile Ads,<br>Annoyances
---
@Eugene-Savenko commented on [Mon Apr 30 2018](https://github.com/AdguardTeam/AdguardFilters/issues/15515#issuecomment-385351945)
debug is here - 1631686
---
@Alex-302 commented on [Thu May 03 2018](https://github.com/AdguardTeam/AdguardFilters/issues/15515#issuecomment-386385757)
У меня синхронизиреутся.
@vozersky глянь у себя.
---
@vozersky commented on [Thu May 03 2018](https://github.com/AdguardTeam/AdguardFilters/issues/15515#issuecomment-386394494)
повторилось (именно на 8.1)
| comp | com google android apps photos adguard bot commented on issue url incorrect blocking comment при включении adguard для приложения google photo фотографии не синхронизируются через wi fi только принудительное выключение adguard для google photo решает проблему ограничений нет фильтрация по https отсутствует проблема остаётся screenshots screenshot system configuration information value platform and adguard version adguard mode vpn filtering quality high speed https filtering enabled filters russian english spyware social media mobile ads annoyances eugene savenko commented on debug is here alex commented on у меня синхронизиреутся vozersky глянь у себя vozersky commented on повторилось именно на | 1 |
4,388 | 7,084,469,599 | IssuesEvent | 2018-01-11 07:03:15 | Yoast/wordpress-seo | https://api.github.com/repos/Yoast/wordpress-seo | closed | Yoast SEO and Divi Theme Conflict | compatibility divi wait for feedback | <!-- Please use this template when creating an issue.
- Please check the boxes after you've created your issue.
- Please use the latest version of Yoast SEO.-->
* [X] I've read and understood the [contribution guidelines](https://github.com/Yoast/wordpress-seo/blob/trunk/.github/CONTRIBUTING.md).
* [X] I've searched for any related issues and avoided creating a duplicate issue.
### Please give us a description of what happened.
I use the Divi page development theme from Elegant Themes and I have a website (https://stanleytotallivingcenter.org) that I have the Yoast SEO plug-in installed on. This week, my ability to edit the site stopped as the Divi page builder backend would not load when I would try to edit the page. I contacted Elegant Themes for help and they informed me that a plug-in was causing the page builder not to load. The provided me with a screenshot with the errors when they tried to access the site to edit it. I have attached a PNG of their screenshot.

Based on their suggestion to disable all plug-ins and then re-enable them one at a time, I have isolated the issue to the Yoast SEO plug-in. When it is off, I can edit pages on the site. When it is activated, the ability to load the Divi Page Builder goes away.
I have currently disabled the plug-in so I can edit the site, but I would love to bring it back as I find it to be a vital part of my site maintenance and SEO efforts.
Can you please take a look and see what might be going on?
### Please describe what you expected to happen and why.
### How can we reproduce this behavior?
1. Use the Divi Theme
2. Install Yoast SEO
3.
### Technical info
* WordPress version: 4.9.1
* Yoast SEO version: 6.0
* Relevant plugins in case of a bug: | True | Yoast SEO and Divi Theme Conflict - <!-- Please use this template when creating an issue.
- Please check the boxes after you've created your issue.
- Please use the latest version of Yoast SEO.-->
* [X] I've read and understood the [contribution guidelines](https://github.com/Yoast/wordpress-seo/blob/trunk/.github/CONTRIBUTING.md).
* [X] I've searched for any related issues and avoided creating a duplicate issue.
### Please give us a description of what happened.
I use the Divi page development theme from Elegant Themes and I have a website (https://stanleytotallivingcenter.org) that I have the Yoast SEO plug-in installed on. This week, my ability to edit the site stopped as the Divi page builder backend would not load when I would try to edit the page. I contacted Elegant Themes for help and they informed me that a plug-in was causing the page builder not to load. The provided me with a screenshot with the errors when they tried to access the site to edit it. I have attached a PNG of their screenshot.

Based on their suggestion to disable all plug-ins and then re-enable them one at a time, I have isolated the issue to the Yoast SEO plug-in. When it is off, I can edit pages on the site. When it is activated, the ability to load the Divi Page Builder goes away.
I have currently disabled the plug-in so I can edit the site, but I would love to bring it back as I find it to be a vital part of my site maintenance and SEO efforts.
Can you please take a look and see what might be going on?
### Please describe what you expected to happen and why.
### How can we reproduce this behavior?
1. Use the Divi Theme
2. Install Yoast SEO
3.
### Technical info
* WordPress version: 4.9.1
* Yoast SEO version: 6.0
* Relevant plugins in case of a bug: | comp | yoast seo and divi theme conflict please use this template when creating an issue please check the boxes after you ve created your issue please use the latest version of yoast seo i ve read and understood the i ve searched for any related issues and avoided creating a duplicate issue please give us a description of what happened i use the divi page development theme from elegant themes and i have a website that i have the yoast seo plug in installed on this week my ability to edit the site stopped as the divi page builder backend would not load when i would try to edit the page i contacted elegant themes for help and they informed me that a plug in was causing the page builder not to load the provided me with a screenshot with the errors when they tried to access the site to edit it i have attached a png of their screenshot based on their suggestion to disable all plug ins and then re enable them one at a time i have isolated the issue to the yoast seo plug in when it is off i can edit pages on the site when it is activated the ability to load the divi page builder goes away i have currently disabled the plug in so i can edit the site but i would love to bring it back as i find it to be a vital part of my site maintenance and seo efforts can you please take a look and see what might be going on please describe what you expected to happen and why how can we reproduce this behavior use the divi theme install yoast seo technical info wordpress version yoast seo version relevant plugins in case of a bug | 1 |
19,865 | 27,568,630,472 | IssuesEvent | 2023-03-08 07:15:26 | KeinNiemand/Factorio-GhostOnWater | https://api.github.com/repos/KeinNiemand/Factorio-GhostOnWater | opened | Add support for Platform | enhancement mod compatibility | Add Support for landfill types from the Platform mod (https://mods.factorio.com/mod/platforms) suggested via Discord. | True | Add support for Platform - Add Support for landfill types from the Platform mod (https://mods.factorio.com/mod/platforms) suggested via Discord. | comp | add support for platform add support for landfill types from the platform mod suggested via discord | 1 |
162,922 | 6,180,297,269 | IssuesEvent | 2017-07-03 04:53:53 | systers/automated-testing | https://api.github.com/repos/systers/automated-testing | closed | Tests for the header and footer links | Priority: HIGH Program: GSoC17 | Write tests for the header and footer links present in all pages (positive, negative and invalid data). For MACC | 1.0 | Tests for the header and footer links - Write tests for the header and footer links present in all pages (positive, negative and invalid data). For MACC | non_comp | tests for the header and footer links write tests for the header and footer links present in all pages positive negative and invalid data for macc | 0 |
8,254 | 10,322,677,710 | IssuesEvent | 2019-08-31 14:32:52 | Direwolf20-MC/BuildingGadgets | https://api.github.com/repos/Direwolf20-MC/BuildingGadgets | closed | Forge .73 crashes due to updates (breaking changes) to IConditionSerializer | 1.14 incompatibility | Awesome mod, loving it so much.
We are getting this crash on client and on server, full crash logs client [here](https://gist.github.com/Wissi/caa86f7525e502ecb6c6da797deee0cb) and server [here](https://gist.github.com/Wissi/29b9d331b308dfacc9513749af8ca64a).
Using:
* forge-1.14.4-28.0.73
* buildinggadgets-3.0.2.jar
* buildinggadgets-3.0.4.jar
| True | Forge .73 crashes due to updates (breaking changes) to IConditionSerializer - Awesome mod, loving it so much.
We are getting this crash on client and on server, full crash logs client [here](https://gist.github.com/Wissi/caa86f7525e502ecb6c6da797deee0cb) and server [here](https://gist.github.com/Wissi/29b9d331b308dfacc9513749af8ca64a).
Using:
* forge-1.14.4-28.0.73
* buildinggadgets-3.0.2.jar
* buildinggadgets-3.0.4.jar
| comp | forge crashes due to updates breaking changes to iconditionserializer awesome mod loving it so much we are getting this crash on client and on server full crash logs client and server using forge buildinggadgets jar buildinggadgets jar | 1 |
15,590 | 20,086,652,286 | IssuesEvent | 2022-02-05 03:52:28 | MuradAkh/LittleLogistics | https://api.github.com/repos/MuradAkh/LittleLogistics | closed | [Feature Request] Support for modded fish from fishing barge | enhancement compatibility | Suggest changing from `FISHING_FISH` on the line below to just `FISHING` or `GAMEPLAY_FISHING` as it seems most mods (for example aquaculture 2) inject straight in to `minecraft:gameplay/fishing` vs `gameplay/fishing/fish`. Of course this would mean the player still has a chance at treasure if they dont win with the treasure chance modifier, but the chance of rolling the treasure table with an unenchanted rod is super low so i dont think it's a huge deal there. Also gives the chance at rolling the junk table haha so it balances out i think.
https://github.com/MuradAkh/LittleLogistics/blob/14cd4cce1bd3a4316798756b1024c99d820fe245/src/main/java/dev/murad/shipping/entity/custom/barge/FishingBargeEntity.java#L145
on another note, do you guys have a discord server at all somewhere for the public? | True | [Feature Request] Support for modded fish from fishing barge - Suggest changing from `FISHING_FISH` on the line below to just `FISHING` or `GAMEPLAY_FISHING` as it seems most mods (for example aquaculture 2) inject straight in to `minecraft:gameplay/fishing` vs `gameplay/fishing/fish`. Of course this would mean the player still has a chance at treasure if they dont win with the treasure chance modifier, but the chance of rolling the treasure table with an unenchanted rod is super low so i dont think it's a huge deal there. Also gives the chance at rolling the junk table haha so it balances out i think.
https://github.com/MuradAkh/LittleLogistics/blob/14cd4cce1bd3a4316798756b1024c99d820fe245/src/main/java/dev/murad/shipping/entity/custom/barge/FishingBargeEntity.java#L145
on another note, do you guys have a discord server at all somewhere for the public? | comp | support for modded fish from fishing barge suggest changing from fishing fish on the line below to just fishing or gameplay fishing as it seems most mods for example aquaculture inject straight in to minecraft gameplay fishing vs gameplay fishing fish of course this would mean the player still has a chance at treasure if they dont win with the treasure chance modifier but the chance of rolling the treasure table with an unenchanted rod is super low so i dont think it s a huge deal there also gives the chance at rolling the junk table haha so it balances out i think on another note do you guys have a discord server at all somewhere for the public | 1 |
82,109 | 3,603,186,905 | IssuesEvent | 2016-02-03 18:07:43 | concrete5/concrete5 | https://api.github.com/repos/concrete5/concrete5 | closed | Reordering blocks in global area doesn't work | accepted:ready to start priority:love to have type:bug | Steps to reproduce:
- Add content block `A` to a global area
- Add content block `B` to same global area
- Publish changes
- Edit the page again
- Reorder blocks, move `B` above `A`
- Publish changes (or just reload the page)
Blocks' actual order remains the same, moving had no affect.
It seems that this bug was introduced in #3065, reverting changes from it seems to fix this problem (but of course we can't just do that).
The problem seems to be that [`processArrangement`](https://github.com/concrete5/concrete5/blob/develop/web/concrete/controllers/backend/page/arrange_blocks.php#L116) is called for the current page instead of the stack and because of that the [`ids`](https://github.com/concrete5/concrete5/blob/develop/web/concrete/src/Page/Page.php#L387) in executed query are incorrect.
I'm not sure how the versioning should work with this scenario. If we just call `processArrangement` for the stack (and provide correct area id for it) then ordering works, but discarding changes for the page doesn't have any affect on the stacks version..
@aembler or @Remo, maybe you could take a look at this? | 1.0 | Reordering blocks in global area doesn't work - Steps to reproduce:
- Add content block `A` to a global area
- Add content block `B` to same global area
- Publish changes
- Edit the page again
- Reorder blocks, move `B` above `A`
- Publish changes (or just reload the page)
Blocks' actual order remains the same, moving had no affect.
It seems that this bug was introduced in #3065, reverting changes from it seems to fix this problem (but of course we can't just do that).
The problem seems to be that [`processArrangement`](https://github.com/concrete5/concrete5/blob/develop/web/concrete/controllers/backend/page/arrange_blocks.php#L116) is called for the current page instead of the stack and because of that the [`ids`](https://github.com/concrete5/concrete5/blob/develop/web/concrete/src/Page/Page.php#L387) in executed query are incorrect.
I'm not sure how the versioning should work with this scenario. If we just call `processArrangement` for the stack (and provide correct area id for it) then ordering works, but discarding changes for the page doesn't have any affect on the stacks version..
@aembler or @Remo, maybe you could take a look at this? | non_comp | reordering blocks in global area doesn t work steps to reproduce add content block a to a global area add content block b to same global area publish changes edit the page again reorder blocks move b above a publish changes or just reload the page blocks actual order remains the same moving had no affect it seems that this bug was introduced in reverting changes from it seems to fix this problem but of course we can t just do that the problem seems to be that is called for the current page instead of the stack and because of that the in executed query are incorrect i m not sure how the versioning should work with this scenario if we just call processarrangement for the stack and provide correct area id for it then ordering works but discarding changes for the page doesn t have any affect on the stacks version aembler or remo maybe you could take a look at this | 0 |
2,349 | 11,796,147,149 | IssuesEvent | 2020-03-18 10:15:18 | submariner-io/submariner | https://api.github.com/repos/submariner-io/submariner | closed | Get Cluster/Service CIDRs for Helm deploys dynamically from subctl or Armada, don't hardcode | automation shipyard | [Part of the review](https://github.com/submariner-io/submariner/pull/317#discussion_r383649597) of #317 showed the need to remove a hardcoded link between the Cluster/Service CIDR vars passed to Helm deploys and the underlying deployment implementation (ie Armada internal defaults). That connection wasn't introduced by the proposed change, and has a fairly different scope, so it was agreed it should be handled separately.
Ideas so far:
* Use `subctl info` to scrape network details - via @mangelajo
* Well-developed, used by Operator path
* Scrape network details from Armada output - via @tpantelis
* Doesn't make a dependency on subctl | 1.0 | Get Cluster/Service CIDRs for Helm deploys dynamically from subctl or Armada, don't hardcode - [Part of the review](https://github.com/submariner-io/submariner/pull/317#discussion_r383649597) of #317 showed the need to remove a hardcoded link between the Cluster/Service CIDR vars passed to Helm deploys and the underlying deployment implementation (ie Armada internal defaults). That connection wasn't introduced by the proposed change, and has a fairly different scope, so it was agreed it should be handled separately.
Ideas so far:
* Use `subctl info` to scrape network details - via @mangelajo
* Well-developed, used by Operator path
* Scrape network details from Armada output - via @tpantelis
* Doesn't make a dependency on subctl | non_comp | get cluster service cidrs for helm deploys dynamically from subctl or armada don t hardcode of showed the need to remove a hardcoded link between the cluster service cidr vars passed to helm deploys and the underlying deployment implementation ie armada internal defaults that connection wasn t introduced by the proposed change and has a fairly different scope so it was agreed it should be handled separately ideas so far use subctl info to scrape network details via mangelajo well developed used by operator path scrape network details from armada output via tpantelis doesn t make a dependency on subctl | 0 |
20,475 | 30,288,913,052 | IssuesEvent | 2023-07-09 02:48:38 | Yesssssman/epicfightmod | https://api.github.com/repos/Yesssssman/epicfightmod | closed | [Bug]: infused crystal sword's on-hit explosion doesn't work | mod incompatibility work-in-progress | ### Have you checked if a similar issue is already reported by someone else?
- [X] I checked there are no similar issues have been reported.
### Have you read the support policy?
- [X] I read it and I accept the policy.
### Are you using the latest Epic Fight and recommended Forge version?
- [X] I checked I'm using latest Epic Fight and recommended Forge version.
### Is this issue related to mod incompatibility?
- [X] This is a mod compatibility issue and I'm aware of the problem.
### The mod
https://www.curseforge.com/minecraft/mc-mods/astral-sorcery
### Minecraft Version
1.16.5
### What happened?
when on battle mode, the chaining lightning bolts from the lightning arc perk added by astral sorcery works properly
but the on-hit explosion from the infused crystal sword (also from astral sorcery) just don't work | True | [Bug]: infused crystal sword's on-hit explosion doesn't work - ### Have you checked if a similar issue is already reported by someone else?
- [X] I checked there are no similar issues have been reported.
### Have you read the support policy?
- [X] I read it and I accept the policy.
### Are you using the latest Epic Fight and recommended Forge version?
- [X] I checked I'm using latest Epic Fight and recommended Forge version.
### Is this issue related to mod incompatibility?
- [X] This is a mod compatibility issue and I'm aware of the problem.
### The mod
https://www.curseforge.com/minecraft/mc-mods/astral-sorcery
### Minecraft Version
1.16.5
### What happened?
when on battle mode, the chaining lightning bolts from the lightning arc perk added by astral sorcery works properly
but the on-hit explosion from the infused crystal sword (also from astral sorcery) just don't work | comp | infused crystal sword s on hit explosion doesn t work have you checked if a similar issue is already reported by someone else i checked there are no similar issues have been reported have you read the support policy i read it and i accept the policy are you using the latest epic fight and recommended forge version i checked i m using latest epic fight and recommended forge version is this issue related to mod incompatibility this is a mod compatibility issue and i m aware of the problem the mod minecraft version what happened when on battle mode the chaining lightning bolts from the lightning arc perk added by astral sorcery works properly but the on hit explosion from the infused crystal sword also from astral sorcery just don t work | 1 |
17,632 | 24,315,318,365 | IssuesEvent | 2022-09-30 05:28:03 | actiontech/sqle | https://api.github.com/repos/actiontech/sqle | closed | 工单支持多数据源 | feature not_compatible | ## 实现方案
### 创建工单页面
**新增接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v1/instances/connections|post|批量测试数据库连接 @taolx0 |完成|
|/v1/task_groups|post|创建审核任务组 @taolx0 |完成|
|/v1/task_groups/audit|post|审核任务组 @taolx0 |完成|
**升级接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v2/workflows|post|创建工单 @ColdWaterLW |完成|
**后端实现变更的接口**
| 接口名 | Method|作用|变更|进度|
| ---- | ---- | ---- | ---- | ---- |
|/v1/instance_tips|get|获取实例下拉列表内容|增加流程模板的筛选条件 @taolx0 |完成|
|/v1/tasks/audits|post|创建审核任务并审核|创建task时要同时添加group id @taolx0 |完成|
### 工单列表页面
**升级接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v2/workflows|get|获取工单列表 @taolx0|完成|
### 工单详情页面
**新增接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v1/workflows/{workflow_id}/tasks/{task_id}/execute|post|工单提交单个数据源上线 @ColdWaterLW|完成|
|/v1/workflows/{workflow_id}/tasks|get|获取工单数据源任务概览 @ColdWaterLW |完成|
**升级接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v2/workflows/{workflow_id}/|get|获取工单详情 @ColdWaterLW |完成|
|/v2/workflows/{workflow_id}/|patch|更新工单 @ColdWaterLW |完成|
|/v2/workflows/{workflow_id}/tasks/execute|post|批量提交SQL上线 @ColdWaterLW|完成|
|/v2/workflows/{workflow_id}/tasks/{task_id}/schedule|put|定时上线 @sjjian |完成|
**后端实现变更的接口**
| 接口名 | Method|作用|变更|进度|
| ---- | ---- | ---- | ---- | ---- |
|/v1/workflows/{workflow_id}/steps/{workflow_step_id}/approve|post|审核通过|model变更 @ColdWaterLW |完成|
### 后续修复
- [x] 定时上线任务 @sjjian 【完成】
- [x] 工单消息通知NotificationBody() @ColdWaterLW
- [x] 更正清理过期task的SQL @taolx0
**优先级高**
- [x] (优先修复)清理旧的工单状态:"on_process",涉及接口:/v1/workflows/{workflow_id}/cancel [post]、/v1/workflows/cancel [post]、/v1/users/{user_name}/ [delete]、/v1/instances/{instance_name}/ [delete] @ColdWaterLW
- [x] (优先修复)修复删除数据源接口,更正SQL @ColdWaterLW
- [x] (优先修复)/v1/tasks/audits/{task_id}/ [get]接口,使用非admin用户时报错,需要更正(s *Storage) GetWorkflowByTaskId的SQL @ColdWaterLW
- [x] (优先修复)修复dashboard页面展示数据 @LZS911 @taolx0
- [x] (优先修复)工单详情页面->工单进度,只要有部分数据源已上线或上线中,就应该不能再驳回 @LZS911 @ColdWaterLW
- [x] 工单详情页面,定时上线的数据源上线完成后状态应变为“上线完成”或“上线失败” @ColdWaterLW
- [x] 工单详情页面,数据源概览新增“上线人”列 @LZS911 @ColdWaterLW
- [x] 工单详情页面,数据源概览接口返回字段增加数据源运维时间instance_maintenance_times @ColdWaterLW
- [x] 【后端完成】修复上线开始、结束时间筛选 @taolx0 @LZS911
- [x] (优先修复)【后端完成】工单状态新增“上线中”,即所有task都点了上线,但有部分或全部task还在上线中的状态 @taolx0 @LZS911
- [x] 修复工单列表工单状态筛选 @taolx0
**优先级中**
- [x] (优先级中)修复报表统计页面: 1. 工单状态 @taolx0
- [x] (优先级中)工单详情页面,工单处于被驳回状态时,应该不能点上线 @LZS911
- [x] 工单详情页面,工单处于待上线阶段,当前用户不是操作用户时,应该不能点上线 @LZS911
**优先级低**
- [x] #902
- [x] (优先级低)概览列表顺序调整:数据源、状态、审核通过率、审核结果评分、待操作人、上线人、上线开始时间、上线结束时间、定时上线时间、操作 @LZS911
- [x] (优先级低)审核任务组接口,已经审核过的任务组不需要再审核 @taolx0
- [x] (优先级低)不同sql情况下创建工单,需要验证所有数据源是否已经审核@LZS911
- [x] 创建工单时,如果用户修改了已审核的SQL没有再点审核,提示用户将会使用已有的审核结果创建工单 @LZS911
## 升级方案 | True | 工单支持多数据源 - ## 实现方案
### 创建工单页面
**新增接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v1/instances/connections|post|批量测试数据库连接 @taolx0 |完成|
|/v1/task_groups|post|创建审核任务组 @taolx0 |完成|
|/v1/task_groups/audit|post|审核任务组 @taolx0 |完成|
**升级接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v2/workflows|post|创建工单 @ColdWaterLW |完成|
**后端实现变更的接口**
| 接口名 | Method|作用|变更|进度|
| ---- | ---- | ---- | ---- | ---- |
|/v1/instance_tips|get|获取实例下拉列表内容|增加流程模板的筛选条件 @taolx0 |完成|
|/v1/tasks/audits|post|创建审核任务并审核|创建task时要同时添加group id @taolx0 |完成|
### 工单列表页面
**升级接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v2/workflows|get|获取工单列表 @taolx0|完成|
### 工单详情页面
**新增接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v1/workflows/{workflow_id}/tasks/{task_id}/execute|post|工单提交单个数据源上线 @ColdWaterLW|完成|
|/v1/workflows/{workflow_id}/tasks|get|获取工单数据源任务概览 @ColdWaterLW |完成|
**升级接口**
| 接口名 | Method|作用|进度|
| ---- | ---- | ---- | ---- |
|/v2/workflows/{workflow_id}/|get|获取工单详情 @ColdWaterLW |完成|
|/v2/workflows/{workflow_id}/|patch|更新工单 @ColdWaterLW |完成|
|/v2/workflows/{workflow_id}/tasks/execute|post|批量提交SQL上线 @ColdWaterLW|完成|
|/v2/workflows/{workflow_id}/tasks/{task_id}/schedule|put|定时上线 @sjjian |完成|
**后端实现变更的接口**
| 接口名 | Method|作用|变更|进度|
| ---- | ---- | ---- | ---- | ---- |
|/v1/workflows/{workflow_id}/steps/{workflow_step_id}/approve|post|审核通过|model变更 @ColdWaterLW |完成|
### 后续修复
- [x] 定时上线任务 @sjjian 【完成】
- [x] 工单消息通知NotificationBody() @ColdWaterLW
- [x] 更正清理过期task的SQL @taolx0
**优先级高**
- [x] (优先修复)清理旧的工单状态:"on_process",涉及接口:/v1/workflows/{workflow_id}/cancel [post]、/v1/workflows/cancel [post]、/v1/users/{user_name}/ [delete]、/v1/instances/{instance_name}/ [delete] @ColdWaterLW
- [x] (优先修复)修复删除数据源接口,更正SQL @ColdWaterLW
- [x] (优先修复)/v1/tasks/audits/{task_id}/ [get]接口,使用非admin用户时报错,需要更正(s *Storage) GetWorkflowByTaskId的SQL @ColdWaterLW
- [x] (优先修复)修复dashboard页面展示数据 @LZS911 @taolx0
- [x] (优先修复)工单详情页面->工单进度,只要有部分数据源已上线或上线中,就应该不能再驳回 @LZS911 @ColdWaterLW
- [x] 工单详情页面,定时上线的数据源上线完成后状态应变为“上线完成”或“上线失败” @ColdWaterLW
- [x] 工单详情页面,数据源概览新增“上线人”列 @LZS911 @ColdWaterLW
- [x] 工单详情页面,数据源概览接口返回字段增加数据源运维时间instance_maintenance_times @ColdWaterLW
- [x] 【后端完成】修复上线开始、结束时间筛选 @taolx0 @LZS911
- [x] (优先修复)【后端完成】工单状态新增“上线中”,即所有task都点了上线,但有部分或全部task还在上线中的状态 @taolx0 @LZS911
- [x] 修复工单列表工单状态筛选 @taolx0
**优先级中**
- [x] (优先级中)修复报表统计页面: 1. 工单状态 @taolx0
- [x] (优先级中)工单详情页面,工单处于被驳回状态时,应该不能点上线 @LZS911
- [x] 工单详情页面,工单处于待上线阶段,当前用户不是操作用户时,应该不能点上线 @LZS911
**优先级低**
- [x] #902
- [x] (优先级低)概览列表顺序调整:数据源、状态、审核通过率、审核结果评分、待操作人、上线人、上线开始时间、上线结束时间、定时上线时间、操作 @LZS911
- [x] (优先级低)审核任务组接口,已经审核过的任务组不需要再审核 @taolx0
- [x] (优先级低)不同sql情况下创建工单,需要验证所有数据源是否已经审核@LZS911
- [x] 创建工单时,如果用户修改了已审核的SQL没有再点审核,提示用户将会使用已有的审核结果创建工单 @LZS911
## 升级方案 | comp | 工单支持多数据源 实现方案 创建工单页面 新增接口 接口名 method 作用 进度 instances connections post 批量测试数据库连接 完成 task groups post 创建审核任务组 完成 task groups audit post 审核任务组 完成 升级接口 接口名 method 作用 进度 workflows post 创建工单 coldwaterlw 完成 后端实现变更的接口 接口名 method 作用 变更 进度 instance tips get 获取实例下拉列表内容 增加流程模板的筛选条件 完成 tasks audits post 创建审核任务并审核 创建task时要同时添加group id 完成 工单列表页面 升级接口 接口名 method 作用 进度 workflows get 获取工单列表 完成 工单详情页面 新增接口 接口名 method 作用 进度 workflows workflow id tasks task id execute post 工单提交单个数据源上线 coldwaterlw 完成 workflows workflow id tasks get 获取工单数据源任务概览 coldwaterlw 完成 升级接口 接口名 method 作用 进度 workflows workflow id get 获取工单详情 coldwaterlw 完成 workflows workflow id patch 更新工单 coldwaterlw 完成 workflows workflow id tasks execute post 批量提交sql上线 coldwaterlw 完成 workflows workflow id tasks task id schedule put 定时上线 sjjian 完成 后端实现变更的接口 接口名 method 作用 变更 进度 workflows workflow id steps workflow step id approve post 审核通过 model变更 coldwaterlw 完成 后续修复 定时上线任务 sjjian 【完成】 工单消息通知notificationbody coldwaterlw 更正清理过期task的sql 优先级高 优先修复 清理旧的工单状态: on process ,涉及接口: workflows workflow id cancel 、 workflows cancel 、 users user name 、 instances instance name coldwaterlw 优先修复 修复删除数据源接口,更正sql coldwaterlw 优先修复 tasks audits task id 接口,使用非admin用户时报错,需要更正 s storage getworkflowbytaskid的sql coldwaterlw 优先修复 修复dashboard页面展示数据 优先修复 工单详情页面 工单进度,只要有部分数据源已上线或上线中,就应该不能再驳回 coldwaterlw 工单详情页面,定时上线的数据源上线完成后状态应变为“上线完成”或“上线失败” coldwaterlw 工单详情页面,数据源概览新增“上线人”列 coldwaterlw 工单详情页面,数据源概览接口返回字段增加数据源运维时间instance maintenance times coldwaterlw 【后端完成】修复上线开始、结束时间筛选 优先修复 【后端完成】工单状态新增“上线中”,即所有task都点了上线,但有部分或全部task还在上线中的状态 修复工单列表工单状态筛选 优先级中 优先级中 修复报表统计页面: 工单状态 优先级中 工单详情页面,工单处于被驳回状态时,应该不能点上线 工单详情页面,工单处于待上线阶段,当前用户不是操作用户时,应该不能点上线 优先级低 优先级低 概览列表顺序调整:数据源、状态、审核通过率、审核结果评分、待操作人、上线人、上线开始时间、上线结束时间、定时上线时间、操作 优先级低 审核任务组接口,已经审核过的任务组不需要再审核 优先级低 不同sql情况下创建工单,需要验证所有数据源是否已经审核 创建工单时,如果用户修改了已审核的sql没有再点审核,提示用户将会使用已有的审核结果创建工单 升级方案 | 1 |
100,915 | 30,818,375,998 | IssuesEvent | 2023-08-01 14:47:00 | yt-project/yt | https://api.github.com/repos/yt-project/yt | closed | BLD: compatibility with Cython 3.0 | refactor build | Cython 3.0 is now in beta (after several years of alpha).
#4043 contains an initial effort to discover and patch incompatibilities, but at the moment it seems to me that this migration is better done incrementally, as only some of the patches will be seamless and uncontroversial.
I'll be using this issue to track progress on this effort.
The most urgent issues are listed in the following. A useful resource for this task is the [migration guide](http://docs.cython.org/en/latest/src/userguide/migrating_to_cy30.html).
### blockers
- [x] #4146
- [x] #4357
- [x] #4365
- [x] #4373
- [x] #4374
### restoring performance
- [x] #4386
- [x] #4390
- [x] #4392
performance issues are discussed in the present thread.
### deprecation warnings
- [x] #4375
- [ ] `The 'DEF' statement is deprecated and will be removed in a future Cython version`.
- [x] see #4043 for a first attempt at removing these
- [x] #4376
- [ ] https://github.com/cython/cython/pull/5242
Since the migration path for some items is a moving target (see https://github.com/cython/cython/pull/5242), I think the minimal effort approach would be to set our requirement to `Cython>=3.0,<3.1` at first. According to the current plan for Cython 3.1 and 3.2, we should be able to drop the upper limit completely once we reach compatibility with version 3.2
### low priority (expected gains from dropping Cython 0.29.x)
- [ ] forbid deprecation warnings from numpy C-API (see http://docs.cython.org/en/latest/src/userguide/migrating_to_cy30.html#numpy-c-api)
This one is easy, as it just requires adding some compiler flags to all `*.pyx` files, but it requires dropping Cython 0.29.x first.
For future reference, here's a simple way to apply the change systematically
<details><summary> Details </summary>
with `header.txt`
```shell
# distutils: define_macros=NPY_NO_DEPRECATED_API=NPY_1_7_API_VERSION
```
and `t.sh`
```shell
for file in "$@"
do
cat header.txt "$file" > /tmp/xx.$$
mv /tmp/xx.$$ "$file"
done
```
apply the change as
```
git ls-files | grep .pyx | xargs bash t.sh
```
</details>
- [ ] (?) Start releasing more portable wheels with `PY_LIMITED_API` https://cython.readthedocs.io/en/latest/src/changes.html?highlight=CYTHON_LIMITED_API#id27 | 1.0 | BLD: compatibility with Cython 3.0 - Cython 3.0 is now in beta (after several years of alpha).
#4043 contains an initial effort to discover and patch incompatibilities, but at the moment it seems to me that this migration is better done incrementally, as only some of the patches will be seamless and uncontroversial.
I'll be using this issue to track progress on this effort.
The most urgent issues are listed in the following. A useful resource for this task is the [migration guide](http://docs.cython.org/en/latest/src/userguide/migrating_to_cy30.html).
### blockers
- [x] #4146
- [x] #4357
- [x] #4365
- [x] #4373
- [x] #4374
### restoring performance
- [x] #4386
- [x] #4390
- [x] #4392
performance issues are discussed in the present thread.
### deprecation warnings
- [x] #4375
- [ ] `The 'DEF' statement is deprecated and will be removed in a future Cython version`.
- [x] see #4043 for a first attempt at removing these
- [x] #4376
- [ ] https://github.com/cython/cython/pull/5242
Since the migration path for some items is a moving target (see https://github.com/cython/cython/pull/5242), I think the minimal effort approach would be to set our requirement to `Cython>=3.0,<3.1` at first. According to the current plan for Cython 3.1 and 3.2, we should be able to drop the upper limit completely once we reach compatibility with version 3.2
### low priority (expected gains from dropping Cython 0.29.x)
- [ ] forbid deprecation warnings from numpy C-API (see http://docs.cython.org/en/latest/src/userguide/migrating_to_cy30.html#numpy-c-api)
This one is easy, as it just requires adding some compiler flags to all `*.pyx` files, but it requires dropping Cython 0.29.x first.
For future reference, here's a simple way to apply the change systematically
<details><summary> Details </summary>
with `header.txt`
```shell
# distutils: define_macros=NPY_NO_DEPRECATED_API=NPY_1_7_API_VERSION
```
and `t.sh`
```shell
for file in "$@"
do
cat header.txt "$file" > /tmp/xx.$$
mv /tmp/xx.$$ "$file"
done
```
apply the change as
```
git ls-files | grep .pyx | xargs bash t.sh
```
</details>
- [ ] (?) Start releasing more portable wheels with `PY_LIMITED_API` https://cython.readthedocs.io/en/latest/src/changes.html?highlight=CYTHON_LIMITED_API#id27 | non_comp | bld compatibility with cython cython is now in beta after several years of alpha contains an initial effort to discover and patch incompatibilities but at the moment it seems to me that this migration is better done incrementally as only some of the patches will be seamless and uncontroversial i ll be using this issue to track progress on this effort the most urgent issues are listed in the following a useful resource for this task is the blockers restoring performance performance issues are discussed in the present thread deprecation warnings the def statement is deprecated and will be removed in a future cython version see for a first attempt at removing these since the migration path for some items is a moving target see i think the minimal effort approach would be to set our requirement to cython at first according to the current plan for cython and we should be able to drop the upper limit completely once we reach compatibility with version low priority expected gains from dropping cython x forbid deprecation warnings from numpy c api see this one is easy as it just requires adding some compiler flags to all pyx files but it requires dropping cython x first for future reference here s a simple way to apply the change systematically details with header txt shell distutils define macros npy no deprecated api npy api version and t sh shell for file in do cat header txt file tmp xx mv tmp xx file done apply the change as git ls files grep pyx xargs bash t sh start releasing more portable wheels with py limited api | 0 |
177,081 | 28,320,967,815 | IssuesEvent | 2023-04-11 01:05:06 | flutter/flutter | https://api.github.com/repos/flutter/flutter | closed | [a11y] VoiceOver does not announce form validation errors | a: text input framework f: material design a: accessibility platform-web has reproducible steps P4 found in release: 2.10 found in release: 2.11 customer: troy | If a form field fails validation, the error message is visually displayed but is not announced by VoiceOver.
## Steps to Reproduce
1. Turn on VoiceOver.
2. Execute `flutter run -d chrome` on the code sample.
3. Tap "Submit" without entering a username.
**Expected results:**
The validation error message ("Please enter your username") is announced by VoiceOver.
**Actual results:**
The validation error message is displayed but is not read.
**Workaround:**
If the `helperText` is set to a space, the validation error message will be announced. 🤷🏾
```dart
TextFormField(
decoration: const InputDecoration(
helperText: ' ', // Must be set to a space to announce validation errors.
labelText: 'Username',
),
validator: (String? value) {
if (value == null || value.isEmpty) {
return 'Please enter your username';
}
return null;
},
),
```
<details>
<summary>Code sample</summary>
```dart
import 'package:flutter/material.dart';
import 'package:flutter/rendering.dart';
void main() {
WidgetsFlutterBinding.ensureInitialized();
// Auto-enable accessibility for our Blind and Low Vision customers (see
// https://docs.flutter.dev/development/accessibility-and-localization/accessibility#screen-readers).
RendererBinding.instance!.setSemanticsEnabled(true);
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
@override
Widget build(BuildContext context) {
const appTitle = 'Form Validation Demo';
return MaterialApp(
title: appTitle,
home: Scaffold(
appBar: AppBar(
title: const Text(appTitle),
),
body: const MyCustomForm(),
),
);
}
}
class MyCustomForm extends StatefulWidget {
const MyCustomForm({Key? key}) : super(key: key);
@override
State<MyCustomForm> createState() => _MyCustomFormState();
}
class _MyCustomFormState extends State<MyCustomForm> {
final _formKey = GlobalKey<FormState>();
@override
Widget build(BuildContext context) {
return Form(
key: _formKey,
child: Column(
children: <Widget>[
TextFormField(
decoration: const InputDecoration(
labelText: 'Username',
),
validator: (String? value) {
if (value == null || value.isEmpty) {
return 'Please enter your username';
}
return null;
},
),
TextButton(
onPressed: () {
if (_formKey.currentState!.validate()) {
ScaffoldMessenger.of(context).showSnackBar(
const SnackBar(content: Text('Submitting...')),
);
}
},
child: const Text('Submit'),
),
],
),
);
}
}
```
</details>
<details>
<summary>Logs</summary>
```
[✓] Flutter (Channel stable, 2.10.3, on macOS 11.6.2 20G314 darwin-x64, locale en-US)
• Flutter version 2.10.3 at /Users/vsomayaji/Dev/flutter
• Upstream repository https://github.com/flutter/flutter.git
• Framework revision 7e9793dee1 (5 days ago), 2022-03-02 11:23:12 -0600
• Engine revision bd539267b4
• Dart version 2.16.1
• DevTools version 2.9.2
[✓] Android toolchain - develop for Android devices (Android SDK version 30.0.3)
• Android SDK at /Users/vsomayaji/Library/Android/sdk
• Platform android-31, build-tools 30.0.3
• Java binary at: /Applications/Android Studio.app/Contents/jre/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 11.0.10+0-b96-7281165)
• All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 13.2.1)
• Xcode at /Applications/Xcode.app/Contents/Developer
• CocoaPods version 1.11.2
[✓] Chrome - develop for the web
• Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome
[✓] Android Studio (version 2020.3)
• Android Studio at /Applications/Android Studio.app/Contents
• Flutter plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/9212-flutter
• Dart plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/6351-dart
• Java version OpenJDK Runtime Environment (build 11.0.10+0-b96-7281165)
[✓] IntelliJ IDEA Ultimate Edition (version 2021.1.2)
• IntelliJ at /Applications/IntelliJ IDEA.app
• Flutter plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/9212-flutter
• Dart plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/6351-dart
[✓] Connected device (1 available)
• Chrome (web) • chrome • web-javascript • Google Chrome 98.0.4758.109
[✓] HTTP Host Availability
• All required HTTP hosts are available
• No issues found!
```
</details>
https://user-images.githubusercontent.com/1007109/157127019-1067198b-28ad-4f27-8a83-9a6b4fe957bb.mov
| 1.0 | [a11y] VoiceOver does not announce form validation errors - If a form field fails validation, the error message is visually displayed but is not announced by VoiceOver.
## Steps to Reproduce
1. Turn on VoiceOver.
2. Execute `flutter run -d chrome` on the code sample.
3. Tap "Submit" without entering a username.
**Expected results:**
The validation error message ("Please enter your username") is announced by VoiceOver.
**Actual results:**
The validation error message is displayed but is not read.
**Workaround:**
If the `helperText` is set to a space, the validation error message will be announced. 🤷🏾
```dart
TextFormField(
decoration: const InputDecoration(
helperText: ' ', // Must be set to a space to announce validation errors.
labelText: 'Username',
),
validator: (String? value) {
if (value == null || value.isEmpty) {
return 'Please enter your username';
}
return null;
},
),
```
<details>
<summary>Code sample</summary>
```dart
import 'package:flutter/material.dart';
import 'package:flutter/rendering.dart';
void main() {
WidgetsFlutterBinding.ensureInitialized();
// Auto-enable accessibility for our Blind and Low Vision customers (see
// https://docs.flutter.dev/development/accessibility-and-localization/accessibility#screen-readers).
RendererBinding.instance!.setSemanticsEnabled(true);
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
@override
Widget build(BuildContext context) {
const appTitle = 'Form Validation Demo';
return MaterialApp(
title: appTitle,
home: Scaffold(
appBar: AppBar(
title: const Text(appTitle),
),
body: const MyCustomForm(),
),
);
}
}
class MyCustomForm extends StatefulWidget {
const MyCustomForm({Key? key}) : super(key: key);
@override
State<MyCustomForm> createState() => _MyCustomFormState();
}
class _MyCustomFormState extends State<MyCustomForm> {
final _formKey = GlobalKey<FormState>();
@override
Widget build(BuildContext context) {
return Form(
key: _formKey,
child: Column(
children: <Widget>[
TextFormField(
decoration: const InputDecoration(
labelText: 'Username',
),
validator: (String? value) {
if (value == null || value.isEmpty) {
return 'Please enter your username';
}
return null;
},
),
TextButton(
onPressed: () {
if (_formKey.currentState!.validate()) {
ScaffoldMessenger.of(context).showSnackBar(
const SnackBar(content: Text('Submitting...')),
);
}
},
child: const Text('Submit'),
),
],
),
);
}
}
```
</details>
<details>
<summary>Logs</summary>
```
[✓] Flutter (Channel stable, 2.10.3, on macOS 11.6.2 20G314 darwin-x64, locale en-US)
• Flutter version 2.10.3 at /Users/vsomayaji/Dev/flutter
• Upstream repository https://github.com/flutter/flutter.git
• Framework revision 7e9793dee1 (5 days ago), 2022-03-02 11:23:12 -0600
• Engine revision bd539267b4
• Dart version 2.16.1
• DevTools version 2.9.2
[✓] Android toolchain - develop for Android devices (Android SDK version 30.0.3)
• Android SDK at /Users/vsomayaji/Library/Android/sdk
• Platform android-31, build-tools 30.0.3
• Java binary at: /Applications/Android Studio.app/Contents/jre/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 11.0.10+0-b96-7281165)
• All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 13.2.1)
• Xcode at /Applications/Xcode.app/Contents/Developer
• CocoaPods version 1.11.2
[✓] Chrome - develop for the web
• Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome
[✓] Android Studio (version 2020.3)
• Android Studio at /Applications/Android Studio.app/Contents
• Flutter plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/9212-flutter
• Dart plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/6351-dart
• Java version OpenJDK Runtime Environment (build 11.0.10+0-b96-7281165)
[✓] IntelliJ IDEA Ultimate Edition (version 2021.1.2)
• IntelliJ at /Applications/IntelliJ IDEA.app
• Flutter plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/9212-flutter
• Dart plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/6351-dart
[✓] Connected device (1 available)
• Chrome (web) • chrome • web-javascript • Google Chrome 98.0.4758.109
[✓] HTTP Host Availability
• All required HTTP hosts are available
• No issues found!
```
</details>
https://user-images.githubusercontent.com/1007109/157127019-1067198b-28ad-4f27-8a83-9a6b4fe957bb.mov
| non_comp | voiceover does not announce form validation errors if a form field fails validation the error message is visually displayed but is not announced by voiceover steps to reproduce turn on voiceover execute flutter run d chrome on the code sample tap submit without entering a username expected results the validation error message please enter your username is announced by voiceover actual results the validation error message is displayed but is not read workaround if the helpertext is set to a space the validation error message will be announced 🤷🏾 dart textformfield decoration const inputdecoration helpertext must be set to a space to announce validation errors labeltext username validator string value if value null value isempty return please enter your username return null code sample dart import package flutter material dart import package flutter rendering dart void main widgetsflutterbinding ensureinitialized auto enable accessibility for our blind and low vision customers see rendererbinding instance setsemanticsenabled true runapp const myapp class myapp extends statelesswidget const myapp key key super key key override widget build buildcontext context const apptitle form validation demo return materialapp title apptitle home scaffold appbar appbar title const text apptitle body const mycustomform class mycustomform extends statefulwidget const mycustomform key key super key key override state createstate mycustomformstate class mycustomformstate extends state final formkey globalkey override widget build buildcontext context return form key formkey child column children textformfield decoration const inputdecoration labeltext username validator string value if value null value isempty return please enter your username return null textbutton onpressed if formkey currentstate validate scaffoldmessenger of context showsnackbar const snackbar content text submitting child const text submit logs flutter channel stable on macos darwin locale en us • flutter version at users vsomayaji dev flutter • upstream repository • framework revision days ago • engine revision • dart version • devtools version android toolchain develop for android devices android sdk version • android sdk at users vsomayaji library android sdk • platform android build tools • java binary at applications android studio app contents jre contents home bin java • java version openjdk runtime environment build • all android licenses accepted xcode develop for ios and macos xcode • xcode at applications xcode app contents developer • cocoapods version chrome develop for the web • chrome at applications google chrome app contents macos google chrome android studio version • android studio at applications android studio app contents • flutter plugin can be installed from 🔨 • dart plugin can be installed from 🔨 • java version openjdk runtime environment build intellij idea ultimate edition version • intellij at applications intellij idea app • flutter plugin can be installed from 🔨 • dart plugin can be installed from 🔨 connected device available • chrome web • chrome • web javascript • google chrome http host availability • all required http hosts are available • no issues found | 0 |
1,590 | 4,146,790,188 | IssuesEvent | 2016-06-15 02:19:45 | CommBank/ci2 | https://api.github.com/repos/CommBank/ci2 | closed | Simplify and deprecate function which_sbt in libci | backwards incompatible enhancement in progress | A simplification of design parameters yields which_sbt obsolete and replaceable by ```which sbt```. Users will now be expected to add their project root to the PATH envvar if they wish for their local ```./sbt``` script to be called preferentially instead of the previous behaviour of by default. | True | Simplify and deprecate function which_sbt in libci - A simplification of design parameters yields which_sbt obsolete and replaceable by ```which sbt```. Users will now be expected to add their project root to the PATH envvar if they wish for their local ```./sbt``` script to be called preferentially instead of the previous behaviour of by default. | comp | simplify and deprecate function which sbt in libci a simplification of design parameters yields which sbt obsolete and replaceable by which sbt users will now be expected to add their project root to the path envvar if they wish for their local sbt script to be called preferentially instead of the previous behaviour of by default | 1 |
11,020 | 13,050,422,924 | IssuesEvent | 2020-07-29 15:28:21 | ankidroid/Anki-Android | https://api.github.com/repos/ankidroid/Anki-Android | closed | Deck Import -Single Deck Export in Anki Desktop SchedV2 - Attempt to get length of null array | Anki Ecosystem Compatibility Bug Import Reproduced V2 Scheduler | https://drive.google.com/file/d/1xwwcF8qr8WPLpfHRblQtXZJBJMlrkEVq/view?usp=drivesdk
Reported in https://github.com/ankidroid/Anki-Android/issues/6383#issuecomment-665173746
```
2020-07-28 18:47:42.275 2127-2283/com.ichi2.anki E/Anki2Importer: _import() exception
java.lang.NullPointerException: Attempt to get length of null array
at com.ichi2.libanki.importer.Anki2Importer._importStaticMedia(Anki2Importer.java:698)
at com.ichi2.libanki.importer.Anki2Importer._import(Anki2Importer.java:151)
at com.ichi2.libanki.importer.Anki2Importer.run(Anki2Importer.java:103)
at com.ichi2.libanki.importer.AnkiPackageImporter.run(AnkiPackageImporter.java:143)
at com.ichi2.async.CollectionTask.doInBackgroundImportAdd(CollectionTask.java:1188)
at com.ichi2.async.CollectionTask.actualDoInBackground(CollectionTask.java:393)
at com.ichi2.async.CollectionTask.doInBackground(CollectionTask.java:304)
at com.ichi2.async.CollectionTask.doInBackground(CollectionTask.java:83)
at android.os.AsyncTask$2.call(AsyncTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:246)
at java.util.concurrent.ThreadPoolExecutor.processTask(ThreadPoolExecutor.java:1187)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
at java.lang.Thread.run(Thread.java:784)
``` | True | Deck Import -Single Deck Export in Anki Desktop SchedV2 - Attempt to get length of null array - https://drive.google.com/file/d/1xwwcF8qr8WPLpfHRblQtXZJBJMlrkEVq/view?usp=drivesdk
Reported in https://github.com/ankidroid/Anki-Android/issues/6383#issuecomment-665173746
```
2020-07-28 18:47:42.275 2127-2283/com.ichi2.anki E/Anki2Importer: _import() exception
java.lang.NullPointerException: Attempt to get length of null array
at com.ichi2.libanki.importer.Anki2Importer._importStaticMedia(Anki2Importer.java:698)
at com.ichi2.libanki.importer.Anki2Importer._import(Anki2Importer.java:151)
at com.ichi2.libanki.importer.Anki2Importer.run(Anki2Importer.java:103)
at com.ichi2.libanki.importer.AnkiPackageImporter.run(AnkiPackageImporter.java:143)
at com.ichi2.async.CollectionTask.doInBackgroundImportAdd(CollectionTask.java:1188)
at com.ichi2.async.CollectionTask.actualDoInBackground(CollectionTask.java:393)
at com.ichi2.async.CollectionTask.doInBackground(CollectionTask.java:304)
at com.ichi2.async.CollectionTask.doInBackground(CollectionTask.java:83)
at android.os.AsyncTask$2.call(AsyncTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:246)
at java.util.concurrent.ThreadPoolExecutor.processTask(ThreadPoolExecutor.java:1187)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1152)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
at java.lang.Thread.run(Thread.java:784)
``` | comp | deck import single deck export in anki desktop attempt to get length of null array reported in com anki e import exception java lang nullpointerexception attempt to get length of null array at com libanki importer importstaticmedia java at com libanki importer import java at com libanki importer run java at com libanki importer ankipackageimporter run ankipackageimporter java at com async collectiontask doinbackgroundimportadd collectiontask java at com async collectiontask actualdoinbackground collectiontask java at com async collectiontask doinbackground collectiontask java at com async collectiontask doinbackground collectiontask java at android os asynctask call asynctask java at java util concurrent futuretask run futuretask java at android os asynctask serialexecutor run asynctask java at java util concurrent threadpoolexecutor processtask threadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java | 1 |
17,494 | 10,709,591,072 | IssuesEvent | 2019-10-24 22:39:26 | badges/shields | https://api.github.com/repos/badges/shields | closed | Badge request: Homebrew Cask | good first issue hacktoberfest service-badge | [Homebrew Cask](https://caskroom.github.io/) is now officially part of Homebrew. Now that we already have a badge for Homebrew, it makes sense to also include Homebrew Cask. | 1.0 | Badge request: Homebrew Cask - [Homebrew Cask](https://caskroom.github.io/) is now officially part of Homebrew. Now that we already have a badge for Homebrew, it makes sense to also include Homebrew Cask. | non_comp | badge request homebrew cask is now officially part of homebrew now that we already have a badge for homebrew it makes sense to also include homebrew cask | 0 |
615,630 | 19,270,839,195 | IssuesEvent | 2021-12-10 05:03:54 | pytorch/pytorch | https://api.github.com/repos/pytorch/pytorch | closed | unable to backward propagate convolution with padded input | high priority module: dependency bug module: nn module: convolution triaged module: mkldnn | ## 🐛 Bug
In a graph where input is padded with required zeros and then passed to do convolution and then backward is called.
I see test failing with error:
`>>> res.backward(grad_in)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/jthakur/vnv_syn_new/lib/python3.6/site-packages/torch/tensor.py", line 245, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs)
File "/home/jthakur/vnv_syn_new/lib/python3.6/site-packages/torch/autograd/__init__.py", line 147, in backward
allow_unreachable=True, accumulate_grad=True) # allow_unreachable flag
RuntimeError: could not create a primitive`
## To Reproduce
if in_channels < 105 then backpropagation is working fine.
Steps to reproduce the behavior:
```
import torch
in_channels = 105
out_channels = 1
ifm_shape = [1, in_channels, 1, 1]
padding = [8, 5, 9, 4]
groups = 1
kernel_size = [1, 1]
dilation = [1, 1]
stride = [14, 14]
ifm = torch.rand(ifm_shape)
ifm.requires_grad_(True)
padded_ifm = torch.nn.functional.pad(ifm, padding)
op = torch.nn.Conv2d(
in_channels=in_channels,
out_channels=out_channels,
kernel_size=kernel_size,
stride=stride,
dilation=dilation,
groups=groups,
)
res = op(padded_ifm)
grad_in = torch.rand(res.shape)
res.backward(grad_in)
```
## Expected behavior
## Environment
- PyTorch Version (e.g., 1.0): 1.8.1
- OS (e.g., Linux): Linux
- How you installed PyTorch (`conda`, `pip`, source): pip
- Build command you used (if compiling from source): NA
- Python version: 3.6.9
- CUDA/cuDNN version: CPU
- GPU models and configuration: NA
- Any other relevant information: NA
cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser @anjali411 @gujinghui @PenghuiCheng @XiaobingSuper @jianyuh @VitalyFedyunin @albanD @mruberry @gqchen @pearu @nikitaved @soulitzer @Lezcano @Varal7 | 1.0 | unable to backward propagate convolution with padded input - ## 🐛 Bug
In a graph where input is padded with required zeros and then passed to do convolution and then backward is called.
I see test failing with error:
`>>> res.backward(grad_in)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/jthakur/vnv_syn_new/lib/python3.6/site-packages/torch/tensor.py", line 245, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs)
File "/home/jthakur/vnv_syn_new/lib/python3.6/site-packages/torch/autograd/__init__.py", line 147, in backward
allow_unreachable=True, accumulate_grad=True) # allow_unreachable flag
RuntimeError: could not create a primitive`
## To Reproduce
if in_channels < 105 then backpropagation is working fine.
Steps to reproduce the behavior:
```
import torch
in_channels = 105
out_channels = 1
ifm_shape = [1, in_channels, 1, 1]
padding = [8, 5, 9, 4]
groups = 1
kernel_size = [1, 1]
dilation = [1, 1]
stride = [14, 14]
ifm = torch.rand(ifm_shape)
ifm.requires_grad_(True)
padded_ifm = torch.nn.functional.pad(ifm, padding)
op = torch.nn.Conv2d(
in_channels=in_channels,
out_channels=out_channels,
kernel_size=kernel_size,
stride=stride,
dilation=dilation,
groups=groups,
)
res = op(padded_ifm)
grad_in = torch.rand(res.shape)
res.backward(grad_in)
```
## Expected behavior
## Environment
- PyTorch Version (e.g., 1.0): 1.8.1
- OS (e.g., Linux): Linux
- How you installed PyTorch (`conda`, `pip`, source): pip
- Build command you used (if compiling from source): NA
- Python version: 3.6.9
- CUDA/cuDNN version: CPU
- GPU models and configuration: NA
- Any other relevant information: NA
cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser @anjali411 @gujinghui @PenghuiCheng @XiaobingSuper @jianyuh @VitalyFedyunin @albanD @mruberry @gqchen @pearu @nikitaved @soulitzer @Lezcano @Varal7 | non_comp | unable to backward propagate convolution with padded input 🐛 bug in a graph where input is padded with required zeros and then passed to do convolution and then backward is called i see test failing with error res backward grad in traceback most recent call last file line in file home jthakur vnv syn new lib site packages torch tensor py line in backward torch autograd backward self gradient retain graph create graph inputs inputs file home jthakur vnv syn new lib site packages torch autograd init py line in backward allow unreachable true accumulate grad true allow unreachable flag runtimeerror could not create a primitive to reproduce if in channels then backpropagation is working fine steps to reproduce the behavior import torch in channels out channels ifm shape padding groups kernel size dilation stride ifm torch rand ifm shape ifm requires grad true padded ifm torch nn functional pad ifm padding op torch nn in channels in channels out channels out channels kernel size kernel size stride stride dilation dilation groups groups res op padded ifm grad in torch rand res shape res backward grad in expected behavior environment pytorch version e g os e g linux linux how you installed pytorch conda pip source pip build command you used if compiling from source na python version cuda cudnn version cpu gpu models and configuration na any other relevant information na cc ezyang gchanan bdhirsh jbschlosser gujinghui penghuicheng xiaobingsuper jianyuh vitalyfedyunin alband mruberry gqchen pearu nikitaved soulitzer lezcano | 0 |
329,225 | 28,208,744,636 | IssuesEvent | 2023-04-05 00:59:52 | googleapis/google-cloud-python | https://api.github.com/repos/googleapis/google-cloud-python | opened | Adopt split repo: python-gke-hub | migration:samples:generated migration:workaround:owlbot migration:library:gapic_auto migration:testing:unit migration:issues:open | Migrate the split-repo https://github.com/googleapis/python-gke-hub to https://github.com/googleapis/google-cloud-python. To do the actual migration, we need to ensure we can clear any of the following tags in this issue that describe the state of the source repo: `testing:system`, `samples:manual`,`workaround:owlbot`. | 1.0 | Adopt split repo: python-gke-hub - Migrate the split-repo https://github.com/googleapis/python-gke-hub to https://github.com/googleapis/google-cloud-python. To do the actual migration, we need to ensure we can clear any of the following tags in this issue that describe the state of the source repo: `testing:system`, `samples:manual`,`workaround:owlbot`. | non_comp | adopt split repo python gke hub migrate the split repo to to do the actual migration we need to ensure we can clear any of the following tags in this issue that describe the state of the source repo testing system samples manual workaround owlbot | 0 |
38 | 2,495,719,192 | IssuesEvent | 2015-01-06 14:10:05 | PowerDNS/pdns | https://api.github.com/repos/PowerDNS/pdns | closed | make distclean broken | auth defect | Steps to reproduce:
1. take fresh copy
2. run bootstrap and configure
3. run make distclean
this results in
```
$ make distclean
Making distclean in pdns/ext/rapidjson
Making distclean in pdns
make[1]: Entering directory `/home/cmouse/src/pdns/pdns'
Making distclean in backends
make[2]: Entering directory `/home/cmouse/src/pdns/pdns/backends'
Making distclean in bind
make[3]: Entering directory `/home/cmouse/src/pdns/pdns/backends/bind'
rm -f zone2sql zone2ldap zone2json
rm -rf .libs _libs
rm -rf ../../.libs ../../_libs
test -z "libbind2backend.la" || rm -f libbind2backend.la
rm -f "./so_locations"
rm -f *.o
rm -f ../../aes/aes_modes.o
rm -f ../../aes/aescrypt.o
rm -f ../../aes/aeskey.o
rm -f ../../aes/aestab.o
rm -f ../../aes/dns_random.o
rm -f ../../arguments.o
rm -f ../../base32.o
rm -f ../../base64.o
rm -f ../../dns.o
rm -f ../../dnsparser.o
rm -f ../../dnsrecords.o
rm -f ../../dnssecinfra.o
rm -f ../../dnswriter.o
rm -f ../../libbind2backend_la-misc.o
rm -f ../../libbind2backend_la-misc.lo
rm -f ../../libbind2backend_la-unix_utility.o
rm -f ../../libbind2backend_la-unix_utility.lo
rm -f ../../libbind2backend_la-zoneparser-tng.o
rm -f ../../libbind2backend_la-zoneparser-tng.lo
rm -f ../../logger.o
rm -f ../../misc.o
rm -f ../../nsecrecords.o
rm -f ../../qtype.o
rm -f ../../rcpgenerator.o
rm -f ../../sillyrecords.o
rm -f ../../statbag.o
rm -f ../../unix_utility.o
rm -f ../../zoneparser-tng.o
rm -f *.lo
rm -f *.tab.c
test -z "" || rm -f
test . = "." || test -z "" || rm -f
rm -f ../../.deps/.dirstamp
rm -f ../../.dirstamp
rm -f ../../aes/.deps/.dirstamp
rm -f ../../aes/.dirstamp
rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
rm -rf ../../.deps ../../aes/.deps ./.deps
rm -f Makefile
make[3]: Leaving directory `/home/cmouse/src/pdns/pdns/backends/bind'
Making distclean in .
make[3]: Entering directory `/home/cmouse/src/pdns/pdns/backends'
rm -rf .libs _libs
rm -f *.lo
test -z "" || rm -f
test . = "." || test -z "" || rm -f
rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
make[3]: Leaving directory `/home/cmouse/src/pdns/pdns/backends'
rm -f Makefile
make[2]: Leaving directory `/home/cmouse/src/pdns/pdns/backends'
Making distclean in ext/polarssl-1.1.2
make[2]: Entering directory `/home/cmouse/src/pdns/pdns/ext/polarssl-1.1.2'
make[3]: Entering directory `/home/cmouse/src/pdns/pdns/ext/polarssl-1.1.2/library'
make[3]: Leaving directory `/home/cmouse/src/pdns/pdns/ext/polarssl-1.1.2/library'
make[2]: Leaving directory `/home/cmouse/src/pdns/pdns/ext/polarssl-1.1.2'
Making distclean in .
make[2]: Entering directory `/home/cmouse/src/pdns/pdns'
Makefile:1219: .deps/arguments.Po: No such file or directory
Makefile:1220: .deps/base32.Po: No such file or directory
Makefile:1221: .deps/base64.Po: No such file or directory
Makefile:1222: .deps/botan110signers.Po: No such file or directory
Makefile:1223: .deps/botan18signers.Po: No such file or directory
Makefile:1224: .deps/botansigners.Po: No such file or directory
Makefile:1225: .deps/common_startup.Po: No such file or directory
Makefile:1226: .deps/communicator.Po: No such file or directory
Makefile:1227: .deps/cryptoppsigners.Po: No such file or directory
Makefile:1228: .deps/dbdnsseckeeper.Po: No such file or directory
Makefile:1229: .deps/dns.Po: No such file or directory
Makefile:1230: .deps/dnsbackend.Po: No such file or directory
Makefile:1231: .deps/dnsbulktest.Po: No such file or directory
Makefile:1232: .deps/dnsdemog.Po: No such file or directory
Makefile:1233: .deps/dnsdist.Po: No such file or directory
Makefile:1234: .deps/dnsgram.Po: No such file or directory
Makefile:1235: .deps/dnslabeltext.Po: No such file or directory
Makefile:1236: .deps/dnspacket.Po: No such file or directory
Makefile:1237: .deps/dnsparser.Po: No such file or directory
Makefile:1238: .deps/dnspcap.Po: No such file or directory
Makefile:1239: .deps/dnsproxy.Po: No such file or directory
Makefile:1240: .deps/dnsrecords.Po: No such file or directory
Makefile:1241: .deps/dnsreplay.Po: No such file or directory
Makefile:1242: .deps/dnsscan.Po: No such file or directory
Makefile:1243: .deps/dnsscope.Po: No such file or directory
Makefile:1244: .deps/dnssecinfra.Po: No such file or directory
Makefile:1245: .deps/dnssecsigner.Po: No such file or directory
Makefile:1246: .deps/dnstcpbench.Po: No such file or directory
Makefile:1247: .deps/dnswasher.Po: No such file or directory
Makefile:1248: .deps/dnswriter.Po: No such file or directory
Makefile:1249: .deps/dynhandler.Po: No such file or directory
Makefile:1250: .deps/dynlistener.Po: No such file or directory
Makefile:1251: .deps/dynloader.Po: No such file or directory
Makefile:1252: .deps/dynmessenger.Po: No such file or directory
Makefile:1253: .deps/ednssubnet.Po: No such file or directory
Makefile:1254: .deps/epollmplexer.Po: No such file or directory
Makefile:1255: .deps/htimer.Po: No such file or directory
Makefile:1256: .deps/iputils.Po: No such file or directory
Makefile:1257: .deps/json.Po: No such file or directory
Makefile:1258: .deps/json_ws.Po: No such file or directory
Makefile:1259: .deps/logger.Po: No such file or directory
Makefile:1260: .deps/lua-auth.Po: No such file or directory
Makefile:1261: .deps/lua-pdns.Po: No such file or directory
Makefile:1262: .deps/lua-recursor.Po: No such file or directory
Makefile:1263: .deps/lwres.Po: No such file or directory
Makefile:1264: .deps/mastercommunicator.Po: No such file or directory
Makefile:1265: .deps/misc.Po: No such file or directory
Makefile:1266: .deps/nameserver.Po: No such file or directory
Makefile:1267: .deps/notify.Po: No such file or directory
Makefile:1268: .deps/nproxy.Po: No such file or directory
Makefile:1269: .deps/nsec3dig.Po: No such file or directory
Makefile:1270: .deps/nsecrecords.Po: No such file or directory
Makefile:1271: .deps/packetcache.Po: No such file or directory
Makefile:1272: .deps/packethandler.Po: No such file or directory
Makefile:1273: .deps/pdns_recursor.Po: No such file or directory
Makefile:1274: .deps/pdnssec.Po: No such file or directory
Makefile:1275: .deps/polarrsakeyinfra.Po: No such file or directory
Makefile:1276: .deps/qtype.Po: No such file or directory
Makefile:1277: .deps/randomhelper.Po: No such file or directory
Makefile:1278: .deps/rcpgenerator.Po: No such file or directory
Makefile:1279: .deps/rec_channel.Po: No such file or directory
Makefile:1280: .deps/rec_channel_rec.Po: No such file or directory
Makefile:1281: .deps/rec_control.Po: No such file or directory
Makefile:1282: .deps/receiver.Po: No such file or directory
Makefile:1283: .deps/recpacketcache.Po: No such file or directory
Makefile:1284: .deps/recursor_cache.Po: No such file or directory
Makefile:1285: .deps/reczones.Po: No such file or directory
Makefile:1286: .deps/resolver.Po: No such file or directory
Makefile:1287: .deps/responsestats.Po: No such file or directory
Makefile:1288: .deps/rfc2136handler.Po: No such file or directory
Makefile:1289: .deps/sdig.Po: No such file or directory
Makefile:1290: .deps/selectmplexer.Po: No such file or directory
Makefile:1291: .deps/serialtweaker.Po: No such file or directory
Makefile:1292: .deps/session.Po: No such file or directory
Makefile:1293: .deps/signingpipe.Po: No such file or directory
Makefile:1294: .deps/sillyrecords.Po: No such file or directory
Makefile:1295: .deps/slavecommunicator.Po: No such file or directory
Makefile:1296: .deps/speedtest.Po: No such file or directory
Makefile:1297: .deps/ssqlite3.Po: No such file or directory
Makefile:1298: .deps/statbag.Po: No such file or directory
Makefile:1299: .deps/syncres.Po: No such file or directory
Makefile:1300: .deps/tcpreceiver.Po: No such file or directory
Makefile:1301: .deps/test-base32_cc.Po: No such file or directory
Makefile:1302: .deps/test-base64_cc.Po: No such file or directory
Makefile:1303: .deps/test-dns_random_hh.Po: No such file or directory
Makefile:1304: .deps/test-dnsrecords_cc.Po: No such file or directory
Makefile:1305: .deps/test-iputils_hh.Po: No such file or directory
Makefile:1306: .deps/test-md5_hh.Po: No such file or directory
Makefile:1307: .deps/test-misc_hh.Po: No such file or directory
Makefile:1308: .deps/test-nameserver_cc.Po: No such file or directory
Makefile:1309: .deps/test-rcpgenerator_cc.Po: No such file or directory
Makefile:1310: .deps/test-sha_hh.Po: No such file or directory
Makefile:1311: .deps/testrunner.Po: No such file or directory
Makefile:1312: .deps/toysdig.Po: No such file or directory
Makefile:1313: .deps/tsig-tests.Po: No such file or directory
Makefile:1314: .deps/ueberbackend.Po: No such file or directory
Makefile:1315: .deps/unix_semaphore.Po: No such file or directory
Makefile:1316: .deps/unix_utility.Po: No such file or directory
Makefile:1317: .deps/version.Po: No such file or directory
Makefile:1318: .deps/webserver.Po: No such file or directory
Makefile:1319: .deps/ws.Po: No such file or directory
Makefile:1320: .deps/zoneparser-tng.Po: No such file or directory
Makefile:1321: aes/.deps/aes_modes.Po: No such file or directory
Makefile:1322: aes/.deps/aescrypt.Po: No such file or directory
Makefile:1323: aes/.deps/aeskey.Po: No such file or directory
Makefile:1324: aes/.deps/aestab.Po: No such file or directory
Makefile:1325: aes/.deps/dns_random.Po: No such file or directory
Makefile:1326: backends/bind/.deps/bindbackend2.Po: No such file or directory
Makefile:1327: backends/bind/.deps/binddnssec.Po: No such file or directory
Makefile:1328: backends/bind/.deps/bindlexer.Po: No such file or directory
Makefile:1329: backends/bind/.deps/bindparser.Po: No such file or directory
make[2]: *** No rule to make target `backends/bind/.deps/bindparser.Po'. Stop.
make[2]: Leaving directory `/home/cmouse/src/pdns/pdns'
make[1]: *** [distclean-recursive] Error 1
make[1]: Leaving directory `/home/cmouse/src/pdns/pdns'
make: *** [distclean-recursive] Error 1
``` | 1.0 | make distclean broken - Steps to reproduce:
1. take fresh copy
2. run bootstrap and configure
3. run make distclean
this results in
```
$ make distclean
Making distclean in pdns/ext/rapidjson
Making distclean in pdns
make[1]: Entering directory `/home/cmouse/src/pdns/pdns'
Making distclean in backends
make[2]: Entering directory `/home/cmouse/src/pdns/pdns/backends'
Making distclean in bind
make[3]: Entering directory `/home/cmouse/src/pdns/pdns/backends/bind'
rm -f zone2sql zone2ldap zone2json
rm -rf .libs _libs
rm -rf ../../.libs ../../_libs
test -z "libbind2backend.la" || rm -f libbind2backend.la
rm -f "./so_locations"
rm -f *.o
rm -f ../../aes/aes_modes.o
rm -f ../../aes/aescrypt.o
rm -f ../../aes/aeskey.o
rm -f ../../aes/aestab.o
rm -f ../../aes/dns_random.o
rm -f ../../arguments.o
rm -f ../../base32.o
rm -f ../../base64.o
rm -f ../../dns.o
rm -f ../../dnsparser.o
rm -f ../../dnsrecords.o
rm -f ../../dnssecinfra.o
rm -f ../../dnswriter.o
rm -f ../../libbind2backend_la-misc.o
rm -f ../../libbind2backend_la-misc.lo
rm -f ../../libbind2backend_la-unix_utility.o
rm -f ../../libbind2backend_la-unix_utility.lo
rm -f ../../libbind2backend_la-zoneparser-tng.o
rm -f ../../libbind2backend_la-zoneparser-tng.lo
rm -f ../../logger.o
rm -f ../../misc.o
rm -f ../../nsecrecords.o
rm -f ../../qtype.o
rm -f ../../rcpgenerator.o
rm -f ../../sillyrecords.o
rm -f ../../statbag.o
rm -f ../../unix_utility.o
rm -f ../../zoneparser-tng.o
rm -f *.lo
rm -f *.tab.c
test -z "" || rm -f
test . = "." || test -z "" || rm -f
rm -f ../../.deps/.dirstamp
rm -f ../../.dirstamp
rm -f ../../aes/.deps/.dirstamp
rm -f ../../aes/.dirstamp
rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
rm -rf ../../.deps ../../aes/.deps ./.deps
rm -f Makefile
make[3]: Leaving directory `/home/cmouse/src/pdns/pdns/backends/bind'
Making distclean in .
make[3]: Entering directory `/home/cmouse/src/pdns/pdns/backends'
rm -rf .libs _libs
rm -f *.lo
test -z "" || rm -f
test . = "." || test -z "" || rm -f
rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
make[3]: Leaving directory `/home/cmouse/src/pdns/pdns/backends'
rm -f Makefile
make[2]: Leaving directory `/home/cmouse/src/pdns/pdns/backends'
Making distclean in ext/polarssl-1.1.2
make[2]: Entering directory `/home/cmouse/src/pdns/pdns/ext/polarssl-1.1.2'
make[3]: Entering directory `/home/cmouse/src/pdns/pdns/ext/polarssl-1.1.2/library'
make[3]: Leaving directory `/home/cmouse/src/pdns/pdns/ext/polarssl-1.1.2/library'
make[2]: Leaving directory `/home/cmouse/src/pdns/pdns/ext/polarssl-1.1.2'
Making distclean in .
make[2]: Entering directory `/home/cmouse/src/pdns/pdns'
Makefile:1219: .deps/arguments.Po: No such file or directory
Makefile:1220: .deps/base32.Po: No such file or directory
Makefile:1221: .deps/base64.Po: No such file or directory
Makefile:1222: .deps/botan110signers.Po: No such file or directory
Makefile:1223: .deps/botan18signers.Po: No such file or directory
Makefile:1224: .deps/botansigners.Po: No such file or directory
Makefile:1225: .deps/common_startup.Po: No such file or directory
Makefile:1226: .deps/communicator.Po: No such file or directory
Makefile:1227: .deps/cryptoppsigners.Po: No such file or directory
Makefile:1228: .deps/dbdnsseckeeper.Po: No such file or directory
Makefile:1229: .deps/dns.Po: No such file or directory
Makefile:1230: .deps/dnsbackend.Po: No such file or directory
Makefile:1231: .deps/dnsbulktest.Po: No such file or directory
Makefile:1232: .deps/dnsdemog.Po: No such file or directory
Makefile:1233: .deps/dnsdist.Po: No such file or directory
Makefile:1234: .deps/dnsgram.Po: No such file or directory
Makefile:1235: .deps/dnslabeltext.Po: No such file or directory
Makefile:1236: .deps/dnspacket.Po: No such file or directory
Makefile:1237: .deps/dnsparser.Po: No such file or directory
Makefile:1238: .deps/dnspcap.Po: No such file or directory
Makefile:1239: .deps/dnsproxy.Po: No such file or directory
Makefile:1240: .deps/dnsrecords.Po: No such file or directory
Makefile:1241: .deps/dnsreplay.Po: No such file or directory
Makefile:1242: .deps/dnsscan.Po: No such file or directory
Makefile:1243: .deps/dnsscope.Po: No such file or directory
Makefile:1244: .deps/dnssecinfra.Po: No such file or directory
Makefile:1245: .deps/dnssecsigner.Po: No such file or directory
Makefile:1246: .deps/dnstcpbench.Po: No such file or directory
Makefile:1247: .deps/dnswasher.Po: No such file or directory
Makefile:1248: .deps/dnswriter.Po: No such file or directory
Makefile:1249: .deps/dynhandler.Po: No such file or directory
Makefile:1250: .deps/dynlistener.Po: No such file or directory
Makefile:1251: .deps/dynloader.Po: No such file or directory
Makefile:1252: .deps/dynmessenger.Po: No such file or directory
Makefile:1253: .deps/ednssubnet.Po: No such file or directory
Makefile:1254: .deps/epollmplexer.Po: No such file or directory
Makefile:1255: .deps/htimer.Po: No such file or directory
Makefile:1256: .deps/iputils.Po: No such file or directory
Makefile:1257: .deps/json.Po: No such file or directory
Makefile:1258: .deps/json_ws.Po: No such file or directory
Makefile:1259: .deps/logger.Po: No such file or directory
Makefile:1260: .deps/lua-auth.Po: No such file or directory
Makefile:1261: .deps/lua-pdns.Po: No such file or directory
Makefile:1262: .deps/lua-recursor.Po: No such file or directory
Makefile:1263: .deps/lwres.Po: No such file or directory
Makefile:1264: .deps/mastercommunicator.Po: No such file or directory
Makefile:1265: .deps/misc.Po: No such file or directory
Makefile:1266: .deps/nameserver.Po: No such file or directory
Makefile:1267: .deps/notify.Po: No such file or directory
Makefile:1268: .deps/nproxy.Po: No such file or directory
Makefile:1269: .deps/nsec3dig.Po: No such file or directory
Makefile:1270: .deps/nsecrecords.Po: No such file or directory
Makefile:1271: .deps/packetcache.Po: No such file or directory
Makefile:1272: .deps/packethandler.Po: No such file or directory
Makefile:1273: .deps/pdns_recursor.Po: No such file or directory
Makefile:1274: .deps/pdnssec.Po: No such file or directory
Makefile:1275: .deps/polarrsakeyinfra.Po: No such file or directory
Makefile:1276: .deps/qtype.Po: No such file or directory
Makefile:1277: .deps/randomhelper.Po: No such file or directory
Makefile:1278: .deps/rcpgenerator.Po: No such file or directory
Makefile:1279: .deps/rec_channel.Po: No such file or directory
Makefile:1280: .deps/rec_channel_rec.Po: No such file or directory
Makefile:1281: .deps/rec_control.Po: No such file or directory
Makefile:1282: .deps/receiver.Po: No such file or directory
Makefile:1283: .deps/recpacketcache.Po: No such file or directory
Makefile:1284: .deps/recursor_cache.Po: No such file or directory
Makefile:1285: .deps/reczones.Po: No such file or directory
Makefile:1286: .deps/resolver.Po: No such file or directory
Makefile:1287: .deps/responsestats.Po: No such file or directory
Makefile:1288: .deps/rfc2136handler.Po: No such file or directory
Makefile:1289: .deps/sdig.Po: No such file or directory
Makefile:1290: .deps/selectmplexer.Po: No such file or directory
Makefile:1291: .deps/serialtweaker.Po: No such file or directory
Makefile:1292: .deps/session.Po: No such file or directory
Makefile:1293: .deps/signingpipe.Po: No such file or directory
Makefile:1294: .deps/sillyrecords.Po: No such file or directory
Makefile:1295: .deps/slavecommunicator.Po: No such file or directory
Makefile:1296: .deps/speedtest.Po: No such file or directory
Makefile:1297: .deps/ssqlite3.Po: No such file or directory
Makefile:1298: .deps/statbag.Po: No such file or directory
Makefile:1299: .deps/syncres.Po: No such file or directory
Makefile:1300: .deps/tcpreceiver.Po: No such file or directory
Makefile:1301: .deps/test-base32_cc.Po: No such file or directory
Makefile:1302: .deps/test-base64_cc.Po: No such file or directory
Makefile:1303: .deps/test-dns_random_hh.Po: No such file or directory
Makefile:1304: .deps/test-dnsrecords_cc.Po: No such file or directory
Makefile:1305: .deps/test-iputils_hh.Po: No such file or directory
Makefile:1306: .deps/test-md5_hh.Po: No such file or directory
Makefile:1307: .deps/test-misc_hh.Po: No such file or directory
Makefile:1308: .deps/test-nameserver_cc.Po: No such file or directory
Makefile:1309: .deps/test-rcpgenerator_cc.Po: No such file or directory
Makefile:1310: .deps/test-sha_hh.Po: No such file or directory
Makefile:1311: .deps/testrunner.Po: No such file or directory
Makefile:1312: .deps/toysdig.Po: No such file or directory
Makefile:1313: .deps/tsig-tests.Po: No such file or directory
Makefile:1314: .deps/ueberbackend.Po: No such file or directory
Makefile:1315: .deps/unix_semaphore.Po: No such file or directory
Makefile:1316: .deps/unix_utility.Po: No such file or directory
Makefile:1317: .deps/version.Po: No such file or directory
Makefile:1318: .deps/webserver.Po: No such file or directory
Makefile:1319: .deps/ws.Po: No such file or directory
Makefile:1320: .deps/zoneparser-tng.Po: No such file or directory
Makefile:1321: aes/.deps/aes_modes.Po: No such file or directory
Makefile:1322: aes/.deps/aescrypt.Po: No such file or directory
Makefile:1323: aes/.deps/aeskey.Po: No such file or directory
Makefile:1324: aes/.deps/aestab.Po: No such file or directory
Makefile:1325: aes/.deps/dns_random.Po: No such file or directory
Makefile:1326: backends/bind/.deps/bindbackend2.Po: No such file or directory
Makefile:1327: backends/bind/.deps/binddnssec.Po: No such file or directory
Makefile:1328: backends/bind/.deps/bindlexer.Po: No such file or directory
Makefile:1329: backends/bind/.deps/bindparser.Po: No such file or directory
make[2]: *** No rule to make target `backends/bind/.deps/bindparser.Po'. Stop.
make[2]: Leaving directory `/home/cmouse/src/pdns/pdns'
make[1]: *** [distclean-recursive] Error 1
make[1]: Leaving directory `/home/cmouse/src/pdns/pdns'
make: *** [distclean-recursive] Error 1
``` | non_comp | make distclean broken steps to reproduce take fresh copy run bootstrap and configure run make distclean this results in make distclean making distclean in pdns ext rapidjson making distclean in pdns make entering directory home cmouse src pdns pdns making distclean in backends make entering directory home cmouse src pdns pdns backends making distclean in bind make entering directory home cmouse src pdns pdns backends bind rm f rm rf libs libs rm rf libs libs test z la rm f la rm f so locations rm f o rm f aes aes modes o rm f aes aescrypt o rm f aes aeskey o rm f aes aestab o rm f aes dns random o rm f arguments o rm f o rm f o rm f dns o rm f dnsparser o rm f dnsrecords o rm f dnssecinfra o rm f dnswriter o rm f la misc o rm f la misc lo rm f la unix utility o rm f la unix utility lo rm f la zoneparser tng o rm f la zoneparser tng lo rm f logger o rm f misc o rm f nsecrecords o rm f qtype o rm f rcpgenerator o rm f sillyrecords o rm f statbag o rm f unix utility o rm f zoneparser tng o rm f lo rm f tab c test z rm f test test z rm f rm f deps dirstamp rm f dirstamp rm f aes deps dirstamp rm f aes dirstamp rm f tags id gtags grtags gsyms gpath tags rm rf deps aes deps deps rm f makefile make leaving directory home cmouse src pdns pdns backends bind making distclean in make entering directory home cmouse src pdns pdns backends rm rf libs libs rm f lo test z rm f test test z rm f rm f tags id gtags grtags gsyms gpath tags make leaving directory home cmouse src pdns pdns backends rm f makefile make leaving directory home cmouse src pdns pdns backends making distclean in ext polarssl make entering directory home cmouse src pdns pdns ext polarssl make entering directory home cmouse src pdns pdns ext polarssl library make leaving directory home cmouse src pdns pdns ext polarssl library make leaving directory home cmouse src pdns pdns ext polarssl making distclean in make entering directory home cmouse src pdns pdns makefile deps arguments po no such file or directory makefile deps po no such file or directory makefile deps po no such file or directory makefile deps po no such file or directory makefile deps po no such file or directory makefile deps botansigners po no such file or directory makefile deps common startup po no such file or directory makefile deps communicator po no such file or directory makefile deps cryptoppsigners po no such file or directory makefile deps dbdnsseckeeper po no such file or directory makefile deps dns po no such file or directory makefile deps dnsbackend po no such file or directory makefile deps dnsbulktest po no such file or directory makefile deps dnsdemog po no such file or directory makefile deps dnsdist po no such file or directory makefile deps dnsgram po no such file or directory makefile deps dnslabeltext po no such file or directory makefile deps dnspacket po no such file or directory makefile deps dnsparser po no such file or directory makefile deps dnspcap po no such file or directory makefile deps dnsproxy po no such file or directory makefile deps dnsrecords po no such file or directory makefile deps dnsreplay po no such file or directory makefile deps dnsscan po no such file or directory makefile deps dnsscope po no such file or directory makefile deps dnssecinfra po no such file or directory makefile deps dnssecsigner po no such file or directory makefile deps dnstcpbench po no such file or directory makefile deps dnswasher po no such file or directory makefile deps dnswriter po no such file or directory makefile deps dynhandler po no such file or directory makefile deps dynlistener po no such file or directory makefile deps dynloader po no such file or directory makefile deps dynmessenger po no such file or directory makefile deps ednssubnet po no such file or directory makefile deps epollmplexer po no such file or directory makefile deps htimer po no such file or directory makefile deps iputils po no such file or directory makefile deps json po no such file or directory makefile deps json ws po no such file or directory makefile deps logger po no such file or directory makefile deps lua auth po no such file or directory makefile deps lua pdns po no such file or directory makefile deps lua recursor po no such file or directory makefile deps lwres po no such file or directory makefile deps mastercommunicator po no such file or directory makefile deps misc po no such file or directory makefile deps nameserver po no such file or directory makefile deps notify po no such file or directory makefile deps nproxy po no such file or directory makefile deps po no such file or directory makefile deps nsecrecords po no such file or directory makefile deps packetcache po no such file or directory makefile deps packethandler po no such file or directory makefile deps pdns recursor po no such file or directory makefile deps pdnssec po no such file or directory makefile deps polarrsakeyinfra po no such file or directory makefile deps qtype po no such file or directory makefile deps randomhelper po no such file or directory makefile deps rcpgenerator po no such file or directory makefile deps rec channel po no such file or directory makefile deps rec channel rec po no such file or directory makefile deps rec control po no such file or directory makefile deps receiver po no such file or directory makefile deps recpacketcache po no such file or directory makefile deps recursor cache po no such file or directory makefile deps reczones po no such file or directory makefile deps resolver po no such file or directory makefile deps responsestats po no such file or directory makefile deps po no such file or directory makefile deps sdig po no such file or directory makefile deps selectmplexer po no such file or directory makefile deps serialtweaker po no such file or directory makefile deps session po no such file or directory makefile deps signingpipe po no such file or directory makefile deps sillyrecords po no such file or directory makefile deps slavecommunicator po no such file or directory makefile deps speedtest po no such file or directory makefile deps po no such file or directory makefile deps statbag po no such file or directory makefile deps syncres po no such file or directory makefile deps tcpreceiver po no such file or directory makefile deps test cc po no such file or directory makefile deps test cc po no such file or directory makefile deps test dns random hh po no such file or directory makefile deps test dnsrecords cc po no such file or directory makefile deps test iputils hh po no such file or directory makefile deps test hh po no such file or directory makefile deps test misc hh po no such file or directory makefile deps test nameserver cc po no such file or directory makefile deps test rcpgenerator cc po no such file or directory makefile deps test sha hh po no such file or directory makefile deps testrunner po no such file or directory makefile deps toysdig po no such file or directory makefile deps tsig tests po no such file or directory makefile deps ueberbackend po no such file or directory makefile deps unix semaphore po no such file or directory makefile deps unix utility po no such file or directory makefile deps version po no such file or directory makefile deps webserver po no such file or directory makefile deps ws po no such file or directory makefile deps zoneparser tng po no such file or directory makefile aes deps aes modes po no such file or directory makefile aes deps aescrypt po no such file or directory makefile aes deps aeskey po no such file or directory makefile aes deps aestab po no such file or directory makefile aes deps dns random po no such file or directory makefile backends bind deps po no such file or directory makefile backends bind deps binddnssec po no such file or directory makefile backends bind deps bindlexer po no such file or directory makefile backends bind deps bindparser po no such file or directory make no rule to make target backends bind deps bindparser po stop make leaving directory home cmouse src pdns pdns make error make leaving directory home cmouse src pdns pdns make error | 0 |
11,510 | 13,504,950,407 | IssuesEvent | 2020-09-13 20:24:35 | PowerNukkit/PowerNukkit | https://api.github.com/repos/PowerNukkit/PowerNukkit | closed | Compatibility issue with wode's Chemistry plugin | Resolution: resolved Type: compatibility | # 🔌Plugin compatibility issue
<!--
👉 This template is helpful, but you may erase everything if you can express the issue clearly
Feel free to ask questions or start related discussion
-->
A compatibility issue with the Chemistry plugin from wode has been reported by @good777LUCKY in https://github.com/GameModsBR/PowerNukkit/issues/312#issuecomment-652182695_
### 📸 Screenshots / Videos
<!-- ✍ If applicable, add screenshots or video recordings to help explain your problem -->
None yet
### ▶ Steps to Reproduce
<!--- ✍ Reliable steps which someone can use to reproduce the issue. -->
1. Open a chest that have chemistry things
2. Client will crash even I disable chemistry plugin
### ✔ Expected Behavior
<!-- ✍ What would you expect to happen -->
Don't crash and possibly show the items or nothing.
### ❌ Actual Behavior
<!-- ✍ What actually happened -->
Client crashes.
### 📋 Debug information
<!-- Use the 'debugpaste' and 'timings paste' command in PowerNukkit -->
<!-- You can get the version from the file name, the 'about' or 'debugpaste' command outputs -->
* PowerNukkit version: ✍ Not informed
* Debug link: ✍ Not informed
### 💢 Crash Dump, Stack Trace and Other Files
<!-- ✍ Use https://hastebin.com for big logs or dumps -->
Not informed
### 💬 Anything else we should know?
<!-- ✍ This is the perfect place to add any additional details -->
The plugin: https://github.com/wode490390/ChemistryGameplay
| True | Compatibility issue with wode's Chemistry plugin - # 🔌Plugin compatibility issue
<!--
👉 This template is helpful, but you may erase everything if you can express the issue clearly
Feel free to ask questions or start related discussion
-->
A compatibility issue with the Chemistry plugin from wode has been reported by @good777LUCKY in https://github.com/GameModsBR/PowerNukkit/issues/312#issuecomment-652182695_
### 📸 Screenshots / Videos
<!-- ✍ If applicable, add screenshots or video recordings to help explain your problem -->
None yet
### ▶ Steps to Reproduce
<!--- ✍ Reliable steps which someone can use to reproduce the issue. -->
1. Open a chest that have chemistry things
2. Client will crash even I disable chemistry plugin
### ✔ Expected Behavior
<!-- ✍ What would you expect to happen -->
Don't crash and possibly show the items or nothing.
### ❌ Actual Behavior
<!-- ✍ What actually happened -->
Client crashes.
### 📋 Debug information
<!-- Use the 'debugpaste' and 'timings paste' command in PowerNukkit -->
<!-- You can get the version from the file name, the 'about' or 'debugpaste' command outputs -->
* PowerNukkit version: ✍ Not informed
* Debug link: ✍ Not informed
### 💢 Crash Dump, Stack Trace and Other Files
<!-- ✍ Use https://hastebin.com for big logs or dumps -->
Not informed
### 💬 Anything else we should know?
<!-- ✍ This is the perfect place to add any additional details -->
The plugin: https://github.com/wode490390/ChemistryGameplay
| comp | compatibility issue with wode s chemistry plugin 🔌plugin compatibility issue 👉 this template is helpful but you may erase everything if you can express the issue clearly feel free to ask questions or start related discussion a compatibility issue with the chemistry plugin from wode has been reported by in 📸 screenshots videos none yet ▶ steps to reproduce open a chest that have chemistry things client will crash even i disable chemistry plugin ✔ expected behavior don t crash and possibly show the items or nothing ❌ actual behavior client crashes 📋 debug information powernukkit version ✍ not informed debug link ✍ not informed 💢 crash dump stack trace and other files not informed 💬 anything else we should know the plugin | 1 |
116,683 | 11,939,669,273 | IssuesEvent | 2020-04-02 15:31:55 | schramm-famm/ether | https://api.github.com/repos/schramm-famm/ether | opened | Update README to include missing API | documentation | The README is currently missing a description for `GET /ether/v1/conversations`. | 1.0 | Update README to include missing API - The README is currently missing a description for `GET /ether/v1/conversations`. | non_comp | update readme to include missing api the readme is currently missing a description for get ether conversations | 0 |
1,374 | 3,906,379,025 | IssuesEvent | 2016-04-19 08:36:34 | AdguardTeam/AdguardForAndroid | https://api.github.com/repos/AdguardTeam/AdguardForAndroid | closed | Musically app broken by https filtering. | Compatibility SSL | Made a test account:
Username: adguardtest
Password: adguard | True | Musically app broken by https filtering. - Made a test account:
Username: adguardtest
Password: adguard | comp | musically app broken by https filtering made a test account username adguardtest password adguard | 1 |
208,139 | 7,136,125,750 | IssuesEvent | 2018-01-23 05:12:51 | wso2/product-apim | https://api.github.com/repos/wso2/product-apim | closed | Include details on configuring SAML SSO as the inbound authentication when configuring STORE/PUBLISHER as the SP | 2.2.0 Priority/Highest SSO Severity/Critical Type/Docs | **Suggested Labels:**
APIM 2.2.0
Type/Doc
Severity/Crtical
Priority/Highest
**Description:**
Refer to doc [1]. Under the topic "Configure service providers to the Publisher and Store with the Facebook Identity Provider" doesn't mention any configuration details related to configuring SAML SSO Inbound Authentication for the Service Provider. Please include these details or point the users for a location where these configuration details are mentioned.
[1]. https://docs.wso2.com/display/AM2xx/Log+in+to+the+API+Store+using+Social+Media
| 1.0 | Include details on configuring SAML SSO as the inbound authentication when configuring STORE/PUBLISHER as the SP - **Suggested Labels:**
APIM 2.2.0
Type/Doc
Severity/Crtical
Priority/Highest
**Description:**
Refer to doc [1]. Under the topic "Configure service providers to the Publisher and Store with the Facebook Identity Provider" doesn't mention any configuration details related to configuring SAML SSO Inbound Authentication for the Service Provider. Please include these details or point the users for a location where these configuration details are mentioned.
[1]. https://docs.wso2.com/display/AM2xx/Log+in+to+the+API+Store+using+Social+Media
| non_comp | include details on configuring saml sso as the inbound authentication when configuring store publisher as the sp suggested labels apim type doc severity crtical priority highest description refer to doc under the topic configure service providers to the publisher and store with the facebook identity provider doesn t mention any configuration details related to configuring saml sso inbound authentication for the service provider please include these details or point the users for a location where these configuration details are mentioned | 0 |
12,251 | 14,478,620,617 | IssuesEvent | 2020-12-10 08:40:36 | OneSignal/OneSignal-Android-SDK | https://api.github.com/repos/OneSignal/OneSignal-Android-SDK | closed | the sdk documentation for android studio 4.1 is deprecated | Compatibility Issue | **Description:**
The sdk documentation for android studio version 4.1 is obsolete.
## In the new version you can no longer add :
buildscript {
repositories {
maven { url 'https://plugins.gradle.org/m2/'}
}
dependencies {
classpath 'gradle.plugin.com.onesignal:onesignal-gradle-plugin:[0.12.8, 0.99.99]'
}
}
apply plugin: 'com.onesignal.androidsdk.onesignal-gradle-plugin'
repositories {
maven { url 'https://maven.google.com' }
## }
An error appears that you cannot add dependencies before the plugins
## New buildgradle structure of android studio:
plugins {
id 'com.android.application'
}
android {
compileSdkVersion 30
buildToolsVersion "30.0.2"
```
defaultConfig {
applicationId "com.app.example"
minSdkVersion 16
targetSdkVersion 30
versionCode 1
versionName "1.0"
```
```
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
```
```
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
```
}
dependencies {
```
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'com.google.android.material:material:1.2.1'
implementation 'androidx.constraintlayout:constraintlayout:2.0.2'
testImplementation 'junit:junit:4.+'
```
| True | the sdk documentation for android studio 4.1 is deprecated - **Description:**
The sdk documentation for android studio version 4.1 is obsolete.
## In the new version you can no longer add :
buildscript {
repositories {
maven { url 'https://plugins.gradle.org/m2/'}
}
dependencies {
classpath 'gradle.plugin.com.onesignal:onesignal-gradle-plugin:[0.12.8, 0.99.99]'
}
}
apply plugin: 'com.onesignal.androidsdk.onesignal-gradle-plugin'
repositories {
maven { url 'https://maven.google.com' }
## }
An error appears that you cannot add dependencies before the plugins
## New buildgradle structure of android studio:
plugins {
id 'com.android.application'
}
android {
compileSdkVersion 30
buildToolsVersion "30.0.2"
```
defaultConfig {
applicationId "com.app.example"
minSdkVersion 16
targetSdkVersion 30
versionCode 1
versionName "1.0"
```
```
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
```
```
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
```
}
dependencies {
```
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'com.google.android.material:material:1.2.1'
implementation 'androidx.constraintlayout:constraintlayout:2.0.2'
testImplementation 'junit:junit:4.+'
```
| comp | the sdk documentation for android studio is deprecated description the sdk documentation for android studio version is obsolete in the new version you can no longer add buildscript repositories maven url dependencies classpath gradle plugin com onesignal onesignal gradle plugin apply plugin com onesignal androidsdk onesignal gradle plugin repositories maven url an error appears that you cannot add dependencies before the plugins new buildgradle structure of android studio plugins id com android application android compilesdkversion buildtoolsversion defaultconfig applicationid com app example minsdkversion targetsdkversion versioncode versionname testinstrumentationrunner androidx test runner androidjunitrunner buildtypes release minifyenabled false proguardfiles getdefaultproguardfile proguard android optimize txt proguard rules pro compileoptions sourcecompatibility javaversion version targetcompatibility javaversion version dependencies implementation androidx appcompat appcompat implementation com google android material material implementation androidx constraintlayout constraintlayout testimplementation junit junit | 1 |
10,475 | 12,421,595,500 | IssuesEvent | 2020-05-23 17:37:38 | facebook/hhvm | https://api.github.com/repos/facebook/hhvm | closed | Registering autoloaders from autoloaders does not as expected | php5 incompatibility | Imagine a situation where an autoloader does not know how to load a class, but it does know which autoloader does so it registers the new autoloader.
In HHVM, registering an autoloader while loading a class using the `spl_autoload_register()` function does not work. The newly registered autoloader is not called. This does work in PHP 5.6.11.
Example code:
``` php
<?php
class Autoloader1
{
public static function autoload ($class)
{
var_dump("Loading {$class} from " . __CLASS__);
spl_autoload_register(["Autoloader2", "autoload"]);
}
}
class Autoloader2
{
public static function autoload ($class)
{
var_dump("Loading {$class} from " . __CLASS__);exit;
}
}
spl_autoload_register(["Autoloader1", "autoload"]);
$r = new Foo();
```
Output from PHP 5.6.11:
```
string(28) "Loading Foo from Autoloader1"
string(28) "Loading Foo from Autoloader2"
Fatal error: Class undefined: Foo in /var/www/willem/tests/hhvmautoload.php on line 26
```
Output from HHVM 3.8.0
```
string(28) "Loading Foo from Autoloader1"
Fatal error: Class undefined: Foo in /var/www/willem/tests/hhvmautoload.php on line 26
```
Note how the second autoloader is not called in the HHVM example.
| True | Registering autoloaders from autoloaders does not as expected - Imagine a situation where an autoloader does not know how to load a class, but it does know which autoloader does so it registers the new autoloader.
In HHVM, registering an autoloader while loading a class using the `spl_autoload_register()` function does not work. The newly registered autoloader is not called. This does work in PHP 5.6.11.
Example code:
``` php
<?php
class Autoloader1
{
public static function autoload ($class)
{
var_dump("Loading {$class} from " . __CLASS__);
spl_autoload_register(["Autoloader2", "autoload"]);
}
}
class Autoloader2
{
public static function autoload ($class)
{
var_dump("Loading {$class} from " . __CLASS__);exit;
}
}
spl_autoload_register(["Autoloader1", "autoload"]);
$r = new Foo();
```
Output from PHP 5.6.11:
```
string(28) "Loading Foo from Autoloader1"
string(28) "Loading Foo from Autoloader2"
Fatal error: Class undefined: Foo in /var/www/willem/tests/hhvmautoload.php on line 26
```
Output from HHVM 3.8.0
```
string(28) "Loading Foo from Autoloader1"
Fatal error: Class undefined: Foo in /var/www/willem/tests/hhvmautoload.php on line 26
```
Note how the second autoloader is not called in the HHVM example.
| comp | registering autoloaders from autoloaders does not as expected imagine a situation where an autoloader does not know how to load a class but it does know which autoloader does so it registers the new autoloader in hhvm registering an autoloader while loading a class using the spl autoload register function does not work the newly registered autoloader is not called this does work in php example code php php class public static function autoload class var dump loading class from class spl autoload register class public static function autoload class var dump loading class from class exit spl autoload register r new foo output from php string loading foo from string loading foo from fatal error class undefined foo in var www willem tests hhvmautoload php on line output from hhvm string loading foo from fatal error class undefined foo in var www willem tests hhvmautoload php on line note how the second autoloader is not called in the hhvm example | 1 |
84,625 | 24,366,961,847 | IssuesEvent | 2022-10-03 15:53:20 | dotnet/arcade | https://api.github.com/repos/dotnet/arcade | opened | Build failed: dotnet-helix-service-weekly/main #2022100301 | First Responder Build Failed | Build [#2022100301](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_build/results?buildId=2010143) failed
## :x: : internal / dotnet-helix-service-weekly failed
### Summary
**Finished** - Mon, 03 Oct 2022 15:53:09 GMT
**Duration** - 41 minutes
**Requested for** - Microsoft.VisualStudio.Services.TFS
**Reason** - schedule
### Details
#### SynchronizeSecrets
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'AccessToken-dotnet-build-bot-public-repo' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'akams-client-id' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'akams-client-secret' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'App-PipeBuild-Client-Secret' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BenchViewUploadToken' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BenchViewUploadTokenLinux' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BotAccount-dn-ha-bt-domain-password' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BotAccount-dn-helix-agents-bot-otp' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BotAccount-dn-helix-agents-bot-password' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BotAccount-dn-roci-domain-password' consider deleting it.
- :x: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - User intervention required for creation or rotation of an Azure DevOps access token.
- :x: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - User intervention required for creation or rotation of an Azure DevOps access token.
- :x: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Script failed with exit code: 1
### Changes
- [f3cfaa83](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/f3cfaa83b06672f66c9d4c8ca90354112acb943a) - Alitzel Mendez Bustillo - Merged PR 26337: Build analysis: Include job failures in analysis
- [3e154ee2](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/3e154ee2ab840b5f30ba8ea878fa9758c487645c) - Alitzel Mendez Bustillo - Merged PR 26329: Change Queue Insights alert executionErrorState from alerting to keep state
- [376d0c35](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/376d0c35c13465d6225219e6099b1d46cb16c01b) - Chad Nedzlek - Merged PR 26312: Add MessageProcessingSeconds metrics
- [caa8f111](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/caa8f111ff242bf9c1367cc1718370bffbc0db85) - Missy Messa - Merged PR 25713: Retrofitting metrics observer to know about org ID
- [bd66b467](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/bd66b46736ddd23fa6521ee54d74405df8461455) - Missy Messa - Merged PR 26285: Remove build number from error path
- [ea90c2ff](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/ea90c2ff873233d8355d51cc5807788e57755bf2) - Missy Messa - removed build number from retry info
- [2dcee0da](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/2dcee0da6dfa5d26a3064047e931f5580deb7b2f) - Ricardo Arenas - Merged PR 26288: Revert 'Queue insights: only retrieve unique queue names from matrix of truth data'
- [179e0c60](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/179e0c60e55337c64180fdca062ce4e2cccd00f5) - Missy Messa - Merged PR 26286: Simplified section names for build and test failures
- [7bd29bcc](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/7bd29bccdf94cb35cd595cc85dd5f3ce48a655fd) - Missy Messa - Simplified section names for build and test failures
- [e364aeec](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/e364aeeceb011655d46e4563432224f7b9358ca6) - Missy Messa - Remove build number from error path
| 1.0 | Build failed: dotnet-helix-service-weekly/main #2022100301 - Build [#2022100301](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_build/results?buildId=2010143) failed
## :x: : internal / dotnet-helix-service-weekly failed
### Summary
**Finished** - Mon, 03 Oct 2022 15:53:09 GMT
**Duration** - 41 minutes
**Requested for** - Microsoft.VisualStudio.Services.TFS
**Reason** - schedule
### Details
#### SynchronizeSecrets
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'AccessToken-dotnet-build-bot-public-repo' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'akams-client-id' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'akams-client-secret' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'App-PipeBuild-Client-Secret' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BenchViewUploadToken' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BenchViewUploadTokenLinux' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BotAccount-dn-ha-bt-domain-password' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BotAccount-dn-helix-agents-bot-otp' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BotAccount-dn-helix-agents-bot-password' consider deleting it.
- :warning: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Extra secret 'BotAccount-dn-roci-domain-password' consider deleting it.
- :x: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - User intervention required for creation or rotation of an Azure DevOps access token.
- :x: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - User intervention required for creation or rotation of an Azure DevOps access token.
- :x: - [[Log]](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_apis/build/builds/2010143/logs/21) - Script failed with exit code: 1
### Changes
- [f3cfaa83](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/f3cfaa83b06672f66c9d4c8ca90354112acb943a) - Alitzel Mendez Bustillo - Merged PR 26337: Build analysis: Include job failures in analysis
- [3e154ee2](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/3e154ee2ab840b5f30ba8ea878fa9758c487645c) - Alitzel Mendez Bustillo - Merged PR 26329: Change Queue Insights alert executionErrorState from alerting to keep state
- [376d0c35](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/376d0c35c13465d6225219e6099b1d46cb16c01b) - Chad Nedzlek - Merged PR 26312: Add MessageProcessingSeconds metrics
- [caa8f111](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/caa8f111ff242bf9c1367cc1718370bffbc0db85) - Missy Messa - Merged PR 25713: Retrofitting metrics observer to know about org ID
- [bd66b467](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/bd66b46736ddd23fa6521ee54d74405df8461455) - Missy Messa - Merged PR 26285: Remove build number from error path
- [ea90c2ff](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/ea90c2ff873233d8355d51cc5807788e57755bf2) - Missy Messa - removed build number from retry info
- [2dcee0da](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/2dcee0da6dfa5d26a3064047e931f5580deb7b2f) - Ricardo Arenas - Merged PR 26288: Revert 'Queue insights: only retrieve unique queue names from matrix of truth data'
- [179e0c60](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/179e0c60e55337c64180fdca062ce4e2cccd00f5) - Missy Messa - Merged PR 26286: Simplified section names for build and test failures
- [7bd29bcc](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/7bd29bccdf94cb35cd595cc85dd5f3ce48a655fd) - Missy Messa - Simplified section names for build and test failures
- [e364aeec](https://dev.azure.com/dnceng/7ea9116e-9fac-403d-b258-b31fcf1bb293/_git/b446d7ab-be6b-4949-aa44-214eef7e35bb/commit/e364aeeceb011655d46e4563432224f7b9358ca6) - Missy Messa - Remove build number from error path
| non_comp | build failed dotnet helix service weekly main build failed x internal dotnet helix service weekly failed summary finished mon oct gmt duration minutes requested for microsoft visualstudio services tfs reason schedule details synchronizesecrets warning extra secret accesstoken dotnet build bot public repo consider deleting it warning extra secret akams client id consider deleting it warning extra secret akams client secret consider deleting it warning extra secret app pipebuild client secret consider deleting it warning extra secret benchviewuploadtoken consider deleting it warning extra secret benchviewuploadtokenlinux consider deleting it warning extra secret botaccount dn ha bt domain password consider deleting it warning extra secret botaccount dn helix agents bot otp consider deleting it warning extra secret botaccount dn helix agents bot password consider deleting it warning extra secret botaccount dn roci domain password consider deleting it x user intervention required for creation or rotation of an azure devops access token x user intervention required for creation or rotation of an azure devops access token x script failed with exit code changes alitzel mendez bustillo merged pr build analysis include job failures in analysis alitzel mendez bustillo merged pr change queue insights alert executionerrorstate from alerting to keep state chad nedzlek merged pr add messageprocessingseconds metrics missy messa merged pr retrofitting metrics observer to know about org id missy messa merged pr remove build number from error path missy messa removed build number from retry info ricardo arenas merged pr revert queue insights only retrieve unique queue names from matrix of truth data missy messa merged pr simplified section names for build and test failures missy messa simplified section names for build and test failures missy messa remove build number from error path | 0 |
3,499 | 6,485,696,846 | IssuesEvent | 2017-08-19 12:58:40 | pingcap/tidb | https://api.github.com/repos/pingcap/tidb | opened | support `GROUP BY` modifiers | compatibility enhancement todo | ## 1. What did you do?
MySQL and Oracle support [`ROLLUP`](https://dev.mysql.com/doc/refman/5.7/en/group-by-modifiers.html) modifier, sql-server also support `CUBE` and `GROUPING SETS`
## 2. What did you expect to see?
```sql
drop table if exists t;
create table t(a bigint, b bigint, c bigint);
insert into t values(1, 2, 3), (2, 2, 3), (3, 2, 3);
```
```sql
MySQL > select a, min(b) from t group by a with rollup;
+------+--------+
| a | min(b) |
+------+--------+
| 1 | 2 |
| 2 | 2 |
| 3 | 2 |
| NULL | 2 |
+------+--------+
4 rows in set (0.01 sec)
```
## 3. What did you see instead?
```sql
TiDB > select a, min(b) from t group by a with rollup;
ERROR 1105 (HY000): line 0 column 39 near " rollup" (total length 46)
```
## 4. What version of TiDB are you using (`tidb-server -V`)?
```sql
TiDB > select tidb_version();
+-----------------------------------------------------------------------------------------------------------------------------------------+
| tidb_version() |
+-----------------------------------------------------------------------------------------------------------------------------------------+
| Release Version: 0.8.0
Git Commit Hash: 3e1728b4b853c224daa969fcc3d03be5d0860ef4
Git Branch: master
UTC Build Time: 2017-08-19 12:33:06 |
+-----------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
``` | True | support `GROUP BY` modifiers - ## 1. What did you do?
MySQL and Oracle support [`ROLLUP`](https://dev.mysql.com/doc/refman/5.7/en/group-by-modifiers.html) modifier, sql-server also support `CUBE` and `GROUPING SETS`
## 2. What did you expect to see?
```sql
drop table if exists t;
create table t(a bigint, b bigint, c bigint);
insert into t values(1, 2, 3), (2, 2, 3), (3, 2, 3);
```
```sql
MySQL > select a, min(b) from t group by a with rollup;
+------+--------+
| a | min(b) |
+------+--------+
| 1 | 2 |
| 2 | 2 |
| 3 | 2 |
| NULL | 2 |
+------+--------+
4 rows in set (0.01 sec)
```
## 3. What did you see instead?
```sql
TiDB > select a, min(b) from t group by a with rollup;
ERROR 1105 (HY000): line 0 column 39 near " rollup" (total length 46)
```
## 4. What version of TiDB are you using (`tidb-server -V`)?
```sql
TiDB > select tidb_version();
+-----------------------------------------------------------------------------------------------------------------------------------------+
| tidb_version() |
+-----------------------------------------------------------------------------------------------------------------------------------------+
| Release Version: 0.8.0
Git Commit Hash: 3e1728b4b853c224daa969fcc3d03be5d0860ef4
Git Branch: master
UTC Build Time: 2017-08-19 12:33:06 |
+-----------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
``` | comp | support group by modifiers what did you do mysql and oracle support modifier sql server also support cube and grouping sets what did you expect to see sql drop table if exists t create table t a bigint b bigint c bigint insert into t values sql mysql select a min b from t group by a with rollup a min b null rows in set sec what did you see instead sql tidb select a min b from t group by a with rollup error line column near rollup total length what version of tidb are you using tidb server v sql tidb select tidb version tidb version release version git commit hash git branch master utc build time row in set sec | 1 |
9,152 | 11,183,585,888 | IssuesEvent | 2019-12-31 14:02:57 | AthenaSulisMinerva/CombatExtendedFastTrack | https://api.github.com/repos/AthenaSulisMinerva/CombatExtendedFastTrack | closed | [RH] Faction: Cordis Die (1.0) - CLAW AGR health and armor rating nerf | duplicate mod compatiblity patch | Oof, nother update lads.
Just nerfed the CLAW AGR really:
Race: RHRace_CLAW_AGR
```
<ArmorRating_Blunt>0.30</ArmorRating_Blunt>
<ArmorRating_Sharp>0.60</ArmorRating_Sharp>
<baseHealthScale>2.0</baseHealthScale>
```
I've nerfed the armor ratings and base health scale for this fella, not sure if it's worth reporting here but I can never be too sure. Don't want the CE FT users hangin' out trying to kill the thing when I nerfed it already. Cheers guys, take care | True | [RH] Faction: Cordis Die (1.0) - CLAW AGR health and armor rating nerf - Oof, nother update lads.
Just nerfed the CLAW AGR really:
Race: RHRace_CLAW_AGR
```
<ArmorRating_Blunt>0.30</ArmorRating_Blunt>
<ArmorRating_Sharp>0.60</ArmorRating_Sharp>
<baseHealthScale>2.0</baseHealthScale>
```
I've nerfed the armor ratings and base health scale for this fella, not sure if it's worth reporting here but I can never be too sure. Don't want the CE FT users hangin' out trying to kill the thing when I nerfed it already. Cheers guys, take care | comp | faction cordis die claw agr health and armor rating nerf oof nother update lads just nerfed the claw agr really race rhrace claw agr i ve nerfed the armor ratings and base health scale for this fella not sure if it s worth reporting here but i can never be too sure don t want the ce ft users hangin out trying to kill the thing when i nerfed it already cheers guys take care | 1 |
16,061 | 21,366,011,795 | IssuesEvent | 2022-04-20 01:46:26 | Automattic/woocommerce-subscriptions-core | https://api.github.com/repos/Automattic/woocommerce-subscriptions-core | opened | Replace code which uses `get_posts()` to get Orders post types with `wc_get_orders()` | type: task size: medium compatibility | ## Description
<!--
A clear and concise description of what the new feature or improvement is.
Include images or screenshots to clarify the context.
What are you trying to do – what's the wider flow?
-->
With Custom Order Tables being worked on by the WC Core team ([public announcement](https://developer.woocommerce.com/2022/01/17/the-plan-for-the-woocommerce-custom-order-table/)), we need to begin to think about updating the Subscriptions Core library to remove any code that is calling WP Post API functions.
In this issue we're tackling uses of `get_posts()` where `shop_order` post type is passed. Subscriptions and products are fine as they're staying in the wp posts table.
### How to update code?
We should be able to replace most cases of `get_posts` with `wc_get_orders()`. To search for custom parameters we can use the approach outlined here: https://github.com/woocommerce/woocommerce/wiki/wc_get_orders-and-WC_Order_Query#adding-custom-parameter-support
### Code that needs Updating
- [ ] [`get_users_subscription_orders`](https://github.com/Automattic/woocommerce-subscriptions-core/blob/97c279bdb6131462de9e90161a6a172ce0447293/includes/class-wc-subscriptions-order.php#L617)
- This function is not being used by any of our code including in WC Payments and WC Subscriptions so updating this is very low priority/impact
- [ ] [`wcs_get_subscription_id_from_key`](https://github.com/Automattic/woocommerce-subscriptions-core/blob/ad42d34fc632327333486d09577740a38f9f76f8/includes/wcs-deprecated-functions.php#L150-L156)
- Subscription keys were a thing back before WC Subscriptions 2.0, however it looks like this function is still be used and it's a public function so we should still update it.
- [ ] [`wcs_get_subscription_orders`](https://github.com/Automattic/woocommerce-subscriptions-core/blob/2a0654bf0385cc3e5b097943aea5589e843b05d0/includes/wcs-order-functions.php#L425)
### Code to ignore
- `class WCS_Related_Order_Store_CPT`
- [`WCS_PayPal_Standard_IPN_Handler::get_order_id_and_key()`](https://github.com/Automattic/woocommerce-subscriptions-core/blob/2a0654bf0385cc3e5b097943aea5589e843b05d0/includes/gateways/paypal/includes/class-wcs-paypal-standard-ipn-handler.php#L604)
- This can probably be ignored for now because `'shop_order'` post type is never queried (even though it's the default option when calling the function). I believe this default option was pre-WC subscriptions 2.0 before a subscription post type existed and we had to use the parent order ID.
- `class WCS_Repair_2_0`
- `class WCS_Upgrade_2_0`
## Product impact
<!-- What product(s) is this feature intended for? -->
- Does this feature affect WooCommerce Subscriptions? yes
- Does this feature affect WooCommerce Payments? yes
## Dev notes
<!-- If applicable, additional technical or implementation details that will help when developing this feature or improvement. -->
## Additional context
<!-- Any additional context or details you think might be helpful. -->
<!-- Ticket numbers/links, P2s, project threads, etc. -->
| True | Replace code which uses `get_posts()` to get Orders post types with `wc_get_orders()` - ## Description
<!--
A clear and concise description of what the new feature or improvement is.
Include images or screenshots to clarify the context.
What are you trying to do – what's the wider flow?
-->
With Custom Order Tables being worked on by the WC Core team ([public announcement](https://developer.woocommerce.com/2022/01/17/the-plan-for-the-woocommerce-custom-order-table/)), we need to begin to think about updating the Subscriptions Core library to remove any code that is calling WP Post API functions.
In this issue we're tackling uses of `get_posts()` where `shop_order` post type is passed. Subscriptions and products are fine as they're staying in the wp posts table.
### How to update code?
We should be able to replace most cases of `get_posts` with `wc_get_orders()`. To search for custom parameters we can use the approach outlined here: https://github.com/woocommerce/woocommerce/wiki/wc_get_orders-and-WC_Order_Query#adding-custom-parameter-support
### Code that needs Updating
- [ ] [`get_users_subscription_orders`](https://github.com/Automattic/woocommerce-subscriptions-core/blob/97c279bdb6131462de9e90161a6a172ce0447293/includes/class-wc-subscriptions-order.php#L617)
- This function is not being used by any of our code including in WC Payments and WC Subscriptions so updating this is very low priority/impact
- [ ] [`wcs_get_subscription_id_from_key`](https://github.com/Automattic/woocommerce-subscriptions-core/blob/ad42d34fc632327333486d09577740a38f9f76f8/includes/wcs-deprecated-functions.php#L150-L156)
- Subscription keys were a thing back before WC Subscriptions 2.0, however it looks like this function is still be used and it's a public function so we should still update it.
- [ ] [`wcs_get_subscription_orders`](https://github.com/Automattic/woocommerce-subscriptions-core/blob/2a0654bf0385cc3e5b097943aea5589e843b05d0/includes/wcs-order-functions.php#L425)
### Code to ignore
- `class WCS_Related_Order_Store_CPT`
- [`WCS_PayPal_Standard_IPN_Handler::get_order_id_and_key()`](https://github.com/Automattic/woocommerce-subscriptions-core/blob/2a0654bf0385cc3e5b097943aea5589e843b05d0/includes/gateways/paypal/includes/class-wcs-paypal-standard-ipn-handler.php#L604)
- This can probably be ignored for now because `'shop_order'` post type is never queried (even though it's the default option when calling the function). I believe this default option was pre-WC subscriptions 2.0 before a subscription post type existed and we had to use the parent order ID.
- `class WCS_Repair_2_0`
- `class WCS_Upgrade_2_0`
## Product impact
<!-- What product(s) is this feature intended for? -->
- Does this feature affect WooCommerce Subscriptions? yes
- Does this feature affect WooCommerce Payments? yes
## Dev notes
<!-- If applicable, additional technical or implementation details that will help when developing this feature or improvement. -->
## Additional context
<!-- Any additional context or details you think might be helpful. -->
<!-- Ticket numbers/links, P2s, project threads, etc. -->
| comp | replace code which uses get posts to get orders post types with wc get orders description a clear and concise description of what the new feature or improvement is include images or screenshots to clarify the context what are you trying to do – what s the wider flow with custom order tables being worked on by the wc core team we need to begin to think about updating the subscriptions core library to remove any code that is calling wp post api functions in this issue we re tackling uses of get posts where shop order post type is passed subscriptions and products are fine as they re staying in the wp posts table how to update code we should be able to replace most cases of get posts with wc get orders to search for custom parameters we can use the approach outlined here code that needs updating this function is not being used by any of our code including in wc payments and wc subscriptions so updating this is very low priority impact subscription keys were a thing back before wc subscriptions however it looks like this function is still be used and it s a public function so we should still update it code to ignore class wcs related order store cpt this can probably be ignored for now because shop order post type is never queried even though it s the default option when calling the function i believe this default option was pre wc subscriptions before a subscription post type existed and we had to use the parent order id class wcs repair class wcs upgrade product impact does this feature affect woocommerce subscriptions yes does this feature affect woocommerce payments yes dev notes additional context | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.