Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 4
112
| repo_url
stringlengths 33
141
| action
stringclasses 3
values | title
stringlengths 1
957
| labels
stringlengths 4
1.11k
| body
stringlengths 1
261k
| index
stringclasses 11
values | text_combine
stringlengths 95
261k
| label
stringclasses 2
values | text
stringlengths 96
250k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
12,160
| 3,051,045,219
|
IssuesEvent
|
2015-08-12 04:57:27
|
girldevelopit/gdi-new-site
|
https://api.github.com/repos/girldevelopit/gdi-new-site
|
closed
|
About Page: Our Mission and Our Values Allignment
|
Beginner Friendly Help Wanted Suggestion UX/Design Needed
|
The our mission and our values paragraphs take a while for me to decipher because of the alignment.
Instead of having them right next to each other can we place the paragraphs on top of each other and still keep the center alignment like the rest of the page. I feel it would it would connect better and reduce the visual confusion.
[ Our Vision ]
Our vision is to create a network of empowered women who feel confident in their abilities to code and build beautiful web and mobile applications. By teaching women around the world from diverse backgrounds to learn software development, we can help women improve their careers and confidence in their everyday lives.
[ Our Values ]
We are committed to making sure women of all races, education levels, income and upbringing can build confidence in their skill set to develop web and mobile applications. Our goal is to provide powerful hands-on programs to women seeking professional help in software development and create basic to advanced web and mobile applications.
|
1.0
|
About Page: Our Mission and Our Values Allignment - The our mission and our values paragraphs take a while for me to decipher because of the alignment.
Instead of having them right next to each other can we place the paragraphs on top of each other and still keep the center alignment like the rest of the page. I feel it would it would connect better and reduce the visual confusion.
[ Our Vision ]
Our vision is to create a network of empowered women who feel confident in their abilities to code and build beautiful web and mobile applications. By teaching women around the world from diverse backgrounds to learn software development, we can help women improve their careers and confidence in their everyday lives.
[ Our Values ]
We are committed to making sure women of all races, education levels, income and upbringing can build confidence in their skill set to develop web and mobile applications. Our goal is to provide powerful hands-on programs to women seeking professional help in software development and create basic to advanced web and mobile applications.
|
design
|
about page our mission and our values allignment the our mission and our values paragraphs take a while for me to decipher because of the alignment instead of having them right next to each other can we place the paragraphs on top of each other and still keep the center alignment like the rest of the page i feel it would it would connect better and reduce the visual confusion our vision is to create a network of empowered women who feel confident in their abilities to code and build beautiful web and mobile applications by teaching women around the world from diverse backgrounds to learn software development we can help women improve their careers and confidence in their everyday lives we are committed to making sure women of all races education levels income and upbringing can build confidence in their skill set to develop web and mobile applications our goal is to provide powerful hands on programs to women seeking professional help in software development and create basic to advanced web and mobile applications
| 1
|
243,372
| 7,856,993,915
|
IssuesEvent
|
2018-06-21 09:22:29
|
telerik/kendo-ui-core
|
https://api.github.com/repos/telerik/kendo-ui-core
|
closed
|
Chat cards are not in a card deck
|
Bug C: Chat Kendo2 Priority 5
|
### Bug report
In R2 2018 SP1 Chat cards are not in a card deck.
### Reproduction of the problem
1. https://demos.telerik.com/kendo-ui/chat/index
1. Click on Get a Quote button
### Current behavior

### Expected/desired behavior

### Environment
* **Kendo UI version:** 2018.2.620
* **jQuery version:** x.y
* **Browser:** [all]
|
1.0
|
Chat cards are not in a card deck - ### Bug report
In R2 2018 SP1 Chat cards are not in a card deck.
### Reproduction of the problem
1. https://demos.telerik.com/kendo-ui/chat/index
1. Click on Get a Quote button
### Current behavior

### Expected/desired behavior

### Environment
* **Kendo UI version:** 2018.2.620
* **jQuery version:** x.y
* **Browser:** [all]
|
non_design
|
chat cards are not in a card deck bug report in chat cards are not in a card deck reproduction of the problem click on get a quote button current behavior expected desired behavior environment kendo ui version jquery version x y browser
| 0
|
265,155
| 20,071,911,818
|
IssuesEvent
|
2022-02-04 08:12:28
|
fishfight/FishFight
|
https://api.github.com/repos/fishfight/FishFight
|
closed
|
Create style guide (naming conventions etc)
|
documentation project
|
Currently I am refactoring the item system and I have come across a lot of loose naming conventions from our prototyping days.
I have corrected some of them as I go, like `facing` on `PhysicsBody`, now, renamed `is_facing_right`. I am unsure how many times I have had to double check whether this means that the body is facing left or right while working on this project...
So, I propose that we create some naming conventions to supplement the other style enforcement put in place by `Clippy`.
In general, names should be informative and it should be possible to read the purpose of a variable from its name and its immediate context. This means that members and other variables that hold a value like, say, recoil should be named recoil. They should also not be specific to a subset of the functionality they will apply to. This is not that big of an issue but `shoot_coroutine` should, for example, in stead be named `attack_coroutine`, as it will also apply in the case of melee weapons...
If getters and setters are implemented, they should be prefixed as `get_some_value` and `set_some_value`. This convention should be used, whether the methods are getting or setting an actual (private) member var `some_value` or a set of member values.
I also think we should stick to naming `bool` variables and members using verbs, like `is_doing_something` or `is_using_something`, in the case of vars and member vars, or `should_use_something` and so on, in the case of arguments (in the case of non-member functions, it might fit better to use `is_[...]` for arguments, but you get the idea...)
These are my proposals. Feel free to add to these or to give any criticism or feedback, if you have it!
|
1.0
|
Create style guide (naming conventions etc) - Currently I am refactoring the item system and I have come across a lot of loose naming conventions from our prototyping days.
I have corrected some of them as I go, like `facing` on `PhysicsBody`, now, renamed `is_facing_right`. I am unsure how many times I have had to double check whether this means that the body is facing left or right while working on this project...
So, I propose that we create some naming conventions to supplement the other style enforcement put in place by `Clippy`.
In general, names should be informative and it should be possible to read the purpose of a variable from its name and its immediate context. This means that members and other variables that hold a value like, say, recoil should be named recoil. They should also not be specific to a subset of the functionality they will apply to. This is not that big of an issue but `shoot_coroutine` should, for example, in stead be named `attack_coroutine`, as it will also apply in the case of melee weapons...
If getters and setters are implemented, they should be prefixed as `get_some_value` and `set_some_value`. This convention should be used, whether the methods are getting or setting an actual (private) member var `some_value` or a set of member values.
I also think we should stick to naming `bool` variables and members using verbs, like `is_doing_something` or `is_using_something`, in the case of vars and member vars, or `should_use_something` and so on, in the case of arguments (in the case of non-member functions, it might fit better to use `is_[...]` for arguments, but you get the idea...)
These are my proposals. Feel free to add to these or to give any criticism or feedback, if you have it!
|
non_design
|
create style guide naming conventions etc currently i am refactoring the item system and i have come across a lot of loose naming conventions from our prototyping days i have corrected some of them as i go like facing on physicsbody now renamed is facing right i am unsure how many times i have had to double check whether this means that the body is facing left or right while working on this project so i propose that we create some naming conventions to supplement the other style enforcement put in place by clippy in general names should be informative and it should be possible to read the purpose of a variable from its name and its immediate context this means that members and other variables that hold a value like say recoil should be named recoil they should also not be specific to a subset of the functionality they will apply to this is not that big of an issue but shoot coroutine should for example in stead be named attack coroutine as it will also apply in the case of melee weapons if getters and setters are implemented they should be prefixed as get some value and set some value this convention should be used whether the methods are getting or setting an actual private member var some value or a set of member values i also think we should stick to naming bool variables and members using verbs like is doing something or is using something in the case of vars and member vars or should use something and so on in the case of arguments in the case of non member functions it might fit better to use is for arguments but you get the idea these are my proposals feel free to add to these or to give any criticism or feedback if you have it
| 0
|
107,607
| 13,490,110,489
|
IssuesEvent
|
2020-09-11 14:42:39
|
patternfly/patternfly-org
|
https://api.github.com/repos/patternfly/patternfly-org
|
opened
|
Add additional button guidelines: Button guidelines & UX writing style guide
|
Content PF4 design Guidelines UX writing style guide
|
Links to docs:
- [Button design guidelines](https://www.patternfly.org/v4/design-guidelines/usage-and-behavior/buttons-and-links)
- [Punctuation page of UX writing style guide](https://www.patternfly.org/v4/design-guidelines/content/punctuation)
Issue: Some buttons in OpenStack dashboard have a "+" sign on them (created using punctuation, not an icon), and some don't. To establish better icon usage on buttons, we can add additional guidelines on when to use icons on buttons and to avoid punctuation on buttons.
Reference:
- [PatternFly's icon page](https://www.patternfly.org/v4/design-guidelines/styles/icons)
- [IBM Carbon's button guidelines](https://www.carbondesignsystem.com/components/button/usage/#modifiers)
- [PatternFly's existing icon usage guidelines](https://pf4.patternfly.org/accessibility-guide#icons)
|
1.0
|
Add additional button guidelines: Button guidelines & UX writing style guide - Links to docs:
- [Button design guidelines](https://www.patternfly.org/v4/design-guidelines/usage-and-behavior/buttons-and-links)
- [Punctuation page of UX writing style guide](https://www.patternfly.org/v4/design-guidelines/content/punctuation)
Issue: Some buttons in OpenStack dashboard have a "+" sign on them (created using punctuation, not an icon), and some don't. To establish better icon usage on buttons, we can add additional guidelines on when to use icons on buttons and to avoid punctuation on buttons.
Reference:
- [PatternFly's icon page](https://www.patternfly.org/v4/design-guidelines/styles/icons)
- [IBM Carbon's button guidelines](https://www.carbondesignsystem.com/components/button/usage/#modifiers)
- [PatternFly's existing icon usage guidelines](https://pf4.patternfly.org/accessibility-guide#icons)
|
design
|
add additional button guidelines button guidelines ux writing style guide links to docs issue some buttons in openstack dashboard have a sign on them created using punctuation not an icon and some don t to establish better icon usage on buttons we can add additional guidelines on when to use icons on buttons and to avoid punctuation on buttons reference
| 1
|
176,568
| 28,121,714,239
|
IssuesEvent
|
2023-03-31 14:39:21
|
aidenndev/Bookstore
|
https://api.github.com/repos/aidenndev/Bookstore
|
closed
|
Add models, views & controllers
|
enhancement system design back-end
|
- [x] Add Book Model (name, ID, customer & reserved time)
- [x] Add Customer Model (Id, name, email, password)
- [x] Add relationships
- [x] Add Controllers
- [x] Add Views
|
1.0
|
Add models, views & controllers - - [x] Add Book Model (name, ID, customer & reserved time)
- [x] Add Customer Model (Id, name, email, password)
- [x] Add relationships
- [x] Add Controllers
- [x] Add Views
|
design
|
add models views controllers add book model name id customer reserved time add customer model id name email password add relationships add controllers add views
| 1
|
214,276
| 7,268,359,101
|
IssuesEvent
|
2018-02-20 09:49:57
|
wso2/product-is
|
https://api.github.com/repos/wso2/product-is
|
opened
|
Explain why we should do cache configuration when Deploying and configuring JWT client-handler artifacts
|
Affected/5.5.0-Alpha Priority/High Type/Docs
|
Explain why we should do cache configuration when Deploying and configuring JWT client-handler artifacts in this doc [1]
[1] https://docs.wso2.com/display/IS550/Private+Key+JWT+Client+Authentication+for+OIDC
In step 5 it says,
Do the cache configuration in <IS_HOME>/repository/conf/identity/identity.xml as shown below:
|
1.0
|
Explain why we should do cache configuration when Deploying and configuring JWT client-handler artifacts - Explain why we should do cache configuration when Deploying and configuring JWT client-handler artifacts in this doc [1]
[1] https://docs.wso2.com/display/IS550/Private+Key+JWT+Client+Authentication+for+OIDC
In step 5 it says,
Do the cache configuration in <IS_HOME>/repository/conf/identity/identity.xml as shown below:
|
non_design
|
explain why we should do cache configuration when deploying and configuring jwt client handler artifacts explain why we should do cache configuration when deploying and configuring jwt client handler artifacts in this doc in step it says do the cache configuration in repository conf identity identity xml as shown below
| 0
|
117,522
| 15,110,133,104
|
IssuesEvent
|
2021-02-08 18:47:33
|
mexyn/statev_v2_issues
|
https://api.github.com/repos/mexyn/statev_v2_issues
|
closed
|
Entfernen von blockierenden Mülltonnen
|
gamedesign solved
|
<!-- Bitte die Vorlage unten vollständig ausfüllen -->
**Kalle Knutsen**
<!-- Mit welchem Character wurde das Verhalten in-game ausgelöst/beobachtet -->
**Statisch**
<!-- Wann exakt (Datum / Uhrzeit) ist der Fehler beobachtet worden -->
**Seit V2 befinden sich 2 Mülltonnen an dem Firmenhash pfWestVine21, welche den normalen Gehweg versperren. Jetzt muss man immer über einen Kantstein, wo man gerne mal drüber stürzt und RP technisch auch jedes Mal unschön ist. Könnten die Container ggf. entfernt werden? Bild: https://pic.statev.de/image/pe2aS**
<!--- Beschreibe den Fehler -->
**Barrierefreier Gehweg zur Firma & den Hotelzimmern**
<!--- Beschreibe wie es richtigerweise sein sollte -->
**Schritte um den Fehler nachvollziehen zu können**
<!--- Beschreibe Schritt für Schritt wie man den Fehler nachstellen kann -->
**Monitorauflösung (nur wenn Falsche Darstellung in der UI)**
<!--- Beschreibe Schritt für Schritt wie man den Fehler nachstellen kann -->
**https://pic.statev.de/image/pe2aS**
<!--- Falls du ein Video oder Bild vom Fehler gemacht hast, dann kannst du diesen hier einfügen. Dies geht ganz einfach per Drag & Drop -->
|
1.0
|
Entfernen von blockierenden Mülltonnen - <!-- Bitte die Vorlage unten vollständig ausfüllen -->
**Kalle Knutsen**
<!-- Mit welchem Character wurde das Verhalten in-game ausgelöst/beobachtet -->
**Statisch**
<!-- Wann exakt (Datum / Uhrzeit) ist der Fehler beobachtet worden -->
**Seit V2 befinden sich 2 Mülltonnen an dem Firmenhash pfWestVine21, welche den normalen Gehweg versperren. Jetzt muss man immer über einen Kantstein, wo man gerne mal drüber stürzt und RP technisch auch jedes Mal unschön ist. Könnten die Container ggf. entfernt werden? Bild: https://pic.statev.de/image/pe2aS**
<!--- Beschreibe den Fehler -->
**Barrierefreier Gehweg zur Firma & den Hotelzimmern**
<!--- Beschreibe wie es richtigerweise sein sollte -->
**Schritte um den Fehler nachvollziehen zu können**
<!--- Beschreibe Schritt für Schritt wie man den Fehler nachstellen kann -->
**Monitorauflösung (nur wenn Falsche Darstellung in der UI)**
<!--- Beschreibe Schritt für Schritt wie man den Fehler nachstellen kann -->
**https://pic.statev.de/image/pe2aS**
<!--- Falls du ein Video oder Bild vom Fehler gemacht hast, dann kannst du diesen hier einfügen. Dies geht ganz einfach per Drag & Drop -->
|
design
|
entfernen von blockierenden mülltonnen kalle knutsen statisch seit befinden sich mülltonnen an dem firmenhash welche den normalen gehweg versperren jetzt muss man immer über einen kantstein wo man gerne mal drüber stürzt und rp technisch auch jedes mal unschön ist könnten die container ggf entfernt werden bild barrierefreier gehweg zur firma den hotelzimmern schritte um den fehler nachvollziehen zu können monitorauflösung nur wenn falsche darstellung in der ui
| 1
|
433,459
| 30,329,475,347
|
IssuesEvent
|
2023-07-11 04:44:50
|
Infleqtion/client-superstaq
|
https://api.github.com/repos/Infleqtion/client-superstaq
|
opened
|
Update tutorial install instructions
|
documentation
|
Update install instructions to:
```
try:
import qiskit
import qiskit_superstaq as qss
except ImportError:
print("Installing qiskit-superstaq...")
%pip install -q qiskit-superstaq[examples]
print("Installed qiskit-superstaq")
```
|
1.0
|
Update tutorial install instructions - Update install instructions to:
```
try:
import qiskit
import qiskit_superstaq as qss
except ImportError:
print("Installing qiskit-superstaq...")
%pip install -q qiskit-superstaq[examples]
print("Installed qiskit-superstaq")
```
|
non_design
|
update tutorial install instructions update install instructions to try import qiskit import qiskit superstaq as qss except importerror print installing qiskit superstaq pip install q qiskit superstaq print installed qiskit superstaq
| 0
|
184,992
| 14,997,729,099
|
IssuesEvent
|
2021-01-29 17:19:25
|
RoyMagnussen/Yummy-Recipe-Blog
|
https://api.github.com/repos/RoyMagnussen/Yummy-Recipe-Blog
|
closed
|
[CHANGE REQUEST] Wireframes
|
documentation enhancement
|
Linked Project: Documentation
#### Describe The Change Wanted
- Create a `Wireframes` section within the `UX` section of the `README.md` file.
- Add the wireframes to the `docs` folder.
- Add a link to the wireframes in the `Wireframes` section.
#### Reason For Change
This will help provide the developers to gain a better understanding of how the site is proposed to look like.
|
1.0
|
[CHANGE REQUEST] Wireframes - Linked Project: Documentation
#### Describe The Change Wanted
- Create a `Wireframes` section within the `UX` section of the `README.md` file.
- Add the wireframes to the `docs` folder.
- Add a link to the wireframes in the `Wireframes` section.
#### Reason For Change
This will help provide the developers to gain a better understanding of how the site is proposed to look like.
|
non_design
|
wireframes linked project documentation describe the change wanted create a wireframes section within the ux section of the readme md file add the wireframes to the docs folder add a link to the wireframes in the wireframes section reason for change this will help provide the developers to gain a better understanding of how the site is proposed to look like
| 0
|
119,251
| 15,438,997,058
|
IssuesEvent
|
2021-03-07 22:29:34
|
fga-eps-mds/2020-2-G4
|
https://api.github.com/repos/fga-eps-mds/2020-2-G4
|
opened
|
Protótipo
|
Design EPS Product Owner
|
## Nessa issue deve ser feito:
- Criar tela de cadastrar cliente
- Criar tela de visualizar uma lista de clientes
- Criar tela de atualizar dados dos clientes
- Criar tela visualizar um único cliente
- Criar tela de cadastrar categorias
- Criar tela de visualizar uma lista de categorias
- Criar tela de atualizar as categorias
## Critérios de aceitação:
- [ ] Validação das telas com o cliente
|
1.0
|
Protótipo - ## Nessa issue deve ser feito:
- Criar tela de cadastrar cliente
- Criar tela de visualizar uma lista de clientes
- Criar tela de atualizar dados dos clientes
- Criar tela visualizar um único cliente
- Criar tela de cadastrar categorias
- Criar tela de visualizar uma lista de categorias
- Criar tela de atualizar as categorias
## Critérios de aceitação:
- [ ] Validação das telas com o cliente
|
design
|
protótipo nessa issue deve ser feito criar tela de cadastrar cliente criar tela de visualizar uma lista de clientes criar tela de atualizar dados dos clientes criar tela visualizar um único cliente criar tela de cadastrar categorias criar tela de visualizar uma lista de categorias criar tela de atualizar as categorias critérios de aceitação validação das telas com o cliente
| 1
|
145,990
| 22,840,246,797
|
IssuesEvent
|
2022-07-12 20:58:51
|
department-of-veterans-affairs/vets-design-system-documentation
|
https://api.github.com/repos/department-of-veterans-affairs/vets-design-system-documentation
|
closed
|
OMB Info - Audit
|
component-update vsp-design-system-team va-omb-info
|
## Description
Audit of OMB Info. Identify as many instances of this component or pattern in use on VA.gov and share with the Design System Team.
## Tasks
- [ ] Work with engineers and the Governance team to find examples of this type of component
- [ ] Add screenshots of component usage examples to a Mural board, including links to sources
- [ ] Gather feedback on findings from the DST (Carol can help schedule a session for this). Questions to ask include:
- Are any changes to current implementations needed?
- Are there any variations that can be consolidated?
- Are there any accessibility considerations that should be taken into account as part of the design?
- [ ] Respond to feedback from DST
- [ ] Add link to Mural board as a comment in this ticket and to the component design ticket if applicable
## Acceptance Criteria
- [ ] Component usage examples have been collected on a Mural board and a link to the board has been added to this ticket
- [ ] Audit findings have been shared with the DST and all feedback has been addressed.
|
1.0
|
OMB Info - Audit -
## Description
Audit of OMB Info. Identify as many instances of this component or pattern in use on VA.gov and share with the Design System Team.
## Tasks
- [ ] Work with engineers and the Governance team to find examples of this type of component
- [ ] Add screenshots of component usage examples to a Mural board, including links to sources
- [ ] Gather feedback on findings from the DST (Carol can help schedule a session for this). Questions to ask include:
- Are any changes to current implementations needed?
- Are there any variations that can be consolidated?
- Are there any accessibility considerations that should be taken into account as part of the design?
- [ ] Respond to feedback from DST
- [ ] Add link to Mural board as a comment in this ticket and to the component design ticket if applicable
## Acceptance Criteria
- [ ] Component usage examples have been collected on a Mural board and a link to the board has been added to this ticket
- [ ] Audit findings have been shared with the DST and all feedback has been addressed.
|
design
|
omb info audit description audit of omb info identify as many instances of this component or pattern in use on va gov and share with the design system team tasks work with engineers and the governance team to find examples of this type of component add screenshots of component usage examples to a mural board including links to sources gather feedback on findings from the dst carol can help schedule a session for this questions to ask include are any changes to current implementations needed are there any variations that can be consolidated are there any accessibility considerations that should be taken into account as part of the design respond to feedback from dst add link to mural board as a comment in this ticket and to the component design ticket if applicable acceptance criteria component usage examples have been collected on a mural board and a link to the board has been added to this ticket audit findings have been shared with the dst and all feedback has been addressed
| 1
|
650
| 2,506,889,753
|
IssuesEvent
|
2015-01-12 14:44:51
|
mathjax/MathJax
|
https://api.github.com/repos/mathjax/MathJax
|
closed
|
Alignment bug in 2.5
|
Accepted Merged QA - Unit Test Wanted
|
I found that equations that were centered in the column in 2.4 are now setting flush left in 2.5.


|
1.0
|
Alignment bug in 2.5 -
I found that equations that were centered in the column in 2.4 are now setting flush left in 2.5.


|
non_design
|
alignment bug in i found that equations that were centered in the column in are now setting flush left in
| 0
|
3,260
| 3,102,743,135
|
IssuesEvent
|
2015-08-31 02:57:50
|
deis/deis
|
https://api.github.com/repos/deis/deis
|
closed
|
builder: Bad protocol version identification 'dummy-value' from 10.240.190.22
|
builder
|
When reading logs from builder, users may see something like:
```
Bad protocol version identification 'dummy-value' from 10.240.190.22
```
This is due to the changes made in #1940 - we had to refactor the ExecStartPost loops to use `ncat` instead of `cat /dev/tcp/...` because of the bash vulnerability fixed in [CoreOS 452.0.0](https://coreos.com/releases/#452.0.0) (specifically, bash no longer has direct network access). We need to pass something to ncat to send over the wire, so we use 'dummy-value'. We don't care that it's an invalid string - the import thing is that ncat can connect.
So, this is innocuous, but ideally we shouldn't need a string at all - maybe we can just send an end of transmission character?
|
1.0
|
builder: Bad protocol version identification 'dummy-value' from 10.240.190.22 - When reading logs from builder, users may see something like:
```
Bad protocol version identification 'dummy-value' from 10.240.190.22
```
This is due to the changes made in #1940 - we had to refactor the ExecStartPost loops to use `ncat` instead of `cat /dev/tcp/...` because of the bash vulnerability fixed in [CoreOS 452.0.0](https://coreos.com/releases/#452.0.0) (specifically, bash no longer has direct network access). We need to pass something to ncat to send over the wire, so we use 'dummy-value'. We don't care that it's an invalid string - the import thing is that ncat can connect.
So, this is innocuous, but ideally we shouldn't need a string at all - maybe we can just send an end of transmission character?
|
non_design
|
builder bad protocol version identification dummy value from when reading logs from builder users may see something like bad protocol version identification dummy value from this is due to the changes made in we had to refactor the execstartpost loops to use ncat instead of cat dev tcp because of the bash vulnerability fixed in specifically bash no longer has direct network access we need to pass something to ncat to send over the wire so we use dummy value we don t care that it s an invalid string the import thing is that ncat can connect so this is innocuous but ideally we shouldn t need a string at all maybe we can just send an end of transmission character
| 0
|
151,943
| 12,066,786,652
|
IssuesEvent
|
2020-04-16 12:22:34
|
Oldes/Rebol-issues
|
https://api.github.com/repos/Oldes/Rebol-issues
|
closed
|
Can't say to-money $1.00
|
Datatype: money! Test.written Type.bug
|
_Submitted by:_ _henrikmk_
```rebol
>> to-money $1
** Script error: cannot MAKE/TO money! from: $1.00
** Where: to to-money
** Near: to money! :value
```
---
<sup>**Imported from:** **[CureCode](https://www.curecode.org/rebol3/ticket.rsp?id=238)** [ Version: 1.0.0 Type: Bug Platform: All Category: n/a Reproduce: Always Fixed-in:none ]</sup>
<sup>**Imported from**: https://github.com/rebol/rebol-issues/issues/238</sup>
Comments:
---
---
> **Rebolbot** added the **Type.bug** on Jan 12, 2016
---
|
1.0
|
Can't say to-money $1.00 - _Submitted by:_ _henrikmk_
```rebol
>> to-money $1
** Script error: cannot MAKE/TO money! from: $1.00
** Where: to to-money
** Near: to money! :value
```
---
<sup>**Imported from:** **[CureCode](https://www.curecode.org/rebol3/ticket.rsp?id=238)** [ Version: 1.0.0 Type: Bug Platform: All Category: n/a Reproduce: Always Fixed-in:none ]</sup>
<sup>**Imported from**: https://github.com/rebol/rebol-issues/issues/238</sup>
Comments:
---
---
> **Rebolbot** added the **Type.bug** on Jan 12, 2016
---
|
non_design
|
can t say to money submitted by henrikmk rebol to money script error cannot make to money from where to to money near to money value imported from imported from comments rebolbot added the type bug on jan
| 0
|
122,772
| 16,326,768,540
|
IssuesEvent
|
2021-05-12 02:29:26
|
Uniswap/uniswap-interface
|
https://api.github.com/repos/Uniswap/uniswap-interface
|
closed
|
Position list item breaks with long strings
|
bug design p1
|
If the token symbol is long, the position list item UI will break.
<img width="952" alt="Screen Shot 2021-05-05 at 3 11 50 PM" src="https://user-images.githubusercontent.com/1355319/117196411-70049880-adb4-11eb-8b0f-8269d5fe9767.png">
|
1.0
|
Position list item breaks with long strings - If the token symbol is long, the position list item UI will break.
<img width="952" alt="Screen Shot 2021-05-05 at 3 11 50 PM" src="https://user-images.githubusercontent.com/1355319/117196411-70049880-adb4-11eb-8b0f-8269d5fe9767.png">
|
design
|
position list item breaks with long strings if the token symbol is long the position list item ui will break img width alt screen shot at pm src
| 1
|
66,559
| 8,031,690,222
|
IssuesEvent
|
2018-07-28 05:10:48
|
ngs-lang/ngs
|
https://api.github.com/repos/ngs-lang/ngs
|
opened
|
Design NGS and modules versioning support
|
modules needs-design
|
* Installing different NGS versions - would they use same modules or each version of NGS will have it's own modules directory?
* What should be the modules directory/directories layout?
* Should `require()` support specifying the desired version? Maybe something as simple as `require("my_module/2.5.1/blah.ngs")`
* Should multimethods or methods be versioned?
* Should NGS support running different versions of the same module at the same time?
|
1.0
|
Design NGS and modules versioning support - * Installing different NGS versions - would they use same modules or each version of NGS will have it's own modules directory?
* What should be the modules directory/directories layout?
* Should `require()` support specifying the desired version? Maybe something as simple as `require("my_module/2.5.1/blah.ngs")`
* Should multimethods or methods be versioned?
* Should NGS support running different versions of the same module at the same time?
|
design
|
design ngs and modules versioning support installing different ngs versions would they use same modules or each version of ngs will have it s own modules directory what should be the modules directory directories layout should require support specifying the desired version maybe something as simple as require my module blah ngs should multimethods or methods be versioned should ngs support running different versions of the same module at the same time
| 1
|
54,547
| 6,827,512,514
|
IssuesEvent
|
2017-11-08 17:15:15
|
zooniverse/Panoptes-Front-End
|
https://api.github.com/repos/zooniverse/Panoptes-Front-End
|
closed
|
Improve UX for Add to Collection
|
design enhancement ui
|
The Add to Collection feature is not as intuitive a user experience as it could be.
- The pill drawer where you add collections could contain all the collections that this subject is part of (and show them on load). If it were done that way, the X would actually remove the subject from the collection, and you wouldn't need the add button at all. The confusion is most obvious when you save an added collection, but you've previously added it (even though it wasn't showing up at all).
- The "Create new collection" part doesn't really need to be visible all the time. A more conventional way to do it would be to have an entry at the bottom of the list "New collection.." and the text field to create the new collection (and add to it) pops up when you click that.
- At the moment, Add collection closes the popup - but without adding your subject to that new collection. This is confusing because if you added a collection, you want to add to it. Either the window should stay open, or else it should auto-add to that collection before closing. (The former is better, since the text field is designed to support adding more that one collection at once).
|
1.0
|
Improve UX for Add to Collection - The Add to Collection feature is not as intuitive a user experience as it could be.
- The pill drawer where you add collections could contain all the collections that this subject is part of (and show them on load). If it were done that way, the X would actually remove the subject from the collection, and you wouldn't need the add button at all. The confusion is most obvious when you save an added collection, but you've previously added it (even though it wasn't showing up at all).
- The "Create new collection" part doesn't really need to be visible all the time. A more conventional way to do it would be to have an entry at the bottom of the list "New collection.." and the text field to create the new collection (and add to it) pops up when you click that.
- At the moment, Add collection closes the popup - but without adding your subject to that new collection. This is confusing because if you added a collection, you want to add to it. Either the window should stay open, or else it should auto-add to that collection before closing. (The former is better, since the text field is designed to support adding more that one collection at once).
|
design
|
improve ux for add to collection the add to collection feature is not as intuitive a user experience as it could be the pill drawer where you add collections could contain all the collections that this subject is part of and show them on load if it were done that way the x would actually remove the subject from the collection and you wouldn t need the add button at all the confusion is most obvious when you save an added collection but you ve previously added it even though it wasn t showing up at all the create new collection part doesn t really need to be visible all the time a more conventional way to do it would be to have an entry at the bottom of the list new collection and the text field to create the new collection and add to it pops up when you click that at the moment add collection closes the popup but without adding your subject to that new collection this is confusing because if you added a collection you want to add to it either the window should stay open or else it should auto add to that collection before closing the former is better since the text field is designed to support adding more that one collection at once
| 1
|
18,154
| 3,377,909,001
|
IssuesEvent
|
2015-11-25 07:52:38
|
owncloud/client
|
https://api.github.com/repos/owncloud/client
|
closed
|
Protocol / Activities: Double click on row should open local file manager of that folder (if existing)
|
approved by QA Design & UX
|
... and possibly select the file
|
1.0
|
Protocol / Activities: Double click on row should open local file manager of that folder (if existing) - ... and possibly select the file
|
design
|
protocol activities double click on row should open local file manager of that folder if existing and possibly select the file
| 1
|
23,376
| 3,835,836,304
|
IssuesEvent
|
2016-04-01 15:42:38
|
iTowns/itowns2
|
https://api.github.com/repos/iTowns/itowns2
|
opened
|
Classes doing work beyond their scope
|
design
|
There's a recurring design issue in the current implementation. A lot of classes do no limit themselves to their role and undertake actions out of their scope. This is very detrimental to the reusability of the classes.
Let's take [WMTS_Provider](https://github.com/iTowns/itowns2/blob/master/src/Core/Commander/Providers/WMTS_Provider.js) as an example.
The role of the WMTS provider is to implement the WMTS protocol. You give it server information, a tile coordinate, and it gives you an image. Nothing more, nothing less. No assumptions should be made on how the developer will use the class.
Currently, the WMTS provider has:
- Specific functions to retrieve elevation textures and height textures. The provider doesn't care what you'll use the texture for. If you want to fetch textures form different layers, just instantiate two providers.
- Manipulation of tile data. There lies the biggest problem: a class meant to retrieve images from a server is changing the model. Because of this, you can't reuse the class for other purposes than terrain tile creation. The code is harder to understand because the update logic is scattered across multiple classes. The role of the class is no longer clear.
I think the architecture of iTowns as it is now has potential. But the way some classes are implemented makes it really difficult to use and to integrate foreign code into it.
Partial list of classes affected by this problem, followed by the functions that are out of place:
- [ ] [WMTS_Provider](https://github.com/iTowns/itowns2/blob/master/src/Core/Commander/Providers/WMTS_Provider.js): see above
- [ ] [WMS_Provider](https://github.com/iTowns/itowns2/blob/master/src/Core/Commander/Providers/WMS_Provider.js): urlGlobalIR
- [ ] [Scene](https://github.com/iTowns/itowns2/blob/master/src/Scene/Scene.js): addImageryLayer, addElevationLayer
- [ ] [TileMesh](https://github.com/iTowns/itowns2/blob/master/src/Globe/TileMesh.js): use of WMTS coordinates
Debatable:
- [ ] [Quadtree](https://github.com/iTowns/itowns2/blob/master/src/Scene/Quadtree.js): up, down, upSubLayer seem out of place. Quadtree's role doesn't seem precisely defined at the moment (#54 has propositions regarding this)
|
1.0
|
Classes doing work beyond their scope - There's a recurring design issue in the current implementation. A lot of classes do no limit themselves to their role and undertake actions out of their scope. This is very detrimental to the reusability of the classes.
Let's take [WMTS_Provider](https://github.com/iTowns/itowns2/blob/master/src/Core/Commander/Providers/WMTS_Provider.js) as an example.
The role of the WMTS provider is to implement the WMTS protocol. You give it server information, a tile coordinate, and it gives you an image. Nothing more, nothing less. No assumptions should be made on how the developer will use the class.
Currently, the WMTS provider has:
- Specific functions to retrieve elevation textures and height textures. The provider doesn't care what you'll use the texture for. If you want to fetch textures form different layers, just instantiate two providers.
- Manipulation of tile data. There lies the biggest problem: a class meant to retrieve images from a server is changing the model. Because of this, you can't reuse the class for other purposes than terrain tile creation. The code is harder to understand because the update logic is scattered across multiple classes. The role of the class is no longer clear.
I think the architecture of iTowns as it is now has potential. But the way some classes are implemented makes it really difficult to use and to integrate foreign code into it.
Partial list of classes affected by this problem, followed by the functions that are out of place:
- [ ] [WMTS_Provider](https://github.com/iTowns/itowns2/blob/master/src/Core/Commander/Providers/WMTS_Provider.js): see above
- [ ] [WMS_Provider](https://github.com/iTowns/itowns2/blob/master/src/Core/Commander/Providers/WMS_Provider.js): urlGlobalIR
- [ ] [Scene](https://github.com/iTowns/itowns2/blob/master/src/Scene/Scene.js): addImageryLayer, addElevationLayer
- [ ] [TileMesh](https://github.com/iTowns/itowns2/blob/master/src/Globe/TileMesh.js): use of WMTS coordinates
Debatable:
- [ ] [Quadtree](https://github.com/iTowns/itowns2/blob/master/src/Scene/Quadtree.js): up, down, upSubLayer seem out of place. Quadtree's role doesn't seem precisely defined at the moment (#54 has propositions regarding this)
|
design
|
classes doing work beyond their scope there s a recurring design issue in the current implementation a lot of classes do no limit themselves to their role and undertake actions out of their scope this is very detrimental to the reusability of the classes let s take as an example the role of the wmts provider is to implement the wmts protocol you give it server information a tile coordinate and it gives you an image nothing more nothing less no assumptions should be made on how the developer will use the class currently the wmts provider has specific functions to retrieve elevation textures and height textures the provider doesn t care what you ll use the texture for if you want to fetch textures form different layers just instantiate two providers manipulation of tile data there lies the biggest problem a class meant to retrieve images from a server is changing the model because of this you can t reuse the class for other purposes than terrain tile creation the code is harder to understand because the update logic is scattered across multiple classes the role of the class is no longer clear i think the architecture of itowns as it is now has potential but the way some classes are implemented makes it really difficult to use and to integrate foreign code into it partial list of classes affected by this problem followed by the functions that are out of place see above urlglobalir addimagerylayer addelevationlayer use of wmts coordinates debatable up down upsublayer seem out of place quadtree s role doesn t seem precisely defined at the moment has propositions regarding this
| 1
|
51,433
| 6,522,689,438
|
IssuesEvent
|
2017-08-29 04:22:41
|
FreeAndFair/ColoradoRLA
|
https://api.github.com/repos/FreeAndFair/ColoradoRLA
|
closed
|
Enforcing User Behavior
|
Ask CDOS blocked client design question ui/ux
|
How much will we be enforcing good behavior on the part of the user? This issue is a place where you can list opportunities we have for such enforcement.
- Will we disallow changes to the Risk Limit set originally by the Secretary of State? In other words, does the RLA Tool stop the SoS from logging in halfway through the audit to change the Risk Limit?
- What if ID and Ballot Style on Ballot Verification screen match the ID and Ballot Style on the paper ballot, but the contests on the screen do not match the contests on the paper ballot. What would we expect Audit Board users to do in this instance? Of course, if the meaning of the Ballot Styles is strictly enforced both in the Tool and on paper, this case will never arise.
Clicking the browser "back" arrow from the Ballot Review screen of the Audit section does not go back to the Ballot Verification screen. Instead, that action brings the system back to the "Hello, Acme County" screen. How can we prevent users from taking this bad, bad action?
|
1.0
|
Enforcing User Behavior - How much will we be enforcing good behavior on the part of the user? This issue is a place where you can list opportunities we have for such enforcement.
- Will we disallow changes to the Risk Limit set originally by the Secretary of State? In other words, does the RLA Tool stop the SoS from logging in halfway through the audit to change the Risk Limit?
- What if ID and Ballot Style on Ballot Verification screen match the ID and Ballot Style on the paper ballot, but the contests on the screen do not match the contests on the paper ballot. What would we expect Audit Board users to do in this instance? Of course, if the meaning of the Ballot Styles is strictly enforced both in the Tool and on paper, this case will never arise.
Clicking the browser "back" arrow from the Ballot Review screen of the Audit section does not go back to the Ballot Verification screen. Instead, that action brings the system back to the "Hello, Acme County" screen. How can we prevent users from taking this bad, bad action?
|
design
|
enforcing user behavior how much will we be enforcing good behavior on the part of the user this issue is a place where you can list opportunities we have for such enforcement will we disallow changes to the risk limit set originally by the secretary of state in other words does the rla tool stop the sos from logging in halfway through the audit to change the risk limit what if id and ballot style on ballot verification screen match the id and ballot style on the paper ballot but the contests on the screen do not match the contests on the paper ballot what would we expect audit board users to do in this instance of course if the meaning of the ballot styles is strictly enforced both in the tool and on paper this case will never arise clicking the browser back arrow from the ballot review screen of the audit section does not go back to the ballot verification screen instead that action brings the system back to the hello acme county screen how can we prevent users from taking this bad bad action
| 1
|
151,483
| 23,832,420,848
|
IssuesEvent
|
2022-09-05 23:35:22
|
towhee-io/towhee
|
https://api.github.com/repos/towhee-io/towhee
|
closed
|
[DesignProposal]: Support mmap in DataFrame
|
stale needs-triage kind/design-proposal
|
### Background and Motivation
DataCollection map supports multiple functions, dataframe should aswell for cmap.
### Design
Follow the style of DataCollection
### Pros and Cons
_No response_
### Anything else? (Additional Context)
_No response_
|
1.0
|
[DesignProposal]: Support mmap in DataFrame - ### Background and Motivation
DataCollection map supports multiple functions, dataframe should aswell for cmap.
### Design
Follow the style of DataCollection
### Pros and Cons
_No response_
### Anything else? (Additional Context)
_No response_
|
design
|
support mmap in dataframe background and motivation datacollection map supports multiple functions dataframe should aswell for cmap design follow the style of datacollection pros and cons no response anything else additional context no response
| 1
|
74,982
| 9,181,617,158
|
IssuesEvent
|
2019-03-05 10:39:22
|
ipfs-shipyard/pm-idm
|
https://api.github.com/repos/ipfs-shipyard/pm-idm
|
closed
|
Include the "Feedback message" in the style guide
|
design task w-1
|
## Description
Include the "Feedback Message" component in the styleguide.
## Acceptance Criteria
- [x] Have the "Feedback Message" component with all its scenarios in the styleguide document.
|
1.0
|
Include the "Feedback message" in the style guide - ## Description
Include the "Feedback Message" component in the styleguide.
## Acceptance Criteria
- [x] Have the "Feedback Message" component with all its scenarios in the styleguide document.
|
design
|
include the feedback message in the style guide description include the feedback message component in the styleguide acceptance criteria have the feedback message component with all its scenarios in the styleguide document
| 1
|
69,045
| 8,369,359,527
|
IssuesEvent
|
2018-10-04 17:07:03
|
8bitPit/Niagara-Issues
|
https://api.github.com/repos/8bitPit/Niagara-Issues
|
closed
|
[Request] Black settings (Dark theme)
|
design
|
Hey,
Is it possible to have black Niagara settings when Dark theme is tick ?
Thanks
|
1.0
|
[Request] Black settings (Dark theme) - Hey,
Is it possible to have black Niagara settings when Dark theme is tick ?
Thanks
|
design
|
black settings dark theme hey is it possible to have black niagara settings when dark theme is tick thanks
| 1
|
631,698
| 20,157,944,579
|
IssuesEvent
|
2022-02-09 18:15:52
|
nevermined-io/sdk-js
|
https://api.github.com/repos/nevermined-io/sdk-js
|
closed
|
Add new graph events api support in the javascript sdk
|
enhancement priority:high sdk
|
Add support for the new GraphQL events api with Graph.
- Ability to query for past events
- Ability to subscribe to future events
|
1.0
|
Add new graph events api support in the javascript sdk - Add support for the new GraphQL events api with Graph.
- Ability to query for past events
- Ability to subscribe to future events
|
non_design
|
add new graph events api support in the javascript sdk add support for the new graphql events api with graph ability to query for past events ability to subscribe to future events
| 0
|
124,989
| 16,680,722,450
|
IssuesEvent
|
2021-06-07 23:10:00
|
chapel-lang/chapel
|
https://api.github.com/repos/chapel-lang/chapel
|
opened
|
Proposal for `.indices` on arrays
|
area: Language type: Design
|
As of PR #15039 or so, we added `.indices` queries to most indexable types as a means of writing code for them that refers to their indices abstractly rather than assuming the indices are, say, `1..size` or `0..<size`. At that time, `.indices` became a synonym for `.domain` for arrays. Issue #15103 proposed that we retire `.domain` in favor of `.indices` on arrays, but was received fairly negatively.
[For historical purposes, this issue captures a fork in the discussion of #15103 that started at [this point](https://github.com/chapel-lang/chapel/issues/15103#issuecomment-853554218) ]
This issue proposes another way to remove the redundancy: What if, rather than returning the array's actual domain, this query returned (or perhaps yielded in some cases?) the array's index set locally. For example, for an n x n Block-distributed array named `B`, `B.domain` would return a (`const`) reference to `B`'s actual Block-distributed domain as it does today, whereas `B.indices` would return `{1..n, 1..n}` (or `{0..<n, 0..<n}` or whatever indices `B.domain` had) as a local default rectangular domain.
This would have the benefits of:
* providing a data-structure neutral way of getting a local copy of the indices of the data structure in question
* removing the "two ways of doing the exact same thing" case that we have today in which `.domain` and `.indices` do the same thing
* providing a user-facing way to ask for `B`'s index set without getting a block-distributed thing back or querying on a per-dimension basis—the only two ways we can do so today I believe.
An obvious disadvantage would be:
* `.indices` vs. `.domain` are similar, so learning to distinguish them is subtle and could have a significant performance impact when making the wrong choice (e.g., `var A: [B.indices] real;` vs. `var A: [B.domain] real;` or `forall i in B.indices` vs. `forall i in B.domain`).
One open question is whether `.indices` would have to return the indices in a closed form (say as a sparse domain for a sparse array or an associative domain for an associative array) or whether it would be reasonable for it to be implemented in a closed form in some cases (say, when the storage required is O(1), as for dense rectangular arrays) but as an iterator in others (say, when it's not, as for sparse / associative arrays).
|
1.0
|
Proposal for `.indices` on arrays - As of PR #15039 or so, we added `.indices` queries to most indexable types as a means of writing code for them that refers to their indices abstractly rather than assuming the indices are, say, `1..size` or `0..<size`. At that time, `.indices` became a synonym for `.domain` for arrays. Issue #15103 proposed that we retire `.domain` in favor of `.indices` on arrays, but was received fairly negatively.
[For historical purposes, this issue captures a fork in the discussion of #15103 that started at [this point](https://github.com/chapel-lang/chapel/issues/15103#issuecomment-853554218) ]
This issue proposes another way to remove the redundancy: What if, rather than returning the array's actual domain, this query returned (or perhaps yielded in some cases?) the array's index set locally. For example, for an n x n Block-distributed array named `B`, `B.domain` would return a (`const`) reference to `B`'s actual Block-distributed domain as it does today, whereas `B.indices` would return `{1..n, 1..n}` (or `{0..<n, 0..<n}` or whatever indices `B.domain` had) as a local default rectangular domain.
This would have the benefits of:
* providing a data-structure neutral way of getting a local copy of the indices of the data structure in question
* removing the "two ways of doing the exact same thing" case that we have today in which `.domain` and `.indices` do the same thing
* providing a user-facing way to ask for `B`'s index set without getting a block-distributed thing back or querying on a per-dimension basis—the only two ways we can do so today I believe.
An obvious disadvantage would be:
* `.indices` vs. `.domain` are similar, so learning to distinguish them is subtle and could have a significant performance impact when making the wrong choice (e.g., `var A: [B.indices] real;` vs. `var A: [B.domain] real;` or `forall i in B.indices` vs. `forall i in B.domain`).
One open question is whether `.indices` would have to return the indices in a closed form (say as a sparse domain for a sparse array or an associative domain for an associative array) or whether it would be reasonable for it to be implemented in a closed form in some cases (say, when the storage required is O(1), as for dense rectangular arrays) but as an iterator in others (say, when it's not, as for sparse / associative arrays).
|
design
|
proposal for indices on arrays as of pr or so we added indices queries to most indexable types as a means of writing code for them that refers to their indices abstractly rather than assuming the indices are say size or size at that time indices became a synonym for domain for arrays issue proposed that we retire domain in favor of indices on arrays but was received fairly negatively this issue proposes another way to remove the redundancy what if rather than returning the array s actual domain this query returned or perhaps yielded in some cases the array s index set locally for example for an n x n block distributed array named b b domain would return a const reference to b s actual block distributed domain as it does today whereas b indices would return n n or n n or whatever indices b domain had as a local default rectangular domain this would have the benefits of providing a data structure neutral way of getting a local copy of the indices of the data structure in question removing the two ways of doing the exact same thing case that we have today in which domain and indices do the same thing providing a user facing way to ask for b s index set without getting a block distributed thing back or querying on a per dimension basis—the only two ways we can do so today i believe an obvious disadvantage would be indices vs domain are similar so learning to distinguish them is subtle and could have a significant performance impact when making the wrong choice e g var a real vs var a real or forall i in b indices vs forall i in b domain one open question is whether indices would have to return the indices in a closed form say as a sparse domain for a sparse array or an associative domain for an associative array or whether it would be reasonable for it to be implemented in a closed form in some cases say when the storage required is o as for dense rectangular arrays but as an iterator in others say when it s not as for sparse associative arrays
| 1
|
179,302
| 30,215,790,541
|
IssuesEvent
|
2023-07-05 15:29:19
|
quicwg/multipath
|
https://api.github.com/repos/quicwg/multipath
|
closed
|
MUST for checking if peer has spare Connection IDs
|
has PR design
|
Section 4.1 of the Multipath draft states: _Following [[QUIC-TRANSPORT](https://quicwg.org/multipath/draft-ietf-quic-multipath.html#QUIC-TRANSPORT)], each endpoint uses NEW_CONNECTION_ID frames to issue usable connections IDs to reach it. Before an endpoint adds a new path by initiating path validation, it MUST check whether at least one unused Connection ID is available for each side._
I do not think it is possible to perform such checks reliably, as RFC 9000 allows the peer to start using spare Connection IDs at any moment.
In [RFC 9000 section 5.1.1](https://quicwg.org/base-drafts/rfc9000.html#name-issuing-connection-ids), we state that _an endpoint that initiates migration and requires non-zero-length connection IDs SHOULD ensure that the pool of connection IDs available to its peer allows the peer to use a new connection ID on migration, as the peer will be unable to respond if the pool is exhausted._
Considering that Multipath is as extension of v1, I think it might just be sufficient to use what we already have (probably adding a pointer to the above text in RFC 9000) and calling it a day.
|
1.0
|
MUST for checking if peer has spare Connection IDs - Section 4.1 of the Multipath draft states: _Following [[QUIC-TRANSPORT](https://quicwg.org/multipath/draft-ietf-quic-multipath.html#QUIC-TRANSPORT)], each endpoint uses NEW_CONNECTION_ID frames to issue usable connections IDs to reach it. Before an endpoint adds a new path by initiating path validation, it MUST check whether at least one unused Connection ID is available for each side._
I do not think it is possible to perform such checks reliably, as RFC 9000 allows the peer to start using spare Connection IDs at any moment.
In [RFC 9000 section 5.1.1](https://quicwg.org/base-drafts/rfc9000.html#name-issuing-connection-ids), we state that _an endpoint that initiates migration and requires non-zero-length connection IDs SHOULD ensure that the pool of connection IDs available to its peer allows the peer to use a new connection ID on migration, as the peer will be unable to respond if the pool is exhausted._
Considering that Multipath is as extension of v1, I think it might just be sufficient to use what we already have (probably adding a pointer to the above text in RFC 9000) and calling it a day.
|
design
|
must for checking if peer has spare connection ids section of the multipath draft states following each endpoint uses new connection id frames to issue usable connections ids to reach it before an endpoint adds a new path by initiating path validation it must check whether at least one unused connection id is available for each side i do not think it is possible to perform such checks reliably as rfc allows the peer to start using spare connection ids at any moment in we state that an endpoint that initiates migration and requires non zero length connection ids should ensure that the pool of connection ids available to its peer allows the peer to use a new connection id on migration as the peer will be unable to respond if the pool is exhausted considering that multipath is as extension of i think it might just be sufficient to use what we already have probably adding a pointer to the above text in rfc and calling it a day
| 1
|
381,067
| 11,272,987,877
|
IssuesEvent
|
2020-01-14 15:47:40
|
flowkey/UIKit-cross-platform
|
https://api.github.com/repos/flowkey/UIKit-cross-platform
|
closed
|
Build Swift for x86 Android (also to support Android Emulator)
|
medium-priority
|
## Motivation
At the moment we can only build for `armv7a`. While this covers the great proportion of Swift devices on the market today, it'd be great to build for x86 devices as well for two main reasons:
- x86 Devices make up about 10% of the total Android market, which is about 200 million devices.
- The Android Emulator only really works with x86-built libraries. The ARM CPU emulation is so slow that it's basically unusable. This isn't a huge issue, but would be interesting particularly for things like automated tests.
## Proposed solution
"Somebody" needs to make a PR on Apple's Swift repo adding support for Android x86. I suspect this won't be anywhere near as involved as the original port, but it would be a non-trivial piece of work that as far as I know has not been attempted yet.
## Potential alternatives
- Wait for x86 devices to lose market share (if this happens to be the trend over time?)
|
1.0
|
Build Swift for x86 Android (also to support Android Emulator) - ## Motivation
At the moment we can only build for `armv7a`. While this covers the great proportion of Swift devices on the market today, it'd be great to build for x86 devices as well for two main reasons:
- x86 Devices make up about 10% of the total Android market, which is about 200 million devices.
- The Android Emulator only really works with x86-built libraries. The ARM CPU emulation is so slow that it's basically unusable. This isn't a huge issue, but would be interesting particularly for things like automated tests.
## Proposed solution
"Somebody" needs to make a PR on Apple's Swift repo adding support for Android x86. I suspect this won't be anywhere near as involved as the original port, but it would be a non-trivial piece of work that as far as I know has not been attempted yet.
## Potential alternatives
- Wait for x86 devices to lose market share (if this happens to be the trend over time?)
|
non_design
|
build swift for android also to support android emulator motivation at the moment we can only build for while this covers the great proportion of swift devices on the market today it d be great to build for devices as well for two main reasons devices make up about of the total android market which is about million devices the android emulator only really works with built libraries the arm cpu emulation is so slow that it s basically unusable this isn t a huge issue but would be interesting particularly for things like automated tests proposed solution somebody needs to make a pr on apple s swift repo adding support for android i suspect this won t be anywhere near as involved as the original port but it would be a non trivial piece of work that as far as i know has not been attempted yet potential alternatives wait for devices to lose market share if this happens to be the trend over time
| 0
|
109,770
| 11,648,627,119
|
IssuesEvent
|
2020-03-01 21:49:01
|
gatsbyjs/gatsby
|
https://api.github.com/repos/gatsbyjs/gatsby
|
reopened
|
gatsby-remark-images docs are inconsistent
|
stale? type: documentation
|
<!--
To make it easier for us to help you, please include as much useful information as possible.
Useful Links:
- Documentation: https://www.gatsbyjs.org/docs/
- Contributing: https://www.gatsbyjs.org/contributing/
Gatsby has several community support channels, try asking your question on:
- Discord: https://gatsby.dev/discord
- Spectrum: https://spectrum.chat/gatsby-js
- Twitter: https://twitter.com/gatsbyjs
Before opening a new issue, please search existing issues https://github.com/gatsbyjs/gatsby/issues
-->
## Summary
There's a bunch of scattered docs on using images in MDX and they're all doing things slightly differently.
Both incorrect:
https://www.gatsbyjs.org/docs/working-with-images-in-markdown/#using-the-mdx-plugin
https://www.gatsbyjs.org/packages/gatsby-plugin-mdx/#gatsby-remark-plugins
Correct:
https://www.gatsbyjs.org/docs/mdx/plugins/#gatsby-remark-plugins
The other docs are missing this key point:
> `gatsby-source-filesystem` needs to be pointed at wherever you have your images on disk, `gatsby-remark-images` needs to be both a sub-plugin of `gatsby-plugin-mdx` and a string entry in the plugins array, and `gatsby-plugin-sharp` can be included on its own.
### Motivation
Using gatsby-remark-images is super confusing.
|
1.0
|
gatsby-remark-images docs are inconsistent - <!--
To make it easier for us to help you, please include as much useful information as possible.
Useful Links:
- Documentation: https://www.gatsbyjs.org/docs/
- Contributing: https://www.gatsbyjs.org/contributing/
Gatsby has several community support channels, try asking your question on:
- Discord: https://gatsby.dev/discord
- Spectrum: https://spectrum.chat/gatsby-js
- Twitter: https://twitter.com/gatsbyjs
Before opening a new issue, please search existing issues https://github.com/gatsbyjs/gatsby/issues
-->
## Summary
There's a bunch of scattered docs on using images in MDX and they're all doing things slightly differently.
Both incorrect:
https://www.gatsbyjs.org/docs/working-with-images-in-markdown/#using-the-mdx-plugin
https://www.gatsbyjs.org/packages/gatsby-plugin-mdx/#gatsby-remark-plugins
Correct:
https://www.gatsbyjs.org/docs/mdx/plugins/#gatsby-remark-plugins
The other docs are missing this key point:
> `gatsby-source-filesystem` needs to be pointed at wherever you have your images on disk, `gatsby-remark-images` needs to be both a sub-plugin of `gatsby-plugin-mdx` and a string entry in the plugins array, and `gatsby-plugin-sharp` can be included on its own.
### Motivation
Using gatsby-remark-images is super confusing.
|
non_design
|
gatsby remark images docs are inconsistent to make it easier for us to help you please include as much useful information as possible useful links documentation contributing gatsby has several community support channels try asking your question on discord spectrum twitter before opening a new issue please search existing issues summary there s a bunch of scattered docs on using images in mdx and they re all doing things slightly differently both incorrect correct the other docs are missing this key point gatsby source filesystem needs to be pointed at wherever you have your images on disk gatsby remark images needs to be both a sub plugin of gatsby plugin mdx and a string entry in the plugins array and gatsby plugin sharp can be included on its own motivation using gatsby remark images is super confusing
| 0
|
5,986
| 2,798,669,323
|
IssuesEvent
|
2015-05-12 19:50:04
|
mozilla/webmaker-android
|
https://api.github.com/repos/mozilla/webmaker-android
|
closed
|
Better selected page state UI
|
design enhancement
|
Can we get something more prominent than this? The shadow doesn't stand out enough when the grid is filled.

|
1.0
|
Better selected page state UI - Can we get something more prominent than this? The shadow doesn't stand out enough when the grid is filled.

|
design
|
better selected page state ui can we get something more prominent than this the shadow doesn t stand out enough when the grid is filled
| 1
|
147,775
| 19,524,101,495
|
IssuesEvent
|
2021-12-30 02:30:14
|
mireillehulent/maven-project
|
https://api.github.com/repos/mireillehulent/maven-project
|
opened
|
CVE-2021-44832 (Medium) detected in log4j-core-2.6.1.jar
|
security vulnerability
|
## CVE-2021-44832 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-core-2.6.1.jar</b></p></summary>
<p>The Apache Log4j Implementation</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /epository/org/apache/logging/log4j/log4j-core/2.6.1/log4j-core-2.6.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **log4j-core-2.6.1.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Log4j2 versions 2.0-beta7 through 2.17.0 (excluding security fix releases 2.3.2 and 2.12.4) are vulnerable to a remote code execution (RCE) attack where an attacker with permission to modify the logging configuration file can construct a malicious configuration using a JDBC Appender with a data source referencing a JNDI URI which can execute remote code. This issue is fixed by limiting JNDI data source names to the java protocol in Log4j2 versions 2.17.1, 2.12.4, and 2.3.2.
<p>Publish Date: 2021-12-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-44832>CVE-2021-44832</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://logging.apache.org/log4j/2.x/security.html">https://logging.apache.org/log4j/2.x/security.html</a></p>
<p>Release Date: 2021-12-28</p>
<p>Fix Resolution: org.apache.logging.log4j:log4j-core:2.3.2,2.12.4,2.17.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-44832 (Medium) detected in log4j-core-2.6.1.jar - ## CVE-2021-44832 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-core-2.6.1.jar</b></p></summary>
<p>The Apache Log4j Implementation</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /epository/org/apache/logging/log4j/log4j-core/2.6.1/log4j-core-2.6.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **log4j-core-2.6.1.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Log4j2 versions 2.0-beta7 through 2.17.0 (excluding security fix releases 2.3.2 and 2.12.4) are vulnerable to a remote code execution (RCE) attack where an attacker with permission to modify the logging configuration file can construct a malicious configuration using a JDBC Appender with a data source referencing a JNDI URI which can execute remote code. This issue is fixed by limiting JNDI data source names to the java protocol in Log4j2 versions 2.17.1, 2.12.4, and 2.3.2.
<p>Publish Date: 2021-12-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-44832>CVE-2021-44832</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://logging.apache.org/log4j/2.x/security.html">https://logging.apache.org/log4j/2.x/security.html</a></p>
<p>Release Date: 2021-12-28</p>
<p>Fix Resolution: org.apache.logging.log4j:log4j-core:2.3.2,2.12.4,2.17.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_design
|
cve medium detected in core jar cve medium severity vulnerability vulnerable library core jar the apache implementation path to dependency file pom xml path to vulnerable library epository org apache logging core core jar dependency hierarchy x core jar vulnerable library vulnerability details apache versions through excluding security fix releases and are vulnerable to a remote code execution rce attack where an attacker with permission to modify the logging configuration file can construct a malicious configuration using a jdbc appender with a data source referencing a jndi uri which can execute remote code this issue is fixed by limiting jndi data source names to the java protocol in versions and publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache logging core step up your open source security game with whitesource
| 0
|
156,130
| 24,575,474,767
|
IssuesEvent
|
2022-10-13 12:00:57
|
Lundalogik/lime-elements
|
https://api.github.com/repos/Lundalogik/lime-elements
|
closed
|
`isolate` components that use `z-index`
|
bug good first issue usability visual design released on @next
|
## Current behavior
Using `z-index` has always created problems in the UI as components overlap each other. For example 👇

## Expected behavior
However, it is possible to prevent this by isolating the components.
Read more: https://developer.mozilla.org/en-US/docs/Web/CSS/isolation
Then we won't need to import the `z-index.scss` file neither, and do all that bookkeeping.
## Environment
- **lime-elements** version: <!-- Version set in package.json -->
- Framework used: <!-- The framework or similar used to consume lime-elements (Angular, StencilJS, Polymer) -->
- Logs: <!-- any relevant logs or error messages -->
|
1.0
|
`isolate` components that use `z-index` - ## Current behavior
Using `z-index` has always created problems in the UI as components overlap each other. For example 👇

## Expected behavior
However, it is possible to prevent this by isolating the components.
Read more: https://developer.mozilla.org/en-US/docs/Web/CSS/isolation
Then we won't need to import the `z-index.scss` file neither, and do all that bookkeeping.
## Environment
- **lime-elements** version: <!-- Version set in package.json -->
- Framework used: <!-- The framework or similar used to consume lime-elements (Angular, StencilJS, Polymer) -->
- Logs: <!-- any relevant logs or error messages -->
|
design
|
isolate components that use z index current behavior using z index has always created problems in the ui as components overlap each other for example 👇 expected behavior however it is possible to prevent this by isolating the components read more then we won t need to import the z index scss file neither and do all that bookkeeping environment lime elements version framework used logs
| 1
|
248,947
| 18,858,139,210
|
IssuesEvent
|
2021-11-12 09:25:42
|
arcturusz/pe
|
https://api.github.com/repos/arcturusz/pe
|
opened
|
Very long command makes it harder to read sequence diagram
|
type.DocumentationBug severity.VeryLow
|
Sequence diagram of the command component can be simplified further? Instead of writing out all the parameters when calling the function, can be simplified to just execute().

<!--session: 1636702723886-a08a79fe-47e2-4e4d-bbbf-871302f6ba4c-->
<!--Version: Web v3.4.1-->
|
1.0
|
Very long command makes it harder to read sequence diagram - Sequence diagram of the command component can be simplified further? Instead of writing out all the parameters when calling the function, can be simplified to just execute().

<!--session: 1636702723886-a08a79fe-47e2-4e4d-bbbf-871302f6ba4c-->
<!--Version: Web v3.4.1-->
|
non_design
|
very long command makes it harder to read sequence diagram sequence diagram of the command component can be simplified further instead of writing out all the parameters when calling the function can be simplified to just execute
| 0
|
126,113
| 16,977,580,221
|
IssuesEvent
|
2021-06-30 02:53:56
|
wpilibsuite/frc-docs
|
https://api.github.com/repos/wpilibsuite/frc-docs
|
closed
|
Document GradleRIO capabilities
|
Needs Design Decision Type: New Article
|
I can work on this, I just need some guidance as to structure.
@Daltz333 - where should this be? I assume a section under Software Tools?
Also, what structure do we want this in? A doc outlining every task might be appropriate, but it might be a bit redundant because of `./gradlew tasks`. Maybe denote only the main useful tasks?
There are a few tasks that help configure CLion - is there a problem documenting it on frc-docs, given that CLion support is unofficial?
|
1.0
|
Document GradleRIO capabilities - I can work on this, I just need some guidance as to structure.
@Daltz333 - where should this be? I assume a section under Software Tools?
Also, what structure do we want this in? A doc outlining every task might be appropriate, but it might be a bit redundant because of `./gradlew tasks`. Maybe denote only the main useful tasks?
There are a few tasks that help configure CLion - is there a problem documenting it on frc-docs, given that CLion support is unofficial?
|
design
|
document gradlerio capabilities i can work on this i just need some guidance as to structure where should this be i assume a section under software tools also what structure do we want this in a doc outlining every task might be appropriate but it might be a bit redundant because of gradlew tasks maybe denote only the main useful tasks there are a few tasks that help configure clion is there a problem documenting it on frc docs given that clion support is unofficial
| 1
|
63,667
| 14,656,757,459
|
IssuesEvent
|
2020-12-28 14:07:53
|
fu1771695yongxie/Boostnote
|
https://api.github.com/repos/fu1771695yongxie/Boostnote
|
opened
|
WS-2019-0307 (Medium) detected in mem-1.1.0.tgz
|
security vulnerability
|
## WS-2019-0307 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mem-1.1.0.tgz</b></p></summary>
<p>Memoize functions - An optimization used to speed up consecutive function calls by caching the result of calls with identical input</p>
<p>Library home page: <a href="https://registry.npmjs.org/mem/-/mem-1.1.0.tgz">https://registry.npmjs.org/mem/-/mem-1.1.0.tgz</a></p>
<p>Path to dependency file: Boostnote/package.json</p>
<p>Path to vulnerable library: Boostnote/node_modules/mem/package.json</p>
<p>
Dependency Hierarchy:
- jest-22.4.4.tgz (Root Library)
- jest-cli-22.4.4.tgz
- yargs-10.1.2.tgz
- os-locale-2.1.0.tgz
- :x: **mem-1.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/Boostnote/commit/11e9dea15d65583ed8c71129e7f8b2449bca784a">11e9dea15d65583ed8c71129e7f8b2449bca784a</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In 'mem' before v4.0.0 there is a Denial of Service (DoS) vulnerability as a result of a failure in removal old values from the cache.
<p>Publish Date: 2018-08-27
<p>URL: <a href=https://github.com/sindresorhus/mem/commit/da4e4398cb27b602de3bd55f746efa9b4a31702b>WS-2019-0307</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1084">https://www.npmjs.com/advisories/1084</a></p>
<p>Release Date: 2019-12-01</p>
<p>Fix Resolution: mem - 4.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2019-0307 (Medium) detected in mem-1.1.0.tgz - ## WS-2019-0307 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mem-1.1.0.tgz</b></p></summary>
<p>Memoize functions - An optimization used to speed up consecutive function calls by caching the result of calls with identical input</p>
<p>Library home page: <a href="https://registry.npmjs.org/mem/-/mem-1.1.0.tgz">https://registry.npmjs.org/mem/-/mem-1.1.0.tgz</a></p>
<p>Path to dependency file: Boostnote/package.json</p>
<p>Path to vulnerable library: Boostnote/node_modules/mem/package.json</p>
<p>
Dependency Hierarchy:
- jest-22.4.4.tgz (Root Library)
- jest-cli-22.4.4.tgz
- yargs-10.1.2.tgz
- os-locale-2.1.0.tgz
- :x: **mem-1.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/Boostnote/commit/11e9dea15d65583ed8c71129e7f8b2449bca784a">11e9dea15d65583ed8c71129e7f8b2449bca784a</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In 'mem' before v4.0.0 there is a Denial of Service (DoS) vulnerability as a result of a failure in removal old values from the cache.
<p>Publish Date: 2018-08-27
<p>URL: <a href=https://github.com/sindresorhus/mem/commit/da4e4398cb27b602de3bd55f746efa9b4a31702b>WS-2019-0307</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1084">https://www.npmjs.com/advisories/1084</a></p>
<p>Release Date: 2019-12-01</p>
<p>Fix Resolution: mem - 4.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_design
|
ws medium detected in mem tgz ws medium severity vulnerability vulnerable library mem tgz memoize functions an optimization used to speed up consecutive function calls by caching the result of calls with identical input library home page a href path to dependency file boostnote package json path to vulnerable library boostnote node modules mem package json dependency hierarchy jest tgz root library jest cli tgz yargs tgz os locale tgz x mem tgz vulnerable library found in head commit a href found in base branch master vulnerability details in mem before there is a denial of service dos vulnerability as a result of a failure in removal old values from the cache publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution mem step up your open source security game with whitesource
| 0
|
151,481
| 23,832,326,132
|
IssuesEvent
|
2022-09-05 23:22:45
|
WordPress/gutenberg
|
https://api.github.com/repos/WordPress/gutenberg
|
closed
|
Make it easier to quickly turn duotone on/off
|
[Type] Enhancement [Feature] Design Tools
|
## What problem does this address?
Currently, due to a the following reasons, it's a bit tricky to easily turn on/off the duotone filter while continuing to try them out:
- The duotone popover isn't persistent and often closes once more causing you to have to open it again.
- It's not quite clear that clicking again on the same color option will turn duotone off.
- The pop-over covers the image often times making it sometimes tricky to see at a glance what on/off might look like quickly.
- If you add a custom color to a current duotone filter, this flow gets even more difficult as the customization doesn't get reflected in the presets.
Improving this would be advantageous as it'll help folks compare/contrast quickly between the original.
General 5.9 experience:
https://user-images.githubusercontent.com/26996883/152717724-ca554cd3-5fc2-41dc-b8c2-bc5d79369839.mov
5.9 experience with custom colors added where you can see it's hard to tell which filter was customized:
https://user-images.githubusercontent.com/26996883/152719041-fd9f655f-f251-4568-ae1b-b65cd56beaf0.mov
This initial feedback came up as part of the [latest FSE Outreach Program's All Things Media exploration](https://make.wordpress.org/test/2022/02/02/fse-program-exploration-all-things-media/#comment-2303) from @paaljoachim:
>Turn duotone filter on and off. To easily see the before and after effects. (In addition to reset.)
## What is your proposed solution?
I think there are two technical considerations: the persistence of the pop over & ensuring the duotone filter being altered is properly reflected to make it easier to toggle on/off cc @ajlende. From there, I think there are some design questions around how obtrusive or not the duotone popover is and how the experience could make it easier to turn on/off. Perhaps it means add a "clear" option to the list of colors! I'm not quite sure. Here's a very quick visual of what I just described (see lots of problems with this too):
<img width="365" alt="Screen Shot 2022-02-06 at 7 21 54 PM" src="https://user-images.githubusercontent.com/26996883/152719464-b40d77e6-842c-4181-9ef8-ce88560dd628.png">
Could be neat to take some inspiration from tools like Lightroom where, on the mobile app, if you hold down your finger on the image you're editing, it'll toggle back to the original version. Obviously can't duplicate that here but I think seeing what other tools allow here could be interesting since work is planned to evolve these image editing options.
cc @jasmussen who I know has thought lots about these tools :)
|
1.0
|
Make it easier to quickly turn duotone on/off - ## What problem does this address?
Currently, due to a the following reasons, it's a bit tricky to easily turn on/off the duotone filter while continuing to try them out:
- The duotone popover isn't persistent and often closes once more causing you to have to open it again.
- It's not quite clear that clicking again on the same color option will turn duotone off.
- The pop-over covers the image often times making it sometimes tricky to see at a glance what on/off might look like quickly.
- If you add a custom color to a current duotone filter, this flow gets even more difficult as the customization doesn't get reflected in the presets.
Improving this would be advantageous as it'll help folks compare/contrast quickly between the original.
General 5.9 experience:
https://user-images.githubusercontent.com/26996883/152717724-ca554cd3-5fc2-41dc-b8c2-bc5d79369839.mov
5.9 experience with custom colors added where you can see it's hard to tell which filter was customized:
https://user-images.githubusercontent.com/26996883/152719041-fd9f655f-f251-4568-ae1b-b65cd56beaf0.mov
This initial feedback came up as part of the [latest FSE Outreach Program's All Things Media exploration](https://make.wordpress.org/test/2022/02/02/fse-program-exploration-all-things-media/#comment-2303) from @paaljoachim:
>Turn duotone filter on and off. To easily see the before and after effects. (In addition to reset.)
## What is your proposed solution?
I think there are two technical considerations: the persistence of the pop over & ensuring the duotone filter being altered is properly reflected to make it easier to toggle on/off cc @ajlende. From there, I think there are some design questions around how obtrusive or not the duotone popover is and how the experience could make it easier to turn on/off. Perhaps it means add a "clear" option to the list of colors! I'm not quite sure. Here's a very quick visual of what I just described (see lots of problems with this too):
<img width="365" alt="Screen Shot 2022-02-06 at 7 21 54 PM" src="https://user-images.githubusercontent.com/26996883/152719464-b40d77e6-842c-4181-9ef8-ce88560dd628.png">
Could be neat to take some inspiration from tools like Lightroom where, on the mobile app, if you hold down your finger on the image you're editing, it'll toggle back to the original version. Obviously can't duplicate that here but I think seeing what other tools allow here could be interesting since work is planned to evolve these image editing options.
cc @jasmussen who I know has thought lots about these tools :)
|
design
|
make it easier to quickly turn duotone on off what problem does this address currently due to a the following reasons it s a bit tricky to easily turn on off the duotone filter while continuing to try them out the duotone popover isn t persistent and often closes once more causing you to have to open it again it s not quite clear that clicking again on the same color option will turn duotone off the pop over covers the image often times making it sometimes tricky to see at a glance what on off might look like quickly if you add a custom color to a current duotone filter this flow gets even more difficult as the customization doesn t get reflected in the presets improving this would be advantageous as it ll help folks compare contrast quickly between the original general experience experience with custom colors added where you can see it s hard to tell which filter was customized this initial feedback came up as part of the from paaljoachim turn duotone filter on and off to easily see the before and after effects in addition to reset what is your proposed solution i think there are two technical considerations the persistence of the pop over ensuring the duotone filter being altered is properly reflected to make it easier to toggle on off cc ajlende from there i think there are some design questions around how obtrusive or not the duotone popover is and how the experience could make it easier to turn on off perhaps it means add a clear option to the list of colors i m not quite sure here s a very quick visual of what i just described see lots of problems with this too img width alt screen shot at pm src could be neat to take some inspiration from tools like lightroom where on the mobile app if you hold down your finger on the image you re editing it ll toggle back to the original version obviously can t duplicate that here but i think seeing what other tools allow here could be interesting since work is planned to evolve these image editing options cc jasmussen who i know has thought lots about these tools
| 1
|
106,768
| 13,383,950,882
|
IssuesEvent
|
2020-09-02 11:11:39
|
Altinn/altinn-studio
|
https://api.github.com/repos/Altinn/altinn-studio
|
closed
|
502 responses when deploying a Designer pod after AKS upgrade
|
kind/bug ops/ci-cd solution/studio/designer
|
## Describe the bug
After we upgraded our AKS cluster for dev.studio we are experiencing issues when deploying the designer pod.
## To Reproduce
Steps to reproduce the behavior:
1. Deploy a designer pod
2. See that we are unable to reach the pod - 502.
## Expected behavior
Should take over for the older pod as expected.
## Cause / resolution
We are not sure about the cause. The process got stuck on config.Build(),
we suspect this is due to the process scanning all the files on the base path..
Up until now we have manipulated the basepath and set it to "/"
Resolution was to use original basepath /app, and move the appsettings file to this path as well.
Seems to have solved the problem for now.
|
1.0
|
502 responses when deploying a Designer pod after AKS upgrade - ## Describe the bug
After we upgraded our AKS cluster for dev.studio we are experiencing issues when deploying the designer pod.
## To Reproduce
Steps to reproduce the behavior:
1. Deploy a designer pod
2. See that we are unable to reach the pod - 502.
## Expected behavior
Should take over for the older pod as expected.
## Cause / resolution
We are not sure about the cause. The process got stuck on config.Build(),
we suspect this is due to the process scanning all the files on the base path..
Up until now we have manipulated the basepath and set it to "/"
Resolution was to use original basepath /app, and move the appsettings file to this path as well.
Seems to have solved the problem for now.
|
design
|
responses when deploying a designer pod after aks upgrade describe the bug after we upgraded our aks cluster for dev studio we are experiencing issues when deploying the designer pod to reproduce steps to reproduce the behavior deploy a designer pod see that we are unable to reach the pod expected behavior should take over for the older pod as expected cause resolution we are not sure about the cause the process got stuck on config build we suspect this is due to the process scanning all the files on the base path up until now we have manipulated the basepath and set it to resolution was to use original basepath app and move the appsettings file to this path as well seems to have solved the problem for now
| 1
|
183,121
| 31,159,989,158
|
IssuesEvent
|
2023-08-16 15:21:41
|
MetPX/sarracenia
|
https://api.github.com/repos/MetPX/sarracenia
|
closed
|
Dependency Management Strategies...
|
bug enhancement Design Developer Discussion_Needed crasher
|
# The Problem
Sarracenia uses a lot of other packages to provide functionality. These are called *dependencies*. In it's native environment (Ubuntu Linux) most of these dependencies are easily resolved using the built-in debian packaging tools (apt-get.) but in many other environments, It is more complex. like: https://xkcd.com/1987/ Even in environments where dependencies are installed *somewhere* it is not always clear which ones are available to a given program.
On redhat-8, for example, there does not seem to be a wide variety of python packages available in operating system repositories. Rather the specific minimal packages needed for the OS's own needs of python are all that seem to be available. This makes it challenging to install on redhat, as one now has to package many dependencies as well as the main package. The typical approach is to hunt for individual dependencies in different third party repositories, or rebuild them from source... This is a bit haphazard, and in some cases, like watchdog or dateparser, the package itself has dependencies and one ends up having to create dozens of python packages.
On redhat, as in many other environments, it seems more practical to use python native packaging, rather than the incomplete OS ones, as they do dependency resolution, and all the dependencies can be brought in using pip. The result of this, if done system-wide, is a mix of Distro packages, and pip provided packages, which complicates auditing and patching. System Administrators may also object to the use of pip packages in the base operating system.
Windows is another example of an environment where pre-existing package availability is unclear. On windows, the natural distribution format would be a self-extracting EXE, but use of plugins with such a method is unclear, and all the dependencies need to be packaged within it. People also install python *distributions* ActiveState, Anaconda, or the more traditional cpython, and those will each have their own installation methods.
The complications mostly arise from dependencies such as xattr, python3-magic, watchdog, etc... that is packages that are wrappers around C libraries or use C libraries as part of their implementation. In these cases, pure python packaging often fails, as more environmental support is needed. For example, the python-magic python package requires the c-library libmagic1 to be installed. If using OS packages, this is just an additional dependency, no problem, but with pip, it will just fail, and the user needs to find the OS package, install that, and then try installing the python package again.
Another complication results from all these different platforms having methods of installation mean that it is not obvious what advice to provide to users when a dependency is missing "pip installe? conda install? apt install, yum install ?" ... the package naming conventions vary by distribution, and are different from the module names used to test their presence.
## Approaches to Dependency Management
### Manual Tailoring
For HPC (which runs redhat 8.x) there are a few dependencies brought in by EPEL packages, some built from source, but some had to be left out. The setup.py file, when building packages on redhat are typically hand edited to work around packages that are not available. So manual editing of packages is done. After the RPM is generated, it is then tested on another system, and a different user, to see whether it runs (as the local user doing the build may have pip packages which provide deps not available to others.)
implementation: manual editing of setup.py to remove dependencies.
### (Mostly) Silent Disable
Looking at xattr, the *import* is in a try/except, and if it fails, the storing of metadata in extended file attributes is disabled. There is a loss of functionality or a different behaviour on these systems as a result. There is no way to query the system for which *degrades* are active. nothing to prompt the user what to do to address, if they want to.
implementation in filemetadata.py:
```
try:
import xattr
supports_extended_attributes = True
except:
supports_extended_attributes = False
```
There are also tests in sarracenia/__init__.py for the code to degrade/understand when dependencies are missing:
```
extras = {
'amqp' : { 'modules_needed': [ 'amqp' ], 'present': False, 'lament' : 'will not be able to connect to rabbitmq broker
s' },
'appdirs' : { 'modules_needed': [ 'appdirs' ], 'present': False, 'lament' : 'will assume linux file placement under h
ome dir' },
'ftppoll' : { 'modules_needed': ['dateparser', 'pytz'], 'present': False, 'lament' : 'will not be able to poll with f
tp' },
'humanize' : { 'modules_needed': ['humanize' ], 'present': False, 'lament': 'humans will have to read larger, uglier
numbers' },
'mqtt' : { 'modules_needed': ['paho.mqtt.client'], 'present': False, 'lament': 'will not be able to connect to mqtt b
rokers' },
'filetypes' : { 'modules_needed': ['magic'], 'present': False, 'lament': 'will not be able to set content headers' },
'vip' : { 'modules_needed': ['netifaces'] , 'present': False, 'lament': 'will not be able to use the vip option for
high availability clustering' },
'watch' : { 'modules_needed': ['watchdog'] , 'present': False, 'lament': 'cannot watch directories' }
}
for x in extras:
extras[x]['present']=True
for y in extras[x]['modules_needed']:
try:
if importlib.util.find_spec( y ):
#logger.debug( f'found feature {y}, enabled')
pass
else:
logger.debug( f"extra feature {x} needs missing module {y}. Disabled" )
extras[x]['present']=False
except:
logger.debug( f"extra feature {x} needs missing module {y}. Disabled" )
extras[x]['present']=False
```
### Demotion to Extras
The Python Packaging tool has a concept of extras, sort of the inverse of *batteries included*... in setup.py one can put extras that are available with additional dependencies being installed:
```
extras = {
'amqp' : [ "amqp" ],
'filetypes': [ "python-magic" ],
'ftppoll' : ['dateparser' ],
'mqtt': [ 'paho.mqtt>=1.5.1' ],
'vip': [ 'netifaces' ],
'redis': [ 'redis' ]
}
extras['all'] = list(itertools.chain.from_iterable(extras.values()))
```
### Platform Dependent Deps
one can add dependencies that vary depending on the platform we are installing on.
```
install_requires=[
"appdirs", "humanfriendly", "humanize", "jsonpickle", "paramiko",
"psutil>=5.3.0", "watchdog",
'xattr ; sys_platform!="win32"', 'python-magic; sys_platform!="win32"',
'python-magic-bin; sys_platform=="win32"'
],
```
( this is in the [v03_issue721_platdep](https://github.com/MetPX/sarracenia/tree/v03_issue721_platdep) branch)
## What do we do?
So all of the approaches above (and perhaps others?) are used in the code, and someone using an installation will have a subset of functionality available, and sr3 has no way of reporting what is available or not. there is a branch https://github.com/MetPX/sarracenia/pull/738 that provides an example report of modules available using an *sr3 extras* command.
should we at least report what is working, and what isn't? An additional problem is that configured plugins may have additional dependencies. The mechanism in the pull request also provides a way for plugins to *register* those, so they show up in the inventory command.
Is this a reasonable/adviseable approach?
|
1.0
|
Dependency Management Strategies... - # The Problem
Sarracenia uses a lot of other packages to provide functionality. These are called *dependencies*. In it's native environment (Ubuntu Linux) most of these dependencies are easily resolved using the built-in debian packaging tools (apt-get.) but in many other environments, It is more complex. like: https://xkcd.com/1987/ Even in environments where dependencies are installed *somewhere* it is not always clear which ones are available to a given program.
On redhat-8, for example, there does not seem to be a wide variety of python packages available in operating system repositories. Rather the specific minimal packages needed for the OS's own needs of python are all that seem to be available. This makes it challenging to install on redhat, as one now has to package many dependencies as well as the main package. The typical approach is to hunt for individual dependencies in different third party repositories, or rebuild them from source... This is a bit haphazard, and in some cases, like watchdog or dateparser, the package itself has dependencies and one ends up having to create dozens of python packages.
On redhat, as in many other environments, it seems more practical to use python native packaging, rather than the incomplete OS ones, as they do dependency resolution, and all the dependencies can be brought in using pip. The result of this, if done system-wide, is a mix of Distro packages, and pip provided packages, which complicates auditing and patching. System Administrators may also object to the use of pip packages in the base operating system.
Windows is another example of an environment where pre-existing package availability is unclear. On windows, the natural distribution format would be a self-extracting EXE, but use of plugins with such a method is unclear, and all the dependencies need to be packaged within it. People also install python *distributions* ActiveState, Anaconda, or the more traditional cpython, and those will each have their own installation methods.
The complications mostly arise from dependencies such as xattr, python3-magic, watchdog, etc... that is packages that are wrappers around C libraries or use C libraries as part of their implementation. In these cases, pure python packaging often fails, as more environmental support is needed. For example, the python-magic python package requires the c-library libmagic1 to be installed. If using OS packages, this is just an additional dependency, no problem, but with pip, it will just fail, and the user needs to find the OS package, install that, and then try installing the python package again.
Another complication results from all these different platforms having methods of installation mean that it is not obvious what advice to provide to users when a dependency is missing "pip installe? conda install? apt install, yum install ?" ... the package naming conventions vary by distribution, and are different from the module names used to test their presence.
## Approaches to Dependency Management
### Manual Tailoring
For HPC (which runs redhat 8.x) there are a few dependencies brought in by EPEL packages, some built from source, but some had to be left out. The setup.py file, when building packages on redhat are typically hand edited to work around packages that are not available. So manual editing of packages is done. After the RPM is generated, it is then tested on another system, and a different user, to see whether it runs (as the local user doing the build may have pip packages which provide deps not available to others.)
implementation: manual editing of setup.py to remove dependencies.
### (Mostly) Silent Disable
Looking at xattr, the *import* is in a try/except, and if it fails, the storing of metadata in extended file attributes is disabled. There is a loss of functionality or a different behaviour on these systems as a result. There is no way to query the system for which *degrades* are active. nothing to prompt the user what to do to address, if they want to.
implementation in filemetadata.py:
```
try:
import xattr
supports_extended_attributes = True
except:
supports_extended_attributes = False
```
There are also tests in sarracenia/__init__.py for the code to degrade/understand when dependencies are missing:
```
extras = {
'amqp' : { 'modules_needed': [ 'amqp' ], 'present': False, 'lament' : 'will not be able to connect to rabbitmq broker
s' },
'appdirs' : { 'modules_needed': [ 'appdirs' ], 'present': False, 'lament' : 'will assume linux file placement under h
ome dir' },
'ftppoll' : { 'modules_needed': ['dateparser', 'pytz'], 'present': False, 'lament' : 'will not be able to poll with f
tp' },
'humanize' : { 'modules_needed': ['humanize' ], 'present': False, 'lament': 'humans will have to read larger, uglier
numbers' },
'mqtt' : { 'modules_needed': ['paho.mqtt.client'], 'present': False, 'lament': 'will not be able to connect to mqtt b
rokers' },
'filetypes' : { 'modules_needed': ['magic'], 'present': False, 'lament': 'will not be able to set content headers' },
'vip' : { 'modules_needed': ['netifaces'] , 'present': False, 'lament': 'will not be able to use the vip option for
high availability clustering' },
'watch' : { 'modules_needed': ['watchdog'] , 'present': False, 'lament': 'cannot watch directories' }
}
for x in extras:
extras[x]['present']=True
for y in extras[x]['modules_needed']:
try:
if importlib.util.find_spec( y ):
#logger.debug( f'found feature {y}, enabled')
pass
else:
logger.debug( f"extra feature {x} needs missing module {y}. Disabled" )
extras[x]['present']=False
except:
logger.debug( f"extra feature {x} needs missing module {y}. Disabled" )
extras[x]['present']=False
```
### Demotion to Extras
The Python Packaging tool has a concept of extras, sort of the inverse of *batteries included*... in setup.py one can put extras that are available with additional dependencies being installed:
```
extras = {
'amqp' : [ "amqp" ],
'filetypes': [ "python-magic" ],
'ftppoll' : ['dateparser' ],
'mqtt': [ 'paho.mqtt>=1.5.1' ],
'vip': [ 'netifaces' ],
'redis': [ 'redis' ]
}
extras['all'] = list(itertools.chain.from_iterable(extras.values()))
```
### Platform Dependent Deps
one can add dependencies that vary depending on the platform we are installing on.
```
install_requires=[
"appdirs", "humanfriendly", "humanize", "jsonpickle", "paramiko",
"psutil>=5.3.0", "watchdog",
'xattr ; sys_platform!="win32"', 'python-magic; sys_platform!="win32"',
'python-magic-bin; sys_platform=="win32"'
],
```
( this is in the [v03_issue721_platdep](https://github.com/MetPX/sarracenia/tree/v03_issue721_platdep) branch)
## What do we do?
So all of the approaches above (and perhaps others?) are used in the code, and someone using an installation will have a subset of functionality available, and sr3 has no way of reporting what is available or not. there is a branch https://github.com/MetPX/sarracenia/pull/738 that provides an example report of modules available using an *sr3 extras* command.
should we at least report what is working, and what isn't? An additional problem is that configured plugins may have additional dependencies. The mechanism in the pull request also provides a way for plugins to *register* those, so they show up in the inventory command.
Is this a reasonable/adviseable approach?
|
design
|
dependency management strategies the problem sarracenia uses a lot of other packages to provide functionality these are called dependencies in it s native environment ubuntu linux most of these dependencies are easily resolved using the built in debian packaging tools apt get but in many other environments it is more complex like even in environments where dependencies are installed somewhere it is not always clear which ones are available to a given program on redhat for example there does not seem to be a wide variety of python packages available in operating system repositories rather the specific minimal packages needed for the os s own needs of python are all that seem to be available this makes it challenging to install on redhat as one now has to package many dependencies as well as the main package the typical approach is to hunt for individual dependencies in different third party repositories or rebuild them from source this is a bit haphazard and in some cases like watchdog or dateparser the package itself has dependencies and one ends up having to create dozens of python packages on redhat as in many other environments it seems more practical to use python native packaging rather than the incomplete os ones as they do dependency resolution and all the dependencies can be brought in using pip the result of this if done system wide is a mix of distro packages and pip provided packages which complicates auditing and patching system administrators may also object to the use of pip packages in the base operating system windows is another example of an environment where pre existing package availability is unclear on windows the natural distribution format would be a self extracting exe but use of plugins with such a method is unclear and all the dependencies need to be packaged within it people also install python distributions activestate anaconda or the more traditional cpython and those will each have their own installation methods the complications mostly arise from dependencies such as xattr magic watchdog etc that is packages that are wrappers around c libraries or use c libraries as part of their implementation in these cases pure python packaging often fails as more environmental support is needed for example the python magic python package requires the c library to be installed if using os packages this is just an additional dependency no problem but with pip it will just fail and the user needs to find the os package install that and then try installing the python package again another complication results from all these different platforms having methods of installation mean that it is not obvious what advice to provide to users when a dependency is missing pip installe conda install apt install yum install the package naming conventions vary by distribution and are different from the module names used to test their presence approaches to dependency management manual tailoring for hpc which runs redhat x there are a few dependencies brought in by epel packages some built from source but some had to be left out the setup py file when building packages on redhat are typically hand edited to work around packages that are not available so manual editing of packages is done after the rpm is generated it is then tested on another system and a different user to see whether it runs as the local user doing the build may have pip packages which provide deps not available to others implementation manual editing of setup py to remove dependencies mostly silent disable looking at xattr the import is in a try except and if it fails the storing of metadata in extended file attributes is disabled there is a loss of functionality or a different behaviour on these systems as a result there is no way to query the system for which degrades are active nothing to prompt the user what to do to address if they want to implementation in filemetadata py try import xattr supports extended attributes true except supports extended attributes false there are also tests in sarracenia init py for the code to degrade understand when dependencies are missing extras amqp modules needed present false lament will not be able to connect to rabbitmq broker s appdirs modules needed present false lament will assume linux file placement under h ome dir ftppoll modules needed present false lament will not be able to poll with f tp humanize modules needed present false lament humans will have to read larger uglier numbers mqtt modules needed present false lament will not be able to connect to mqtt b rokers filetypes modules needed present false lament will not be able to set content headers vip modules needed present false lament will not be able to use the vip option for high availability clustering watch modules needed present false lament cannot watch directories for x in extras extras true for y in extras try if importlib util find spec y logger debug f found feature y enabled pass else logger debug f extra feature x needs missing module y disabled extras false except logger debug f extra feature x needs missing module y disabled extras false demotion to extras the python packaging tool has a concept of extras sort of the inverse of batteries included in setup py one can put extras that are available with additional dependencies being installed extras amqp filetypes ftppoll mqtt vip redis extras list itertools chain from iterable extras values platform dependent deps one can add dependencies that vary depending on the platform we are installing on install requires appdirs humanfriendly humanize jsonpickle paramiko psutil watchdog xattr sys platform python magic sys platform python magic bin sys platform this is in the branch what do we do so all of the approaches above and perhaps others are used in the code and someone using an installation will have a subset of functionality available and has no way of reporting what is available or not there is a branch that provides an example report of modules available using an extras command should we at least report what is working and what isn t an additional problem is that configured plugins may have additional dependencies the mechanism in the pull request also provides a way for plugins to register those so they show up in the inventory command is this a reasonable adviseable approach
| 1
|
60,703
| 7,374,562,793
|
IssuesEvent
|
2018-03-13 20:45:18
|
dbrgn/mcp3425-rs
|
https://api.github.com/repos/dbrgn/mcp3425-rs
|
closed
|
Design: Measurement Freshness
|
design
|
In continuous conversion mode, the ADC conversions are triggered periodically by a timer inside the device. If a user reads a measurement value twice in a row, the second value will be "stale" because no new measurement by the ADC has taken place since the previous value returned.
I could think of three ways to handle this:
- If the value is not "fresh", I return an error
- I return a tuple with the value returned and with the flag that indicates whether the data was fresh or not
- The more rusty way: A [`Measurement`](https://docs.rs/mcp3425/0.1.0/mcp3425/enum.Measurement.html) enum that can be either `Fresh(i16)` or `NotFresh(i16)`.
I implemented the third approach. It requires to *always* match on the measurement result though. Is this a good idea? Maybe an `.unwrap()` method that ignores the freshness should be provided?
|
1.0
|
Design: Measurement Freshness - In continuous conversion mode, the ADC conversions are triggered periodically by a timer inside the device. If a user reads a measurement value twice in a row, the second value will be "stale" because no new measurement by the ADC has taken place since the previous value returned.
I could think of three ways to handle this:
- If the value is not "fresh", I return an error
- I return a tuple with the value returned and with the flag that indicates whether the data was fresh or not
- The more rusty way: A [`Measurement`](https://docs.rs/mcp3425/0.1.0/mcp3425/enum.Measurement.html) enum that can be either `Fresh(i16)` or `NotFresh(i16)`.
I implemented the third approach. It requires to *always* match on the measurement result though. Is this a good idea? Maybe an `.unwrap()` method that ignores the freshness should be provided?
|
design
|
design measurement freshness in continuous conversion mode the adc conversions are triggered periodically by a timer inside the device if a user reads a measurement value twice in a row the second value will be stale because no new measurement by the adc has taken place since the previous value returned i could think of three ways to handle this if the value is not fresh i return an error i return a tuple with the value returned and with the flag that indicates whether the data was fresh or not the more rusty way a enum that can be either fresh or notfresh i implemented the third approach it requires to always match on the measurement result though is this a good idea maybe an unwrap method that ignores the freshness should be provided
| 1
|
162,993
| 25,731,608,872
|
IssuesEvent
|
2022-12-07 20:46:24
|
bounswe/bounswe2022group8
|
https://api.github.com/repos/bounswe/bounswe2022group8
|
closed
|
FE-15: Recommendation Page
|
Effort: High Status: Completed Coding Design Team: Frontend
|
### What's up?
As discussed in [Week #7 Frontend Meeting #3](https://github.com/bounswe/bounswe2022group8/wiki/Week-7-Frontend-Meeting-Notes-3) we need to design and implement a recommendation page.
### To Do
- [x] Carefully study the recommendation pages of art-related websites..
- [x] Decide the main functionalities that should be on the page.
- [x] Create a basic and useful recommendation page design and layout.
- [x] Implement the basic design.
- [x] Improve the design and implement the improvements.
### Deadline
22.11.2022 _@23.59_
### Additional Information
_No response_
### Reviewers
_No response_
|
1.0
|
FE-15: Recommendation Page - ### What's up?
As discussed in [Week #7 Frontend Meeting #3](https://github.com/bounswe/bounswe2022group8/wiki/Week-7-Frontend-Meeting-Notes-3) we need to design and implement a recommendation page.
### To Do
- [x] Carefully study the recommendation pages of art-related websites..
- [x] Decide the main functionalities that should be on the page.
- [x] Create a basic and useful recommendation page design and layout.
- [x] Implement the basic design.
- [x] Improve the design and implement the improvements.
### Deadline
22.11.2022 _@23.59_
### Additional Information
_No response_
### Reviewers
_No response_
|
design
|
fe recommendation page what s up as discussed in we need to design and implement a recommendation page to do carefully study the recommendation pages of art related websites decide the main functionalities that should be on the page create a basic and useful recommendation page design and layout implement the basic design improve the design and implement the improvements deadline additional information no response reviewers no response
| 1
|
161,164
| 25,297,300,242
|
IssuesEvent
|
2022-11-17 07:59:22
|
gohugoio/hugoDocs
|
https://api.github.com/repos/gohugoio/hugoDocs
|
closed
|
Navigation previous / next arrows.
|
Design
|
Hello,
I am very confused about the "previous" and "next" navigation arrows under the "What's on this page?" list.
For example I am on the [Organization](https://gohugo.io/content-management/organization/) page and press on the right arrow.
I'd expect to go to the next page in the structure on the left, which is [Page Bundles](https://gohugo.io/content-management/page-bundles/), but instead it links to [Syntax Highlighting](https://gohugo.io/content-management/syntax-highlighting/) - which is the last subpage in the list on the left.
Following the "next" arrows, they jump back and forth in what feels a random order through the list.
This makes it difficult to follow the documentation and doesn't give me much confidence in Hugo's capabilities to structure content.
|
1.0
|
Navigation previous / next arrows. - Hello,
I am very confused about the "previous" and "next" navigation arrows under the "What's on this page?" list.
For example I am on the [Organization](https://gohugo.io/content-management/organization/) page and press on the right arrow.
I'd expect to go to the next page in the structure on the left, which is [Page Bundles](https://gohugo.io/content-management/page-bundles/), but instead it links to [Syntax Highlighting](https://gohugo.io/content-management/syntax-highlighting/) - which is the last subpage in the list on the left.
Following the "next" arrows, they jump back and forth in what feels a random order through the list.
This makes it difficult to follow the documentation and doesn't give me much confidence in Hugo's capabilities to structure content.
|
design
|
navigation previous next arrows hello i am very confused about the previous and next navigation arrows under the what s on this page list for example i am on the page and press on the right arrow i d expect to go to the next page in the structure on the left which is but instead it links to which is the last subpage in the list on the left following the next arrows they jump back and forth in what feels a random order through the list this makes it difficult to follow the documentation and doesn t give me much confidence in hugo s capabilities to structure content
| 1
|
436,387
| 12,550,378,819
|
IssuesEvent
|
2020-06-06 10:54:17
|
googleapis/google-api-java-client-services
|
https://api.github.com/repos/googleapis/google-api-java-client-services
|
opened
|
Synthesis failed for recommender
|
api: recommender autosynth failure priority: p1 type: bug
|
Hello! Autosynth couldn't regenerate recommender. :broken_heart:
Here's the output from running `synth.py`:
```
2020-06-06 03:54:10,423 autosynth [INFO] > logs will be written to: /tmpfs/src/github/synthtool/logs/googleapis/google-api-java-client-services
2020-06-06 03:54:11,763 autosynth [DEBUG] > Running: git config --global core.excludesfile /home/kbuilder/.autosynth-gitignore
2020-06-06 03:54:11,767 autosynth [DEBUG] > Running: git config user.name yoshi-automation
2020-06-06 03:54:11,770 autosynth [DEBUG] > Running: git config user.email yoshi-automation@google.com
2020-06-06 03:54:11,789 autosynth [DEBUG] > Running: git config push.default simple
2020-06-06 03:54:11,793 autosynth [DEBUG] > Running: git branch -f autosynth-recommender
2020-06-06 03:54:11,796 autosynth [DEBUG] > Running: git checkout autosynth-recommender
Switched to branch 'autosynth-recommender'
2020-06-06 03:54:12,319 autosynth [INFO] > Running synthtool
2020-06-06 03:54:12,319 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/google-api-services-recommender/synth.metadata', 'synth.py', '--']
2020-06-06 03:54:12,322 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata clients/google-api-services-recommender/synth.metadata synth.py -- recommender
tee: /tmpfs/src/github/synthtool/logs/googleapis/google-api-java-client-services: Is a directory
2020-06-06 03:54:12,536 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/google-api-java-client-services/synth.py.
On branch autosynth-recommender
nothing to commit, working tree clean
2020-06-06 03:54:12,913 synthtool [DEBUG] > Cloning https://github.com/googleapis/discovery-artifact-manager.git.
DEBUG:synthtool:Cloning https://github.com/googleapis/discovery-artifact-manager.git.
2020-06-06 03:54:13,300 synthtool [DEBUG] > Cleaning output directory.
DEBUG:synthtool:Cleaning output directory.
2020-06-06 03:54:13,302 synthtool [DEBUG] > Installing dependencies.
DEBUG:synthtool:Installing dependencies.
2020-06-06 03:54:15,081 synthtool [INFO] > Generating recommender v1.
INFO:synthtool:Generating recommender v1.
2020-06-06 03:54:15,081 synthtool [INFO] > 1.30.1
INFO:synthtool: 1.30.1
Writing json metadata to clients/google-api-services-recommender/v1.metadata.json
2020-06-06 03:54:15,657 synthtool [INFO] > Generating recommender v1beta1.
INFO:synthtool:Generating recommender v1beta1.
2020-06-06 03:54:15,657 synthtool [INFO] > 1.30.1
INFO:synthtool: 1.30.1
Writing json metadata to clients/google-api-services-recommender/v1beta1.metadata.json
2020-06-06 03:54:16,293 synthtool [DEBUG] > Wrote metadata to clients/google-api-services-recommender/synth.metadata.
DEBUG:synthtool:Wrote metadata to clients/google-api-services-recommender/synth.metadata.
2020-06-06 03:54:16,328 autosynth [DEBUG] > Running: git clean -fdx
Removing .cache/
Removing __pycache__/
Removing clients/google-api-services-recommender/synth.metadata
Removing generator/.cache/
Removing generator/ez_setup.pyc
Removing generator/src/google_apis_client_generator.egg-info/
Removing generator/src/googleapis/__init__.pyc
Removing generator/src/googleapis/codegen/__init__.pyc
Removing generator/src/googleapis/codegen/api.pyc
Removing generator/src/googleapis/codegen/api_exception.pyc
Removing generator/src/googleapis/codegen/api_library_generator.pyc
Removing generator/src/googleapis/codegen/data_types.pyc
Removing generator/src/googleapis/codegen/django_helpers.pyc
Removing generator/src/googleapis/codegen/filesys/__init__.pyc
Removing generator/src/googleapis/codegen/filesys/files.pyc
Removing generator/src/googleapis/codegen/filesys/filesystem_library_package.pyc
Removing generator/src/googleapis/codegen/filesys/library_package.pyc
Removing generator/src/googleapis/codegen/filesys/package_writer_foundry.pyc
Removing generator/src/googleapis/codegen/filesys/tar_library_package.pyc
Removing generator/src/googleapis/codegen/filesys/zip_library_package.pyc
Removing generator/src/googleapis/codegen/generate_library.pyc
Removing generator/src/googleapis/codegen/generator.pyc
Removing generator/src/googleapis/codegen/generator_lookup.pyc
Removing generator/src/googleapis/codegen/import_definition.pyc
Removing generator/src/googleapis/codegen/java_generator.pyc
Removing generator/src/googleapis/codegen/java_import_manager.pyc
Removing generator/src/googleapis/codegen/language_model.pyc
Removing generator/src/googleapis/codegen/schema.pyc
Removing generator/src/googleapis/codegen/targets.pyc
Removing generator/src/googleapis/codegen/template_helpers.pyc
Removing generator/src/googleapis/codegen/template_objects.pyc
Removing generator/src/googleapis/codegen/utilities/__init__.pyc
Removing generator/src/googleapis/codegen/utilities/convert_size.pyc
Removing generator/src/googleapis/codegen/utilities/html_stripper.pyc
Removing generator/src/googleapis/codegen/utilities/json_expander.pyc
Removing generator/src/googleapis/codegen/utilities/json_with_comments.pyc
Removing generator/src/googleapis/codegen/utilities/maven_utils.pyc
Removing generator/src/googleapis/codegen/utilities/name_validator.pyc
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 615, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 476, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 555, in _inner_main
).synthesize(base_synth_log_path)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 121, in synthesize
with open(log_file_path, "rt") as fp:
IsADirectoryError: [Errno 21] Is a directory: '/tmpfs/src/github/synthtool/logs/googleapis/google-api-java-client-services'
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/a12fce2f-fa64-48b1-abd7-5a9cfc4df713/targets/github%2Fsynthtool;config=default/tests;query=google-api-java-client-services;failed=false).
|
1.0
|
Synthesis failed for recommender - Hello! Autosynth couldn't regenerate recommender. :broken_heart:
Here's the output from running `synth.py`:
```
2020-06-06 03:54:10,423 autosynth [INFO] > logs will be written to: /tmpfs/src/github/synthtool/logs/googleapis/google-api-java-client-services
2020-06-06 03:54:11,763 autosynth [DEBUG] > Running: git config --global core.excludesfile /home/kbuilder/.autosynth-gitignore
2020-06-06 03:54:11,767 autosynth [DEBUG] > Running: git config user.name yoshi-automation
2020-06-06 03:54:11,770 autosynth [DEBUG] > Running: git config user.email yoshi-automation@google.com
2020-06-06 03:54:11,789 autosynth [DEBUG] > Running: git config push.default simple
2020-06-06 03:54:11,793 autosynth [DEBUG] > Running: git branch -f autosynth-recommender
2020-06-06 03:54:11,796 autosynth [DEBUG] > Running: git checkout autosynth-recommender
Switched to branch 'autosynth-recommender'
2020-06-06 03:54:12,319 autosynth [INFO] > Running synthtool
2020-06-06 03:54:12,319 autosynth [INFO] > ['/tmpfs/src/github/synthtool/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/google-api-services-recommender/synth.metadata', 'synth.py', '--']
2020-06-06 03:54:12,322 autosynth [DEBUG] > Running: /tmpfs/src/github/synthtool/env/bin/python3 -m synthtool --metadata clients/google-api-services-recommender/synth.metadata synth.py -- recommender
tee: /tmpfs/src/github/synthtool/logs/googleapis/google-api-java-client-services: Is a directory
2020-06-06 03:54:12,536 synthtool [DEBUG] > Executing /home/kbuilder/.cache/synthtool/google-api-java-client-services/synth.py.
On branch autosynth-recommender
nothing to commit, working tree clean
2020-06-06 03:54:12,913 synthtool [DEBUG] > Cloning https://github.com/googleapis/discovery-artifact-manager.git.
DEBUG:synthtool:Cloning https://github.com/googleapis/discovery-artifact-manager.git.
2020-06-06 03:54:13,300 synthtool [DEBUG] > Cleaning output directory.
DEBUG:synthtool:Cleaning output directory.
2020-06-06 03:54:13,302 synthtool [DEBUG] > Installing dependencies.
DEBUG:synthtool:Installing dependencies.
2020-06-06 03:54:15,081 synthtool [INFO] > Generating recommender v1.
INFO:synthtool:Generating recommender v1.
2020-06-06 03:54:15,081 synthtool [INFO] > 1.30.1
INFO:synthtool: 1.30.1
Writing json metadata to clients/google-api-services-recommender/v1.metadata.json
2020-06-06 03:54:15,657 synthtool [INFO] > Generating recommender v1beta1.
INFO:synthtool:Generating recommender v1beta1.
2020-06-06 03:54:15,657 synthtool [INFO] > 1.30.1
INFO:synthtool: 1.30.1
Writing json metadata to clients/google-api-services-recommender/v1beta1.metadata.json
2020-06-06 03:54:16,293 synthtool [DEBUG] > Wrote metadata to clients/google-api-services-recommender/synth.metadata.
DEBUG:synthtool:Wrote metadata to clients/google-api-services-recommender/synth.metadata.
2020-06-06 03:54:16,328 autosynth [DEBUG] > Running: git clean -fdx
Removing .cache/
Removing __pycache__/
Removing clients/google-api-services-recommender/synth.metadata
Removing generator/.cache/
Removing generator/ez_setup.pyc
Removing generator/src/google_apis_client_generator.egg-info/
Removing generator/src/googleapis/__init__.pyc
Removing generator/src/googleapis/codegen/__init__.pyc
Removing generator/src/googleapis/codegen/api.pyc
Removing generator/src/googleapis/codegen/api_exception.pyc
Removing generator/src/googleapis/codegen/api_library_generator.pyc
Removing generator/src/googleapis/codegen/data_types.pyc
Removing generator/src/googleapis/codegen/django_helpers.pyc
Removing generator/src/googleapis/codegen/filesys/__init__.pyc
Removing generator/src/googleapis/codegen/filesys/files.pyc
Removing generator/src/googleapis/codegen/filesys/filesystem_library_package.pyc
Removing generator/src/googleapis/codegen/filesys/library_package.pyc
Removing generator/src/googleapis/codegen/filesys/package_writer_foundry.pyc
Removing generator/src/googleapis/codegen/filesys/tar_library_package.pyc
Removing generator/src/googleapis/codegen/filesys/zip_library_package.pyc
Removing generator/src/googleapis/codegen/generate_library.pyc
Removing generator/src/googleapis/codegen/generator.pyc
Removing generator/src/googleapis/codegen/generator_lookup.pyc
Removing generator/src/googleapis/codegen/import_definition.pyc
Removing generator/src/googleapis/codegen/java_generator.pyc
Removing generator/src/googleapis/codegen/java_import_manager.pyc
Removing generator/src/googleapis/codegen/language_model.pyc
Removing generator/src/googleapis/codegen/schema.pyc
Removing generator/src/googleapis/codegen/targets.pyc
Removing generator/src/googleapis/codegen/template_helpers.pyc
Removing generator/src/googleapis/codegen/template_objects.pyc
Removing generator/src/googleapis/codegen/utilities/__init__.pyc
Removing generator/src/googleapis/codegen/utilities/convert_size.pyc
Removing generator/src/googleapis/codegen/utilities/html_stripper.pyc
Removing generator/src/googleapis/codegen/utilities/json_expander.pyc
Removing generator/src/googleapis/codegen/utilities/json_with_comments.pyc
Removing generator/src/googleapis/codegen/utilities/maven_utils.pyc
Removing generator/src/googleapis/codegen/utilities/name_validator.pyc
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 615, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 476, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 555, in _inner_main
).synthesize(base_synth_log_path)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 121, in synthesize
with open(log_file_path, "rt") as fp:
IsADirectoryError: [Errno 21] Is a directory: '/tmpfs/src/github/synthtool/logs/googleapis/google-api-java-client-services'
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/a12fce2f-fa64-48b1-abd7-5a9cfc4df713/targets/github%2Fsynthtool;config=default/tests;query=google-api-java-client-services;failed=false).
|
non_design
|
synthesis failed for recommender hello autosynth couldn t regenerate recommender broken heart here s the output from running synth py autosynth logs will be written to tmpfs src github synthtool logs googleapis google api java client services autosynth running git config global core excludesfile home kbuilder autosynth gitignore autosynth running git config user name yoshi automation autosynth running git config user email yoshi automation google com autosynth running git config push default simple autosynth running git branch f autosynth recommender autosynth running git checkout autosynth recommender switched to branch autosynth recommender autosynth running synthtool autosynth autosynth running tmpfs src github synthtool env bin m synthtool metadata clients google api services recommender synth metadata synth py recommender tee tmpfs src github synthtool logs googleapis google api java client services is a directory synthtool executing home kbuilder cache synthtool google api java client services synth py on branch autosynth recommender nothing to commit working tree clean synthtool cloning debug synthtool cloning synthtool cleaning output directory debug synthtool cleaning output directory synthtool installing dependencies debug synthtool installing dependencies synthtool generating recommender info synthtool generating recommender synthtool info synthtool writing json metadata to clients google api services recommender metadata json synthtool generating recommender info synthtool generating recommender synthtool info synthtool writing json metadata to clients google api services recommender metadata json synthtool wrote metadata to clients google api services recommender synth metadata debug synthtool wrote metadata to clients google api services recommender synth metadata autosynth running git clean fdx removing cache removing pycache removing clients google api services recommender synth metadata removing generator cache removing generator ez setup pyc removing generator src google apis client generator egg info removing generator src googleapis init pyc removing generator src googleapis codegen init pyc removing generator src googleapis codegen api pyc removing generator src googleapis codegen api exception pyc removing generator src googleapis codegen api library generator pyc removing generator src googleapis codegen data types pyc removing generator src googleapis codegen django helpers pyc removing generator src googleapis codegen filesys init pyc removing generator src googleapis codegen filesys files pyc removing generator src googleapis codegen filesys filesystem library package pyc removing generator src googleapis codegen filesys library package pyc removing generator src googleapis codegen filesys package writer foundry pyc removing generator src googleapis codegen filesys tar library package pyc removing generator src googleapis codegen filesys zip library package pyc removing generator src googleapis codegen generate library pyc removing generator src googleapis codegen generator pyc removing generator src googleapis codegen generator lookup pyc removing generator src googleapis codegen import definition pyc removing generator src googleapis codegen java generator pyc removing generator src googleapis codegen java import manager pyc removing generator src googleapis codegen language model pyc removing generator src googleapis codegen schema pyc removing generator src googleapis codegen targets pyc removing generator src googleapis codegen template helpers pyc removing generator src googleapis codegen template objects pyc removing generator src googleapis codegen utilities init pyc removing generator src googleapis codegen utilities convert size pyc removing generator src googleapis codegen utilities html stripper pyc removing generator src googleapis codegen utilities json expander pyc removing generator src googleapis codegen utilities json with comments pyc removing generator src googleapis codegen utilities maven utils pyc removing generator src googleapis codegen utilities name validator pyc traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool autosynth synth py line in main file tmpfs src github synthtool autosynth synth py line in main return inner main temp dir file tmpfs src github synthtool autosynth synth py line in inner main synthesize base synth log path file tmpfs src github synthtool autosynth synthesizer py line in synthesize with open log file path rt as fp isadirectoryerror is a directory tmpfs src github synthtool logs googleapis google api java client services google internal developers can see the full log
| 0
|
270,904
| 20,614,239,498
|
IssuesEvent
|
2022-03-07 11:37:16
|
uit-cosmo/user-guide
|
https://api.github.com/repos/uit-cosmo/user-guide
|
opened
|
Provide description of uit_sandpiles
|
documentation
|
@Sosnowsky Could you provide a short description of the uit_sandpiles repo in the user guide?
|
1.0
|
Provide description of uit_sandpiles - @Sosnowsky Could you provide a short description of the uit_sandpiles repo in the user guide?
|
non_design
|
provide description of uit sandpiles sosnowsky could you provide a short description of the uit sandpiles repo in the user guide
| 0
|
113,100
| 17,115,744,552
|
IssuesEvent
|
2021-07-11 10:05:59
|
NixOS/nixpkgs
|
https://api.github.com/repos/NixOS/nixpkgs
|
closed
|
Vulnerability roundup 103: bluez-5.55: 2 advisories [5.7]
|
1.severity: security
|
[search](https://search.nix.gsc.io/?q=bluez&i=fosho&repos=NixOS-nixpkgs), [files](https://github.com/NixOS/nixpkgs/search?utf8=%E2%9C%93&q=bluez+in%3Apath&type=Code)
* [ ] [CVE-2021-0129](https://nvd.nist.gov/vuln/detail/CVE-2021-0129) CVSSv3=5.7 (nixos-20.09)
* [ ] [CVE-2021-3588](https://nvd.nist.gov/vuln/detail/CVE-2021-3588) CVSSv3=3.3 (nixos-20.09)
Scanned versions: nixos-20.09: 86d3781c390.
|
True
|
Vulnerability roundup 103: bluez-5.55: 2 advisories [5.7] - [search](https://search.nix.gsc.io/?q=bluez&i=fosho&repos=NixOS-nixpkgs), [files](https://github.com/NixOS/nixpkgs/search?utf8=%E2%9C%93&q=bluez+in%3Apath&type=Code)
* [ ] [CVE-2021-0129](https://nvd.nist.gov/vuln/detail/CVE-2021-0129) CVSSv3=5.7 (nixos-20.09)
* [ ] [CVE-2021-3588](https://nvd.nist.gov/vuln/detail/CVE-2021-3588) CVSSv3=3.3 (nixos-20.09)
Scanned versions: nixos-20.09: 86d3781c390.
|
non_design
|
vulnerability roundup bluez advisories nixos nixos scanned versions nixos
| 0
|
12,490
| 3,079,743,584
|
IssuesEvent
|
2015-08-21 18:01:34
|
ThibaultLatrille/ControverSciences
|
https://api.github.com/repos/ThibaultLatrille/ControverSciences
|
closed
|
Choix des thèmes des articles
|
*** important design
|
Les cases ne sont pas toutes de la meme taille a cause de la taille du contenu. Le plus simle est sans doute de mettre sur deux lignes tous les contenus, picto puis a la ligne le mot.
Sur la page https://www.controversciences.org/references/7
Par : F. Giry
Navigateur : chrome modern windows webkit
|
1.0
|
Choix des thèmes des articles - Les cases ne sont pas toutes de la meme taille a cause de la taille du contenu. Le plus simle est sans doute de mettre sur deux lignes tous les contenus, picto puis a la ligne le mot.
Sur la page https://www.controversciences.org/references/7
Par : F. Giry
Navigateur : chrome modern windows webkit
|
design
|
choix des thèmes des articles les cases ne sont pas toutes de la meme taille a cause de la taille du contenu le plus simle est sans doute de mettre sur deux lignes tous les contenus picto puis a la ligne le mot sur la page par f giry navigateur chrome modern windows webkit
| 1
|
38,869
| 10,258,119,089
|
IssuesEvent
|
2019-08-21 21:54:13
|
Squalr/Squally
|
https://api.github.com/repos/Squalr/Squally
|
opened
|
PDB Generation Failing with LNK1201 Windows Debug/RelWithDebug
|
area:build-system good first issue (intermediate)
|
This issue popped up some time ago, and I've been unable to diagnose it so far. Reverting to older points in time in the repo didn't seem to solve it either for me.
This wasn't caught for awhile because I do most development on OSX. In the mean time, I'm continuing to do all testing on OSX, and just compiling in Release on windows.
This isn't a very common issue, and Stackoverflow answers do not seem to be helping :^)
|
1.0
|
PDB Generation Failing with LNK1201 Windows Debug/RelWithDebug - This issue popped up some time ago, and I've been unable to diagnose it so far. Reverting to older points in time in the repo didn't seem to solve it either for me.
This wasn't caught for awhile because I do most development on OSX. In the mean time, I'm continuing to do all testing on OSX, and just compiling in Release on windows.
This isn't a very common issue, and Stackoverflow answers do not seem to be helping :^)
|
non_design
|
pdb generation failing with windows debug relwithdebug this issue popped up some time ago and i ve been unable to diagnose it so far reverting to older points in time in the repo didn t seem to solve it either for me this wasn t caught for awhile because i do most development on osx in the mean time i m continuing to do all testing on osx and just compiling in release on windows this isn t a very common issue and stackoverflow answers do not seem to be helping
| 0
|
49,832
| 6,264,894,940
|
IssuesEvent
|
2017-07-16 12:47:07
|
OpenBudget/budgetkey-app-search
|
https://api.github.com/repos/OpenBudget/budgetkey-app-search
|
closed
|
Redesign Faceted Search
|
design
|
_From @mushon on October 12, 2015 17:34_
<!---
@huboard:{"order":1.2197274440461925e-19,"milestone_order":0.00048828125}
-->
_Copied from original issue: OpenBudget/open-budget-frontend#279_
|
1.0
|
Redesign Faceted Search - _From @mushon on October 12, 2015 17:34_
<!---
@huboard:{"order":1.2197274440461925e-19,"milestone_order":0.00048828125}
-->
_Copied from original issue: OpenBudget/open-budget-frontend#279_
|
design
|
redesign faceted search from mushon on october huboard order milestone order copied from original issue openbudget open budget frontend
| 1
|
55,207
| 6,892,579,272
|
IssuesEvent
|
2017-11-22 21:42:59
|
webcompat/webcompat.com
|
https://api.github.com/repos/webcompat/webcompat.com
|
closed
|
Contributors guideline section remains highlighted when tapped/clicked on Chrome browser
|
lang: CSS scope: design scope: refactor
|
Browser / Version: Chrome 56.0.2924, Chrome 57.0.2987
Operating System: Windows 10 Pro, Android 6.0.1
**Steps to Reproduce**
1. Navigate to: https://webcompat.com/contributors.
2. Click/Tap a guideline section.
3. Observe behavior.
**Expected Behavior:**
Section is not highlighted.
**Actual Behavior:**
Section remains highlighted.
**Notes:**
1. Not reproducible on Firefox Nightly 55.0a1 and Release 52.0.2.
2. Screenshot attached.
sv;

|
1.0
|
Contributors guideline section remains highlighted when tapped/clicked on Chrome browser - Browser / Version: Chrome 56.0.2924, Chrome 57.0.2987
Operating System: Windows 10 Pro, Android 6.0.1
**Steps to Reproduce**
1. Navigate to: https://webcompat.com/contributors.
2. Click/Tap a guideline section.
3. Observe behavior.
**Expected Behavior:**
Section is not highlighted.
**Actual Behavior:**
Section remains highlighted.
**Notes:**
1. Not reproducible on Firefox Nightly 55.0a1 and Release 52.0.2.
2. Screenshot attached.
sv;

|
design
|
contributors guideline section remains highlighted when tapped clicked on chrome browser browser version chrome chrome operating system windows pro android steps to reproduce navigate to click tap a guideline section observe behavior expected behavior section is not highlighted actual behavior section remains highlighted notes not reproducible on firefox nightly and release screenshot attached sv
| 1
|
106,206
| 13,254,014,627
|
IssuesEvent
|
2020-08-20 08:35:37
|
pupilfirst/pupilfirst
|
https://api.github.com/repos/pupilfirst/pupilfirst
|
opened
|
Revamp the /coaches page
|
design enhancement
|
The existing `/coaches` page is the only remaining Bootstrap-based page on the site. We should remove it to get rid of the dependency and improve it in the process.
1. The existing design contains a lot of older elements that are no-longer required. We should preserve only the `name` and `connect_link` on the index page, and show the `about` as an additional bit on the `show` page.
2. We should consider basing the design on our `/team` page, where the _list partial_ is reused on the `show` page to show the list of _other_ members below the _detailed_ view of the selected member.
3. We should add a multi-select filter that lets the viewer filter the coaches by course.
4. The list of available courses on the filter should be different, depending on the viewer. A public user should see only public courses, a student should see public courses + their enrolled courses, and an admin or coach should see all unarchived courses.
5. A student opening the page should have their courses pre-selected in the filter.
|
1.0
|
Revamp the /coaches page - The existing `/coaches` page is the only remaining Bootstrap-based page on the site. We should remove it to get rid of the dependency and improve it in the process.
1. The existing design contains a lot of older elements that are no-longer required. We should preserve only the `name` and `connect_link` on the index page, and show the `about` as an additional bit on the `show` page.
2. We should consider basing the design on our `/team` page, where the _list partial_ is reused on the `show` page to show the list of _other_ members below the _detailed_ view of the selected member.
3. We should add a multi-select filter that lets the viewer filter the coaches by course.
4. The list of available courses on the filter should be different, depending on the viewer. A public user should see only public courses, a student should see public courses + their enrolled courses, and an admin or coach should see all unarchived courses.
5. A student opening the page should have their courses pre-selected in the filter.
|
design
|
revamp the coaches page the existing coaches page is the only remaining bootstrap based page on the site we should remove it to get rid of the dependency and improve it in the process the existing design contains a lot of older elements that are no longer required we should preserve only the name and connect link on the index page and show the about as an additional bit on the show page we should consider basing the design on our team page where the list partial is reused on the show page to show the list of other members below the detailed view of the selected member we should add a multi select filter that lets the viewer filter the coaches by course the list of available courses on the filter should be different depending on the viewer a public user should see only public courses a student should see public courses their enrolled courses and an admin or coach should see all unarchived courses a student opening the page should have their courses pre selected in the filter
| 1
|
93,879
| 11,825,199,596
|
IssuesEvent
|
2020-03-21 11:27:40
|
nikodemus/foolang
|
https://api.github.com/repos/nikodemus/foolang
|
opened
|
indexed classes
|
design feature
|
as in Smalltalk. The basis for classes like Array.
- Syntax?
- Implementation?
Worth a design note.
|
1.0
|
indexed classes - as in Smalltalk. The basis for classes like Array.
- Syntax?
- Implementation?
Worth a design note.
|
design
|
indexed classes as in smalltalk the basis for classes like array syntax implementation worth a design note
| 1
|
390,315
| 26,857,966,634
|
IssuesEvent
|
2023-02-03 16:00:27
|
NixOS/nix
|
https://api.github.com/repos/NixOS/nix
|
opened
|
Missing or incorrect S3 documentation in the Manual
|
documentation
|
## NB: I didn't test this, only moving here because the issue was opened on the wrong repo (https://github.com/NixOS/nixos-homepage/issues/731)
## Problem
for a S3 binary cache READ operations is incomplete - it is missing `"s3:ListBucket"` action. Whilst the current config is enough for basic nix-shell substitutions, without `"s3:ListBucket"` action a user would see this kind of error when running `nix-build`:
```
error: AWS error fetching '7fk9b9w8bpkkm36glg378l4arczk85sw.narinfo': Access Denied
```
## Checklist
<!-- make sure this issue is not redundant or obsolete -->
- [x] checked [latest Nix manual] \([source])
- [x] checked [open documentation issues and pull requests] for possible duplicates
[latest Nix manual]: https://nixos.org/manual/nix/unstable/
[source]: https://github.com/NixOS/nix/tree/master/doc/manual/src
[open documentation issues and pull requests]: https://github.com/NixOS/nix/labels/documentation
## Proposal
So the complete list of actions for reads should be this:
```
"s3:GetObject",
"s3:GetBucketLocation",
"s3:ListBucket"
```
## Priorities
Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
|
1.0
|
Missing or incorrect S3 documentation in the Manual - ## NB: I didn't test this, only moving here because the issue was opened on the wrong repo (https://github.com/NixOS/nixos-homepage/issues/731)
## Problem
for a S3 binary cache READ operations is incomplete - it is missing `"s3:ListBucket"` action. Whilst the current config is enough for basic nix-shell substitutions, without `"s3:ListBucket"` action a user would see this kind of error when running `nix-build`:
```
error: AWS error fetching '7fk9b9w8bpkkm36glg378l4arczk85sw.narinfo': Access Denied
```
## Checklist
<!-- make sure this issue is not redundant or obsolete -->
- [x] checked [latest Nix manual] \([source])
- [x] checked [open documentation issues and pull requests] for possible duplicates
[latest Nix manual]: https://nixos.org/manual/nix/unstable/
[source]: https://github.com/NixOS/nix/tree/master/doc/manual/src
[open documentation issues and pull requests]: https://github.com/NixOS/nix/labels/documentation
## Proposal
So the complete list of actions for reads should be this:
```
"s3:GetObject",
"s3:GetBucketLocation",
"s3:ListBucket"
```
## Priorities
Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
|
non_design
|
missing or incorrect documentation in the manual nb i didn t test this only moving here because the issue was opened on the wrong repo problem for a binary cache read operations is incomplete it is missing listbucket action whilst the current config is enough for basic nix shell substitutions without listbucket action a user would see this kind of error when running nix build error aws error fetching narinfo access denied checklist checked checked for possible duplicates proposal so the complete list of actions for reads should be this getobject getbucketlocation listbucket priorities add to
| 0
|
174,533
| 27,660,449,889
|
IssuesEvent
|
2023-03-12 13:09:33
|
TeamAntPowerLifting/TeamAnt
|
https://api.github.com/repos/TeamAntPowerLifting/TeamAnt
|
closed
|
design: footer 컴포넌트 구현
|
design
|
## 🍦 기능 구현
- Footer 컴포넌트 제작
## 🍭 기능 구현 방식
- 컴포넌트로 각 페이지 하단에 재사용할 예정
## 🍪 기능 구현 결과
<img src="https://user-images.githubusercontent.com/83527046/218497296-f0ce5172-3dc4-418d-baae-a16ffc5e54ef.png">
## 📚 기능 구현을 위한 참고 레퍼런스
(참고 레퍼런스 링크)
|
1.0
|
design: footer 컴포넌트 구현 - ## 🍦 기능 구현
- Footer 컴포넌트 제작
## 🍭 기능 구현 방식
- 컴포넌트로 각 페이지 하단에 재사용할 예정
## 🍪 기능 구현 결과
<img src="https://user-images.githubusercontent.com/83527046/218497296-f0ce5172-3dc4-418d-baae-a16ffc5e54ef.png">
## 📚 기능 구현을 위한 참고 레퍼런스
(참고 레퍼런스 링크)
|
design
|
design footer 컴포넌트 구현 🍦 기능 구현 footer 컴포넌트 제작 🍭 기능 구현 방식 컴포넌트로 각 페이지 하단에 재사용할 예정 🍪 기능 구현 결과 img src 📚 기능 구현을 위한 참고 레퍼런스 참고 레퍼런스 링크
| 1
|
392,503
| 26,942,838,098
|
IssuesEvent
|
2023-02-08 04:42:04
|
yosileyid/yosileyid
|
https://api.github.com/repos/yosileyid/yosileyid
|
closed
|
Change docs/CONTRIBUTING.md to be more similar to atom/CONTRIBUTING
|
documentation enhancement
|
The atom project has a much nicer contributing page. I want to update the current one to be more similar to theirs.
[Atom CONTRIBUTING page](https://github.com/atom/atom/blob/master/CONTRIBUTING.md)
|
1.0
|
Change docs/CONTRIBUTING.md to be more similar to atom/CONTRIBUTING - The atom project has a much nicer contributing page. I want to update the current one to be more similar to theirs.
[Atom CONTRIBUTING page](https://github.com/atom/atom/blob/master/CONTRIBUTING.md)
|
non_design
|
change docs contributing md to be more similar to atom contributing the atom project has a much nicer contributing page i want to update the current one to be more similar to theirs
| 0
|
173,308
| 21,155,265,001
|
IssuesEvent
|
2022-04-07 02:02:48
|
Aivolt1/u-i-u-x-volt-ai
|
https://api.github.com/repos/Aivolt1/u-i-u-x-volt-ai
|
reopened
|
CVE-2022-24771 (High) detected in node-forge-0.10.0.tgz
|
security vulnerability
|
## CVE-2022-24771 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.10.0.tgz</b></p></summary>
<p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/node-forge</p>
<p>
Dependency Hierarchy:
- gatsby-3.9.0.tgz (Root Library)
- webpack-dev-server-3.11.2.tgz
- selfsigned-1.10.11.tgz
- :x: **node-forge-0.10.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Forge (also called `node-forge`) is a native implementation of Transport Layer Security in JavaScript. Prior to version 1.3.0, RSA PKCS#1 v1.5 signature verification code is lenient in checking the digest algorithm structure. This can allow a crafted structure that steals padding bytes and uses unchecked portion of the PKCS#1 encoded message to forge a signature when a low public exponent is being used. The issue has been addressed in `node-forge` version 1.3.0. There are currently no known workarounds.
<p>Publish Date: 2022-03-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24771>CVE-2022-24771</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24771">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24771</a></p>
<p>Release Date: 2022-03-18</p>
<p>Fix Resolution: node-forge - 1.3.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-24771 (High) detected in node-forge-0.10.0.tgz - ## CVE-2022-24771 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.10.0.tgz</b></p></summary>
<p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/node-forge</p>
<p>
Dependency Hierarchy:
- gatsby-3.9.0.tgz (Root Library)
- webpack-dev-server-3.11.2.tgz
- selfsigned-1.10.11.tgz
- :x: **node-forge-0.10.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Forge (also called `node-forge`) is a native implementation of Transport Layer Security in JavaScript. Prior to version 1.3.0, RSA PKCS#1 v1.5 signature verification code is lenient in checking the digest algorithm structure. This can allow a crafted structure that steals padding bytes and uses unchecked portion of the PKCS#1 encoded message to forge a signature when a low public exponent is being used. The issue has been addressed in `node-forge` version 1.3.0. There are currently no known workarounds.
<p>Publish Date: 2022-03-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-24771>CVE-2022-24771</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24771">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24771</a></p>
<p>Release Date: 2022-03-18</p>
<p>Fix Resolution: node-forge - 1.3.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_design
|
cve high detected in node forge tgz cve high severity vulnerability vulnerable library node forge tgz javascript implementations of network transports cryptography ciphers pki message digests and various utilities library home page a href path to dependency file package json path to vulnerable library node modules node forge dependency hierarchy gatsby tgz root library webpack dev server tgz selfsigned tgz x node forge tgz vulnerable library found in base branch master vulnerability details forge also called node forge is a native implementation of transport layer security in javascript prior to version rsa pkcs signature verification code is lenient in checking the digest algorithm structure this can allow a crafted structure that steals padding bytes and uses unchecked portion of the pkcs encoded message to forge a signature when a low public exponent is being used the issue has been addressed in node forge version there are currently no known workarounds publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution node forge step up your open source security game with whitesource
| 0
|
196,603
| 15,603,347,722
|
IssuesEvent
|
2021-03-19 01:41:30
|
stegnerw/penguin_swarm
|
https://api.github.com/repos/stegnerw/penguin_swarm
|
closed
|
Change license to MIT
|
documentation
|
Licenses are scary and I do not understand all of the terms in LGPLv3. I would feel much more comfortable under MIT, as it is written in plain English.
|
1.0
|
Change license to MIT - Licenses are scary and I do not understand all of the terms in LGPLv3. I would feel much more comfortable under MIT, as it is written in plain English.
|
non_design
|
change license to mit licenses are scary and i do not understand all of the terms in i would feel much more comfortable under mit as it is written in plain english
| 0
|
514,053
| 14,932,200,666
|
IssuesEvent
|
2021-01-25 07:19:31
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.amazon.com - see bug description
|
browser-focus-geckoview engine-gecko ml-needsdiagnosis-false ml-probability-high priority-critical
|
<!-- @browser: Firefox Mobile 84.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:84.0) Gecko/84.0 Firefox/84.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/66210 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://www.amazon.com/gp/aw/s?k=famous footwear
**Browser / Version**: Firefox Mobile 84.0
**Operating System**: Android 9
**Tested Another Browser**: Yes Opera
**Problem type**: Something else
**Description**: .didnt load
**Steps to Reproduce**:
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.amazon.com - see bug description - <!-- @browser: Firefox Mobile 84.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:84.0) Gecko/84.0 Firefox/84.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/66210 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://www.amazon.com/gp/aw/s?k=famous footwear
**Browser / Version**: Firefox Mobile 84.0
**Operating System**: Android 9
**Tested Another Browser**: Yes Opera
**Problem type**: Something else
**Description**: .didnt load
**Steps to Reproduce**:
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_design
|
see bug description url footwear browser version firefox mobile operating system android tested another browser yes opera problem type something else description didnt load steps to reproduce browser configuration none from with ❤️
| 0
|
64,702
| 16,014,306,009
|
IssuesEvent
|
2021-04-20 14:21:04
|
hashicorp/packer
|
https://api.github.com/repos/hashicorp/packer
|
closed
|
Proxmox 6, vm Debian 9, problem with the preseed.cfg file, when we get to the storage configuration it does not continue with the installation, the screen remains blue and does not continue
|
bug builder/proxmox remote-plugin/proxmox waiting-reply
|
When filing a bug, please include the following headings if possible. Any
example text in this template can be deleted.
#### Overview of the Issue
Before we start, I apologize for my English.
When the packer reaches the boot_command and uploads the preseed.cfg file this file works the problem is when we get to the step to configure the storage, at this moment the process stops and does not continue, we only see the blue screen.
#### Reproduction Steps
I have a 6.2-4 version of the proxmox server where I run virtual machines and I would like to have my custom ISO.
To create ISO I have two files, a configuration file and the packer file.
I launch the command from my desktop:
$ packer build -var-file = config.json debian-9.13.json
proxmox: the output will be in this color.
==> proxmox: Creating VM
==> proxmox: Starting VM
==> proxmox: starting the HTTP server on port 8902
==> proxmox: Waiting 10 seconds for boot
==> proxmox: typing boot command
==> proxmox: Waiting for SSH to be available ...
It loads the proseed.cfg file without problems and runs the whole process until it reaches the storage configuration. It stops here and does not continue, it reaches the timeout.
### Packer version
From packer v1.6.5
### Simplified Packer Buildfile
debian-9.13.json
````
{
"builders": [
{
.....
"disks": [
{
"type": "virtio",
"disk_size": "{{ user `disk_size`}}",
"storage_pool": "{{user `datastore`}}",
"storage_pool_type": "{{user `datastore_type`}}"
}
],
......
"iso_file": "{{user `iso`}}",
"http_directory": "http",
"template_description": "{{ user `template_description` }}",
"boot_wait": "10s",
"boot_command": [
"{{ user `boot_command_prefix` }}",
"install <wait>",
"preseed/url=http://{{ .HTTPIP }}:{{ .HTTPPort }}/preseed.cfg<wait>",
"debian-installer=en_US.UTF-8 <wait>",
"auto <wait>",
"locale=en_US.UTF-8 <wait>",
"kbd-chooser/method=es <wait>",
"keyboard-configuration/xkb-keymap=es <wait>",
"netcfg/get_hostname=node0 <wait>",
"netcfg/get_domain=test.lan <wait>",
"fb=false <wait>",
"debconf/frontend=noninteractive <wait>",
"console-setup/ask_detect=false <wait>",
"console-keymaps-at/keymap=es <wait>",
"grub-installer/bootdev=/dev/sda <wait>",
"<enter><wait>"
]
}
],
...
}
````
Vars file: config.json
-------------------------
````
{
"template_description": "debian 9.13, generated by packer on {{ isotime \"2020-01-02T15:04:05Z\" }}",
"hostname": "node0",
"local_domain": "internal.test",
"vmid": "400",
"locale": "es_ES",
"cores": "1",
"sockets": "1",
"memory": "2048",
"disk_size": "50G",
"datastore": "local-lvm",
"datastore_type": "lvm",
"iso": "local:iso/debian-9.13.0-amd64-netinst.iso",
"boot_command_prefix": "<esc><wait>",
"preseed_file": "preseed.cfg"
}
````
proseed.cfg
---------------
````
#Early
d-i partman/early_command string \
echo "Starting install" \
sleep 60
# Localization ----------------------------------------------------------
# d-i debian-installer/language string en
# d-i debian-installer/country string ES
# d-i debian-installer/locale string en_GB.UTF-8
# Keymap & Console ------------------------------------------------------
# d-i keyboard-configuration/xkb-keymap select es
# Network ---------------------------------------------------------------
d-i netcfg/enable boolean true
d-i netcfg/choose_interface select auto
d-i netcfg/dhcp_failed note
d-i netcfg/dhcp_options select Configure network manually
# Mirror settings ------------------------------------------------------
d-i mirror/country string manual
d-i mirror/http/hostname string ftp.es.debian.org
d-i mirror/http/directory string /debian/
d-i mirror/http/proxy string
# Root password ---------------------------------------------------------
d-i passwd/root-password password user
d-i passwd/root-password-again password pass
# user account ----------------------------------------------------------
d-i passwd/user-fullname string user1
d-i passwd/username string user1
d-i passwd/user-password password pass
d-i passwd/user-password-again password pass
d-i passwd/user-uid string 1010
# Clock and time zone setup --------------------------------------------
d-i clock-setup/utc boolean true
d-i time/zone string Europe/Madrid
d-i clock-setup/ntp boolean true
# Partitioning ----------------------------------------------------------
d-i partman-auto/disk string /dev/sda
d-i partman-auto/method string lvm
d-i partman-lvm/device_remove_lvm boolean true
d-i partman-auto/expert_recipe string \
boot-root :: \
40 300 300 ext4 \
$primary{ } \
$bootable{ } \
method{ format } format{ } \
use_filesystem{ } filesystem{ ext4 } \
mountpoint{ /boot } \
. \
2000 10000 100000000 ext4 \ \
$primary{ } \
method{ lvm } \
device{ /dev/sda} \
vg_name{ vg-root } \
. \
2000 10000 100000000 ext4 \
$lvmok{ } \
in_vg{ vg-root } \
lv_name{ lv-root } \
method{ format } format{ } \
use_filesystem{ } filesystem{ ext4 } \
mountpoint{ / } \
. \
d-i partman-partitioning/confirm_write_new_label boolean true
d-i partman/choose_partition select finish
d-i partman/confirm boolean true
d-i partman/confirm_nooverwrite boolean true
d-i partman-lvm/confirm boolean true
d-i partman-lvm/confirm_nooverwrite boolean true
# Package selection -------------------------------------------------------------------
tasksel tasksel/first multiselect standard
# Additional packages ------------------------------------------------------------------
d-i pkgsel/include string console-setup console-data openssh-server
# Custom config ------------------------------------------------------------------------
d-i preseed/late_command string \
cp install.sh /target/root/install.sh; \
in-target apt update -y; \
in-target apt install -y sudo; \
in-target usermod -aG sudo kub; \
in-target chmod +x /root/install.sh; \
in-target sh -c /root/install.sh;
# Boot loader installation ----------------------------------------------------
# Install grub in the first device (assuming it is not a USB stick)
d-i grub-installer/only_debian boolean true
d-i grub-installer/with_other_os boolean true
d-i grub-installer/bootdev string default
# Finishing up the installation -----------------------------------------------
d-i finish-install/reboot_in_progress note
````
### Operating system and Environment details
proxmox:
pve-manager/6.2-4/9824574a
kernel version: Linux 5.4.34-1-pve #1 SMP PVE 5.4.34-2
ISO:
debian-9.13.0-amd64-netinst.iso
### Log Fragments and crash.log files
$ packer build -debug -var-file=config.json debian-9.13.json
==> proxmox: Pausing after run of step 'StepDownload'. Press enter to continue.
==> proxmox: Pausing after run of step 'stepUploadISO'. Press enter to continue.
==> proxmox: Pausing after run of step 'stepUploadAdditionalISOs'. Press enter to continue.
==> proxmox: Creating VM
==> proxmox: Starting VM
==> proxmox: Pausing after run of step 'stepStartVM'. Press enter to continue.
==> proxmox: Starting HTTP server on port 8605
==> proxmox: Pausing after run of step 'StepHTTPServer'. Press enter to continue.
==> proxmox: Waiting 10s for boot
==> proxmox: Typing the boot command
==> proxmox: Pausing after run of step 'stepTypeBootCommand'. Press enter to continue.
==> proxmox: Waiting for SSH to become available...
Cancelling build after receiving interrupt
==> proxmox: Pausing before cleanup of step 'stepTypeBootCommand'. Press enter to continue.
==> proxmox: Pausing before cleanup of step 'StepHTTPServer'. Press enter to continue.
==> proxmox: Pausing before cleanup of step 'stepStartVM'. Press enter to continue.
==> proxmox: Stopping VM
==> proxmox: Deleting VM
==> proxmox: Pausing before cleanup of step 'stepUploadAdditionalISOs'. Press enter to continue.
==> proxmox: Pausing before cleanup of step 'stepUploadISO'. Press enter to continue.
==> proxmox: Pausing before cleanup of step 'StepDownload'. Press enter to continue.
Build 'proxmox' errored after 2 minutes 52 seconds: build was cancelled
|
1.0
|
Proxmox 6, vm Debian 9, problem with the preseed.cfg file, when we get to the storage configuration it does not continue with the installation, the screen remains blue and does not continue - When filing a bug, please include the following headings if possible. Any
example text in this template can be deleted.
#### Overview of the Issue
Before we start, I apologize for my English.
When the packer reaches the boot_command and uploads the preseed.cfg file this file works the problem is when we get to the step to configure the storage, at this moment the process stops and does not continue, we only see the blue screen.
#### Reproduction Steps
I have a 6.2-4 version of the proxmox server where I run virtual machines and I would like to have my custom ISO.
To create ISO I have two files, a configuration file and the packer file.
I launch the command from my desktop:
$ packer build -var-file = config.json debian-9.13.json
proxmox: the output will be in this color.
==> proxmox: Creating VM
==> proxmox: Starting VM
==> proxmox: starting the HTTP server on port 8902
==> proxmox: Waiting 10 seconds for boot
==> proxmox: typing boot command
==> proxmox: Waiting for SSH to be available ...
It loads the proseed.cfg file without problems and runs the whole process until it reaches the storage configuration. It stops here and does not continue, it reaches the timeout.
### Packer version
From packer v1.6.5
### Simplified Packer Buildfile
debian-9.13.json
````
{
"builders": [
{
.....
"disks": [
{
"type": "virtio",
"disk_size": "{{ user `disk_size`}}",
"storage_pool": "{{user `datastore`}}",
"storage_pool_type": "{{user `datastore_type`}}"
}
],
......
"iso_file": "{{user `iso`}}",
"http_directory": "http",
"template_description": "{{ user `template_description` }}",
"boot_wait": "10s",
"boot_command": [
"{{ user `boot_command_prefix` }}",
"install <wait>",
"preseed/url=http://{{ .HTTPIP }}:{{ .HTTPPort }}/preseed.cfg<wait>",
"debian-installer=en_US.UTF-8 <wait>",
"auto <wait>",
"locale=en_US.UTF-8 <wait>",
"kbd-chooser/method=es <wait>",
"keyboard-configuration/xkb-keymap=es <wait>",
"netcfg/get_hostname=node0 <wait>",
"netcfg/get_domain=test.lan <wait>",
"fb=false <wait>",
"debconf/frontend=noninteractive <wait>",
"console-setup/ask_detect=false <wait>",
"console-keymaps-at/keymap=es <wait>",
"grub-installer/bootdev=/dev/sda <wait>",
"<enter><wait>"
]
}
],
...
}
````
Vars file: config.json
-------------------------
````
{
"template_description": "debian 9.13, generated by packer on {{ isotime \"2020-01-02T15:04:05Z\" }}",
"hostname": "node0",
"local_domain": "internal.test",
"vmid": "400",
"locale": "es_ES",
"cores": "1",
"sockets": "1",
"memory": "2048",
"disk_size": "50G",
"datastore": "local-lvm",
"datastore_type": "lvm",
"iso": "local:iso/debian-9.13.0-amd64-netinst.iso",
"boot_command_prefix": "<esc><wait>",
"preseed_file": "preseed.cfg"
}
````
proseed.cfg
---------------
````
#Early
d-i partman/early_command string \
echo "Starting install" \
sleep 60
# Localization ----------------------------------------------------------
# d-i debian-installer/language string en
# d-i debian-installer/country string ES
# d-i debian-installer/locale string en_GB.UTF-8
# Keymap & Console ------------------------------------------------------
# d-i keyboard-configuration/xkb-keymap select es
# Network ---------------------------------------------------------------
d-i netcfg/enable boolean true
d-i netcfg/choose_interface select auto
d-i netcfg/dhcp_failed note
d-i netcfg/dhcp_options select Configure network manually
# Mirror settings ------------------------------------------------------
d-i mirror/country string manual
d-i mirror/http/hostname string ftp.es.debian.org
d-i mirror/http/directory string /debian/
d-i mirror/http/proxy string
# Root password ---------------------------------------------------------
d-i passwd/root-password password user
d-i passwd/root-password-again password pass
# user account ----------------------------------------------------------
d-i passwd/user-fullname string user1
d-i passwd/username string user1
d-i passwd/user-password password pass
d-i passwd/user-password-again password pass
d-i passwd/user-uid string 1010
# Clock and time zone setup --------------------------------------------
d-i clock-setup/utc boolean true
d-i time/zone string Europe/Madrid
d-i clock-setup/ntp boolean true
# Partitioning ----------------------------------------------------------
d-i partman-auto/disk string /dev/sda
d-i partman-auto/method string lvm
d-i partman-lvm/device_remove_lvm boolean true
d-i partman-auto/expert_recipe string \
boot-root :: \
40 300 300 ext4 \
$primary{ } \
$bootable{ } \
method{ format } format{ } \
use_filesystem{ } filesystem{ ext4 } \
mountpoint{ /boot } \
. \
2000 10000 100000000 ext4 \ \
$primary{ } \
method{ lvm } \
device{ /dev/sda} \
vg_name{ vg-root } \
. \
2000 10000 100000000 ext4 \
$lvmok{ } \
in_vg{ vg-root } \
lv_name{ lv-root } \
method{ format } format{ } \
use_filesystem{ } filesystem{ ext4 } \
mountpoint{ / } \
. \
d-i partman-partitioning/confirm_write_new_label boolean true
d-i partman/choose_partition select finish
d-i partman/confirm boolean true
d-i partman/confirm_nooverwrite boolean true
d-i partman-lvm/confirm boolean true
d-i partman-lvm/confirm_nooverwrite boolean true
# Package selection -------------------------------------------------------------------
tasksel tasksel/first multiselect standard
# Additional packages ------------------------------------------------------------------
d-i pkgsel/include string console-setup console-data openssh-server
# Custom config ------------------------------------------------------------------------
d-i preseed/late_command string \
cp install.sh /target/root/install.sh; \
in-target apt update -y; \
in-target apt install -y sudo; \
in-target usermod -aG sudo kub; \
in-target chmod +x /root/install.sh; \
in-target sh -c /root/install.sh;
# Boot loader installation ----------------------------------------------------
# Install grub in the first device (assuming it is not a USB stick)
d-i grub-installer/only_debian boolean true
d-i grub-installer/with_other_os boolean true
d-i grub-installer/bootdev string default
# Finishing up the installation -----------------------------------------------
d-i finish-install/reboot_in_progress note
````
### Operating system and Environment details
proxmox:
pve-manager/6.2-4/9824574a
kernel version: Linux 5.4.34-1-pve #1 SMP PVE 5.4.34-2
ISO:
debian-9.13.0-amd64-netinst.iso
### Log Fragments and crash.log files
$ packer build -debug -var-file=config.json debian-9.13.json
==> proxmox: Pausing after run of step 'StepDownload'. Press enter to continue.
==> proxmox: Pausing after run of step 'stepUploadISO'. Press enter to continue.
==> proxmox: Pausing after run of step 'stepUploadAdditionalISOs'. Press enter to continue.
==> proxmox: Creating VM
==> proxmox: Starting VM
==> proxmox: Pausing after run of step 'stepStartVM'. Press enter to continue.
==> proxmox: Starting HTTP server on port 8605
==> proxmox: Pausing after run of step 'StepHTTPServer'. Press enter to continue.
==> proxmox: Waiting 10s for boot
==> proxmox: Typing the boot command
==> proxmox: Pausing after run of step 'stepTypeBootCommand'. Press enter to continue.
==> proxmox: Waiting for SSH to become available...
Cancelling build after receiving interrupt
==> proxmox: Pausing before cleanup of step 'stepTypeBootCommand'. Press enter to continue.
==> proxmox: Pausing before cleanup of step 'StepHTTPServer'. Press enter to continue.
==> proxmox: Pausing before cleanup of step 'stepStartVM'. Press enter to continue.
==> proxmox: Stopping VM
==> proxmox: Deleting VM
==> proxmox: Pausing before cleanup of step 'stepUploadAdditionalISOs'. Press enter to continue.
==> proxmox: Pausing before cleanup of step 'stepUploadISO'. Press enter to continue.
==> proxmox: Pausing before cleanup of step 'StepDownload'. Press enter to continue.
Build 'proxmox' errored after 2 minutes 52 seconds: build was cancelled
|
non_design
|
proxmox vm debian problem with the preseed cfg file when we get to the storage configuration it does not continue with the installation the screen remains blue and does not continue when filing a bug please include the following headings if possible any example text in this template can be deleted overview of the issue before we start i apologize for my english when the packer reaches the boot command and uploads the preseed cfg file this file works the problem is when we get to the step to configure the storage at this moment the process stops and does not continue we only see the blue screen reproduction steps i have a version of the proxmox server where i run virtual machines and i would like to have my custom iso to create iso i have two files a configuration file and the packer file i launch the command from my desktop packer build var file config json debian json proxmox the output will be in this color proxmox creating vm proxmox starting vm proxmox starting the http server on port proxmox waiting seconds for boot proxmox typing boot command proxmox waiting for ssh to be available it loads the proseed cfg file without problems and runs the whole process until it reaches the storage configuration it stops here and does not continue it reaches the timeout packer version from packer simplified packer buildfile debian json builders disks type virtio disk size user disk size storage pool user datastore storage pool type user datastore type iso file user iso http directory http template description user template description boot wait boot command user boot command prefix install preseed url httpip httpport preseed cfg debian installer en us utf auto locale en us utf kbd chooser method es keyboard configuration xkb keymap es netcfg get hostname netcfg get domain test lan fb false debconf frontend noninteractive console setup ask detect false console keymaps at keymap es grub installer bootdev dev sda vars file config json template description debian generated by packer on isotime hostname local domain internal test vmid locale es es cores sockets memory disk size datastore local lvm datastore type lvm iso local iso debian netinst iso boot command prefix preseed file preseed cfg proseed cfg early d i partman early command string echo starting install sleep localization d i debian installer language string en d i debian installer country string es d i debian installer locale string en gb utf keymap console d i keyboard configuration xkb keymap select es network d i netcfg enable boolean true d i netcfg choose interface select auto d i netcfg dhcp failed note d i netcfg dhcp options select configure network manually mirror settings d i mirror country string manual d i mirror http hostname string ftp es debian org d i mirror http directory string debian d i mirror http proxy string root password d i passwd root password password user d i passwd root password again password pass user account d i passwd user fullname string d i passwd username string d i passwd user password password pass d i passwd user password again password pass d i passwd user uid string clock and time zone setup d i clock setup utc boolean true d i time zone string europe madrid d i clock setup ntp boolean true partitioning d i partman auto disk string dev sda d i partman auto method string lvm d i partman lvm device remove lvm boolean true d i partman auto expert recipe string boot root primary bootable method format format use filesystem filesystem mountpoint boot primary method lvm device dev sda vg name vg root lvmok in vg vg root lv name lv root method format format use filesystem filesystem mountpoint d i partman partitioning confirm write new label boolean true d i partman choose partition select finish d i partman confirm boolean true d i partman confirm nooverwrite boolean true d i partman lvm confirm boolean true d i partman lvm confirm nooverwrite boolean true package selection tasksel tasksel first multiselect standard additional packages d i pkgsel include string console setup console data openssh server custom config d i preseed late command string cp install sh target root install sh in target apt update y in target apt install y sudo in target usermod ag sudo kub in target chmod x root install sh in target sh c root install sh boot loader installation install grub in the first device assuming it is not a usb stick d i grub installer only debian boolean true d i grub installer with other os boolean true d i grub installer bootdev string default finishing up the installation d i finish install reboot in progress note operating system and environment details proxmox pve manager kernel version linux pve smp pve iso debian netinst iso log fragments and crash log files packer build debug var file config json debian json proxmox pausing after run of step stepdownload press enter to continue proxmox pausing after run of step stepuploadiso press enter to continue proxmox pausing after run of step stepuploadadditionalisos press enter to continue proxmox creating vm proxmox starting vm proxmox pausing after run of step stepstartvm press enter to continue proxmox starting http server on port proxmox pausing after run of step stephttpserver press enter to continue proxmox waiting for boot proxmox typing the boot command proxmox pausing after run of step steptypebootcommand press enter to continue proxmox waiting for ssh to become available cancelling build after receiving interrupt proxmox pausing before cleanup of step steptypebootcommand press enter to continue proxmox pausing before cleanup of step stephttpserver press enter to continue proxmox pausing before cleanup of step stepstartvm press enter to continue proxmox stopping vm proxmox deleting vm proxmox pausing before cleanup of step stepuploadadditionalisos press enter to continue proxmox pausing before cleanup of step stepuploadiso press enter to continue proxmox pausing before cleanup of step stepdownload press enter to continue build proxmox errored after minutes seconds build was cancelled
| 0
|
390,617
| 11,551,085,441
|
IssuesEvent
|
2020-02-19 00:19:17
|
googleapis/java-trace
|
https://api.github.com/repos/googleapis/java-trace
|
closed
|
Synthesis failed for java-trace
|
api: cloudtrace autosynth failure priority: p1 type: bug
|
Hello! Autosynth couldn't regenerate java-trace. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
On branch autosynth
nothing to commit, working tree clean
HEAD detached at FETCH_HEAD
nothing to commit, working tree clean
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6aec9c34db0e4be221cdaf6faba27bdc07cfea846808b3d3b964dfce3a9a0f9b
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudtrace/artman_cloudtrace_v1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/GetTraceRequest.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/ListTracesResponseOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/PatchTracesRequestOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/ListTracesResponse.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/Trace.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/ListTracesRequest.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/Traces.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceProto.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceSpan.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TracesOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/PatchTracesRequest.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceSpanOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/GetTraceRequestOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/ListTracesRequestOrBuilder.java.
synthtool > No replacements made in [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/**/*Name.java'), PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/**/*Names.java')] for pattern /\*
\* Copyright \d{4} Google LLC
\*
\* Licensed under the Apache License, Version 2.0 \(the "License"\); you may not use this file except
\* in compliance with the License. You may obtain a copy of the License at
\*
\* http://www.apache.org/licenses/LICENSE-2.0
\*
\* Unless required by applicable law or agreed to in writing, software distributed under the License
\* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
\* or implied. See the License for the specific language governing permissions and limitations under
\* the License.
\*/
, maybe replacement is not longer needed?
synthtool > Replaced 'package com.google.devtools.cloudtrace.v1;' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/grpc-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceServiceGrpc.java.
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v1/samples/src')] were copied. Does the source contain files?
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v1/samples/resources')] were copied. Does the source contain files?
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v1/samples/src/**/*.manifest.yaml')] were copied. Does the source contain files?
synthtool > Running java formatter on 26 files
synthtool > Running java formatter on 1 files
synthtool > Running java formatter on 15 files
synthtool > Running java formatter on 0 files
synthtool > Running generator for google/devtools/cloudtrace/artman_cloudtrace_v2.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/ModuleOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/StackTraceOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/Span.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/BatchWriteSpansRequest.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/AttributeValueOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TracingProto.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/BatchWriteSpansRequestOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TruncatableStringOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TraceProto.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/Module.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/StackTrace.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/AttributeValue.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/SpanOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TruncatableString.java.
synthtool > Replaced '/\\*\n \\* Copyright \\d{4} Google LLC\n \\*\n \\* Licensed under the Apache License, Version 2.0 \\(the "License"\\); you may not use this file except\n \\* in compliance with the License. You may obtain a copy of the License at\n \\*\n \\* http://www.apache.org/licenses/LICENSE-2.0\n \\*\n \\* Unless required by applicable law or agreed to in writing, software distributed under the License\n \\* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express\n \\* or implied. See the License for the specific language governing permissions and limitations under\n \\* the License.\n \\*/\n' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/SpanName.java.
synthtool > Replaced '/\\*\n \\* Copyright \\d{4} Google LLC\n \\*\n \\* Licensed under the Apache License, Version 2.0 \\(the "License"\\); you may not use this file except\n \\* in compliance with the License. You may obtain a copy of the License at\n \\*\n \\* http://www.apache.org/licenses/LICENSE-2.0\n \\*\n \\* Unless required by applicable law or agreed to in writing, software distributed under the License\n \\* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express\n \\* or implied. See the License for the specific language governing permissions and limitations under\n \\* the License.\n \\*/\n' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/ProjectName.java.
synthtool > Replaced 'package com.google.devtools.cloudtrace.v2;' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/grpc-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TraceServiceGrpc.java.
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v2/samples/src')] were copied. Does the source contain files?
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v2/samples/resources')] were copied. Does the source contain files?
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v2/samples/src/**/*.manifest.yaml')] were copied. Does the source contain files?
synthtool > Running java formatter on 26 files
synthtool > Running java formatter on 1 files
synthtool > Running java formatter on 16 files
synthtool > Running java formatter on 0 files
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 35, in <module>
java.common_templates()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/languages/java.py", line 349, in common_templates
templates = gcp.CommonTemplates().java_library(**kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/common.py", line 75, in java_library
return self._generic_library("java_library", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/common.py", line 43, in _generic_library
if not kwargs["metadata"]["samples"]:
KeyError: 'samples'
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7769bc45-4de7-4c4e-8e43-b07e6b6eba12).
|
1.0
|
Synthesis failed for java-trace - Hello! Autosynth couldn't regenerate java-trace. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to branch 'autosynth'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--']
synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
On branch autosynth
nothing to commit, working tree clean
HEAD detached at FETCH_HEAD
nothing to commit, working tree clean
synthtool > Ensuring dependencies.
synthtool > Pulling artman image.
latest: Pulling from googleapis/artman
Digest: sha256:6aec9c34db0e4be221cdaf6faba27bdc07cfea846808b3d3b964dfce3a9a0f9b
Status: Image is up to date for googleapis/artman:latest
synthtool > Cloning googleapis.
synthtool > Running generator for google/devtools/cloudtrace/artman_cloudtrace_v1.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/GetTraceRequest.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/ListTracesResponseOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/PatchTracesRequestOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/ListTracesResponse.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/Trace.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/ListTracesRequest.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/Traces.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceProto.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceSpan.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TracesOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/PatchTracesRequest.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceSpanOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/GetTraceRequestOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/ListTracesRequestOrBuilder.java.
synthtool > No replacements made in [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/**/*Name.java'), PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v1/src/**/*Names.java')] for pattern /\*
\* Copyright \d{4} Google LLC
\*
\* Licensed under the Apache License, Version 2.0 \(the "License"\); you may not use this file except
\* in compliance with the License. You may obtain a copy of the License at
\*
\* http://www.apache.org/licenses/LICENSE-2.0
\*
\* Unless required by applicable law or agreed to in writing, software distributed under the License
\* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
\* or implied. See the License for the specific language governing permissions and limitations under
\* the License.
\*/
, maybe replacement is not longer needed?
synthtool > Replaced 'package com.google.devtools.cloudtrace.v1;' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/grpc-google-cloud-trace-v1/src/main/java/com/google/devtools/cloudtrace/v1/TraceServiceGrpc.java.
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v1/samples/src')] were copied. Does the source contain files?
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v1/samples/resources')] were copied. Does the source contain files?
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v1/samples/src/**/*.manifest.yaml')] were copied. Does the source contain files?
synthtool > Running java formatter on 26 files
synthtool > Running java formatter on 1 files
synthtool > Running java formatter on 15 files
synthtool > Running java formatter on 0 files
synthtool > Running generator for google/devtools/cloudtrace/artman_cloudtrace_v2.yaml.
synthtool > Generated code into /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/ModuleOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/StackTraceOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/Span.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/BatchWriteSpansRequest.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/AttributeValueOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TracingProto.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/BatchWriteSpansRequestOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TruncatableStringOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TraceProto.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/Module.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/StackTrace.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/AttributeValue.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/SpanOrBuilder.java.
synthtool > Replaced '// Generated by the protocol buffer compiler. DO NOT EDIT!' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TruncatableString.java.
synthtool > Replaced '/\\*\n \\* Copyright \\d{4} Google LLC\n \\*\n \\* Licensed under the Apache License, Version 2.0 \\(the "License"\\); you may not use this file except\n \\* in compliance with the License. You may obtain a copy of the License at\n \\*\n \\* http://www.apache.org/licenses/LICENSE-2.0\n \\*\n \\* Unless required by applicable law or agreed to in writing, software distributed under the License\n \\* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express\n \\* or implied. See the License for the specific language governing permissions and limitations under\n \\* the License.\n \\*/\n' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/SpanName.java.
synthtool > Replaced '/\\*\n \\* Copyright \\d{4} Google LLC\n \\*\n \\* Licensed under the Apache License, Version 2.0 \\(the "License"\\); you may not use this file except\n \\* in compliance with the License. You may obtain a copy of the License at\n \\*\n \\* http://www.apache.org/licenses/LICENSE-2.0\n \\*\n \\* Unless required by applicable law or agreed to in writing, software distributed under the License\n \\* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express\n \\* or implied. See the License for the specific language governing permissions and limitations under\n \\* the License.\n \\*/\n' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/proto-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/ProjectName.java.
synthtool > Replaced 'package com.google.devtools.cloudtrace.v2;' in /home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/grpc-google-cloud-trace-v2/src/main/java/com/google/devtools/cloudtrace/v2/TraceServiceGrpc.java.
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v2/samples/src')] were copied. Does the source contain files?
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v2/samples/resources')] were copied. Does the source contain files?
synthtool > No files in sources [PosixPath('/home/kbuilder/.cache/synthtool/googleapis/artman-genfiles/java/gapic-google-cloud-trace-v2/samples/src/**/*.manifest.yaml')] were copied. Does the source contain files?
synthtool > Running java formatter on 26 files
synthtool > Running java formatter on 1 files
synthtool > Running java formatter on 16 files
synthtool > Running java formatter on 0 files
synthtool > Wrote metadata to synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 35, in <module>
java.common_templates()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/languages/java.py", line 349, in common_templates
templates = gcp.CommonTemplates().java_library(**kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/common.py", line 75, in java_library
return self._generic_library("java_library", **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/common.py", line 43, in _generic_library
if not kwargs["metadata"]["samples"]:
KeyError: 'samples'
Synthesis failed
```
Google internal developers can see the full log [here](https://sponge/7769bc45-4de7-4c4e-8e43-b07e6b6eba12).
|
non_design
|
synthesis failed for java trace hello autosynth couldn t regenerate java trace broken heart here s the output from running synth py cloning into working repo switched to branch autosynth running synthtool synthtool executing tmpfs src git autosynth working repo synth py on branch autosynth nothing to commit working tree clean head detached at fetch head nothing to commit working tree clean synthtool ensuring dependencies synthtool pulling artman image latest pulling from googleapis artman digest status image is up to date for googleapis artman latest synthtool cloning googleapis synthtool running generator for google devtools cloudtrace artman cloudtrace yaml synthtool generated code into home kbuilder cache synthtool googleapis artman genfiles java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace gettracerequest java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace listtracesresponseorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace patchtracesrequestorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace listtracesresponse java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace trace java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace traceorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace listtracesrequest java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace traces java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace traceproto java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace tracespan java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace tracesorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace patchtracesrequest java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace tracespanorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace gettracerequestorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace listtracesrequestorbuilder java synthtool no replacements made in for pattern copyright d google llc licensed under the apache license version the license you may not use this file except in compliance with the license you may obtain a copy of the license at unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license maybe replacement is not longer needed synthtool replaced package com google devtools cloudtrace in home kbuilder cache synthtool googleapis artman genfiles java grpc google cloud trace src main java com google devtools cloudtrace traceservicegrpc java synthtool no files in sources were copied does the source contain files synthtool no files in sources were copied does the source contain files synthtool no files in sources were copied does the source contain files synthtool running java formatter on files synthtool running java formatter on files synthtool running java formatter on files synthtool running java formatter on files synthtool running generator for google devtools cloudtrace artman cloudtrace yaml synthtool generated code into home kbuilder cache synthtool googleapis artman genfiles java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace moduleorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace stacktraceorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace span java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace batchwritespansrequest java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace attributevalueorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace tracingproto java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace batchwritespansrequestorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace truncatablestringorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace traceproto java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace module java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace stacktrace java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace attributevalue java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace spanorbuilder java synthtool replaced generated by the protocol buffer compiler do not edit in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace truncatablestring java synthtool replaced n copyright d google llc n n licensed under the apache license version the license you may not use this file except n in compliance with the license you may obtain a copy of the license at n n n unless required by applicable law or agreed to in writing software distributed under the license n is distributed on an as is basis without warranties or conditions of any kind either express n or implied see the license for the specific language governing permissions and limitations under n the license n n in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace spanname java synthtool replaced n copyright d google llc n n licensed under the apache license version the license you may not use this file except n in compliance with the license you may obtain a copy of the license at n n n unless required by applicable law or agreed to in writing software distributed under the license n is distributed on an as is basis without warranties or conditions of any kind either express n or implied see the license for the specific language governing permissions and limitations under n the license n n in home kbuilder cache synthtool googleapis artman genfiles java proto google cloud trace src main java com google devtools cloudtrace projectname java synthtool replaced package com google devtools cloudtrace in home kbuilder cache synthtool googleapis artman genfiles java grpc google cloud trace src main java com google devtools cloudtrace traceservicegrpc java synthtool no files in sources were copied does the source contain files synthtool no files in sources were copied does the source contain files synthtool no files in sources were copied does the source contain files synthtool running java formatter on files synthtool running java formatter on files synthtool running java formatter on files synthtool running java formatter on files synthtool wrote metadata to synth metadata traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo synth py line in java common templates file tmpfs src git autosynth env lib site packages synthtool languages java py line in common templates templates gcp commontemplates java library kwargs file tmpfs src git autosynth env lib site packages synthtool gcp common py line in java library return self generic library java library kwargs file tmpfs src git autosynth env lib site packages synthtool gcp common py line in generic library if not kwargs keyerror samples synthesis failed google internal developers can see the full log
| 0
|
57,989
| 7,111,057,592
|
IssuesEvent
|
2018-01-17 12:59:57
|
pints-team/pints
|
https://api.github.com/repos/pints-team/pints
|
opened
|
Add documentation for contributors
|
design-and-infrastructure documentation
|
Things it should describe:
- [ ] Installation with `pip -e . [dev, docs, extras]`
- [ ] Python stuff? Maybe [pep8](https://www.python.org/dev/peps/pep-0008/) and flake8.
- [ ] Running tests before
- [ ] Writing tests
- [ ] Writing docs (rst) and building the docs (see #107)
- [ ] Creating branches and using PRs
|
1.0
|
Add documentation for contributors - Things it should describe:
- [ ] Installation with `pip -e . [dev, docs, extras]`
- [ ] Python stuff? Maybe [pep8](https://www.python.org/dev/peps/pep-0008/) and flake8.
- [ ] Running tests before
- [ ] Writing tests
- [ ] Writing docs (rst) and building the docs (see #107)
- [ ] Creating branches and using PRs
|
design
|
add documentation for contributors things it should describe installation with pip e python stuff maybe and running tests before writing tests writing docs rst and building the docs see creating branches and using prs
| 1
|
33,359
| 6,198,561,969
|
IssuesEvent
|
2017-07-05 19:27:41
|
stylelint/stylelint
|
https://api.github.com/repos/stylelint/stylelint
|
closed
|
Fix missing closing brace in example inside max-nesting-depth documentation
|
status: help wanted type: documentation
|
A closing brace is missing in the first example inside [max-nesting-depth documentation](https://github.com/stylelint/stylelint/blob/master/lib/rules/max-nesting-depth/README.md):
```css
a { & > b { top: 0; }
```
|
1.0
|
Fix missing closing brace in example inside max-nesting-depth documentation - A closing brace is missing in the first example inside [max-nesting-depth documentation](https://github.com/stylelint/stylelint/blob/master/lib/rules/max-nesting-depth/README.md):
```css
a { & > b { top: 0; }
```
|
non_design
|
fix missing closing brace in example inside max nesting depth documentation a closing brace is missing in the first example inside css a b top
| 0
|
575,414
| 17,030,455,799
|
IssuesEvent
|
2021-07-04 13:01:59
|
samvdkris/dotobot
|
https://api.github.com/repos/samvdkris/dotobot
|
closed
|
Quote all formatting
|
enhancement low priority
|
Just like with the help message !quote all could use a new lick of paint. An easy way to not clog up the entire screen is by dumping a .txt file or in any other language with which we can manipulate syntax highlighting to maybe make the file more interesting.
Alternatively a lot of embeds can be used. Can we make an embed scrollable?
|
1.0
|
Quote all formatting - Just like with the help message !quote all could use a new lick of paint. An easy way to not clog up the entire screen is by dumping a .txt file or in any other language with which we can manipulate syntax highlighting to maybe make the file more interesting.
Alternatively a lot of embeds can be used. Can we make an embed scrollable?
|
non_design
|
quote all formatting just like with the help message quote all could use a new lick of paint an easy way to not clog up the entire screen is by dumping a txt file or in any other language with which we can manipulate syntax highlighting to maybe make the file more interesting alternatively a lot of embeds can be used can we make an embed scrollable
| 0
|
73,886
| 8,951,655,322
|
IssuesEvent
|
2019-01-25 14:33:58
|
webdev6/obscure-book-genres
|
https://api.github.com/repos/webdev6/obscure-book-genres
|
opened
|
Visual Design - Colour scheme
|
Design
|
- [ ] Find out colour scheme and look for web app
- [ ] Search colours
- [ ] Select colours
- [ ] Add to mood board
- [ ] Show team members and get approval
|
1.0
|
Visual Design - Colour scheme - - [ ] Find out colour scheme and look for web app
- [ ] Search colours
- [ ] Select colours
- [ ] Add to mood board
- [ ] Show team members and get approval
|
design
|
visual design colour scheme find out colour scheme and look for web app search colours select colours add to mood board show team members and get approval
| 1
|
142,444
| 21,766,658,949
|
IssuesEvent
|
2022-05-13 03:15:00
|
COS301-SE-2022/MathU-Similarity-Index
|
https://api.github.com/repos/COS301-SE-2022/MathU-Similarity-Index
|
closed
|
💄 🩹 (Prototype): Added Styling for AppBar
|
scope:design priority:medium scope:ui scope:client status:ready type:bug type:change type:chore
|
Fixed confusion between saved page and view all page
|
1.0
|
💄 🩹 (Prototype): Added Styling for AppBar - Fixed confusion between saved page and view all page
|
design
|
💄 🩹 prototype added styling for appbar fixed confusion between saved page and view all page
| 1
|
552,807
| 16,327,927,378
|
IssuesEvent
|
2021-05-12 05:10:09
|
ppy/osu
|
https://api.github.com/repos/ppy/osu
|
closed
|
Hitsample-related hard crash after switching skins during gameplay loading
|
area:skinning priority:1 ruleset:osu!mania type:reliability
|
Reproduction steps (or [15s video](https://streamable.com/jg7jbh)):
1. Play any mania beatmap.
2. Change skins during the gameplay loading screen (for the best chance of reproducing, switch from this skin https://joppe27.s-ul.eu/FUjlsZfd to this one https://joppe27.s-ul.eu/xlubncdY).
3. Quickly press some keys when gameplay starts (you can also use this replay which always reproduces this issue: [Joppe27 playing Montee - Omni (Elekton) [Everything].zip](https://github.com/ppy/osu/files/6401009/Joppe27.playing.Montee.-.Omni.Elekton.Everything.zip)).
4. An unhandled error will occur: `System.ObjectDisposedException: Can not get a channel from a disposed sample.` The game crashes.
Notice how after step 2, the hit samples do not get changed to those used by the new skin. This is especially obvious when osu! doesn't crash after doing these reproduction steps, because then the old skin hitsamples will just keep being used for the remainder (or a portion) of the beatmap. This is probably very much related to the crash I reported here.
Possibly related: #5798.
Logs: [logs.zip](https://github.com/ppy/osu/files/6401192/logs.zip)
Version: 2021.424.0.0
|
1.0
|
Hitsample-related hard crash after switching skins during gameplay loading - Reproduction steps (or [15s video](https://streamable.com/jg7jbh)):
1. Play any mania beatmap.
2. Change skins during the gameplay loading screen (for the best chance of reproducing, switch from this skin https://joppe27.s-ul.eu/FUjlsZfd to this one https://joppe27.s-ul.eu/xlubncdY).
3. Quickly press some keys when gameplay starts (you can also use this replay which always reproduces this issue: [Joppe27 playing Montee - Omni (Elekton) [Everything].zip](https://github.com/ppy/osu/files/6401009/Joppe27.playing.Montee.-.Omni.Elekton.Everything.zip)).
4. An unhandled error will occur: `System.ObjectDisposedException: Can not get a channel from a disposed sample.` The game crashes.
Notice how after step 2, the hit samples do not get changed to those used by the new skin. This is especially obvious when osu! doesn't crash after doing these reproduction steps, because then the old skin hitsamples will just keep being used for the remainder (or a portion) of the beatmap. This is probably very much related to the crash I reported here.
Possibly related: #5798.
Logs: [logs.zip](https://github.com/ppy/osu/files/6401192/logs.zip)
Version: 2021.424.0.0
|
non_design
|
hitsample related hard crash after switching skins during gameplay loading reproduction steps or play any mania beatmap change skins during the gameplay loading screen for the best chance of reproducing switch from this skin to this one quickly press some keys when gameplay starts you can also use this replay which always reproduces this issue zip an unhandled error will occur system objectdisposedexception can not get a channel from a disposed sample the game crashes notice how after step the hit samples do not get changed to those used by the new skin this is especially obvious when osu doesn t crash after doing these reproduction steps because then the old skin hitsamples will just keep being used for the remainder or a portion of the beatmap this is probably very much related to the crash i reported here possibly related logs version
| 0
|
150,569
| 23,681,582,873
|
IssuesEvent
|
2022-08-28 21:47:12
|
pulumi/pulumi
|
https://api.github.com/repos/pulumi/pulumi
|
closed
|
Providers Broken via MIssing Type Name
|
kind/bug resolution/by-design impact/regression area/plugins
|
### What happened?
**TL;DR:** [This commit](https://github.com/pulumi/pulumi/pull/10435/files) has introduced a change which has broken provider codegen for at least `@pulumi/command`.
I recently checked out `@pulumi/command` to see work on codegen changes. I noticed that codegen was not working, and that the error seemed legitimate.
I checked out Pulumi v3.38.0, and I confirmed that codegen works fine. I then used `git bisect` and determined this above commit was the offending comment, which jives with the error message from `@pulumi/command`:
```
make build
(cd provider && go build -o /Users/robbiemckinstry/workspace/pulumi/pulumi-command/bin/pulumi-gen-command -ldflags "-X github.com/pulumi/pulumi-command/provider/pkg/version.Version=0.4.2-alpha.1661516193+2d1657e1.dirty" github.com/pulumi/pulumi-command/provider/cmd/pulumi-gen-command)
# github.com/pulumi/pulumi/pkg/v3/resource/deploy/deploytest
../../pulumi/pkg/resource/deploy/deploytest/pluginhost.go:423:57: undefined: workspace.ProjectPlugin
# github.com/pulumi/pulumi/pkg/v3/engine
../../pulumi/pkg/engine/plugins.go:180:99: undefined: workspace.ProjectPlugin
make: *** [codegen] Error 2
```
It's not clear to me _why_ `deploytest` can't resolve `workspace.ProjectPlugin` in `pulumi/command`, since `workspace` is already imported and [ProjectPlugin](https://github.com/pulumi/pulumi/pull/10435/files#diff-196f5b7811e033d2523373f8138cca8c54eba850b419591283cd88a35828ac12R510-R515) seems to be defined.
### Steps to reproduce
1. Check out the offending commit, or any later commit.
2. Check out `@pulumi/command`.
3. Update `provider/go.mod` in pulumi/command to point to your local pulumi/pulumi repo.
4. Run `make ensure && make build` in `@pulumi/command`
5. Observe the error message.
### Expected Behavior
I expected `pulumi/command` to successfully codegen.
### Actual Behavior
See above error message.
### Output of `pulumi about`
```
CLI
Version 3.38.0
Go Version go1.19
Go Compiler gc
Host
OS darwin
Version 12.3
Arch arm64
Backend
Name pulumi.com
URL https://app.pulumi.com/thesnowmancometh
User thesnowmancometh
Organizations thesnowmancometh, whiterabbit, pulumi
Pulumi locates its logs in /var/folders/8n/y4s73s2d4rnbn7d85clz9pz40000gn/T/ by default
```
### Additional context
_No response_
### Contributing
Vote on this issue by adding a 👍 reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).
|
1.0
|
Providers Broken via MIssing Type Name - ### What happened?
**TL;DR:** [This commit](https://github.com/pulumi/pulumi/pull/10435/files) has introduced a change which has broken provider codegen for at least `@pulumi/command`.
I recently checked out `@pulumi/command` to see work on codegen changes. I noticed that codegen was not working, and that the error seemed legitimate.
I checked out Pulumi v3.38.0, and I confirmed that codegen works fine. I then used `git bisect` and determined this above commit was the offending comment, which jives with the error message from `@pulumi/command`:
```
make build
(cd provider && go build -o /Users/robbiemckinstry/workspace/pulumi/pulumi-command/bin/pulumi-gen-command -ldflags "-X github.com/pulumi/pulumi-command/provider/pkg/version.Version=0.4.2-alpha.1661516193+2d1657e1.dirty" github.com/pulumi/pulumi-command/provider/cmd/pulumi-gen-command)
# github.com/pulumi/pulumi/pkg/v3/resource/deploy/deploytest
../../pulumi/pkg/resource/deploy/deploytest/pluginhost.go:423:57: undefined: workspace.ProjectPlugin
# github.com/pulumi/pulumi/pkg/v3/engine
../../pulumi/pkg/engine/plugins.go:180:99: undefined: workspace.ProjectPlugin
make: *** [codegen] Error 2
```
It's not clear to me _why_ `deploytest` can't resolve `workspace.ProjectPlugin` in `pulumi/command`, since `workspace` is already imported and [ProjectPlugin](https://github.com/pulumi/pulumi/pull/10435/files#diff-196f5b7811e033d2523373f8138cca8c54eba850b419591283cd88a35828ac12R510-R515) seems to be defined.
### Steps to reproduce
1. Check out the offending commit, or any later commit.
2. Check out `@pulumi/command`.
3. Update `provider/go.mod` in pulumi/command to point to your local pulumi/pulumi repo.
4. Run `make ensure && make build` in `@pulumi/command`
5. Observe the error message.
### Expected Behavior
I expected `pulumi/command` to successfully codegen.
### Actual Behavior
See above error message.
### Output of `pulumi about`
```
CLI
Version 3.38.0
Go Version go1.19
Go Compiler gc
Host
OS darwin
Version 12.3
Arch arm64
Backend
Name pulumi.com
URL https://app.pulumi.com/thesnowmancometh
User thesnowmancometh
Organizations thesnowmancometh, whiterabbit, pulumi
Pulumi locates its logs in /var/folders/8n/y4s73s2d4rnbn7d85clz9pz40000gn/T/ by default
```
### Additional context
_No response_
### Contributing
Vote on this issue by adding a 👍 reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).
|
design
|
providers broken via missing type name what happened tl dr has introduced a change which has broken provider codegen for at least pulumi command i recently checked out pulumi command to see work on codegen changes i noticed that codegen was not working and that the error seemed legitimate i checked out pulumi and i confirmed that codegen works fine i then used git bisect and determined this above commit was the offending comment which jives with the error message from pulumi command make build cd provider go build o users robbiemckinstry workspace pulumi pulumi command bin pulumi gen command ldflags x github com pulumi pulumi command provider pkg version version alpha dirty github com pulumi pulumi command provider cmd pulumi gen command github com pulumi pulumi pkg resource deploy deploytest pulumi pkg resource deploy deploytest pluginhost go undefined workspace projectplugin github com pulumi pulumi pkg engine pulumi pkg engine plugins go undefined workspace projectplugin make error it s not clear to me why deploytest can t resolve workspace projectplugin in pulumi command since workspace is already imported and seems to be defined steps to reproduce check out the offending commit or any later commit check out pulumi command update provider go mod in pulumi command to point to your local pulumi pulumi repo run make ensure make build in pulumi command observe the error message expected behavior i expected pulumi command to successfully codegen actual behavior see above error message output of pulumi about cli version go version go compiler gc host os darwin version arch backend name pulumi com url user thesnowmancometh organizations thesnowmancometh whiterabbit pulumi pulumi locates its logs in var folders t by default additional context no response contributing vote on this issue by adding a 👍 reaction to contribute a fix for this issue leave a comment and link to your pull request if you ve opened one already
| 1
|
49,974
| 6,288,950,632
|
IssuesEvent
|
2017-07-19 18:07:28
|
roschaefer/story.board
|
https://api.github.com/repos/roschaefer/story.board
|
closed
|
tc Channel select: Sensorstory as a default
|
design Priority: medium User Story
|
As a reporter
I want sensorstory as the default channel
in order to save one click.
|
1.0
|
tc Channel select: Sensorstory as a default - As a reporter
I want sensorstory as the default channel
in order to save one click.
|
design
|
tc channel select sensorstory as a default as a reporter i want sensorstory as the default channel in order to save one click
| 1
|
425,422
| 12,339,994,568
|
IssuesEvent
|
2020-05-14 19:07:01
|
google/knative-gcp
|
https://api.github.com/repos/google/knative-gcp
|
closed
|
Add installation documentation for gcp broker
|
area/broker kind/doc priority/1 release/1
|
**Problem**
Add installation instructions in knative-gcp and a link from knative documentation as a pointer to alternate broker implementation.
* [x] Introduction to gcp broker and how is it different from channel based one
* [x] Installation, auth configuration: https://github.com/google/knative-gcp/pull/880
* [x] Provide a simple demo: #936
* [x] How to debug
* [x] link to main README
* [x] add a section about alternate broker and introduce gcp broker in knative docs
**[Persona:](https://github.com/knative/eventing/blob/master/docs/personas.md)**
Which persona is this feature for?
**Exit Criteria**
A measurable (binary) test that would indicate that the problem has been resolved.
**Time Estimate (optional):**
How many developer-days do you think this may take to resolve?
**Additional context (optional)**
Add any other context about the feature request here.
|
1.0
|
Add installation documentation for gcp broker - **Problem**
Add installation instructions in knative-gcp and a link from knative documentation as a pointer to alternate broker implementation.
* [x] Introduction to gcp broker and how is it different from channel based one
* [x] Installation, auth configuration: https://github.com/google/knative-gcp/pull/880
* [x] Provide a simple demo: #936
* [x] How to debug
* [x] link to main README
* [x] add a section about alternate broker and introduce gcp broker in knative docs
**[Persona:](https://github.com/knative/eventing/blob/master/docs/personas.md)**
Which persona is this feature for?
**Exit Criteria**
A measurable (binary) test that would indicate that the problem has been resolved.
**Time Estimate (optional):**
How many developer-days do you think this may take to resolve?
**Additional context (optional)**
Add any other context about the feature request here.
|
non_design
|
add installation documentation for gcp broker problem add installation instructions in knative gcp and a link from knative documentation as a pointer to alternate broker implementation introduction to gcp broker and how is it different from channel based one installation auth configuration provide a simple demo how to debug link to main readme add a section about alternate broker and introduce gcp broker in knative docs which persona is this feature for exit criteria a measurable binary test that would indicate that the problem has been resolved time estimate optional how many developer days do you think this may take to resolve additional context optional add any other context about the feature request here
| 0
|
98,680
| 12,344,814,928
|
IssuesEvent
|
2020-05-15 07:47:23
|
nishidayoshikatsu/covid19-yamaguchi
|
https://api.github.com/repos/nishidayoshikatsu/covid19-yamaguchi
|
closed
|
PNG版のロゴデータのフォントが中文フォントになっている
|
design
|
## 改善詳細 / Details of Improvement
- サイトで使用されているPNG版のロゴデータのフォントが中文フォントになっている
## スクリーンショット / Screenshot
slackでご指摘いただいたコメント
<!-- バグであればdeveloper toolからコンソールも合わせて添付 -->
<!-- If it's a bug, attach a screenshot of the developer tool console -->

## 期待する見せ方・挙動 / Expected behavior
- 適切なフォントに修正したものを利用する
## 動作環境・ブラウザ / Environment
- macOS / Windows / Linux / iOS / Android
- Chrome / Safari / Firefox / Edge / Internet Explorer
|
1.0
|
PNG版のロゴデータのフォントが中文フォントになっている - ## 改善詳細 / Details of Improvement
- サイトで使用されているPNG版のロゴデータのフォントが中文フォントになっている
## スクリーンショット / Screenshot
slackでご指摘いただいたコメント
<!-- バグであればdeveloper toolからコンソールも合わせて添付 -->
<!-- If it's a bug, attach a screenshot of the developer tool console -->

## 期待する見せ方・挙動 / Expected behavior
- 適切なフォントに修正したものを利用する
## 動作環境・ブラウザ / Environment
- macOS / Windows / Linux / iOS / Android
- Chrome / Safari / Firefox / Edge / Internet Explorer
|
design
|
png版のロゴデータのフォントが中文フォントになっている 改善詳細 details of improvement サイトで使用されているpng版のロゴデータのフォントが中文フォントになっている スクリーンショット screenshot slackでご指摘いただいたコメント 期待する見せ方・挙動 expected behavior 適切なフォントに修正したものを利用する 動作環境・ブラウザ environment macos windows linux ios android chrome safari firefox edge internet explorer
| 1
|
300,830
| 22,698,876,311
|
IssuesEvent
|
2022-07-05 08:51:55
|
nicolargo/glances
|
https://api.github.com/repos/nicolargo/glances
|
closed
|
Must escape argument to install modules on OS X Catalina
|
documentation
|
On a newly installed glances I tried to add the docker module, with this result:
```
~❯ pip --version && pip install glances[docker] --user
pip 22.1.2 from /usr/local/lib/python3.9/site-packages/pip (python 3.9)
zsh: no matches found: glances[docker]
```
Adding quotes fixed it for me
`
pip install 'glances[docker]' --user
`
As noted in the issue below
Try escaping the argument with quotes:
`pip install pip install 'glances[action,browser,cloud,cpuinfo,chart,docker,export,folders,gpu,ip,raid,snmp,web,wifi]' --user`
_Originally posted by @kevintraver in https://github.com/nicolargo/glances/issues/1088#issuecomment-304237552_
|
1.0
|
Must escape argument to install modules on OS X Catalina - On a newly installed glances I tried to add the docker module, with this result:
```
~❯ pip --version && pip install glances[docker] --user
pip 22.1.2 from /usr/local/lib/python3.9/site-packages/pip (python 3.9)
zsh: no matches found: glances[docker]
```
Adding quotes fixed it for me
`
pip install 'glances[docker]' --user
`
As noted in the issue below
Try escaping the argument with quotes:
`pip install pip install 'glances[action,browser,cloud,cpuinfo,chart,docker,export,folders,gpu,ip,raid,snmp,web,wifi]' --user`
_Originally posted by @kevintraver in https://github.com/nicolargo/glances/issues/1088#issuecomment-304237552_
|
non_design
|
must escape argument to install modules on os x catalina on a newly installed glances i tried to add the docker module with this result ❯ pip version pip install glances user pip from usr local lib site packages pip python zsh no matches found glances adding quotes fixed it for me pip install glances user as noted in the issue below try escaping the argument with quotes pip install pip install glances user originally posted by kevintraver in
| 0
|
21,620
| 3,736,401,400
|
IssuesEvent
|
2016-03-08 15:50:35
|
mysociety/fixmystreet
|
https://api.github.com/repos/mysociety/fixmystreet
|
closed
|
Individual report page needs to handle display of multiple photos
|
Design Reviewing
|
Reports can have multiple photos now. This needs to be displayed properly on the individual report page.
And updates can have multiple photos too. So whatever we come up with should work in the main report body and any updates below it.
|
1.0
|
Individual report page needs to handle display of multiple photos - Reports can have multiple photos now. This needs to be displayed properly on the individual report page.
And updates can have multiple photos too. So whatever we come up with should work in the main report body and any updates below it.
|
design
|
individual report page needs to handle display of multiple photos reports can have multiple photos now this needs to be displayed properly on the individual report page and updates can have multiple photos too so whatever we come up with should work in the main report body and any updates below it
| 1
|
173,147
| 27,391,918,747
|
IssuesEvent
|
2023-02-28 16:48:53
|
tijlleenders/ZinZen
|
https://api.github.com/repos/tijlleenders/ZinZen
|
closed
|
Make design for the calendar overview
|
design
|
The calender overview can have the different feelings that have been logged over a month and additionally can provide a summary to the user
|
1.0
|
Make design for the calendar overview - The calender overview can have the different feelings that have been logged over a month and additionally can provide a summary to the user
|
design
|
make design for the calendar overview the calender overview can have the different feelings that have been logged over a month and additionally can provide a summary to the user
| 1
|
82,395
| 3,606,294,256
|
IssuesEvent
|
2016-02-04 10:36:34
|
EyeSeeTea/malariapp
|
https://api.github.com/repos/EyeSeeTea/malariapp
|
closed
|
Validate before create never assessment planning survey.
|
high priority
|
When we remove a deleted survey, it will be created as never assessment without check if exist.
|
1.0
|
Validate before create never assessment planning survey. - When we remove a deleted survey, it will be created as never assessment without check if exist.
|
non_design
|
validate before create never assessment planning survey when we remove a deleted survey it will be created as never assessment without check if exist
| 0
|
76,421
| 7,527,901,937
|
IssuesEvent
|
2018-04-13 18:43:59
|
istio/istio
|
https://api.github.com/repos/istio/istio
|
closed
|
pilot: glide Failures and private IPs
|
area/networking area/test and release
|
I seem to be running into errors trying to run glide update. Also for the main repo we should never have anything pulling from private IPs like 192.30.255.112. This is not only a bug but a security issue. Please fix ASAP. It's holding up development on multi-zone hybrid for the Dec release. We have a tight deadline.
Summarly assigning this issue to
- ldemailly who is shepherding Istio releases atm
- costinm because we the error seems to be hinting at some nested dependency.
Please redirect as necessary.
============================================================
[ERROR] Update failed for github.com/googleapis/googleapis: Unable to get repository: Cloning into '/home/XXXXXX/.glide/cache/src/git-github.com-costinm-googleapis.git'...
Warning: Permanently added 'github.com,192.30.255.112' (RSA) to the list of known hosts.
Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
: exit status 128
[ERROR] Failed to do initial checkout of config: Unable to get repository: Cloning into '/home/XXXXXX/.glide/cache/src/git-github.com-costinm-googleapis.git'...
Warning: Permanently added 'github.com,192.30.255.112' (RSA) to the list of known hosts.
Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
: exit status 128
|
1.0
|
pilot: glide Failures and private IPs - I seem to be running into errors trying to run glide update. Also for the main repo we should never have anything pulling from private IPs like 192.30.255.112. This is not only a bug but a security issue. Please fix ASAP. It's holding up development on multi-zone hybrid for the Dec release. We have a tight deadline.
Summarly assigning this issue to
- ldemailly who is shepherding Istio releases atm
- costinm because we the error seems to be hinting at some nested dependency.
Please redirect as necessary.
============================================================
[ERROR] Update failed for github.com/googleapis/googleapis: Unable to get repository: Cloning into '/home/XXXXXX/.glide/cache/src/git-github.com-costinm-googleapis.git'...
Warning: Permanently added 'github.com,192.30.255.112' (RSA) to the list of known hosts.
Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
: exit status 128
[ERROR] Failed to do initial checkout of config: Unable to get repository: Cloning into '/home/XXXXXX/.glide/cache/src/git-github.com-costinm-googleapis.git'...
Warning: Permanently added 'github.com,192.30.255.112' (RSA) to the list of known hosts.
Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
: exit status 128
|
non_design
|
pilot glide failures and private ips i seem to be running into errors trying to run glide update also for the main repo we should never have anything pulling from private ips like this is not only a bug but a security issue please fix asap it s holding up development on multi zone hybrid for the dec release we have a tight deadline summarly assigning this issue to ldemailly who is shepherding istio releases atm costinm because we the error seems to be hinting at some nested dependency please redirect as necessary update failed for github com googleapis googleapis unable to get repository cloning into home xxxxxx glide cache src git github com costinm googleapis git warning permanently added github com rsa to the list of known hosts permission denied publickey fatal could not read from remote repository please make sure you have the correct access rights and the repository exists exit status failed to do initial checkout of config unable to get repository cloning into home xxxxxx glide cache src git github com costinm googleapis git warning permanently added github com rsa to the list of known hosts permission denied publickey fatal could not read from remote repository please make sure you have the correct access rights and the repository exists exit status
| 0
|
734,367
| 25,346,367,509
|
IssuesEvent
|
2022-11-19 08:34:19
|
googleapis/nodejs-ai-platform
|
https://api.github.com/repos/googleapis/nodejs-ai-platform
|
opened
|
AI platform predict tabular regression: should make predictions using the tabular regression model failed
|
type: bug priority: p1 flakybot: issue
|
Note: #443 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 04f7c858217f1a3ce7b1072c7bf8946d39947532
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/2c1b5aae-a46c-4a7a-a474-79da01e686eb), [Sponge](http://sponge2/2c1b5aae-a46c-4a7a-a474-79da01e686eb)
status: failed
<details><summary>Test output</summary><br><pre>Command failed: node ./predict-tabular-regression.js 1014154341088493568 undefined us-central1
7 PERMISSION_DENIED: Permission denied: Consumer 'project:undefined' has been suspended.
Error: Command failed: node ./predict-tabular-regression.js 1014154341088493568 undefined us-central1
7 PERMISSION_DENIED: Permission denied: Consumer 'project:undefined' has been suspended.
at checkExecSyncError (child_process.js:635:11)
at Object.execSync (child_process.js:671:15)
at execSync (test/predict-tabular-regression.test.js:24:28)
at Context.<anonymous> (test/predict-tabular-regression.test.js:33:20)
at processImmediate (internal/timers.js:461:21)</pre></details>
|
1.0
|
AI platform predict tabular regression: should make predictions using the tabular regression model failed - Note: #443 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
----
commit: 04f7c858217f1a3ce7b1072c7bf8946d39947532
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/2c1b5aae-a46c-4a7a-a474-79da01e686eb), [Sponge](http://sponge2/2c1b5aae-a46c-4a7a-a474-79da01e686eb)
status: failed
<details><summary>Test output</summary><br><pre>Command failed: node ./predict-tabular-regression.js 1014154341088493568 undefined us-central1
7 PERMISSION_DENIED: Permission denied: Consumer 'project:undefined' has been suspended.
Error: Command failed: node ./predict-tabular-regression.js 1014154341088493568 undefined us-central1
7 PERMISSION_DENIED: Permission denied: Consumer 'project:undefined' has been suspended.
at checkExecSyncError (child_process.js:635:11)
at Object.execSync (child_process.js:671:15)
at execSync (test/predict-tabular-regression.test.js:24:28)
at Context.<anonymous> (test/predict-tabular-regression.test.js:33:20)
at processImmediate (internal/timers.js:461:21)</pre></details>
|
non_design
|
ai platform predict tabular regression should make predictions using the tabular regression model failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output command failed node predict tabular regression js undefined us permission denied permission denied consumer project undefined has been suspended error command failed node predict tabular regression js undefined us permission denied permission denied consumer project undefined has been suspended at checkexecsyncerror child process js at object execsync child process js at execsync test predict tabular regression test js at context test predict tabular regression test js at processimmediate internal timers js
| 0
|
158,563
| 24,855,306,672
|
IssuesEvent
|
2022-10-27 01:18:17
|
department-of-veterans-affairs/va.gov-team
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
|
opened
|
[DRAFT][Bug] My VA - Benefit payments - Recent direct-deposit - Data anomalies
|
bug backend design authenticated-experience dashboard front end my-va-payment-info
|
**[DRAFT]** Will finalize ticket details once all test-runs are completed.
## What happened?
On Staging **My VA** page, **Benefit payments** section, for User with recent direct-deposit payment, display of payment info appeared anomalous, as did related info on **Your VA payments** page:
- My VA page:
- Zero amount - not sure why a zero-amount payment was displayed and why such payment would be sent from API in the first place.
- "Deposited" - mismatch with Your VA payments page & API-data [paymentMethod was null]. Should front-end have made an apparent "judgment call" and displayed this instead of "Check mailed" or just displayed dashes?
- Your VA payments page:
- "Direct Deposit" not displayed - mismatch with My VA page
- Missing Bank & Account info - something to do with zero amount?
**NOTES**:
- This _might_ be just a Staging API mock-data error, but it's definitely worth investigating.
- I've added design, frontend, and backend practice-area labels. It appears some team-wide decision needs to be made on how this potential bug should be fixed.
## Specs:
- Device: [device-agnostic]
- Browser: [browser-agnostic]
- Test User: 36 (Wesley) — has recent payment via direct deposit
### Screenshots
**My VA** page:

**Your VA payments** page:

## Steps to Reproduce
- URL: https://staging.va.gov/my-va/
- Test User _(if applicable)_: 36 (Wesley)
1. On **My VA** page, in **Benefit payments** section, in **payment card**, note **+$0.00** amount and **Deposited on**. [API data said null for paymentMethod.]
2. In **payment card**, click **View your payment history** link.
3. On **Your VA payments** page, in **Payments you received** section, in **1st table row**, note **$0.00** amount and **---** in **Method**, **Bank**, and **Account** columns.
## Desired behavior
Sorry, this is a tricky one. I only have questions:
- Do zero-amount payments really occur?
- If they do occur:
- Would they really be missing Method, Bank, or Account data?
- Would Veterans find zero-amount payments useful? Should front-end display them at all?
- If they should be displayed:
- If null paymentMethod comes in, should
- Would the dashes in payment details cause Veterans confusion/concern?
[Feel free to schedule a quick Zoom chat with me as desired/needed.]
## Acceptance Criteria
[need discussion if this is really a bug]
## How to configure this issue
- [ ] **Attached to a Milestone** (when will this be completed?)
- [ ] **Attached to Epic** (what body of work is this a part of? possibly `Ongoing Maintenance`)
- [ ] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `triage`, `tools-improvements`)
- [ ] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`)
- [ ] **Labeled with `Bug`**
|
1.0
|
[DRAFT][Bug] My VA - Benefit payments - Recent direct-deposit - Data anomalies - **[DRAFT]** Will finalize ticket details once all test-runs are completed.
## What happened?
On Staging **My VA** page, **Benefit payments** section, for User with recent direct-deposit payment, display of payment info appeared anomalous, as did related info on **Your VA payments** page:
- My VA page:
- Zero amount - not sure why a zero-amount payment was displayed and why such payment would be sent from API in the first place.
- "Deposited" - mismatch with Your VA payments page & API-data [paymentMethod was null]. Should front-end have made an apparent "judgment call" and displayed this instead of "Check mailed" or just displayed dashes?
- Your VA payments page:
- "Direct Deposit" not displayed - mismatch with My VA page
- Missing Bank & Account info - something to do with zero amount?
**NOTES**:
- This _might_ be just a Staging API mock-data error, but it's definitely worth investigating.
- I've added design, frontend, and backend practice-area labels. It appears some team-wide decision needs to be made on how this potential bug should be fixed.
## Specs:
- Device: [device-agnostic]
- Browser: [browser-agnostic]
- Test User: 36 (Wesley) — has recent payment via direct deposit
### Screenshots
**My VA** page:

**Your VA payments** page:

## Steps to Reproduce
- URL: https://staging.va.gov/my-va/
- Test User _(if applicable)_: 36 (Wesley)
1. On **My VA** page, in **Benefit payments** section, in **payment card**, note **+$0.00** amount and **Deposited on**. [API data said null for paymentMethod.]
2. In **payment card**, click **View your payment history** link.
3. On **Your VA payments** page, in **Payments you received** section, in **1st table row**, note **$0.00** amount and **---** in **Method**, **Bank**, and **Account** columns.
## Desired behavior
Sorry, this is a tricky one. I only have questions:
- Do zero-amount payments really occur?
- If they do occur:
- Would they really be missing Method, Bank, or Account data?
- Would Veterans find zero-amount payments useful? Should front-end display them at all?
- If they should be displayed:
- If null paymentMethod comes in, should
- Would the dashes in payment details cause Veterans confusion/concern?
[Feel free to schedule a quick Zoom chat with me as desired/needed.]
## Acceptance Criteria
[need discussion if this is really a bug]
## How to configure this issue
- [ ] **Attached to a Milestone** (when will this be completed?)
- [ ] **Attached to Epic** (what body of work is this a part of? possibly `Ongoing Maintenance`)
- [ ] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `triage`, `tools-improvements`)
- [ ] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`)
- [ ] **Labeled with `Bug`**
|
design
|
my va benefit payments recent direct deposit data anomalies will finalize ticket details once all test runs are completed what happened on staging my va page benefit payments section for user with recent direct deposit payment display of payment info appeared anomalous as did related info on your va payments page my va page zero amount not sure why a zero amount payment was displayed and why such payment would be sent from api in the first place deposited mismatch with your va payments page api data should front end have made an apparent judgment call and displayed this instead of check mailed or just displayed dashes your va payments page direct deposit not displayed mismatch with my va page missing bank account info something to do with zero amount notes this might be just a staging api mock data error but it s definitely worth investigating i ve added design frontend and backend practice area labels it appears some team wide decision needs to be made on how this potential bug should be fixed specs device browser test user wesley mdash has recent payment via direct deposit screenshots my va page your va payments page steps to reproduce url test user if applicable wesley on my va page in benefit payments section in payment card note amount and deposited on in payment card click view your payment history link on your va payments page in payments you received section in table row note amount and in method bank and account columns desired behavior sorry this is a tricky one i only have questions do zero amount payments really occur if they do occur would they really be missing method bank or account data would veterans find zero amount payments useful should front end display them at all if they should be displayed if null paymentmethod comes in should would the dashes in payment details cause veterans confusion concern acceptance criteria how to configure this issue attached to a milestone when will this be completed attached to epic what body of work is this a part of possibly ongoing maintenance labeled with team product support analytics insights operations triage tools improvements labeled with practice area backend frontend devops design research product ia qa analytics contact center research accessibility content labeled with bug
| 1
|
35,188
| 4,635,777,866
|
IssuesEvent
|
2016-09-29 08:31:13
|
Microsoft/vscode
|
https://api.github.com/repos/Microsoft/vscode
|
closed
|
Feature: CompletionItems can have promises as additionalEdits
|
as-designed feature-request suggest
|
It would be a really nice feature, if the `CompletionItem` can actually return promises as their `additionalTextEdits`.
I'm imaging the following feature:
You are autocompleting something (in my example it's an import that is added to the programmers file) and there are other imports that have the same class name. Now it would be cool if I can add a Promise to the additionalTextEdits to ask the user with a `showInputBox` for an alias name.
Example:
```typescript
import {Foobar} from 'whatever';
let foo = new Foobar();
```
Now if I autocomplete another `Foobar` class from the library `foobaz`, I want to ask the developer for an aliased name to generate the following code:
```typescript
import {Foobar} from 'whatever';
import {Foobar as MyFoobar} from 'foobaz';
let foo = new Foobar();
/*...*/
```
To do so, the additionalTextEdit should have the possibility to be a promise like so:
```typescript
completionItem.additionalTextEdits = [
vscode.window.showInputBox(/*...*/).then(name => {
//prepare the text edit with the given name (if any)
return TextEdit;
});
]
```
What do you guys think?
Cheers
|
1.0
|
Feature: CompletionItems can have promises as additionalEdits - It would be a really nice feature, if the `CompletionItem` can actually return promises as their `additionalTextEdits`.
I'm imaging the following feature:
You are autocompleting something (in my example it's an import that is added to the programmers file) and there are other imports that have the same class name. Now it would be cool if I can add a Promise to the additionalTextEdits to ask the user with a `showInputBox` for an alias name.
Example:
```typescript
import {Foobar} from 'whatever';
let foo = new Foobar();
```
Now if I autocomplete another `Foobar` class from the library `foobaz`, I want to ask the developer for an aliased name to generate the following code:
```typescript
import {Foobar} from 'whatever';
import {Foobar as MyFoobar} from 'foobaz';
let foo = new Foobar();
/*...*/
```
To do so, the additionalTextEdit should have the possibility to be a promise like so:
```typescript
completionItem.additionalTextEdits = [
vscode.window.showInputBox(/*...*/).then(name => {
//prepare the text edit with the given name (if any)
return TextEdit;
});
]
```
What do you guys think?
Cheers
|
design
|
feature completionitems can have promises as additionaledits it would be a really nice feature if the completionitem can actually return promises as their additionaltextedits i m imaging the following feature you are autocompleting something in my example it s an import that is added to the programmers file and there are other imports that have the same class name now it would be cool if i can add a promise to the additionaltextedits to ask the user with a showinputbox for an alias name example typescript import foobar from whatever let foo new foobar now if i autocomplete another foobar class from the library foobaz i want to ask the developer for an aliased name to generate the following code typescript import foobar from whatever import foobar as myfoobar from foobaz let foo new foobar to do so the additionaltextedit should have the possibility to be a promise like so typescript completionitem additionaltextedits vscode window showinputbox then name prepare the text edit with the given name if any return textedit what do you guys think cheers
| 1
|
343,385
| 24,769,440,256
|
IssuesEvent
|
2022-10-23 00:17:44
|
gamerpotion/DarkRPG
|
https://api.github.com/repos/gamerpotion/DarkRPG
|
closed
|
Server IP has changed
|
documentation
|
Anyone having issues connecting to the mmo server, please get version 2.4f onwards.
or add **darkrpg.mcserver.us** to the server list.
|
1.0
|
Server IP has changed - Anyone having issues connecting to the mmo server, please get version 2.4f onwards.
or add **darkrpg.mcserver.us** to the server list.
|
non_design
|
server ip has changed anyone having issues connecting to the mmo server please get version onwards or add darkrpg mcserver us to the server list
| 0
|
76,702
| 9,481,637,443
|
IssuesEvent
|
2019-04-21 07:21:01
|
modelarious/FantasiaSource
|
https://api.github.com/repos/modelarious/FantasiaSource
|
opened
|
Students can bypass the token system
|
Critical Design bug
|
Steps to fool the system:
- Submit fake assignment that will have low clone count
- Get token from fake submission
- Submit fake token and real assignment to teacher
- Teacher asks Fantasia to validate the fake token
- Fantasia checks and sees the token is the same as the one on the server and returns OK
There needs to be something else in place to catch this.
possible solutions:
- 1
- Having the teacher run a hash on each of the submissions they receive and submit ```StudentID SubmissionHash``` as a token for each.
- Cons: requires lots of effort from the user
- Pros: much less overhead as the files would be small
- 2
- Ditch the user visible tokens:
- student submits an assignment to the system
- token is created internally and not sent to the student
- teacher submits a zip of all assignments received
- system reapplies same token process and then compares those tokens to existing ones
- Pros: requires little effort from the user and hides complexity
- Pros: SO much overhead, especially with a lot of large projects
|
1.0
|
Students can bypass the token system - Steps to fool the system:
- Submit fake assignment that will have low clone count
- Get token from fake submission
- Submit fake token and real assignment to teacher
- Teacher asks Fantasia to validate the fake token
- Fantasia checks and sees the token is the same as the one on the server and returns OK
There needs to be something else in place to catch this.
possible solutions:
- 1
- Having the teacher run a hash on each of the submissions they receive and submit ```StudentID SubmissionHash``` as a token for each.
- Cons: requires lots of effort from the user
- Pros: much less overhead as the files would be small
- 2
- Ditch the user visible tokens:
- student submits an assignment to the system
- token is created internally and not sent to the student
- teacher submits a zip of all assignments received
- system reapplies same token process and then compares those tokens to existing ones
- Pros: requires little effort from the user and hides complexity
- Pros: SO much overhead, especially with a lot of large projects
|
design
|
students can bypass the token system steps to fool the system submit fake assignment that will have low clone count get token from fake submission submit fake token and real assignment to teacher teacher asks fantasia to validate the fake token fantasia checks and sees the token is the same as the one on the server and returns ok there needs to be something else in place to catch this possible solutions having the teacher run a hash on each of the submissions they receive and submit studentid submissionhash as a token for each cons requires lots of effort from the user pros much less overhead as the files would be small ditch the user visible tokens student submits an assignment to the system token is created internally and not sent to the student teacher submits a zip of all assignments received system reapplies same token process and then compares those tokens to existing ones pros requires little effort from the user and hides complexity pros so much overhead especially with a lot of large projects
| 1
|
231,001
| 7,622,355,025
|
IssuesEvent
|
2018-05-03 11:54:18
|
heptastique/onlygo
|
https://api.github.com/repos/heptastique/onlygo
|
closed
|
Estimation of Possible Distance in CentreInteret
|
priority: low
|
Estimation de la distance qu'il est possible de parcourir dans le Centre d'Interet
|
1.0
|
Estimation of Possible Distance in CentreInteret - Estimation de la distance qu'il est possible de parcourir dans le Centre d'Interet
|
non_design
|
estimation of possible distance in centreinteret estimation de la distance qu il est possible de parcourir dans le centre d interet
| 0
|
109,665
| 13,797,123,246
|
IssuesEvent
|
2020-10-09 21:08:06
|
ampproject/amphtml
|
https://api.github.com/repos/ampproject/amphtml
|
opened
|
Design Review 2020-10-14 20:00 UTC (Americas)
|
Type: Design Review
|
Time: [2020-10-14 20:00 UTC](https://www.timeanddate.com/worldclock/meeting.html?year=2020&month=10&day=14&iv=0) ([add to Google Calendar](http://www.google.com/calendar/event?action=TEMPLATE&text=AMP%20Project%20Design%20Review&dates=20201014T200000Z/20201014T210000Z&details=https%3A%2F%2Fbit.ly%2Famp-dr))
Location: [Video conference via Google Meet](https://bit.ly/amp-dr)
The AMP community holds weekly engineering [design reviews](https://github.com/ampproject/amphtml/blob/master/contributing/design-reviews.md). **We encourage everyone in the community to participate in these design reviews.**
If you are interested in bringing your design to design review, read the [design review documentation](https://github.com/ampproject/amphtml/blob/master/contributing/design-reviews.md) and add a link to your design doc or issue by the Monday before your design review.
When attending a design review please read through the designs *before* the design review starts. This allows us to spend more time on discussion of the design.
We rotate our design review between times that work better for different parts of the world as described in our [design review documentation](https://github.com/ampproject/amphtml/blob/master/contributing/design-reviews.md), but you are welcome to attend any design review. If you cannot make any of the design reviews but have a design to discuss please let mrjoro@ know on [Slack](https://github.com/ampproject/amphtml/blob/master/CONTRIBUTING.md#discussion-channels) and we will find a time that works for you.
|
1.0
|
Design Review 2020-10-14 20:00 UTC (Americas) - Time: [2020-10-14 20:00 UTC](https://www.timeanddate.com/worldclock/meeting.html?year=2020&month=10&day=14&iv=0) ([add to Google Calendar](http://www.google.com/calendar/event?action=TEMPLATE&text=AMP%20Project%20Design%20Review&dates=20201014T200000Z/20201014T210000Z&details=https%3A%2F%2Fbit.ly%2Famp-dr))
Location: [Video conference via Google Meet](https://bit.ly/amp-dr)
The AMP community holds weekly engineering [design reviews](https://github.com/ampproject/amphtml/blob/master/contributing/design-reviews.md). **We encourage everyone in the community to participate in these design reviews.**
If you are interested in bringing your design to design review, read the [design review documentation](https://github.com/ampproject/amphtml/blob/master/contributing/design-reviews.md) and add a link to your design doc or issue by the Monday before your design review.
When attending a design review please read through the designs *before* the design review starts. This allows us to spend more time on discussion of the design.
We rotate our design review between times that work better for different parts of the world as described in our [design review documentation](https://github.com/ampproject/amphtml/blob/master/contributing/design-reviews.md), but you are welcome to attend any design review. If you cannot make any of the design reviews but have a design to discuss please let mrjoro@ know on [Slack](https://github.com/ampproject/amphtml/blob/master/CONTRIBUTING.md#discussion-channels) and we will find a time that works for you.
|
design
|
design review utc americas time location the amp community holds weekly engineering we encourage everyone in the community to participate in these design reviews if you are interested in bringing your design to design review read the and add a link to your design doc or issue by the monday before your design review when attending a design review please read through the designs before the design review starts this allows us to spend more time on discussion of the design we rotate our design review between times that work better for different parts of the world as described in our but you are welcome to attend any design review if you cannot make any of the design reviews but have a design to discuss please let mrjoro know on and we will find a time that works for you
| 1
|
78,882
| 9,808,439,978
|
IssuesEvent
|
2019-06-12 15:37:32
|
fecgov/FEC
|
https://api.github.com/repos/fecgov/FEC
|
closed
|
Request: Decimal usage
|
Upcoming feature theme: general design
|
## Feedback
I'd recommend getting rid of the cents places in displays of aggregate financial data (i.e., for pretty much everything except individual transactions). Just round the displayed totals to the nearest dollar. Including cents makes the totals harder to read, and no one cares about them.
## Details
- URL: https://beta.fec.gov/data/
- User Agent: Mozilla/5.0 (Windows NT 6.1; rv:41.0) Gecko/20100101 Firefox/41.0
|
1.0
|
Request: Decimal usage - ## Feedback
I'd recommend getting rid of the cents places in displays of aggregate financial data (i.e., for pretty much everything except individual transactions). Just round the displayed totals to the nearest dollar. Including cents makes the totals harder to read, and no one cares about them.
## Details
- URL: https://beta.fec.gov/data/
- User Agent: Mozilla/5.0 (Windows NT 6.1; rv:41.0) Gecko/20100101 Firefox/41.0
|
design
|
request decimal usage feedback i d recommend getting rid of the cents places in displays of aggregate financial data i e for pretty much everything except individual transactions just round the displayed totals to the nearest dollar including cents makes the totals harder to read and no one cares about them details url user agent mozilla windows nt rv gecko firefox
| 1
|
52,173
| 6,577,355,995
|
IssuesEvent
|
2017-09-12 00:20:25
|
stephenhenderson808/musicianstoolkit
|
https://api.github.com/repos/stephenhenderson808/musicianstoolkit
|
closed
|
Redesign Education Homepage
|
Design Product
|
We want everything else to basically be the same, just need some design tweaks:
Tagline:
**Music education reimagined**
Subtag line
**Music education for all students in all stages of life**
In the explore section highlight:
Instrument Maintenance
Music theory
Instrument fundamentals (can you change picture)
Masterclasses and interviews
Ensembles
Jazz
Music Education
Is there a price point? I think people are going to want to know. Perhaps you list the individual price point of $5/Student and then have a button that says, submit for details on packaging pricing.
**FOR STEPHEN ONLY:**
Take out the words “Introduction” from everything. It makes it sound like it is a single lesson when we say, “Introduction to Jazz Bass.” The first unit is an introduction, but units 2-14 are not. Again, it reinforced the thought that the course only has one video.
|
1.0
|
Redesign Education Homepage - We want everything else to basically be the same, just need some design tweaks:
Tagline:
**Music education reimagined**
Subtag line
**Music education for all students in all stages of life**
In the explore section highlight:
Instrument Maintenance
Music theory
Instrument fundamentals (can you change picture)
Masterclasses and interviews
Ensembles
Jazz
Music Education
Is there a price point? I think people are going to want to know. Perhaps you list the individual price point of $5/Student and then have a button that says, submit for details on packaging pricing.
**FOR STEPHEN ONLY:**
Take out the words “Introduction” from everything. It makes it sound like it is a single lesson when we say, “Introduction to Jazz Bass.” The first unit is an introduction, but units 2-14 are not. Again, it reinforced the thought that the course only has one video.
|
design
|
redesign education homepage we want everything else to basically be the same just need some design tweaks tagline music education reimagined subtag line music education for all students in all stages of life in the explore section highlight instrument maintenance music theory instrument fundamentals can you change picture masterclasses and interviews ensembles jazz music education is there a price point i think people are going to want to know perhaps you list the individual price point of student and then have a button that says submit for details on packaging pricing for stephen only take out the words “introduction” from everything it makes it sound like it is a single lesson when we say “introduction to jazz bass ” the first unit is an introduction but units are not again it reinforced the thought that the course only has one video
| 1
|
151,614
| 23,848,610,399
|
IssuesEvent
|
2022-09-06 15:50:35
|
button-inc/service-development-toolkit
|
https://api.github.com/repos/button-inc/service-development-toolkit
|
opened
|
Sprint B: Value Proposition Canvas
|
Design
|
### Description
Specify which part of the system we're working in.
### User Story
As a member of the Button Innovation Team,
I want a document that summarizes the user value of the ELT Demo,
So that I can reference and update it as the project evolves.
### Acceptance Criteria
| Given | When | Then | Pass/Fail (TBD by reviewer) |
| I'm on the Innovation Team Miro Board |When I navigate to the ELT demo section, | I find a Value Proposition Canvas for the Button Sales Team |---: |
| I'm on the Innovation Team Miro Board |When I navigate to the ELT demo section, | I find a Value Proposition Canvas for a prospective client |---: |
| N/A |N/A | Jeff has signed off on the VPCs |---: |
| N/A |N/A | Alec has signed off on the VPCs |---: |
|
1.0
|
Sprint B: Value Proposition Canvas - ### Description
Specify which part of the system we're working in.
### User Story
As a member of the Button Innovation Team,
I want a document that summarizes the user value of the ELT Demo,
So that I can reference and update it as the project evolves.
### Acceptance Criteria
| Given | When | Then | Pass/Fail (TBD by reviewer) |
| I'm on the Innovation Team Miro Board |When I navigate to the ELT demo section, | I find a Value Proposition Canvas for the Button Sales Team |---: |
| I'm on the Innovation Team Miro Board |When I navigate to the ELT demo section, | I find a Value Proposition Canvas for a prospective client |---: |
| N/A |N/A | Jeff has signed off on the VPCs |---: |
| N/A |N/A | Alec has signed off on the VPCs |---: |
|
design
|
sprint b value proposition canvas description specify which part of the system we re working in user story as a member of the button innovation team i want a document that summarizes the user value of the elt demo so that i can reference and update it as the project evolves acceptance criteria given when then pass fail tbd by reviewer i m on the innovation team miro board when i navigate to the elt demo section i find a value proposition canvas for the button sales team i m on the innovation team miro board when i navigate to the elt demo section i find a value proposition canvas for a prospective client n a n a jeff has signed off on the vpcs n a n a alec has signed off on the vpcs
| 1
|
168,557
| 26,663,385,200
|
IssuesEvent
|
2023-01-25 23:44:35
|
elastic/design-system-team
|
https://api.github.com/repos/elastic/design-system-team
|
closed
|
Integrate Data Visualizer with Discover
|
7.14-design
|
#### Notes
Here are Alona's original notes on the topic (2018): https://docs.google.com/document/d/1OuKXp59fAh2s4Ak5JdExHvnO8GTxEheRTcgOQ5kWVTI/edit?usp=sharing
#### Kibana Theme
#### Kibana Priority
#### Eng Lead
#### Funding
#### PM Lead
#### Project Meta Issue
|
1.0
|
Integrate Data Visualizer with Discover - #### Notes
Here are Alona's original notes on the topic (2018): https://docs.google.com/document/d/1OuKXp59fAh2s4Ak5JdExHvnO8GTxEheRTcgOQ5kWVTI/edit?usp=sharing
#### Kibana Theme
#### Kibana Priority
#### Eng Lead
#### Funding
#### PM Lead
#### Project Meta Issue
|
design
|
integrate data visualizer with discover notes here are alona s original notes on the topic kibana theme kibana priority eng lead funding pm lead project meta issue
| 1
|
32,472
| 4,364,780,837
|
IssuesEvent
|
2016-08-03 08:22:21
|
Kozea/www-pharminfo
|
https://api.github.com/repos/Kozea/www-pharminfo
|
opened
|
Fonctionnalités à développer
|
design feature next
|
- [ ] Cacher le bouton "click-to-call"
Quand on est en haut de page ou quand on scroll vers le haut, le faire apparaître. Le cacher quand on scroll (vers le bas) pour ne pas gêner la lecture du contenu
-------
- [ ] Créer effet parallax sur l'image en bas de la section "Clients"
-------
- [ ] Rendre interactif les offres à la manière d'un accordéon
Quand on clique sur "voir plus" > déplier l'offre
Quand on clique sur un autre offre, fermer l'offre ouverte précédemment
|
1.0
|
Fonctionnalités à développer - - [ ] Cacher le bouton "click-to-call"
Quand on est en haut de page ou quand on scroll vers le haut, le faire apparaître. Le cacher quand on scroll (vers le bas) pour ne pas gêner la lecture du contenu
-------
- [ ] Créer effet parallax sur l'image en bas de la section "Clients"
-------
- [ ] Rendre interactif les offres à la manière d'un accordéon
Quand on clique sur "voir plus" > déplier l'offre
Quand on clique sur un autre offre, fermer l'offre ouverte précédemment
|
design
|
fonctionnalités à développer cacher le bouton click to call quand on est en haut de page ou quand on scroll vers le haut le faire apparaître le cacher quand on scroll vers le bas pour ne pas gêner la lecture du contenu créer effet parallax sur l image en bas de la section clients rendre interactif les offres à la manière d un accordéon quand on clique sur voir plus déplier l offre quand on clique sur un autre offre fermer l offre ouverte précédemment
| 1
|
174,176
| 6,537,409,195
|
IssuesEvent
|
2017-08-31 22:20:27
|
TheValarProject/AwakenDreamsClient
|
https://api.github.com/repos/TheValarProject/AwakenDreamsClient
|
opened
|
Credits menu
|
new-feature priority-medium
|
There should be a button on the main screen, and maybe in the pause menu to go to a credits menu. This would make it easy for us to give proper credit to the people involved in the creation of the mod, as well as allowing us to properly use CC-BY assets which could help with texturing, sounds, etc. In addition to making the menu, credits for the current work done so far needs to be figured out as best as possible.
|
1.0
|
Credits menu - There should be a button on the main screen, and maybe in the pause menu to go to a credits menu. This would make it easy for us to give proper credit to the people involved in the creation of the mod, as well as allowing us to properly use CC-BY assets which could help with texturing, sounds, etc. In addition to making the menu, credits for the current work done so far needs to be figured out as best as possible.
|
non_design
|
credits menu there should be a button on the main screen and maybe in the pause menu to go to a credits menu this would make it easy for us to give proper credit to the people involved in the creation of the mod as well as allowing us to properly use cc by assets which could help with texturing sounds etc in addition to making the menu credits for the current work done so far needs to be figured out as best as possible
| 0
|
657,119
| 21,785,949,153
|
IssuesEvent
|
2022-05-14 05:58:05
|
wso2/api-manager
|
https://api.github.com/repos/wso2/api-manager
|
opened
|
[4.1.0] Database connections are getting busy when using Monetization feature
|
Type/Bug Priority/Normal
|
### Description:
<!-- Describe the issue -->
$subject
Caused by: org.apache.tomcat.jdbc.pool.PoolExhaustedException: [https-jsse-nio-9443-exec-45] Timeout: Pool empty. Unable to fetch a connection in 60 seconds, none available[size:50; busy:50; idle:0; lastwait:60000].
at org.apache.tomcat.jdbc.pool.ConnectionPool.borrowConnection(ConnectionPool.java:738)
at org.apache.tomcat.jdbc.pool.ConnectionPool.getConnection(ConnectionPool.java:198)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.getConnection(DataSourceProxy.java:136)
at org.wso2.carbon.identity.core.persistence.JDBCPersistenceManager.getDBConnection(JDBCPersistenceManager.java:147)
... 57 more
### Steps to reproduce:
<!-- List the steps you followed when you encountered the issue -->
1. Configure monetization with Stripe
https://apim.docs.wso2.com/en/latest/design/api-monetization/monetizing-an-api/
2. When working with feature, you will observe the error.
### Affected product version:
<!-- Members can use Affected/*** labels -->
4.1.0
|
1.0
|
[4.1.0] Database connections are getting busy when using Monetization feature - ### Description:
<!-- Describe the issue -->
$subject
Caused by: org.apache.tomcat.jdbc.pool.PoolExhaustedException: [https-jsse-nio-9443-exec-45] Timeout: Pool empty. Unable to fetch a connection in 60 seconds, none available[size:50; busy:50; idle:0; lastwait:60000].
at org.apache.tomcat.jdbc.pool.ConnectionPool.borrowConnection(ConnectionPool.java:738)
at org.apache.tomcat.jdbc.pool.ConnectionPool.getConnection(ConnectionPool.java:198)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.getConnection(DataSourceProxy.java:136)
at org.wso2.carbon.identity.core.persistence.JDBCPersistenceManager.getDBConnection(JDBCPersistenceManager.java:147)
... 57 more
### Steps to reproduce:
<!-- List the steps you followed when you encountered the issue -->
1. Configure monetization with Stripe
https://apim.docs.wso2.com/en/latest/design/api-monetization/monetizing-an-api/
2. When working with feature, you will observe the error.
### Affected product version:
<!-- Members can use Affected/*** labels -->
4.1.0
|
non_design
|
database connections are getting busy when using monetization feature description subject caused by org apache tomcat jdbc pool poolexhaustedexception timeout pool empty unable to fetch a connection in seconds none available at org apache tomcat jdbc pool connectionpool borrowconnection connectionpool java at org apache tomcat jdbc pool connectionpool getconnection connectionpool java at org apache tomcat jdbc pool datasourceproxy getconnection datasourceproxy java at org carbon identity core persistence jdbcpersistencemanager getdbconnection jdbcpersistencemanager java more steps to reproduce configure monetization with stripe when working with feature you will observe the error affected product version
| 0
|
65,067
| 7,856,055,129
|
IssuesEvent
|
2018-06-21 05:52:48
|
lbryio/lbry-app
|
https://api.github.com/repos/lbryio/lbry-app
|
closed
|
When the sidenav closes a sub-menu, it should animate the same as when it opens
|
redesign
|
Currently, when the sub-menu collapses, it doesn't animate.
|
1.0
|
When the sidenav closes a sub-menu, it should animate the same as when it opens - Currently, when the sub-menu collapses, it doesn't animate.
|
design
|
when the sidenav closes a sub menu it should animate the same as when it opens currently when the sub menu collapses it doesn t animate
| 1
|
24,399
| 11,034,898,565
|
IssuesEvent
|
2019-12-07 09:31:13
|
mgh3326/calculator
|
https://api.github.com/repos/mgh3326/calculator
|
opened
|
WS-2016-0090 (Medium) detected in jquery-2.1.4.min.js, jquery-1.7.1.min.js
|
security vulnerability
|
## WS-2016-0090 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-2.1.4.min.js</b>, <b>jquery-1.7.1.min.js</b></p></summary>
<p>
<details><summary><b>jquery-2.1.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/calculator/node_modules/js-base64/test/index.html</p>
<p>Path to vulnerable library: /calculator/node_modules/js-base64/test/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.4.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.7.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/calculator/node_modules/sockjs/examples/express-3.x/index.html</p>
<p>Path to vulnerable library: /calculator/node_modules/sockjs/examples/express-3.x/index.html,/calculator/node_modules/sockjs/examples/multiplex/index.html,/calculator/node_modules/sockjs/examples/hapi/html/index.html,/calculator/node_modules/sockjs/examples/echo/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.7.1.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/mgh3326/calculator/commit/6ddd840d636d2c21618c6fc3fa90a4790d1c00bc">6ddd840d636d2c21618c6fc3fa90a4790d1c00bc</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
JQuery, before 2.2.0, is vulnerable to Cross-site Scripting (XSS) attacks via text/javascript response with arbitrary code execution.
<p>Publish Date: 2016-11-27
<p>URL: <a href=https://github.com/jquery/jquery/commit/b078a62013782c7424a4a61a240c23c4c0b42614>WS-2016-0090</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/jquery/jquery/commit/b078a62013782c7424a4a61a240c23c4c0b42614">https://github.com/jquery/jquery/commit/b078a62013782c7424a4a61a240c23c4c0b42614</a></p>
<p>Release Date: 2019-04-08</p>
<p>Fix Resolution: 2.2.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2016-0090 (Medium) detected in jquery-2.1.4.min.js, jquery-1.7.1.min.js - ## WS-2016-0090 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-2.1.4.min.js</b>, <b>jquery-1.7.1.min.js</b></p></summary>
<p>
<details><summary><b>jquery-2.1.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/calculator/node_modules/js-base64/test/index.html</p>
<p>Path to vulnerable library: /calculator/node_modules/js-base64/test/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.1.4.min.js** (Vulnerable Library)
</details>
<details><summary><b>jquery-1.7.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/calculator/node_modules/sockjs/examples/express-3.x/index.html</p>
<p>Path to vulnerable library: /calculator/node_modules/sockjs/examples/express-3.x/index.html,/calculator/node_modules/sockjs/examples/multiplex/index.html,/calculator/node_modules/sockjs/examples/hapi/html/index.html,/calculator/node_modules/sockjs/examples/echo/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.7.1.min.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/mgh3326/calculator/commit/6ddd840d636d2c21618c6fc3fa90a4790d1c00bc">6ddd840d636d2c21618c6fc3fa90a4790d1c00bc</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
JQuery, before 2.2.0, is vulnerable to Cross-site Scripting (XSS) attacks via text/javascript response with arbitrary code execution.
<p>Publish Date: 2016-11-27
<p>URL: <a href=https://github.com/jquery/jquery/commit/b078a62013782c7424a4a61a240c23c4c0b42614>WS-2016-0090</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/jquery/jquery/commit/b078a62013782c7424a4a61a240c23c4c0b42614">https://github.com/jquery/jquery/commit/b078a62013782c7424a4a61a240c23c4c0b42614</a></p>
<p>Release Date: 2019-04-08</p>
<p>Fix Resolution: 2.2.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_design
|
ws medium detected in jquery min js jquery min js ws medium severity vulnerability vulnerable libraries jquery min js jquery min js jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm calculator node modules js test index html path to vulnerable library calculator node modules js test index html dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm calculator node modules sockjs examples express x index html path to vulnerable library calculator node modules sockjs examples express x index html calculator node modules sockjs examples multiplex index html calculator node modules sockjs examples hapi html index html calculator node modules sockjs examples echo index html dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details jquery before is vulnerable to cross site scripting xss attacks via text javascript response with arbitrary code execution publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
290,832
| 25,099,198,195
|
IssuesEvent
|
2022-11-08 12:26:55
|
ARM-software/psa-api
|
https://api.github.com/repos/ARM-software/psa-api
|
opened
|
Use the PSA attestation token format in the Attestation API
|
enhancement Attestation API
|
The attestation token format is currently being standardized as the PSA Attestation token in the draft [datatracker.ietf.org/doc/draft-tschofenig-rats-psa-token](https://datatracker.ietf.org/doc/draft-tschofenig-rats-psa-token) [PSATOKEN] specification.
This has evolved slightly since the v1.0 specification, and now uses allocated claim ids, instead of claim ids from the private use range. The token format described in the v1.0 API should be deprecated in favor of the emerging standard, and the Attestation API needs to be updated with a new version to reference the new format.
There are some open issues and options with how this progresses:
* An implementation that produces a token compliant with [PSATOKEN] is not compliant with the v1.0 Attestation API, so this would probably require an updated specification to be v2.0. But the API is unchanged, so perhaps v1.1 is appropriate?
* Should the legacy v1.0 format be retained in the new version of the specification, perhaps as an appendix, or just removed entirely? It will still be documented in the v1.0 specification.
* Should the new version of the specification provide the same level of detail about the token format as the current specification, provide no details and just refer to the IETF document, or some in-between level of description?
* How should the Attestation API update and IETF document timeline work:
- Should the new version wait until the IETF document is finalised?
- Should a beta version of a new Attestation API be published, until the IETF document is finalised?
- Should a final version of a new Attestation API be published, noting that the IETF document is not yet final?
|
1.0
|
Use the PSA attestation token format in the Attestation API - The attestation token format is currently being standardized as the PSA Attestation token in the draft [datatracker.ietf.org/doc/draft-tschofenig-rats-psa-token](https://datatracker.ietf.org/doc/draft-tschofenig-rats-psa-token) [PSATOKEN] specification.
This has evolved slightly since the v1.0 specification, and now uses allocated claim ids, instead of claim ids from the private use range. The token format described in the v1.0 API should be deprecated in favor of the emerging standard, and the Attestation API needs to be updated with a new version to reference the new format.
There are some open issues and options with how this progresses:
* An implementation that produces a token compliant with [PSATOKEN] is not compliant with the v1.0 Attestation API, so this would probably require an updated specification to be v2.0. But the API is unchanged, so perhaps v1.1 is appropriate?
* Should the legacy v1.0 format be retained in the new version of the specification, perhaps as an appendix, or just removed entirely? It will still be documented in the v1.0 specification.
* Should the new version of the specification provide the same level of detail about the token format as the current specification, provide no details and just refer to the IETF document, or some in-between level of description?
* How should the Attestation API update and IETF document timeline work:
- Should the new version wait until the IETF document is finalised?
- Should a beta version of a new Attestation API be published, until the IETF document is finalised?
- Should a final version of a new Attestation API be published, noting that the IETF document is not yet final?
|
non_design
|
use the psa attestation token format in the attestation api the attestation token format is currently being standardized as the psa attestation token in the draft specification this has evolved slightly since the specification and now uses allocated claim ids instead of claim ids from the private use range the token format described in the api should be deprecated in favor of the emerging standard and the attestation api needs to be updated with a new version to reference the new format there are some open issues and options with how this progresses an implementation that produces a token compliant with is not compliant with the attestation api so this would probably require an updated specification to be but the api is unchanged so perhaps is appropriate should the legacy format be retained in the new version of the specification perhaps as an appendix or just removed entirely it will still be documented in the specification should the new version of the specification provide the same level of detail about the token format as the current specification provide no details and just refer to the ietf document or some in between level of description how should the attestation api update and ietf document timeline work should the new version wait until the ietf document is finalised should a beta version of a new attestation api be published until the ietf document is finalised should a final version of a new attestation api be published noting that the ietf document is not yet final
| 0
|
119,112
| 15,398,631,057
|
IssuesEvent
|
2021-03-04 00:26:07
|
cli/cli
|
https://api.github.com/repos/cli/cli
|
opened
|
show diffstat in pr view
|
enhancement needs-design
|
When I'm looking at pull requests with `pr view` I frequently want to know how big the change set is so I can determine if it's something I can review quickly or I need to put off. I currently run `pr diff` to eyeball it but it would be nice to just include it in the PR header. Something like:
```
Add a template `--format` flag to api command
Open • +238 −1 • mislav wants to merge 4 commits into trunk from api-template
Reviewers: vilmibm (Commented), code reviewers (Requested)
Projects: The GitHub CLI (Needs review 🤔)
```
@ampinsk what do you think?
|
1.0
|
show diffstat in pr view - When I'm looking at pull requests with `pr view` I frequently want to know how big the change set is so I can determine if it's something I can review quickly or I need to put off. I currently run `pr diff` to eyeball it but it would be nice to just include it in the PR header. Something like:
```
Add a template `--format` flag to api command
Open • +238 −1 • mislav wants to merge 4 commits into trunk from api-template
Reviewers: vilmibm (Commented), code reviewers (Requested)
Projects: The GitHub CLI (Needs review 🤔)
```
@ampinsk what do you think?
|
design
|
show diffstat in pr view when i m looking at pull requests with pr view i frequently want to know how big the change set is so i can determine if it s something i can review quickly or i need to put off i currently run pr diff to eyeball it but it would be nice to just include it in the pr header something like add a template format flag to api command open • − • mislav wants to merge commits into trunk from api template reviewers vilmibm commented code reviewers requested projects the github cli needs review 🤔 ampinsk what do you think
| 1
|
42,600
| 17,198,746,426
|
IssuesEvent
|
2021-07-16 22:18:52
|
bcgov/ols-devkit
|
https://api.github.com/repos/bcgov/ols-devkit
|
closed
|
In Demo app, MapBox Streets has disappeared
|
bug high priority location services demo app
|
When you start up demo app, there is no map and Mapbox Streets is the default map layer. Mapbox Satellite still works so I don't think its a subscription issue.
|
1.0
|
In Demo app, MapBox Streets has disappeared - When you start up demo app, there is no map and Mapbox Streets is the default map layer. Mapbox Satellite still works so I don't think its a subscription issue.
|
non_design
|
in demo app mapbox streets has disappeared when you start up demo app there is no map and mapbox streets is the default map layer mapbox satellite still works so i don t think its a subscription issue
| 0
|
184,124
| 31,822,120,592
|
IssuesEvent
|
2023-09-14 03:48:45
|
ILLIXR/ILLIXR
|
https://api.github.com/repos/ILLIXR/ILLIXR
|
opened
|
Dependency handling for plugin loader
|
feature design decision infrastructure
|
ILLIXR loads the plugins by the order they are passed in right now. Dependency conflicts easily arise for plugins requesting other services and better handling is required. For now, the plugin list has been manually reordered. However, we should consider using a dependency graph in the future.
|
1.0
|
Dependency handling for plugin loader - ILLIXR loads the plugins by the order they are passed in right now. Dependency conflicts easily arise for plugins requesting other services and better handling is required. For now, the plugin list has been manually reordered. However, we should consider using a dependency graph in the future.
|
design
|
dependency handling for plugin loader illixr loads the plugins by the order they are passed in right now dependency conflicts easily arise for plugins requesting other services and better handling is required for now the plugin list has been manually reordered however we should consider using a dependency graph in the future
| 1
|
167,077
| 26,451,279,542
|
IssuesEvent
|
2023-01-16 11:23:45
|
kodadot/nft-gallery
|
https://api.github.com/repos/kodadot/nft-gallery
|
closed
|
Navbar at big touch devices is broken
|
UX first $ p2 mobile navbar redesign
|
> So is that oki? Bit confused that you wrote oki and removed label @exezbcz
>
> 
nope, the bug is still there.

- the navbar on big touch devices is broken.
Probably it was fixed in
- #4622
Close if done so
_Originally posted by @exezbcz in https://github.com/kodadot/nft-gallery/issues/4580#issuecomment-1381939698_
|
1.0
|
Navbar at big touch devices is broken - > So is that oki? Bit confused that you wrote oki and removed label @exezbcz
>
> 
nope, the bug is still there.

- the navbar on big touch devices is broken.
Probably it was fixed in
- #4622
Close if done so
_Originally posted by @exezbcz in https://github.com/kodadot/nft-gallery/issues/4580#issuecomment-1381939698_
|
design
|
navbar at big touch devices is broken so is that oki bit confused that you wrote oki and removed label exezbcz nope the bug is still there the navbar on big touch devices is broken probably it was fixed in close if done so originally posted by exezbcz in
| 1
|
231,925
| 25,552,578,571
|
IssuesEvent
|
2022-11-30 01:53:52
|
AlexRogalskiy/openimagecv
|
https://api.github.com/repos/AlexRogalskiy/openimagecv
|
closed
|
CVE-2018-19362 (High) detected in jackson-databind-2.1.3.jar - autoclosed
|
security vulnerability
|
## CVE-2018-19362 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.1.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>
Dependency Hierarchy:
- logback-jackson-0.1.5.jar (Root Library)
- :x: **jackson-databind-2.1.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/openimagecv/commit/26c840d74d967fb2fc08c33b2bcfa9a86a9e01e9">26c840d74d967fb2fc08c33b2bcfa9a86a9e01e9</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.8 might allow attackers to have unspecified impact by leveraging failure to block the jboss-common-core class from polymorphic deserialization.
<p>Publish Date: 2019-01-02
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-19362>CVE-2018-19362</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19362">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19362</a></p>
<p>Release Date: 2019-01-02</p>
<p>Fix Resolution: 2.9.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2018-19362 (High) detected in jackson-databind-2.1.3.jar - autoclosed - ## CVE-2018-19362 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.1.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>
Dependency Hierarchy:
- logback-jackson-0.1.5.jar (Root Library)
- :x: **jackson-databind-2.1.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/openimagecv/commit/26c840d74d967fb2fc08c33b2bcfa9a86a9e01e9">26c840d74d967fb2fc08c33b2bcfa9a86a9e01e9</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.8 might allow attackers to have unspecified impact by leveraging failure to block the jboss-common-core class from polymorphic deserialization.
<p>Publish Date: 2019-01-02
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-19362>CVE-2018-19362</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19362">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19362</a></p>
<p>Release Date: 2019-01-02</p>
<p>Fix Resolution: 2.9.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_design
|
cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api dependency hierarchy logback jackson jar root library x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before might allow attackers to have unspecified impact by leveraging failure to block the jboss common core class from polymorphic deserialization publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
14,777
| 18,165,346,448
|
IssuesEvent
|
2021-09-27 14:06:49
|
ValveSoftware/Proton
|
https://api.github.com/repos/ValveSoftware/Proton
|
closed
|
World War Z: Aftermath (699130)
|
Game compatibility - Unofficial XAudio2
|
# Compatibility Report
- Name of the game with compatibility issues: World War Z: Aftermath
- Steam AppID of the game: 699130
## System Information
- GPU: Nvidia GTX 1080Ti
- Driver/LLVM version: Nvidia 460.91.03
- Kernel version: 5.10.0-8-amd64
- Link to full system information report as [Gist](https://gist.github.com/alexcrow89/2150e323377e15401b5d82c103c82250):
- Proton version: Experimental
## I confirm:
- [X] that I haven't found an existing compatibility report for this game.
- [X] that I have checked whether there are updates for my system available.
<!-- Please add `PROTON_LOG=1 %command%` to the game's launch options and
attach the generated $HOME/steam-$APPID.log to this issue report as a file.
(Proton logs compress well if needed.)-->
[steam-699130.log](https://github.com/ValveSoftware/Proton/files/7212129/steam-699130.log)
## Symptoms
After running the game (seemingly successfully) on my initial try, rather than immediately playing the game I went into the options menu, changing the game's renderer from DX11 to Vulkan. I also enabled AMD's FSR technology, just to see if it would work. I was prompted to restart the game. Upon doing so, the game no-longer reaches the main menu, rather crashing on the initial loading screen with a zombie's face.
## Reproduction
This crash happens at the same spot, every time I've tried re-running the game, without fail. I've uninstalled, re-installed, deleted "WWZ"-related directories, etc. The game still crashes at the initial loading screen with a zombie's face.
<!--
1. You can find the Steam AppID in the URL of the shop page of the game.
e.g. for `The Witcher 3: Wild Hunt` the AppID is `292030`.
2. You can find your driver and Linux version, as well as your graphics
processor's name in the system information report of Steam.
3. You can retrieve a full system information report by clicking
`Help` > `System Information` in the Steam client on your machine.
4. Please copy it to your clipboard by pressing `Ctrl+A` and then `Ctrl+C`.
Then paste it in a [Gist](https://gist.github.com/) and post the link in
this issue.
5. Please search for open issues and pull requests by the name of the game and
find out whether they are relevant and should be referenced above.
-->
|
True
|
World War Z: Aftermath (699130) - # Compatibility Report
- Name of the game with compatibility issues: World War Z: Aftermath
- Steam AppID of the game: 699130
## System Information
- GPU: Nvidia GTX 1080Ti
- Driver/LLVM version: Nvidia 460.91.03
- Kernel version: 5.10.0-8-amd64
- Link to full system information report as [Gist](https://gist.github.com/alexcrow89/2150e323377e15401b5d82c103c82250):
- Proton version: Experimental
## I confirm:
- [X] that I haven't found an existing compatibility report for this game.
- [X] that I have checked whether there are updates for my system available.
<!-- Please add `PROTON_LOG=1 %command%` to the game's launch options and
attach the generated $HOME/steam-$APPID.log to this issue report as a file.
(Proton logs compress well if needed.)-->
[steam-699130.log](https://github.com/ValveSoftware/Proton/files/7212129/steam-699130.log)
## Symptoms
After running the game (seemingly successfully) on my initial try, rather than immediately playing the game I went into the options menu, changing the game's renderer from DX11 to Vulkan. I also enabled AMD's FSR technology, just to see if it would work. I was prompted to restart the game. Upon doing so, the game no-longer reaches the main menu, rather crashing on the initial loading screen with a zombie's face.
## Reproduction
This crash happens at the same spot, every time I've tried re-running the game, without fail. I've uninstalled, re-installed, deleted "WWZ"-related directories, etc. The game still crashes at the initial loading screen with a zombie's face.
<!--
1. You can find the Steam AppID in the URL of the shop page of the game.
e.g. for `The Witcher 3: Wild Hunt` the AppID is `292030`.
2. You can find your driver and Linux version, as well as your graphics
processor's name in the system information report of Steam.
3. You can retrieve a full system information report by clicking
`Help` > `System Information` in the Steam client on your machine.
4. Please copy it to your clipboard by pressing `Ctrl+A` and then `Ctrl+C`.
Then paste it in a [Gist](https://gist.github.com/) and post the link in
this issue.
5. Please search for open issues and pull requests by the name of the game and
find out whether they are relevant and should be referenced above.
-->
|
non_design
|
world war z aftermath compatibility report name of the game with compatibility issues world war z aftermath steam appid of the game system information gpu nvidia gtx driver llvm version nvidia kernel version link to full system information report as proton version experimental i confirm that i haven t found an existing compatibility report for this game that i have checked whether there are updates for my system available please add proton log command to the game s launch options and attach the generated home steam appid log to this issue report as a file proton logs compress well if needed symptoms after running the game seemingly successfully on my initial try rather than immediately playing the game i went into the options menu changing the game s renderer from to vulkan i also enabled amd s fsr technology just to see if it would work i was prompted to restart the game upon doing so the game no longer reaches the main menu rather crashing on the initial loading screen with a zombie s face reproduction this crash happens at the same spot every time i ve tried re running the game without fail i ve uninstalled re installed deleted wwz related directories etc the game still crashes at the initial loading screen with a zombie s face you can find the steam appid in the url of the shop page of the game e g for the witcher wild hunt the appid is you can find your driver and linux version as well as your graphics processor s name in the system information report of steam you can retrieve a full system information report by clicking help system information in the steam client on your machine please copy it to your clipboard by pressing ctrl a and then ctrl c then paste it in a and post the link in this issue please search for open issues and pull requests by the name of the game and find out whether they are relevant and should be referenced above
| 0
|
64,768
| 7,840,443,623
|
IssuesEvent
|
2018-06-18 16:22:08
|
emfoundation/ce100-app
|
https://api.github.com/repos/emfoundation/ce100-app
|
closed
|
Design: improve tag selection
|
design
|
As a CE100 contact, I want to:
- see an overview of the tags I’ve selected before I save them so that I can double check they are relevant
- see how many tags are selected, and how many can still be selected, deselect or clear so that I can easily do my selection
|
1.0
|
Design: improve tag selection - As a CE100 contact, I want to:
- see an overview of the tags I’ve selected before I save them so that I can double check they are relevant
- see how many tags are selected, and how many can still be selected, deselect or clear so that I can easily do my selection
|
design
|
design improve tag selection as a contact i want to see an overview of the tags i’ve selected before i save them so that i can double check they are relevant see how many tags are selected and how many can still be selected deselect or clear so that i can easily do my selection
| 1
|
526,399
| 15,287,795,116
|
IssuesEvent
|
2021-02-23 16:09:05
|
SAP/ownid-server-sdk-net
|
https://api.github.com/repos/SAP/ownid-server-sdk-net
|
opened
|
console: short lifetime token
|
Priority: Medium Type: Enhancement
|
To make console more secure, we have to implement a mechanism when token lives not more than 5 minutes.
At the same time, if a user is inactive more than 10 hours, let's make him to enter email/password again.
|
1.0
|
console: short lifetime token - To make console more secure, we have to implement a mechanism when token lives not more than 5 minutes.
At the same time, if a user is inactive more than 10 hours, let's make him to enter email/password again.
|
non_design
|
console short lifetime token to make console more secure we have to implement a mechanism when token lives not more than minutes at the same time if a user is inactive more than hours let s make him to enter email password again
| 0
|
196,721
| 14,886,913,697
|
IssuesEvent
|
2021-01-20 17:34:59
|
eventespresso/event-espresso-core
|
https://api.github.com/repos/eventespresso/event-espresso-core
|
closed
|
Relax Entity Edit Form Validation
|
category:admin-page-ui-&-ux status:needs-testing type:bug 🐞
|
Currently the name field for tickets & datetimes is marked as required, but should be optional
|
1.0
|
Relax Entity Edit Form Validation - Currently the name field for tickets & datetimes is marked as required, but should be optional
|
non_design
|
relax entity edit form validation currently the name field for tickets datetimes is marked as required but should be optional
| 0
|
224,297
| 17,171,669,630
|
IssuesEvent
|
2021-07-15 05:55:51
|
openebs/lvm-localpv
|
https://api.github.com/repos/openebs/lvm-localpv
|
closed
|
How to configure snapshot size using SnapshotClass
|
documentation
|
**Describe the problem/challenge you have**
LVM Localpv now support the `SnapSize` SnapshotClass parameter to configure the snapshot size , but the documentation is missing which should be added.
|
1.0
|
How to configure snapshot size using SnapshotClass - **Describe the problem/challenge you have**
LVM Localpv now support the `SnapSize` SnapshotClass parameter to configure the snapshot size , but the documentation is missing which should be added.
|
non_design
|
how to configure snapshot size using snapshotclass describe the problem challenge you have lvm localpv now support the snapsize snapshotclass parameter to configure the snapshot size but the documentation is missing which should be added
| 0
|
270,907
| 20,614,339,035
|
IssuesEvent
|
2022-03-07 11:43:18
|
borgbackup/borg
|
https://api.github.com/repos/borgbackup/borg
|
reopened
|
better document borg check --max-duration
|
documentation
|
Copied from the alpha testing ticket:
Regarding --max-duration:
It is mentioned in "borg check" with one sentence: "do only a partial repo check for max. SECONDS seconds (Default: unlimited)".
There is no further explaination why and when this option could be used. I assume that instead of one full long-running check Borg can run multiple partial checks that last --max-duration until the whole repo is checked. But this fact doesn't seem to be expressed with this clarity in the docs. It should be mentioned in borg check.
|
1.0
|
better document borg check --max-duration - Copied from the alpha testing ticket:
Regarding --max-duration:
It is mentioned in "borg check" with one sentence: "do only a partial repo check for max. SECONDS seconds (Default: unlimited)".
There is no further explaination why and when this option could be used. I assume that instead of one full long-running check Borg can run multiple partial checks that last --max-duration until the whole repo is checked. But this fact doesn't seem to be expressed with this clarity in the docs. It should be mentioned in borg check.
|
non_design
|
better document borg check max duration copied from the alpha testing ticket regarding max duration it is mentioned in borg check with one sentence do only a partial repo check for max seconds seconds default unlimited there is no further explaination why and when this option could be used i assume that instead of one full long running check borg can run multiple partial checks that last max duration until the whole repo is checked but this fact doesn t seem to be expressed with this clarity in the docs it should be mentioned in borg check
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.