Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
12,522
| 14,966,752,886
|
IssuesEvent
|
2021-01-27 14:56:30
|
panther-labs/panther
|
https://api.github.com/repos/panther-labs/panther
|
closed
|
Panther should add support built-in schemas in YAML format
|
epic p1 team:data processing
|
### Description
Panther should be able to load built-in schemas defined in YAML
### RFC
TBD
### Designs
Not required - this is a purely backend feature
### Acceptance Criteria
- We have a new repo e.g. panther-logs that contains schemas defined in YAML format
- Upon a new Panther deployment or a redeployment, Panther will fetch those schemas from the repo and add them in the built-in supported log types
- The solution should be designed in a way that allows eventually updating built-in log types at runtime using the same mechanism but upon some user interaction
|
1.0
|
Panther should add support built-in schemas in YAML format - ### Description
Panther should be able to load built-in schemas defined in YAML
### RFC
TBD
### Designs
Not required - this is a purely backend feature
### Acceptance Criteria
- We have a new repo e.g. panther-logs that contains schemas defined in YAML format
- Upon a new Panther deployment or a redeployment, Panther will fetch those schemas from the repo and add them in the built-in supported log types
- The solution should be designed in a way that allows eventually updating built-in log types at runtime using the same mechanism but upon some user interaction
|
process
|
panther should add support built in schemas in yaml format description panther should be able to load built in schemas defined in yaml rfc tbd designs not required this is a purely backend feature acceptance criteria we have a new repo e g panther logs that contains schemas defined in yaml format upon a new panther deployment or a redeployment panther will fetch those schemas from the repo and add them in the built in supported log types the solution should be designed in a way that allows eventually updating built in log types at runtime using the same mechanism but upon some user interaction
| 1
|
8,775
| 11,899,354,555
|
IssuesEvent
|
2020-03-30 08:52:26
|
MHRA/products
|
https://api.github.com/repos/MHRA/products
|
opened
|
Job status endpoint isn't returning XML response
|
BUG :bug: EPIC - Auto Batch Process :oncoming_automobile:
|
**Describe the bug**
Job status endpoint isn't returning XML response - only JSON. Reported by Accenture team whilst performing SIT.
**To Reproduce**
Make XML request to job status endpoint.
**Expected behavior**
Response should be in xml format, not JSON.
**Screenshots**
N/A
**Additional context**
Assuming that XML filter that was added to other endpoints wasn't added to the job status endpoint.
|
1.0
|
Job status endpoint isn't returning XML response - **Describe the bug**
Job status endpoint isn't returning XML response - only JSON. Reported by Accenture team whilst performing SIT.
**To Reproduce**
Make XML request to job status endpoint.
**Expected behavior**
Response should be in xml format, not JSON.
**Screenshots**
N/A
**Additional context**
Assuming that XML filter that was added to other endpoints wasn't added to the job status endpoint.
|
process
|
job status endpoint isn t returning xml response describe the bug job status endpoint isn t returning xml response only json reported by accenture team whilst performing sit to reproduce make xml request to job status endpoint expected behavior response should be in xml format not json screenshots n a additional context assuming that xml filter that was added to other endpoints wasn t added to the job status endpoint
| 1
|
236,200
| 25,971,543,256
|
IssuesEvent
|
2022-12-19 11:39:15
|
nk7598/linux-4.19.72
|
https://api.github.com/repos/nk7598/linux-4.19.72
|
closed
|
CVE-2022-3635 (High) detected in linuxlinux-4.19.269 - autoclosed
|
security vulnerability
|
## CVE-2022-3635 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.269</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nk7598/linux-4.19.72/commit/8d6de636016872da224f31e7d9d0fe96d373b46c">8d6de636016872da224f31e7d9d0fe96d373b46c</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/atm/idt77252.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/atm/idt77252.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability, which was classified as critical, has been found in Linux Kernel. Affected by this issue is the function tst_timer of the file drivers/atm/idt77252.c of the component IPsec. The manipulation leads to use after free. It is recommended to apply a patch to fix this issue. VDB-211934 is the identifier assigned to this vulnerability.
<p>Publish Date: 2022-10-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3635>CVE-2022-3635</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2022-3635">https://www.linuxkernelcves.com/cves/CVE-2022-3635</a></p>
<p>Release Date: 2022-10-21</p>
<p>Fix Resolution: v4.9.326,v4.14.291,v4.19.256,v5.4.211,v5.10.138,v5.15.63</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-3635 (High) detected in linuxlinux-4.19.269 - autoclosed - ## CVE-2022-3635 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.269</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nk7598/linux-4.19.72/commit/8d6de636016872da224f31e7d9d0fe96d373b46c">8d6de636016872da224f31e7d9d0fe96d373b46c</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/atm/idt77252.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/atm/idt77252.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability, which was classified as critical, has been found in Linux Kernel. Affected by this issue is the function tst_timer of the file drivers/atm/idt77252.c of the component IPsec. The manipulation leads to use after free. It is recommended to apply a patch to fix this issue. VDB-211934 is the identifier assigned to this vulnerability.
<p>Publish Date: 2022-10-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3635>CVE-2022-3635</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2022-3635">https://www.linuxkernelcves.com/cves/CVE-2022-3635</a></p>
<p>Release Date: 2022-10-21</p>
<p>Fix Resolution: v4.9.326,v4.14.291,v4.19.256,v5.4.211,v5.10.138,v5.15.63</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in linuxlinux autoclosed cve high severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href vulnerable source files drivers atm c drivers atm c vulnerability details a vulnerability which was classified as critical has been found in linux kernel affected by this issue is the function tst timer of the file drivers atm c of the component ipsec the manipulation leads to use after free it is recommended to apply a patch to fix this issue vdb is the identifier assigned to this vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
17,178
| 22,756,508,455
|
IssuesEvent
|
2022-07-07 17:01:05
|
alchemistry/alchemlyb
|
https://api.github.com/repos/alchemistry/alchemlyb
|
closed
|
statistical_inefficiency does not slice data frame
|
bug preprocessors
|
### Description
When using `alchemlyb.preprocessing.statistical_inefficiency` with a data frame, a series to calculate statistical inefficiency on, and slicing parameters (`lower`, `upper`, `step`), the slicing is only applied to the series, but not to the data frame. Since `statistical_inefficiency` uses a raw index to subsample the data frame, the series and the data frame are out of sync by the end of the function. This leads to results that seem, at the very least, unintuitive.
### Examples
I think that a few examples make this much easier to explain. Assume I have loaded some data, e.g.
```python
from alchemlyb.parsing.gmx import extract_u_nk
data = extract_u_nk('lambda_0.xvg', T=298.15)
```
and want to use only part of this data, e.g.
```python
lower = 1000
upper = 5000
```
then I would expect a few things:
1. Slicing removes times before and after my set limits → this works ✅
```python
from alchemlyb.preprocessing import slicing
sliced_data = slicing(data, lower=lower, upper=upper)
assert all(sliced_data.reset_index()['time'] >= lower)
assert all(sliced_data.reset_index()['time'] <= upper)
```
2. Using the slicing arguments with `statistical_inefficiency` removes times before and after my set limits → this does not work ❌
```python
from alchemlyb.preprocessing import statistical_inefficiency
subsampled_data = statistical_inefficiency(data, data.columns[0], lower=lower, upper=upper)
assert all(subsampled_data.reset_index()['time'] >= lower)
assert all(subsampled_data.reset_index()['time'] <= upper)
```
Specifically, I find that the times are shifted, i.e. that the lowest time of `subsampled_data` is 0, and the highest time is `upper - lower`.
3. Using slicing on the data frame before running `statistical_inefficiency`, or running `statistical_inefficiency` with slicing parameters leads to the same result → this does not work ❌
```python
sliced_data = slicing(data, lower=lower, upper=upper)
subsampled_sliced_data = statistical_inefficiency(sliced_data, data.columns[0])
subsampled_data = statistical_inefficiency(data, data.columns[0], lower=lower, upper=upper)
assert (subsampled_data == subsampled_sliced_data).all(axis=None)
```
The problem here is, unsurprisingly, similar to the one above: `subsampled_data` starts at time 0, ends at time `upper - lower`, whereas `subsampled_sliced_data` starts at time `lower` and ends at time `upper`.
4. The issues demonstrated in cases 2 and 3 are especially problematic if we imagine a use case in which we would repeatedly sample from the same data, e.g.
```python
window_size=1000
for window_idx in range(5):
lower = window_idx * window_size
upper = (window_idx + 1) * window_size
subsampled_data = statistical_inefficiency(data, data.columns[0], lower=lower, upper=upper)
# ...
# further analysis of subsampled_data for window here
# ...
```
My expectation here would be to analyze windows of different data at every iteration. The current implementation would, however, always subsample from the window 0 to 1000. (The subsampled data might differ between windows, since the statistical inefficiency is actually calculated on a different window of the series each time, but the resulting indices are applied to the same window of the data frame every time.)
5. These issues could lead to correlated samples in case that the `step` keyword is used. Imagine that we have data which is correlated such that the `conservative` subsampling algorithm picks every 10th frame. If we were to use `statistical_inefficiency` with `step=10` (by chance, or because we have a feeling that we might have sampled data too frequently), then `statistical_inefficiency` would return every frame of the first 10% of the data! So we would have lost a lot of information, and would continue working with heavily sampled data.
### Proposed solution
I will propose a change shortly which adds tests (in line with examples 1 to 3) and a fix for this behavior. I think the most stable fix is to call `slicing` on both the series and the dataframe. It might be possible to calculate an offset index plus a stride to avoid slicing the dataframe, but that seems like pointless overengineering since there is a `slicing` function readily available that makes sure that the series and dataframe slicing will always be done in the same way.
PS: Maybe I've misunderstood everything and this is intended behavior. In this case, please close the PR, but I would then suggest to add some documentation (ideally also a note when calling the function) warning about this behavior 🙂
|
1.0
|
statistical_inefficiency does not slice data frame - ### Description
When using `alchemlyb.preprocessing.statistical_inefficiency` with a data frame, a series to calculate statistical inefficiency on, and slicing parameters (`lower`, `upper`, `step`), the slicing is only applied to the series, but not to the data frame. Since `statistical_inefficiency` uses a raw index to subsample the data frame, the series and the data frame are out of sync by the end of the function. This leads to results that seem, at the very least, unintuitive.
### Examples
I think that a few examples make this much easier to explain. Assume I have loaded some data, e.g.
```python
from alchemlyb.parsing.gmx import extract_u_nk
data = extract_u_nk('lambda_0.xvg', T=298.15)
```
and want to use only part of this data, e.g.
```python
lower = 1000
upper = 5000
```
then I would expect a few things:
1. Slicing removes times before and after my set limits → this works ✅
```python
from alchemlyb.preprocessing import slicing
sliced_data = slicing(data, lower=lower, upper=upper)
assert all(sliced_data.reset_index()['time'] >= lower)
assert all(sliced_data.reset_index()['time'] <= upper)
```
2. Using the slicing arguments with `statistical_inefficiency` removes times before and after my set limits → this does not work ❌
```python
from alchemlyb.preprocessing import statistical_inefficiency
subsampled_data = statistical_inefficiency(data, data.columns[0], lower=lower, upper=upper)
assert all(subsampled_data.reset_index()['time'] >= lower)
assert all(subsampled_data.reset_index()['time'] <= upper)
```
Specifically, I find that the times are shifted, i.e. that the lowest time of `subsampled_data` is 0, and the highest time is `upper - lower`.
3. Using slicing on the data frame before running `statistical_inefficiency`, or running `statistical_inefficiency` with slicing parameters leads to the same result → this does not work ❌
```python
sliced_data = slicing(data, lower=lower, upper=upper)
subsampled_sliced_data = statistical_inefficiency(sliced_data, data.columns[0])
subsampled_data = statistical_inefficiency(data, data.columns[0], lower=lower, upper=upper)
assert (subsampled_data == subsampled_sliced_data).all(axis=None)
```
The problem here is, unsurprisingly, similar to the one above: `subsampled_data` starts at time 0, ends at time `upper - lower`, whereas `subsampled_sliced_data` starts at time `lower` and ends at time `upper`.
4. The issues demonstrated in cases 2 and 3 are especially problematic if we imagine a use case in which we would repeatedly sample from the same data, e.g.
```python
window_size=1000
for window_idx in range(5):
lower = window_idx * window_size
upper = (window_idx + 1) * window_size
subsampled_data = statistical_inefficiency(data, data.columns[0], lower=lower, upper=upper)
# ...
# further analysis of subsampled_data for window here
# ...
```
My expectation here would be to analyze windows of different data at every iteration. The current implementation would, however, always subsample from the window 0 to 1000. (The subsampled data might differ between windows, since the statistical inefficiency is actually calculated on a different window of the series each time, but the resulting indices are applied to the same window of the data frame every time.)
5. These issues could lead to correlated samples in case that the `step` keyword is used. Imagine that we have data which is correlated such that the `conservative` subsampling algorithm picks every 10th frame. If we were to use `statistical_inefficiency` with `step=10` (by chance, or because we have a feeling that we might have sampled data too frequently), then `statistical_inefficiency` would return every frame of the first 10% of the data! So we would have lost a lot of information, and would continue working with heavily sampled data.
### Proposed solution
I will propose a change shortly which adds tests (in line with examples 1 to 3) and a fix for this behavior. I think the most stable fix is to call `slicing` on both the series and the dataframe. It might be possible to calculate an offset index plus a stride to avoid slicing the dataframe, but that seems like pointless overengineering since there is a `slicing` function readily available that makes sure that the series and dataframe slicing will always be done in the same way.
PS: Maybe I've misunderstood everything and this is intended behavior. In this case, please close the PR, but I would then suggest to add some documentation (ideally also a note when calling the function) warning about this behavior 🙂
|
process
|
statistical inefficiency does not slice data frame description when using alchemlyb preprocessing statistical inefficiency with a data frame a series to calculate statistical inefficiency on and slicing parameters lower upper step the slicing is only applied to the series but not to the data frame since statistical inefficiency uses a raw index to subsample the data frame the series and the data frame are out of sync by the end of the function this leads to results that seem at the very least unintuitive examples i think that a few examples make this much easier to explain assume i have loaded some data e g python from alchemlyb parsing gmx import extract u nk data extract u nk lambda xvg t and want to use only part of this data e g python lower upper then i would expect a few things slicing removes times before and after my set limits → this works ✅ python from alchemlyb preprocessing import slicing sliced data slicing data lower lower upper upper assert all sliced data reset index lower assert all sliced data reset index upper using the slicing arguments with statistical inefficiency removes times before and after my set limits → this does not work ❌ python from alchemlyb preprocessing import statistical inefficiency subsampled data statistical inefficiency data data columns lower lower upper upper assert all subsampled data reset index lower assert all subsampled data reset index upper specifically i find that the times are shifted i e that the lowest time of subsampled data is and the highest time is upper lower using slicing on the data frame before running statistical inefficiency or running statistical inefficiency with slicing parameters leads to the same result → this does not work ❌ python sliced data slicing data lower lower upper upper subsampled sliced data statistical inefficiency sliced data data columns subsampled data statistical inefficiency data data columns lower lower upper upper assert subsampled data subsampled sliced data all axis none the problem here is unsurprisingly similar to the one above subsampled data starts at time ends at time upper lower whereas subsampled sliced data starts at time lower and ends at time upper the issues demonstrated in cases and are especially problematic if we imagine a use case in which we would repeatedly sample from the same data e g python window size for window idx in range lower window idx window size upper window idx window size subsampled data statistical inefficiency data data columns lower lower upper upper further analysis of subsampled data for window here my expectation here would be to analyze windows of different data at every iteration the current implementation would however always subsample from the window to the subsampled data might differ between windows since the statistical inefficiency is actually calculated on a different window of the series each time but the resulting indices are applied to the same window of the data frame every time these issues could lead to correlated samples in case that the step keyword is used imagine that we have data which is correlated such that the conservative subsampling algorithm picks every frame if we were to use statistical inefficiency with step by chance or because we have a feeling that we might have sampled data too frequently then statistical inefficiency would return every frame of the first of the data so we would have lost a lot of information and would continue working with heavily sampled data proposed solution i will propose a change shortly which adds tests in line with examples to and a fix for this behavior i think the most stable fix is to call slicing on both the series and the dataframe it might be possible to calculate an offset index plus a stride to avoid slicing the dataframe but that seems like pointless overengineering since there is a slicing function readily available that makes sure that the series and dataframe slicing will always be done in the same way ps maybe i ve misunderstood everything and this is intended behavior in this case please close the pr but i would then suggest to add some documentation ideally also a note when calling the function warning about this behavior 🙂
| 1
|
71,391
| 23,606,729,747
|
IssuesEvent
|
2022-08-24 08:53:49
|
vector-im/element-call
|
https://api.github.com/repos/vector-im/element-call
|
closed
|
Debug log sending appears to be broken in 0.2.7
|
T-Defect
|
### Steps to reproduce
All the rageshakes are arriving without debug logs
### Outcome
#### What did you expect?
#### What happened instead?
### Operating system
_No response_
### Browser information
_No response_
### URL for webapp
_No response_
### Will you send logs?
No
|
1.0
|
Debug log sending appears to be broken in 0.2.7 - ### Steps to reproduce
All the rageshakes are arriving without debug logs
### Outcome
#### What did you expect?
#### What happened instead?
### Operating system
_No response_
### Browser information
_No response_
### URL for webapp
_No response_
### Will you send logs?
No
|
non_process
|
debug log sending appears to be broken in steps to reproduce all the rageshakes are arriving without debug logs outcome what did you expect what happened instead operating system no response browser information no response url for webapp no response will you send logs no
| 0
|
6,035
| 8,848,778,060
|
IssuesEvent
|
2019-01-08 08:22:30
|
AnotherCodeArtist/CEPWare
|
https://api.github.com/repos/AnotherCodeArtist/CEPWare
|
closed
|
Implement Telegram Notifications
|
WP6 - Complex Event Processing in progress
|
Apache Flink not only should write into Log files but send the correct status to your smartphone via telegram.
|
1.0
|
Implement Telegram Notifications - Apache Flink not only should write into Log files but send the correct status to your smartphone via telegram.
|
process
|
implement telegram notifications apache flink not only should write into log files but send the correct status to your smartphone via telegram
| 1
|
66,395
| 27,438,933,602
|
IssuesEvent
|
2023-03-02 09:38:21
|
hashicorp/terraform-provider-azurerm
|
https://api.github.com/repos/hashicorp/terraform-provider-azurerm
|
reopened
|
In azurerm_container_app the traffic_weight block in the ingress block should not be required
|
bug service/container-apps
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Community Note
<!--- Please keep this note for the community --->
* Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform Version
v1.3.7
### AzureRM Provider Version
v3.43.0
### Affected Resource(s)/Data Source(s)
azurerm_container_app
### Terraform Configuration Files
```hcl
variable "container_apps" {
description = "Specifies the container apps in the managed environment."
type = list(object({
name = string
revision_mode = optional(string)
ingress = optional(object({
allow_insecure_connections = optional(bool)
external_enabled = optional(bool)
target_port = optional(number)
transport = optional(string)
traffic_weight = optional(list(object({
label = optional(string)
latest_revision = optional(bool)
revision_suffix = optional(string)
percentage = optional(number)
})))
}))
dapr = optional(object({
app_id = optional(string)
app_port = optional(number)
app_protocol = optional(string)
}))
secrets = optional(list(object({
name = string
value = string
})))
template = object({
containers = list(object({
name = string
image = string
args = optional(list(string))
command = optional(list(string))
cpu = optional(number)
memory = optional(string)
env = optional(list(object({
name = string
secret_name = optional(string)
value = optional(string)
})))
}))
min_replicas = optional(number)
max_replicas = optional(number)
revision_suffix = optional(string)
volume = optional(list(object({
name = string
storage_name = optional(string)
storage_type = optional(string)
})))
})
}))
default = [{
name = "nodeapp"
revision_mode = "Single"
ingress = {
external_enabled = false
target_port = 3000
transport = "http"
traffic_weight = [{
label = "blue"
latest_revision = true
revision_suffix = "blue"
percentage = 100
}]
}
dapr = {
app_id = "nodeapp"
app_port = 3000
app_protocol = "http"
}
template = {
containers = [{
name = "hello-k8s-node"
image = "dapriosamples/hello-k8s-node:latest"
cpu = 0.5
memory = "1Gi"
env = [{
name = "APP_PORT"
value = 3000
}]
}]
min_replicas = 1
max_replicas = 1
}
},
{
name = "pythonapp"
revision_mode = "Single"
dapr = {
app_id = "pythonapp"
app_port = 80
}
template = {
containers = [{
name = "hello-k8s-python"
image = "dapriosamples/hello-k8s-python:latest"
cpu = 0.5
memory = "1Gi"
}]
min_replicas = 1
max_replicas = 1
}
}]
}
resource "azurerm_container_app" "container_app" {
for_each = {for app in var.container_apps: app.name => app}
name = each.key
resource_group_name = var.resource_group_name
container_app_environment_id = azurerm_container_app_environment.managed_environment.id
tags = var.tags
revision_mode = each.value.revision_mode
template {
dynamic "container" {
for_each = coalesce(each.value.template.containers, [])
content {
name = container.value.name
image = container.value.image
args = try(container.value.args, null)
command = try(container.value.command, null)
cpu = container.value.cpu
memory = container.value.memory
dynamic "env" {
for_each = coalesce(container.value.env, [])
content {
name = env.value.name
secret_name = try(env.value.secret_name, null)
value = try(env.value.value, null)
}
}
}
}
min_replicas = try(each.value.template.min_replicas, null)
max_replicas = try(each.value.template.max_replicas, null)
revision_suffix = try(each.value.template.revision_suffix, null)
dynamic "volume" {
for_each = each.value.template.volume != null ? [each.value.template.volume] : []
content {
name = volume.value.name
storage_name = try(volume.value.storage_name, null)
storage_type = try(volume.value.storage_type, null)
}
}
}
dynamic "ingress" {
for_each = each.value.ingress != null ? [each.value.ingress] : []
content {
allow_insecure_connections = try(ingress.value.allow_insecure_connections, null)
external_enabled = try(ingress.value.external_enabled, null)
target_port = ingress.value.target_port
transport = ingress.value.transport
dynamic "traffic_weight" {
for_each = coalesce(ingress.value.traffic_weight, [])
content {
label = traffic_weight.value.label
latest_revision = traffic_weight.value.latest_revision
revision_suffix = traffic_weight.value.revision_suffix
percentage = traffic_weight.value.percentage
}
}
}
}
dynamic "dapr" {
for_each = each.value.dapr != null ? [each.value.dapr] : []
content {
app_id = dapr.value.app_id
app_port = dapr.value.app_port
app_protocol = dapr.value.app_protocol
}
}
dynamic "secret" {
for_each = each.value.secrets != null ? [each.value.secrets] : []
content {
name = secret.value.name
value = secret.value.value
}
}
lifecycle {
ignore_changes = [
tags
]
}
}
```
### Debug Output/Panic Output
```shell
The container app successfully deploys:
module.container_apps.azurerm_container_app.container_app["nodeapp"]: Creation complete after 39s [id=/subscriptions/1a45a694-ae23-4650-9774-89a571c462f6/resourceGroups/AnubisRG/providers/Microsoft.App/containerApps/nodeapp]
If the container app exposes a public or private ingress, the engineer deploying the container app must specify a `traffic_weight` block in the `ingress` block in the `azurerm_container_app` resource even if this not necessary when creating the same resource using Bicep, ARM, Azure CLI, PowerShell, or Azure REST API.
```
### Expected Behaviour
In `azurerm_container_app` the `traffic_weight` block in the `ingress block` should be optional, while now you have to create at least one `traffic_weight` if the `ingress block` is not null.
### Actual Behaviour
If the container app exposes a public or private ingress, the engineer deploying the container app must specify a `traffic_weight` block in the `ingress` block in the `azurerm_container_app` resource even if this not necessary when creating the same resource using Bicep, ARM, Azure CLI, PowerShell, or Azure REST API.
### Steps to Reproduce
Deploy a container app that exposes a private and public ingress, specify a `traffic_weight` block in the `ingress` block in the `azurerm_container_app` resource even if this is not necessary when creating the same resource using Bicep, ARM, Azure CLI, PowerShell, or Azure REST API.
### Important Factoids
_No response_
### References
_No response_
|
1.0
|
In azurerm_container_app the traffic_weight block in the ingress block should not be required - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Community Note
<!--- Please keep this note for the community --->
* Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform Version
v1.3.7
### AzureRM Provider Version
v3.43.0
### Affected Resource(s)/Data Source(s)
azurerm_container_app
### Terraform Configuration Files
```hcl
variable "container_apps" {
description = "Specifies the container apps in the managed environment."
type = list(object({
name = string
revision_mode = optional(string)
ingress = optional(object({
allow_insecure_connections = optional(bool)
external_enabled = optional(bool)
target_port = optional(number)
transport = optional(string)
traffic_weight = optional(list(object({
label = optional(string)
latest_revision = optional(bool)
revision_suffix = optional(string)
percentage = optional(number)
})))
}))
dapr = optional(object({
app_id = optional(string)
app_port = optional(number)
app_protocol = optional(string)
}))
secrets = optional(list(object({
name = string
value = string
})))
template = object({
containers = list(object({
name = string
image = string
args = optional(list(string))
command = optional(list(string))
cpu = optional(number)
memory = optional(string)
env = optional(list(object({
name = string
secret_name = optional(string)
value = optional(string)
})))
}))
min_replicas = optional(number)
max_replicas = optional(number)
revision_suffix = optional(string)
volume = optional(list(object({
name = string
storage_name = optional(string)
storage_type = optional(string)
})))
})
}))
default = [{
name = "nodeapp"
revision_mode = "Single"
ingress = {
external_enabled = false
target_port = 3000
transport = "http"
traffic_weight = [{
label = "blue"
latest_revision = true
revision_suffix = "blue"
percentage = 100
}]
}
dapr = {
app_id = "nodeapp"
app_port = 3000
app_protocol = "http"
}
template = {
containers = [{
name = "hello-k8s-node"
image = "dapriosamples/hello-k8s-node:latest"
cpu = 0.5
memory = "1Gi"
env = [{
name = "APP_PORT"
value = 3000
}]
}]
min_replicas = 1
max_replicas = 1
}
},
{
name = "pythonapp"
revision_mode = "Single"
dapr = {
app_id = "pythonapp"
app_port = 80
}
template = {
containers = [{
name = "hello-k8s-python"
image = "dapriosamples/hello-k8s-python:latest"
cpu = 0.5
memory = "1Gi"
}]
min_replicas = 1
max_replicas = 1
}
}]
}
resource "azurerm_container_app" "container_app" {
for_each = {for app in var.container_apps: app.name => app}
name = each.key
resource_group_name = var.resource_group_name
container_app_environment_id = azurerm_container_app_environment.managed_environment.id
tags = var.tags
revision_mode = each.value.revision_mode
template {
dynamic "container" {
for_each = coalesce(each.value.template.containers, [])
content {
name = container.value.name
image = container.value.image
args = try(container.value.args, null)
command = try(container.value.command, null)
cpu = container.value.cpu
memory = container.value.memory
dynamic "env" {
for_each = coalesce(container.value.env, [])
content {
name = env.value.name
secret_name = try(env.value.secret_name, null)
value = try(env.value.value, null)
}
}
}
}
min_replicas = try(each.value.template.min_replicas, null)
max_replicas = try(each.value.template.max_replicas, null)
revision_suffix = try(each.value.template.revision_suffix, null)
dynamic "volume" {
for_each = each.value.template.volume != null ? [each.value.template.volume] : []
content {
name = volume.value.name
storage_name = try(volume.value.storage_name, null)
storage_type = try(volume.value.storage_type, null)
}
}
}
dynamic "ingress" {
for_each = each.value.ingress != null ? [each.value.ingress] : []
content {
allow_insecure_connections = try(ingress.value.allow_insecure_connections, null)
external_enabled = try(ingress.value.external_enabled, null)
target_port = ingress.value.target_port
transport = ingress.value.transport
dynamic "traffic_weight" {
for_each = coalesce(ingress.value.traffic_weight, [])
content {
label = traffic_weight.value.label
latest_revision = traffic_weight.value.latest_revision
revision_suffix = traffic_weight.value.revision_suffix
percentage = traffic_weight.value.percentage
}
}
}
}
dynamic "dapr" {
for_each = each.value.dapr != null ? [each.value.dapr] : []
content {
app_id = dapr.value.app_id
app_port = dapr.value.app_port
app_protocol = dapr.value.app_protocol
}
}
dynamic "secret" {
for_each = each.value.secrets != null ? [each.value.secrets] : []
content {
name = secret.value.name
value = secret.value.value
}
}
lifecycle {
ignore_changes = [
tags
]
}
}
```
### Debug Output/Panic Output
```shell
The container app successfully deploys:
module.container_apps.azurerm_container_app.container_app["nodeapp"]: Creation complete after 39s [id=/subscriptions/1a45a694-ae23-4650-9774-89a571c462f6/resourceGroups/AnubisRG/providers/Microsoft.App/containerApps/nodeapp]
If the container app exposes a public or private ingress, the engineer deploying the container app must specify a `traffic_weight` block in the `ingress` block in the `azurerm_container_app` resource even if this not necessary when creating the same resource using Bicep, ARM, Azure CLI, PowerShell, or Azure REST API.
```
### Expected Behaviour
In `azurerm_container_app` the `traffic_weight` block in the `ingress block` should be optional, while now you have to create at least one `traffic_weight` if the `ingress block` is not null.
### Actual Behaviour
If the container app exposes a public or private ingress, the engineer deploying the container app must specify a `traffic_weight` block in the `ingress` block in the `azurerm_container_app` resource even if this not necessary when creating the same resource using Bicep, ARM, Azure CLI, PowerShell, or Azure REST API.
### Steps to Reproduce
Deploy a container app that exposes a private and public ingress, specify a `traffic_weight` block in the `ingress` block in the `azurerm_container_app` resource even if this is not necessary when creating the same resource using Bicep, ARM, Azure CLI, PowerShell, or Azure REST API.
### Important Factoids
_No response_
### References
_No response_
|
non_process
|
in azurerm container app the traffic weight block in the ingress block should not be required is there an existing issue for this i have searched the existing issues community note please vote on this issue by adding a thumbsup to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform version azurerm provider version affected resource s data source s azurerm container app terraform configuration files hcl variable container apps description specifies the container apps in the managed environment type list object name string revision mode optional string ingress optional object allow insecure connections optional bool external enabled optional bool target port optional number transport optional string traffic weight optional list object label optional string latest revision optional bool revision suffix optional string percentage optional number dapr optional object app id optional string app port optional number app protocol optional string secrets optional list object name string value string template object containers list object name string image string args optional list string command optional list string cpu optional number memory optional string env optional list object name string secret name optional string value optional string min replicas optional number max replicas optional number revision suffix optional string volume optional list object name string storage name optional string storage type optional string default name nodeapp revision mode single ingress external enabled false target port transport http traffic weight label blue latest revision true revision suffix blue percentage dapr app id nodeapp app port app protocol http template containers name hello node image dapriosamples hello node latest cpu memory env name app port value min replicas max replicas name pythonapp revision mode single dapr app id pythonapp app port template containers name hello python image dapriosamples hello python latest cpu memory min replicas max replicas resource azurerm container app container app for each for app in var container apps app name app name each key resource group name var resource group name container app environment id azurerm container app environment managed environment id tags var tags revision mode each value revision mode template dynamic container for each coalesce each value template containers content name container value name image container value image args try container value args null command try container value command null cpu container value cpu memory container value memory dynamic env for each coalesce container value env content name env value name secret name try env value secret name null value try env value value null min replicas try each value template min replicas null max replicas try each value template max replicas null revision suffix try each value template revision suffix null dynamic volume for each each value template volume null content name volume value name storage name try volume value storage name null storage type try volume value storage type null dynamic ingress for each each value ingress null content allow insecure connections try ingress value allow insecure connections null external enabled try ingress value external enabled null target port ingress value target port transport ingress value transport dynamic traffic weight for each coalesce ingress value traffic weight content label traffic weight value label latest revision traffic weight value latest revision revision suffix traffic weight value revision suffix percentage traffic weight value percentage dynamic dapr for each each value dapr null content app id dapr value app id app port dapr value app port app protocol dapr value app protocol dynamic secret for each each value secrets null content name secret value name value secret value value lifecycle ignore changes tags debug output panic output shell the container app successfully deploys module container apps azurerm container app container app creation complete after if the container app exposes a public or private ingress the engineer deploying the container app must specify a traffic weight block in the ingress block in the azurerm container app resource even if this not necessary when creating the same resource using bicep arm azure cli powershell or azure rest api expected behaviour in azurerm container app the traffic weight block in the ingress block should be optional while now you have to create at least one traffic weight if the ingress block is not null actual behaviour if the container app exposes a public or private ingress the engineer deploying the container app must specify a traffic weight block in the ingress block in the azurerm container app resource even if this not necessary when creating the same resource using bicep arm azure cli powershell or azure rest api steps to reproduce deploy a container app that exposes a private and public ingress specify a traffic weight block in the ingress block in the azurerm container app resource even if this is not necessary when creating the same resource using bicep arm azure cli powershell or azure rest api important factoids no response references no response
| 0
|
286,575
| 21,579,737,237
|
IssuesEvent
|
2022-05-02 17:22:29
|
magma/domain-proxy
|
https://api.github.com/repos/magma/domain-proxy
|
opened
|
Fix Domain Proxy DocuSaurus documentation
|
bug documentation
|
Markdown headers of Domain Proxy doc files are in one line, causing the generated documentation to not properly detect md ids listed in the sidebar.
Correct current readmes and apply changes to 1.7.0 versioned docs.
Additionally, fix asset links to work properly with docusaurus
|
1.0
|
Fix Domain Proxy DocuSaurus documentation - Markdown headers of Domain Proxy doc files are in one line, causing the generated documentation to not properly detect md ids listed in the sidebar.
Correct current readmes and apply changes to 1.7.0 versioned docs.
Additionally, fix asset links to work properly with docusaurus
|
non_process
|
fix domain proxy docusaurus documentation markdown headers of domain proxy doc files are in one line causing the generated documentation to not properly detect md ids listed in the sidebar correct current readmes and apply changes to versioned docs additionally fix asset links to work properly with docusaurus
| 0
|
101,486
| 31,165,720,561
|
IssuesEvent
|
2023-08-16 19:31:27
|
MicrosoftDocs/visualstudio-docs
|
https://api.github.com/repos/MicrosoftDocs/visualstudio-docs
|
closed
|
WriteCodeFragment: Document _TypeName metadata
|
doc-bug visual-studio-windows/prod msbuild/tech Pri2
|
https://github.com/dotnet/msbuild/pull/6285 introduced new metadata respected by the `WriteCodeFragment` task to specify the type of parameters. It should be documented here.
We probably need an example--the ones provided in that PR by @reduckted are excellent.
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 9b571575-db30-7502-9787-a65d97200d10
* Version Independent ID: e6d55b9a-17f2-cb5e-1949-c800b394228e
* Content: [WriteCodeFragment Task - MSBuild](https://learn.microsoft.com/en-us/visualstudio/msbuild/writecodefragment-task?view=vs-2022)
* Content Source: [docs/msbuild/writecodefragment-task.md](https://github.com/MicrosoftDocs/visualstudio-docs/blob/main/docs/msbuild/writecodefragment-task.md)
* Product: **visual-studio-windows**
* Technology: **msbuild**
* GitHub Login: @ghogen
* Microsoft Alias: **ghogen**
|
1.0
|
WriteCodeFragment: Document _TypeName metadata - https://github.com/dotnet/msbuild/pull/6285 introduced new metadata respected by the `WriteCodeFragment` task to specify the type of parameters. It should be documented here.
We probably need an example--the ones provided in that PR by @reduckted are excellent.
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 9b571575-db30-7502-9787-a65d97200d10
* Version Independent ID: e6d55b9a-17f2-cb5e-1949-c800b394228e
* Content: [WriteCodeFragment Task - MSBuild](https://learn.microsoft.com/en-us/visualstudio/msbuild/writecodefragment-task?view=vs-2022)
* Content Source: [docs/msbuild/writecodefragment-task.md](https://github.com/MicrosoftDocs/visualstudio-docs/blob/main/docs/msbuild/writecodefragment-task.md)
* Product: **visual-studio-windows**
* Technology: **msbuild**
* GitHub Login: @ghogen
* Microsoft Alias: **ghogen**
|
non_process
|
writecodefragment document typename metadata introduced new metadata respected by the writecodefragment task to specify the type of parameters it should be documented here we probably need an example the ones provided in that pr by reduckted are excellent document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source product visual studio windows technology msbuild github login ghogen microsoft alias ghogen
| 0
|
509,621
| 14,740,565,037
|
IssuesEvent
|
2021-01-07 09:17:30
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
opened
|
Extend display of bans
|
Category: Accounts Priority: Medium
|
Bans of users should additionally display the reason and a note of
"If you think this is in error, contact support@strangeloopgames.com"
|
1.0
|
Extend display of bans - Bans of users should additionally display the reason and a note of
"If you think this is in error, contact support@strangeloopgames.com"
|
non_process
|
extend display of bans bans of users should additionally display the reason and a note of if you think this is in error contact support strangeloopgames com
| 0
|
10,871
| 13,640,426,298
|
IssuesEvent
|
2020-09-25 12:44:12
|
timberio/vector
|
https://api.github.com/repos/timberio/vector
|
closed
|
New `merge` remap function
|
domain: mapping domain: processing type: feature
|
The `merge` remap function would merge 2 objects together.
## Examples
For all examples we'll use this event:
```js
{
// ...
"parent1": {
"key1": "val1",
"child": {
"grandchild1": "val1"
}
},
"parent2": {
"key2", "val2",
"child": {
"grandchild2": "val2"
}
}
// ...
}
```
### Shallow merge
```
merge(.parent1, .parent2)
del(.parent2)
```
Results in:
```js
{
// ...
"parent1": {
"key1": "val1",
"key2", "val2",
"child": {
"grandchild2": "val2"
}
}
// ...
}
```
### Deep merge
```
merge(.parent1, .parent2, deep=true)
del(.parent2)
```
Results in:
```js
{
// ...
"parent1": {
"key1": "val1",
"key2", "val2",
"child": {
"grandchild1": "val1"
"grandchild2": "val2"
}
}
// ...
}
```
### Root
For clarity, users can merge with the root object with the `.` path:
```
merge(., .parent2)
del(.parent2)
```
|
1.0
|
New `merge` remap function - The `merge` remap function would merge 2 objects together.
## Examples
For all examples we'll use this event:
```js
{
// ...
"parent1": {
"key1": "val1",
"child": {
"grandchild1": "val1"
}
},
"parent2": {
"key2", "val2",
"child": {
"grandchild2": "val2"
}
}
// ...
}
```
### Shallow merge
```
merge(.parent1, .parent2)
del(.parent2)
```
Results in:
```js
{
// ...
"parent1": {
"key1": "val1",
"key2", "val2",
"child": {
"grandchild2": "val2"
}
}
// ...
}
```
### Deep merge
```
merge(.parent1, .parent2, deep=true)
del(.parent2)
```
Results in:
```js
{
// ...
"parent1": {
"key1": "val1",
"key2", "val2",
"child": {
"grandchild1": "val1"
"grandchild2": "val2"
}
}
// ...
}
```
### Root
For clarity, users can merge with the root object with the `.` path:
```
merge(., .parent2)
del(.parent2)
```
|
process
|
new merge remap function the merge remap function would merge objects together examples for all examples we ll use this event js child child shallow merge merge del results in js child deep merge merge deep true del results in js child root for clarity users can merge with the root object with the path merge del
| 1
|
10,692
| 13,485,887,839
|
IssuesEvent
|
2020-09-11 08:41:24
|
MHRA/products
|
https://api.github.com/repos/MHRA/products
|
opened
|
Convert PARs site to typescript
|
EPIC - PARs process
|
## Convert PARs site to typescript
As a user
I want to be able to use older browsers like IE11
So that I can use the same browser that I use for other things
## Acceptance Criteria
Site builds to target ES5 for older browsers. The easiest way to do this is to use typescript with the related configuration.
### Customer acceptance criteria
- [ ] _What the customer can check_
### Technical acceptance criteria
- [ ] _Technical things that have to happen_
### Data acceptance criteria
- [ ] _Data that needs to be collected_
### Testing acceptance criteria
- [ ] _Tests or testing enabling changes that need to be made_
## Data - Potential impact
**Size**
**Value**
**Effort**
### Exit Criteria met
- [ ] Backlog
- [ ] Discovery
- [ ] DUXD
- [ ] Development
- [ ] Quality Assurance
- [ ] Release and Validate
|
1.0
|
Convert PARs site to typescript - ## Convert PARs site to typescript
As a user
I want to be able to use older browsers like IE11
So that I can use the same browser that I use for other things
## Acceptance Criteria
Site builds to target ES5 for older browsers. The easiest way to do this is to use typescript with the related configuration.
### Customer acceptance criteria
- [ ] _What the customer can check_
### Technical acceptance criteria
- [ ] _Technical things that have to happen_
### Data acceptance criteria
- [ ] _Data that needs to be collected_
### Testing acceptance criteria
- [ ] _Tests or testing enabling changes that need to be made_
## Data - Potential impact
**Size**
**Value**
**Effort**
### Exit Criteria met
- [ ] Backlog
- [ ] Discovery
- [ ] DUXD
- [ ] Development
- [ ] Quality Assurance
- [ ] Release and Validate
|
process
|
convert pars site to typescript convert pars site to typescript as a user i want to be able to use older browsers like so that i can use the same browser that i use for other things acceptance criteria site builds to target for older browsers the easiest way to do this is to use typescript with the related configuration customer acceptance criteria what the customer can check technical acceptance criteria technical things that have to happen data acceptance criteria data that needs to be collected testing acceptance criteria tests or testing enabling changes that need to be made data potential impact size value effort exit criteria met backlog discovery duxd development quality assurance release and validate
| 1
|
131,799
| 18,396,897,117
|
IssuesEvent
|
2021-10-12 12:25:49
|
bounswe/2021SpringGroup3
|
https://api.github.com/repos/bounswe/2021SpringGroup3
|
closed
|
Implementation: Frontend for Creating/Updating Posts
|
Type: Design Status: Available Priority: High Component: Frontend
|
Please add UpdatePostViewController and CreatePostViewController classes to our views package.
This classes should consume our rest api's, which are implemented in CreatePostController and UpdatePostController.
After obtaining the relevant data, prepare the forms for these procedures in HTML/CSS/Bootstrap.
At the end, you may open a pull request and assign reviewers.
|
1.0
|
Implementation: Frontend for Creating/Updating Posts - Please add UpdatePostViewController and CreatePostViewController classes to our views package.
This classes should consume our rest api's, which are implemented in CreatePostController and UpdatePostController.
After obtaining the relevant data, prepare the forms for these procedures in HTML/CSS/Bootstrap.
At the end, you may open a pull request and assign reviewers.
|
non_process
|
implementation frontend for creating updating posts please add updatepostviewcontroller and createpostviewcontroller classes to our views package this classes should consume our rest api s which are implemented in createpostcontroller and updatepostcontroller after obtaining the relevant data prepare the forms for these procedures in html css bootstrap at the end you may open a pull request and assign reviewers
| 0
|
17,563
| 23,377,405,036
|
IssuesEvent
|
2022-08-11 05:42:20
|
Battle-s/battle-school-backend
|
https://api.github.com/repos/Battle-s/battle-school-backend
|
closed
|
[FEAT] 사용자 토큰 만료 검사
|
feature :computer: processing :hourglass_flowing_sand:
|
## 설명
> 이슈에 대한 설명을 작성합니다. 담당자도 함께 작성하면 좋습니다.
## 체크사항
> 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다.
- [ ] todo1
- [ ] todo2
- [ ] todo3
## 참고자료
> 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다.
## 관련 논의
> 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다.
|
1.0
|
[FEAT] 사용자 토큰 만료 검사 - ## 설명
> 이슈에 대한 설명을 작성합니다. 담당자도 함께 작성하면 좋습니다.
## 체크사항
> 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다.
- [ ] todo1
- [ ] todo2
- [ ] todo3
## 참고자료
> 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다.
## 관련 논의
> 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다.
|
process
|
사용자 토큰 만료 검사 설명 이슈에 대한 설명을 작성합니다 담당자도 함께 작성하면 좋습니다 체크사항 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다 참고자료 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다 관련 논의 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다
| 1
|
15,499
| 19,703,257,692
|
IssuesEvent
|
2022-01-12 18:51:42
|
googleapis/google-api-ruby-client
|
https://api.github.com/repos/googleapis/google-api-ruby-client
|
opened
|
Your .repo-metadata.json files have a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json files:
Result of scan 📈:
* must have required property 'release_level' in generated/google-apis-abusiveexperiencereport_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-abusiveexperiencereport_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-acceleratedmobilepageurl_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-acceleratedmobilepageurl_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-accessapproval_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-accessapproval_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-accesscontextmanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-accesscontextmanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-accesscontextmanager_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-accesscontextmanager_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexchangebuyer2_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexchangebuyer2_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexchangebuyer_v1_2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexchangebuyer_v1_2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexchangebuyer_v1_3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexchangebuyer_v1_3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexchangebuyer_v1_4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexchangebuyer_v1_4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexperiencereport_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexperiencereport_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admin_datatransfer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admin_datatransfer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admin_directory_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admin_directory_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admin_reports_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admin_reports_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admob_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admob_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admob_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admob_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adsense_v1_4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adsense_v1_4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adsense_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adsense_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adsensehost_v4_1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adsensehost_v4_1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-alertcenter_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-alertcenter_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-analytics_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-analytics_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-analyticsadmin_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-analyticsadmin_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-analyticsdata_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-analyticsdata_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-analyticsreporting_v4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-analyticsreporting_v4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-androiddeviceprovisioning_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-androiddeviceprovisioning_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-androidenterprise_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-androidenterprise_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-androidmanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-androidmanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-androidpublisher_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-androidpublisher_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-apigateway_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-apigateway_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-apigateway_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-apigateway_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-apigee_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-apigee_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-apikeys_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-apikeys_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-appengine_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-appengine_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-appengine_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-appengine_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-appengine_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-appengine_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-area120tables_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-area120tables_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-artifactregistry_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-artifactregistry_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-artifactregistry_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-artifactregistry_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-artifactregistry_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-artifactregistry_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-assuredworkloads_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-assuredworkloads_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-authorizedbuyersmarketplace_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-authorizedbuyersmarketplace_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-baremetalsolution_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-baremetalsolution_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-baremetalsolution_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-baremetalsolution_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-baremetalsolution_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-baremetalsolution_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigquery_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigquery_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigqueryconnection_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigqueryconnection_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigquerydatatransfer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigquerydatatransfer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigqueryreservation_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigqueryreservation_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigqueryreservation_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigqueryreservation_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigtableadmin_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigtableadmin_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigtableadmin_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigtableadmin_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-billingbudgets_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-billingbudgets_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-billingbudgets_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-billingbudgets_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-binaryauthorization_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-binaryauthorization_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-binaryauthorization_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-binaryauthorization_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-blogger_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-blogger_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-blogger_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-blogger_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-books_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-books_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-calendar_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-calendar_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-chat_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-chat_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-chromemanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-chromemanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-chromepolicy_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-chromepolicy_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-chromeuxreport_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-chromeuxreport_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-civicinfo_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-civicinfo_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-classroom_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-classroom_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1p4beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1p4beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1p5beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1p5beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1p7beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1p7beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbilling_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbilling_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbuild_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbuild_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbuild_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbuild_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbuild_v1alpha2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbuild_v1alpha2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbuild_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbuild_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudchannel_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudchannel_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-clouddebugger_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-clouddebugger_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-clouddeploy_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-clouddeploy_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-clouderrorreporting_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-clouderrorreporting_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudfunctions_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudfunctions_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudidentity_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudidentity_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudidentity_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudidentity_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudiot_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudiot_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudkms_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudkms_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudprofiler_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudprofiler_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudscheduler_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudscheduler_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudscheduler_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudscheduler_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudsearch_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudsearch_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudshell_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudshell_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudsupport_v2beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudsupport_v2beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtasks_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtasks_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtasks_v2beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtasks_v2beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtasks_v2beta3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtasks_v2beta3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtrace_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtrace_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtrace_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtrace_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtrace_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtrace_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-composer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-composer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-composer_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-composer_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-compute_alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-compute_alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-compute_beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-compute_beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-compute_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-compute_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-connectors_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-connectors_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-contactcenterinsights_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-contactcenterinsights_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-container_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-container_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-container_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-container_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-containeranalysis_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-containeranalysis_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-containeranalysis_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-containeranalysis_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-containeranalysis_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-containeranalysis_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-content_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-content_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-content_v2_1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-content_v2_1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-customsearch_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-customsearch_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datacatalog_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datacatalog_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datacatalog_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datacatalog_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dataflow_v1b3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dataflow_v1b3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datafusion_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datafusion_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datafusion_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datafusion_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datalabeling_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datalabeling_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datamigration_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datamigration_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datamigration_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datamigration_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datapipelines_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datapipelines_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dataproc_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dataproc_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dataproc_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dataproc_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastore_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastore_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastore_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastore_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastore_v1beta3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastore_v1beta3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastream_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastream_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastream_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastream_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-deploymentmanager_alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-deploymentmanager_alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-deploymentmanager_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-deploymentmanager_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-deploymentmanager_v2beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-deploymentmanager_v2beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dfareporting_v3_3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dfareporting_v3_3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dfareporting_v3_4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dfareporting_v3_4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dfareporting_v3_5/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dfareporting_v3_5/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dialogflow_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dialogflow_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dialogflow_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dialogflow_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dialogflow_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dialogflow_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dialogflow_v3beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dialogflow_v3beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-digitalassetlinks_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-digitalassetlinks_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-discovery_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-discovery_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-displayvideo_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-displayvideo_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dlp_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dlp_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dns_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dns_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dns_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dns_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-docs_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-docs_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-documentai_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-documentai_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-documentai_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-documentai_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-documentai_v1beta3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-documentai_v1beta3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-domains_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-domains_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-domains_v1alpha2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-domains_v1alpha2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-domains_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-domains_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-domainsrdap_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-domainsrdap_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-doubleclickbidmanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-doubleclickbidmanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-doubleclickbidmanager_v1_1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-doubleclickbidmanager_v1_1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-doubleclicksearch_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-doubleclicksearch_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-drive_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-drive_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-drive_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-drive_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-driveactivity_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-driveactivity_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-essentialcontacts_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-essentialcontacts_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-eventarc_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-eventarc_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-eventarc_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-eventarc_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-factchecktools_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-factchecktools_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-fcm_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-fcm_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-fcmdata_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-fcmdata_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-file_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-file_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-file_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-file_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebase_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebase_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebaseappcheck_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebaseappcheck_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasedatabase_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasedatabase_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasedynamiclinks_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasedynamiclinks_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasehosting_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasehosting_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasehosting_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasehosting_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebaseml_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebaseml_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebaseml_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebaseml_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebaserules_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebaserules_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasestorage_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasestorage_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firestore_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firestore_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firestore_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firestore_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firestore_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firestore_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-fitness_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-fitness_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-games_configuration_v1configuration/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-games_configuration_v1configuration/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-games_management_v1management/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-games_management_v1management/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-games_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-games_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gameservices_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gameservices_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gameservices_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gameservices_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-genomics_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-genomics_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-genomics_v1alpha2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-genomics_v1alpha2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-genomics_v2alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-genomics_v2alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1alpha2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1alpha2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gmail_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gmail_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gmailpostmastertools_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gmailpostmastertools_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gmailpostmastertools_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gmailpostmastertools_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-groupsmigration_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-groupsmigration_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-groupssettings_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-groupssettings_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-healthcare_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-healthcare_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-healthcare_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-healthcare_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-homegraph_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-homegraph_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-iam_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-iam_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-iamcredentials_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-iamcredentials_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-iap_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-iap_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-iap_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-iap_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ideahub_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ideahub_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ideahub_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ideahub_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-identitytoolkit_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-identitytoolkit_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-indexing_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-indexing_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-jobs_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-jobs_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-jobs_v3p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-jobs_v3p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-jobs_v4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-jobs_v4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-keep_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-keep_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-kgsearch_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-kgsearch_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-language_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-language_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-language_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-language_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-language_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-language_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-libraryagent_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-libraryagent_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-licensing_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-licensing_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-lifesciences_v2beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-lifesciences_v2beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-localservices_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-localservices_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-logging_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-logging_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-managedidentities_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-managedidentities_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-managedidentities_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-managedidentities_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-managedidentities_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-managedidentities_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-manufacturers_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-manufacturers_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-memcache_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-memcache_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-memcache_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-memcache_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-metastore_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-metastore_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-metastore_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-metastore_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ml_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ml_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-monitoring_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-monitoring_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-monitoring_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-monitoring_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessaccountmanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessaccountmanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessbusinessinformation_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessbusinessinformation_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinesslodging_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinesslodging_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessnotifications_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessnotifications_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessplaceactions_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessplaceactions_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessqanda_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessqanda_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessverifications_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessverifications_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkconnectivity_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkconnectivity_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkconnectivity_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkconnectivity_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkmanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkmanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkmanagement_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkmanagement_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networksecurity_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networksecurity_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networksecurity_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networksecurity_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkservices_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkservices_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkservices_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkservices_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-notebooks_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-notebooks_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-oauth2_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-oauth2_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ondemandscanning_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ondemandscanning_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ondemandscanning_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ondemandscanning_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-orgpolicy_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-orgpolicy_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-osconfig_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-osconfig_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-osconfig_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-osconfig_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-osconfig_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-osconfig_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-oslogin_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-oslogin_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-oslogin_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-oslogin_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-oslogin_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-oslogin_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pagespeedonline_v5/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pagespeedonline_v5/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-paymentsresellersubscription_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-paymentsresellersubscription_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-people_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-people_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-playablelocations_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-playablelocations_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-playcustomapp_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-playcustomapp_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policyanalyzer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policyanalyzer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policyanalyzer_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policyanalyzer_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policysimulator_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policysimulator_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policysimulator_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policysimulator_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policytroubleshooter_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policytroubleshooter_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policytroubleshooter_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policytroubleshooter_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-poly_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-poly_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-privateca_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-privateca_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-privateca_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-privateca_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-prod_tt_sasportal_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-prod_tt_sasportal_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pubsub_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pubsub_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pubsub_v1beta1a/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pubsub_v1beta1a/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pubsub_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pubsub_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pubsublite_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pubsublite_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-realtimebidding_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-realtimebidding_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-realtimebidding_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-realtimebidding_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-recaptchaenterprise_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-recaptchaenterprise_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-recommendationengine_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-recommendationengine_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-recommender_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-recommender_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-recommender_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-recommender_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-redis_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-redis_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-redis_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-redis_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-remotebuildexecution_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-remotebuildexecution_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-remotebuildexecution_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-remotebuildexecution_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-remotebuildexecution_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-remotebuildexecution_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-reseller_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-reseller_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-resourcesettings_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-resourcesettings_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-retail_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-retail_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-retail_v2alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-retail_v2alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-retail_v2beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-retail_v2beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-run_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-run_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-run_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-run_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-run_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-run_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-run_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-run_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-runtimeconfig_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-runtimeconfig_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-runtimeconfig_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-runtimeconfig_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-safebrowsing_v4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-safebrowsing_v4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sasportal_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sasportal_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-script_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-script_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-searchconsole_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-searchconsole_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-secretmanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-secretmanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-secretmanager_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-secretmanager_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-securitycenter_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-securitycenter_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-securitycenter_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-securitycenter_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-securitycenter_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-securitycenter_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-serviceconsumermanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-serviceconsumermanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-serviceconsumermanagement_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-serviceconsumermanagement_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicecontrol_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicecontrol_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicecontrol_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicecontrol_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicedirectory_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicedirectory_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicedirectory_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicedirectory_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicemanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicemanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicenetworking_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicenetworking_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicenetworking_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicenetworking_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-serviceusage_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-serviceusage_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-serviceusage_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-serviceusage_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sheets_v4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sheets_v4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-site_verification_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-site_verification_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-slides_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-slides_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-smartdevicemanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-smartdevicemanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sourcerepo_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sourcerepo_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-spanner_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-spanner_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-speech_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-speech_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-speech_v1p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-speech_v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-speech_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-speech_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sqladmin_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sqladmin_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sqladmin_v1beta4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sqladmin_v1beta4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-storage_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-storage_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-storagetransfer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-storagetransfer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-streetviewpublish_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-streetviewpublish_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sts_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sts_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sts_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sts_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tagmanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tagmanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tagmanager_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tagmanager_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tasks_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tasks_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-testing_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-testing_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-texttospeech_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-texttospeech_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-texttospeech_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-texttospeech_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-toolresults_v1beta3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-toolresults_v1beta3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tpu_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tpu_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tpu_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tpu_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tpu_v2alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tpu_v2alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-trafficdirector_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-trafficdirector_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-transcoder_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-transcoder_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-transcoder_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-transcoder_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-translate_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-translate_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-translate_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-translate_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-translate_v3beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-translate_v3beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vault_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vault_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vectortile_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vectortile_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-verifiedaccess_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-verifiedaccess_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-versionhistory_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-versionhistory_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1p2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1p2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1p3beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1p3beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vision_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vision_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vision_v1p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vision_v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vision_v1p2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vision_v1p2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vmmigration_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vmmigration_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vmmigration_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vmmigration_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-webfonts_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-webfonts_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-webmasters_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-webmasters_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-webrisk_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-webrisk_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-websecurityscanner_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-websecurityscanner_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-websecurityscanner_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-websecurityscanner_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-websecurityscanner_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-websecurityscanner_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-workflowexecutions_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-workflowexecutions_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-workflowexecutions_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-workflowexecutions_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-workflows_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-workflows_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-workflows_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-workflows_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-youtube_analytics_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-youtube_analytics_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-youtube_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-youtube_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-youtubereporting_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-youtubereporting_v1/.repo-metadata.json
* must have required property 'release_level' in google-api-client/.repo-metadata.json
* must have required property 'client_documentation' in google-api-client/.repo-metadata.json
* must have required property 'release_level' in google-apis-core/.repo-metadata.json
* must have required property 'client_documentation' in google-apis-core/.repo-metadata.json
* must have required property 'release_level' in google-apis-generator/.repo-metadata.json
* must have required property 'client_documentation' in google-apis-generator/.repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json files have a problem 🤒 - You have a problem with your .repo-metadata.json files:
Result of scan 📈:
* must have required property 'release_level' in generated/google-apis-abusiveexperiencereport_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-abusiveexperiencereport_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-acceleratedmobilepageurl_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-acceleratedmobilepageurl_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-accessapproval_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-accessapproval_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-accesscontextmanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-accesscontextmanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-accesscontextmanager_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-accesscontextmanager_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexchangebuyer2_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexchangebuyer2_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexchangebuyer_v1_2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexchangebuyer_v1_2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexchangebuyer_v1_3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexchangebuyer_v1_3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexchangebuyer_v1_4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexchangebuyer_v1_4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adexperiencereport_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adexperiencereport_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admin_datatransfer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admin_datatransfer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admin_directory_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admin_directory_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admin_reports_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admin_reports_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admob_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admob_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-admob_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-admob_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adsense_v1_4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adsense_v1_4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adsense_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adsense_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-adsensehost_v4_1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-adsensehost_v4_1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-alertcenter_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-alertcenter_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-analytics_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-analytics_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-analyticsadmin_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-analyticsadmin_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-analyticsdata_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-analyticsdata_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-analyticsreporting_v4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-analyticsreporting_v4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-androiddeviceprovisioning_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-androiddeviceprovisioning_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-androidenterprise_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-androidenterprise_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-androidmanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-androidmanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-androidpublisher_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-androidpublisher_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-apigateway_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-apigateway_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-apigateway_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-apigateway_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-apigee_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-apigee_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-apikeys_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-apikeys_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-appengine_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-appengine_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-appengine_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-appengine_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-appengine_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-appengine_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-area120tables_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-area120tables_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-artifactregistry_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-artifactregistry_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-artifactregistry_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-artifactregistry_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-artifactregistry_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-artifactregistry_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-assuredworkloads_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-assuredworkloads_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-authorizedbuyersmarketplace_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-authorizedbuyersmarketplace_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-baremetalsolution_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-baremetalsolution_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-baremetalsolution_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-baremetalsolution_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-baremetalsolution_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-baremetalsolution_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigquery_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigquery_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigqueryconnection_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigqueryconnection_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigquerydatatransfer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigquerydatatransfer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigqueryreservation_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigqueryreservation_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigqueryreservation_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigqueryreservation_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigtableadmin_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigtableadmin_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-bigtableadmin_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-bigtableadmin_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-billingbudgets_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-billingbudgets_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-billingbudgets_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-billingbudgets_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-binaryauthorization_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-binaryauthorization_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-binaryauthorization_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-binaryauthorization_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-blogger_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-blogger_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-blogger_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-blogger_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-books_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-books_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-calendar_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-calendar_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-chat_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-chat_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-chromemanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-chromemanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-chromepolicy_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-chromepolicy_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-chromeuxreport_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-chromeuxreport_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-civicinfo_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-civicinfo_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-classroom_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-classroom_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1p4beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1p4beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1p5beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1p5beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudasset_v1p7beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudasset_v1p7beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbilling_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbilling_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbuild_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbuild_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbuild_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbuild_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbuild_v1alpha2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbuild_v1alpha2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudbuild_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudbuild_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudchannel_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudchannel_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-clouddebugger_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-clouddebugger_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-clouddeploy_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-clouddeploy_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-clouderrorreporting_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-clouderrorreporting_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudfunctions_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudfunctions_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudidentity_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudidentity_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudidentity_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudidentity_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudiot_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudiot_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudkms_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudkms_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudprofiler_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudprofiler_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudresourcemanager_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudresourcemanager_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudscheduler_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudscheduler_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudscheduler_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudscheduler_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudsearch_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudsearch_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudshell_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudshell_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudsupport_v2beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudsupport_v2beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtasks_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtasks_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtasks_v2beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtasks_v2beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtasks_v2beta3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtasks_v2beta3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtrace_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtrace_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtrace_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtrace_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-cloudtrace_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-cloudtrace_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-composer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-composer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-composer_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-composer_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-compute_alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-compute_alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-compute_beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-compute_beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-compute_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-compute_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-connectors_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-connectors_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-contactcenterinsights_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-contactcenterinsights_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-container_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-container_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-container_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-container_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-containeranalysis_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-containeranalysis_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-containeranalysis_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-containeranalysis_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-containeranalysis_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-containeranalysis_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-content_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-content_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-content_v2_1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-content_v2_1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-customsearch_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-customsearch_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datacatalog_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datacatalog_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datacatalog_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datacatalog_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dataflow_v1b3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dataflow_v1b3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datafusion_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datafusion_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datafusion_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datafusion_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datalabeling_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datalabeling_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datamigration_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datamigration_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datamigration_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datamigration_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datapipelines_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datapipelines_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dataproc_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dataproc_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dataproc_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dataproc_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastore_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastore_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastore_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastore_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastore_v1beta3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastore_v1beta3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastream_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastream_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-datastream_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-datastream_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-deploymentmanager_alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-deploymentmanager_alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-deploymentmanager_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-deploymentmanager_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-deploymentmanager_v2beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-deploymentmanager_v2beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dfareporting_v3_3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dfareporting_v3_3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dfareporting_v3_4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dfareporting_v3_4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dfareporting_v3_5/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dfareporting_v3_5/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dialogflow_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dialogflow_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dialogflow_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dialogflow_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dialogflow_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dialogflow_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dialogflow_v3beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dialogflow_v3beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-digitalassetlinks_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-digitalassetlinks_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-discovery_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-discovery_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-displayvideo_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-displayvideo_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dlp_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dlp_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dns_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dns_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-dns_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-dns_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-docs_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-docs_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-documentai_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-documentai_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-documentai_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-documentai_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-documentai_v1beta3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-documentai_v1beta3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-domains_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-domains_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-domains_v1alpha2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-domains_v1alpha2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-domains_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-domains_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-domainsrdap_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-domainsrdap_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-doubleclickbidmanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-doubleclickbidmanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-doubleclickbidmanager_v1_1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-doubleclickbidmanager_v1_1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-doubleclicksearch_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-doubleclicksearch_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-drive_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-drive_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-drive_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-drive_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-driveactivity_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-driveactivity_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-essentialcontacts_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-essentialcontacts_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-eventarc_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-eventarc_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-eventarc_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-eventarc_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-factchecktools_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-factchecktools_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-fcm_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-fcm_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-fcmdata_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-fcmdata_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-file_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-file_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-file_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-file_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebase_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebase_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebaseappcheck_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebaseappcheck_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasedatabase_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasedatabase_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasedynamiclinks_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasedynamiclinks_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasehosting_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasehosting_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasehosting_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasehosting_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebaseml_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebaseml_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebaseml_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebaseml_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebaserules_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebaserules_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firebasestorage_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firebasestorage_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firestore_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firestore_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firestore_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firestore_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-firestore_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-firestore_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-fitness_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-fitness_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-games_configuration_v1configuration/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-games_configuration_v1configuration/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-games_management_v1management/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-games_management_v1management/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-games_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-games_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gameservices_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gameservices_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gameservices_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gameservices_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-genomics_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-genomics_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-genomics_v1alpha2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-genomics_v1alpha2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-genomics_v2alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-genomics_v2alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1alpha2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1alpha2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gkehub_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gkehub_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gmail_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gmail_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gmailpostmastertools_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gmailpostmastertools_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-gmailpostmastertools_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-gmailpostmastertools_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-groupsmigration_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-groupsmigration_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-groupssettings_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-groupssettings_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-healthcare_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-healthcare_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-healthcare_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-healthcare_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-homegraph_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-homegraph_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-iam_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-iam_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-iamcredentials_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-iamcredentials_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-iap_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-iap_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-iap_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-iap_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ideahub_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ideahub_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ideahub_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ideahub_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-identitytoolkit_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-identitytoolkit_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-indexing_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-indexing_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-jobs_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-jobs_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-jobs_v3p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-jobs_v3p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-jobs_v4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-jobs_v4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-keep_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-keep_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-kgsearch_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-kgsearch_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-language_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-language_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-language_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-language_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-language_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-language_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-libraryagent_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-libraryagent_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-licensing_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-licensing_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-lifesciences_v2beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-lifesciences_v2beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-localservices_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-localservices_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-logging_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-logging_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-managedidentities_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-managedidentities_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-managedidentities_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-managedidentities_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-managedidentities_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-managedidentities_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-manufacturers_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-manufacturers_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-memcache_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-memcache_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-memcache_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-memcache_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-metastore_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-metastore_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-metastore_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-metastore_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ml_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ml_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-monitoring_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-monitoring_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-monitoring_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-monitoring_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessaccountmanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessaccountmanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessbusinessinformation_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessbusinessinformation_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinesslodging_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinesslodging_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessnotifications_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessnotifications_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessplaceactions_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessplaceactions_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessqanda_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessqanda_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-mybusinessverifications_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-mybusinessverifications_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkconnectivity_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkconnectivity_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkconnectivity_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkconnectivity_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkmanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkmanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkmanagement_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkmanagement_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networksecurity_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networksecurity_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networksecurity_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networksecurity_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkservices_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkservices_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-networkservices_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-networkservices_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-notebooks_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-notebooks_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-oauth2_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-oauth2_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ondemandscanning_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ondemandscanning_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-ondemandscanning_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-ondemandscanning_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-orgpolicy_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-orgpolicy_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-osconfig_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-osconfig_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-osconfig_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-osconfig_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-osconfig_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-osconfig_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-oslogin_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-oslogin_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-oslogin_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-oslogin_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-oslogin_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-oslogin_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pagespeedonline_v5/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pagespeedonline_v5/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-paymentsresellersubscription_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-paymentsresellersubscription_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-people_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-people_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-playablelocations_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-playablelocations_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-playcustomapp_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-playcustomapp_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policyanalyzer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policyanalyzer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policyanalyzer_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policyanalyzer_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policysimulator_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policysimulator_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policysimulator_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policysimulator_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policytroubleshooter_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policytroubleshooter_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-policytroubleshooter_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-policytroubleshooter_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-poly_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-poly_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-privateca_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-privateca_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-privateca_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-privateca_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-prod_tt_sasportal_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-prod_tt_sasportal_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pubsub_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pubsub_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pubsub_v1beta1a/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pubsub_v1beta1a/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pubsub_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pubsub_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-pubsublite_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-pubsublite_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-realtimebidding_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-realtimebidding_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-realtimebidding_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-realtimebidding_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-recaptchaenterprise_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-recaptchaenterprise_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-recommendationengine_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-recommendationengine_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-recommender_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-recommender_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-recommender_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-recommender_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-redis_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-redis_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-redis_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-redis_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-remotebuildexecution_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-remotebuildexecution_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-remotebuildexecution_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-remotebuildexecution_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-remotebuildexecution_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-remotebuildexecution_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-reseller_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-reseller_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-resourcesettings_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-resourcesettings_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-retail_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-retail_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-retail_v2alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-retail_v2alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-retail_v2beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-retail_v2beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-run_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-run_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-run_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-run_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-run_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-run_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-run_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-run_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-runtimeconfig_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-runtimeconfig_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-runtimeconfig_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-runtimeconfig_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-safebrowsing_v4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-safebrowsing_v4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sasportal_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sasportal_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-script_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-script_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-searchconsole_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-searchconsole_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-secretmanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-secretmanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-secretmanager_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-secretmanager_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-securitycenter_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-securitycenter_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-securitycenter_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-securitycenter_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-securitycenter_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-securitycenter_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-serviceconsumermanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-serviceconsumermanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-serviceconsumermanagement_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-serviceconsumermanagement_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicecontrol_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicecontrol_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicecontrol_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicecontrol_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicedirectory_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicedirectory_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicedirectory_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicedirectory_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicemanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicemanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicenetworking_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicenetworking_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-servicenetworking_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-servicenetworking_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-serviceusage_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-serviceusage_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-serviceusage_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-serviceusage_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sheets_v4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sheets_v4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-site_verification_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-site_verification_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-slides_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-slides_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-smartdevicemanagement_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-smartdevicemanagement_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sourcerepo_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sourcerepo_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-spanner_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-spanner_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-speech_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-speech_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-speech_v1p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-speech_v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-speech_v2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-speech_v2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sqladmin_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sqladmin_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sqladmin_v1beta4/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sqladmin_v1beta4/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-storage_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-storage_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-storagetransfer_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-storagetransfer_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-streetviewpublish_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-streetviewpublish_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sts_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sts_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-sts_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-sts_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tagmanager_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tagmanager_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tagmanager_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tagmanager_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tasks_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tasks_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-testing_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-testing_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-texttospeech_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-texttospeech_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-texttospeech_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-texttospeech_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-toolresults_v1beta3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-toolresults_v1beta3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tpu_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tpu_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tpu_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tpu_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-tpu_v2alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-tpu_v2alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-trafficdirector_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-trafficdirector_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-transcoder_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-transcoder_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-transcoder_v1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-transcoder_v1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-translate_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-translate_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-translate_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-translate_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-translate_v3beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-translate_v3beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vault_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vault_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vectortile_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vectortile_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-verifiedaccess_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-verifiedaccess_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-versionhistory_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-versionhistory_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1beta2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1beta2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1p2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1p2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-videointelligence_v1p3beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-videointelligence_v1p3beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vision_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vision_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vision_v1p1beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vision_v1p1beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vision_v1p2beta1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vision_v1p2beta1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vmmigration_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vmmigration_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-vmmigration_v1alpha1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-vmmigration_v1alpha1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-webfonts_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-webfonts_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-webmasters_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-webmasters_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-webrisk_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-webrisk_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-websecurityscanner_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-websecurityscanner_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-websecurityscanner_v1alpha/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-websecurityscanner_v1alpha/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-websecurityscanner_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-websecurityscanner_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-workflowexecutions_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-workflowexecutions_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-workflowexecutions_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-workflowexecutions_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-workflows_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-workflows_v1/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-workflows_v1beta/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-workflows_v1beta/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-youtube_analytics_v2/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-youtube_analytics_v2/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-youtube_v3/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-youtube_v3/.repo-metadata.json
* must have required property 'release_level' in generated/google-apis-youtubereporting_v1/.repo-metadata.json
* must have required property 'client_documentation' in generated/google-apis-youtubereporting_v1/.repo-metadata.json
* must have required property 'release_level' in google-api-client/.repo-metadata.json
* must have required property 'client_documentation' in google-api-client/.repo-metadata.json
* must have required property 'release_level' in google-apis-core/.repo-metadata.json
* must have required property 'client_documentation' in google-apis-core/.repo-metadata.json
* must have required property 'release_level' in google-apis-generator/.repo-metadata.json
* must have required property 'client_documentation' in google-apis-generator/.repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json files have a problem 🤒 you have a problem with your repo metadata json files result of scan 📈 must have required property release level in generated google apis abusiveexperiencereport repo metadata json must have required property client documentation in generated google apis abusiveexperiencereport repo metadata json must have required property release level in generated google apis acceleratedmobilepageurl repo metadata json must have required property client documentation in generated google apis acceleratedmobilepageurl repo metadata json must have required property release level in generated google apis accessapproval repo metadata json must have required property client documentation in generated google apis accessapproval repo metadata json must have required property release level in generated google apis accesscontextmanager repo metadata json must have required property client documentation in generated google apis accesscontextmanager repo metadata json must have required property release level in generated google apis accesscontextmanager repo metadata json must have required property client documentation in generated google apis accesscontextmanager repo metadata json must have required property release level in generated google apis repo metadata json must have required property client documentation in generated google apis repo metadata json must have required property release level in generated google apis adexchangebuyer repo metadata json must have required property client documentation in generated google apis adexchangebuyer repo metadata json must have required property release level in generated google apis adexchangebuyer repo metadata json must have required property client documentation in generated google apis adexchangebuyer repo metadata json must have required property release level in generated google apis adexchangebuyer repo metadata json must have required property client documentation in generated google apis adexchangebuyer repo metadata json must have required property release level in generated google apis adexperiencereport repo metadata json must have required property client documentation in generated google apis adexperiencereport repo metadata json must have required property release level in generated google apis admin datatransfer repo metadata json must have required property client documentation in generated google apis admin datatransfer repo metadata json must have required property release level in generated google apis admin directory repo metadata json must have required property client documentation in generated google apis admin directory repo metadata json must have required property release level in generated google apis admin reports repo metadata json must have required property client documentation in generated google apis admin reports repo metadata json must have required property release level in generated google apis admob repo metadata json must have required property client documentation in generated google apis admob repo metadata json must have required property release level in generated google apis admob repo metadata json must have required property client documentation in generated google apis admob repo metadata json must have required property release level in generated google apis adsense repo metadata json must have required property client documentation in generated google apis adsense repo metadata json must have required property release level in generated google apis adsense repo metadata json must have required property client documentation in generated google apis adsense repo metadata json must have required property release level in generated google apis adsensehost repo metadata json must have required property client documentation in generated google apis adsensehost repo metadata json must have required property release level in generated google apis alertcenter repo metadata json must have required property client documentation in generated google apis alertcenter repo metadata json must have required property release level in generated google apis analytics repo metadata json must have required property client documentation in generated google apis analytics repo metadata json must have required property release level in generated google apis analyticsadmin repo metadata json must have required property client documentation in generated google apis analyticsadmin repo metadata json must have required property release level in generated google apis analyticsdata repo metadata json must have required property client documentation in generated google apis analyticsdata repo metadata json must have required property release level in generated google apis analyticsreporting repo metadata json must have required property client documentation in generated google apis analyticsreporting repo metadata json must have required property release level in generated google apis androiddeviceprovisioning repo metadata json must have required property client documentation in generated google apis androiddeviceprovisioning repo metadata json must have required property release level in generated google apis androidenterprise repo metadata json must have required property client documentation in generated google apis androidenterprise repo metadata json must have required property release level in generated google apis androidmanagement repo metadata json must have required property client documentation in generated google apis androidmanagement repo metadata json must have required property release level in generated google apis androidpublisher repo metadata json must have required property client documentation in generated google apis androidpublisher repo metadata json must have required property release level in generated google apis apigateway repo metadata json must have required property client documentation in generated google apis apigateway repo metadata json must have required property release level in generated google apis apigateway repo metadata json must have required property client documentation in generated google apis apigateway repo metadata json must have required property release level in generated google apis apigee repo metadata json must have required property client documentation in generated google apis apigee repo metadata json must have required property release level in generated google apis apikeys repo metadata json must have required property client documentation in generated google apis apikeys repo metadata json must have required property release level in generated google apis appengine repo metadata json must have required property client documentation in generated google apis appengine repo metadata json must have required property release level in generated google apis appengine repo metadata json must have required property client documentation in generated google apis appengine repo metadata json must have required property release level in generated google apis appengine repo metadata json must have required property client documentation in generated google apis appengine repo metadata json must have required property release level in generated google apis repo metadata json must have required property client documentation in generated google apis repo metadata json must have required property release level in generated google apis artifactregistry repo metadata json must have required property client documentation in generated google apis artifactregistry repo metadata json must have required property release level in generated google apis artifactregistry repo metadata json must have required property client documentation in generated google apis artifactregistry repo metadata json must have required property release level in generated google apis artifactregistry repo metadata json must have required property client documentation in generated google apis artifactregistry repo metadata json must have required property release level in generated google apis assuredworkloads repo metadata json must have required property client documentation in generated google apis assuredworkloads repo metadata json must have required property release level in generated google apis authorizedbuyersmarketplace repo metadata json must have required property client documentation in generated google apis authorizedbuyersmarketplace repo metadata json must have required property release level in generated google apis baremetalsolution repo metadata json must have required property client documentation in generated google apis baremetalsolution repo metadata json must have required property release level in generated google apis baremetalsolution repo metadata json must have required property client documentation in generated google apis baremetalsolution repo metadata json must have required property release level in generated google apis baremetalsolution repo metadata json must have required property client documentation in generated google apis baremetalsolution repo metadata json must have required property release level in generated google apis bigquery repo metadata json must have required property client documentation in generated google apis bigquery repo metadata json must have required property release level in generated google apis bigqueryconnection repo metadata json must have required property client documentation in generated google apis bigqueryconnection repo metadata json must have required property release level in generated google apis bigquerydatatransfer repo metadata json must have required property client documentation in generated google apis bigquerydatatransfer repo metadata json must have required property release level in generated google apis bigqueryreservation repo metadata json must have required property client documentation in generated google apis bigqueryreservation repo metadata json must have required property release level in generated google apis bigqueryreservation repo metadata json must have required property client documentation in generated google apis bigqueryreservation repo metadata json must have required property release level in generated google apis bigtableadmin repo metadata json must have required property client documentation in generated google apis bigtableadmin repo metadata json must have required property release level in generated google apis bigtableadmin repo metadata json must have required property client documentation in generated google apis bigtableadmin repo metadata json must have required property release level in generated google apis billingbudgets repo metadata json must have required property client documentation in generated google apis billingbudgets repo metadata json must have required property release level in generated google apis billingbudgets repo metadata json must have required property client documentation in generated google apis billingbudgets repo metadata json must have required property release level in generated google apis binaryauthorization repo metadata json must have required property client documentation in generated google apis binaryauthorization repo metadata json must have required property release level in generated google apis binaryauthorization repo metadata json must have required property client documentation in generated google apis binaryauthorization repo metadata json must have required property release level in generated google apis blogger repo metadata json must have required property client documentation in generated google apis blogger repo metadata json must have required property release level in generated google apis blogger repo metadata json must have required property client documentation in generated google apis blogger repo metadata json must have required property release level in generated google apis books repo metadata json must have required property client documentation in generated google apis books repo metadata json must have required property release level in generated google apis calendar repo metadata json must have required property client documentation in generated google apis calendar repo metadata json must have required property release level in generated google apis chat repo metadata json must have required property client documentation in generated google apis chat repo metadata json must have required property release level in generated google apis chromemanagement repo metadata json must have required property client documentation in generated google apis chromemanagement repo metadata json must have required property release level in generated google apis chromepolicy repo metadata json must have required property client documentation in generated google apis chromepolicy repo metadata json must have required property release level in generated google apis chromeuxreport repo metadata json must have required property client documentation in generated google apis chromeuxreport repo metadata json must have required property release level in generated google apis civicinfo repo metadata json must have required property client documentation in generated google apis civicinfo repo metadata json must have required property release level in generated google apis classroom repo metadata json must have required property client documentation in generated google apis classroom repo metadata json must have required property release level in generated google apis cloudasset repo metadata json must have required property client documentation in generated google apis cloudasset repo metadata json must have required property release level in generated google apis cloudasset repo metadata json must have required property client documentation in generated google apis cloudasset repo metadata json must have required property release level in generated google apis cloudasset repo metadata json must have required property client documentation in generated google apis cloudasset repo metadata json must have required property release level in generated google apis cloudasset repo metadata json must have required property client documentation in generated google apis cloudasset repo metadata json must have required property release level in generated google apis cloudasset repo metadata json must have required property client documentation in generated google apis cloudasset repo metadata json must have required property release level in generated google apis cloudasset repo metadata json must have required property client documentation in generated google apis cloudasset repo metadata json must have required property release level in generated google apis cloudbilling repo metadata json must have required property client documentation in generated google apis cloudbilling repo metadata json must have required property release level in generated google apis cloudbuild repo metadata json must have required property client documentation in generated google apis cloudbuild repo metadata json must have required property release level in generated google apis cloudbuild repo metadata json must have required property client documentation in generated google apis cloudbuild repo metadata json must have required property release level in generated google apis cloudbuild repo metadata json must have required property client documentation in generated google apis cloudbuild repo metadata json must have required property release level in generated google apis cloudbuild repo metadata json must have required property client documentation in generated google apis cloudbuild repo metadata json must have required property release level in generated google apis cloudchannel repo metadata json must have required property client documentation in generated google apis cloudchannel repo metadata json must have required property release level in generated google apis clouddebugger repo metadata json must have required property client documentation in generated google apis clouddebugger repo metadata json must have required property release level in generated google apis clouddeploy repo metadata json must have required property client documentation in generated google apis clouddeploy repo metadata json must have required property release level in generated google apis clouderrorreporting repo metadata json must have required property client documentation in generated google apis clouderrorreporting repo metadata json must have required property release level in generated google apis cloudfunctions repo metadata json must have required property client documentation in generated google apis cloudfunctions repo metadata json must have required property release level in generated google apis cloudidentity repo metadata json must have required property client documentation in generated google apis cloudidentity repo metadata json must have required property release level in generated google apis cloudidentity repo metadata json must have required property client documentation in generated google apis cloudidentity repo metadata json must have required property release level in generated google apis cloudiot repo metadata json must have required property client documentation in generated google apis cloudiot repo metadata json must have required property release level in generated google apis cloudkms repo metadata json must have required property client documentation in generated google apis cloudkms repo metadata json must have required property release level in generated google apis cloudprofiler repo metadata json must have required property client documentation in generated google apis cloudprofiler repo metadata json must have required property release level in generated google apis cloudresourcemanager repo metadata json must have required property client documentation in generated google apis cloudresourcemanager repo metadata json must have required property release level in generated google apis cloudresourcemanager repo metadata json must have required property client documentation in generated google apis cloudresourcemanager repo metadata json must have required property release level in generated google apis cloudresourcemanager repo metadata json must have required property client documentation in generated google apis cloudresourcemanager repo metadata json must have required property release level in generated google apis cloudresourcemanager repo metadata json must have required property client documentation in generated google apis cloudresourcemanager repo metadata json must have required property release level in generated google apis cloudresourcemanager repo metadata json must have required property client documentation in generated google apis cloudresourcemanager repo metadata json must have required property release level in generated google apis cloudscheduler repo metadata json must have required property client documentation in generated google apis cloudscheduler repo metadata json must have required property release level in generated google apis cloudscheduler repo metadata json must have required property client documentation in generated google apis cloudscheduler repo metadata json must have required property release level in generated google apis cloudsearch repo metadata json must have required property client documentation in generated google apis cloudsearch repo metadata json must have required property release level in generated google apis cloudshell repo metadata json must have required property client documentation in generated google apis cloudshell repo metadata json must have required property release level in generated google apis cloudsupport repo metadata json must have required property client documentation in generated google apis cloudsupport repo metadata json must have required property release level in generated google apis cloudtasks repo metadata json must have required property client documentation in generated google apis cloudtasks repo metadata json must have required property release level in generated google apis cloudtasks repo metadata json must have required property client documentation in generated google apis cloudtasks repo metadata json must have required property release level in generated google apis cloudtasks repo metadata json must have required property client documentation in generated google apis cloudtasks repo metadata json must have required property release level in generated google apis cloudtrace repo metadata json must have required property client documentation in generated google apis cloudtrace repo metadata json must have required property release level in generated google apis cloudtrace repo metadata json must have required property client documentation in generated google apis cloudtrace repo metadata json must have required property release level in generated google apis cloudtrace repo metadata json must have required property client documentation in generated google apis cloudtrace repo metadata json must have required property release level in generated google apis composer repo metadata json must have required property client documentation in generated google apis composer repo metadata json must have required property release level in generated google apis composer repo metadata json must have required property client documentation in generated google apis composer repo metadata json must have required property release level in generated google apis compute alpha repo metadata json must have required property client documentation in generated google apis compute alpha repo metadata json must have required property release level in generated google apis compute beta repo metadata json must have required property client documentation in generated google apis compute beta repo metadata json must have required property release level in generated google apis compute repo metadata json must have required property client documentation in generated google apis compute repo metadata json must have required property release level in generated google apis connectors repo metadata json must have required property client documentation in generated google apis connectors repo metadata json must have required property release level in generated google apis contactcenterinsights repo metadata json must have required property client documentation in generated google apis contactcenterinsights repo metadata json must have required property release level in generated google apis container repo metadata json must have required property client documentation in generated google apis container repo metadata json must have required property release level in generated google apis container repo metadata json must have required property client documentation in generated google apis container repo metadata json must have required property release level in generated google apis containeranalysis repo metadata json must have required property client documentation in generated google apis containeranalysis repo metadata json must have required property release level in generated google apis containeranalysis repo metadata json must have required property client documentation in generated google apis containeranalysis repo metadata json must have required property release level in generated google apis containeranalysis repo metadata json must have required property client documentation in generated google apis containeranalysis repo metadata json must have required property release level in generated google apis content repo metadata json must have required property client documentation in generated google apis content repo metadata json must have required property release level in generated google apis content repo metadata json must have required property client documentation in generated google apis content repo metadata json must have required property release level in generated google apis customsearch repo metadata json must have required property client documentation in generated google apis customsearch repo metadata json must have required property release level in generated google apis datacatalog repo metadata json must have required property client documentation in generated google apis datacatalog repo metadata json must have required property release level in generated google apis datacatalog repo metadata json must have required property client documentation in generated google apis datacatalog repo metadata json must have required property release level in generated google apis dataflow repo metadata json must have required property client documentation in generated google apis dataflow repo metadata json must have required property release level in generated google apis datafusion repo metadata json must have required property client documentation in generated google apis datafusion repo metadata json must have required property release level in generated google apis datafusion repo metadata json must have required property client documentation in generated google apis datafusion repo metadata json must have required property release level in generated google apis datalabeling repo metadata json must have required property client documentation in generated google apis datalabeling repo metadata json must have required property release level in generated google apis datamigration repo metadata json must have required property client documentation in generated google apis datamigration repo metadata json must have required property release level in generated google apis datamigration repo metadata json must have required property client documentation in generated google apis datamigration repo metadata json must have required property release level in generated google apis datapipelines repo metadata json must have required property client documentation in generated google apis datapipelines repo metadata json must have required property release level in generated google apis dataproc repo metadata json must have required property client documentation in generated google apis dataproc repo metadata json must have required property release level in generated google apis dataproc repo metadata json must have required property client documentation in generated google apis dataproc repo metadata json must have required property release level in generated google apis datastore repo metadata json must have required property client documentation in generated google apis datastore repo metadata json must have required property release level in generated google apis datastore repo metadata json must have required property client documentation in generated google apis datastore repo metadata json must have required property release level in generated google apis datastore repo metadata json must have required property client documentation in generated google apis datastore repo metadata json must have required property release level in generated google apis datastream repo metadata json must have required property client documentation in generated google apis datastream repo metadata json must have required property release level in generated google apis datastream repo metadata json must have required property client documentation in generated google apis datastream repo metadata json must have required property release level in generated google apis deploymentmanager alpha repo metadata json must have required property client documentation in generated google apis deploymentmanager alpha repo metadata json must have required property release level in generated google apis deploymentmanager repo metadata json must have required property client documentation in generated google apis deploymentmanager repo metadata json must have required property release level in generated google apis deploymentmanager repo metadata json must have required property client documentation in generated google apis deploymentmanager repo metadata json must have required property release level in generated google apis dfareporting repo metadata json must have required property client documentation in generated google apis dfareporting repo metadata json must have required property release level in generated google apis dfareporting repo metadata json must have required property client documentation in generated google apis dfareporting repo metadata json must have required property release level in generated google apis dfareporting repo metadata json must have required property client documentation in generated google apis dfareporting repo metadata json must have required property release level in generated google apis dialogflow repo metadata json must have required property client documentation in generated google apis dialogflow repo metadata json must have required property release level in generated google apis dialogflow repo metadata json must have required property client documentation in generated google apis dialogflow repo metadata json must have required property release level in generated google apis dialogflow repo metadata json must have required property client documentation in generated google apis dialogflow repo metadata json must have required property release level in generated google apis dialogflow repo metadata json must have required property client documentation in generated google apis dialogflow repo metadata json must have required property release level in generated google apis digitalassetlinks repo metadata json must have required property client documentation in generated google apis digitalassetlinks repo metadata json must have required property release level in generated google apis discovery repo metadata json must have required property client documentation in generated google apis discovery repo metadata json must have required property release level in generated google apis displayvideo repo metadata json must have required property client documentation in generated google apis displayvideo repo metadata json must have required property release level in generated google apis dlp repo metadata json must have required property client documentation in generated google apis dlp repo metadata json must have required property release level in generated google apis dns repo metadata json must have required property client documentation in generated google apis dns repo metadata json must have required property release level in generated google apis dns repo metadata json must have required property client documentation in generated google apis dns repo metadata json must have required property release level in generated google apis docs repo metadata json must have required property client documentation in generated google apis docs repo metadata json must have required property release level in generated google apis documentai repo metadata json must have required property client documentation in generated google apis documentai repo metadata json must have required property release level in generated google apis documentai repo metadata json must have required property client documentation in generated google apis documentai repo metadata json must have required property release level in generated google apis documentai repo metadata json must have required property client documentation in generated google apis documentai repo metadata json must have required property release level in generated google apis domains repo metadata json must have required property client documentation in generated google apis domains repo metadata json must have required property release level in generated google apis domains repo metadata json must have required property client documentation in generated google apis domains repo metadata json must have required property release level in generated google apis domains repo metadata json must have required property client documentation in generated google apis domains repo metadata json must have required property release level in generated google apis domainsrdap repo metadata json must have required property client documentation in generated google apis domainsrdap repo metadata json must have required property release level in generated google apis doubleclickbidmanager repo metadata json must have required property client documentation in generated google apis doubleclickbidmanager repo metadata json must have required property release level in generated google apis doubleclickbidmanager repo metadata json must have required property client documentation in generated google apis doubleclickbidmanager repo metadata json must have required property release level in generated google apis doubleclicksearch repo metadata json must have required property client documentation in generated google apis doubleclicksearch repo metadata json must have required property release level in generated google apis drive repo metadata json must have required property client documentation in generated google apis drive repo metadata json must have required property release level in generated google apis drive repo metadata json must have required property client documentation in generated google apis drive repo metadata json must have required property release level in generated google apis driveactivity repo metadata json must have required property client documentation in generated google apis driveactivity repo metadata json must have required property release level in generated google apis essentialcontacts repo metadata json must have required property client documentation in generated google apis essentialcontacts repo metadata json must have required property release level in generated google apis eventarc repo metadata json must have required property client documentation in generated google apis eventarc repo metadata json must have required property release level in generated google apis eventarc repo metadata json must have required property client documentation in generated google apis eventarc repo metadata json must have required property release level in generated google apis factchecktools repo metadata json must have required property client documentation in generated google apis factchecktools repo metadata json must have required property release level in generated google apis fcm repo metadata json must have required property client documentation in generated google apis fcm repo metadata json must have required property release level in generated google apis fcmdata repo metadata json must have required property client documentation in generated google apis fcmdata repo metadata json must have required property release level in generated google apis file repo metadata json must have required property client documentation in generated google apis file repo metadata json must have required property release level in generated google apis file repo metadata json must have required property client documentation in generated google apis file repo metadata json must have required property release level in generated google apis firebase repo metadata json must have required property client documentation in generated google apis firebase repo metadata json must have required property release level in generated google apis firebaseappcheck repo metadata json must have required property client documentation in generated google apis firebaseappcheck repo metadata json must have required property release level in generated google apis firebasedatabase repo metadata json must have required property client documentation in generated google apis firebasedatabase repo metadata json must have required property release level in generated google apis firebasedynamiclinks repo metadata json must have required property client documentation in generated google apis firebasedynamiclinks repo metadata json must have required property release level in generated google apis firebasehosting repo metadata json must have required property client documentation in generated google apis firebasehosting repo metadata json must have required property release level in generated google apis firebasehosting repo metadata json must have required property client documentation in generated google apis firebasehosting repo metadata json must have required property release level in generated google apis firebaseml repo metadata json must have required property client documentation in generated google apis firebaseml repo metadata json must have required property release level in generated google apis firebaseml repo metadata json must have required property client documentation in generated google apis firebaseml repo metadata json must have required property release level in generated google apis firebaserules repo metadata json must have required property client documentation in generated google apis firebaserules repo metadata json must have required property release level in generated google apis firebasestorage repo metadata json must have required property client documentation in generated google apis firebasestorage repo metadata json must have required property release level in generated google apis firestore repo metadata json must have required property client documentation in generated google apis firestore repo metadata json must have required property release level in generated google apis firestore repo metadata json must have required property client documentation in generated google apis firestore repo metadata json must have required property release level in generated google apis firestore repo metadata json must have required property client documentation in generated google apis firestore repo metadata json must have required property release level in generated google apis fitness repo metadata json must have required property client documentation in generated google apis fitness repo metadata json must have required property release level in generated google apis games configuration repo metadata json must have required property client documentation in generated google apis games configuration repo metadata json must have required property release level in generated google apis games management repo metadata json must have required property client documentation in generated google apis games management repo metadata json must have required property release level in generated google apis games repo metadata json must have required property client documentation in generated google apis games repo metadata json must have required property release level in generated google apis gameservices repo metadata json must have required property client documentation in generated google apis gameservices repo metadata json must have required property release level in generated google apis gameservices repo metadata json must have required property client documentation in generated google apis gameservices repo metadata json must have required property release level in generated google apis genomics repo metadata json must have required property client documentation in generated google apis genomics repo metadata json must have required property release level in generated google apis genomics repo metadata json must have required property client documentation in generated google apis genomics repo metadata json must have required property release level in generated google apis genomics repo metadata json must have required property client documentation in generated google apis genomics repo metadata json must have required property release level in generated google apis gkehub repo metadata json must have required property client documentation in generated google apis gkehub repo metadata json must have required property release level in generated google apis gkehub repo metadata json must have required property client documentation in generated google apis gkehub repo metadata json must have required property release level in generated google apis gkehub repo metadata json must have required property client documentation in generated google apis gkehub repo metadata json must have required property release level in generated google apis gkehub repo metadata json must have required property client documentation in generated google apis gkehub repo metadata json must have required property release level in generated google apis gkehub repo metadata json must have required property client documentation in generated google apis gkehub repo metadata json must have required property release level in generated google apis gmail repo metadata json must have required property client documentation in generated google apis gmail repo metadata json must have required property release level in generated google apis gmailpostmastertools repo metadata json must have required property client documentation in generated google apis gmailpostmastertools repo metadata json must have required property release level in generated google apis gmailpostmastertools repo metadata json must have required property client documentation in generated google apis gmailpostmastertools repo metadata json must have required property release level in generated google apis groupsmigration repo metadata json must have required property client documentation in generated google apis groupsmigration repo metadata json must have required property release level in generated google apis groupssettings repo metadata json must have required property client documentation in generated google apis groupssettings repo metadata json must have required property release level in generated google apis healthcare repo metadata json must have required property client documentation in generated google apis healthcare repo metadata json must have required property release level in generated google apis healthcare repo metadata json must have required property client documentation in generated google apis healthcare repo metadata json must have required property release level in generated google apis homegraph repo metadata json must have required property client documentation in generated google apis homegraph repo metadata json must have required property release level in generated google apis iam repo metadata json must have required property client documentation in generated google apis iam repo metadata json must have required property release level in generated google apis iamcredentials repo metadata json must have required property client documentation in generated google apis iamcredentials repo metadata json must have required property release level in generated google apis iap repo metadata json must have required property client documentation in generated google apis iap repo metadata json must have required property release level in generated google apis iap repo metadata json must have required property client documentation in generated google apis iap repo metadata json must have required property release level in generated google apis ideahub repo metadata json must have required property client documentation in generated google apis ideahub repo metadata json must have required property release level in generated google apis ideahub repo metadata json must have required property client documentation in generated google apis ideahub repo metadata json must have required property release level in generated google apis identitytoolkit repo metadata json must have required property client documentation in generated google apis identitytoolkit repo metadata json must have required property release level in generated google apis indexing repo metadata json must have required property client documentation in generated google apis indexing repo metadata json must have required property release level in generated google apis jobs repo metadata json must have required property client documentation in generated google apis jobs repo metadata json must have required property release level in generated google apis jobs repo metadata json must have required property client documentation in generated google apis jobs repo metadata json must have required property release level in generated google apis jobs repo metadata json must have required property client documentation in generated google apis jobs repo metadata json must have required property release level in generated google apis keep repo metadata json must have required property client documentation in generated google apis keep repo metadata json must have required property release level in generated google apis kgsearch repo metadata json must have required property client documentation in generated google apis kgsearch repo metadata json must have required property release level in generated google apis language repo metadata json must have required property client documentation in generated google apis language repo metadata json must have required property release level in generated google apis language repo metadata json must have required property client documentation in generated google apis language repo metadata json must have required property release level in generated google apis language repo metadata json must have required property client documentation in generated google apis language repo metadata json must have required property release level in generated google apis libraryagent repo metadata json must have required property client documentation in generated google apis libraryagent repo metadata json must have required property release level in generated google apis licensing repo metadata json must have required property client documentation in generated google apis licensing repo metadata json must have required property release level in generated google apis lifesciences repo metadata json must have required property client documentation in generated google apis lifesciences repo metadata json must have required property release level in generated google apis localservices repo metadata json must have required property client documentation in generated google apis localservices repo metadata json must have required property release level in generated google apis logging repo metadata json must have required property client documentation in generated google apis logging repo metadata json must have required property release level in generated google apis managedidentities repo metadata json must have required property client documentation in generated google apis managedidentities repo metadata json must have required property release level in generated google apis managedidentities repo metadata json must have required property client documentation in generated google apis managedidentities repo metadata json must have required property release level in generated google apis managedidentities repo metadata json must have required property client documentation in generated google apis managedidentities repo metadata json must have required property release level in generated google apis manufacturers repo metadata json must have required property client documentation in generated google apis manufacturers repo metadata json must have required property release level in generated google apis memcache repo metadata json must have required property client documentation in generated google apis memcache repo metadata json must have required property release level in generated google apis memcache repo metadata json must have required property client documentation in generated google apis memcache repo metadata json must have required property release level in generated google apis metastore repo metadata json must have required property client documentation in generated google apis metastore repo metadata json must have required property release level in generated google apis metastore repo metadata json must have required property client documentation in generated google apis metastore repo metadata json must have required property release level in generated google apis ml repo metadata json must have required property client documentation in generated google apis ml repo metadata json must have required property release level in generated google apis monitoring repo metadata json must have required property client documentation in generated google apis monitoring repo metadata json must have required property release level in generated google apis monitoring repo metadata json must have required property client documentation in generated google apis monitoring repo metadata json must have required property release level in generated google apis mybusinessaccountmanagement repo metadata json must have required property client documentation in generated google apis mybusinessaccountmanagement repo metadata json must have required property release level in generated google apis mybusinessbusinessinformation repo metadata json must have required property client documentation in generated google apis mybusinessbusinessinformation repo metadata json must have required property release level in generated google apis mybusinesslodging repo metadata json must have required property client documentation in generated google apis mybusinesslodging repo metadata json must have required property release level in generated google apis mybusinessnotifications repo metadata json must have required property client documentation in generated google apis mybusinessnotifications repo metadata json must have required property release level in generated google apis mybusinessplaceactions repo metadata json must have required property client documentation in generated google apis mybusinessplaceactions repo metadata json must have required property release level in generated google apis mybusinessqanda repo metadata json must have required property client documentation in generated google apis mybusinessqanda repo metadata json must have required property release level in generated google apis mybusinessverifications repo metadata json must have required property client documentation in generated google apis mybusinessverifications repo metadata json must have required property release level in generated google apis networkconnectivity repo metadata json must have required property client documentation in generated google apis networkconnectivity repo metadata json must have required property release level in generated google apis networkconnectivity repo metadata json must have required property client documentation in generated google apis networkconnectivity repo metadata json must have required property release level in generated google apis networkmanagement repo metadata json must have required property client documentation in generated google apis networkmanagement repo metadata json must have required property release level in generated google apis networkmanagement repo metadata json must have required property client documentation in generated google apis networkmanagement repo metadata json must have required property release level in generated google apis networksecurity repo metadata json must have required property client documentation in generated google apis networksecurity repo metadata json must have required property release level in generated google apis networksecurity repo metadata json must have required property client documentation in generated google apis networksecurity repo metadata json must have required property release level in generated google apis networkservices repo metadata json must have required property client documentation in generated google apis networkservices repo metadata json must have required property release level in generated google apis networkservices repo metadata json must have required property client documentation in generated google apis networkservices repo metadata json must have required property release level in generated google apis notebooks repo metadata json must have required property client documentation in generated google apis notebooks repo metadata json must have required property release level in generated google apis repo metadata json must have required property client documentation in generated google apis repo metadata json must have required property release level in generated google apis ondemandscanning repo metadata json must have required property client documentation in generated google apis ondemandscanning repo metadata json must have required property release level in generated google apis ondemandscanning repo metadata json must have required property client documentation in generated google apis ondemandscanning repo metadata json must have required property release level in generated google apis orgpolicy repo metadata json must have required property client documentation in generated google apis orgpolicy repo metadata json must have required property release level in generated google apis osconfig repo metadata json must have required property client documentation in generated google apis osconfig repo metadata json must have required property release level in generated google apis osconfig repo metadata json must have required property client documentation in generated google apis osconfig repo metadata json must have required property release level in generated google apis osconfig repo metadata json must have required property client documentation in generated google apis osconfig repo metadata json must have required property release level in generated google apis oslogin repo metadata json must have required property client documentation in generated google apis oslogin repo metadata json must have required property release level in generated google apis oslogin repo metadata json must have required property client documentation in generated google apis oslogin repo metadata json must have required property release level in generated google apis oslogin repo metadata json must have required property client documentation in generated google apis oslogin repo metadata json must have required property release level in generated google apis pagespeedonline repo metadata json must have required property client documentation in generated google apis pagespeedonline repo metadata json must have required property release level in generated google apis paymentsresellersubscription repo metadata json must have required property client documentation in generated google apis paymentsresellersubscription repo metadata json must have required property release level in generated google apis people repo metadata json must have required property client documentation in generated google apis people repo metadata json must have required property release level in generated google apis playablelocations repo metadata json must have required property client documentation in generated google apis playablelocations repo metadata json must have required property release level in generated google apis playcustomapp repo metadata json must have required property client documentation in generated google apis playcustomapp repo metadata json must have required property release level in generated google apis policyanalyzer repo metadata json must have required property client documentation in generated google apis policyanalyzer repo metadata json must have required property release level in generated google apis policyanalyzer repo metadata json must have required property client documentation in generated google apis policyanalyzer repo metadata json must have required property release level in generated google apis policysimulator repo metadata json must have required property client documentation in generated google apis policysimulator repo metadata json must have required property release level in generated google apis policysimulator repo metadata json must have required property client documentation in generated google apis policysimulator repo metadata json must have required property release level in generated google apis policytroubleshooter repo metadata json must have required property client documentation in generated google apis policytroubleshooter repo metadata json must have required property release level in generated google apis policytroubleshooter repo metadata json must have required property client documentation in generated google apis policytroubleshooter repo metadata json must have required property release level in generated google apis poly repo metadata json must have required property client documentation in generated google apis poly repo metadata json must have required property release level in generated google apis privateca repo metadata json must have required property client documentation in generated google apis privateca repo metadata json must have required property release level in generated google apis privateca repo metadata json must have required property client documentation in generated google apis privateca repo metadata json must have required property release level in generated google apis prod tt sasportal repo metadata json must have required property client documentation in generated google apis prod tt sasportal repo metadata json must have required property release level in generated google apis pubsub repo metadata json must have required property client documentation in generated google apis pubsub repo metadata json must have required property release level in generated google apis pubsub repo metadata json must have required property client documentation in generated google apis pubsub repo metadata json must have required property release level in generated google apis pubsub repo metadata json must have required property client documentation in generated google apis pubsub repo metadata json must have required property release level in generated google apis pubsublite repo metadata json must have required property client documentation in generated google apis pubsublite repo metadata json must have required property release level in generated google apis realtimebidding repo metadata json must have required property client documentation in generated google apis realtimebidding repo metadata json must have required property release level in generated google apis realtimebidding repo metadata json must have required property client documentation in generated google apis realtimebidding repo metadata json must have required property release level in generated google apis recaptchaenterprise repo metadata json must have required property client documentation in generated google apis recaptchaenterprise repo metadata json must have required property release level in generated google apis recommendationengine repo metadata json must have required property client documentation in generated google apis recommendationengine repo metadata json must have required property release level in generated google apis recommender repo metadata json must have required property client documentation in generated google apis recommender repo metadata json must have required property release level in generated google apis recommender repo metadata json must have required property client documentation in generated google apis recommender repo metadata json must have required property release level in generated google apis redis repo metadata json must have required property client documentation in generated google apis redis repo metadata json must have required property release level in generated google apis redis repo metadata json must have required property client documentation in generated google apis redis repo metadata json must have required property release level in generated google apis remotebuildexecution repo metadata json must have required property client documentation in generated google apis remotebuildexecution repo metadata json must have required property release level in generated google apis remotebuildexecution repo metadata json must have required property client documentation in generated google apis remotebuildexecution repo metadata json must have required property release level in generated google apis remotebuildexecution repo metadata json must have required property client documentation in generated google apis remotebuildexecution repo metadata json must have required property release level in generated google apis reseller repo metadata json must have required property client documentation in generated google apis reseller repo metadata json must have required property release level in generated google apis resourcesettings repo metadata json must have required property client documentation in generated google apis resourcesettings repo metadata json must have required property release level in generated google apis retail repo metadata json must have required property client documentation in generated google apis retail repo metadata json must have required property release level in generated google apis retail repo metadata json must have required property client documentation in generated google apis retail repo metadata json must have required property release level in generated google apis retail repo metadata json must have required property client documentation in generated google apis retail repo metadata json must have required property release level in generated google apis run repo metadata json must have required property client documentation in generated google apis run repo metadata json must have required property release level in generated google apis run repo metadata json must have required property client documentation in generated google apis run repo metadata json must have required property release level in generated google apis run repo metadata json must have required property client documentation in generated google apis run repo metadata json must have required property release level in generated google apis run repo metadata json must have required property client documentation in generated google apis run repo metadata json must have required property release level in generated google apis runtimeconfig repo metadata json must have required property client documentation in generated google apis runtimeconfig repo metadata json must have required property release level in generated google apis runtimeconfig repo metadata json must have required property client documentation in generated google apis runtimeconfig repo metadata json must have required property release level in generated google apis safebrowsing repo metadata json must have required property client documentation in generated google apis safebrowsing repo metadata json must have required property release level in generated google apis sasportal repo metadata json must have required property client documentation in generated google apis sasportal repo metadata json must have required property release level in generated google apis script repo metadata json must have required property client documentation in generated google apis script repo metadata json must have required property release level in generated google apis searchconsole repo metadata json must have required property client documentation in generated google apis searchconsole repo metadata json must have required property release level in generated google apis secretmanager repo metadata json must have required property client documentation in generated google apis secretmanager repo metadata json must have required property release level in generated google apis secretmanager repo metadata json must have required property client documentation in generated google apis secretmanager repo metadata json must have required property release level in generated google apis securitycenter repo metadata json must have required property client documentation in generated google apis securitycenter repo metadata json must have required property release level in generated google apis securitycenter repo metadata json must have required property client documentation in generated google apis securitycenter repo metadata json must have required property release level in generated google apis securitycenter repo metadata json must have required property client documentation in generated google apis securitycenter repo metadata json must have required property release level in generated google apis serviceconsumermanagement repo metadata json must have required property client documentation in generated google apis serviceconsumermanagement repo metadata json must have required property release level in generated google apis serviceconsumermanagement repo metadata json must have required property client documentation in generated google apis serviceconsumermanagement repo metadata json must have required property release level in generated google apis servicecontrol repo metadata json must have required property client documentation in generated google apis servicecontrol repo metadata json must have required property release level in generated google apis servicecontrol repo metadata json must have required property client documentation in generated google apis servicecontrol repo metadata json must have required property release level in generated google apis servicedirectory repo metadata json must have required property client documentation in generated google apis servicedirectory repo metadata json must have required property release level in generated google apis servicedirectory repo metadata json must have required property client documentation in generated google apis servicedirectory repo metadata json must have required property release level in generated google apis servicemanagement repo metadata json must have required property client documentation in generated google apis servicemanagement repo metadata json must have required property release level in generated google apis servicenetworking repo metadata json must have required property client documentation in generated google apis servicenetworking repo metadata json must have required property release level in generated google apis servicenetworking repo metadata json must have required property client documentation in generated google apis servicenetworking repo metadata json must have required property release level in generated google apis serviceusage repo metadata json must have required property client documentation in generated google apis serviceusage repo metadata json must have required property release level in generated google apis serviceusage repo metadata json must have required property client documentation in generated google apis serviceusage repo metadata json must have required property release level in generated google apis sheets repo metadata json must have required property client documentation in generated google apis sheets repo metadata json must have required property release level in generated google apis site verification repo metadata json must have required property client documentation in generated google apis site verification repo metadata json must have required property release level in generated google apis slides repo metadata json must have required property client documentation in generated google apis slides repo metadata json must have required property release level in generated google apis smartdevicemanagement repo metadata json must have required property client documentation in generated google apis smartdevicemanagement repo metadata json must have required property release level in generated google apis sourcerepo repo metadata json must have required property client documentation in generated google apis sourcerepo repo metadata json must have required property release level in generated google apis spanner repo metadata json must have required property client documentation in generated google apis spanner repo metadata json must have required property release level in generated google apis speech repo metadata json must have required property client documentation in generated google apis speech repo metadata json must have required property release level in generated google apis speech repo metadata json must have required property client documentation in generated google apis speech repo metadata json must have required property release level in generated google apis speech repo metadata json must have required property client documentation in generated google apis speech repo metadata json must have required property release level in generated google apis sqladmin repo metadata json must have required property client documentation in generated google apis sqladmin repo metadata json must have required property release level in generated google apis sqladmin repo metadata json must have required property client documentation in generated google apis sqladmin repo metadata json must have required property release level in generated google apis storage repo metadata json must have required property client documentation in generated google apis storage repo metadata json must have required property release level in generated google apis storagetransfer repo metadata json must have required property client documentation in generated google apis storagetransfer repo metadata json must have required property release level in generated google apis streetviewpublish repo metadata json must have required property client documentation in generated google apis streetviewpublish repo metadata json must have required property release level in generated google apis sts repo metadata json must have required property client documentation in generated google apis sts repo metadata json must have required property release level in generated google apis sts repo metadata json must have required property client documentation in generated google apis sts repo metadata json must have required property release level in generated google apis tagmanager repo metadata json must have required property client documentation in generated google apis tagmanager repo metadata json must have required property release level in generated google apis tagmanager repo metadata json must have required property client documentation in generated google apis tagmanager repo metadata json must have required property release level in generated google apis tasks repo metadata json must have required property client documentation in generated google apis tasks repo metadata json must have required property release level in generated google apis testing repo metadata json must have required property client documentation in generated google apis testing repo metadata json must have required property release level in generated google apis texttospeech repo metadata json must have required property client documentation in generated google apis texttospeech repo metadata json must have required property release level in generated google apis texttospeech repo metadata json must have required property client documentation in generated google apis texttospeech repo metadata json must have required property release level in generated google apis toolresults repo metadata json must have required property client documentation in generated google apis toolresults repo metadata json must have required property release level in generated google apis tpu repo metadata json must have required property client documentation in generated google apis tpu repo metadata json must have required property release level in generated google apis tpu repo metadata json must have required property client documentation in generated google apis tpu repo metadata json must have required property release level in generated google apis tpu repo metadata json must have required property client documentation in generated google apis tpu repo metadata json must have required property release level in generated google apis trafficdirector repo metadata json must have required property client documentation in generated google apis trafficdirector repo metadata json must have required property release level in generated google apis transcoder repo metadata json must have required property client documentation in generated google apis transcoder repo metadata json must have required property release level in generated google apis transcoder repo metadata json must have required property client documentation in generated google apis transcoder repo metadata json must have required property release level in generated google apis translate repo metadata json must have required property client documentation in generated google apis translate repo metadata json must have required property release level in generated google apis translate repo metadata json must have required property client documentation in generated google apis translate repo metadata json must have required property release level in generated google apis translate repo metadata json must have required property client documentation in generated google apis translate repo metadata json must have required property release level in generated google apis vault repo metadata json must have required property client documentation in generated google apis vault repo metadata json must have required property release level in generated google apis vectortile repo metadata json must have required property client documentation in generated google apis vectortile repo metadata json must have required property release level in generated google apis verifiedaccess repo metadata json must have required property client documentation in generated google apis verifiedaccess repo metadata json must have required property release level in generated google apis versionhistory repo metadata json must have required property client documentation in generated google apis versionhistory repo metadata json must have required property release level in generated google apis videointelligence repo metadata json must have required property client documentation in generated google apis videointelligence repo metadata json must have required property release level in generated google apis videointelligence repo metadata json must have required property client documentation in generated google apis videointelligence repo metadata json must have required property release level in generated google apis videointelligence repo metadata json must have required property client documentation in generated google apis videointelligence repo metadata json must have required property release level in generated google apis videointelligence repo metadata json must have required property client documentation in generated google apis videointelligence repo metadata json must have required property release level in generated google apis videointelligence repo metadata json must have required property client documentation in generated google apis videointelligence repo metadata json must have required property release level in generated google apis vision repo metadata json must have required property client documentation in generated google apis vision repo metadata json must have required property release level in generated google apis vision repo metadata json must have required property client documentation in generated google apis vision repo metadata json must have required property release level in generated google apis vision repo metadata json must have required property client documentation in generated google apis vision repo metadata json must have required property release level in generated google apis vmmigration repo metadata json must have required property client documentation in generated google apis vmmigration repo metadata json must have required property release level in generated google apis vmmigration repo metadata json must have required property client documentation in generated google apis vmmigration repo metadata json must have required property release level in generated google apis webfonts repo metadata json must have required property client documentation in generated google apis webfonts repo metadata json must have required property release level in generated google apis webmasters repo metadata json must have required property client documentation in generated google apis webmasters repo metadata json must have required property release level in generated google apis webrisk repo metadata json must have required property client documentation in generated google apis webrisk repo metadata json must have required property release level in generated google apis websecurityscanner repo metadata json must have required property client documentation in generated google apis websecurityscanner repo metadata json must have required property release level in generated google apis websecurityscanner repo metadata json must have required property client documentation in generated google apis websecurityscanner repo metadata json must have required property release level in generated google apis websecurityscanner repo metadata json must have required property client documentation in generated google apis websecurityscanner repo metadata json must have required property release level in generated google apis workflowexecutions repo metadata json must have required property client documentation in generated google apis workflowexecutions repo metadata json must have required property release level in generated google apis workflowexecutions repo metadata json must have required property client documentation in generated google apis workflowexecutions repo metadata json must have required property release level in generated google apis workflows repo metadata json must have required property client documentation in generated google apis workflows repo metadata json must have required property release level in generated google apis workflows repo metadata json must have required property client documentation in generated google apis workflows repo metadata json must have required property release level in generated google apis youtube analytics repo metadata json must have required property client documentation in generated google apis youtube analytics repo metadata json must have required property release level in generated google apis youtube repo metadata json must have required property client documentation in generated google apis youtube repo metadata json must have required property release level in generated google apis youtubereporting repo metadata json must have required property client documentation in generated google apis youtubereporting repo metadata json must have required property release level in google api client repo metadata json must have required property client documentation in google api client repo metadata json must have required property release level in google apis core repo metadata json must have required property client documentation in google apis core repo metadata json must have required property release level in google apis generator repo metadata json must have required property client documentation in google apis generator repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
| 1
|
374,744
| 11,094,908,134
|
IssuesEvent
|
2019-12-16 07:46:45
|
arcticicestudio/nord-jetbrains
|
https://api.github.com/repos/arcticicestudio/nord-jetbrains
|
closed
|
Go syntax highlighting support for IntelliJ/Goland 2019.3
|
context-syntax priority-high scope-compatibility scope-ux target-goland target-intellij-idea type-bug
|
Related to #69, #70
---
Like already documented and fixed in #70, IntelliJ/Goland version 2019.3 also [changes in Go's syntax highlight for the default bundled color schemes][goland].
This again requires to explicitly define the values for some attributes in order to achieve the same highlight like in previous versions that are matching Nord's style guidelines.
[goland]: https://www.jetbrains.com/go/whatsnew/#v2019-3-code-editing
|
1.0
|
Go syntax highlighting support for IntelliJ/Goland 2019.3 - Related to #69, #70
---
Like already documented and fixed in #70, IntelliJ/Goland version 2019.3 also [changes in Go's syntax highlight for the default bundled color schemes][goland].
This again requires to explicitly define the values for some attributes in order to achieve the same highlight like in previous versions that are matching Nord's style guidelines.
[goland]: https://www.jetbrains.com/go/whatsnew/#v2019-3-code-editing
|
non_process
|
go syntax highlighting support for intellij goland related to like already documented and fixed in intellij goland version also this again requires to explicitly define the values for some attributes in order to achieve the same highlight like in previous versions that are matching nord s style guidelines
| 0
|
10,026
| 13,044,161,476
|
IssuesEvent
|
2020-07-29 03:47:23
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `AddDateIntDecimal` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `AddDateIntDecimal` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @mapleFU
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `AddDateIntDecimal` from TiDB -
## Description
Port the scalar function `AddDateIntDecimal` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @mapleFU
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function adddateintdecimal from tidb description port the scalar function adddateintdecimal from tidb to coprocessor score mentor s maplefu recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
307,840
| 9,423,023,910
|
IssuesEvent
|
2019-04-11 10:46:26
|
AugurProject/augur
|
https://api.github.com/repos/AugurProject/augur
|
closed
|
Presets to All-time
|
Priority: Low
|
On acct summary pg for Your Overview and Augur Status - presets should be All-Time to start
|
1.0
|
Presets to All-time - On acct summary pg for Your Overview and Augur Status - presets should be All-Time to start
|
non_process
|
presets to all time on acct summary pg for your overview and augur status presets should be all time to start
| 0
|
122,248
| 4,828,841,512
|
IssuesEvent
|
2016-11-07 17:16:09
|
sgloutnikov/sxm-collect
|
https://api.github.com/repos/sgloutnikov/sxm-collect
|
closed
|
Add clean/standardize functions for incoming data with filters
|
enhancement High Priority
|
Pass all incoming data through clean methods, with customizable filters.
- Strip last word of long track name. (>35 characters)
- 'J. Cole' sometimes 'J.Cole'
- Remove '(BED) XMRCHR' in song.
- Remove Various Artists
- If found X replace with Y.
- If found X delete X.
|
1.0
|
Add clean/standardize functions for incoming data with filters - Pass all incoming data through clean methods, with customizable filters.
- Strip last word of long track name. (>35 characters)
- 'J. Cole' sometimes 'J.Cole'
- Remove '(BED) XMRCHR' in song.
- Remove Various Artists
- If found X replace with Y.
- If found X delete X.
|
non_process
|
add clean standardize functions for incoming data with filters pass all incoming data through clean methods with customizable filters strip last word of long track name characters j cole sometimes j cole remove bed xmrchr in song remove various artists if found x replace with y if found x delete x
| 0
|
15,428
| 19,618,690,193
|
IssuesEvent
|
2022-01-07 01:34:56
|
beyondhb1079/s4us
|
https://api.github.com/repos/beyondhb1079/s4us
|
closed
|
Upgrade Material UI to 5
|
process
|
This is the latest stable version per https://mui.com/getting-started/usage/
Per [migration guide](https://mui.com/guides/migration-v4), we need to:
- [x] Upgrade React to v17
- [x] Upgrade (if needed) react-scripts, @types/react, @types/react-dom
- [x] Setup ThemeProvider properly
- [x] Update MUI version
- [x] Run codemods
- [x] Handle individual breaking changes
- [x] Theme structure
- [x] [Pickers](https://mui.com/guides/pickers-migration/)
- [x] #847
|
1.0
|
Upgrade Material UI to 5 - This is the latest stable version per https://mui.com/getting-started/usage/
Per [migration guide](https://mui.com/guides/migration-v4), we need to:
- [x] Upgrade React to v17
- [x] Upgrade (if needed) react-scripts, @types/react, @types/react-dom
- [x] Setup ThemeProvider properly
- [x] Update MUI version
- [x] Run codemods
- [x] Handle individual breaking changes
- [x] Theme structure
- [x] [Pickers](https://mui.com/guides/pickers-migration/)
- [x] #847
|
process
|
upgrade material ui to this is the latest stable version per per we need to upgrade react to upgrade if needed react scripts types react types react dom setup themeprovider properly update mui version run codemods handle individual breaking changes theme structure
| 1
|
13,201
| 15,646,185,801
|
IssuesEvent
|
2021-03-23 00:20:40
|
parcel-bundler/parcel
|
https://api.github.com/repos/parcel-bundler/parcel
|
closed
|
Parcel Not outputting prefixed css
|
:grey_question: Question CSS Preprocessing Stale
|
I have been trying to set up Parcel to output prefixed css but have not had any luck.
So far I've tried using a .postcssrc config with:
{
"plugins": {
"autoprefixer": true
}
}
but the outputted css had no vendor prefixes for flexbox or transforms.
Then I tried using a .browserlistrc file just using what are supposed to be defaults:
> 0.5%,
last 2 versions,
Firefox ESR,
not dead
but still nothing.
So am I missing something obvious?
Isn't this something that should be part of the "zero-configuration" concept?
If getting Parcel to use autoprefixer IS possible, how should I be setting this up?
|
1.0
|
Parcel Not outputting prefixed css - I have been trying to set up Parcel to output prefixed css but have not had any luck.
So far I've tried using a .postcssrc config with:
{
"plugins": {
"autoprefixer": true
}
}
but the outputted css had no vendor prefixes for flexbox or transforms.
Then I tried using a .browserlistrc file just using what are supposed to be defaults:
> 0.5%,
last 2 versions,
Firefox ESR,
not dead
but still nothing.
So am I missing something obvious?
Isn't this something that should be part of the "zero-configuration" concept?
If getting Parcel to use autoprefixer IS possible, how should I be setting this up?
|
process
|
parcel not outputting prefixed css i have been trying to set up parcel to output prefixed css but have not had any luck so far i ve tried using a postcssrc config with plugins autoprefixer true but the outputted css had no vendor prefixes for flexbox or transforms then i tried using a browserlistrc file just using what are supposed to be defaults last versions firefox esr not dead but still nothing so am i missing something obvious isn t this something that should be part of the zero configuration concept if getting parcel to use autoprefixer is possible how should i be setting this up
| 1
|
8,715
| 11,853,248,904
|
IssuesEvent
|
2020-03-24 21:34:02
|
Altinn/altinn-studio
|
https://api.github.com/repos/Altinn/altinn-studio
|
opened
|
Update application to support that user can go back from confirmation page
|
area/process kind/user-story
|
## Description
> Document the need briefly for the story. Start with the "who, what, why" eg "As a (type of user), I want (some goal), so that (reason)." The title of the user story should reflect this need in a shorter way.
## Screenshots
> Screenshots or links to Figma (make sure your sketch is public)
## Considerations
> Describe input (beyond tasks) on how the user story should be solved can be put here.
## Acceptance criteria
> Describe criteria here (i.e. What is allowed/not allowed (negative tesing), validations, error messages and warnings etc.)
## Specification tasks
- [ ] Development tasks are defined
- [ ] Test design / decide test need
## Development tasks
> Add tasks here
## Definition of done
Verify that this issue meets [DoD](https://confluence.brreg.no/display/T3KP/Definition+of+Done#DefinitionofDone-DoD%E2%80%93utvikling) (Only for project members) before closing.
- [ ] Documentation is updated (if relevant)
- [ ] Technical documentation (docs.altinn.studio)
- [ ] User documentation (altinn.github.io/docs)
- [ ] QA
- [ ] Manual test is complete (if relevant)
- [ ] Automated test is implemented (if relevant)
- [ ] All tasks in this userstory are closed (i.e. remaining tasks are moved to other user stories or marked obsolete)
|
1.0
|
Update application to support that user can go back from confirmation page - ## Description
> Document the need briefly for the story. Start with the "who, what, why" eg "As a (type of user), I want (some goal), so that (reason)." The title of the user story should reflect this need in a shorter way.
## Screenshots
> Screenshots or links to Figma (make sure your sketch is public)
## Considerations
> Describe input (beyond tasks) on how the user story should be solved can be put here.
## Acceptance criteria
> Describe criteria here (i.e. What is allowed/not allowed (negative tesing), validations, error messages and warnings etc.)
## Specification tasks
- [ ] Development tasks are defined
- [ ] Test design / decide test need
## Development tasks
> Add tasks here
## Definition of done
Verify that this issue meets [DoD](https://confluence.brreg.no/display/T3KP/Definition+of+Done#DefinitionofDone-DoD%E2%80%93utvikling) (Only for project members) before closing.
- [ ] Documentation is updated (if relevant)
- [ ] Technical documentation (docs.altinn.studio)
- [ ] User documentation (altinn.github.io/docs)
- [ ] QA
- [ ] Manual test is complete (if relevant)
- [ ] Automated test is implemented (if relevant)
- [ ] All tasks in this userstory are closed (i.e. remaining tasks are moved to other user stories or marked obsolete)
|
process
|
update application to support that user can go back from confirmation page description document the need briefly for the story start with the who what why eg as a type of user i want some goal so that reason the title of the user story should reflect this need in a shorter way screenshots screenshots or links to figma make sure your sketch is public considerations describe input beyond tasks on how the user story should be solved can be put here acceptance criteria describe criteria here i e what is allowed not allowed negative tesing validations error messages and warnings etc specification tasks development tasks are defined test design decide test need development tasks add tasks here definition of done verify that this issue meets only for project members before closing documentation is updated if relevant technical documentation docs altinn studio user documentation altinn github io docs qa manual test is complete if relevant automated test is implemented if relevant all tasks in this userstory are closed i e remaining tasks are moved to other user stories or marked obsolete
| 1
|
58,930
| 16,943,774,540
|
IssuesEvent
|
2021-06-28 01:47:03
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
opened
|
New invites are incredibly easy to miss.
|
T-Defect
|
If your invite section is collapsed or scrolled out of sight, the only way you'll spot a new invite is if you remember the previous badge count and spot that it's incremented. Instead a toast or something would be way better to make sure users don't miss new invites. (It typically takes me about 6 hours to spot a new invite atm, which is far from ideal).
|
1.0
|
New invites are incredibly easy to miss. - If your invite section is collapsed or scrolled out of sight, the only way you'll spot a new invite is if you remember the previous badge count and spot that it's incremented. Instead a toast or something would be way better to make sure users don't miss new invites. (It typically takes me about 6 hours to spot a new invite atm, which is far from ideal).
|
non_process
|
new invites are incredibly easy to miss if your invite section is collapsed or scrolled out of sight the only way you ll spot a new invite is if you remember the previous badge count and spot that it s incremented instead a toast or something would be way better to make sure users don t miss new invites it typically takes me about hours to spot a new invite atm which is far from ideal
| 0
|
12,534
| 14,972,446,903
|
IssuesEvent
|
2021-01-27 22:52:46
|
parcel-bundler/parcel
|
https://api.github.com/repos/parcel-bundler/parcel
|
closed
|
HTML entities are replaced with the actual symbols in the compiled css
|
CSS Preprocessing
|
# 🐛 bug report
HTML entities are replaced with the actual symbols in the compiled css.
And thus, the symbols show up garbled.
For example: If I comple scss with a pseudo element with html entity like this:
Source SCSS:
```
#test:before { content: "\2713"; }
```
Compiled css:
```
#test:before { content: "✓"; }
```
The html entity '\2713' is replaced with the actual symbol '✓' in the complied css,
and then, it's shown like this "e✓" on browsers.
|
1.0
|
HTML entities are replaced with the actual symbols in the compiled css - # 🐛 bug report
HTML entities are replaced with the actual symbols in the compiled css.
And thus, the symbols show up garbled.
For example: If I comple scss with a pseudo element with html entity like this:
Source SCSS:
```
#test:before { content: "\2713"; }
```
Compiled css:
```
#test:before { content: "✓"; }
```
The html entity '\2713' is replaced with the actual symbol '✓' in the complied css,
and then, it's shown like this "e✓" on browsers.
|
process
|
html entities are replaced with the actual symbols in the compiled css 🐛 bug report html entities are replaced with the actual symbols in the compiled css and thus the symbols show up garbled for example if i comple scss with a pseudo element with html entity like this source scss test before content compiled css test before content ✓ the html entity is replaced with the actual symbol ✓ in the complied css and then it s shown like this e✓ on browsers
| 1
|
20,823
| 27,579,376,864
|
IssuesEvent
|
2023-03-08 15:13:04
|
camunda/issues
|
https://api.github.com/repos/camunda/issues
|
opened
|
Delete Selected Version of Decision Definition
|
component:operate component:zeebe component:zeebe-process-automation public feature-parity potential:8.3 riskAssessment:completed riskClass:medium
|
### Value Proposition Statement
Delete Decision Definition to free up storage, declutter user interface and prevent errors
### User Problem
- Currently, there is no way to delete decision definition from Zeebe cluster.
- While developing new decisions, developers frequently deploy different variants of definitions. This can bloat their development environment.
- Currently, the only way to deal with this challenge is recreation of cluster.
- I have multiple versions of Decision Definitions that are not being used anymore. I cannot delete them, what can cause accidentally trigger evaluation of out to date decisions.
### User Stories
- As a Developer, I can delete a selected version of deployed decision definition and all evaluated decisions of this version (1 operation) via Operate UI
- As a Developer, while using this feature, I can read basic information, what will be deleted
- As a Developer, I can see the progress of deletion in Operations Panel in Decisions tab
- As a Developer, I can read the documentation, explaining what is going to happen when I delete a version of decision definition
### Implementation Notes
- By deleting a version of a decision definition, I also delete all the evaluated decisions of this version
- Operations panel is available in Decisions tab - the same as in processes tab
This is the first, out of 4 iterations, to implement the whole feature of **Delete Process and Decision Definition**. More details in [Miro](https://miro.com/app/board/uXjVPNlXkFg=/)
**Why this scope?**
- Decision Definition doesn't have dependencies with other objects so it should be the simplest first iteration
- This iteration already brings great value for the customers and solves customers problem
**Iterations:**
1. https://github.com/camunda/product-hub/issues/94
2. https://github.com/camunda/product-hub/issues/615
3. https://github.com/camunda/product-hub/issues/619
4. https://github.com/camunda/product-hub/issues/620
### Validation Criteria
**Metrics**
- Number of deleted decision definitions
**Customer Validation**
- We have validated with at least 2 Enterprise customer that this works as expected (e.g. Domo, SEB,SwissRE)
- Validated with at least 1 community member, that commented on the public issues
### Breakdown
<!-- Please link to sub-issues / -tasks contributing to respective epic phase or phase results where appropriate. -->
#### Discovery phase ##
<!-- Example: "Conduct customer interview with xyz" -->
#### Define phase ##
https://github.com/camunda/product-design/issues/16
#### Additional security testing
* Test for security login and monitoring failure
* Ensure current permission management remains in effect
Design Planning
* Reviewed by design: 04 July 2022, reviewed again in Dec 2022
* Designer assigned: Yes
* Assignee: @gastonpillet01
* [Design process](https://www.figma.com/file/cbmCPVF9OaQFBt8pierKOa/Delete-Definition---Design-Process?node-id=0%3A1)
* [Design Brief](https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit?pli=1#)
* [Research Brief](https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit?pli=1#heading=h.rlwz79ka4luf)
* [Research findings - discovery](https://drive.google.com/drive/folders/1T4UEKutoOOiDNAL8ebQC1msghwLSFFYX)
* [Research deliverables - prototype evaluation](https://drive.google.com/drive/folders/1uslHtXK88tVtcF0zqQ_EHqa-nE7RG5xW)
Design Deliverables
* [Design Brief](https://github.com/camunda/product-design/issues/16) Expected date 30 June | Delivery date 04 July
* [Competitor Analysis](https://github.com/camunda/product-design/issues/57) Expected date 13 July | Delivery date 15 July
* [Conceptual Flows](https://github.com/camunda/product-design/issues/58) Expected date 1 August | Delivery date 8 August
* [Wireframes](https://github.com/camunda/product-design/issues/59) Expected date 8 August | Delivery date 18 August (final iteration for testing)
* [Prototype](https://github.com/camunda/product-design/issues/82) Expected date 13 August | Delivery date 18 August
* [Delete Process Specifications](https://github.com/camunda/product-design/issues/60) Expected date 15 August | Delivery date 25 August
* [Delete Decision Specifications](https://github.com/camunda/product-design/issues/88) Expected date 31 August | Delivery date 31 August (delayed due to unclarity on impact of DRD)
Risk Management
* [Assessment](https://onetrust.camunda.com/itrm): https://onetrust.camunda.com/itrm/assessment/detail/b4c92f8a-6431-4cc1-87c4-f48a48836cac?type=grc
* [Risk Class](https://confluence.camunda.com/display/ISMS/P06+-+Information+Security+and+Enterprise+Risk+Management#P06InformationSecurityandEnterpriseRiskManagement-Riskandopportunityvalue,riskandopportunityclassesandacceptancelevels): medium
#### Implement phase ##
<!-- Example: link to "Implement User Story xyz" -->
Zeebe Epic
* DRI: @remcowesterhoud
* https://github.com/camunda/zeebe/issues/9576
#### Validate phase ##
<!-- Example: link to "Evaluate usage data of last quarter" -->
### Links to additional collateral
<!-- Example: link to relevant support cases -->
- [Kickoff Doc](https://docs.google.com/document/d/1wf5fQ03be6178xk33QobSv_SgChmXfM7IP-OfnN5dzw/edit#heading=h.bt4le1pnsv1)
- [Community Zeebe Issue](https://github.com/camunda/zeebe/issues/2908)
- [OPTUM Support Case](https://jira.camunda.com/browse/SUPPORT-13775)
- [SwissRe](https://jira.camunda.com/browse/SUPPORT-8143)
- [Domo](https://jira.camunda.com/browse/SUPPORT-10144)
- [Undeploy zeebe community ticket](https://github.com/camunda/zeebe/issues/2506)
|
1.0
|
Delete Selected Version of Decision Definition - ### Value Proposition Statement
Delete Decision Definition to free up storage, declutter user interface and prevent errors
### User Problem
- Currently, there is no way to delete decision definition from Zeebe cluster.
- While developing new decisions, developers frequently deploy different variants of definitions. This can bloat their development environment.
- Currently, the only way to deal with this challenge is recreation of cluster.
- I have multiple versions of Decision Definitions that are not being used anymore. I cannot delete them, what can cause accidentally trigger evaluation of out to date decisions.
### User Stories
- As a Developer, I can delete a selected version of deployed decision definition and all evaluated decisions of this version (1 operation) via Operate UI
- As a Developer, while using this feature, I can read basic information, what will be deleted
- As a Developer, I can see the progress of deletion in Operations Panel in Decisions tab
- As a Developer, I can read the documentation, explaining what is going to happen when I delete a version of decision definition
### Implementation Notes
- By deleting a version of a decision definition, I also delete all the evaluated decisions of this version
- Operations panel is available in Decisions tab - the same as in processes tab
This is the first, out of 4 iterations, to implement the whole feature of **Delete Process and Decision Definition**. More details in [Miro](https://miro.com/app/board/uXjVPNlXkFg=/)
**Why this scope?**
- Decision Definition doesn't have dependencies with other objects so it should be the simplest first iteration
- This iteration already brings great value for the customers and solves customers problem
**Iterations:**
1. https://github.com/camunda/product-hub/issues/94
2. https://github.com/camunda/product-hub/issues/615
3. https://github.com/camunda/product-hub/issues/619
4. https://github.com/camunda/product-hub/issues/620
### Validation Criteria
**Metrics**
- Number of deleted decision definitions
**Customer Validation**
- We have validated with at least 2 Enterprise customer that this works as expected (e.g. Domo, SEB,SwissRE)
- Validated with at least 1 community member, that commented on the public issues
### Breakdown
<!-- Please link to sub-issues / -tasks contributing to respective epic phase or phase results where appropriate. -->
#### Discovery phase ##
<!-- Example: "Conduct customer interview with xyz" -->
#### Define phase ##
https://github.com/camunda/product-design/issues/16
#### Additional security testing
* Test for security login and monitoring failure
* Ensure current permission management remains in effect
Design Planning
* Reviewed by design: 04 July 2022, reviewed again in Dec 2022
* Designer assigned: Yes
* Assignee: @gastonpillet01
* [Design process](https://www.figma.com/file/cbmCPVF9OaQFBt8pierKOa/Delete-Definition---Design-Process?node-id=0%3A1)
* [Design Brief](https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit?pli=1#)
* [Research Brief](https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit?pli=1#heading=h.rlwz79ka4luf)
* [Research findings - discovery](https://drive.google.com/drive/folders/1T4UEKutoOOiDNAL8ebQC1msghwLSFFYX)
* [Research deliverables - prototype evaluation](https://drive.google.com/drive/folders/1uslHtXK88tVtcF0zqQ_EHqa-nE7RG5xW)
Design Deliverables
* [Design Brief](https://github.com/camunda/product-design/issues/16) Expected date 30 June | Delivery date 04 July
* [Competitor Analysis](https://github.com/camunda/product-design/issues/57) Expected date 13 July | Delivery date 15 July
* [Conceptual Flows](https://github.com/camunda/product-design/issues/58) Expected date 1 August | Delivery date 8 August
* [Wireframes](https://github.com/camunda/product-design/issues/59) Expected date 8 August | Delivery date 18 August (final iteration for testing)
* [Prototype](https://github.com/camunda/product-design/issues/82) Expected date 13 August | Delivery date 18 August
* [Delete Process Specifications](https://github.com/camunda/product-design/issues/60) Expected date 15 August | Delivery date 25 August
* [Delete Decision Specifications](https://github.com/camunda/product-design/issues/88) Expected date 31 August | Delivery date 31 August (delayed due to unclarity on impact of DRD)
Risk Management
* [Assessment](https://onetrust.camunda.com/itrm): https://onetrust.camunda.com/itrm/assessment/detail/b4c92f8a-6431-4cc1-87c4-f48a48836cac?type=grc
* [Risk Class](https://confluence.camunda.com/display/ISMS/P06+-+Information+Security+and+Enterprise+Risk+Management#P06InformationSecurityandEnterpriseRiskManagement-Riskandopportunityvalue,riskandopportunityclassesandacceptancelevels): medium
#### Implement phase ##
<!-- Example: link to "Implement User Story xyz" -->
Zeebe Epic
* DRI: @remcowesterhoud
* https://github.com/camunda/zeebe/issues/9576
#### Validate phase ##
<!-- Example: link to "Evaluate usage data of last quarter" -->
### Links to additional collateral
<!-- Example: link to relevant support cases -->
- [Kickoff Doc](https://docs.google.com/document/d/1wf5fQ03be6178xk33QobSv_SgChmXfM7IP-OfnN5dzw/edit#heading=h.bt4le1pnsv1)
- [Community Zeebe Issue](https://github.com/camunda/zeebe/issues/2908)
- [OPTUM Support Case](https://jira.camunda.com/browse/SUPPORT-13775)
- [SwissRe](https://jira.camunda.com/browse/SUPPORT-8143)
- [Domo](https://jira.camunda.com/browse/SUPPORT-10144)
- [Undeploy zeebe community ticket](https://github.com/camunda/zeebe/issues/2506)
|
process
|
delete selected version of decision definition value proposition statement delete decision definition to free up storage declutter user interface and prevent errors user problem currently there is no way to delete decision definition from zeebe cluster while developing new decisions developers frequently deploy different variants of definitions this can bloat their development environment currently the only way to deal with this challenge is recreation of cluster i have multiple versions of decision definitions that are not being used anymore i cannot delete them what can cause accidentally trigger evaluation of out to date decisions user stories as a developer i can delete a selected version of deployed decision definition and all evaluated decisions of this version operation via operate ui as a developer while using this feature i can read basic information what will be deleted as a developer i can see the progress of deletion in operations panel in decisions tab as a developer i can read the documentation explaining what is going to happen when i delete a version of decision definition implementation notes by deleting a version of a decision definition i also delete all the evaluated decisions of this version operations panel is available in decisions tab the same as in processes tab this is the first out of iterations to implement the whole feature of delete process and decision definition more details in why this scope decision definition doesn t have dependencies with other objects so it should be the simplest first iteration this iteration already brings great value for the customers and solves customers problem iterations validation criteria metrics number of deleted decision definitions customer validation we have validated with at least enterprise customer that this works as expected e g domo seb swissre validated with at least community member that commented on the public issues breakdown discovery phase define phase additional security testing test for security login and monitoring failure ensure current permission management remains in effect design planning reviewed by design july reviewed again in dec designer assigned yes assignee design deliverables expected date june delivery date july expected date july delivery date july expected date august delivery date august expected date august delivery date august final iteration for testing expected date august delivery date august expected date august delivery date august expected date august delivery date august delayed due to unclarity on impact of drd risk management medium implement phase zeebe epic dri remcowesterhoud validate phase links to additional collateral
| 1
|
8,208
| 2,611,471,093
|
IssuesEvent
|
2015-02-27 05:15:53
|
chrsmith/hedgewars
|
https://api.github.com/repos/chrsmith/hedgewars
|
closed
|
Sometimes can't see ammo menu when not my turn
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1. Run game
2. Try to look at your ammo menu when it's opponent turn
What is the expected output? What do you see instead?
Can't see it when shotgun is used
Please use labels and text to provide additional information.
```
Original issue reported on code.google.com by `unC0Rr` on 15 Apr 2011 at 5:13
|
1.0
|
Sometimes can't see ammo menu when not my turn - ```
What steps will reproduce the problem?
1. Run game
2. Try to look at your ammo menu when it's opponent turn
What is the expected output? What do you see instead?
Can't see it when shotgun is used
Please use labels and text to provide additional information.
```
Original issue reported on code.google.com by `unC0Rr` on 15 Apr 2011 at 5:13
|
non_process
|
sometimes can t see ammo menu when not my turn what steps will reproduce the problem run game try to look at your ammo menu when it s opponent turn what is the expected output what do you see instead can t see it when shotgun is used please use labels and text to provide additional information original issue reported on code google com by on apr at
| 0
|
542,686
| 15,864,908,132
|
IssuesEvent
|
2021-04-08 14:13:22
|
AxonFramework/extension-mongo
|
https://api.github.com/repos/AxonFramework/extension-mongo
|
opened
|
Use Date type for date strings
|
Priority 2: Should Type: Feature
|
### Feature Description
Replace the date String fields for actual Date types.
Important in this process is to ensure the created sorting index (based on the event timestamp and sequence number) keeps a similar performance as it currently has.
Furthermore, we should make sure this adjustment keeps the current ordering based on timestamp and seqNo in working order.
### Current Behaviour
Currently, the Mongo Extension uses Strings to contain date times.
### Wanted Behaviour
The Mongo Extension should use the Date type to refer to date fields.
|
1.0
|
Use Date type for date strings - ### Feature Description
Replace the date String fields for actual Date types.
Important in this process is to ensure the created sorting index (based on the event timestamp and sequence number) keeps a similar performance as it currently has.
Furthermore, we should make sure this adjustment keeps the current ordering based on timestamp and seqNo in working order.
### Current Behaviour
Currently, the Mongo Extension uses Strings to contain date times.
### Wanted Behaviour
The Mongo Extension should use the Date type to refer to date fields.
|
non_process
|
use date type for date strings feature description replace the date string fields for actual date types important in this process is to ensure the created sorting index based on the event timestamp and sequence number keeps a similar performance as it currently has furthermore we should make sure this adjustment keeps the current ordering based on timestamp and seqno in working order current behaviour currently the mongo extension uses strings to contain date times wanted behaviour the mongo extension should use the date type to refer to date fields
| 0
|
10,224
| 13,093,982,652
|
IssuesEvent
|
2020-08-03 11:29:28
|
zotero/zotero
|
https://api.github.com/repos/zotero/zotero
|
opened
|
Ignore "et al" and its localized versions in citation dialog
|
Word Processor Integration
|
"Smith et al" now returns no results for Smith papers because the author name does not match et al.
Request: https://forums.zotero.org/discussion/84448/feature-request-for-word-libreoffice-plugin-ignore-et-al-look-for-doi-during-search
|
1.0
|
Ignore "et al" and its localized versions in citation dialog - "Smith et al" now returns no results for Smith papers because the author name does not match et al.
Request: https://forums.zotero.org/discussion/84448/feature-request-for-word-libreoffice-plugin-ignore-et-al-look-for-doi-during-search
|
process
|
ignore et al and its localized versions in citation dialog smith et al now returns no results for smith papers because the author name does not match et al request
| 1
|
183,940
| 14,263,978,780
|
IssuesEvent
|
2020-11-20 15:08:48
|
yadevee/yals
|
https://api.github.com/repos/yadevee/yals
|
opened
|
Testing: Support for Firefox and Opera browsers
|
tests
|
- [ ] Support Firefox
- [ ] Optional support for Opera
|
1.0
|
Testing: Support for Firefox and Opera browsers - - [ ] Support Firefox
- [ ] Optional support for Opera
|
non_process
|
testing support for firefox and opera browsers support firefox optional support for opera
| 0
|
15,167
| 9,812,241,284
|
IssuesEvent
|
2019-06-13 03:29:18
|
PuzzleServer/mainpuzzleserver
|
https://api.github.com/repos/PuzzleServer/mainpuzzleserver
|
closed
|
[P0] Adjust lockout times for DC puzzles
|
puzzlehunt feedback usability
|
DC puzzles felt like they had overly-restrictive lockout times. We could adjust this with changes to the overall algorithm or we could look at adding a DC property to puzzles and having a DC-specific algorithm. (note: having the DC property for a puzzle would also let us display a reminder that the puzzle has DC on the submissions page and/or the play page)
|
True
|
[P0] Adjust lockout times for DC puzzles - DC puzzles felt like they had overly-restrictive lockout times. We could adjust this with changes to the overall algorithm or we could look at adding a DC property to puzzles and having a DC-specific algorithm. (note: having the DC property for a puzzle would also let us display a reminder that the puzzle has DC on the submissions page and/or the play page)
|
non_process
|
adjust lockout times for dc puzzles dc puzzles felt like they had overly restrictive lockout times we could adjust this with changes to the overall algorithm or we could look at adding a dc property to puzzles and having a dc specific algorithm note having the dc property for a puzzle would also let us display a reminder that the puzzle has dc on the submissions page and or the play page
| 0
|
204,047
| 7,079,876,530
|
IssuesEvent
|
2018-01-10 11:17:41
|
sbwtech/elgg-projects
|
https://api.github.com/repos/sbwtech/elgg-projects
|
closed
|
Possibility to filter tasks
|
low priority
|
For example:
- Status (open, closed, reopened, etc)
- Assignee (own tab for "Assigned to me"?)
|
1.0
|
Possibility to filter tasks - For example:
- Status (open, closed, reopened, etc)
- Assignee (own tab for "Assigned to me"?)
|
non_process
|
possibility to filter tasks for example status open closed reopened etc assignee own tab for assigned to me
| 0
|
117,777
| 15,173,180,068
|
IssuesEvent
|
2021-02-13 12:57:12
|
urbit/landscape
|
https://api.github.com/repos/urbit/landscape
|
opened
|
User names are black on dark background in firefox
|
design
|
User names are black on dark background in Firefox.

**To Reproduce**
Use FireFox
**Desktop (please complete the following information):**
- MacOS
- Firefox 86.0b9
- 6q1bi
|
1.0
|
User names are black on dark background in firefox - User names are black on dark background in Firefox.

**To Reproduce**
Use FireFox
**Desktop (please complete the following information):**
- MacOS
- Firefox 86.0b9
- 6q1bi
|
non_process
|
user names are black on dark background in firefox user names are black on dark background in firefox to reproduce use firefox desktop please complete the following information macos firefox
| 0
|
5,802
| 8,643,540,176
|
IssuesEvent
|
2018-11-25 18:54:58
|
gfrebello/qs-trip-planning-procedure
|
https://api.github.com/repos/gfrebello/qs-trip-planning-procedure
|
closed
|
Implement "flights back" functionality
|
Priority:Very High Process:Implement Requirement
|
Currently, the user can only select flights going from the origin to the destination, but cannot yet choose flights coming back.
|
1.0
|
Implement "flights back" functionality - Currently, the user can only select flights going from the origin to the destination, but cannot yet choose flights coming back.
|
process
|
implement flights back functionality currently the user can only select flights going from the origin to the destination but cannot yet choose flights coming back
| 1
|
15,075
| 18,772,928,381
|
IssuesEvent
|
2021-11-07 06:17:48
|
bridgetownrb/bridgetown
|
https://api.github.com/repos/bridgetownrb/bridgetown
|
closed
|
Upgrade requirements to minimum Ruby 2.7
|
process ruby3
|
I [tweeted this out](https://twitter.com/jaredcwhite/status/1435985254339121165?s=21) a while back and forgot to file an issue so then it slipped my mind!
I desperately want to get on the Ruby 3 train ASAP, but that's not widely supported across deployment infra yet. However, Ruby 2.7 seems to be pretty solid at this point, so we're going to go for that as a minimum for Bridgetown 1.0 and hopefully by 2.0 we can rally around Ruby 3.x.
(Don't forget to remove 2.5.8 & 2.6.6 from the test matrix in our GH action.)
|
1.0
|
Upgrade requirements to minimum Ruby 2.7 - I [tweeted this out](https://twitter.com/jaredcwhite/status/1435985254339121165?s=21) a while back and forgot to file an issue so then it slipped my mind!
I desperately want to get on the Ruby 3 train ASAP, but that's not widely supported across deployment infra yet. However, Ruby 2.7 seems to be pretty solid at this point, so we're going to go for that as a minimum for Bridgetown 1.0 and hopefully by 2.0 we can rally around Ruby 3.x.
(Don't forget to remove 2.5.8 & 2.6.6 from the test matrix in our GH action.)
|
process
|
upgrade requirements to minimum ruby i a while back and forgot to file an issue so then it slipped my mind i desperately want to get on the ruby train asap but that s not widely supported across deployment infra yet however ruby seems to be pretty solid at this point so we re going to go for that as a minimum for bridgetown and hopefully by we can rally around ruby x don t forget to remove from the test matrix in our gh action
| 1
|
8,762
| 11,882,440,890
|
IssuesEvent
|
2020-03-27 14:24:38
|
kubeflow/testing
|
https://api.github.com/repos/kubeflow/testing
|
closed
|
Move prow jobs onto the Kubeflow testing cluster
|
area/engprod kind/process priority/p1
|
This is issue is tracking moving Kubeflow's prow jobs onto its own build clusters.
https://github.com/kubernetes/test-infra/blob/master/prow/scaling.md#separate-build-clusters
This should allow better isolation and debuggability because we will have access to the prow pod logs.
/assign @clarketm
|
1.0
|
Move prow jobs onto the Kubeflow testing cluster - This is issue is tracking moving Kubeflow's prow jobs onto its own build clusters.
https://github.com/kubernetes/test-infra/blob/master/prow/scaling.md#separate-build-clusters
This should allow better isolation and debuggability because we will have access to the prow pod logs.
/assign @clarketm
|
process
|
move prow jobs onto the kubeflow testing cluster this is issue is tracking moving kubeflow s prow jobs onto its own build clusters this should allow better isolation and debuggability because we will have access to the prow pod logs assign clarketm
| 1
|
8,897
| 11,992,359,395
|
IssuesEvent
|
2020-04-08 09:58:34
|
prisma/prisma-client-js
|
https://api.github.com/repos/prisma/prisma-client-js
|
closed
|
Adding orderBy to included field breaks typings
|
bug/2-confirmed kind/bug process/candidate
|
## Bug description
OrderBy in deeper include breaks types.
## How to reproduce
1. Use `https://github.com/prisma/prisma-examples/tree/prisma2/typescript/graphql`
2. Add new queryField
```
const test = queryField('test', {
type: 'Boolean',
resolve: async (self, args, { prisma }) => {
const users = await prisma.user.findMany({
include: {
posts: {
include: {
author: true
},
orderBy: {
title: "asc"
},
},
},
})
// Here author is not available in typescript suggestion
users.map(d => d.posts[0].author)
},
})
```
## Expected behaviour
Author type should be available in the post
## Prisma
Using Preview-24 but also fails on 2.0.0-alpha.972
|
1.0
|
Adding orderBy to included field breaks typings -
## Bug description
OrderBy in deeper include breaks types.
## How to reproduce
1. Use `https://github.com/prisma/prisma-examples/tree/prisma2/typescript/graphql`
2. Add new queryField
```
const test = queryField('test', {
type: 'Boolean',
resolve: async (self, args, { prisma }) => {
const users = await prisma.user.findMany({
include: {
posts: {
include: {
author: true
},
orderBy: {
title: "asc"
},
},
},
})
// Here author is not available in typescript suggestion
users.map(d => d.posts[0].author)
},
})
```
## Expected behaviour
Author type should be available in the post
## Prisma
Using Preview-24 but also fails on 2.0.0-alpha.972
|
process
|
adding orderby to included field breaks typings bug description orderby in deeper include breaks types how to reproduce use add new queryfield const test queryfield test type boolean resolve async self args prisma const users await prisma user findmany include posts include author true orderby title asc here author is not available in typescript suggestion users map d d posts author expected behaviour author type should be available in the post prisma using preview but also fails on alpha
| 1
|
226,242
| 7,511,176,236
|
IssuesEvent
|
2018-04-11 05:10:06
|
CS2103JAN2018-W14-B1/main
|
https://api.github.com/repos/CS2103JAN2018-W14-B1/main
|
closed
|
As a teacher, I want to create a class
|
Priority.high type.story
|
... so that I can group and manage students who are taking the same class
|
1.0
|
As a teacher, I want to create a class - ... so that I can group and manage students who are taking the same class
|
non_process
|
as a teacher i want to create a class so that i can group and manage students who are taking the same class
| 0
|
11,762
| 14,594,060,157
|
IssuesEvent
|
2020-12-20 03:01:35
|
googleapis/java-workflows
|
https://api.github.com/repos/googleapis/java-workflows
|
closed
|
Warning: a recent release failed
|
type: process type: process
|
The release PR #25 is still in a pending state after several hours
|
2.0
|
Warning: a recent release failed - The release PR #25 is still in a pending state after several hours
|
process
|
warning a recent release failed the release pr is still in a pending state after several hours
| 1
|
17,831
| 5,520,874,046
|
IssuesEvent
|
2017-03-19 10:28:55
|
RSS-Bridge/rss-bridge
|
https://api.github.com/repos/RSS-Bridge/rss-bridge
|
closed
|
[request-pull] add HTTPS status to bridges and visual notification in the bridge list
|
code inclusion request improvement / refactoring question
|
User should be aware of the HTTPS status of the connection they will be using to get their bridge data.
Sometimes, the URI used to collect data supports HTTPS, but only to redirect to HTTP version of the URI.
Sometimes, the downloaded content need to be corrected to replace the links by their HTTPS equivalent.
Sometimes, the URI can be accessed with HTTPS, but it uses an invalid certificate: in this case, an option offers to ignore this invalid certificate and force the HTTPS connection.
All this information is now stored in the bridge in order to be useful for unknown future development. For now, it just notifies the RSS-Bridge users about the HTTPS status of the bridges in the bridges list.
Reviews and comments before merging are most welcome !
---
The following changes since commit ec3824e2841b8549b5b171adbdc80539c829c384:
[bridges] Remove compatible WordPress bridges (2016-09-17 20:57:33 +0200)
are available in the git repository at:
https://framagit.org/peetah/rss-bridge httpsStatus
for you to fetch changes up to 167f6445087ee888d3932bd398dfd7ee38421fd6:
[bridges] add HTTPS status (2016-09-20 22:45:30 +0200)
---
Pierre Mazière (5):
Revert "[bridges] Remove compatible WordPress bridges"
[HTMLUtils] remove useless variable $bridgeClass
[BridgeAbstract] fix context settings
[core] add explicit HTTPS status
[bridges] add HTTPS status
bridges/ABCTabsBridge.php | 1 +
bridges/AcrimedBridge.php | 1 +
bridges/AllocineFRBridge.php | 1 +
bridges/AnimeUltimeBridge.php | 2 ++
bridges/ArstechnicaBridge.php | 11 ++++++++++
bridges/Arte7Bridge.php | 172 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++---------------------------------------------------------------------
bridges/AskfmBridge.php | 3 ++-
bridges/BandcampBridge.php | 4 +++-
bridges/BastaBridge.php | 19 +++++++++-------
bridges/BlaguesDeMerdeBridge.php | 1 +
bridges/BooruprojectBridge.php | 1 +
bridges/CADBridge.php | 6 +++--
bridges/CNETBridge.php | 3 ++-
bridges/CastorusBridge.php | 3 ++-
bridges/CollegeDeFranceBridge.php | 3 ++-
bridges/CommonDreamsBridge.php | 9 ++++++--
bridges/CopieDoubleBridge.php | 1 +
bridges/CourrierInternationalBridge.php | 1 +
bridges/CpasbienBridge.php | 3 ++-
bridges/CryptomeBridge.php | 1 +
bridges/DailymotionBridge.php | 1 +
bridges/DanbooruBridge.php | 3 ++-
bridges/DansTonChatBridge.php | 1 +
bridges/DauphineLibereBridge.php | 3 ++-
bridges/DemoBridge.php | 3 ++-
bridges/DeveloppezDotComBridge.php | 1 +
bridges/DilbertBridge.php | 1 +
bridges/DollbooruBridge.php | 3 ++-
bridges/DuckDuckGoBridge.php | 1 +
bridges/EZTVBridge.php | 1 +
bridges/EliteDangerousGalnetBridge.php | 1 +
bridges/ElsevierBridge.php | 1 +
bridges/EstCeQuonMetEnProdBridge.php | 1 +
bridges/FacebookBridge.php | 1 +
bridges/FeedExpanderExampleBridge.php | 1 +
bridges/FierPandaBridge.php | 3 ++-
bridges/FlickrExploreBridge.php | 2 ++
bridges/FlickrTagBridge.php | 3 ++-
bridges/FootitoBridge.php | 1 +
bridges/FourchanBridge.php | 8 +++++--
bridges/FreenewsBridge.php | 26 ++++++++++++++++++++++
bridges/FuturaSciencesBridge.php | 1 +
bridges/GBAtempBridge.php | 3 ++-
bridges/GelbooruBridge.php | 1 +
bridges/GiphyBridge.php | 106 ++++++++++++++++++++++++++++++++++++++++++++-------------------------------------------
bridges/GithubIssueBridge.php | 1 +
bridges/GizmodoBridge.php | 1 +
bridges/GooglePlusPostBridge.php | 1 +
bridges/GoogleSearchBridge.php | 1 +
bridges/HDWallpapersBridge.php | 1 +
bridges/HentaiHavenBridge.php | 5 ++++-
bridges/IdenticaBridge.php | 1 +
bridges/InstagramBridge.php | 3 ++-
bridges/IsoHuntBridge.php | 1 +
bridges/JapanExpoBridge.php | 1 +
bridges/KonachanBridge.php | 10 ++++++++-
bridges/KoreusBridge.php | 1 +
bridges/KununuBridge.php | 1 +
bridges/LWNprevBridge.php | 1 +
bridges/LeBonCoinBridge.php | 4 +++-
bridges/LeJournalDuGeekBridge.php | 16 ++++++++++++++
bridges/LeMondeInformatiqueBridge.php | 1 +
bridges/LesJoiesDuCodeBridge.php | 1 +
bridges/LichessBridge.php | 3 ++-
bridges/LinkedInCompanyBridge.php | 1 +
bridges/LolibooruBridge.php | 1 +
bridges/MangareaderBridge.php | 13 ++++++-----
bridges/MilbooruBridge.php | 1 +
bridges/MoebooruBridge.php | 1 +
bridges/MondeDiploBridge.php | 3 ++-
bridges/MsnMondeBridge.php | 3 ++-
bridges/MspabooruBridge.php | 1 +
bridges/NakedSecurityBridge.php | 12 ++++++++++
bridges/NasaApodBridge.php | 1 +
bridges/NeuviemeArtBridge.php | 1 +
bridges/NextInpactBridge.php | 8 ++++++-
bridges/NextgovBridge.php | 1 +
bridges/NiceMatinBridge.php | 1 +
bridges/NovelUpdatesBridge.php | 5 ++++-
bridges/NumeramaBridge.php | 21 ++++++++++++++++++
bridges/OpenClassroomsBridge.php | 1 +
bridges/ParuVenduImmoBridge.php | 1 +
bridges/PickyWallpapersBridge.php | 1 +
bridges/PinterestBridge.php | 3 ++-
bridges/PlanetLibreBridge.php | 1 +
bridges/RTBFBridge.php | 3 ++-
bridges/Releases3DSBridge.php | 1 +
bridges/ReporterreBridge.php | 3 ++-
bridges/Rue89Bridge.php | 1 +
bridges/Rule34Bridge.php | 3 ++-
bridges/Rule34pahealBridge.php | 3 +++
bridges/SafebooruBridge.php | 1 +
bridges/SakugabooruBridge.php | 3 ++-
bridges/ScmbBridge.php | 1 +
bridges/ScoopItBridge.php | 1 +
bridges/SensCritiqueBridge.php | 1 +
bridges/ShanaprojectBridge.php | 3 ++-
bridges/Shimmie2Bridge.php | 3 ++-
bridges/SiliconBridge.php | 16 ++++++++++++++
bridges/SoundcloudBridge.php | 1 +
bridges/StripeAPIChangeLogBridge.php | 1 +
bridges/SuperbWallpapersBridge.php | 3 ++-
bridges/T411Bridge.php | 1 +
bridges/TagBoardBridge.php | 3 ++-
bridges/TbibBridge.php | 3 ++-
bridges/TheCodingLoveBridge.php | 1 +
bridges/TheHackerNewsBridge.php | 2 ++
bridges/TheOatMealBridge.php | 5 +++--
bridges/ThePirateBayBridge.php | 1 +
bridges/TwitchApiBridge.php | 3 ++-
bridges/TwitterBridge.php | 1 +
bridges/UnsplashBridge.php | 3 ++-
bridges/ViadeoCompanyBridge.php | 1 +
bridges/VineBridge.php | 5 +++--
bridges/VkBridge.php | 3 ++-
bridges/WallpaperStopBridge.php | 1 +
bridges/WeLiveSecurityBridge.php | 3 ++-
bridges/WhydBridge.php | 3 ++-
bridges/WikipediaBridge.php | 1 +
bridges/WorldOfTanksBridge.php | 1 +
bridges/XbooruBridge.php | 1 +
bridges/YandereBridge.php | 1 +
bridges/YoutubeBridge.php | 1 +
bridges/ZDNetBridge.php | 1 +
bridges/ZatazBridge.php | 17 ++++++++++++++
css/style.css | 27 +++++++++++++++++++++++
index.php | 4 ++++
lib/BridgeAbstract.php | 35 ++++++++++++++++++++++-------
lib/FeedExpander.php | 24 +++++++++++++++++++-
lib/HTMLUtils.php | 54 +++++++++++++++++++++++++++++++++++++--------
130 files changed, 597 insertions(+), 215 deletions(-)
create mode 100644 bridges/ArstechnicaBridge.php
create mode 100644 bridges/FreenewsBridge.php
create mode 100644 bridges/LeJournalDuGeekBridge.php
create mode 100644 bridges/NakedSecurityBridge.php
create mode 100644 bridges/NumeramaBridge.php
create mode 100644 bridges/SiliconBridge.php
create mode 100644 bridges/ZatazBridge.php
|
1.0
|
[request-pull] add HTTPS status to bridges and visual notification in the bridge list - User should be aware of the HTTPS status of the connection they will be using to get their bridge data.
Sometimes, the URI used to collect data supports HTTPS, but only to redirect to HTTP version of the URI.
Sometimes, the downloaded content need to be corrected to replace the links by their HTTPS equivalent.
Sometimes, the URI can be accessed with HTTPS, but it uses an invalid certificate: in this case, an option offers to ignore this invalid certificate and force the HTTPS connection.
All this information is now stored in the bridge in order to be useful for unknown future development. For now, it just notifies the RSS-Bridge users about the HTTPS status of the bridges in the bridges list.
Reviews and comments before merging are most welcome !
---
The following changes since commit ec3824e2841b8549b5b171adbdc80539c829c384:
[bridges] Remove compatible WordPress bridges (2016-09-17 20:57:33 +0200)
are available in the git repository at:
https://framagit.org/peetah/rss-bridge httpsStatus
for you to fetch changes up to 167f6445087ee888d3932bd398dfd7ee38421fd6:
[bridges] add HTTPS status (2016-09-20 22:45:30 +0200)
---
Pierre Mazière (5):
Revert "[bridges] Remove compatible WordPress bridges"
[HTMLUtils] remove useless variable $bridgeClass
[BridgeAbstract] fix context settings
[core] add explicit HTTPS status
[bridges] add HTTPS status
bridges/ABCTabsBridge.php | 1 +
bridges/AcrimedBridge.php | 1 +
bridges/AllocineFRBridge.php | 1 +
bridges/AnimeUltimeBridge.php | 2 ++
bridges/ArstechnicaBridge.php | 11 ++++++++++
bridges/Arte7Bridge.php | 172 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++---------------------------------------------------------------------
bridges/AskfmBridge.php | 3 ++-
bridges/BandcampBridge.php | 4 +++-
bridges/BastaBridge.php | 19 +++++++++-------
bridges/BlaguesDeMerdeBridge.php | 1 +
bridges/BooruprojectBridge.php | 1 +
bridges/CADBridge.php | 6 +++--
bridges/CNETBridge.php | 3 ++-
bridges/CastorusBridge.php | 3 ++-
bridges/CollegeDeFranceBridge.php | 3 ++-
bridges/CommonDreamsBridge.php | 9 ++++++--
bridges/CopieDoubleBridge.php | 1 +
bridges/CourrierInternationalBridge.php | 1 +
bridges/CpasbienBridge.php | 3 ++-
bridges/CryptomeBridge.php | 1 +
bridges/DailymotionBridge.php | 1 +
bridges/DanbooruBridge.php | 3 ++-
bridges/DansTonChatBridge.php | 1 +
bridges/DauphineLibereBridge.php | 3 ++-
bridges/DemoBridge.php | 3 ++-
bridges/DeveloppezDotComBridge.php | 1 +
bridges/DilbertBridge.php | 1 +
bridges/DollbooruBridge.php | 3 ++-
bridges/DuckDuckGoBridge.php | 1 +
bridges/EZTVBridge.php | 1 +
bridges/EliteDangerousGalnetBridge.php | 1 +
bridges/ElsevierBridge.php | 1 +
bridges/EstCeQuonMetEnProdBridge.php | 1 +
bridges/FacebookBridge.php | 1 +
bridges/FeedExpanderExampleBridge.php | 1 +
bridges/FierPandaBridge.php | 3 ++-
bridges/FlickrExploreBridge.php | 2 ++
bridges/FlickrTagBridge.php | 3 ++-
bridges/FootitoBridge.php | 1 +
bridges/FourchanBridge.php | 8 +++++--
bridges/FreenewsBridge.php | 26 ++++++++++++++++++++++
bridges/FuturaSciencesBridge.php | 1 +
bridges/GBAtempBridge.php | 3 ++-
bridges/GelbooruBridge.php | 1 +
bridges/GiphyBridge.php | 106 ++++++++++++++++++++++++++++++++++++++++++++-------------------------------------------
bridges/GithubIssueBridge.php | 1 +
bridges/GizmodoBridge.php | 1 +
bridges/GooglePlusPostBridge.php | 1 +
bridges/GoogleSearchBridge.php | 1 +
bridges/HDWallpapersBridge.php | 1 +
bridges/HentaiHavenBridge.php | 5 ++++-
bridges/IdenticaBridge.php | 1 +
bridges/InstagramBridge.php | 3 ++-
bridges/IsoHuntBridge.php | 1 +
bridges/JapanExpoBridge.php | 1 +
bridges/KonachanBridge.php | 10 ++++++++-
bridges/KoreusBridge.php | 1 +
bridges/KununuBridge.php | 1 +
bridges/LWNprevBridge.php | 1 +
bridges/LeBonCoinBridge.php | 4 +++-
bridges/LeJournalDuGeekBridge.php | 16 ++++++++++++++
bridges/LeMondeInformatiqueBridge.php | 1 +
bridges/LesJoiesDuCodeBridge.php | 1 +
bridges/LichessBridge.php | 3 ++-
bridges/LinkedInCompanyBridge.php | 1 +
bridges/LolibooruBridge.php | 1 +
bridges/MangareaderBridge.php | 13 ++++++-----
bridges/MilbooruBridge.php | 1 +
bridges/MoebooruBridge.php | 1 +
bridges/MondeDiploBridge.php | 3 ++-
bridges/MsnMondeBridge.php | 3 ++-
bridges/MspabooruBridge.php | 1 +
bridges/NakedSecurityBridge.php | 12 ++++++++++
bridges/NasaApodBridge.php | 1 +
bridges/NeuviemeArtBridge.php | 1 +
bridges/NextInpactBridge.php | 8 ++++++-
bridges/NextgovBridge.php | 1 +
bridges/NiceMatinBridge.php | 1 +
bridges/NovelUpdatesBridge.php | 5 ++++-
bridges/NumeramaBridge.php | 21 ++++++++++++++++++
bridges/OpenClassroomsBridge.php | 1 +
bridges/ParuVenduImmoBridge.php | 1 +
bridges/PickyWallpapersBridge.php | 1 +
bridges/PinterestBridge.php | 3 ++-
bridges/PlanetLibreBridge.php | 1 +
bridges/RTBFBridge.php | 3 ++-
bridges/Releases3DSBridge.php | 1 +
bridges/ReporterreBridge.php | 3 ++-
bridges/Rue89Bridge.php | 1 +
bridges/Rule34Bridge.php | 3 ++-
bridges/Rule34pahealBridge.php | 3 +++
bridges/SafebooruBridge.php | 1 +
bridges/SakugabooruBridge.php | 3 ++-
bridges/ScmbBridge.php | 1 +
bridges/ScoopItBridge.php | 1 +
bridges/SensCritiqueBridge.php | 1 +
bridges/ShanaprojectBridge.php | 3 ++-
bridges/Shimmie2Bridge.php | 3 ++-
bridges/SiliconBridge.php | 16 ++++++++++++++
bridges/SoundcloudBridge.php | 1 +
bridges/StripeAPIChangeLogBridge.php | 1 +
bridges/SuperbWallpapersBridge.php | 3 ++-
bridges/T411Bridge.php | 1 +
bridges/TagBoardBridge.php | 3 ++-
bridges/TbibBridge.php | 3 ++-
bridges/TheCodingLoveBridge.php | 1 +
bridges/TheHackerNewsBridge.php | 2 ++
bridges/TheOatMealBridge.php | 5 +++--
bridges/ThePirateBayBridge.php | 1 +
bridges/TwitchApiBridge.php | 3 ++-
bridges/TwitterBridge.php | 1 +
bridges/UnsplashBridge.php | 3 ++-
bridges/ViadeoCompanyBridge.php | 1 +
bridges/VineBridge.php | 5 +++--
bridges/VkBridge.php | 3 ++-
bridges/WallpaperStopBridge.php | 1 +
bridges/WeLiveSecurityBridge.php | 3 ++-
bridges/WhydBridge.php | 3 ++-
bridges/WikipediaBridge.php | 1 +
bridges/WorldOfTanksBridge.php | 1 +
bridges/XbooruBridge.php | 1 +
bridges/YandereBridge.php | 1 +
bridges/YoutubeBridge.php | 1 +
bridges/ZDNetBridge.php | 1 +
bridges/ZatazBridge.php | 17 ++++++++++++++
css/style.css | 27 +++++++++++++++++++++++
index.php | 4 ++++
lib/BridgeAbstract.php | 35 ++++++++++++++++++++++-------
lib/FeedExpander.php | 24 +++++++++++++++++++-
lib/HTMLUtils.php | 54 +++++++++++++++++++++++++++++++++++++--------
130 files changed, 597 insertions(+), 215 deletions(-)
create mode 100644 bridges/ArstechnicaBridge.php
create mode 100644 bridges/FreenewsBridge.php
create mode 100644 bridges/LeJournalDuGeekBridge.php
create mode 100644 bridges/NakedSecurityBridge.php
create mode 100644 bridges/NumeramaBridge.php
create mode 100644 bridges/SiliconBridge.php
create mode 100644 bridges/ZatazBridge.php
|
non_process
|
add https status to bridges and visual notification in the bridge list user should be aware of the https status of the connection they will be using to get their bridge data sometimes the uri used to collect data supports https but only to redirect to http version of the uri sometimes the downloaded content need to be corrected to replace the links by their https equivalent sometimes the uri can be accessed with https but it uses an invalid certificate in this case an option offers to ignore this invalid certificate and force the https connection all this information is now stored in the bridge in order to be useful for unknown future development for now it just notifies the rss bridge users about the https status of the bridges in the bridges list reviews and comments before merging are most welcome the following changes since commit remove compatible wordpress bridges are available in the git repository at httpsstatus for you to fetch changes up to add https status pierre mazière revert remove compatible wordpress bridges remove useless variable bridgeclass fix context settings add explicit https status add https status bridges abctabsbridge php bridges acrimedbridge php bridges allocinefrbridge php bridges animeultimebridge php bridges arstechnicabridge php bridges php bridges askfmbridge php bridges bandcampbridge php bridges bastabridge php bridges blaguesdemerdebridge php bridges booruprojectbridge php bridges cadbridge php bridges cnetbridge php bridges castorusbridge php bridges collegedefrancebridge php bridges commondreamsbridge php bridges copiedoublebridge php bridges courrierinternationalbridge php bridges cpasbienbridge php bridges cryptomebridge php bridges dailymotionbridge php bridges danboorubridge php bridges danstonchatbridge php bridges dauphineliberebridge php bridges demobridge php bridges developpezdotcombridge php bridges dilbertbridge php bridges dollboorubridge php bridges duckduckgobridge php bridges eztvbridge php bridges elitedangerousgalnetbridge php bridges elsevierbridge php bridges estcequonmetenprodbridge php bridges facebookbridge php bridges feedexpanderexamplebridge php bridges fierpandabridge php bridges flickrexplorebridge php bridges flickrtagbridge php bridges footitobridge php bridges fourchanbridge php bridges freenewsbridge php bridges futurasciencesbridge php bridges gbatempbridge php bridges gelboorubridge php bridges giphybridge php bridges githubissuebridge php bridges gizmodobridge php bridges googlepluspostbridge php bridges googlesearchbridge php bridges hdwallpapersbridge php bridges hentaihavenbridge php bridges identicabridge php bridges instagrambridge php bridges isohuntbridge php bridges japanexpobridge php bridges konachanbridge php bridges koreusbridge php bridges kununubridge php bridges lwnprevbridge php bridges leboncoinbridge php bridges lejournaldugeekbridge php bridges lemondeinformatiquebridge php bridges lesjoiesducodebridge php bridges lichessbridge php bridges linkedincompanybridge php bridges loliboorubridge php bridges mangareaderbridge php bridges milboorubridge php bridges moeboorubridge php bridges mondediplobridge php bridges msnmondebridge php bridges mspaboorubridge php bridges nakedsecuritybridge php bridges nasaapodbridge php bridges neuviemeartbridge php bridges nextinpactbridge php bridges nextgovbridge php bridges nicematinbridge php bridges novelupdatesbridge php bridges numeramabridge php bridges openclassroomsbridge php bridges paruvenduimmobridge php bridges pickywallpapersbridge php bridges pinterestbridge php bridges planetlibrebridge php bridges rtbfbridge php bridges php bridges reporterrebridge php bridges php bridges php bridges php bridges safeboorubridge php bridges sakugaboorubridge php bridges scmbbridge php bridges scoopitbridge php bridges senscritiquebridge php bridges shanaprojectbridge php bridges php bridges siliconbridge php bridges soundcloudbridge php bridges stripeapichangelogbridge php bridges superbwallpapersbridge php bridges php bridges tagboardbridge php bridges tbibbridge php bridges thecodinglovebridge php bridges thehackernewsbridge php bridges theoatmealbridge php bridges thepiratebaybridge php bridges twitchapibridge php bridges twitterbridge php bridges unsplashbridge php bridges viadeocompanybridge php bridges vinebridge php bridges vkbridge php bridges wallpaperstopbridge php bridges welivesecuritybridge php bridges whydbridge php bridges wikipediabridge php bridges worldoftanksbridge php bridges xboorubridge php bridges yanderebridge php bridges youtubebridge php bridges zdnetbridge php bridges zatazbridge php css style css index php lib bridgeabstract php lib feedexpander php lib htmlutils php files changed insertions deletions create mode bridges arstechnicabridge php create mode bridges freenewsbridge php create mode bridges lejournaldugeekbridge php create mode bridges nakedsecuritybridge php create mode bridges numeramabridge php create mode bridges siliconbridge php create mode bridges zatazbridge php
| 0
|
19,425
| 25,579,220,507
|
IssuesEvent
|
2022-12-01 02:00:06
|
lizhihao6/get-daily-arxiv-noti
|
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
|
opened
|
New submissions for Thu, 1 Dec 22
|
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
|
## Keyword: events
There is no result
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
### From Actions to Events: A Transfer Learning Approach Using Improved Deep Belief Networks
- **Authors:** Mateus Roder, Jurandy Almeida, Gustavo H. de Rosa, Leandro A. Passos, André L. D. Rossi, João P. Papa
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2211.17045
- **Pdf link:** https://arxiv.org/pdf/2211.17045
- **Abstract**
In the last decade, exponential data growth supplied machine learning-based algorithms' capacity and enabled their usage in daily-life activities. Additionally, such an improvement is partially explained due to the advent of deep learning techniques, i.e., stacks of simple architectures that end up in more complex models. Although both factors produce outstanding results, they also pose drawbacks regarding the learning process as training complex models over large datasets are expensive and time-consuming. Such a problem is even more evident when dealing with video analysis. Some works have considered transfer learning or domain adaptation, i.e., approaches that map the knowledge from one domain to another, to ease the training burden, yet most of them operate over individual or small blocks of frames. This paper proposes a novel approach to map the knowledge from action recognition to event recognition using an energy-based model, denoted as Spectral Deep Belief Network. Such a model can process all frames simultaneously, carrying spatial and temporal information through the learning process. The experimental results conducted over two public video dataset, the HMDB-51 and the UCF-101, depict the effectiveness of the proposed model and its reduced computational burden when compared to traditional energy-based models, such as Restricted Boltzmann Machines and Deep Belief Networks.
## Keyword: ISP
### DINER: Depth-aware Image-based NEural Radiance fields
- **Authors:** Malte Prinzler, Otmar Hilliges, Justus Thies
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2211.16630
- **Pdf link:** https://arxiv.org/pdf/2211.16630
- **Abstract**
We present Depth-aware Image-based NEural Radiance fields (DINER). Given a sparse set of RGB input views, we predict depth and feature maps to guide the reconstruction of a volumetric scene representation that allows us to render 3D objects under novel views. Specifically, we propose novel techniques to incorporate depth information into feature fusion and efficient scene sampling. In comparison to the previous state of the art, DINER achieves higher synthesis quality and can process input views with greater disparity. This allows us to capture scenes more completely without changing capturing hardware requirements and ultimately enables larger viewpoint changes during novel view synthesis. We evaluate our method by synthesizing novel views, both for human heads and for general objects, and observe significantly improved qualitative results and increased perceptual metrics compared to the previous state of the art. The code will be made publicly available for research purposes.
### Rethinking Disparity: A Depth Range Free Multi-View Stereo Based on Disparity
- **Authors:** Qingsong Yan, Qiang Wang, Kaiyong Zhao, Bo Li, Xiaowen Chu, Fei Deng
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2211.16905
- **Pdf link:** https://arxiv.org/pdf/2211.16905
- **Abstract**
Existing learning-based multi-view stereo (MVS) methods rely on the depth range to build the 3D cost volume and may fail when the range is too large or unreliable. To address this problem, we propose a disparity-based MVS method based on the epipolar disparity flow (E-flow), called DispMVS, which infers the depth information from the pixel movement between two views. The core of DispMVS is to construct a 2D cost volume on the image plane along the epipolar line between each pair (between the reference image and several source images) for pixel matching and fuse uncountable depths triangulated from each pair by multi-view geometry to ensure multi-view consistency. To be robust, DispMVS starts from a randomly initialized depth map and iteratively refines the depth map with the help of the coarse-to-fine strategy. Experiments on DTUMVS and Tanks\&Temple datasets show that DispMVS is not sensitive to the depth range and achieves state-of-the-art results with lower GPU memory.
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### ObjCAViT: Improving Monocular Depth Estimation Using Natural Language Models And Image-Object Cross-Attention
- **Authors:** Dylan Auty, Krystian Mikolajczyk
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2211.17232
- **Pdf link:** https://arxiv.org/pdf/2211.17232
- **Abstract**
While monocular depth estimation (MDE) is an important problem in computer vision, it is difficult due to the ambiguity that results from the compression of a 3D scene into only 2 dimensions. It is common practice in the field to treat it as simple image-to-image translation, without consideration for the semantics of the scene and the objects within it. In contrast, humans and animals have been shown to use higher-level information to solve MDE: prior knowledge of the nature of the objects in the scene, their positions and likely configurations relative to one another, and their apparent sizes have all been shown to help resolve this ambiguity. In this paper, we present a novel method to enhance MDE performance by encouraging use of known-useful information about the semantics of objects and inter-object relationships within a scene. Our novel ObjCAViT module sources world-knowledge from language models and learns inter-object relationships in the context of the MDE problem using transformer attention, incorporating apparent size information. Our method produces highly accurate depth maps, and we obtain competitive results on the NYUv2 and KITTI datasets. Our ablation experiments show that the use of language and cross-attention within the ObjCAViT module increases performance. Code is released at https://github.com/DylanAuty/ObjCAViT.
## Keyword: RAW
### SGDraw: Scene Graph Drawing Interface Using Object-Oriented Representation
- **Authors:** Tianyu Zhang, Xusheng Du, Chia-Ming Chang, Xi Yang, Haoran Xie
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Human-Computer Interaction (cs.HC)
- **Arxiv link:** https://arxiv.org/abs/2211.16697
- **Pdf link:** https://arxiv.org/pdf/2211.16697
- **Abstract**
Scene understanding is an essential and challenging task in computer vision. To provide the visually fundamental graphical structure of an image, the scene graph has received increased attention due to its powerful semantic representation. However, it is difficult to draw a proper scene graph for image retrieval, image generation, and multi-modal applications. The conventional scene graph annotation interface is not easy to use in image annotations, and the automatic scene graph generation approaches using deep neural networks are prone to generate redundant content while disregarding details. In this work, we propose SGDraw, a scene graph drawing interface using object-oriented scene graph representation to help users draw and edit scene graphs interactively. For the proposed object-oriented representation, we consider the objects, attributes, and relationships of objects as a structural unit. SGDraw provides a web-based scene graph annotation and generation tool for scene understanding applications. To verify the effectiveness of the proposed interface, we conducted a comparison study with the conventional tool and the user experience study. The results show that SGDraw can help generate scene graphs with richer details and describe the images more accurately than traditional bounding box annotations. We believe the proposed SGDraw can be useful in various vision tasks, such as image retrieval and generation.
### Extracting Semantic Knowledge from GANs with Unsupervised Learning
- **Authors:** Jianjin Xu, Zhaoxiang Zhang, Xiaolin Hu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2211.16710
- **Pdf link:** https://arxiv.org/pdf/2211.16710
- **Abstract**
Recently, unsupervised learning has made impressive progress on various tasks. Despite the dominance of discriminative models, increasing attention is drawn to representations learned by generative models and in particular, Generative Adversarial Networks (GANs). Previous works on the interpretation of GANs reveal that GANs encode semantics in feature maps in a linearly separable form. In this work, we further find that GAN's features can be well clustered with the linear separability assumption. We propose a novel clustering algorithm, named KLiSH, which leverages the linear separability to cluster GAN's features. KLiSH succeeds in extracting fine-grained semantics of GANs trained on datasets of various objects, e.g., car, portrait, animals, and so on. With KLiSH, we can sample images from GANs along with their segmentation masks and synthesize paired image-segmentation datasets. Using the synthesized datasets, we enable two downstream applications. First, we train semantic segmentation networks on these datasets and test them on real images, realizing unsupervised semantic segmentation. Second, we train image-to-image translation networks on the synthesized datasets, enabling semantic-conditional image synthesis without human annotations.
### Dr.3D: Adapting 3D GANs to Artistic Drawings
- **Authors:** Wonjoon Jin, Nuri Ryu, Geonung Kim, Seung-Hwan Baek, Sunghyun Cho
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2211.16798
- **Pdf link:** https://arxiv.org/pdf/2211.16798
- **Abstract**
While 3D GANs have recently demonstrated the high-quality synthesis of multi-view consistent images and 3D shapes, they are mainly restricted to photo-realistic human portraits. This paper aims to extend 3D GANs to a different, but meaningful visual form: artistic portrait drawings. However, extending existing 3D GANs to drawings is challenging due to the inevitable geometric ambiguity present in drawings. To tackle this, we present Dr.3D, a novel adaptation approach that adapts an existing 3D GAN to artistic drawings. Dr.3D is equipped with three novel components to handle the geometric ambiguity: a deformation-aware 3D synthesis network, an alternating adaptation of pose estimation and image synthesis, and geometric priors. Experiments show that our approach can successfully adapt 3D GANs to drawings and enable multi-view consistent semantic editing of drawings.
### Linking Sketch Patches by Learning Synonymous Proximity for Graphic Sketch Representation
- **Authors:** Sicong Zang, Shikui Tu, Lei Xu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2211.16841
- **Pdf link:** https://arxiv.org/pdf/2211.16841
- **Abstract**
Graphic sketch representations are effective for representing sketches. Existing methods take the patches cropped from sketches as the graph nodes, and construct the edges based on sketch's drawing order or Euclidean distances on the canvas. However, the drawing order of a sketch may not be unique, while the patches from semantically related parts of a sketch may be far away from each other on the canvas. In this paper, we propose an order-invariant, semantics-aware method for graphic sketch representations. The cropped sketch patches are linked according to their global semantics or local geometric shapes, namely the synonymous proximity, by computing the cosine similarity between the captured patch embeddings. Such constructed edges are learnable to adapt to the variation of sketch drawings, which enable the message passing among synonymous patches. Aggregating the messages from synonymous patches by graph convolutional networks plays a role of denoising, which is beneficial to produce robust patch embeddings and accurate sketch representations. Furthermore, we enforce a clustering constraint over the embeddings jointly with the network learning. The synonymous patches are self-organized as compact clusters, and their embeddings are guided to move towards their assigned cluster centroids. It raises the accuracy of the computed synonymous proximity. Experimental results show that our method significantly improves the performance on both controllable sketch synthesis and sketch healing.
### A Geometric Model for Polarization Imaging on Projective Cameras
- **Authors:** Mara Pistellato, Filippo Bergamasco
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2211.16986
- **Pdf link:** https://arxiv.org/pdf/2211.16986
- **Abstract**
The vast majority of Shape-from-Polarization (SfP) methods work under the oversimplified assumption of using orthographic cameras. Indeed, it is still not well understood how to project the Stokes vectors when the incoming rays are not orthogonal to the image plane. We try to answer this question presenting a geometric model describing how a general projective camera captures the light polarization state. Based on the optical properties of a tilted polarizer, our model is implemented as a pre-processing operation acting on raw images, followed by a per-pixel rotation of the reconstructed normal field. In this way, all the existing SfP methods assuming orthographic cameras can behave like they were designed for projective ones. Moreover, our model is consistent with state-of-the-art forward and inverse renderers (like Mitsuba3 and ART), intrinsically enforces physical constraints among the captured channels, and handles demosaicing of DoFP sensors. Experiments on existing and new datasets demonstrate the accuracy of the model when applied to commercially available polarimetric cameras.
### From Actions to Events: A Transfer Learning Approach Using Improved Deep Belief Networks
- **Authors:** Mateus Roder, Jurandy Almeida, Gustavo H. de Rosa, Leandro A. Passos, André L. D. Rossi, João P. Papa
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2211.17045
- **Pdf link:** https://arxiv.org/pdf/2211.17045
- **Abstract**
In the last decade, exponential data growth supplied machine learning-based algorithms' capacity and enabled their usage in daily-life activities. Additionally, such an improvement is partially explained due to the advent of deep learning techniques, i.e., stacks of simple architectures that end up in more complex models. Although both factors produce outstanding results, they also pose drawbacks regarding the learning process as training complex models over large datasets are expensive and time-consuming. Such a problem is even more evident when dealing with video analysis. Some works have considered transfer learning or domain adaptation, i.e., approaches that map the knowledge from one domain to another, to ease the training burden, yet most of them operate over individual or small blocks of frames. This paper proposes a novel approach to map the knowledge from action recognition to event recognition using an energy-based model, denoted as Spectral Deep Belief Network. Such a model can process all frames simultaneously, carrying spatial and temporal information through the learning process. The experimental results conducted over two public video dataset, the HMDB-51 and the UCF-101, depict the effectiveness of the proposed model and its reduced computational burden when compared to traditional energy-based models, such as Restricted Boltzmann Machines and Deep Belief Networks.
## Keyword: raw image
### A Geometric Model for Polarization Imaging on Projective Cameras
- **Authors:** Mara Pistellato, Filippo Bergamasco
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2211.16986
- **Pdf link:** https://arxiv.org/pdf/2211.16986
- **Abstract**
The vast majority of Shape-from-Polarization (SfP) methods work under the oversimplified assumption of using orthographic cameras. Indeed, it is still not well understood how to project the Stokes vectors when the incoming rays are not orthogonal to the image plane. We try to answer this question presenting a geometric model describing how a general projective camera captures the light polarization state. Based on the optical properties of a tilted polarizer, our model is implemented as a pre-processing operation acting on raw images, followed by a per-pixel rotation of the reconstructed normal field. In this way, all the existing SfP methods assuming orthographic cameras can behave like they were designed for projective ones. Moreover, our model is consistent with state-of-the-art forward and inverse renderers (like Mitsuba3 and ART), intrinsically enforces physical constraints among the captured channels, and handles demosaicing of DoFP sensors. Experiments on existing and new datasets demonstrate the accuracy of the model when applied to commercially available polarimetric cameras.
|
2.0
|
New submissions for Thu, 1 Dec 22 - ## Keyword: events
There is no result
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
### From Actions to Events: A Transfer Learning Approach Using Improved Deep Belief Networks
- **Authors:** Mateus Roder, Jurandy Almeida, Gustavo H. de Rosa, Leandro A. Passos, André L. D. Rossi, João P. Papa
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2211.17045
- **Pdf link:** https://arxiv.org/pdf/2211.17045
- **Abstract**
In the last decade, exponential data growth supplied machine learning-based algorithms' capacity and enabled their usage in daily-life activities. Additionally, such an improvement is partially explained due to the advent of deep learning techniques, i.e., stacks of simple architectures that end up in more complex models. Although both factors produce outstanding results, they also pose drawbacks regarding the learning process as training complex models over large datasets are expensive and time-consuming. Such a problem is even more evident when dealing with video analysis. Some works have considered transfer learning or domain adaptation, i.e., approaches that map the knowledge from one domain to another, to ease the training burden, yet most of them operate over individual or small blocks of frames. This paper proposes a novel approach to map the knowledge from action recognition to event recognition using an energy-based model, denoted as Spectral Deep Belief Network. Such a model can process all frames simultaneously, carrying spatial and temporal information through the learning process. The experimental results conducted over two public video dataset, the HMDB-51 and the UCF-101, depict the effectiveness of the proposed model and its reduced computational burden when compared to traditional energy-based models, such as Restricted Boltzmann Machines and Deep Belief Networks.
## Keyword: ISP
### DINER: Depth-aware Image-based NEural Radiance fields
- **Authors:** Malte Prinzler, Otmar Hilliges, Justus Thies
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2211.16630
- **Pdf link:** https://arxiv.org/pdf/2211.16630
- **Abstract**
We present Depth-aware Image-based NEural Radiance fields (DINER). Given a sparse set of RGB input views, we predict depth and feature maps to guide the reconstruction of a volumetric scene representation that allows us to render 3D objects under novel views. Specifically, we propose novel techniques to incorporate depth information into feature fusion and efficient scene sampling. In comparison to the previous state of the art, DINER achieves higher synthesis quality and can process input views with greater disparity. This allows us to capture scenes more completely without changing capturing hardware requirements and ultimately enables larger viewpoint changes during novel view synthesis. We evaluate our method by synthesizing novel views, both for human heads and for general objects, and observe significantly improved qualitative results and increased perceptual metrics compared to the previous state of the art. The code will be made publicly available for research purposes.
### Rethinking Disparity: A Depth Range Free Multi-View Stereo Based on Disparity
- **Authors:** Qingsong Yan, Qiang Wang, Kaiyong Zhao, Bo Li, Xiaowen Chu, Fei Deng
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2211.16905
- **Pdf link:** https://arxiv.org/pdf/2211.16905
- **Abstract**
Existing learning-based multi-view stereo (MVS) methods rely on the depth range to build the 3D cost volume and may fail when the range is too large or unreliable. To address this problem, we propose a disparity-based MVS method based on the epipolar disparity flow (E-flow), called DispMVS, which infers the depth information from the pixel movement between two views. The core of DispMVS is to construct a 2D cost volume on the image plane along the epipolar line between each pair (between the reference image and several source images) for pixel matching and fuse uncountable depths triangulated from each pair by multi-view geometry to ensure multi-view consistency. To be robust, DispMVS starts from a randomly initialized depth map and iteratively refines the depth map with the help of the coarse-to-fine strategy. Experiments on DTUMVS and Tanks\&Temple datasets show that DispMVS is not sensitive to the depth range and achieves state-of-the-art results with lower GPU memory.
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### ObjCAViT: Improving Monocular Depth Estimation Using Natural Language Models And Image-Object Cross-Attention
- **Authors:** Dylan Auty, Krystian Mikolajczyk
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2211.17232
- **Pdf link:** https://arxiv.org/pdf/2211.17232
- **Abstract**
While monocular depth estimation (MDE) is an important problem in computer vision, it is difficult due to the ambiguity that results from the compression of a 3D scene into only 2 dimensions. It is common practice in the field to treat it as simple image-to-image translation, without consideration for the semantics of the scene and the objects within it. In contrast, humans and animals have been shown to use higher-level information to solve MDE: prior knowledge of the nature of the objects in the scene, their positions and likely configurations relative to one another, and their apparent sizes have all been shown to help resolve this ambiguity. In this paper, we present a novel method to enhance MDE performance by encouraging use of known-useful information about the semantics of objects and inter-object relationships within a scene. Our novel ObjCAViT module sources world-knowledge from language models and learns inter-object relationships in the context of the MDE problem using transformer attention, incorporating apparent size information. Our method produces highly accurate depth maps, and we obtain competitive results on the NYUv2 and KITTI datasets. Our ablation experiments show that the use of language and cross-attention within the ObjCAViT module increases performance. Code is released at https://github.com/DylanAuty/ObjCAViT.
## Keyword: RAW
### SGDraw: Scene Graph Drawing Interface Using Object-Oriented Representation
- **Authors:** Tianyu Zhang, Xusheng Du, Chia-Ming Chang, Xi Yang, Haoran Xie
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Human-Computer Interaction (cs.HC)
- **Arxiv link:** https://arxiv.org/abs/2211.16697
- **Pdf link:** https://arxiv.org/pdf/2211.16697
- **Abstract**
Scene understanding is an essential and challenging task in computer vision. To provide the visually fundamental graphical structure of an image, the scene graph has received increased attention due to its powerful semantic representation. However, it is difficult to draw a proper scene graph for image retrieval, image generation, and multi-modal applications. The conventional scene graph annotation interface is not easy to use in image annotations, and the automatic scene graph generation approaches using deep neural networks are prone to generate redundant content while disregarding details. In this work, we propose SGDraw, a scene graph drawing interface using object-oriented scene graph representation to help users draw and edit scene graphs interactively. For the proposed object-oriented representation, we consider the objects, attributes, and relationships of objects as a structural unit. SGDraw provides a web-based scene graph annotation and generation tool for scene understanding applications. To verify the effectiveness of the proposed interface, we conducted a comparison study with the conventional tool and the user experience study. The results show that SGDraw can help generate scene graphs with richer details and describe the images more accurately than traditional bounding box annotations. We believe the proposed SGDraw can be useful in various vision tasks, such as image retrieval and generation.
### Extracting Semantic Knowledge from GANs with Unsupervised Learning
- **Authors:** Jianjin Xu, Zhaoxiang Zhang, Xiaolin Hu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2211.16710
- **Pdf link:** https://arxiv.org/pdf/2211.16710
- **Abstract**
Recently, unsupervised learning has made impressive progress on various tasks. Despite the dominance of discriminative models, increasing attention is drawn to representations learned by generative models and in particular, Generative Adversarial Networks (GANs). Previous works on the interpretation of GANs reveal that GANs encode semantics in feature maps in a linearly separable form. In this work, we further find that GAN's features can be well clustered with the linear separability assumption. We propose a novel clustering algorithm, named KLiSH, which leverages the linear separability to cluster GAN's features. KLiSH succeeds in extracting fine-grained semantics of GANs trained on datasets of various objects, e.g., car, portrait, animals, and so on. With KLiSH, we can sample images from GANs along with their segmentation masks and synthesize paired image-segmentation datasets. Using the synthesized datasets, we enable two downstream applications. First, we train semantic segmentation networks on these datasets and test them on real images, realizing unsupervised semantic segmentation. Second, we train image-to-image translation networks on the synthesized datasets, enabling semantic-conditional image synthesis without human annotations.
### Dr.3D: Adapting 3D GANs to Artistic Drawings
- **Authors:** Wonjoon Jin, Nuri Ryu, Geonung Kim, Seung-Hwan Baek, Sunghyun Cho
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Machine Learning (cs.LG)
- **Arxiv link:** https://arxiv.org/abs/2211.16798
- **Pdf link:** https://arxiv.org/pdf/2211.16798
- **Abstract**
While 3D GANs have recently demonstrated the high-quality synthesis of multi-view consistent images and 3D shapes, they are mainly restricted to photo-realistic human portraits. This paper aims to extend 3D GANs to a different, but meaningful visual form: artistic portrait drawings. However, extending existing 3D GANs to drawings is challenging due to the inevitable geometric ambiguity present in drawings. To tackle this, we present Dr.3D, a novel adaptation approach that adapts an existing 3D GAN to artistic drawings. Dr.3D is equipped with three novel components to handle the geometric ambiguity: a deformation-aware 3D synthesis network, an alternating adaptation of pose estimation and image synthesis, and geometric priors. Experiments show that our approach can successfully adapt 3D GANs to drawings and enable multi-view consistent semantic editing of drawings.
### Linking Sketch Patches by Learning Synonymous Proximity for Graphic Sketch Representation
- **Authors:** Sicong Zang, Shikui Tu, Lei Xu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2211.16841
- **Pdf link:** https://arxiv.org/pdf/2211.16841
- **Abstract**
Graphic sketch representations are effective for representing sketches. Existing methods take the patches cropped from sketches as the graph nodes, and construct the edges based on sketch's drawing order or Euclidean distances on the canvas. However, the drawing order of a sketch may not be unique, while the patches from semantically related parts of a sketch may be far away from each other on the canvas. In this paper, we propose an order-invariant, semantics-aware method for graphic sketch representations. The cropped sketch patches are linked according to their global semantics or local geometric shapes, namely the synonymous proximity, by computing the cosine similarity between the captured patch embeddings. Such constructed edges are learnable to adapt to the variation of sketch drawings, which enable the message passing among synonymous patches. Aggregating the messages from synonymous patches by graph convolutional networks plays a role of denoising, which is beneficial to produce robust patch embeddings and accurate sketch representations. Furthermore, we enforce a clustering constraint over the embeddings jointly with the network learning. The synonymous patches are self-organized as compact clusters, and their embeddings are guided to move towards their assigned cluster centroids. It raises the accuracy of the computed synonymous proximity. Experimental results show that our method significantly improves the performance on both controllable sketch synthesis and sketch healing.
### A Geometric Model for Polarization Imaging on Projective Cameras
- **Authors:** Mara Pistellato, Filippo Bergamasco
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2211.16986
- **Pdf link:** https://arxiv.org/pdf/2211.16986
- **Abstract**
The vast majority of Shape-from-Polarization (SfP) methods work under the oversimplified assumption of using orthographic cameras. Indeed, it is still not well understood how to project the Stokes vectors when the incoming rays are not orthogonal to the image plane. We try to answer this question presenting a geometric model describing how a general projective camera captures the light polarization state. Based on the optical properties of a tilted polarizer, our model is implemented as a pre-processing operation acting on raw images, followed by a per-pixel rotation of the reconstructed normal field. In this way, all the existing SfP methods assuming orthographic cameras can behave like they were designed for projective ones. Moreover, our model is consistent with state-of-the-art forward and inverse renderers (like Mitsuba3 and ART), intrinsically enforces physical constraints among the captured channels, and handles demosaicing of DoFP sensors. Experiments on existing and new datasets demonstrate the accuracy of the model when applied to commercially available polarimetric cameras.
### From Actions to Events: A Transfer Learning Approach Using Improved Deep Belief Networks
- **Authors:** Mateus Roder, Jurandy Almeida, Gustavo H. de Rosa, Leandro A. Passos, André L. D. Rossi, João P. Papa
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI)
- **Arxiv link:** https://arxiv.org/abs/2211.17045
- **Pdf link:** https://arxiv.org/pdf/2211.17045
- **Abstract**
In the last decade, exponential data growth supplied machine learning-based algorithms' capacity and enabled their usage in daily-life activities. Additionally, such an improvement is partially explained due to the advent of deep learning techniques, i.e., stacks of simple architectures that end up in more complex models. Although both factors produce outstanding results, they also pose drawbacks regarding the learning process as training complex models over large datasets are expensive and time-consuming. Such a problem is even more evident when dealing with video analysis. Some works have considered transfer learning or domain adaptation, i.e., approaches that map the knowledge from one domain to another, to ease the training burden, yet most of them operate over individual or small blocks of frames. This paper proposes a novel approach to map the knowledge from action recognition to event recognition using an energy-based model, denoted as Spectral Deep Belief Network. Such a model can process all frames simultaneously, carrying spatial and temporal information through the learning process. The experimental results conducted over two public video dataset, the HMDB-51 and the UCF-101, depict the effectiveness of the proposed model and its reduced computational burden when compared to traditional energy-based models, such as Restricted Boltzmann Machines and Deep Belief Networks.
## Keyword: raw image
### A Geometric Model for Polarization Imaging on Projective Cameras
- **Authors:** Mara Pistellato, Filippo Bergamasco
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2211.16986
- **Pdf link:** https://arxiv.org/pdf/2211.16986
- **Abstract**
The vast majority of Shape-from-Polarization (SfP) methods work under the oversimplified assumption of using orthographic cameras. Indeed, it is still not well understood how to project the Stokes vectors when the incoming rays are not orthogonal to the image plane. We try to answer this question presenting a geometric model describing how a general projective camera captures the light polarization state. Based on the optical properties of a tilted polarizer, our model is implemented as a pre-processing operation acting on raw images, followed by a per-pixel rotation of the reconstructed normal field. In this way, all the existing SfP methods assuming orthographic cameras can behave like they were designed for projective ones. Moreover, our model is consistent with state-of-the-art forward and inverse renderers (like Mitsuba3 and ART), intrinsically enforces physical constraints among the captured channels, and handles demosaicing of DoFP sensors. Experiments on existing and new datasets demonstrate the accuracy of the model when applied to commercially available polarimetric cameras.
|
process
|
new submissions for thu dec keyword events there is no result keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb from actions to events a transfer learning approach using improved deep belief networks authors mateus roder jurandy almeida gustavo h de rosa leandro a passos andré l d rossi joão p papa subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract in the last decade exponential data growth supplied machine learning based algorithms capacity and enabled their usage in daily life activities additionally such an improvement is partially explained due to the advent of deep learning techniques i e stacks of simple architectures that end up in more complex models although both factors produce outstanding results they also pose drawbacks regarding the learning process as training complex models over large datasets are expensive and time consuming such a problem is even more evident when dealing with video analysis some works have considered transfer learning or domain adaptation i e approaches that map the knowledge from one domain to another to ease the training burden yet most of them operate over individual or small blocks of frames this paper proposes a novel approach to map the knowledge from action recognition to event recognition using an energy based model denoted as spectral deep belief network such a model can process all frames simultaneously carrying spatial and temporal information through the learning process the experimental results conducted over two public video dataset the hmdb and the ucf depict the effectiveness of the proposed model and its reduced computational burden when compared to traditional energy based models such as restricted boltzmann machines and deep belief networks keyword isp diner depth aware image based neural radiance fields authors malte prinzler otmar hilliges justus thies subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract we present depth aware image based neural radiance fields diner given a sparse set of rgb input views we predict depth and feature maps to guide the reconstruction of a volumetric scene representation that allows us to render objects under novel views specifically we propose novel techniques to incorporate depth information into feature fusion and efficient scene sampling in comparison to the previous state of the art diner achieves higher synthesis quality and can process input views with greater disparity this allows us to capture scenes more completely without changing capturing hardware requirements and ultimately enables larger viewpoint changes during novel view synthesis we evaluate our method by synthesizing novel views both for human heads and for general objects and observe significantly improved qualitative results and increased perceptual metrics compared to the previous state of the art the code will be made publicly available for research purposes rethinking disparity a depth range free multi view stereo based on disparity authors qingsong yan qiang wang kaiyong zhao bo li xiaowen chu fei deng subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract existing learning based multi view stereo mvs methods rely on the depth range to build the cost volume and may fail when the range is too large or unreliable to address this problem we propose a disparity based mvs method based on the epipolar disparity flow e flow called dispmvs which infers the depth information from the pixel movement between two views the core of dispmvs is to construct a cost volume on the image plane along the epipolar line between each pair between the reference image and several source images for pixel matching and fuse uncountable depths triangulated from each pair by multi view geometry to ensure multi view consistency to be robust dispmvs starts from a randomly initialized depth map and iteratively refines the depth map with the help of the coarse to fine strategy experiments on dtumvs and tanks temple datasets show that dispmvs is not sensitive to the depth range and achieves state of the art results with lower gpu memory keyword image signal processing there is no result keyword image signal process there is no result keyword compression objcavit improving monocular depth estimation using natural language models and image object cross attention authors dylan auty krystian mikolajczyk subjects computer vision and pattern recognition cs cv machine learning cs lg arxiv link pdf link abstract while monocular depth estimation mde is an important problem in computer vision it is difficult due to the ambiguity that results from the compression of a scene into only dimensions it is common practice in the field to treat it as simple image to image translation without consideration for the semantics of the scene and the objects within it in contrast humans and animals have been shown to use higher level information to solve mde prior knowledge of the nature of the objects in the scene their positions and likely configurations relative to one another and their apparent sizes have all been shown to help resolve this ambiguity in this paper we present a novel method to enhance mde performance by encouraging use of known useful information about the semantics of objects and inter object relationships within a scene our novel objcavit module sources world knowledge from language models and learns inter object relationships in the context of the mde problem using transformer attention incorporating apparent size information our method produces highly accurate depth maps and we obtain competitive results on the and kitti datasets our ablation experiments show that the use of language and cross attention within the objcavit module increases performance code is released at keyword raw sgdraw scene graph drawing interface using object oriented representation authors tianyu zhang xusheng du chia ming chang xi yang haoran xie subjects computer vision and pattern recognition cs cv human computer interaction cs hc arxiv link pdf link abstract scene understanding is an essential and challenging task in computer vision to provide the visually fundamental graphical structure of an image the scene graph has received increased attention due to its powerful semantic representation however it is difficult to draw a proper scene graph for image retrieval image generation and multi modal applications the conventional scene graph annotation interface is not easy to use in image annotations and the automatic scene graph generation approaches using deep neural networks are prone to generate redundant content while disregarding details in this work we propose sgdraw a scene graph drawing interface using object oriented scene graph representation to help users draw and edit scene graphs interactively for the proposed object oriented representation we consider the objects attributes and relationships of objects as a structural unit sgdraw provides a web based scene graph annotation and generation tool for scene understanding applications to verify the effectiveness of the proposed interface we conducted a comparison study with the conventional tool and the user experience study the results show that sgdraw can help generate scene graphs with richer details and describe the images more accurately than traditional bounding box annotations we believe the proposed sgdraw can be useful in various vision tasks such as image retrieval and generation extracting semantic knowledge from gans with unsupervised learning authors jianjin xu zhaoxiang zhang xiaolin hu subjects computer vision and pattern recognition cs cv artificial intelligence cs ai machine learning cs lg arxiv link pdf link abstract recently unsupervised learning has made impressive progress on various tasks despite the dominance of discriminative models increasing attention is drawn to representations learned by generative models and in particular generative adversarial networks gans previous works on the interpretation of gans reveal that gans encode semantics in feature maps in a linearly separable form in this work we further find that gan s features can be well clustered with the linear separability assumption we propose a novel clustering algorithm named klish which leverages the linear separability to cluster gan s features klish succeeds in extracting fine grained semantics of gans trained on datasets of various objects e g car portrait animals and so on with klish we can sample images from gans along with their segmentation masks and synthesize paired image segmentation datasets using the synthesized datasets we enable two downstream applications first we train semantic segmentation networks on these datasets and test them on real images realizing unsupervised semantic segmentation second we train image to image translation networks on the synthesized datasets enabling semantic conditional image synthesis without human annotations dr adapting gans to artistic drawings authors wonjoon jin nuri ryu geonung kim seung hwan baek sunghyun cho subjects computer vision and pattern recognition cs cv machine learning cs lg arxiv link pdf link abstract while gans have recently demonstrated the high quality synthesis of multi view consistent images and shapes they are mainly restricted to photo realistic human portraits this paper aims to extend gans to a different but meaningful visual form artistic portrait drawings however extending existing gans to drawings is challenging due to the inevitable geometric ambiguity present in drawings to tackle this we present dr a novel adaptation approach that adapts an existing gan to artistic drawings dr is equipped with three novel components to handle the geometric ambiguity a deformation aware synthesis network an alternating adaptation of pose estimation and image synthesis and geometric priors experiments show that our approach can successfully adapt gans to drawings and enable multi view consistent semantic editing of drawings linking sketch patches by learning synonymous proximity for graphic sketch representation authors sicong zang shikui tu lei xu subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract graphic sketch representations are effective for representing sketches existing methods take the patches cropped from sketches as the graph nodes and construct the edges based on sketch s drawing order or euclidean distances on the canvas however the drawing order of a sketch may not be unique while the patches from semantically related parts of a sketch may be far away from each other on the canvas in this paper we propose an order invariant semantics aware method for graphic sketch representations the cropped sketch patches are linked according to their global semantics or local geometric shapes namely the synonymous proximity by computing the cosine similarity between the captured patch embeddings such constructed edges are learnable to adapt to the variation of sketch drawings which enable the message passing among synonymous patches aggregating the messages from synonymous patches by graph convolutional networks plays a role of denoising which is beneficial to produce robust patch embeddings and accurate sketch representations furthermore we enforce a clustering constraint over the embeddings jointly with the network learning the synonymous patches are self organized as compact clusters and their embeddings are guided to move towards their assigned cluster centroids it raises the accuracy of the computed synonymous proximity experimental results show that our method significantly improves the performance on both controllable sketch synthesis and sketch healing a geometric model for polarization imaging on projective cameras authors mara pistellato filippo bergamasco subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract the vast majority of shape from polarization sfp methods work under the oversimplified assumption of using orthographic cameras indeed it is still not well understood how to project the stokes vectors when the incoming rays are not orthogonal to the image plane we try to answer this question presenting a geometric model describing how a general projective camera captures the light polarization state based on the optical properties of a tilted polarizer our model is implemented as a pre processing operation acting on raw images followed by a per pixel rotation of the reconstructed normal field in this way all the existing sfp methods assuming orthographic cameras can behave like they were designed for projective ones moreover our model is consistent with state of the art forward and inverse renderers like and art intrinsically enforces physical constraints among the captured channels and handles demosaicing of dofp sensors experiments on existing and new datasets demonstrate the accuracy of the model when applied to commercially available polarimetric cameras from actions to events a transfer learning approach using improved deep belief networks authors mateus roder jurandy almeida gustavo h de rosa leandro a passos andré l d rossi joão p papa subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract in the last decade exponential data growth supplied machine learning based algorithms capacity and enabled their usage in daily life activities additionally such an improvement is partially explained due to the advent of deep learning techniques i e stacks of simple architectures that end up in more complex models although both factors produce outstanding results they also pose drawbacks regarding the learning process as training complex models over large datasets are expensive and time consuming such a problem is even more evident when dealing with video analysis some works have considered transfer learning or domain adaptation i e approaches that map the knowledge from one domain to another to ease the training burden yet most of them operate over individual or small blocks of frames this paper proposes a novel approach to map the knowledge from action recognition to event recognition using an energy based model denoted as spectral deep belief network such a model can process all frames simultaneously carrying spatial and temporal information through the learning process the experimental results conducted over two public video dataset the hmdb and the ucf depict the effectiveness of the proposed model and its reduced computational burden when compared to traditional energy based models such as restricted boltzmann machines and deep belief networks keyword raw image a geometric model for polarization imaging on projective cameras authors mara pistellato filippo bergamasco subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract the vast majority of shape from polarization sfp methods work under the oversimplified assumption of using orthographic cameras indeed it is still not well understood how to project the stokes vectors when the incoming rays are not orthogonal to the image plane we try to answer this question presenting a geometric model describing how a general projective camera captures the light polarization state based on the optical properties of a tilted polarizer our model is implemented as a pre processing operation acting on raw images followed by a per pixel rotation of the reconstructed normal field in this way all the existing sfp methods assuming orthographic cameras can behave like they were designed for projective ones moreover our model is consistent with state of the art forward and inverse renderers like and art intrinsically enforces physical constraints among the captured channels and handles demosaicing of dofp sensors experiments on existing and new datasets demonstrate the accuracy of the model when applied to commercially available polarimetric cameras
| 1
|
19,149
| 25,222,662,566
|
IssuesEvent
|
2022-11-14 13:56:29
|
ESMValGroup/ESMValCore
|
https://api.github.com/repos/ESMValGroup/ESMValCore
|
opened
|
Preprocessor to rotate vector fields to lat/lon directions
|
enhancement preprocessor
|
**Is your feature request related to a problem? Please describe.**
It would be useful to be able to rotate vector variables (such as uo and vo) to the regular grid directions. There is this [iris](https://scitools-iris.readthedocs.io/en/latest/generated/api/iris/analysis/cartography.html?highlight=iris.analysis.cartography#iris.analysis.cartography.rotate_grid_vectors) routine that could do the job. However some tests show strange results for MPI models and furthermore the function is not lazy.
Another issue is that the routine fails if the dimensions of the mask do not match the dimensions of the cube, so the grid cell angles need to be computed putside of the routine:
```python
from iris.analysis.cartography import rotate_grid_vectors, gridcell_angles
grid_angles_cube = gridcell_angles(uo)
grid_angles_cube = iris.util.new_axis(grid_angles_cube)
grid_angles_cube = iris.util.new_axis(grid_angles_cube)
grid_angles_data = da.broadcast_to(grid_angles_cube.lazy_data(), uo.shape)
grid_angles_cube = uo.copy(grid_angles_data)
grid_angles_cube.units = Unit('degrees')
grid_angles_cube.long_name = 'gridcell_angle_from_true_east'
urot_slice, vrot_slice = rotate_grid_vectors(uo, vo, grid_angles_cube)
```
Any opinions on how this could be implemented?
**Would you be able to help out?**
Yes
|
1.0
|
Preprocessor to rotate vector fields to lat/lon directions - **Is your feature request related to a problem? Please describe.**
It would be useful to be able to rotate vector variables (such as uo and vo) to the regular grid directions. There is this [iris](https://scitools-iris.readthedocs.io/en/latest/generated/api/iris/analysis/cartography.html?highlight=iris.analysis.cartography#iris.analysis.cartography.rotate_grid_vectors) routine that could do the job. However some tests show strange results for MPI models and furthermore the function is not lazy.
Another issue is that the routine fails if the dimensions of the mask do not match the dimensions of the cube, so the grid cell angles need to be computed putside of the routine:
```python
from iris.analysis.cartography import rotate_grid_vectors, gridcell_angles
grid_angles_cube = gridcell_angles(uo)
grid_angles_cube = iris.util.new_axis(grid_angles_cube)
grid_angles_cube = iris.util.new_axis(grid_angles_cube)
grid_angles_data = da.broadcast_to(grid_angles_cube.lazy_data(), uo.shape)
grid_angles_cube = uo.copy(grid_angles_data)
grid_angles_cube.units = Unit('degrees')
grid_angles_cube.long_name = 'gridcell_angle_from_true_east'
urot_slice, vrot_slice = rotate_grid_vectors(uo, vo, grid_angles_cube)
```
Any opinions on how this could be implemented?
**Would you be able to help out?**
Yes
|
process
|
preprocessor to rotate vector fields to lat lon directions is your feature request related to a problem please describe it would be useful to be able to rotate vector variables such as uo and vo to the regular grid directions there is this routine that could do the job however some tests show strange results for mpi models and furthermore the function is not lazy another issue is that the routine fails if the dimensions of the mask do not match the dimensions of the cube so the grid cell angles need to be computed putside of the routine python from iris analysis cartography import rotate grid vectors gridcell angles grid angles cube gridcell angles uo grid angles cube iris util new axis grid angles cube grid angles cube iris util new axis grid angles cube grid angles data da broadcast to grid angles cube lazy data uo shape grid angles cube uo copy grid angles data grid angles cube units unit degrees grid angles cube long name gridcell angle from true east urot slice vrot slice rotate grid vectors uo vo grid angles cube any opinions on how this could be implemented would you be able to help out yes
| 1
|
9,237
| 12,266,635,627
|
IssuesEvent
|
2020-05-07 09:17:28
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
reopened
|
Custom Column expression `substring()` returns nothing for MySQL
|
Good Backend Starer Issue Priority:P2 Querying/Processor Type:Bug
|
**Describe the bug**
Using the expression function `substring()` when querying MySQL returns empty (`""`) column.
**To Reproduce**
1. Custom question > select MySQL/MariaDB database and a table with string column
2. Click on Custom Column > formula: `substring([string_column], 0, 5)` name: "test"
3. Visualize > shows empty values in the "test" column.
**Expected behavior**
Expect the snippet returned ;-)
**Information about your Metabase Installation:**
Metabase 0.35.3 with multiple backends and internal instance.
**Additional context**
Postgres and H2 seems to work (haven't tested other databases)
|
1.0
|
Custom Column expression `substring()` returns nothing for MySQL - **Describe the bug**
Using the expression function `substring()` when querying MySQL returns empty (`""`) column.
**To Reproduce**
1. Custom question > select MySQL/MariaDB database and a table with string column
2. Click on Custom Column > formula: `substring([string_column], 0, 5)` name: "test"
3. Visualize > shows empty values in the "test" column.
**Expected behavior**
Expect the snippet returned ;-)
**Information about your Metabase Installation:**
Metabase 0.35.3 with multiple backends and internal instance.
**Additional context**
Postgres and H2 seems to work (haven't tested other databases)
|
process
|
custom column expression substring returns nothing for mysql describe the bug using the expression function substring when querying mysql returns empty column to reproduce custom question select mysql mariadb database and a table with string column click on custom column formula substring name test visualize shows empty values in the test column expected behavior expect the snippet returned information about your metabase installation metabase with multiple backends and internal instance additional context postgres and seems to work haven t tested other databases
| 1
|
71,442
| 13,652,832,958
|
IssuesEvent
|
2020-09-27 09:35:52
|
GTNewHorizons/GT-New-Horizons-Modpack
|
https://api.github.com/repos/GTNewHorizons/GT-New-Horizons-Modpack
|
closed
|
Forestry Yield trait doesn't seem to work properly.
|
Type Need Code changes
|
After breeding plums (high yield) and mixing them with my butternuts (average yield) I expected a 50% increase. Average yield is supposed to be ~20% chance of fruit, with high yield at 30%. However, after testing, I see essentially no yield difference between the average and high yield trees.


|
1.0
|
Forestry Yield trait doesn't seem to work properly. - After breeding plums (high yield) and mixing them with my butternuts (average yield) I expected a 50% increase. Average yield is supposed to be ~20% chance of fruit, with high yield at 30%. However, after testing, I see essentially no yield difference between the average and high yield trees.


|
non_process
|
forestry yield trait doesn t seem to work properly after breeding plums high yield and mixing them with my butternuts average yield i expected a increase average yield is supposed to be chance of fruit with high yield at however after testing i see essentially no yield difference between the average and high yield trees
| 0
|
3,851
| 6,808,551,173
|
IssuesEvent
|
2017-11-04 04:29:03
|
Great-Hill-Corporation/quickBlocks
|
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
|
reopened
|
InMemoryCache is allocated but not deleted
|
libs-utillib status-inprocess type-enhancement
|
If I statically allocate it, it opens CToml config file, which does not allow for changing location of binary cache, so I have to dynamically allocate it. But, then, I have no place to de-allocate it unless the end user calls close library which is not required. Could do it as I do in getCurl where it's all in one function and takes a flag to deallocate, but I'm not a fan of that.
|
1.0
|
InMemoryCache is allocated but not deleted - If I statically allocate it, it opens CToml config file, which does not allow for changing location of binary cache, so I have to dynamically allocate it. But, then, I have no place to de-allocate it unless the end user calls close library which is not required. Could do it as I do in getCurl where it's all in one function and takes a flag to deallocate, but I'm not a fan of that.
|
process
|
inmemorycache is allocated but not deleted if i statically allocate it it opens ctoml config file which does not allow for changing location of binary cache so i have to dynamically allocate it but then i have no place to de allocate it unless the end user calls close library which is not required could do it as i do in getcurl where it s all in one function and takes a flag to deallocate but i m not a fan of that
| 1
|
282,627
| 24,490,490,679
|
IssuesEvent
|
2022-10-10 00:46:28
|
wazuh/wazuh
|
https://api.github.com/repos/wazuh/wazuh
|
opened
|
Release 4.3.9 - Revision 1 - Release Candidate RC1 - Footprint Metrics - DOCKER-LISTENER,CIS-CAT,OSQUERY,AZURE-LOGS,OPEN-SCAP (2.5d)
|
release test/4.3.9
|
## Footprint metrics information
| | |
|---------------------------------|--------------------------------------------|
| **Main release candidate issue #** | #15090 |
| **Main footprint metrics issue #** | #15113 |
| **Version** | 4.3.9 |
| **Release candidate #** | RC1 |
| **Tag** | https://github.com/wazuh/wazuh/tree/4.3.9-rc1 |
## Stress test documentation
### Packages used
- Repository: `packages-dev.wazuh.com`
- Package path: `pre-release`
- Package revision: `1`
- **Jenkins build**: https://ci.wazuh.info/job/Test_stress/3622/
---
<details><summary>Manager</summary>
+ <details><summary>Plots</summary>
















</details>
+ <details><summary>Logs and configuration</summary>
[ossec_Test_stress_B3622_manager_2022-10-10.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_manager_centos/logs/ossec_Test_stress_B3622_manager_2022-10-10.zip)
</details>
+ <details><summary>CSV</summary>
[monitor-manager-Test_stress_B3622_manager-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_manager_centos/data/monitor-manager-Test_stress_B3622_manager-pre-release.csv)
[Test_stress_B3622_manager_analysisd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_manager_centos/data/Test_stress_B3622_manager_analysisd_state.csv)
[Test_stress_B3622_manager_remoted_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_manager_centos/data/Test_stress_B3622_manager_remoted_state.csv)
</details>
</details>
<details><summary>Centos agent</summary>
+ <details><summary>Plots</summary>

















</details>
+ <details><summary>Logs and configuration</summary>
[ossec_Test_stress_B3622_centos_2022-10-10.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_centos/logs/ossec_Test_stress_B3622_centos_2022-10-10.zip)
</details>
+ <details><summary>CSV</summary>
[monitor-agent-Test_stress_B3622_centos-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_centos/data/monitor-agent-Test_stress_B3622_centos-pre-release.csv)
[Test_stress_B3622_centos_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_centos/data/Test_stress_B3622_centos_agentd_state.csv)
</details>
</details>
<details><summary>Ubuntu agent</summary>
+ <details><summary>Plots</summary>

















</details>
+ <details><summary>Logs and configuration</summary>
[ossec_Test_stress_B3622_ubuntu_2022-10-10.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_ubuntu/logs/ossec_Test_stress_B3622_ubuntu_2022-10-10.zip)
</details>
+ <details><summary>CSV</summary>
[monitor-agent-Test_stress_B3622_ubuntu-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_ubuntu/data/monitor-agent-Test_stress_B3622_ubuntu-pre-release.csv)
[Test_stress_B3622_ubuntu_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_ubuntu/data/Test_stress_B3622_ubuntu_agentd_state.csv)
</details>
</details>
<details><summary>Windows agent</summary>
+ <details><summary>Plots</summary>















</details>
+ <details><summary>Logs and configuration</summary>
[ossec_Test_stress_B3622_windows_2022-10-10.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_windows/logs/ossec_Test_stress_B3622_windows_2022-10-10.zip)
</details>
+ <details><summary>CSV</summary>
[monitor-winagent-Test_stress_B3622_windows-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_windows/data/monitor-winagent-Test_stress_B3622_windows-pre-release.csv)
[Test_stress_B3622_windows_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_windows/data/Test_stress_B3622_windows_agentd_state.csv)
</details>
</details>
<details><summary>macOS agent</summary>
+ <details><summary>Plots</summary>
</details>
+ <details><summary>Logs and configuration</summary>
</details>
+ <details><summary>CSV</summary>
</details>
</details>
<details><summary>Solaris agent</summary>
+ <details><summary>Plots</summary>
</details>
+ <details><summary>Logs and configuration</summary>
</details>
+ <details><summary>CSV</summary>
</details>
</details>
|
1.0
|
Release 4.3.9 - Revision 1 - Release Candidate RC1 - Footprint Metrics - DOCKER-LISTENER,CIS-CAT,OSQUERY,AZURE-LOGS,OPEN-SCAP (2.5d) - ## Footprint metrics information
| | |
|---------------------------------|--------------------------------------------|
| **Main release candidate issue #** | #15090 |
| **Main footprint metrics issue #** | #15113 |
| **Version** | 4.3.9 |
| **Release candidate #** | RC1 |
| **Tag** | https://github.com/wazuh/wazuh/tree/4.3.9-rc1 |
## Stress test documentation
### Packages used
- Repository: `packages-dev.wazuh.com`
- Package path: `pre-release`
- Package revision: `1`
- **Jenkins build**: https://ci.wazuh.info/job/Test_stress/3622/
---
<details><summary>Manager</summary>
+ <details><summary>Plots</summary>
















</details>
+ <details><summary>Logs and configuration</summary>
[ossec_Test_stress_B3622_manager_2022-10-10.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_manager_centos/logs/ossec_Test_stress_B3622_manager_2022-10-10.zip)
</details>
+ <details><summary>CSV</summary>
[monitor-manager-Test_stress_B3622_manager-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_manager_centos/data/monitor-manager-Test_stress_B3622_manager-pre-release.csv)
[Test_stress_B3622_manager_analysisd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_manager_centos/data/Test_stress_B3622_manager_analysisd_state.csv)
[Test_stress_B3622_manager_remoted_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_manager_centos/data/Test_stress_B3622_manager_remoted_state.csv)
</details>
</details>
<details><summary>Centos agent</summary>
+ <details><summary>Plots</summary>

















</details>
+ <details><summary>Logs and configuration</summary>
[ossec_Test_stress_B3622_centos_2022-10-10.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_centos/logs/ossec_Test_stress_B3622_centos_2022-10-10.zip)
</details>
+ <details><summary>CSV</summary>
[monitor-agent-Test_stress_B3622_centos-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_centos/data/monitor-agent-Test_stress_B3622_centos-pre-release.csv)
[Test_stress_B3622_centos_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_centos/data/Test_stress_B3622_centos_agentd_state.csv)
</details>
</details>
<details><summary>Ubuntu agent</summary>
+ <details><summary>Plots</summary>

















</details>
+ <details><summary>Logs and configuration</summary>
[ossec_Test_stress_B3622_ubuntu_2022-10-10.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_ubuntu/logs/ossec_Test_stress_B3622_ubuntu_2022-10-10.zip)
</details>
+ <details><summary>CSV</summary>
[monitor-agent-Test_stress_B3622_ubuntu-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_ubuntu/data/monitor-agent-Test_stress_B3622_ubuntu-pre-release.csv)
[Test_stress_B3622_ubuntu_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_ubuntu/data/Test_stress_B3622_ubuntu_agentd_state.csv)
</details>
</details>
<details><summary>Windows agent</summary>
+ <details><summary>Plots</summary>















</details>
+ <details><summary>Logs and configuration</summary>
[ossec_Test_stress_B3622_windows_2022-10-10.zip](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_windows/logs/ossec_Test_stress_B3622_windows_2022-10-10.zip)
</details>
+ <details><summary>CSV</summary>
[monitor-winagent-Test_stress_B3622_windows-pre-release.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_windows/data/monitor-winagent-Test_stress_B3622_windows-pre-release.csv)
[Test_stress_B3622_windows_agentd_state.csv](https://ci.wazuh.com/data/Test_stress/pre-release/4.3.9/B3622-3600m/B3622_agent_windows/data/Test_stress_B3622_windows_agentd_state.csv)
</details>
</details>
<details><summary>macOS agent</summary>
+ <details><summary>Plots</summary>
</details>
+ <details><summary>Logs and configuration</summary>
</details>
+ <details><summary>CSV</summary>
</details>
</details>
<details><summary>Solaris agent</summary>
+ <details><summary>Plots</summary>
</details>
+ <details><summary>Logs and configuration</summary>
</details>
+ <details><summary>CSV</summary>
</details>
</details>
|
non_process
|
release revision release candidate footprint metrics docker listener cis cat osquery azure logs open scap footprint metrics information main release candidate issue main footprint metrics issue version release candidate tag stress test documentation packages used repository packages dev wazuh com package path pre release package revision jenkins build manager plots logs and configuration csv centos agent plots logs and configuration csv ubuntu agent plots logs and configuration csv windows agent plots logs and configuration csv macos agent plots logs and configuration csv solaris agent plots logs and configuration csv
| 0
|
12,150
| 14,741,401,633
|
IssuesEvent
|
2021-01-07 10:33:54
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
SA Billing - Allentown - Invalid Late Fees
|
anc-process anp-important ant-bug
|
In GitLab by @kdjstudios on Jan 16, 2019, 12:01
**Submitted by:** "Alina King" <alina.king@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-01-15-36737
**Server:** Internal
**Client/Site:** Allentown
**Account:** NA
**Issue:**
This the valid late fees are approved to be corrected.
|
1.0
|
SA Billing - Allentown - Invalid Late Fees - In GitLab by @kdjstudios on Jan 16, 2019, 12:01
**Submitted by:** "Alina King" <alina.king@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-01-15-36737
**Server:** Internal
**Client/Site:** Allentown
**Account:** NA
**Issue:**
This the valid late fees are approved to be corrected.
|
process
|
sa billing allentown invalid late fees in gitlab by kdjstudios on jan submitted by alina king helpdesk server internal client site allentown account na issue this the valid late fees are approved to be corrected
| 1
|
11,751
| 14,589,669,603
|
IssuesEvent
|
2020-12-19 03:11:38
|
kwinborne/asm6809
|
https://api.github.com/repos/kwinborne/asm6809
|
opened
|
Syntax Checking
|
Preprocessor
|
Create an algorithm that will search an input source file and check the mnemonics against a table of instructions.
|
1.0
|
Syntax Checking - Create an algorithm that will search an input source file and check the mnemonics against a table of instructions.
|
process
|
syntax checking create an algorithm that will search an input source file and check the mnemonics against a table of instructions
| 1
|
10,886
| 13,654,837,032
|
IssuesEvent
|
2020-09-27 19:18:55
|
eddieantonio/predictive-text-studio
|
https://api.github.com/repos/eddieantonio/predictive-text-studio
|
closed
|
Update to actual kmp.json format 😫
|
bug data-backing data-processing enhancement worker
|
I was wrong about the format of a `kmp.json` file. It actually follows this spec:
Documentation: https://help.keyman.com/developer/11.0/reference/file-types/metadata
(Outdated) Documentation: https://github.com/keymanapp/keyman/wiki/KMP-Metadata-File-(kmp.inf-and-kmp.json)#fields
JSON Schema: https://api.keyman.com/schemas/package.json
Here are some example `kmp.json` files to get you started:
This one is from https://downloads.keyman.com/models/nrc.en.mtnt/0.1.4/nrc.en.mtnt.model.kmp:
```json
{
"system": {
"keymanDeveloperVersion": "12.0.1500.0",
"fileVersion": "12.0"
},
"options": {
"followKeyboardVersion": false
},
"info": {
"author": {
"description": "Eddie Antonio Santos",
"url": "mailto:Eddie.Santos@nrc-cnrc.gc.ca"
},
"copyright": {
"description": "© 2019 National Research Council Canada"
},
"name": {
"description": "English language model mined from MTNT"
},
"version": {
"description": "0.1.4"
}
},
"files": [
{
"name": "nrc.en.mtnt.model.js",
"description": "Lexical model nrc.en.mtnt.model.js",
"copyLocation": "0",
"fileType": ".model.js"
}
],
"lexicalModels": [
{
"name": "English dictionary (MTNT)",
"id": "nrc.en.mtnt",
"languages": [
{
"name": "English",
"id": "en"
},
{
"name": "English (US)",
"id": "en-us"
},
{
"name": "English (Canada)",
"id": "en-ca"
}
]
}
]
}
```
This one is from https://downloads.keyman.com/models/nrc.str.sencoten/1.0.5/nrc.str.sencoten.model.kmp
```json
{
"system": {
"keymanDeveloperVersion": "12.0.1500.0",
"fileVersion": "12.0"
},
"options": {
"followKeyboardVersion": false
},
"info": {
"author": {
"description": "Eddie Antonio Santos",
"url": "mailto:Eddie.Santos@nrc-cnrc.gc.ca"
},
"copyright": {
"description": "© 2019 Timothy Montler and the W̱SÁNEĆ School Board"
},
"name": {
"description": "SENĆOŦEN (Saanich Dialect) Lexical Model"
},
"version": {
"description": "1.0.5"
}
},
"files": [
{
"name": "nrc.str.sencoten.model.js",
"description": "Lexical model nrc.str.sencoten.model.js",
"copyLocation": "0",
"fileType": ".model.js"
}
],
"lexicalModels": [
{
"name": "SENĆOŦEN dictionary",
"id": "nrc.str.sencoten",
"languages": [
{
"name": "North Straits Salish",
"id": "str"
},
{
"name": "SENĆOŦEN",
"id": "str-Latn"
}
]
}
]
}
```
---
**NOTE**: this does _not_ invalidate #72 or #51! Those are still useful. The `.model_info` files are still used for **public** distribution of Keyman lexical models. I mistakenly assumed that the `kmp.json` and `.model_info` files were the same 😩
|
1.0
|
Update to actual kmp.json format 😫 - I was wrong about the format of a `kmp.json` file. It actually follows this spec:
Documentation: https://help.keyman.com/developer/11.0/reference/file-types/metadata
(Outdated) Documentation: https://github.com/keymanapp/keyman/wiki/KMP-Metadata-File-(kmp.inf-and-kmp.json)#fields
JSON Schema: https://api.keyman.com/schemas/package.json
Here are some example `kmp.json` files to get you started:
This one is from https://downloads.keyman.com/models/nrc.en.mtnt/0.1.4/nrc.en.mtnt.model.kmp:
```json
{
"system": {
"keymanDeveloperVersion": "12.0.1500.0",
"fileVersion": "12.0"
},
"options": {
"followKeyboardVersion": false
},
"info": {
"author": {
"description": "Eddie Antonio Santos",
"url": "mailto:Eddie.Santos@nrc-cnrc.gc.ca"
},
"copyright": {
"description": "© 2019 National Research Council Canada"
},
"name": {
"description": "English language model mined from MTNT"
},
"version": {
"description": "0.1.4"
}
},
"files": [
{
"name": "nrc.en.mtnt.model.js",
"description": "Lexical model nrc.en.mtnt.model.js",
"copyLocation": "0",
"fileType": ".model.js"
}
],
"lexicalModels": [
{
"name": "English dictionary (MTNT)",
"id": "nrc.en.mtnt",
"languages": [
{
"name": "English",
"id": "en"
},
{
"name": "English (US)",
"id": "en-us"
},
{
"name": "English (Canada)",
"id": "en-ca"
}
]
}
]
}
```
This one is from https://downloads.keyman.com/models/nrc.str.sencoten/1.0.5/nrc.str.sencoten.model.kmp
```json
{
"system": {
"keymanDeveloperVersion": "12.0.1500.0",
"fileVersion": "12.0"
},
"options": {
"followKeyboardVersion": false
},
"info": {
"author": {
"description": "Eddie Antonio Santos",
"url": "mailto:Eddie.Santos@nrc-cnrc.gc.ca"
},
"copyright": {
"description": "© 2019 Timothy Montler and the W̱SÁNEĆ School Board"
},
"name": {
"description": "SENĆOŦEN (Saanich Dialect) Lexical Model"
},
"version": {
"description": "1.0.5"
}
},
"files": [
{
"name": "nrc.str.sencoten.model.js",
"description": "Lexical model nrc.str.sencoten.model.js",
"copyLocation": "0",
"fileType": ".model.js"
}
],
"lexicalModels": [
{
"name": "SENĆOŦEN dictionary",
"id": "nrc.str.sencoten",
"languages": [
{
"name": "North Straits Salish",
"id": "str"
},
{
"name": "SENĆOŦEN",
"id": "str-Latn"
}
]
}
]
}
```
---
**NOTE**: this does _not_ invalidate #72 or #51! Those are still useful. The `.model_info` files are still used for **public** distribution of Keyman lexical models. I mistakenly assumed that the `kmp.json` and `.model_info` files were the same 😩
|
process
|
update to actual kmp json format 😫 i was wrong about the format of a kmp json file it actually follows this spec documentation outdated documentation json schema here are some example kmp json files to get you started this one is from json system keymandeveloperversion fileversion options followkeyboardversion false info author description eddie antonio santos url mailto eddie santos nrc cnrc gc ca copyright description © national research council canada name description english language model mined from mtnt version description files name nrc en mtnt model js description lexical model nrc en mtnt model js copylocation filetype model js lexicalmodels name english dictionary mtnt id nrc en mtnt languages name english id en name english us id en us name english canada id en ca this one is from json system keymandeveloperversion fileversion options followkeyboardversion false info author description eddie antonio santos url mailto eddie santos nrc cnrc gc ca copyright description © timothy montler and the w̱sáneć school board name description senćoŧen saanich dialect lexical model version description files name nrc str sencoten model js description lexical model nrc str sencoten model js copylocation filetype model js lexicalmodels name senćoŧen dictionary id nrc str sencoten languages name north straits salish id str name senćoŧen id str latn note this does not invalidate or those are still useful the model info files are still used for public distribution of keyman lexical models i mistakenly assumed that the kmp json and model info files were the same 😩
| 1
|
6,491
| 9,559,660,260
|
IssuesEvent
|
2019-05-03 17:20:57
|
google/go-cloud
|
https://api.github.com/repos/google/go-cloud
|
opened
|
website: hugo seems to keep generating "changing" sitemap.xml files
|
process
|
See https://github.com/google/go-cloud/commit/332c1bde650b09ca2589871809e0ae6 -- we have many commits every day to gh-pages, which essentially reshuffle the `sitemap.xml` file order.
Robert suggests trying a newer Hugo version
Or we could open an issue for Hugo
|
1.0
|
website: hugo seems to keep generating "changing" sitemap.xml files - See https://github.com/google/go-cloud/commit/332c1bde650b09ca2589871809e0ae6 -- we have many commits every day to gh-pages, which essentially reshuffle the `sitemap.xml` file order.
Robert suggests trying a newer Hugo version
Or we could open an issue for Hugo
|
process
|
website hugo seems to keep generating changing sitemap xml files see we have many commits every day to gh pages which essentially reshuffle the sitemap xml file order robert suggests trying a newer hugo version or we could open an issue for hugo
| 1
|
507,765
| 14,680,170,608
|
IssuesEvent
|
2020-12-31 09:15:51
|
k8smeetup/website-tasks
|
https://api.github.com/repos/k8smeetup/website-tasks
|
opened
|
/docs/tutorials/kubernetes-basics/update/update-intro.html
|
lang/zh priority/P0 sync/update version/master welcome
|
Source File: [/docs/tutorials/kubernetes-basics/update/update-intro.html](https://github.com/kubernetes/website/blob/master/content/en/docs/tutorials/kubernetes-basics/update/update-intro.html)
Diff 命令参考:
```bash
# 查看原始文档与翻译文档更新差异
git diff --no-index -- content/en/docs/tutorials/kubernetes-basics/update/update-intro.html content/zh/docs/tutorials/kubernetes-basics/update/update-intro.html
# 跨分支持查看原始文档更新差异
git diff release-1.19 master -- content/en/docs/tutorials/kubernetes-basics/update/update-intro.html
```
|
1.0
|
/docs/tutorials/kubernetes-basics/update/update-intro.html - Source File: [/docs/tutorials/kubernetes-basics/update/update-intro.html](https://github.com/kubernetes/website/blob/master/content/en/docs/tutorials/kubernetes-basics/update/update-intro.html)
Diff 命令参考:
```bash
# 查看原始文档与翻译文档更新差异
git diff --no-index -- content/en/docs/tutorials/kubernetes-basics/update/update-intro.html content/zh/docs/tutorials/kubernetes-basics/update/update-intro.html
# 跨分支持查看原始文档更新差异
git diff release-1.19 master -- content/en/docs/tutorials/kubernetes-basics/update/update-intro.html
```
|
non_process
|
docs tutorials kubernetes basics update update intro html source file diff 命令参考 bash 查看原始文档与翻译文档更新差异 git diff no index content en docs tutorials kubernetes basics update update intro html content zh docs tutorials kubernetes basics update update intro html 跨分支持查看原始文档更新差异 git diff release master content en docs tutorials kubernetes basics update update intro html
| 0
|
326,461
| 9,956,575,885
|
IssuesEvent
|
2019-07-05 14:16:50
|
ReliefApplications/bms_front
|
https://api.github.com/repos/ReliefApplications/bms_front
|
closed
|
Field Officer User Needs Permission to View Distribution - High
|
High Priority Waiting for Review
|
Field Officer cannot open distributions to view and 'mark' as distributed under paper vouchers.
|
1.0
|
Field Officer User Needs Permission to View Distribution - High - Field Officer cannot open distributions to view and 'mark' as distributed under paper vouchers.
|
non_process
|
field officer user needs permission to view distribution high field officer cannot open distributions to view and mark as distributed under paper vouchers
| 0
|
20,592
| 27,257,005,789
|
IssuesEvent
|
2023-02-22 12:18:06
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
[processor/spanmetrics] Please group datapoints into metrics by dimensions
|
enhancement processor/spanmetrics needs triage
|
### Component(s)
processor/spanmetrics
### Is your feature request related to a problem? Please describe.
Currently there is no way to filter out specific spans from the metrics. Previously it was possible to do this with the filter processor, but after #16102 this is no longer possible (see: #16387)
### Describe the solution you'd like
Group datapoints by attributes into separate metrics, so datapoints in the same metric share the same attributes.
Eg:
before:
```
ResourceMetrics #0
Resource SchemaURL:
ScopeMetrics #0
ScopeMetrics SchemaURL:
InstrumentationScope spanmetricsprocessor
Metric #0
Descriptor:
-> Name: calls_total
-> Description:
-> Unit:
-> DataType: Sum
-> IsMonotonic: true
-> AggregationTemporality: Cumulative
NumberDataPoints #0
Data point attributes:
-> service.name: Str(dummy-service)
-> operation: Str(middleware - query)
-> span.kind: Str(SPAN_KIND_INTERNAL)
-> status.code: Str(STATUS_CODE_UNSET)
-> http.method: Str(UNKOWN)
StartTimestamp: 2022-11-21 09:37:33.646666564 +0000 UTC
Timestamp: 2022-11-21 09:38:23.476303012 +0000 UTC
Value: 6
NumberDataPoints #1
Data point attributes:
-> service.name: Str(dummy-service)
-> operation: Str(GET /healthz)
-> span.kind: Str(SPAN_KIND_SERVER)
-> status.code: Str(STATUS_CODE_UNSET)
-> net.host.port: Int(80)
-> http.method: Str(GET)
-> http.status_code: Int(200)
StartTimestamp: 2022-11-21 09:37:33.646666564 +0000 UTC
Timestamp: 2022-11-21 09:38:23.476303012 +0000 UTC
Value: 6
```
after:
```
ResourceMetrics #0
Resource SchemaURL:
ScopeMetrics #0
ScopeMetrics SchemaURL:
InstrumentationScope spanmetricsprocessor
Metric #0
Descriptor:
-> Name: calls_total
-> Description:
-> Unit:
-> DataType: Sum
-> IsMonotonic: true
-> AggregationTemporality: Cumulative
NumberDataPoints #0
Data point attributes:
-> service.name: Str(dummy-service)
-> operation: Str(middleware - query)
-> span.kind: Str(SPAN_KIND_INTERNAL)
-> status.code: Str(STATUS_CODE_UNSET)
-> http.method: Str(UNKOWN)
StartTimestamp: 2022-11-21 09:37:33.646666564 +0000 UTC
Timestamp: 2022-11-21 09:38:23.476303012 +0000 UTC
Value: 6
Metric #1
Descriptor:
-> Name: calls_total
-> Description:
-> Unit:
-> DataType: Sum
-> IsMonotonic: true
-> AggregationTemporality: Cumulative
NumberDataPoints #0
Data point attributes:
-> service.name: Str(dummy-service)
-> operation: Str(GET /healthz)
-> span.kind: Str(SPAN_KIND_SERVER)
-> status.code: Str(STATUS_CODE_UNSET)
-> net.host.port: Int(80)
-> http.method: Str(GET)
-> http.status_code: Int(200)
StartTimestamp: 2022-11-21 09:37:33.646666564 +0000 UTC
Timestamp: 2022-11-21 09:38:23.476303012 +0000 UTC
Value: 6
```
### Describe alternatives you've considered
see: #16387
### Additional context
_No response_
|
1.0
|
[processor/spanmetrics] Please group datapoints into metrics by dimensions - ### Component(s)
processor/spanmetrics
### Is your feature request related to a problem? Please describe.
Currently there is no way to filter out specific spans from the metrics. Previously it was possible to do this with the filter processor, but after #16102 this is no longer possible (see: #16387)
### Describe the solution you'd like
Group datapoints by attributes into separate metrics, so datapoints in the same metric share the same attributes.
Eg:
before:
```
ResourceMetrics #0
Resource SchemaURL:
ScopeMetrics #0
ScopeMetrics SchemaURL:
InstrumentationScope spanmetricsprocessor
Metric #0
Descriptor:
-> Name: calls_total
-> Description:
-> Unit:
-> DataType: Sum
-> IsMonotonic: true
-> AggregationTemporality: Cumulative
NumberDataPoints #0
Data point attributes:
-> service.name: Str(dummy-service)
-> operation: Str(middleware - query)
-> span.kind: Str(SPAN_KIND_INTERNAL)
-> status.code: Str(STATUS_CODE_UNSET)
-> http.method: Str(UNKOWN)
StartTimestamp: 2022-11-21 09:37:33.646666564 +0000 UTC
Timestamp: 2022-11-21 09:38:23.476303012 +0000 UTC
Value: 6
NumberDataPoints #1
Data point attributes:
-> service.name: Str(dummy-service)
-> operation: Str(GET /healthz)
-> span.kind: Str(SPAN_KIND_SERVER)
-> status.code: Str(STATUS_CODE_UNSET)
-> net.host.port: Int(80)
-> http.method: Str(GET)
-> http.status_code: Int(200)
StartTimestamp: 2022-11-21 09:37:33.646666564 +0000 UTC
Timestamp: 2022-11-21 09:38:23.476303012 +0000 UTC
Value: 6
```
after:
```
ResourceMetrics #0
Resource SchemaURL:
ScopeMetrics #0
ScopeMetrics SchemaURL:
InstrumentationScope spanmetricsprocessor
Metric #0
Descriptor:
-> Name: calls_total
-> Description:
-> Unit:
-> DataType: Sum
-> IsMonotonic: true
-> AggregationTemporality: Cumulative
NumberDataPoints #0
Data point attributes:
-> service.name: Str(dummy-service)
-> operation: Str(middleware - query)
-> span.kind: Str(SPAN_KIND_INTERNAL)
-> status.code: Str(STATUS_CODE_UNSET)
-> http.method: Str(UNKOWN)
StartTimestamp: 2022-11-21 09:37:33.646666564 +0000 UTC
Timestamp: 2022-11-21 09:38:23.476303012 +0000 UTC
Value: 6
Metric #1
Descriptor:
-> Name: calls_total
-> Description:
-> Unit:
-> DataType: Sum
-> IsMonotonic: true
-> AggregationTemporality: Cumulative
NumberDataPoints #0
Data point attributes:
-> service.name: Str(dummy-service)
-> operation: Str(GET /healthz)
-> span.kind: Str(SPAN_KIND_SERVER)
-> status.code: Str(STATUS_CODE_UNSET)
-> net.host.port: Int(80)
-> http.method: Str(GET)
-> http.status_code: Int(200)
StartTimestamp: 2022-11-21 09:37:33.646666564 +0000 UTC
Timestamp: 2022-11-21 09:38:23.476303012 +0000 UTC
Value: 6
```
### Describe alternatives you've considered
see: #16387
### Additional context
_No response_
|
process
|
please group datapoints into metrics by dimensions component s processor spanmetrics is your feature request related to a problem please describe currently there is no way to filter out specific spans from the metrics previously it was possible to do this with the filter processor but after this is no longer possible see describe the solution you d like group datapoints by attributes into separate metrics so datapoints in the same metric share the same attributes eg before resourcemetrics resource schemaurl scopemetrics scopemetrics schemaurl instrumentationscope spanmetricsprocessor metric descriptor name calls total description unit datatype sum ismonotonic true aggregationtemporality cumulative numberdatapoints data point attributes service name str dummy service operation str middleware query span kind str span kind internal status code str status code unset http method str unkown starttimestamp utc timestamp utc value numberdatapoints data point attributes service name str dummy service operation str get healthz span kind str span kind server status code str status code unset net host port int http method str get http status code int starttimestamp utc timestamp utc value after resourcemetrics resource schemaurl scopemetrics scopemetrics schemaurl instrumentationscope spanmetricsprocessor metric descriptor name calls total description unit datatype sum ismonotonic true aggregationtemporality cumulative numberdatapoints data point attributes service name str dummy service operation str middleware query span kind str span kind internal status code str status code unset http method str unkown starttimestamp utc timestamp utc value metric descriptor name calls total description unit datatype sum ismonotonic true aggregationtemporality cumulative numberdatapoints data point attributes service name str dummy service operation str get healthz span kind str span kind server status code str status code unset net host port int http method str get http status code int starttimestamp utc timestamp utc value describe alternatives you ve considered see additional context no response
| 1
|
421,093
| 28,308,687,630
|
IssuesEvent
|
2023-04-10 13:32:53
|
AY2223S2-CS2113-W15-2/tp
|
https://api.github.com/repos/AY2223S2-CS2113-W15-2/tp
|
closed
|
[PE-D][Tester C] Supported category list is hard to find
|
severity.Medium type.DocumentationBug consolidated
|
The supported category list can only be found near the bottom of the user guide. It makes it quite hard to use, as when using the app I need to navigate between this list and the documentation for the command I'm using frequently. For testing I have decided to just use `Food` for everything because I'm lazy, but a typical user would definitely want to label their expenses correctly.
Suggestion:
* Paste the list of supported categories to the output of `/help` and to the output whenever an invalid category is supplied.
* Remove the supported category section in the UG and paste the list to wherever a category flag is used in the UG.
<!--session: 1680272517401-1bfb3b3c-3e07-408e-9f74-bc076a147d38-->
<!--Version: Web v3.4.7-->
-------------
Labels: `severity.Medium` `type.DocumentationBug`
original: joulev/ped#3
|
1.0
|
[PE-D][Tester C] Supported category list is hard to find - The supported category list can only be found near the bottom of the user guide. It makes it quite hard to use, as when using the app I need to navigate between this list and the documentation for the command I'm using frequently. For testing I have decided to just use `Food` for everything because I'm lazy, but a typical user would definitely want to label their expenses correctly.
Suggestion:
* Paste the list of supported categories to the output of `/help` and to the output whenever an invalid category is supplied.
* Remove the supported category section in the UG and paste the list to wherever a category flag is used in the UG.
<!--session: 1680272517401-1bfb3b3c-3e07-408e-9f74-bc076a147d38-->
<!--Version: Web v3.4.7-->
-------------
Labels: `severity.Medium` `type.DocumentationBug`
original: joulev/ped#3
|
non_process
|
supported category list is hard to find the supported category list can only be found near the bottom of the user guide it makes it quite hard to use as when using the app i need to navigate between this list and the documentation for the command i m using frequently for testing i have decided to just use food for everything because i m lazy but a typical user would definitely want to label their expenses correctly suggestion paste the list of supported categories to the output of help and to the output whenever an invalid category is supplied remove the supported category section in the ug and paste the list to wherever a category flag is used in the ug labels severity medium type documentationbug original joulev ped
| 0
|
2,612
| 5,383,472,592
|
IssuesEvent
|
2017-02-24 06:54:20
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
node blocks when piping to standard output. Git for Bash on Windows 10, Cygwin too.
|
process windows
|
* **Version**:v7.6.0, v6.10.0
* **Platform**: MINGW64_NT-10.0 DESKTOP-DOPK2VD 2.6.1(0.306/5/3) 2017-01-20 15:23 x86_64 Msys
CYGWIN_NT-10.0 DESKTOP-DOPK2VD 2.6.1(0.305/5/3) 2016-12-16 11:55 x86_64 Cygwin
* **Subsystem**:standard input/standard output
$ cat fillbuffer.js
```js
console.log(0);
```
```bash
$ yes | xargs node fillbuffer.js|tee xxx
0
0
```
[ I expect many more 0's going down the page and filling up the file system in the xxx file, but unfortunately, it hangs ]
Note that I may have a virus. One is quarantined. Please let me know if you can duplicate. If not, I will look elsewhere. Note that:
```bash
$ yes| xargs -L 1 echo | tee xxx
y
y
y
y
.
.
.
[ keeps going ]
```
works so I think the issue is with node.js
Note that
```bash
$ yes | xargs -L 1 node fillbuffer.js|tee xxx
0
0
```
Hangs after 2 calls to node as well. That's what I've noticed, 2 calls and then it hangs.
|
1.0
|
node blocks when piping to standard output. Git for Bash on Windows 10, Cygwin too. - * **Version**:v7.6.0, v6.10.0
* **Platform**: MINGW64_NT-10.0 DESKTOP-DOPK2VD 2.6.1(0.306/5/3) 2017-01-20 15:23 x86_64 Msys
CYGWIN_NT-10.0 DESKTOP-DOPK2VD 2.6.1(0.305/5/3) 2016-12-16 11:55 x86_64 Cygwin
* **Subsystem**:standard input/standard output
$ cat fillbuffer.js
```js
console.log(0);
```
```bash
$ yes | xargs node fillbuffer.js|tee xxx
0
0
```
[ I expect many more 0's going down the page and filling up the file system in the xxx file, but unfortunately, it hangs ]
Note that I may have a virus. One is quarantined. Please let me know if you can duplicate. If not, I will look elsewhere. Note that:
```bash
$ yes| xargs -L 1 echo | tee xxx
y
y
y
y
.
.
.
[ keeps going ]
```
works so I think the issue is with node.js
Note that
```bash
$ yes | xargs -L 1 node fillbuffer.js|tee xxx
0
0
```
Hangs after 2 calls to node as well. That's what I've noticed, 2 calls and then it hangs.
|
process
|
node blocks when piping to standard output git for bash on windows cygwin too version platform nt desktop msys cygwin nt desktop cygwin subsystem standard input standard output cat fillbuffer js js console log bash yes xargs node fillbuffer js tee xxx note that i may have a virus one is quarantined please let me know if you can duplicate if not i will look elsewhere note that bash yes xargs l echo tee xxx y y y y works so i think the issue is with node js note that bash yes xargs l node fillbuffer js tee xxx hangs after calls to node as well that s what i ve noticed calls and then it hangs
| 1
|
517,068
| 14,994,217,042
|
IssuesEvent
|
2021-01-29 12:34:00
|
gazprom-neft/consta-uikit
|
https://api.github.com/repos/gazprom-neft/consta-uikit
|
closed
|
ContextMenu: исправить имя переменной на отступ
|
bug 🔥 priority
|
<img width="485" alt="изображение" src="https://user-images.githubusercontent.com/24884175/105043878-2303ed00-5a77-11eb-8318-64da905816f1.png">
Привет, нужно исправить название переменной `--bottom-spase` -> `--bottom-space`
|
1.0
|
ContextMenu: исправить имя переменной на отступ - <img width="485" alt="изображение" src="https://user-images.githubusercontent.com/24884175/105043878-2303ed00-5a77-11eb-8318-64da905816f1.png">
Привет, нужно исправить название переменной `--bottom-spase` -> `--bottom-space`
|
non_process
|
contextmenu исправить имя переменной на отступ img width alt изображение src привет нужно исправить название переменной bottom spase bottom space
| 0
|
28,073
| 22,837,721,594
|
IssuesEvent
|
2022-07-12 18:21:57
|
mitodl/ol-infrastructure
|
https://api.github.com/repos/mitodl/ol-infrastructure
|
closed
|
Dagster AMI build and envrionment deployment
|
Data Infrastructure
|
# User Story
- As a data platform owner I would like to be able to run dagster on our own infrastructure so <some kind of good reason here>
We are incorporating Dagster as a component of our data platform infrastructure to orchestrate our data movement activities.
# Description/Context
<!-- What needs to be done? What additional details are needed by the person who will do th work? -->
?
# Acceptance Criteria
- [ ] ?<!-- What are the concrete outcomes that need to happen for this to be "done"? -->
|
1.0
|
Dagster AMI build and envrionment deployment - # User Story
- As a data platform owner I would like to be able to run dagster on our own infrastructure so <some kind of good reason here>
We are incorporating Dagster as a component of our data platform infrastructure to orchestrate our data movement activities.
# Description/Context
<!-- What needs to be done? What additional details are needed by the person who will do th work? -->
?
# Acceptance Criteria
- [ ] ?<!-- What are the concrete outcomes that need to happen for this to be "done"? -->
|
non_process
|
dagster ami build and envrionment deployment user story as a data platform owner i would like to be able to run dagster on our own infrastructure so we are incorporating dagster as a component of our data platform infrastructure to orchestrate our data movement activities description context acceptance criteria
| 0
|
278,391
| 30,702,301,904
|
IssuesEvent
|
2023-07-27 01:18:48
|
nidhi7598/linux-4.1.15_CVE-2022-42896
|
https://api.github.com/repos/nidhi7598/linux-4.1.15_CVE-2022-42896
|
closed
|
CVE-2020-35499 (Medium) detected in linuxlinux-4.6 - autoclosed
|
Mend: dependency security vulnerability
|
## CVE-2020-35499 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-4.1.15_CVE-2022-42896/commit/faba4e6dccb5c94b584906681b473fe12da6a9dc">faba4e6dccb5c94b584906681b473fe12da6a9dc</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/bluetooth/sco.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/bluetooth/sco.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A NULL pointer dereference flaw in Linux kernel versions prior to 5.11 may be seen if sco_sock_getsockopt function in net/bluetooth/sco.c do not have a sanity check for a socket connection, when using BT_SNDMTU/BT_RCVMTU for SCO sockets. This could allow a local attacker with a special user privilege to crash the system (DOS) or leak kernel internal information.
Mend Note: After conducting further research, Mend has determined that all versions of Linux Kernel up to version v5.10.4 are vulnerable to CVE-2020-35499.
<p>Publish Date: 2021-02-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-35499>CVE-2020-35499</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2020-35499">https://www.linuxkernelcves.com/cves/CVE-2020-35499</a></p>
<p>Release Date: 2021-02-19</p>
<p>Fix Resolution: v5.10.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-35499 (Medium) detected in linuxlinux-4.6 - autoclosed - ## CVE-2020-35499 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-4.1.15_CVE-2022-42896/commit/faba4e6dccb5c94b584906681b473fe12da6a9dc">faba4e6dccb5c94b584906681b473fe12da6a9dc</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/bluetooth/sco.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/bluetooth/sco.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A NULL pointer dereference flaw in Linux kernel versions prior to 5.11 may be seen if sco_sock_getsockopt function in net/bluetooth/sco.c do not have a sanity check for a socket connection, when using BT_SNDMTU/BT_RCVMTU for SCO sockets. This could allow a local attacker with a special user privilege to crash the system (DOS) or leak kernel internal information.
Mend Note: After conducting further research, Mend has determined that all versions of Linux Kernel up to version v5.10.4 are vulnerable to CVE-2020-35499.
<p>Publish Date: 2021-02-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-35499>CVE-2020-35499</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2020-35499">https://www.linuxkernelcves.com/cves/CVE-2020-35499</a></p>
<p>Release Date: 2021-02-19</p>
<p>Fix Resolution: v5.10.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linuxlinux autoclosed cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files net bluetooth sco c net bluetooth sco c vulnerability details a null pointer dereference flaw in linux kernel versions prior to may be seen if sco sock getsockopt function in net bluetooth sco c do not have a sanity check for a socket connection when using bt sndmtu bt rcvmtu for sco sockets this could allow a local attacker with a special user privilege to crash the system dos or leak kernel internal information mend note after conducting further research mend has determined that all versions of linux kernel up to version are vulnerable to cve publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
680,915
| 23,289,982,418
|
IssuesEvent
|
2022-08-05 21:18:06
|
FAForever/fa
|
https://api.github.com/repos/FAForever/fa
|
closed
|
Nuke and SMD pgen adjacency don't work if you build the pgens after
|
area: sim priority: medium
|
Building pgens adjacent to an SMD/nuke should reduce its power consumption but doesn't. Moon and me noticed it in this game: 16042407 and I was able to easily reproduce it in this sandbox: 16044635.
Strangely, if you build an SMD/nuke next to already existing pgens, the adjacency bonus does work correctly, but if you build the SMD/nuke first and then the adjacent pgens after, it doesn't. See the sandbox here: 16044694
My very quick tests of some other power adjacencies weren't able to find a problem anywhere else. Most notably (air-)factories and omnis aren't affected by whatever weirdness is going on here.
|
1.0
|
Nuke and SMD pgen adjacency don't work if you build the pgens after - Building pgens adjacent to an SMD/nuke should reduce its power consumption but doesn't. Moon and me noticed it in this game: 16042407 and I was able to easily reproduce it in this sandbox: 16044635.
Strangely, if you build an SMD/nuke next to already existing pgens, the adjacency bonus does work correctly, but if you build the SMD/nuke first and then the adjacent pgens after, it doesn't. See the sandbox here: 16044694
My very quick tests of some other power adjacencies weren't able to find a problem anywhere else. Most notably (air-)factories and omnis aren't affected by whatever weirdness is going on here.
|
non_process
|
nuke and smd pgen adjacency don t work if you build the pgens after building pgens adjacent to an smd nuke should reduce its power consumption but doesn t moon and me noticed it in this game and i was able to easily reproduce it in this sandbox strangely if you build an smd nuke next to already existing pgens the adjacency bonus does work correctly but if you build the smd nuke first and then the adjacent pgens after it doesn t see the sandbox here my very quick tests of some other power adjacencies weren t able to find a problem anywhere else most notably air factories and omnis aren t affected by whatever weirdness is going on here
| 0
|
7,348
| 10,482,866,327
|
IssuesEvent
|
2019-09-24 13:02:27
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
JPEG image resolution
|
enhancement preprocess stale
|
## Expected Behavior
I'm developing DITA to WordprocessingML plug-in. To convert `<image>` to the `<w:drawing>` the image resolution information is indispensable for JPEG image.
## Actual Behavior
[https://github.com/dita-ot/dita-ot/blob/develop/src/main/java/org/dita/dost/writer/ImageMetadataFilter.java](
) currently support only PNG image file resolution. It does not return the JPEG resolution.
As a result the plug-in cannot get correct image size, moreover I cannot implement image/@scale attribute.
<image href="image/tys125f.jpg" id="image_tdk_5tg_dbb" scale="20" placement="inline" class="- topic/image " xtrf="file:/D:/SVN/pdf5/testdata-ooxml/20170907-image/cImageTest.dita" xtrc="image:4;20:74" dita-ot:image-width="1200" dita-ot:image-height="1600"/>
## Possible Solution
Add the code to get the resolution of JPEG image.
## Steps to Reproduce
1. Make a topic that contains image element that contains the href attribute to the JPEG image file.
2. Execute Jarno's plugin com.elovirta.ooxml
3. Confirm that the topic file in temp directory does not contain dita-ot:horizontal-dpi and dita-ot:vertical-dpi attribute. for JPEG image
## Environment
* DITA-OT version: 2.5.1
* Operating system and version: Windows 10 64bit
* How did you run DITA-OT? From oXygen or command-line
* Transformation type: com.elovirta.ooxml
|
1.0
|
JPEG image resolution - ## Expected Behavior
I'm developing DITA to WordprocessingML plug-in. To convert `<image>` to the `<w:drawing>` the image resolution information is indispensable for JPEG image.
## Actual Behavior
[https://github.com/dita-ot/dita-ot/blob/develop/src/main/java/org/dita/dost/writer/ImageMetadataFilter.java](
) currently support only PNG image file resolution. It does not return the JPEG resolution.
As a result the plug-in cannot get correct image size, moreover I cannot implement image/@scale attribute.
<image href="image/tys125f.jpg" id="image_tdk_5tg_dbb" scale="20" placement="inline" class="- topic/image " xtrf="file:/D:/SVN/pdf5/testdata-ooxml/20170907-image/cImageTest.dita" xtrc="image:4;20:74" dita-ot:image-width="1200" dita-ot:image-height="1600"/>
## Possible Solution
Add the code to get the resolution of JPEG image.
## Steps to Reproduce
1. Make a topic that contains image element that contains the href attribute to the JPEG image file.
2. Execute Jarno's plugin com.elovirta.ooxml
3. Confirm that the topic file in temp directory does not contain dita-ot:horizontal-dpi and dita-ot:vertical-dpi attribute. for JPEG image
## Environment
* DITA-OT version: 2.5.1
* Operating system and version: Windows 10 64bit
* How did you run DITA-OT? From oXygen or command-line
* Transformation type: com.elovirta.ooxml
|
process
|
jpeg image resolution expected behavior i m developing dita to wordprocessingml plug in to convert to the the image resolution information is indispensable for jpeg image actual behavior currently support only png image file resolution it does not return the jpeg resolution as a result the plug in cannot get correct image size moreover i cannot implement image scale attribute possible solution add the code to get the resolution of jpeg image steps to reproduce make a topic that contains image element that contains the href attribute to the jpeg image file execute jarno s plugin com elovirta ooxml confirm that the topic file in temp directory does not contain dita ot horizontal dpi and dita ot vertical dpi attribute for jpeg image environment dita ot version operating system and version windows how did you run dita ot from oxygen or command line transformation type com elovirta ooxml
| 1
|
1,237
| 3,777,008,095
|
IssuesEvent
|
2016-03-17 18:29:45
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
opened
|
Unbundle Android tools
|
Mobile P1 Process Release / binary
|
We're about to add another tool for Android builds to do sharded dex compilation, and that depends on having an Android SDK available. If we put it into @bazel_tools, then we'd implicitly require every user of Bazel to also have an Android SDK, whether they work with Android or not.
After talking to @ahumesky and others, it seems to me that the right solution is to unbundle the Android tools - all of them. This requires that all devs who work with Android now also need to depend on the android tools repository (@android_tools?), but that's better than the alternative.
I think we have to do this before we can declare a 1.0, and if we want to ship further improvements to Android builds, sooner rather than later.
|
1.0
|
Unbundle Android tools - We're about to add another tool for Android builds to do sharded dex compilation, and that depends on having an Android SDK available. If we put it into @bazel_tools, then we'd implicitly require every user of Bazel to also have an Android SDK, whether they work with Android or not.
After talking to @ahumesky and others, it seems to me that the right solution is to unbundle the Android tools - all of them. This requires that all devs who work with Android now also need to depend on the android tools repository (@android_tools?), but that's better than the alternative.
I think we have to do this before we can declare a 1.0, and if we want to ship further improvements to Android builds, sooner rather than later.
|
process
|
unbundle android tools we re about to add another tool for android builds to do sharded dex compilation and that depends on having an android sdk available if we put it into bazel tools then we d implicitly require every user of bazel to also have an android sdk whether they work with android or not after talking to ahumesky and others it seems to me that the right solution is to unbundle the android tools all of them this requires that all devs who work with android now also need to depend on the android tools repository android tools but that s better than the alternative i think we have to do this before we can declare a and if we want to ship further improvements to android builds sooner rather than later
| 1
|
261,148
| 27,785,436,290
|
IssuesEvent
|
2023-03-17 02:29:46
|
detain/watchable
|
https://api.github.com/repos/detain/watchable
|
closed
|
vuetify-2.6.4.tgz: 1 vulnerabilities (highest severity is: 5.4) - autoclosed
|
Mend: dependency security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>vuetify-2.6.4.tgz</b></p></summary>
<p>Vue Material Component Framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/vuetify/-/vuetify-2.6.4.tgz">https://registry.npmjs.org/vuetify/-/vuetify-2.6.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/vuetify/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/detain/watchable/commit/aa67e5c8feb26ac6176ddfd4899d3ecd6eb82bb3">aa67e5c8feb26ac6176ddfd4899d3ecd6eb82bb3</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (vuetify version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-25873](https://www.mend.io/vulnerability-database/CVE-2022-25873) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.4 | vuetify-2.6.4.tgz | Direct | 2.6.10 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-25873</summary>
### Vulnerable Library - <b>vuetify-2.6.4.tgz</b></p>
<p>Vue Material Component Framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/vuetify/-/vuetify-2.6.4.tgz">https://registry.npmjs.org/vuetify/-/vuetify-2.6.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/vuetify/package.json</p>
<p>
Dependency Hierarchy:
- :x: **vuetify-2.6.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/detain/watchable/commit/aa67e5c8feb26ac6176ddfd4899d3ecd6eb82bb3">aa67e5c8feb26ac6176ddfd4899d3ecd6eb82bb3</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package vuetify from 2.0.0-beta.4 and before 2.6.10 are vulnerable to Cross-site Scripting (XSS) due to improper input sanitization in the 'eventName' function within the VCalendar component.
<p>Publish Date: 2022-09-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25873>CVE-2022-25873</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2022-25873">https://nvd.nist.gov/vuln/detail/CVE-2022-25873</a></p>
<p>Release Date: 2022-09-18</p>
<p>Fix Resolution: 2.6.10</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
True
|
vuetify-2.6.4.tgz: 1 vulnerabilities (highest severity is: 5.4) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>vuetify-2.6.4.tgz</b></p></summary>
<p>Vue Material Component Framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/vuetify/-/vuetify-2.6.4.tgz">https://registry.npmjs.org/vuetify/-/vuetify-2.6.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/vuetify/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/detain/watchable/commit/aa67e5c8feb26ac6176ddfd4899d3ecd6eb82bb3">aa67e5c8feb26ac6176ddfd4899d3ecd6eb82bb3</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (vuetify version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-25873](https://www.mend.io/vulnerability-database/CVE-2022-25873) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.4 | vuetify-2.6.4.tgz | Direct | 2.6.10 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-25873</summary>
### Vulnerable Library - <b>vuetify-2.6.4.tgz</b></p>
<p>Vue Material Component Framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/vuetify/-/vuetify-2.6.4.tgz">https://registry.npmjs.org/vuetify/-/vuetify-2.6.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/vuetify/package.json</p>
<p>
Dependency Hierarchy:
- :x: **vuetify-2.6.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/detain/watchable/commit/aa67e5c8feb26ac6176ddfd4899d3ecd6eb82bb3">aa67e5c8feb26ac6176ddfd4899d3ecd6eb82bb3</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The package vuetify from 2.0.0-beta.4 and before 2.6.10 are vulnerable to Cross-site Scripting (XSS) due to improper input sanitization in the 'eventName' function within the VCalendar component.
<p>Publish Date: 2022-09-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25873>CVE-2022-25873</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2022-25873">https://nvd.nist.gov/vuln/detail/CVE-2022-25873</a></p>
<p>Release Date: 2022-09-18</p>
<p>Fix Resolution: 2.6.10</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details>
|
non_process
|
vuetify tgz vulnerabilities highest severity is autoclosed vulnerable library vuetify tgz vue material component framework library home page a href path to dependency file package json path to vulnerable library node modules vuetify package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in vuetify version remediation available medium vuetify tgz direct details cve vulnerable library vuetify tgz vue material component framework library home page a href path to dependency file package json path to vulnerable library node modules vuetify package json dependency hierarchy x vuetify tgz vulnerable library found in head commit a href found in base branch main vulnerability details the package vuetify from beta and before are vulnerable to cross site scripting xss due to improper input sanitization in the eventname function within the vcalendar component publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
2,781
| 5,714,037,026
|
IssuesEvent
|
2017-04-19 09:22:14
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
process: flaky behavior of 'exit' event handler
|
confirmed-bug process
|
* **Version**: '8.0.0-rc.0'
* **Platform**: Windows 7 x64
* **Subsystem**: process
`process.md` [states](https://github.com/nodejs/node/blob/master/doc/api/process.md#event-exit):
> Listener functions **must** only perform **synchronous** operations. The Node.js
> process will exit immediately after calling the `'exit'` event listeners
> causing any additional work still queued in the event loop to be abandoned.
> In the following example, for instance, the timeout will never occur:
>
> ```js
> process.on('exit', (code) => {
> setTimeout(() => {
> console.log('This will not run');
> }, 0);
> });
> ```
However, the behavior of the code is flaky:
```
> test.js
> test.js
This will not run
> test.js
This will not run
> test.js
> test.js
> test.js
> test.js
This will not run
...
```
If this flakiness is not a bug, what would be better?
1. Increase the timeout up to `1000` or something and save the categorical 'the timeout will never occur'.
2. State flakiness. If so, what would be preferable wordings for the description and the logged string?
|
1.0
|
process: flaky behavior of 'exit' event handler - * **Version**: '8.0.0-rc.0'
* **Platform**: Windows 7 x64
* **Subsystem**: process
`process.md` [states](https://github.com/nodejs/node/blob/master/doc/api/process.md#event-exit):
> Listener functions **must** only perform **synchronous** operations. The Node.js
> process will exit immediately after calling the `'exit'` event listeners
> causing any additional work still queued in the event loop to be abandoned.
> In the following example, for instance, the timeout will never occur:
>
> ```js
> process.on('exit', (code) => {
> setTimeout(() => {
> console.log('This will not run');
> }, 0);
> });
> ```
However, the behavior of the code is flaky:
```
> test.js
> test.js
This will not run
> test.js
This will not run
> test.js
> test.js
> test.js
> test.js
This will not run
...
```
If this flakiness is not a bug, what would be better?
1. Increase the timeout up to `1000` or something and save the categorical 'the timeout will never occur'.
2. State flakiness. If so, what would be preferable wordings for the description and the logged string?
|
process
|
process flaky behavior of exit event handler version rc platform windows subsystem process process md listener functions must only perform synchronous operations the node js process will exit immediately after calling the exit event listeners causing any additional work still queued in the event loop to be abandoned in the following example for instance the timeout will never occur js process on exit code settimeout console log this will not run however the behavior of the code is flaky test js test js this will not run test js this will not run test js test js test js test js this will not run if this flakiness is not a bug what would be better increase the timeout up to or something and save the categorical the timeout will never occur state flakiness if so what would be preferable wordings for the description and the logged string
| 1
|
10,951
| 13,756,693,705
|
IssuesEvent
|
2020-10-06 20:21:14
|
kubernetes/minikube
|
https://api.github.com/repos/kubernetes/minikube
|
closed
|
Create a Make target which outputs HTML Test reports on Windows
|
kind/process os/windows priority/important-longterm
|
We need to have a make target which converts the test reports to HTML and works on Windows.
/cc: @medyagh
|
1.0
|
Create a Make target which outputs HTML Test reports on Windows - We need to have a make target which converts the test reports to HTML and works on Windows.
/cc: @medyagh
|
process
|
create a make target which outputs html test reports on windows we need to have a make target which converts the test reports to html and works on windows cc medyagh
| 1
|
107,179
| 4,290,713,729
|
IssuesEvent
|
2016-07-18 10:56:53
|
ankidroid/Anki-Android
|
https://api.github.com/repos/ankidroid/Anki-Android
|
closed
|
Pull to sync/refresh
|
enhancement Priority-Low
|
Request to add pull to sync option.
Will really help with one hand use, right now have to reach top right corner, sync is not even part of setting menu.
Which is also available by long press on show open app button.
|
1.0
|
Pull to sync/refresh - Request to add pull to sync option.
Will really help with one hand use, right now have to reach top right corner, sync is not even part of setting menu.
Which is also available by long press on show open app button.
|
non_process
|
pull to sync refresh request to add pull to sync option will really help with one hand use right now have to reach top right corner sync is not even part of setting menu which is also available by long press on show open app button
| 0
|
9,063
| 12,136,898,785
|
IssuesEvent
|
2020-04-23 15:00:03
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Error in extends example
|
devops-cicd-process/tech devops/prod doc-bug
|
There is an error in the sample at #extend-from-a-template.
As written the pipeline will not fail as intended. The example below will cause it to fail as intended. Credit to Merlin Lang MSFT for his stackoverflow answer.
https://stackoverflow.com/questions/60729411/extend-yaml-pipeline-example-validate-step
- ${{ each step in parameters.buildSteps }}:
- ${{ each pair in step }}:
${{ if ne(pair.value, 'CmdLine@2') }}:
${{ pair.key }}: ${{ pair.value }}
${{ if eq(pair.value, 'CmdLine@2') }}: # checks for buildStep with script
'Rejecting Script: ${{ pair.value }}': error # rejects buildStep when script is found
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 6724abea-bbdc-bf66-ed5e-3214fa6c3e66
* Version Independent ID: 4f8dab21-3f0e-da32-cc0e-1d85c13c0065
* Content: [Templates - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#extend-from-a-template)
* Content Source: [docs/pipelines/process/templates.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/templates.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Error in extends example - There is an error in the sample at #extend-from-a-template.
As written the pipeline will not fail as intended. The example below will cause it to fail as intended. Credit to Merlin Lang MSFT for his stackoverflow answer.
https://stackoverflow.com/questions/60729411/extend-yaml-pipeline-example-validate-step
- ${{ each step in parameters.buildSteps }}:
- ${{ each pair in step }}:
${{ if ne(pair.value, 'CmdLine@2') }}:
${{ pair.key }}: ${{ pair.value }}
${{ if eq(pair.value, 'CmdLine@2') }}: # checks for buildStep with script
'Rejecting Script: ${{ pair.value }}': error # rejects buildStep when script is found
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 6724abea-bbdc-bf66-ed5e-3214fa6c3e66
* Version Independent ID: 4f8dab21-3f0e-da32-cc0e-1d85c13c0065
* Content: [Templates - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#extend-from-a-template)
* Content Source: [docs/pipelines/process/templates.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/templates.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
error in extends example there is an error in the sample at extend from a template as written the pipeline will not fail as intended the example below will cause it to fail as intended credit to merlin lang msft for his stackoverflow answer each step in parameters buildsteps each pair in step if ne pair value cmdline pair key pair value if eq pair value cmdline checks for buildstep with script rejecting script pair value error rejects buildstep when script is found document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id bbdc version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
217,849
| 16,740,789,903
|
IssuesEvent
|
2021-06-11 09:31:07
|
DavidT3/XGA
|
https://api.github.com/repos/DavidT3/XGA
|
closed
|
Let methods that use tabulate take an argument to specify the table format
|
bug documentation enhancement
|
For the 'allowed_models()' method of BaseProfile1D, and the 'allowed_prior_types'/'info' methods of BaseModel1D.
This is triggered by the fact that the one I chose ('fancy_grid') contains a unicode character that doesn't work with the LaTeX package used to build the PDF version of the documentation.
I like the fancy grid table, but for the tutorial notebook I should be able to pass a simpler one.
|
1.0
|
Let methods that use tabulate take an argument to specify the table format - For the 'allowed_models()' method of BaseProfile1D, and the 'allowed_prior_types'/'info' methods of BaseModel1D.
This is triggered by the fact that the one I chose ('fancy_grid') contains a unicode character that doesn't work with the LaTeX package used to build the PDF version of the documentation.
I like the fancy grid table, but for the tutorial notebook I should be able to pass a simpler one.
|
non_process
|
let methods that use tabulate take an argument to specify the table format for the allowed models method of and the allowed prior types info methods of this is triggered by the fact that the one i chose fancy grid contains a unicode character that doesn t work with the latex package used to build the pdf version of the documentation i like the fancy grid table but for the tutorial notebook i should be able to pass a simpler one
| 0
|
159,246
| 12,467,842,754
|
IssuesEvent
|
2020-05-28 17:46:58
|
LIBCAS/INDIHU-Mind
|
https://api.github.com/repos/LIBCAS/INDIHU-Mind
|
reopened
|
Štítky by měly být na kartách vidět už v náhledech
|
To test
|
Jinak nemají štítky moc smysl, natož pak barvy jim přiřazované.
|
1.0
|
Štítky by měly být na kartách vidět už v náhledech - Jinak nemají štítky moc smysl, natož pak barvy jim přiřazované.
|
non_process
|
štítky by měly být na kartách vidět už v náhledech jinak nemají štítky moc smysl natož pak barvy jim přiřazované
| 0
|
15,729
| 19,902,829,392
|
IssuesEvent
|
2022-01-25 09:45:35
|
plazi/community
|
https://api.github.com/repos/plazi/community
|
opened
|
to be processed 10.1016/j.isci.2021.103579 and 10.1126/sciadv.aav3875
|
process request
|
MIght it be possible to process this article that has not treatments, but should go to BLR https://www.cell.com/action/showPdf?pii=S2589-0042%2821%2901549-2 (just remove the first page?!), and the original description in https://www.science.org/doi/reader/10.1126/sciadv.aav3875 low level
tx
|
1.0
|
to be processed 10.1016/j.isci.2021.103579 and 10.1126/sciadv.aav3875 - MIght it be possible to process this article that has not treatments, but should go to BLR https://www.cell.com/action/showPdf?pii=S2589-0042%2821%2901549-2 (just remove the first page?!), and the original description in https://www.science.org/doi/reader/10.1126/sciadv.aav3875 low level
tx
|
process
|
to be processed j isci and sciadv might it be possible to process this article that has not treatments but should go to blr just remove the first page and the original description in low level tx
| 1
|
12,019
| 14,738,453,865
|
IssuesEvent
|
2021-01-07 04:48:43
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Keener - clients not receiving fax of their invoices
|
anc-external anc-ops anc-process anp-0.5 ant-bug ant-support
|
In GitLab by @kdjstudios on May 31, 2018, 15:16
**Submitted by:** Gaylan Garrett <gaylan@keenercom.net>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-05-31-68327
**Server:** External
**Client/Site:** Keener
**Account:** 6635 & 6628
**Issue:**
This is the email that I received that said the invoices were faxed.
Clients 6635 and 6628 did not receive the fax and apparently have not received them in some time.
> From: billing@keenercom.net <billing@keenercom.net>
> Sent: Thursday, May 24, 2018 2:38 AM
> To: Billing Department
> Subject: Billing Cycle: 05/22/2018 Master Faxed 4 Invoices.
> KNRVA-15049 for (KNRVA-6635) Young Seal Coating faxed to 434-239-3355 KNRVA-15042 for (KNRVA-6607) West Jefferson Family Practice faxed to 614-879-7067 KNRVA-15015 for (KNRVA-6628) Sturges Clinic, Inc faxed to 419-526-1107 KNRVA-14973 for (KNRVA-7430) Paseo Camarillo Prof Center faxed to 805-388-7666
|
1.0
|
Keener - clients not receiving fax of their invoices - In GitLab by @kdjstudios on May 31, 2018, 15:16
**Submitted by:** Gaylan Garrett <gaylan@keenercom.net>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-05-31-68327
**Server:** External
**Client/Site:** Keener
**Account:** 6635 & 6628
**Issue:**
This is the email that I received that said the invoices were faxed.
Clients 6635 and 6628 did not receive the fax and apparently have not received them in some time.
> From: billing@keenercom.net <billing@keenercom.net>
> Sent: Thursday, May 24, 2018 2:38 AM
> To: Billing Department
> Subject: Billing Cycle: 05/22/2018 Master Faxed 4 Invoices.
> KNRVA-15049 for (KNRVA-6635) Young Seal Coating faxed to 434-239-3355 KNRVA-15042 for (KNRVA-6607) West Jefferson Family Practice faxed to 614-879-7067 KNRVA-15015 for (KNRVA-6628) Sturges Clinic, Inc faxed to 419-526-1107 KNRVA-14973 for (KNRVA-7430) Paseo Camarillo Prof Center faxed to 805-388-7666
|
process
|
keener clients not receiving fax of their invoices in gitlab by kdjstudios on may submitted by gaylan garrett helpdesk server external client site keener account issue this is the email that i received that said the invoices were faxed clients and did not receive the fax and apparently have not received them in some time from billing keenercom net sent thursday may am to billing department subject billing cycle master faxed invoices knrva for knrva young seal coating faxed to knrva for knrva west jefferson family practice faxed to knrva for knrva sturges clinic inc faxed to knrva for knrva paseo camarillo prof center faxed to
| 1
|
174,646
| 6,542,033,998
|
IssuesEvent
|
2017-09-01 23:51:12
|
bcgov/api-specs
|
https://api.github.com/repos/bcgov/api-specs
|
reopened
|
Add support for advanced batch route optimization problems
|
enhancement high priority ROUTE PLANNER
|
Add a vehicle-count parameter to allow dividing up M stops between vehicle-count vehicles while minimizing total distance or time.
|
1.0
|
Add support for advanced batch route optimization problems - Add a vehicle-count parameter to allow dividing up M stops between vehicle-count vehicles while minimizing total distance or time.
|
non_process
|
add support for advanced batch route optimization problems add a vehicle count parameter to allow dividing up m stops between vehicle count vehicles while minimizing total distance or time
| 0
|
13,588
| 16,162,948,923
|
IssuesEvent
|
2021-05-01 01:26:33
|
tdwg/chrono
|
https://api.github.com/repos/tdwg/chrono
|
closed
|
Nature of the object of a ChronometricAge
|
Process - prepare for Executive review Question - answered
|
In the definition of chrono:ChronometricAge "The age of a specimen or related materials that is generated from a dating assay.", the word "related" worries me a lot, particularly when combined with some of the example values "stratigraphically pre-1104" "Double Tuff". This implies that related means stratigraphically correlated, which extends to pretty much any fossil placed within any chronostratigraphic unit. This definition allows the term to be used with any dwc:geologicalContext to provide a date range for that geological context, rather than reporting on dates directly derived from samples, which I understand to be the intent of this proposal. I'd be much happier with either (1) the term being restricted to application to a material sample, with the removal of "related material", or (2) the explicit expansion of the vocabulary to include correlation, with a term that allows the assertion of whether a chronometricAge is derived directly from the material sample, is derived from a material sample in the same continuous section/exposure/core, or whether the age applies by stratigraphic correlation (probably with discussion of how to handle specimens identified to taxa that define biostratigraphic zones), or (3) an explicit statement that related applies only within a single continuous section/exposure/core, and a term that distinguishes between dates obtained from the material sample itself, or from another material sample.
_Originally posted by @chicoreus in https://github.com/tdwg/chrono/issues/15#issuecomment-732234400_
|
1.0
|
Nature of the object of a ChronometricAge - In the definition of chrono:ChronometricAge "The age of a specimen or related materials that is generated from a dating assay.", the word "related" worries me a lot, particularly when combined with some of the example values "stratigraphically pre-1104" "Double Tuff". This implies that related means stratigraphically correlated, which extends to pretty much any fossil placed within any chronostratigraphic unit. This definition allows the term to be used with any dwc:geologicalContext to provide a date range for that geological context, rather than reporting on dates directly derived from samples, which I understand to be the intent of this proposal. I'd be much happier with either (1) the term being restricted to application to a material sample, with the removal of "related material", or (2) the explicit expansion of the vocabulary to include correlation, with a term that allows the assertion of whether a chronometricAge is derived directly from the material sample, is derived from a material sample in the same continuous section/exposure/core, or whether the age applies by stratigraphic correlation (probably with discussion of how to handle specimens identified to taxa that define biostratigraphic zones), or (3) an explicit statement that related applies only within a single continuous section/exposure/core, and a term that distinguishes between dates obtained from the material sample itself, or from another material sample.
_Originally posted by @chicoreus in https://github.com/tdwg/chrono/issues/15#issuecomment-732234400_
|
process
|
nature of the object of a chronometricage in the definition of chrono chronometricage the age of a specimen or related materials that is generated from a dating assay the word related worries me a lot particularly when combined with some of the example values stratigraphically pre double tuff this implies that related means stratigraphically correlated which extends to pretty much any fossil placed within any chronostratigraphic unit this definition allows the term to be used with any dwc geologicalcontext to provide a date range for that geological context rather than reporting on dates directly derived from samples which i understand to be the intent of this proposal i d be much happier with either the term being restricted to application to a material sample with the removal of related material or the explicit expansion of the vocabulary to include correlation with a term that allows the assertion of whether a chronometricage is derived directly from the material sample is derived from a material sample in the same continuous section exposure core or whether the age applies by stratigraphic correlation probably with discussion of how to handle specimens identified to taxa that define biostratigraphic zones or an explicit statement that related applies only within a single continuous section exposure core and a term that distinguishes between dates obtained from the material sample itself or from another material sample originally posted by chicoreus in
| 1
|
41,590
| 10,739,747,708
|
IssuesEvent
|
2019-10-29 16:54:01
|
trailofbits/osquery-extensions
|
https://api.github.com/repos/trailofbits/osquery-extensions
|
opened
|
build fails on MacOS
|
build osquery-extensions
|
Building on MacOS fails (even using the 4.0.1 porting branch) with the error `ld: library not found for -lboost_iostreams-mt` at the linking stage of building the extension binary.
|
1.0
|
build fails on MacOS - Building on MacOS fails (even using the 4.0.1 porting branch) with the error `ld: library not found for -lboost_iostreams-mt` at the linking stage of building the extension binary.
|
non_process
|
build fails on macos building on macos fails even using the porting branch with the error ld library not found for lboost iostreams mt at the linking stage of building the extension binary
| 0
|
8,736
| 11,866,324,207
|
IssuesEvent
|
2020-03-26 03:19:26
|
trynmaps/metrics-mvp
|
https://api.github.com/repos/trynmaps/metrics-mvp
|
closed
|
Create issues based on Tal notes
|
Process
|
Had a good call with Tal about how we can address SFMTA planners' needs and improve the Muni system. [Notes here](https://docs.google.com/document/d/11_lso4AdbNlNdeav4sy5QrDkz_7RGBKGHi9AtVm-YI4/edit). Tal also promised to send us a list of questions/feedback from people at SFMTA who he showed our tool to.
We should go through and add issues to Github to include features that will help address these needs.
(This isn't really a coding todo, but since it's important to building out the Github, I'll put it here. Let's see if that works.)
|
1.0
|
Create issues based on Tal notes - Had a good call with Tal about how we can address SFMTA planners' needs and improve the Muni system. [Notes here](https://docs.google.com/document/d/11_lso4AdbNlNdeav4sy5QrDkz_7RGBKGHi9AtVm-YI4/edit). Tal also promised to send us a list of questions/feedback from people at SFMTA who he showed our tool to.
We should go through and add issues to Github to include features that will help address these needs.
(This isn't really a coding todo, but since it's important to building out the Github, I'll put it here. Let's see if that works.)
|
process
|
create issues based on tal notes had a good call with tal about how we can address sfmta planners needs and improve the muni system tal also promised to send us a list of questions feedback from people at sfmta who he showed our tool to we should go through and add issues to github to include features that will help address these needs this isn t really a coding todo but since it s important to building out the github i ll put it here let s see if that works
| 1
|
123,716
| 4,868,602,550
|
IssuesEvent
|
2016-11-15 09:56:42
|
os-data/eu-structural-funds
|
https://api.github.com/repos/os-data/eu-structural-funds
|
closed
|
Modify the guidelines
|
specs top priority
|
Modify the guidelines to take into effect the changes in the specs and pipeline approach. Split up the current document in the wiki.
- [x] Sourcing
- [ ] Extracting
- [ ] Transforming
Also:
- [ ] Modify the diagram
|
1.0
|
Modify the guidelines - Modify the guidelines to take into effect the changes in the specs and pipeline approach. Split up the current document in the wiki.
- [x] Sourcing
- [ ] Extracting
- [ ] Transforming
Also:
- [ ] Modify the diagram
|
non_process
|
modify the guidelines modify the guidelines to take into effect the changes in the specs and pipeline approach split up the current document in the wiki sourcing extracting transforming also modify the diagram
| 0
|
20,947
| 27,807,467,938
|
IssuesEvent
|
2023-03-17 21:33:45
|
cse442-at-ub/project_s23-iweatherify
|
https://api.github.com/repos/cse442-at-ub/project_s23-iweatherify
|
closed
|
Add state and interactivity to the units page
|
Processing Task Sprint 2
|
**Task Tests**
*Test 1*
1. Go to the following URL: https://github.com/cse442-at-ub/project_s23-iweatherify/tree/dev
2. Click on the green `<> Code` button and download the ZIP file.

3. Unzip the downloaded file to a folder on your computer.
4. Open a terminal and navigate to the git repository folder using the `cd` command.
5. Run the `npm install` command in the terminal to install the necessary dependencies.
6. Run the `npm start` command in the terminal to start the application.
7. Check the output from the npm start command for the URL to access the application. The URL should be a localhost address (e.g., http://localhost:8080).
8. Navigate to http://localhost:8080
9. Ensure you have logged in to our app to see the page use UserID: `Replace` and Password:`Replace` to login
10. Go to URL: http://localhost:8080/#/website-units-page
11. Verify that the units page is displayed.
12. Verify that you can change the units to any avaliable options from the dropdown list


*Test 2*
1. Repeat steps 1 through 10 `Test 1`
2. Open the browser inspector tool and change the viewport to a mobile device
3. Repeat steps 11 and 12 from `Test 1`
|
1.0
|
Add state and interactivity to the units page - **Task Tests**
*Test 1*
1. Go to the following URL: https://github.com/cse442-at-ub/project_s23-iweatherify/tree/dev
2. Click on the green `<> Code` button and download the ZIP file.

3. Unzip the downloaded file to a folder on your computer.
4. Open a terminal and navigate to the git repository folder using the `cd` command.
5. Run the `npm install` command in the terminal to install the necessary dependencies.
6. Run the `npm start` command in the terminal to start the application.
7. Check the output from the npm start command for the URL to access the application. The URL should be a localhost address (e.g., http://localhost:8080).
8. Navigate to http://localhost:8080
9. Ensure you have logged in to our app to see the page use UserID: `Replace` and Password:`Replace` to login
10. Go to URL: http://localhost:8080/#/website-units-page
11. Verify that the units page is displayed.
12. Verify that you can change the units to any avaliable options from the dropdown list


*Test 2*
1. Repeat steps 1 through 10 `Test 1`
2. Open the browser inspector tool and change the viewport to a mobile device
3. Repeat steps 11 and 12 from `Test 1`
|
process
|
add state and interactivity to the units page task tests test go to the following url click on the green code button and download the zip file unzip the downloaded file to a folder on your computer open a terminal and navigate to the git repository folder using the cd command run the npm install command in the terminal to install the necessary dependencies run the npm start command in the terminal to start the application check the output from the npm start command for the url to access the application the url should be a localhost address e g navigate to ensure you have logged in to our app to see the page use userid replace and password replace to login go to url verify that the units page is displayed verify that you can change the units to any avaliable options from the dropdown list test repeat steps through test open the browser inspector tool and change the viewport to a mobile device repeat steps and from test
| 1
|
63,423
| 3,195,003,015
|
IssuesEvent
|
2015-09-30 14:49:55
|
SpaceAppsXploration/rdfendpoints
|
https://api.github.com/repos/SpaceAppsXploration/rdfendpoints
|
closed
|
Dump WebResource and Indexer into the Graph database
|
enhancement high priority
|
WebResource entities and Indexer entities has to be represented in a graph with triples like:
<webresource1> <chronos:relatedTo> <http://taxonomy.projectchronos.eu/concepts/c/keyword1>
<webresource2> <chronos:relatedTo> <http://taxonomy.projectchronos.eu/concepts/c/keyword1>
<webresource3> <chronos:relatedTo> <http://taxonomy.projectchronos.eu/concepts/c/keyword1>
Objects are JSON that can be easily flattened to JSON-LD with this mapping:
```
{
"label": "infrared telescopes",
"url": "http://taxonomy.projectchronos.eu/concepts/c/infrared+telescopes#concept",
"group": "keywords",
"ancestor": "http://taxonomy.projectchronos.eu/concepts/c/astronomy#concept"
}
```
label > rdf:label
url > @id
|
1.0
|
Dump WebResource and Indexer into the Graph database - WebResource entities and Indexer entities has to be represented in a graph with triples like:
<webresource1> <chronos:relatedTo> <http://taxonomy.projectchronos.eu/concepts/c/keyword1>
<webresource2> <chronos:relatedTo> <http://taxonomy.projectchronos.eu/concepts/c/keyword1>
<webresource3> <chronos:relatedTo> <http://taxonomy.projectchronos.eu/concepts/c/keyword1>
Objects are JSON that can be easily flattened to JSON-LD with this mapping:
```
{
"label": "infrared telescopes",
"url": "http://taxonomy.projectchronos.eu/concepts/c/infrared+telescopes#concept",
"group": "keywords",
"ancestor": "http://taxonomy.projectchronos.eu/concepts/c/astronomy#concept"
}
```
label > rdf:label
url > @id
|
non_process
|
dump webresource and indexer into the graph database webresource entities and indexer entities has to be represented in a graph with triples like objects are json that can be easily flattened to json ld with this mapping label infrared telescopes url group keywords ancestor label rdf label url id
| 0
|
8,029
| 11,209,669,818
|
IssuesEvent
|
2020-01-06 11:05:19
|
prisma/lift
|
https://api.github.com/repos/prisma/lift
|
closed
|
--auto-approve is unknown when lifting down
|
bug/2-confirmed kind/bug process/candidate
|
I cannot use `--auto-approve` on `prisma2 lift down` while the help says I should be able to do it.
```
$ prisma2 lift down --auto-approve
! Unknown or unexpected option: --auto-approve
Migrate your database down to a specific state.
Usage
prisma lift down [<dec|name|timestamp>]
Arguments
[<dec>] go down by an amount [default: 1]
Options
--auto-approve Skip interactive approval before migrating
-h, --help Displays this help message
-p, --preview Preview the migration changes
[... rest of help]
```
|
1.0
|
--auto-approve is unknown when lifting down - I cannot use `--auto-approve` on `prisma2 lift down` while the help says I should be able to do it.
```
$ prisma2 lift down --auto-approve
! Unknown or unexpected option: --auto-approve
Migrate your database down to a specific state.
Usage
prisma lift down [<dec|name|timestamp>]
Arguments
[<dec>] go down by an amount [default: 1]
Options
--auto-approve Skip interactive approval before migrating
-h, --help Displays this help message
-p, --preview Preview the migration changes
[... rest of help]
```
|
process
|
auto approve is unknown when lifting down i cannot use auto approve on lift down while the help says i should be able to do it lift down auto approve unknown or unexpected option auto approve migrate your database down to a specific state usage prisma lift down arguments go down by an amount options auto approve skip interactive approval before migrating h help displays this help message p preview preview the migration changes
| 1
|
20,395
| 27,052,082,675
|
IssuesEvent
|
2023-02-13 13:56:50
|
zammad/zammad
|
https://api.github.com/repos/zammad/zammad
|
closed
|
Failing HTML-Processing denies user to access the mail
|
bug verified prioritised by payment mail processing performance specification required
|
<!--
Hi there - thanks for filing an issue. Please ensure the following things before creating an issue - thank you! 🤓
Since november 15th we handle all requests, except real bugs, at our community board.
Full explanation: https://community.zammad.org/t/major-change-regarding-github-issues-community-board/21
Please post:
- Feature requests
- Development questions
- Technical questions
on the board -> https://community.zammad.org !
If you think you hit a bug, please continue:
- Search existing issues and the CHANGELOG.md for your issue - there might be a solution already
- Make sure to use the latest version of Zammad if possible
- Add the `log/production.log` file from your system. Attention: Make sure no confidential data is in it!
- Please write the issue in english
- Don't remove the template - otherwise we will close the issue without further comments
- Ask questions about Zammad configuration and usage at our mailinglist. See: https://zammad.org/participate
Note: We always do our best. Unfortunately, sometimes there are too many requests and we can't handle everything at once. If you want to prioritize/escalate your issue, you can do so by means of a support contract (see https://zammad.com/pricing#selfhosted).
* The upper textblock will be removed automatically when you submit your issue *
-->
### Infos:
* Used Zammad version: 3.1.x
* Installation method (source, package, ..): any
* Operating system: any
* Database + version: any
* Elasticsearch version: any
* Browser + version: any
* Ticket-ID: #1051620, #1056801, #1072848, #1086963, #1086950, # 10104953, #10108800, #10113609, #10114725, #10109416, #10117639, #10118442, #10122133
### Expected behavior:
If Zammad can't reliably process HTML content (with sanitizing and stuff), it will create a note at the ticket with the raw mail attached (what you originally received or typed [depending on the direction the message goes to]).
### Actual behavior:
If Zammad can't process HTML content (e.g. because the system is too busy at that moment or processing takes too long for other reasons), it will create a note that the message could not be processed and you shall check the RAW message.
For incoming, this is no problem, you can download the raw eml and view the content.
For outgoing mails (what your agent typed and sent), Zammad will include the same error message inside the eml as well. This will cause Zammad to loose the articles content.
Note: For both directions, the limit currently is too low (or not robust enough if you have a pretty busy system) and thus needs fiddling. Especially for outgoing mail, this needs fiddling so that the content doesn't get lost.
Message of article is:
```
This message cannot be displayed due to HTML processing issues. Download the raw message below and open it via an Email client if you still wish to view it.
```
### Steps to reproduce the behavior:
* make your system busy as hell
* pump mails into Zammad (this one is a bit tricky to enforce ;), you basically can simply lower the processing limit dramatically to enforce it)
Yes I'm sure this is a bug and no feature request or a general question.
|
1.0
|
Failing HTML-Processing denies user to access the mail - <!--
Hi there - thanks for filing an issue. Please ensure the following things before creating an issue - thank you! 🤓
Since november 15th we handle all requests, except real bugs, at our community board.
Full explanation: https://community.zammad.org/t/major-change-regarding-github-issues-community-board/21
Please post:
- Feature requests
- Development questions
- Technical questions
on the board -> https://community.zammad.org !
If you think you hit a bug, please continue:
- Search existing issues and the CHANGELOG.md for your issue - there might be a solution already
- Make sure to use the latest version of Zammad if possible
- Add the `log/production.log` file from your system. Attention: Make sure no confidential data is in it!
- Please write the issue in english
- Don't remove the template - otherwise we will close the issue without further comments
- Ask questions about Zammad configuration and usage at our mailinglist. See: https://zammad.org/participate
Note: We always do our best. Unfortunately, sometimes there are too many requests and we can't handle everything at once. If you want to prioritize/escalate your issue, you can do so by means of a support contract (see https://zammad.com/pricing#selfhosted).
* The upper textblock will be removed automatically when you submit your issue *
-->
### Infos:
* Used Zammad version: 3.1.x
* Installation method (source, package, ..): any
* Operating system: any
* Database + version: any
* Elasticsearch version: any
* Browser + version: any
* Ticket-ID: #1051620, #1056801, #1072848, #1086963, #1086950, # 10104953, #10108800, #10113609, #10114725, #10109416, #10117639, #10118442, #10122133
### Expected behavior:
If Zammad can't reliably process HTML content (with sanitizing and stuff), it will create a note at the ticket with the raw mail attached (what you originally received or typed [depending on the direction the message goes to]).
### Actual behavior:
If Zammad can't process HTML content (e.g. because the system is too busy at that moment or processing takes too long for other reasons), it will create a note that the message could not be processed and you shall check the RAW message.
For incoming, this is no problem, you can download the raw eml and view the content.
For outgoing mails (what your agent typed and sent), Zammad will include the same error message inside the eml as well. This will cause Zammad to loose the articles content.
Note: For both directions, the limit currently is too low (or not robust enough if you have a pretty busy system) and thus needs fiddling. Especially for outgoing mail, this needs fiddling so that the content doesn't get lost.
Message of article is:
```
This message cannot be displayed due to HTML processing issues. Download the raw message below and open it via an Email client if you still wish to view it.
```
### Steps to reproduce the behavior:
* make your system busy as hell
* pump mails into Zammad (this one is a bit tricky to enforce ;), you basically can simply lower the processing limit dramatically to enforce it)
Yes I'm sure this is a bug and no feature request or a general question.
|
process
|
failing html processing denies user to access the mail hi there thanks for filing an issue please ensure the following things before creating an issue thank you 🤓 since november we handle all requests except real bugs at our community board full explanation please post feature requests development questions technical questions on the board if you think you hit a bug please continue search existing issues and the changelog md for your issue there might be a solution already make sure to use the latest version of zammad if possible add the log production log file from your system attention make sure no confidential data is in it please write the issue in english don t remove the template otherwise we will close the issue without further comments ask questions about zammad configuration and usage at our mailinglist see note we always do our best unfortunately sometimes there are too many requests and we can t handle everything at once if you want to prioritize escalate your issue you can do so by means of a support contract see the upper textblock will be removed automatically when you submit your issue infos used zammad version x installation method source package any operating system any database version any elasticsearch version any browser version any ticket id expected behavior if zammad can t reliably process html content with sanitizing and stuff it will create a note at the ticket with the raw mail attached what you originally received or typed actual behavior if zammad can t process html content e g because the system is too busy at that moment or processing takes too long for other reasons it will create a note that the message could not be processed and you shall check the raw message for incoming this is no problem you can download the raw eml and view the content for outgoing mails what your agent typed and sent zammad will include the same error message inside the eml as well this will cause zammad to loose the articles content note for both directions the limit currently is too low or not robust enough if you have a pretty busy system and thus needs fiddling especially for outgoing mail this needs fiddling so that the content doesn t get lost message of article is this message cannot be displayed due to html processing issues download the raw message below and open it via an email client if you still wish to view it steps to reproduce the behavior make your system busy as hell pump mails into zammad this one is a bit tricky to enforce you basically can simply lower the processing limit dramatically to enforce it yes i m sure this is a bug and no feature request or a general question
| 1
|
2,683
| 5,531,483,053
|
IssuesEvent
|
2017-03-21 07:44:17
|
openvstorage/alba
|
https://api.github.com/repos/openvstorage/alba
|
reopened
|
get-disk-safety failed with 'Namespace manager exception: Nsm_model.Err.Namespace_id_not_found'.
|
process_wontfix
|
### Problem description
Monitoring revealed the following exception:
```
CRIT - EXCEPTION HC000 - Could not fetch alba information for backend nvmebackend Message: Command 'get-disk-safety' failed with 'Namespace manager exception: Nsm_model.Err.Namespace_id_not_found'.
```
What could have happened:
- Another healtcheck was busy with a test that involves creating and removing namespaces
- At the time the other namespace was getting deleted, get-disk-safety was called
### Proposed solution
The whole command should not fail when one namespace cannot be fetched. Perhaps return the current output you have collected and add an exception section or something?
|
1.0
|
get-disk-safety failed with 'Namespace manager exception: Nsm_model.Err.Namespace_id_not_found'. - ### Problem description
Monitoring revealed the following exception:
```
CRIT - EXCEPTION HC000 - Could not fetch alba information for backend nvmebackend Message: Command 'get-disk-safety' failed with 'Namespace manager exception: Nsm_model.Err.Namespace_id_not_found'.
```
What could have happened:
- Another healtcheck was busy with a test that involves creating and removing namespaces
- At the time the other namespace was getting deleted, get-disk-safety was called
### Proposed solution
The whole command should not fail when one namespace cannot be fetched. Perhaps return the current output you have collected and add an exception section or something?
|
process
|
get disk safety failed with namespace manager exception nsm model err namespace id not found problem description monitoring revealed the following exception crit exception could not fetch alba information for backend nvmebackend message command get disk safety failed with namespace manager exception nsm model err namespace id not found what could have happened another healtcheck was busy with a test that involves creating and removing namespaces at the time the other namespace was getting deleted get disk safety was called proposed solution the whole command should not fail when one namespace cannot be fetched perhaps return the current output you have collected and add an exception section or something
| 1
|
12,051
| 14,739,078,418
|
IssuesEvent
|
2021-01-07 06:26:04
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Tool Tips - Every Field
|
anc-ui anp-1.5 ant-enhancement grt-ui processes
|
In GitLab by @kdjstudios on Aug 29, 2018, 12:25
Hello Team,
I know right now we have the tool tip functionality implements on a majority of the fields in SAB that are commonly seen and used by users. I would like to bring up for discussion to add this to all fields. This way end users will have info on everything, as well our admin or internal tech users will also have information on those type of prompts too.
|
1.0
|
Tool Tips - Every Field - In GitLab by @kdjstudios on Aug 29, 2018, 12:25
Hello Team,
I know right now we have the tool tip functionality implements on a majority of the fields in SAB that are commonly seen and used by users. I would like to bring up for discussion to add this to all fields. This way end users will have info on everything, as well our admin or internal tech users will also have information on those type of prompts too.
|
process
|
tool tips every field in gitlab by kdjstudios on aug hello team i know right now we have the tool tip functionality implements on a majority of the fields in sab that are commonly seen and used by users i would like to bring up for discussion to add this to all fields this way end users will have info on everything as well our admin or internal tech users will also have information on those type of prompts too
| 1
|
223,936
| 17,145,825,099
|
IssuesEvent
|
2021-07-13 14:30:18
|
qbittorrent/qBittorrent
|
https://api.github.com/repos/qbittorrent/qBittorrent
|
closed
|
Missing info on translating windows installer
|
Documentation OS: Windows
|
I suggest mentioning the translation of the windows installer on the wiki page:
https://github.com/qbittorrent/qBittorrent/wiki/How-to-translate-qBittorrent
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/46792476-missing-info-on-translating-windows-installer?utm_campaign=plugin&utm_content=tracker%2F298524&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F298524&utm_medium=issues&utm_source=github).
</bountysource-plugin>
|
1.0
|
Missing info on translating windows installer - I suggest mentioning the translation of the windows installer on the wiki page:
https://github.com/qbittorrent/qBittorrent/wiki/How-to-translate-qBittorrent
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/46792476-missing-info-on-translating-windows-installer?utm_campaign=plugin&utm_content=tracker%2F298524&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F298524&utm_medium=issues&utm_source=github).
</bountysource-plugin>
|
non_process
|
missing info on translating windows installer i suggest mentioning the translation of the windows installer on the wiki page want to back this issue we accept bounties via
| 0
|
5,141
| 7,923,541,442
|
IssuesEvent
|
2018-07-05 14:21:19
|
Open-EO/openeo-api
|
https://api.github.com/repos/Open-EO/openeo-api
|
closed
|
Merge GET /processes and GET /processes/{process_id}?
|
data discovery process discovery work in progress
|
During development of the clients I found it very inconvenient that you need to call GET /processes/{process_id} for all processes over and over again. I'd suggest to merge both endpoints, which basically adds the args only:
Example for /processes:
```
[
{
"process_id":"NDVI",
"description":"Computes the normalized difference vegetation index (NDVI) for all pixels of the input dataset.",
"args":{
"red":{
"description":"red band ..."
},
"nir":{
"description":"near infrared band ..."
}
}
},
{
"process_id":"median_time",
"description":"Applies median aggregation to pixel time series for all bands of the input dataset.",
"args":{
"A":{
"description":"input product (time series)"
}
}
}
]
```
GET /processes/{process_id} would be removed completely.
Not sure about GET /data and GET /data/{product_id}. The latter delivers more additional information to /data and I think datasets are not requested that often as only one or two datasets are usually used per request, but more than one or two processes per request.
|
1.0
|
Merge GET /processes and GET /processes/{process_id}? - During development of the clients I found it very inconvenient that you need to call GET /processes/{process_id} for all processes over and over again. I'd suggest to merge both endpoints, which basically adds the args only:
Example for /processes:
```
[
{
"process_id":"NDVI",
"description":"Computes the normalized difference vegetation index (NDVI) for all pixels of the input dataset.",
"args":{
"red":{
"description":"red band ..."
},
"nir":{
"description":"near infrared band ..."
}
}
},
{
"process_id":"median_time",
"description":"Applies median aggregation to pixel time series for all bands of the input dataset.",
"args":{
"A":{
"description":"input product (time series)"
}
}
}
]
```
GET /processes/{process_id} would be removed completely.
Not sure about GET /data and GET /data/{product_id}. The latter delivers more additional information to /data and I think datasets are not requested that often as only one or two datasets are usually used per request, but more than one or two processes per request.
|
process
|
merge get processes and get processes process id during development of the clients i found it very inconvenient that you need to call get processes process id for all processes over and over again i d suggest to merge both endpoints which basically adds the args only example for processes process id ndvi description computes the normalized difference vegetation index ndvi for all pixels of the input dataset args red description red band nir description near infrared band process id median time description applies median aggregation to pixel time series for all bands of the input dataset args a description input product time series get processes process id would be removed completely not sure about get data and get data product id the latter delivers more additional information to data and i think datasets are not requested that often as only one or two datasets are usually used per request but more than one or two processes per request
| 1
|
9,746
| 12,734,307,823
|
IssuesEvent
|
2020-06-25 13:43:12
|
googleapis/python-bigquery
|
https://api.github.com/repos/googleapis/python-bigquery
|
opened
|
Transition the library to the new microgenerator
|
type: process
|
With the new code generator ready to be rolled out, we can make the transition here in PubSub. This implies dropping support for Pythom 2.7 and 3.5!
|
1.0
|
Transition the library to the new microgenerator - With the new code generator ready to be rolled out, we can make the transition here in PubSub. This implies dropping support for Pythom 2.7 and 3.5!
|
process
|
transition the library to the new microgenerator with the new code generator ready to be rolled out we can make the transition here in pubsub this implies dropping support for pythom and
| 1
|
1,141
| 3,631,282,142
|
IssuesEvent
|
2016-02-11 00:31:43
|
spootTheLousy/saguaro
|
https://api.github.com/repos/spootTheLousy/saguaro
|
closed
|
Quotes, text processing
|
Bug: Major Discussion Post/text processing
|
- [ ] Interboard post linking works
- [x] Meme arrows quote properly
- [x] In-thread post quoting properly working
Review **/_core/regist/autolink.php** usage.
Discuss @RePod s proposal to shuffle all this into text processing class
|
1.0
|
Quotes, text processing - - [ ] Interboard post linking works
- [x] Meme arrows quote properly
- [x] In-thread post quoting properly working
Review **/_core/regist/autolink.php** usage.
Discuss @RePod s proposal to shuffle all this into text processing class
|
process
|
quotes text processing interboard post linking works meme arrows quote properly in thread post quoting properly working review core regist autolink php usage discuss repod s proposal to shuffle all this into text processing class
| 1
|
70,532
| 3,331,824,280
|
IssuesEvent
|
2015-11-11 17:24:13
|
coollog/sublite
|
https://api.github.com/repos/coollog/sublite
|
opened
|
Finish changing all the old Controller/Model classes to use the new static format
|
1 Difficulty 2 Priority 3 Length Type: Enhancement
|
Lots of refactoring =)
|
1.0
|
Finish changing all the old Controller/Model classes to use the new static format - Lots of refactoring =)
|
non_process
|
finish changing all the old controller model classes to use the new static format lots of refactoring
| 0
|
22,073
| 30,594,953,072
|
IssuesEvent
|
2023-07-21 20:51:57
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
[MLv2] It should be possible to call `join-lhs-display-name` without a join or a joinable
|
.Backend .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
|
`join-lhs-display-name` added in #32420 expects either a join or a joinable (table or card metadata) argument. But when adding a new join with a notebook editor, there's a moment before a table is selected when we have neither of them. It should be possible to call the method without them to retrieve the LHS display name

|
1.0
|
[MLv2] It should be possible to call `join-lhs-display-name` without a join or a joinable - `join-lhs-display-name` added in #32420 expects either a join or a joinable (table or card metadata) argument. But when adding a new join with a notebook editor, there's a moment before a table is selected when we have neither of them. It should be possible to call the method without them to retrieve the LHS display name

|
process
|
it should be possible to call join lhs display name without a join or a joinable join lhs display name added in expects either a join or a joinable table or card metadata argument but when adding a new join with a notebook editor there s a moment before a table is selected when we have neither of them it should be possible to call the method without them to retrieve the lhs display name
| 1
|
14,074
| 16,944,988,347
|
IssuesEvent
|
2021-06-28 04:57:25
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
3D line length not shown in identify tool results
|
Bug Feedback Processing
|
**Describe the bug**
A line layer that received the Z-values from the built-in "Drape" algorithm does not show the 3D length in the identify tool results. Using the "v.drape" from the GRASS toolbox to assign Z-values will result in a visible 3D length in the identifying section.
**How to Reproduce**
<!-- Steps, sample datasets and qgis project file to reproduce the behavior. Screencasts or screenshots welcome -->
1. Create a new 2D line shape layer (I used CRS: 25832 ETRS89/UTM32N) where you have a DEM available.
2. The line won't show a 3D length which is the expected behaviour, because it does not have Z-values.

3. Use the "Drape" algorithm the assign Z-values to that specific line.
4. Z-Values are there, but a 3D length is not displayed yet.

5. If you use the "v.drape" from GRASS instead to assign the Z-Values to the line, the 3D length is shown in the identifying section, although the vertex section does not show the Z-values.

(Even when you edit the Z values of a line manually, you will not get the 3D length)
**QGIS and OS versions**
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.16.8-Hannover | QGIS code revision | 8c50902e
-- | -- | -- | --
Compiled against Qt | 5.15.2 | Running against Qt | 5.15.2
Compiled against GDAL/OGR | 3.3.0 | Running against GDAL/OGR | 3.3.0
Compiled against GEOS | 3.9.1-CAPI-1.14.2 | Running against GEOS | 3.9.1-CAPI-1.14.2
Compiled against SQLite | 3.35.2 | Running against SQLite | 3.35.2
PostgreSQL Client Version | 13.0 | SpatiaLite Version | 5.0.1
QWT Version | 6.1.3 | QScintilla2 Version | 2.11.5
Compiled against PROJ | 8.0.1 | Running against PROJ | Rel. 8.0.1, March 5th, 2021
OS Version | Windows 10 Version 2009
Active python plugins | db_manager; MetaSearch; processing
</body></html><!--EndFragment-->
**Additional context**
I am using two independent windows machines (same behaviour) and a new profile.
This behaviour occurred in the version 3.10.14, 3.16.8, 3.18.3 and 3.20.0.
The archive contains a dem and the origin line which was used to go through the steps described above.
[demand2dline.zip](https://github.com/qgis/QGIS/files/6719650/demand2dline.zip)
|
1.0
|
3D line length not shown in identify tool results - **Describe the bug**
A line layer that received the Z-values from the built-in "Drape" algorithm does not show the 3D length in the identify tool results. Using the "v.drape" from the GRASS toolbox to assign Z-values will result in a visible 3D length in the identifying section.
**How to Reproduce**
<!-- Steps, sample datasets and qgis project file to reproduce the behavior. Screencasts or screenshots welcome -->
1. Create a new 2D line shape layer (I used CRS: 25832 ETRS89/UTM32N) where you have a DEM available.
2. The line won't show a 3D length which is the expected behaviour, because it does not have Z-values.

3. Use the "Drape" algorithm the assign Z-values to that specific line.
4. Z-Values are there, but a 3D length is not displayed yet.

5. If you use the "v.drape" from GRASS instead to assign the Z-Values to the line, the 3D length is shown in the identifying section, although the vertex section does not show the Z-values.

(Even when you edit the Z values of a line manually, you will not get the 3D length)
**QGIS and OS versions**
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.16.8-Hannover | QGIS code revision | 8c50902e
-- | -- | -- | --
Compiled against Qt | 5.15.2 | Running against Qt | 5.15.2
Compiled against GDAL/OGR | 3.3.0 | Running against GDAL/OGR | 3.3.0
Compiled against GEOS | 3.9.1-CAPI-1.14.2 | Running against GEOS | 3.9.1-CAPI-1.14.2
Compiled against SQLite | 3.35.2 | Running against SQLite | 3.35.2
PostgreSQL Client Version | 13.0 | SpatiaLite Version | 5.0.1
QWT Version | 6.1.3 | QScintilla2 Version | 2.11.5
Compiled against PROJ | 8.0.1 | Running against PROJ | Rel. 8.0.1, March 5th, 2021
OS Version | Windows 10 Version 2009
Active python plugins | db_manager; MetaSearch; processing
</body></html><!--EndFragment-->
**Additional context**
I am using two independent windows machines (same behaviour) and a new profile.
This behaviour occurred in the version 3.10.14, 3.16.8, 3.18.3 and 3.20.0.
The archive contains a dem and the origin line which was used to go through the steps described above.
[demand2dline.zip](https://github.com/qgis/QGIS/files/6719650/demand2dline.zip)
|
process
|
line length not shown in identify tool results describe the bug a line layer that received the z values from the built in drape algorithm does not show the length in the identify tool results using the v drape from the grass toolbox to assign z values will result in a visible length in the identifying section how to reproduce create a new line shape layer i used crs where you have a dem available the line won t show a length which is the expected behaviour because it does not have z values use the drape algorithm the assign z values to that specific line z values are there but a length is not displayed yet if you use the v drape from grass instead to assign the z values to the line the length is shown in the identifying section although the vertex section does not show the z values even when you edit the z values of a line manually you will not get the length qgis and os versions doctype html public dtd html en p li white space pre wrap qgis version hannover qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel march os version windows version active python plugins db manager metasearch processing additional context i am using two independent windows machines same behaviour and a new profile this behaviour occurred in the version and the archive contains a dem and the origin line which was used to go through the steps described above
| 1
|
17,939
| 23,937,186,485
|
IssuesEvent
|
2022-09-11 12:00:49
|
encode/uvicorn
|
https://api.github.com/repos/encode/uvicorn
|
closed
|
Uvicorn with reload hangs when using a ProcessPoolExecutor
|
bug user experience multiprocessing
|
### Checklist
<!-- Please make sure you check all these items before submitting your bug report. -->
- [x] The bug is reproducible against the latest release and/or `master`.
- [x] There are no similar issues or pull requests to fix it yet.
### Describe the bug
When at last 1 task is submitted to a `ProcessPoolExecutor` uvicorn fails to reload when a file change has been detected. It detects the file change and the server is shutdown but it doesn't start again. As long as no tasks are submitted uvicorn is able to reload properly.
### To reproduce
```
"""ProcessPoolExecutor Example.
Run
---
uvicorn main:app --reload
Versions
--------
fastapi~=0.63.0
uvicorn[standard]~=0.13.3
"""
from concurrent.futures import ProcessPoolExecutor
from typing import Any, Dict
from fastapi import FastAPI
app = FastAPI(title="Example API")
POOL = ProcessPoolExecutor(max_workers=1)
def task() -> None:
"""."""
print("Executed in process pool")
@app.get("/")
def index() -> Dict[str, Any]:
"""Index."""
POOL.submit(task)
return {"message": "Hello World"}
```
### Expected behavior
Uvicorn should reload when file changes are detected.
```
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started reloader process [9042] using watchgod
INFO: Started server process [9044]
INFO: Waiting for application startup.
INFO: Application startup complete.
WARNING: WatchGodReload detected file change in '['/Users/maartenhuijsmans/main.py']'. Reloading...
INFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [9044]
INFO: Started server process [9047]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: 127.0.0.1:61607 - "GET / HTTP/1.1" 200 OK
```
### Actual behavior
Uvicorn doesn't start
```
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started reloader process [9054] using watchgod
INFO: Started server process [9056]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: 127.0.0.1:61615 - "GET / HTTP/1.1" 200 OK
Executed in process pool
WARNING: WatchGodReload detected file change in '['/Users/maartenhuijsmans/main.py']'. Reloading...
INFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [9056]
```
### Debugging material
<!-- Any tracebacks, screenshots, etc. that can help understanding the problem.
NOTE:
- Please list tracebacks in full (don't truncate them).
- If relevant, consider turning on DEBUG or TRACE logs for additional details (see the Logging section on https://www.uvicorn.org/settings/ specifically the `log-level` flag).
- Consider using `<details>` to make tracebacks/logs collapsible if they're very large (see https://gist.github.com/ericclemmons/b146fe5da72ca1f706b2ef72a20ac39d).
-->
### Environment
- macOS 10.13.6 / python 3.8.6 / uvicorn 0.13.3
- `uvicorn main:app --reload`
### Additional context
<!-- Any additional information that can help understanding the problem.
Eg. linked issues, or a description of what you were trying to achieve. -->
|
1.0
|
Uvicorn with reload hangs when using a ProcessPoolExecutor - ### Checklist
<!-- Please make sure you check all these items before submitting your bug report. -->
- [x] The bug is reproducible against the latest release and/or `master`.
- [x] There are no similar issues or pull requests to fix it yet.
### Describe the bug
When at last 1 task is submitted to a `ProcessPoolExecutor` uvicorn fails to reload when a file change has been detected. It detects the file change and the server is shutdown but it doesn't start again. As long as no tasks are submitted uvicorn is able to reload properly.
### To reproduce
```
"""ProcessPoolExecutor Example.
Run
---
uvicorn main:app --reload
Versions
--------
fastapi~=0.63.0
uvicorn[standard]~=0.13.3
"""
from concurrent.futures import ProcessPoolExecutor
from typing import Any, Dict
from fastapi import FastAPI
app = FastAPI(title="Example API")
POOL = ProcessPoolExecutor(max_workers=1)
def task() -> None:
"""."""
print("Executed in process pool")
@app.get("/")
def index() -> Dict[str, Any]:
"""Index."""
POOL.submit(task)
return {"message": "Hello World"}
```
### Expected behavior
Uvicorn should reload when file changes are detected.
```
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started reloader process [9042] using watchgod
INFO: Started server process [9044]
INFO: Waiting for application startup.
INFO: Application startup complete.
WARNING: WatchGodReload detected file change in '['/Users/maartenhuijsmans/main.py']'. Reloading...
INFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [9044]
INFO: Started server process [9047]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: 127.0.0.1:61607 - "GET / HTTP/1.1" 200 OK
```
### Actual behavior
Uvicorn doesn't start
```
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started reloader process [9054] using watchgod
INFO: Started server process [9056]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: 127.0.0.1:61615 - "GET / HTTP/1.1" 200 OK
Executed in process pool
WARNING: WatchGodReload detected file change in '['/Users/maartenhuijsmans/main.py']'. Reloading...
INFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [9056]
```
### Debugging material
<!-- Any tracebacks, screenshots, etc. that can help understanding the problem.
NOTE:
- Please list tracebacks in full (don't truncate them).
- If relevant, consider turning on DEBUG or TRACE logs for additional details (see the Logging section on https://www.uvicorn.org/settings/ specifically the `log-level` flag).
- Consider using `<details>` to make tracebacks/logs collapsible if they're very large (see https://gist.github.com/ericclemmons/b146fe5da72ca1f706b2ef72a20ac39d).
-->
### Environment
- macOS 10.13.6 / python 3.8.6 / uvicorn 0.13.3
- `uvicorn main:app --reload`
### Additional context
<!-- Any additional information that can help understanding the problem.
Eg. linked issues, or a description of what you were trying to achieve. -->
|
process
|
uvicorn with reload hangs when using a processpoolexecutor checklist the bug is reproducible against the latest release and or master there are no similar issues or pull requests to fix it yet describe the bug when at last task is submitted to a processpoolexecutor uvicorn fails to reload when a file change has been detected it detects the file change and the server is shutdown but it doesn t start again as long as no tasks are submitted uvicorn is able to reload properly to reproduce processpoolexecutor example run uvicorn main app reload versions fastapi uvicorn from concurrent futures import processpoolexecutor from typing import any dict from fastapi import fastapi app fastapi title example api pool processpoolexecutor max workers def task none print executed in process pool app get def index dict index pool submit task return message hello world expected behavior uvicorn should reload when file changes are detected info uvicorn running on press ctrl c to quit info started reloader process using watchgod info started server process info waiting for application startup info application startup complete warning watchgodreload detected file change in reloading info shutting down info waiting for application shutdown info application shutdown complete info finished server process info started server process info waiting for application startup info application startup complete info get http ok actual behavior uvicorn doesn t start info uvicorn running on press ctrl c to quit info started reloader process using watchgod info started server process info waiting for application startup info application startup complete info get http ok executed in process pool warning watchgodreload detected file change in reloading info shutting down info waiting for application shutdown info application shutdown complete info finished server process debugging material any tracebacks screenshots etc that can help understanding the problem note please list tracebacks in full don t truncate them if relevant consider turning on debug or trace logs for additional details see the logging section on specifically the log level flag consider using to make tracebacks logs collapsible if they re very large see environment macos python uvicorn uvicorn main app reload additional context any additional information that can help understanding the problem eg linked issues or a description of what you were trying to achieve
| 1
|
18,806
| 24,705,813,145
|
IssuesEvent
|
2022-10-19 18:58:02
|
apache/arrow-datafusion
|
https://api.github.com/repos/apache/arrow-datafusion
|
closed
|
Release process: Add instructions on Apache Reporter
|
enhancement development-process
|
**Is your feature request related to a problem or challenge? Please describe what you are trying to do.**
```
Could you add "adding a release to
https://reporter.apache.org/addrelease.html?arrow " to
release process of DataFusion and Ballista?
The release information is used to generate a template for
a board report.
FYI:
* Board Report Wizard: https://reporter.apache.org/wizard/?arrow
* A board report draft based on template generated by the
Board Report Wizard: https://github.com/apache/arrow/pull/14357
```
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
|
1.0
|
Release process: Add instructions on Apache Reporter - **Is your feature request related to a problem or challenge? Please describe what you are trying to do.**
```
Could you add "adding a release to
https://reporter.apache.org/addrelease.html?arrow " to
release process of DataFusion and Ballista?
The release information is used to generate a template for
a board report.
FYI:
* Board Report Wizard: https://reporter.apache.org/wizard/?arrow
* A board report draft based on template generated by the
Board Report Wizard: https://github.com/apache/arrow/pull/14357
```
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
|
process
|
release process add instructions on apache reporter is your feature request related to a problem or challenge please describe what you are trying to do could you add adding a release to to release process of datafusion and ballista the release information is used to generate a template for a board report fyi board report wizard a board report draft based on template generated by the board report wizard describe the solution you d like a clear and concise description of what you want to happen describe alternatives you ve considered a clear and concise description of any alternative solutions or features you ve considered additional context add any other context or screenshots about the feature request here
| 1
|
9,290
| 12,306,066,068
|
IssuesEvent
|
2020-05-12 00:17:52
|
googleapis/nodejs-os-config
|
https://api.github.com/repos/googleapis/nodejs-os-config
|
opened
|
promote library to GA
|
type: process
|
Package name: **@google-cloud/os-config**
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] 28 days elapsed since last beta release with new API surface
- [ ] Server API is GA
- [ ] Package API is stable, and we can commit to backward compatibility
- [ ] All dependencies are GA
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
1.0
|
promote library to GA - Package name: **@google-cloud/os-config**
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] 28 days elapsed since last beta release with new API surface
- [ ] Server API is GA
- [ ] Package API is stable, and we can commit to backward compatibility
- [ ] All dependencies are GA
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
process
|
promote library to ga package name google cloud os config current release beta proposed release ga instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility all dependencies are ga optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one “getting started” sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site
| 1
|
12,645
| 15,019,332,791
|
IssuesEvent
|
2021-02-01 13:25:16
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Remove support for the Java singlejar implementation
|
P2 area-java-toolchains team-Rules-Java type: process
|
Previously: #2241
The [Java implementation](https://github.com/bazelbuild/bazel/tree/master/src/java_tools/singlejar/java/com/google/devtools/build/singlejar) of singlejar has been replaced by a [faster c++ implementation](https://github.com/bazelbuild/bazel/blob/df036962c94eeac0ccb6d1359d6f9681195b45ef/src/tools/singlejar/singlejar_main.cc) everywhere except:
* RBE, because of issues with a dependency on c++ protos
* some places in the bootstrap build, where it might be tricky to use the native implementation but where other zip tools might be fine (it's not very performance-critical)
* I'm not sure the c++ implementation works on arm yet
We should use the c++ implementation everywhere, and stop supporting the legacy Java singlejar.
|
1.0
|
Remove support for the Java singlejar implementation - Previously: #2241
The [Java implementation](https://github.com/bazelbuild/bazel/tree/master/src/java_tools/singlejar/java/com/google/devtools/build/singlejar) of singlejar has been replaced by a [faster c++ implementation](https://github.com/bazelbuild/bazel/blob/df036962c94eeac0ccb6d1359d6f9681195b45ef/src/tools/singlejar/singlejar_main.cc) everywhere except:
* RBE, because of issues with a dependency on c++ protos
* some places in the bootstrap build, where it might be tricky to use the native implementation but where other zip tools might be fine (it's not very performance-critical)
* I'm not sure the c++ implementation works on arm yet
We should use the c++ implementation everywhere, and stop supporting the legacy Java singlejar.
|
process
|
remove support for the java singlejar implementation previously the of singlejar has been replaced by a everywhere except rbe because of issues with a dependency on c protos some places in the bootstrap build where it might be tricky to use the native implementation but where other zip tools might be fine it s not very performance critical i m not sure the c implementation works on arm yet we should use the c implementation everywhere and stop supporting the legacy java singlejar
| 1
|
182,481
| 30,855,480,605
|
IssuesEvent
|
2023-08-02 20:13:24
|
NIAEFEUP/sigarra-extension
|
https://api.github.com/repos/NIAEFEUP/sigarra-extension
|
closed
|
Redesign warnings cards
|
ui-design
|
Class name "info"
<img width="954" alt="info" src="https://user-images.githubusercontent.com/53405284/221821729-54e15a15-0af5-428f-9fae-60a718f9e106.png">
Class name "alerta"
<img width="956" alt="alertta" src="https://user-images.githubusercontent.com/53405284/221821744-2be7b564-8a5d-43e5-8ac7-328af3cb72a5.png">
|
1.0
|
Redesign warnings cards - Class name "info"
<img width="954" alt="info" src="https://user-images.githubusercontent.com/53405284/221821729-54e15a15-0af5-428f-9fae-60a718f9e106.png">
Class name "alerta"
<img width="956" alt="alertta" src="https://user-images.githubusercontent.com/53405284/221821744-2be7b564-8a5d-43e5-8ac7-328af3cb72a5.png">
|
non_process
|
redesign warnings cards class name info img width alt info src class name alerta img width alt alertta src
| 0
|
5,066
| 3,899,840,903
|
IssuesEvent
|
2016-04-18 00:02:48
|
lionheart/openradar-mirror
|
https://api.github.com/repos/lionheart/openradar-mirror
|
opened
|
12978935: Xcode: Kill Two-Finger-Swipe-To-Go-Back
|
classification:ui/usability reproducible:always status:open
|
#### Description
Summary:
When working in Xcode, swiping horizontally with two fingers while the cursor is above the main text-editing area will cause Xcode to switch to the previous file that was loaded into that editor.
This sudden file-switching is INFURIATING.
99% of the time, when I swipe my fingers horizontally, it's because I'm working on a small screen and I need to SCROLL left or right within my file to see the rest of a line of code. Exactly 0% of the time do I want to suddenly flip to another file by swiping. To do that, I can use the breadcrumbs bar above the editor or the outline view to the left.
Xcode is not a consumer app. It needs swipe-gestures about as much as it needs Cover Flow. Delete the swipes or at least provide an option to disable them, please.
-
Product Version: 4.5.2
Created: 2013-01-09T04:29:59.360106
Originated: 2013-01-08T00:00:00
Open Radar Link: http://www.openradar.me/12978935
|
True
|
12978935: Xcode: Kill Two-Finger-Swipe-To-Go-Back - #### Description
Summary:
When working in Xcode, swiping horizontally with two fingers while the cursor is above the main text-editing area will cause Xcode to switch to the previous file that was loaded into that editor.
This sudden file-switching is INFURIATING.
99% of the time, when I swipe my fingers horizontally, it's because I'm working on a small screen and I need to SCROLL left or right within my file to see the rest of a line of code. Exactly 0% of the time do I want to suddenly flip to another file by swiping. To do that, I can use the breadcrumbs bar above the editor or the outline view to the left.
Xcode is not a consumer app. It needs swipe-gestures about as much as it needs Cover Flow. Delete the swipes or at least provide an option to disable them, please.
-
Product Version: 4.5.2
Created: 2013-01-09T04:29:59.360106
Originated: 2013-01-08T00:00:00
Open Radar Link: http://www.openradar.me/12978935
|
non_process
|
xcode kill two finger swipe to go back description summary when working in xcode swiping horizontally with two fingers while the cursor is above the main text editing area will cause xcode to switch to the previous file that was loaded into that editor this sudden file switching is infuriating of the time when i swipe my fingers horizontally it s because i m working on a small screen and i need to scroll left or right within my file to see the rest of a line of code exactly of the time do i want to suddenly flip to another file by swiping to do that i can use the breadcrumbs bar above the editor or the outline view to the left xcode is not a consumer app it needs swipe gestures about as much as it needs cover flow delete the swipes or at least provide an option to disable them please product version created originated open radar link
| 0
|
5,128
| 7,895,227,600
|
IssuesEvent
|
2018-06-29 01:54:11
|
googlegenomics/gcp-variant-transforms
|
https://api.github.com/repos/googlegenomics/gcp-variant-transforms
|
closed
|
Re-import gnomAD with the new annotation options in VT
|
P1 process
|
We currently host gnomAD as part of the annotation offering, but we can present a much better BQ table using the new VT features (`--annotation_fields`). We should re-annotation and re-import gnomAD using VT.
|
1.0
|
Re-import gnomAD with the new annotation options in VT - We currently host gnomAD as part of the annotation offering, but we can present a much better BQ table using the new VT features (`--annotation_fields`). We should re-annotation and re-import gnomAD using VT.
|
process
|
re import gnomad with the new annotation options in vt we currently host gnomad as part of the annotation offering but we can present a much better bq table using the new vt features annotation fields we should re annotation and re import gnomad using vt
| 1
|
972
| 3,423,126,376
|
IssuesEvent
|
2015-12-09 03:41:30
|
MaretEngineering/MROV
|
https://api.github.com/repos/MaretEngineering/MROV
|
closed
|
Be able to turn off Serial by changing one constant
|
Necessary Addition Processing
|
All the lines that refer to serial should be inside if statements that refer to that constant.
That way we can turn it off to test just the processing part.
|
1.0
|
Be able to turn off Serial by changing one constant - All the lines that refer to serial should be inside if statements that refer to that constant.
That way we can turn it off to test just the processing part.
|
process
|
be able to turn off serial by changing one constant all the lines that refer to serial should be inside if statements that refer to that constant that way we can turn it off to test just the processing part
| 1
|
21,550
| 29,865,424,664
|
IssuesEvent
|
2023-06-20 03:05:29
|
cncf/tag-security
|
https://api.github.com/repos/cncf/tag-security
|
closed
|
[Sec Assess WG] Getting more reviewers for Security Assessments
|
help wanted good first issue assessment-process suggestion inactive
|
This issue was created from results of the Security Assessment Improvement Working Group (https://github.com/cncf/sig-security/issues/167#issuecomment-714514142).
# Getting more reviewers for Security Assessments
## Premise
- Challenge of assembling a team for each review
## Ideas
- what are the reasons that people want to participate? can we incentivize more?
- Provide swag/recognition
- For issues found they would get discount for courses and conferences
- actively reach out to past reviewers (This is currently done by co-chairs and TLs informally)
- Create a more concrete list of the expectations/requirements of a reviewer
- Find new ways to engage new reviewers including in-experienced ones
- Reach out to researchers to review the projects
- Recommend the CNCF provide training/skills to community members to be able to perform assessments and audits
## Logistics
- [x] Contributors (For multiple contributors, 1 lead to coordinate)
- @magnologan
- Placeholder_2
- [x] SIG-Representative @lumjjb
|
1.0
|
[Sec Assess WG] Getting more reviewers for Security Assessments - This issue was created from results of the Security Assessment Improvement Working Group (https://github.com/cncf/sig-security/issues/167#issuecomment-714514142).
# Getting more reviewers for Security Assessments
## Premise
- Challenge of assembling a team for each review
## Ideas
- what are the reasons that people want to participate? can we incentivize more?
- Provide swag/recognition
- For issues found they would get discount for courses and conferences
- actively reach out to past reviewers (This is currently done by co-chairs and TLs informally)
- Create a more concrete list of the expectations/requirements of a reviewer
- Find new ways to engage new reviewers including in-experienced ones
- Reach out to researchers to review the projects
- Recommend the CNCF provide training/skills to community members to be able to perform assessments and audits
## Logistics
- [x] Contributors (For multiple contributors, 1 lead to coordinate)
- @magnologan
- Placeholder_2
- [x] SIG-Representative @lumjjb
|
process
|
getting more reviewers for security assessments this issue was created from results of the security assessment improvement working group getting more reviewers for security assessments premise challenge of assembling a team for each review ideas what are the reasons that people want to participate can we incentivize more provide swag recognition for issues found they would get discount for courses and conferences actively reach out to past reviewers this is currently done by co chairs and tls informally create a more concrete list of the expectations requirements of a reviewer find new ways to engage new reviewers including in experienced ones reach out to researchers to review the projects recommend the cncf provide training skills to community members to be able to perform assessments and audits logistics contributors for multiple contributors lead to coordinate magnologan placeholder sig representative lumjjb
| 1
|
18,576
| 24,558,354,133
|
IssuesEvent
|
2022-10-12 17:51:24
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Mobile apps] Images and PDFs are not getting displayed in the mobile apps
|
Bug P0 iOS Android Process: Fixed
|
Question with Images and PDF is not getting displayed in the mobile apps
|
1.0
|
[Mobile apps] Images and PDFs are not getting displayed in the mobile apps - Question with Images and PDF is not getting displayed in the mobile apps
|
process
|
images and pdfs are not getting displayed in the mobile apps question with images and pdf is not getting displayed in the mobile apps
| 1
|
25,666
| 18,957,259,882
|
IssuesEvent
|
2021-11-18 21:56:18
|
google/site-kit-wp
|
https://api.github.com/repos/google/site-kit-wp
|
closed
|
Unstable VRT for buttons
|
P1 Type: Infrastructure
|
## Bug Description
The visual regression test for buttons has become somewhat unstable recently and fails intermittently. Specifically, the failure is related to the button in a hovered state.
## Screenshots

_Backstop reference image diff with test_
---------------
_Do not alter or remove anything below. The following sections will be managed by moderators only._
## Acceptance criteria
* Visual regression tests for buttons should consistently pass
## Implementation Brief
### Test Coverage
* <!-- One or more bullet points for how to implement automated tests to verify the issue is resolved. -->
### Visual Regression Changes
* Run visual regression tests and check that the buttons now pass reliably
## QA Brief
* <!-- One or more bullet points for how to test that the issue has been resolved. -->
## Changelog entry
* <!-- One sentence summarizing the PR, to be used in the changelog. -->
|
1.0
|
Unstable VRT for buttons - ## Bug Description
The visual regression test for buttons has become somewhat unstable recently and fails intermittently. Specifically, the failure is related to the button in a hovered state.
## Screenshots

_Backstop reference image diff with test_
---------------
_Do not alter or remove anything below. The following sections will be managed by moderators only._
## Acceptance criteria
* Visual regression tests for buttons should consistently pass
## Implementation Brief
### Test Coverage
* <!-- One or more bullet points for how to implement automated tests to verify the issue is resolved. -->
### Visual Regression Changes
* Run visual regression tests and check that the buttons now pass reliably
## QA Brief
* <!-- One or more bullet points for how to test that the issue has been resolved. -->
## Changelog entry
* <!-- One sentence summarizing the PR, to be used in the changelog. -->
|
non_process
|
unstable vrt for buttons bug description the visual regression test for buttons has become somewhat unstable recently and fails intermittently specifically the failure is related to the button in a hovered state screenshots backstop reference image diff with test do not alter or remove anything below the following sections will be managed by moderators only acceptance criteria visual regression tests for buttons should consistently pass implementation brief test coverage visual regression changes run visual regression tests and check that the buttons now pass reliably qa brief changelog entry
| 0
|
10,327
| 13,162,144,410
|
IssuesEvent
|
2020-08-10 20:58:32
|
knative/serving
|
https://api.github.com/repos/knative/serving
|
closed
|
Add a new toggle to config observability to trigger log collection.
|
area/monitoring kind/feature kind/good-first-issue kind/process
|
Currently the log collection is driven by whether the template is _empty_.
Which kind of works, but it's not good since
- if you want to turn off the log collection for a while you actually need to delete the whole template, that seems too invasive
- we cannot match example to default, since example shows how to write a template
- even though it's commented such behavior is a bit obscure.
So I suggest we switch to an additional toggle and match example in the CM to the default value.
The drawback is that it might not work directly for the downstream forks, since places who have log collection on by default _will_ need to update the CM to toggle the switch on by default.
/area monitoring
/kind good-first-issue
/kind process
/cc @mdemirhan @yanweiguo @mattmoor @julz
|
1.0
|
Add a new toggle to config observability to trigger log collection. - Currently the log collection is driven by whether the template is _empty_.
Which kind of works, but it's not good since
- if you want to turn off the log collection for a while you actually need to delete the whole template, that seems too invasive
- we cannot match example to default, since example shows how to write a template
- even though it's commented such behavior is a bit obscure.
So I suggest we switch to an additional toggle and match example in the CM to the default value.
The drawback is that it might not work directly for the downstream forks, since places who have log collection on by default _will_ need to update the CM to toggle the switch on by default.
/area monitoring
/kind good-first-issue
/kind process
/cc @mdemirhan @yanweiguo @mattmoor @julz
|
process
|
add a new toggle to config observability to trigger log collection currently the log collection is driven by whether the template is empty which kind of works but it s not good since if you want to turn off the log collection for a while you actually need to delete the whole template that seems too invasive we cannot match example to default since example shows how to write a template even though it s commented such behavior is a bit obscure so i suggest we switch to an additional toggle and match example in the cm to the default value the drawback is that it might not work directly for the downstream forks since places who have log collection on by default will need to update the cm to toggle the switch on by default area monitoring kind good first issue kind process cc mdemirhan yanweiguo mattmoor julz
| 1
|
47,639
| 5,906,571,126
|
IssuesEvent
|
2017-05-19 15:28:19
|
alexrj/Slic3r
|
https://api.github.com/repos/alexrj/Slic3r
|
closed
|
1.2.9 command line can't find paths with non ASCII characters
|
Needs testing with current dev version or next release OS: Windows Verified bug
|
Hi, I've came across this Issue:
When running the following command to slice a RepetierHost composition:
`slic3r.exe --load "C:\Users\Andrés\AppData\Local\RepetierHost\slic3r.ini" --print-center 93,100 -o "C:\Users\Andrés\AppData\Local\RepetierHost\composition.gcode" "C:\Users\Andrés\AppData\Local\RepetierHost\composition.amf"`
Output is:
`Cannot find specified configuration file (C:\Users\AndrÚs\AppData\Local\RepetierHost\slic3r.ini).`
It looks like the accented "é" character is not supported by Slic3r. This did not happen in the previous stable version.
|
1.0
|
1.2.9 command line can't find paths with non ASCII characters - Hi, I've came across this Issue:
When running the following command to slice a RepetierHost composition:
`slic3r.exe --load "C:\Users\Andrés\AppData\Local\RepetierHost\slic3r.ini" --print-center 93,100 -o "C:\Users\Andrés\AppData\Local\RepetierHost\composition.gcode" "C:\Users\Andrés\AppData\Local\RepetierHost\composition.amf"`
Output is:
`Cannot find specified configuration file (C:\Users\AndrÚs\AppData\Local\RepetierHost\slic3r.ini).`
It looks like the accented "é" character is not supported by Slic3r. This did not happen in the previous stable version.
|
non_process
|
command line can t find paths with non ascii characters hi i ve came across this issue when running the following command to slice a repetierhost composition exe load c users andrés appdata local repetierhost ini print center o c users andrés appdata local repetierhost composition gcode c users andrés appdata local repetierhost composition amf output is cannot find specified configuration file c users andrús appdata local repetierhost ini it looks like the accented é character is not supported by this did not happen in the previous stable version
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.