Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
22,088
| 30,611,160,890
|
IssuesEvent
|
2023-07-23 16:20:56
|
danrleypereira/verzel-pleno-prova
|
https://api.github.com/repos/danrleypereira/verzel-pleno-prova
|
closed
|
Implementar o Redux Thunk para lidar com as requisições de veículos e autenticação
|
enhancement Processo Seletivo
|
Objetivos:
Gerenciar o currentPage com um número pageSize de veículos, além de manter o veículo selecionado por último.
Utilizar o Redux Thunk para separar as responsabilidades e isolar os componentes das requisições, permitindo que se concentrem no uso de estados para gerenciar a interface.
Fluxo de trabalho:
No componente "Cars", a thunk será disparada no useEffect para fazer a requisição à API de veículos.
Em caso de sucesso, a thunk irá disparar as actions correspondentes para salvar os veículos e os dados de paginação no Redux store.
Estado atual da aplicação:
Atualmente, a API é requisitada através de uma função de serviço que salva os veículos e os dados de paginação em um state local do componente.
Benefícios:
Melhorar a organização e separação de responsabilidades no código, tornando-o mais fácil de manter e entender.
Permitir a reutilização dos estados de veículos e paginação em diferentes componentes.
Facilitar a integração de futuras funcionalidades, como autenticação, através do uso do Redux Thunk.
No Futuro:
A thunk será responsável por manter os dados de requisição autenticada com token e controlar o acesso a recursos específicos com base na autenticação do usuário.
|
1.0
|
Implementar o Redux Thunk para lidar com as requisições de veículos e autenticação - Objetivos:
Gerenciar o currentPage com um número pageSize de veículos, além de manter o veículo selecionado por último.
Utilizar o Redux Thunk para separar as responsabilidades e isolar os componentes das requisições, permitindo que se concentrem no uso de estados para gerenciar a interface.
Fluxo de trabalho:
No componente "Cars", a thunk será disparada no useEffect para fazer a requisição à API de veículos.
Em caso de sucesso, a thunk irá disparar as actions correspondentes para salvar os veículos e os dados de paginação no Redux store.
Estado atual da aplicação:
Atualmente, a API é requisitada através de uma função de serviço que salva os veículos e os dados de paginação em um state local do componente.
Benefícios:
Melhorar a organização e separação de responsabilidades no código, tornando-o mais fácil de manter e entender.
Permitir a reutilização dos estados de veículos e paginação em diferentes componentes.
Facilitar a integração de futuras funcionalidades, como autenticação, através do uso do Redux Thunk.
No Futuro:
A thunk será responsável por manter os dados de requisição autenticada com token e controlar o acesso a recursos específicos com base na autenticação do usuário.
|
process
|
implementar o redux thunk para lidar com as requisições de veículos e autenticação objetivos gerenciar o currentpage com um número pagesize de veículos além de manter o veículo selecionado por último utilizar o redux thunk para separar as responsabilidades e isolar os componentes das requisições permitindo que se concentrem no uso de estados para gerenciar a interface fluxo de trabalho no componente cars a thunk será disparada no useeffect para fazer a requisição à api de veículos em caso de sucesso a thunk irá disparar as actions correspondentes para salvar os veículos e os dados de paginação no redux store estado atual da aplicação atualmente a api é requisitada através de uma função de serviço que salva os veículos e os dados de paginação em um state local do componente benefícios melhorar a organização e separação de responsabilidades no código tornando o mais fácil de manter e entender permitir a reutilização dos estados de veículos e paginação em diferentes componentes facilitar a integração de futuras funcionalidades como autenticação através do uso do redux thunk no futuro a thunk será responsável por manter os dados de requisição autenticada com token e controlar o acesso a recursos específicos com base na autenticação do usuário
| 1
|
10,424
| 13,216,687,215
|
IssuesEvent
|
2020-08-17 04:40:39
|
symfony/symfony
|
https://api.github.com/repos/symfony/symfony
|
closed
|
Extremely slow performance of Process component with iterator as a stdin data source
|
Performance Process
|
| Q | A
| ---------------- | -----
| Bug report? | yes
| Feature request? | no
| BC Break report? | no
| RFC? | no
| Symfony version | 3.3.10
<!--
- Please fill in this template according to your issue.
- For support request or how-tos, visit https://symfony.com/support
- Otherwise, replace this comment by the description of your issue.
-->
When `Symfony\Component\Process\Process` gets iterator as a data source for the process stdin - the transfer performance is unacceptably low: it's ~1KB/s.
The trivial reproduction code would look like
```php
$data = new \SplFileObject('/var/log/syslog');
$process = new Process(
['dd', 'of=out.file', 'status=none'],
__DIR__,
[],
$data
);
$process->run();
```
I also created a repository with a complete example that also reads from the syslog and writes to the same directory `out.file` file.
***Update***:
And I also found that even when passed a string it still is terribly slow (~1MB/s):
```php
$data = file_get_contents(__DIR__ . '/in.bin');
$process = new Process(
['dd', 'of=out.file', 'status=none'],
__DIR__,
[],
$data
);
$process->run();
```
meanwhile, the same implemented with `proc_open` is instant
```php
$descriptors = [
0 => ['file', __DIR__ . '/in.bin', 'r'],
1 => ['pipe', 'w'],
2 => ['pipe', 'w'],
];
$cwd = __DIR__;
$process = proc_open('dd of=out.file', $descriptors, $pipes, $cwd);
if (is_resource($process)) {
$return_value = proc_close($process);
echo 'done';
}
```
|
1.0
|
Extremely slow performance of Process component with iterator as a stdin data source - | Q | A
| ---------------- | -----
| Bug report? | yes
| Feature request? | no
| BC Break report? | no
| RFC? | no
| Symfony version | 3.3.10
<!--
- Please fill in this template according to your issue.
- For support request or how-tos, visit https://symfony.com/support
- Otherwise, replace this comment by the description of your issue.
-->
When `Symfony\Component\Process\Process` gets iterator as a data source for the process stdin - the transfer performance is unacceptably low: it's ~1KB/s.
The trivial reproduction code would look like
```php
$data = new \SplFileObject('/var/log/syslog');
$process = new Process(
['dd', 'of=out.file', 'status=none'],
__DIR__,
[],
$data
);
$process->run();
```
I also created a repository with a complete example that also reads from the syslog and writes to the same directory `out.file` file.
***Update***:
And I also found that even when passed a string it still is terribly slow (~1MB/s):
```php
$data = file_get_contents(__DIR__ . '/in.bin');
$process = new Process(
['dd', 'of=out.file', 'status=none'],
__DIR__,
[],
$data
);
$process->run();
```
meanwhile, the same implemented with `proc_open` is instant
```php
$descriptors = [
0 => ['file', __DIR__ . '/in.bin', 'r'],
1 => ['pipe', 'w'],
2 => ['pipe', 'w'],
];
$cwd = __DIR__;
$process = proc_open('dd of=out.file', $descriptors, $pipes, $cwd);
if (is_resource($process)) {
$return_value = proc_close($process);
echo 'done';
}
```
|
process
|
extremely slow performance of process component with iterator as a stdin data source q a bug report yes feature request no bc break report no rfc no symfony version please fill in this template according to your issue for support request or how tos visit otherwise replace this comment by the description of your issue when symfony component process process gets iterator as a data source for the process stdin the transfer performance is unacceptably low it s s the trivial reproduction code would look like php data new splfileobject var log syslog process new process dir data process run i also created a repository with a complete example that also reads from the syslog and writes to the same directory out file file update and i also found that even when passed a string it still is terribly slow s php data file get contents dir in bin process new process dir data process run meanwhile the same implemented with proc open is instant php descriptors cwd dir process proc open dd of out file descriptors pipes cwd if is resource process return value proc close process echo done
| 1
|
239,139
| 7,787,048,320
|
IssuesEvent
|
2018-06-06 20:59:33
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
reopened
|
www.golightstream.com - Unable to select streaming category
|
Re-triaged browser-firefox browser-firefox-nightly-61.0a1 priority-normal
|
<!-- @browser: Firefox Nightly 61.0a1 (2018-04-25) -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:59.0) Gecko/20100101 Firefox/59.0 -->
<!-- @reported_with: web -->
**URL**: https://www.golightstream.com/
**Browser / Version**: Firefox Nightly 61.0a1 (2018-04-25)
**Operating System**: Windows 10 Pro
**Tested Another Browser**: Yes
**Problem type**: Design is broken
**Description**: Unable to select streaming category
**Prerequisites**
1. Google account available.
**Steps to Reproduce**
1. Navigate to https://www.golightstream.com/
2. Click "Start Streaming" button.
3. Select an option (e.g."Lightstream Studio").
4. Select an option where you would like to stream (e.g. ""You Tube).
5. Sign in with valid credentials and allow access.
6. When the category selection page is displayed, click a category (e.g "Gaming").
7. Observe behavior.
**Expected Behavior:**
Category is selected and redirect to user page is performed.
**Actual Behavior:**
Category can't be selected.
**Note:**
1. Reproducible on Firefox 59.0.2 Release.
2. Not reproducible on Chrome 66.0.3359.117.
3. Screenshot attached.
4. Affected area:
Console:
```js
ReferenceError: event is not defined[Learn More] app.2c5f4644034df87ace50.js:1:81784
```
[](https://webcompat.com/uploads/2018/4/8e789e84-8b4d-42ad-8c30-217be9f10fad.jpg)

_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.golightstream.com - Unable to select streaming category - <!-- @browser: Firefox Nightly 61.0a1 (2018-04-25) -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:59.0) Gecko/20100101 Firefox/59.0 -->
<!-- @reported_with: web -->
**URL**: https://www.golightstream.com/
**Browser / Version**: Firefox Nightly 61.0a1 (2018-04-25)
**Operating System**: Windows 10 Pro
**Tested Another Browser**: Yes
**Problem type**: Design is broken
**Description**: Unable to select streaming category
**Prerequisites**
1. Google account available.
**Steps to Reproduce**
1. Navigate to https://www.golightstream.com/
2. Click "Start Streaming" button.
3. Select an option (e.g."Lightstream Studio").
4. Select an option where you would like to stream (e.g. ""You Tube).
5. Sign in with valid credentials and allow access.
6. When the category selection page is displayed, click a category (e.g "Gaming").
7. Observe behavior.
**Expected Behavior:**
Category is selected and redirect to user page is performed.
**Actual Behavior:**
Category can't be selected.
**Note:**
1. Reproducible on Firefox 59.0.2 Release.
2. Not reproducible on Chrome 66.0.3359.117.
3. Screenshot attached.
4. Affected area:
Console:
```js
ReferenceError: event is not defined[Learn More] app.2c5f4644034df87ace50.js:1:81784
```
[](https://webcompat.com/uploads/2018/4/8e789e84-8b4d-42ad-8c30-217be9f10fad.jpg)

_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
unable to select streaming category url browser version firefox nightly operating system windows pro tested another browser yes problem type design is broken description unable to select streaming category prerequisites google account available steps to reproduce navigate to click start streaming button select an option e g lightstream studio select an option where you would like to stream e g you tube sign in with valid credentials and allow access when the category selection page is displayed click a category e g gaming observe behavior expected behavior category is selected and redirect to user page is performed actual behavior category can t be selected note reproducible on firefox release not reproducible on chrome screenshot attached affected area console js referenceerror event is not defined app js from with ❤️
| 0
|
552,199
| 16,202,726,426
|
IssuesEvent
|
2021-05-05 00:18:10
|
iterative/dvc.org
|
https://api.github.com/repos/iterative/dvc.org
|
closed
|
blog: cleanup tags?
|
blog-engine discussion priority-p2 website
|
There's lots of single-use labels. Many could be removed or changed to a better one (with more usage).
Full list (crude command that assumes no blog has more than 10 labels):
```
$ grep "tags:" content/blog/* -A 10 | awk '{print $2 " " $3}' | grep '^- ' | sort -f | uniq -c
2 - Ambassador
1 - Autocomplete
1 - Azure
1 - Benchmark
3 - Best
1 - Bitbucket
1 - Blogging
1 - Book
2 - Cache
9 - CI/CD
1 - CLI
1 - Cloud
20 - CML
1 - cml-send-comment
1 - Completion
1 - Conda
2 - Conference
3 - Continuous
2 - DAGsHub
2 - Data
5 - DataOps
3 - DevOps
20 - Discord
2 - DivOps
2 - Docker
2 - Documentation
10 - DVC
1 - echo
1 - Engineering
1 - External
1 - GCP
14 - Gems
1 - Git
2 - GitHub
1 - GitLab
8 - Google
1 - gpu
1 - GPUs
3 - Hacktoberfest
23 - Heartbeat
1 - Homebrew
2 - Hyperparameters
2 - Import
1 - Machine
9 - Meetup
2 - Mentoring
1 - Metrics
1 - MinIO
1 - ML
17 - MLOps
1 - Model
1 - Monorepo
1 - New
2 - Open
1 - Optimization
1 - Performance
2 - Pipeline
5 - Pipelines
4 - Plots
1 - Podcast
1 - Productivity
1 - Project
1 - PTDC-18
2 - PyCon
1 - PyData
5 - Python
1 - PyTorch
6 - R
1 - Rclone
1 - Reddit
6 - Release
2 - Reproducibility
1 - RStats
2 - SciPy
1 - Self-hosted
1 - shtab
1 - spaCy
1 - Spell
1 - SSH
1 - Students
1 - Tab
1 - Tags
2 - Terraform
5 - Tutorial
2 - Udemy
2 - Vega
1 - Videos
1 - Volunteer
1 - YouTube
```
(That's the current output on `master`).
Obvious issues (examples):
- `DVC` should be used more or not used at all?
- `Pipeline` vs `Pipelines`
- `cml-send-comment` ? `PTDC-18`?
- `Project`, `Tab`, etc. - too broad
- `Video` and `YouTube` only used once (each)?
|
1.0
|
blog: cleanup tags? - There's lots of single-use labels. Many could be removed or changed to a better one (with more usage).
Full list (crude command that assumes no blog has more than 10 labels):
```
$ grep "tags:" content/blog/* -A 10 | awk '{print $2 " " $3}' | grep '^- ' | sort -f | uniq -c
2 - Ambassador
1 - Autocomplete
1 - Azure
1 - Benchmark
3 - Best
1 - Bitbucket
1 - Blogging
1 - Book
2 - Cache
9 - CI/CD
1 - CLI
1 - Cloud
20 - CML
1 - cml-send-comment
1 - Completion
1 - Conda
2 - Conference
3 - Continuous
2 - DAGsHub
2 - Data
5 - DataOps
3 - DevOps
20 - Discord
2 - DivOps
2 - Docker
2 - Documentation
10 - DVC
1 - echo
1 - Engineering
1 - External
1 - GCP
14 - Gems
1 - Git
2 - GitHub
1 - GitLab
8 - Google
1 - gpu
1 - GPUs
3 - Hacktoberfest
23 - Heartbeat
1 - Homebrew
2 - Hyperparameters
2 - Import
1 - Machine
9 - Meetup
2 - Mentoring
1 - Metrics
1 - MinIO
1 - ML
17 - MLOps
1 - Model
1 - Monorepo
1 - New
2 - Open
1 - Optimization
1 - Performance
2 - Pipeline
5 - Pipelines
4 - Plots
1 - Podcast
1 - Productivity
1 - Project
1 - PTDC-18
2 - PyCon
1 - PyData
5 - Python
1 - PyTorch
6 - R
1 - Rclone
1 - Reddit
6 - Release
2 - Reproducibility
1 - RStats
2 - SciPy
1 - Self-hosted
1 - shtab
1 - spaCy
1 - Spell
1 - SSH
1 - Students
1 - Tab
1 - Tags
2 - Terraform
5 - Tutorial
2 - Udemy
2 - Vega
1 - Videos
1 - Volunteer
1 - YouTube
```
(That's the current output on `master`).
Obvious issues (examples):
- `DVC` should be used more or not used at all?
- `Pipeline` vs `Pipelines`
- `cml-send-comment` ? `PTDC-18`?
- `Project`, `Tab`, etc. - too broad
- `Video` and `YouTube` only used once (each)?
|
non_process
|
blog cleanup tags there s lots of single use labels many could be removed or changed to a better one with more usage full list crude command that assumes no blog has more than labels grep tags content blog a awk print grep sort f uniq c ambassador autocomplete azure benchmark best bitbucket blogging book cache ci cd cli cloud cml cml send comment completion conda conference continuous dagshub data dataops devops discord divops docker documentation dvc echo engineering external gcp gems git github gitlab google gpu gpus hacktoberfest heartbeat homebrew hyperparameters import machine meetup mentoring metrics minio ml mlops model monorepo new open optimization performance pipeline pipelines plots podcast productivity project ptdc pycon pydata python pytorch r rclone reddit release reproducibility rstats scipy self hosted shtab spacy spell ssh students tab tags terraform tutorial udemy vega videos volunteer youtube that s the current output on master obvious issues examples dvc should be used more or not used at all pipeline vs pipelines cml send comment ptdc project tab etc too broad video and youtube only used once each
| 0
|
21,438
| 29,478,612,129
|
IssuesEvent
|
2023-06-02 02:05:38
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Improve server e2e failure messages
|
process: tests type: chore stale
|
### What would you like?
Often, the server e2e tests fail because it's expected that a certain number of tests will fail, but either too many or too few tests fail. The error message printed is something like:
```
expected 5 to equal 4
```
This makes it difficult to tell if a test is flaky because it's unclear which tests failed or didn't fail. Also, it would be easier to debug if it was clearer which tests didn't behave as expected.
I propose changing how e2e tests are setup, so instead of just specifying how many tests should fail, we specify an array of test titles that we expect to fail.
```javascript
e2e.it('expects 2 failures', {
spec: 'has_two_failures.spec.js',
expectedFailures: [
'throws error when 1st argument is invalid',
'throws error when 2nd argument is invalid',
],
})
```
Then when one or more of the tests doesn't fail as expected, the stdout would print something like:
```
Expected the following test(s) to fail, but it/they passed:
- throws error when 1st argument is invalid
```
And when test fails that should pass, it would say something like:
```
Expected the following test(s) to pass, but it/they failed:
- finds the right element
```
### Why is this needed?
This will improve recognition of flaky tests and help debugging ones that fail.
### Other
_No response_
|
1.0
|
Improve server e2e failure messages - ### What would you like?
Often, the server e2e tests fail because it's expected that a certain number of tests will fail, but either too many or too few tests fail. The error message printed is something like:
```
expected 5 to equal 4
```
This makes it difficult to tell if a test is flaky because it's unclear which tests failed or didn't fail. Also, it would be easier to debug if it was clearer which tests didn't behave as expected.
I propose changing how e2e tests are setup, so instead of just specifying how many tests should fail, we specify an array of test titles that we expect to fail.
```javascript
e2e.it('expects 2 failures', {
spec: 'has_two_failures.spec.js',
expectedFailures: [
'throws error when 1st argument is invalid',
'throws error when 2nd argument is invalid',
],
})
```
Then when one or more of the tests doesn't fail as expected, the stdout would print something like:
```
Expected the following test(s) to fail, but it/they passed:
- throws error when 1st argument is invalid
```
And when test fails that should pass, it would say something like:
```
Expected the following test(s) to pass, but it/they failed:
- finds the right element
```
### Why is this needed?
This will improve recognition of flaky tests and help debugging ones that fail.
### Other
_No response_
|
process
|
improve server failure messages what would you like often the server tests fail because it s expected that a certain number of tests will fail but either too many or too few tests fail the error message printed is something like expected to equal this makes it difficult to tell if a test is flaky because it s unclear which tests failed or didn t fail also it would be easier to debug if it was clearer which tests didn t behave as expected i propose changing how tests are setup so instead of just specifying how many tests should fail we specify an array of test titles that we expect to fail javascript it expects failures spec has two failures spec js expectedfailures throws error when argument is invalid throws error when argument is invalid then when one or more of the tests doesn t fail as expected the stdout would print something like expected the following test s to fail but it they passed throws error when argument is invalid and when test fails that should pass it would say something like expected the following test s to pass but it they failed finds the right element why is this needed this will improve recognition of flaky tests and help debugging ones that fail other no response
| 1
|
1,390
| 3,956,071,903
|
IssuesEvent
|
2016-04-30 00:30:38
|
PlagueHO/LabBuilder
|
https://api.github.com/repos/PlagueHO/LabBuilder
|
closed
|
Error running Install-Lab
|
In Process
|
Get-VMNetworkAdapter : Hyper-V encountered an error trying to access an object on computer 'ROBERTSP4' because the object was not found. The object might have been deleted, or you might not have permission to perform the task. Verify that
the Virtual Machine Management service on the computer is running. If the service is running, try to perform the task again by using Run as Administrator.
At C:\Program Files\WindowsPowerShell\Modules\LabBuilder\0.7.4.852\LabBuilder.psm1:5128 char:38
+ $ExistingManagementAdapter = Get-VMNetworkAdapter `
+ ~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (:) [Get-VMNetworkAdapter], VirtualizationException
+ FullyQualifiedErrorId : ObjectNotFound,Microsoft.HyperV.PowerShell.Commands.GetVMNetworkAdapter
**Running as Admin and VMMS service is running**
**Running Windows 10 latest released version all windows updates done.**
Major Minor Build Revision
----- ----- ----- --------
10 0 10586 0
**PowerShell version**
Name Value
---- -----
PSVersion 5.0.10586.122
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...}
BuildVersion 10.0.10586.122
CLRVersion 4.0.30319.42000
WSManStackVersion 3.0
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
|
1.0
|
Error running Install-Lab - Get-VMNetworkAdapter : Hyper-V encountered an error trying to access an object on computer 'ROBERTSP4' because the object was not found. The object might have been deleted, or you might not have permission to perform the task. Verify that
the Virtual Machine Management service on the computer is running. If the service is running, try to perform the task again by using Run as Administrator.
At C:\Program Files\WindowsPowerShell\Modules\LabBuilder\0.7.4.852\LabBuilder.psm1:5128 char:38
+ $ExistingManagementAdapter = Get-VMNetworkAdapter `
+ ~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (:) [Get-VMNetworkAdapter], VirtualizationException
+ FullyQualifiedErrorId : ObjectNotFound,Microsoft.HyperV.PowerShell.Commands.GetVMNetworkAdapter
**Running as Admin and VMMS service is running**
**Running Windows 10 latest released version all windows updates done.**
Major Minor Build Revision
----- ----- ----- --------
10 0 10586 0
**PowerShell version**
Name Value
---- -----
PSVersion 5.0.10586.122
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...}
BuildVersion 10.0.10586.122
CLRVersion 4.0.30319.42000
WSManStackVersion 3.0
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
|
process
|
error running install lab get vmnetworkadapter hyper v encountered an error trying to access an object on computer because the object was not found the object might have been deleted or you might not have permission to perform the task verify that the virtual machine management service on the computer is running if the service is running try to perform the task again by using run as administrator at c program files windowspowershell modules labbuilder labbuilder char existingmanagementadapter get vmnetworkadapter categoryinfo objectnotfound virtualizationexception fullyqualifiederrorid objectnotfound microsoft hyperv powershell commands getvmnetworkadapter running as admin and vmms service is running running windows latest released version all windows updates done major minor build revision powershell version name value psversion pscompatibleversions buildversion clrversion wsmanstackversion psremotingprotocolversion serializationversion
| 1
|
17,457
| 23,277,724,000
|
IssuesEvent
|
2022-08-05 08:54:29
|
ydb-platform/ydb
|
https://api.github.com/repos/ydb-platform/ydb
|
closed
|
Sometimes executeQuery throws query not found error
|
area/sdk area/queryprocessor
|
Hello. I'm using serverless YDB.
I prepared this query:
```
DECLARE $items AS List<Struct<id: String, order_id: String, product_id: String, quantity: Uint32>>;
DECLARE $order_id AS String;
DECLARE $user_id AS String;
UPSERT INTO order_items(id, order_id, product_id, quantity)
SELECT id, order_id, product_id, quantity FROM AS_TABLE($items);
$table = (
SELECT $order_id, false, false, $user_id, SUM(order_item.quantity * product.price)
FROM AS_TABLE($items) AS order_item
INNER JOIN products AS product
ON (order_item.product_id==product.id)
);
UPSERT INTO orders(id, hasPaid, isCompleted, user_id, price)
SELECT column0, column1, column2, column3, column4 FROM $table;
SELECT column4 FROM $table;
```
and saved it in memory.
When I call it using session.executeQuery, the method sometimes throws:
NotFound [Error]: NotFound: [
{
"message": "Query not found: 23e79871-3b42b18f-8a7dde08-5099c5b8",
"severity": 1
}
]
<details>
<summary>
Node.JS code:
</summary>
const result = await this.client.withSessionRetry(async (session) => {
return await session.executeQuery(
await this.queries.insertOrder(session),
this.queries.createInsertOrderParams(params.products, context.userID, id)
)
})
insertOrder() {
if (this._insertOrder) {
return this._insertOrder
}
// language=SQL
this._insertOrder = await session.prepareQuery(`
DECLARE $items AS List<Struct<id: String, order_id: String, product_id: String, quantity: Uint32>>;
DECLARE $order_id AS String;
DECLARE $user_id AS String;
UPSERT INTO order_items(id, order_id, product_id, quantity)
SELECT id, order_id, product_id, quantity FROM AS_TABLE($items);
$table = (
SELECT $order_id, false, false, $user_id, SUM(order_item.quantity * product.price)
FROM AS_TABLE($items) AS order_item
INNER JOIN products AS product
ON (order_item.product_id==product.id)
);
UPSERT INTO orders(id, hasPaid, isCompleted, user_id, price)
SELECT column0, column1, column2, column3, column4 FROM $table;
SELECT column4 FROM $table;
`)
return this._insertOrder
}
</details>
How to fix this?
|
1.0
|
Sometimes executeQuery throws query not found error - Hello. I'm using serverless YDB.
I prepared this query:
```
DECLARE $items AS List<Struct<id: String, order_id: String, product_id: String, quantity: Uint32>>;
DECLARE $order_id AS String;
DECLARE $user_id AS String;
UPSERT INTO order_items(id, order_id, product_id, quantity)
SELECT id, order_id, product_id, quantity FROM AS_TABLE($items);
$table = (
SELECT $order_id, false, false, $user_id, SUM(order_item.quantity * product.price)
FROM AS_TABLE($items) AS order_item
INNER JOIN products AS product
ON (order_item.product_id==product.id)
);
UPSERT INTO orders(id, hasPaid, isCompleted, user_id, price)
SELECT column0, column1, column2, column3, column4 FROM $table;
SELECT column4 FROM $table;
```
and saved it in memory.
When I call it using session.executeQuery, the method sometimes throws:
NotFound [Error]: NotFound: [
{
"message": "Query not found: 23e79871-3b42b18f-8a7dde08-5099c5b8",
"severity": 1
}
]
<details>
<summary>
Node.JS code:
</summary>
const result = await this.client.withSessionRetry(async (session) => {
return await session.executeQuery(
await this.queries.insertOrder(session),
this.queries.createInsertOrderParams(params.products, context.userID, id)
)
})
insertOrder() {
if (this._insertOrder) {
return this._insertOrder
}
// language=SQL
this._insertOrder = await session.prepareQuery(`
DECLARE $items AS List<Struct<id: String, order_id: String, product_id: String, quantity: Uint32>>;
DECLARE $order_id AS String;
DECLARE $user_id AS String;
UPSERT INTO order_items(id, order_id, product_id, quantity)
SELECT id, order_id, product_id, quantity FROM AS_TABLE($items);
$table = (
SELECT $order_id, false, false, $user_id, SUM(order_item.quantity * product.price)
FROM AS_TABLE($items) AS order_item
INNER JOIN products AS product
ON (order_item.product_id==product.id)
);
UPSERT INTO orders(id, hasPaid, isCompleted, user_id, price)
SELECT column0, column1, column2, column3, column4 FROM $table;
SELECT column4 FROM $table;
`)
return this._insertOrder
}
</details>
How to fix this?
|
process
|
sometimes executequery throws query not found error hello i m using serverless ydb i prepared this query declare items as list declare order id as string declare user id as string upsert into order items id order id product id quantity select id order id product id quantity from as table items table select order id false false user id sum order item quantity product price from as table items as order item inner join products as product on order item product id product id upsert into orders id haspaid iscompleted user id price select from table select from table and saved it in memory when i call it using session executequery the method sometimes throws notfound notfound message query not found severity node js code const result await this client withsessionretry async session return await session executequery await this queries insertorder session this queries createinsertorderparams params products context userid id insertorder if this insertorder return this insertorder language sql this insertorder await session preparequery declare items as list declare order id as string declare user id as string upsert into order items id order id product id quantity select id order id product id quantity from as table items table select order id false false user id sum order item quantity product price from as table items as order item inner join products as product on order item product id product id upsert into orders id haspaid iscompleted user id price select from table select from table return this insertorder how to fix this
| 1
|
337,755
| 24,555,990,965
|
IssuesEvent
|
2022-10-12 15:54:35
|
ssec/polar2grid
|
https://api.github.com/repos/ssec/polar2grid
|
closed
|
Binary writer argument list incorrectly includes `--fill-value` option
|
documentation
|
The CSPP Polar2Grid Documentation includes `--fill-value` which was probably taken from the geotiff writer argument list, since the description is "Instead of an alpha channel fill invalid values with this value. Turns LA or RGBA images in to L or RGB images respectively."
|
1.0
|
Binary writer argument list incorrectly includes `--fill-value` option - The CSPP Polar2Grid Documentation includes `--fill-value` which was probably taken from the geotiff writer argument list, since the description is "Instead of an alpha channel fill invalid values with this value. Turns LA or RGBA images in to L or RGB images respectively."
|
non_process
|
binary writer argument list incorrectly includes fill value option the cspp documentation includes fill value which was probably taken from the geotiff writer argument list since the description is instead of an alpha channel fill invalid values with this value turns la or rgba images in to l or rgb images respectively
| 0
|
16,890
| 22,192,912,350
|
IssuesEvent
|
2022-06-07 02:16:56
|
kserve/kserve
|
https://api.github.com/repos/kserve/kserve
|
closed
|
Add packaged helm chart to release assets
|
kind/feature kserve/release-process
|
/kind feature
**Describe the solution you'd like**
It would be great to be able to deploy the helm chart (in my case using terraform) without having to clone this repo.
**Anything else you would like to add:**
Example Github Actions workflow to add the packaged helm chart to the release assets (which url can then be passed to helm similar to `kubectl -f`): https://github.com/ray-project/kuberay/pull/199/files
With the current folder structure going `cd manifests && helm package charts` would result in `charts-v0.8.0.tgz` and then this action would upload `helm-chart-charts-v0.8.0.tgz`. Might be worth adding one more level of nesting, i.e. `manifests/charts/kserve` and then going `cd manifests/charts && helm package kserve` resulting in `helm-chart-kserve-v0.8.0.tgz` under the release assets
|
1.0
|
Add packaged helm chart to release assets - /kind feature
**Describe the solution you'd like**
It would be great to be able to deploy the helm chart (in my case using terraform) without having to clone this repo.
**Anything else you would like to add:**
Example Github Actions workflow to add the packaged helm chart to the release assets (which url can then be passed to helm similar to `kubectl -f`): https://github.com/ray-project/kuberay/pull/199/files
With the current folder structure going `cd manifests && helm package charts` would result in `charts-v0.8.0.tgz` and then this action would upload `helm-chart-charts-v0.8.0.tgz`. Might be worth adding one more level of nesting, i.e. `manifests/charts/kserve` and then going `cd manifests/charts && helm package kserve` resulting in `helm-chart-kserve-v0.8.0.tgz` under the release assets
|
process
|
add packaged helm chart to release assets kind feature describe the solution you d like it would be great to be able to deploy the helm chart in my case using terraform without having to clone this repo anything else you would like to add example github actions workflow to add the packaged helm chart to the release assets which url can then be passed to helm similar to kubectl f with the current folder structure going cd manifests helm package charts would result in charts tgz and then this action would upload helm chart charts tgz might be worth adding one more level of nesting i e manifests charts kserve and then going cd manifests charts helm package kserve resulting in helm chart kserve tgz under the release assets
| 1
|
22,303
| 30,857,545,291
|
IssuesEvent
|
2023-08-02 22:09:11
|
NCAR/ucomp-pipeline
|
https://api.github.com/repos/NCAR/ucomp-pipeline
|
opened
|
Find out why no certain products for 2021-09-22
|
process
|
@bberkeyU:
> Why is there no L2 like products doppler, line width, or polarizations in 2021-09-22.
|
1.0
|
Find out why no certain products for 2021-09-22 - @bberkeyU:
> Why is there no L2 like products doppler, line width, or polarizations in 2021-09-22.
|
process
|
find out why no certain products for bberkeyu why is there no like products doppler line width or polarizations in
| 1
|
12,024
| 14,738,524,453
|
IssuesEvent
|
2021-01-07 05:00:49
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Chatt - Stage Fees - Wrong dates
|
anc-ops anc-process anc-ui anp-0.5 ant-bug ant-support
|
In GitLab by @kdjstudios on Jun 8, 2018, 09:22
**Submitted by:** "Trawana Ervin" <trawana.ervin@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-06-08-15834/conversation
**Server:** Internal
**Client/Site:** Chattanoga
**Account:** Multiple
**Issue:**
Stage fees were applied to the following accounts in February and have not posted to the balance. When the fees are reentered the following message displays at the top.
Fri Jun 08 2018 10:38:09 * Staged fees and payments will be applied in 03/18/2018 Master billing cycle.
Clinical Enterprises (4523)
L.S. Riggens MD PC (B4667)
Farmer, Price, Hornsby, Weather (T2242)
Please let me know if additional information is needed.
|
1.0
|
Chatt - Stage Fees - Wrong dates - In GitLab by @kdjstudios on Jun 8, 2018, 09:22
**Submitted by:** "Trawana Ervin" <trawana.ervin@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-06-08-15834/conversation
**Server:** Internal
**Client/Site:** Chattanoga
**Account:** Multiple
**Issue:**
Stage fees were applied to the following accounts in February and have not posted to the balance. When the fees are reentered the following message displays at the top.
Fri Jun 08 2018 10:38:09 * Staged fees and payments will be applied in 03/18/2018 Master billing cycle.
Clinical Enterprises (4523)
L.S. Riggens MD PC (B4667)
Farmer, Price, Hornsby, Weather (T2242)
Please let me know if additional information is needed.
|
process
|
chatt stage fees wrong dates in gitlab by kdjstudios on jun submitted by trawana ervin helpdesk server internal client site chattanoga account multiple issue stage fees were applied to the following accounts in february and have not posted to the balance when the fees are reentered the following message displays at the top fri jun staged fees and payments will be applied in master billing cycle clinical enterprises l s riggens md pc farmer price hornsby weather please let me know if additional information is needed
| 1
|
39,390
| 9,422,952,236
|
IssuesEvent
|
2019-04-11 10:35:02
|
zotonic/zotonic
|
https://api.github.com/repos/zotonic/zotonic
|
closed
|
Page block new dialog
|
admin-ui defect
|
When you add a page block "page" the new modal does not show the link button in the 'new-find-results'. I've added a screenshot:

|
1.0
|
Page block new dialog - When you add a page block "page" the new modal does not show the link button in the 'new-find-results'. I've added a screenshot:

|
non_process
|
page block new dialog when you add a page block page the new modal does not show the link button in the new find results i ve added a screenshot
| 0
|
9,523
| 12,499,710,251
|
IssuesEvent
|
2020-06-01 20:41:22
|
googleapis/google-cloud-cpp
|
https://api.github.com/repos/googleapis/google-cloud-cpp
|
closed
|
Archive the old repo
|
api: spanner type: process
|
Archive the repository, with a prominent note in the README explaining where the development will continue.
|
1.0
|
Archive the old repo - Archive the repository, with a prominent note in the README explaining where the development will continue.
|
process
|
archive the old repo archive the repository with a prominent note in the readme explaining where the development will continue
| 1
|
722,584
| 24,868,037,979
|
IssuesEvent
|
2022-10-27 13:24:35
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.ulta.com - Pick up in store does not trigger selections
|
browser-firefox priority-normal severity-critical action-needssitepatch engine-gecko diagnosis-priority-p1
|
<!-- @browser: Firefox 105.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:105.0) Gecko/20100101 Firefox/105.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/111711 -->
**URL**: https://www.ulta.com/
**Browser / Version**: Firefox 105.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Edge
**Problem type**: Site is not usable
**Description**: Buttons or links not working
**Steps to Reproduce**:
Certain items just won't work. For example I am in the shopping cart page. I can click the button for in store pick up but the page just flashes but won't change. I click the button to load more items on a product page, it turns to the hand symbol but nothing happens. I can not add or remove items from my cart on the check out page. All issues appear to be on the check out page. It works fine in other browsers so it's not the website and I've disabled any add ons that might be interfering with it. Nothing works.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
2.0
|
www.ulta.com - Pick up in store does not trigger selections - <!-- @browser: Firefox 105.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:105.0) Gecko/20100101 Firefox/105.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/111711 -->
**URL**: https://www.ulta.com/
**Browser / Version**: Firefox 105.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Edge
**Problem type**: Site is not usable
**Description**: Buttons or links not working
**Steps to Reproduce**:
Certain items just won't work. For example I am in the shopping cart page. I can click the button for in store pick up but the page just flashes but won't change. I click the button to load more items on a product page, it turns to the hand symbol but nothing happens. I can not add or remove items from my cart on the check out page. All issues appear to be on the check out page. It works fine in other browsers so it's not the website and I've disabled any add ons that might be interfering with it. Nothing works.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
pick up in store does not trigger selections url browser version firefox operating system windows tested another browser yes edge problem type site is not usable description buttons or links not working steps to reproduce certain items just won t work for example i am in the shopping cart page i can click the button for in store pick up but the page just flashes but won t change i click the button to load more items on a product page it turns to the hand symbol but nothing happens i can not add or remove items from my cart on the check out page all issues appear to be on the check out page it works fine in other browsers so it s not the website and i ve disabled any add ons that might be interfering with it nothing works browser configuration none from with ❤️
| 0
|
7,491
| 10,579,854,128
|
IssuesEvent
|
2019-10-08 04:26:45
|
MShooshtari/Kaggle_House_Prices
|
https://api.github.com/repos/MShooshtari/Kaggle_House_Prices
|
opened
|
Test set pre-process steps.
|
pre_process
|
I was always curious, what kind of pre-processing steps should we apply on the test set?
- Null detection?
- Data in range?
- Outlier detection? (probably not)
How to normalize the data to be consistent with the train set?
Anything else?
|
1.0
|
Test set pre-process steps. - I was always curious, what kind of pre-processing steps should we apply on the test set?
- Null detection?
- Data in range?
- Outlier detection? (probably not)
How to normalize the data to be consistent with the train set?
Anything else?
|
process
|
test set pre process steps i was always curious what kind of pre processing steps should we apply on the test set null detection data in range outlier detection probably not how to normalize the data to be consistent with the train set anything else
| 1
|
15,502
| 19,703,263,528
|
IssuesEvent
|
2022-01-12 18:52:06
|
googleapis/java-api-gateway
|
https://api.github.com/repos/googleapis/java-api-gateway
|
opened
|
Your .repo-metadata.json file has a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'api-gateway' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'api-gateway' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 release level must be equal to one of the allowed values in repo metadata json api shortname api gateway invalid in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
| 1
|
4,036
| 6,971,790,533
|
IssuesEvent
|
2017-12-11 15:07:12
|
ontop/ontop
|
https://api.github.com/repos/ontop/ontop
|
closed
|
Ontop native mapping parser: generate sources with Maven
|
status: accepted topic: mapping processing type: enhancement
|
Ontop native mapping parser: source code generation (with antlr) needs to be performed manually.
Suggestion: use the antlr Maven plugin (http://www.antlr.org/api/maven-plugin/latest/) to do it during the Maven "generate-sources" phase instead.
|
1.0
|
Ontop native mapping parser: generate sources with Maven - Ontop native mapping parser: source code generation (with antlr) needs to be performed manually.
Suggestion: use the antlr Maven plugin (http://www.antlr.org/api/maven-plugin/latest/) to do it during the Maven "generate-sources" phase instead.
|
process
|
ontop native mapping parser generate sources with maven ontop native mapping parser source code generation with antlr needs to be performed manually suggestion use the antlr maven plugin to do it during the maven generate sources phase instead
| 1
|
205,431
| 15,977,160,713
|
IssuesEvent
|
2021-04-17 03:30:00
|
serverless-stack/serverless-stack
|
https://api.github.com/repos/serverless-stack/serverless-stack
|
opened
|
Export `api.url` in all docs and examples
|
documentation
|
As the subject suggests, update docs and examples for `Api`, `ApiGatewayV1Api`, `ApolloApi` and `AppSyncApi`:
```
this.addOutputs({
Url: api.url
});
```
|
1.0
|
Export `api.url` in all docs and examples - As the subject suggests, update docs and examples for `Api`, `ApiGatewayV1Api`, `ApolloApi` and `AppSyncApi`:
```
this.addOutputs({
Url: api.url
});
```
|
non_process
|
export api url in all docs and examples as the subject suggests update docs and examples for api apolloapi and appsyncapi this addoutputs url api url
| 0
|
14,606
| 17,703,634,240
|
IssuesEvent
|
2021-08-25 03:26:46
|
tdwg/dwc
|
https://api.github.com/repos/tdwg/dwc
|
closed
|
Change term - acceptedNameUsage
|
Term - change Class - Taxon non-normative Process - complete
|
## Term change
* Submitter: Quentin Groom
* Efficacy Justification (why is this change necessary?): To improve clarity of the term usage, particularly to distinguish the different terms that can hold a scientific Latin name
* Demand Justification (if the change is semantic in nature, name at least two organizations that independently need this term): This is largely for people and organizations publishing Darwin Core files to avoid repeated questions that keep cropping up. The issue #28 highlighted that the definitions of `scientificName`, `acceptedNameUsage `and `originalNameUsage` are all similar to one another, however, their intended usage is quite distinct, even though they are not clearly documented. The intension of this suggested change is to add to the comments of the term to help users understand the use of the terms more easily. The suggested explanations were given by @deepreef in #28, but are only preliminary.
* Stability Justification (what concerns are there that this might affect existing implementations?): The intension is that the comments would reinforce the existing definition and thus improve stability.
* Implications for dwciri: namespace (does this change affect a dwciri term version)?: No implication
Current Term definition: https://dwc.tdwg.org/list/#dwc_acceptedNameUsage
Proposed attributes of the new term:
* Usage comments (recommendations regarding content, etc., not normative): **The full scientific name, with authorship and date information if known, of the accepted (botanical) or valid (zoological) name in cases where the provided scientificName is considered by the reference indicated in the accordingTo property, or of the content provider, to be a synonym or misapplied name. When applied to an Organism or Occurrence, this term should be used in cases where a content provider regards the provided scientificName to be inconsistent with the taxonomic perspective of the content provider. For example, there are many discrepancies within specimen collections and observation datasets between the recorded name (e.g., the most recent identification from an expert who examined a specimen, or a field identification for an observed organism), and the name asserted by the content provider to be taxonomically accepted.**
|
1.0
|
Change term - acceptedNameUsage - ## Term change
* Submitter: Quentin Groom
* Efficacy Justification (why is this change necessary?): To improve clarity of the term usage, particularly to distinguish the different terms that can hold a scientific Latin name
* Demand Justification (if the change is semantic in nature, name at least two organizations that independently need this term): This is largely for people and organizations publishing Darwin Core files to avoid repeated questions that keep cropping up. The issue #28 highlighted that the definitions of `scientificName`, `acceptedNameUsage `and `originalNameUsage` are all similar to one another, however, their intended usage is quite distinct, even though they are not clearly documented. The intension of this suggested change is to add to the comments of the term to help users understand the use of the terms more easily. The suggested explanations were given by @deepreef in #28, but are only preliminary.
* Stability Justification (what concerns are there that this might affect existing implementations?): The intension is that the comments would reinforce the existing definition and thus improve stability.
* Implications for dwciri: namespace (does this change affect a dwciri term version)?: No implication
Current Term definition: https://dwc.tdwg.org/list/#dwc_acceptedNameUsage
Proposed attributes of the new term:
* Usage comments (recommendations regarding content, etc., not normative): **The full scientific name, with authorship and date information if known, of the accepted (botanical) or valid (zoological) name in cases where the provided scientificName is considered by the reference indicated in the accordingTo property, or of the content provider, to be a synonym or misapplied name. When applied to an Organism or Occurrence, this term should be used in cases where a content provider regards the provided scientificName to be inconsistent with the taxonomic perspective of the content provider. For example, there are many discrepancies within specimen collections and observation datasets between the recorded name (e.g., the most recent identification from an expert who examined a specimen, or a field identification for an observed organism), and the name asserted by the content provider to be taxonomically accepted.**
|
process
|
change term acceptednameusage term change submitter quentin groom efficacy justification why is this change necessary to improve clarity of the term usage particularly to distinguish the different terms that can hold a scientific latin name demand justification if the change is semantic in nature name at least two organizations that independently need this term this is largely for people and organizations publishing darwin core files to avoid repeated questions that keep cropping up the issue highlighted that the definitions of scientificname acceptednameusage and originalnameusage are all similar to one another however their intended usage is quite distinct even though they are not clearly documented the intension of this suggested change is to add to the comments of the term to help users understand the use of the terms more easily the suggested explanations were given by deepreef in but are only preliminary stability justification what concerns are there that this might affect existing implementations the intension is that the comments would reinforce the existing definition and thus improve stability implications for dwciri namespace does this change affect a dwciri term version no implication current term definition proposed attributes of the new term usage comments recommendations regarding content etc not normative the full scientific name with authorship and date information if known of the accepted botanical or valid zoological name in cases where the provided scientificname is considered by the reference indicated in the accordingto property or of the content provider to be a synonym or misapplied name when applied to an organism or occurrence this term should be used in cases where a content provider regards the provided scientificname to be inconsistent with the taxonomic perspective of the content provider for example there are many discrepancies within specimen collections and observation datasets between the recorded name e g the most recent identification from an expert who examined a specimen or a field identification for an observed organism and the name asserted by the content provider to be taxonomically accepted
| 1
|
14,786
| 18,062,513,133
|
IssuesEvent
|
2021-09-20 15:19:55
|
alphagov/govuk-design-system
|
https://api.github.com/repos/alphagov/govuk-design-system
|
opened
|
Hold a team retro following the v4.0.0 release
|
process refining team processes
|
<!--
This is a template for any issues that aren’t bug reports or new feature requests. The headings in this section provide examples of the information you might want to include, but feel free to add/delete sections where appropriate.
-->
## What
After we've released v4.0.0, run a retro with the team to reflect on how it went.
## Why
To celebrate the successes and identify improvements for future breaking releases.
## Who needs to know about this
Whole team
## Done when
- [ ] Retro organised
- [ ] Retro run
- [ ] Actions noted and cards created where applicable
|
2.0
|
Hold a team retro following the v4.0.0 release - <!--
This is a template for any issues that aren’t bug reports or new feature requests. The headings in this section provide examples of the information you might want to include, but feel free to add/delete sections where appropriate.
-->
## What
After we've released v4.0.0, run a retro with the team to reflect on how it went.
## Why
To celebrate the successes and identify improvements for future breaking releases.
## Who needs to know about this
Whole team
## Done when
- [ ] Retro organised
- [ ] Retro run
- [ ] Actions noted and cards created where applicable
|
process
|
hold a team retro following the release this is a template for any issues that aren’t bug reports or new feature requests the headings in this section provide examples of the information you might want to include but feel free to add delete sections where appropriate what after we ve released run a retro with the team to reflect on how it went why to celebrate the successes and identify improvements for future breaking releases who needs to know about this whole team done when retro organised retro run actions noted and cards created where applicable
| 1
|
557,655
| 16,514,675,924
|
IssuesEvent
|
2021-05-26 08:45:18
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
console.cloud.google.com - design is broken
|
browser-firefox-ios ml-needsdiagnosis-false os-ios priority-critical
|
<!-- @browser: Firefox iOS 33.1 -->
<!-- @ua_header: Mozilla/5.0 (iPhone; CPU OS 14_5_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) FxiOS/33.1 Mobile/15E148 Safari/605.1.15 -->
<!-- @reported_with: mobile-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/75049 -->
**URL**: https://console.cloud.google.com/iam-admin/iam/log?authuser=1
**Browser / Version**: Firefox iOS 33.1
**Operating System**: iOS 14.5.1
**Tested Another Browser**: Yes Other
**Problem type**: Design is broken
**Description**: Images not loaded
**Steps to Reproduce**:
Wouldn’t load screen seems fishy
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
console.cloud.google.com - design is broken - <!-- @browser: Firefox iOS 33.1 -->
<!-- @ua_header: Mozilla/5.0 (iPhone; CPU OS 14_5_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) FxiOS/33.1 Mobile/15E148 Safari/605.1.15 -->
<!-- @reported_with: mobile-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/75049 -->
**URL**: https://console.cloud.google.com/iam-admin/iam/log?authuser=1
**Browser / Version**: Firefox iOS 33.1
**Operating System**: iOS 14.5.1
**Tested Another Browser**: Yes Other
**Problem type**: Design is broken
**Description**: Images not loaded
**Steps to Reproduce**:
Wouldn’t load screen seems fishy
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
console cloud google com design is broken url browser version firefox ios operating system ios tested another browser yes other problem type design is broken description images not loaded steps to reproduce wouldn’t load screen seems fishy browser configuration none from with ❤️
| 0
|
12,399
| 14,911,187,054
|
IssuesEvent
|
2021-01-22 10:40:34
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
Response Datastore- Change the default log level to WARN
|
Bug Process: Fixed
|
Log level changed to DEBUG in PR #2833, please revert to WARN after analyzing the issue
|
1.0
|
Response Datastore- Change the default log level to WARN - Log level changed to DEBUG in PR #2833, please revert to WARN after analyzing the issue
|
process
|
response datastore change the default log level to warn log level changed to debug in pr please revert to warn after analyzing the issue
| 1
|
277,286
| 24,057,681,493
|
IssuesEvent
|
2022-09-16 18:32:54
|
Mamr96insatbug/test
|
https://api.github.com/repos/Mamr96insatbug/test
|
opened
|
Error playing
|
Forwarded-to-Test
|
# :clipboard: Bug Details
>Error playing
key | value
--|--
Reported At | 2022-09-16 18:32:44 UTC
Email | ahmed_elbashary@hotmail.com
Categories | Report a bug, Network issue
Tags | Forwarded-to-Test
App Version | 2.0.8 (6)
Session Duration | 3876
Device | arm64, iOS 15.5
Display | 390x844 (@3x)
Location | Cairo, Egypt (en)
## :point_right: [View Full Bug Report on Instabug](https://dashboard.instabug.com/applications/test123/beta/bugs/471?utm_source=github&utm_medium=integrations) :point_left:
___
# :chart_with_downwards_trend: Session Profiler
Here is what the app was doing right before the bug was reported:
Key | Value
--|--
CPU Load | 2%
Used Memory | 100.0% - 0.05/0.05 GB
Used Storage | 42.1% - 96.03/228.27 GB
Connectivity | WiFi - Simulator WiFi
Battery | 100% - unplugged
Orientation | portrait
Find all the changes that happened in the parameters mentioned above during the last 60 seconds before the bug was reported here: :point_right: **[View Full Session Profiler](https://dashboard.instabug.com/applications/test123/beta/bugs/471?show-session-profiler=true&utm_source=github&utm_medium=integrations)** :point_left:
___
# :mag_right: Logs
### User Steps
Here are the last 10 steps done by the user right before the bug was reported:
```
18:32:08 Tap in Floating Button of type IBGInvocationFloatingView in SwiftRadio.NowPlayingViewController
18:31:35 Shake in: SwiftRadio.NowPlayingViewController
18:31:35 Shake in: SwiftRadio.NowPlayingViewController
18:31:31 Tap in Floating Button of type IBGInvocationFloatingView in SwiftRadio.NowPlayingViewController
18:31:23 Top View: SwiftRadio.NowPlayingViewController
18:31:23 Tap in UIStackView in SwiftRadio.StationsViewController
17:27:32 Application: DidBecomeActive
17:27:32 Application: SceneDidActivate
17:27:32 Top View: SwiftRadio.StationsViewController
```
Find all the user steps done by the user throughout the session here: :point_right: **[View All User Steps](https://dashboard.instabug.com/applications/test123/beta/bugs/471?show-logs=user_steps&utm_source=github&utm_medium=integrations)** :point_left:
___
# :camera: Images
[](https://d38gnqwzxziyyy.cloudfront.net/attachments/bugs/18883613/e51be77b86a9fba47a26c2eb2695347e_original/27239454/2022091608323767928569.jpg?Expires=4819026773&Signature=b~SCeyrvDm~zgdOEy~G~TNF-ri~rNbsTOKA4JggfrEh4w2LZGfI21tUHAQIO4uC3askO-nk-bLD8uGIjdFHjVFGskUVn913hnLkEECT6VOYbIEwaKXMMC5Z21XyVq5BJIFcQQ2UiXwCebBLFqfICejqEtYx1t3dklXcjQL-RPWiJCQVeLAKpwqpBVDNdbqq18E5Fey5uZnPZuAHYMA6h2GfqG1EDWinhncwqbSWrEOpwyXvZi-DrYJi7z0iFRTZIhRJMfOeyA3AJq5tx5A4Kh6UTVS~2YP-YO4dB2bZLFjgIM5KLrLpNfcWtMxpXjXDmGiwU~1mS7d5DALYLNkGNIg__&Key-Pair-Id=APKAIXAG65U6UUX7JAQQ)
___
# :warning: Looking for More Details?
1. **Network Log**: we are unable to capture your network requests automatically. If you are using AFNetworking or Alamofire, [**check the details mentioned here**](https://docs.instabug.com/docs/ios-logging?utm_source=github&utm_medium=integrations#section-requests-not-appearing-in-logs).
2. **User Events**: start capturing custom User Events to send them along with each report. [**Find all the details in the docs**](https://docs.instabug.com/docs/ios-logging?utm_source=github&utm_medium=integrations).
3. **Instabug Log**: start adding Instabug logs to see them right inside each report you receive. [**Find all the details in the docs**](https://docs.instabug.com/docs/ios-logging?utm_source=github&utm_medium=integrations).
4. **Console Log**: when enabled you will see them right inside each report you receive. [**Find all the details in the docs**](https://docs.instabug.com/docs/ios-logging?utm_source=github&utm_medium=integrations).
|
1.0
|
Error playing - # :clipboard: Bug Details
>Error playing
key | value
--|--
Reported At | 2022-09-16 18:32:44 UTC
Email | ahmed_elbashary@hotmail.com
Categories | Report a bug, Network issue
Tags | Forwarded-to-Test
App Version | 2.0.8 (6)
Session Duration | 3876
Device | arm64, iOS 15.5
Display | 390x844 (@3x)
Location | Cairo, Egypt (en)
## :point_right: [View Full Bug Report on Instabug](https://dashboard.instabug.com/applications/test123/beta/bugs/471?utm_source=github&utm_medium=integrations) :point_left:
___
# :chart_with_downwards_trend: Session Profiler
Here is what the app was doing right before the bug was reported:
Key | Value
--|--
CPU Load | 2%
Used Memory | 100.0% - 0.05/0.05 GB
Used Storage | 42.1% - 96.03/228.27 GB
Connectivity | WiFi - Simulator WiFi
Battery | 100% - unplugged
Orientation | portrait
Find all the changes that happened in the parameters mentioned above during the last 60 seconds before the bug was reported here: :point_right: **[View Full Session Profiler](https://dashboard.instabug.com/applications/test123/beta/bugs/471?show-session-profiler=true&utm_source=github&utm_medium=integrations)** :point_left:
___
# :mag_right: Logs
### User Steps
Here are the last 10 steps done by the user right before the bug was reported:
```
18:32:08 Tap in Floating Button of type IBGInvocationFloatingView in SwiftRadio.NowPlayingViewController
18:31:35 Shake in: SwiftRadio.NowPlayingViewController
18:31:35 Shake in: SwiftRadio.NowPlayingViewController
18:31:31 Tap in Floating Button of type IBGInvocationFloatingView in SwiftRadio.NowPlayingViewController
18:31:23 Top View: SwiftRadio.NowPlayingViewController
18:31:23 Tap in UIStackView in SwiftRadio.StationsViewController
17:27:32 Application: DidBecomeActive
17:27:32 Application: SceneDidActivate
17:27:32 Top View: SwiftRadio.StationsViewController
```
Find all the user steps done by the user throughout the session here: :point_right: **[View All User Steps](https://dashboard.instabug.com/applications/test123/beta/bugs/471?show-logs=user_steps&utm_source=github&utm_medium=integrations)** :point_left:
___
# :camera: Images
[](https://d38gnqwzxziyyy.cloudfront.net/attachments/bugs/18883613/e51be77b86a9fba47a26c2eb2695347e_original/27239454/2022091608323767928569.jpg?Expires=4819026773&Signature=b~SCeyrvDm~zgdOEy~G~TNF-ri~rNbsTOKA4JggfrEh4w2LZGfI21tUHAQIO4uC3askO-nk-bLD8uGIjdFHjVFGskUVn913hnLkEECT6VOYbIEwaKXMMC5Z21XyVq5BJIFcQQ2UiXwCebBLFqfICejqEtYx1t3dklXcjQL-RPWiJCQVeLAKpwqpBVDNdbqq18E5Fey5uZnPZuAHYMA6h2GfqG1EDWinhncwqbSWrEOpwyXvZi-DrYJi7z0iFRTZIhRJMfOeyA3AJq5tx5A4Kh6UTVS~2YP-YO4dB2bZLFjgIM5KLrLpNfcWtMxpXjXDmGiwU~1mS7d5DALYLNkGNIg__&Key-Pair-Id=APKAIXAG65U6UUX7JAQQ)
___
# :warning: Looking for More Details?
1. **Network Log**: we are unable to capture your network requests automatically. If you are using AFNetworking or Alamofire, [**check the details mentioned here**](https://docs.instabug.com/docs/ios-logging?utm_source=github&utm_medium=integrations#section-requests-not-appearing-in-logs).
2. **User Events**: start capturing custom User Events to send them along with each report. [**Find all the details in the docs**](https://docs.instabug.com/docs/ios-logging?utm_source=github&utm_medium=integrations).
3. **Instabug Log**: start adding Instabug logs to see them right inside each report you receive. [**Find all the details in the docs**](https://docs.instabug.com/docs/ios-logging?utm_source=github&utm_medium=integrations).
4. **Console Log**: when enabled you will see them right inside each report you receive. [**Find all the details in the docs**](https://docs.instabug.com/docs/ios-logging?utm_source=github&utm_medium=integrations).
|
non_process
|
error playing clipboard bug details error playing key value reported at utc email ahmed elbashary hotmail com categories report a bug network issue tags forwarded to test app version session duration device ios display location cairo egypt en point right point left chart with downwards trend session profiler here is what the app was doing right before the bug was reported key value cpu load used memory gb used storage gb connectivity wifi simulator wifi battery unplugged orientation portrait find all the changes that happened in the parameters mentioned above during the last seconds before the bug was reported here point right point left mag right logs user steps here are the last steps done by the user right before the bug was reported tap in floating button of type ibginvocationfloatingview in swiftradio nowplayingviewcontroller shake in swiftradio nowplayingviewcontroller shake in swiftradio nowplayingviewcontroller tap in floating button of type ibginvocationfloatingview in swiftradio nowplayingviewcontroller top view swiftradio nowplayingviewcontroller tap in uistackview in swiftradio stationsviewcontroller application didbecomeactive application scenedidactivate top view swiftradio stationsviewcontroller find all the user steps done by the user throughout the session here point right point left camera images warning looking for more details network log we are unable to capture your network requests automatically if you are using afnetworking or alamofire user events start capturing custom user events to send them along with each report instabug log start adding instabug logs to see them right inside each report you receive console log when enabled you will see them right inside each report you receive
| 0
|
15,519
| 19,703,268,201
|
IssuesEvent
|
2022-01-12 18:52:24
|
googleapis/java-dialogflow-cx
|
https://api.github.com/repos/googleapis/java-dialogflow-cx
|
opened
|
Your .repo-metadata.json file has a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'dialogflow-cx' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'dialogflow-cx' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 release level must be equal to one of the allowed values in repo metadata json api shortname dialogflow cx invalid in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
| 1
|
22,640
| 31,886,708,856
|
IssuesEvent
|
2023-09-17 02:16:20
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
detachedproc 0.12 has 4 GuardDog issues
|
guarddog silent-process-execution
|
https://pypi.org/project/detachedproc
https://inspector.pypi.io/project/detachedproc
```{
"dependency": "detachedproc",
"version": "0.12",
"result": {
"issues": 4,
"errors": {},
"results": {
"silent-process-execution": [
{
"location": "detachedproc-0.12/detachedproc/__init__.py:311",
"code": " subprocess.Popen(\n self.cmd_to_execute,\n cwd=self.working_directory,\n env=os.environ.copy(),\n shell=True,\n stderr=subprocess.DEVN... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "detachedproc-0.12/detachedproc/__init__.py:325",
"code": " subprocess.Popen(\n [\"nohup\", self.tmpfile],\n shell=False,\n start_new_session=True,\n stderr=subprocess.DEVNULL,\n stdout=subprocess... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "detachedproc-0.12/detachedproc/__init__.py:647",
"code": " subprocess.run(\n self.cmd_to_execute,\n cwd=self.working_directory,\n env=os.environ.copy(),\n shell=True,\n stderr=subprocess.DEVNUL... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "detachedproc-0.12/detachedproc/__init__.py:661",
"code": " subprocess.run(\n [\"nohup\", self.tmpfile],\n shell=False,\n start_new_session=True,\n stderr=subprocess.DEVNULL,\n stdout=subprocess.D... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp00g2hspp/detachedproc"
}
}```
|
1.0
|
detachedproc 0.12 has 4 GuardDog issues - https://pypi.org/project/detachedproc
https://inspector.pypi.io/project/detachedproc
```{
"dependency": "detachedproc",
"version": "0.12",
"result": {
"issues": 4,
"errors": {},
"results": {
"silent-process-execution": [
{
"location": "detachedproc-0.12/detachedproc/__init__.py:311",
"code": " subprocess.Popen(\n self.cmd_to_execute,\n cwd=self.working_directory,\n env=os.environ.copy(),\n shell=True,\n stderr=subprocess.DEVN... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "detachedproc-0.12/detachedproc/__init__.py:325",
"code": " subprocess.Popen(\n [\"nohup\", self.tmpfile],\n shell=False,\n start_new_session=True,\n stderr=subprocess.DEVNULL,\n stdout=subprocess... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "detachedproc-0.12/detachedproc/__init__.py:647",
"code": " subprocess.run(\n self.cmd_to_execute,\n cwd=self.working_directory,\n env=os.environ.copy(),\n shell=True,\n stderr=subprocess.DEVNUL... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "detachedproc-0.12/detachedproc/__init__.py:661",
"code": " subprocess.run(\n [\"nohup\", self.tmpfile],\n shell=False,\n start_new_session=True,\n stderr=subprocess.DEVNULL,\n stdout=subprocess.D... )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp00g2hspp/detachedproc"
}
}```
|
process
|
detachedproc has guarddog issues dependency detachedproc version result issues errors results silent process execution location detachedproc detachedproc init py code subprocess popen n self cmd to execute n cwd self working directory n env os environ copy n shell true n stderr subprocess devn message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null location detachedproc detachedproc init py code subprocess popen n n shell false n start new session true n stderr subprocess devnull n stdout subprocess message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null location detachedproc detachedproc init py code subprocess run n self cmd to execute n cwd self working directory n env os environ copy n shell true n stderr subprocess devnul message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null location detachedproc detachedproc init py code subprocess run n n shell false n start new session true n stderr subprocess devnull n stdout subprocess d message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp detachedproc
| 1
|
17,185
| 22,766,896,254
|
IssuesEvent
|
2022-07-08 05:53:58
|
Tencent/tdesign-miniprogram
|
https://api.github.com/repos/Tencent/tdesign-miniprogram
|
closed
|
[tdesign-miniprogram]希望支持全局定制组件颜色
|
enhancement Stale processing
|
<!-- generated by issue-helper DO NOT REMOVE __FEATURE_REQUEST__ -->
### 这个功能解决了什么问题
各应用有自己的主色调,希望支持组件库的全局颜色覆写,给出颜色覆写的示例文档
### 你建议的方案是什么?
可参考其他组件库实现方法,https://vant-contrib.gitee.io/vant-weapp/#/theme#ding-zhi-quan-ju-zhu-ti-yang-shi
<!-- generated by issue-helper DO NOT REMOVE __FEATURE_REQUEST__ -->
|
1.0
|
[tdesign-miniprogram]希望支持全局定制组件颜色 - <!-- generated by issue-helper DO NOT REMOVE __FEATURE_REQUEST__ -->
### 这个功能解决了什么问题
各应用有自己的主色调,希望支持组件库的全局颜色覆写,给出颜色覆写的示例文档
### 你建议的方案是什么?
可参考其他组件库实现方法,https://vant-contrib.gitee.io/vant-weapp/#/theme#ding-zhi-quan-ju-zhu-ti-yang-shi
<!-- generated by issue-helper DO NOT REMOVE __FEATURE_REQUEST__ -->
|
process
|
希望支持全局定制组件颜色 这个功能解决了什么问题 各应用有自己的主色调,希望支持组件库的全局颜色覆写,给出颜色覆写的示例文档 你建议的方案是什么? 可参考其他组件库实现方法,
| 1
|
39,386
| 9,422,433,865
|
IssuesEvent
|
2019-04-11 09:21:36
|
contao/contao
|
https://api.github.com/repos/contao/contao
|
closed
|
Error when trying to copy events (copyAll-mode) from one calendar into another (empty!) calendar
|
defect
|
Contao throws an error when I'm trying to copy (copyAll-mode) **as a non admin** multiple events from one calendar into another **empty**! calendar.
https://github.com/contao/contao/blob/c0fc6315af73e8bef9d67e2f8ae952ac1ddda80b/calendar-bundle/src/Resources/contao/dca/tl_calendar_events.php#L662
When I'm escaping Line 662 everything works just fine.
Tested in contao 4.7.2 and the contao online demo.
https://we.tl/t-dEYGcNCz6Y
|
1.0
|
Error when trying to copy events (copyAll-mode) from one calendar into another (empty!) calendar - Contao throws an error when I'm trying to copy (copyAll-mode) **as a non admin** multiple events from one calendar into another **empty**! calendar.
https://github.com/contao/contao/blob/c0fc6315af73e8bef9d67e2f8ae952ac1ddda80b/calendar-bundle/src/Resources/contao/dca/tl_calendar_events.php#L662
When I'm escaping Line 662 everything works just fine.
Tested in contao 4.7.2 and the contao online demo.
https://we.tl/t-dEYGcNCz6Y
|
non_process
|
error when trying to copy events copyall mode from one calendar into another empty calendar contao throws an error when i m trying to copy copyall mode as a non admin multiple events from one calendar into another empty calendar when i m escaping line everything works just fine tested in contao and the contao online demo
| 0
|
295,591
| 22,261,132,736
|
IssuesEvent
|
2022-06-10 00:51:44
|
GoogleContainerTools/kpt
|
https://api.github.com/repos/GoogleContainerTools/kpt
|
closed
|
Update section on kpt book to author function using new Go library
|
documentation enhancement area/fn-sdk triaged
|
We should have a draft doc PR ready when kubernetes-sigs/kustomize#4319 is in flight.
We can merge kubernetes-sigs/kustomize#4319 and the doc PR at the same time.
We can port the examples from kubernetes-sigs/kustomize#4319
|
1.0
|
Update section on kpt book to author function using new Go library - We should have a draft doc PR ready when kubernetes-sigs/kustomize#4319 is in flight.
We can merge kubernetes-sigs/kustomize#4319 and the doc PR at the same time.
We can port the examples from kubernetes-sigs/kustomize#4319
|
non_process
|
update section on kpt book to author function using new go library we should have a draft doc pr ready when kubernetes sigs kustomize is in flight we can merge kubernetes sigs kustomize and the doc pr at the same time we can port the examples from kubernetes sigs kustomize
| 0
|
11,774
| 14,610,586,382
|
IssuesEvent
|
2020-12-22 00:48:51
|
googleapis/nodejs-automl
|
https://api.github.com/repos/googleapis/nodejs-automl
|
closed
|
Tests need refactoring
|
api: automl type: process
|
AutoML Create model tests have no teardown. this caused AUTOML_PROJECT_ID project to become the biggest project in the automl (83k LROs, 1.4k models, 540 datasets).
|
1.0
|
Tests need refactoring - AutoML Create model tests have no teardown. this caused AUTOML_PROJECT_ID project to become the biggest project in the automl (83k LROs, 1.4k models, 540 datasets).
|
process
|
tests need refactoring automl create model tests have no teardown this caused automl project id project to become the biggest project in the automl lros models datasets
| 1
|
1,052
| 12,530,143,652
|
IssuesEvent
|
2020-06-04 12:36:06
|
sohaibaslam/learning_site
|
https://api.github.com/repos/sohaibaslam/learning_site
|
opened
|
Broken Crawlers 04, Jun 2020
|
crawler broken/unreliable
|
1. **abcmart kr(100%)**
1. **adidas kr(62%)/tr(100%)**
1. **adler de(100%)**
1. **aldo eu(100%)**
1. **alexandermcqueen cn(100%)**
1. **americaneagle ca(100%)**
1. **ami dk(100%)/jp(100%)/uk(100%)**
1. **anthropologie (100%)/fr(100%)/uk(100%)**
1. **antonioli au(100%)/ca(100%)/cn(100%)/hk(100%)/jp(100%)/kr(100%)/mo(100%)/ru(100%)/tw(100%)/uk(100%)/us(100%)**
1. **argos uk(100%)**
1. **armandthiery fr(100%)**
1. **armaniexchange (100%)**
1. **armedangels de(100%)**
1. **asos (100%)/ae(100%)/au(100%)/ch(100%)/cn(100%)/es(99%)/hk(100%)/id(100%)/my(100%)/nl(100%)/ph(100%)/pl(100%)/ru(100%)/sa(100%)/sg(100%)/th(100%)/vn(100%)**
1. **avenue us(100%)**
1. **babyshop ae(100%)/sa(100%)**
1. **balr es(100%)/fr(100%)/nl(100%)**
1. **brands hu(100%)**
1. **burlington us(100%)**
1. **calvinklein us(100%)**
1. **central th(100%)**
1. **centrepoint ae(100%)**
1. **champion eu(100%)/fr(100%)**
1. **chloe kr(100%)**
1. **coldwatercreek us(100%)**
1. **conforama fr(100%)**
1. **converse at(100%)/au(100%)/de(100%)**
1. **cotton au(100%)**
1. **countryroad (100%)**
1. **davidjones (100%)**
1. **drmartens de(100%)/es(100%)/eu(100%)/fr(100%)/it(100%)/nl(100%)/uk(100%)/us(100%)**
1. **elcorteingles es(100%)/pt(56%)**
1. **ellos fi(100%)/no(100%)/se(100%)**
1. **exact za(100%)**
1. **footaction us(100%)**
1. **footlocker (52%)/be(100%)/de(100%)/dk(100%)/es(100%)/fr(100%)/it(100%)/lu(100%)/nl(100%)/no(100%)/se(100%)/uk(100%)**
1. **footpatrol uk(100%)**
1. **forloveandlemons de(100%)**
1. **fredperry (100%)/us(100%)**
1. **gapfactory us(100%)**
1. **goodhood uk(100%)**
1. **gymshark fi(35%)/ie(100%)**
1. **harrods (100%)**
1. **hermes at(100%)/ca(100%)/fr(75%)/it(50%)/nl(67%)/pt(94%)/us(88%)**
1. **hm hk(53%)/kw(100%)/ro(100%)/sa(100%)**
1. **hollister cn(100%)**
1. **hush uk(100%)**
1. **isetan jp(100%)**
1. **kupivip ru(100%)**
1. **laredouteapi ch(100%)/es(100%)/it(100%)/ru(100%)**
1. **lefties es(100%)/pt(100%)**
1. **lifestylestores in(100%)**
1. **liverpool mx(100%)**
1. **luigibertolli br(100%)**
1. **maccosmetics uk(100%)**
1. **maxfashion sa(100%)**
1. **michaelkors ca(100%)**
1. **miumiu de(100%)**
1. **moosejaw us(100%)**
1. **mothercare sa(100%)**
1. **mrporter (100%)**
1. **muji de(100%)**
1. **next hk(100%)/jp(100%)/kr(100%)/nl(100%)/nz(100%)/tr(100%)**
1. **oasis (100%)**
1. **parfois es(58%)/ie(53%)/ma(100%)**
1. **peterhahn de(100%)**
1. **popup br(100%)**
1. **prettysecrets in(100%)**
1. **pullandbear gt(100%)**
1. **ralphlauren gr(99%)/lu(38%)/sk(100%)**
1. **reebok ch(100%)/de(100%)**
1. **reserved ro(100%)**
1. **runnerspoint de(100%)**
1. **saksfifthavenue mo(100%)/ru(100%)/tw(100%)**
1. **sandroatjd cn(100%)**
1. **selfridges cn(100%)/de(100%)/es(100%)/ie(100%)/mo(100%)/sg(100%)**
1. **sephora us(100%)**
1. **sfera es(100%)**
1. **simons ca(100%)**
1. **snkrs eu(100%)/fr(100%)**
1. **soccer us(100%)**
1. **solebox de(100%)/uk(100%)**
1. **stefaniamode dk(100%)**
1. **stories nl(100%)**
1. **stylebop (100%)/au(100%)/ca(100%)/es(100%)/fr(100%)/hk(100%)/jp(100%)/mo(100%)**
1. **suistudio eu(100%)/uk(100%)**
1. **suitsupply at(100%)/de(100%)/es(100%)/fi(100%)/fr(100%)/it(100%)/no(100%)**
1. **superdry th(100%)**
1. **tods cn(100%)/gr(100%)/pt(100%)**
1. **tommybahama ae(91%)/au(36%)/ch(100%)/de(100%)/in(100%)/kr(100%)/mx(67%)/ph(100%)/pl(34%)/sa(100%)/se(58%)/sg(100%)/us(100%)/za(100%)**
1. **topbrands ru(100%)**
1. **topman my(100%)**
1. **valentino cn(100%)**
1. **walmart ca(100%)**
1. **warehouse (100%)/au(100%)/ca(100%)/ie(100%)/nl(100%)/se(100%)**
1. **wayfair de(100%)/uk(100%)**
1. **weekday dk(100%)/se(100%)/uk(100%)**
1. **zadigetvoltaire fr(100%)**
1. **zalando it(100%)**
1. **zalandolounge de(100%)**
1. **zalora ph(100%)/sg(89%)**
1. tommyjohn us(98%)
1. smallable ca(54%)/fr(60%)/it(97%)/nl(67%)/qa(83%)/uk(75%)
1. lastcall us(90%)
1. vip cn(89%)
1. 24sevres eu(84%)/fr(86%)/uk(84%)/us(81%)
1. diesel cn(85%)
1. leroymerlin fr(83%)
1. noon sa(81%)
1. watchshop ru(80%)
1. burberry (69%)/ae(66%)/at(58%)/au(56%)/be(73%)/bg(61%)/ca(57%)/ch(61%)/cz(50%)/de(48%)/dk(65%)/es(47%)/fi(50%)/fr(62%)/hk(47%)/hu(38%)/ie(54%)/it(65%)/jp(49%)/kr(61%)/my(51%)/nl(44%)/pl(55%)/pt(56%)/ro(59%)/ru(55%)/se(61%)/sg(56%)/si(65%)/sk(54%)/tr(50%)/tw(57%)/us(61%)
1. saksoff5th us(69%)
1. ssense ca(69%)/jp(59%)
1. universal at(64%)
1. okaidi fr(63%)
1. lncc kr(61%)
1. acne au(37%)/ch(55%)/cn(42%)/it(53%)/nl(60%)/no(56%)
1. jimmyjazz us(60%)
1. cos hu(59%)/kr(34%)
1. underarmour us(59%)
1. neimanmarcus jp(57%)
1. lululemon cn(55%)
1. nayomi ae(53%)/sa(36%)
1. sportchek ca(49%)
1. sezane fr(48%)/uk(42%)/us(42%)
1. scotchandsoda ca(47%)/us(36%)
1. camper ca(41%)/de(31%)/es(33%)/gr(34%)/it(33%)/nl(31%)/pl(33%)/uk(34%)/us(46%)
1. jelmoli ch(46%)
1. theory (44%)
1. hibbett us(43%)
1. levi es(39%)/ie(42%)
1. aloyoga us(41%)
1. navabi de(39%)/uk(39%)
1. paulsmith au(37%)
1. moncler ca(30%)/es(30%)/fr(30%)/kr(36%)/uk(30%)/us(33%)
1. undiz fr(35%)
1. paris cl(34%)
1. joseph de(33%)/eu(33%)/uk(33%)/us(33%)
1. openingceremony us(33%)
1. riachuelo br(32%)
1. gap hk(31%)
1. rivafashion qa(31%)
1. crockid ru(30%)
|
True
|
Broken Crawlers 04, Jun 2020 - 1. **abcmart kr(100%)**
1. **adidas kr(62%)/tr(100%)**
1. **adler de(100%)**
1. **aldo eu(100%)**
1. **alexandermcqueen cn(100%)**
1. **americaneagle ca(100%)**
1. **ami dk(100%)/jp(100%)/uk(100%)**
1. **anthropologie (100%)/fr(100%)/uk(100%)**
1. **antonioli au(100%)/ca(100%)/cn(100%)/hk(100%)/jp(100%)/kr(100%)/mo(100%)/ru(100%)/tw(100%)/uk(100%)/us(100%)**
1. **argos uk(100%)**
1. **armandthiery fr(100%)**
1. **armaniexchange (100%)**
1. **armedangels de(100%)**
1. **asos (100%)/ae(100%)/au(100%)/ch(100%)/cn(100%)/es(99%)/hk(100%)/id(100%)/my(100%)/nl(100%)/ph(100%)/pl(100%)/ru(100%)/sa(100%)/sg(100%)/th(100%)/vn(100%)**
1. **avenue us(100%)**
1. **babyshop ae(100%)/sa(100%)**
1. **balr es(100%)/fr(100%)/nl(100%)**
1. **brands hu(100%)**
1. **burlington us(100%)**
1. **calvinklein us(100%)**
1. **central th(100%)**
1. **centrepoint ae(100%)**
1. **champion eu(100%)/fr(100%)**
1. **chloe kr(100%)**
1. **coldwatercreek us(100%)**
1. **conforama fr(100%)**
1. **converse at(100%)/au(100%)/de(100%)**
1. **cotton au(100%)**
1. **countryroad (100%)**
1. **davidjones (100%)**
1. **drmartens de(100%)/es(100%)/eu(100%)/fr(100%)/it(100%)/nl(100%)/uk(100%)/us(100%)**
1. **elcorteingles es(100%)/pt(56%)**
1. **ellos fi(100%)/no(100%)/se(100%)**
1. **exact za(100%)**
1. **footaction us(100%)**
1. **footlocker (52%)/be(100%)/de(100%)/dk(100%)/es(100%)/fr(100%)/it(100%)/lu(100%)/nl(100%)/no(100%)/se(100%)/uk(100%)**
1. **footpatrol uk(100%)**
1. **forloveandlemons de(100%)**
1. **fredperry (100%)/us(100%)**
1. **gapfactory us(100%)**
1. **goodhood uk(100%)**
1. **gymshark fi(35%)/ie(100%)**
1. **harrods (100%)**
1. **hermes at(100%)/ca(100%)/fr(75%)/it(50%)/nl(67%)/pt(94%)/us(88%)**
1. **hm hk(53%)/kw(100%)/ro(100%)/sa(100%)**
1. **hollister cn(100%)**
1. **hush uk(100%)**
1. **isetan jp(100%)**
1. **kupivip ru(100%)**
1. **laredouteapi ch(100%)/es(100%)/it(100%)/ru(100%)**
1. **lefties es(100%)/pt(100%)**
1. **lifestylestores in(100%)**
1. **liverpool mx(100%)**
1. **luigibertolli br(100%)**
1. **maccosmetics uk(100%)**
1. **maxfashion sa(100%)**
1. **michaelkors ca(100%)**
1. **miumiu de(100%)**
1. **moosejaw us(100%)**
1. **mothercare sa(100%)**
1. **mrporter (100%)**
1. **muji de(100%)**
1. **next hk(100%)/jp(100%)/kr(100%)/nl(100%)/nz(100%)/tr(100%)**
1. **oasis (100%)**
1. **parfois es(58%)/ie(53%)/ma(100%)**
1. **peterhahn de(100%)**
1. **popup br(100%)**
1. **prettysecrets in(100%)**
1. **pullandbear gt(100%)**
1. **ralphlauren gr(99%)/lu(38%)/sk(100%)**
1. **reebok ch(100%)/de(100%)**
1. **reserved ro(100%)**
1. **runnerspoint de(100%)**
1. **saksfifthavenue mo(100%)/ru(100%)/tw(100%)**
1. **sandroatjd cn(100%)**
1. **selfridges cn(100%)/de(100%)/es(100%)/ie(100%)/mo(100%)/sg(100%)**
1. **sephora us(100%)**
1. **sfera es(100%)**
1. **simons ca(100%)**
1. **snkrs eu(100%)/fr(100%)**
1. **soccer us(100%)**
1. **solebox de(100%)/uk(100%)**
1. **stefaniamode dk(100%)**
1. **stories nl(100%)**
1. **stylebop (100%)/au(100%)/ca(100%)/es(100%)/fr(100%)/hk(100%)/jp(100%)/mo(100%)**
1. **suistudio eu(100%)/uk(100%)**
1. **suitsupply at(100%)/de(100%)/es(100%)/fi(100%)/fr(100%)/it(100%)/no(100%)**
1. **superdry th(100%)**
1. **tods cn(100%)/gr(100%)/pt(100%)**
1. **tommybahama ae(91%)/au(36%)/ch(100%)/de(100%)/in(100%)/kr(100%)/mx(67%)/ph(100%)/pl(34%)/sa(100%)/se(58%)/sg(100%)/us(100%)/za(100%)**
1. **topbrands ru(100%)**
1. **topman my(100%)**
1. **valentino cn(100%)**
1. **walmart ca(100%)**
1. **warehouse (100%)/au(100%)/ca(100%)/ie(100%)/nl(100%)/se(100%)**
1. **wayfair de(100%)/uk(100%)**
1. **weekday dk(100%)/se(100%)/uk(100%)**
1. **zadigetvoltaire fr(100%)**
1. **zalando it(100%)**
1. **zalandolounge de(100%)**
1. **zalora ph(100%)/sg(89%)**
1. tommyjohn us(98%)
1. smallable ca(54%)/fr(60%)/it(97%)/nl(67%)/qa(83%)/uk(75%)
1. lastcall us(90%)
1. vip cn(89%)
1. 24sevres eu(84%)/fr(86%)/uk(84%)/us(81%)
1. diesel cn(85%)
1. leroymerlin fr(83%)
1. noon sa(81%)
1. watchshop ru(80%)
1. burberry (69%)/ae(66%)/at(58%)/au(56%)/be(73%)/bg(61%)/ca(57%)/ch(61%)/cz(50%)/de(48%)/dk(65%)/es(47%)/fi(50%)/fr(62%)/hk(47%)/hu(38%)/ie(54%)/it(65%)/jp(49%)/kr(61%)/my(51%)/nl(44%)/pl(55%)/pt(56%)/ro(59%)/ru(55%)/se(61%)/sg(56%)/si(65%)/sk(54%)/tr(50%)/tw(57%)/us(61%)
1. saksoff5th us(69%)
1. ssense ca(69%)/jp(59%)
1. universal at(64%)
1. okaidi fr(63%)
1. lncc kr(61%)
1. acne au(37%)/ch(55%)/cn(42%)/it(53%)/nl(60%)/no(56%)
1. jimmyjazz us(60%)
1. cos hu(59%)/kr(34%)
1. underarmour us(59%)
1. neimanmarcus jp(57%)
1. lululemon cn(55%)
1. nayomi ae(53%)/sa(36%)
1. sportchek ca(49%)
1. sezane fr(48%)/uk(42%)/us(42%)
1. scotchandsoda ca(47%)/us(36%)
1. camper ca(41%)/de(31%)/es(33%)/gr(34%)/it(33%)/nl(31%)/pl(33%)/uk(34%)/us(46%)
1. jelmoli ch(46%)
1. theory (44%)
1. hibbett us(43%)
1. levi es(39%)/ie(42%)
1. aloyoga us(41%)
1. navabi de(39%)/uk(39%)
1. paulsmith au(37%)
1. moncler ca(30%)/es(30%)/fr(30%)/kr(36%)/uk(30%)/us(33%)
1. undiz fr(35%)
1. paris cl(34%)
1. joseph de(33%)/eu(33%)/uk(33%)/us(33%)
1. openingceremony us(33%)
1. riachuelo br(32%)
1. gap hk(31%)
1. rivafashion qa(31%)
1. crockid ru(30%)
|
non_process
|
broken crawlers jun abcmart kr adidas kr tr adler de aldo eu alexandermcqueen cn americaneagle ca ami dk jp uk anthropologie fr uk antonioli au ca cn hk jp kr mo ru tw uk us argos uk armandthiery fr armaniexchange armedangels de asos ae au ch cn es hk id my nl ph pl ru sa sg th vn avenue us babyshop ae sa balr es fr nl brands hu burlington us calvinklein us central th centrepoint ae champion eu fr chloe kr coldwatercreek us conforama fr converse at au de cotton au countryroad davidjones drmartens de es eu fr it nl uk us elcorteingles es pt ellos fi no se exact za footaction us footlocker be de dk es fr it lu nl no se uk footpatrol uk forloveandlemons de fredperry us gapfactory us goodhood uk gymshark fi ie harrods hermes at ca fr it nl pt us hm hk kw ro sa hollister cn hush uk isetan jp kupivip ru laredouteapi ch es it ru lefties es pt lifestylestores in liverpool mx luigibertolli br maccosmetics uk maxfashion sa michaelkors ca miumiu de moosejaw us mothercare sa mrporter muji de next hk jp kr nl nz tr oasis parfois es ie ma peterhahn de popup br prettysecrets in pullandbear gt ralphlauren gr lu sk reebok ch de reserved ro runnerspoint de saksfifthavenue mo ru tw sandroatjd cn selfridges cn de es ie mo sg sephora us sfera es simons ca snkrs eu fr soccer us solebox de uk stefaniamode dk stories nl stylebop au ca es fr hk jp mo suistudio eu uk suitsupply at de es fi fr it no superdry th tods cn gr pt tommybahama ae au ch de in kr mx ph pl sa se sg us za topbrands ru topman my valentino cn walmart ca warehouse au ca ie nl se wayfair de uk weekday dk se uk zadigetvoltaire fr zalando it zalandolounge de zalora ph sg tommyjohn us smallable ca fr it nl qa uk lastcall us vip cn eu fr uk us diesel cn leroymerlin fr noon sa watchshop ru burberry ae at au be bg ca ch cz de dk es fi fr hk hu ie it jp kr my nl pl pt ro ru se sg si sk tr tw us us ssense ca jp universal at okaidi fr lncc kr acne au ch cn it nl no jimmyjazz us cos hu kr underarmour us neimanmarcus jp lululemon cn nayomi ae sa sportchek ca sezane fr uk us scotchandsoda ca us camper ca de es gr it nl pl uk us jelmoli ch theory hibbett us levi es ie aloyoga us navabi de uk paulsmith au moncler ca es fr kr uk us undiz fr paris cl joseph de eu uk us openingceremony us riachuelo br gap hk rivafashion qa crockid ru
| 0
|
11,964
| 14,729,132,351
|
IssuesEvent
|
2021-01-06 10:56:58
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Mobile] [Android] Unable to delete account
|
Android Bug P0 Process: Dev Process: Tested dev
|
AR : User is unable to delete his mobile app account : Getting an error message
ER : User should be able to delete his account

|
2.0
|
[Mobile] [Android] Unable to delete account - AR : User is unable to delete his mobile app account : Getting an error message
ER : User should be able to delete his account

|
process
|
unable to delete account ar user is unable to delete his mobile app account getting an error message er user should be able to delete his account
| 1
|
9,100
| 8,516,594,695
|
IssuesEvent
|
2018-11-01 03:36:30
|
onury/geolocator
|
https://api.github.com/repos/onury/geolocator
|
closed
|
GeoError happening
|
service-fault
|
Hello everyone, I'm having this error which started a couple of days ago.
Have anyone gone through this error too?
fetch.js:229 `GET https://geoip.nekudo.com/shutdown net::ERR_ABORTED 404`
geolocator.js:957
```
{
"name": "GeoError",
"code": "UNKNOWN_ERROR",
"message": "Could not load source at https://geoip.nekudo.com/api↵[object Event]",
"stack": "GeoError: Could not load source at https://geoip.n…/libs/geolocator/2.1.3/geolocator.min.js:1:11824)"
}
```
|
1.0
|
GeoError happening - Hello everyone, I'm having this error which started a couple of days ago.
Have anyone gone through this error too?
fetch.js:229 `GET https://geoip.nekudo.com/shutdown net::ERR_ABORTED 404`
geolocator.js:957
```
{
"name": "GeoError",
"code": "UNKNOWN_ERROR",
"message": "Could not load source at https://geoip.nekudo.com/api↵[object Event]",
"stack": "GeoError: Could not load source at https://geoip.n…/libs/geolocator/2.1.3/geolocator.min.js:1:11824)"
}
```
|
non_process
|
geoerror happening hello everyone i m having this error which started a couple of days ago have anyone gone through this error too fetch js get net err aborted geolocator js name geoerror code unknown error message could not load source at stack geoerror could not load source at
| 0
|
7,285
| 10,435,234,479
|
IssuesEvent
|
2019-09-17 16:49:33
|
neuropoly/spinalcordtoolbox
|
https://api.github.com/repos/neuropoly/spinalcordtoolbox
|
closed
|
Compute cord length
|
feature priority:HIGH sct_process_segmentation
|
With the recent removal of `-p` in `sct_process_segmentation`, it is no more possible to measure cord length. This feature was removed because we (wrongly) thought it was not used, but [some users expressed interest](http://forum.spinalcordmri.org/t/computing-cord-length-in-v4-0-0/105) in having this feature back.
One possible (and easy) implementation would be to add the cumulative number of slices (corrected for cord angle and pixel size) in a new column of the output .csv file.
|
1.0
|
Compute cord length - With the recent removal of `-p` in `sct_process_segmentation`, it is no more possible to measure cord length. This feature was removed because we (wrongly) thought it was not used, but [some users expressed interest](http://forum.spinalcordmri.org/t/computing-cord-length-in-v4-0-0/105) in having this feature back.
One possible (and easy) implementation would be to add the cumulative number of slices (corrected for cord angle and pixel size) in a new column of the output .csv file.
|
process
|
compute cord length with the recent removal of p in sct process segmentation it is no more possible to measure cord length this feature was removed because we wrongly thought it was not used but in having this feature back one possible and easy implementation would be to add the cumulative number of slices corrected for cord angle and pixel size in a new column of the output csv file
| 1
|
329,198
| 28,208,155,377
|
IssuesEvent
|
2023-04-05 00:07:47
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachprod: apply (VM) labels to persistent disks
|
C-enhancement A-testing T-testeng
|
Currently, `roachprod` applies a set of default (see `vm.GetDefaultLabelMap`) and custom labels (see `opts.CustomLabels`) to every allocated instance. E.g., the `cluster` label allows to easily track usage/cost by cluster name.
The labels are typically propagated to all directly attached resources, e.g., cpu, ram, network, local storage. However, persistent disks can outlive an instance, hence they need to be labelled explicitly. For GCP, `--update-labels` [1] needs to be explicitly invoked when creating/attaching a persistent disk. For AWS, `--tag-specifications` needs to be augmented to include `ResourceType=volume`. For Azure, no change seems to be needed; it already propagates the tags by default.
[1] https://cloud.google.com/compute/docs/labeling-resources#add_or_update_labels_to_existing_resources
Epic: CRDB-10428
Jira issue: CRDB-20852
|
2.0
|
roachprod: apply (VM) labels to persistent disks - Currently, `roachprod` applies a set of default (see `vm.GetDefaultLabelMap`) and custom labels (see `opts.CustomLabels`) to every allocated instance. E.g., the `cluster` label allows to easily track usage/cost by cluster name.
The labels are typically propagated to all directly attached resources, e.g., cpu, ram, network, local storage. However, persistent disks can outlive an instance, hence they need to be labelled explicitly. For GCP, `--update-labels` [1] needs to be explicitly invoked when creating/attaching a persistent disk. For AWS, `--tag-specifications` needs to be augmented to include `ResourceType=volume`. For Azure, no change seems to be needed; it already propagates the tags by default.
[1] https://cloud.google.com/compute/docs/labeling-resources#add_or_update_labels_to_existing_resources
Epic: CRDB-10428
Jira issue: CRDB-20852
|
non_process
|
roachprod apply vm labels to persistent disks currently roachprod applies a set of default see vm getdefaultlabelmap and custom labels see opts customlabels to every allocated instance e g the cluster label allows to easily track usage cost by cluster name the labels are typically propagated to all directly attached resources e g cpu ram network local storage however persistent disks can outlive an instance hence they need to be labelled explicitly for gcp update labels needs to be explicitly invoked when creating attaching a persistent disk for aws tag specifications needs to be augmented to include resourcetype volume for azure no change seems to be needed it already propagates the tags by default epic crdb jira issue crdb
| 0
|
161,657
| 12,556,911,306
|
IssuesEvent
|
2020-06-07 11:18:20
|
openmsupply/mobile
|
https://api.github.com/repos/openmsupply/mobile
|
opened
|
TESTING: Vaccine Module
|
Test Plan Template
|
This is an issue for testing Vaccines Module.
Please copy and paste this list of tests and post your results in a comment in this issue.
[Provsional public docs: https://docs.google.com/document/d/1UPa2HNmUhithGoK0anmrQAKxkxcHpnJqG6W2H6sQZD4/edit#]
## Supplier Invoice
### New Columns
- Location
- [ ] Can click on the cell to open a modal
- [ ] The modal lists all locations - where the locationType must be on the `item.restrictedLocationType`
- [ ] If `item.restrictedLocationType` is null, all locations are listed.
- [ ] If there are no locations with a location type equal to the items.restrictedLocationType, the cell is disabled
- [ ] Displays "N/A" when no location is selected
- [ ] Displays the location.description when a location has been selected
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Column header is not squashed [might be squased in the cell]
- [ ] header is LOCATION
- [ ] Disabled when finalised.
- VVM Status
- [ ] header is VVM Status
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Disabled when not a vaccine item
- [ ] Can click on the cell to open a modal
- [ ] By default for vaccine items, the vvm status which has the lowest VaccineVialStatus.level is applied.
- [ ] Disabled when finalised.
## Customer Invoice
### New Columns
- Doses
- [ ] sortable
- [ ] editable
- [ ] Header is DOSES
- [ ] Disabled when finalised.
- [ ] When editing the quantity, doses updates to be quantity * item.doses
- [ ] Doses cannot be above quantity * item.doses
- Breach
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Header is BREACH
- [ ] Disabled when finalised.
- [ ] When the underlying TransactionBatch records have "been in a breach", a hazard icon displays. [Been in a breach: The batch was in a location during a temperature breach]
-vaccine items
- [ ] When clicking the icon, a breach modal opens
- [ ] All breaches the batch has been in are displayed in the temperature breach modal.
- [ ] the temperature breaches which show are the correct temperature breaches for the batches
## Supplier Requisition
### Row
- [ ] When clicking a row in a supplier requisition for a vaccine item, a 'item details' bottom modal opens
- The modal correctly shows:
- [ ] Open vial wastage
- [ ] Closed vial wastage
- [ ] the last requisition date
## Stocktake
#### New columns on batch edit
- [ ] New columns only show when editing a vaccine item
- Location
- [ ] Can click on the cell to open a modal - where the locationType must be on the `item.restrictedLocationType`
- [ ] If `item.restrictedLocationType` is null, all locations are listed.
- [ ] If there are no locations with a location type equal to the items.restrictedLocationType, the cell is disabled
- [ ] Displays "N/A" when no location is selected
- [ ] Displays the location.description when a location has been selected
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Column header is not squashed [might be squased in the cell]
- [ ] header is LOCATION
- [ ] Disabled when finalised.
- [ ] The Location applied for existing batches is the correct batch the udnerlying item batch has applied
- VVM Status
- [ ] header is VVM Status
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Disabled when not a vaccine item
- [ ] Can click on the cell to open a modal
- [ ] By default for vaccine items, the vvm status for existing batches is the same as the underlying item batch.
- [ ] By default for vaccine items, the vvm status for non-existing batches is the lowest level vaccine vial monitor status
- [ ] Disabled when finalised.
- Doses
- [ ] sortable
- [ ] editable
- [ ] Header is DOSES
- [ ] Disabled when finalised.
- [ ] When editing the quantity, doses updates to be quantity * item.doses
- [ ] Doses cannot be above quantity * item.doses
- Breach
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Header is BREACH
- [ ] Disabled when finalised.
- [ ] When the underlying TransactionBatch records have "been in a breach", a hazard icon displays. [Been in a breach: The batch was in a location during a temperature breach]
- [ ] Doesn't show for non-vaccine items
- [ ] When clicking the icon, a breach modal opens
- [ ] All breaches the batch has been in are displayed in the temperature breach modal.
- [ ] the temperature breaches which show are the correct temperature breaches for the batches
## Main page
- [ ] Shows a Vaccines button when vaccine module is enabled [ note: MUST manually sync at least once after enabling vaccines or initializing ]
## Vaccines page
- [ ] Displays all locations which has a Sensor attached
- [ ] Displays a Admin button when the currently logged in user has "Can view/edit Preferences" enabled.
### Fridge
- [ ] Correctly shows all temperature logs in the from and to dates.
- dates
- [ ] Clicking either the from or to calendar button opens a modal to choose a date
- [ ] Can not choose a to date before the from date.
- [ ] Choosing a date correctly updates the chart
- [ ] Choosing a date correctly updates the displayed date.
- breaches
- [ ] Correctly displays a hazard icon for each breach within the date range
- [ ] correctly displays a hazard icon for a breach whose start time is before the from date, but the end time before the to date, or null
- [ ] Clicking a hazard icon correctly opens the breach modal
- stats
- [ ] Correctly displays the name of the location
- [ ] Correctly displays the current temperature of the location
- [ ] Correctly displays the number of temperatyre breaches for the location
- [ ] correctly displays the temperature exposure of the location
- [ ] correctly displauys the total amount of stock currently in the location
## Vaccines Admin
- [ ] Can toggle between sensors and locations
### Location
- [ ] Displays all locations
- [ ] Can click the edit button to open a location edit modal
- [ ] Correctly displays details
- [ ] Can click add location button to open a location edit modal
### LocationEdit
- Description
- [ ] Required field
- [ ] Must have (0,50) characters
- code
- [ ] Required field
- [ ] Must have (0,20) characters
- LocationType
- [ ] Lists all location types
- [ ] Correctly updates the underlying list after editing
### Sensor
- [ ] lists all Sensor records
- [ ] Can edit the sensor name through the table
|
1.0
|
TESTING: Vaccine Module - This is an issue for testing Vaccines Module.
Please copy and paste this list of tests and post your results in a comment in this issue.
[Provsional public docs: https://docs.google.com/document/d/1UPa2HNmUhithGoK0anmrQAKxkxcHpnJqG6W2H6sQZD4/edit#]
## Supplier Invoice
### New Columns
- Location
- [ ] Can click on the cell to open a modal
- [ ] The modal lists all locations - where the locationType must be on the `item.restrictedLocationType`
- [ ] If `item.restrictedLocationType` is null, all locations are listed.
- [ ] If there are no locations with a location type equal to the items.restrictedLocationType, the cell is disabled
- [ ] Displays "N/A" when no location is selected
- [ ] Displays the location.description when a location has been selected
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Column header is not squashed [might be squased in the cell]
- [ ] header is LOCATION
- [ ] Disabled when finalised.
- VVM Status
- [ ] header is VVM Status
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Disabled when not a vaccine item
- [ ] Can click on the cell to open a modal
- [ ] By default for vaccine items, the vvm status which has the lowest VaccineVialStatus.level is applied.
- [ ] Disabled when finalised.
## Customer Invoice
### New Columns
- Doses
- [ ] sortable
- [ ] editable
- [ ] Header is DOSES
- [ ] Disabled when finalised.
- [ ] When editing the quantity, doses updates to be quantity * item.doses
- [ ] Doses cannot be above quantity * item.doses
- Breach
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Header is BREACH
- [ ] Disabled when finalised.
- [ ] When the underlying TransactionBatch records have "been in a breach", a hazard icon displays. [Been in a breach: The batch was in a location during a temperature breach]
-vaccine items
- [ ] When clicking the icon, a breach modal opens
- [ ] All breaches the batch has been in are displayed in the temperature breach modal.
- [ ] the temperature breaches which show are the correct temperature breaches for the batches
## Supplier Requisition
### Row
- [ ] When clicking a row in a supplier requisition for a vaccine item, a 'item details' bottom modal opens
- The modal correctly shows:
- [ ] Open vial wastage
- [ ] Closed vial wastage
- [ ] the last requisition date
## Stocktake
#### New columns on batch edit
- [ ] New columns only show when editing a vaccine item
- Location
- [ ] Can click on the cell to open a modal - where the locationType must be on the `item.restrictedLocationType`
- [ ] If `item.restrictedLocationType` is null, all locations are listed.
- [ ] If there are no locations with a location type equal to the items.restrictedLocationType, the cell is disabled
- [ ] Displays "N/A" when no location is selected
- [ ] Displays the location.description when a location has been selected
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Column header is not squashed [might be squased in the cell]
- [ ] header is LOCATION
- [ ] Disabled when finalised.
- [ ] The Location applied for existing batches is the correct batch the udnerlying item batch has applied
- VVM Status
- [ ] header is VVM Status
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Disabled when not a vaccine item
- [ ] Can click on the cell to open a modal
- [ ] By default for vaccine items, the vvm status for existing batches is the same as the underlying item batch.
- [ ] By default for vaccine items, the vvm status for non-existing batches is the lowest level vaccine vial monitor status
- [ ] Disabled when finalised.
- Doses
- [ ] sortable
- [ ] editable
- [ ] Header is DOSES
- [ ] Disabled when finalised.
- [ ] When editing the quantity, doses updates to be quantity * item.doses
- [ ] Doses cannot be above quantity * item.doses
- Breach
- [ ] NOT sortable
- [ ] NOT editable
- [ ] Header is BREACH
- [ ] Disabled when finalised.
- [ ] When the underlying TransactionBatch records have "been in a breach", a hazard icon displays. [Been in a breach: The batch was in a location during a temperature breach]
- [ ] Doesn't show for non-vaccine items
- [ ] When clicking the icon, a breach modal opens
- [ ] All breaches the batch has been in are displayed in the temperature breach modal.
- [ ] the temperature breaches which show are the correct temperature breaches for the batches
## Main page
- [ ] Shows a Vaccines button when vaccine module is enabled [ note: MUST manually sync at least once after enabling vaccines or initializing ]
## Vaccines page
- [ ] Displays all locations which has a Sensor attached
- [ ] Displays a Admin button when the currently logged in user has "Can view/edit Preferences" enabled.
### Fridge
- [ ] Correctly shows all temperature logs in the from and to dates.
- dates
- [ ] Clicking either the from or to calendar button opens a modal to choose a date
- [ ] Can not choose a to date before the from date.
- [ ] Choosing a date correctly updates the chart
- [ ] Choosing a date correctly updates the displayed date.
- breaches
- [ ] Correctly displays a hazard icon for each breach within the date range
- [ ] correctly displays a hazard icon for a breach whose start time is before the from date, but the end time before the to date, or null
- [ ] Clicking a hazard icon correctly opens the breach modal
- stats
- [ ] Correctly displays the name of the location
- [ ] Correctly displays the current temperature of the location
- [ ] Correctly displays the number of temperatyre breaches for the location
- [ ] correctly displays the temperature exposure of the location
- [ ] correctly displauys the total amount of stock currently in the location
## Vaccines Admin
- [ ] Can toggle between sensors and locations
### Location
- [ ] Displays all locations
- [ ] Can click the edit button to open a location edit modal
- [ ] Correctly displays details
- [ ] Can click add location button to open a location edit modal
### LocationEdit
- Description
- [ ] Required field
- [ ] Must have (0,50) characters
- code
- [ ] Required field
- [ ] Must have (0,20) characters
- LocationType
- [ ] Lists all location types
- [ ] Correctly updates the underlying list after editing
### Sensor
- [ ] lists all Sensor records
- [ ] Can edit the sensor name through the table
|
non_process
|
testing vaccine module this is an issue for testing vaccines module please copy and paste this list of tests and post your results in a comment in this issue supplier invoice new columns location can click on the cell to open a modal the modal lists all locations where the locationtype must be on the item restrictedlocationtype if item restrictedlocationtype is null all locations are listed if there are no locations with a location type equal to the items restrictedlocationtype the cell is disabled displays n a when no location is selected displays the location description when a location has been selected not sortable not editable column header is not squashed header is location disabled when finalised vvm status header is vvm status not sortable not editable disabled when not a vaccine item can click on the cell to open a modal by default for vaccine items the vvm status which has the lowest vaccinevialstatus level is applied disabled when finalised customer invoice new columns doses sortable editable header is doses disabled when finalised when editing the quantity doses updates to be quantity item doses doses cannot be above quantity item doses breach not sortable not editable header is breach disabled when finalised when the underlying transactionbatch records have been in a breach a hazard icon displays vaccine items when clicking the icon a breach modal opens all breaches the batch has been in are displayed in the temperature breach modal the temperature breaches which show are the correct temperature breaches for the batches supplier requisition row when clicking a row in a supplier requisition for a vaccine item a item details bottom modal opens the modal correctly shows open vial wastage closed vial wastage the last requisition date stocktake new columns on batch edit new columns only show when editing a vaccine item location can click on the cell to open a modal where the locationtype must be on the item restrictedlocationtype if item restrictedlocationtype is null all locations are listed if there are no locations with a location type equal to the items restrictedlocationtype the cell is disabled displays n a when no location is selected displays the location description when a location has been selected not sortable not editable column header is not squashed header is location disabled when finalised the location applied for existing batches is the correct batch the udnerlying item batch has applied vvm status header is vvm status not sortable not editable disabled when not a vaccine item can click on the cell to open a modal by default for vaccine items the vvm status for existing batches is the same as the underlying item batch by default for vaccine items the vvm status for non existing batches is the lowest level vaccine vial monitor status disabled when finalised doses sortable editable header is doses disabled when finalised when editing the quantity doses updates to be quantity item doses doses cannot be above quantity item doses breach not sortable not editable header is breach disabled when finalised when the underlying transactionbatch records have been in a breach a hazard icon displays doesn t show for non vaccine items when clicking the icon a breach modal opens all breaches the batch has been in are displayed in the temperature breach modal the temperature breaches which show are the correct temperature breaches for the batches main page shows a vaccines button when vaccine module is enabled vaccines page displays all locations which has a sensor attached displays a admin button when the currently logged in user has can view edit preferences enabled fridge correctly shows all temperature logs in the from and to dates dates clicking either the from or to calendar button opens a modal to choose a date can not choose a to date before the from date choosing a date correctly updates the chart choosing a date correctly updates the displayed date breaches correctly displays a hazard icon for each breach within the date range correctly displays a hazard icon for a breach whose start time is before the from date but the end time before the to date or null clicking a hazard icon correctly opens the breach modal stats correctly displays the name of the location correctly displays the current temperature of the location correctly displays the number of temperatyre breaches for the location correctly displays the temperature exposure of the location correctly displauys the total amount of stock currently in the location vaccines admin can toggle between sensors and locations location displays all locations can click the edit button to open a location edit modal correctly displays details can click add location button to open a location edit modal locationedit description required field must have characters code required field must have characters locationtype lists all location types correctly updates the underlying list after editing sensor lists all sensor records can edit the sensor name through the table
| 0
|
65,433
| 16,331,752,735
|
IssuesEvent
|
2021-05-12 10:04:50
|
Haivision/srt
|
https://api.github.com/repos/Haivision/srt
|
closed
|
Proposed binary installer for libsrt on Windows
|
Type: Enhancement [build]
|
Hi,
#### The problem
While libsrt is now available in the repos of the main Linux distros and macOS, building an application using SRT on Windows is, at best, challenging.
There is no precise build instructions for libsrt on Windows, only guidelines. It is currently impossible to automate (from scratch) the build of a Windows application using SRT. This is however required when you do continuous integration and build on bare Windows machines (see "GitHub Actions" for instance).
#### Proposed solution
Since I need it, I wrote a bunch of scripts to automate the build of a binary installer for libsrt. When you install it, you get pre-compiled static libraries for libsrt, header files and a Visual Studio property file to reference libsrt.
The scripts are here: https://github.com/tsduck/srt-win-installer
The project contains only scripts, no copy of libsrt source code or any other dependent code. When building an installer, the scripts start with cloning the required repositories from GitHub.
The first release, for libsrt 1.4.1, is here: https://github.com/tsduck/srt-win-installer/releases
I did this because I needed it for the [TSDuck](https://github.com/tsduck/tsduck) project. However, if the libsrt team is interested in taking over those scripts so that a proper binary installer can be built as part of the libsrt project, I am ready the release the ownership.
Let me know.
-Thierry Lelégard, author of [TSDuck](https://tsduck.io)
|
1.0
|
Proposed binary installer for libsrt on Windows - Hi,
#### The problem
While libsrt is now available in the repos of the main Linux distros and macOS, building an application using SRT on Windows is, at best, challenging.
There is no precise build instructions for libsrt on Windows, only guidelines. It is currently impossible to automate (from scratch) the build of a Windows application using SRT. This is however required when you do continuous integration and build on bare Windows machines (see "GitHub Actions" for instance).
#### Proposed solution
Since I need it, I wrote a bunch of scripts to automate the build of a binary installer for libsrt. When you install it, you get pre-compiled static libraries for libsrt, header files and a Visual Studio property file to reference libsrt.
The scripts are here: https://github.com/tsduck/srt-win-installer
The project contains only scripts, no copy of libsrt source code or any other dependent code. When building an installer, the scripts start with cloning the required repositories from GitHub.
The first release, for libsrt 1.4.1, is here: https://github.com/tsduck/srt-win-installer/releases
I did this because I needed it for the [TSDuck](https://github.com/tsduck/tsduck) project. However, if the libsrt team is interested in taking over those scripts so that a proper binary installer can be built as part of the libsrt project, I am ready the release the ownership.
Let me know.
-Thierry Lelégard, author of [TSDuck](https://tsduck.io)
|
non_process
|
proposed binary installer for libsrt on windows hi the problem while libsrt is now available in the repos of the main linux distros and macos building an application using srt on windows is at best challenging there is no precise build instructions for libsrt on windows only guidelines it is currently impossible to automate from scratch the build of a windows application using srt this is however required when you do continuous integration and build on bare windows machines see github actions for instance proposed solution since i need it i wrote a bunch of scripts to automate the build of a binary installer for libsrt when you install it you get pre compiled static libraries for libsrt header files and a visual studio property file to reference libsrt the scripts are here the project contains only scripts no copy of libsrt source code or any other dependent code when building an installer the scripts start with cloning the required repositories from github the first release for libsrt is here i did this because i needed it for the project however if the libsrt team is interested in taking over those scripts so that a proper binary installer can be built as part of the libsrt project i am ready the release the ownership let me know thierry lelégard author of
| 0
|
86,071
| 10,709,785,959
|
IssuesEvent
|
2019-10-24 23:23:06
|
dart-lang/site-www
|
https://api.github.com/repos/dart-lang/site-www
|
closed
|
static analysis text formatting issue
|
bug design needs-info p1-high
|
Affected URLs:
* https://dart.dev/guides/language/analysis-options
* https://dart.dev/guides/language/language-tour
* https://dart.dev/guides/language/sound-dart
* https://dart.dev/guides/language/sound-problems
* https://dart.dev/guides/libraries/library-tour
These pages have a weird "static analysis: error/warning" at the top right. (Or more, for sound-dart and sound-problems.)
I don't know when this first happened, but I think this is due to the `passes-sa` and `fails-sa` classes, which are defined in [main.scss](https://github.com/dart-lang/site-www/blob/master/src/_assets/css/main.scss).

/cc @chalin @johnpryan
|
1.0
|
static analysis text formatting issue - Affected URLs:
* https://dart.dev/guides/language/analysis-options
* https://dart.dev/guides/language/language-tour
* https://dart.dev/guides/language/sound-dart
* https://dart.dev/guides/language/sound-problems
* https://dart.dev/guides/libraries/library-tour
These pages have a weird "static analysis: error/warning" at the top right. (Or more, for sound-dart and sound-problems.)
I don't know when this first happened, but I think this is due to the `passes-sa` and `fails-sa` classes, which are defined in [main.scss](https://github.com/dart-lang/site-www/blob/master/src/_assets/css/main.scss).

/cc @chalin @johnpryan
|
non_process
|
static analysis text formatting issue affected urls these pages have a weird static analysis error warning at the top right or more for sound dart and sound problems i don t know when this first happened but i think this is due to the passes sa and fails sa classes which are defined in cc chalin johnpryan
| 0
|
278,692
| 24,168,862,272
|
IssuesEvent
|
2022-09-22 17:18:38
|
denoland/deno
|
https://api.github.com/repos/denoland/deno
|
reopened
|
segfault deserializing snapshot in tests
|
bug tests
|
Seems like a segfault in v8 related to deserializing the snapshot.

```
error: test failed, to rerun pass '-p deno_runtime --lib'
Caused by:
process didn't exit successfully: `C:\actions-runner\work\deno\deno\target\release\deps\deno_runtime-413c758822cbc9f4.exe` (exit code: 0xc0000005, STATUS_ACCESS_VIOLATION)
```
https://github.com/denoland/deno/runs/7941098188?check_suite_focus=true
|
1.0
|
segfault deserializing snapshot in tests - Seems like a segfault in v8 related to deserializing the snapshot.

```
error: test failed, to rerun pass '-p deno_runtime --lib'
Caused by:
process didn't exit successfully: `C:\actions-runner\work\deno\deno\target\release\deps\deno_runtime-413c758822cbc9f4.exe` (exit code: 0xc0000005, STATUS_ACCESS_VIOLATION)
```
https://github.com/denoland/deno/runs/7941098188?check_suite_focus=true
|
non_process
|
segfault deserializing snapshot in tests seems like a segfault in related to deserializing the snapshot error test failed to rerun pass p deno runtime lib caused by process didn t exit successfully c actions runner work deno deno target release deps deno runtime exe exit code status access violation
| 0
|
422,069
| 12,265,786,436
|
IssuesEvent
|
2020-05-07 07:49:38
|
ntop/ntopng
|
https://api.github.com/repos/ntop/ntopng
|
closed
|
Better handle license switch
|
in progress low-priority bug to be confirmed
|

When the demo mode ends, errors like the one above are reported when switching back to community edition. They need to be handled more nicely.
See also #3595
|
1.0
|
Better handle license switch - 
When the demo mode ends, errors like the one above are reported when switching back to community edition. They need to be handled more nicely.
See also #3595
|
non_process
|
better handle license switch when the demo mode ends errors like the one above are reported when switching back to community edition they need to be handled more nicely see also
| 0
|
63,696
| 8,691,141,713
|
IssuesEvent
|
2018-12-04 00:01:29
|
CuBoulder/express
|
https://api.github.com/repos/CuBoulder/express
|
closed
|
AS a SO/CE I would like to display an events page with a filter
|
Epic Still Open at 3.0 evaluate-1:Director Approval evaluate-1:More Information evaluate-2:Accessibility evaluate-2:Documentation Needs evaluate-2:Support Concerns evaluate-2:Usability
|
## Context
Here is the link to the Department Ceremonies page on the University of Oregon Commencement website that we would like to model our department listing to if feasible.
https://commencement.uoregon.edu/ceremonies
Here is our current listing page for Spring 2017 ceremonies for reference (73 schools, colleges, departments or programs listed):
http://www.colorado.edu/commencement/spring-ceremony/spring-collegedepartment-recognition-ceremonies


|
1.0
|
AS a SO/CE I would like to display an events page with a filter - ## Context
Here is the link to the Department Ceremonies page on the University of Oregon Commencement website that we would like to model our department listing to if feasible.
https://commencement.uoregon.edu/ceremonies
Here is our current listing page for Spring 2017 ceremonies for reference (73 schools, colleges, departments or programs listed):
http://www.colorado.edu/commencement/spring-ceremony/spring-collegedepartment-recognition-ceremonies


|
non_process
|
as a so ce i would like to display an events page with a filter context here is the link to the department ceremonies page on the university of oregon commencement website that we would like to model our department listing to if feasible here is our current listing page for spring ceremonies for reference schools colleges departments or programs listed
| 0
|
145,451
| 13,151,261,438
|
IssuesEvent
|
2020-08-09 15:52:25
|
hailstorm75/MarkDoc.Core
|
https://api.github.com/repos/hailstorm75/MarkDoc.Core
|
opened
|
Document the Documentation XML library
|
documentation
|
# Library name(s)
`MarkDoc.Documentation.Xml`
# Description
Document the Documentation algorithms
|
1.0
|
Document the Documentation XML library - # Library name(s)
`MarkDoc.Documentation.Xml`
# Description
Document the Documentation algorithms
|
non_process
|
document the documentation xml library library name s markdoc documentation xml description document the documentation algorithms
| 0
|
4,500
| 7,348,672,526
|
IssuesEvent
|
2018-03-08 07:43:15
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Output - C# script example
|
assigned-to-author doc-bug functions in-process triaged
|
Hi,
The sample in the function.json is an ordinary HTTP Trigger and not like described a BLOB Trigger. Please correct either the description or the sample.
.....
> Output - C# script example
> The following example shows a blob trigger binding in a function.json file and C# script (.csx) code that uses the binding. The function creates a queue item with a POCO payload for each HTTP request received.
>
> Here's the function.json file:
>
> JSON
>
```
{
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"authLevel": "function",
"name": "input"
},
{
"type": "http",
"direction": "out",
"name": "return"
},
{
"type": "queue",
"direction": "out",
"name": "$return",
"queueName": "outqueue",
"connection": "MyStorageConnectionAppSetting",
}
]
}
```
best
Eric
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5063120f-d48e-6a76-8e6d-208ead92d187
* Version Independent ID: 72bb8ecc-6dec-079a-a9d4-52205466a8c6
* Content: [Azure Queue storage bindings for Azure Functions | Microsoft Docs](https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue#output---example)
* Content Source: [articles/azure-functions/functions-bindings-storage-queue.md](https://github.com/Microsoft/azure-docs/blob/master/articles/azure-functions/functions-bindings-storage-queue.md)
* Service: **functions**
* GitHub Login: @ggailey777
* Microsoft Alias: **glenga**
|
1.0
|
Output - C# script example - Hi,
The sample in the function.json is an ordinary HTTP Trigger and not like described a BLOB Trigger. Please correct either the description or the sample.
.....
> Output - C# script example
> The following example shows a blob trigger binding in a function.json file and C# script (.csx) code that uses the binding. The function creates a queue item with a POCO payload for each HTTP request received.
>
> Here's the function.json file:
>
> JSON
>
```
{
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"authLevel": "function",
"name": "input"
},
{
"type": "http",
"direction": "out",
"name": "return"
},
{
"type": "queue",
"direction": "out",
"name": "$return",
"queueName": "outqueue",
"connection": "MyStorageConnectionAppSetting",
}
]
}
```
best
Eric
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5063120f-d48e-6a76-8e6d-208ead92d187
* Version Independent ID: 72bb8ecc-6dec-079a-a9d4-52205466a8c6
* Content: [Azure Queue storage bindings for Azure Functions | Microsoft Docs](https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue#output---example)
* Content Source: [articles/azure-functions/functions-bindings-storage-queue.md](https://github.com/Microsoft/azure-docs/blob/master/articles/azure-functions/functions-bindings-storage-queue.md)
* Service: **functions**
* GitHub Login: @ggailey777
* Microsoft Alias: **glenga**
|
process
|
output c script example hi the sample in the function json is an ordinary http trigger and not like described a blob trigger please correct either the description or the sample output c script example the following example shows a blob trigger binding in a function json file and c script csx code that uses the binding the function creates a queue item with a poco payload for each http request received here s the function json file json bindings type httptrigger direction in authlevel function name input type http direction out name return type queue direction out name return queuename outqueue connection mystorageconnectionappsetting best eric document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service functions github login microsoft alias glenga
| 1
|
173,592
| 6,528,215,232
|
IssuesEvent
|
2017-08-30 06:24:24
|
container-storage-interface/spec
|
https://api.github.com/repos/container-storage-interface/spec
|
closed
|
NodePublishVolume target_path uniqueness requirement
|
priority/p0
|
There is an assumed uniqueness of target_path across Volumes that today must be enforced by the CO. If two volumes had the same target_path and they ended up on the same node there would be problems, even worse this bad config would be masked if they were first published on different nodes.
Even if we implemented the changes described in [https://github.com/container-storage-interface/spec/issues/17](url) and [the](https://github.com/container-storage-interface/spec/issues/37) target_path would still need to be unique relative to the container-scoped name space.
Note for those of a Windows-bent: be aware that not every file system is case sensitive so one can have two different strings that are effectively the same target_path.
|
1.0
|
NodePublishVolume target_path uniqueness requirement - There is an assumed uniqueness of target_path across Volumes that today must be enforced by the CO. If two volumes had the same target_path and they ended up on the same node there would be problems, even worse this bad config would be masked if they were first published on different nodes.
Even if we implemented the changes described in [https://github.com/container-storage-interface/spec/issues/17](url) and [the](https://github.com/container-storage-interface/spec/issues/37) target_path would still need to be unique relative to the container-scoped name space.
Note for those of a Windows-bent: be aware that not every file system is case sensitive so one can have two different strings that are effectively the same target_path.
|
non_process
|
nodepublishvolume target path uniqueness requirement there is an assumed uniqueness of target path across volumes that today must be enforced by the co if two volumes had the same target path and they ended up on the same node there would be problems even worse this bad config would be masked if they were first published on different nodes even if we implemented the changes described in url and target path would still need to be unique relative to the container scoped name space note for those of a windows bent be aware that not every file system is case sensitive so one can have two different strings that are effectively the same target path
| 0
|
68,892
| 13,194,343,061
|
IssuesEvent
|
2020-08-13 16:39:57
|
dotnet/roslyn
|
https://api.github.com/repos/dotnet/roslyn
|
closed
|
IDE0044 "Make field readonly" false positive when used in ref return
|
Area-IDE Bug IDE-CodeStyle Resolution-Fixed
|
**Version Used**:
16.0.0 Preview 2.0
**Steps to Reproduce**:
```csharp
public class C
{
private int a, b; // IDE0044
private ref int select(bool c) => ref (c ? ref a : ref b);
public void Test() => select(true) = 3;
}
```
**Expected Behavior**:
The members are mutated. Marking them readonly breaks the code, so IDE0044 shouldn't trigger.
**Actual Behavior**:
IDE0044 is falsely triggered.
|
1.0
|
IDE0044 "Make field readonly" false positive when used in ref return - **Version Used**:
16.0.0 Preview 2.0
**Steps to Reproduce**:
```csharp
public class C
{
private int a, b; // IDE0044
private ref int select(bool c) => ref (c ? ref a : ref b);
public void Test() => select(true) = 3;
}
```
**Expected Behavior**:
The members are mutated. Marking them readonly breaks the code, so IDE0044 shouldn't trigger.
**Actual Behavior**:
IDE0044 is falsely triggered.
|
non_process
|
make field readonly false positive when used in ref return version used preview steps to reproduce csharp public class c private int a b private ref int select bool c ref c ref a ref b public void test select true expected behavior the members are mutated marking them readonly breaks the code so shouldn t trigger actual behavior is falsely triggered
| 0
|
13,359
| 15,823,542,744
|
IssuesEvent
|
2021-04-06 01:01:22
|
googleapis/google-cloud-ruby
|
https://api.github.com/repos/googleapis/google-cloud-ruby
|
opened
|
Warning: a recent release failed
|
type: process
|
The following release PRs may have failed:
* #11090
* #11092
* #11093
* #11094
* #11095
* #11096
* #11107
* #11108
* #11113
* #10917
* #10918
* #10928
|
1.0
|
Warning: a recent release failed - The following release PRs may have failed:
* #11090
* #11092
* #11093
* #11094
* #11095
* #11096
* #11107
* #11108
* #11113
* #10917
* #10918
* #10928
|
process
|
warning a recent release failed the following release prs may have failed
| 1
|
5,394
| 8,223,795,183
|
IssuesEvent
|
2018-09-06 11:49:51
|
linnovate/root
|
https://api.github.com/repos/linnovate/root
|
closed
|
After deleting a document, the document isnt deleted until changing to another tab (updates/tasks/etc)
|
Process bug bug
|
@abrahamos, after deleting the document, it still stays until changing the tab or refreshing the page
the document that i had just deleted is still there, and is still interactable

|
1.0
|
After deleting a document, the document isnt deleted until changing to another tab (updates/tasks/etc) - @abrahamos, after deleting the document, it still stays until changing the tab or refreshing the page
the document that i had just deleted is still there, and is still interactable

|
process
|
after deleting a document the document isnt deleted until changing to another tab updates tasks etc abrahamos after deleting the document it still stays until changing the tab or refreshing the page the document that i had just deleted is still there and is still interactable
| 1
|
7,779
| 10,919,900,301
|
IssuesEvent
|
2019-11-21 20:02:24
|
microsoft/ptvsd
|
https://api.github.com/repos/microsoft/ptvsd
|
closed
|
Monkey-patch os.posix_spawn
|
Bug Upstream-pydevd area:Multiprocessing
|
But succeeds on Python 3.6 and 3.7. Looking at pydevd logs for the parent process, the difference is that there's no "Patching args" in case of 3.8. However, there *is* a pydevd logfile for the child process, which only contains the "Using Cython speedups" line.
|
1.0
|
Monkey-patch os.posix_spawn - But succeeds on Python 3.6 and 3.7. Looking at pydevd logs for the parent process, the difference is that there's no "Patching args" in case of 3.8. However, there *is* a pydevd logfile for the child process, which only contains the "Using Cython speedups" line.
|
process
|
monkey patch os posix spawn but succeeds on python and looking at pydevd logs for the parent process the difference is that there s no patching args in case of however there is a pydevd logfile for the child process which only contains the using cython speedups line
| 1
|
20,996
| 27,862,718,161
|
IssuesEvent
|
2023-03-21 08:02:02
|
googleapis/google-cloud-node
|
https://api.github.com/repos/googleapis/google-cloud-node
|
opened
|
Your .repo-metadata.json files have a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json files:
Result of scan 📈:
* api_shortname '{{name}}' invalid in packages/gapic-node-templating/templates/bootstrap-templates/.repo-metadata.json
* api_shortname 'stitcher' invalid in packages/google-cloud-video-stitcher/.repo-metadata.json
☝️ Once you address these problems, you can close this issue.
### Need help?
* [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field.
* [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**.
* Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json files have a problem 🤒 - You have a problem with your .repo-metadata.json files:
Result of scan 📈:
* api_shortname '{{name}}' invalid in packages/gapic-node-templating/templates/bootstrap-templates/.repo-metadata.json
* api_shortname 'stitcher' invalid in packages/google-cloud-video-stitcher/.repo-metadata.json
☝️ Once you address these problems, you can close this issue.
### Need help?
* [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field.
* [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**.
* Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json files have a problem 🤒 you have a problem with your repo metadata json files result of scan 📈 api shortname name invalid in packages gapic node templating templates bootstrap templates repo metadata json api shortname stitcher invalid in packages google cloud video stitcher repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
| 1
|
278,721
| 24,170,512,019
|
IssuesEvent
|
2022-09-22 18:45:56
|
dotnet/source-build
|
https://api.github.com/repos/dotnet/source-build
|
closed
|
.NET 8: Enable full linux-portable CI build
|
area-ci-testing untriaged
|
### Describe the Problem
Unified build seeks to expand the number of platforms that .NET 8 can source-build on, and replace the existing Microsoft official build with source-build. Part of this is enabling a build for the Linux portable assets we produce.
### Describe the Solution
Enable a full linux-portable build of source-build in CI.
|
1.0
|
.NET 8: Enable full linux-portable CI build - ### Describe the Problem
Unified build seeks to expand the number of platforms that .NET 8 can source-build on, and replace the existing Microsoft official build with source-build. Part of this is enabling a build for the Linux portable assets we produce.
### Describe the Solution
Enable a full linux-portable build of source-build in CI.
|
non_process
|
net enable full linux portable ci build describe the problem unified build seeks to expand the number of platforms that net can source build on and replace the existing microsoft official build with source build part of this is enabling a build for the linux portable assets we produce describe the solution enable a full linux portable build of source build in ci
| 0
|
144,304
| 5,538,166,358
|
IssuesEvent
|
2017-03-22 00:30:34
|
smartchicago/chicago-early-learning
|
https://api.github.com/repos/smartchicago/chicago-early-learning
|
opened
|
Create new email newsletter sign-up page
|
High Priority
|
Before the application launches this year, our partners want to encourage parents to sign up for the Chicago Early Learning Keep in Touch email. Currently, when parents click on the "Early Learning Email Newsletter" link they are led to this URL: http://chicagoearlylearning.us5.list-manage.com/subscribe?u=085247dea37361c266002462c&id=5161f6a47c
We want to create an easier URL to share/remember like: chicagoearlylearning.org/email
Therefore, let's create a new page and embed the Mailchimp sign-up form.
Here is the embed code:
`<!-- Begin MailChimp Signup Form -->
<link href="//cdn-images.mailchimp.com/embedcode/classic-10_7.css" rel="stylesheet" type="text/css">
<style type="text/css">
#mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; }
/* Add your own MailChimp form style overrides in your site stylesheet or in this style block.
We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */
</style>
<div id="mc_embed_signup">
<form action="//chicagoearlylearning.us5.list-manage.com/subscribe/post?u=085247dea37361c266002462c&id=5161f6a47c" method="post" id="mc-embedded-subscribe-form" name="mc-embedded-subscribe-form" class="validate" target="_blank" novalidate>
<div id="mc_embed_signup_scroll">
<h2>Subscribe to our mailing list</h2>
<div class="indicates-required"><span class="asterisk">*</span> indicates required</div>
<div class="mc-field-group">
<label for="mce-EMAIL">Email Address <span class="asterisk">*</span>
</label>
<input type="email" value="" name="EMAIL" class="required email" id="mce-EMAIL">
</div>
<div class="mc-field-group">
<label for="mce-FNAME">First Name <span class="asterisk">*</span>
</label>
<input type="text" value="" name="FNAME" class="required" id="mce-FNAME">
</div>
<div class="mc-field-group">
<label for="mce-LNAME">Last Name <span class="asterisk">*</span>
</label>
<input type="text" value="" name="LNAME" class="required" id="mce-LNAME">
</div>
<div class="mc-field-group">
<label for="mce-MMERGE4">Zip Code <span class="asterisk">*</span>
</label>
<input type="text" value="" name="MMERGE4" class="required" id="mce-MMERGE4">
</div>
<div class="mc-field-group size1of2">
<label for="mce-MMERGE5">Text Me! Here's my cell phone number </label>
<div class="phonefield phonefield-us">
(<span class="phonearea"><input class="phonepart " pattern="[0-9]*" id="mce-MMERGE5-area" name="MMERGE5[area]" maxlength="3" size="3" value="" type="text"></span>)
<span class="phonedetail1"><input class="phonepart " pattern="[0-9]*" id="mce-MMERGE5-detail1" name="MMERGE5[detail1]" maxlength="3" size="3" value="" type="text"></span> -
<span class="phonedetail2"><input class="phonepart " pattern="[0-9]*" id="mce-MMERGE5-detail2" name="MMERGE5[detail2]" maxlength="4" size="4" value="" type="text"></span>
<span class="small-meta nowrap">(###) ###-####</span>
</div>
</div><div class="mc-field-group">
<label for="mce-MMERGE3">Source </label>
<input type="text" value="" name="MMERGE3" class="" id="mce-MMERGE3">
</div>
<div id="mce-responses" class="clear">
<div class="response" id="mce-error-response" style="display:none"></div>
<div class="response" id="mce-success-response" style="display:none"></div>
</div> <!-- real people should not fill this in and expect good things - do not remove this or risk form bot signups-->
<div style="position: absolute; left: -5000px;" aria-hidden="true"><input type="text" name="b_085247dea37361c266002462c_5161f6a47c" tabindex="-1" value=""></div>
<div class="clear"><input type="submit" value="Subscribe" name="subscribe" id="mc-embedded-subscribe" class="button"></div>
</div>
</form>
</div>
<!--End mc_embed_signup-->`
|
1.0
|
Create new email newsletter sign-up page - Before the application launches this year, our partners want to encourage parents to sign up for the Chicago Early Learning Keep in Touch email. Currently, when parents click on the "Early Learning Email Newsletter" link they are led to this URL: http://chicagoearlylearning.us5.list-manage.com/subscribe?u=085247dea37361c266002462c&id=5161f6a47c
We want to create an easier URL to share/remember like: chicagoearlylearning.org/email
Therefore, let's create a new page and embed the Mailchimp sign-up form.
Here is the embed code:
`<!-- Begin MailChimp Signup Form -->
<link href="//cdn-images.mailchimp.com/embedcode/classic-10_7.css" rel="stylesheet" type="text/css">
<style type="text/css">
#mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; }
/* Add your own MailChimp form style overrides in your site stylesheet or in this style block.
We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */
</style>
<div id="mc_embed_signup">
<form action="//chicagoearlylearning.us5.list-manage.com/subscribe/post?u=085247dea37361c266002462c&id=5161f6a47c" method="post" id="mc-embedded-subscribe-form" name="mc-embedded-subscribe-form" class="validate" target="_blank" novalidate>
<div id="mc_embed_signup_scroll">
<h2>Subscribe to our mailing list</h2>
<div class="indicates-required"><span class="asterisk">*</span> indicates required</div>
<div class="mc-field-group">
<label for="mce-EMAIL">Email Address <span class="asterisk">*</span>
</label>
<input type="email" value="" name="EMAIL" class="required email" id="mce-EMAIL">
</div>
<div class="mc-field-group">
<label for="mce-FNAME">First Name <span class="asterisk">*</span>
</label>
<input type="text" value="" name="FNAME" class="required" id="mce-FNAME">
</div>
<div class="mc-field-group">
<label for="mce-LNAME">Last Name <span class="asterisk">*</span>
</label>
<input type="text" value="" name="LNAME" class="required" id="mce-LNAME">
</div>
<div class="mc-field-group">
<label for="mce-MMERGE4">Zip Code <span class="asterisk">*</span>
</label>
<input type="text" value="" name="MMERGE4" class="required" id="mce-MMERGE4">
</div>
<div class="mc-field-group size1of2">
<label for="mce-MMERGE5">Text Me! Here's my cell phone number </label>
<div class="phonefield phonefield-us">
(<span class="phonearea"><input class="phonepart " pattern="[0-9]*" id="mce-MMERGE5-area" name="MMERGE5[area]" maxlength="3" size="3" value="" type="text"></span>)
<span class="phonedetail1"><input class="phonepart " pattern="[0-9]*" id="mce-MMERGE5-detail1" name="MMERGE5[detail1]" maxlength="3" size="3" value="" type="text"></span> -
<span class="phonedetail2"><input class="phonepart " pattern="[0-9]*" id="mce-MMERGE5-detail2" name="MMERGE5[detail2]" maxlength="4" size="4" value="" type="text"></span>
<span class="small-meta nowrap">(###) ###-####</span>
</div>
</div><div class="mc-field-group">
<label for="mce-MMERGE3">Source </label>
<input type="text" value="" name="MMERGE3" class="" id="mce-MMERGE3">
</div>
<div id="mce-responses" class="clear">
<div class="response" id="mce-error-response" style="display:none"></div>
<div class="response" id="mce-success-response" style="display:none"></div>
</div> <!-- real people should not fill this in and expect good things - do not remove this or risk form bot signups-->
<div style="position: absolute; left: -5000px;" aria-hidden="true"><input type="text" name="b_085247dea37361c266002462c_5161f6a47c" tabindex="-1" value=""></div>
<div class="clear"><input type="submit" value="Subscribe" name="subscribe" id="mc-embedded-subscribe" class="button"></div>
</div>
</form>
</div>
<!--End mc_embed_signup-->`
|
non_process
|
create new email newsletter sign up page before the application launches this year our partners want to encourage parents to sign up for the chicago early learning keep in touch email currently when parents click on the early learning email newsletter link they are led to this url we want to create an easier url to share remember like chicagoearlylearning org email therefore let s create a new page and embed the mailchimp sign up form here is the embed code mc embed signup background fff clear left font helvetica arial sans serif add your own mailchimp form style overrides in your site stylesheet or in this style block we recommend moving this block and the preceding css link to the head of your html file subscribe to our mailing list indicates required email address first name last name zip code text me here s my cell phone number source
| 0
|
11,280
| 14,078,164,432
|
IssuesEvent
|
2020-11-04 13:10:50
|
cetic/tsorage
|
https://api.github.com/repos/cetic/tsorage
|
closed
|
The processor eventually fails when unmatching values and types are provided.
|
bug processing
|
Excerpt from the Processor log:
```
2020-11-02T17:02:21.642150719Z [WARN] [11/02/2020 17:02:21.641] [default-akka.actor.default-dispatcher-8] [RestartWithBackoffFlow(akka://default)] Restarting graph due to failure. stack_trace:
2020-11-02T17:02:21.642161372Z java.lang.IllegalArgumentException: Expected boolean; got 0
2020-11-02T17:02:21.642164818Z at be.cetic.tsorage.data.datatype.BooleanSupport$.fromJson(BooleanSupport.scala:17)
2020-11-02T17:02:21.642168318Z at be.cetic.tsorage.data.datatype.BooleanSupport$.fromJson(BooleanSupport.scala:9)
2020-11-02T17:02:21.642171458Z at be.cetic.tsorage.data.datatype.DataTypeSupport.asRawUdtValue(DataTypeSupport.scala:113)
2020-11-02T17:02:21.642174529Z at be.cetic.tsorage.processor.flow.CassandraWriter$.$anonfun$writeMessageAsync$1(CassandraWriter.scala:39)
2020-11-02T17:02:21.642177611Z at scala.collection.immutable.List.map(List.scala:286)
2020-11-02T17:02:21.642180652Z at be.cetic.tsorage.processor.flow.CassandraWriter$.writeMessageAsync(CassandraWriter.scala:32)
2020-11-02T17:02:21.642183700Z at be.cetic.tsorage.processor.flow.CassandraWriter$.$anonfun$createWriteMsgFlow$1(CassandraWriter.scala:60)
2020-11-02T17:02:21.642186939Z at akka.stream.impl.fusing.MapAsyncUnordered$$anon$31.onPush(Ops.scala:1403)
2020-11-02T17:02:21.642189957Z at akka.stream.impl.fusing.GraphInterpreter.processPush(GraphInterpreter.scala:541)
2020-11-02T17:02:21.642193068Z at akka.stream.impl.fusing.GraphInterpreter.processEvent(GraphInterpreter.scala:495)
2020-11-02T17:02:21.642196038Z at akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:390)
2020-11-02T17:02:21.642199154Z at akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:624)
2020-11-02T17:02:21.642202206Z at akka.stream.impl.fusing.ActorGraphInterpreter$SimpleBoundaryEvent.execute(ActorGraphInterpreter.scala:55)
2020-11-02T17:02:21.642205271Z at akka.stream.impl.fusing.ActorGraphInterpreter$SimpleBoundaryEvent.execute$(ActorGraphInterpreter.scala:51)
2020-11-02T17:02:21.642208334Z at akka.stream.impl.fusing.ActorGraphInterpreter$BatchingActorInputBoundary$OnNext.execute(ActorGraphInterpreter.scala:94)
2020-11-02T17:02:21.642223856Z at akka.stream.impl.fusing.GraphInterpreterShell.processEvent(ActorGraphInterpreter.scala:599)
2020-11-02T17:02:21.642226916Z at akka.stream.impl.fusing.ActorGraphInterpreter.akka$stream$impl$fusing$ActorGraphInterpreter$$processEvent(ActorGraphInterpreter.scala:768)
2020-11-02T17:02:21.642229773Z at akka.stream.impl.fusing.ActorGraphInterpreter$$anonfun$receive$1.applyOrElse(ActorGraphInterpreter.scala:783)
2020-11-02T17:02:21.642232498Z at akka.actor.Actor.aroundReceive(Actor.scala:534)
2020-11-02T17:02:21.642235582Z at akka.actor.Actor.aroundReceive$(Actor.scala:532)
2020-11-02T17:02:21.642238219Z at akka.stream.impl.fusing.ActorGraphInterpreter.aroundReceive(ActorGraphInterpreter.scala:690)
2020-11-02T17:02:21.642240923Z at akka.actor.ActorCell.receiveMessage(ActorCell.scala:573)
2020-11-02T17:02:21.642243547Z at akka.actor.ActorCell.invoke(ActorCell.scala:543)
2020-11-02T17:02:21.642246129Z at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:269)
2020-11-02T17:02:21.642248706Z at akka.dispatch.Mailbox.run(Mailbox.scala:230)
2020-11-02T17:02:21.642251343Z at akka.dispatch.Mailbox.exec(Mailbox.scala:242)
2020-11-02T17:02:21.642253926Z at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
2020-11-02T17:02:21.642256987Z at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
2020-11-02T17:02:21.642259691Z at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
2020-11-02T17:02:21.642262334Z at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
2020-11-02T17:02:21.642265076Z at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:177)
2020-11-02T17:02:21.642267865Z
```
There were a series of similar messages. While the processor still worked after this one, it eventually completely stopped.
Hypothesis: A message with a _tbool_ type and a `0` value has been submitted, which is an error. Because the Boolean support cannot process a `0` value, an exception is raised, which crashes the Processor. Consequently, the message is not committed (it is not successfully written to Cassandra). When the Processor restarts, it reads the message again, which leads to the same result.
After a while, the process is completely stalled with important lags on the partitions. There is an evidence of this on the Kafka logs:
```
root@tsorage-kafka-0:/# kafka-consumer-groups --bootstrap-server localhost:9092 --group ts_processing --describe
TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID
raw 6 910457 1104884 194427 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 18 579031 702597 123566 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 26 567074 688111 121037 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 1 562430 682444 120014 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 12 860425 1044180 183755 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 22 631232 766131 134899 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 2 726710 881795 155085 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 30 564658 685166 120508 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 10 794333 963844 169511 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 16 667162 809540 142378 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 27 676739 821160 144421 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 19 636122 771888 135766 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 8 731388 887460 156072 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 0 37946916 38088799 141883 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 3 534737 648870 114133 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 13 751416 911781 160365 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 31 685809 832172 146363 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 23 722814 877069 154255 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 7 641957 778964 137007 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 17 894145 1084970 190825 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 28 562341 682353 120012 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 20 638579 774867 136288 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 14 557585 676581 118996 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 4 646407 784373 137966 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 24 560883 680584 119701 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 11 479018 581233 102215 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 21 854987 1036763 181776 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 29 598065 725701 127636 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 15 771852 936593 164741 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 5 574206 696745 122539 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 25 850498 1032015 181517 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 9 764846 927931 163085 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
```
|
1.0
|
The processor eventually fails when unmatching values and types are provided. - Excerpt from the Processor log:
```
2020-11-02T17:02:21.642150719Z [WARN] [11/02/2020 17:02:21.641] [default-akka.actor.default-dispatcher-8] [RestartWithBackoffFlow(akka://default)] Restarting graph due to failure. stack_trace:
2020-11-02T17:02:21.642161372Z java.lang.IllegalArgumentException: Expected boolean; got 0
2020-11-02T17:02:21.642164818Z at be.cetic.tsorage.data.datatype.BooleanSupport$.fromJson(BooleanSupport.scala:17)
2020-11-02T17:02:21.642168318Z at be.cetic.tsorage.data.datatype.BooleanSupport$.fromJson(BooleanSupport.scala:9)
2020-11-02T17:02:21.642171458Z at be.cetic.tsorage.data.datatype.DataTypeSupport.asRawUdtValue(DataTypeSupport.scala:113)
2020-11-02T17:02:21.642174529Z at be.cetic.tsorage.processor.flow.CassandraWriter$.$anonfun$writeMessageAsync$1(CassandraWriter.scala:39)
2020-11-02T17:02:21.642177611Z at scala.collection.immutable.List.map(List.scala:286)
2020-11-02T17:02:21.642180652Z at be.cetic.tsorage.processor.flow.CassandraWriter$.writeMessageAsync(CassandraWriter.scala:32)
2020-11-02T17:02:21.642183700Z at be.cetic.tsorage.processor.flow.CassandraWriter$.$anonfun$createWriteMsgFlow$1(CassandraWriter.scala:60)
2020-11-02T17:02:21.642186939Z at akka.stream.impl.fusing.MapAsyncUnordered$$anon$31.onPush(Ops.scala:1403)
2020-11-02T17:02:21.642189957Z at akka.stream.impl.fusing.GraphInterpreter.processPush(GraphInterpreter.scala:541)
2020-11-02T17:02:21.642193068Z at akka.stream.impl.fusing.GraphInterpreter.processEvent(GraphInterpreter.scala:495)
2020-11-02T17:02:21.642196038Z at akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:390)
2020-11-02T17:02:21.642199154Z at akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:624)
2020-11-02T17:02:21.642202206Z at akka.stream.impl.fusing.ActorGraphInterpreter$SimpleBoundaryEvent.execute(ActorGraphInterpreter.scala:55)
2020-11-02T17:02:21.642205271Z at akka.stream.impl.fusing.ActorGraphInterpreter$SimpleBoundaryEvent.execute$(ActorGraphInterpreter.scala:51)
2020-11-02T17:02:21.642208334Z at akka.stream.impl.fusing.ActorGraphInterpreter$BatchingActorInputBoundary$OnNext.execute(ActorGraphInterpreter.scala:94)
2020-11-02T17:02:21.642223856Z at akka.stream.impl.fusing.GraphInterpreterShell.processEvent(ActorGraphInterpreter.scala:599)
2020-11-02T17:02:21.642226916Z at akka.stream.impl.fusing.ActorGraphInterpreter.akka$stream$impl$fusing$ActorGraphInterpreter$$processEvent(ActorGraphInterpreter.scala:768)
2020-11-02T17:02:21.642229773Z at akka.stream.impl.fusing.ActorGraphInterpreter$$anonfun$receive$1.applyOrElse(ActorGraphInterpreter.scala:783)
2020-11-02T17:02:21.642232498Z at akka.actor.Actor.aroundReceive(Actor.scala:534)
2020-11-02T17:02:21.642235582Z at akka.actor.Actor.aroundReceive$(Actor.scala:532)
2020-11-02T17:02:21.642238219Z at akka.stream.impl.fusing.ActorGraphInterpreter.aroundReceive(ActorGraphInterpreter.scala:690)
2020-11-02T17:02:21.642240923Z at akka.actor.ActorCell.receiveMessage(ActorCell.scala:573)
2020-11-02T17:02:21.642243547Z at akka.actor.ActorCell.invoke(ActorCell.scala:543)
2020-11-02T17:02:21.642246129Z at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:269)
2020-11-02T17:02:21.642248706Z at akka.dispatch.Mailbox.run(Mailbox.scala:230)
2020-11-02T17:02:21.642251343Z at akka.dispatch.Mailbox.exec(Mailbox.scala:242)
2020-11-02T17:02:21.642253926Z at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
2020-11-02T17:02:21.642256987Z at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
2020-11-02T17:02:21.642259691Z at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
2020-11-02T17:02:21.642262334Z at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
2020-11-02T17:02:21.642265076Z at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:177)
2020-11-02T17:02:21.642267865Z
```
There were a series of similar messages. While the processor still worked after this one, it eventually completely stopped.
Hypothesis: A message with a _tbool_ type and a `0` value has been submitted, which is an error. Because the Boolean support cannot process a `0` value, an exception is raised, which crashes the Processor. Consequently, the message is not committed (it is not successfully written to Cassandra). When the Processor restarts, it reads the message again, which leads to the same result.
After a while, the process is completely stalled with important lags on the partitions. There is an evidence of this on the Kafka logs:
```
root@tsorage-kafka-0:/# kafka-consumer-groups --bootstrap-server localhost:9092 --group ts_processing --describe
TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID
raw 6 910457 1104884 194427 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 18 579031 702597 123566 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 26 567074 688111 121037 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 1 562430 682444 120014 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 12 860425 1044180 183755 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 22 631232 766131 134899 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 2 726710 881795 155085 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 30 564658 685166 120508 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 10 794333 963844 169511 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 16 667162 809540 142378 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 27 676739 821160 144421 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 19 636122 771888 135766 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 8 731388 887460 156072 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 0 37946916 38088799 141883 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 3 534737 648870 114133 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 13 751416 911781 160365 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 31 685809 832172 146363 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 23 722814 877069 154255 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 7 641957 778964 137007 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 17 894145 1084970 190825 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 28 562341 682353 120012 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 20 638579 774867 136288 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 14 557585 676581 118996 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 4 646407 784373 137966 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 24 560883 680584 119701 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 11 479018 581233 102215 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 21 854987 1036763 181776 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 29 598065 725701 127636 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 15 771852 936593 164741 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 5 574206 696745 122539 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 25 850498 1032015 181517 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
raw 9 764846 927931 163085 consumer-ts_processing-1-73a66e6f-38e0-41e4-b3b3-109ad89f90f9 /10.44.0.7 consumer-ts_processing-1
```
|
process
|
the processor eventually fails when unmatching values and types are provided excerpt from the processor log restarting graph due to failure stack trace java lang illegalargumentexception expected boolean got at be cetic tsorage data datatype booleansupport fromjson booleansupport scala at be cetic tsorage data datatype booleansupport fromjson booleansupport scala at be cetic tsorage data datatype datatypesupport asrawudtvalue datatypesupport scala at be cetic tsorage processor flow cassandrawriter anonfun writemessageasync cassandrawriter scala at scala collection immutable list map list scala at be cetic tsorage processor flow cassandrawriter writemessageasync cassandrawriter scala at be cetic tsorage processor flow cassandrawriter anonfun createwritemsgflow cassandrawriter scala at akka stream impl fusing mapasyncunordered anon onpush ops scala at akka stream impl fusing graphinterpreter processpush graphinterpreter scala at akka stream impl fusing graphinterpreter processevent graphinterpreter scala at akka stream impl fusing graphinterpreter execute graphinterpreter scala at akka stream impl fusing graphinterpretershell runbatch actorgraphinterpreter scala at akka stream impl fusing actorgraphinterpreter simpleboundaryevent execute actorgraphinterpreter scala at akka stream impl fusing actorgraphinterpreter simpleboundaryevent execute actorgraphinterpreter scala at akka stream impl fusing actorgraphinterpreter batchingactorinputboundary onnext execute actorgraphinterpreter scala at akka stream impl fusing graphinterpretershell processevent actorgraphinterpreter scala at akka stream impl fusing actorgraphinterpreter akka stream impl fusing actorgraphinterpreter processevent actorgraphinterpreter scala at akka stream impl fusing actorgraphinterpreter anonfun receive applyorelse actorgraphinterpreter scala at akka actor actor aroundreceive actor scala at akka actor actor aroundreceive actor scala at akka stream impl fusing actorgraphinterpreter aroundreceive actorgraphinterpreter scala at akka actor actorcell receivemessage actorcell scala at akka actor actorcell invoke actorcell scala at akka dispatch mailbox processmailbox mailbox scala at akka dispatch mailbox run mailbox scala at akka dispatch mailbox exec mailbox scala at java base java util concurrent forkjointask doexec forkjointask java at java base java util concurrent forkjoinpool workqueue toplevelexec forkjoinpool java at java base java util concurrent forkjoinpool scan forkjoinpool java at java base java util concurrent forkjoinpool runworker forkjoinpool java at java base java util concurrent forkjoinworkerthread run forkjoinworkerthread java there were a series of similar messages while the processor still worked after this one it eventually completely stopped hypothesis a message with a tbool type and a value has been submitted which is an error because the boolean support cannot process a value an exception is raised which crashes the processor consequently the message is not committed it is not successfully written to cassandra when the processor restarts it reads the message again which leads to the same result after a while the process is completely stalled with important lags on the partitions there is an evidence of this on the kafka logs root tsorage kafka kafka consumer groups bootstrap server localhost group ts processing describe topic partition current offset log end offset lag consumer id host client id raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing raw consumer ts processing consumer ts processing
| 1
|
20,514
| 27,171,758,710
|
IssuesEvent
|
2023-02-17 20:09:31
|
NationalSecurityAgency/ghidra
|
https://api.github.com/repos/NationalSecurityAgency/ghidra
|
closed
|
Reduced Function Start Identification in Ghidra9.1 compared to Ghidra 9.0.4 for ARM binaries
|
Type: Bug Feature: Processor/ARM
|
**Describe the bug**
The number of correctly detected function starts found by Ghidra 9.1 is lesser that Ghidra 9.0.4 for ELF32 Little Endian stripped ARM binaries. We noticed this trend by comparing the function start identified by both versions of Ghidra against the function start addresses given by the DWARF utility in the GCC compiler during compilation. The following link shows a recent paper that uses this technique to find out the addresses in the binary that correspond to actual function starts.
https://www.usenix.org/conference/cset19/speaker-or-organizer/sri-shaila-g-university-california-riverside
We compared the function start identification capability in both Ghidra versions on a testset 10 benign binaries from the SPEC 2017 benchmark. We found that the number of correctly identified function starts found by Ghidra9.0.4 is higher than in Ghidra9.1 for ARM binaries.
**To Reproduce**
We take the 544.nab_r binary found in SPEC 2017 benchmark as an example. Other binaries can also be used.
1.Compile the source code into ELF32 Little Endian ARM binaries by using the GCC ARM compiler. Use the -g3 flag to attach debugging information to the dynamically linked unstripped binary.
2.The debugging information attached to the unstripped binary will give the addresses in the binary that correspond to actual function starts. We will refer to this list of function starts as the ground truth. [A reasonable but less accurate alternative to getting the ground truth is to use the disassembler on unstripped version of the binary to find the function starts, but there might be a few falsely identified functions.]
3. Compile the source code into ELF32 Little Endian stripped ARM binaries by using the GCC compiler. Use the -s flag to produce a dynamically linked stripped binary.
4. Disassemble the stripped binary from step 3 by using Ghidra 9.0.4 and record the function starts found by Ghidra 9.0.4 that are also found in the ground truth.
5. Disassemble the stripped binary from step 3 by using Ghidra 9.1 and record the function starts found by Ghidra 9.1 that are also found in the ground truth.
6. Compare the number of functions that are correctly identified in both version. In our observation, Ghidra9.0.4 could identify the functions at the following locations. However, Ghidra9.1 could not. These functions are also found in the ground truth.
0x22654
0x226a8
0x22704
0x22758
0x2279c
0x2847c
0x28904
0x27c3c
0x291ac
**Expected behavior**
We expected that Ghidra 9.1 would be able to identify the same number of correct function starts as Ghidra9.0.4 or identify more correct function starts.
**Screenshots**
If applicable, add screenshots to help explain your problem.

Ghidra9.0.4 can identify the function start at address 0x226a8

Ghidra9.1 cannot identify the function start at address 0x226a8

Ghidra9.0.4 can identify the function start at address 0x22654

Ghidra9.1 cannot identify the function start at address 0x22654
**Attachments**
If applicable, please attach any files that caused problems or log files generated by the software.
I have attached the compiled ARM binaries, both the stripped and the unstripped versions for the nab program from SPEC 2017 benchmark.
[Nab_Binaries.zip](https://github.com/NationalSecurityAgency/ghidra/files/4194137/Nab_Binaries.zip)
I have attached the original ARM_LE_Pattern.xml file from both Ghidra versions. I have also attached the patched version(according to the description below) of ARM_LE_Pattern.xml for Ghidra9.1.
[GhidraFunctionBytePatterns.zip](https://github.com/NationalSecurityAgency/ghidra/files/4194141/GhidraFunctionBytePatterns.zip)
**Environment (please complete the following information):**
OS: Ubuntu 18.04 LTS and Windows 10 Home Ed
Java Version: Open jdk version 11.0.6
Ghidra Version: Ghidra 9.0.4 and Ghidra 9.1
**Additional context**
The Fix/Patch
On further analysis, we found that the following line was added into the ARM_LE_patterns.xml file for Ghidra9.1
`<align mark="0" bits="3"/>`
This line prevents the function byte pattern rule associated with it to be applied to the ARM ELF binary under analysis. This decreases the number of correctly identified function starts found by Ghidra 9.1. Removing this line improves the number of functions that are correctly identified by Ghidra 9.1. When this line is removed, both versions are able to identify similar number of function starts correctly.
We also found that Ghidra9.1.1 was also not able to identify the function starts stated above for the 544.nab_r binary. It can recover more function starts if the patched version of the ARM_LE_Pattern.xml attached to this report is used instead of the original version.
|
1.0
|
Reduced Function Start Identification in Ghidra9.1 compared to Ghidra 9.0.4 for ARM binaries - **Describe the bug**
The number of correctly detected function starts found by Ghidra 9.1 is lesser that Ghidra 9.0.4 for ELF32 Little Endian stripped ARM binaries. We noticed this trend by comparing the function start identified by both versions of Ghidra against the function start addresses given by the DWARF utility in the GCC compiler during compilation. The following link shows a recent paper that uses this technique to find out the addresses in the binary that correspond to actual function starts.
https://www.usenix.org/conference/cset19/speaker-or-organizer/sri-shaila-g-university-california-riverside
We compared the function start identification capability in both Ghidra versions on a testset 10 benign binaries from the SPEC 2017 benchmark. We found that the number of correctly identified function starts found by Ghidra9.0.4 is higher than in Ghidra9.1 for ARM binaries.
**To Reproduce**
We take the 544.nab_r binary found in SPEC 2017 benchmark as an example. Other binaries can also be used.
1.Compile the source code into ELF32 Little Endian ARM binaries by using the GCC ARM compiler. Use the -g3 flag to attach debugging information to the dynamically linked unstripped binary.
2.The debugging information attached to the unstripped binary will give the addresses in the binary that correspond to actual function starts. We will refer to this list of function starts as the ground truth. [A reasonable but less accurate alternative to getting the ground truth is to use the disassembler on unstripped version of the binary to find the function starts, but there might be a few falsely identified functions.]
3. Compile the source code into ELF32 Little Endian stripped ARM binaries by using the GCC compiler. Use the -s flag to produce a dynamically linked stripped binary.
4. Disassemble the stripped binary from step 3 by using Ghidra 9.0.4 and record the function starts found by Ghidra 9.0.4 that are also found in the ground truth.
5. Disassemble the stripped binary from step 3 by using Ghidra 9.1 and record the function starts found by Ghidra 9.1 that are also found in the ground truth.
6. Compare the number of functions that are correctly identified in both version. In our observation, Ghidra9.0.4 could identify the functions at the following locations. However, Ghidra9.1 could not. These functions are also found in the ground truth.
0x22654
0x226a8
0x22704
0x22758
0x2279c
0x2847c
0x28904
0x27c3c
0x291ac
**Expected behavior**
We expected that Ghidra 9.1 would be able to identify the same number of correct function starts as Ghidra9.0.4 or identify more correct function starts.
**Screenshots**
If applicable, add screenshots to help explain your problem.

Ghidra9.0.4 can identify the function start at address 0x226a8

Ghidra9.1 cannot identify the function start at address 0x226a8

Ghidra9.0.4 can identify the function start at address 0x22654

Ghidra9.1 cannot identify the function start at address 0x22654
**Attachments**
If applicable, please attach any files that caused problems or log files generated by the software.
I have attached the compiled ARM binaries, both the stripped and the unstripped versions for the nab program from SPEC 2017 benchmark.
[Nab_Binaries.zip](https://github.com/NationalSecurityAgency/ghidra/files/4194137/Nab_Binaries.zip)
I have attached the original ARM_LE_Pattern.xml file from both Ghidra versions. I have also attached the patched version(according to the description below) of ARM_LE_Pattern.xml for Ghidra9.1.
[GhidraFunctionBytePatterns.zip](https://github.com/NationalSecurityAgency/ghidra/files/4194141/GhidraFunctionBytePatterns.zip)
**Environment (please complete the following information):**
OS: Ubuntu 18.04 LTS and Windows 10 Home Ed
Java Version: Open jdk version 11.0.6
Ghidra Version: Ghidra 9.0.4 and Ghidra 9.1
**Additional context**
The Fix/Patch
On further analysis, we found that the following line was added into the ARM_LE_patterns.xml file for Ghidra9.1
`<align mark="0" bits="3"/>`
This line prevents the function byte pattern rule associated with it to be applied to the ARM ELF binary under analysis. This decreases the number of correctly identified function starts found by Ghidra 9.1. Removing this line improves the number of functions that are correctly identified by Ghidra 9.1. When this line is removed, both versions are able to identify similar number of function starts correctly.
We also found that Ghidra9.1.1 was also not able to identify the function starts stated above for the 544.nab_r binary. It can recover more function starts if the patched version of the ARM_LE_Pattern.xml attached to this report is used instead of the original version.
|
process
|
reduced function start identification in compared to ghidra for arm binaries describe the bug the number of correctly detected function starts found by ghidra is lesser that ghidra for little endian stripped arm binaries we noticed this trend by comparing the function start identified by both versions of ghidra against the function start addresses given by the dwarf utility in the gcc compiler during compilation the following link shows a recent paper that uses this technique to find out the addresses in the binary that correspond to actual function starts we compared the function start identification capability in both ghidra versions on a testset benign binaries from the spec benchmark we found that the number of correctly identified function starts found by is higher than in for arm binaries to reproduce we take the nab r binary found in spec benchmark as an example other binaries can also be used compile the source code into little endian arm binaries by using the gcc arm compiler use the flag to attach debugging information to the dynamically linked unstripped binary the debugging information attached to the unstripped binary will give the addresses in the binary that correspond to actual function starts we will refer to this list of function starts as the ground truth compile the source code into little endian stripped arm binaries by using the gcc compiler use the s flag to produce a dynamically linked stripped binary disassemble the stripped binary from step by using ghidra and record the function starts found by ghidra that are also found in the ground truth disassemble the stripped binary from step by using ghidra and record the function starts found by ghidra that are also found in the ground truth compare the number of functions that are correctly identified in both version in our observation could identify the functions at the following locations however could not these functions are also found in the ground truth expected behavior we expected that ghidra would be able to identify the same number of correct function starts as or identify more correct function starts screenshots if applicable add screenshots to help explain your problem can identify the function start at address cannot identify the function start at address can identify the function start at address cannot identify the function start at address attachments if applicable please attach any files that caused problems or log files generated by the software i have attached the compiled arm binaries both the stripped and the unstripped versions for the nab program from spec benchmark i have attached the original arm le pattern xml file from both ghidra versions i have also attached the patched version according to the description below of arm le pattern xml for environment please complete the following information os ubuntu lts and windows home ed java version open jdk version ghidra version ghidra and ghidra additional context the fix patch on further analysis we found that the following line was added into the arm le patterns xml file for this line prevents the function byte pattern rule associated with it to be applied to the arm elf binary under analysis this decreases the number of correctly identified function starts found by ghidra removing this line improves the number of functions that are correctly identified by ghidra when this line is removed both versions are able to identify similar number of function starts correctly we also found that was also not able to identify the function starts stated above for the nab r binary it can recover more function starts if the patched version of the arm le pattern xml attached to this report is used instead of the original version
| 1
|
273,832
| 29,831,099,011
|
IssuesEvent
|
2023-06-18 09:31:44
|
RG4421/ampere-centos-kernel
|
https://api.github.com/repos/RG4421/ampere-centos-kernel
|
closed
|
WS-2021-0545 (Medium) detected in multiple libraries - autoclosed
|
Mend: dependency security vulnerability
|
## WS-2021-0545 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linuxv5.2</b>, <b>linuxv5.2</b>, <b>linuxv5.2</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
perf report: Fix memory leaks around perf_tip()
This is an automated ID intended to aid in discovery of potential security vulnerabilities. The actual impact and attack plausibility have not yet been proven.
This ID is fixed in Linux Kernel version v5.15.7 by commit 71e284dcebecb9fd204ff11097469cc547723ad1. For more details please see the references link.
<p>Publish Date: 2021-12-19
<p>URL: <a href=https://github.com/gregkh/linux/commit/71e284dcebecb9fd204ff11097469cc547723ad1>WS-2021-0545</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/GSD-2021-1002560">https://osv.dev/vulnerability/GSD-2021-1002560</a></p>
<p>Release Date: 2021-12-19</p>
<p>Fix Resolution: v5.15.7</p>
</p>
</details>
<p></p>
|
True
|
WS-2021-0545 (Medium) detected in multiple libraries - autoclosed - ## WS-2021-0545 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linuxv5.2</b>, <b>linuxv5.2</b>, <b>linuxv5.2</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
perf report: Fix memory leaks around perf_tip()
This is an automated ID intended to aid in discovery of potential security vulnerabilities. The actual impact and attack plausibility have not yet been proven.
This ID is fixed in Linux Kernel version v5.15.7 by commit 71e284dcebecb9fd204ff11097469cc547723ad1. For more details please see the references link.
<p>Publish Date: 2021-12-19
<p>URL: <a href=https://github.com/gregkh/linux/commit/71e284dcebecb9fd204ff11097469cc547723ad1>WS-2021-0545</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/GSD-2021-1002560">https://osv.dev/vulnerability/GSD-2021-1002560</a></p>
<p>Release Date: 2021-12-19</p>
<p>Fix Resolution: v5.15.7</p>
</p>
</details>
<p></p>
|
non_process
|
ws medium detected in multiple libraries autoclosed ws medium severity vulnerability vulnerable libraries vulnerability details perf report fix memory leaks around perf tip this is an automated id intended to aid in discovery of potential security vulnerabilities the actual impact and attack plausibility have not yet been proven this id is fixed in linux kernel version by commit for more details please see the references link publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
| 0
|
344,313
| 30,733,966,026
|
IssuesEvent
|
2023-07-28 05:49:21
|
woowacourse-teams/2023-3-ddang
|
https://api.github.com/repos/woowacourse-teams/2023-3-ddang
|
closed
|
ViewModel 테스트 작성 - AuctionDetailViewModel
|
android test
|
### 📄 설명
- Repository mocking
- ViewModel의 데이터가 의도에 맞게 잘 바뀌는지 확인
### ✅ 작업할 내용
- [ ] AuctionDetailViewModel 테스트 작성
### 🙋🏻 참고 자료
_No response_
### ⏰ 추정 시간
1시간
### ⏰ 소요 시간
1시간
|
1.0
|
ViewModel 테스트 작성 - AuctionDetailViewModel - ### 📄 설명
- Repository mocking
- ViewModel의 데이터가 의도에 맞게 잘 바뀌는지 확인
### ✅ 작업할 내용
- [ ] AuctionDetailViewModel 테스트 작성
### 🙋🏻 참고 자료
_No response_
### ⏰ 추정 시간
1시간
### ⏰ 소요 시간
1시간
|
non_process
|
viewmodel 테스트 작성 auctiondetailviewmodel 📄 설명 repository mocking viewmodel의 데이터가 의도에 맞게 잘 바뀌는지 확인 ✅ 작업할 내용 auctiondetailviewmodel 테스트 작성 🙋🏻 참고 자료 no response ⏰ 추정 시간 ⏰ 소요 시간
| 0
|
228,555
| 17,465,433,773
|
IssuesEvent
|
2021-08-06 16:06:52
|
api3dao/airnode
|
https://api.github.com/repos/api3dao/airnode
|
closed
|
Validate OIS title
|
documentation validator
|
OIS title isn't validated but it should be
https://github.com/api3dao/airnode/blob/c3ffc8daf06f86ee8cc37c3bf46cbeb967b90ae8/packages/validator/templates/ois.json#L5
Waiting on the discussion at ChainAPI to be resolved for what is allowed
|
1.0
|
Validate OIS title - OIS title isn't validated but it should be
https://github.com/api3dao/airnode/blob/c3ffc8daf06f86ee8cc37c3bf46cbeb967b90ae8/packages/validator/templates/ois.json#L5
Waiting on the discussion at ChainAPI to be resolved for what is allowed
|
non_process
|
validate ois title ois title isn t validated but it should be waiting on the discussion at chainapi to be resolved for what is allowed
| 0
|
47,546
| 25,061,363,418
|
IssuesEvent
|
2022-11-07 02:12:08
|
fedarko/strainFlye
|
https://api.github.com/repos/fedarko/strainFlye
|
opened
|
Use pyfaidx to access contigs in FASTA files
|
performance
|
A few of the commands need to access specific contig sequences while iterating over all contigs—for example, we go through all contigs → for those that match some condition, retrieve the contig sequence → do some more stuff using the sequence.
When there are thousands of contigs, I think this process will become slow (or at least slower than needed) because we retrieve sequences using [`fasta_utils.get_single_seq()`](https://github.com/fedarko/strainFlye/blob/789bb736a8cd4545fec25963725b109d84d90388/strainflye/fasta_utils.py#L116-L147)—which in turn goes through, in the worst case, every sequence in the FASTA file. Since this is done during an iteration over all contigs, it's one of those `O(|contigs|^2)` situations.
It is possible to speed this up by indexing the FASTA file to allow for "random access," which should cut the runtime of accessing a given contig down to ~`O(1)` (and thus cut the total runtime down to `O(|contigs|)`. It would probably make sense to just use an existing library that supports indexing / random access in FASTA files (rather than re-inventing the wheel); [pyfaidx](https://github.com/mdshw5/pyfaidx) seems good. (We currently read FASTA files using scikit-bio, and we can probably keep that around for most things, but I don't think it supports indexing / random access.)
|
True
|
Use pyfaidx to access contigs in FASTA files - A few of the commands need to access specific contig sequences while iterating over all contigs—for example, we go through all contigs → for those that match some condition, retrieve the contig sequence → do some more stuff using the sequence.
When there are thousands of contigs, I think this process will become slow (or at least slower than needed) because we retrieve sequences using [`fasta_utils.get_single_seq()`](https://github.com/fedarko/strainFlye/blob/789bb736a8cd4545fec25963725b109d84d90388/strainflye/fasta_utils.py#L116-L147)—which in turn goes through, in the worst case, every sequence in the FASTA file. Since this is done during an iteration over all contigs, it's one of those `O(|contigs|^2)` situations.
It is possible to speed this up by indexing the FASTA file to allow for "random access," which should cut the runtime of accessing a given contig down to ~`O(1)` (and thus cut the total runtime down to `O(|contigs|)`. It would probably make sense to just use an existing library that supports indexing / random access in FASTA files (rather than re-inventing the wheel); [pyfaidx](https://github.com/mdshw5/pyfaidx) seems good. (We currently read FASTA files using scikit-bio, and we can probably keep that around for most things, but I don't think it supports indexing / random access.)
|
non_process
|
use pyfaidx to access contigs in fasta files a few of the commands need to access specific contig sequences while iterating over all contigs—for example we go through all contigs → for those that match some condition retrieve the contig sequence → do some more stuff using the sequence when there are thousands of contigs i think this process will become slow or at least slower than needed because we retrieve sequences using in turn goes through in the worst case every sequence in the fasta file since this is done during an iteration over all contigs it s one of those o contigs situations it is possible to speed this up by indexing the fasta file to allow for random access which should cut the runtime of accessing a given contig down to o and thus cut the total runtime down to o contigs it would probably make sense to just use an existing library that supports indexing random access in fasta files rather than re inventing the wheel seems good we currently read fasta files using scikit bio and we can probably keep that around for most things but i don t think it supports indexing random access
| 0
|
296,942
| 25,585,273,178
|
IssuesEvent
|
2022-12-01 08:51:50
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
VPN config failed to load with an error `Failed to write the VPN config`
|
bug QA/Yes release-notes/exclude QA/Test-Plan-Specified OS/Desktop feature/vpn
|
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
VPN config error is shown when tried to connect to VPN in following scenarios:
Also noticed that `rasphone` entry disappears in `%APPDATA%\Microsoft\Network\Connections\Pbk` folder when VPN config error is shown
1. After removing any previous configs, create a new VPN profile and connect to VPN for the first time
2. Switch to different region once VPN is successfully connected
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
#### Scenario 1: VPN connection fails first time login in a new profile
Pre-requisite: Remove any pre-existing VPN configs from Windows settings
1. Install `1.46.130`
2. launch Brave
3. Enable `brave://flags/#brave-vpn `
4. click `Relaunch`
5. load `account.bravesoftware.com`
6. enter basic-auth creds for site
7. enter an email address and click `Get login link`
8. check your email inbox
9. click `Log in to Brave` or the link in the email
10. click on `Browse plans`
11. scroll down and click on `Buy now` for `Brave VPN Subscription`
12. enter payment details on `https://checkout.stripe.com/pay/cs_test…`
13. click `Subscribe`
14. Success screen is shown with `YOU HAVE ACTIVE CREDENTIALS LOADED!` message
15. Click VPN button in the toolbar and connect to VPN
#### Scenario 2: Switch to different region once VPN is successfully connected
- from scenario 1, switch from USA(Northwest) to Australia
- %APPDATA%\Microsoft\Network\Connections\Pbk - `rasphone` entry disappears
## Actual result:
<!--Please add screenshots if needed-->
Got this below error:
- `Failed to write the VPN config. This may happen if you don't have permission or if you haven't run Windows Update recently.`
- `%APPDATA%\Microsoft\Network\Connections\Pbk%` - do not show `rasphone` entry
#### Work around for both scenarios is
- close the browser
- reboot the machine
- relaunch Brave
- Toggled ON to connect VPN in the VPN panel
Logs show this error: `ERROR: failed to write "NumCustomPolicy" field to rasphone.pbk`
Ex1|Ex2|scenario 2
----|----|----
||
## Expected result:
VPN should be connected successfully without errors
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easily
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
```
Brave | 1.46.130 Chromium: 108.0.5359.62 (Official Build) (64-bit)
-- | --
Revision | 041930a89a990cfab0315a2d9f20d6429a4a67cf-refs/branch-heads/5359@{#938}
OS | Windows 11 Version 21H2 (Build 22000.1219)
```
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? Yes
- Can you reproduce this issue with the beta channel?
- Can you reproduce this issue with the nightly channel?
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields?
- Does the issue resolve itself when disabling Brave Rewards?
- Is the issue reproducible on the latest version of Chrome?
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
@mattmcalister @rebron @bsclifton @simonhong
cc: @brave/qa-team
|
1.0
|
VPN config failed to load with an error `Failed to write the VPN config` - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
VPN config error is shown when tried to connect to VPN in following scenarios:
Also noticed that `rasphone` entry disappears in `%APPDATA%\Microsoft\Network\Connections\Pbk` folder when VPN config error is shown
1. After removing any previous configs, create a new VPN profile and connect to VPN for the first time
2. Switch to different region once VPN is successfully connected
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
#### Scenario 1: VPN connection fails first time login in a new profile
Pre-requisite: Remove any pre-existing VPN configs from Windows settings
1. Install `1.46.130`
2. launch Brave
3. Enable `brave://flags/#brave-vpn `
4. click `Relaunch`
5. load `account.bravesoftware.com`
6. enter basic-auth creds for site
7. enter an email address and click `Get login link`
8. check your email inbox
9. click `Log in to Brave` or the link in the email
10. click on `Browse plans`
11. scroll down and click on `Buy now` for `Brave VPN Subscription`
12. enter payment details on `https://checkout.stripe.com/pay/cs_test…`
13. click `Subscribe`
14. Success screen is shown with `YOU HAVE ACTIVE CREDENTIALS LOADED!` message
15. Click VPN button in the toolbar and connect to VPN
#### Scenario 2: Switch to different region once VPN is successfully connected
- from scenario 1, switch from USA(Northwest) to Australia
- %APPDATA%\Microsoft\Network\Connections\Pbk - `rasphone` entry disappears
## Actual result:
<!--Please add screenshots if needed-->
Got this below error:
- `Failed to write the VPN config. This may happen if you don't have permission or if you haven't run Windows Update recently.`
- `%APPDATA%\Microsoft\Network\Connections\Pbk%` - do not show `rasphone` entry
#### Work around for both scenarios is
- close the browser
- reboot the machine
- relaunch Brave
- Toggled ON to connect VPN in the VPN panel
Logs show this error: `ERROR: failed to write "NumCustomPolicy" field to rasphone.pbk`
Ex1|Ex2|scenario 2
----|----|----
||
## Expected result:
VPN should be connected successfully without errors
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easily
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
```
Brave | 1.46.130 Chromium: 108.0.5359.62 (Official Build) (64-bit)
-- | --
Revision | 041930a89a990cfab0315a2d9f20d6429a4a67cf-refs/branch-heads/5359@{#938}
OS | Windows 11 Version 21H2 (Build 22000.1219)
```
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? Yes
- Can you reproduce this issue with the beta channel?
- Can you reproduce this issue with the nightly channel?
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields?
- Does the issue resolve itself when disabling Brave Rewards?
- Is the issue reproducible on the latest version of Chrome?
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
@mattmcalister @rebron @bsclifton @simonhong
cc: @brave/qa-team
|
non_process
|
vpn config failed to load with an error failed to write the vpn config have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description vpn config error is shown when tried to connect to vpn in following scenarios also noticed that rasphone entry disappears in appdata microsoft network connections pbk folder when vpn config error is shown after removing any previous configs create a new vpn profile and connect to vpn for the first time switch to different region once vpn is successfully connected steps to reproduce scenario vpn connection fails first time login in a new profile pre requisite remove any pre existing vpn configs from windows settings install launch brave enable brave flags brave vpn click relaunch load account bravesoftware com enter basic auth creds for site enter an email address and click get login link check your email inbox click log in to brave or the link in the email click on browse plans scroll down and click on buy now for brave vpn subscription enter payment details on click subscribe success screen is shown with you have active credentials loaded message click vpn button in the toolbar and connect to vpn scenario switch to different region once vpn is successfully connected from scenario switch from usa northwest to australia appdata microsoft network connections pbk rasphone entry disappears actual result got this below error failed to write the vpn config this may happen if you don t have permission or if you haven t run windows update recently appdata microsoft network connections pbk do not show rasphone entry work around for both scenarios is close the browser reboot the machine relaunch brave toggled on to connect vpn in the vpn panel logs show this error error failed to write numcustompolicy field to rasphone pbk scenario expected result vpn should be connected successfully without errors reproduces how often easily brave version brave version info brave chromium official build bit revision refs branch heads os windows version build version channel information can you reproduce this issue with the current release yes can you reproduce this issue with the beta channel can you reproduce this issue with the nightly channel other additional information does the issue resolve itself when disabling brave shields does the issue resolve itself when disabling brave rewards is the issue reproducible on the latest version of chrome miscellaneous information mattmcalister rebron bsclifton simonhong cc brave qa team
| 0
|
226,761
| 18,044,159,252
|
IssuesEvent
|
2021-09-18 15:39:18
|
logicmoo/logicmoo_workspace
|
https://api.github.com/repos/logicmoo/logicmoo_workspace
|
opened
|
logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01 JUnit
|
logicmoo.base.fol.fiveof Test_9999 unit_test NONMONOTONIC_TYPE_01
|
(cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl)
GH_MASTER_ISSUE_FINFO=
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANONMONOTONIC_TYPE_01
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/66/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://github.com/logicmoo/logicmoo_workspace/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl'),
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
%~ this_test_might_need( :-( expects_dialect(pfc)))
% =============================================
% File 'mpred_builtin.pfc'
% Purpose: Agent Reactivity for SWI-Prolog
% Maintainer: Douglas Miles
% Contact: $Author: dmiles $@users.sourceforge.net %
% Version: 'interface' 1.0.0
% Revision: $Revision: 1.9 $
% Revised At: $Date: 2002/06/27 14:13:20 $
% =============================================
%
:- module(baseKB).
:- process_script_file.
%~ kifm = leftof(h1,h2).
%~ kifm = leftof(h1,h2).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h1,h2)))))
%~ kifm = leftof(h1,h2).
%~ kifm = leftof(h2,h3).
%~ kifm = leftof(h2,h3).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h2,h3)))))
%~ kifm = leftof(h2,h3).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:25
%~ kifm = leftof(h3,h4).
%~ kifm = leftof(h3,h4).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h3,h4)))))
%~ kifm = leftof(h3,h4).
%~ kifm = leftof(h4,h5).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:26
%~ kifm = leftof(h4,h5).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h4,h5)))))
%~ kifm = leftof(h4,h5).
%~ kifm = ( leftof(House_Leftof,House_Leftof3) =>
%~ house(House_Leftof)&house(House_Leftof3)).
%~ kifm = ( leftof(House_Leftof8,House_Leftof9) =>
%~ house(House_Leftof8)&house(House_Leftof9)).
%~ debugm( common_logic_loader,
%~ show_success( common_logic_loader,
%~ common_logic_loader : ain( clif( leftof(H1,H2)=>(house(H1)&house(H2))))))
%~ kifm = ( leftof(House_Leftof,House_Leftof3) =>
%~ house(House_Leftof)&house(House_Leftof3)).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:31
%~ kifm = leftof(H1,H2)=>(house(H1)&house(H2)).
%~ mpred_test("Test_0001_Line_0038__H1",baseKB:house(h1))
%~ mpred_test("Test_0002_Line_0039__H2",baseKB:house(h2))
%~ mpred_test("Test_0003_Line_0040__H3",baseKB:house(h3))
%~ mpred_test("Test_0004_Line_0041__H4",baseKB:house(h4))
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:42
%~ mpred_test("Test_0005_Line_0042__H5",baseKB:house(h5))
%~ mpred_test("Test_0006_Line_0045__False_positive",baseKB:poss(house(false_positive)))
%~ mpred_test("Test_0007_Line_0047__naf_False_positive",baseKB:(\+nesc(house(false_positive))))
%~ skipped( listing( [ house/1,
%~ nesc/1]))
/*~
%~ kifm=leftof(h1,h2)
%~ kifm=leftof(h1,h2)
%~ kif_to_boxlog_attvars2 = leftof(h1,h2)
% /var/lib/jenkins/.local/share/swi-prolog/pack/pfc/prolog/pfc_test compiled into pfc_test 0.06 sec, -1 clauses
=======================================================
leftof(h1,h2)
============================================
?- kif_to_boxlog( leftof(h1,h2) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h1 leftof h2
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h1,h2)
%~ kif_to_boxlog_attvars2 = leftof(h1,h2)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h1,h2).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h1 leftof h2
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h1,h2).
============================================%~ kifm=leftof(h2,h3)
%~ kifm=leftof(h2,h3)
%~ kif_to_boxlog_attvars2 = leftof(h2,h3)
=======================================================
leftof(h2,h3)
============================================
?- kif_to_boxlog( leftof(h2,h3) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h2 leftof h3
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h2,h3)
%~ kif_to_boxlog_attvars2 = leftof(h2,h3)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h2,h3).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h2 leftof h3
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h2,h3).
============================================%~ kifm=leftof(h3,h4)
%~ kifm=leftof(h3,h4)
%~ kif_to_boxlog_attvars2 = leftof(h3,h4)
=======================================================
leftof(h3,h4)
============================================
?- kif_to_boxlog( leftof(h3,h4) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h3 leftof h4
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h3,h4)
%~ kif_to_boxlog_attvars2 = leftof(h3,h4)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h3,h4).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h3 leftof h4
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h3,h4).
============================================%~ kifm=leftof(h4,h5)
%~ kifm=leftof(h4,h5)
%~ kif_to_boxlog_attvars2 = leftof(h4,h5)
=======================================================
leftof(h4,h5)
============================================
?- kif_to_boxlog( leftof(h4,h5) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h4 leftof h5
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h4,h5)
%~ kif_to_boxlog_attvars2 = leftof(h4,h5)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h4,h5).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h4 leftof h5
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h4,h5).
============================================%~ kifm=leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3))
%~ kifm=leftof(House_Leftof8,House_Leftof9)=>(house(House_Leftof8)&house(House_Leftof9))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('House_Leftof8'),'$VAR'('House_Leftof9')),and(house('$VAR'('House_Leftof8')),house('$VAR'('House_Leftof9'))))
=======================================================
=>(leftof('$VAR'('House_Leftof'),'$VAR'('House_Leftof3')),&(house('$VAR'('House_Leftof')),house('$VAR'('House_Leftof3'))))
============================================
?- kif_to_boxlog( leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3)) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ If:
%~ ?House_Leftof leftof ?House_Leftof3 then it is
%~ Implied that:
%~ " ?House_Leftof isa house " and
%~ " ?House_Leftof3 isa house "
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('House_Leftof'),'$VAR'('House_Leftof3')),and(house('$VAR'('House_Leftof')),house('$VAR'('House_Leftof3'))))
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 6 entailment(s):
nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof))==>nesc(~house(House_Leftof3)).
nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof3))==>nesc(~house(House_Leftof)).
nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof))==>nesc(house(House_Leftof3)).
nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof3))==>nesc(house(House_Leftof)).
poss(house(House_Leftof))&nesc(~house(House_Leftof3))==>nesc(~leftof(House_Leftof,House_Leftof3)).
poss(house(House_Leftof3))&nesc(~house(House_Leftof))==>nesc(~leftof(House_Leftof,House_Leftof3)).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof3 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof)) ==>
nesc( ~( house(House_Leftof3)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof3 isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof3)) ==>
nesc( ~( house(House_Leftof)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof isa house " is possible
%~ It's Proof that:
%~ " ?House_Leftof3 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof)) ==>
nesc( house(House_Leftof3))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof3 isa house " is possible
%~ It's Proof that:
%~ " ?House_Leftof isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof3)) ==>
nesc( house(House_Leftof))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof isa house " is possible and
%~ " ?House_Leftof3 isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( poss(house(House_Leftof))&nesc(~house(House_Leftof3)) ==>
nesc( ~( leftof(House_Leftof,House_Leftof3)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof3 isa house " is possible and
%~ " ?House_Leftof isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( poss(house(House_Leftof3))&nesc(~house(House_Leftof)) ==>
nesc( ~( leftof(House_Leftof,House_Leftof3)))).
============================================%~ kifm=leftof(H1,H2)=>(house(H1)&house(H2))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('H1'),'$VAR'('H2')),and(house('$VAR'('H1')),house('$VAR'('H2'))))
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H2 isa house " is possible and
%~ " ?H1 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 leftof ?H2 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
poss(house(H2))&nesc(~house(H1))==>nesc(~leftof(H1,H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H1 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H2 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&nesc(~house(H1))==>nesc(~house(H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H2 isa house " is possible
%~ It's Proof that:
%~ " ?H1 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&poss(house(H2))==>nesc(house(H1)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 isa house " is possible and
%~ " ?H2 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 leftof ?H2 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
poss(house(H1))&nesc(~house(H2))==>nesc(~leftof(H1,H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H2 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&nesc(~house(H2))==>nesc(~house(H1)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H1 isa house " is possible
%~ It's Proof that:
%~ " ?H2 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&poss(house(H1))==>nesc(house(H2)).
[0m[1m[97m[40m?- listing(kif_show).[49m[0m[21m[0m%~ mpred_test("Test_0001_Line_0038__H1",baseKB:house(h1))
Call: (90) [baseKB] baseKB:house(h1)
Fail: (90) [baseKB] baseKB:house(h1)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h1))),rtrace(baseKB:house(h1))))
no_proof_for(\+house(h1)).
no_proof_for(\+house(h1)).
no_proof_for(\+house(h1)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0001_Line_0038__H1'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0001_Line_0038__H1-junit.xml
%~ mpred_test("Test_0002_Line_0039__H2",baseKB:house(h2))
Call: (90) [baseKB] baseKB:house(h2)
Fail: (90) [baseKB] baseKB:house(h2)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h2))),rtrace(baseKB:house(h2))))
no_proof_for(\+house(h2)).
no_proof_for(\+house(h2)).
no_proof_for(\+house(h2)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0002_Line_0039__H2'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0002_Line_0039__H2-junit.xml
%~ mpred_test("Test_0003_Line_0040__H3",baseKB:house(h3))
Call: (90) [baseKB] baseKB:house(h3)
Fail: (90) [baseKB] baseKB:house(h3)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h3))),rtrace(baseKB:house(h3))))
no_proof_for(\+house(h3)).
no_proof_for(\+house(h3)).
no_proof_for(\+house(h3)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0003_Line_0040__H3'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0003_Line_0040__H3-junit.xml
%~ mpred_test("Test_0004_Line_0041__H4",baseKB:house(h4))
Call: (90) [baseKB] baseKB:house(h4)
Fail: (90) [baseKB] baseKB:house(h4)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h4))),rtrace(baseKB:house(h4))))
no_proof_for(\+house(h4)).
no_proof_for(\+house(h4)).
no_proof_for(\+house(h4)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0004_Line_0041__H4'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0004_Line_0041__H4-junit.xml
%~ mpred_test("Test_0005_Line_0042__H5",baseKB:house(h5))
Call: (90) [baseKB] baseKB:house(h5)
Fail: (90) [baseKB] baseKB:house(h5)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h5))),rtrace(baseKB:house(h5))))
no_proof_for(\+house(h5)).
no_proof_for(\+house(h5)).
no_proof_for(\+house(h5)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0005_Line_0042__H5'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0005_Line_0042__H5-junit.xml
%~ mpred_test("Test_0006_Line_0045__False_positive",baseKB:poss(house(false_positive)))
passed=info(why_was_true(baseKB:poss(house(false_positive))))
no_proof_for(poss(house(false_positive))).
no_proof_for(poss(house(false_positive))).
no_proof_for(poss(house(false_positive))).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0006_Line_0045__False_positive'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0006_Line_0045__False_positive-junit.xml
%~ mpred_test("Test_0007_Line_0047__naf_False_positive",baseKB:(\+nesc(house(false_positive))))
passed=info(why_was_true(baseKB:(\+nesc(house(false_positive)))))
no_proof_for(\+nesc(house(false_positive))).
no_proof_for(\+nesc(house(false_positive))).
no_proof_for(\+nesc(house(false_positive))).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0007_Line_0047__naf_False_positive'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0007_Line_0047__naf_False_positive-junit.xml
~*/
%~ unused(save_junit_results)
%~ test_completed_exit(6)
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
```
totalTime=2
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANONMONOTONIC_TYPE_01
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/66/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://github.com/logicmoo/logicmoo_workspace/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k nonmonotonic_type_01.pl (returned 6)
|
2.0
|
logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01 JUnit - (cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl)
GH_MASTER_ISSUE_FINFO=
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANONMONOTONIC_TYPE_01
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/66/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://github.com/logicmoo/logicmoo_workspace/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl'),
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
%~ this_test_might_need( :-( expects_dialect(pfc)))
% =============================================
% File 'mpred_builtin.pfc'
% Purpose: Agent Reactivity for SWI-Prolog
% Maintainer: Douglas Miles
% Contact: $Author: dmiles $@users.sourceforge.net %
% Version: 'interface' 1.0.0
% Revision: $Revision: 1.9 $
% Revised At: $Date: 2002/06/27 14:13:20 $
% =============================================
%
:- module(baseKB).
:- process_script_file.
%~ kifm = leftof(h1,h2).
%~ kifm = leftof(h1,h2).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h1,h2)))))
%~ kifm = leftof(h1,h2).
%~ kifm = leftof(h2,h3).
%~ kifm = leftof(h2,h3).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h2,h3)))))
%~ kifm = leftof(h2,h3).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:25
%~ kifm = leftof(h3,h4).
%~ kifm = leftof(h3,h4).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h3,h4)))))
%~ kifm = leftof(h3,h4).
%~ kifm = leftof(h4,h5).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:26
%~ kifm = leftof(h4,h5).
%~ debugm(common_logic_loader,show_success(common_logic_loader,common_logic_loader:ain(clif(leftof(h4,h5)))))
%~ kifm = leftof(h4,h5).
%~ kifm = ( leftof(House_Leftof,House_Leftof3) =>
%~ house(House_Leftof)&house(House_Leftof3)).
%~ kifm = ( leftof(House_Leftof8,House_Leftof9) =>
%~ house(House_Leftof8)&house(House_Leftof9)).
%~ debugm( common_logic_loader,
%~ show_success( common_logic_loader,
%~ common_logic_loader : ain( clif( leftof(H1,H2)=>(house(H1)&house(H2))))))
%~ kifm = ( leftof(House_Leftof,House_Leftof3) =>
%~ house(House_Leftof)&house(House_Leftof3)).
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:31
%~ kifm = leftof(H1,H2)=>(house(H1)&house(H2)).
%~ mpred_test("Test_0001_Line_0038__H1",baseKB:house(h1))
%~ mpred_test("Test_0002_Line_0039__H2",baseKB:house(h2))
%~ mpred_test("Test_0003_Line_0040__H3",baseKB:house(h3))
%~ mpred_test("Test_0004_Line_0041__H4",baseKB:house(h4))
%~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl:42
%~ mpred_test("Test_0005_Line_0042__H5",baseKB:house(h5))
%~ mpred_test("Test_0006_Line_0045__False_positive",baseKB:poss(house(false_positive)))
%~ mpred_test("Test_0007_Line_0047__naf_False_positive",baseKB:(\+nesc(house(false_positive))))
%~ skipped( listing( [ house/1,
%~ nesc/1]))
/*~
%~ kifm=leftof(h1,h2)
%~ kifm=leftof(h1,h2)
%~ kif_to_boxlog_attvars2 = leftof(h1,h2)
% /var/lib/jenkins/.local/share/swi-prolog/pack/pfc/prolog/pfc_test compiled into pfc_test 0.06 sec, -1 clauses
=======================================================
leftof(h1,h2)
============================================
?- kif_to_boxlog( leftof(h1,h2) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h1 leftof h2
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h1,h2)
%~ kif_to_boxlog_attvars2 = leftof(h1,h2)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h1,h2).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h1 leftof h2
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h1,h2).
============================================%~ kifm=leftof(h2,h3)
%~ kifm=leftof(h2,h3)
%~ kif_to_boxlog_attvars2 = leftof(h2,h3)
=======================================================
leftof(h2,h3)
============================================
?- kif_to_boxlog( leftof(h2,h3) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h2 leftof h3
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h2,h3)
%~ kif_to_boxlog_attvars2 = leftof(h2,h3)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h2,h3).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h2 leftof h3
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h2,h3).
============================================%~ kifm=leftof(h3,h4)
%~ kifm=leftof(h3,h4)
%~ kif_to_boxlog_attvars2 = leftof(h3,h4)
=======================================================
leftof(h3,h4)
============================================
?- kif_to_boxlog( leftof(h3,h4) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h3 leftof h4
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h3,h4)
%~ kif_to_boxlog_attvars2 = leftof(h3,h4)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h3,h4).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h3 leftof h4
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h3,h4).
============================================%~ kifm=leftof(h4,h5)
%~ kifm=leftof(h4,h5)
%~ kif_to_boxlog_attvars2 = leftof(h4,h5)
=======================================================
leftof(h4,h5)
============================================
?- kif_to_boxlog( leftof(h4,h5) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h4 leftof h5
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(h4,h5)
%~ kif_to_boxlog_attvars2 = leftof(h4,h5)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 1 entailment(s):
leftof(h4,h5).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ h4 leftof h5
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
leftof(h4,h5).
============================================%~ kifm=leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3))
%~ kifm=leftof(House_Leftof8,House_Leftof9)=>(house(House_Leftof8)&house(House_Leftof9))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('House_Leftof8'),'$VAR'('House_Leftof9')),and(house('$VAR'('House_Leftof8')),house('$VAR'('House_Leftof9'))))
=======================================================
=>(leftof('$VAR'('House_Leftof'),'$VAR'('House_Leftof3')),&(house('$VAR'('House_Leftof')),house('$VAR'('House_Leftof3'))))
============================================
?- kif_to_boxlog( leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3)) ).
% In English:
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ If:
%~ ?House_Leftof leftof ?House_Leftof3 then it is
%~ Implied that:
%~ " ?House_Leftof isa house " and
%~ " ?House_Leftof3 isa house "
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ kifm=leftof(House_Leftof,House_Leftof3)=>(house(House_Leftof)&house(House_Leftof3))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('House_Leftof'),'$VAR'('House_Leftof3')),and(house('$VAR'('House_Leftof')),house('$VAR'('House_Leftof3'))))
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Results in the following 6 entailment(s):
nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof))==>nesc(~house(House_Leftof3)).
nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof3))==>nesc(~house(House_Leftof)).
nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof))==>nesc(house(House_Leftof3)).
nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof3))==>nesc(house(House_Leftof)).
poss(house(House_Leftof))&nesc(~house(House_Leftof3))==>nesc(~leftof(House_Leftof,House_Leftof3)).
poss(house(House_Leftof3))&nesc(~house(House_Leftof))==>nesc(~leftof(House_Leftof,House_Leftof3)).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof3 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof)) ==>
nesc( ~( house(House_Leftof3)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof3 isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&nesc(~house(House_Leftof3)) ==>
nesc( ~( house(House_Leftof)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof isa house " is possible
%~ It's Proof that:
%~ " ?House_Leftof3 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof)) ==>
nesc( house(House_Leftof3))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily true and
%~ " ?House_Leftof3 isa house " is possible
%~ It's Proof that:
%~ " ?House_Leftof isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( nesc(leftof(House_Leftof,House_Leftof3))&poss(house(House_Leftof3)) ==>
nesc( house(House_Leftof))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof isa house " is possible and
%~ " ?House_Leftof3 isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( poss(house(House_Leftof))&nesc(~house(House_Leftof3)) ==>
nesc( ~( leftof(House_Leftof,House_Leftof3)))).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?House_Leftof3 isa house " is possible and
%~ " ?House_Leftof isa house " is necessarily false
%~ It's Proof that:
%~ " ?House_Leftof leftof ?House_Leftof3 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
( poss(house(House_Leftof3))&nesc(~house(House_Leftof)) ==>
nesc( ~( leftof(House_Leftof,House_Leftof3)))).
============================================%~ kifm=leftof(H1,H2)=>(house(H1)&house(H2))
%~ kif_to_boxlog_attvars2 = =>(leftof('$VAR'('H1'),'$VAR'('H2')),and(house('$VAR'('H1')),house('$VAR'('H2'))))
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H2 isa house " is possible and
%~ " ?H1 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 leftof ?H2 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
poss(house(H2))&nesc(~house(H1))==>nesc(~leftof(H1,H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H1 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H2 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&nesc(~house(H1))==>nesc(~house(H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H2 isa house " is possible
%~ It's Proof that:
%~ " ?H1 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&poss(house(H2))==>nesc(house(H1)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 isa house " is possible and
%~ " ?H2 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 leftof ?H2 " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
poss(house(H1))&nesc(~house(H2))==>nesc(~leftof(H1,H2)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H2 isa house " is necessarily false
%~ It's Proof that:
%~ " ?H1 isa house " is necessarily false
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&nesc(~house(H2))==>nesc(~house(H1)).
% AND
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%~ Whenever:
%~ " ?H1 leftof ?H2 " is necessarily true and
%~ " ?H1 isa house " is possible
%~ It's Proof that:
%~ " ?H2 isa house " is necessarily true
%~
%~ %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
nesc(leftof(H1,H2))&poss(house(H1))==>nesc(house(H2)).
[0m[1m[97m[40m?- listing(kif_show).[49m[0m[21m[0m%~ mpred_test("Test_0001_Line_0038__H1",baseKB:house(h1))
Call: (90) [baseKB] baseKB:house(h1)
Fail: (90) [baseKB] baseKB:house(h1)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h1))),rtrace(baseKB:house(h1))))
no_proof_for(\+house(h1)).
no_proof_for(\+house(h1)).
no_proof_for(\+house(h1)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0001_Line_0038__H1'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0001_Line_0038__H1-junit.xml
%~ mpred_test("Test_0002_Line_0039__H2",baseKB:house(h2))
Call: (90) [baseKB] baseKB:house(h2)
Fail: (90) [baseKB] baseKB:house(h2)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h2))),rtrace(baseKB:house(h2))))
no_proof_for(\+house(h2)).
no_proof_for(\+house(h2)).
no_proof_for(\+house(h2)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0002_Line_0039__H2'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0002_Line_0039__H2-junit.xml
%~ mpred_test("Test_0003_Line_0040__H3",baseKB:house(h3))
Call: (90) [baseKB] baseKB:house(h3)
Fail: (90) [baseKB] baseKB:house(h3)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h3))),rtrace(baseKB:house(h3))))
no_proof_for(\+house(h3)).
no_proof_for(\+house(h3)).
no_proof_for(\+house(h3)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0003_Line_0040__H3'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0003_Line_0040__H3-junit.xml
%~ mpred_test("Test_0004_Line_0041__H4",baseKB:house(h4))
Call: (90) [baseKB] baseKB:house(h4)
Fail: (90) [baseKB] baseKB:house(h4)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h4))),rtrace(baseKB:house(h4))))
no_proof_for(\+house(h4)).
no_proof_for(\+house(h4)).
no_proof_for(\+house(h4)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0004_Line_0041__H4'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0004_Line_0041__H4-junit.xml
%~ mpred_test("Test_0005_Line_0042__H5",baseKB:house(h5))
Call: (90) [baseKB] baseKB:house(h5)
Fail: (90) [baseKB] baseKB:house(h5)
^ Call: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
^ Unify: (90) [must_sanity] must_sanity:mquietly_if(true, rtrace:tAt_normal)
failure=info((why_was_true(baseKB:(\+house(h5))),rtrace(baseKB:house(h5))))
no_proof_for(\+house(h5)).
no_proof_for(\+house(h5)).
no_proof_for(\+house(h5)).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0005_Line_0042__H5'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0005_Line_0042__H5-junit.xml
%~ mpred_test("Test_0006_Line_0045__False_positive",baseKB:poss(house(false_positive)))
passed=info(why_was_true(baseKB:poss(house(false_positive))))
no_proof_for(poss(house(false_positive))).
no_proof_for(poss(house(false_positive))).
no_proof_for(poss(house(false_positive))).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0006_Line_0045__False_positive'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0006_Line_0045__False_positive-junit.xml
%~ mpred_test("Test_0007_Line_0047__naf_False_positive",baseKB:(\+nesc(house(false_positive))))
passed=info(why_was_true(baseKB:(\+nesc(house(false_positive)))))
no_proof_for(\+nesc(house(false_positive))).
no_proof_for(\+nesc(house(false_positive))).
no_proof_for(\+nesc(house(false_positive))).
name ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0007_Line_0047__naf_False_positive'.
JUNIT_CLASSNAME ='logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01'.
JUNIT_CMD ='timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif nonmonotonic_type_01.pl'.
% saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-junit-fol-fiveof-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.base.fol.fiveof.NONMONOTONIC_TYPE_01-Test_0007_Line_0047__naf_False_positive-junit.xml
~*/
%~ unused(save_junit_results)
%~ test_completed_exit(6)
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
:- dynamic junit_prop/3.
```
totalTime=2
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ANONMONOTONIC_TYPE_01
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
This Build: https://jenkins.logicmoo.org/job/logicmoo_workspace/66/testReport/logicmoo.base.fol.fiveof/NONMONOTONIC_TYPE_01/logicmoo_base_fol_fiveof_NONMONOTONIC_TYPE_01_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3
https://github.com/logicmoo/logicmoo_workspace/blob/869479bc8cf913ee2df5ebbe49363a2dad9c9fb3/packs_sys/logicmoo_base/t/examples/fol/fiveof/nonmonotonic_type_01.pl
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k nonmonotonic_type_01.pl (returned 6)
|
non_process
|
logicmoo base fol fiveof nonmonotonic type junit cd var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl gh master issue finfo issue search gitlab latest this build github running var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl this test might need use module library logicmoo plarkc this test might need expects dialect pfc file mpred builtin pfc purpose agent reactivity for swi prolog maintainer douglas miles contact author dmiles users sourceforge net version interface revision revision revised at date module basekb process script file kifm leftof kifm leftof debugm common logic loader show success common logic loader common logic loader ain clif leftof kifm leftof kifm leftof kifm leftof debugm common logic loader show success common logic loader common logic loader ain clif leftof kifm leftof var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl kifm leftof kifm leftof debugm common logic loader show success common logic loader common logic loader ain clif leftof kifm leftof kifm leftof var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl kifm leftof debugm common logic loader show success common logic loader common logic loader ain clif leftof kifm leftof kifm leftof house leftof house house house leftof house house kifm leftof house house house house house house debugm common logic loader show success common logic loader common logic loader ain clif leftof house house kifm leftof house leftof house house house leftof house house var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl kifm leftof house house mpred test test line basekb house mpred test test line basekb house mpred test test line basekb house mpred test test line basekb house var lib jenkins workspace logicmoo workspace packs sys logicmoo base t examples fol fiveof nonmonotonic type pl mpred test test line basekb house mpred test test line false positive basekb poss house false positive mpred test test line naf false positive basekb nesc house false positive skipped listing house nesc kifm leftof kifm leftof kif to boxlog leftof var lib jenkins local share swi prolog pack pfc prolog pfc test compiled into pfc test sec clauses leftof kif to boxlog leftof in english leftof kifm leftof kif to boxlog leftof results in the following entailment s leftof leftof leftof kifm leftof kifm leftof kif to boxlog leftof leftof kif to boxlog leftof in english leftof kifm leftof kif to boxlog leftof results in the following entailment s leftof leftof leftof kifm leftof kifm leftof kif to boxlog leftof leftof kif to boxlog leftof in english leftof kifm leftof kif to boxlog leftof results in the following entailment s leftof leftof leftof kifm leftof kifm leftof kif to boxlog leftof leftof kif to boxlog leftof in english leftof kifm leftof kif to boxlog leftof results in the following entailment s leftof leftof leftof kifm leftof house leftof house house house leftof house house kifm leftof house house house house house house kif to boxlog leftof var house var house and house var house house var house leftof var house leftof var house house var house leftof house var house kif to boxlog leftof house leftof house house house leftof house house in english if house leftof leftof house then it is implied that house leftof isa house and house isa house kifm leftof house leftof house house house leftof house house kif to boxlog leftof var house leftof var house and house var house leftof house var house results in the following entailment s nesc leftof house leftof house nesc house house leftof nesc house house nesc leftof house leftof house nesc house house nesc house house leftof nesc leftof house leftof house poss house house leftof nesc house house nesc leftof house leftof house poss house house nesc house house leftof poss house house leftof nesc house house nesc leftof house leftof house poss house house nesc house house leftof nesc leftof house leftof house whenever house leftof leftof house is necessarily true and house leftof isa house is necessarily false it s proof that house isa house is necessarily false nesc leftof house leftof house nesc house house leftof nesc house house and whenever house leftof leftof house is necessarily true and house isa house is necessarily false it s proof that house leftof isa house is necessarily false nesc leftof house leftof house nesc house house nesc house house leftof and whenever house leftof leftof house is necessarily true and house leftof isa house is possible it s proof that house isa house is necessarily true nesc leftof house leftof house poss house house leftof nesc house house and whenever house leftof leftof house is necessarily true and house isa house is possible it s proof that house leftof isa house is necessarily true nesc leftof house leftof house poss house house nesc house house leftof and whenever house leftof isa house is possible and house isa house is necessarily false it s proof that house leftof leftof house is necessarily false poss house house leftof nesc house house nesc leftof house leftof house and whenever house isa house is possible and house leftof isa house is necessarily false it s proof that house leftof leftof house is necessarily false poss house house nesc house house leftof nesc leftof house leftof house kifm leftof house house kif to boxlog leftof var var and house var house var whenever isa house is possible and isa house is necessarily false it s proof that leftof is necessarily false poss house nesc house nesc leftof and whenever leftof is necessarily true and isa house is necessarily false it s proof that isa house is necessarily false nesc leftof nesc house nesc house and whenever leftof is necessarily true and isa house is possible it s proof that isa house is necessarily true nesc leftof poss house nesc house and whenever isa house is possible and isa house is necessarily false it s proof that leftof is necessarily false poss house nesc house nesc leftof and whenever leftof is necessarily true and isa house is necessarily false it s proof that isa house is necessarily false nesc leftof nesc house nesc house and whenever leftof is necessarily true and isa house is possible it s proof that isa house is necessarily true nesc leftof poss house nesc house listing kif show mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line basekb house call basekb house fail basekb house call must sanity mquietly if true rtrace tat normal unify must sanity mquietly if true rtrace tat normal failure info why was true basekb house rtrace basekb house no proof for house no proof for house no proof for house name logicmoo base fol fiveof nonmonotonic type test line junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line junit xml mpred test test line false positive basekb poss house false positive passed info why was true basekb poss house false positive no proof for poss house false positive no proof for poss house false positive no proof for poss house false positive name logicmoo base fol fiveof nonmonotonic type test line false positive junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line false positive junit xml mpred test test line naf false positive basekb nesc house false positive passed info why was true basekb nesc house false positive no proof for nesc house false positive no proof for nesc house false positive no proof for nesc house false positive name logicmoo base fol fiveof nonmonotonic type test line naf false positive junit classname logicmoo base fol fiveof nonmonotonic type junit cmd timeout foreground preserve status s sigkill k lmoo clif nonmonotonic type pl saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo junit fol fiveof units logicmoo base fol fiveof nonmonotonic type test line naf false positive junit xml unused save junit results test completed exit dynamic junit prop dynamic junit prop dynamic junit prop totaltime issue search gitlab latest this build github failed var lib jenkins workspace logicmoo workspace bin lmoo junit minor k nonmonotonic type pl returned
| 0
|
13,842
| 2,787,995,498
|
IssuesEvent
|
2015-05-08 10:28:44
|
CocoaPods/CocoaPods
|
https://api.github.com/repos/CocoaPods/CocoaPods
|
closed
|
pod setup breaks in 0.37.0
|
s1:awaiting input t2:defect
|
```
$ uname -a
Darwin CX22 14.3.0 Darwin Kernel Version 14.3.0: Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 x86_64
$ pod --version
0.37.0
$ gem --version
2.4.5
$ ruby --version
ruby 2.2.0p0 (2014-12-25 revision 49005) [x86_64-darwin14]
```
The command I used is `cd ~ && rm -rf .cocoapods && pod setup --verbose`
```
$ cd ~ && rm -rf .cocoapods && pod setup --verbose
Setting up CocoaPods master repo
Creating shallow clone of spec repo `master` from `https://github.com/CocoaPods/Specs.git` (branch `master`)
$ /usr/local/bin/git clone https://github.com/CocoaPods/Specs.git master --depth\=1
Cloning into 'master'...
Checking out files: 100% (36965/36965), done.
$ /usr/local/bin/git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
[!] There was an error reading '/Users/chakrit/.cocoapods/repos/master/CocoaPods-version.yml'.
Please consult http://blog.cocoapods.org/Repairing-Our-Broken-Specs-Repository/ for more information.
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/sources_manager.rb:333:in `rescue in version_information'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/sources_manager.rb:330:in `version_information'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/sources_manager.rb:251:in `check_version_information'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/repo/add.rb:47:in `block in run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/user_interface.rb:49:in `section'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/repo/add.rb:39:in `run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/setup.rb:84:in `add_master_repo'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/setup.rb:40:in `block in run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/user_interface.rb:49:in `section'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/setup.rb:32:in `run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/claide-0.8.1/lib/claide/command.rb:312:in `run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command.rb:46:in `run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/bin/pod:44:in `<top (required)>'
/Users/chakrit/.rbenv/versions/2.2.0/bin/pod:23:in `load'
/Users/chakrit/.rbenv/versions/2.2.0/bin/pod:23:in `<main>'
```
|
1.0
|
pod setup breaks in 0.37.0 - ```
$ uname -a
Darwin CX22 14.3.0 Darwin Kernel Version 14.3.0: Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64 x86_64
$ pod --version
0.37.0
$ gem --version
2.4.5
$ ruby --version
ruby 2.2.0p0 (2014-12-25 revision 49005) [x86_64-darwin14]
```
The command I used is `cd ~ && rm -rf .cocoapods && pod setup --verbose`
```
$ cd ~ && rm -rf .cocoapods && pod setup --verbose
Setting up CocoaPods master repo
Creating shallow clone of spec repo `master` from `https://github.com/CocoaPods/Specs.git` (branch `master`)
$ /usr/local/bin/git clone https://github.com/CocoaPods/Specs.git master --depth\=1
Cloning into 'master'...
Checking out files: 100% (36965/36965), done.
$ /usr/local/bin/git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
[!] There was an error reading '/Users/chakrit/.cocoapods/repos/master/CocoaPods-version.yml'.
Please consult http://blog.cocoapods.org/Repairing-Our-Broken-Specs-Repository/ for more information.
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/sources_manager.rb:333:in `rescue in version_information'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/sources_manager.rb:330:in `version_information'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/sources_manager.rb:251:in `check_version_information'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/repo/add.rb:47:in `block in run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/user_interface.rb:49:in `section'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/repo/add.rb:39:in `run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/setup.rb:84:in `add_master_repo'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/setup.rb:40:in `block in run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/user_interface.rb:49:in `section'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command/setup.rb:32:in `run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/claide-0.8.1/lib/claide/command.rb:312:in `run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/lib/cocoapods/command.rb:46:in `run'
/Users/chakrit/.rbenv/versions/2.2.0/lib/ruby/gems/2.2.0/gems/cocoapods-0.37.0/bin/pod:44:in `<top (required)>'
/Users/chakrit/.rbenv/versions/2.2.0/bin/pod:23:in `load'
/Users/chakrit/.rbenv/versions/2.2.0/bin/pod:23:in `<main>'
```
|
non_process
|
pod setup breaks in uname a darwin darwin kernel version mon mar pdt root xnu release pod version gem version ruby version ruby revision the command i used is cd rm rf cocoapods pod setup verbose cd rm rf cocoapods pod setup verbose setting up cocoapods master repo creating shallow clone of spec repo master from branch master usr local bin git clone master depth cloning into master checking out files done usr local bin git checkout master already on master your branch is up to date with origin master there was an error reading users chakrit cocoapods repos master cocoapods version yml please consult for more information users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods sources manager rb in rescue in version information users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods sources manager rb in version information users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods sources manager rb in check version information users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods command repo add rb in block in run users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods user interface rb in section users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods command repo add rb in run users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods command setup rb in add master repo users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods command setup rb in block in run users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods user interface rb in section users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods command setup rb in run users chakrit rbenv versions lib ruby gems gems claide lib claide command rb in run users chakrit rbenv versions lib ruby gems gems cocoapods lib cocoapods command rb in run users chakrit rbenv versions lib ruby gems gems cocoapods bin pod in users chakrit rbenv versions bin pod in load users chakrit rbenv versions bin pod in
| 0
|
389,284
| 11,500,405,141
|
IssuesEvent
|
2020-02-12 15:30:54
|
oysteijo/simd_neuralnet
|
https://api.github.com/repos/oysteijo/simd_neuralnet
|
closed
|
Refactor neuralnet_new and how nets are stored
|
High priority Refactor/Cleanup enhancement
|
The current `neuralnet_new` function tales in a filename to a npz file containing the weigths and biases of a neural network to be created, in addition to the names of the activations functions.
It would be an improvement if the activation functions could be stored in one "string" array together with the weights.
The function should then be renamed `neuralnet_read` or `neuralnet_open` or `neuralnet_from_file`, or something like that. We also have to change the `neuralnet_save` function accordingly.
|
1.0
|
Refactor neuralnet_new and how nets are stored - The current `neuralnet_new` function tales in a filename to a npz file containing the weigths and biases of a neural network to be created, in addition to the names of the activations functions.
It would be an improvement if the activation functions could be stored in one "string" array together with the weights.
The function should then be renamed `neuralnet_read` or `neuralnet_open` or `neuralnet_from_file`, or something like that. We also have to change the `neuralnet_save` function accordingly.
|
non_process
|
refactor neuralnet new and how nets are stored the current neuralnet new function tales in a filename to a npz file containing the weigths and biases of a neural network to be created in addition to the names of the activations functions it would be an improvement if the activation functions could be stored in one string array together with the weights the function should then be renamed neuralnet read or neuralnet open or neuralnet from file or something like that we also have to change the neuralnet save function accordingly
| 0
|
9,037
| 12,130,107,928
|
IssuesEvent
|
2020-04-23 00:30:40
|
GoogleCloudPlatform/python-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
|
closed
|
remove gcp-devrel-py-tools from appengine/standard/endpoints-frameworks-v2/iata/requirements-test.txt
|
priority: p2 remove-gcp-devrel-py-tools type: process
|
remove gcp-devrel-py-tools from appengine/standard/endpoints-frameworks-v2/iata/requirements-test.txt
|
1.0
|
remove gcp-devrel-py-tools from appengine/standard/endpoints-frameworks-v2/iata/requirements-test.txt - remove gcp-devrel-py-tools from appengine/standard/endpoints-frameworks-v2/iata/requirements-test.txt
|
process
|
remove gcp devrel py tools from appengine standard endpoints frameworks iata requirements test txt remove gcp devrel py tools from appengine standard endpoints frameworks iata requirements test txt
| 1
|
22,500
| 31,478,093,124
|
IssuesEvent
|
2023-08-30 12:11:47
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
hpcflow-new2 0.2.0a92 has 2 GuardDog issues
|
guarddog exec-base64 silent-process-execution
|
https://pypi.org/project/hpcflow-new2
https://inspector.pypi.io/project/hpcflow-new2
```{
"dependency": "hpcflow-new2",
"version": "0.2.0a92",
"result": {
"issues": 2,
"errors": {},
"results": {
"silent-process-execution": [
{
"location": "hpcflow_new2-0.2.0a92/hpcflow/sdk/helper/helper.py:111",
"code": " proc = subprocess.Popen(\n args=args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n **kwargs,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
],
"exec-base64": [
{
"location": "hpcflow_new2-0.2.0a92/hpcflow/sdk/submission/jobscript.py:981",
"code": " init_proc = subprocess.Popen(\n args=args,\n cwd=str(self.workflow.path),\n creationflags=subprocess.CREATE_NO_WINDOW,\n )",
"message": "This package contains a call to the `eval` function with a `base64` encoded string as argument.\nThis is a common method used to hide a malicious payload in a module as static analysis will not decode the\nstring.\n"
}
]
},
"path": "/tmp/tmp58q_ropn/hpcflow-new2"
}
}```
|
1.0
|
hpcflow-new2 0.2.0a92 has 2 GuardDog issues - https://pypi.org/project/hpcflow-new2
https://inspector.pypi.io/project/hpcflow-new2
```{
"dependency": "hpcflow-new2",
"version": "0.2.0a92",
"result": {
"issues": 2,
"errors": {},
"results": {
"silent-process-execution": [
{
"location": "hpcflow_new2-0.2.0a92/hpcflow/sdk/helper/helper.py:111",
"code": " proc = subprocess.Popen(\n args=args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n **kwargs,\n )",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
],
"exec-base64": [
{
"location": "hpcflow_new2-0.2.0a92/hpcflow/sdk/submission/jobscript.py:981",
"code": " init_proc = subprocess.Popen(\n args=args,\n cwd=str(self.workflow.path),\n creationflags=subprocess.CREATE_NO_WINDOW,\n )",
"message": "This package contains a call to the `eval` function with a `base64` encoded string as argument.\nThis is a common method used to hide a malicious payload in a module as static analysis will not decode the\nstring.\n"
}
]
},
"path": "/tmp/tmp58q_ropn/hpcflow-new2"
}
}```
|
process
|
hpcflow has guarddog issues dependency hpcflow version result issues errors results silent process execution location hpcflow hpcflow sdk helper helper py code proc subprocess popen n args args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n kwargs n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null exec location hpcflow hpcflow sdk submission jobscript py code init proc subprocess popen n args args n cwd str self workflow path n creationflags subprocess create no window n message this package contains a call to the eval function with a encoded string as argument nthis is a common method used to hide a malicious payload in a module as static analysis will not decode the nstring n path tmp ropn hpcflow
| 1
|
9,207
| 12,239,323,650
|
IssuesEvent
|
2020-05-04 21:27:14
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Need Azure DevOps Pipeline expression function to access a numbered item in an array
|
Pri1 devops-cicd-process/tech devops/prod
|
If you have an array-typed object in a template expression (e.g. passed by parameters), you can only iterate over that array with today's syntax. There is no way to access a specific (i.e. indexed) item from that array.
Consider the following example, where a template is invoked with a defaults object that contains a list of default object specifications coming from different sources. By using a filtered-array expression it is easy to access the `item` property of each array-element. But there is no way to access, the first or last item (which would be very useful in this scenario).
I propose to add three new function expressions:
* `item(index, array)`
Returns the element in `array` that is located at position `index` (0-based), or returns `null` if the index was out of bounds. `array` can be a filtered-array expression, or a YAML sequence object obtained from a parameter or variable in the current context.
If `array` is a string, the `item` returns the character located at `index`.
If `array` is neither an array, nor a string, then coerce the value into a string following normal type-casting rules.
* `first` works like `item` with the `index` parameter set to `0` to obatin the first item.
* `last` works like `item` with the `index` parameter set to the length of `array` minus 1, to get the last item in the sequence.
```yml
parameters:
- name: defaults
type: object
default:
- item: default1
- item: default2
- item: default3
steps:
- pwsh: 'Write-Host ${{ item(1, parameters.defaults.*.item) }}'
# Writes: default2
- pwsh: 'Write-Host ${{ first(parameters.defaults.*.item) }}'
# Writes default1
- pwsh: 'Write-Host ${{ last(parameters.defaults.*.item) }}'
# Writes default3
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6
* Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18
* Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#feedback)
* Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/expressions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Need Azure DevOps Pipeline expression function to access a numbered item in an array - If you have an array-typed object in a template expression (e.g. passed by parameters), you can only iterate over that array with today's syntax. There is no way to access a specific (i.e. indexed) item from that array.
Consider the following example, where a template is invoked with a defaults object that contains a list of default object specifications coming from different sources. By using a filtered-array expression it is easy to access the `item` property of each array-element. But there is no way to access, the first or last item (which would be very useful in this scenario).
I propose to add three new function expressions:
* `item(index, array)`
Returns the element in `array` that is located at position `index` (0-based), or returns `null` if the index was out of bounds. `array` can be a filtered-array expression, or a YAML sequence object obtained from a parameter or variable in the current context.
If `array` is a string, the `item` returns the character located at `index`.
If `array` is neither an array, nor a string, then coerce the value into a string following normal type-casting rules.
* `first` works like `item` with the `index` parameter set to `0` to obatin the first item.
* `last` works like `item` with the `index` parameter set to the length of `array` minus 1, to get the last item in the sequence.
```yml
parameters:
- name: defaults
type: object
default:
- item: default1
- item: default2
- item: default3
steps:
- pwsh: 'Write-Host ${{ item(1, parameters.defaults.*.item) }}'
# Writes: default2
- pwsh: 'Write-Host ${{ first(parameters.defaults.*.item) }}'
# Writes default1
- pwsh: 'Write-Host ${{ last(parameters.defaults.*.item) }}'
# Writes default3
```
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6
* Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18
* Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#feedback)
* Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/expressions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
need azure devops pipeline expression function to access a numbered item in an array if you have an array typed object in a template expression e g passed by parameters you can only iterate over that array with today s syntax there is no way to access a specific i e indexed item from that array consider the following example where a template is invoked with a defaults object that contains a list of default object specifications coming from different sources by using a filtered array expression it is easy to access the item property of each array element but there is no way to access the first or last item which would be very useful in this scenario i propose to add three new function expressions item index array returns the element in array that is located at position index based or returns null if the index was out of bounds array can be a filtered array expression or a yaml sequence object obtained from a parameter or variable in the current context if array is a string the item returns the character located at index if array is neither an array nor a string then coerce the value into a string following normal type casting rules first works like item with the index parameter set to to obatin the first item last works like item with the index parameter set to the length of array minus to get the last item in the sequence yml parameters name defaults type object default item item item steps pwsh write host item parameters defaults item writes pwsh write host first parameters defaults item writes pwsh write host last parameters defaults item writes document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
638,490
| 20,729,202,858
|
IssuesEvent
|
2022-03-14 07:38:34
|
pa1nki113r/Project_Brutality
|
https://api.github.com/repos/pa1nki113r/Project_Brutality
|
closed
|
Brightmap files
|
minor priority
|
The "Arachnophyte.txt" and "Draugr.txt" files in the BMAP folder are exactly the same.
|
1.0
|
Brightmap files - The "Arachnophyte.txt" and "Draugr.txt" files in the BMAP folder are exactly the same.
|
non_process
|
brightmap files the arachnophyte txt and draugr txt files in the bmap folder are exactly the same
| 0
|
1,050
| 3,518,609,769
|
IssuesEvent
|
2016-01-12 13:45:34
|
kerubistan/kerub
|
https://api.github.com/repos/kerubistan/kerub
|
opened
|
integrate ethtool utility
|
component:data processing enhancement priority: normal
|
could make use of ethtool to detect if wake-on-lan is configured on the host
|
1.0
|
integrate ethtool utility - could make use of ethtool to detect if wake-on-lan is configured on the host
|
process
|
integrate ethtool utility could make use of ethtool to detect if wake on lan is configured on the host
| 1
|
47,289
| 24,938,438,595
|
IssuesEvent
|
2022-10-31 16:52:46
|
Automattic/jetpack
|
https://api.github.com/repos/Automattic/jetpack
|
closed
|
Performance: minify all .js files
|
[Type] Enhancement General [Focus] Performance [Status] Stale
|
[We already use Grunt](https://github.com/Automattic/jetpack/blob/master/Gruntfile.js) for several aspects of the Jetpack codebase: sass, concat, jshint, ...
It would be nice if we could use it to generate minified versions of each one of the .js files included in Jetpack, thus improving performance a bit.
Suggested here:
https://wordpress.org/support/topic/please-minify-js?replies=1&view=all
|
True
|
Performance: minify all .js files - [We already use Grunt](https://github.com/Automattic/jetpack/blob/master/Gruntfile.js) for several aspects of the Jetpack codebase: sass, concat, jshint, ...
It would be nice if we could use it to generate minified versions of each one of the .js files included in Jetpack, thus improving performance a bit.
Suggested here:
https://wordpress.org/support/topic/please-minify-js?replies=1&view=all
|
non_process
|
performance minify all js files for several aspects of the jetpack codebase sass concat jshint it would be nice if we could use it to generate minified versions of each one of the js files included in jetpack thus improving performance a bit suggested here
| 0
|
12,999
| 15,360,041,231
|
IssuesEvent
|
2021-03-01 16:29:56
|
edwardsmarc/CASFRI
|
https://api.github.com/repos/edwardsmarc/CASFRI
|
closed
|
Implement temporalization
|
blocker enhancement post-translation process
|
Temporalization will be implemented according to the roadmap document. Each polygon will be splitted into many parts, each part being valid for a specific time interval.
Computed time interval for each polygon parts will be based on the STAND_PHOTO_YEAR attribute. It is still to determine what to do when stand_photo_year is not set (leave them or fix them. See issue #249 and #248).
Once issues with STAND_PHOTO_YEAR are solved, implementing temporalization becomes only a matter of writing a function taking the name of a table having a geometry column and a year and to compute all the historical parts of this polygons and their associated VALID_YEAR_START and VALID_YEAR_END values.
SELECT * FROM TT_GeoHistory(schemaName, tableName, uniqueIDColumn, geoColumnName, photoYearColName, hasPrecedenceFct, isValidFct)
will return a SETOF (geom, validYearStart, validYearEnd) for each polygon.
hasPrecedenceFct, if provided, is the name of a function taking two RECORDs. When two overlapping polygons of the same year are compared, TT_GeoHistory() will call this function to determine if the RECORD corresponding to the first polygon takes precedence over RECORD of the second one. If hasPrecedenceFct returns TRUE, the first polygon takes precedence over the second one. If hasPrecedenceFct returns FALSE, the second polygon takes precedence over the first one. If hasPrecedenceFct is not provided, TT_GeoHistory() always gives precedence to the polygon having the highest uniqueIDColumn.
isValidFct, if provided, is the name of a function taking one RECORD. It is used to determine if a polygon must be taken into account when computing old and future history of the area covered by a polygon. It will typically return FALSE if all significant attributes are NULL, leading TT_GeoHistory() to ignore this polygon. If only one significant attribute is not NULL, isValidFct returns TRUE. If isValidFct is not provided, all poygons are considered valid.
In an alternative version of TT_GeoHistory() hasPrecedenceFct and isValidFct have hardcoded names. There is only need to pass a list of attribute names (e.g. "cas_id, dist_type_1, species_1") to TT_GeoHistory() to activate the precedence and the validation mechanism. The advantage is that since TT_GeoHistory() knows the list of values to pass to both functions, it does not have to blindly and inefficiently pass all of them.
Some questions regarding the values assigned to VALID_YEAR_START and VALID_YEAR_END:
- If the earliest year for the area covered by a polygon part is 2000, should this become its VALID_YEAR_START or should it be assigned an earlier, arbitrary VALID_YEAR_START (e.g. 1940 or 1900) since this is all the info we have for the past of this area?
- If the latest year for the area covered by a polygon part is 2020, should this become its VALID_YEAR_END or should it be assigned a later, arbitrary VALID_YEAR_END (e.g. 3000 or NULL) since this is all the info we have for the future of this area?
|
1.0
|
Implement temporalization - Temporalization will be implemented according to the roadmap document. Each polygon will be splitted into many parts, each part being valid for a specific time interval.
Computed time interval for each polygon parts will be based on the STAND_PHOTO_YEAR attribute. It is still to determine what to do when stand_photo_year is not set (leave them or fix them. See issue #249 and #248).
Once issues with STAND_PHOTO_YEAR are solved, implementing temporalization becomes only a matter of writing a function taking the name of a table having a geometry column and a year and to compute all the historical parts of this polygons and their associated VALID_YEAR_START and VALID_YEAR_END values.
SELECT * FROM TT_GeoHistory(schemaName, tableName, uniqueIDColumn, geoColumnName, photoYearColName, hasPrecedenceFct, isValidFct)
will return a SETOF (geom, validYearStart, validYearEnd) for each polygon.
hasPrecedenceFct, if provided, is the name of a function taking two RECORDs. When two overlapping polygons of the same year are compared, TT_GeoHistory() will call this function to determine if the RECORD corresponding to the first polygon takes precedence over RECORD of the second one. If hasPrecedenceFct returns TRUE, the first polygon takes precedence over the second one. If hasPrecedenceFct returns FALSE, the second polygon takes precedence over the first one. If hasPrecedenceFct is not provided, TT_GeoHistory() always gives precedence to the polygon having the highest uniqueIDColumn.
isValidFct, if provided, is the name of a function taking one RECORD. It is used to determine if a polygon must be taken into account when computing old and future history of the area covered by a polygon. It will typically return FALSE if all significant attributes are NULL, leading TT_GeoHistory() to ignore this polygon. If only one significant attribute is not NULL, isValidFct returns TRUE. If isValidFct is not provided, all poygons are considered valid.
In an alternative version of TT_GeoHistory() hasPrecedenceFct and isValidFct have hardcoded names. There is only need to pass a list of attribute names (e.g. "cas_id, dist_type_1, species_1") to TT_GeoHistory() to activate the precedence and the validation mechanism. The advantage is that since TT_GeoHistory() knows the list of values to pass to both functions, it does not have to blindly and inefficiently pass all of them.
Some questions regarding the values assigned to VALID_YEAR_START and VALID_YEAR_END:
- If the earliest year for the area covered by a polygon part is 2000, should this become its VALID_YEAR_START or should it be assigned an earlier, arbitrary VALID_YEAR_START (e.g. 1940 or 1900) since this is all the info we have for the past of this area?
- If the latest year for the area covered by a polygon part is 2020, should this become its VALID_YEAR_END or should it be assigned a later, arbitrary VALID_YEAR_END (e.g. 3000 or NULL) since this is all the info we have for the future of this area?
|
process
|
implement temporalization temporalization will be implemented according to the roadmap document each polygon will be splitted into many parts each part being valid for a specific time interval computed time interval for each polygon parts will be based on the stand photo year attribute it is still to determine what to do when stand photo year is not set leave them or fix them see issue and once issues with stand photo year are solved implementing temporalization becomes only a matter of writing a function taking the name of a table having a geometry column and a year and to compute all the historical parts of this polygons and their associated valid year start and valid year end values select from tt geohistory schemaname tablename uniqueidcolumn geocolumnname photoyearcolname hasprecedencefct isvalidfct will return a setof geom validyearstart validyearend for each polygon hasprecedencefct if provided is the name of a function taking two records when two overlapping polygons of the same year are compared tt geohistory will call this function to determine if the record corresponding to the first polygon takes precedence over record of the second one if hasprecedencefct returns true the first polygon takes precedence over the second one if hasprecedencefct returns false the second polygon takes precedence over the first one if hasprecedencefct is not provided tt geohistory always gives precedence to the polygon having the highest uniqueidcolumn isvalidfct if provided is the name of a function taking one record it is used to determine if a polygon must be taken into account when computing old and future history of the area covered by a polygon it will typically return false if all significant attributes are null leading tt geohistory to ignore this polygon if only one significant attribute is not null isvalidfct returns true if isvalidfct is not provided all poygons are considered valid in an alternative version of tt geohistory hasprecedencefct and isvalidfct have hardcoded names there is only need to pass a list of attribute names e g cas id dist type species to tt geohistory to activate the precedence and the validation mechanism the advantage is that since tt geohistory knows the list of values to pass to both functions it does not have to blindly and inefficiently pass all of them some questions regarding the values assigned to valid year start and valid year end if the earliest year for the area covered by a polygon part is should this become its valid year start or should it be assigned an earlier arbitrary valid year start e g or since this is all the info we have for the past of this area if the latest year for the area covered by a polygon part is should this become its valid year end or should it be assigned a later arbitrary valid year end e g or null since this is all the info we have for the future of this area
| 1
|
592,024
| 17,868,373,241
|
IssuesEvent
|
2021-09-06 12:24:08
|
valora-inc/wallet
|
https://api.github.com/repos/valora-inc/wallet
|
opened
|
Size of In-app CTA increases automatically as user receives any new amount request from another user.
|
bug Priority: P3 ios android wallet qa-report
|
**Frequency:**100%
**Repro on build version:** Appstore Build V1.18.0, Playstore Build V1.18.3, Testflight Build V1.17.0
**Repro on devices:** iPhone 12 (14.7.1), iPhone 6+ (12.4.5), Redmi Note 8 (10.0), Oneplus 7T (11.0)
**Pre-condition:**
1. User must have installed application & login into the app.
**Repro Steps:**
1. Launch the application & user A redirects to home screen.
2. Send Request of valid amount from another user to User A.
3. Observe the In-app CTA on the home screen of User A.
**Current Behavior:** Size of In-app CTA increases automatically as user receives any new amount request from another user.
**Expected Behavior:** Size of In-app CTA should remain same when user receives any new amount request from another user.
**Investigation:** On tapping “Decline” button from request In-app CTA, the CTA get removed and the size of CTA reduces.
**Impact:** Bad user experience as CTA size increases when new request is received
**Attachment:**
[In-app CTA size issue.mp4](https://drive.google.com/file/d/1rgYGNqiuHkOUiGKBSpgJWj0TIX0SihZU/view?usp=sharing)
|
1.0
|
Size of In-app CTA increases automatically as user receives any new amount request from another user. - **Frequency:**100%
**Repro on build version:** Appstore Build V1.18.0, Playstore Build V1.18.3, Testflight Build V1.17.0
**Repro on devices:** iPhone 12 (14.7.1), iPhone 6+ (12.4.5), Redmi Note 8 (10.0), Oneplus 7T (11.0)
**Pre-condition:**
1. User must have installed application & login into the app.
**Repro Steps:**
1. Launch the application & user A redirects to home screen.
2. Send Request of valid amount from another user to User A.
3. Observe the In-app CTA on the home screen of User A.
**Current Behavior:** Size of In-app CTA increases automatically as user receives any new amount request from another user.
**Expected Behavior:** Size of In-app CTA should remain same when user receives any new amount request from another user.
**Investigation:** On tapping “Decline” button from request In-app CTA, the CTA get removed and the size of CTA reduces.
**Impact:** Bad user experience as CTA size increases when new request is received
**Attachment:**
[In-app CTA size issue.mp4](https://drive.google.com/file/d/1rgYGNqiuHkOUiGKBSpgJWj0TIX0SihZU/view?usp=sharing)
|
non_process
|
size of in app cta increases automatically as user receives any new amount request from another user frequency repro on build version appstore build playstore build testflight build repro on devices iphone iphone redmi note oneplus pre condition user must have installed application login into the app repro steps launch the application user a redirects to home screen send request of valid amount from another user to user a observe the in app cta on the home screen of user a current behavior size of in app cta increases automatically as user receives any new amount request from another user expected behavior size of in app cta should remain same when user receives any new amount request from another user investigation on tapping “decline” button from request in app cta the cta get removed and the size of cta reduces impact bad user experience as cta size increases when new request is received attachment
| 0
|
12,888
| 15,280,056,250
|
IssuesEvent
|
2021-02-23 05:30:55
|
topcoder-platform/community-app
|
https://api.github.com/repos/topcoder-platform/community-app
|
reopened
|
Recommended Challenges: Sorting is not working
|
P2 ShapeupProcess challenge- recommender-tool
|
When the recommended toggle is on, the default sort option is best match. But when the user selects other sort options like most recent and title(A-Z) the sort does not work.
<img width="1440" alt="Screenshot 2021-02-19 at 4 46 13 PM" src="https://user-images.githubusercontent.com/58783823/108497808-07883e00-72d2-11eb-9726-966ab6be26ee.png">
|
1.0
|
Recommended Challenges: Sorting is not working - When the recommended toggle is on, the default sort option is best match. But when the user selects other sort options like most recent and title(A-Z) the sort does not work.
<img width="1440" alt="Screenshot 2021-02-19 at 4 46 13 PM" src="https://user-images.githubusercontent.com/58783823/108497808-07883e00-72d2-11eb-9726-966ab6be26ee.png">
|
process
|
recommended challenges sorting is not working when the recommended toggle is on the default sort option is best match but when the user selects other sort options like most recent and title a z the sort does not work img width alt screenshot at pm src
| 1
|
24,880
| 7,575,594,546
|
IssuesEvent
|
2018-04-24 02:35:25
|
samurailens/teedknow-issue-tracker
|
https://api.github.com/repos/samurailens/teedknow-issue-tracker
|
closed
|
Students - Spelling mistakes in notifications.
|
Build Generated Priority: Low Status: Fixed
|
1. Login as Trainer
2. Click on Students
3. Click on Invite
4. Text invalid text and click on Send Email button
Issue 1 - Invite text is wrong. Please see the below screenshot:

Expected Result: Text should be "Sending Invite to Naresh098@hotmail.com...."
Issue 2 - Successful spelling is wrong check the below screenshot:

|
1.0
|
Students - Spelling mistakes in notifications. - 1. Login as Trainer
2. Click on Students
3. Click on Invite
4. Text invalid text and click on Send Email button
Issue 1 - Invite text is wrong. Please see the below screenshot:

Expected Result: Text should be "Sending Invite to Naresh098@hotmail.com...."
Issue 2 - Successful spelling is wrong check the below screenshot:

|
non_process
|
students spelling mistakes in notifications login as trainer click on students click on invite text invalid text and click on send email button issue invite text is wrong please see the below screenshot expected result text should be sending invite to hotmail com issue successful spelling is wrong check the below screenshot
| 0
|
6,787
| 9,921,251,422
|
IssuesEvent
|
2019-06-30 16:34:24
|
NottingHack/hms2
|
https://api.github.com/repos/NottingHack/hms2
|
closed
|
Waiting Approval list pages for membership team
|
Process
|
Ian asked for page that list all members with member.approval state and review buttons for each
Then update the `NewMemberApprovalNeeded` notification so that its links point to this page not a specific user approval
Another on that would help to tie in with #296 as then it could show which approvals have been reviewed but rejected and are awaiting update
p.s. also update notification to have a count
p.p.s slack notification when count changes or hits zero
|
1.0
|
Waiting Approval list pages for membership team - Ian asked for page that list all members with member.approval state and review buttons for each
Then update the `NewMemberApprovalNeeded` notification so that its links point to this page not a specific user approval
Another on that would help to tie in with #296 as then it could show which approvals have been reviewed but rejected and are awaiting update
p.s. also update notification to have a count
p.p.s slack notification when count changes or hits zero
|
process
|
waiting approval list pages for membership team ian asked for page that list all members with member approval state and review buttons for each then update the newmemberapprovalneeded notification so that its links point to this page not a specific user approval another on that would help to tie in with as then it could show which approvals have been reviewed but rejected and are awaiting update p s also update notification to have a count p p s slack notification when count changes or hits zero
| 1
|
56,951
| 8,132,907,118
|
IssuesEvent
|
2018-08-18 17:46:02
|
godotengine/godot
|
https://api.github.com/repos/godotengine/godot
|
closed
|
Shading language parser issue with retrieving vectors from matrices
|
bug documentation topic:rendering
|
Godot 2.0.3-stable
Looking at documentation on http://docs.godotengine.org/en/latest/reference/shading_language.html at the bottom of the page there is a "Obtaining world-space normal and position in material fragment program". The code does not work, seems like multiple vector extractions and swizzling is not supported properly. Also `mat3(INV_CAMERA_MATRIX)` does not work (invalid constructor argument).
`INV_CAMERA_MATRIX.w.xyz` does not work, while `(INV_CAMERA_MATRIX.w).xyz` is parsed properly.
Original code:
```
vec3 world_normal = NORMAL * mat3(INV_CAMERA_MATRIX);
vec3 world_pos = (VERTEX - INV_CAMERA_MATRIX.w.xyz) * mat3(INV_CAMERA_MATRIX);
```
Working code:
```
vec3 world_normal = NORMAL * mat3((INV_CAMERA_MATRIX.x).xyz, (INV_CAMERA_MATRIX.y).xyz, (INV_CAMERA_MATRIX.z).xyz);
vec3 world_pos = (VERTEX.xyz - (INV_CAMERA_MATRIX.w).xyz) * mat3((INV_CAMERA_MATRIX.x).xyz, (INV_CAMERA_MATRIX.y).xyz, (INV_CAMERA_MATRIX.z).xyz);
```
Also... after adding a `EMISSION = world_pos;` the result is still in screen-space... So I guess there's a parser and a documentation error here.
_EDIT:_
I did this in the fragment shader for a material, which is marked as "out" in the documentation... weird.
|
1.0
|
Shading language parser issue with retrieving vectors from matrices - Godot 2.0.3-stable
Looking at documentation on http://docs.godotengine.org/en/latest/reference/shading_language.html at the bottom of the page there is a "Obtaining world-space normal and position in material fragment program". The code does not work, seems like multiple vector extractions and swizzling is not supported properly. Also `mat3(INV_CAMERA_MATRIX)` does not work (invalid constructor argument).
`INV_CAMERA_MATRIX.w.xyz` does not work, while `(INV_CAMERA_MATRIX.w).xyz` is parsed properly.
Original code:
```
vec3 world_normal = NORMAL * mat3(INV_CAMERA_MATRIX);
vec3 world_pos = (VERTEX - INV_CAMERA_MATRIX.w.xyz) * mat3(INV_CAMERA_MATRIX);
```
Working code:
```
vec3 world_normal = NORMAL * mat3((INV_CAMERA_MATRIX.x).xyz, (INV_CAMERA_MATRIX.y).xyz, (INV_CAMERA_MATRIX.z).xyz);
vec3 world_pos = (VERTEX.xyz - (INV_CAMERA_MATRIX.w).xyz) * mat3((INV_CAMERA_MATRIX.x).xyz, (INV_CAMERA_MATRIX.y).xyz, (INV_CAMERA_MATRIX.z).xyz);
```
Also... after adding a `EMISSION = world_pos;` the result is still in screen-space... So I guess there's a parser and a documentation error here.
_EDIT:_
I did this in the fragment shader for a material, which is marked as "out" in the documentation... weird.
|
non_process
|
shading language parser issue with retrieving vectors from matrices godot stable looking at documentation on at the bottom of the page there is a obtaining world space normal and position in material fragment program the code does not work seems like multiple vector extractions and swizzling is not supported properly also inv camera matrix does not work invalid constructor argument inv camera matrix w xyz does not work while inv camera matrix w xyz is parsed properly original code world normal normal inv camera matrix world pos vertex inv camera matrix w xyz inv camera matrix working code world normal normal inv camera matrix x xyz inv camera matrix y xyz inv camera matrix z xyz world pos vertex xyz inv camera matrix w xyz inv camera matrix x xyz inv camera matrix y xyz inv camera matrix z xyz also after adding a emission world pos the result is still in screen space so i guess there s a parser and a documentation error here edit i did this in the fragment shader for a material which is marked as out in the documentation weird
| 0
|
135,702
| 18,717,800,532
|
IssuesEvent
|
2021-11-03 08:13:11
|
shaimael/IdentityServer4
|
https://api.github.com/repos/shaimael/IdentityServer4
|
closed
|
CVE-2018-14040 (Medium) detected in multiple libraries - autoclosed
|
security vulnerability
|
## CVE-2018-14040 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.5.min.js</b>, <b>bootstrap-3.3.6.min.js</b>, <b>bootstrap-3.3.5.js</b>, <b>bootstrap-3.3.6.js</b></p></summary>
<p>
<details><summary><b>bootstrap-3.3.5.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/old/MvcManual/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.5.min.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-3.3.6.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-3.3.5.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.js</a></p>
<p>Path to vulnerable library: /samples/Clients/old/MvcManual/wwwroot/lib/bootstrap/dist/js/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.5.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-3.3.6.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js</a></p>
<p>Path to vulnerable library: /samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.js,/samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/shaimael/IdentityServer4/commit/1a971ceee12750a348ada2520e1769e6c763fb5f">1a971ceee12750a348ada2520e1769e6c763fb5f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040>CVE-2018-14040</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.5","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.6","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.5","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.6","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2018-14040","vulnerabilityDetails":"In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-14040 (Medium) detected in multiple libraries - autoclosed - ## CVE-2018-14040 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.5.min.js</b>, <b>bootstrap-3.3.6.min.js</b>, <b>bootstrap-3.3.5.js</b>, <b>bootstrap-3.3.6.js</b></p></summary>
<p>
<details><summary><b>bootstrap-3.3.5.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/old/MvcManual/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.5.min.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-3.3.6.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-3.3.5.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.js</a></p>
<p>Path to vulnerable library: /samples/Clients/old/MvcManual/wwwroot/lib/bootstrap/dist/js/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.5.js** (Vulnerable Library)
</details>
<details><summary><b>bootstrap-3.3.6.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.js</a></p>
<p>Path to vulnerable library: /samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.js,/samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.js** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/shaimael/IdentityServer4/commit/1a971ceee12750a348ada2520e1769e6c763fb5f">1a971ceee12750a348ada2520e1769e6c763fb5f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040>CVE-2018-14040</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.5","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.6","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.5","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"},{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.6","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2018-14040","vulnerabilityDetails":"In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in multiple libraries autoclosed cve medium severity vulnerability vulnerable libraries bootstrap min js bootstrap min js bootstrap js bootstrap js bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients old mvcmanual wwwroot lib bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library bootstrap js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients old mvcmanual wwwroot lib bootstrap dist js bootstrap js dependency hierarchy x bootstrap js vulnerable library bootstrap js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap js samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap js dependency hierarchy x bootstrap js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap before xss is possible in the collapse data parent attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org webjars npm bootstrap org webjars bootstrap isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree twitter bootstrap isminimumfixversionavailable true minimumfixversion org webjars npm bootstrap org webjars bootstrap packagetype javascript packagename twitter bootstrap packageversion packagefilepaths istransitivedependency false dependencytree twitter bootstrap isminimumfixversionavailable true minimumfixversion org webjars npm bootstrap org webjars bootstrap packagetype javascript packagename twitter bootstrap packageversion packagefilepaths istransitivedependency false dependencytree twitter bootstrap isminimumfixversionavailable true minimumfixversion org webjars npm bootstrap org webjars bootstrap packagetype javascript packagename twitter bootstrap packageversion packagefilepaths istransitivedependency false dependencytree twitter bootstrap isminimumfixversionavailable true minimumfixversion org webjars npm bootstrap org webjars bootstrap basebranches vulnerabilityidentifier cve vulnerabilitydetails in bootstrap before xss is possible in the collapse data parent attribute vulnerabilityurl
| 0
|
15,445
| 19,661,550,695
|
IssuesEvent
|
2022-01-10 17:31:38
|
googleapis/google-cloud-go
|
https://api.github.com/repos/googleapis/google-cloud-go
|
closed
|
compute/apiv1/firewall_policies_client.go: invalid operation: req.ParentId != nil (mismatched types string and nil)
|
type: bug priority: p2 api: compute type: process
|
I tried updating my `go.mod` packages by running:
`go get -u ./...`
Received the error message:
```bash
go get -u ./...
# cloud.google.com/go/compute/apiv1
/go/pkg/mod/cloud.google.com/go@v0.97.0/compute/apiv1/firewall_policies_client.go:684:32: invalid operation: req.ParentId != nil (mismatched types string and nil)
/go/pkg/mod/cloud.google.com/go@v0.97.0/compute/apiv1/firewall_policies_client.go:869:32: invalid operation: req.ParentId != nil (mismatched types string and nil)
```
On `go1.17.3 linux/amd64`
For your reference, here are the source code lines mentioned in the error:
https://github.com/googleapis/google-cloud-go/blob/main/compute/apiv1/firewall_policies_client.go#L684-L689
```go
if req != nil && req.ParentId != nil {
params.Add("parentId", fmt.Sprintf("%v", req.GetParentId()))
}
if req != nil && req.RequestId != nil {
params.Add("requestId", fmt.Sprintf("%v", req.GetRequestId()))
}
```
|
1.0
|
compute/apiv1/firewall_policies_client.go: invalid operation: req.ParentId != nil (mismatched types string and nil) - I tried updating my `go.mod` packages by running:
`go get -u ./...`
Received the error message:
```bash
go get -u ./...
# cloud.google.com/go/compute/apiv1
/go/pkg/mod/cloud.google.com/go@v0.97.0/compute/apiv1/firewall_policies_client.go:684:32: invalid operation: req.ParentId != nil (mismatched types string and nil)
/go/pkg/mod/cloud.google.com/go@v0.97.0/compute/apiv1/firewall_policies_client.go:869:32: invalid operation: req.ParentId != nil (mismatched types string and nil)
```
On `go1.17.3 linux/amd64`
For your reference, here are the source code lines mentioned in the error:
https://github.com/googleapis/google-cloud-go/blob/main/compute/apiv1/firewall_policies_client.go#L684-L689
```go
if req != nil && req.ParentId != nil {
params.Add("parentId", fmt.Sprintf("%v", req.GetParentId()))
}
if req != nil && req.RequestId != nil {
params.Add("requestId", fmt.Sprintf("%v", req.GetRequestId()))
}
```
|
process
|
compute firewall policies client go invalid operation req parentid nil mismatched types string and nil i tried updating my go mod packages by running go get u received the error message bash go get u cloud google com go compute go pkg mod cloud google com go compute firewall policies client go invalid operation req parentid nil mismatched types string and nil go pkg mod cloud google com go compute firewall policies client go invalid operation req parentid nil mismatched types string and nil on linux for your reference here are the source code lines mentioned in the error go if req nil req parentid nil params add parentid fmt sprintf v req getparentid if req nil req requestid nil params add requestid fmt sprintf v req getrequestid
| 1
|
19,971
| 26,451,749,364
|
IssuesEvent
|
2023-01-16 11:45:42
|
firebase/firebase-cpp-sdk
|
https://api.github.com/repos/firebase/firebase-cpp-sdk
|
closed
|
[C++] Nightly Integration Testing Report for Firestore
|
type: process nightly-testing
|
<hidden value="integration-test-status-comment"></hidden>
### ✅ [build against repo] Integration test succeeded!
Requested by @sunmou99 on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Sun Jan 15 04:03 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3922884794)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against SDK] Integration test succeeded!
Requested by @firebase-workflow-trigger[bot] on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Sun Jan 15 05:50 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3923379154)**
<hidden value="integration-test-status-comment"></hidden>
***
### ❌ [build against tip] Integration test FAILED
Requested by @sunmou99 on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Sun Jan 15 03:37 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3923085560)**
| Failures | Configs |
|----------|---------|
| firestore | [TEST] [FAILURE] [Linux] [1/2 ssl_lib: x64] [1/2 build_type: boringssl]<details><summary>(1 failed tests)</summary> NumericTransformsTest.ServerTimestampAndIncrement</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
|
1.0
|
[C++] Nightly Integration Testing Report for Firestore -
<hidden value="integration-test-status-comment"></hidden>
### ✅ [build against repo] Integration test succeeded!
Requested by @sunmou99 on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Sun Jan 15 04:03 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3922884794)**
<hidden value="integration-test-status-comment"></hidden>
***
### ✅ [build against SDK] Integration test succeeded!
Requested by @firebase-workflow-trigger[bot] on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Sun Jan 15 05:50 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3923379154)**
<hidden value="integration-test-status-comment"></hidden>
***
### ❌ [build against tip] Integration test FAILED
Requested by @sunmou99 on commit 45f8e3268c2adbabca165ed0a937835f18930d2f
Last updated: Sun Jan 15 03:37 PST 2023
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3923085560)**
| Failures | Configs |
|----------|---------|
| firestore | [TEST] [FAILURE] [Linux] [1/2 ssl_lib: x64] [1/2 build_type: boringssl]<details><summary>(1 failed tests)</summary> NumericTransformsTest.ServerTimestampAndIncrement</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
|
process
|
nightly integration testing report for firestore ✅ nbsp integration test succeeded requested by on commit last updated sun jan pst ✅ nbsp integration test succeeded requested by firebase workflow trigger on commit last updated sun jan pst ❌ nbsp integration test failed requested by on commit last updated sun jan pst failures configs firestore failed tests nbsp nbsp numerictransformstest servertimestampandincrement add flaky tests to
| 1
|
189,797
| 14,523,174,467
|
IssuesEvent
|
2020-12-14 09:47:03
|
proarc/proarc
|
https://api.github.com/repos/proarc/proarc
|
closed
|
Nastavit organizaci objektu
|
6 k testování Release-3.5.18
|
Podle organizace objektu a organizace uživatele filtrovat záznamy, které smí uživatel vidět.
|
1.0
|
Nastavit organizaci objektu - Podle organizace objektu a organizace uživatele filtrovat záznamy, které smí uživatel vidět.
|
non_process
|
nastavit organizaci objektu podle organizace objektu a organizace uživatele filtrovat záznamy které smí uživatel vidět
| 0
|
7,724
| 10,832,455,736
|
IssuesEvent
|
2019-11-11 10:38:31
|
liskcenterutrecht/lisk.bike
|
https://api.github.com/repos/liskcenterutrecht/lisk.bike
|
closed
|
Onboarding: Create bike
|
Process Flow Virtual Lock Server
|
Onboarding means that the Virtual Lock Server creates a wallet for the lock on the lisk.bike sidechain. Now there's a connection between the lock and the pubkey, in the Lisk blockchain.
See
[] https://github.com/bartwr/lisk-bike-blockchain-app/blob/master/client/create-account.js(url)
[]https://github.com/bartwr/lisk-bike-blockchain-app/blob/master/transactions/create-bike.js(url)
- [x] Lock sends the 'login' command to server
- [x] Server creates wallet for lock using ./client/create-account.js
- [x] Server registers lock onto the blockchain and stores BikeID (=Pubkey) + Imei on VLS using ./client/create-bike.js
|
1.0
|
Onboarding: Create bike - Onboarding means that the Virtual Lock Server creates a wallet for the lock on the lisk.bike sidechain. Now there's a connection between the lock and the pubkey, in the Lisk blockchain.
See
[] https://github.com/bartwr/lisk-bike-blockchain-app/blob/master/client/create-account.js(url)
[]https://github.com/bartwr/lisk-bike-blockchain-app/blob/master/transactions/create-bike.js(url)
- [x] Lock sends the 'login' command to server
- [x] Server creates wallet for lock using ./client/create-account.js
- [x] Server registers lock onto the blockchain and stores BikeID (=Pubkey) + Imei on VLS using ./client/create-bike.js
|
process
|
onboarding create bike onboarding means that the virtual lock server creates a wallet for the lock on the lisk bike sidechain now there s a connection between the lock and the pubkey in the lisk blockchain see lock sends the login command to server server creates wallet for lock using client create account js server registers lock onto the blockchain and stores bikeid pubkey imei on vls using client create bike js
| 1
|
11,648
| 14,502,201,822
|
IssuesEvent
|
2020-12-11 20:39:43
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Node Path
|
Pri1 devops-cicd-process/tech devops/prod doc-enhancement
|
Hi Guys,
Would be great to describe where the build is looking for the node if the label '"com.azure.dev.pipelines.agent.handler.node.path"' is not defined.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 3339a2e0-be29-1363-f588-b231d4472c02
* Version Independent ID: 72dd11a3-704d-d0fd-6dfa-cf49f3352de3
* Content: [Container Jobs in Azure Pipelines and TFS - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/container-phases?view=azure-devops)
* Content Source: [docs/pipelines/process/container-phases.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/container-phases.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Node Path - Hi Guys,
Would be great to describe where the build is looking for the node if the label '"com.azure.dev.pipelines.agent.handler.node.path"' is not defined.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 3339a2e0-be29-1363-f588-b231d4472c02
* Version Independent ID: 72dd11a3-704d-d0fd-6dfa-cf49f3352de3
* Content: [Container Jobs in Azure Pipelines and TFS - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/container-phases?view=azure-devops)
* Content Source: [docs/pipelines/process/container-phases.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/container-phases.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
node path hi guys would be great to describe where the build is looking for the node if the label com azure dev pipelines agent handler node path is not defined document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
19,752
| 26,113,575,966
|
IssuesEvent
|
2022-12-28 01:02:01
|
quark-engine/quark-engine
|
https://api.github.com/repos/quark-engine/quark-engine
|
closed
|
Prepare to release version v22.12.1
|
work-in-progress issue-processing-state-06
|
Update the version number in init.py for the release with the latest version of Quark.
It includes the following changes.
* #434
* #436
|
1.0
|
Prepare to release version v22.12.1 - Update the version number in init.py for the release with the latest version of Quark.
It includes the following changes.
* #434
* #436
|
process
|
prepare to release version update the version number in init py for the release with the latest version of quark it includes the following changes
| 1
|
299,884
| 25,934,081,462
|
IssuesEvent
|
2022-12-16 12:37:23
|
opendatadiscovery/odd-platform
|
https://api.github.com/repos/opendatadiscovery/odd-platform
|
opened
|
Cover the section "Collector" in Management by autotests
|
kind: testing
|
### Is your proposal related to a problem?
Cover the section "Collector" in Management
### Describe the solution you'd like
- Add new collector without required data
- Edit required data in a Collector
- Edit non-required data in a Collector
- Regenerate token in a Collector
- Delete a collector
- Add new collector with non-required data
### Additional context
https://odd.testops.cloud/project/1/test-cases/155?treeId=2
|
1.0
|
Cover the section "Collector" in Management by autotests - ### Is your proposal related to a problem?
Cover the section "Collector" in Management
### Describe the solution you'd like
- Add new collector without required data
- Edit required data in a Collector
- Edit non-required data in a Collector
- Regenerate token in a Collector
- Delete a collector
- Add new collector with non-required data
### Additional context
https://odd.testops.cloud/project/1/test-cases/155?treeId=2
|
non_process
|
cover the section collector in management by autotests is your proposal related to a problem cover the section collector in management describe the solution you d like add new collector without required data edit required data in a collector edit non required data in a collector regenerate token in a collector delete a collector add new collector with non required data additional context
| 0
|
418,574
| 12,199,879,045
|
IssuesEvent
|
2020-04-30 02:57:13
|
PopupMaker/Popup-Maker
|
https://api.github.com/repos/PopupMaker/Popup-Maker
|
opened
|
Refactor initialization routine
|
component-front-end priority-medium scope-developer-apis scope-reliability type-improvement
|
- [ ] Should allow extensions to delay init until all promises resolve (geo lookup). Likely using wp.hooks and a filtered array of promise functions to call and wait for.
- [ ] Should allow per popup activation & deactivation for polling.
|
1.0
|
Refactor initialization routine - - [ ] Should allow extensions to delay init until all promises resolve (geo lookup). Likely using wp.hooks and a filtered array of promise functions to call and wait for.
- [ ] Should allow per popup activation & deactivation for polling.
|
non_process
|
refactor initialization routine should allow extensions to delay init until all promises resolve geo lookup likely using wp hooks and a filtered array of promise functions to call and wait for should allow per popup activation deactivation for polling
| 0
|
11,780
| 14,614,110,707
|
IssuesEvent
|
2020-12-22 09:22:30
|
FraserYu/myBlogTalk
|
https://api.github.com/repos/FraserYu/myBlogTalk
|
opened
|
Docker Container就是个进程 | 日拱一兵|Java|Spring Boot|Java并发编程|最新干货分享
|
/p/docker-container-is-a-process.html Gitalk
|
https://dayarch.top/p/docker-container-is-a-process.html
学习新内容有门槛,降低门槛的办法就是贴近我们已有知识,Docker Container 其实就是一个进程,进程是可以获取操作系统资源的,Container 亦是如此,只不过具体的方式在上层做了封装略有不同, 本站是日拱一兵的技术分享博客,内容涵盖Java后端技术、Spring Boot、Java并发编程等技术研究与分享,用有趣的方式解读技术
|
1.0
|
Docker Container就是个进程 | 日拱一兵|Java|Spring Boot|Java并发编程|最新干货分享 - https://dayarch.top/p/docker-container-is-a-process.html
学习新内容有门槛,降低门槛的办法就是贴近我们已有知识,Docker Container 其实就是一个进程,进程是可以获取操作系统资源的,Container 亦是如此,只不过具体的方式在上层做了封装略有不同, 本站是日拱一兵的技术分享博客,内容涵盖Java后端技术、Spring Boot、Java并发编程等技术研究与分享,用有趣的方式解读技术
|
process
|
docker container就是个进程 日拱一兵|java|spring boot|java并发编程|最新干货分享 学习新内容有门槛,降低门槛的办法就是贴近我们已有知识,docker container 其实就是一个进程,进程是可以获取操作系统资源的,container 亦是如此,只不过具体的方式在上层做了封装略有不同 本站是日拱一兵的技术分享博客,内容涵盖java后端技术、spring boot、java并发编程等技术研究与分享,用有趣的方式解读技术
| 1
|
677,690
| 23,170,845,753
|
IssuesEvent
|
2022-07-30 17:36:26
|
episphere/connectApp
|
https://api.github.com/repos/episphere/connectApp
|
closed
|
Message with Login link goes to "promotions" tab for Gmail accounts
|
Medium Priority MVP Sign-up
|
When logging in with the link sent to the participant's email, in Gmail, the message goes to the "promotions" tab (which is typically ads from companies). Before we used this method, Connect emails ended up in the "primary" tab.
Might be an issue in future because promotional emails do not notify the account holder of a new message (unless the account holder changes the notifications), and if participants don't check the "promotions" tab, they may think that the email was not sent. (In my personal experience I get hundreds of promotional emails every day that I don't look through).

|
1.0
|
Message with Login link goes to "promotions" tab for Gmail accounts - When logging in with the link sent to the participant's email, in Gmail, the message goes to the "promotions" tab (which is typically ads from companies). Before we used this method, Connect emails ended up in the "primary" tab.
Might be an issue in future because promotional emails do not notify the account holder of a new message (unless the account holder changes the notifications), and if participants don't check the "promotions" tab, they may think that the email was not sent. (In my personal experience I get hundreds of promotional emails every day that I don't look through).

|
non_process
|
message with login link goes to promotions tab for gmail accounts when logging in with the link sent to the participant s email in gmail the message goes to the promotions tab which is typically ads from companies before we used this method connect emails ended up in the primary tab might be an issue in future because promotional emails do not notify the account holder of a new message unless the account holder changes the notifications and if participants don t check the promotions tab they may think that the email was not sent in my personal experience i get hundreds of promotional emails every day that i don t look through
| 0
|
219,129
| 7,333,351,836
|
IssuesEvent
|
2018-03-05 19:08:10
|
NCEAS/metacat
|
https://api.github.com/repos/NCEAS/metacat
|
closed
|
Invalidly formatted dates in listObjects fromDate / toDate parameters do not result in InvalidRequest
|
Component: Bugzilla-Id Priority: Normal Status: Resolved Tracker: Bug
|
---
Author Name: **Jing Tao** (Jing Tao)
Original Redmine Issue: 7232, https://projects.ecoinformatics.org/ecoinfo/issues/7232
Original Date: 2017-12-08
Original Assignee: Jing Tao
---
Details see:
https://redmine.dataone.org/issues/8130
|
1.0
|
Invalidly formatted dates in listObjects fromDate / toDate parameters do not result in InvalidRequest - ---
Author Name: **Jing Tao** (Jing Tao)
Original Redmine Issue: 7232, https://projects.ecoinformatics.org/ecoinfo/issues/7232
Original Date: 2017-12-08
Original Assignee: Jing Tao
---
Details see:
https://redmine.dataone.org/issues/8130
|
non_process
|
invalidly formatted dates in listobjects fromdate todate parameters do not result in invalidrequest author name jing tao jing tao original redmine issue original date original assignee jing tao details see
| 0
|
20,789
| 27,531,724,700
|
IssuesEvent
|
2023-03-06 22:43:44
|
medic/cht-core
|
https://api.github.com/repos/medic/cht-core
|
opened
|
Release 4.1.1
|
Type: Internal process
|
# Planning - Product Manager
- [ ] Create an GH Milestone and add this issue to it.
- [ ] Add all the issues to be worked on to the Milestone.
- [ ] Ensure that all issues are labelled correctly, particularly ensure that "Regressions" are labelled with "Affects: <version>" labels. The "Affects" label is used in a link in the Known Issues section of the release notes of that version so it has to match exactly. To make sure the label is correct go to the [release notes](https://docs.communityhealthtoolkit.org/core/releases/#release-notes) and ensure the issue is listed.
# Development - Release Engineer
When development is ready to begin one of the engineers should be nominated as a Release Engineer. They will be responsible for making sure the following tasks are completed though not necessarily completing them.
- [ ] Set the version number in `package.json` and `package-lock.json` and submit a PR to the release branch. The easiest way to do this is to use `npm --no-git-tag-version version patch`.
- [ ] Write an update in the weekly Product Team call agenda summarising development and acceptance testing progress and identifying any blockers (the [milestone-status](https://github.com/medic/support-scripts/tree/master/milestone-status) script can be used to get a breakdown of the issues). The Release Engineer is to update this every week until the version is released.
# Releasing - Release Engineer
Once all issues have passed acceptance testing and have been merged into `master` and backported to the release branch release testing can begin.
- [ ] Build a beta named `<major>.<minor>.<patch>-beta.1` by pushing a git tag and when CI completes successfully notify the QA team that it's ready for release testing.
- [ ] Add release notes to the [Core Framework Releases](https://docs.communityhealthtoolkit.org/core/releases/) page:
- [ ] Create a new document for the release in the [releases folder](https://github.com/medic/cht-docs/tree/main/content/en/core/releases).
- [ ] Ensure all issues are in the GH Milestone, that they're correctly labelled (in particular: they have the right Type, "UI/UX" if they change the UI, and "Breaking change" if appropriate), and have human readable descriptions.
- [ ] Use [this script](https://github.com/medic/cht-core/blob/master/scripts/release-notes) to export the issues into our release note format.
- [ ] Manually document any known migration steps and known issues.
- [ ] Add a link to the new release page in the [Release Notes](https://docs.communityhealthtoolkit.org/core/releases/#release-notes) section.
- [ ] Assign the PR to:
- The Director of Technology
- An SRE to review and confirm the documentation on upgrade instructions and breaking changes is sufficient
- [ ] Until release testing passes, make sure regressions are fixed in `master`, cherry-pick them into the release branch, and release another beta.
- [ ] Create a release in GitHub from the release branch so it shows up under the [Releases tab](https://github.com/medic/cht-core/releases) with the naming convention `<major>.<minor>.<patch>`. This will create the git tag automatically. Link to the release notes in the description of the release.
- [ ] Confirm the release build completes successfully and the new release is available on the [market](https://staging.dev.medicmobile.org/builds_4/releases). Make sure that the document has new entry with `id: medic:medic:<major>.<minor>.<patch>`
- [ ] Announce the release on the [CHT forum](https://forum.communityhealthtoolkit.org/), under the "Product - Releases" category using this template:
```
*Announcing the release of {{version}}*
This release fixes {{number of bugs}}. Read the [release notes]({{url}}) for full details.
```
- [ ] Mark this issue "done" and close the Milestone.
|
1.0
|
Release 4.1.1 - # Planning - Product Manager
- [ ] Create an GH Milestone and add this issue to it.
- [ ] Add all the issues to be worked on to the Milestone.
- [ ] Ensure that all issues are labelled correctly, particularly ensure that "Regressions" are labelled with "Affects: <version>" labels. The "Affects" label is used in a link in the Known Issues section of the release notes of that version so it has to match exactly. To make sure the label is correct go to the [release notes](https://docs.communityhealthtoolkit.org/core/releases/#release-notes) and ensure the issue is listed.
# Development - Release Engineer
When development is ready to begin one of the engineers should be nominated as a Release Engineer. They will be responsible for making sure the following tasks are completed though not necessarily completing them.
- [ ] Set the version number in `package.json` and `package-lock.json` and submit a PR to the release branch. The easiest way to do this is to use `npm --no-git-tag-version version patch`.
- [ ] Write an update in the weekly Product Team call agenda summarising development and acceptance testing progress and identifying any blockers (the [milestone-status](https://github.com/medic/support-scripts/tree/master/milestone-status) script can be used to get a breakdown of the issues). The Release Engineer is to update this every week until the version is released.
# Releasing - Release Engineer
Once all issues have passed acceptance testing and have been merged into `master` and backported to the release branch release testing can begin.
- [ ] Build a beta named `<major>.<minor>.<patch>-beta.1` by pushing a git tag and when CI completes successfully notify the QA team that it's ready for release testing.
- [ ] Add release notes to the [Core Framework Releases](https://docs.communityhealthtoolkit.org/core/releases/) page:
- [ ] Create a new document for the release in the [releases folder](https://github.com/medic/cht-docs/tree/main/content/en/core/releases).
- [ ] Ensure all issues are in the GH Milestone, that they're correctly labelled (in particular: they have the right Type, "UI/UX" if they change the UI, and "Breaking change" if appropriate), and have human readable descriptions.
- [ ] Use [this script](https://github.com/medic/cht-core/blob/master/scripts/release-notes) to export the issues into our release note format.
- [ ] Manually document any known migration steps and known issues.
- [ ] Add a link to the new release page in the [Release Notes](https://docs.communityhealthtoolkit.org/core/releases/#release-notes) section.
- [ ] Assign the PR to:
- The Director of Technology
- An SRE to review and confirm the documentation on upgrade instructions and breaking changes is sufficient
- [ ] Until release testing passes, make sure regressions are fixed in `master`, cherry-pick them into the release branch, and release another beta.
- [ ] Create a release in GitHub from the release branch so it shows up under the [Releases tab](https://github.com/medic/cht-core/releases) with the naming convention `<major>.<minor>.<patch>`. This will create the git tag automatically. Link to the release notes in the description of the release.
- [ ] Confirm the release build completes successfully and the new release is available on the [market](https://staging.dev.medicmobile.org/builds_4/releases). Make sure that the document has new entry with `id: medic:medic:<major>.<minor>.<patch>`
- [ ] Announce the release on the [CHT forum](https://forum.communityhealthtoolkit.org/), under the "Product - Releases" category using this template:
```
*Announcing the release of {{version}}*
This release fixes {{number of bugs}}. Read the [release notes]({{url}}) for full details.
```
- [ ] Mark this issue "done" and close the Milestone.
|
process
|
release planning product manager create an gh milestone and add this issue to it add all the issues to be worked on to the milestone ensure that all issues are labelled correctly particularly ensure that regressions are labelled with affects labels the affects label is used in a link in the known issues section of the release notes of that version so it has to match exactly to make sure the label is correct go to the and ensure the issue is listed development release engineer when development is ready to begin one of the engineers should be nominated as a release engineer they will be responsible for making sure the following tasks are completed though not necessarily completing them set the version number in package json and package lock json and submit a pr to the release branch the easiest way to do this is to use npm no git tag version version patch write an update in the weekly product team call agenda summarising development and acceptance testing progress and identifying any blockers the script can be used to get a breakdown of the issues the release engineer is to update this every week until the version is released releasing release engineer once all issues have passed acceptance testing and have been merged into master and backported to the release branch release testing can begin build a beta named beta by pushing a git tag and when ci completes successfully notify the qa team that it s ready for release testing add release notes to the page create a new document for the release in the ensure all issues are in the gh milestone that they re correctly labelled in particular they have the right type ui ux if they change the ui and breaking change if appropriate and have human readable descriptions use to export the issues into our release note format manually document any known migration steps and known issues add a link to the new release page in the section assign the pr to the director of technology an sre to review and confirm the documentation on upgrade instructions and breaking changes is sufficient until release testing passes make sure regressions are fixed in master cherry pick them into the release branch and release another beta create a release in github from the release branch so it shows up under the with the naming convention this will create the git tag automatically link to the release notes in the description of the release confirm the release build completes successfully and the new release is available on the make sure that the document has new entry with id medic medic announce the release on the under the product releases category using this template announcing the release of version this release fixes number of bugs read the url for full details mark this issue done and close the milestone
| 1
|
17,235
| 22,955,345,947
|
IssuesEvent
|
2022-07-19 11:02:38
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Add duration to mongo logging.
|
kind/bug process/candidate tech/engines topic: logging topic: mongodb
|
### Bug description
With this code
```js
const prisma = newPrismaClient({
log: [
{
emit: 'event',
level: 'info',
},
{
emit: 'event',
level: 'query',
},
],
})
prisma.$on('query', onQuery)
await prisma.user.findMany()
```
when using mongo I get following event:
```js
{
is_query: true,
item_type: 'query',
level: 'debug',
module_path: 'mongodb_query_connector::query',
params: '[]',
query: 'db.User.aggregate([ { $project: { _id: 1, }, }, ])'
}
```
where with sqlite or any other provider I get someting along these lines:
```js
{
duration_ms: 0,
is_query: true,
item_type: 'query',
level: 'debug',
module_path: 'quaint::connector::metrics',
params: '[-1,0]',
query: 'SELECT `main`.`User`.`id` FROM `main`.`User` WHERE 1=1 LIMIT ? OFFSET ?',
result: 'success'
}
```
### How to reproduce
<!--
1. Go to '...'
2. Change '....'
3. Run '....'
4. See error
-->
### Expected behavior
MongoDB logs should include the duration in the log event.
### Prisma information
<!-- Do not include your database credentials when sharing your Prisma schema! -->
### Environment & setup
- OS: <!--[e.g. Mac OS, Windows, Debian, CentOS, ...]-->
- Database: <!--[PostgreSQL, MySQL, MariaDB or SQLite]-->
- Node.js version: <!--[Run `node -v` to see your Node.js version]-->
### Prisma Version
```
```
|
1.0
|
Add duration to mongo logging. - ### Bug description
With this code
```js
const prisma = newPrismaClient({
log: [
{
emit: 'event',
level: 'info',
},
{
emit: 'event',
level: 'query',
},
],
})
prisma.$on('query', onQuery)
await prisma.user.findMany()
```
when using mongo I get following event:
```js
{
is_query: true,
item_type: 'query',
level: 'debug',
module_path: 'mongodb_query_connector::query',
params: '[]',
query: 'db.User.aggregate([ { $project: { _id: 1, }, }, ])'
}
```
where with sqlite or any other provider I get someting along these lines:
```js
{
duration_ms: 0,
is_query: true,
item_type: 'query',
level: 'debug',
module_path: 'quaint::connector::metrics',
params: '[-1,0]',
query: 'SELECT `main`.`User`.`id` FROM `main`.`User` WHERE 1=1 LIMIT ? OFFSET ?',
result: 'success'
}
```
### How to reproduce
<!--
1. Go to '...'
2. Change '....'
3. Run '....'
4. See error
-->
### Expected behavior
MongoDB logs should include the duration in the log event.
### Prisma information
<!-- Do not include your database credentials when sharing your Prisma schema! -->
### Environment & setup
- OS: <!--[e.g. Mac OS, Windows, Debian, CentOS, ...]-->
- Database: <!--[PostgreSQL, MySQL, MariaDB or SQLite]-->
- Node.js version: <!--[Run `node -v` to see your Node.js version]-->
### Prisma Version
```
```
|
process
|
add duration to mongo logging bug description with this code js const prisma newprismaclient log emit event level info emit event level query prisma on query onquery await prisma user findmany when using mongo i get following event js is query true item type query level debug module path mongodb query connector query params query db user aggregate where with sqlite or any other provider i get someting along these lines js duration ms is query true item type query level debug module path quaint connector metrics params query select main user id from main user where limit offset result success how to reproduce go to change run see error expected behavior mongodb logs should include the duration in the log event prisma information environment setup os database node js version prisma version
| 1
|
15,507
| 19,703,265,551
|
IssuesEvent
|
2022-01-12 18:52:13
|
googleapis/java-binary-authorization
|
https://api.github.com/repos/googleapis/java-binary-authorization
|
opened
|
Your .repo-metadata.json file has a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'binary-authorization' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname 'binary-authorization' invalid in .repo-metadata.json
☝️ Once you correct these problems, you can close this issue.
Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 release level must be equal to one of the allowed values in repo metadata json api shortname binary authorization invalid in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
| 1
|
93,742
| 27,027,106,582
|
IssuesEvent
|
2023-02-11 18:44:55
|
nature-of-code/book-website-2nd-edition
|
https://api.github.com/repos/nature-of-code/book-website-2nd-edition
|
closed
|
Math Formula
|
magicbook build
|
Latex / Katex math formula rendering is broken in HTML and PDF builds.
(Not sure if this is true just an example issue).
|
1.0
|
Math Formula - Latex / Katex math formula rendering is broken in HTML and PDF builds.
(Not sure if this is true just an example issue).
|
non_process
|
math formula latex katex math formula rendering is broken in html and pdf builds not sure if this is true just an example issue
| 0
|
430,495
| 12,453,956,180
|
IssuesEvent
|
2020-05-27 14:34:33
|
Broken-Gem-Studio/Broken-Engine
|
https://api.github.com/repos/Broken-Gem-Studio/Broken-Engine
|
closed
|
Progress bar Save&Load bug
|
Bug High Priority
|
## Bug Description
does not save & load the image
## Type of Bug
Select the type of bug with and "x" ([x])
* [ ] Visual
* [ ] Physics
* [ ] Audio
* [ ] Particles
* [ ] Resource Management & Save/Load
* [ ] Materials
* [ ] Components
* [ ] Game Objects
* [X] UI/UX
* [ ] Scripting
* [ ] Other
## Severity
Select the severity of bug affection and mark with "x" ([x])
- [ ] Crash
- [X] Game stopper/slower
- [ ] Cosmetic
## Reproduction
Steps to reproduce the behavior:
1. Create progress
2. Save & Load
3.
4.
## Frequency
Select the frequency with which the bug appears and mark it "x" ([x])
* [X] Always
* [ ] Very Often
* [ ] Usually
* [ ] Few Times
* [ ] Few Times under specific conditions
## Conduct
### Expected result:
progress bar works after load
### Actual result:
does not work after load
## Screenshots and Illustrations:
## Build
- **Please specify the build:** ``Insert the build here``
0.6.4
## Observations and Additional Information
|
1.0
|
Progress bar Save&Load bug - ## Bug Description
does not save & load the image
## Type of Bug
Select the type of bug with and "x" ([x])
* [ ] Visual
* [ ] Physics
* [ ] Audio
* [ ] Particles
* [ ] Resource Management & Save/Load
* [ ] Materials
* [ ] Components
* [ ] Game Objects
* [X] UI/UX
* [ ] Scripting
* [ ] Other
## Severity
Select the severity of bug affection and mark with "x" ([x])
- [ ] Crash
- [X] Game stopper/slower
- [ ] Cosmetic
## Reproduction
Steps to reproduce the behavior:
1. Create progress
2. Save & Load
3.
4.
## Frequency
Select the frequency with which the bug appears and mark it "x" ([x])
* [X] Always
* [ ] Very Often
* [ ] Usually
* [ ] Few Times
* [ ] Few Times under specific conditions
## Conduct
### Expected result:
progress bar works after load
### Actual result:
does not work after load
## Screenshots and Illustrations:
## Build
- **Please specify the build:** ``Insert the build here``
0.6.4
## Observations and Additional Information
|
non_process
|
progress bar save load bug bug description does not save load the image type of bug select the type of bug with and x visual physics audio particles resource management save load materials components game objects ui ux scripting other severity select the severity of bug affection and mark with x crash game stopper slower cosmetic reproduction steps to reproduce the behavior create progress save load frequency select the frequency with which the bug appears and mark it x always very often usually few times few times under specific conditions conduct expected result progress bar works after load actual result does not work after load screenshots and illustrations build please specify the build insert the build here observations and additional information
| 0
|
72,390
| 8,728,674,616
|
IssuesEvent
|
2018-12-10 18:02:13
|
vector-im/riot-web
|
https://api.github.com/repos/vector-im/riot-web
|
opened
|
E2E: Room Settings: Show E2E pptions when creating a room
|
feature needs-design type:e2e
|
Part of https://github.com/vector-im/riot-meta/issues/230
Include room E2E settings choices at room-creation time
This is missing a wireframe
|
1.0
|
E2E: Room Settings: Show E2E pptions when creating a room - Part of https://github.com/vector-im/riot-meta/issues/230
Include room E2E settings choices at room-creation time
This is missing a wireframe
|
non_process
|
room settings show pptions when creating a room part of include room settings choices at room creation time this is missing a wireframe
| 0
|
19,997
| 26,470,201,789
|
IssuesEvent
|
2023-01-17 06:22:50
|
hashgraph/hedera-json-rpc-relay
|
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
|
closed
|
Release 0.13.0
|
enhancement P1 process
|
### Problem
0.13.0 features are not yet deployed
### Solution
manual release process
- [x] Create a `release/0.13` branch off of main. Ensure github test actions run
- [x] Tag as `v0.13.0-rc1`
- [x] git tag
- [x] Confirm new docker image version is deployed
- [ ] Previewnet Testing
- [ ] Deploy tagged version
- [ ] Manual testing
- [ ] Run acceptance tests
- [ ] Run performance tests (if yet applicable)
- [ ] Testnet Testing
- [ ] Deploy tagged version
- [ ] Manual testing
- [ ] Run acceptance tests
- [ ] Run performance tests (if yet applicable)
- Let bake
- [ ] Tag as `v0.13.0`
- [ ] git tag
- [ ] Confirm new docker image version is deployed
- [ ] Write up release notes and changelist
- [ ] Mainnet Testing
- [ ] Deploy tagged version
- [ ] Manual testing
Any bugs or missed features found should see a new ticket opened, addressed in main and cherry-picked to release/0.1 with a new rc version tagged and docker image deployed
### Alternatives
_No response_
|
1.0
|
Release 0.13.0 - ### Problem
0.13.0 features are not yet deployed
### Solution
manual release process
- [x] Create a `release/0.13` branch off of main. Ensure github test actions run
- [x] Tag as `v0.13.0-rc1`
- [x] git tag
- [x] Confirm new docker image version is deployed
- [ ] Previewnet Testing
- [ ] Deploy tagged version
- [ ] Manual testing
- [ ] Run acceptance tests
- [ ] Run performance tests (if yet applicable)
- [ ] Testnet Testing
- [ ] Deploy tagged version
- [ ] Manual testing
- [ ] Run acceptance tests
- [ ] Run performance tests (if yet applicable)
- Let bake
- [ ] Tag as `v0.13.0`
- [ ] git tag
- [ ] Confirm new docker image version is deployed
- [ ] Write up release notes and changelist
- [ ] Mainnet Testing
- [ ] Deploy tagged version
- [ ] Manual testing
Any bugs or missed features found should see a new ticket opened, addressed in main and cherry-picked to release/0.1 with a new rc version tagged and docker image deployed
### Alternatives
_No response_
|
process
|
release problem features are not yet deployed solution manual release process create a release branch off of main ensure github test actions run tag as git tag confirm new docker image version is deployed previewnet testing deploy tagged version manual testing run acceptance tests run performance tests if yet applicable testnet testing deploy tagged version manual testing run acceptance tests run performance tests if yet applicable let bake tag as git tag confirm new docker image version is deployed write up release notes and changelist mainnet testing deploy tagged version manual testing any bugs or missed features found should see a new ticket opened addressed in main and cherry picked to release with a new rc version tagged and docker image deployed alternatives no response
| 1
|
85,908
| 15,755,304,985
|
IssuesEvent
|
2021-03-31 01:32:32
|
ysmanohar/DashBoard
|
https://api.github.com/repos/ysmanohar/DashBoard
|
opened
|
CVE-2019-20149 (High) detected in kind-of-6.0.2.tgz
|
security vulnerability
|
## CVE-2019-20149 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kind-of-6.0.2.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p>
<p>Path to dependency file: /DashBoard/bower_components/polymer/package.json</p>
<p>Path to vulnerable library: DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- findup-sync-0.4.3.tgz (Root Library)
- micromatch-2.3.11.tgz
- braces-1.8.5.tgz
- expand-range-1.8.2.tgz
- fill-range-2.2.4.tgz
- randomatic-3.1.1.tgz
- :x: **kind-of-6.0.2.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
<p>Publish Date: 2019-12-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149</a></p>
<p>Release Date: 2019-12-30</p>
<p>Fix Resolution: 6.0.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-20149 (High) detected in kind-of-6.0.2.tgz - ## CVE-2019-20149 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kind-of-6.0.2.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p>
<p>Path to dependency file: /DashBoard/bower_components/polymer/package.json</p>
<p>Path to vulnerable library: DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json,DashBoard/bower_components/test-fixture/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- findup-sync-0.4.3.tgz (Root Library)
- micromatch-2.3.11.tgz
- braces-1.8.5.tgz
- expand-range-1.8.2.tgz
- fill-range-2.2.4.tgz
- randomatic-3.1.1.tgz
- :x: **kind-of-6.0.2.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
<p>Publish Date: 2019-12-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149</a></p>
<p>Release Date: 2019-12-30</p>
<p>Fix Resolution: 6.0.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in kind of tgz cve high severity vulnerability vulnerable library kind of tgz get the native type of a value library home page a href path to dependency file dashboard bower components polymer package json path to vulnerable library dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dashboard bower components test fixture node modules kind of package json dependency hierarchy findup sync tgz root library micromatch tgz braces tgz expand range tgz fill range tgz randomatic tgz x kind of tgz vulnerable library vulnerability details ctorname in index js in kind of allows external user input to overwrite certain internal attributes via a conflicting name as demonstrated by constructor name symbol hence a crafted payload can overwrite this builtin attribute to manipulate the type detection result publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
127,470
| 5,030,829,643
|
IssuesEvent
|
2016-12-16 02:55:22
|
gregorbj/VisionEval
|
https://api.github.com/repos/gregorbj/VisionEval
|
opened
|
allow the NAME argument to INP, GET, SET to be a vector of column names
|
high priority
|
In the short term, allow the NAME argument to INP, GET, SET to be a vector of column names that share the same data type, etc. This will make specifying the data list much easier.
In the long term, potentially automatically generate the list with code, maybe even the testing code.
|
1.0
|
allow the NAME argument to INP, GET, SET to be a vector of column names - In the short term, allow the NAME argument to INP, GET, SET to be a vector of column names that share the same data type, etc. This will make specifying the data list much easier.
In the long term, potentially automatically generate the list with code, maybe even the testing code.
|
non_process
|
allow the name argument to inp get set to be a vector of column names in the short term allow the name argument to inp get set to be a vector of column names that share the same data type etc this will make specifying the data list much easier in the long term potentially automatically generate the list with code maybe even the testing code
| 0
|
381,670
| 11,278,281,581
|
IssuesEvent
|
2020-01-15 06:13:37
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
opened
|
[0.9.0 staging-1337] Chat Messages disappear
|
Priority: High
|
I bought 1 Bannock, then I bought another Bannock. I have notification:

After time some of this notification (underlined by green line) will dissapear and I can't read it almost at all:

They are saved in the chat history, but if I turn on all chats I don't see it.
One step to see it: Turn on/off, for example, "Trades". Notification should appear at 1 sec.
This behavior is not only found in store messages.
In Main chat, I should be able to watch all the messages that come. Temporary messages are only error messages.
|
1.0
|
[0.9.0 staging-1337] Chat Messages disappear - I bought 1 Bannock, then I bought another Bannock. I have notification:

After time some of this notification (underlined by green line) will dissapear and I can't read it almost at all:

They are saved in the chat history, but if I turn on all chats I don't see it.
One step to see it: Turn on/off, for example, "Trades". Notification should appear at 1 sec.
This behavior is not only found in store messages.
In Main chat, I should be able to watch all the messages that come. Temporary messages are only error messages.
|
non_process
|
chat messages disappear i bought bannock then i bought another bannock i have notification after time some of this notification underlined by green line will dissapear and i can t read it almost at all they are saved in the chat history but if i turn on all chats i don t see it one step to see it turn on off for example trades notification should appear at sec this behavior is not only found in store messages in main chat i should be able to watch all the messages that come temporary messages are only error messages
| 0
|
22,189
| 30,737,262,620
|
IssuesEvent
|
2023-07-28 08:39:31
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
PDF - Stage "preprocess2" does not report not resolved conrefs
|
bug priority/medium preprocess2
|
I'm attaching a sample project, the main DITA Map is `m_Key-Resolution-TEST.ditamap`.
There is a topic named `c_Keyref-source-TEST.xml` which is not referenced in the DITA Map as resource-only so because of that the conrefs to that topic are no longer resolved with the "preprocess2" PDF stage.
The problem can be fixed by referring to the `c_Keyref-source-TEST.xml` in the DITA Map as resource-only but such cases when conrefs cannot be resolved should be reported as error messages by the publishing engine, right now no message is issued and people have the impression that the conref has been resolved until they look inside the PDF.
[conref-keyref-test.zip](https://github.com/dita-ot/dita-ot/files/4138282/conref-keyref-test.zip)
|
1.0
|
PDF - Stage "preprocess2" does not report not resolved conrefs - I'm attaching a sample project, the main DITA Map is `m_Key-Resolution-TEST.ditamap`.
There is a topic named `c_Keyref-source-TEST.xml` which is not referenced in the DITA Map as resource-only so because of that the conrefs to that topic are no longer resolved with the "preprocess2" PDF stage.
The problem can be fixed by referring to the `c_Keyref-source-TEST.xml` in the DITA Map as resource-only but such cases when conrefs cannot be resolved should be reported as error messages by the publishing engine, right now no message is issued and people have the impression that the conref has been resolved until they look inside the PDF.
[conref-keyref-test.zip](https://github.com/dita-ot/dita-ot/files/4138282/conref-keyref-test.zip)
|
process
|
pdf stage does not report not resolved conrefs i m attaching a sample project the main dita map is m key resolution test ditamap there is a topic named c keyref source test xml which is not referenced in the dita map as resource only so because of that the conrefs to that topic are no longer resolved with the pdf stage the problem can be fixed by referring to the c keyref source test xml in the dita map as resource only but such cases when conrefs cannot be resolved should be reported as error messages by the publishing engine right now no message is issued and people have the impression that the conref has been resolved until they look inside the pdf
| 1
|
299
| 2,521,881,728
|
IssuesEvent
|
2015-01-19 17:35:03
|
eugenkiss/chanobol
|
https://api.github.com/repos/eugenkiss/chanobol
|
closed
|
Handling of Menu in Fragments
|
code
|
I don't like the handling of the toolbar menu in the fragments with menu groups and needing to hide and make visible menu groups in onresume and so on. There must be a better way.
|
1.0
|
Handling of Menu in Fragments - I don't like the handling of the toolbar menu in the fragments with menu groups and needing to hide and make visible menu groups in onresume and so on. There must be a better way.
|
non_process
|
handling of menu in fragments i don t like the handling of the toolbar menu in the fragments with menu groups and needing to hide and make visible menu groups in onresume and so on there must be a better way
| 0
|
49,016
| 5,997,191,700
|
IssuesEvent
|
2017-06-03 21:24:21
|
participedia/frontend
|
https://api.github.com/repos/participedia/frontend
|
closed
|
SEARCH - Featured Cases
|
before user testing
|
**Comment**: Let's add some teasers to this set of links like "In progress cases" and Cases in <dropdown for country>, Methods in order of popualrity
[Open #9 in Usersnap Dashboard](https://usersnap.com/a/#/participedia-dev/p/participedia-ed971123/9)
<a href='https://usersnappublic.s3.amazonaws.com/2017-04-18/21-32/30962f81-4e16-4cbe-99ae-593cd8aec855.png'></a>
<a href='https://usersnappublic.s3.amazonaws.com/2017-04-18/21-32/30962f81-4e16-4cbe-99ae-593cd8aec855.png'>Download original image</a>
**Browser**: Chrome 57 (macOS Sierra)
**Referer**: [http://participedia.xyz/](http://participedia.xyz/)
**Screen size**: 1680 x 1050 **Browser size**: 1505 x 953
Powered by [usersnap.com](https://usersnap.com/?utm_source=github_entry&utm_medium=web&utm_campaign=product)
|
1.0
|
SEARCH - Featured Cases - **Comment**: Let's add some teasers to this set of links like "In progress cases" and Cases in <dropdown for country>, Methods in order of popualrity
[Open #9 in Usersnap Dashboard](https://usersnap.com/a/#/participedia-dev/p/participedia-ed971123/9)
<a href='https://usersnappublic.s3.amazonaws.com/2017-04-18/21-32/30962f81-4e16-4cbe-99ae-593cd8aec855.png'></a>
<a href='https://usersnappublic.s3.amazonaws.com/2017-04-18/21-32/30962f81-4e16-4cbe-99ae-593cd8aec855.png'>Download original image</a>
**Browser**: Chrome 57 (macOS Sierra)
**Referer**: [http://participedia.xyz/](http://participedia.xyz/)
**Screen size**: 1680 x 1050 **Browser size**: 1505 x 953
Powered by [usersnap.com](https://usersnap.com/?utm_source=github_entry&utm_medium=web&utm_campaign=product)
|
non_process
|
search featured cases comment let s add some teasers to this set of links like in progress cases and cases in methods in order of popualrity a href browser chrome macos sierra referer screen size x browser size x powered by
| 0
|
7,394
| 10,521,072,263
|
IssuesEvent
|
2019-09-30 04:22:51
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Folder Hierarchy using SSIS for Blob
|
Pri2 cxp machine-learning/svc product-question team-data-science-process/subsvc triaged
|
If I am copying source data into Blob Storage, how would I dynamically create the folder structure on the fly to be like this
container/datasource/yyyy/MM/dd/ yyyy-MM-dd HH:mm:ss_datasource.csv"
We are thinking of executing this package using ADF and passing datasource, yyyy, MM, dd using parameters and trigger runtime. There is not much documentation on this piece so anything would help!
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: f5411258-7d41-101c-f864-376da9140228
* Version Independent ID: 81b50ad0-f9e2-6866-3701-93ac3224c364
* Content: [Move Blob storage data with SSIS connectors - Team Data Science Process](https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/move-data-to-azure-blob-using-ssis)
* Content Source: [articles/machine-learning/team-data-science-process/move-data-to-azure-blob-using-ssis.md](https://github.com/Microsoft/azure-docs/blob/master/articles/machine-learning/team-data-science-process/move-data-to-azure-blob-using-ssis.md)
* Service: **machine-learning**
* Sub-service: **team-data-science-process**
* GitHub Login: @marktab
* Microsoft Alias: **tdsp**
|
1.0
|
Folder Hierarchy using SSIS for Blob - If I am copying source data into Blob Storage, how would I dynamically create the folder structure on the fly to be like this
container/datasource/yyyy/MM/dd/ yyyy-MM-dd HH:mm:ss_datasource.csv"
We are thinking of executing this package using ADF and passing datasource, yyyy, MM, dd using parameters and trigger runtime. There is not much documentation on this piece so anything would help!
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: f5411258-7d41-101c-f864-376da9140228
* Version Independent ID: 81b50ad0-f9e2-6866-3701-93ac3224c364
* Content: [Move Blob storage data with SSIS connectors - Team Data Science Process](https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/move-data-to-azure-blob-using-ssis)
* Content Source: [articles/machine-learning/team-data-science-process/move-data-to-azure-blob-using-ssis.md](https://github.com/Microsoft/azure-docs/blob/master/articles/machine-learning/team-data-science-process/move-data-to-azure-blob-using-ssis.md)
* Service: **machine-learning**
* Sub-service: **team-data-science-process**
* GitHub Login: @marktab
* Microsoft Alias: **tdsp**
|
process
|
folder hierarchy using ssis for blob if i am copying source data into blob storage how would i dynamically create the folder structure on the fly to be like this container datasource yyyy mm dd yyyy mm dd hh mm ss datasource csv we are thinking of executing this package using adf and passing datasource yyyy mm dd using parameters and trigger runtime there is not much documentation on this piece so anything would help document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service machine learning sub service team data science process github login marktab microsoft alias tdsp
| 1
|
223,147
| 17,105,057,115
|
IssuesEvent
|
2021-07-09 16:23:59
|
matthewjhcarr/twitter-clone-project
|
https://api.github.com/repos/matthewjhcarr/twitter-clone-project
|
closed
|
Add documentation to code
|
documentation
|
Code should be well documented and so far documentation has been lacking.
This should be fixed and in future all committed code should be documented.
|
1.0
|
Add documentation to code - Code should be well documented and so far documentation has been lacking.
This should be fixed and in future all committed code should be documented.
|
non_process
|
add documentation to code code should be well documented and so far documentation has been lacking this should be fixed and in future all committed code should be documented
| 0
|
12,845
| 15,225,664,697
|
IssuesEvent
|
2021-02-18 07:43:02
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Preserve unexpected generated files
|
category: sandboxing team-Local-Exec type: support / not a bug (process)
|
### Description of the problem / feature request:
I'm guessing this is a feature request: I would like a way to specify side effects from a rule `cc_library` for example. If I run the compile command with specific options, additional files may be generated and bazel wipes these.
### Feature requests: what underlying problem are you trying to solve with this feature?
Preserve unexpected generated files.
### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
No bug.
### What operating system are you running Bazel on?
Ubuntu 16.04
### What's the output of `bazel info release`?
release 2.0.0
### If `bazel info release` returns "development version" or "(@non-git)", tell us how you built Bazel.
No.
### What's the output of `git remote get-url origin ; git rev-parse master ; git rev-parse HEAD` ?
No need.
### Have you found anything relevant by searching the web?
Yep.
### Any other information, logs, or outputs that you want to share?
Use case: https://github.com/bazelbuild/bazel/issues/9047
|
1.0
|
Preserve unexpected generated files - ### Description of the problem / feature request:
I'm guessing this is a feature request: I would like a way to specify side effects from a rule `cc_library` for example. If I run the compile command with specific options, additional files may be generated and bazel wipes these.
### Feature requests: what underlying problem are you trying to solve with this feature?
Preserve unexpected generated files.
### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
No bug.
### What operating system are you running Bazel on?
Ubuntu 16.04
### What's the output of `bazel info release`?
release 2.0.0
### If `bazel info release` returns "development version" or "(@non-git)", tell us how you built Bazel.
No.
### What's the output of `git remote get-url origin ; git rev-parse master ; git rev-parse HEAD` ?
No need.
### Have you found anything relevant by searching the web?
Yep.
### Any other information, logs, or outputs that you want to share?
Use case: https://github.com/bazelbuild/bazel/issues/9047
|
process
|
preserve unexpected generated files description of the problem feature request i m guessing this is a feature request i would like a way to specify side effects from a rule cc library for example if i run the compile command with specific options additional files may be generated and bazel wipes these feature requests what underlying problem are you trying to solve with this feature preserve unexpected generated files bugs what s the simplest easiest way to reproduce this bug please provide a minimal example if possible no bug what operating system are you running bazel on ubuntu what s the output of bazel info release release if bazel info release returns development version or non git tell us how you built bazel no what s the output of git remote get url origin git rev parse master git rev parse head no need have you found anything relevant by searching the web yep any other information logs or outputs that you want to share use case
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.