Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
19,459
5,887,867,273
IssuesEvent
2017-05-17 08:41:48
Jakobtottrup/OptekSemester2
https://api.github.com/repos/Jakobtottrup/OptekSemester2
closed
Påkrævede felter skal have en rød *
Codework
Det er blevet fjernet for noget tid siden, men har været implementeret tidligere.
1.0
Påkrævede felter skal have en rød * - Det er blevet fjernet for noget tid siden, men har været implementeret tidligere.
non_process
påkrævede felter skal have en rød det er blevet fjernet for noget tid siden men har været implementeret tidligere
0
12,389
14,908,707,486
IssuesEvent
2021-01-22 06:31:02
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[PM] Site participant registry > Invited tab > if user disables the participants then user is navigating to 'New' tab
Bug P1 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
Scenario 1: New to Disable Steps 1. Closed study > Site participant registry > go to the invited tab 2. Select the participants email in New tab 3. Click on the disable invitation button 4. Observe the navigation of the user AR : User is navigated to 'New' tab ER : User should be navigated to Disable tab Scenario 2: Invited to Disable 1. Closed study > Site participant registry > go to the invited tab 2. Select the participants email in the invited tab 3. Click on the disable invitation button 4. Observe the navigation of the user AR : User is navigated to 'New' tab ER : User should be navigated to Disable tab Scenario 3: Disable to enable invitation 1. Closed study > Site participant registry > go to the invited tab 2. Select the participants email in the disable tab 3. Click on the enable invitation button 4. Observe the navigation of the user ER : User should be navigated to New tab
3.0
[PM] Site participant registry > Invited tab > if user disables the participants then user is navigating to 'New' tab - Scenario 1: New to Disable Steps 1. Closed study > Site participant registry > go to the invited tab 2. Select the participants email in New tab 3. Click on the disable invitation button 4. Observe the navigation of the user AR : User is navigated to 'New' tab ER : User should be navigated to Disable tab Scenario 2: Invited to Disable 1. Closed study > Site participant registry > go to the invited tab 2. Select the participants email in the invited tab 3. Click on the disable invitation button 4. Observe the navigation of the user AR : User is navigated to 'New' tab ER : User should be navigated to Disable tab Scenario 3: Disable to enable invitation 1. Closed study > Site participant registry > go to the invited tab 2. Select the participants email in the disable tab 3. Click on the enable invitation button 4. Observe the navigation of the user ER : User should be navigated to New tab
process
site participant registry invited tab if user disables the participants then user is navigating to new tab scenario new to disable steps closed study site participant registry go to the invited tab select the participants email in new tab click on the disable invitation button observe the navigation of the user ar user is navigated to new tab er user should be navigated to disable tab scenario invited to disable closed study site participant registry go to the invited tab select the participants email in the invited tab click on the disable invitation button observe the navigation of the user ar user is navigated to new tab er user should be navigated to disable tab scenario disable to enable invitation closed study site participant registry go to the invited tab select the participants email in the disable tab click on the enable invitation button observe the navigation of the user er user should be navigated to new tab
1
13,842
16,602,738,757
IssuesEvent
2021-06-01 22:00:36
ivanbukhtiyarov/elevators
https://api.github.com/repos/ivanbukhtiyarov/elevators
closed
RI-04-01: Валидация команды
analysis enhancement in process
Реализовать методы is_source_valid, is_action_valid и is_value_valid в app.py в соответствии с RI-04-01. Для source и action проверять входимость значения в список возможных значений. Value зависит от action, поэтому нужно учесть допустимые значения value для разных action. Возможно, придётся уточнить требования.
1.0
RI-04-01: Валидация команды - Реализовать методы is_source_valid, is_action_valid и is_value_valid в app.py в соответствии с RI-04-01. Для source и action проверять входимость значения в список возможных значений. Value зависит от action, поэтому нужно учесть допустимые значения value для разных action. Возможно, придётся уточнить требования.
process
ri валидация команды реализовать методы is source valid is action valid и is value valid в app py в соответствии с ri для source и action проверять входимость значения в список возможных значений value зависит от action поэтому нужно учесть допустимые значения value для разных action возможно придётся уточнить требования
1
267,999
8,401,611,450
IssuesEvent
2018-10-11 02:01:41
CS2103-AY1819S1-F11-4/main
https://api.github.com/repos/CS2103-AY1819S1-F11-4/main
closed
timetable Ui
feature.Timetable priority.High severity.Low status.Ongoing
-need test for browser panel -~~waiting on information on whether is doing ui with java graded.~~
1.0
timetable Ui - -need test for browser panel -~~waiting on information on whether is doing ui with java graded.~~
non_process
timetable ui need test for browser panel waiting on information on whether is doing ui with java graded
0
12,229
14,743,572,355
IssuesEvent
2021-01-07 14:06:37
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Dyad - Swift MD - failed payments (but actually processed?)
anc-process anp-0.5 ant-bug ant-parent/primary has attachment
In GitLab by @kdjstudios on Aug 28, 2019, 08:49 **Submitted by:** "Shonda Medwatz" <shonda.smith@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/9164557 **Server:** Internal **Client/Site:** Dyad **Account:** Swift MD **Issue:** The Dyad has their biggest client Swift MD who tried making a payment several times on the portal on Saturday 8/23/19 and the transactions showed as failed. We received word today that they payment actually did go through, please see screenshot below. However, it did not apply in SAB. Can someone please take a look at this for me asap as the client wants this resolved asap. ![image](/uploads/5b94de4698899f5ff44218407df6eab3/image.png)
1.0
Dyad - Swift MD - failed payments (but actually processed?) - In GitLab by @kdjstudios on Aug 28, 2019, 08:49 **Submitted by:** "Shonda Medwatz" <shonda.smith@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/9164557 **Server:** Internal **Client/Site:** Dyad **Account:** Swift MD **Issue:** The Dyad has their biggest client Swift MD who tried making a payment several times on the portal on Saturday 8/23/19 and the transactions showed as failed. We received word today that they payment actually did go through, please see screenshot below. However, it did not apply in SAB. Can someone please take a look at this for me asap as the client wants this resolved asap. ![image](/uploads/5b94de4698899f5ff44218407df6eab3/image.png)
process
dyad swift md failed payments but actually processed in gitlab by kdjstudios on aug submitted by shonda medwatz helpdesk server internal client site dyad account swift md issue the dyad has their biggest client swift md who tried making a payment several times on the portal on saturday and the transactions showed as failed we received word today that they payment actually did go through please see screenshot below however it did not apply in sab can someone please take a look at this for me asap as the client wants this resolved asap uploads image png
1
3,480
5,916,681,408
IssuesEvent
2017-05-22 11:12:01
support-project/knowledge
https://api.github.com/repos/support-project/knowledge
closed
tag Link encode
[Status] 3.merged [Type] 1.requirement
投稿一覧や投稿詳細を見た際のタグをクリックすると、encodeされないLinkになっているのをEncodeにしてもらえませんか? キーワードで検索でタグ検索するとEncodeされているのでちょっと違和感が。
1.0
tag Link encode - 投稿一覧や投稿詳細を見た際のタグをクリックすると、encodeされないLinkになっているのをEncodeにしてもらえませんか? キーワードで検索でタグ検索するとEncodeされているのでちょっと違和感が。
non_process
tag link encode 投稿一覧や投稿詳細を見た際のタグをクリックすると、encodeされないlinkになっているのをencodeにしてもらえませんか? キーワードで検索でタグ検索するとencodeされているのでちょっと違和感が。
0
121,780
4,821,245,957
IssuesEvent
2016-11-05 07:25:13
artfcl-intlgnce/NarcoWars
https://api.github.com/repos/artfcl-intlgnce/NarcoWars
closed
Suggested Enhancement: Add mob health bar
Engineering Team enhancement Priority 3
Mob health bar/health number. e.g. 7.4/10 or (////// )
1.0
Suggested Enhancement: Add mob health bar - Mob health bar/health number. e.g. 7.4/10 or (////// )
non_process
suggested enhancement add mob health bar mob health bar health number e g or
0
15,860
20,035,607,586
IssuesEvent
2022-02-02 11:32:19
plazi/community
https://api.github.com/repos/plazi/community
opened
African Invertebrates: Jason Londt project
Zenodo process request
Via Torsten Dikow I have set-up a Dropbox folder with 28 publications by Jason Londt in African Invertebrates. Two of these are faunistic studies so might be of less taxonomic importance but I thought I include them. I can provide bibliographic data as well. The earliest of these publications don’t have a DOI (or at least didn’t get one assigned initially) but were incorproared/accessible from the Sabinet African Journal Archive. I can obtain the unique URLs (I think they use the handle system now) for all of these earlier publications if that would help. The folder can be accessed here: https://www.dropbox.com/sh/xmzrruu00xn5cd8/AACj3w9fSCMD8jSz_4QObeXVa?dl=0 . How can I help in marking these articles up? Thanks for all of your help, Torsten
1.0
African Invertebrates: Jason Londt project - Via Torsten Dikow I have set-up a Dropbox folder with 28 publications by Jason Londt in African Invertebrates. Two of these are faunistic studies so might be of less taxonomic importance but I thought I include them. I can provide bibliographic data as well. The earliest of these publications don’t have a DOI (or at least didn’t get one assigned initially) but were incorproared/accessible from the Sabinet African Journal Archive. I can obtain the unique URLs (I think they use the handle system now) for all of these earlier publications if that would help. The folder can be accessed here: https://www.dropbox.com/sh/xmzrruu00xn5cd8/AACj3w9fSCMD8jSz_4QObeXVa?dl=0 . How can I help in marking these articles up? Thanks for all of your help, Torsten
process
african invertebrates jason londt project via torsten dikow i have set up a dropbox folder with publications by jason londt in african invertebrates two of these are faunistic studies so might be of less taxonomic importance but i thought i include them i can provide bibliographic data as well the earliest of these publications don’t have a doi or at least didn’t get one assigned initially but were incorproared accessible from the sabinet african journal archive i can obtain the unique urls i think they use the handle system now for all of these earlier publications if that would help the folder can be accessed here how can i help in marking these articles up thanks for all of your help torsten
1
5,991
8,805,374,822
IssuesEvent
2018-12-26 19:14:04
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Keyref breaks with peer topic, relative path, topic in sub-directory
bug preprocess/keyref stale
I've got a key definition with `@scope="peer"`, linking to a topic with a relative path. I've also got a topic in a sub-directory. When the key is referenced in a `<reltable>` and pushes a link into the subdirectory, the path is adjusted properly, link resolves. When the key is referenced directly from that topic, the link is not adjusted, and is broken. To reproduce, add this to the end of `hierarchy.ditamap`: ``` <keydef keys="peerhtml" href="../peerdir/file.html" navtitle="peer html file in peerdir" scope="peer" format="html"/> <keydef keys="peerdita" href="../peerdir/file.dita" navtitle="peer DITA file in peerdir" scope="peer" format="dita"/> <reltable> <relrow> <relcell> <topicref href="tasks/garagetaskoverview.xml" type="concept"/> </relcell> <relcell> <topicref keyref="peerhtml"/> <topicref keyref="peerdita"/> </relcell> </relrow> </reltable> ``` Then add this reference into `tasks\garagetaskoverview.xml`: ``` <related-links> <linklist> <title>links in the topic (broken paths)</title> <link keyref="peerdita"/> <link keyref="peerhtml"/> </linklist> </related-links> ```
1.0
Keyref breaks with peer topic, relative path, topic in sub-directory - I've got a key definition with `@scope="peer"`, linking to a topic with a relative path. I've also got a topic in a sub-directory. When the key is referenced in a `<reltable>` and pushes a link into the subdirectory, the path is adjusted properly, link resolves. When the key is referenced directly from that topic, the link is not adjusted, and is broken. To reproduce, add this to the end of `hierarchy.ditamap`: ``` <keydef keys="peerhtml" href="../peerdir/file.html" navtitle="peer html file in peerdir" scope="peer" format="html"/> <keydef keys="peerdita" href="../peerdir/file.dita" navtitle="peer DITA file in peerdir" scope="peer" format="dita"/> <reltable> <relrow> <relcell> <topicref href="tasks/garagetaskoverview.xml" type="concept"/> </relcell> <relcell> <topicref keyref="peerhtml"/> <topicref keyref="peerdita"/> </relcell> </relrow> </reltable> ``` Then add this reference into `tasks\garagetaskoverview.xml`: ``` <related-links> <linklist> <title>links in the topic (broken paths)</title> <link keyref="peerdita"/> <link keyref="peerhtml"/> </linklist> </related-links> ```
process
keyref breaks with peer topic relative path topic in sub directory i ve got a key definition with scope peer linking to a topic with a relative path i ve also got a topic in a sub directory when the key is referenced in a and pushes a link into the subdirectory the path is adjusted properly link resolves when the key is referenced directly from that topic the link is not adjusted and is broken to reproduce add this to the end of hierarchy ditamap then add this reference into tasks garagetaskoverview xml links in the topic broken paths
1
602,395
18,468,266,436
IssuesEvent
2021-10-17 09:22:24
ohtuprojekti-Kierratysavustin/Kierratysavustin
https://api.github.com/repos/ohtuprojekti-Kierratysavustin/Kierratysavustin
opened
Kierrätyskisa
enhancement Size 9999 Priority Low
# Käyttäjätarina **Käyttäjänä** Haluan **luoda tai osallistua tai kutsua ihmisiä leikkimieliseen kierrätyskisaan** Jotta **saamme kierrätysastetta nostettua** ## Hyväksymiskriteerit ### Toiminnalliset - [ ] Jos *[ Käyttäjäskenaario ]* Kun *[ Ympäristö ]* Niin *[ Seuraukset ]* ### Ei-toiminnalliset - [ ] [ *Miten* ] - [ ] [ *Suorituskyky* ] - [ ] [ *Tietoturva* ] - [ ] [ *Datan käyttö* ] - [ ] [ *Dokumentaation päivittäminen* ] ## Riippuvuudet **Toteutettavissa jälkeen:** [ *Issue* ] **Toteutettava ennen:** [ *Issue* ] **Liittyy:** [ *Issue* ] **Vaihtoehtoinen ratkaisu:** [ *Issue* ] ## Valmis kun: - [ ] Ominaisuus on toteutettuna hyväksymiskriteerien mukaisesti ja toimii - [ ] Ominaisuus on testattu käyttäjätarinan hyväksymiskriteerien mukaisesti ja testit menevät läpi - [ ] Kaikki ominaisuuteen liittyvä dokumentaatio on kirjoitettu - [ ] Koodi noudattaa määriteltyä koodaustyyliä - [ ] Koodi on katselmoitua: katselmoinnin on tehnyt vähintään 2 muuta henkilöä, kuin ominaisuuden koodannut henkilö - [ ] Ominaisuus mergetty mainiin ja toiminnassa staging-ympäristössä - [ ] Ominaisuus on toiminnassa tuotanto-ympäristössä
1.0
Kierrätyskisa - # Käyttäjätarina **Käyttäjänä** Haluan **luoda tai osallistua tai kutsua ihmisiä leikkimieliseen kierrätyskisaan** Jotta **saamme kierrätysastetta nostettua** ## Hyväksymiskriteerit ### Toiminnalliset - [ ] Jos *[ Käyttäjäskenaario ]* Kun *[ Ympäristö ]* Niin *[ Seuraukset ]* ### Ei-toiminnalliset - [ ] [ *Miten* ] - [ ] [ *Suorituskyky* ] - [ ] [ *Tietoturva* ] - [ ] [ *Datan käyttö* ] - [ ] [ *Dokumentaation päivittäminen* ] ## Riippuvuudet **Toteutettavissa jälkeen:** [ *Issue* ] **Toteutettava ennen:** [ *Issue* ] **Liittyy:** [ *Issue* ] **Vaihtoehtoinen ratkaisu:** [ *Issue* ] ## Valmis kun: - [ ] Ominaisuus on toteutettuna hyväksymiskriteerien mukaisesti ja toimii - [ ] Ominaisuus on testattu käyttäjätarinan hyväksymiskriteerien mukaisesti ja testit menevät läpi - [ ] Kaikki ominaisuuteen liittyvä dokumentaatio on kirjoitettu - [ ] Koodi noudattaa määriteltyä koodaustyyliä - [ ] Koodi on katselmoitua: katselmoinnin on tehnyt vähintään 2 muuta henkilöä, kuin ominaisuuden koodannut henkilö - [ ] Ominaisuus mergetty mainiin ja toiminnassa staging-ympäristössä - [ ] Ominaisuus on toiminnassa tuotanto-ympäristössä
non_process
kierrätyskisa käyttäjätarina käyttäjänä haluan luoda tai osallistua tai kutsua ihmisiä leikkimieliseen kierrätyskisaan jotta saamme kierrätysastetta nostettua hyväksymiskriteerit toiminnalliset jos kun niin ei toiminnalliset riippuvuudet toteutettavissa jälkeen toteutettava ennen liittyy vaihtoehtoinen ratkaisu valmis kun ominaisuus on toteutettuna hyväksymiskriteerien mukaisesti ja toimii ominaisuus on testattu käyttäjätarinan hyväksymiskriteerien mukaisesti ja testit menevät läpi kaikki ominaisuuteen liittyvä dokumentaatio on kirjoitettu koodi noudattaa määriteltyä koodaustyyliä koodi on katselmoitua katselmoinnin on tehnyt vähintään muuta henkilöä kuin ominaisuuden koodannut henkilö ominaisuus mergetty mainiin ja toiminnassa staging ympäristössä ominaisuus on toiminnassa tuotanto ympäristössä
0
11,277
14,077,948,820
IssuesEvent
2020-11-04 12:51:04
MEDEAEditions/DEPCHA
https://api.github.com/repos/MEDEAEditions/DEPCHA
opened
TORDF
preprocessing
All task, bugs and to-do relating to the **TORDF.xsl** (Ingest XML/TEI into GAMS) - [ ] `<bk:unit>LITERAL</bk:unit>` to `<bk:unit rdf:resource="https://gams.uni-graz.at/o:depcha.wheaton.1#unit.1">` if `<om:Unit> `exists; otherwise `<bk:unit>LITERAL</bk:unit>` - [ ] remove `@rdf:seeAlso` in bk:EconomicGood; references to other LOD is in `<skos:Concept>`
1.0
TORDF - All task, bugs and to-do relating to the **TORDF.xsl** (Ingest XML/TEI into GAMS) - [ ] `<bk:unit>LITERAL</bk:unit>` to `<bk:unit rdf:resource="https://gams.uni-graz.at/o:depcha.wheaton.1#unit.1">` if `<om:Unit> `exists; otherwise `<bk:unit>LITERAL</bk:unit>` - [ ] remove `@rdf:seeAlso` in bk:EconomicGood; references to other LOD is in `<skos:Concept>`
process
tordf all task bugs and to do relating to the tordf xsl ingest xml tei into gams literal to exists otherwise literal remove rdf seealso in bk economicgood references to other lod is in
1
382,409
11,305,826,770
IssuesEvent
2020-01-18 09:11:08
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
opened
Investigate compression methods on the minikube.iso
area/guest-vm area/performance kind/feature priority/important-longterm
The compression method to use is a trade-off between size and speed... We could investigate changing to a slightly bigger ISO, that boots faster.
1.0
Investigate compression methods on the minikube.iso - The compression method to use is a trade-off between size and speed... We could investigate changing to a slightly bigger ISO, that boots faster.
non_process
investigate compression methods on the minikube iso the compression method to use is a trade off between size and speed we could investigate changing to a slightly bigger iso that boots faster
0
768,779
26,979,861,203
IssuesEvent
2023-02-09 12:19:20
aquasecurity/trivy
https://api.github.com/repos/aquasecurity/trivy
closed
Unable to open JAR files
kind/bug priority/important-soon
## Description Java scanning may lead to the following error. ``` failed to analyze file: failed to analyze usr/lib/jvm/java-1.8-openjdk/lib/tools.jar: unable to open usr/lib/jvm/java-1.8-openjdk/lib/tools.jar: failed to open: unable to read the file: stream error: stream ID 9; PROTOCOL_ERROR; received from peer ``` Currently, we're investigating this issue. As a temporary mitigation, you may be able to avoid this issue by downloading the Java DB in advance. `--download-java-db-only` is available in v0.37.0+. ``` $ trivy image --download-java-db-only 2023-02-01T16:57:04.322+0900 INFO Downloading the Java DB... $ trivy image [YOUR_JAVA_IMAGE] ```
1.0
Unable to open JAR files - ## Description Java scanning may lead to the following error. ``` failed to analyze file: failed to analyze usr/lib/jvm/java-1.8-openjdk/lib/tools.jar: unable to open usr/lib/jvm/java-1.8-openjdk/lib/tools.jar: failed to open: unable to read the file: stream error: stream ID 9; PROTOCOL_ERROR; received from peer ``` Currently, we're investigating this issue. As a temporary mitigation, you may be able to avoid this issue by downloading the Java DB in advance. `--download-java-db-only` is available in v0.37.0+. ``` $ trivy image --download-java-db-only 2023-02-01T16:57:04.322+0900 INFO Downloading the Java DB... $ trivy image [YOUR_JAVA_IMAGE] ```
non_process
unable to open jar files description java scanning may lead to the following error failed to analyze file failed to analyze usr lib jvm java openjdk lib tools jar unable to open usr lib jvm java openjdk lib tools jar failed to open unable to read the file stream error stream id protocol error received from peer currently we re investigating this issue as a temporary mitigation you may be able to avoid this issue by downloading the java db in advance download java db only is available in trivy image download java db only info downloading the java db trivy image
0
75,404
7,470,857,601
IssuesEvent
2018-04-03 07:14:18
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Support yaml file for pipeline
area/pipeline kind/enhancement status/resolved status/to-test team/cn version/2.0
1. support export a yaml file from existing pipeline 2. support create a pipeline by importing a yaml file. 3. support create a pipeline by specifiy a repos that contains the yaml file.
1.0
Support yaml file for pipeline - 1. support export a yaml file from existing pipeline 2. support create a pipeline by importing a yaml file. 3. support create a pipeline by specifiy a repos that contains the yaml file.
non_process
support yaml file for pipeline support export a yaml file from existing pipeline support create a pipeline by importing a yaml file support create a pipeline by specifiy a repos that contains the yaml file
0
570,270
17,023,076,848
IssuesEvent
2021-07-03 00:16:53
tomhughes/trac-tickets
https://api.github.com/repos/tomhughes/trac-tickets
closed
[PATCH] slippy maps page should use static viewer for non-javascript types
Component: website Priority: trivial Resolution: fixed Type: enhancement
**[Submitted to the original trac issue database at 12.06pm, Friday, 11th November 2005]** The old static viewer should be in the page when it loads. If the user has javascript and a compatible browser the tiles code should replace the viewer with tiles and hide the zoom/pan links.
1.0
[PATCH] slippy maps page should use static viewer for non-javascript types - **[Submitted to the original trac issue database at 12.06pm, Friday, 11th November 2005]** The old static viewer should be in the page when it loads. If the user has javascript and a compatible browser the tiles code should replace the viewer with tiles and hide the zoom/pan links.
non_process
slippy maps page should use static viewer for non javascript types the old static viewer should be in the page when it loads if the user has javascript and a compatible browser the tiles code should replace the viewer with tiles and hide the zoom pan links
0
14,634
17,768,239,356
IssuesEvent
2021-08-30 10:18:36
pystatgen/sgkit
https://api.github.com/repos/pystatgen/sgkit
opened
Support macOS arm64 processors
process + tools
I have a new Mac Mini with an Apple M1 chip. It would be good to be able to run (and develop) sgkit on this architecture.
1.0
Support macOS arm64 processors - I have a new Mac Mini with an Apple M1 chip. It would be good to be able to run (and develop) sgkit on this architecture.
process
support macos processors i have a new mac mini with an apple chip it would be good to be able to run and develop sgkit on this architecture
1
17,557
6,474,590,788
IssuesEvent
2017-08-17 18:26:45
habitat-sh/habitat
https://api.github.com/repos/habitat-sh/habitat
closed
Configurable publish phases in `builder.toml`
A-builder C-feature
## User Story As a user of builder's builds, I need a way to declare configurable publish phases which will be performed by a builder-worker after the initial build phase, so I can consume my built artifact in various post-processed formats such as a Docker container, an AMI, etc ## Requirements The `builder.toml` which resides next to a `plan.sh`/`plan.ps1` is the configuration that a builder-worker reads to understand additional steps. A refactored version of this may look something like this (with thanks to @tduffield) ```toml [[publish]] type = "docker-registry" registry = "index.docker.io" # default image_name = "{{pkg_origin}}/{{pkg_name}}" # default tags = [ "{{pkg_version}}-{{pkg_release}}" ] # default secret_key = "ENCRYPTED_VALUE" # placeholder for encrypted creds (if needed) [[publish]] type = "s3" secret_key = "ENCRYPTED_VALUE" [[publish]] type = "artifactory" secret_key = "ENCRYPTED_VALUE" ``` Some things to note: * The `type` key/value represents the exporter/publisher to use for the phase * We support an array of tables so multiple exporters of a single type can be used * We have dropped the current configuration for publishing to the depot itself. This will become a requirement of the system * Mustache can be used for string substitution for variables generated by the build phase such as pkg name, version, origin, and release
1.0
Configurable publish phases in `builder.toml` - ## User Story As a user of builder's builds, I need a way to declare configurable publish phases which will be performed by a builder-worker after the initial build phase, so I can consume my built artifact in various post-processed formats such as a Docker container, an AMI, etc ## Requirements The `builder.toml` which resides next to a `plan.sh`/`plan.ps1` is the configuration that a builder-worker reads to understand additional steps. A refactored version of this may look something like this (with thanks to @tduffield) ```toml [[publish]] type = "docker-registry" registry = "index.docker.io" # default image_name = "{{pkg_origin}}/{{pkg_name}}" # default tags = [ "{{pkg_version}}-{{pkg_release}}" ] # default secret_key = "ENCRYPTED_VALUE" # placeholder for encrypted creds (if needed) [[publish]] type = "s3" secret_key = "ENCRYPTED_VALUE" [[publish]] type = "artifactory" secret_key = "ENCRYPTED_VALUE" ``` Some things to note: * The `type` key/value represents the exporter/publisher to use for the phase * We support an array of tables so multiple exporters of a single type can be used * We have dropped the current configuration for publishing to the depot itself. This will become a requirement of the system * Mustache can be used for string substitution for variables generated by the build phase such as pkg name, version, origin, and release
non_process
configurable publish phases in builder toml user story as a user of builder s builds i need a way to declare configurable publish phases which will be performed by a builder worker after the initial build phase so i can consume my built artifact in various post processed formats such as a docker container an ami etc requirements the builder toml which resides next to a plan sh plan is the configuration that a builder worker reads to understand additional steps a refactored version of this may look something like this with thanks to tduffield toml type docker registry registry index docker io default image name pkg origin pkg name default tags default secret key encrypted value placeholder for encrypted creds if needed type secret key encrypted value type artifactory secret key encrypted value some things to note the type key value represents the exporter publisher to use for the phase we support an array of tables so multiple exporters of a single type can be used we have dropped the current configuration for publishing to the depot itself this will become a requirement of the system mustache can be used for string substitution for variables generated by the build phase such as pkg name version origin and release
0
74,949
20,515,138,925
IssuesEvent
2022-03-01 10:56:16
tensorflow/tensorflow
https://api.github.com/repos/tensorflow/tensorflow
opened
AttributeError: module 'tensorflow' has no attribute 'reduce_sum'
type:build/install
Hi while running python -c "import tensorflow as tf;print(tf.reduce_sum(tf.random.normal([1000, 1000])))" To verify my Tensorflow installation I get AttributeError; AttributeError: module 'tensorflow' has no attribute 'reduce_sum' tensorflow version: Version: 2.5.0
1.0
AttributeError: module 'tensorflow' has no attribute 'reduce_sum' - Hi while running python -c "import tensorflow as tf;print(tf.reduce_sum(tf.random.normal([1000, 1000])))" To verify my Tensorflow installation I get AttributeError; AttributeError: module 'tensorflow' has no attribute 'reduce_sum' tensorflow version: Version: 2.5.0
non_process
attributeerror module tensorflow has no attribute reduce sum hi while running python c import tensorflow as tf print tf reduce sum tf random normal to verify my tensorflow installation i get attributeerror attributeerror module tensorflow has no attribute reduce sum tensorflow version version
0
13,078
15,420,036,992
IssuesEvent
2021-03-05 10:59:04
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
'multi-species biofilm formation' has double inheritance
multi-species process
This class currently has ‘symbiotic process’ and ‘biofilm formation’ as parent classes. I would recommend removing the **subClassOf 'symbiotic process'** axiom since this process does not directly involve the interaction between the host and the symbiont. You can always add an axiom that asserts that this is part of some ‘symbiotic process’ if you want to relate it back to that class.
1.0
'multi-species biofilm formation' has double inheritance - This class currently has ‘symbiotic process’ and ‘biofilm formation’ as parent classes. I would recommend removing the **subClassOf 'symbiotic process'** axiom since this process does not directly involve the interaction between the host and the symbiont. You can always add an axiom that asserts that this is part of some ‘symbiotic process’ if you want to relate it back to that class.
process
multi species biofilm formation has double inheritance this class currently has ‘symbiotic process’ and ‘biofilm formation’ as parent classes i would recommend removing the subclassof symbiotic process axiom since this process does not directly involve the interaction between the host and the symbiont you can always add an axiom that asserts that this is part of some ‘symbiotic process’ if you want to relate it back to that class
1
11,389
14,224,466,938
IssuesEvent
2020-11-17 19:43:03
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
closed
update kubernets version script does not generate unit tests
kind/process priority/important-longterm
in this PR https://github.com/kubernetes/minikube/pull/9693 I used the script ``` cd hack/update/kubernetes_version go run update_kubernetes_version.go ``` it changed the constants.go and the docs but it did not change these files and didn't add a folder of unit test data ``` pkg/minikube/bootstrapper/bsutil/kubeadm_test.go pkg/minikube/bootstrapper/bsutil/testdata/v1.20.0-beta.1 ```
1.0
update kubernets version script does not generate unit tests - in this PR https://github.com/kubernetes/minikube/pull/9693 I used the script ``` cd hack/update/kubernetes_version go run update_kubernetes_version.go ``` it changed the constants.go and the docs but it did not change these files and didn't add a folder of unit test data ``` pkg/minikube/bootstrapper/bsutil/kubeadm_test.go pkg/minikube/bootstrapper/bsutil/testdata/v1.20.0-beta.1 ```
process
update kubernets version script does not generate unit tests in this pr i used the script cd hack update kubernetes version go run update kubernetes version go it changed the constants go and the docs but it did not change these files and didn t add a folder of unit test data pkg minikube bootstrapper bsutil kubeadm test go pkg minikube bootstrapper bsutil testdata beta
1
479,234
13,793,254,062
IssuesEvent
2020-10-09 14:42:18
rokwire/safer-illinois-app
https://api.github.com/repos/rokwire/safer-illinois-app
closed
[BUG] The 2 different guidelines is displayed for the same county "Champaign Illinois"
Priority: High Type: Bug
**Describe the bug** Two different guidelines are displayed same county "Champaign, Illinois" **To Reproduce** Steps to reproduce the behavior: 1. Install Safer Illinois app 2. Complete the COVID onboarding process 3. User logged in as a University Student on Who are you screen and verified his identity 4. Complete the COVID onboarding steps with Exposure notification and test result Consent to YES 5. COVID set up is done. COVID - 19 screen is displayed 6. Users current status is "Orange, Likely infected" 7. Tap on County Guidelines. County Guidelines - "Champaign, Illinois county" is displayed for status "Orange". 8. Tap on the drop-down arrow displayed next to "Champaign, Illinois county" and select "Champaign, Illinois" **Actual Result** Now the county Guidelines is displayed for the status Yellow color **Expected behavior** The 2 different guidelines should not display for the same county **Screenshots** If applicable, add screenshots to help explain your problem. ![CG1](https://user-images.githubusercontent.com/26231179/95095310-84829480-06f0-11eb-9f0f-cd802703b0e5.png) ![CG 2](https://user-images.githubusercontent.com/26231179/95095318-877d8500-06f0-11eb-8beb-e51599adc51c.png) ![Cg 3](https://user-images.githubusercontent.com/26231179/95095321-89474880-06f0-11eb-8d26-aeb6bfbe2072.png) **Smartphone (please complete the following information):** Device: [e.g. Android] - Version [e.g. 2.6.13] **Additional context** Add any other context about the problem here.
1.0
[BUG] The 2 different guidelines is displayed for the same county "Champaign Illinois" - **Describe the bug** Two different guidelines are displayed same county "Champaign, Illinois" **To Reproduce** Steps to reproduce the behavior: 1. Install Safer Illinois app 2. Complete the COVID onboarding process 3. User logged in as a University Student on Who are you screen and verified his identity 4. Complete the COVID onboarding steps with Exposure notification and test result Consent to YES 5. COVID set up is done. COVID - 19 screen is displayed 6. Users current status is "Orange, Likely infected" 7. Tap on County Guidelines. County Guidelines - "Champaign, Illinois county" is displayed for status "Orange". 8. Tap on the drop-down arrow displayed next to "Champaign, Illinois county" and select "Champaign, Illinois" **Actual Result** Now the county Guidelines is displayed for the status Yellow color **Expected behavior** The 2 different guidelines should not display for the same county **Screenshots** If applicable, add screenshots to help explain your problem. ![CG1](https://user-images.githubusercontent.com/26231179/95095310-84829480-06f0-11eb-9f0f-cd802703b0e5.png) ![CG 2](https://user-images.githubusercontent.com/26231179/95095318-877d8500-06f0-11eb-8beb-e51599adc51c.png) ![Cg 3](https://user-images.githubusercontent.com/26231179/95095321-89474880-06f0-11eb-8d26-aeb6bfbe2072.png) **Smartphone (please complete the following information):** Device: [e.g. Android] - Version [e.g. 2.6.13] **Additional context** Add any other context about the problem here.
non_process
the different guidelines is displayed for the same county champaign illinois describe the bug two different guidelines are displayed same county champaign illinois to reproduce steps to reproduce the behavior install safer illinois app complete the covid onboarding process user logged in as a university student on who are you screen and verified his identity complete the covid onboarding steps with exposure notification and test result consent to yes covid set up is done covid screen is displayed users current status is orange likely infected tap on county guidelines county guidelines champaign illinois county is displayed for status orange tap on the drop down arrow displayed next to champaign illinois county and select champaign illinois actual result now the county guidelines is displayed for the status yellow color expected behavior the different guidelines should not display for the same county screenshots if applicable add screenshots to help explain your problem smartphone please complete the following information device version additional context add any other context about the problem here
0
164,031
12,754,374,919
IssuesEvent
2020-06-28 05:00:21
microsoft/azure-tools-for-java
https://api.github.com/repos/microsoft/azure-tools-for-java
closed
[intelliJ][Spark on Cosmos]Warning use account when config with cosmos configurartion.
HDInsight IntelliJ Internal Test fixed
Build: azure-toolkit-for-intellij-2018.3.develop.1036.03-20-2019 Repro Steps: 1. Create a **Apache Spark on Cosmos configuration** **without Spark Clusters** and save it. 2. Right click LogQuery and run Spark livy Interactive Session Console(Scala). 3. Click Continue anyway. ![image](https://user-images.githubusercontent.com/35950097/54738768-865d3380-4bf0-11e9-95e8-dd50c264871f.png) Result: Warning message use "account" rather than "cluster". ![image](https://user-images.githubusercontent.com/35950097/54738836-c7edde80-4bf0-11e9-8865-ef03297f7899.png)
1.0
[intelliJ][Spark on Cosmos]Warning use account when config with cosmos configurartion. - Build: azure-toolkit-for-intellij-2018.3.develop.1036.03-20-2019 Repro Steps: 1. Create a **Apache Spark on Cosmos configuration** **without Spark Clusters** and save it. 2. Right click LogQuery and run Spark livy Interactive Session Console(Scala). 3. Click Continue anyway. ![image](https://user-images.githubusercontent.com/35950097/54738768-865d3380-4bf0-11e9-95e8-dd50c264871f.png) Result: Warning message use "account" rather than "cluster". ![image](https://user-images.githubusercontent.com/35950097/54738836-c7edde80-4bf0-11e9-8865-ef03297f7899.png)
non_process
warning use account when config with cosmos configurartion build azure toolkit for intellij develop repro steps create a apache spark on cosmos configuration without spark clusters and save it right click logquery and run spark livy interactive session console scala click continue anyway result warning message use account rather than cluster
0
131,887
12,493,235,050
IssuesEvent
2020-06-01 08:52:44
kubernetes-sigs/cluster-api
https://api.github.com/repos/kubernetes-sigs/cluster-api
closed
Document that dollar mark needs to be escaped for variable expansion in clusterctl
help wanted kind/documentation kind/feature lifecycle/stale
Just a small todo / note to self. When using env vars, for example a password, if it contains a dollar symbol, in addition to being single quoted, it needs to be doubled, .e.g.: The password foo$bar must be represented as: PASSWORD='foo$$bar' Looks like some variable expansion happening in viper, but can get hit. /kind feature /area documentation
1.0
Document that dollar mark needs to be escaped for variable expansion in clusterctl - Just a small todo / note to self. When using env vars, for example a password, if it contains a dollar symbol, in addition to being single quoted, it needs to be doubled, .e.g.: The password foo$bar must be represented as: PASSWORD='foo$$bar' Looks like some variable expansion happening in viper, but can get hit. /kind feature /area documentation
non_process
document that dollar mark needs to be escaped for variable expansion in clusterctl just a small todo note to self when using env vars for example a password if it contains a dollar symbol in addition to being single quoted it needs to be doubled e g the password foo bar must be represented as password foo bar looks like some variable expansion happening in viper but can get hit kind feature area documentation
0
11,070
13,906,034,008
IssuesEvent
2020-10-20 10:41:24
pystatgen/sgkit
https://api.github.com/repos/pystatgen/sgkit
closed
Simulate genotypes for unit tests
core operations process + tools
I'm trying to simulate some genotype calls for unit tests using a simple model from `msprime`. At @jeromekelleher's suggestion, I tried this: ```python # Using 1.x code, not yet released !pip install git+https://github.com/tskit-dev/msprime.git#egg=msprime import msprime model = msprime.Demography.island_model(2, migration_rate=1e-3, Ne=10**4) samples = model.sample(10, 10) # 10 haploids from each pop ts = msprime.simulate( samples=samples, demography=model, # what should this be? mutation_rate=1e-8, recombination_rate=1e-8 ) list(ts.variants()) [] ``` What is `demography` supposed to be? I saw a "FIXME" for documenting it in `simulate` but I wasn't sure what to set it to if not the `Demography` instance that comes back from `island_model`. Also, is this why I'm not getting any variants back out? I am ultimately trying to simulate hardy weinberg equilibrium. I'd like to have some fraction of the variants be in or near perfect equilibrium and the remainder way out of it. If you get a chance, could I get an assist @jeromekelleher?
1.0
Simulate genotypes for unit tests - I'm trying to simulate some genotype calls for unit tests using a simple model from `msprime`. At @jeromekelleher's suggestion, I tried this: ```python # Using 1.x code, not yet released !pip install git+https://github.com/tskit-dev/msprime.git#egg=msprime import msprime model = msprime.Demography.island_model(2, migration_rate=1e-3, Ne=10**4) samples = model.sample(10, 10) # 10 haploids from each pop ts = msprime.simulate( samples=samples, demography=model, # what should this be? mutation_rate=1e-8, recombination_rate=1e-8 ) list(ts.variants()) [] ``` What is `demography` supposed to be? I saw a "FIXME" for documenting it in `simulate` but I wasn't sure what to set it to if not the `Demography` instance that comes back from `island_model`. Also, is this why I'm not getting any variants back out? I am ultimately trying to simulate hardy weinberg equilibrium. I'd like to have some fraction of the variants be in or near perfect equilibrium and the remainder way out of it. If you get a chance, could I get an assist @jeromekelleher?
process
simulate genotypes for unit tests i m trying to simulate some genotype calls for unit tests using a simple model from msprime at jeromekelleher s suggestion i tried this python using x code not yet released pip install git import msprime model msprime demography island model migration rate ne samples model sample haploids from each pop ts msprime simulate samples samples demography model what should this be mutation rate recombination rate list ts variants what is demography supposed to be i saw a fixme for documenting it in simulate but i wasn t sure what to set it to if not the demography instance that comes back from island model also is this why i m not getting any variants back out i am ultimately trying to simulate hardy weinberg equilibrium i d like to have some fraction of the variants be in or near perfect equilibrium and the remainder way out of it if you get a chance could i get an assist jeromekelleher
1
12,416
14,921,184,108
IssuesEvent
2021-01-23 08:52:08
threefoldfoundation/tft-stellar
https://api.github.com/repos/threefoldfoundation/tft-stellar
closed
Possibility for the unlock service to use an external storage like etcd or mongo
process_wontfix type_feature
Currently, if the node running the unlock service fails or the image is updated, the data inside the image is also lost. In order to achieve ha, rolling upgrades and node failure must be tolerated.
1.0
Possibility for the unlock service to use an external storage like etcd or mongo - Currently, if the node running the unlock service fails or the image is updated, the data inside the image is also lost. In order to achieve ha, rolling upgrades and node failure must be tolerated.
process
possibility for the unlock service to use an external storage like etcd or mongo currently if the node running the unlock service fails or the image is updated the data inside the image is also lost in order to achieve ha rolling upgrades and node failure must be tolerated
1
73,995
24,897,916,559
IssuesEvent
2022-10-28 17:38:41
matrix-org/synapse
https://api.github.com/repos/matrix-org/synapse
closed
Mautrix Bridges are not bridging messages since upgrading to version 1.70.0
A-Application-Service S-Major T-Defect X-Regression O-Occasional
### Description Since upgrading to version 1.70.0, mautrix-instagram, mautrix-discord, mautrix-facebook and mautrix-telegram are not bridging mesaging from Matrix to the requested service. Any communication with the bot is also impossible (`help`, `ping`, etc). Maubot is unaffected. ### Steps to reproduce - install and configure a mautrix-based bridge (instagram, facebook, discord or telegram) on 1.69 - upgrade to 1.70 - send a message from matrix in a bridged channel ### Homeserver Privately hosted homeserver ### Synapse Version 1.70.0 (Python 3.9.2) ### Installation Method Synapse is installed with `pip install --upgrade matrix-synapse[all]`, which compiles the project from sources. ### Platform Bridges are manually installed from their git repo. Host server is a Raspberry Pi 3 running Debian 11.5. ### Relevant log output ```shell No logs seems to point to an error. ``` ### Anything else that would be useful to know? Reverting to synapse 1.69 fixes the issue.
1.0
Mautrix Bridges are not bridging messages since upgrading to version 1.70.0 - ### Description Since upgrading to version 1.70.0, mautrix-instagram, mautrix-discord, mautrix-facebook and mautrix-telegram are not bridging mesaging from Matrix to the requested service. Any communication with the bot is also impossible (`help`, `ping`, etc). Maubot is unaffected. ### Steps to reproduce - install and configure a mautrix-based bridge (instagram, facebook, discord or telegram) on 1.69 - upgrade to 1.70 - send a message from matrix in a bridged channel ### Homeserver Privately hosted homeserver ### Synapse Version 1.70.0 (Python 3.9.2) ### Installation Method Synapse is installed with `pip install --upgrade matrix-synapse[all]`, which compiles the project from sources. ### Platform Bridges are manually installed from their git repo. Host server is a Raspberry Pi 3 running Debian 11.5. ### Relevant log output ```shell No logs seems to point to an error. ``` ### Anything else that would be useful to know? Reverting to synapse 1.69 fixes the issue.
non_process
mautrix bridges are not bridging messages since upgrading to version description since upgrading to version mautrix instagram mautrix discord mautrix facebook and mautrix telegram are not bridging mesaging from matrix to the requested service any communication with the bot is also impossible help ping etc maubot is unaffected steps to reproduce install and configure a mautrix based bridge instagram facebook discord or telegram on upgrade to send a message from matrix in a bridged channel homeserver privately hosted homeserver synapse version python installation method synapse is installed with pip install upgrade matrix synapse which compiles the project from sources platform bridges are manually installed from their git repo host server is a raspberry pi running debian relevant log output shell no logs seems to point to an error anything else that would be useful to know reverting to synapse fixes the issue
0
20,442
27,100,573,846
IssuesEvent
2023-02-15 08:19:03
billingran/Newsletter
https://api.github.com/repos/billingran/Newsletter
closed
Éviter d'enregistrer plusieurs fois le même utilisateur
processing... Brief 2
- [ ] Éviter d'enregistrer plusieurs fois le même utilisateur lorsqu'on rafraichit la page après avoir validé le formulaire avec succès.
1.0
Éviter d'enregistrer plusieurs fois le même utilisateur - - [ ] Éviter d'enregistrer plusieurs fois le même utilisateur lorsqu'on rafraichit la page après avoir validé le formulaire avec succès.
process
éviter d enregistrer plusieurs fois le même utilisateur éviter d enregistrer plusieurs fois le même utilisateur lorsqu on rafraichit la page après avoir validé le formulaire avec succès
1
17,813
23,741,281,329
IssuesEvent
2022-08-31 12:39:45
km4ack/patmenu2
https://api.github.com/repos/km4ack/patmenu2
closed
Add WL2K_MOBILES
enhancement in process
add WL2K_MOBILES to the position report section of Pat Menu. Send the request to INQUIRY with a subject of REQUEST. This will request a list of 100 position reports instead of the 30 delivered when WL2K_USERS is requested.
1.0
Add WL2K_MOBILES - add WL2K_MOBILES to the position report section of Pat Menu. Send the request to INQUIRY with a subject of REQUEST. This will request a list of 100 position reports instead of the 30 delivered when WL2K_USERS is requested.
process
add mobiles add mobiles to the position report section of pat menu send the request to inquiry with a subject of request this will request a list of position reports instead of the delivered when users is requested
1
17,850
23,795,761,483
IssuesEvent
2022-09-02 19:27:43
ncbo/bioportal-project
https://api.github.com/repos/ncbo/bioportal-project
closed
New submission creation for AIO fails with duplicate resource ID error
ontology processing problem
End user reported on the support list that they were unable to create a new submission for the [AIO ontology](https://bioportal.bioontology.org/ontologies/AIO). Steps to reproduce: * Navigate to the AIO [summary page](https://bioportal.bioontology.org/ontologies/AIO). * Click the Add submission button * On the [resulting form](https://bioportal.bioontology.org/ontologies/AIO/submissions/new), click the Add submission button Form fails to submit with the following error displayed: ![Screen Shot 2022-09-02 at 11 53 53 AM](https://user-images.githubusercontent.com/1696923/188219904-8bc3c97c-553a-4780-bba8-5cf812fb8087.png) Plain text error: ``` [:error, #<OpenStruct links=nil, context=nil, proc_naming=#<OpenStruct links=nil, context=nil, duplicate="There is already a persistent resource with id `http://data.bioontology.org/ontologies/AIO/submissions/2`">>] ```
1.0
New submission creation for AIO fails with duplicate resource ID error - End user reported on the support list that they were unable to create a new submission for the [AIO ontology](https://bioportal.bioontology.org/ontologies/AIO). Steps to reproduce: * Navigate to the AIO [summary page](https://bioportal.bioontology.org/ontologies/AIO). * Click the Add submission button * On the [resulting form](https://bioportal.bioontology.org/ontologies/AIO/submissions/new), click the Add submission button Form fails to submit with the following error displayed: ![Screen Shot 2022-09-02 at 11 53 53 AM](https://user-images.githubusercontent.com/1696923/188219904-8bc3c97c-553a-4780-bba8-5cf812fb8087.png) Plain text error: ``` [:error, #<OpenStruct links=nil, context=nil, proc_naming=#<OpenStruct links=nil, context=nil, duplicate="There is already a persistent resource with id `http://data.bioontology.org/ontologies/AIO/submissions/2`">>] ```
process
new submission creation for aio fails with duplicate resource id error end user reported on the support list that they were unable to create a new submission for the steps to reproduce navigate to the aio click the add submission button on the click the add submission button form fails to submit with the following error displayed plain text error
1
409,539
27,742,443,708
IssuesEvent
2023-03-15 15:04:46
scylladb/scylla-operator
https://api.github.com/repos/scylladb/scylla-operator
closed
Update docs version
kind/documentation priority/important-longterm
We should bump our sphinx theme https://sphinx-theme.scylladb.com/stable/upgrade/1-2-to-1-3.html and also bump pytest to at least 7.2.0 when at it. Also note that some of the upgrade instruction don't apply to our repo or need to be modified based on your judgment to maintain CI invariants. this should help you bootstrap ``` podman run -it --rm -v="$( pwd ):/go/$( go list -m )" --workdir="/go/$( go list -m )/docs" -p 5500:5500 ubuntu:20.04 bash -c 'apt-get update && apt-get install -y curl python3 python3-distutils make git && ./hack/install-poetry.sh && $HOME/.poetry/bin/poetry update && make multiversion && make -C docs multiversionpreview' ```
1.0
Update docs version - We should bump our sphinx theme https://sphinx-theme.scylladb.com/stable/upgrade/1-2-to-1-3.html and also bump pytest to at least 7.2.0 when at it. Also note that some of the upgrade instruction don't apply to our repo or need to be modified based on your judgment to maintain CI invariants. this should help you bootstrap ``` podman run -it --rm -v="$( pwd ):/go/$( go list -m )" --workdir="/go/$( go list -m )/docs" -p 5500:5500 ubuntu:20.04 bash -c 'apt-get update && apt-get install -y curl python3 python3-distutils make git && ./hack/install-poetry.sh && $HOME/.poetry/bin/poetry update && make multiversion && make -C docs multiversionpreview' ```
non_process
update docs version we should bump our sphinx theme and also bump pytest to at least when at it also note that some of the upgrade instruction don t apply to our repo or need to be modified based on your judgment to maintain ci invariants this should help you bootstrap podman run it rm v pwd go go list m workdir go go list m docs p ubuntu bash c apt get update apt get install y curl distutils make git hack install poetry sh home poetry bin poetry update make multiversion make c docs multiversionpreview
0
18,404
24,543,476,302
IssuesEvent
2022-10-12 06:51:42
home-climate-control/dz
https://api.github.com/repos/home-climate-control/dz
closed
Feature #2: replace bang-bang control mechanism with PI controller
enhancement usability fault tolerance process control reactive-only
### Expected Behavior Control process emits signal producing reasonable HVAC unit uptimes and downtimes. ### Actual Behavior Jitter is unbound. This will damage older or cheaper HVAC units that don't have internal protection from control short cycling. ### Mitigation Replace bang-bang with PI controller (PID is not necessary).
1.0
Feature #2: replace bang-bang control mechanism with PI controller - ### Expected Behavior Control process emits signal producing reasonable HVAC unit uptimes and downtimes. ### Actual Behavior Jitter is unbound. This will damage older or cheaper HVAC units that don't have internal protection from control short cycling. ### Mitigation Replace bang-bang with PI controller (PID is not necessary).
process
feature replace bang bang control mechanism with pi controller expected behavior control process emits signal producing reasonable hvac unit uptimes and downtimes actual behavior jitter is unbound this will damage older or cheaper hvac units that don t have internal protection from control short cycling mitigation replace bang bang with pi controller pid is not necessary
1
759,878
26,616,019,026
IssuesEvent
2023-01-24 07:10:11
robolaunch/central-orchestrator
https://api.github.com/repos/robolaunch/central-orchestrator
opened
user operations refactor
refactor medium priority
### What would you like to be added? Refactor the all user, organization, and team operations. ### Why is this needed? Ensurance
1.0
user operations refactor - ### What would you like to be added? Refactor the all user, organization, and team operations. ### Why is this needed? Ensurance
non_process
user operations refactor what would you like to be added refactor the all user organization and team operations why is this needed ensurance
0
7,453
10,560,782,972
IssuesEvent
2019-10-04 14:34:23
johang88/triton
https://api.github.com/repos/johang88/triton
opened
Add content type to content meta data
content processor
This will allow collision meshes to be created without hacks.
1.0
Add content type to content meta data - This will allow collision meshes to be created without hacks.
process
add content type to content meta data this will allow collision meshes to be created without hacks
1
20,759
27,492,774,604
IssuesEvent
2023-03-04 20:38:42
Azure/azure-sdk-tools
https://api.github.com/repos/Azure/azure-sdk-tools
closed
Define role, responsibilities, and more for Service buddy
Engagement Experience WS: Process Tools & Automation
The purpose of this Epic is to focus on defining the roles and responsibilities, tools for people to do their job, and mechanisms to build in accountability for the service buddy for Cadl
1.0
Define role, responsibilities, and more for Service buddy - The purpose of this Epic is to focus on defining the roles and responsibilities, tools for people to do their job, and mechanisms to build in accountability for the service buddy for Cadl
process
define role responsibilities and more for service buddy the purpose of this epic is to focus on defining the roles and responsibilities tools for people to do their job and mechanisms to build in accountability for the service buddy for cadl
1
35,955
2,793,980,912
IssuesEvent
2015-05-11 14:26:26
mozilla/marketplace-tests
https://api.github.com/repos/mozilla/marketplace-tests
closed
Add a setup.cfg file to configure the behaviour of flake8
Community difficulty beginner priority medium
This is a simple task: 1. Add a new file to the repo called `setup.cfg`. The file should look just like the file found at https://github.com/mozilla/mcom-tests/blob/master/setup.cfg 2. Update the `.travis.yml` file so the `script:` line reads: ```yml script: "flake8 ." ```
1.0
Add a setup.cfg file to configure the behaviour of flake8 - This is a simple task: 1. Add a new file to the repo called `setup.cfg`. The file should look just like the file found at https://github.com/mozilla/mcom-tests/blob/master/setup.cfg 2. Update the `.travis.yml` file so the `script:` line reads: ```yml script: "flake8 ." ```
non_process
add a setup cfg file to configure the behaviour of this is a simple task add a new file to the repo called setup cfg the file should look just like the file found at update the travis yml file so the script line reads yml script
0
15,768
19,913,877,896
IssuesEvent
2022-01-25 20:09:55
input-output-hk/high-assurance-legacy
https://api.github.com/repos/input-output-hk/high-assurance-legacy
closed
Make all small types possible as data types
type: enhancement reason: wontfix language: isabelle topic: process calculus
At its core, our process calculus implementation is untyped in the sense that there is a single type for channels, `chan`, and a single type for values, `val`. However, we have a typed layer, which allows us to use typed channels (using the type constructor `channel`) as well as data of various types. Typed channels and other data are encoded as untyped channels and values. As a result, the sizes of the types we’re using are restricted by the sizes of `chan` and `val`. Currently, we force types `'a channel` to be countable and require other data types to be countable as well. This has the annoying consequence that we can treat neither real numbers nor functions on data as data, meaning we cannot communicate them over channels. With the new AFP entry [`ZFC_in_HOL`][zfc-in-hol], we have a very good tool for relaxing this restriction. We plan to change the requirement of data types being countable to the requirement of them being _small_ in the sense of `ZFC_in_HOL`. The type of real numbers is certainly small and, more importantly, small types are closed under function space construction; so this change would solve the above described problem. It would not allow us to treat the type `V` of Zermelo–Fraenkel sets as a data type though, but then again our point is to support rich typing, while ZFC is untyped. [zfc-in-hol]: https://www.isa-afp.org/entries/ZFC_in_HOL.html "Zermelo Fraenkel Set Theory in Higher-Order Logic"
1.0
Make all small types possible as data types - At its core, our process calculus implementation is untyped in the sense that there is a single type for channels, `chan`, and a single type for values, `val`. However, we have a typed layer, which allows us to use typed channels (using the type constructor `channel`) as well as data of various types. Typed channels and other data are encoded as untyped channels and values. As a result, the sizes of the types we’re using are restricted by the sizes of `chan` and `val`. Currently, we force types `'a channel` to be countable and require other data types to be countable as well. This has the annoying consequence that we can treat neither real numbers nor functions on data as data, meaning we cannot communicate them over channels. With the new AFP entry [`ZFC_in_HOL`][zfc-in-hol], we have a very good tool for relaxing this restriction. We plan to change the requirement of data types being countable to the requirement of them being _small_ in the sense of `ZFC_in_HOL`. The type of real numbers is certainly small and, more importantly, small types are closed under function space construction; so this change would solve the above described problem. It would not allow us to treat the type `V` of Zermelo–Fraenkel sets as a data type though, but then again our point is to support rich typing, while ZFC is untyped. [zfc-in-hol]: https://www.isa-afp.org/entries/ZFC_in_HOL.html "Zermelo Fraenkel Set Theory in Higher-Order Logic"
process
make all small types possible as data types at its core our process calculus implementation is untyped in the sense that there is a single type for channels chan and a single type for values val however we have a typed layer which allows us to use typed channels using the type constructor channel as well as data of various types typed channels and other data are encoded as untyped channels and values as a result the sizes of the types we’re using are restricted by the sizes of chan and val currently we force types a channel to be countable and require other data types to be countable as well this has the annoying consequence that we can treat neither real numbers nor functions on data as data meaning we cannot communicate them over channels with the new afp entry we have a very good tool for relaxing this restriction we plan to change the requirement of data types being countable to the requirement of them being small in the sense of zfc in hol the type of real numbers is certainly small and more importantly small types are closed under function space construction so this change would solve the above described problem it would not allow us to treat the type v of zermelo–fraenkel sets as a data type though but then again our point is to support rich typing while zfc is untyped zermelo fraenkel set theory in higher order logic
1
30,416
11,824,944,561
IssuesEvent
2020-03-21 09:53:40
mchrapek/studia-projekt-zespolowy-backend
https://api.github.com/repos/mchrapek/studia-projekt-zespolowy-backend
closed
Możliwość zablokowania użytkownika przez administratora
core function security
[Kryteria akceptacji] Jako administrator mogę zablokować użytkownika, co powoduje blokadę logowania użytkownika.
True
Możliwość zablokowania użytkownika przez administratora - [Kryteria akceptacji] Jako administrator mogę zablokować użytkownika, co powoduje blokadę logowania użytkownika.
non_process
możliwość zablokowania użytkownika przez administratora jako administrator mogę zablokować użytkownika co powoduje blokadę logowania użytkownika
0
17,761
23,690,934,550
IssuesEvent
2022-08-29 10:43:17
apache/arrow-rs
https://api.github.com/repos/apache/arrow-rs
closed
Release Arrow `21.0.0` (next release after `20.0.0`)
development-process
Follow on from https://github.com/apache/arrow-rs/issues/2172 * Planned Release Candidate: ~2022-08-19~ 2022-08-18 * Planned Release and Publish to crates.io: ~2022-08-22~ 2022-08-22 Items: - [x] https://github.com/apache/arrow-rs/issues/2338 - [x] https://github.com/apache/arrow-rs/pull/2339 - [x] https://github.com/apache/arrow-rs/pull/2483 - [x] https://github.com/apache/arrow-rs/pull/2506 - [x] https://lists.apache.org/thread/g4nzgd57w6rbt7o4tkqpbdmrqg8tqrxz - [x] Release candidate approved - [x] Release to crates.io - [ ] Draft update to DataFusion: See full list here: https://github.com/apache/arrow-rs/compare/20.0.0...master
1.0
Release Arrow `21.0.0` (next release after `20.0.0`) - Follow on from https://github.com/apache/arrow-rs/issues/2172 * Planned Release Candidate: ~2022-08-19~ 2022-08-18 * Planned Release and Publish to crates.io: ~2022-08-22~ 2022-08-22 Items: - [x] https://github.com/apache/arrow-rs/issues/2338 - [x] https://github.com/apache/arrow-rs/pull/2339 - [x] https://github.com/apache/arrow-rs/pull/2483 - [x] https://github.com/apache/arrow-rs/pull/2506 - [x] https://lists.apache.org/thread/g4nzgd57w6rbt7o4tkqpbdmrqg8tqrxz - [x] Release candidate approved - [x] Release to crates.io - [ ] Draft update to DataFusion: See full list here: https://github.com/apache/arrow-rs/compare/20.0.0...master
process
release arrow next release after follow on from planned release candidate planned release and publish to crates io items release candidate approved release to crates io draft update to datafusion see full list here
1
269,131
8,432,286,133
IssuesEvent
2018-10-17 01:10:46
alan345/nacho
https://api.github.com/repos/alan345/nacho
closed
Minimize API documentation
High priority
Remove everything except signupDate and trialEndDate from POST. Remove all other Optional fields from definitions in introduction.
1.0
Minimize API documentation - Remove everything except signupDate and trialEndDate from POST. Remove all other Optional fields from definitions in introduction.
non_process
minimize api documentation remove everything except signupdate and trialenddate from post remove all other optional fields from definitions in introduction
0
10,252
13,105,618,185
IssuesEvent
2020-08-04 12:31:02
Explore-AI/test-repo
https://api.github.com/repos/Explore-AI/test-repo
opened
JHC-TAG-TEST-PP
bug content-type:pre-processing student-submitted unread
Content Type: Pre-Processing Content Name: JHC-TAG-TEST-PP Problem: The title is weird Additional Details: I think this is a piece of test content that we shouldn't be able to see Supporting Files: Reported by: jacob@explore-ai.net
1.0
JHC-TAG-TEST-PP - Content Type: Pre-Processing Content Name: JHC-TAG-TEST-PP Problem: The title is weird Additional Details: I think this is a piece of test content that we shouldn't be able to see Supporting Files: Reported by: jacob@explore-ai.net
process
jhc tag test pp content type pre processing content name jhc tag test pp problem the title is weird additional details i think this is a piece of test content that we shouldn t be able to see supporting files reported by jacob explore ai net
1
4,197
7,156,581,005
IssuesEvent
2018-01-26 16:45:08
HDLOfficial/Dark-Cave
https://api.github.com/repos/HDLOfficial/Dark-Cave
closed
Spelling Error in file README.md
Processing Issue
last line first word says "Pyscic" should say Psychic (sorry to be nit picky) By the way, I am interested in making a dialogue storyboard
1.0
Spelling Error in file README.md - last line first word says "Pyscic" should say Psychic (sorry to be nit picky) By the way, I am interested in making a dialogue storyboard
process
spelling error in file readme md last line first word says pyscic should say psychic sorry to be nit picky by the way i am interested in making a dialogue storyboard
1
49,620
20,824,857,111
IssuesEvent
2022-03-18 19:27:52
mitin20/bb-upptime
https://api.github.com/repos/mitin20/bb-upptime
closed
🛑 Promerica-renditionservice/qa is down
status promerica-renditionservice-qa
In [`d14bf76`](https://github.com/mitin20/bb-upptime/commit/d14bf760937660e259da40b8a648e68682f14a9a ), Promerica-renditionservice/qa (https://edbsqa.pfcti.com/api/portal/actuator/health) was **down**: - HTTP code: 500 - Response time: 16 ms
1.0
🛑 Promerica-renditionservice/qa is down - In [`d14bf76`](https://github.com/mitin20/bb-upptime/commit/d14bf760937660e259da40b8a648e68682f14a9a ), Promerica-renditionservice/qa (https://edbsqa.pfcti.com/api/portal/actuator/health) was **down**: - HTTP code: 500 - Response time: 16 ms
non_process
🛑 promerica renditionservice qa is down in promerica renditionservice qa was down http code response time ms
0
12,535
14,972,455,387
IssuesEvent
2021-01-27 22:53:46
BootBlock/FileSieve
https://api.github.com/repos/BootBlock/FileSieve
closed
Give updates during the file scanning stage
processing ui
When FileSieve is performing the initial scanning of the Source Items' files that should be processed, it should display something that assures the user that things haven't gotten "stuck". This became especially required with the changes made to the Classification method as it can perform an enormous amount of work.
1.0
Give updates during the file scanning stage - When FileSieve is performing the initial scanning of the Source Items' files that should be processed, it should display something that assures the user that things haven't gotten "stuck". This became especially required with the changes made to the Classification method as it can perform an enormous amount of work.
process
give updates during the file scanning stage when filesieve is performing the initial scanning of the source items files that should be processed it should display something that assures the user that things haven t gotten stuck this became especially required with the changes made to the classification method as it can perform an enormous amount of work
1
195,424
14,728,807,200
IssuesEvent
2021-01-06 10:28:06
input-output-hk/cardano-wallet
https://api.github.com/repos/input-output-hk/cardano-wallet
reopened
Flaky Network.Wai.Middleware.Logging, Logging Middleware, GET, 200, no query
Test failure
# Context <!-- WHEN CREATED Any information that is useful to understand the bug and the subsystem it evolves in. References to documentation and or other tickets are welcome. --> https://github.com/input-output-hk/cardano-wallet/pull/2156#issuecomment-704040406 # Test Case <!-- WHEN CREATED A link to the test scenario or property that is failing. If short enough, put the whole test for readability. --> https://github.com/input-output-hk/cardano-wallet/blob/master/lib/core/test/unit/Network/Wai/Middleware/LoggingSpec.hs#L119-L128 # Failure / Counter-example <!-- WHEN CREATED For failing unit or integration tests, put here the output of the test runner. For properties, also provide the counter example given by QuickCheck. --> ``` Failures: test/unit/Network/Wai/Middleware/LoggingSpec.hs:119:5: 1) Network.Wai.Middleware.Logging, Logging Middleware, GET, 200, no query uncaught exception: HttpException HttpExceptionRequest Request { host = "localhost" port = 56698 secure = False requestHeaders = [] path = "/get" queryString = "" method = "GET" proxy = Nothing rawBody = False redirectCount = 10 responseTimeout = ResponseTimeoutDefault requestVersion = HTTP/1.1 } ConnectionTimeout To rerun use: --match "/Network.Wai.Middleware.Logging/Logging Middleware/GET, 200, no query/" ``` --- # Resolution <!-- WHEN IN PROGRESS What is happening? How is this going to be fixed? Detail the approach and give, in the form of a TODO list steps toward the resolution of the bug. Attach a PR to each item in the list. This may be refined as the investigation progresses. --> --- # QA <!-- WHEN IN PROGRESS How do we make sure the bug has been fixed? Give here manual steps or tests to verify the fix. How/why could this bug slip through testing? -->
1.0
Flaky Network.Wai.Middleware.Logging, Logging Middleware, GET, 200, no query - # Context <!-- WHEN CREATED Any information that is useful to understand the bug and the subsystem it evolves in. References to documentation and or other tickets are welcome. --> https://github.com/input-output-hk/cardano-wallet/pull/2156#issuecomment-704040406 # Test Case <!-- WHEN CREATED A link to the test scenario or property that is failing. If short enough, put the whole test for readability. --> https://github.com/input-output-hk/cardano-wallet/blob/master/lib/core/test/unit/Network/Wai/Middleware/LoggingSpec.hs#L119-L128 # Failure / Counter-example <!-- WHEN CREATED For failing unit or integration tests, put here the output of the test runner. For properties, also provide the counter example given by QuickCheck. --> ``` Failures: test/unit/Network/Wai/Middleware/LoggingSpec.hs:119:5: 1) Network.Wai.Middleware.Logging, Logging Middleware, GET, 200, no query uncaught exception: HttpException HttpExceptionRequest Request { host = "localhost" port = 56698 secure = False requestHeaders = [] path = "/get" queryString = "" method = "GET" proxy = Nothing rawBody = False redirectCount = 10 responseTimeout = ResponseTimeoutDefault requestVersion = HTTP/1.1 } ConnectionTimeout To rerun use: --match "/Network.Wai.Middleware.Logging/Logging Middleware/GET, 200, no query/" ``` --- # Resolution <!-- WHEN IN PROGRESS What is happening? How is this going to be fixed? Detail the approach and give, in the form of a TODO list steps toward the resolution of the bug. Attach a PR to each item in the list. This may be refined as the investigation progresses. --> --- # QA <!-- WHEN IN PROGRESS How do we make sure the bug has been fixed? Give here manual steps or tests to verify the fix. How/why could this bug slip through testing? -->
non_process
flaky network wai middleware logging logging middleware get no query context when created any information that is useful to understand the bug and the subsystem it evolves in references to documentation and or other tickets are welcome test case when created a link to the test scenario or property that is failing if short enough put the whole test for readability failure counter example when created for failing unit or integration tests put here the output of the test runner for properties also provide the counter example given by quickcheck failures test unit network wai middleware loggingspec hs network wai middleware logging logging middleware get no query uncaught exception httpexception httpexceptionrequest request host localhost port secure false requestheaders path get querystring method get proxy nothing rawbody false redirectcount responsetimeout responsetimeoutdefault requestversion http connectiontimeout to rerun use match network wai middleware logging logging middleware get no query resolution when in progress what is happening how is this going to be fixed detail the approach and give in the form of a todo list steps toward the resolution of the bug attach a pr to each item in the list this may be refined as the investigation progresses qa when in progress how do we make sure the bug has been fixed give here manual steps or tests to verify the fix how why could this bug slip through testing
0
4,957
7,802,133,789
IssuesEvent
2018-06-10 08:34:12
Eliyayalon/Celiac-Israel
https://api.github.com/repos/Eliyayalon/Celiac-Israel
closed
add input filed - AdminApp
2 in process iter3
Todo: in add edit screen page - [x] add input field of type business - [x] change the feild of food type to restaurant type
1.0
add input filed - AdminApp - Todo: in add edit screen page - [x] add input field of type business - [x] change the feild of food type to restaurant type
process
add input filed adminapp todo in add edit screen page add input field of type business change the feild of food type to restaurant type
1
19,500
25,809,434,258
IssuesEvent
2022-12-11 18:10:40
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
Obsolete GO:0046752 viral capsid precursor transport to host cell nucleus?
obsoletion multi-species process
GO:0046752 viral capsid precursor transport to host cell nucleus is defined as 'Any process in which viral capsid precursors are transported to a specific location in the nucleus, thus accumulating the necessary components for assembly of a capsid. ' - but I dont think this is a specific transport process, this is just specifying the targets. No annotations, no mappings @pmasson55 @genegodbold Is this a useful term ?
1.0
Obsolete GO:0046752 viral capsid precursor transport to host cell nucleus? - GO:0046752 viral capsid precursor transport to host cell nucleus is defined as 'Any process in which viral capsid precursors are transported to a specific location in the nucleus, thus accumulating the necessary components for assembly of a capsid. ' - but I dont think this is a specific transport process, this is just specifying the targets. No annotations, no mappings @pmasson55 @genegodbold Is this a useful term ?
process
obsolete go viral capsid precursor transport to host cell nucleus go viral capsid precursor transport to host cell nucleus is defined as any process in which viral capsid precursors are transported to a specific location in the nucleus thus accumulating the necessary components for assembly of a capsid but i dont think this is a specific transport process this is just specifying the targets no annotations no mappings genegodbold is this a useful term
1
9,463
12,440,750,264
IssuesEvent
2020-05-26 12:33:59
Arch666Angel/mods
https://api.github.com/repos/Arch666Angel/mods
closed
Crystal Brainstorming
Angels Bio Processing Enhancement
Santa asked us to brainstorm ideas on crystals on discord. This is a record of thoughts: Points of data: - Modules 1 is Green science - you need to be able to make & use them with nothing but green tech. - What is the theme of "Bio Processing" that we want to follow? Our posit - BIO is about simple recipies, that take time & space Issue 1: Dealing with crystals is too hard, especially while in beta - Make an option to turn them off - Or, make them optional (they make modules and/or beacons better, but you can make them without if you want) - e.g. leave all module recipies as-is without crystals. Then make a tier 1.5/2.5/3.5/etc that use crystals to amp-up the effects, or make more efficient, or use less power, or etc. - Or, make them tiered (low-level beacons/modules don't need any crystals, but higer-tiers do) Issue 2: What do crystals do? - Consider making crystals required for beacons, but NOT modules. Issue 3: Make the on-ramp to crystals smoother - Can get splinters from fish, and shards from puffers (lower quantities) - Fish: Need a way to get rid of the polluted fish water without clarifiers; maybe turn it into saline & sulphuric? - When you slaughter fish for crystals what else do you get? Meat to warehouse for (much, much) later? Or is there a way with no meat? - Alternate idea: don't make splinters from fish. Make splinters by growing them from polluted fish water (crystal dust + polluted fish water --> splinters & clean water). Crystal dust can come from farming or geodes. - alternate alternate: cleaning polluted water in a hydro plant gives a % chance of a crystal splinter, and turns fish water into saline + sulphuric (or whatever) - puffers - puffer eggs + crystal dust -> crystal shard + acid gas (or similar) - two/three different types of puffers puffing at the same time -> polluted puffer atmosphere, polluted puffer atmosphere + crystal dust -> shards - feed crystal dust + nutrien paste to puffers -> some puffers die, but you get shards instead of gas out. (alternative to puffing) - have raw shards a byproduct of (one of) the breeding recipes - I am currently having the following thought: All the animalis processing revolves around three different steps: - breeding - petting/puffing/zoo - slaughtering for meat - What if the three different crystals would come out of these different processes but at different tiers. That means: splinters from fish petting (polluted fish water), shards from breeding of puffers, crystals from slaughtering biters. - Alternatively, we can try to get the crystals always out of the same one of these steps: e.g. always slaughtering - Can you cut the full crystals into the smaller ones? (so once you have big biters, you can make everything from that?) Issue 4: Alien Spores / Eggsperiments / Alien Meat - Perchloric acid for the spores seems a little too high-tech (needs gold science & tungsten due to thermal water). - Are the mushredtatoes (from the rare swamp garden seed) enough to make this doable before gold science? - Butchery - too slow; cutting up for meat is super-fast. Recipes should be very short time. - Issue 5: Tech balancing - Splinters are green science (modules 1) - Assume shards are blue, and full crystals purple. - Check the ingredients for everything and see what matches. - If higher tier ingredients are required, they either need to be removed, or an alternate/inefficient method added
1.0
Crystal Brainstorming - Santa asked us to brainstorm ideas on crystals on discord. This is a record of thoughts: Points of data: - Modules 1 is Green science - you need to be able to make & use them with nothing but green tech. - What is the theme of "Bio Processing" that we want to follow? Our posit - BIO is about simple recipies, that take time & space Issue 1: Dealing with crystals is too hard, especially while in beta - Make an option to turn them off - Or, make them optional (they make modules and/or beacons better, but you can make them without if you want) - e.g. leave all module recipies as-is without crystals. Then make a tier 1.5/2.5/3.5/etc that use crystals to amp-up the effects, or make more efficient, or use less power, or etc. - Or, make them tiered (low-level beacons/modules don't need any crystals, but higer-tiers do) Issue 2: What do crystals do? - Consider making crystals required for beacons, but NOT modules. Issue 3: Make the on-ramp to crystals smoother - Can get splinters from fish, and shards from puffers (lower quantities) - Fish: Need a way to get rid of the polluted fish water without clarifiers; maybe turn it into saline & sulphuric? - When you slaughter fish for crystals what else do you get? Meat to warehouse for (much, much) later? Or is there a way with no meat? - Alternate idea: don't make splinters from fish. Make splinters by growing them from polluted fish water (crystal dust + polluted fish water --> splinters & clean water). Crystal dust can come from farming or geodes. - alternate alternate: cleaning polluted water in a hydro plant gives a % chance of a crystal splinter, and turns fish water into saline + sulphuric (or whatever) - puffers - puffer eggs + crystal dust -> crystal shard + acid gas (or similar) - two/three different types of puffers puffing at the same time -> polluted puffer atmosphere, polluted puffer atmosphere + crystal dust -> shards - feed crystal dust + nutrien paste to puffers -> some puffers die, but you get shards instead of gas out. (alternative to puffing) - have raw shards a byproduct of (one of) the breeding recipes - I am currently having the following thought: All the animalis processing revolves around three different steps: - breeding - petting/puffing/zoo - slaughtering for meat - What if the three different crystals would come out of these different processes but at different tiers. That means: splinters from fish petting (polluted fish water), shards from breeding of puffers, crystals from slaughtering biters. - Alternatively, we can try to get the crystals always out of the same one of these steps: e.g. always slaughtering - Can you cut the full crystals into the smaller ones? (so once you have big biters, you can make everything from that?) Issue 4: Alien Spores / Eggsperiments / Alien Meat - Perchloric acid for the spores seems a little too high-tech (needs gold science & tungsten due to thermal water). - Are the mushredtatoes (from the rare swamp garden seed) enough to make this doable before gold science? - Butchery - too slow; cutting up for meat is super-fast. Recipes should be very short time. - Issue 5: Tech balancing - Splinters are green science (modules 1) - Assume shards are blue, and full crystals purple. - Check the ingredients for everything and see what matches. - If higher tier ingredients are required, they either need to be removed, or an alternate/inefficient method added
process
crystal brainstorming santa asked us to brainstorm ideas on crystals on discord this is a record of thoughts points of data modules is green science you need to be able to make use them with nothing but green tech what is the theme of bio processing that we want to follow our posit bio is about simple recipies that take time space issue dealing with crystals is too hard especially while in beta make an option to turn them off or make them optional they make modules and or beacons better but you can make them without if you want e g leave all module recipies as is without crystals then make a tier etc that use crystals to amp up the effects or make more efficient or use less power or etc or make them tiered low level beacons modules don t need any crystals but higer tiers do issue what do crystals do consider making crystals required for beacons but not modules issue make the on ramp to crystals smoother can get splinters from fish and shards from puffers lower quantities fish need a way to get rid of the polluted fish water without clarifiers maybe turn it into saline sulphuric when you slaughter fish for crystals what else do you get meat to warehouse for much much later or is there a way with no meat alternate idea don t make splinters from fish make splinters by growing them from polluted fish water crystal dust polluted fish water splinters clean water crystal dust can come from farming or geodes alternate alternate cleaning polluted water in a hydro plant gives a chance of a crystal splinter and turns fish water into saline sulphuric or whatever puffers puffer eggs crystal dust crystal shard acid gas or similar two three different types of puffers puffing at the same time polluted puffer atmosphere polluted puffer atmosphere crystal dust shards feed crystal dust nutrien paste to puffers some puffers die but you get shards instead of gas out alternative to puffing have raw shards a byproduct of one of the breeding recipes i am currently having the following thought all the animalis processing revolves around three different steps breeding petting puffing zoo slaughtering for meat what if the three different crystals would come out of these different processes but at different tiers that means splinters from fish petting polluted fish water shards from breeding of puffers crystals from slaughtering biters alternatively we can try to get the crystals always out of the same one of these steps e g always slaughtering can you cut the full crystals into the smaller ones so once you have big biters you can make everything from that issue alien spores eggsperiments alien meat perchloric acid for the spores seems a little too high tech needs gold science tungsten due to thermal water are the mushredtatoes from the rare swamp garden seed enough to make this doable before gold science butchery too slow cutting up for meat is super fast recipes should be very short time issue tech balancing splinters are green science modules assume shards are blue and full crystals purple check the ingredients for everything and see what matches if higher tier ingredients are required they either need to be removed or an alternate inefficient method added
1
23,778
4,045,700,244
IssuesEvent
2016-05-22 06:29:40
project-renard/curie
https://api.github.com/repos/project-renard/curie
closed
Create test of argument processing for PDF file
bitesize patch-exists ready testing
We need to test the `::App` by first pushing a PDF file (use the one in `test-data`) onto `@ARGV` and then calling `$app->process_arguments`.
1.0
Create test of argument processing for PDF file - We need to test the `::App` by first pushing a PDF file (use the one in `test-data`) onto `@ARGV` and then calling `$app->process_arguments`.
non_process
create test of argument processing for pdf file we need to test the app by first pushing a pdf file use the one in test data onto argv and then calling app process arguments
0
7,767
10,888,616,115
IssuesEvent
2019-11-18 16:34:41
ContaoMonitoring/monitoring-timeline
https://api.github.com/repos/ContaoMonitoring/monitoring-timeline
closed
Use color parametrization to be colour blind friendly
Improvement ⚙ - Processed
Use the parametrized colors of https://github.com/ContaoMonitoring/monitoring/issues/28 to be colour blind friendly.
1.0
Use color parametrization to be colour blind friendly - Use the parametrized colors of https://github.com/ContaoMonitoring/monitoring/issues/28 to be colour blind friendly.
process
use color parametrization to be colour blind friendly use the parametrized colors of to be colour blind friendly
1
130,101
10,596,496,840
IssuesEvent
2019-10-09 21:23:45
mozilla/iris_firefox
https://api.github.com/repos/mozilla/iris_firefox
closed
Fix for telemetry_data_collected_for_one_offs_from_new_tab.py
regression test case
Fix for telemetry_data_collected_for_one_offs_from_new_tab.py
1.0
Fix for telemetry_data_collected_for_one_offs_from_new_tab.py - Fix for telemetry_data_collected_for_one_offs_from_new_tab.py
non_process
fix for telemetry data collected for one offs from new tab py fix for telemetry data collected for one offs from new tab py
0
7,695
10,780,504,132
IssuesEvent
2019-11-04 13:08:20
threefoldtech/ztid
https://api.github.com/repos/threefoldtech/ztid
closed
ztid hangs during 0-OS boot
process_wontfix type_bug
We have come up in a situation where ztid hangs forever and prevent a 0-OS node to boot. After running an strace on ztid, it turns out it is blocked on an `mmap` call. ![strace](https://user-images.githubusercontent.com/303164/53805663-16da1980-3f4b-11e9-9fb3-a72a9949f8fa.png)
1.0
ztid hangs during 0-OS boot - We have come up in a situation where ztid hangs forever and prevent a 0-OS node to boot. After running an strace on ztid, it turns out it is blocked on an `mmap` call. ![strace](https://user-images.githubusercontent.com/303164/53805663-16da1980-3f4b-11e9-9fb3-a72a9949f8fa.png)
process
ztid hangs during os boot we have come up in a situation where ztid hangs forever and prevent a os node to boot after running an strace on ztid it turns out it is blocked on an mmap call
1
13,034
15,382,691,685
IssuesEvent
2021-03-03 01:10:05
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Cell statistic algorithm name() should not be translatable
Bug Processing
https://github.com/qgis/QGIS/blob/6af2bd36b1107e575331d24782f8101afa13c88b/src/analysis/processing/qgsalgorithmcellstatistics.cpp#L164-L167 It should be `return QStringLiteral( "cellstatistics" );` Otherwise we end up with : ``` outputs['StatistiquesDeLaCellule'] = processing.run('native:Statistiques de la cellule', <- translated name here... alg_params, context=context, feedback=feedback, is_child_algorithm=True ) ``` in processing script... @gioman tell me if you want me to fill in the issue template but I'm not sure it's needed here...
1.0
Cell statistic algorithm name() should not be translatable - https://github.com/qgis/QGIS/blob/6af2bd36b1107e575331d24782f8101afa13c88b/src/analysis/processing/qgsalgorithmcellstatistics.cpp#L164-L167 It should be `return QStringLiteral( "cellstatistics" );` Otherwise we end up with : ``` outputs['StatistiquesDeLaCellule'] = processing.run('native:Statistiques de la cellule', <- translated name here... alg_params, context=context, feedback=feedback, is_child_algorithm=True ) ``` in processing script... @gioman tell me if you want me to fill in the issue template but I'm not sure it's needed here...
process
cell statistic algorithm name should not be translatable it should be return qstringliteral cellstatistics otherwise we end up with outputs processing run native statistiques de la cellule translated name here alg params context context feedback feedback is child algorithm true in processing script gioman tell me if you want me to fill in the issue template but i m not sure it s needed here
1
797,084
28,137,458,001
IssuesEvent
2023-04-01 14:54:48
SierraBay/SierraBay12
https://api.github.com/repos/SierraBay/SierraBay12
closed
Баг: ППТ имеет иконки насыщения и жажды
:bug: Баг 🏕 Priority: Medium
**Ckey**: `lordnest` **Шаги:** 1. Зайти в раунд за ППТ, предварительно сделав полный протез тела в чарактер сетапе **Реальное поведение:** У ППТ справа вверху появляются иконки насыщения **Ожидаемое поведение:** У ППТ отсутствуют эти всплывающие иконки <hr> *Репорт сгенерирован автоматически* *Автор: `LordNest#5211` / `286536768726237185`*
1.0
Баг: ППТ имеет иконки насыщения и жажды - **Ckey**: `lordnest` **Шаги:** 1. Зайти в раунд за ППТ, предварительно сделав полный протез тела в чарактер сетапе **Реальное поведение:** У ППТ справа вверху появляются иконки насыщения **Ожидаемое поведение:** У ППТ отсутствуют эти всплывающие иконки <hr> *Репорт сгенерирован автоматически* *Автор: `LordNest#5211` / `286536768726237185`*
non_process
баг ппт имеет иконки насыщения и жажды ckey lordnest шаги зайти в раунд за ппт предварительно сделав полный протез тела в чарактер сетапе реальное поведение у ппт справа вверху появляются иконки насыщения ожидаемое поведение у ппт отсутствуют эти всплывающие иконки репорт сгенерирован автоматически автор lordnest
0
6,139
9,009,732,343
IssuesEvent
2019-02-05 09:56:40
Activiti/Activiti
https://api.github.com/repos/Activiti/Activiti
closed
Process Instance Model is not working with Accept application/json
api blocking priority1 process wontfix
When I call the API http://{{domain}}/myapp-rb/v1/process-instances/4159a15f-3f1b-11e8-aa83-0a586460031e/model with Accept: application/json Current result: I'm getting a 406 Expecting result: The JSON as we have for the process-definitions API http://{{domain}}/myapp-rb/v1/process-definitions/4159a15f-3f1b-11e8-aa83-0a586460031e/model
1.0
Process Instance Model is not working with Accept application/json - When I call the API http://{{domain}}/myapp-rb/v1/process-instances/4159a15f-3f1b-11e8-aa83-0a586460031e/model with Accept: application/json Current result: I'm getting a 406 Expecting result: The JSON as we have for the process-definitions API http://{{domain}}/myapp-rb/v1/process-definitions/4159a15f-3f1b-11e8-aa83-0a586460031e/model
process
process instance model is not working with accept application json when i call the api with accept application json current result i m getting a expecting result the json as we have for the process definitions api
1
6,659
9,781,415,824
IssuesEvent
2019-06-07 19:43:23
brucemiller/LaTeXML
https://api.github.com/repos/brucemiller/LaTeXML
closed
subequations environment results in invalid XHTML
bug postprocessing schema
The minimal LaTeX file `subeqs.tex`: ``` \documentclass{article} \usepackage{amsmath} \begin{document} \begin{subequations} \begin{align} 1 &= 1\\ 2 &= 2 \end{align} \end{subequations} \end{document} ``` converted to XHTML using ``` latexml --dest=subeqs.xml subeqs.tex ; latexmlpost --dest=subeqs.xhtml subeqs.xml ``` results in invalid XHTML. The errors are ``` Line 19, Column 20: document type does not allow element "tbody" here <tbody id="S0.E1.1"><tr class="ltx_equation ltx_eqn_row ltx_align_baseline"> ``` and the same for `Line 25, Column 20` when checking the syntax on `https://validator.w3.org/#validate-by-upload`. Note that this is new in `LaTeXML` version 0.8.3. Version 0.8.2 inserted `tbody` in the proper place and just once whereas 0.8.3 seems to add a (misplaced) `tbody` for each sub-equation.
1.0
subequations environment results in invalid XHTML - The minimal LaTeX file `subeqs.tex`: ``` \documentclass{article} \usepackage{amsmath} \begin{document} \begin{subequations} \begin{align} 1 &= 1\\ 2 &= 2 \end{align} \end{subequations} \end{document} ``` converted to XHTML using ``` latexml --dest=subeqs.xml subeqs.tex ; latexmlpost --dest=subeqs.xhtml subeqs.xml ``` results in invalid XHTML. The errors are ``` Line 19, Column 20: document type does not allow element "tbody" here <tbody id="S0.E1.1"><tr class="ltx_equation ltx_eqn_row ltx_align_baseline"> ``` and the same for `Line 25, Column 20` when checking the syntax on `https://validator.w3.org/#validate-by-upload`. Note that this is new in `LaTeXML` version 0.8.3. Version 0.8.2 inserted `tbody` in the proper place and just once whereas 0.8.3 seems to add a (misplaced) `tbody` for each sub-equation.
process
subequations environment results in invalid xhtml the minimal latex file subeqs tex documentclass article usepackage amsmath begin document begin subequations begin align end align end subequations end document converted to xhtml using latexml dest subeqs xml subeqs tex latexmlpost dest subeqs xhtml subeqs xml results in invalid xhtml the errors are line column document type does not allow element tbody here and the same for line column when checking the syntax on note that this is new in latexml version version inserted tbody in the proper place and just once whereas seems to add a misplaced tbody for each sub equation
1
151,019
19,648,212,012
IssuesEvent
2022-01-10 01:12:04
faizulho/gatsby-starter-docz-netlifycms-1
https://api.github.com/repos/faizulho/gatsby-starter-docz-netlifycms-1
closed
WS-2022-0007 (Medium) detected in node-forge-0.10.0.tgz - autoclosed
security vulnerability
## WS-2022-0007 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.10.0.tgz</b></p></summary> <p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/node-forge/package.json</p> <p> Dependency Hierarchy: - gatsby-2.30.3.tgz (Root Library) - webpack-dev-server-3.11.2.tgz - selfsigned-1.10.8.tgz - :x: **node-forge-0.10.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In node-forge before 1.0.0 he regex used for the forge.util.parseUrl API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior. <p>Publish Date: 2022-01-08 <p>URL: <a href=https://github.com/digitalbazaar/forge/commit/db8016c805371e72b06d8e2edfe0ace0df934a5e>WS-2022-0007</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-gf8q-jrpm-jvxq">https://github.com/advisories/GHSA-gf8q-jrpm-jvxq</a></p> <p>Release Date: 2022-01-08</p> <p>Fix Resolution: node-forge - 1.0.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2022-0007 (Medium) detected in node-forge-0.10.0.tgz - autoclosed - ## WS-2022-0007 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.10.0.tgz</b></p></summary> <p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/node-forge/package.json</p> <p> Dependency Hierarchy: - gatsby-2.30.3.tgz (Root Library) - webpack-dev-server-3.11.2.tgz - selfsigned-1.10.8.tgz - :x: **node-forge-0.10.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In node-forge before 1.0.0 he regex used for the forge.util.parseUrl API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior. <p>Publish Date: 2022-01-08 <p>URL: <a href=https://github.com/digitalbazaar/forge/commit/db8016c805371e72b06d8e2edfe0ace0df934a5e>WS-2022-0007</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-gf8q-jrpm-jvxq">https://github.com/advisories/GHSA-gf8q-jrpm-jvxq</a></p> <p>Release Date: 2022-01-08</p> <p>Fix Resolution: node-forge - 1.0.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
ws medium detected in node forge tgz autoclosed ws medium severity vulnerability vulnerable library node forge tgz javascript implementations of network transports cryptography ciphers pki message digests and various utilities library home page a href path to dependency file package json path to vulnerable library node modules node forge package json dependency hierarchy gatsby tgz root library webpack dev server tgz selfsigned tgz x node forge tgz vulnerable library found in base branch master vulnerability details in node forge before he regex used for the forge util parseurl api would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution node forge step up your open source security game with whitesource
0
22,569
31,790,770,400
IssuesEvent
2023-09-13 02:59:36
googleapis/google-cloud-java
https://api.github.com/repos/googleapis/google-cloud-java
closed
Your .repo-metadata.json files have a problem 🤒
type: process repo-metadata: lint
You have a problem with your .repo-metadata.json files: Result of scan 📈: * api_shortname 'apigee-registry' invalid in java-apigee-registry/.repo-metadata.json * api_shortname 'beyondcorp-appconnections' invalid in java-beyondcorp-appconnections/.repo-metadata.json * api_shortname 'beyondcorp-appconnectors' invalid in java-beyondcorp-appconnectors/.repo-metadata.json * api_shortname 'beyondcorp-appgateways' invalid in java-beyondcorp-appgateways/.repo-metadata.json * api_shortname 'beyondcorp-clientconnectorservices' invalid in java-beyondcorp-clientconnectorservices/.repo-metadata.json * api_shortname 'beyondcorp-clientgateways' invalid in java-beyondcorp-clientgateways/.repo-metadata.json * api_shortname 'dialogflow-cx' invalid in java-dialogflow-cx/.repo-metadata.json * api_shortname 'distributedcloudedge' invalid in java-distributedcloudedge/.repo-metadata.json * api_shortname 'gke-backup' invalid in java-gke-backup/.repo-metadata.json * api_shortname 'gke-multi-cloud' invalid in java-gke-multi-cloud/.repo-metadata.json * api_shortname 'iam-admin' invalid in java-iam-admin/.repo-metadata.json * api_shortname 'infra-manager' invalid in java-infra-manager/.repo-metadata.json * api_shortname 'maps-addressvalidation' invalid in java-maps-addressvalidation/.repo-metadata.json * api_shortname 'maps-mapsplatformdatasets' invalid in java-maps-mapsplatformdatasets/.repo-metadata.json * api_shortname 'maps-routing' invalid in java-maps-routing/.repo-metadata.json * api_shortname 'monitoring-dashboards' invalid in java-monitoring-dashboards/.repo-metadata.json * api_shortname 'monitoring-metricsscope' invalid in java-monitoring-metricsscope/.repo-metadata.json * api_shortname 'orchestration-airflow' invalid in java-orchestration-airflow/.repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json files have a problem 🤒 - You have a problem with your .repo-metadata.json files: Result of scan 📈: * api_shortname 'apigee-registry' invalid in java-apigee-registry/.repo-metadata.json * api_shortname 'beyondcorp-appconnections' invalid in java-beyondcorp-appconnections/.repo-metadata.json * api_shortname 'beyondcorp-appconnectors' invalid in java-beyondcorp-appconnectors/.repo-metadata.json * api_shortname 'beyondcorp-appgateways' invalid in java-beyondcorp-appgateways/.repo-metadata.json * api_shortname 'beyondcorp-clientconnectorservices' invalid in java-beyondcorp-clientconnectorservices/.repo-metadata.json * api_shortname 'beyondcorp-clientgateways' invalid in java-beyondcorp-clientgateways/.repo-metadata.json * api_shortname 'dialogflow-cx' invalid in java-dialogflow-cx/.repo-metadata.json * api_shortname 'distributedcloudedge' invalid in java-distributedcloudedge/.repo-metadata.json * api_shortname 'gke-backup' invalid in java-gke-backup/.repo-metadata.json * api_shortname 'gke-multi-cloud' invalid in java-gke-multi-cloud/.repo-metadata.json * api_shortname 'iam-admin' invalid in java-iam-admin/.repo-metadata.json * api_shortname 'infra-manager' invalid in java-infra-manager/.repo-metadata.json * api_shortname 'maps-addressvalidation' invalid in java-maps-addressvalidation/.repo-metadata.json * api_shortname 'maps-mapsplatformdatasets' invalid in java-maps-mapsplatformdatasets/.repo-metadata.json * api_shortname 'maps-routing' invalid in java-maps-routing/.repo-metadata.json * api_shortname 'monitoring-dashboards' invalid in java-monitoring-dashboards/.repo-metadata.json * api_shortname 'monitoring-metricsscope' invalid in java-monitoring-metricsscope/.repo-metadata.json * api_shortname 'orchestration-airflow' invalid in java-orchestration-airflow/.repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json files have a problem 🤒 you have a problem with your repo metadata json files result of scan 📈 api shortname apigee registry invalid in java apigee registry repo metadata json api shortname beyondcorp appconnections invalid in java beyondcorp appconnections repo metadata json api shortname beyondcorp appconnectors invalid in java beyondcorp appconnectors repo metadata json api shortname beyondcorp appgateways invalid in java beyondcorp appgateways repo metadata json api shortname beyondcorp clientconnectorservices invalid in java beyondcorp clientconnectorservices repo metadata json api shortname beyondcorp clientgateways invalid in java beyondcorp clientgateways repo metadata json api shortname dialogflow cx invalid in java dialogflow cx repo metadata json api shortname distributedcloudedge invalid in java distributedcloudedge repo metadata json api shortname gke backup invalid in java gke backup repo metadata json api shortname gke multi cloud invalid in java gke multi cloud repo metadata json api shortname iam admin invalid in java iam admin repo metadata json api shortname infra manager invalid in java infra manager repo metadata json api shortname maps addressvalidation invalid in java maps addressvalidation repo metadata json api shortname maps mapsplatformdatasets invalid in java maps mapsplatformdatasets repo metadata json api shortname maps routing invalid in java maps routing repo metadata json api shortname monitoring dashboards invalid in java monitoring dashboards repo metadata json api shortname monitoring metricsscope invalid in java monitoring metricsscope repo metadata json api shortname orchestration airflow invalid in java orchestration airflow repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
1
21,800
30,315,138,999
IssuesEvent
2023-07-10 15:06:20
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Creating/editing database connection with SSH tunnel fails silently, when SSH port is not filled out
Type:Bug Priority:P3 .Regression Administration/Databases .Team/QueryProcessor :hammer_and_wrench:
### Describe the bug Checking https://github.com/metabase/metabase/issues/20756 I saw that the connection will fail when the SSH port is not filled out ### To Reproduce Spin up a stack with a bastion and an RDS server: https://github.com/paoliniluis/bastion-rds Connect to the DB and just create a table and some rows Connect Metabase to it without filling the SSH port ### Expected behavior If port is not filled, it should use port 22 as default ### Logs java.lang.NullPointerException at clojure.lang.RT.intCast(RT.java:1221) at metabase.util.ssh$start_ssh_tunnel_BANG_.invokeStatic(ssh.clj:53) at metabase.util.ssh$start_ssh_tunnel_BANG_.invoke(ssh.clj:53) at metabase.util.ssh$include_ssh_tunnel_BANG_.invokeStatic(ssh.clj:94) at metabase.util.ssh$include_ssh_tunnel_BANG_.invoke(ssh.clj:88) at metabase.util.ssh$do_with_ssh_tunnel.invokeStatic(ssh.clj:134) at metabase.util.ssh$do_with_ssh_tunnel.invoke(ssh.clj:130) at metabase.driver.sql_jdbc.connection$do_with_connection_spec_for_testing_connection.invokeStatic(connection.clj:293) at metabase.driver.sql_jdbc.connection$do_with_connection_spec_for_testing_connection.invoke(connection.clj:289) at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invokeStatic(connection.clj:318) at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invoke(connection.clj:314) at metabase.driver.sql_jdbc$fn__121460.invokeStatic(sql_jdbc.clj:52) at metabase.driver.sql_jdbc$fn__121460.invoke(sql_jdbc.clj:50) at clojure.lang.MultiFn.invoke(MultiFn.java:234) at metabase.driver.util$can_connect_with_details_QMARK_$fn__48184.invoke(util.clj:144) at clojure.core$binding_conveyor_fn$fn__5823.invoke(core.clj:2047) at clojure.lang.AFn.call(AFn.java:18) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source) ### Information about your Metabase installation ```JSON v47-RC2 ``` ### Severity P3 ### Additional context _No response_
1.0
Creating/editing database connection with SSH tunnel fails silently, when SSH port is not filled out - ### Describe the bug Checking https://github.com/metabase/metabase/issues/20756 I saw that the connection will fail when the SSH port is not filled out ### To Reproduce Spin up a stack with a bastion and an RDS server: https://github.com/paoliniluis/bastion-rds Connect to the DB and just create a table and some rows Connect Metabase to it without filling the SSH port ### Expected behavior If port is not filled, it should use port 22 as default ### Logs java.lang.NullPointerException at clojure.lang.RT.intCast(RT.java:1221) at metabase.util.ssh$start_ssh_tunnel_BANG_.invokeStatic(ssh.clj:53) at metabase.util.ssh$start_ssh_tunnel_BANG_.invoke(ssh.clj:53) at metabase.util.ssh$include_ssh_tunnel_BANG_.invokeStatic(ssh.clj:94) at metabase.util.ssh$include_ssh_tunnel_BANG_.invoke(ssh.clj:88) at metabase.util.ssh$do_with_ssh_tunnel.invokeStatic(ssh.clj:134) at metabase.util.ssh$do_with_ssh_tunnel.invoke(ssh.clj:130) at metabase.driver.sql_jdbc.connection$do_with_connection_spec_for_testing_connection.invokeStatic(connection.clj:293) at metabase.driver.sql_jdbc.connection$do_with_connection_spec_for_testing_connection.invoke(connection.clj:289) at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invokeStatic(connection.clj:318) at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invoke(connection.clj:314) at metabase.driver.sql_jdbc$fn__121460.invokeStatic(sql_jdbc.clj:52) at metabase.driver.sql_jdbc$fn__121460.invoke(sql_jdbc.clj:50) at clojure.lang.MultiFn.invoke(MultiFn.java:234) at metabase.driver.util$can_connect_with_details_QMARK_$fn__48184.invoke(util.clj:144) at clojure.core$binding_conveyor_fn$fn__5823.invoke(core.clj:2047) at clojure.lang.AFn.call(AFn.java:18) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source) ### Information about your Metabase installation ```JSON v47-RC2 ``` ### Severity P3 ### Additional context _No response_
process
creating editing database connection with ssh tunnel fails silently when ssh port is not filled out describe the bug checking i saw that the connection will fail when the ssh port is not filled out to reproduce spin up a stack with a bastion and an rds server connect to the db and just create a table and some rows connect metabase to it without filling the ssh port expected behavior if port is not filled it should use port as default logs java lang nullpointerexception at clojure lang rt intcast rt java at metabase util ssh start ssh tunnel bang invokestatic ssh clj at metabase util ssh start ssh tunnel bang invoke ssh clj at metabase util ssh include ssh tunnel bang invokestatic ssh clj at metabase util ssh include ssh tunnel bang invoke ssh clj at metabase util ssh do with ssh tunnel invokestatic ssh clj at metabase util ssh do with ssh tunnel invoke ssh clj at metabase driver sql jdbc connection do with connection spec for testing connection invokestatic connection clj at metabase driver sql jdbc connection do with connection spec for testing connection invoke connection clj at metabase driver sql jdbc connection can connect qmark invokestatic connection clj at metabase driver sql jdbc connection can connect qmark invoke connection clj at metabase driver sql jdbc fn invokestatic sql jdbc clj at metabase driver sql jdbc fn invoke sql jdbc clj at clojure lang multifn invoke multifn java at metabase driver util can connect with details qmark fn invoke util clj at clojure core binding conveyor fn fn invoke core clj at clojure lang afn call afn java at java base java util concurrent futuretask run unknown source at java base java util concurrent threadpoolexecutor runworker unknown source at java base java util concurrent threadpoolexecutor worker run unknown source at java base java lang thread run unknown source information about your metabase installation json severity additional context no response
1
107,921
13,530,597,418
IssuesEvent
2020-09-15 20:10:39
WordPress/gutenberg
https://api.github.com/repos/WordPress/gutenberg
closed
block.json: set color property individually
[Feature] Design Tools [Status] In Progress
The implicit block attributes for colors we declare through block.json follow this logic: CSS Property | block.json (check presence of) | Style attribute (block serialization) | theme.json --- | --- | --- | --- `--wp--style--color--link` | `__experimentalColor.linkColor` | `style.color.link` | `styles.color.link` `background` | `__experimentalColor.gradients` | `style.color.gradient` | `styles.color.gradient` `background-color` | `__experimentalColor` | `style.color.background` | `styles.color.background` `color` | `__experimentalColor` | `style.color.text` | `styles.color.text` Text and background colors are coupled, although some blocks may not require them both at once.
1.0
block.json: set color property individually - The implicit block attributes for colors we declare through block.json follow this logic: CSS Property | block.json (check presence of) | Style attribute (block serialization) | theme.json --- | --- | --- | --- `--wp--style--color--link` | `__experimentalColor.linkColor` | `style.color.link` | `styles.color.link` `background` | `__experimentalColor.gradients` | `style.color.gradient` | `styles.color.gradient` `background-color` | `__experimentalColor` | `style.color.background` | `styles.color.background` `color` | `__experimentalColor` | `style.color.text` | `styles.color.text` Text and background colors are coupled, although some blocks may not require them both at once.
non_process
block json set color property individually the implicit block attributes for colors we declare through block json follow this logic css property block json check presence of style attribute block serialization theme json wp style color link experimentalcolor linkcolor style color link styles color link background experimentalcolor gradients style color gradient styles color gradient background color experimentalcolor style color background styles color background color experimentalcolor style color text styles color text text and background colors are coupled although some blocks may not require them both at once
0
22,481
31,394,059,312
IssuesEvent
2023-08-26 17:59:52
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Duplicate columns should have nicer aliases
Type:Bug Priority:P3 Querying/Processor .Backend .Team/QueryProcessor :hammer_and_wrench: .Wanted: MLv2
From a Slack discussion between me and @senior. Right now if we have two columns with the same identifier, e.g. `name`, the Clojure JDBC library "helpfully" suffixes duplicates so we'll have `name` and `name_2`. It would be better if we just handled this ourselves where possible so we could give them more informative aliases, e.g. `venue_name` and `category_name`. This would be 100% under-the-hood so no UI changes. However we need to consider UX implementations. We might get complaints about us changing MB’s behavior, e.g. > why are the columns in my CSV different in 25 from 24? :arrow_down: Please click the :+1: reaction instead of leaving a `+1` or `update?` comment
2.0
Duplicate columns should have nicer aliases - From a Slack discussion between me and @senior. Right now if we have two columns with the same identifier, e.g. `name`, the Clojure JDBC library "helpfully" suffixes duplicates so we'll have `name` and `name_2`. It would be better if we just handled this ourselves where possible so we could give them more informative aliases, e.g. `venue_name` and `category_name`. This would be 100% under-the-hood so no UI changes. However we need to consider UX implementations. We might get complaints about us changing MB’s behavior, e.g. > why are the columns in my CSV different in 25 from 24? :arrow_down: Please click the :+1: reaction instead of leaving a `+1` or `update?` comment
process
duplicate columns should have nicer aliases from a slack discussion between me and senior right now if we have two columns with the same identifier e g name the clojure jdbc library helpfully suffixes duplicates so we ll have name and name it would be better if we just handled this ourselves where possible so we could give them more informative aliases e g venue name and category name this would be under the hood so no ui changes however we need to consider ux implementations we might get complaints about us changing mb’s behavior e g why are the columns in my csv different in from arrow down please click the reaction instead of leaving a or update comment
1
401,827
11,798,400,395
IssuesEvent
2020-03-18 14:20:30
codidact/qpixel
https://api.github.com/repos/codidact/qpixel
closed
Live Markdown previews
area: frontend priority: medium type: change request
We need to look into a client-side Markdown renderer and render posts as they're being typed so users have a live preview.
1.0
Live Markdown previews - We need to look into a client-side Markdown renderer and render posts as they're being typed so users have a live preview.
non_process
live markdown previews we need to look into a client side markdown renderer and render posts as they re being typed so users have a live preview
0
13,071
15,397,646,074
IssuesEvent
2021-03-03 22:30:07
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Oracle, BigQuery filtering by column with day-of-week bucketing not working
.Backend Database/BigQuery Database/Oracle Querying/Processor Type:Bug
Couldn't find any existing matching bug reports. While working on a fix for #13604 I noticed tests were failing for Oracle and BigQuery because we were generating queries that used `%` for modulus where BigQuery and Oracle instead use a [`mod()`](https://docs.oracle.com/cd/B19306_01/server.102/b14200/functions088.htm) function (actually BigQuery can apparently use `%` *outside* of the `WHERE` clause, but has to use `mod()` in the `WHERE` clause). The BigQuery error was ``` Syntax error: Parenthesized expression cannot be parsed as an expression, struct constructor, or subquery ``` The Oracle error was ``` java.sql.SQLSyntaxErrorException: ORA-00920: invalid relational operator ```
1.0
Oracle, BigQuery filtering by column with day-of-week bucketing not working - Couldn't find any existing matching bug reports. While working on a fix for #13604 I noticed tests were failing for Oracle and BigQuery because we were generating queries that used `%` for modulus where BigQuery and Oracle instead use a [`mod()`](https://docs.oracle.com/cd/B19306_01/server.102/b14200/functions088.htm) function (actually BigQuery can apparently use `%` *outside* of the `WHERE` clause, but has to use `mod()` in the `WHERE` clause). The BigQuery error was ``` Syntax error: Parenthesized expression cannot be parsed as an expression, struct constructor, or subquery ``` The Oracle error was ``` java.sql.SQLSyntaxErrorException: ORA-00920: invalid relational operator ```
process
oracle bigquery filtering by column with day of week bucketing not working couldn t find any existing matching bug reports while working on a fix for i noticed tests were failing for oracle and bigquery because we were generating queries that used for modulus where bigquery and oracle instead use a function actually bigquery can apparently use outside of the where clause but has to use mod in the where clause the bigquery error was syntax error parenthesized expression cannot be parsed as an expression struct constructor or subquery the oracle error was java sql sqlsyntaxerrorexception ora invalid relational operator
1
135,933
5,266,952,111
IssuesEvent
2017-02-04 17:51:50
senderle/topic-modeling-tool
https://api.github.com/repos/senderle/topic-modeling-tool
closed
Improve error reporting for bad CSV metadata input
bug priority-medium
Right now, when metadata CSVs are ill-formed or otherwise confusing to the tool, it just soldiers on, producing results that are sometimes very weird, without any feedback. That's not great! This might be solved by incorporating a proper CSV library. See also #27.
1.0
Improve error reporting for bad CSV metadata input - Right now, when metadata CSVs are ill-formed or otherwise confusing to the tool, it just soldiers on, producing results that are sometimes very weird, without any feedback. That's not great! This might be solved by incorporating a proper CSV library. See also #27.
non_process
improve error reporting for bad csv metadata input right now when metadata csvs are ill formed or otherwise confusing to the tool it just soldiers on producing results that are sometimes very weird without any feedback that s not great this might be solved by incorporating a proper csv library see also
0
18,912
24,853,344,629
IssuesEvent
2022-10-26 22:24:18
googleapis/google-cloud-go
https://api.github.com/repos/googleapis/google-cloud-go
closed
compute/metadata: make its own module
api: compute type: process
Now that we generate a compute library we should separate out the metadata package into its own module. This will potentially have a nice benefit of lightening our dependency graph once things rely on it. Also in the future it would be nice to have a new version of this package/module that is context aware, see #4483. In order to do that though I think we first need to have a stable module to give people time to update should they choose to do so.
1.0
compute/metadata: make its own module - Now that we generate a compute library we should separate out the metadata package into its own module. This will potentially have a nice benefit of lightening our dependency graph once things rely on it. Also in the future it would be nice to have a new version of this package/module that is context aware, see #4483. In order to do that though I think we first need to have a stable module to give people time to update should they choose to do so.
process
compute metadata make its own module now that we generate a compute library we should separate out the metadata package into its own module this will potentially have a nice benefit of lightening our dependency graph once things rely on it also in the future it would be nice to have a new version of this package module that is context aware see in order to do that though i think we first need to have a stable module to give people time to update should they choose to do so
1
203,587
15,375,704,083
IssuesEvent
2021-03-02 15:12:32
unfoldingWord/tc-create-app
https://api.github.com/repos/unfoldingWord/tc-create-app
closed
Target section goes blank when Section/Blocks are toggled
QA/ElsyTested QA/KozTested QA/Passed
v1.0.5-rc.5 Target section goes blank when Section/Blocks are toggled. - Open.md file(tA or tW). - Click on Sections button and then Blocks. Note that the Target side has gone blank. - Toggle is back . Note that the Source side has a blank space and the content is pushed down. Here is the screen recording: [https://app.zenhub.com/files/191973535/79155025-a9a3-44ea-bd4d-38c19b68db47/download](https://app.zenhub.com/files/191973535/79155025-a9a3-44ea-bd4d-38c19b68db47/download)
2.0
Target section goes blank when Section/Blocks are toggled - v1.0.5-rc.5 Target section goes blank when Section/Blocks are toggled. - Open.md file(tA or tW). - Click on Sections button and then Blocks. Note that the Target side has gone blank. - Toggle is back . Note that the Source side has a blank space and the content is pushed down. Here is the screen recording: [https://app.zenhub.com/files/191973535/79155025-a9a3-44ea-bd4d-38c19b68db47/download](https://app.zenhub.com/files/191973535/79155025-a9a3-44ea-bd4d-38c19b68db47/download)
non_process
target section goes blank when section blocks are toggled rc target section goes blank when section blocks are toggled open md file ta or tw click on sections button and then blocks note that the target side has gone blank toggle is back note that the source side has a blank space and the content is pushed down here is the screen recording
0
4,604
2,559,636,760
IssuesEvent
2015-02-05 02:54:51
cs2103jan2015-t10-1c/main
https://api.github.com/repos/cs2103jan2015-t10-1c/main
closed
As a normal user, I can change the due date of a task easily
priority.high type.story
so that I can move the task earlier or later
1.0
As a normal user, I can change the due date of a task easily - so that I can move the task earlier or later
non_process
as a normal user i can change the due date of a task easily so that i can move the task earlier or later
0
19,808
11,297,901,597
IssuesEvent
2020-01-17 07:34:19
ITISFoundation/osparc-simcore
https://api.github.com/repos/ITISFoundation/osparc-simcore
closed
Review why services disconnect
a:pipeline-services t:enhancement
Why a service disconnect? Enumerate reasons? Define strategies to resolve each item. ![Image Pasted at 2019-11-26 22-17.png](https://images.zenhubusercontent.com/5caef818ecad11531cc41364/53b49b30-5a6a-429a-b51c-77cb9db256ff) - see log in time recorded in screenshot -
1.0
Review why services disconnect - Why a service disconnect? Enumerate reasons? Define strategies to resolve each item. ![Image Pasted at 2019-11-26 22-17.png](https://images.zenhubusercontent.com/5caef818ecad11531cc41364/53b49b30-5a6a-429a-b51c-77cb9db256ff) - see log in time recorded in screenshot -
non_process
review why services disconnect why a service disconnect enumerate reasons define strategies to resolve each item see log in time recorded in screenshot
0
169,862
26,868,842,283
IssuesEvent
2023-02-04 07:18:17
Team-B1ND/dui-android
https://api.github.com/repos/Team-B1ND/dui-android
opened
DUI Input, Button, Select 등 사이즈 조정
DESIGN 🎨 BUG 🐛
## 문제 상황 Font가 작아지면서 생각보다 Button, Input, Select의 여백이 커졌습니다. ## 원인 원래 좀 넓었던 여백이 폰트로 인해 더 넓어짐.. ## 해결안 - [ ] Button, Input, Select Space을 40으로 줄여 디자인을 수정합니다. - [ ] 공통되는 Padding 값을 개발합니다. - [ ] Padding 값을 조정하여 적용합니다. - [ ] Round 또한 10, 15 둘 중 명확하게 결정합니다. -> medium ## 기타 사항 디자이너가 필요합니다...
1.0
DUI Input, Button, Select 등 사이즈 조정 - ## 문제 상황 Font가 작아지면서 생각보다 Button, Input, Select의 여백이 커졌습니다. ## 원인 원래 좀 넓었던 여백이 폰트로 인해 더 넓어짐.. ## 해결안 - [ ] Button, Input, Select Space을 40으로 줄여 디자인을 수정합니다. - [ ] 공통되는 Padding 값을 개발합니다. - [ ] Padding 값을 조정하여 적용합니다. - [ ] Round 또한 10, 15 둘 중 명확하게 결정합니다. -> medium ## 기타 사항 디자이너가 필요합니다...
non_process
dui input button select 등 사이즈 조정 문제 상황 font가 작아지면서 생각보다 button input select의 여백이 커졌습니다 원인 원래 좀 넓었던 여백이 폰트로 인해 더 넓어짐 해결안 button input select space을 줄여 디자인을 수정합니다 공통되는 padding 값을 개발합니다 padding 값을 조정하여 적용합니다 round 또한 둘 중 명확하게 결정합니다 medium 기타 사항 디자이너가 필요합니다
0
10,130
13,044,162,387
IssuesEvent
2020-07-29 03:47:32
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `DateLiteral` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `DateLiteral` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @sticnarf ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `DateLiteral` from TiDB - ## Description Port the scalar function `DateLiteral` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @sticnarf ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function dateliteral from tidb description port the scalar function dateliteral from tidb to coprocessor score mentor s sticnarf recommended skills rust programming learning materials already implemented expressions ported from tidb
1
99,569
20,991,211,114
IssuesEvent
2022-03-29 09:27:30
pandas-dev/pandas
https://api.github.com/repos/pandas-dev/pandas
closed
STYLE disallow Series | AnyArrayLike
Code Style good first issue
PR should be made to https://github.com/pandas-dev/pandas-dev-flaker idea is to disallow `foo: Series | AnyArrayLike`, as `Series` is array-like already This came out of this review: https://github.com/pandas-dev/pandas/pull/41955#discussion_r663390463
1.0
STYLE disallow Series | AnyArrayLike - PR should be made to https://github.com/pandas-dev/pandas-dev-flaker idea is to disallow `foo: Series | AnyArrayLike`, as `Series` is array-like already This came out of this review: https://github.com/pandas-dev/pandas/pull/41955#discussion_r663390463
non_process
style disallow series anyarraylike pr should be made to idea is to disallow foo series anyarraylike as series is array like already this came out of this review
0
3,260
6,339,944,447
IssuesEvent
2017-07-27 09:36:13
coala/coala
https://api.github.com/repos/coala/coala
closed
Root.Smell clean up
area/aspects process/approved
Add ClassSize.ClassLength, ClassSize.ClassConstants, ClassSize.ClassInstanceVariables, ClassSize.ClassMethods, MethodSmell.MethodLength, MethodSmell.ParameterListLength, Complexity.CyclomaticComplexity, Complexity.MaintainabilityIndex aspects. Refactor Root.Smell.complexity, and fix all docs issues
1.0
Root.Smell clean up - Add ClassSize.ClassLength, ClassSize.ClassConstants, ClassSize.ClassInstanceVariables, ClassSize.ClassMethods, MethodSmell.MethodLength, MethodSmell.ParameterListLength, Complexity.CyclomaticComplexity, Complexity.MaintainabilityIndex aspects. Refactor Root.Smell.complexity, and fix all docs issues
process
root smell clean up add classsize classlength classsize classconstants classsize classinstancevariables classsize classmethods methodsmell methodlength methodsmell parameterlistlength complexity cyclomaticcomplexity complexity maintainabilityindex aspects refactor root smell complexity and fix all docs issues
1
99,795
16,454,051,479
IssuesEvent
2021-05-21 09:59:02
AlexRogalskiy/charts
https://api.github.com/repos/AlexRogalskiy/charts
opened
CVE-2020-11022 (Medium) detected in jquery-1.9.1.js, jquery-1.8.1.min.js
security vulnerability
## CVE-2020-11022 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-1.9.1.js</b>, <b>jquery-1.8.1.min.js</b></p></summary> <p> <details><summary><b>jquery-1.9.1.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js</a></p> <p>Path to dependency file: charts/node_modules/tinygradient/bower_components/tinycolor/index.html</p> <p>Path to vulnerable library: charts/node_modules/tinygradient/bower_components/tinycolor/demo/jquery-1.9.1.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.9.1.js** (Vulnerable Library) </details> <details><summary><b>jquery-1.8.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js</a></p> <p>Path to dependency file: charts/node_modules/redeyed/examples/browser/index.html</p> <p>Path to vulnerable library: charts/node_modules/redeyed/examples/browser/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.8.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/charts/commit/314f258d67e43b0d911c13ba8860fdd95ac194ce">314f258d67e43b0d911c13ba8860fdd95ac194ce</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0. <p>Publish Date: 2020-04-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p> <p>Release Date: 2020-04-29</p> <p>Fix Resolution: jQuery - 3.5.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-11022 (Medium) detected in jquery-1.9.1.js, jquery-1.8.1.min.js - ## CVE-2020-11022 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-1.9.1.js</b>, <b>jquery-1.8.1.min.js</b></p></summary> <p> <details><summary><b>jquery-1.9.1.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js</a></p> <p>Path to dependency file: charts/node_modules/tinygradient/bower_components/tinycolor/index.html</p> <p>Path to vulnerable library: charts/node_modules/tinygradient/bower_components/tinycolor/demo/jquery-1.9.1.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.9.1.js** (Vulnerable Library) </details> <details><summary><b>jquery-1.8.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js</a></p> <p>Path to dependency file: charts/node_modules/redeyed/examples/browser/index.html</p> <p>Path to vulnerable library: charts/node_modules/redeyed/examples/browser/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.8.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/charts/commit/314f258d67e43b0d911c13ba8860fdd95ac194ce">314f258d67e43b0d911c13ba8860fdd95ac194ce</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0. <p>Publish Date: 2020-04-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p> <p>Release Date: 2020-04-29</p> <p>Fix Resolution: jQuery - 3.5.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in jquery js jquery min js cve medium severity vulnerability vulnerable libraries jquery js jquery min js jquery js javascript library for dom operations library home page a href path to dependency file charts node modules tinygradient bower components tinycolor index html path to vulnerable library charts node modules tinygradient bower components tinycolor demo jquery js dependency hierarchy x jquery js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file charts node modules redeyed examples browser index html path to vulnerable library charts node modules redeyed examples browser index html dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details in jquery versions greater than or equal to and before passing html from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource
0
371,183
10,962,643,551
IssuesEvent
2019-11-27 17:43:03
lowRISC/opentitan
https://api.github.com/repos/lowRISC/opentitan
closed
[sw] Cannot build coremark
Component:SW Priority:P1 Type:Bug
The coremark build is currently broken: ``` cd sw make SW_DIR=benchmarks/coremark SIM=1 ITERATIONS=1 SW_BUILD_DIR=coremark_build ``` Fails with ``` riscv32-unknown-elf-gcc: error: coremark_build/_crt.o: No such file or directory ``` There are a few issues to resolve: 1. As highlighted above exts/common/_crt.c doesn't get built as STANDALONE_SW is set to 1 but this is required by the coremark build 2. A variety of paths are relative to the sw/ directory. Building coremark involves changing to the coremark directory under sw/vendor/coremark and running a makefile there, where these relative paths are not valid (and the SW_BUILD_DIR ends up getting created there) 3. The ITERATIONS=1 isn't passed to the coremark makefile I've successfully made coremark build by addressing these points but as I just wanted to get it working without worrying about anything else it isn't really a patch that should get merged as is. I can create a PR of it if interested. I believe we're moving to meson from the current makefile setup? So we may not want to fix this as it's not a big priority but we do want to ensure it can build under the new meson setup.
1.0
[sw] Cannot build coremark - The coremark build is currently broken: ``` cd sw make SW_DIR=benchmarks/coremark SIM=1 ITERATIONS=1 SW_BUILD_DIR=coremark_build ``` Fails with ``` riscv32-unknown-elf-gcc: error: coremark_build/_crt.o: No such file or directory ``` There are a few issues to resolve: 1. As highlighted above exts/common/_crt.c doesn't get built as STANDALONE_SW is set to 1 but this is required by the coremark build 2. A variety of paths are relative to the sw/ directory. Building coremark involves changing to the coremark directory under sw/vendor/coremark and running a makefile there, where these relative paths are not valid (and the SW_BUILD_DIR ends up getting created there) 3. The ITERATIONS=1 isn't passed to the coremark makefile I've successfully made coremark build by addressing these points but as I just wanted to get it working without worrying about anything else it isn't really a patch that should get merged as is. I can create a PR of it if interested. I believe we're moving to meson from the current makefile setup? So we may not want to fix this as it's not a big priority but we do want to ensure it can build under the new meson setup.
non_process
cannot build coremark the coremark build is currently broken cd sw make sw dir benchmarks coremark sim iterations sw build dir coremark build fails with unknown elf gcc error coremark build crt o no such file or directory there are a few issues to resolve as highlighted above exts common crt c doesn t get built as standalone sw is set to but this is required by the coremark build a variety of paths are relative to the sw directory building coremark involves changing to the coremark directory under sw vendor coremark and running a makefile there where these relative paths are not valid and the sw build dir ends up getting created there the iterations isn t passed to the coremark makefile i ve successfully made coremark build by addressing these points but as i just wanted to get it working without worrying about anything else it isn t really a patch that should get merged as is i can create a pr of it if interested i believe we re moving to meson from the current makefile setup so we may not want to fix this as it s not a big priority but we do want to ensure it can build under the new meson setup
0
7,135
10,278,504,531
IssuesEvent
2019-08-25 14:58:01
qgis/QGIS-Documentation
https://api.github.com/repos/qgis/QGIS-Documentation
closed
[processing][needs-docs] Add cell size parameter to native interpolation algorithms
Automatic new feature Easy Processing Alg User Manual
Original commit: https://github.com/qgis/QGIS/commit/84d155eaf0467204aa087b78c8b5d7a0f3d1c9bc by web-flow [processing][needs-docs] Add cell size parameter to native interpolation algorithms (fix #18556, #20114)
1.0
[processing][needs-docs] Add cell size parameter to native interpolation algorithms - Original commit: https://github.com/qgis/QGIS/commit/84d155eaf0467204aa087b78c8b5d7a0f3d1c9bc by web-flow [processing][needs-docs] Add cell size parameter to native interpolation algorithms (fix #18556, #20114)
process
add cell size parameter to native interpolation algorithms original commit by web flow add cell size parameter to native interpolation algorithms fix
1
12,322
14,879,556,564
IssuesEvent
2021-01-20 07:53:17
lutraconsulting/qgis-crayfish-plugin
https://api.github.com/repos/lutraconsulting/qgis-crayfish-plugin
closed
Feature request - Rasterize tool: output data type and nodata value options
enhancement processing
The tool defaults to Float64 data type and NaN for nodata. It would be useful to provide the possibility to modify these defaults.
1.0
Feature request - Rasterize tool: output data type and nodata value options - The tool defaults to Float64 data type and NaN for nodata. It would be useful to provide the possibility to modify these defaults.
process
feature request rasterize tool output data type and nodata value options the tool defaults to data type and nan for nodata it would be useful to provide the possibility to modify these defaults
1
121,009
10,146,655,762
IssuesEvent
2019-08-05 08:42:18
pypa/warehouse
https://api.github.com/repos/pypa/warehouse
closed
Improve test coverage for rate limited routes
good first issue testing
As resolved in #6065, our current testing strategy for rate limited routes does not ensure `hit` is called. Tests should be added that actually exercise the rate limiter or ensure `hit` is called on all services that use a rate limiter.
1.0
Improve test coverage for rate limited routes - As resolved in #6065, our current testing strategy for rate limited routes does not ensure `hit` is called. Tests should be added that actually exercise the rate limiter or ensure `hit` is called on all services that use a rate limiter.
non_process
improve test coverage for rate limited routes as resolved in our current testing strategy for rate limited routes does not ensure hit is called tests should be added that actually exercise the rate limiter or ensure hit is called on all services that use a rate limiter
0
223
2,651,118,773
IssuesEvent
2015-03-16 08:59:10
srusskih/SublimeJEDI
https://api.github.com/repos/srusskih/SublimeJEDI
closed
completion error with dict
processing wontfix
Everytime a type in word `dict` and continue in newline ,this completion always appear. ![7d044e5f-09d0-429e-bc75-6cf6d8eda3d2](https://cloud.githubusercontent.com/assets/300016/3423613/21c32840-ffa0-11e3-8a05-0a701e26dbb5.png) I know its a bug with sublimecodeintel plugin, but anaconda plugin(also a python completion plugin) can avoid it.
1.0
completion error with dict - Everytime a type in word `dict` and continue in newline ,this completion always appear. ![7d044e5f-09d0-429e-bc75-6cf6d8eda3d2](https://cloud.githubusercontent.com/assets/300016/3423613/21c32840-ffa0-11e3-8a05-0a701e26dbb5.png) I know its a bug with sublimecodeintel plugin, but anaconda plugin(also a python completion plugin) can avoid it.
process
completion error with dict everytime a type in word dict and continue in newline this completion always appear i know its a bug with sublimecodeintel plugin but anaconda plugin also a python completion plugin can avoid it
1
18,756
24,657,678,932
IssuesEvent
2022-10-18 02:11:33
didi/mpx
https://api.github.com/repos/didi/mpx
closed
@mpxjs/cli 创建示例项目 打包 web 报错
processing
**问题描述** 1、示例项目引入, `vant button` 组件,运行正常显示,打包 `web`,无法访问 2、我以为是 `vant` 的问题,尝试把 `van-button` 隐藏,打包依旧一样 3、`node` 版本 `16.9.0` 使用 `pnpm` 构建 **index.mpx 代码如下** ![image](https://user-images.githubusercontent.com/19426584/194835972-99bc501e-baf6-4c3b-8aa5-e3b049f83448.png) **报错信息如下**: ![image](https://user-images.githubusercontent.com/19426584/194835532-4522b4da-5eb6-4bef-b292-8f3feac3241b.png)
1.0
@mpxjs/cli 创建示例项目 打包 web 报错 - **问题描述** 1、示例项目引入, `vant button` 组件,运行正常显示,打包 `web`,无法访问 2、我以为是 `vant` 的问题,尝试把 `van-button` 隐藏,打包依旧一样 3、`node` 版本 `16.9.0` 使用 `pnpm` 构建 **index.mpx 代码如下** ![image](https://user-images.githubusercontent.com/19426584/194835972-99bc501e-baf6-4c3b-8aa5-e3b049f83448.png) **报错信息如下**: ![image](https://user-images.githubusercontent.com/19426584/194835532-4522b4da-5eb6-4bef-b292-8f3feac3241b.png)
process
mpxjs cli 创建示例项目 打包 web 报错 问题描述 、示例项目引入, vant button 组件,运行正常显示,打包 web ,无法访问 、我以为是 vant 的问题,尝试把 van button 隐藏,打包依旧一样 、 node 版本 使用 pnpm 构建 index mpx 代码如下 报错信息如下 :
1
349,135
24,934,877,325
IssuesEvent
2022-10-31 14:27:44
dagger/dagger
https://api.github.com/repos/dagger/dagger
closed
Add links to pkg.do.dev docs in Go get started guide
area/documentation
### What is the issue? We received the following feedback in this Discord thread https://discord.com/channels/707636530424053791/1034373947040477204/1034373947040477204 for our Go getting started guide at https://docs.dagger.io/sdk/go/959738/get-started "just creates a dagger client with dagger.Connect()": you could add a link to the dagger.Connect() method on GoPkg
1.0
Add links to pkg.do.dev docs in Go get started guide - ### What is the issue? We received the following feedback in this Discord thread https://discord.com/channels/707636530424053791/1034373947040477204/1034373947040477204 for our Go getting started guide at https://docs.dagger.io/sdk/go/959738/get-started "just creates a dagger client with dagger.Connect()": you could add a link to the dagger.Connect() method on GoPkg
non_process
add links to pkg do dev docs in go get started guide what is the issue we received the following feedback in this discord thread for our go getting started guide at just creates a dagger client with dagger connect you could add a link to the dagger connect method on gopkg
0
15,782
19,975,495,445
IssuesEvent
2022-01-29 02:38:45
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Unrelated outputs of "qgis_process run ..." command
Feedback stale Processing Bug
I am running the following command and its output has unnecessary information imho ``` C:\OSGeo4W>qgis_process-qgis-dev run native:buffer D:\src\osgeo4w\src\qgis-dev\qgis\src\core\providers\qgsproviderregistry.cpp(251) : (QgsProviderRegistry::init) [2691ms] Checking C:/OSGeo4W/apps/qgis-dev/plugins/grassplugin7.dll: ...invalid (lib not loadable): Cannot load library C:\OSGeo4W\apps\qgis-dev\plugins\grassplugin7.dll: Le module spÚcifiÚ est introuvable. D:\src\osgeo4w\src\qgis-dev\qgis\src\core\providers\qgsproviderregistry.cpp(251) : (QgsProviderRegistry::init) [8ms] Checking C:/OSGeo4W/apps/qgis-dev/plugins/provider_grass7.dll: ...invalid (lib not loadable): Cannot load library C:\OSGeo4W\apps\qgis-dev\plugins\provider_grass7.dll: Le module spÚcifiÚ est introuvable. D:\src\osgeo4w\src\qgis-dev\qgis\src\core\providers\qgsproviderregistry.cpp(251) : (QgsProviderRegistry::init) [2ms] Checking C:/OSGeo4W/apps/qgis-dev/plugins/provider_grassraster7.dll: ...invalid (lib not loadable): Cannot load library C:\OSGeo4W\apps\qgis-dev\plugins\provider_grassraster7.dll: Le module spÚcifiÚ est introuvable. D:\src\osgeo4w\src\qgis-dev\qgis\src\core\providers\qgsproviderregistry.cpp(304) : (QgsProviderRegistry::init) [19ms] Loaded 25 providers (DB2;OAPIF;WFS;arcgisfeatureserver;arcgismapserver;delimitedtext;ept;gdal;geonode;gpx;hana;mdal;memory;mesh_memory;mssql;ogr;oracle;pdal;postgres;postgresraster;spatialite;vectortile;virtual;wcs;wms) D:\src\osgeo4w\src\qgis-dev\qgis\src\process\qgsprocess.cpp(189) : (QgsProcessingExec::loadPythonSupport) [137ms] load library qgispython (3.21.0) <string>:1: DeprecationWarning: setapi() is deprecated D:\src\osgeo4w\src\qgis-dev\qgis\src\core\qgsmessagelog.cpp(29) : (QgsMessageLog::logMessage) [976ms] 2021-07-13T12:28:38 Python error[2] Traceback (most recent call last): File "C:\OSGeo4W/apps/qgis-dev/./python\qgis\utils.py", line 335, in _startPlugin plugins[packageName] = package.classFactory(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\__init__.py", line 29, in classFactory return CadastreMenu(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\cadastre_menu.py", line 63, in __init__ self.mapCanvas = iface.mapCanvas() AttributeError: 'NoneType' object has no attribute 'mapCanvas' Traceback (most recent call last): File "C:\OSGeo4W/apps/qgis-dev/./python\qgis\utils.py", line 335, in _startPlugin plugins[packageName] = package.classFactory(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\__init__.py", line 29, in classFactory return CadastreMenu(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\cadastre_menu.py", line 63, in __init__ self.mapCanvas = iface.mapCanvas() AttributeError: 'NoneType' object has no attribute 'mapCanvas' D:\src\osgeo4w\src\qgis-dev\qgis\src\core\qgsmessagelog.cpp(29) : (QgsMessageLog::logMessage) [27ms] 2021-07-13T12:28:38 Couldn't load plugin 'cadastre' due to an error when calling its classFactory() method[1] Couldn't load plugin 'cadastre' due to an error when calling its classFactory() method AttributeError: 'NoneType' object has no attribute 'mapCanvas' Traceback (most recent call last): File "C:\OSGeo4W/apps/qgis-dev/./python\qgis\utils.py", line 335, in _startPlugin plugins[packageName] = package.classFactory(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\__init__.py", line 29, in classFactory return CadastreMenu(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\cadastre_menu.py", line 63, in __init__ self.mapCanvas = iface.mapCanvas() AttributeError: 'NoneType' object has no attribute 'mapCanvas' Python version: 3.9.5 (tags/v3.9.5:0a7dcbd, May 3 2021, 17:27:52) [MSC v.1928 64 bit (AMD64)] QGIS version: 3.21.0-Master Master, f7646705c Python Path: C:/OSGeo4W/apps/qgis-dev/./pythonC:/Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/pythonC:/Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/pluginsC:/OSGeo4W/apps/qgis-dev/./python/pluginsC:\OSGeo4W\apps\qgis-dev\pythonC:\OSGeo4W64\apps\Python36\LibC:\OSGeo4W\bin\python39.zipC:\OSGeo4W\apps\Python39\DLLsC:\OSGeo4W\apps\Python39\libC:\OSGeo4W\apps\qgis-dev\binC:\OSGeo4W\apps\Python39C:\OSGeo4W\apps\Python39\lib\site-packagesC:/Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/pythonC:\Users\pclocal\AppData\Roaming\QGIS\QGIS3\profiles\default\python\plugins\cadastre\forms error starting plugin: cadastre D:\src\osgeo4w\src\qgis-dev\qgis\src\core\qgsmessagelog.cpp(29) : (QgsMessageLog::logMessage) [193ms] 2021-07-13T12:28:38 Processing[2] Problem with GRASS installation: GRASS was not found or is not correctly installed Problem with GRASS installation: GRASS was not found or is not correctly installed D:\src\osgeo4w\src\qgis-dev\qgis\src\core\qgsmessagelog.cpp(29) : (QgsMessageLog::logMessage) [329ms] 2021-07-13T12:28:39 Processing[2] Problem with SAGA installation: SAGA was not found or is not correctly installed Problem with SAGA installation: SAGA was not found or is not correctly installed ---------------- Inputs ---------------- ERROR: The following mandatory parameters were not specified INPUT: Input layer OUTPUT: Buffered ``` Do not pay attention to the command itself. I think, since I'm running a specific command, it shouldn't report: * errors of implementation related to other plugins (cadastre) * missing providers (SAGA/GRASS) that I do not call The output should only concern the command I ran (ie, the 6 last lines). Tested with QGIS version: 3.21.0-Master Master, f7646705c
1.0
Unrelated outputs of "qgis_process run ..." command - I am running the following command and its output has unnecessary information imho ``` C:\OSGeo4W>qgis_process-qgis-dev run native:buffer D:\src\osgeo4w\src\qgis-dev\qgis\src\core\providers\qgsproviderregistry.cpp(251) : (QgsProviderRegistry::init) [2691ms] Checking C:/OSGeo4W/apps/qgis-dev/plugins/grassplugin7.dll: ...invalid (lib not loadable): Cannot load library C:\OSGeo4W\apps\qgis-dev\plugins\grassplugin7.dll: Le module spÚcifiÚ est introuvable. D:\src\osgeo4w\src\qgis-dev\qgis\src\core\providers\qgsproviderregistry.cpp(251) : (QgsProviderRegistry::init) [8ms] Checking C:/OSGeo4W/apps/qgis-dev/plugins/provider_grass7.dll: ...invalid (lib not loadable): Cannot load library C:\OSGeo4W\apps\qgis-dev\plugins\provider_grass7.dll: Le module spÚcifiÚ est introuvable. D:\src\osgeo4w\src\qgis-dev\qgis\src\core\providers\qgsproviderregistry.cpp(251) : (QgsProviderRegistry::init) [2ms] Checking C:/OSGeo4W/apps/qgis-dev/plugins/provider_grassraster7.dll: ...invalid (lib not loadable): Cannot load library C:\OSGeo4W\apps\qgis-dev\plugins\provider_grassraster7.dll: Le module spÚcifiÚ est introuvable. D:\src\osgeo4w\src\qgis-dev\qgis\src\core\providers\qgsproviderregistry.cpp(304) : (QgsProviderRegistry::init) [19ms] Loaded 25 providers (DB2;OAPIF;WFS;arcgisfeatureserver;arcgismapserver;delimitedtext;ept;gdal;geonode;gpx;hana;mdal;memory;mesh_memory;mssql;ogr;oracle;pdal;postgres;postgresraster;spatialite;vectortile;virtual;wcs;wms) D:\src\osgeo4w\src\qgis-dev\qgis\src\process\qgsprocess.cpp(189) : (QgsProcessingExec::loadPythonSupport) [137ms] load library qgispython (3.21.0) <string>:1: DeprecationWarning: setapi() is deprecated D:\src\osgeo4w\src\qgis-dev\qgis\src\core\qgsmessagelog.cpp(29) : (QgsMessageLog::logMessage) [976ms] 2021-07-13T12:28:38 Python error[2] Traceback (most recent call last): File "C:\OSGeo4W/apps/qgis-dev/./python\qgis\utils.py", line 335, in _startPlugin plugins[packageName] = package.classFactory(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\__init__.py", line 29, in classFactory return CadastreMenu(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\cadastre_menu.py", line 63, in __init__ self.mapCanvas = iface.mapCanvas() AttributeError: 'NoneType' object has no attribute 'mapCanvas' Traceback (most recent call last): File "C:\OSGeo4W/apps/qgis-dev/./python\qgis\utils.py", line 335, in _startPlugin plugins[packageName] = package.classFactory(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\__init__.py", line 29, in classFactory return CadastreMenu(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\cadastre_menu.py", line 63, in __init__ self.mapCanvas = iface.mapCanvas() AttributeError: 'NoneType' object has no attribute 'mapCanvas' D:\src\osgeo4w\src\qgis-dev\qgis\src\core\qgsmessagelog.cpp(29) : (QgsMessageLog::logMessage) [27ms] 2021-07-13T12:28:38 Couldn't load plugin 'cadastre' due to an error when calling its classFactory() method[1] Couldn't load plugin 'cadastre' due to an error when calling its classFactory() method AttributeError: 'NoneType' object has no attribute 'mapCanvas' Traceback (most recent call last): File "C:\OSGeo4W/apps/qgis-dev/./python\qgis\utils.py", line 335, in _startPlugin plugins[packageName] = package.classFactory(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\__init__.py", line 29, in classFactory return CadastreMenu(iface) File "C:\Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/plugins\cadastre\cadastre_menu.py", line 63, in __init__ self.mapCanvas = iface.mapCanvas() AttributeError: 'NoneType' object has no attribute 'mapCanvas' Python version: 3.9.5 (tags/v3.9.5:0a7dcbd, May 3 2021, 17:27:52) [MSC v.1928 64 bit (AMD64)] QGIS version: 3.21.0-Master Master, f7646705c Python Path: C:/OSGeo4W/apps/qgis-dev/./pythonC:/Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/pythonC:/Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/python/pluginsC:/OSGeo4W/apps/qgis-dev/./python/pluginsC:\OSGeo4W\apps\qgis-dev\pythonC:\OSGeo4W64\apps\Python36\LibC:\OSGeo4W\bin\python39.zipC:\OSGeo4W\apps\Python39\DLLsC:\OSGeo4W\apps\Python39\libC:\OSGeo4W\apps\qgis-dev\binC:\OSGeo4W\apps\Python39C:\OSGeo4W\apps\Python39\lib\site-packagesC:/Users/pclocal/AppData/Roaming/QGIS/QGIS3\profiles\default/pythonC:\Users\pclocal\AppData\Roaming\QGIS\QGIS3\profiles\default\python\plugins\cadastre\forms error starting plugin: cadastre D:\src\osgeo4w\src\qgis-dev\qgis\src\core\qgsmessagelog.cpp(29) : (QgsMessageLog::logMessage) [193ms] 2021-07-13T12:28:38 Processing[2] Problem with GRASS installation: GRASS was not found or is not correctly installed Problem with GRASS installation: GRASS was not found or is not correctly installed D:\src\osgeo4w\src\qgis-dev\qgis\src\core\qgsmessagelog.cpp(29) : (QgsMessageLog::logMessage) [329ms] 2021-07-13T12:28:39 Processing[2] Problem with SAGA installation: SAGA was not found or is not correctly installed Problem with SAGA installation: SAGA was not found or is not correctly installed ---------------- Inputs ---------------- ERROR: The following mandatory parameters were not specified INPUT: Input layer OUTPUT: Buffered ``` Do not pay attention to the command itself. I think, since I'm running a specific command, it shouldn't report: * errors of implementation related to other plugins (cadastre) * missing providers (SAGA/GRASS) that I do not call The output should only concern the command I ran (ie, the 6 last lines). Tested with QGIS version: 3.21.0-Master Master, f7646705c
process
unrelated outputs of qgis process run command i am running the following command and its output has unnecessary information imho c qgis process qgis dev run native buffer d src src qgis dev qgis src core providers qgsproviderregistry cpp qgsproviderregistry init checking c apps qgis dev plugins dll invalid lib not loadable cannot load library c apps qgis dev plugins dll le module spúcifiú est introuvable d src src qgis dev qgis src core providers qgsproviderregistry cpp qgsproviderregistry init checking c apps qgis dev plugins provider dll invalid lib not loadable cannot load library c apps qgis dev plugins provider dll le module spúcifiú est introuvable d src src qgis dev qgis src core providers qgsproviderregistry cpp qgsproviderregistry init checking c apps qgis dev plugins provider dll invalid lib not loadable cannot load library c apps qgis dev plugins provider dll le module spúcifiú est introuvable d src src qgis dev qgis src core providers qgsproviderregistry cpp qgsproviderregistry init loaded providers oapif wfs arcgisfeatureserver arcgismapserver delimitedtext ept gdal geonode gpx hana mdal memory mesh memory mssql ogr oracle pdal postgres postgresraster spatialite vectortile virtual wcs wms d src src qgis dev qgis src process qgsprocess cpp qgsprocessingexec loadpythonsupport load library qgispython deprecationwarning setapi is deprecated d src src qgis dev qgis src core qgsmessagelog cpp qgsmessagelog logmessage python error traceback most recent call last file c apps qgis dev python qgis utils py line in startplugin plugins package classfactory iface file c users pclocal appdata roaming qgis profiles default python plugins cadastre init py line in classfactory return cadastremenu iface file c users pclocal appdata roaming qgis profiles default python plugins cadastre cadastre menu py line in init self mapcanvas iface mapcanvas attributeerror nonetype object has no attribute mapcanvas traceback most recent call last file c apps qgis dev python qgis utils py line in startplugin plugins package classfactory iface file c users pclocal appdata roaming qgis profiles default python plugins cadastre init py line in classfactory return cadastremenu iface file c users pclocal appdata roaming qgis profiles default python plugins cadastre cadastre menu py line in init self mapcanvas iface mapcanvas attributeerror nonetype object has no attribute mapcanvas d src src qgis dev qgis src core qgsmessagelog cpp qgsmessagelog logmessage couldn t load plugin cadastre due to an error when calling its classfactory method couldn t load plugin cadastre due to an error when calling its classfactory method attributeerror nonetype object has no attribute mapcanvas traceback most recent call last file c apps qgis dev python qgis utils py line in startplugin plugins package classfactory iface file c users pclocal appdata roaming qgis profiles default python plugins cadastre init py line in classfactory return cadastremenu iface file c users pclocal appdata roaming qgis profiles default python plugins cadastre cadastre menu py line in init self mapcanvas iface mapcanvas attributeerror nonetype object has no attribute mapcanvas python version tags may qgis version master master python path c apps qgis dev pythonc users pclocal appdata roaming qgis profiles default pythonc users pclocal appdata roaming qgis profiles default python pluginsc apps qgis dev python pluginsc apps qgis dev pythonc apps libc bin zipc apps dllsc apps libc apps qgis dev binc apps apps lib site packagesc users pclocal appdata roaming qgis profiles default pythonc users pclocal appdata roaming qgis profiles default python plugins cadastre forms error starting plugin cadastre d src src qgis dev qgis src core qgsmessagelog cpp qgsmessagelog logmessage processing problem with grass installation grass was not found or is not correctly installed problem with grass installation grass was not found or is not correctly installed d src src qgis dev qgis src core qgsmessagelog cpp qgsmessagelog logmessage processing problem with saga installation saga was not found or is not correctly installed problem with saga installation saga was not found or is not correctly installed inputs error the following mandatory parameters were not specified input input layer output buffered do not pay attention to the command itself i think since i m running a specific command it shouldn t report errors of implementation related to other plugins cadastre missing providers saga grass that i do not call the output should only concern the command i ran ie the last lines tested with qgis version master master
1
16,791
22,037,197,230
IssuesEvent
2022-05-28 19:33:12
hashgraph/hedera-json-rpc-relay
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
opened
Deploy previewnet relay instance using Helm
enhancement P2 process
### Problem We currently lack a flow to easily deploy to integration using the new Helm Chart flow. ### Solution 2 phase approach Phase 1: Provide process to pick up from a `release/x.y.z` branch the newly tagged branch version and deploy to previewnet with manual steps Phase 2: Provide self service option to describe desired deployment and have automated process kick it off. ### Alternatives _No response_
1.0
Deploy previewnet relay instance using Helm - ### Problem We currently lack a flow to easily deploy to integration using the new Helm Chart flow. ### Solution 2 phase approach Phase 1: Provide process to pick up from a `release/x.y.z` branch the newly tagged branch version and deploy to previewnet with manual steps Phase 2: Provide self service option to describe desired deployment and have automated process kick it off. ### Alternatives _No response_
process
deploy previewnet relay instance using helm problem we currently lack a flow to easily deploy to integration using the new helm chart flow solution phase approach phase provide process to pick up from a release x y z branch the newly tagged branch version and deploy to previewnet with manual steps phase provide self service option to describe desired deployment and have automated process kick it off alternatives no response
1
1,834
4,630,880,331
IssuesEvent
2016-09-28 14:05:57
nodejs/node
https://api.github.com/repos/nodejs/node
reopened
Shouldn't the default SIGINT handler trigger process.on('exit')?
process
Maybe it's a bug, maybe it's by design. Documentation doesn't mention anything about whether or not the "exit" event should fire, but it's a bit silly I have to write: ```js process.on('SIGINT', function () { process.exit(somecode); // now the "exit" event will fire }); ``` Just because the built-in handler won't do it for me. It also means I can no longer depend on the default exit code that Node associates with SIGINT, which apparently is `128 + signal number` (according to the docs).
1.0
Shouldn't the default SIGINT handler trigger process.on('exit')? - Maybe it's a bug, maybe it's by design. Documentation doesn't mention anything about whether or not the "exit" event should fire, but it's a bit silly I have to write: ```js process.on('SIGINT', function () { process.exit(somecode); // now the "exit" event will fire }); ``` Just because the built-in handler won't do it for me. It also means I can no longer depend on the default exit code that Node associates with SIGINT, which apparently is `128 + signal number` (according to the docs).
process
shouldn t the default sigint handler trigger process on exit maybe it s a bug maybe it s by design documentation doesn t mention anything about whether or not the exit event should fire but it s a bit silly i have to write js process on sigint function process exit somecode now the exit event will fire just because the built in handler won t do it for me it also means i can no longer depend on the default exit code that node associates with sigint which apparently is signal number according to the docs
1
12,929
15,298,164,607
IssuesEvent
2021-02-24 09:23:03
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
MySQL migration creation bug
bug/1-repro-available engines/migration engine kind/bug process/candidate team/migrations topic: migrate topic: migrate dev
## Bug description Error occurred while doing initial MySQL database migration creation: ``` prisma:tryLoadEnv Environment variables loaded from /Users/user/Projects/project-v3/.env +0ms Environment variables loaded from .env Prisma schema loaded from prisma/schema.prisma Datasource "db": MySQL database "project" at "localhost:3306" prisma:migrateEngine:rpc starting migration engine with binary: /Users/user/Projects/project-v3/node_modules/@prisma/engines/migration-engine-darwin +0ms prisma:migrateEngine:rpc SENDING RPC CALL {"id":1,"jsonrpc":"2.0","method":"devDiagnostic","params":{"migrationsDirectoryPath":"/Users/user/Projects/project-v3/prisma/migrations"}} +4ms prisma:migrateEngine:stderr Feb 23 21:49:07.108 INFO migration_engine: Starting migration engine RPC server git_hash="3c463ebd78b1d21d8fdacdd27899e280cf686223" +0ms prisma:migrateEngine:stderr Feb 23 21:49:07.121 INFO quaint::single: Starting a mysql connection. +13ms prisma:migrateEngine:stderr Feb 23 21:49:07.126 INFO DevDiagnostic:calculate_drift:sql_schema_from_migration_history: quaint::single: Starting a mysql connection. +5ms prisma:migrateEngine:stderr Feb 23 21:49:07.227 INFO DevDiagnostic:validate_migrations:sql_schema_from_migration_history: quaint::single: Starting a mysql connection. +101ms prisma:migrateEngine:rpc { prisma:migrateEngine:rpc jsonrpc: '2.0', prisma:migrateEngine:rpc error: { prisma:migrateEngine:rpc code: 4466, prisma:migrateEngine:rpc message: 'An error happened. Check the data field for details.', prisma:migrateEngine:rpc data: { prisma:migrateEngine:rpc is_panic: false, prisma:migrateEngine:rpc message: 'Migration `20210223133504_` failed to apply cleanly to a temporary database. \n' + prisma:migrateEngine:rpc 'Error:\n' + prisma:migrateEngine:rpc "Database error: Error querying the database: Server error: `ERROR 42000 (1067): Invalid default value for 'created_at''\n" + prisma:migrateEngine:rpc ' 0: sql_migration_connector::flavour::mysql::sql_schema_from_migration_history\n' + prisma:migrateEngine:rpc ' at migration-engine/connectors/sql-migration-connector/src/flavour/mysql.rs:275\n' + prisma:migrateEngine:rpc ' 1: sql_migration_connector::sql_database_migration_inferrer::validate_migrations\n' + prisma:migrateEngine:rpc ' at migration-engine/connectors/sql-migration-connector/src/sql_database_migration_inferrer.rs:70\n' + prisma:migrateEngine:rpc ' 2: migration_core::api::DevDiagnostic\n' + prisma:migrateEngine:rpc ' at migration-engine/core/src/api.rs:79', prisma:migrateEngine:rpc meta: [Object], prisma:migrateEngine:rpc error_code: 'P3006' prisma:migrateEngine:rpc } prisma:migrateEngine:rpc }, prisma:migrateEngine:rpc id: 1 prisma:migrateEngine:rpc } +128ms Error: Error: P3006 Migration `20210223133504_` failed to apply cleanly to a temporary database. Error: Database error: Error querying the database: Server error: `ERROR 42000 (1067): Invalid default value for 'created_at'' 0: sql_migration_connector::flavour::mysql::sql_schema_from_migration_history at migration-engine/connectors/sql-migration-connector/src/flavour/mysql.rs:275 1: sql_migration_connector::sql_database_migration_inferrer::validate_migrations at migration-engine/connectors/sql-migration-connector/src/sql_database_migration_inferrer.rs:70 2: migration_core::api::DevDiagnostic at migration-engine/core/src/api.rs:79 at Object.<anonymous> (/Users/user/Projects/project-v3/node_modules/prisma/build/index.js:54905:26) at MigrateEngine.handleResponse (/Users/user/Projects/project-v3/node_modules/prisma/build/index.js:54780:38) at LineStream.<anonymous> (/Users/user/Projects/project-v3/node_modules/prisma/build/index.js:54865:18) at LineStream.emit (events.js:314:20) at LineStream.EventEmitter.emit (domain.js:483:12) at addChunk (_stream_readable.js:297:12) at readableAddChunk (_stream_readable.js:272:9) at LineStream.Readable.push (_stream_readable.js:213:10) at LineStream.Transform.push (_stream_transform.js:152:32) at LineStream._pushBuffer (/Users/user/Projects/project-v3/node_modules/prisma/build/index.js:54617:19) ``` ## How to reproduce Run `DEBUG="*" ./node_modules/.bin/prisma migrate dev --preview-feature`. ## Expected behavior Initial MySQL database migration creation should be successful. ## Prisma information schema.prisma: ``` model Address { id Int @id @default(autoincrement()) created_at DateTime @default(now()) @db.Timestamp(0) created_by Int? updated_at DateTime? @default(now()) @db.Timestamp(0) updated_by Int? } ``` Sample of generated migration SQL: ``` -- CreateTable CREATE TABLE `address` ( `id` INTEGER NOT NULL AUTO_INCREMENT, `created_at` TIMESTAMP(0) NOT NULL DEFAULT CURRENT_TIMESTAMP(3), `created_by` INTEGER, `updated_at` TIMESTAMP(0) DEFAULT CURRENT_TIMESTAMP(3), `updated_by` INTEGER, PRIMARY KEY (`id`) ) DEFAULT CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci; ``` Generated SQL syntax for `created_at` and `updated_at` fields is invalid. More information about correct MySQL syntax can be found here: https://dev.mysql.com/doc/refman/5.7/en/timestamp-initialization.html. `DEFAULT CURRENT_TIMESTAMP(3)` should be `DEFAULT CURRENT_TIMESTAMP` or `DEFAULT CURRENT_TIMESTAMP(0)`. ## Environment & setup - OS: Mac OS 10.15.7 - Database: MySQL 5.7.32 - Node.js version: 12.20.2 - Prisma version: 2.17.0
1.0
MySQL migration creation bug - ## Bug description Error occurred while doing initial MySQL database migration creation: ``` prisma:tryLoadEnv Environment variables loaded from /Users/user/Projects/project-v3/.env +0ms Environment variables loaded from .env Prisma schema loaded from prisma/schema.prisma Datasource "db": MySQL database "project" at "localhost:3306" prisma:migrateEngine:rpc starting migration engine with binary: /Users/user/Projects/project-v3/node_modules/@prisma/engines/migration-engine-darwin +0ms prisma:migrateEngine:rpc SENDING RPC CALL {"id":1,"jsonrpc":"2.0","method":"devDiagnostic","params":{"migrationsDirectoryPath":"/Users/user/Projects/project-v3/prisma/migrations"}} +4ms prisma:migrateEngine:stderr Feb 23 21:49:07.108 INFO migration_engine: Starting migration engine RPC server git_hash="3c463ebd78b1d21d8fdacdd27899e280cf686223" +0ms prisma:migrateEngine:stderr Feb 23 21:49:07.121 INFO quaint::single: Starting a mysql connection. +13ms prisma:migrateEngine:stderr Feb 23 21:49:07.126 INFO DevDiagnostic:calculate_drift:sql_schema_from_migration_history: quaint::single: Starting a mysql connection. +5ms prisma:migrateEngine:stderr Feb 23 21:49:07.227 INFO DevDiagnostic:validate_migrations:sql_schema_from_migration_history: quaint::single: Starting a mysql connection. +101ms prisma:migrateEngine:rpc { prisma:migrateEngine:rpc jsonrpc: '2.0', prisma:migrateEngine:rpc error: { prisma:migrateEngine:rpc code: 4466, prisma:migrateEngine:rpc message: 'An error happened. Check the data field for details.', prisma:migrateEngine:rpc data: { prisma:migrateEngine:rpc is_panic: false, prisma:migrateEngine:rpc message: 'Migration `20210223133504_` failed to apply cleanly to a temporary database. \n' + prisma:migrateEngine:rpc 'Error:\n' + prisma:migrateEngine:rpc "Database error: Error querying the database: Server error: `ERROR 42000 (1067): Invalid default value for 'created_at''\n" + prisma:migrateEngine:rpc ' 0: sql_migration_connector::flavour::mysql::sql_schema_from_migration_history\n' + prisma:migrateEngine:rpc ' at migration-engine/connectors/sql-migration-connector/src/flavour/mysql.rs:275\n' + prisma:migrateEngine:rpc ' 1: sql_migration_connector::sql_database_migration_inferrer::validate_migrations\n' + prisma:migrateEngine:rpc ' at migration-engine/connectors/sql-migration-connector/src/sql_database_migration_inferrer.rs:70\n' + prisma:migrateEngine:rpc ' 2: migration_core::api::DevDiagnostic\n' + prisma:migrateEngine:rpc ' at migration-engine/core/src/api.rs:79', prisma:migrateEngine:rpc meta: [Object], prisma:migrateEngine:rpc error_code: 'P3006' prisma:migrateEngine:rpc } prisma:migrateEngine:rpc }, prisma:migrateEngine:rpc id: 1 prisma:migrateEngine:rpc } +128ms Error: Error: P3006 Migration `20210223133504_` failed to apply cleanly to a temporary database. Error: Database error: Error querying the database: Server error: `ERROR 42000 (1067): Invalid default value for 'created_at'' 0: sql_migration_connector::flavour::mysql::sql_schema_from_migration_history at migration-engine/connectors/sql-migration-connector/src/flavour/mysql.rs:275 1: sql_migration_connector::sql_database_migration_inferrer::validate_migrations at migration-engine/connectors/sql-migration-connector/src/sql_database_migration_inferrer.rs:70 2: migration_core::api::DevDiagnostic at migration-engine/core/src/api.rs:79 at Object.<anonymous> (/Users/user/Projects/project-v3/node_modules/prisma/build/index.js:54905:26) at MigrateEngine.handleResponse (/Users/user/Projects/project-v3/node_modules/prisma/build/index.js:54780:38) at LineStream.<anonymous> (/Users/user/Projects/project-v3/node_modules/prisma/build/index.js:54865:18) at LineStream.emit (events.js:314:20) at LineStream.EventEmitter.emit (domain.js:483:12) at addChunk (_stream_readable.js:297:12) at readableAddChunk (_stream_readable.js:272:9) at LineStream.Readable.push (_stream_readable.js:213:10) at LineStream.Transform.push (_stream_transform.js:152:32) at LineStream._pushBuffer (/Users/user/Projects/project-v3/node_modules/prisma/build/index.js:54617:19) ``` ## How to reproduce Run `DEBUG="*" ./node_modules/.bin/prisma migrate dev --preview-feature`. ## Expected behavior Initial MySQL database migration creation should be successful. ## Prisma information schema.prisma: ``` model Address { id Int @id @default(autoincrement()) created_at DateTime @default(now()) @db.Timestamp(0) created_by Int? updated_at DateTime? @default(now()) @db.Timestamp(0) updated_by Int? } ``` Sample of generated migration SQL: ``` -- CreateTable CREATE TABLE `address` ( `id` INTEGER NOT NULL AUTO_INCREMENT, `created_at` TIMESTAMP(0) NOT NULL DEFAULT CURRENT_TIMESTAMP(3), `created_by` INTEGER, `updated_at` TIMESTAMP(0) DEFAULT CURRENT_TIMESTAMP(3), `updated_by` INTEGER, PRIMARY KEY (`id`) ) DEFAULT CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci; ``` Generated SQL syntax for `created_at` and `updated_at` fields is invalid. More information about correct MySQL syntax can be found here: https://dev.mysql.com/doc/refman/5.7/en/timestamp-initialization.html. `DEFAULT CURRENT_TIMESTAMP(3)` should be `DEFAULT CURRENT_TIMESTAMP` or `DEFAULT CURRENT_TIMESTAMP(0)`. ## Environment & setup - OS: Mac OS 10.15.7 - Database: MySQL 5.7.32 - Node.js version: 12.20.2 - Prisma version: 2.17.0
process
mysql migration creation bug bug description error occurred while doing initial mysql database migration creation prisma tryloadenv environment variables loaded from users user projects project env environment variables loaded from env prisma schema loaded from prisma schema prisma datasource db mysql database project at localhost prisma migrateengine rpc starting migration engine with binary users user projects project node modules prisma engines migration engine darwin prisma migrateengine rpc sending rpc call id jsonrpc method devdiagnostic params migrationsdirectorypath users user projects project prisma migrations prisma migrateengine stderr feb info migration engine starting migration engine rpc server git hash prisma migrateengine stderr feb info quaint single starting a mysql connection prisma migrateengine stderr feb info devdiagnostic calculate drift sql schema from migration history quaint single starting a mysql connection prisma migrateengine stderr feb info devdiagnostic validate migrations sql schema from migration history quaint single starting a mysql connection prisma migrateengine rpc prisma migrateengine rpc jsonrpc prisma migrateengine rpc error prisma migrateengine rpc code prisma migrateengine rpc message an error happened check the data field for details prisma migrateengine rpc data prisma migrateengine rpc is panic false prisma migrateengine rpc message migration failed to apply cleanly to a temporary database n prisma migrateengine rpc error n prisma migrateengine rpc database error error querying the database server error error invalid default value for created at n prisma migrateengine rpc sql migration connector flavour mysql sql schema from migration history n prisma migrateengine rpc at migration engine connectors sql migration connector src flavour mysql rs n prisma migrateengine rpc sql migration connector sql database migration inferrer validate migrations n prisma migrateengine rpc at migration engine connectors sql migration connector src sql database migration inferrer rs n prisma migrateengine rpc migration core api devdiagnostic n prisma migrateengine rpc at migration engine core src api rs prisma migrateengine rpc meta prisma migrateengine rpc error code prisma migrateengine rpc prisma migrateengine rpc prisma migrateengine rpc id prisma migrateengine rpc error error migration failed to apply cleanly to a temporary database error database error error querying the database server error error invalid default value for created at sql migration connector flavour mysql sql schema from migration history at migration engine connectors sql migration connector src flavour mysql rs sql migration connector sql database migration inferrer validate migrations at migration engine connectors sql migration connector src sql database migration inferrer rs migration core api devdiagnostic at migration engine core src api rs at object users user projects project node modules prisma build index js at migrateengine handleresponse users user projects project node modules prisma build index js at linestream users user projects project node modules prisma build index js at linestream emit events js at linestream eventemitter emit domain js at addchunk stream readable js at readableaddchunk stream readable js at linestream readable push stream readable js at linestream transform push stream transform js at linestream pushbuffer users user projects project node modules prisma build index js how to reproduce run debug node modules bin prisma migrate dev preview feature expected behavior initial mysql database migration creation should be successful prisma information schema prisma model address id int id default autoincrement created at datetime default now db timestamp created by int updated at datetime default now db timestamp updated by int sample of generated migration sql createtable create table address id integer not null auto increment created at timestamp not null default current timestamp created by integer updated at timestamp default current timestamp updated by integer primary key id default character set collate unicode ci generated sql syntax for created at and updated at fields is invalid more information about correct mysql syntax can be found here default current timestamp should be default current timestamp or default current timestamp environment setup os mac os database mysql node js version prisma version
1
319,462
27,374,713,065
IssuesEvent
2023-02-28 04:20:45
DaHoon06/Dotto
https://api.github.com/repos/DaHoon06/Dotto
closed
피드 api
✔️ Test 🧑🏻‍💻 Develop
- [ ] 목록조회 - [x] 생성 - postman 테스트 완료 - [x] 조회 - postman 테스트 완료 - [x] 삭제 - postman 테스트 완료 - [x] 수정 - postman 테스트 완료
1.0
피드 api - - [ ] 목록조회 - [x] 생성 - postman 테스트 완료 - [x] 조회 - postman 테스트 완료 - [x] 삭제 - postman 테스트 완료 - [x] 수정 - postman 테스트 완료
non_process
피드 api 목록조회 생성 postman 테스트 완료 조회 postman 테스트 완료 삭제 postman 테스트 완료 수정 postman 테스트 완료
0
362
2,797,167,447
IssuesEvent
2015-05-12 12:20:20
joyent/node
https://api.github.com/repos/joyent/node
closed
child process: .pipe() doesn't always work
child_process
```javascript var spawn = require('child_process').spawn; var net = require('net'); function start(cmd, port) { net.createServer(function(conn) { var proc = spawn(cmd, [], { stdio: 'pipe' }); conn.pipe(proc.stdin); proc.stdout.pipe(conn); }).listen(port); } start('cat', 8000); // works, simple echo server start('rev', 8001); // doesn't work, no output ``` ``` $ nc 127.0.0.1 8000 # cat, works ping ping pong pong ^D $ nc 127.0.0.1 8001 # rev, no output ping pong ^D $ rev # expected output ping gnip pong gnop ^D ``` ``` $ strace -fe read,write out/Release/node tmp/pipe.js <snipped> [pid 6235] read(0, <unfinished ...> [pid 6232] read(9, "ping\n", 65536) = 5 [pid 6232] write(11, "ping\n", 5) = 5 [pid 6235] <... read resumed> "ping\n", 4096) = 5 [pid 6235] read(0, <unfinished ...> [pid 6232] read(9, "pong\n", 65536) = 5 [pid 6232] write(11, "pong\n", 5) = 5 [pid 6235] <... read resumed> "pong\n", 4096) = 5 [pid 6235] read(0, <unfinished ...> [pid 6232] read(9, "", 65536) = 0 [pid 6235] <... read resumed> "", 4096) = 0 [pid 6235] write(1, "gnip\ngnop\n", 10) = 10 Process 6235 detached [pid 6232] read(12, "gnip\ngnop\n", 65536) = 10 [pid 6232] --- SIGCHLD (Child exited) @ 0 (0) --- ``` Ergo, the child process does write the expected output but not until the pipe is closed. Needs further investigation.
1.0
child process: .pipe() doesn't always work - ```javascript var spawn = require('child_process').spawn; var net = require('net'); function start(cmd, port) { net.createServer(function(conn) { var proc = spawn(cmd, [], { stdio: 'pipe' }); conn.pipe(proc.stdin); proc.stdout.pipe(conn); }).listen(port); } start('cat', 8000); // works, simple echo server start('rev', 8001); // doesn't work, no output ``` ``` $ nc 127.0.0.1 8000 # cat, works ping ping pong pong ^D $ nc 127.0.0.1 8001 # rev, no output ping pong ^D $ rev # expected output ping gnip pong gnop ^D ``` ``` $ strace -fe read,write out/Release/node tmp/pipe.js <snipped> [pid 6235] read(0, <unfinished ...> [pid 6232] read(9, "ping\n", 65536) = 5 [pid 6232] write(11, "ping\n", 5) = 5 [pid 6235] <... read resumed> "ping\n", 4096) = 5 [pid 6235] read(0, <unfinished ...> [pid 6232] read(9, "pong\n", 65536) = 5 [pid 6232] write(11, "pong\n", 5) = 5 [pid 6235] <... read resumed> "pong\n", 4096) = 5 [pid 6235] read(0, <unfinished ...> [pid 6232] read(9, "", 65536) = 0 [pid 6235] <... read resumed> "", 4096) = 0 [pid 6235] write(1, "gnip\ngnop\n", 10) = 10 Process 6235 detached [pid 6232] read(12, "gnip\ngnop\n", 65536) = 10 [pid 6232] --- SIGCHLD (Child exited) @ 0 (0) --- ``` Ergo, the child process does write the expected output but not until the pipe is closed. Needs further investigation.
process
child process pipe doesn t always work javascript var spawn require child process spawn var net require net function start cmd port net createserver function conn var proc spawn cmd stdio pipe conn pipe proc stdin proc stdout pipe conn listen port start cat works simple echo server start rev doesn t work no output nc cat works ping ping pong pong d nc rev no output ping pong d rev expected output ping gnip pong gnop d strace fe read write out release node tmp pipe js read read ping n write ping n ping n read read pong n write pong n pong n read read write gnip ngnop n process detached read gnip ngnop n sigchld child exited ergo the child process does write the expected output but not until the pipe is closed needs further investigation
1
17,956
23,960,669,998
IssuesEvent
2022-09-12 18:47:58
JeroenMathon/NeosVR-Research-Initiative
https://api.github.com/repos/JeroenMathon/NeosVR-Research-Initiative
opened
Process Claims and Supplied information
help wanted processing
For the following file: Collected Data/Unverified/Source 1/Claims and Supplied information.md - [ ] Process all unprocessed information of the file and add media and source references where needed, - [ ] Strike through all non verifiable and unrelated information - [ ] Refactor all information and process these for profiling
1.0
Process Claims and Supplied information - For the following file: Collected Data/Unverified/Source 1/Claims and Supplied information.md - [ ] Process all unprocessed information of the file and add media and source references where needed, - [ ] Strike through all non verifiable and unrelated information - [ ] Refactor all information and process these for profiling
process
process claims and supplied information for the following file collected data unverified source claims and supplied information md process all unprocessed information of the file and add media and source references where needed strike through all non verifiable and unrelated information refactor all information and process these for profiling
1
11,283
14,078,999,670
IssuesEvent
2020-11-04 14:19:54
google/ground-platform
https://api.github.com/repos/google/ground-platform
closed
[Deployment] Split up deployment targets into dev, staging, and prod
priority: p2 type: feature request type: process web
`ng deploy` will currently deploy to the user's dev Firebase instance.
1.0
[Deployment] Split up deployment targets into dev, staging, and prod - `ng deploy` will currently deploy to the user's dev Firebase instance.
process
split up deployment targets into dev staging and prod ng deploy will currently deploy to the user s dev firebase instance
1
4,585
7,428,418,704
IssuesEvent
2018-03-24 01:19:01
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
$ipconf has a null value for subnet...
cxp doc-bug expressroute in-process triaged
When executing these commands as described, $ipconf is created with subnet as null, causing 'New-AzureRmVirtualNetworkGateway' to fail with Invalid Request 400, Subnet is required. I used the reference page for New-AzureRmVirtualNetworkGateway and that worked for me. On THAT page, the GatewaySubnet is created instead of being staged as a configuration item, as described on THIS page. https://docs.microsoft.com/en-us/powershell/module/azurerm.network/new-azurermvirtualnetworkgateway?view=azurermps-5.5.0 --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 1028ea5b-a7cb-6463-3e15-0fb354ba1698 * Version Independent ID: 267aa403-49c5-f0ed-7998-2287dd4102d7 * Content: [Add a virtual network gateway to a VNet for ExpressRoute: PowerShell: Azure](https://docs.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-resource-manager) * Content Source: [articles/expressroute/expressroute-howto-add-gateway-resource-manager.md](https://github.com/Microsoft/azure-docs/blob/master/articles/expressroute/expressroute-howto-add-gateway-resource-manager.md) * Service: **expressroute** * GitHub Login: @charwen * Microsoft Alias: **charwen**
1.0
$ipconf has a null value for subnet... - When executing these commands as described, $ipconf is created with subnet as null, causing 'New-AzureRmVirtualNetworkGateway' to fail with Invalid Request 400, Subnet is required. I used the reference page for New-AzureRmVirtualNetworkGateway and that worked for me. On THAT page, the GatewaySubnet is created instead of being staged as a configuration item, as described on THIS page. https://docs.microsoft.com/en-us/powershell/module/azurerm.network/new-azurermvirtualnetworkgateway?view=azurermps-5.5.0 --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 1028ea5b-a7cb-6463-3e15-0fb354ba1698 * Version Independent ID: 267aa403-49c5-f0ed-7998-2287dd4102d7 * Content: [Add a virtual network gateway to a VNet for ExpressRoute: PowerShell: Azure](https://docs.microsoft.com/en-us/azure/expressroute/expressroute-howto-add-gateway-resource-manager) * Content Source: [articles/expressroute/expressroute-howto-add-gateway-resource-manager.md](https://github.com/Microsoft/azure-docs/blob/master/articles/expressroute/expressroute-howto-add-gateway-resource-manager.md) * Service: **expressroute** * GitHub Login: @charwen * Microsoft Alias: **charwen**
process
ipconf has a null value for subnet when executing these commands as described ipconf is created with subnet as null causing new azurermvirtualnetworkgateway to fail with invalid request subnet is required i used the reference page for new azurermvirtualnetworkgateway and that worked for me on that page the gatewaysubnet is created instead of being staged as a configuration item as described on this page document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service expressroute github login charwen microsoft alias charwen
1
166,555
12,961,866,837
IssuesEvent
2020-07-20 16:16:23
STEllAR-GROUP/phylanx
https://api.github.com/repos/STEllAR-GROUP/phylanx
opened
Occasional crash in categorical_crossentropy test, Debug build
category: tests compiler: gcc platform: linux
The nightly test of the Debug build of tests.unit.python.execution_tree.categorical_crossentropy fails consistently, but when run in isolation it does not crash. The full log is here: http://omega.nic.uoregon.edu:8020/#/builders/2/builds/201/steps/16/logs/stdio . The top of the stack is `phylanx::execution_tree::primitives::cat_cross_operation::cat_cross3d(phylanx::ir::node_data<double>&&, phylanx::ir::node_data<double>&&, bool, int) const`. Here's the stack trace: ``` 233/262 Test #340: tests.unit.python.execution_tree.categorical_crossentropy ........................***Failed 1.02 sec *** Error in `/packages/python/3.6.4-ssl/bin/python3': free(): invalid next size (fast): 0x00007f89440088b0 *** ======= Backtrace: ========= /lib64/libc.so.6(+0x81299)[0x7f8a183c8299] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/phylanx/libphylanx_keras_support.so(_ZNK7phylanx14execution_tree10primitives19cat_cross_operation11cat_cross3dEONS_2ir9node_dataIdEES6_bi+0xf4b)[0x7f8961a73b0b] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/phylanx/libphylanx_keras_support.so(+0x65eea8)[0x7f8961a78ea8] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/phylanx/libphylanx_keras_support.so(_ZNK7phylanx14execution_tree10primitives19cat_cross_operation4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EES8_NS0_12eval_contextE+0x5f4)[0x7f8961a798b4] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x61)[0x7f8a0f419ca1] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJRKSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSN_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f182294] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJRKSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSI_7id_typeENS_7threads15thread_priorityEDpOT_+0x57c)[0x7f8a0f182e1c] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail17async_remote_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionEJRKSt6vectorINS3_23primitive_argument_typeESaIS8_EENS3_12eval_contextEEEENS_4lcos6futureINS_6traits14extract_actionIT_vE4type17local_result_typeEEENS0_11sync_policyERKNS_6naming7id_typeEONSO_7addressEDpOT0_+0x10d)[0x7f8a0f18317d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail9sync_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionERKNS0_11sync_policyEJRKSt6vectorINS3_23primitive_argument_typeESaISB_EENS3_12eval_contextEEEENS_6traits14extract_actionIT_vE4type17local_result_typeEOT0_RKNS_6naming7id_typeEDpOT1_+0x8d)[0x7f8a0f18327d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalEN3hpx6detail11sync_policyERKSt6vectorINS0_23primitive_argument_typeESaIS6_EENS0_12eval_contextE+0x47)[0x7f8a0f0ddd37] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives8variable4bindERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x14a)[0x7f8a0f5c690a] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4bindERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x42)[0x7f8a0f3f81c2] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail9sync_implINS_7actions12basic_actionIKN7phylanx14execution_tree10primitives19primitive_componentEFbRKSt6vectorINS5_23primitive_argument_typeESaISA_EENS5_12eval_contextEENS7_11bind_actionEEERKNS0_11sync_policyEJSC_SF_EEENS_6traits14extract_actionIT_vE4type17local_result_typeEOT0_RKNS_6naming7id_typeEDpOT1_+0xa57)[0x7f8a0f16bd27] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4bindEOSt6vectorINS0_23primitive_argument_typeESaIS3_EENS0_12eval_contextE+0x27)[0x7f8a0f0dd0f7] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(+0xb5da2e)[0x7f8a0f1c2a2e] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(+0xb5a662)[0x7f8a0f1bf662] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(_ZN3hpx4lcos6detail16future_data_baseINS_6traits6detail16future_data_voidEE16run_on_completedEONS_4util15unique_functionIFvvELb0EEE+0x68)[0x7f8a0de1d808] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(_ZN3hpx4lcos6detail16future_data_baseINS_6traits6detail16future_data_voidEE19handle_on_completedINS_4util15unique_functionIFvvELb0EEEEEvOT_+0x538)[0x7f8a0de20b48] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(_ZN3hpx4lcos6detail16future_data_baseINS_6traits6detail16future_data_voidEE16set_on_completedENS_4util15unique_functionIFvvELb0EEE+0x78)[0x7f8a0de1e078] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives15define_variable4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x34b)[0x7f8a0f1c195b] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJRKSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSN_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f182294] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJRKSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSI_7id_typeENS_7threads15thread_priorityEDpOT_+0x57c)[0x7f8a0f182e1c] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail17async_remote_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionEJRKSt6vectorINS3_23primitive_argument_typeESaIS8_EENS3_12eval_contextEEEENS_4lcos6futureINS_6traits14extract_actionIT_vE4type17local_result_typeEEENS0_11sync_policyERKNS_6naming7id_typeEONSO_7addressEDpOT0_+0x10d)[0x7f8a0f18317d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail9sync_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionERKNS0_11sync_policyEJRKSt6vectorINS3_23primitive_argument_typeESaISB_EENS3_12eval_contextEEEENS_6traits14extract_actionIT_vE4type17local_result_typeEOT0_RKNS_6naming7id_typeEDpOT1_+0x8d)[0x7f8a0f18327d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalEN3hpx6detail11sync_policyERKSt6vectorINS0_23primitive_argument_typeESaIS6_EENS0_12eval_contextE+0x47)[0x7f8a0f0ddd37] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree6detail30value_operand_sync_helper_argsIRKNS0_23primitive_argument_typeERKSt6vectorIS3_SaIS3_EEEES3_OT_OT0_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESM_NS0_12eval_contextE+0xc5)[0x7f8a0f183525] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree18value_operand_syncERKNS0_23primitive_argument_typeERKSt6vectorIS1_SaIS1_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESG_NS0_12eval_contextE+0x38)[0x7f8a0f0dddd8] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/phylanx/libphylanx_controls.so(_ZNK7phylanx14execution_tree10primitives15block_operation4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EES8_NS0_12eval_contextE+0xc7)[0x7f8964eb03f7] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x61)[0x7f8a0f419ca1] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJRKSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSN_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f182294] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJRKSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSI_7id_typeENS_7threads15thread_priorityEDpOT_+0x57c)[0x7f8a0f182e1c] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalERKSt6vectorINS0_23primitive_argument_typeESaIS3_EENS0_12eval_contextE+0x1a9)[0x7f8a0f0de929] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree6detail25value_operand_helper_argsIRKNS0_23primitive_argument_typeERKSt6vectorIS3_SaIS3_EEEEN3hpx4lcos6futureIS3_EEOT_OT0_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESQ_NS0_12eval_contextE+0x210)[0x7f8a0f185510] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree13value_operandERKNS0_23primitive_argument_typeERKSt6vectorIS1_SaIS1_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESG_NS0_12eval_contextE+0x38)[0x7f8a0f0e1878] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives6lambda4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x560)[0x7f8a0f1d31a0] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJRKSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSN_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f182294] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJRKSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSI_7id_typeENS_7threads15thread_priorityEDpOT_+0x57c)[0x7f8a0f182e1c] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalERKSt6vectorINS0_23primitive_argument_typeESaIS3_EENS0_12eval_contextE+0x1a9)[0x7f8a0f0de929] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives8function4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x3c2)[0x7f8a0f1d1382] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSL_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f18ea44] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx10apply_p_cbIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS2_23primitive_argument_typeEEESB_EENS8_6detail20parcel_write_handlerISB_EEJSt6vectorISA_SaISA_EENS2_12eval_contextEEEEbOT0_ONS_6naming7addressERKNSM_7id_typeENS_7threads15thread_priorityEOT1_DpOT2_+0x251)[0x7f8a0f18ee61] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSG_7id_typeENS_7threads15thread_priorityEDpOT_+0x217)[0x7f8a0f18f6b7] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail17async_remote_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionEJSt6vectorINS3_23primitive_argument_typeESaIS8_EENS3_12eval_contextEEEENS_4lcos6futureINS_6traits14extract_actionIT_vE4type17local_result_typeEEENS0_11sync_policyERKNS_6naming7id_typeEONSM_7addressEDpOT0_+0x10d)[0x7f8a0f18f96d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail9sync_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionERKNS0_11sync_policyEJSt6vectorINS3_23primitive_argument_typeESaISB_EENS3_12eval_contextEEEENS_6traits14extract_actionIT_vE4type17local_result_typeEOT0_RKNS_6naming7id_typeEDpOT1_+0x8d)[0x7f8a0f18fa6d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalEN3hpx6detail11sync_policyEOSt6vectorINS0_23primitive_argument_typeESaIS6_EENS0_12eval_contextE+0x47)[0x7f8a0f0e1f17] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree6detail30value_operand_sync_helper_argsIRKNS0_23primitive_argument_typeESt6vectorIS3_SaIS3_EEEES3_OT_OT0_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESK_NS0_12eval_contextE+0xc5)[0x7f8a0f18fd15] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree18value_operand_syncERKNS0_23primitive_argument_typeEOSt6vectorIS1_SaIS1_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESF_NS0_12eval_contextE+0x38)[0x7f8a0f0e1fb8] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/python/build/lib.linux-x86_64-3.6/phylanx/_phylanx.cpython-36m-x86_64-linux-gnu.so(+0xa5011)[0x7f8a0fb99011] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/python/build/lib.linux-x86_64-3.6/phylanx/_phylanx.cpython-36m-x86_64-linux-gnu.so(+0xa03a3)[0x7f8a0fb943a3] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/python/build/lib.linux-x86_64-3.6/phylanx/_phylanx.cpython-36m-x86_64-linux-gnu.so(+0x9838d)[0x7f8a0fb8c38d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(_ZN3hpx7threads10coroutines6detail14coroutine_implclEv+0xe3)[0x7f8a0de0bf13] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(+0x6207d9)[0x7f8a0de0b7d9] ```
1.0
Occasional crash in categorical_crossentropy test, Debug build - The nightly test of the Debug build of tests.unit.python.execution_tree.categorical_crossentropy fails consistently, but when run in isolation it does not crash. The full log is here: http://omega.nic.uoregon.edu:8020/#/builders/2/builds/201/steps/16/logs/stdio . The top of the stack is `phylanx::execution_tree::primitives::cat_cross_operation::cat_cross3d(phylanx::ir::node_data<double>&&, phylanx::ir::node_data<double>&&, bool, int) const`. Here's the stack trace: ``` 233/262 Test #340: tests.unit.python.execution_tree.categorical_crossentropy ........................***Failed 1.02 sec *** Error in `/packages/python/3.6.4-ssl/bin/python3': free(): invalid next size (fast): 0x00007f89440088b0 *** ======= Backtrace: ========= /lib64/libc.so.6(+0x81299)[0x7f8a183c8299] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/phylanx/libphylanx_keras_support.so(_ZNK7phylanx14execution_tree10primitives19cat_cross_operation11cat_cross3dEONS_2ir9node_dataIdEES6_bi+0xf4b)[0x7f8961a73b0b] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/phylanx/libphylanx_keras_support.so(+0x65eea8)[0x7f8961a78ea8] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/phylanx/libphylanx_keras_support.so(_ZNK7phylanx14execution_tree10primitives19cat_cross_operation4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EES8_NS0_12eval_contextE+0x5f4)[0x7f8961a798b4] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x61)[0x7f8a0f419ca1] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJRKSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSN_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f182294] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJRKSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSI_7id_typeENS_7threads15thread_priorityEDpOT_+0x57c)[0x7f8a0f182e1c] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail17async_remote_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionEJRKSt6vectorINS3_23primitive_argument_typeESaIS8_EENS3_12eval_contextEEEENS_4lcos6futureINS_6traits14extract_actionIT_vE4type17local_result_typeEEENS0_11sync_policyERKNS_6naming7id_typeEONSO_7addressEDpOT0_+0x10d)[0x7f8a0f18317d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail9sync_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionERKNS0_11sync_policyEJRKSt6vectorINS3_23primitive_argument_typeESaISB_EENS3_12eval_contextEEEENS_6traits14extract_actionIT_vE4type17local_result_typeEOT0_RKNS_6naming7id_typeEDpOT1_+0x8d)[0x7f8a0f18327d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalEN3hpx6detail11sync_policyERKSt6vectorINS0_23primitive_argument_typeESaIS6_EENS0_12eval_contextE+0x47)[0x7f8a0f0ddd37] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives8variable4bindERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x14a)[0x7f8a0f5c690a] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4bindERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x42)[0x7f8a0f3f81c2] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail9sync_implINS_7actions12basic_actionIKN7phylanx14execution_tree10primitives19primitive_componentEFbRKSt6vectorINS5_23primitive_argument_typeESaISA_EENS5_12eval_contextEENS7_11bind_actionEEERKNS0_11sync_policyEJSC_SF_EEENS_6traits14extract_actionIT_vE4type17local_result_typeEOT0_RKNS_6naming7id_typeEDpOT1_+0xa57)[0x7f8a0f16bd27] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4bindEOSt6vectorINS0_23primitive_argument_typeESaIS3_EENS0_12eval_contextE+0x27)[0x7f8a0f0dd0f7] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(+0xb5da2e)[0x7f8a0f1c2a2e] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(+0xb5a662)[0x7f8a0f1bf662] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(_ZN3hpx4lcos6detail16future_data_baseINS_6traits6detail16future_data_voidEE16run_on_completedEONS_4util15unique_functionIFvvELb0EEE+0x68)[0x7f8a0de1d808] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(_ZN3hpx4lcos6detail16future_data_baseINS_6traits6detail16future_data_voidEE19handle_on_completedINS_4util15unique_functionIFvvELb0EEEEEvOT_+0x538)[0x7f8a0de20b48] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(_ZN3hpx4lcos6detail16future_data_baseINS_6traits6detail16future_data_voidEE16set_on_completedENS_4util15unique_functionIFvvELb0EEE+0x78)[0x7f8a0de1e078] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives15define_variable4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x34b)[0x7f8a0f1c195b] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJRKSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSN_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f182294] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJRKSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSI_7id_typeENS_7threads15thread_priorityEDpOT_+0x57c)[0x7f8a0f182e1c] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail17async_remote_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionEJRKSt6vectorINS3_23primitive_argument_typeESaIS8_EENS3_12eval_contextEEEENS_4lcos6futureINS_6traits14extract_actionIT_vE4type17local_result_typeEEENS0_11sync_policyERKNS_6naming7id_typeEONSO_7addressEDpOT0_+0x10d)[0x7f8a0f18317d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail9sync_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionERKNS0_11sync_policyEJRKSt6vectorINS3_23primitive_argument_typeESaISB_EENS3_12eval_contextEEEENS_6traits14extract_actionIT_vE4type17local_result_typeEOT0_RKNS_6naming7id_typeEDpOT1_+0x8d)[0x7f8a0f18327d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalEN3hpx6detail11sync_policyERKSt6vectorINS0_23primitive_argument_typeESaIS6_EENS0_12eval_contextE+0x47)[0x7f8a0f0ddd37] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree6detail30value_operand_sync_helper_argsIRKNS0_23primitive_argument_typeERKSt6vectorIS3_SaIS3_EEEES3_OT_OT0_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESM_NS0_12eval_contextE+0xc5)[0x7f8a0f183525] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree18value_operand_syncERKNS0_23primitive_argument_typeERKSt6vectorIS1_SaIS1_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESG_NS0_12eval_contextE+0x38)[0x7f8a0f0dddd8] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/phylanx/libphylanx_controls.so(_ZNK7phylanx14execution_tree10primitives15block_operation4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EES8_NS0_12eval_contextE+0xc7)[0x7f8964eb03f7] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x61)[0x7f8a0f419ca1] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJRKSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSN_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f182294] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJRKSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSI_7id_typeENS_7threads15thread_priorityEDpOT_+0x57c)[0x7f8a0f182e1c] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalERKSt6vectorINS0_23primitive_argument_typeESaIS3_EENS0_12eval_contextE+0x1a9)[0x7f8a0f0de929] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree6detail25value_operand_helper_argsIRKNS0_23primitive_argument_typeERKSt6vectorIS3_SaIS3_EEEEN3hpx4lcos6futureIS3_EEOT_OT0_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESQ_NS0_12eval_contextE+0x210)[0x7f8a0f185510] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree13value_operandERKNS0_23primitive_argument_typeERKSt6vectorIS1_SaIS1_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESG_NS0_12eval_contextE+0x38)[0x7f8a0f0e1878] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives6lambda4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x560)[0x7f8a0f1d31a0] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJRKSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSN_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f182294] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJRKSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSI_7id_typeENS_7threads15thread_priorityEDpOT_+0x57c)[0x7f8a0f182e1c] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalERKSt6vectorINS0_23primitive_argument_typeESaIS3_EENS0_12eval_contextE+0x1a9)[0x7f8a0f0de929] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives8function4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x3c2)[0x7f8a0f1d1382] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives24primitive_component_base7do_evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0xb9)[0x7f8a0f419dc9] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree10primitives19primitive_component4evalERKSt6vectorINS0_23primitive_argument_typeESaIS4_EENS0_12eval_contextE+0x5d)[0x7f8a0f3f85fd] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx7applier6detail9apply_l_pIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS4_23primitive_argument_typeEEESD_EEJSt6vectorISC_SaISC_EENS4_12eval_contextEEEEbOT0_RKNS_6naming7id_typeEONSL_7addressENS_7threads15thread_priorityEDpOT1_+0x344)[0x7f8a0f18ea44] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx10apply_p_cbIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS_7actions18typed_continuationINS_4lcos6futureINS2_23primitive_argument_typeEEESB_EENS8_6detail20parcel_write_handlerISB_EEJSt6vectorISA_SaISA_EENS2_12eval_contextEEEEbOT0_ONS_6naming7addressERKNSM_7id_typeENS_7threads15thread_priorityEOT1_DpOT2_+0x251)[0x7f8a0f18ee61] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx4lcos15packaged_actionIN7phylanx14execution_tree10primitives19primitive_component11eval_actionENS0_6futureINS3_23primitive_argument_typeEEELb0EE8do_applyIJSt6vectorIS8_SaIS8_EENS3_12eval_contextEEEEvONS_6naming7addressERKNSG_7id_typeENS_7threads15thread_priorityEDpOT_+0x217)[0x7f8a0f18f6b7] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail17async_remote_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionEJSt6vectorINS3_23primitive_argument_typeESaIS8_EENS3_12eval_contextEEEENS_4lcos6futureINS_6traits14extract_actionIT_vE4type17local_result_typeEEENS0_11sync_policyERKNS_6naming7id_typeEONSM_7addressEDpOT0_+0x10d)[0x7f8a0f18f96d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN3hpx6detail9sync_implIN7phylanx14execution_tree10primitives19primitive_component11eval_actionERKNS0_11sync_policyEJSt6vectorINS3_23primitive_argument_typeESaISB_EENS3_12eval_contextEEEENS_6traits14extract_actionIT_vE4type17local_result_typeEOT0_RKNS_6naming7id_typeEDpOT1_+0x8d)[0x7f8a0f18fa6d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZNK7phylanx14execution_tree9primitive4evalEN3hpx6detail11sync_policyEOSt6vectorINS0_23primitive_argument_typeESaIS6_EENS0_12eval_contextE+0x47)[0x7f8a0f0e1f17] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree6detail30value_operand_sync_helper_argsIRKNS0_23primitive_argument_typeESt6vectorIS3_SaIS3_EEEES3_OT_OT0_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESK_NS0_12eval_contextE+0xc5)[0x7f8a0f18fd15] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/lib/libhpx_phylanx.so.0(_ZN7phylanx14execution_tree18value_operand_syncERKNS0_23primitive_argument_typeEOSt6vectorIS1_SaIS1_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESF_NS0_12eval_contextE+0x38)[0x7f8a0f0e1fb8] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/python/build/lib.linux-x86_64-3.6/phylanx/_phylanx.cpython-36m-x86_64-linux-gnu.so(+0xa5011)[0x7f8a0fb99011] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/python/build/lib.linux-x86_64-3.6/phylanx/_phylanx.cpython-36m-x86_64-linux-gnu.so(+0xa03a3)[0x7f8a0fb943a3] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/python/build/lib.linux-x86_64-3.6/phylanx/_phylanx.cpython-36m-x86_64-linux-gnu.so(+0x9838d)[0x7f8a0fb8c38d] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(_ZN3hpx7threads10coroutines6detail14coroutine_implclEv+0xe3)[0x7f8a0de0bf13] /var/lib/buildbot/workers/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/hpx-Release/lib/libhpx.so.1(+0x6207d9)[0x7f8a0de0b7d9] ```
non_process
occasional crash in categorical crossentropy test debug build the nightly test of the debug build of tests unit python execution tree categorical crossentropy fails consistently but when run in isolation it does not crash the full log is here the top of the stack is phylanx execution tree primitives cat cross operation cat phylanx ir node data phylanx ir node data bool int const here s the stack trace test tests unit python execution tree categorical crossentropy failed sec error in packages python ssl bin free invalid next size fast backtrace libc so var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib phylanx libphylanx keras support so cross bi var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib phylanx libphylanx keras support so var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib phylanx libphylanx keras support so cross argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so component argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so component argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so l actionens continuationins argument typeeeesd saisc rkns typeeonsn var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexteeeevons typeens priorityedpot var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so remote argument contexteeeens actionit result policyerkns typeeonso var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument typeesaisb contexteeeens actionit result rkns var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so implins argument typeesaisa policyejsc sf eeens actionit result rkns var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc hpx release lib libhpx so data baseins data on completedeons var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc hpx release lib libhpx so data baseins data on completedins var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc hpx release lib libhpx so data baseins data on completedens var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so component argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so l actionens continuationins argument typeeeesd saisc rkns typeeonsn var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexteeeevons typeens priorityedpot var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so remote argument contexteeeens actionit result policyerkns typeeonso var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument typeesaisb contexteeeens actionit result rkns var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so operand sync helper argument ot traitsicesaiceeesm contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so operand argument traitsicesaiceeesg contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib phylanx libphylanx controls so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so component argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so component argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so l actionens continuationins argument typeeeesd saisc rkns typeeonsn var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexteeeevons typeens priorityedpot var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so operand helper argument eeot traitsicesaiceeesq contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument traitsicesaiceeesg contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so component argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so l actionens continuationins argument typeeeesd saisc rkns typeeonsn var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexteeeevons typeens priorityedpot var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so component argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so l actionens continuationins argument typeeeesd saisc rkns typeeonsl var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so p actionens continuationins argument typeeeesb write handlerisb saisa ons typeens var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexteeeevons typeens priorityedpot var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so remote argument contexteeeens actionit result policyerkns typeeonsm var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument typeesaisb contexteeeens actionit result rkns var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so argument contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so operand sync helper argument ot traitsicesaiceeesk contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release lib libhpx phylanx so operand argument traitsicesaiceeesf contexte var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release python build lib linux phylanx phylanx cpython linux gnu so var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release python build lib linux phylanx phylanx cpython linux gnu so var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc phylanx release python build lib linux phylanx phylanx cpython linux gnu so var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc hpx release lib libhpx so implclev var lib buildbot workers phylanx release build tools buildbot build delphi linux gcc hpx release lib libhpx so
0
126,981
12,303,508,949
IssuesEvent
2020-05-11 18:49:25
adafruit/circuitpython
https://api.github.com/repos/adafruit/circuitpython
opened
http://adafru.it/mpy-update short URL points to 2.X docs
documentation
context: https://forums.adafruit.com/viewtopic.php?f=60&t=165259 http://adafru.it/mpy-update forwards to https://circuitpython.readthedocs.io/en/2.x/docs/troubleshooting.html#valueerror-incompatible-mpy-file It looks like the short URL needs to be updated to point to 5.X. I can do it but I don't know how/where
1.0
http://adafru.it/mpy-update short URL points to 2.X docs - context: https://forums.adafruit.com/viewtopic.php?f=60&t=165259 http://adafru.it/mpy-update forwards to https://circuitpython.readthedocs.io/en/2.x/docs/troubleshooting.html#valueerror-incompatible-mpy-file It looks like the short URL needs to be updated to point to 5.X. I can do it but I don't know how/where
non_process
short url points to x docs context forwards to it looks like the short url needs to be updated to point to x i can do it but i don t know how where
0
1,070
3,536,107,389
IssuesEvent
2016-01-17 01:08:41
SpongePowered/Mixin
https://api.github.com/repos/SpongePowered/Mixin
closed
Mixin AP resolves incorrect target class for method reference
accepted annotation processor bug
**Note:** This is closely related to #33 which seems to have broken in the latest Mixin builds. Basically in SpongeVanilla there are quite a few injections where Minecraft code is calling the methods on the current class (e.g. `DedicatedServer.loadAllWorlds` instead of `MinecraftServer.loadAllWorlds`) even though the method is only defined in one of the super classes. We fixed this back then by just looking into the hierarchy for finding the appropriate obfuscation mapping for the injections. As far as I can tell this is generally still working fine, there is no problem with _finding the actual method mapping_. However, when writing to the refmap, instead of keeping the original defined target of the method call (e.g. here `DedicatedServer`) Mixin replaces the target owner of the method call with the owner defined in the mapping (`MinecraftServer`), so the injection will **never** match in production. Here is an example: ``` Lnet/minecraft/server/dedicated/DedicatedServer;loadAllWorlds(Ljava/lang/String;Ljava/lang/String;JLnet/minecraft/world/WorldType;Ljava/lang/String;)V ``` Gets obfuscated to the refmap as: ``` Lnet/minecraft/server/MinecraftServer;func_71247_a(Ljava/lang/String;Ljava/lang/String;JLnet/minecraft/world/WorldType;Ljava/lang/String;)V ``` ... instead of keeping the target owner of the method call which will prevent the injection from ever being successful in production. For now I have fixed it [by doing the obfuscation manually and setting the correct injection target owner class](https://github.com/SpongePowered/SpongeVanilla/blob/master/src/main/java/org/spongepowered/server/mixin/server/MixinDedicatedServer.java#L80-L88), however it would be nice if we could get that somehow working again (it broke some other parts in SpongeVanilla too).
1.0
Mixin AP resolves incorrect target class for method reference - **Note:** This is closely related to #33 which seems to have broken in the latest Mixin builds. Basically in SpongeVanilla there are quite a few injections where Minecraft code is calling the methods on the current class (e.g. `DedicatedServer.loadAllWorlds` instead of `MinecraftServer.loadAllWorlds`) even though the method is only defined in one of the super classes. We fixed this back then by just looking into the hierarchy for finding the appropriate obfuscation mapping for the injections. As far as I can tell this is generally still working fine, there is no problem with _finding the actual method mapping_. However, when writing to the refmap, instead of keeping the original defined target of the method call (e.g. here `DedicatedServer`) Mixin replaces the target owner of the method call with the owner defined in the mapping (`MinecraftServer`), so the injection will **never** match in production. Here is an example: ``` Lnet/minecraft/server/dedicated/DedicatedServer;loadAllWorlds(Ljava/lang/String;Ljava/lang/String;JLnet/minecraft/world/WorldType;Ljava/lang/String;)V ``` Gets obfuscated to the refmap as: ``` Lnet/minecraft/server/MinecraftServer;func_71247_a(Ljava/lang/String;Ljava/lang/String;JLnet/minecraft/world/WorldType;Ljava/lang/String;)V ``` ... instead of keeping the target owner of the method call which will prevent the injection from ever being successful in production. For now I have fixed it [by doing the obfuscation manually and setting the correct injection target owner class](https://github.com/SpongePowered/SpongeVanilla/blob/master/src/main/java/org/spongepowered/server/mixin/server/MixinDedicatedServer.java#L80-L88), however it would be nice if we could get that somehow working again (it broke some other parts in SpongeVanilla too).
process
mixin ap resolves incorrect target class for method reference note this is closely related to which seems to have broken in the latest mixin builds basically in spongevanilla there are quite a few injections where minecraft code is calling the methods on the current class e g dedicatedserver loadallworlds instead of minecraftserver loadallworlds even though the method is only defined in one of the super classes we fixed this back then by just looking into the hierarchy for finding the appropriate obfuscation mapping for the injections as far as i can tell this is generally still working fine there is no problem with finding the actual method mapping however when writing to the refmap instead of keeping the original defined target of the method call e g here dedicatedserver mixin replaces the target owner of the method call with the owner defined in the mapping minecraftserver so the injection will never match in production here is an example lnet minecraft server dedicated dedicatedserver loadallworlds ljava lang string ljava lang string jlnet minecraft world worldtype ljava lang string v gets obfuscated to the refmap as lnet minecraft server minecraftserver func a ljava lang string ljava lang string jlnet minecraft world worldtype ljava lang string v instead of keeping the target owner of the method call which will prevent the injection from ever being successful in production for now i have fixed it however it would be nice if we could get that somehow working again it broke some other parts in spongevanilla too
1
12,171
14,741,833,395
IssuesEvent
2021-01-07 11:14:53
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
SA Billing - Memphis - Invalid Late Fees
anc-process anp-important ant-support
In GitLab by @kdjstudios on Feb 14, 2019, 11:34 **Submitted by:** "Laura Duckworth" <laura.duckworth@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/6612373 **Server:** Internal **Client/Site:** Memphis **Account:** NA **Issue:** Thanks for your patience, if you could remove those invalid fees, that would be much appreciated. Code 7980 Late Charge Adjustment And the description can just be Error Memphis, TN 55c20c85f0cf67cb150000d4 St Paul's Lutheran Church 120-AR18416 044-40874 1/1/2019 15 12/17/2018 250.94 12/23/2018 250.94 25 not_valid Memphis, TN 587d134fcf85c12cc600002f New Rock Asset Partners LP 120-1209542 044-40812 1/1/2019 15 12/17/2018 87.53 12/24/2018 87.53 25 not_valid Memphis, TN 55c20c8af0cf67cb15000180 National Industries for the Blind 120-AR3385 044-40810 1/1/2019 15 12/17/2018 68.4 12/23/2018 68.4 25 not_valid Memphis, TN 55c20c8ef0cf67cb150001f0 Johnson Roofing Company 120-AR3738 044-40755 1/1/2019 15 12/17/2018 228.46 1/2/2019 288.46 25 not_valid Memphis, TN 55c20c8bf0cf67cb1500019c East AR Family Medicine 120-AR504 044-40711 1/1/2019 30 12/2/2018 404.02 12/27/2018 404.02 25 not_valid Memphis, TN 55c0cedff0cf67295a0007d5 PRINCE OF PEACE CATHOLIC CHURCH 044-9332 044-40490 12/1/2018 15 11/16/2018 126.63 11/14/2018 126.63 15 not_valid Memphis, TN 587d134fcf85c12cc600002f New Rock Asset Partners LP 120-1209542 044-40479 12/1/2018 15 11/16/2018 70.77 11/16/2018 70.77 25 not_valid Memphis, TN 55c20c8ef0cf67cb150001f0 Johnson Roofing Company 120-AR3738 044-40422 12/1/2018 15 11/16/2018 228.46 12/1/2018 228.46 25 not_valid Memphis, TN 55c20c8bf0cf67cb1500019c East AR Family Medicine 120-AR504 044-40378 12/1/2018 30 11/1/2018 266.63 11/28/2018 266.63 25 not_valid Memphis, TN 55c0cedef0cf67295a0007bd STONEGATE APARTMENTS 044-9312 044-40177 11/1/2018 15 10/17/2018 417.56 11/5/2018 967.02 15 not_valid Memphis, TN 55c20c8af0cf67cb15000180 National Industries for the Blind 120-AR3385 044-40122 11/1/2018 15 10/17/2018 143.1 11/2/2018 221.7 25 not_valid Memphis, TN 55c20c8ef0cf67cb150001f0 Johnson Roofing Company 120-AR3738 044-40065 11/1/2018 15 10/17/2018 17.88 11/5/2018 17.88 25 not_valid
1.0
SA Billing - Memphis - Invalid Late Fees - In GitLab by @kdjstudios on Feb 14, 2019, 11:34 **Submitted by:** "Laura Duckworth" <laura.duckworth@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/6612373 **Server:** Internal **Client/Site:** Memphis **Account:** NA **Issue:** Thanks for your patience, if you could remove those invalid fees, that would be much appreciated. Code 7980 Late Charge Adjustment And the description can just be Error Memphis, TN 55c20c85f0cf67cb150000d4 St Paul's Lutheran Church 120-AR18416 044-40874 1/1/2019 15 12/17/2018 250.94 12/23/2018 250.94 25 not_valid Memphis, TN 587d134fcf85c12cc600002f New Rock Asset Partners LP 120-1209542 044-40812 1/1/2019 15 12/17/2018 87.53 12/24/2018 87.53 25 not_valid Memphis, TN 55c20c8af0cf67cb15000180 National Industries for the Blind 120-AR3385 044-40810 1/1/2019 15 12/17/2018 68.4 12/23/2018 68.4 25 not_valid Memphis, TN 55c20c8ef0cf67cb150001f0 Johnson Roofing Company 120-AR3738 044-40755 1/1/2019 15 12/17/2018 228.46 1/2/2019 288.46 25 not_valid Memphis, TN 55c20c8bf0cf67cb1500019c East AR Family Medicine 120-AR504 044-40711 1/1/2019 30 12/2/2018 404.02 12/27/2018 404.02 25 not_valid Memphis, TN 55c0cedff0cf67295a0007d5 PRINCE OF PEACE CATHOLIC CHURCH 044-9332 044-40490 12/1/2018 15 11/16/2018 126.63 11/14/2018 126.63 15 not_valid Memphis, TN 587d134fcf85c12cc600002f New Rock Asset Partners LP 120-1209542 044-40479 12/1/2018 15 11/16/2018 70.77 11/16/2018 70.77 25 not_valid Memphis, TN 55c20c8ef0cf67cb150001f0 Johnson Roofing Company 120-AR3738 044-40422 12/1/2018 15 11/16/2018 228.46 12/1/2018 228.46 25 not_valid Memphis, TN 55c20c8bf0cf67cb1500019c East AR Family Medicine 120-AR504 044-40378 12/1/2018 30 11/1/2018 266.63 11/28/2018 266.63 25 not_valid Memphis, TN 55c0cedef0cf67295a0007bd STONEGATE APARTMENTS 044-9312 044-40177 11/1/2018 15 10/17/2018 417.56 11/5/2018 967.02 15 not_valid Memphis, TN 55c20c8af0cf67cb15000180 National Industries for the Blind 120-AR3385 044-40122 11/1/2018 15 10/17/2018 143.1 11/2/2018 221.7 25 not_valid Memphis, TN 55c20c8ef0cf67cb150001f0 Johnson Roofing Company 120-AR3738 044-40065 11/1/2018 15 10/17/2018 17.88 11/5/2018 17.88 25 not_valid
process
sa billing memphis invalid late fees in gitlab by kdjstudios on feb submitted by laura duckworth helpdesk server internal client site memphis account na issue thanks for your patience if you could remove those invalid fees that would be much appreciated code late charge adjustment and the description can just be error memphis tn st paul s lutheran church not valid memphis tn new rock asset partners lp not valid memphis tn national industries for the blind not valid memphis tn johnson roofing company not valid memphis tn east ar family medicine not valid memphis tn prince of peace catholic church not valid memphis tn new rock asset partners lp not valid memphis tn johnson roofing company not valid memphis tn east ar family medicine not valid memphis tn stonegate apartments not valid memphis tn national industries for the blind not valid memphis tn johnson roofing company not valid
1
25,415
2,683,723,180
IssuesEvent
2015-03-28 07:47:34
kendraio/kendra_home
https://api.github.com/repos/kendraio/kendra_home
opened
Make Drupal send unwrapped plain text emails
Critical Priority High Priority
I don't want Drupal sending hard wrapped nor HTML emails. Just want unwrapped plain text emails. Format=flowed?
2.0
Make Drupal send unwrapped plain text emails - I don't want Drupal sending hard wrapped nor HTML emails. Just want unwrapped plain text emails. Format=flowed?
non_process
make drupal send unwrapped plain text emails i don t want drupal sending hard wrapped nor html emails just want unwrapped plain text emails format flowed
0
5,068
7,869,082,816
IssuesEvent
2018-06-24 09:13:52
StrikeNP/trac_test
https://api.github.com/repos/StrikeNP/trac_test
closed
Add documentation for Python Scripts (Trac #768)
Migrated from Trac enhancement post_processing weberjk@uwm.edu
Justin added some python scripts that he used for his thesis. We may want to use them in the future. Let's add to their documentation. Use this ticket when making svn commits. Attachments: [plot_explicit_ta_configs.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_explicit_ta_configs.maff) [plot_new_pdf_config_1_plot_2.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_config_1_plot_2.maff) [plot_combo_pdf_run_3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_combo_pdf_run_3.maff) [plot_input_fields_rtp3_thlp3_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_input_fields_rtp3_thlp3_1.maff) [plot_new_pdf_20180522_test_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_20180522_test_1.maff) [plot_attempts_8_10.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempts_8_10.maff) [plot_attempt_8_only.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempt_8_only.maff) [plot_beta_1p3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3.maff) [plot_beta_1p3_all.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3_all.maff) Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/768 ```json { "status": "closed", "changetime": "2015-07-15T02:14:44", "description": "Justin added some python scripts that he used for his thesis. We may want to use them in the future. Let's add to their documentation. Use this ticket when making svn commits. ", "reporter": "weberjk@uwm.edu", "cc": "vlarson@uwm.edu", "resolution": "worksforme", "_ts": "1436926484509531", "component": "post_processing", "summary": "Add documentation for Python Scripts", "priority": "minor", "keywords": "", "time": "2015-04-23T15:30:13", "milestone": "", "owner": "weberjk@uwm.edu", "type": "enhancement" } ```
1.0
Add documentation for Python Scripts (Trac #768) - Justin added some python scripts that he used for his thesis. We may want to use them in the future. Let's add to their documentation. Use this ticket when making svn commits. Attachments: [plot_explicit_ta_configs.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_explicit_ta_configs.maff) [plot_new_pdf_config_1_plot_2.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_config_1_plot_2.maff) [plot_combo_pdf_run_3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_combo_pdf_run_3.maff) [plot_input_fields_rtp3_thlp3_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_input_fields_rtp3_thlp3_1.maff) [plot_new_pdf_20180522_test_1.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_new_pdf_20180522_test_1.maff) [plot_attempts_8_10.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempts_8_10.maff) [plot_attempt_8_only.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_attempt_8_only.maff) [plot_beta_1p3.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3.maff) [plot_beta_1p3_all.maff](https://github.com/larson-group/trac_attachment_archive/blob/master/trac_test/822/plot_beta_1p3_all.maff) Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/768 ```json { "status": "closed", "changetime": "2015-07-15T02:14:44", "description": "Justin added some python scripts that he used for his thesis. We may want to use them in the future. Let's add to their documentation. Use this ticket when making svn commits. ", "reporter": "weberjk@uwm.edu", "cc": "vlarson@uwm.edu", "resolution": "worksforme", "_ts": "1436926484509531", "component": "post_processing", "summary": "Add documentation for Python Scripts", "priority": "minor", "keywords": "", "time": "2015-04-23T15:30:13", "milestone": "", "owner": "weberjk@uwm.edu", "type": "enhancement" } ```
process
add documentation for python scripts trac justin added some python scripts that he used for his thesis we may want to use them in the future let s add to their documentation use this ticket when making svn commits attachments migrated from json status closed changetime description justin added some python scripts that he used for his thesis we may want to use them in the future let s add to their documentation use this ticket when making svn commits reporter weberjk uwm edu cc vlarson uwm edu resolution worksforme ts component post processing summary add documentation for python scripts priority minor keywords time milestone owner weberjk uwm edu type enhancement
1
187,469
22,045,751,161
IssuesEvent
2022-05-30 01:22:00
Nivaskumark/kernel_v4.1.15
https://api.github.com/repos/Nivaskumark/kernel_v4.1.15
closed
CVE-2019-19332 (Medium) detected in linux-stable-rtv4.1.33 - autoclosed
security vulnerability
## CVE-2019-19332 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/Nivaskumark/kernel_v4.1.15/commit/00db4e8795bcbec692fb60b19160bdd763ad42e3">00db4e8795bcbec692fb60b19160bdd763ad42e3</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/arch/x86/kvm/cpuid.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/arch/x86/kvm/cpuid.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An out-of-bounds memory write issue was found in the Linux Kernel, version 3.13 through 5.4, in the way the Linux kernel's KVM hypervisor handled the 'KVM_GET_EMULATED_CPUID' ioctl(2) request to get CPUID features emulated by the KVM hypervisor. A user or process able to access the '/dev/kvm' device could use this flaw to crash the system, resulting in a denial of service. <p>Publish Date: 2020-01-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-19332>CVE-2019-19332</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2019-19332">https://www.linuxkernelcves.com/cves/CVE-2019-19332</a></p> <p>Release Date: 2020-03-13</p> <p>Fix Resolution: v5.5-rc1,v3.16.79,v4.14.159,v4.19.89,v4.4.207,v4.9.207,v5.3.16,v5.4.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-19332 (Medium) detected in linux-stable-rtv4.1.33 - autoclosed - ## CVE-2019-19332 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/Nivaskumark/kernel_v4.1.15/commit/00db4e8795bcbec692fb60b19160bdd763ad42e3">00db4e8795bcbec692fb60b19160bdd763ad42e3</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/arch/x86/kvm/cpuid.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/arch/x86/kvm/cpuid.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An out-of-bounds memory write issue was found in the Linux Kernel, version 3.13 through 5.4, in the way the Linux kernel's KVM hypervisor handled the 'KVM_GET_EMULATED_CPUID' ioctl(2) request to get CPUID features emulated by the KVM hypervisor. A user or process able to access the '/dev/kvm' device could use this flaw to crash the system, resulting in a denial of service. <p>Publish Date: 2020-01-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-19332>CVE-2019-19332</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2019-19332">https://www.linuxkernelcves.com/cves/CVE-2019-19332</a></p> <p>Release Date: 2020-03-13</p> <p>Fix Resolution: v5.5-rc1,v3.16.79,v4.14.159,v4.19.89,v4.4.207,v4.9.207,v5.3.16,v5.4.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in linux stable autoclosed cve medium severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files arch kvm cpuid c arch kvm cpuid c vulnerability details an out of bounds memory write issue was found in the linux kernel version through in the way the linux kernel s kvm hypervisor handled the kvm get emulated cpuid ioctl request to get cpuid features emulated by the kvm hypervisor a user or process able to access the dev kvm device could use this flaw to crash the system resulting in a denial of service publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
5,262
8,056,309,004
IssuesEvent
2018-08-02 12:18:09
prusa3d/Slic3r
https://api.github.com/repos/prusa3d/Slic3r
closed
Post-processing script launches another instance of Slic3r and throws invalid file type error
post processing scripts
### Version 1.39.1-prusa3d-win64 ### Operating system type + version Windows 10 Home edition ### Behavior I have a perl script defined in Post-processing scripts section of the Output options that calls GPX.exe to create the .x3g file. When I click on Export G-Code, the .gcode file is created and I see "Running post-processing scripts..." in the status bar. However, instead of GPX running, a new instance of Slic3r launches and an error appears saying the file type is not valid. Clicking OK to this displays a second duplicate error error message, but the file is different. In the first error, the file reference is the perl script. In the second error, the file name is the .gcode file generated. This was working in 1.37.1 ![image](https://user-images.githubusercontent.com/29186424/38339102-27d9083e-3832-11e8-92bc-c3efaa70f620.png) First Error ![image](https://user-images.githubusercontent.com/29186424/38339212-bed46e2c-3832-11e8-95fd-b723e6b19f60.png) ![image](https://user-images.githubusercontent.com/29186424/38339161-73f7df42-3832-11e8-8299-a908c3542610.png) Second Error ![image](https://user-images.githubusercontent.com/29186424/38339190-9af1de5e-3832-11e8-8648-b262bad69aa7.png) Two instances of Slic3r 1.39.1 running now and the first instance still shows "Running post-processing scripts..." till the second instance is closed. #### STL/Config (.ZIP) where problem occurs N/A
1.0
Post-processing script launches another instance of Slic3r and throws invalid file type error - ### Version 1.39.1-prusa3d-win64 ### Operating system type + version Windows 10 Home edition ### Behavior I have a perl script defined in Post-processing scripts section of the Output options that calls GPX.exe to create the .x3g file. When I click on Export G-Code, the .gcode file is created and I see "Running post-processing scripts..." in the status bar. However, instead of GPX running, a new instance of Slic3r launches and an error appears saying the file type is not valid. Clicking OK to this displays a second duplicate error error message, but the file is different. In the first error, the file reference is the perl script. In the second error, the file name is the .gcode file generated. This was working in 1.37.1 ![image](https://user-images.githubusercontent.com/29186424/38339102-27d9083e-3832-11e8-92bc-c3efaa70f620.png) First Error ![image](https://user-images.githubusercontent.com/29186424/38339212-bed46e2c-3832-11e8-95fd-b723e6b19f60.png) ![image](https://user-images.githubusercontent.com/29186424/38339161-73f7df42-3832-11e8-8299-a908c3542610.png) Second Error ![image](https://user-images.githubusercontent.com/29186424/38339190-9af1de5e-3832-11e8-8648-b262bad69aa7.png) Two instances of Slic3r 1.39.1 running now and the first instance still shows "Running post-processing scripts..." till the second instance is closed. #### STL/Config (.ZIP) where problem occurs N/A
process
post processing script launches another instance of and throws invalid file type error version operating system type version windows home edition behavior i have a perl script defined in post processing scripts section of the output options that calls gpx exe to create the file when i click on export g code the gcode file is created and i see running post processing scripts in the status bar however instead of gpx running a new instance of launches and an error appears saying the file type is not valid clicking ok to this displays a second duplicate error error message but the file is different in the first error the file reference is the perl script in the second error the file name is the gcode file generated this was working in first error second error two instances of running now and the first instance still shows running post processing scripts till the second instance is closed stl config zip where problem occurs n a
1
259,053
19,583,791,434
IssuesEvent
2022-01-05 02:25:34
coolbutuseless/ggpattern
https://api.github.com/repos/coolbutuseless/ggpattern
closed
Improve 'geom_*' man documentation
documentation
Finish updates started in f9cf485: * [x] Update documented pattern aesthetics to match the updated vignette https://coolbutuseless.github.io/package/ggpattern/articles/developing-patterns.html * [x] Add a couple of examples * [ ] Nice if we could figure out to get `geom_*` functions alphabetical in pgkdown site (but this is not that important)
1.0
Improve 'geom_*' man documentation - Finish updates started in f9cf485: * [x] Update documented pattern aesthetics to match the updated vignette https://coolbutuseless.github.io/package/ggpattern/articles/developing-patterns.html * [x] Add a couple of examples * [ ] Nice if we could figure out to get `geom_*` functions alphabetical in pgkdown site (but this is not that important)
non_process
improve geom man documentation finish updates started in update documented pattern aesthetics to match the updated vignette add a couple of examples nice if we could figure out to get geom functions alphabetical in pgkdown site but this is not that important
0
94,511
8,495,408,006
IssuesEvent
2018-10-29 04:52:05
CfSOtago/evAnalysis
https://api.github.com/repos/CfSOtago/evAnalysis
closed
FTF Test Data: missing GPS Date/Time on ~ 30% of observations
bug ftfTestData
Caused by loss of signal e.g. charging in garages. Daniel will fix for ftfSample_v1.0 using other variables.
1.0
FTF Test Data: missing GPS Date/Time on ~ 30% of observations - Caused by loss of signal e.g. charging in garages. Daniel will fix for ftfSample_v1.0 using other variables.
non_process
ftf test data missing gps date time on of observations caused by loss of signal e g charging in garages daniel will fix for ftfsample using other variables
0
244,299
7,873,240,894
IssuesEvent
2018-06-25 13:46:53
Repair-DeskPOS/RepairDesk-Bugs
https://api.github.com/repos/Repair-DeskPOS/RepairDesk-Bugs
closed
Customer Group Discounts Incorrect on Invoice
Discount Medium Priority Resolved bug
I have a friends and family discount group that provides a 25% discount on all accessories. The discount applies properly to the ticket, but when it creates an invoice the discount calculates wrong. Look at open ticket #0-110 in my RepairDesk right now and you will see a tempered glass screen protector for $34.99 that is discounted 25% to $26.24 If you look at invoice #93 for the same transaction you will see that the price changes to $31.49
1.0
Customer Group Discounts Incorrect on Invoice - I have a friends and family discount group that provides a 25% discount on all accessories. The discount applies properly to the ticket, but when it creates an invoice the discount calculates wrong. Look at open ticket #0-110 in my RepairDesk right now and you will see a tempered glass screen protector for $34.99 that is discounted 25% to $26.24 If you look at invoice #93 for the same transaction you will see that the price changes to $31.49
non_process
customer group discounts incorrect on invoice i have a friends and family discount group that provides a discount on all accessories the discount applies properly to the ticket but when it creates an invoice the discount calculates wrong look at open ticket in my repairdesk right now and you will see a tempered glass screen protector for that is discounted to if you look at invoice for the same transaction you will see that the price changes to
0
269,190
28,960,025,487
IssuesEvent
2023-05-10 01:09:14
ChoeMinji/mongo-r4.4.6
https://api.github.com/repos/ChoeMinji/mongo-r4.4.6
reopened
CVE-2020-25659 (Medium) detected in cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl
Mend: dependency security vulnerability
## CVE-2020-25659 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl</b></p></summary> <p>cryptography is a package which provides cryptographic recipes and primitives to Python developers.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/c2/fa/fa9a8933c285895935d1392922fe721e9cb1b2c1881d14f149213a227ee3/cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/c2/fa/fa9a8933c285895935d1392922fe721e9cb1b2c1881d14f149213a227ee3/cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /etc/pip/dev-requirements.txt</p> <p>Path to vulnerable library: /etc/pip/dev-requirements.txt,/etc/pip/toolchain-requirements.txt,/src/third_party/wiredtiger/lang/python,/etc/pip/dev-requirements.txt,/etc/pip/jira-requirements.txt,/src/third_party/wiredtiger/bench/workgen</p> <p> Dependency Hierarchy: - :x: **cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ChoeMinji/mongo-r4.4.6/commit/9c4537f1af3987a4f237e73712977c87c207c818">9c4537f1af3987a4f237e73712977c87c207c818</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> python-cryptography 3.2 is vulnerable to Bleichenbacher timing attacks in the RSA decryption API, via timed processing of valid PKCS#1 v1.5 ciphertext. <p>Publish Date: 2021-01-11 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-25659>CVE-2020-25659</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/pyca/cryptography/security/advisories/GHSA-hggm-jpg3-v476">https://github.com/pyca/cryptography/security/advisories/GHSA-hggm-jpg3-v476</a></p> <p>Release Date: 2021-01-11</p> <p>Fix Resolution: 3.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-25659 (Medium) detected in cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl - ## CVE-2020-25659 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl</b></p></summary> <p>cryptography is a package which provides cryptographic recipes and primitives to Python developers.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/c2/fa/fa9a8933c285895935d1392922fe721e9cb1b2c1881d14f149213a227ee3/cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/c2/fa/fa9a8933c285895935d1392922fe721e9cb1b2c1881d14f149213a227ee3/cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /etc/pip/dev-requirements.txt</p> <p>Path to vulnerable library: /etc/pip/dev-requirements.txt,/etc/pip/toolchain-requirements.txt,/src/third_party/wiredtiger/lang/python,/etc/pip/dev-requirements.txt,/etc/pip/jira-requirements.txt,/src/third_party/wiredtiger/bench/workgen</p> <p> Dependency Hierarchy: - :x: **cryptography-2.3-cp34-abi3-manylinux1_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ChoeMinji/mongo-r4.4.6/commit/9c4537f1af3987a4f237e73712977c87c207c818">9c4537f1af3987a4f237e73712977c87c207c818</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> python-cryptography 3.2 is vulnerable to Bleichenbacher timing attacks in the RSA decryption API, via timed processing of valid PKCS#1 v1.5 ciphertext. <p>Publish Date: 2021-01-11 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-25659>CVE-2020-25659</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/pyca/cryptography/security/advisories/GHSA-hggm-jpg3-v476">https://github.com/pyca/cryptography/security/advisories/GHSA-hggm-jpg3-v476</a></p> <p>Release Date: 2021-01-11</p> <p>Fix Resolution: 3.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in cryptography whl cve medium severity vulnerability vulnerable library cryptography whl cryptography is a package which provides cryptographic recipes and primitives to python developers library home page a href path to dependency file etc pip dev requirements txt path to vulnerable library etc pip dev requirements txt etc pip toolchain requirements txt src third party wiredtiger lang python etc pip dev requirements txt etc pip jira requirements txt src third party wiredtiger bench workgen dependency hierarchy x cryptography whl vulnerable library found in head commit a href found in base branch main vulnerability details python cryptography is vulnerable to bleichenbacher timing attacks in the rsa decryption api via timed processing of valid pkcs ciphertext publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0