Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
5
112
repo_url
stringlengths
34
141
action
stringclasses
3 values
title
stringlengths
1
1k
labels
stringlengths
4
1.38k
body
stringlengths
1
262k
index
stringclasses
16 values
text_combine
stringlengths
96
262k
label
stringclasses
2 values
text
stringlengths
96
252k
binary_label
int64
0
1
291,465
8,925,527,268
IssuesEvent
2019-01-21 23:11:04
mat3rial-dev/page_rwanda_memorial
https://api.github.com/repos/mat3rial-dev/page_rwanda_memorial
opened
Implement admin view
priority-high
Page used by database owner to administer admin users. These administrative users will be in charge of curating the data entered by the general public. This page requires an authentication system.
1.0
Implement admin view - Page used by database owner to administer admin users. These administrative users will be in charge of curating the data entered by the general public. This page requires an authentication system.
priority
implement admin view page used by database owner to administer admin users these administrative users will be in charge of curating the data entered by the general public this page requires an authentication system
1
33,488
2,765,734,500
IssuesEvent
2015-04-29 22:10:00
biocore/qiita
https://api.github.com/repos/biocore/qiita
closed
re-defining the required fields for the study and prep templates to facilitate addition of new data
database changes priority: high refactor
In working on [documentation of the sample and prep templates](https://github.com/biocore/qiita/wiki/Preparing-Qiita-template-files) @rob-knight, @adamrp, @wasade, @josenavas and I discussed updating the requirements for sample and prep template fields. These fields should no longer be required: - [ ] ``emp_status`` - [ ] ``run_prefix`` - [ ] ``sample_type`` - [ ] ``description`` - [ ] ``has_physical_specimen`` - [ ] ``physical_location`` - [ ] ``has_extracted_data`` - [ ] ``host_subject_id`` - [ ] ``barcode`` (this will not be relevant when users can start with demultiplexed data) - [ ] ``primer`` - [ ] ``linker`` (this is not always used) - [ ] ``center_name`` - [ ] ``platform`` - [ ] ``library_construction_protocol`` - [ ] ``experiment_design_description`` These fields should be removed all together: - [ ] ``required_sample_info_status`` - [ ] ``center_project_name`` These fields should have their names changed: - [ ] ``sample_name`` will be changed to ``SampleID`` - [ ] ``has_physical_specimen`` should change to ``physical_specimen_remaining`` - [ ] ``physical_location`` should change to ``physical_specimen_location`` - [ ] ``has_extracted_data``should change to ``dna_extracted`` Format/allowed value changes: - [ ] ``platform`` the allowed values need to be expanded in or linked from the [wiki document](https://github.com/biocore/qiita/wiki/Preparing-Qiita-template-files)
1.0
re-defining the required fields for the study and prep templates to facilitate addition of new data - In working on [documentation of the sample and prep templates](https://github.com/biocore/qiita/wiki/Preparing-Qiita-template-files) @rob-knight, @adamrp, @wasade, @josenavas and I discussed updating the requirements for sample and prep template fields. These fields should no longer be required: - [ ] ``emp_status`` - [ ] ``run_prefix`` - [ ] ``sample_type`` - [ ] ``description`` - [ ] ``has_physical_specimen`` - [ ] ``physical_location`` - [ ] ``has_extracted_data`` - [ ] ``host_subject_id`` - [ ] ``barcode`` (this will not be relevant when users can start with demultiplexed data) - [ ] ``primer`` - [ ] ``linker`` (this is not always used) - [ ] ``center_name`` - [ ] ``platform`` - [ ] ``library_construction_protocol`` - [ ] ``experiment_design_description`` These fields should be removed all together: - [ ] ``required_sample_info_status`` - [ ] ``center_project_name`` These fields should have their names changed: - [ ] ``sample_name`` will be changed to ``SampleID`` - [ ] ``has_physical_specimen`` should change to ``physical_specimen_remaining`` - [ ] ``physical_location`` should change to ``physical_specimen_location`` - [ ] ``has_extracted_data``should change to ``dna_extracted`` Format/allowed value changes: - [ ] ``platform`` the allowed values need to be expanded in or linked from the [wiki document](https://github.com/biocore/qiita/wiki/Preparing-Qiita-template-files)
priority
re defining the required fields for the study and prep templates to facilitate addition of new data in working on rob knight adamrp wasade josenavas and i discussed updating the requirements for sample and prep template fields these fields should no longer be required emp status run prefix sample type description has physical specimen physical location has extracted data host subject id barcode this will not be relevant when users can start with demultiplexed data primer linker this is not always used center name platform library construction protocol experiment design description these fields should be removed all together required sample info status center project name these fields should have their names changed sample name will be changed to sampleid has physical specimen should change to physical specimen remaining physical location should change to physical specimen location has extracted data should change to dna extracted format allowed value changes platform the allowed values need to be expanded in or linked from the
1
348,098
10,438,771,318
IssuesEvent
2019-09-18 03:30:52
nmiodice/terraform-azure-devops-hack
https://api.github.com/repos/nmiodice/terraform-azure-devops-hack
closed
AzDO Client Implementation
High Priority
# Description As contributor to the Azure DevOps Terraform Provider, I want to use an authenticated client to interact with AzDO # Acceptance Criteria - [x] AzDO client written in Go can authenticate with AzDO using a personal access token - [x] Client can be used to provision resources within AzDO - [ ] Test the Client implementation
1.0
AzDO Client Implementation - # Description As contributor to the Azure DevOps Terraform Provider, I want to use an authenticated client to interact with AzDO # Acceptance Criteria - [x] AzDO client written in Go can authenticate with AzDO using a personal access token - [x] Client can be used to provision resources within AzDO - [ ] Test the Client implementation
priority
azdo client implementation description as contributor to the azure devops terraform provider i want to use an authenticated client to interact with azdo acceptance criteria azdo client written in go can authenticate with azdo using a personal access token client can be used to provision resources within azdo test the client implementation
1
184,510
14,289,501,215
IssuesEvent
2020-11-23 19:21:43
github-vet/rangeclosure-findings
https://api.github.com/repos/github-vet/rangeclosure-findings
closed
ipsn/go-ipfs: gxlibs/github.com/libp2p/go-libp2p-swarm/swarm_test.go; 111 LoC
fresh large test
Found a possible issue in [ipsn/go-ipfs](https://www.github.com/ipsn/go-ipfs) at [gxlibs/github.com/libp2p/go-libp2p-swarm/swarm_test.go](https://github.com/ipsn/go-ipfs/blob/8b9b72514244155bc38ab21eac7f4d950ea97c28/gxlibs/github.com/libp2p/go-libp2p-swarm/swarm_test.go#L111-L221) The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements which capture loop variables. [Click here to see the code in its original context.](https://github.com/ipsn/go-ipfs/blob/8b9b72514244155bc38ab21eac7f4d950ea97c28/gxlibs/github.com/libp2p/go-libp2p-swarm/swarm_test.go#L111-L221) <details> <summary>Click here to show the 111 line(s) of Go which triggered the analyzer.</summary> ```go for _, s1 := range swarms { log.Debugf("-------------------------------------------------------") log.Debugf("%s ping pong round", s1.LocalPeer()) log.Debugf("-------------------------------------------------------") _, cancel := context.WithCancel(ctx) got := map[peer.ID]int{} errChan := make(chan error, MsgNum*len(swarms)) streamChan := make(chan inet.Stream, MsgNum) // send out "ping" x MsgNum to every peer go func() { defer close(streamChan) var wg sync.WaitGroup send := func(p peer.ID) { defer wg.Done() // first, one stream per peer (nice) stream, err := s1.NewStream(ctx, p) if err != nil { errChan <- err return } // send out ping! for k := 0; k < MsgNum; k++ { // with k messages msg := "ping" log.Debugf("%s %s %s (%d)", s1.LocalPeer(), msg, p, k) if _, err := stream.Write([]byte(msg)); err != nil { errChan <- err continue } } // read it later streamChan <- stream } for _, s2 := range swarms { if s2.LocalPeer() == s1.LocalPeer() { continue // dont send to self... } wg.Add(1) go send(s2.LocalPeer()) } wg.Wait() }() // receive "pong" x MsgNum from every peer go func() { defer close(errChan) count := 0 countShouldBe := MsgNum * (len(swarms) - 1) for stream := range streamChan { // one per peer defer stream.Close() // get peer on the other side p := stream.Conn().RemotePeer() // receive pings msgCount := 0 msg := make([]byte, 4) for k := 0; k < MsgNum; k++ { // with k messages // read from the stream if _, err := stream.Read(msg); err != nil { errChan <- err continue } if string(msg) != "pong" { errChan <- fmt.Errorf("unexpected message: %s", msg) continue } log.Debugf("%s %s %s (%d)", s1.LocalPeer(), msg, p, k) msgCount++ } got[p] = msgCount count += msgCount } if count != countShouldBe { errChan <- fmt.Errorf("count mismatch: %d != %d", count, countShouldBe) } }() // check any errors (blocks till consumer is done) for err := range errChan { if err != nil { t.Error(err.Error()) } } log.Debugf("%s got pongs", s1.LocalPeer()) if (len(swarms) - 1) != len(got) { t.Errorf("got (%d) less messages than sent (%d).", len(got), len(swarms)) } for p, n := range got { if n != MsgNum { t.Error("peer did not get all msgs", p, n, "/", MsgNum) } } cancel() <-time.After(10 * time.Millisecond) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 8b9b72514244155bc38ab21eac7f4d950ea97c28
1.0
ipsn/go-ipfs: gxlibs/github.com/libp2p/go-libp2p-swarm/swarm_test.go; 111 LoC - Found a possible issue in [ipsn/go-ipfs](https://www.github.com/ipsn/go-ipfs) at [gxlibs/github.com/libp2p/go-libp2p-swarm/swarm_test.go](https://github.com/ipsn/go-ipfs/blob/8b9b72514244155bc38ab21eac7f4d950ea97c28/gxlibs/github.com/libp2p/go-libp2p-swarm/swarm_test.go#L111-L221) The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements which capture loop variables. [Click here to see the code in its original context.](https://github.com/ipsn/go-ipfs/blob/8b9b72514244155bc38ab21eac7f4d950ea97c28/gxlibs/github.com/libp2p/go-libp2p-swarm/swarm_test.go#L111-L221) <details> <summary>Click here to show the 111 line(s) of Go which triggered the analyzer.</summary> ```go for _, s1 := range swarms { log.Debugf("-------------------------------------------------------") log.Debugf("%s ping pong round", s1.LocalPeer()) log.Debugf("-------------------------------------------------------") _, cancel := context.WithCancel(ctx) got := map[peer.ID]int{} errChan := make(chan error, MsgNum*len(swarms)) streamChan := make(chan inet.Stream, MsgNum) // send out "ping" x MsgNum to every peer go func() { defer close(streamChan) var wg sync.WaitGroup send := func(p peer.ID) { defer wg.Done() // first, one stream per peer (nice) stream, err := s1.NewStream(ctx, p) if err != nil { errChan <- err return } // send out ping! for k := 0; k < MsgNum; k++ { // with k messages msg := "ping" log.Debugf("%s %s %s (%d)", s1.LocalPeer(), msg, p, k) if _, err := stream.Write([]byte(msg)); err != nil { errChan <- err continue } } // read it later streamChan <- stream } for _, s2 := range swarms { if s2.LocalPeer() == s1.LocalPeer() { continue // dont send to self... } wg.Add(1) go send(s2.LocalPeer()) } wg.Wait() }() // receive "pong" x MsgNum from every peer go func() { defer close(errChan) count := 0 countShouldBe := MsgNum * (len(swarms) - 1) for stream := range streamChan { // one per peer defer stream.Close() // get peer on the other side p := stream.Conn().RemotePeer() // receive pings msgCount := 0 msg := make([]byte, 4) for k := 0; k < MsgNum; k++ { // with k messages // read from the stream if _, err := stream.Read(msg); err != nil { errChan <- err continue } if string(msg) != "pong" { errChan <- fmt.Errorf("unexpected message: %s", msg) continue } log.Debugf("%s %s %s (%d)", s1.LocalPeer(), msg, p, k) msgCount++ } got[p] = msgCount count += msgCount } if count != countShouldBe { errChan <- fmt.Errorf("count mismatch: %d != %d", count, countShouldBe) } }() // check any errors (blocks till consumer is done) for err := range errChan { if err != nil { t.Error(err.Error()) } } log.Debugf("%s got pongs", s1.LocalPeer()) if (len(swarms) - 1) != len(got) { t.Errorf("got (%d) less messages than sent (%d).", len(got), len(swarms)) } for p, n := range got { if n != MsgNum { t.Error("peer did not get all msgs", p, n, "/", MsgNum) } } cancel() <-time.After(10 * time.Millisecond) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 8b9b72514244155bc38ab21eac7f4d950ea97c28
non_priority
ipsn go ipfs gxlibs github com go swarm swarm test go loc found a possible issue in at the below snippet of go code triggered static analysis which searches for goroutines and or defer statements which capture loop variables click here to show the line s of go which triggered the analyzer go for range swarms log debugf log debugf s ping pong round localpeer log debugf cancel context withcancel ctx got map int errchan make chan error msgnum len swarms streamchan make chan inet stream msgnum send out ping x msgnum to every peer go func defer close streamchan var wg sync waitgroup send func p peer id defer wg done first one stream per peer nice stream err newstream ctx p if err nil errchan err return send out ping for k k msgnum k with k messages msg ping log debugf s s s d localpeer msg p k if err stream write byte msg err nil errchan err continue read it later streamchan stream for range swarms if localpeer localpeer continue dont send to self wg add go send localpeer wg wait receive pong x msgnum from every peer go func defer close errchan count countshouldbe msgnum len swarms for stream range streamchan one per peer defer stream close get peer on the other side p stream conn remotepeer receive pings msgcount msg make byte for k k msgnum k with k messages read from the stream if err stream read msg err nil errchan err continue if string msg pong errchan fmt errorf unexpected message s msg continue log debugf s s s d localpeer msg p k msgcount got msgcount count msgcount if count countshouldbe errchan fmt errorf count mismatch d d count countshouldbe check any errors blocks till consumer is done for err range errchan if err nil t error err error log debugf s got pongs localpeer if len swarms len got t errorf got d less messages than sent d len got len swarms for p n range got if n msgnum t error peer did not get all msgs p n msgnum cancel time after time millisecond leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
0
813,016
30,442,468,136
IssuesEvent
2023-07-15 08:23:55
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
closed
Docker sunsets Free Team Organizations affecting Kicbase
priority/critical-urgent lifecycle/rotten
Following the same issue as kind https://github.com/kubernetes-sigs/kind/issues/3124 Just got the notice, we have Until April 14th to resolve this. here is a list of organizations we use in build and release of minikube, even though minikube uses docker hub as a fail over, it is essential for users not having access to gcr.io - kicbase/stable (production images) - kicbase/echo-server (production image used on tutorial on website) - kicbase/build (images used in CI) it seems like we could apply for open source exception here I will apply today https://web.docker.com/rs/790-SSB-375/images/privatereposfaq.pdf
1.0
Docker sunsets Free Team Organizations affecting Kicbase - Following the same issue as kind https://github.com/kubernetes-sigs/kind/issues/3124 Just got the notice, we have Until April 14th to resolve this. here is a list of organizations we use in build and release of minikube, even though minikube uses docker hub as a fail over, it is essential for users not having access to gcr.io - kicbase/stable (production images) - kicbase/echo-server (production image used on tutorial on website) - kicbase/build (images used in CI) it seems like we could apply for open source exception here I will apply today https://web.docker.com/rs/790-SSB-375/images/privatereposfaq.pdf
priority
docker sunsets free team organizations affecting kicbase following the same issue as kind just got the notice we have until april to resolve this here is a list of organizations we use in build and release of minikube even though minikube uses docker hub as a fail over it is essential for users not having access to gcr io kicbase stable production images kicbase echo server production image used on tutorial on website kicbase build images used in ci it seems like we could apply for open source exception here i will apply today
1
108,964
4,365,084,533
IssuesEvent
2016-08-03 09:29:50
google/paco
https://api.github.com/repos/google/paco
opened
iOS - Last experiment on invitations page is not touchable
Component-iOS Component-UI Priority-Medium
With many experiments, more than one screen full, the last experiment is visible when scrolling done, but the screen stops and scrolls back up to hide it when you quit scrolling. This makes it hidden and untouchable.
1.0
iOS - Last experiment on invitations page is not touchable - With many experiments, more than one screen full, the last experiment is visible when scrolling done, but the screen stops and scrolls back up to hide it when you quit scrolling. This makes it hidden and untouchable.
priority
ios last experiment on invitations page is not touchable with many experiments more than one screen full the last experiment is visible when scrolling done but the screen stops and scrolls back up to hide it when you quit scrolling this makes it hidden and untouchable
1
380,704
11,270,039,923
IssuesEvent
2020-01-14 10:06:47
godotengine/godot
https://api.github.com/repos/godotengine/godot
closed
Importing ogg vorbis files causes a crash in 3.1 stable
bug confirmed high priority topic:import
Opening a project with ogg vorbis files, or importing them, causes the editor to immediately crash. Here's what I get from running the editor in the command line: ``` ERROR: set_data: Condition ' ogg_stream == __null ' is true. At: modules/stb_vorbis/audio_stream_ogg_vorbis.cpp:196. ERROR: instance_playback: Condition ' data == __null ' is true. returned: ovs At: modules/stb_vorbis/audio_stream_ogg_vorbis.cpp:132. handle_crash: Program crashed with signal 11 Dumping the backtrace. Please include this when reporting the bug on https://github.com/godotengine/godot/issues [1] /lib/x86_64-linux-gnu/libc.so.6(+0x41100) [0x7fc8ea0a2100] (??:0) [2] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0x12ed453] (??:?) [3] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0x115f22a] (??:?) [4] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0x10beb67] (<artificial>:?) [5] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0x10ce95a] (<artificial>:?) [6] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0xce0080] (<artificial>:?) [7] /lib/x86_64-linux-gnu/libpthread.so.0(+0x8164) [0x7fc8e94f9164] (??:0) [8] /lib/x86_64-linux-gnu/libc.so.6(clone+0x3f) [0x7fc8ea17bdef] (??:0) -- END OF BACKTRACE -- ``` Removing all ogg vorbis files from the project (using the beta 7 editor), allows projects to launch without a problem.
1.0
Importing ogg vorbis files causes a crash in 3.1 stable - Opening a project with ogg vorbis files, or importing them, causes the editor to immediately crash. Here's what I get from running the editor in the command line: ``` ERROR: set_data: Condition ' ogg_stream == __null ' is true. At: modules/stb_vorbis/audio_stream_ogg_vorbis.cpp:196. ERROR: instance_playback: Condition ' data == __null ' is true. returned: ovs At: modules/stb_vorbis/audio_stream_ogg_vorbis.cpp:132. handle_crash: Program crashed with signal 11 Dumping the backtrace. Please include this when reporting the bug on https://github.com/godotengine/godot/issues [1] /lib/x86_64-linux-gnu/libc.so.6(+0x41100) [0x7fc8ea0a2100] (??:0) [2] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0x12ed453] (??:?) [3] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0x115f22a] (??:?) [4] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0x10beb67] (<artificial>:?) [5] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0x10ce95a] (<artificial>:?) [6] /home/user/Projects/Godot/Godot_v3.1-stable_x11.64() [0xce0080] (<artificial>:?) [7] /lib/x86_64-linux-gnu/libpthread.so.0(+0x8164) [0x7fc8e94f9164] (??:0) [8] /lib/x86_64-linux-gnu/libc.so.6(clone+0x3f) [0x7fc8ea17bdef] (??:0) -- END OF BACKTRACE -- ``` Removing all ogg vorbis files from the project (using the beta 7 editor), allows projects to launch without a problem.
priority
importing ogg vorbis files causes a crash in stable opening a project with ogg vorbis files or importing them causes the editor to immediately crash here s what i get from running the editor in the command line error set data condition ogg stream null is true at modules stb vorbis audio stream ogg vorbis cpp error instance playback condition data null is true returned ovs at modules stb vorbis audio stream ogg vorbis cpp handle crash program crashed with signal dumping the backtrace please include this when reporting the bug on lib linux gnu libc so home user projects godot godot stable home user projects godot godot stable home user projects godot godot stable home user projects godot godot stable home user projects godot godot stable lib linux gnu libpthread so lib linux gnu libc so clone end of backtrace removing all ogg vorbis files from the project using the beta editor allows projects to launch without a problem
1
237,194
7,757,200,169
IssuesEvent
2018-05-31 15:39:27
web-platform-tests/wpt
https://api.github.com/repos/web-platform-tests/wpt
opened
Migrate Travis CI to GitHub Apps
infra priority:roadmap
Like https://github.com/whatwg/meta/issues/96, and possible we can learn from that issue, or vice versa.
1.0
Migrate Travis CI to GitHub Apps - Like https://github.com/whatwg/meta/issues/96, and possible we can learn from that issue, or vice versa.
priority
migrate travis ci to github apps like and possible we can learn from that issue or vice versa
1
270,825
8,470,853,911
IssuesEvent
2018-10-24 06:31:54
kyma-project/kyma
https://api.github.com/repos/kyma-project/kyma
closed
Enable nightly links validation on repositories
area/community enhancement priority/critical
<!-- Thank you for your contribution. Before you submit the issue: 1. Search open and closed issues for duplicates. 2. Read the contributing guidelines. --> **Description** Currently governance job is executed on master when PR is merged. That validation should be disabled, because it sometimes blocks PR when `external service` is not available. Instead of that nightly links validation should be enabled on all repositories. AC: - [ ] Nightly links validation enabled on all repositories - [ ] Links validation on master disabled on all repositories - [ ] Send notification about failed validation to technical writers
1.0
Enable nightly links validation on repositories - <!-- Thank you for your contribution. Before you submit the issue: 1. Search open and closed issues for duplicates. 2. Read the contributing guidelines. --> **Description** Currently governance job is executed on master when PR is merged. That validation should be disabled, because it sometimes blocks PR when `external service` is not available. Instead of that nightly links validation should be enabled on all repositories. AC: - [ ] Nightly links validation enabled on all repositories - [ ] Links validation on master disabled on all repositories - [ ] Send notification about failed validation to technical writers
priority
enable nightly links validation on repositories thank you for your contribution before you submit the issue search open and closed issues for duplicates read the contributing guidelines description currently governance job is executed on master when pr is merged that validation should be disabled because it sometimes blocks pr when external service is not available instead of that nightly links validation should be enabled on all repositories ac nightly links validation enabled on all repositories links validation on master disabled on all repositories send notification about failed validation to technical writers
1
254,002
21,720,037,553
IssuesEvent
2022-05-10 22:29:56
trinodb/trino
https://api.github.com/repos/trinodb/trino
closed
Flaky TestHiveFaultTolerantExecutionConnectorTest.close
bug test
``` 2022-04-13T03:08:12.254-0500 INFO pool-3-thread-1 io.trino.testng.services.LogTestDurationListener Test io.trino.plugin.hive.TestHiveFaultTolerantExecutionConnectorTest.close took 30.49s Error: Tests run: 754, Failures: 1, Errors: 0, Skipped: 25, Time elapsed: 919.723 s <<< FAILURE! - in TestSuite Error: io.trino.plugin.hive.TestHiveFaultTolerantExecutionConnectorTest.close Time elapsed: 31.504 s <<< FAILURE! java.lang.AssertionError: Expected memory reservation on server_0(worker)to be 0 but was 79736; detailed memory usage: 20220413_080329_03792_nksgw: SQL: INSERT INTO invalid_partition_value VALUES (4, 'test' || chr(13)) state: FAILED memoryReservation: 79736 taggedMemoryReservaton: {HashAggregationOperator=79736} at org.testng.Assert.fail(Assert.java:94) at io.trino.testing.AbstractTestQueryFramework.assertMemoryPoolReleased(AbstractTestQueryFramework.java:149) at io.trino.testing.AbstractTestQueryFramework.lambda$checkQueryMemoryReleased$0(AbstractTestQueryFramework.java:134) at io.trino.testing.assertions.Assert.assertEventually(Assert.java:77) at io.trino.testing.AbstractTestQueryFramework.checkQueryMemoryReleased(AbstractTestQueryFramework.java:127) at io.trino.testing.AbstractTestQueryFramework.close(AbstractTestQueryFramework.java:108) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeAfterClassMethods(TestMethodWorker.java:217) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:115) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) ``` https://github.com/trinodb/trino/runs/6003330080
1.0
Flaky TestHiveFaultTolerantExecutionConnectorTest.close - ``` 2022-04-13T03:08:12.254-0500 INFO pool-3-thread-1 io.trino.testng.services.LogTestDurationListener Test io.trino.plugin.hive.TestHiveFaultTolerantExecutionConnectorTest.close took 30.49s Error: Tests run: 754, Failures: 1, Errors: 0, Skipped: 25, Time elapsed: 919.723 s <<< FAILURE! - in TestSuite Error: io.trino.plugin.hive.TestHiveFaultTolerantExecutionConnectorTest.close Time elapsed: 31.504 s <<< FAILURE! java.lang.AssertionError: Expected memory reservation on server_0(worker)to be 0 but was 79736; detailed memory usage: 20220413_080329_03792_nksgw: SQL: INSERT INTO invalid_partition_value VALUES (4, 'test' || chr(13)) state: FAILED memoryReservation: 79736 taggedMemoryReservaton: {HashAggregationOperator=79736} at org.testng.Assert.fail(Assert.java:94) at io.trino.testing.AbstractTestQueryFramework.assertMemoryPoolReleased(AbstractTestQueryFramework.java:149) at io.trino.testing.AbstractTestQueryFramework.lambda$checkQueryMemoryReleased$0(AbstractTestQueryFramework.java:134) at io.trino.testing.assertions.Assert.assertEventually(Assert.java:77) at io.trino.testing.AbstractTestQueryFramework.checkQueryMemoryReleased(AbstractTestQueryFramework.java:127) at io.trino.testing.AbstractTestQueryFramework.close(AbstractTestQueryFramework.java:108) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeAfterClassMethods(TestMethodWorker.java:217) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:115) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) ``` https://github.com/trinodb/trino/runs/6003330080
non_priority
flaky testhivefaulttolerantexecutionconnectortest close info pool thread io trino testng services logtestdurationlistener test io trino plugin hive testhivefaulttolerantexecutionconnectortest close took error tests run failures errors skipped time elapsed s failure in testsuite error io trino plugin hive testhivefaulttolerantexecutionconnectortest close time elapsed s failure java lang assertionerror expected memory reservation on server worker to be but was detailed memory usage nksgw sql insert into invalid partition value values test chr state failed memoryreservation taggedmemoryreservaton hashaggregationoperator at org testng assert fail assert java at io trino testing abstracttestqueryframework assertmemorypoolreleased abstracttestqueryframework java at io trino testing abstracttestqueryframework lambda checkquerymemoryreleased abstracttestqueryframework java at io trino testing assertions assert asserteventually assert java at io trino testing abstracttestqueryframework checkquerymemoryreleased abstracttestqueryframework java at io trino testing abstracttestqueryframework close abstracttestqueryframework java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org testng internal methodinvocationhelper invokemethod methodinvocationhelper java at org testng internal invoker invokeconfigurationmethod invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal testmethodworker invokeafterclassmethods testmethodworker java at org testng internal testmethodworker run testmethodworker java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java
0
359,645
25,248,701,291
IssuesEvent
2022-11-15 13:09:24
airalab/robonomics-wiki
https://api.github.com/repos/airalab/robonomics-wiki
closed
[Issue]: Add Use Intro page
documentation
### Issue description This page is to describe the three modules of the Use block: Use Cases, Proof of Concept and Playground. It would be great to add three buttons leading to these modules. ### Doc Page https://github.com/airalab/robonomics-wiki/blob/test-code/data/sidebar_docs.yaml#L90
1.0
[Issue]: Add Use Intro page - ### Issue description This page is to describe the three modules of the Use block: Use Cases, Proof of Concept and Playground. It would be great to add three buttons leading to these modules. ### Doc Page https://github.com/airalab/robonomics-wiki/blob/test-code/data/sidebar_docs.yaml#L90
non_priority
add use intro page issue description this page is to describe the three modules of the use block use cases proof of concept and playground it would be great to add three buttons leading to these modules doc page
0
492,758
14,219,455,195
IssuesEvent
2020-11-17 13:18:46
neuropoly/axondeepseg
https://api.github.com/repos/neuropoly/axondeepseg
opened
Dash app for morphometrics analysis
feature gui priority:MEDIUM
## Pre-planning stage * [ ] Attend live Plotly session on the topic * https://go.plotly.com/dash-image-processing/ * Registration requires. * [ ] Explore examples of similar features already available on the web * https://eoss-image-processing.github.io/2020/11/10/region-properties-exploration.html * [ ] Follow tutorials to get an idea of how to properly plan app. * https://www.youtube.com/playlist?list=PLCDERj-IUIFCaELQ2i7AwgD2M6Xvc4Slf * Documentation page https://dash.plotly.com/ ## Planning stage * [ ] Draw a rough design of the app * [ ] Draw logic for expected callbacks of the apps ... ## Implementation stage ...
1.0
Dash app for morphometrics analysis - ## Pre-planning stage * [ ] Attend live Plotly session on the topic * https://go.plotly.com/dash-image-processing/ * Registration requires. * [ ] Explore examples of similar features already available on the web * https://eoss-image-processing.github.io/2020/11/10/region-properties-exploration.html * [ ] Follow tutorials to get an idea of how to properly plan app. * https://www.youtube.com/playlist?list=PLCDERj-IUIFCaELQ2i7AwgD2M6Xvc4Slf * Documentation page https://dash.plotly.com/ ## Planning stage * [ ] Draw a rough design of the app * [ ] Draw logic for expected callbacks of the apps ... ## Implementation stage ...
priority
dash app for morphometrics analysis pre planning stage attend live plotly session on the topic registration requires explore examples of similar features already available on the web follow tutorials to get an idea of how to properly plan app documentation page planning stage draw a rough design of the app draw logic for expected callbacks of the apps implementation stage
1
116,989
9,905,221,601
IssuesEvent
2019-06-27 11:02:03
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
teamcity: failed test: TestChangefeedMonitoring
C-test-failure O-robot
The following tests appear to have failed on master (testrace): TestChangefeedMonitoring, TestChangefeedMonitoring/enterprise You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestChangefeedMonitoring). [#1361910](https://teamcity.cockroachdb.com/viewLog.html?buildId=1361910): ``` TestChangefeedMonitoring --- FAIL: testrace/TestChangefeedMonitoring (51.440s) TestChangefeedMonitoring/enterprise ...l.(*InternalExecutor).QueryRow /go/src/github.com/cockroachdb/cockroach/pkg/sql/internal.go:277 github.com/cockroachdb/cockroach/pkg/jobs.(*Job).Update.func1 /go/src/github.com/cockroachdb/cockroach/pkg/jobs/update.go:106 github.com/cockroachdb/cockroach/pkg/internal/client.(*DB).Txn.func1 /go/src/github.com/cockroachdb/cockroach/pkg/internal/client/db.go:624 github.com/cockroachdb/cockroach/pkg/internal/client.(*Txn).exec /go/src/github.com/cockroachdb/cockroach/pkg/internal/client/txn.go:686 github.com/cockroachdb/cockroach/pkg/internal/client.(*DB).Txn /go/src/github.com/cockroachdb/cockroach/pkg/internal/client/db.go:623 github.com/cockroachdb/cockroach/pkg/jobs.(*Job).runInTxn /go/src/github.com/cockroachdb/cockroach/pkg/jobs/jobs.go:438 github.com/cockroachdb/cockroach/pkg/jobs.(*Job).Update /go/src/github.com/cockroachdb/cockroach/pkg/jobs/update.go:104 github.com/cockroachdb/cockroach/pkg/jobs.(*Job).HighWaterProgressed /go/src/github.com/cockroachdb/cockroach/pkg/jobs/jobs.go:245 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.checkpointResolvedTimestamp /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed.go:308 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.(*changeFrontier).noteResolvedSpan /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_processors.go:577 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.(*changeFrontier).Next /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_processors.go:533 github.com/cockroachdb/cockroach/pkg/sql/distsqlrun.Run /go/src/github.com/cockroachdb/cockroach/pkg/sql/distsqlrun/base.go:171 github.com/cockroachdb/cockroach/pkg/sql/distsqlrun.(*ProcessorBase).Run /go/src/github.com/cockroachdb/cockroach/pkg/sql/distsqlrun/processors.go:793 github.com/cockroachdb/cockroach/pkg/sql/distsqlrun.(*Flow).Run /go/src/github.com/cockroachdb/cockroach/pkg/sql/distsqlrun/flow.go:654 github.com/cockroachdb/cockroach/pkg/sql.(*DistSQLPlanner).Run /go/src/github.com/cockroachdb/cockroach/pkg/sql/distsql_running.go:296 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.distChangefeedFlow /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_dist.go:191 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.(*changefeedResumer).Resume /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_stmt.go:484 github.com/cockroachdb/cockroach/pkg/jobs.(*Registry).resume.func1 /go/src/github.com/cockroachdb/cockroach/pkg/jobs/registry.go:549 github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunAsyncTask.func1 /go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:321 runtime.goexit /usr/local/go/src/runtime/asm_amd64.s:1337 changefeed_test.go:982: condition failed to evaluate within 45s: waiting for the feed to be < 100000000 nanos behind got 15538686179 goroutine 57441 [running]: runtime/debug.Stack(0xa7a358200, 0xc00c6516e0, 0x540a940) /usr/local/go/src/runtime/debug/stack.go:24 +0xab github.com/cockroachdb/cockroach/pkg/testutils.SucceedsSoon(0x54dd320, 0xc001d85f00, 0xc00e5377e0) /go/src/github.com/cockroachdb/cockroach/pkg/testutils/soon.go:45 +0x181 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.TestChangefeedMonitoring.func1(0xc001d85f00, 0xc00a49ec00, 0x5435180, 0xc00c982dc0) /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_test.go:982 +0x87b github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.enterpriseTest.func1(0xc001d85f00) /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/helpers_test.go:265 +0x874 testing.tRunner(0xc001d85f00, 0xc00181aef0) /usr/local/go/src/testing/testing.go:865 +0x164 created by testing.(*T).Run /usr/local/go/src/testing/testing.go:916 +0x65b ``` Please assign, take a look and update the issue accordingly.
1.0
teamcity: failed test: TestChangefeedMonitoring - The following tests appear to have failed on master (testrace): TestChangefeedMonitoring, TestChangefeedMonitoring/enterprise You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestChangefeedMonitoring). [#1361910](https://teamcity.cockroachdb.com/viewLog.html?buildId=1361910): ``` TestChangefeedMonitoring --- FAIL: testrace/TestChangefeedMonitoring (51.440s) TestChangefeedMonitoring/enterprise ...l.(*InternalExecutor).QueryRow /go/src/github.com/cockroachdb/cockroach/pkg/sql/internal.go:277 github.com/cockroachdb/cockroach/pkg/jobs.(*Job).Update.func1 /go/src/github.com/cockroachdb/cockroach/pkg/jobs/update.go:106 github.com/cockroachdb/cockroach/pkg/internal/client.(*DB).Txn.func1 /go/src/github.com/cockroachdb/cockroach/pkg/internal/client/db.go:624 github.com/cockroachdb/cockroach/pkg/internal/client.(*Txn).exec /go/src/github.com/cockroachdb/cockroach/pkg/internal/client/txn.go:686 github.com/cockroachdb/cockroach/pkg/internal/client.(*DB).Txn /go/src/github.com/cockroachdb/cockroach/pkg/internal/client/db.go:623 github.com/cockroachdb/cockroach/pkg/jobs.(*Job).runInTxn /go/src/github.com/cockroachdb/cockroach/pkg/jobs/jobs.go:438 github.com/cockroachdb/cockroach/pkg/jobs.(*Job).Update /go/src/github.com/cockroachdb/cockroach/pkg/jobs/update.go:104 github.com/cockroachdb/cockroach/pkg/jobs.(*Job).HighWaterProgressed /go/src/github.com/cockroachdb/cockroach/pkg/jobs/jobs.go:245 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.checkpointResolvedTimestamp /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed.go:308 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.(*changeFrontier).noteResolvedSpan /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_processors.go:577 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.(*changeFrontier).Next /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_processors.go:533 github.com/cockroachdb/cockroach/pkg/sql/distsqlrun.Run /go/src/github.com/cockroachdb/cockroach/pkg/sql/distsqlrun/base.go:171 github.com/cockroachdb/cockroach/pkg/sql/distsqlrun.(*ProcessorBase).Run /go/src/github.com/cockroachdb/cockroach/pkg/sql/distsqlrun/processors.go:793 github.com/cockroachdb/cockroach/pkg/sql/distsqlrun.(*Flow).Run /go/src/github.com/cockroachdb/cockroach/pkg/sql/distsqlrun/flow.go:654 github.com/cockroachdb/cockroach/pkg/sql.(*DistSQLPlanner).Run /go/src/github.com/cockroachdb/cockroach/pkg/sql/distsql_running.go:296 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.distChangefeedFlow /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_dist.go:191 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.(*changefeedResumer).Resume /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_stmt.go:484 github.com/cockroachdb/cockroach/pkg/jobs.(*Registry).resume.func1 /go/src/github.com/cockroachdb/cockroach/pkg/jobs/registry.go:549 github.com/cockroachdb/cockroach/pkg/util/stop.(*Stopper).RunAsyncTask.func1 /go/src/github.com/cockroachdb/cockroach/pkg/util/stop/stopper.go:321 runtime.goexit /usr/local/go/src/runtime/asm_amd64.s:1337 changefeed_test.go:982: condition failed to evaluate within 45s: waiting for the feed to be < 100000000 nanos behind got 15538686179 goroutine 57441 [running]: runtime/debug.Stack(0xa7a358200, 0xc00c6516e0, 0x540a940) /usr/local/go/src/runtime/debug/stack.go:24 +0xab github.com/cockroachdb/cockroach/pkg/testutils.SucceedsSoon(0x54dd320, 0xc001d85f00, 0xc00e5377e0) /go/src/github.com/cockroachdb/cockroach/pkg/testutils/soon.go:45 +0x181 github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.TestChangefeedMonitoring.func1(0xc001d85f00, 0xc00a49ec00, 0x5435180, 0xc00c982dc0) /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/changefeed_test.go:982 +0x87b github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl.enterpriseTest.func1(0xc001d85f00) /go/src/github.com/cockroachdb/cockroach/pkg/ccl/changefeedccl/helpers_test.go:265 +0x874 testing.tRunner(0xc001d85f00, 0xc00181aef0) /usr/local/go/src/testing/testing.go:865 +0x164 created by testing.(*T).Run /usr/local/go/src/testing/testing.go:916 +0x65b ``` Please assign, take a look and update the issue accordingly.
non_priority
teamcity failed test testchangefeedmonitoring the following tests appear to have failed on master testrace testchangefeedmonitoring testchangefeedmonitoring enterprise you may want to check testchangefeedmonitoring fail testrace testchangefeedmonitoring testchangefeedmonitoring enterprise l internalexecutor queryrow go src github com cockroachdb cockroach pkg sql internal go github com cockroachdb cockroach pkg jobs job update go src github com cockroachdb cockroach pkg jobs update go github com cockroachdb cockroach pkg internal client db txn go src github com cockroachdb cockroach pkg internal client db go github com cockroachdb cockroach pkg internal client txn exec go src github com cockroachdb cockroach pkg internal client txn go github com cockroachdb cockroach pkg internal client db txn go src github com cockroachdb cockroach pkg internal client db go github com cockroachdb cockroach pkg jobs job runintxn go src github com cockroachdb cockroach pkg jobs jobs go github com cockroachdb cockroach pkg jobs job update go src github com cockroachdb cockroach pkg jobs update go github com cockroachdb cockroach pkg jobs job highwaterprogressed go src github com cockroachdb cockroach pkg jobs jobs go github com cockroachdb cockroach pkg ccl changefeedccl checkpointresolvedtimestamp go src github com cockroachdb cockroach pkg ccl changefeedccl changefeed go github com cockroachdb cockroach pkg ccl changefeedccl changefrontier noteresolvedspan go src github com cockroachdb cockroach pkg ccl changefeedccl changefeed processors go github com cockroachdb cockroach pkg ccl changefeedccl changefrontier next go src github com cockroachdb cockroach pkg ccl changefeedccl changefeed processors go github com cockroachdb cockroach pkg sql distsqlrun run go src github com cockroachdb cockroach pkg sql distsqlrun base go github com cockroachdb cockroach pkg sql distsqlrun processorbase run go src github com cockroachdb cockroach pkg sql distsqlrun processors go github com cockroachdb cockroach pkg sql distsqlrun flow run go src github com cockroachdb cockroach pkg sql distsqlrun flow go github com cockroachdb cockroach pkg sql distsqlplanner run go src github com cockroachdb cockroach pkg sql distsql running go github com cockroachdb cockroach pkg ccl changefeedccl distchangefeedflow go src github com cockroachdb cockroach pkg ccl changefeedccl changefeed dist go github com cockroachdb cockroach pkg ccl changefeedccl changefeedresumer resume go src github com cockroachdb cockroach pkg ccl changefeedccl changefeed stmt go github com cockroachdb cockroach pkg jobs registry resume go src github com cockroachdb cockroach pkg jobs registry go github com cockroachdb cockroach pkg util stop stopper runasynctask go src github com cockroachdb cockroach pkg util stop stopper go runtime goexit usr local go src runtime asm s changefeed test go condition failed to evaluate within waiting for the feed to be nanos behind got goroutine runtime debug stack usr local go src runtime debug stack go github com cockroachdb cockroach pkg testutils succeedssoon go src github com cockroachdb cockroach pkg testutils soon go github com cockroachdb cockroach pkg ccl changefeedccl testchangefeedmonitoring go src github com cockroachdb cockroach pkg ccl changefeedccl changefeed test go github com cockroachdb cockroach pkg ccl changefeedccl enterprisetest go src github com cockroachdb cockroach pkg ccl changefeedccl helpers test go testing trunner usr local go src testing testing go created by testing t run usr local go src testing testing go please assign take a look and update the issue accordingly
0
218,149
16,959,784,747
IssuesEvent
2021-06-29 00:56:26
RPTools/maptool
https://api.github.com/repos/RPTools/maptool
closed
Macros moved/copied between Macro Groups & Panels are always player-editable
bug macro changes tested
**Describe the bug** When moving a macro between a macro group, or when copying a macro from one panel to another, the 'Allow Players to Edit Macro' checkbox is always reset to 'checked'. **To Reproduce** Steps to reproduce the behavior: 1. Create a campaign macro and edit it so 'Allow Players to Edit Macro' is **unchecked**. Rename `Locked` or similar. 2. **Copy to Token Error**: Create a token. 3. Drag the `Locked` macro to the token's Selection panel. It will copy. 4. Edit the copied `Locked` macro on the token. 5. See error: 'Allow Players to Edit Macro' is **checked**. 2. **Move to Group Error**: Create a second campaign macro in a new macro group from the first. 3. Drag the `Locked` campaign macro to the new macro group. It will move from one group to another (neat!). 4. Edit the moved `Locked` macro on the campaign panel. 5. See error: 'Allow Players to Edit Macro' is **checked**. **Expected behavior** All settings of the moved/copied macros are the same. **MapTool Info** - Version: 1.9.2 - Install: Upgrade **Desktop (please complete the following information):** - Windows 10 **Additional context** - Duplicating a macro from the macro button context menu works correctly / does not do this. - The current behavior is arguably reasonable for players connected to a server when dragging macros from the Campaign Panel to their own token panels. It is complicated here - players shouldn't be able to view the code of locked/trusted macros, but they also shouldn't be able to put a trusted macro on a token it wasn't intended for.
1.0
Macros moved/copied between Macro Groups & Panels are always player-editable - **Describe the bug** When moving a macro between a macro group, or when copying a macro from one panel to another, the 'Allow Players to Edit Macro' checkbox is always reset to 'checked'. **To Reproduce** Steps to reproduce the behavior: 1. Create a campaign macro and edit it so 'Allow Players to Edit Macro' is **unchecked**. Rename `Locked` or similar. 2. **Copy to Token Error**: Create a token. 3. Drag the `Locked` macro to the token's Selection panel. It will copy. 4. Edit the copied `Locked` macro on the token. 5. See error: 'Allow Players to Edit Macro' is **checked**. 2. **Move to Group Error**: Create a second campaign macro in a new macro group from the first. 3. Drag the `Locked` campaign macro to the new macro group. It will move from one group to another (neat!). 4. Edit the moved `Locked` macro on the campaign panel. 5. See error: 'Allow Players to Edit Macro' is **checked**. **Expected behavior** All settings of the moved/copied macros are the same. **MapTool Info** - Version: 1.9.2 - Install: Upgrade **Desktop (please complete the following information):** - Windows 10 **Additional context** - Duplicating a macro from the macro button context menu works correctly / does not do this. - The current behavior is arguably reasonable for players connected to a server when dragging macros from the Campaign Panel to their own token panels. It is complicated here - players shouldn't be able to view the code of locked/trusted macros, but they also shouldn't be able to put a trusted macro on a token it wasn't intended for.
non_priority
macros moved copied between macro groups panels are always player editable describe the bug when moving a macro between a macro group or when copying a macro from one panel to another the allow players to edit macro checkbox is always reset to checked to reproduce steps to reproduce the behavior create a campaign macro and edit it so allow players to edit macro is unchecked rename locked or similar copy to token error create a token drag the locked macro to the token s selection panel it will copy edit the copied locked macro on the token see error allow players to edit macro is checked move to group error create a second campaign macro in a new macro group from the first drag the locked campaign macro to the new macro group it will move from one group to another neat edit the moved locked macro on the campaign panel see error allow players to edit macro is checked expected behavior all settings of the moved copied macros are the same maptool info version install upgrade desktop please complete the following information windows additional context duplicating a macro from the macro button context menu works correctly does not do this the current behavior is arguably reasonable for players connected to a server when dragging macros from the campaign panel to their own token panels it is complicated here players shouldn t be able to view the code of locked trusted macros but they also shouldn t be able to put a trusted macro on a token it wasn t intended for
0
462,484
13,247,979,474
IssuesEvent
2020-08-19 18:10:00
open-telemetry/opentelemetry-specification
https://api.github.com/repos/open-telemetry/opentelemetry-specification
closed
Span name: Both low-cardinality (grouping key) and human-readable (display name)
area:api area:sampling priority:p1 release:required-for-ga spec:trace
This came up when I suggested the handler name instead of the route name for the HTTP semantic conventions (#548). I think that the handler name is a better grouping key than the route, but the route is a better display name, as @yurishkuro [pointed out](https://github.com/open-telemetry/opentelemetry-specification/pull/548#discussion_r404397569). So I think we should relieve span name from this double duty. I'll post ideas as separate comments to allow reactions. EDIT: Cf. current spec of span name https://github.com/open-telemetry/opentelemetry-specification/blob/e7fe34ad/specification/trace/api.md#span: > The span name is a human-readable string which concisely identifies the work represented by the Span, for example, an RPC method name, a function name, or the name of a subtask or stage within a larger computation. The span name should be the most general string that identifies a (statistically) interesting class of Spans, rather than individual Span instances. That is, "get_user" is a reasonable name, while "get_user/314159", where "314159" is a user ID, is not a good name due to its high cardinality. I used the term "grouping key" here because I think using the span name as a key to group / sample similar spans is the reason why a Span name should have a low cardinality.
1.0
Span name: Both low-cardinality (grouping key) and human-readable (display name) - This came up when I suggested the handler name instead of the route name for the HTTP semantic conventions (#548). I think that the handler name is a better grouping key than the route, but the route is a better display name, as @yurishkuro [pointed out](https://github.com/open-telemetry/opentelemetry-specification/pull/548#discussion_r404397569). So I think we should relieve span name from this double duty. I'll post ideas as separate comments to allow reactions. EDIT: Cf. current spec of span name https://github.com/open-telemetry/opentelemetry-specification/blob/e7fe34ad/specification/trace/api.md#span: > The span name is a human-readable string which concisely identifies the work represented by the Span, for example, an RPC method name, a function name, or the name of a subtask or stage within a larger computation. The span name should be the most general string that identifies a (statistically) interesting class of Spans, rather than individual Span instances. That is, "get_user" is a reasonable name, while "get_user/314159", where "314159" is a user ID, is not a good name due to its high cardinality. I used the term "grouping key" here because I think using the span name as a key to group / sample similar spans is the reason why a Span name should have a low cardinality.
priority
span name both low cardinality grouping key and human readable display name this came up when i suggested the handler name instead of the route name for the http semantic conventions i think that the handler name is a better grouping key than the route but the route is a better display name as yurishkuro so i think we should relieve span name from this double duty i ll post ideas as separate comments to allow reactions edit cf current spec of span name the span name is a human readable string which concisely identifies the work represented by the span for example an rpc method name a function name or the name of a subtask or stage within a larger computation the span name should be the most general string that identifies a statistically interesting class of spans rather than individual span instances that is get user is a reasonable name while get user where is a user id is not a good name due to its high cardinality i used the term grouping key here because i think using the span name as a key to group sample similar spans is the reason why a span name should have a low cardinality
1
811,125
30,275,335,085
IssuesEvent
2023-07-07 19:02:54
virtualcell/vcell
https://api.github.com/repos/virtualcell/vcell
closed
Update reactome old to new id map, access resource as stream instead of file.
bug High Priority VCell-7.5.0 VCell-7.5.1
Map was 6 years old. Open as file in Release fails, trying as stream. Improved robustness, better error messages.
1.0
Update reactome old to new id map, access resource as stream instead of file. - Map was 6 years old. Open as file in Release fails, trying as stream. Improved robustness, better error messages.
priority
update reactome old to new id map access resource as stream instead of file map was years old open as file in release fails trying as stream improved robustness better error messages
1
590,207
17,773,751,207
IssuesEvent
2021-08-30 16:28:33
renovatebot/renovate
https://api.github.com/repos/renovatebot/renovate
closed
bug: extra slash in docker registry other than index.docker.io
type:bug priority-3-normal good first issue datasource:docker status:ready reproduction:provided
### How are you running Renovate? Self-hosted ### Please select which platform you are using if self-hosting. GitHub Enterprise Server ### If you're self-hosting Renovate, tell us what version of Renovate you run. 25.73.0-986e1c1f9 ### Describe the bug Unable to retrieve tags for docker registry other than index.docker.io. This prevents us from updating images from k8s.gcr.io for example. I tried with or without the trailing slash in the registryUrl variable. Reproduction repository: https://github.com/aslafy-z/reproduce-renovate-bug-docker-extraslash ``` DEBUG: Datasource 404 (repository=zadkiel/test-renovate) "datasource": "docker", "lookupName": "defaultbackend-amd64", "url": "https://k8s.gcr.io/v2//defaultbackend-amd64/tags/list?n=10000" ``` There's an extra slash in the generated URL which prevents the request to succeed. ### Relevant debug logs <details><summary>Logs</summary> ``` DEBUG: Using RE2 as regex engine DEBUG: Parsing configs DEBUG: Checking for config file in /tmp/test-renovate/config DEBUG: No config file found on disk - skipping DEBUG: File config "config": {} DEBUG: CLI config "config": { "repositories": ["zadkiel/test-renovate"], "prHourlyLimit": 0, "prConcurrentLimit": 0 } DEBUG: Env config "config": { "hostRules": [], "platform": "github", "endpoint": "https://github-enterprise.local/api/v3", "token": "***********" } DEBUG: Combined config "config": { "hostRules": [], "platform": "github", "endpoint": "https://github-enterprise.local/api/v3", "token": "***********", "repositories": ["zadkiel/test-renovate"], "prHourlyLimit": 0, "prConcurrentLimit": 0 } DEBUG: Adding trailing slash to endpoint (node:901107) Warning: Setting the NODE_TLS_REJECT_UNAUTHORIZED environment variable to '0' makes TLS connections and HTTPS requests insecure by disabling certificate verification. (Use `node --trace-warnings ...` to show where the warning was created) DEBUG: GitHub 404 "url": "https://github-enterprise.local/api/v3/user/emails" DEBUG: Cannot read user/emails endpoint on GitHub to retrieve gitAuthor DEBUG: Authenticated as GitHub user: renovate-bot DEBUG: Using default gitAuthor: Renovate Bot <renovate@whitesourcesoftware.com> DEBUG: Adding token authentication for github-enterprise.local to hostRules DEBUG: Using baseDir: /tmp/renovate DEBUG: Using cacheDir: /tmp/renovate/cache DEBUG: Initializing Renovate internal cache into /tmp/renovate/cache/renovate/renovate-cache-v1 DEBUG: Commits limit = null DEBUG: Clearing hostRules DEBUG: Adding token authentication for github-enterprise.local to hostRules INFO: Repository started (repository=zadkiel/test-renovate) "renovateVersion": "25.73.0-986e1c1f9" DEBUG: Using localDir: /tmp/renovate/repos/github/zadkiel/test-renovate (repository=zadkiel/test-renovate) DEBUG: initRepo("zadkiel/test-renovate") (repository=zadkiel/test-renovate) DEBUG: Overriding default GitHub endpoint (repository=zadkiel/test-renovate) "endpoint": "https://github-enterprise.local/api/v3/" DEBUG: zadkiel/test-renovate default branch = master (repository=zadkiel/test-renovate) DEBUG: Using personal access token for git init (repository=zadkiel/test-renovate) DEBUG: resetMemCache() (repository=zadkiel/test-renovate) DEBUG: Resetting npmrc (repository=zadkiel/test-renovate) DEBUG: checkOnboarding() (repository=zadkiel/test-renovate) DEBUG: isOnboarded() (repository=zadkiel/test-renovate) DEBUG: findFile(renovate.json) (repository=zadkiel/test-renovate) DEBUG: Initializing git repository into /tmp/renovate/repos/github/zadkiel/test-renovate (repository=zadkiel/test-renovate) DEBUG: git clone completed (repository=zadkiel/test-renovate) "durationMs": 1195 DEBUG: latest repository commit (repository=zadkiel/test-renovate) "latestCommit": { "hash": "5c1671bfe9f6e275f483889dc2307a958c94c359", "date": "2021-08-13T12:44:51+02:00", "message": "Update components.yaml", "refs": "HEAD -> master, origin/master, origin/HEAD", "body": "", "author_name": "zadkiel", "author_email": "zadkiel@zadkiel.local" } DEBUG: Setting git author name (repository=zadkiel/test-renovate) "gitAuthorName": "Renovate Bot" DEBUG: Setting git author email (repository=zadkiel/test-renovate) "gitAuthorEmail": "renovate@whitesourcesoftware.com" DEBUG: findFile(renovate.json5) (repository=zadkiel/test-renovate) DEBUG: findFile(.github/renovate.json) (repository=zadkiel/test-renovate) DEBUG: Config file exists (repository=zadkiel/test-renovate) "fileName": ".github/renovate.json" DEBUG: Retrieving issueList (repository=zadkiel/test-renovate) DEBUG: Retrieved 1 issues (repository=zadkiel/test-renovate) DEBUG: Repo is onboarded (repository=zadkiel/test-renovate) DEBUG: Found .github/renovate.json config file (repository=zadkiel/test-renovate) DEBUG: Repository config (repository=zadkiel/test-renovate) "fileName": ".github/renovate.json", "config": { "$schema": "https://docs.renovatebot.com/renovate-schema.json", "regexManagers": [ { "fileMatch": ["^components\\.yaml$"], "matchStrings": [ "[^\\n]*_version: \"(?<currentValue>.*)\"\\s*#\\s*renovate:\\s*datasource=(?<datasource>.*?) depName=(?<depName>.*?)( registryUrl=(?<registryUrl>.*?))?( versioning=(?<versioning>.*?))?\\n" ] } ], "extends": ["config:base"], "labels": ["kind/upgrade"], "timezone": "Europe/Paris" } DEBUG: migrateAndValidate() (repository=zadkiel/test-renovate) DEBUG: No config migration necessary (repository=zadkiel/test-renovate) DEBUG: massaged config (repository=zadkiel/test-renovate) "config": { "$schema": "https://docs.renovatebot.com/renovate-schema.json", "regexManagers": [ { "fileMatch": ["^components\\.yaml$"], "matchStrings": [ "[^\\n]*_version: \"(?<currentValue>.*)\"\\s*#\\s*renovate:\\s*datasource=(?<datasource>.*?) depName=(?<depName>.*?)( registryUrl=(?<registryUrl>.*?))?( versioning=(?<versioning>.*?))?\\n" ] } ], "extends": ["config:base"], "labels": ["kind/upgrade"], "timezone": "Europe/Paris" } DEBUG: migrated config (repository=zadkiel/test-renovate) "config": { "$schema": "https://docs.renovatebot.com/renovate-schema.json", "regexManagers": [ { "fileMatch": ["^components\\.yaml$"], "matchStrings": [ "[^\\n]*_version: \"(?<currentValue>.*)\"\\s*#\\s*renovate:\\s*datasource=(?<datasource>.*?) depName=(?<depName>.*?)( registryUrl=(?<registryUrl>.*?))?( versioning=(?<versioning>.*?))?\\n" ] } ], "extends": ["config:base"], "labels": ["kind/upgrade"], "timezone": "Europe/Paris" } DEBUG: Setting hostRules from config (repository=zadkiel/test-renovate) DEBUG: Found repo ignorePaths (repository=zadkiel/test-renovate) "ignorePaths": [ "**/node_modules/**", "**/bower_components/**", "**/vendor/**", "**/examples/**", "**/__tests__/**", "**/test/**", "**/tests/**", "**/__fixtures__/**" ] DEBUG: detectSemanticCommits() (repository=zadkiel/test-renovate) DEBUG: getCommitMessages (repository=zadkiel/test-renovate) DEBUG: Semantic commits detection: unknown (repository=zadkiel/test-renovate) DEBUG: No semantic commits detected (repository=zadkiel/test-renovate) DEBUG: Setting branchPrefix: renovate/ (repository=zadkiel/test-renovate) DEBUG: No vulnerability alerts found (repository=zadkiel/test-renovate) DEBUG: No vulnerability alerts found (repository=zadkiel/test-renovate) DEBUG: No baseBranches (repository=zadkiel/test-renovate) DEBUG: extract() (repository=zadkiel/test-renovate) DEBUG: Setting current branch to master (repository=zadkiel/test-renovate) DEBUG: latest commit (repository=zadkiel/test-renovate) "branchName": "master", "latestCommitDate": "2021-08-13T12:44:51+02:00" DEBUG: Using file match: (^|/)tasks/[^/]+\.ya?ml$ for manager ansible (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)requirements\.ya?ml$ for manager ansible-galaxy (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)galaxy\.ya?ml$ for manager ansible-galaxy (repository=zadkiel/test-renovate) DEBUG: Using file match: azure.*pipelines?.*\.ya?ml$ for manager azure-pipelines (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)batect(-bundle)?\.yml$ for manager batect (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)batect$ for manager batect-wrapper (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)WORKSPACE(|\.bazel)$ for manager bazel (repository=zadkiel/test-renovate) DEBUG: Using file match: \.bzl$ for manager bazel (repository=zadkiel/test-renovate) DEBUG: Using file match: buildkite\.ya?ml for manager buildkite (repository=zadkiel/test-renovate) DEBUG: Using file match: \.buildkite/.+\.ya?ml$ for manager buildkite (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Gemfile$ for manager bundler (repository=zadkiel/test-renovate) DEBUG: Using file match: \.cake$ for manager cake (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Cargo.toml$ for manager cargo (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).circleci/config.yml$ for manager circleci (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)cloudbuild.ya?ml for manager cloudbuild (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Podfile$ for manager cocoapods (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)([\w-]*)composer.json$ for manager composer (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)deps\.edn$ for manager deps-edn (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)docker-compose[^/]*\.ya?ml$ for manager docker-compose (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/|\.)Dockerfile$ for manager dockerfile (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Dockerfile\.[^/]*$ for manager dockerfile (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).drone.yml$ for manager droneci (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).gitmodules$ for manager git-submodules (repository=zadkiel/test-renovate) DEBUG: Using file match: ^(workflow-templates|\.github\/workflows)\/[^/]+\.ya?ml$ for manager github-actions (repository=zadkiel/test-renovate) DEBUG: Using file match: \.gitlab-ci\.yml$ for manager gitlabci (repository=zadkiel/test-renovate) DEBUG: Using file match: ^\.gitlab-ci\.yml$ for manager gitlabci-include (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)go.mod$ for manager gomod (repository=zadkiel/test-renovate) DEBUG: Using file match: \.gradle(\.kts)?$ for manager gradle (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)gradle.properties$ for manager gradle (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)gradle.properties$ for manager gradle-lite (repository=zadkiel/test-renovate) DEBUG: Using file match: \.gradle(\.kts)?$ for manager gradle-lite (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)gradle/wrapper/gradle-wrapper.properties$ for manager gradle-wrapper (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)requirements\.yaml$ for manager helm-requirements (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)values.yaml$ for manager helm-values (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)helmfile.yaml$ for manager helmfile (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Chart.yaml$ for manager helmv3 (repository=zadkiel/test-renovate) DEBUG: Using file match: ^Formula/[^/]+[.]rb$ for manager homebrew (repository=zadkiel/test-renovate) DEBUG: Using file match: \.html?$ for manager html (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)plugins\.(txt|ya?ml)$ for manager jenkins (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)kustomization\.yaml for manager kustomize (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)project\.clj$ for manager leiningen (repository=zadkiel/test-renovate) DEBUG: Using file match: \.pom\.xml$ for manager maven (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)pom\.xml$ for manager maven (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)package.js$ for manager meteor (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)mix\.exs$ for manager mix (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).node-version$ for manager nodenv (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)package.json$ for manager npm (repository=zadkiel/test-renovate) DEBUG: Using file match: \.(?:cs|fs|vb)proj$ for manager nuget (repository=zadkiel/test-renovate) DEBUG: Using file match: \.(?:props|targets)$ for manager nuget (repository=zadkiel/test-renovate) DEBUG: Using file match: \.config\/dotnet-tools\.json$ for manager nuget (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.nvmrc$ for manager nvm (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)([\w-]*)requirements\.(txt|pip)$ for manager pip_requirements (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)setup.py$ for manager pip_setup (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Pipfile$ for manager pipenv (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)pyproject\.toml$ for manager poetry (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.pre-commit-config\.yaml$ for manager pre-commit (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)pubspec\.ya?ml$ for manager pub (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).python-version$ for manager pyenv (repository=zadkiel/test-renovate) DEBUG: Using file match: ^components\.yaml$ for manager regex (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.ruby-version$ for manager ruby-version (repository=zadkiel/test-renovate) DEBUG: Using file match: \.sbt$ for manager sbt (repository=zadkiel/test-renovate) DEBUG: Using file match: project/[^/]*.scala$ for manager sbt (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)setup\.cfg$ for manager setup-cfg (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Package\.swift for manager swift (repository=zadkiel/test-renovate) DEBUG: Using file match: \.tf$ for manager terraform (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.terraform-version$ for manager terraform-version (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)terragrunt\.hcl$ for manager terragrunt (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.terragrunt-version$ for manager terragrunt-version (repository=zadkiel/test-renovate) DEBUG: Using file match: ^.travis.yml$ for manager travis (repository=zadkiel/test-renovate) DEBUG: Matched 1 file(s) for manager regex: components.yaml (repository=zadkiel/test-renovate) DEBUG: Found regex package files (repository=zadkiel/test-renovate) DEBUG: Found 1 package file(s) (repository=zadkiel/test-renovate) INFO: Dependency extraction complete (repository=zadkiel/test-renovate) "baseBranch": "master", "stats": { "managers": {"regex": {"fileCount": 1, "depCount": 5}}, "total": {"fileCount": 1, "depCount": 5} } DEBUG: Retrying Tags for https://k8s.gcr.io//defaultbackend-amd64 using library/ prefix (repository=zadkiel/test-renovate) DEBUG: Datasource 404 (repository=zadkiel/test-renovate) "datasource": "docker", "lookupName": "ingress-nginx/controller", "url": "https://k8s.gcr.io/v2//ingress-nginx/controller/tags/list?n=10000" DEBUG: Failed to look up dependency ingress-nginx/controller (repository=zadkiel/test-renovate, packageFile=components.yaml, dependency=ingress-nginx/controller) DEBUG: Datasource 404 (repository=zadkiel/test-renovate) "datasource": "docker", "lookupName": "defaultbackend-amd64", "url": "https://k8s.gcr.io/v2//library/defaultbackend-amd64/tags/list?n=10000" DEBUG: Failed to look up dependency defaultbackend-amd64 (repository=zadkiel/test-renovate, packageFile=components.yaml, dependency=defaultbackend-amd64) DEBUG: Datasource unauthorized (repository=zadkiel/test-renovate) "datasource": "docker", "lookupName": "jettech/kube-webhook-certgen", "url": "https://index.docker.io/v2/jettech/kube-webhook-certgen/tags/list?n=10000" DEBUG: Failed to look up dependency jettech/kube-webhook-certgen (repository=zadkiel/test-renovate, packageFile=components.yaml, dependency=jettech/kube-webhook-certgen) DEBUG: Package releases lookups complete (repository=zadkiel/test-renovate) "baseBranch": "master" DEBUG: branchifyUpgrades (repository=zadkiel/test-renovate) DEBUG: 2 flattened updates found: kubernetes/kubernetes, ingress-nginx (repository=zadkiel/test-renovate) DEBUG: Returning 2 branch(es) (repository=zadkiel/test-renovate) DEBUG: Fetching changelog: https://github.com/kubernetes/kubernetes (v1.21.0 -> v1.22.0) (repository=zadkiel/test-renovate) DEBUG: Fetching changelog: https://github.com/kubernetes/ingress-nginx (3.30.0 -> 3.35.0) (repository=zadkiel/test-renovate) WARN: No github.com token has been configured. Skipping release notes retrieval (repository=zadkiel/test-renovate) "manager": "regex", "depName": "kubernetes/kubernetes", "sourceUrl": "https://github.com/kubernetes/kubernetes" WARN: No github.com token has been configured. Skipping release notes retrieval (repository=zadkiel/test-renovate) "manager": "regex", "depName": "ingress-nginx", "sourceUrl": "https://github.com/kubernetes/ingress-nginx" DEBUG: config.repoIsOnboarded=true (repository=zadkiel/test-renovate) DEBUG: packageFiles with updates (repository=zadkiel/test-renovate) "config": { "regex": [ { "packageFile": "components.yaml", "deps": [ { "depName": "kubernetes/kubernetes", "currentValue": "v1.21.0", "datasource": "github-releases", "replaceString": "kubectl_version: \"v1.21.0\" # renovate: datasource=github-releases depName=kubernetes/kubernetes\n", "depIndex": 0, "updates": [ { "bucket": "non-major", "newVersion": "v1.22.0", "newValue": "v1.22.0", "releaseTimestamp": "2021-08-04T19:38:56.000Z", "newMajor": 1, "newMinor": 22, "updateType": "minor", "branchName": "renovate/kubernetes-kubernetes-1.x" } ], "warnings": [], "versioning": "semver", "sourceUrl": "https://github.com/kubernetes/kubernetes", "currentVersion": "v1.21.0", "isSingleVersion": true, "fixedVersion": "v1.21.0" }, { "depName": "ingress-nginx", "currentValue": "3.30.0", "datasource": "helm", "registryUrls": ["https://kubernetes.github.io/ingress-nginx"], "replaceString": "ingress_nginx_chart_version: \"3.30.0\" # renovate: datasource=helm depName=ingress-nginx registryUrl=https://kubernetes.github.io/ingress-nginx\n", "depIndex": 1, "updates": [ { "bucket": "non-major", "newVersion": "3.35.0", "newValue": "3.35.0", "releaseTimestamp": "2021-08-03T14:35:46.714Z", "newMajor": 3, "newMinor": 35, "updateType": "minor", "branchName": "renovate/ingress-nginx-3.x" } ], "warnings": [], "versioning": "semver", "sourceUrl": "https://github.com/kubernetes/ingress-nginx", "currentVersion": "3.30.0", "isSingleVersion": true, "fixedVersion": "3.30.0" }, { "depName": "ingress-nginx/controller", "currentValue": "v0.46.0", "datasource": "docker", "versioning": "docker", "registryUrls": ["https://k8s.gcr.io/"], "replaceString": "ingress_nginx_version: \"v0.46.0\" # renovate: datasource=docker depName=ingress-nginx/controller registryUrl=https://k8s.gcr.io versioning=docker\n", "depIndex": 2, "updates": [], "warnings": [ { "topic": "ingress-nginx/controller", "message": "Failed to look up dependency ingress-nginx/controller" } ] }, { "depName": "jettech/kube-webhook-certgen", "currentValue": "v1.5.1", "datasource": "docker", "versioning": "docker", "registryUrls": ["https://index.docker.io/"], "replaceString": "ingress_nginx_certgen_version: \"v1.5.1\" # renovate: datasource=docker depName=jettech/kube-webhook-certgen registryUrl=https://index.docker.io versioning=docker\n", "depIndex": 3, "updates": [], "warnings": [ { "topic": "jettech/kube-webhook-certgen", "message": "Failed to look up dependency jettech/kube-webhook-certgen" } ] }, { "depName": "defaultbackend-amd64", "currentValue": "1.5", "datasource": "docker", "versioning": "docker", "registryUrls": ["https://k8s.gcr.io/"], "replaceString": "ingress_nginx_backend_version: \"1.5\" # renovate: datasource=docker depName=defaultbackend-amd64 registryUrl=https://k8s.gcr.io versioning=docker\n", "depIndex": 4, "updates": [], "warnings": [ { "topic": "defaultbackend-amd64", "message": "Failed to look up dependency defaultbackend-amd64" } ] } ], "matchStrings": [ "[^\\n]*_version: \"(?<currentValue>.*)\"\\s*#\\s*renovate:\\s*datasource=(?<datasource>.*?) depName=(?<depName>.*?)( registryUrl=(?<registryUrl>.*?))?( versioning=(?<versioning>.*?))?\\n" ] } ] } DEBUG: processRepo() (repository=zadkiel/test-renovate) DEBUG: Processing 2 branches: renovate/ingress-nginx-3.x, renovate/kubernetes-kubernetes-1.x (repository=zadkiel/test-renovate) DEBUG: Calculated maximum PRs remaining this run (repository=zadkiel/test-renovate) "prsRemaining": 99 DEBUG: PullRequests limit = 99 (repository=zadkiel/test-renovate) DEBUG: Calculated maximum branches remaining this run (repository=zadkiel/test-renovate) "branchesRemaining": 99 DEBUG: Branches limit = 99 (repository=zadkiel/test-renovate) DEBUG: Setting current branch to master (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: latest commit (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) "branchName": "master", "latestCommitDate": "2021-08-13T12:44:51+02:00" DEBUG: getBranchPr(renovate/ingress-nginx-3.x) (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: findPr(renovate/ingress-nginx-3.x, undefined, open) (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Retrieving PR list (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Retrieved 3 Pull Requests (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: findPr(renovate/ingress-nginx-3.x, undefined, closed) (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: branchExists=false (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: dependencyDashboardCheck=undefined (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: recreateClosed is false (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: findPr(renovate/ingress-nginx-3.x, Update Helm release ingress-nginx to v3.35.0, !open) (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Found PR #2 (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Found closed PR with current title (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Retrieved closed PR list with graphql (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) "prNumbers": [2, 3] DEBUG: Returning from graphql closed PR list (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Closed PR already exists. Skipping branch. (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) "prTitle": "Update Helm release ingress-nginx to v3.35.0" DEBUG: Merged PR is blocking this branch (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) "pr": 2 DEBUG: Setting current branch to master (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: latest commit (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) "branchName": "master", "latestCommitDate": "2021-08-13T12:44:51+02:00" DEBUG: getBranchPr(renovate/kubernetes-kubernetes-1.x) (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: findPr(renovate/kubernetes-kubernetes-1.x, undefined, open) (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Found PR #4 (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Returning from graphql open PR list (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: branchExists=true (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: dependencyDashboardCheck=undefined (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: PR rebase requested=false (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Checking if PR has been edited (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Found existing branch PR (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Checking schedule(at any time, Europe/Paris) (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: No schedule defined (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Branch already exists (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: GitHub 404 (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) "url": "repos/zadkiel/test-renovate/branches/master/protection" DEBUG: No branch protection found (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Skipping stale branch check due to rebaseWhen=auto (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Branch does not need rebasing (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Using reuseExistingBranch: true (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: manager.getUpdatedPackageFiles() reuseExistinbranch=true (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Branch dep is already updated (repository=zadkiel/test-renovate, packageFile=components.yaml, branch=renovate/kubernetes-kubernetes-1.x) "depName": "kubernetes/kubernetes" DEBUG: No content changed (repository=zadkiel/test-renovate, packageFile=components.yaml, branch=renovate/kubernetes-kubernetes-1.x) "depName": "kubernetes/kubernetes" DEBUG: No package files need updating (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: No updated lock files in branch (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: No files to commit (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Checking if we can automerge branch (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: mergeStatus=no automerge (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Ensuring PR (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: There are 0 errors and 0 warnings (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Found existing PR (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Processing existing PR (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Pull Request #4 does not need updating (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: PR is not configured for automerge (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Removing any stale branches (repository=zadkiel/test-renovate) DEBUG: config.repoIsOnboarded=true (repository=zadkiel/test-renovate) DEBUG: Branch lists (repository=zadkiel/test-renovate) "branchList": ["renovate/ingress-nginx-3.x", "renovate/kubernetes-kubernetes-1.x"], "renovateBranches": ["renovate/kubernetes-kubernetes-1.x"] DEBUG: remainingBranches= (repository=zadkiel/test-renovate) DEBUG: No branches to clean up (repository=zadkiel/test-renovate) DEBUG: Repository timing splits (milliseconds) (repository=zadkiel/test-renovate) "splits": {"init": 5208, "extract": 1414, "lookup": 2708, "update": 3181}, "total": 12525 DEBUG: http statistics (repository=zadkiel/test-renovate) "hostStats": [ "index.docker.io, 1 request, 1039ms request average, 0ms queue average", "k8s.gcr.io, 4 requests, 397ms request average, 0ms queue average", "github-enterprise.local, 6 requests, 662ms request average, 0ms queue average" ], "totalRequests": 11 INFO: Repository finished (repository=zadkiel/test-renovate) "durationMs": 12525 DEBUG: Renovate exiting ``` </details> ### Have you created a minimal reproduction repository? I have linked to a minimal reproduction repository in the bug description
1.0
bug: extra slash in docker registry other than index.docker.io - ### How are you running Renovate? Self-hosted ### Please select which platform you are using if self-hosting. GitHub Enterprise Server ### If you're self-hosting Renovate, tell us what version of Renovate you run. 25.73.0-986e1c1f9 ### Describe the bug Unable to retrieve tags for docker registry other than index.docker.io. This prevents us from updating images from k8s.gcr.io for example. I tried with or without the trailing slash in the registryUrl variable. Reproduction repository: https://github.com/aslafy-z/reproduce-renovate-bug-docker-extraslash ``` DEBUG: Datasource 404 (repository=zadkiel/test-renovate) "datasource": "docker", "lookupName": "defaultbackend-amd64", "url": "https://k8s.gcr.io/v2//defaultbackend-amd64/tags/list?n=10000" ``` There's an extra slash in the generated URL which prevents the request to succeed. ### Relevant debug logs <details><summary>Logs</summary> ``` DEBUG: Using RE2 as regex engine DEBUG: Parsing configs DEBUG: Checking for config file in /tmp/test-renovate/config DEBUG: No config file found on disk - skipping DEBUG: File config "config": {} DEBUG: CLI config "config": { "repositories": ["zadkiel/test-renovate"], "prHourlyLimit": 0, "prConcurrentLimit": 0 } DEBUG: Env config "config": { "hostRules": [], "platform": "github", "endpoint": "https://github-enterprise.local/api/v3", "token": "***********" } DEBUG: Combined config "config": { "hostRules": [], "platform": "github", "endpoint": "https://github-enterprise.local/api/v3", "token": "***********", "repositories": ["zadkiel/test-renovate"], "prHourlyLimit": 0, "prConcurrentLimit": 0 } DEBUG: Adding trailing slash to endpoint (node:901107) Warning: Setting the NODE_TLS_REJECT_UNAUTHORIZED environment variable to '0' makes TLS connections and HTTPS requests insecure by disabling certificate verification. (Use `node --trace-warnings ...` to show where the warning was created) DEBUG: GitHub 404 "url": "https://github-enterprise.local/api/v3/user/emails" DEBUG: Cannot read user/emails endpoint on GitHub to retrieve gitAuthor DEBUG: Authenticated as GitHub user: renovate-bot DEBUG: Using default gitAuthor: Renovate Bot <renovate@whitesourcesoftware.com> DEBUG: Adding token authentication for github-enterprise.local to hostRules DEBUG: Using baseDir: /tmp/renovate DEBUG: Using cacheDir: /tmp/renovate/cache DEBUG: Initializing Renovate internal cache into /tmp/renovate/cache/renovate/renovate-cache-v1 DEBUG: Commits limit = null DEBUG: Clearing hostRules DEBUG: Adding token authentication for github-enterprise.local to hostRules INFO: Repository started (repository=zadkiel/test-renovate) "renovateVersion": "25.73.0-986e1c1f9" DEBUG: Using localDir: /tmp/renovate/repos/github/zadkiel/test-renovate (repository=zadkiel/test-renovate) DEBUG: initRepo("zadkiel/test-renovate") (repository=zadkiel/test-renovate) DEBUG: Overriding default GitHub endpoint (repository=zadkiel/test-renovate) "endpoint": "https://github-enterprise.local/api/v3/" DEBUG: zadkiel/test-renovate default branch = master (repository=zadkiel/test-renovate) DEBUG: Using personal access token for git init (repository=zadkiel/test-renovate) DEBUG: resetMemCache() (repository=zadkiel/test-renovate) DEBUG: Resetting npmrc (repository=zadkiel/test-renovate) DEBUG: checkOnboarding() (repository=zadkiel/test-renovate) DEBUG: isOnboarded() (repository=zadkiel/test-renovate) DEBUG: findFile(renovate.json) (repository=zadkiel/test-renovate) DEBUG: Initializing git repository into /tmp/renovate/repos/github/zadkiel/test-renovate (repository=zadkiel/test-renovate) DEBUG: git clone completed (repository=zadkiel/test-renovate) "durationMs": 1195 DEBUG: latest repository commit (repository=zadkiel/test-renovate) "latestCommit": { "hash": "5c1671bfe9f6e275f483889dc2307a958c94c359", "date": "2021-08-13T12:44:51+02:00", "message": "Update components.yaml", "refs": "HEAD -> master, origin/master, origin/HEAD", "body": "", "author_name": "zadkiel", "author_email": "zadkiel@zadkiel.local" } DEBUG: Setting git author name (repository=zadkiel/test-renovate) "gitAuthorName": "Renovate Bot" DEBUG: Setting git author email (repository=zadkiel/test-renovate) "gitAuthorEmail": "renovate@whitesourcesoftware.com" DEBUG: findFile(renovate.json5) (repository=zadkiel/test-renovate) DEBUG: findFile(.github/renovate.json) (repository=zadkiel/test-renovate) DEBUG: Config file exists (repository=zadkiel/test-renovate) "fileName": ".github/renovate.json" DEBUG: Retrieving issueList (repository=zadkiel/test-renovate) DEBUG: Retrieved 1 issues (repository=zadkiel/test-renovate) DEBUG: Repo is onboarded (repository=zadkiel/test-renovate) DEBUG: Found .github/renovate.json config file (repository=zadkiel/test-renovate) DEBUG: Repository config (repository=zadkiel/test-renovate) "fileName": ".github/renovate.json", "config": { "$schema": "https://docs.renovatebot.com/renovate-schema.json", "regexManagers": [ { "fileMatch": ["^components\\.yaml$"], "matchStrings": [ "[^\\n]*_version: \"(?<currentValue>.*)\"\\s*#\\s*renovate:\\s*datasource=(?<datasource>.*?) depName=(?<depName>.*?)( registryUrl=(?<registryUrl>.*?))?( versioning=(?<versioning>.*?))?\\n" ] } ], "extends": ["config:base"], "labels": ["kind/upgrade"], "timezone": "Europe/Paris" } DEBUG: migrateAndValidate() (repository=zadkiel/test-renovate) DEBUG: No config migration necessary (repository=zadkiel/test-renovate) DEBUG: massaged config (repository=zadkiel/test-renovate) "config": { "$schema": "https://docs.renovatebot.com/renovate-schema.json", "regexManagers": [ { "fileMatch": ["^components\\.yaml$"], "matchStrings": [ "[^\\n]*_version: \"(?<currentValue>.*)\"\\s*#\\s*renovate:\\s*datasource=(?<datasource>.*?) depName=(?<depName>.*?)( registryUrl=(?<registryUrl>.*?))?( versioning=(?<versioning>.*?))?\\n" ] } ], "extends": ["config:base"], "labels": ["kind/upgrade"], "timezone": "Europe/Paris" } DEBUG: migrated config (repository=zadkiel/test-renovate) "config": { "$schema": "https://docs.renovatebot.com/renovate-schema.json", "regexManagers": [ { "fileMatch": ["^components\\.yaml$"], "matchStrings": [ "[^\\n]*_version: \"(?<currentValue>.*)\"\\s*#\\s*renovate:\\s*datasource=(?<datasource>.*?) depName=(?<depName>.*?)( registryUrl=(?<registryUrl>.*?))?( versioning=(?<versioning>.*?))?\\n" ] } ], "extends": ["config:base"], "labels": ["kind/upgrade"], "timezone": "Europe/Paris" } DEBUG: Setting hostRules from config (repository=zadkiel/test-renovate) DEBUG: Found repo ignorePaths (repository=zadkiel/test-renovate) "ignorePaths": [ "**/node_modules/**", "**/bower_components/**", "**/vendor/**", "**/examples/**", "**/__tests__/**", "**/test/**", "**/tests/**", "**/__fixtures__/**" ] DEBUG: detectSemanticCommits() (repository=zadkiel/test-renovate) DEBUG: getCommitMessages (repository=zadkiel/test-renovate) DEBUG: Semantic commits detection: unknown (repository=zadkiel/test-renovate) DEBUG: No semantic commits detected (repository=zadkiel/test-renovate) DEBUG: Setting branchPrefix: renovate/ (repository=zadkiel/test-renovate) DEBUG: No vulnerability alerts found (repository=zadkiel/test-renovate) DEBUG: No vulnerability alerts found (repository=zadkiel/test-renovate) DEBUG: No baseBranches (repository=zadkiel/test-renovate) DEBUG: extract() (repository=zadkiel/test-renovate) DEBUG: Setting current branch to master (repository=zadkiel/test-renovate) DEBUG: latest commit (repository=zadkiel/test-renovate) "branchName": "master", "latestCommitDate": "2021-08-13T12:44:51+02:00" DEBUG: Using file match: (^|/)tasks/[^/]+\.ya?ml$ for manager ansible (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)requirements\.ya?ml$ for manager ansible-galaxy (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)galaxy\.ya?ml$ for manager ansible-galaxy (repository=zadkiel/test-renovate) DEBUG: Using file match: azure.*pipelines?.*\.ya?ml$ for manager azure-pipelines (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)batect(-bundle)?\.yml$ for manager batect (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)batect$ for manager batect-wrapper (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)WORKSPACE(|\.bazel)$ for manager bazel (repository=zadkiel/test-renovate) DEBUG: Using file match: \.bzl$ for manager bazel (repository=zadkiel/test-renovate) DEBUG: Using file match: buildkite\.ya?ml for manager buildkite (repository=zadkiel/test-renovate) DEBUG: Using file match: \.buildkite/.+\.ya?ml$ for manager buildkite (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Gemfile$ for manager bundler (repository=zadkiel/test-renovate) DEBUG: Using file match: \.cake$ for manager cake (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Cargo.toml$ for manager cargo (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).circleci/config.yml$ for manager circleci (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)cloudbuild.ya?ml for manager cloudbuild (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Podfile$ for manager cocoapods (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)([\w-]*)composer.json$ for manager composer (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)deps\.edn$ for manager deps-edn (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)docker-compose[^/]*\.ya?ml$ for manager docker-compose (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/|\.)Dockerfile$ for manager dockerfile (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Dockerfile\.[^/]*$ for manager dockerfile (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).drone.yml$ for manager droneci (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).gitmodules$ for manager git-submodules (repository=zadkiel/test-renovate) DEBUG: Using file match: ^(workflow-templates|\.github\/workflows)\/[^/]+\.ya?ml$ for manager github-actions (repository=zadkiel/test-renovate) DEBUG: Using file match: \.gitlab-ci\.yml$ for manager gitlabci (repository=zadkiel/test-renovate) DEBUG: Using file match: ^\.gitlab-ci\.yml$ for manager gitlabci-include (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)go.mod$ for manager gomod (repository=zadkiel/test-renovate) DEBUG: Using file match: \.gradle(\.kts)?$ for manager gradle (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)gradle.properties$ for manager gradle (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)gradle.properties$ for manager gradle-lite (repository=zadkiel/test-renovate) DEBUG: Using file match: \.gradle(\.kts)?$ for manager gradle-lite (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)gradle/wrapper/gradle-wrapper.properties$ for manager gradle-wrapper (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)requirements\.yaml$ for manager helm-requirements (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)values.yaml$ for manager helm-values (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)helmfile.yaml$ for manager helmfile (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Chart.yaml$ for manager helmv3 (repository=zadkiel/test-renovate) DEBUG: Using file match: ^Formula/[^/]+[.]rb$ for manager homebrew (repository=zadkiel/test-renovate) DEBUG: Using file match: \.html?$ for manager html (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)plugins\.(txt|ya?ml)$ for manager jenkins (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)kustomization\.yaml for manager kustomize (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)project\.clj$ for manager leiningen (repository=zadkiel/test-renovate) DEBUG: Using file match: \.pom\.xml$ for manager maven (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)pom\.xml$ for manager maven (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)package.js$ for manager meteor (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)mix\.exs$ for manager mix (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).node-version$ for manager nodenv (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)package.json$ for manager npm (repository=zadkiel/test-renovate) DEBUG: Using file match: \.(?:cs|fs|vb)proj$ for manager nuget (repository=zadkiel/test-renovate) DEBUG: Using file match: \.(?:props|targets)$ for manager nuget (repository=zadkiel/test-renovate) DEBUG: Using file match: \.config\/dotnet-tools\.json$ for manager nuget (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.nvmrc$ for manager nvm (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)([\w-]*)requirements\.(txt|pip)$ for manager pip_requirements (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)setup.py$ for manager pip_setup (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Pipfile$ for manager pipenv (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)pyproject\.toml$ for manager poetry (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.pre-commit-config\.yaml$ for manager pre-commit (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)pubspec\.ya?ml$ for manager pub (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/).python-version$ for manager pyenv (repository=zadkiel/test-renovate) DEBUG: Using file match: ^components\.yaml$ for manager regex (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.ruby-version$ for manager ruby-version (repository=zadkiel/test-renovate) DEBUG: Using file match: \.sbt$ for manager sbt (repository=zadkiel/test-renovate) DEBUG: Using file match: project/[^/]*.scala$ for manager sbt (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)setup\.cfg$ for manager setup-cfg (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)Package\.swift for manager swift (repository=zadkiel/test-renovate) DEBUG: Using file match: \.tf$ for manager terraform (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.terraform-version$ for manager terraform-version (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)terragrunt\.hcl$ for manager terragrunt (repository=zadkiel/test-renovate) DEBUG: Using file match: (^|/)\.terragrunt-version$ for manager terragrunt-version (repository=zadkiel/test-renovate) DEBUG: Using file match: ^.travis.yml$ for manager travis (repository=zadkiel/test-renovate) DEBUG: Matched 1 file(s) for manager regex: components.yaml (repository=zadkiel/test-renovate) DEBUG: Found regex package files (repository=zadkiel/test-renovate) DEBUG: Found 1 package file(s) (repository=zadkiel/test-renovate) INFO: Dependency extraction complete (repository=zadkiel/test-renovate) "baseBranch": "master", "stats": { "managers": {"regex": {"fileCount": 1, "depCount": 5}}, "total": {"fileCount": 1, "depCount": 5} } DEBUG: Retrying Tags for https://k8s.gcr.io//defaultbackend-amd64 using library/ prefix (repository=zadkiel/test-renovate) DEBUG: Datasource 404 (repository=zadkiel/test-renovate) "datasource": "docker", "lookupName": "ingress-nginx/controller", "url": "https://k8s.gcr.io/v2//ingress-nginx/controller/tags/list?n=10000" DEBUG: Failed to look up dependency ingress-nginx/controller (repository=zadkiel/test-renovate, packageFile=components.yaml, dependency=ingress-nginx/controller) DEBUG: Datasource 404 (repository=zadkiel/test-renovate) "datasource": "docker", "lookupName": "defaultbackend-amd64", "url": "https://k8s.gcr.io/v2//library/defaultbackend-amd64/tags/list?n=10000" DEBUG: Failed to look up dependency defaultbackend-amd64 (repository=zadkiel/test-renovate, packageFile=components.yaml, dependency=defaultbackend-amd64) DEBUG: Datasource unauthorized (repository=zadkiel/test-renovate) "datasource": "docker", "lookupName": "jettech/kube-webhook-certgen", "url": "https://index.docker.io/v2/jettech/kube-webhook-certgen/tags/list?n=10000" DEBUG: Failed to look up dependency jettech/kube-webhook-certgen (repository=zadkiel/test-renovate, packageFile=components.yaml, dependency=jettech/kube-webhook-certgen) DEBUG: Package releases lookups complete (repository=zadkiel/test-renovate) "baseBranch": "master" DEBUG: branchifyUpgrades (repository=zadkiel/test-renovate) DEBUG: 2 flattened updates found: kubernetes/kubernetes, ingress-nginx (repository=zadkiel/test-renovate) DEBUG: Returning 2 branch(es) (repository=zadkiel/test-renovate) DEBUG: Fetching changelog: https://github.com/kubernetes/kubernetes (v1.21.0 -> v1.22.0) (repository=zadkiel/test-renovate) DEBUG: Fetching changelog: https://github.com/kubernetes/ingress-nginx (3.30.0 -> 3.35.0) (repository=zadkiel/test-renovate) WARN: No github.com token has been configured. Skipping release notes retrieval (repository=zadkiel/test-renovate) "manager": "regex", "depName": "kubernetes/kubernetes", "sourceUrl": "https://github.com/kubernetes/kubernetes" WARN: No github.com token has been configured. Skipping release notes retrieval (repository=zadkiel/test-renovate) "manager": "regex", "depName": "ingress-nginx", "sourceUrl": "https://github.com/kubernetes/ingress-nginx" DEBUG: config.repoIsOnboarded=true (repository=zadkiel/test-renovate) DEBUG: packageFiles with updates (repository=zadkiel/test-renovate) "config": { "regex": [ { "packageFile": "components.yaml", "deps": [ { "depName": "kubernetes/kubernetes", "currentValue": "v1.21.0", "datasource": "github-releases", "replaceString": "kubectl_version: \"v1.21.0\" # renovate: datasource=github-releases depName=kubernetes/kubernetes\n", "depIndex": 0, "updates": [ { "bucket": "non-major", "newVersion": "v1.22.0", "newValue": "v1.22.0", "releaseTimestamp": "2021-08-04T19:38:56.000Z", "newMajor": 1, "newMinor": 22, "updateType": "minor", "branchName": "renovate/kubernetes-kubernetes-1.x" } ], "warnings": [], "versioning": "semver", "sourceUrl": "https://github.com/kubernetes/kubernetes", "currentVersion": "v1.21.0", "isSingleVersion": true, "fixedVersion": "v1.21.0" }, { "depName": "ingress-nginx", "currentValue": "3.30.0", "datasource": "helm", "registryUrls": ["https://kubernetes.github.io/ingress-nginx"], "replaceString": "ingress_nginx_chart_version: \"3.30.0\" # renovate: datasource=helm depName=ingress-nginx registryUrl=https://kubernetes.github.io/ingress-nginx\n", "depIndex": 1, "updates": [ { "bucket": "non-major", "newVersion": "3.35.0", "newValue": "3.35.0", "releaseTimestamp": "2021-08-03T14:35:46.714Z", "newMajor": 3, "newMinor": 35, "updateType": "minor", "branchName": "renovate/ingress-nginx-3.x" } ], "warnings": [], "versioning": "semver", "sourceUrl": "https://github.com/kubernetes/ingress-nginx", "currentVersion": "3.30.0", "isSingleVersion": true, "fixedVersion": "3.30.0" }, { "depName": "ingress-nginx/controller", "currentValue": "v0.46.0", "datasource": "docker", "versioning": "docker", "registryUrls": ["https://k8s.gcr.io/"], "replaceString": "ingress_nginx_version: \"v0.46.0\" # renovate: datasource=docker depName=ingress-nginx/controller registryUrl=https://k8s.gcr.io versioning=docker\n", "depIndex": 2, "updates": [], "warnings": [ { "topic": "ingress-nginx/controller", "message": "Failed to look up dependency ingress-nginx/controller" } ] }, { "depName": "jettech/kube-webhook-certgen", "currentValue": "v1.5.1", "datasource": "docker", "versioning": "docker", "registryUrls": ["https://index.docker.io/"], "replaceString": "ingress_nginx_certgen_version: \"v1.5.1\" # renovate: datasource=docker depName=jettech/kube-webhook-certgen registryUrl=https://index.docker.io versioning=docker\n", "depIndex": 3, "updates": [], "warnings": [ { "topic": "jettech/kube-webhook-certgen", "message": "Failed to look up dependency jettech/kube-webhook-certgen" } ] }, { "depName": "defaultbackend-amd64", "currentValue": "1.5", "datasource": "docker", "versioning": "docker", "registryUrls": ["https://k8s.gcr.io/"], "replaceString": "ingress_nginx_backend_version: \"1.5\" # renovate: datasource=docker depName=defaultbackend-amd64 registryUrl=https://k8s.gcr.io versioning=docker\n", "depIndex": 4, "updates": [], "warnings": [ { "topic": "defaultbackend-amd64", "message": "Failed to look up dependency defaultbackend-amd64" } ] } ], "matchStrings": [ "[^\\n]*_version: \"(?<currentValue>.*)\"\\s*#\\s*renovate:\\s*datasource=(?<datasource>.*?) depName=(?<depName>.*?)( registryUrl=(?<registryUrl>.*?))?( versioning=(?<versioning>.*?))?\\n" ] } ] } DEBUG: processRepo() (repository=zadkiel/test-renovate) DEBUG: Processing 2 branches: renovate/ingress-nginx-3.x, renovate/kubernetes-kubernetes-1.x (repository=zadkiel/test-renovate) DEBUG: Calculated maximum PRs remaining this run (repository=zadkiel/test-renovate) "prsRemaining": 99 DEBUG: PullRequests limit = 99 (repository=zadkiel/test-renovate) DEBUG: Calculated maximum branches remaining this run (repository=zadkiel/test-renovate) "branchesRemaining": 99 DEBUG: Branches limit = 99 (repository=zadkiel/test-renovate) DEBUG: Setting current branch to master (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: latest commit (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) "branchName": "master", "latestCommitDate": "2021-08-13T12:44:51+02:00" DEBUG: getBranchPr(renovate/ingress-nginx-3.x) (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: findPr(renovate/ingress-nginx-3.x, undefined, open) (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Retrieving PR list (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Retrieved 3 Pull Requests (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: findPr(renovate/ingress-nginx-3.x, undefined, closed) (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: branchExists=false (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: dependencyDashboardCheck=undefined (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: recreateClosed is false (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: findPr(renovate/ingress-nginx-3.x, Update Helm release ingress-nginx to v3.35.0, !open) (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Found PR #2 (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Found closed PR with current title (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Retrieved closed PR list with graphql (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) "prNumbers": [2, 3] DEBUG: Returning from graphql closed PR list (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) DEBUG: Closed PR already exists. Skipping branch. (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) "prTitle": "Update Helm release ingress-nginx to v3.35.0" DEBUG: Merged PR is blocking this branch (repository=zadkiel/test-renovate, branch=renovate/ingress-nginx-3.x) "pr": 2 DEBUG: Setting current branch to master (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: latest commit (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) "branchName": "master", "latestCommitDate": "2021-08-13T12:44:51+02:00" DEBUG: getBranchPr(renovate/kubernetes-kubernetes-1.x) (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: findPr(renovate/kubernetes-kubernetes-1.x, undefined, open) (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Found PR #4 (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Returning from graphql open PR list (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: branchExists=true (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: dependencyDashboardCheck=undefined (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: PR rebase requested=false (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Checking if PR has been edited (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Found existing branch PR (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Checking schedule(at any time, Europe/Paris) (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: No schedule defined (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Branch already exists (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: GitHub 404 (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) "url": "repos/zadkiel/test-renovate/branches/master/protection" DEBUG: No branch protection found (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Skipping stale branch check due to rebaseWhen=auto (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Branch does not need rebasing (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Using reuseExistingBranch: true (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: manager.getUpdatedPackageFiles() reuseExistinbranch=true (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Branch dep is already updated (repository=zadkiel/test-renovate, packageFile=components.yaml, branch=renovate/kubernetes-kubernetes-1.x) "depName": "kubernetes/kubernetes" DEBUG: No content changed (repository=zadkiel/test-renovate, packageFile=components.yaml, branch=renovate/kubernetes-kubernetes-1.x) "depName": "kubernetes/kubernetes" DEBUG: No package files need updating (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: No updated lock files in branch (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: No files to commit (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Checking if we can automerge branch (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: mergeStatus=no automerge (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Ensuring PR (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: There are 0 errors and 0 warnings (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Found existing PR (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Processing existing PR (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Pull Request #4 does not need updating (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: PR is not configured for automerge (repository=zadkiel/test-renovate, branch=renovate/kubernetes-kubernetes-1.x) DEBUG: Removing any stale branches (repository=zadkiel/test-renovate) DEBUG: config.repoIsOnboarded=true (repository=zadkiel/test-renovate) DEBUG: Branch lists (repository=zadkiel/test-renovate) "branchList": ["renovate/ingress-nginx-3.x", "renovate/kubernetes-kubernetes-1.x"], "renovateBranches": ["renovate/kubernetes-kubernetes-1.x"] DEBUG: remainingBranches= (repository=zadkiel/test-renovate) DEBUG: No branches to clean up (repository=zadkiel/test-renovate) DEBUG: Repository timing splits (milliseconds) (repository=zadkiel/test-renovate) "splits": {"init": 5208, "extract": 1414, "lookup": 2708, "update": 3181}, "total": 12525 DEBUG: http statistics (repository=zadkiel/test-renovate) "hostStats": [ "index.docker.io, 1 request, 1039ms request average, 0ms queue average", "k8s.gcr.io, 4 requests, 397ms request average, 0ms queue average", "github-enterprise.local, 6 requests, 662ms request average, 0ms queue average" ], "totalRequests": 11 INFO: Repository finished (repository=zadkiel/test-renovate) "durationMs": 12525 DEBUG: Renovate exiting ``` </details> ### Have you created a minimal reproduction repository? I have linked to a minimal reproduction repository in the bug description
priority
bug extra slash in docker registry other than index docker io how are you running renovate self hosted please select which platform you are using if self hosting github enterprise server if you re self hosting renovate tell us what version of renovate you run describe the bug unable to retrieve tags for docker registry other than index docker io this prevents us from updating images from gcr io for example i tried with or without the trailing slash in the registryurl variable reproduction repository debug datasource repository zadkiel test renovate datasource docker lookupname defaultbackend url there s an extra slash in the generated url which prevents the request to succeed relevant debug logs logs debug using as regex engine debug parsing configs debug checking for config file in tmp test renovate config debug no config file found on disk skipping debug file config config debug cli config config repositories prhourlylimit prconcurrentlimit debug env config config hostrules platform github endpoint token debug combined config config hostrules platform github endpoint token repositories prhourlylimit prconcurrentlimit debug adding trailing slash to endpoint node warning setting the node tls reject unauthorized environment variable to makes tls connections and https requests insecure by disabling certificate verification use node trace warnings to show where the warning was created debug github url debug cannot read user emails endpoint on github to retrieve gitauthor debug authenticated as github user renovate bot debug using default gitauthor renovate bot debug adding token authentication for github enterprise local to hostrules debug using basedir tmp renovate debug using cachedir tmp renovate cache debug initializing renovate internal cache into tmp renovate cache renovate renovate cache debug commits limit null debug clearing hostrules debug adding token authentication for github enterprise local to hostrules info repository started repository zadkiel test renovate renovateversion debug using localdir tmp renovate repos github zadkiel test renovate repository zadkiel test renovate debug initrepo zadkiel test renovate repository zadkiel test renovate debug overriding default github endpoint repository zadkiel test renovate endpoint debug zadkiel test renovate default branch master repository zadkiel test renovate debug using personal access token for git init repository zadkiel test renovate debug resetmemcache repository zadkiel test renovate debug resetting npmrc repository zadkiel test renovate debug checkonboarding repository zadkiel test renovate debug isonboarded repository zadkiel test renovate debug findfile renovate json repository zadkiel test renovate debug initializing git repository into tmp renovate repos github zadkiel test renovate repository zadkiel test renovate debug git clone completed repository zadkiel test renovate durationms debug latest repository commit repository zadkiel test renovate latestcommit hash date message update components yaml refs head master origin master origin head body author name zadkiel author email zadkiel zadkiel local debug setting git author name repository zadkiel test renovate gitauthorname renovate bot debug setting git author email repository zadkiel test renovate gitauthoremail renovate whitesourcesoftware com debug findfile renovate repository zadkiel test renovate debug findfile github renovate json repository zadkiel test renovate debug config file exists repository zadkiel test renovate filename github renovate json debug retrieving issuelist repository zadkiel test renovate debug retrieved issues repository zadkiel test renovate debug repo is onboarded repository zadkiel test renovate debug found github renovate json config file repository zadkiel test renovate debug repository config repository zadkiel test renovate filename github renovate json config schema regexmanagers filematch matchstrings version s s renovate s datasource depname registryurl versioning n extends labels timezone europe paris debug migrateandvalidate repository zadkiel test renovate debug no config migration necessary repository zadkiel test renovate debug massaged config repository zadkiel test renovate config schema regexmanagers filematch matchstrings version s s renovate s datasource depname registryurl versioning n extends labels timezone europe paris debug migrated config repository zadkiel test renovate config schema regexmanagers filematch matchstrings version s s renovate s datasource depname registryurl versioning n extends labels timezone europe paris debug setting hostrules from config repository zadkiel test renovate debug found repo ignorepaths repository zadkiel test renovate ignorepaths node modules bower components vendor examples tests test tests fixtures debug detectsemanticcommits repository zadkiel test renovate debug getcommitmessages repository zadkiel test renovate debug semantic commits detection unknown repository zadkiel test renovate debug no semantic commits detected repository zadkiel test renovate debug setting branchprefix renovate repository zadkiel test renovate debug no vulnerability alerts found repository zadkiel test renovate debug no vulnerability alerts found repository zadkiel test renovate debug no basebranches repository zadkiel test renovate debug extract repository zadkiel test renovate debug setting current branch to master repository zadkiel test renovate debug latest commit repository zadkiel test renovate branchname master latestcommitdate debug using file match tasks ya ml for manager ansible repository zadkiel test renovate debug using file match requirements ya ml for manager ansible galaxy repository zadkiel test renovate debug using file match galaxy ya ml for manager ansible galaxy repository zadkiel test renovate debug using file match azure pipelines ya ml for manager azure pipelines repository zadkiel test renovate debug using file match batect bundle yml for manager batect repository zadkiel test renovate debug using file match batect for manager batect wrapper repository zadkiel test renovate debug using file match workspace bazel for manager bazel repository zadkiel test renovate debug using file match bzl for manager bazel repository zadkiel test renovate debug using file match buildkite ya ml for manager buildkite repository zadkiel test renovate debug using file match buildkite ya ml for manager buildkite repository zadkiel test renovate debug using file match gemfile for manager bundler repository zadkiel test renovate debug using file match cake for manager cake repository zadkiel test renovate debug using file match cargo toml for manager cargo repository zadkiel test renovate debug using file match circleci config yml for manager circleci repository zadkiel test renovate debug using file match cloudbuild ya ml for manager cloudbuild repository zadkiel test renovate debug using file match podfile for manager cocoapods repository zadkiel test renovate debug using file match composer json for manager composer repository zadkiel test renovate debug using file match deps edn for manager deps edn repository zadkiel test renovate debug using file match docker compose ya ml for manager docker compose repository zadkiel test renovate debug using file match dockerfile for manager dockerfile repository zadkiel test renovate debug using file match dockerfile for manager dockerfile repository zadkiel test renovate debug using file match drone yml for manager droneci repository zadkiel test renovate debug using file match gitmodules for manager git submodules repository zadkiel test renovate debug using file match workflow templates github workflows ya ml for manager github actions repository zadkiel test renovate debug using file match gitlab ci yml for manager gitlabci repository zadkiel test renovate debug using file match gitlab ci yml for manager gitlabci include repository zadkiel test renovate debug using file match go mod for manager gomod repository zadkiel test renovate debug using file match gradle kts for manager gradle repository zadkiel test renovate debug using file match gradle properties for manager gradle repository zadkiel test renovate debug using file match gradle properties for manager gradle lite repository zadkiel test renovate debug using file match gradle kts for manager gradle lite repository zadkiel test renovate debug using file match gradle wrapper gradle wrapper properties for manager gradle wrapper repository zadkiel test renovate debug using file match requirements yaml for manager helm requirements repository zadkiel test renovate debug using file match values yaml for manager helm values repository zadkiel test renovate debug using file match helmfile yaml for manager helmfile repository zadkiel test renovate debug using file match chart yaml for manager repository zadkiel test renovate debug using file match formula rb for manager homebrew repository zadkiel test renovate debug using file match html for manager html repository zadkiel test renovate debug using file match plugins txt ya ml for manager jenkins repository zadkiel test renovate debug using file match kustomization yaml for manager kustomize repository zadkiel test renovate debug using file match project clj for manager leiningen repository zadkiel test renovate debug using file match pom xml for manager maven repository zadkiel test renovate debug using file match pom xml for manager maven repository zadkiel test renovate debug using file match package js for manager meteor repository zadkiel test renovate debug using file match mix exs for manager mix repository zadkiel test renovate debug using file match node version for manager nodenv repository zadkiel test renovate debug using file match package json for manager npm repository zadkiel test renovate debug using file match cs fs vb proj for manager nuget repository zadkiel test renovate debug using file match props targets for manager nuget repository zadkiel test renovate debug using file match config dotnet tools json for manager nuget repository zadkiel test renovate debug using file match nvmrc for manager nvm repository zadkiel test renovate debug using file match requirements txt pip for manager pip requirements repository zadkiel test renovate debug using file match setup py for manager pip setup repository zadkiel test renovate debug using file match pipfile for manager pipenv repository zadkiel test renovate debug using file match pyproject toml for manager poetry repository zadkiel test renovate debug using file match pre commit config yaml for manager pre commit repository zadkiel test renovate debug using file match pubspec ya ml for manager pub repository zadkiel test renovate debug using file match python version for manager pyenv repository zadkiel test renovate debug using file match components yaml for manager regex repository zadkiel test renovate debug using file match ruby version for manager ruby version repository zadkiel test renovate debug using file match sbt for manager sbt repository zadkiel test renovate debug using file match project scala for manager sbt repository zadkiel test renovate debug using file match setup cfg for manager setup cfg repository zadkiel test renovate debug using file match package swift for manager swift repository zadkiel test renovate debug using file match tf for manager terraform repository zadkiel test renovate debug using file match terraform version for manager terraform version repository zadkiel test renovate debug using file match terragrunt hcl for manager terragrunt repository zadkiel test renovate debug using file match terragrunt version for manager terragrunt version repository zadkiel test renovate debug using file match travis yml for manager travis repository zadkiel test renovate debug matched file s for manager regex components yaml repository zadkiel test renovate debug found regex package files repository zadkiel test renovate debug found package file s repository zadkiel test renovate info dependency extraction complete repository zadkiel test renovate basebranch master stats managers regex filecount depcount total filecount depcount debug retrying tags for using library prefix repository zadkiel test renovate debug datasource repository zadkiel test renovate datasource docker lookupname ingress nginx controller url debug failed to look up dependency ingress nginx controller repository zadkiel test renovate packagefile components yaml dependency ingress nginx controller debug datasource repository zadkiel test renovate datasource docker lookupname defaultbackend url debug failed to look up dependency defaultbackend repository zadkiel test renovate packagefile components yaml dependency defaultbackend debug datasource unauthorized repository zadkiel test renovate datasource docker lookupname jettech kube webhook certgen url debug failed to look up dependency jettech kube webhook certgen repository zadkiel test renovate packagefile components yaml dependency jettech kube webhook certgen debug package releases lookups complete repository zadkiel test renovate basebranch master debug branchifyupgrades repository zadkiel test renovate debug flattened updates found kubernetes kubernetes ingress nginx repository zadkiel test renovate debug returning branch es repository zadkiel test renovate debug fetching changelog repository zadkiel test renovate debug fetching changelog repository zadkiel test renovate warn no github com token has been configured skipping release notes retrieval repository zadkiel test renovate manager regex depname kubernetes kubernetes sourceurl warn no github com token has been configured skipping release notes retrieval repository zadkiel test renovate manager regex depname ingress nginx sourceurl debug config repoisonboarded true repository zadkiel test renovate debug packagefiles with updates repository zadkiel test renovate config regex packagefile components yaml deps depname kubernetes kubernetes currentvalue datasource github releases replacestring kubectl version renovate datasource github releases depname kubernetes kubernetes n depindex updates bucket non major newversion newvalue releasetimestamp newmajor newminor updatetype minor branchname renovate kubernetes kubernetes x warnings versioning semver sourceurl currentversion issingleversion true fixedversion depname ingress nginx currentvalue datasource helm registryurls replacestring ingress nginx chart version renovate datasource helm depname ingress nginx registryurl depindex updates bucket non major newversion newvalue releasetimestamp newmajor newminor updatetype minor branchname renovate ingress nginx x warnings versioning semver sourceurl currentversion issingleversion true fixedversion depname ingress nginx controller currentvalue datasource docker versioning docker registryurls replacestring ingress nginx version renovate datasource docker depname ingress nginx controller registryurl versioning docker n depindex updates warnings topic ingress nginx controller message failed to look up dependency ingress nginx controller depname jettech kube webhook certgen currentvalue datasource docker versioning docker registryurls replacestring ingress nginx certgen version renovate datasource docker depname jettech kube webhook certgen registryurl versioning docker n depindex updates warnings topic jettech kube webhook certgen message failed to look up dependency jettech kube webhook certgen depname defaultbackend currentvalue datasource docker versioning docker registryurls replacestring ingress nginx backend version renovate datasource docker depname defaultbackend registryurl versioning docker n depindex updates warnings topic defaultbackend message failed to look up dependency defaultbackend matchstrings version s s renovate s datasource depname registryurl versioning n debug processrepo repository zadkiel test renovate debug processing branches renovate ingress nginx x renovate kubernetes kubernetes x repository zadkiel test renovate debug calculated maximum prs remaining this run repository zadkiel test renovate prsremaining debug pullrequests limit repository zadkiel test renovate debug calculated maximum branches remaining this run repository zadkiel test renovate branchesremaining debug branches limit repository zadkiel test renovate debug setting current branch to master repository zadkiel test renovate branch renovate ingress nginx x debug latest commit repository zadkiel test renovate branch renovate ingress nginx x branchname master latestcommitdate debug getbranchpr renovate ingress nginx x repository zadkiel test renovate branch renovate ingress nginx x debug findpr renovate ingress nginx x undefined open repository zadkiel test renovate branch renovate ingress nginx x debug retrieving pr list repository zadkiel test renovate branch renovate ingress nginx x debug retrieved pull requests repository zadkiel test renovate branch renovate ingress nginx x debug findpr renovate ingress nginx x undefined closed repository zadkiel test renovate branch renovate ingress nginx x debug branchexists false repository zadkiel test renovate branch renovate ingress nginx x debug dependencydashboardcheck undefined repository zadkiel test renovate branch renovate ingress nginx x debug recreateclosed is false repository zadkiel test renovate branch renovate ingress nginx x debug findpr renovate ingress nginx x update helm release ingress nginx to open repository zadkiel test renovate branch renovate ingress nginx x debug found pr repository zadkiel test renovate branch renovate ingress nginx x debug found closed pr with current title repository zadkiel test renovate branch renovate ingress nginx x debug retrieved closed pr list with graphql repository zadkiel test renovate branch renovate ingress nginx x prnumbers debug returning from graphql closed pr list repository zadkiel test renovate branch renovate ingress nginx x debug closed pr already exists skipping branch repository zadkiel test renovate branch renovate ingress nginx x prtitle update helm release ingress nginx to debug merged pr is blocking this branch repository zadkiel test renovate branch renovate ingress nginx x pr debug setting current branch to master repository zadkiel test renovate branch renovate kubernetes kubernetes x debug latest commit repository zadkiel test renovate branch renovate kubernetes kubernetes x branchname master latestcommitdate debug getbranchpr renovate kubernetes kubernetes x repository zadkiel test renovate branch renovate kubernetes kubernetes x debug findpr renovate kubernetes kubernetes x undefined open repository zadkiel test renovate branch renovate kubernetes kubernetes x debug found pr repository zadkiel test renovate branch renovate kubernetes kubernetes x debug returning from graphql open pr list repository zadkiel test renovate branch renovate kubernetes kubernetes x debug branchexists true repository zadkiel test renovate branch renovate kubernetes kubernetes x debug dependencydashboardcheck undefined repository zadkiel test renovate branch renovate kubernetes kubernetes x debug pr rebase requested false repository zadkiel test renovate branch renovate kubernetes kubernetes x debug checking if pr has been edited repository zadkiel test renovate branch renovate kubernetes kubernetes x debug found existing branch pr repository zadkiel test renovate branch renovate kubernetes kubernetes x debug checking schedule at any time europe paris repository zadkiel test renovate branch renovate kubernetes kubernetes x debug no schedule defined repository zadkiel test renovate branch renovate kubernetes kubernetes x debug branch already exists repository zadkiel test renovate branch renovate kubernetes kubernetes x debug github repository zadkiel test renovate branch renovate kubernetes kubernetes x url repos zadkiel test renovate branches master protection debug no branch protection found repository zadkiel test renovate branch renovate kubernetes kubernetes x debug skipping stale branch check due to rebasewhen auto repository zadkiel test renovate branch renovate kubernetes kubernetes x debug branch does not need rebasing repository zadkiel test renovate branch renovate kubernetes kubernetes x debug using reuseexistingbranch true repository zadkiel test renovate branch renovate kubernetes kubernetes x debug manager getupdatedpackagefiles reuseexistinbranch true repository zadkiel test renovate branch renovate kubernetes kubernetes x debug branch dep is already updated repository zadkiel test renovate packagefile components yaml branch renovate kubernetes kubernetes x depname kubernetes kubernetes debug no content changed repository zadkiel test renovate packagefile components yaml branch renovate kubernetes kubernetes x depname kubernetes kubernetes debug no package files need updating repository zadkiel test renovate branch renovate kubernetes kubernetes x debug no updated lock files in branch repository zadkiel test renovate branch renovate kubernetes kubernetes x debug no files to commit repository zadkiel test renovate branch renovate kubernetes kubernetes x debug checking if we can automerge branch repository zadkiel test renovate branch renovate kubernetes kubernetes x debug mergestatus no automerge repository zadkiel test renovate branch renovate kubernetes kubernetes x debug ensuring pr repository zadkiel test renovate branch renovate kubernetes kubernetes x debug there are errors and warnings repository zadkiel test renovate branch renovate kubernetes kubernetes x debug found existing pr repository zadkiel test renovate branch renovate kubernetes kubernetes x debug processing existing pr repository zadkiel test renovate branch renovate kubernetes kubernetes x debug pull request does not need updating repository zadkiel test renovate branch renovate kubernetes kubernetes x debug pr is not configured for automerge repository zadkiel test renovate branch renovate kubernetes kubernetes x debug removing any stale branches repository zadkiel test renovate debug config repoisonboarded true repository zadkiel test renovate debug branch lists repository zadkiel test renovate branchlist renovatebranches debug remainingbranches repository zadkiel test renovate debug no branches to clean up repository zadkiel test renovate debug repository timing splits milliseconds repository zadkiel test renovate splits init extract lookup update total debug http statistics repository zadkiel test renovate hoststats index docker io request request average queue average gcr io requests request average queue average github enterprise local requests request average queue average totalrequests info repository finished repository zadkiel test renovate durationms debug renovate exiting have you created a minimal reproduction repository i have linked to a minimal reproduction repository in the bug description
1
815
8,208,334,570
IssuesEvent
2018-09-04 00:54:24
xcat2/xcat-core
https://api.github.com/repos/xcat2/xcat-core
closed
rpm build failed on build xCAT-probe
component:automation status:pending
August 31st: *linux_rpm* build of branch `devel` ==> *FAILED* BUILD_MACHINE=`c910f04x33@xcat.build.server` BUILD_DIR=`/xcat/build/xcat2_autobuild_daily_builds/20180831.0615/linux_rpm ``` Error: build of the following RPMs failed: xCAT-probe 2018-08-31 10:17:51,909 buildxcat WARNING:Wide character in print at /usr/share/perl5/Pod/Html.pm line 435. error: bad date in %changelog: Thu, 30 Aug 2018 - GONG Jie <gongjie@linux.vnet.ibm.com> ```
1.0
rpm build failed on build xCAT-probe - August 31st: *linux_rpm* build of branch `devel` ==> *FAILED* BUILD_MACHINE=`c910f04x33@xcat.build.server` BUILD_DIR=`/xcat/build/xcat2_autobuild_daily_builds/20180831.0615/linux_rpm ``` Error: build of the following RPMs failed: xCAT-probe 2018-08-31 10:17:51,909 buildxcat WARNING:Wide character in print at /usr/share/perl5/Pod/Html.pm line 435. error: bad date in %changelog: Thu, 30 Aug 2018 - GONG Jie <gongjie@linux.vnet.ibm.com> ```
non_priority
rpm build failed on build xcat probe august linux rpm build of branch devel failed build machine xcat build server build dir xcat build autobuild daily builds linux rpm error build of the following rpms failed xcat probe buildxcat warning wide character in print at usr share pod html pm line error bad date in changelog thu aug gong jie
0
39,851
9,691,523,621
IssuesEvent
2019-05-24 11:27:47
cakephp/bake
https://api.github.com/repos/cakephp/bake
closed
4.x - Can't bake custom view template
Defect
## What you did I baked my plugin and copied across the templates as per the documentation. ```bash ~/Sites/CakePHP4 $ bin/cake bake template -f --theme=CakeBootstrap --prefix=Admin Questions view Baking `view` view template file... File `/Users/davidyell/Sites/CakePHP4/src/../templates/Admin/Questions/view.php` exists Do you want to overwrite? (y/n/a/q) [n] > a Wrote `/Users/davidyell/Sites/CakePHP4/src/../templates/Admin/Questions/view.php` ``` ## What happened? When I baked my templates the 'view' template was baked using the default template and not the one included in my bake theme plugin. ## Things you've tried I tried making my `CakeBootstrap/templates/bake/Template/view.twig` template just contain the words 'A view template', to ensure I didn't have any Twig errors. I can generate my index, add and edit templates correctly using the same command. ## Show us the code https://github.com/davidyell/Learning-CakePHP4/tree/master/plugins/CakeBootstrap/templates/bake/Template
1.0
4.x - Can't bake custom view template - ## What you did I baked my plugin and copied across the templates as per the documentation. ```bash ~/Sites/CakePHP4 $ bin/cake bake template -f --theme=CakeBootstrap --prefix=Admin Questions view Baking `view` view template file... File `/Users/davidyell/Sites/CakePHP4/src/../templates/Admin/Questions/view.php` exists Do you want to overwrite? (y/n/a/q) [n] > a Wrote `/Users/davidyell/Sites/CakePHP4/src/../templates/Admin/Questions/view.php` ``` ## What happened? When I baked my templates the 'view' template was baked using the default template and not the one included in my bake theme plugin. ## Things you've tried I tried making my `CakeBootstrap/templates/bake/Template/view.twig` template just contain the words 'A view template', to ensure I didn't have any Twig errors. I can generate my index, add and edit templates correctly using the same command. ## Show us the code https://github.com/davidyell/Learning-CakePHP4/tree/master/plugins/CakeBootstrap/templates/bake/Template
non_priority
x can t bake custom view template what you did i baked my plugin and copied across the templates as per the documentation bash sites bin cake bake template f theme cakebootstrap prefix admin questions view baking view view template file file users davidyell sites src templates admin questions view php exists do you want to overwrite y n a q a wrote users davidyell sites src templates admin questions view php what happened when i baked my templates the view template was baked using the default template and not the one included in my bake theme plugin things you ve tried i tried making my cakebootstrap templates bake template view twig template just contain the words a view template to ensure i didn t have any twig errors i can generate my index add and edit templates correctly using the same command show us the code
0
300,375
9,210,151,316
IssuesEvent
2019-03-09 02:06:15
ac2cz/FoxTelem
https://api.github.com/repos/ac2cz/FoxTelem
closed
FCD Pro- Bad Behavior on RPi
Fox In A Box/RaspPi fcd low priority wontfix
This is from Barry, WD4ASW. He did not realize that the original FCDP (which I call FCDP-) did not work, so he plugged it into an FIAB. It did not appear as a source, so he chose RTL-SDR (?) and started. He got an error box saying the device was not found, but clicking OK just brought up the error box again, so he could not get rid of FoxTelem except by rebooting (he does not know about kill etc). Plugging in an FCDP+ apparently got him going again. So the issue I'm reporting here is not that FCDP- is not supported but rather that it can get into an infinite loop of errors such that you can't apparently stop it easily. I'd suggest that a device error would automatically set "start" to off. (BTW, I tried and did not see this behavior with an FCDP+), He also did not see that the FCDP- was not supported. I'll update the FIAB docs; I'm not sure if it is in the FoxTelem docs or not.
1.0
FCD Pro- Bad Behavior on RPi - This is from Barry, WD4ASW. He did not realize that the original FCDP (which I call FCDP-) did not work, so he plugged it into an FIAB. It did not appear as a source, so he chose RTL-SDR (?) and started. He got an error box saying the device was not found, but clicking OK just brought up the error box again, so he could not get rid of FoxTelem except by rebooting (he does not know about kill etc). Plugging in an FCDP+ apparently got him going again. So the issue I'm reporting here is not that FCDP- is not supported but rather that it can get into an infinite loop of errors such that you can't apparently stop it easily. I'd suggest that a device error would automatically set "start" to off. (BTW, I tried and did not see this behavior with an FCDP+), He also did not see that the FCDP- was not supported. I'll update the FIAB docs; I'm not sure if it is in the FoxTelem docs or not.
priority
fcd pro bad behavior on rpi this is from barry he did not realize that the original fcdp which i call fcdp did not work so he plugged it into an fiab it did not appear as a source so he chose rtl sdr and started he got an error box saying the device was not found but clicking ok just brought up the error box again so he could not get rid of foxtelem except by rebooting he does not know about kill etc plugging in an fcdp apparently got him going again so the issue i m reporting here is not that fcdp is not supported but rather that it can get into an infinite loop of errors such that you can t apparently stop it easily i d suggest that a device error would automatically set start to off btw i tried and did not see this behavior with an fcdp he also did not see that the fcdp was not supported i ll update the fiab docs i m not sure if it is in the foxtelem docs or not
1
85,009
3,683,650,228
IssuesEvent
2016-02-24 14:50:35
emoncms/MyHomeEnergyPlanner
https://api.github.com/repos/emoncms/MyHomeEnergyPlanner
closed
Top bar figures not changing after measures applied.
bug High priority
In "63 Nicholas rd V4" having improved several elements the heat losses have dropped but the space heat demand , primary energy etc have not.
1.0
Top bar figures not changing after measures applied. - In "63 Nicholas rd V4" having improved several elements the heat losses have dropped but the space heat demand , primary energy etc have not.
priority
top bar figures not changing after measures applied in nicholas rd having improved several elements the heat losses have dropped but the space heat demand primary energy etc have not
1
417,063
28,110,078,034
IssuesEvent
2023-03-31 06:18:47
jerrrren/ped
https://api.github.com/repos/jerrrren/ped
opened
Missing explanation for input in UG
severity.Medium type.DocumentationBug
The UG sheed explain the command input, specifically [ct/{ind|env}] syntax is not explained in the UG. <!--session: 1680243402490-26cf3b09-9882-4ab8-a78c-2fd363fb09d5--> <!--Version: Web v3.4.7-->
1.0
Missing explanation for input in UG - The UG sheed explain the command input, specifically [ct/{ind|env}] syntax is not explained in the UG. <!--session: 1680243402490-26cf3b09-9882-4ab8-a78c-2fd363fb09d5--> <!--Version: Web v3.4.7-->
non_priority
missing explanation for input in ug the ug sheed explain the command input specifically syntax is not explained in the ug
0
6,400
14,502,894,231
IssuesEvent
2020-12-11 21:45:52
sg-s/xolotl
https://api.github.com/repos/sg-s/xolotl
opened
why is getFullStateSize a function?
C++ weak-architecture
isn't it sufficient to make it a property? that is public?
1.0
why is getFullStateSize a function? - isn't it sufficient to make it a property? that is public?
non_priority
why is getfullstatesize a function isn t it sufficient to make it a property that is public
0
213,814
16,539,106,486
IssuesEvent
2021-05-27 14:48:52
alvgomper1/Acme-Planner
https://api.github.com/repos/alvgomper1/Acme-Planner
closed
Task-004: Test crear shout y publicarlo sin spam(Anonymous)
test
Write a shout and publish it as long as it is not considered spam (Probar solo la parte de spam).
1.0
Task-004: Test crear shout y publicarlo sin spam(Anonymous) - Write a shout and publish it as long as it is not considered spam (Probar solo la parte de spam).
non_priority
task test crear shout y publicarlo sin spam anonymous write a shout and publish it as long as it is not considered spam probar solo la parte de spam
0
42,075
9,127,062,926
IssuesEvent
2019-02-25 02:09:38
EdenServer/community
https://api.github.com/repos/EdenServer/community
closed
Trade for Puppetry Tobe not working
in-code-review
Dhima Polevhia isn't taking the items for Puppetry Tobe. 1 Ruby 1 Moblinweave 1 Scarlet Linen 1 Wamoura Cloth 1 Imperial Gold Piece The other 2 AF piece trades worked fine.
1.0
Trade for Puppetry Tobe not working - Dhima Polevhia isn't taking the items for Puppetry Tobe. 1 Ruby 1 Moblinweave 1 Scarlet Linen 1 Wamoura Cloth 1 Imperial Gold Piece The other 2 AF piece trades worked fine.
non_priority
trade for puppetry tobe not working dhima polevhia isn t taking the items for puppetry tobe ruby moblinweave scarlet linen wamoura cloth imperial gold piece the other af piece trades worked fine
0
377,480
11,171,474,088
IssuesEvent
2019-12-28 20:04:16
clinwiki-org/clinwiki
https://api.github.com/repos/clinwiki-org/clinwiki
closed
Implement Type Ahead Using Facets
Priority 1
--from email exchange-- At the last meeting we spoke about being able to limit the number of facets in the autocomplete. This simplifies the problem and allows us to use the current API. There is already an api that enables us to filter on one facet. We can reuse this endpoint calling it for each of the facets and then combining the results in the front end (or we may just go ahead and handle this on the backend). This way we should be able to complete the autocomplete in a matter of days. I measured each call taking on average 200ms which seems acceptable. Most of the work for this will be for converting the current input field into a field that can handle autocomplete.
1.0
Implement Type Ahead Using Facets - --from email exchange-- At the last meeting we spoke about being able to limit the number of facets in the autocomplete. This simplifies the problem and allows us to use the current API. There is already an api that enables us to filter on one facet. We can reuse this endpoint calling it for each of the facets and then combining the results in the front end (or we may just go ahead and handle this on the backend). This way we should be able to complete the autocomplete in a matter of days. I measured each call taking on average 200ms which seems acceptable. Most of the work for this will be for converting the current input field into a field that can handle autocomplete.
priority
implement type ahead using facets from email exchange at the last meeting we spoke about being able to limit the number of facets in the autocomplete this simplifies the problem and allows us to use the current api there is already an api that enables us to filter on one facet we can reuse this endpoint calling it for each of the facets and then combining the results in the front end or we may just go ahead and handle this on the backend this way we should be able to complete the autocomplete in a matter of days i measured each call taking on average which seems acceptable most of the work for this will be for converting the current input field into a field that can handle autocomplete
1
481,405
13,885,701,801
IssuesEvent
2020-10-18 21:09:23
conan-io/conan
https://api.github.com/repos/conan-io/conan
closed
[feature] conan config install a single file
Hacktoberfest complex: low good first issue priority: low stage: queue type: feature
When using `conan config install`, allow a single configuration file to be installed. Normally, the command requires a git repository, a local folder, or a zip file that contains the file(s) to be installed. If I have a single file I want installed, I would have to put it into a folder or zip it into a file. The current result: ``` conan config install my_settings\settings.yml Traceback (most recent call last): File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\command.py", line 1859, in run method(args[0][1:]) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\command.py", line 554, in config target_folder=args.target_folder) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conan_api.py", line 78, in wrapper return f(*args, **kwargs) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conan_api.py", line 611, in config_install source_folder=source_folder, target_folder=target_folder) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conf\config_installer.py", line 235, in configuration_install _process_config(config, cache, output, requester) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conf\config_installer.py", line 190, in _process_config _process_zip_file(config, config.uri, cache, output, tmp_folder) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conf\config_installer.py", line 63, in _process_zip_file unzip(zippath, tmp_folder, output=output) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\tools\files.py", line 104, in unzip with zipfile.ZipFile(filename, "r") as z: File "C:\Program Files\Python37\lib\zipfile.py", line 1225, in __init__ self._RealGetContents() File "C:\Program Files\Python37\lib\zipfile.py", line 1292, in _RealGetContents raise BadZipFile("File is not a zip file") zipfile.BadZipFile: File is not a zip file ERROR: File is not a zip file ``` This feature is useful in the case where you only want to install specific files from a single folder. For example, ``` my_settings\ settings.yml remotes.txt conan config install my_settings (Installs settings.yml and remotes.txt) conan config install my_settings\settings.yml (Installs settings.yml and ignores remotes.txt) ``` It is also useful when the current folder structure does not facilitate creating a folder specifically for configuration files (e.g. a git repository). ``` my_git_repo\ settings.yml unrelated.txt ... conan config install .\ Installing settings.yml Copying file unrelated.txt to C:\Users\Kyle.Kaja\.conan\. (I don't want this installed here!!) ... OR mkdir temp_settings copy settings.yml temp_settings conan config install temp_settings Installing settings.yml rmdir temp_settings /s (Remove the directory so Git doesn't try to track that new file.) (So much to do when all I want to do is install a single configuration file.) PREFERRED conan config install settings.yml (Ahh! Simple, short, and intuitive!) Installing settings.yml ```
1.0
[feature] conan config install a single file - When using `conan config install`, allow a single configuration file to be installed. Normally, the command requires a git repository, a local folder, or a zip file that contains the file(s) to be installed. If I have a single file I want installed, I would have to put it into a folder or zip it into a file. The current result: ``` conan config install my_settings\settings.yml Traceback (most recent call last): File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\command.py", line 1859, in run method(args[0][1:]) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\command.py", line 554, in config target_folder=args.target_folder) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conan_api.py", line 78, in wrapper return f(*args, **kwargs) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conan_api.py", line 611, in config_install source_folder=source_folder, target_folder=target_folder) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conf\config_installer.py", line 235, in configuration_install _process_config(config, cache, output, requester) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conf\config_installer.py", line 190, in _process_config _process_zip_file(config, config.uri, cache, output, tmp_folder) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\conf\config_installer.py", line 63, in _process_zip_file unzip(zippath, tmp_folder, output=output) File "C:\Users\Kyle.Kaja\AppData\Roaming\Python\Python37\site-packages\conans\client\tools\files.py", line 104, in unzip with zipfile.ZipFile(filename, "r") as z: File "C:\Program Files\Python37\lib\zipfile.py", line 1225, in __init__ self._RealGetContents() File "C:\Program Files\Python37\lib\zipfile.py", line 1292, in _RealGetContents raise BadZipFile("File is not a zip file") zipfile.BadZipFile: File is not a zip file ERROR: File is not a zip file ``` This feature is useful in the case where you only want to install specific files from a single folder. For example, ``` my_settings\ settings.yml remotes.txt conan config install my_settings (Installs settings.yml and remotes.txt) conan config install my_settings\settings.yml (Installs settings.yml and ignores remotes.txt) ``` It is also useful when the current folder structure does not facilitate creating a folder specifically for configuration files (e.g. a git repository). ``` my_git_repo\ settings.yml unrelated.txt ... conan config install .\ Installing settings.yml Copying file unrelated.txt to C:\Users\Kyle.Kaja\.conan\. (I don't want this installed here!!) ... OR mkdir temp_settings copy settings.yml temp_settings conan config install temp_settings Installing settings.yml rmdir temp_settings /s (Remove the directory so Git doesn't try to track that new file.) (So much to do when all I want to do is install a single configuration file.) PREFERRED conan config install settings.yml (Ahh! Simple, short, and intuitive!) Installing settings.yml ```
priority
conan config install a single file when using conan config install allow a single configuration file to be installed normally the command requires a git repository a local folder or a zip file that contains the file s to be installed if i have a single file i want installed i would have to put it into a folder or zip it into a file the current result conan config install my settings settings yml traceback most recent call last file c users kyle kaja appdata roaming python site packages conans client command py line in run method args file c users kyle kaja appdata roaming python site packages conans client command py line in config target folder args target folder file c users kyle kaja appdata roaming python site packages conans client conan api py line in wrapper return f args kwargs file c users kyle kaja appdata roaming python site packages conans client conan api py line in config install source folder source folder target folder target folder file c users kyle kaja appdata roaming python site packages conans client conf config installer py line in configuration install process config config cache output requester file c users kyle kaja appdata roaming python site packages conans client conf config installer py line in process config process zip file config config uri cache output tmp folder file c users kyle kaja appdata roaming python site packages conans client conf config installer py line in process zip file unzip zippath tmp folder output output file c users kyle kaja appdata roaming python site packages conans client tools files py line in unzip with zipfile zipfile filename r as z file c program files lib zipfile py line in init self realgetcontents file c program files lib zipfile py line in realgetcontents raise badzipfile file is not a zip file zipfile badzipfile file is not a zip file error file is not a zip file this feature is useful in the case where you only want to install specific files from a single folder for example my settings settings yml remotes txt conan config install my settings installs settings yml and remotes txt conan config install my settings settings yml installs settings yml and ignores remotes txt it is also useful when the current folder structure does not facilitate creating a folder specifically for configuration files e g a git repository my git repo settings yml unrelated txt conan config install installing settings yml copying file unrelated txt to c users kyle kaja conan i don t want this installed here or mkdir temp settings copy settings yml temp settings conan config install temp settings installing settings yml rmdir temp settings s remove the directory so git doesn t try to track that new file so much to do when all i want to do is install a single configuration file preferred conan config install settings yml ahh simple short and intuitive installing settings yml
1
217,433
16,855,775,160
IssuesEvent
2021-06-21 06:24:53
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
raftstore::test_replication_mode::test_dr_auto_sync failed
component/test-bench
raftstore::test_replication_mode::test_dr_auto_sync Latest failed builds: https://internal.pingcap.net/idc-jenkins/job/tikv_ghpr_test/22703/consoleFull
1.0
raftstore::test_replication_mode::test_dr_auto_sync failed - raftstore::test_replication_mode::test_dr_auto_sync Latest failed builds: https://internal.pingcap.net/idc-jenkins/job/tikv_ghpr_test/22703/consoleFull
non_priority
raftstore test replication mode test dr auto sync failed raftstore test replication mode test dr auto sync latest failed builds
0
483,148
13,919,394,004
IssuesEvent
2020-10-21 09:01:19
geosolutions-it/MapStore2
https://api.github.com/repos/geosolutions-it/MapStore2
reopened
Identify maxItems cfg option removed
Good first issue Priority: Medium user feedback
### Description Hi In one of my localConfig.json I have set maxItems for the Identify plugin { "name": "Identify", "cfg": { "showHighlightFeatureButton": true, "draggable": false, "dock": true, "viewerOptions": { "container": "{context.ReactSwipe}" }, "maxItems": 200 }, "override": { "Toolbar": { "position": 11 } } }, However this cfg setting is no longer available, is there a replacement setting that I have missed? Was removed at this commit... https://github.com/geosolutions-it/MapStore2/commit/e68ce42ca1ad311767ec229ba8afcab83ce13e2b#diff-cb0e46ea9c7290f7a328445a8d05f9d7 This is very important as we are not able to display the correct number of records, for instance in the screenshot below there should be 39 Building Consents... ![wairoa_identify_error_no_records_displayed](https://user-images.githubusercontent.com/40550145/65396876-3c534580-ddff-11e9-8d39-b1225580009e.png) Cheers Si <!-- probot = {"9274843":{"who":"tdipisa","what":"","when":"2020-09-28T09:00:00.000Z"}} -->
1.0
Identify maxItems cfg option removed - ### Description Hi In one of my localConfig.json I have set maxItems for the Identify plugin { "name": "Identify", "cfg": { "showHighlightFeatureButton": true, "draggable": false, "dock": true, "viewerOptions": { "container": "{context.ReactSwipe}" }, "maxItems": 200 }, "override": { "Toolbar": { "position": 11 } } }, However this cfg setting is no longer available, is there a replacement setting that I have missed? Was removed at this commit... https://github.com/geosolutions-it/MapStore2/commit/e68ce42ca1ad311767ec229ba8afcab83ce13e2b#diff-cb0e46ea9c7290f7a328445a8d05f9d7 This is very important as we are not able to display the correct number of records, for instance in the screenshot below there should be 39 Building Consents... ![wairoa_identify_error_no_records_displayed](https://user-images.githubusercontent.com/40550145/65396876-3c534580-ddff-11e9-8d39-b1225580009e.png) Cheers Si <!-- probot = {"9274843":{"who":"tdipisa","what":"","when":"2020-09-28T09:00:00.000Z"}} -->
priority
identify maxitems cfg option removed description hi in one of my localconfig json i have set maxitems for the identify plugin name identify cfg showhighlightfeaturebutton true draggable false dock true vieweroptions container context reactswipe maxitems override toolbar position however this cfg setting is no longer available is there a replacement setting that i have missed was removed at this commit this is very important as we are not able to display the correct number of records for instance in the screenshot below there should be building consents cheers si
1
35,540
17,113,197,471
IssuesEvent
2021-07-10 19:28:12
aperta-principium/Interclip
https://api.github.com/repos/aperta-principium/Interclip
closed
Add Redis
enhancement performance
Add a Redis server and use it for retrieving clips, as it could drastically improve load times.
True
Add Redis - Add a Redis server and use it for retrieving clips, as it could drastically improve load times.
non_priority
add redis add a redis server and use it for retrieving clips as it could drastically improve load times
0
493,940
14,241,796,646
IssuesEvent
2020-11-19 00:13:03
sButtons/sbuttons
https://api.github.com/repos/sButtons/sbuttons
closed
[BUTTON IDEA]: Funky Button
Hacktoberfest Priority: Low button-idea good first issue help wanted stale-issue
**1. Name of button**: Funky Button **2. Description**: A button that's color is originally outside the border but on hover becomes inside **3. Button Type (Animated, Special, etc...)**: Animated **4. Will you work on it?**: No if you want to work on it please comment to be assigned. Here's an example of what the button should act like: [https://codepen.io/halvo/full/pRdeja](https://codepen.io/halvo/full/pRdeja)
1.0
[BUTTON IDEA]: Funky Button - **1. Name of button**: Funky Button **2. Description**: A button that's color is originally outside the border but on hover becomes inside **3. Button Type (Animated, Special, etc...)**: Animated **4. Will you work on it?**: No if you want to work on it please comment to be assigned. Here's an example of what the button should act like: [https://codepen.io/halvo/full/pRdeja](https://codepen.io/halvo/full/pRdeja)
priority
funky button name of button funky button description a button that s color is originally outside the border but on hover becomes inside button type animated special etc animated will you work on it no if you want to work on it please comment to be assigned here s an example of what the button should act like
1
36,426
6,533,697,426
IssuesEvent
2017-08-31 07:43:54
linkedresearch/linkedresearch.org
https://api.github.com/repos/linkedresearch/linkedresearch.org
opened
Specify data shape for annotation notifications
documentation RDF
Annotations can be subdivided to Web Annotation motivations. Some relevant ones listed in ballpark priority that should be specified: * [ ] assessing * [ ] replying * [ ] commenting * [ ] describing * [ ] linking * [ ] moderating * [ ] tagging * [ ] questioning ..
1.0
Specify data shape for annotation notifications - Annotations can be subdivided to Web Annotation motivations. Some relevant ones listed in ballpark priority that should be specified: * [ ] assessing * [ ] replying * [ ] commenting * [ ] describing * [ ] linking * [ ] moderating * [ ] tagging * [ ] questioning ..
non_priority
specify data shape for annotation notifications annotations can be subdivided to web annotation motivations some relevant ones listed in ballpark priority that should be specified assessing replying commenting describing linking moderating tagging questioning
0
274,741
30,123,016,300
IssuesEvent
2023-06-30 16:46:52
bcgov/cloud-pathfinder
https://api.github.com/repos/bcgov/cloud-pathfinder
closed
Security review of AWS services (EMR Serverless, Redshift, Kinesis, EC2 ) identified through GDX-Analytics CAPS
Security Privacy
**Describe the issue** We identified some AWS services like EMR (Elastic Map Reduce) Serverless, Redshift, Kinesis, EC2 used by GDX-Analytics's Core Analytics Pipeline Service (CAPS). We are looking forward to carrying out compliance work for these services and marking them as reviewed in the list of services available for our teams. SoAR(PDF + word format) and PIA (PDF format) are shared by GDX-Analytics teams. **Additional context** Add any other context, attachments or screenshots **Definition of done** - review the use of the services - coordinate with privacy review - review with PO and like AWS Polly, make some tile updates
True
Security review of AWS services (EMR Serverless, Redshift, Kinesis, EC2 ) identified through GDX-Analytics CAPS - **Describe the issue** We identified some AWS services like EMR (Elastic Map Reduce) Serverless, Redshift, Kinesis, EC2 used by GDX-Analytics's Core Analytics Pipeline Service (CAPS). We are looking forward to carrying out compliance work for these services and marking them as reviewed in the list of services available for our teams. SoAR(PDF + word format) and PIA (PDF format) are shared by GDX-Analytics teams. **Additional context** Add any other context, attachments or screenshots **Definition of done** - review the use of the services - coordinate with privacy review - review with PO and like AWS Polly, make some tile updates
non_priority
security review of aws services emr serverless redshift kinesis identified through gdx analytics caps describe the issue we identified some aws services like emr elastic map reduce serverless redshift kinesis used by gdx analytics s core analytics pipeline service caps we are looking forward to carrying out compliance work for these services and marking them as reviewed in the list of services available for our teams soar pdf word format and pia pdf format are shared by gdx analytics teams additional context add any other context attachments or screenshots definition of done review the use of the services coordinate with privacy review review with po and like aws polly make some tile updates
0
24,337
3,900,778,595
IssuesEvent
2016-04-18 08:04:05
geetsisbac/WCVVENIXYFVIRBXH3BYTI6TE
https://api.github.com/repos/geetsisbac/WCVVENIXYFVIRBXH3BYTI6TE
closed
G+r36omqGgVrc/X5y5fJdTnXUmSIeGZe3b02fvTlAfPOt5YhvyMxbH/tnvENVEAF+AjOYxHn414qNi2ohu05AaLnyceCYQ1cTEdaXgZipiNxav4yoscVis7LFc3oxXZPZWeWgzYn36i5y7bs6+1cQ7RkGYD/MW2FkMXoG5IILUs=
design
VEFAi63WMg7q48WvQkRKMoq8yYiNqX1PLeogTSc2f9r8D4X+1XmMbqMaFD4sUTEYmGzyOJ40kUKBNOzNy1JUbYZBZJcEe0QAEWty4hMR4rLaF4J7d1hV3ApDoUbSZ7ltsAm6yVKxBfjwKhEQMCguVmmrnbd0LaJr1XiXCgGP6IAlj+KZA4bMLheMsp8oA0cANQu1WUQ7YNkn+jKZioSmkMlr2G2lP2W0zfHw2VIIel1CHMKLGBzK4f5V+9W17RaFUqADXdOLR5P5eIxRpdhAN6GQ7rWy/pxhwK3D5s6Mb0AU8G1a3OzypqVUkTJ/D6eX6CTDG/COF8DdpBWueJ4fIfBr7dbAcRv9TF0CFwWF9tPci/PnBxZO4VGyNF442DO0pnfbg1lrH6VVg2uUtbO9T8BXLBjNyIfs2/a5S1n+5cAU0Sn7ijoKvUo8bSpROgrQFMFu6uV3HHg5fOSKEzD2zv5YvR6R/AVB0AbgVkFh5k0np2AkHgAdbPiNpVjz9AfXL+fMg4nLYd1aA7jqoeJPl8OJjW3/rhff45ZS4H57SgEMLcuq6nJWdS3/qP3bF3jP4H2/EksHXDphWOUosZWDwi9CVw7Tn5/FHmI/o6n7YiQ0Q/gV7U1tIWGJJsxiHU/JcKDOYNGGLT0VShy/F6vAMUSPHqma5UAw1ix6r4h24VMwYkRFHohYqxJ1vcCOZajAk5ScdpJOpBY7Nm+BngFVmrrs4neBVv0MTGFKzgOC4AzuqZjT7W8UyoUrkeWR7/O7XfxJMYlLJPQYLfbyckKi8pOUnHaSTqQWOzZvgZ4BVZq67OJ3gVb9DExhSs4DguAM4OalJGM6ZXXDfkvaoILld88ZW/b6T2wd/VdEWXN63rVvl8b41IuY1E9qy5TY65MCKto3ndb4DHpS9HRUj3soMM+j0o2jT/73XnEnJNrt2duGZtS2rGCXhxscS0nN9ad2/4/FQGXs1y6TEJBUbVKZREPb+35tGCD4LJhk5t3rcFRYbuBphff/kJKhiUIVFuvx/jvQ3ip62D1nMIRy/obuvEUi/DS2we6kv0bP1qb4LvcmoA3Qc+1oMrF4IHZiAeRzgvvdmYFU0Kju39v9+NkwsmW9pyc54Qm3x1jt7ZvEvEBpPlkNwEiVV1EJz+KSelaa5c99K7x/lmx5AESy6yEyxw1BN4ogzTfRCBYtcPEqjXhpbLtl2bbbLSD7cDaPH4ycBF0fUO68f8hbRMWFQfPnu8dMCVSWoTe7hX6D5cXOlCGr5+7cIP1BErd1gYYITexhlnVznU2t2Bylj3ER9xYmTH2W21acC/d5eXwpsHWAgtFvefs/IttCYNM6bMtj+ocKHeFcNZyXjZ77A3Bh6Nvvdd/5vsA6N4TLfUGY8e586Pp4gIIe4NduD1vE/ED2BViqe3ErG5pgERi4Qx7iV2JC21gwoUW72TF18sqBXDvAZjGN8lv3iTSuK5McjEC1RN6X3QX1t5bN0dLtJa4/dTiMos348uWCObzK5lAYl4OCkrWHLkNV9Wv15O62cWkBsdz3yuNpyOpBPzNYJEG8V/43i6hvgLSGzDia7Ohtfq2Duakl9ONRCXHDwJWlv3p6Jo1t/mvuamKvS9YTVGcCtWdKs3y76shks935V/3/l+O2TDVZzFmcxPysmRA23K6hfm7xfojAQXwR4PAWSxiiZzo6iE0T0yIV8HlLTF5w2F56gYCZQk8YupcZvDunZwqeprO03/JEvR5gYChAh1jg8p86cqwZvLKjxU9m6JJdjhgDcWqYVk01s/l631bfXMUvNEAdPdnQH4els0lWov/RYvHGOkNSlsS7tTMCT6zASSCIgWk=
1.0
G+r36omqGgVrc/X5y5fJdTnXUmSIeGZe3b02fvTlAfPOt5YhvyMxbH/tnvENVEAF+AjOYxHn414qNi2ohu05AaLnyceCYQ1cTEdaXgZipiNxav4yoscVis7LFc3oxXZPZWeWgzYn36i5y7bs6+1cQ7RkGYD/MW2FkMXoG5IILUs= - VEFAi63WMg7q48WvQkRKMoq8yYiNqX1PLeogTSc2f9r8D4X+1XmMbqMaFD4sUTEYmGzyOJ40kUKBNOzNy1JUbYZBZJcEe0QAEWty4hMR4rLaF4J7d1hV3ApDoUbSZ7ltsAm6yVKxBfjwKhEQMCguVmmrnbd0LaJr1XiXCgGP6IAlj+KZA4bMLheMsp8oA0cANQu1WUQ7YNkn+jKZioSmkMlr2G2lP2W0zfHw2VIIel1CHMKLGBzK4f5V+9W17RaFUqADXdOLR5P5eIxRpdhAN6GQ7rWy/pxhwK3D5s6Mb0AU8G1a3OzypqVUkTJ/D6eX6CTDG/COF8DdpBWueJ4fIfBr7dbAcRv9TF0CFwWF9tPci/PnBxZO4VGyNF442DO0pnfbg1lrH6VVg2uUtbO9T8BXLBjNyIfs2/a5S1n+5cAU0Sn7ijoKvUo8bSpROgrQFMFu6uV3HHg5fOSKEzD2zv5YvR6R/AVB0AbgVkFh5k0np2AkHgAdbPiNpVjz9AfXL+fMg4nLYd1aA7jqoeJPl8OJjW3/rhff45ZS4H57SgEMLcuq6nJWdS3/qP3bF3jP4H2/EksHXDphWOUosZWDwi9CVw7Tn5/FHmI/o6n7YiQ0Q/gV7U1tIWGJJsxiHU/JcKDOYNGGLT0VShy/F6vAMUSPHqma5UAw1ix6r4h24VMwYkRFHohYqxJ1vcCOZajAk5ScdpJOpBY7Nm+BngFVmrrs4neBVv0MTGFKzgOC4AzuqZjT7W8UyoUrkeWR7/O7XfxJMYlLJPQYLfbyckKi8pOUnHaSTqQWOzZvgZ4BVZq67OJ3gVb9DExhSs4DguAM4OalJGM6ZXXDfkvaoILld88ZW/b6T2wd/VdEWXN63rVvl8b41IuY1E9qy5TY65MCKto3ndb4DHpS9HRUj3soMM+j0o2jT/73XnEnJNrt2duGZtS2rGCXhxscS0nN9ad2/4/FQGXs1y6TEJBUbVKZREPb+35tGCD4LJhk5t3rcFRYbuBphff/kJKhiUIVFuvx/jvQ3ip62D1nMIRy/obuvEUi/DS2we6kv0bP1qb4LvcmoA3Qc+1oMrF4IHZiAeRzgvvdmYFU0Kju39v9+NkwsmW9pyc54Qm3x1jt7ZvEvEBpPlkNwEiVV1EJz+KSelaa5c99K7x/lmx5AESy6yEyxw1BN4ogzTfRCBYtcPEqjXhpbLtl2bbbLSD7cDaPH4ycBF0fUO68f8hbRMWFQfPnu8dMCVSWoTe7hX6D5cXOlCGr5+7cIP1BErd1gYYITexhlnVznU2t2Bylj3ER9xYmTH2W21acC/d5eXwpsHWAgtFvefs/IttCYNM6bMtj+ocKHeFcNZyXjZ77A3Bh6Nvvdd/5vsA6N4TLfUGY8e586Pp4gIIe4NduD1vE/ED2BViqe3ErG5pgERi4Qx7iV2JC21gwoUW72TF18sqBXDvAZjGN8lv3iTSuK5McjEC1RN6X3QX1t5bN0dLtJa4/dTiMos348uWCObzK5lAYl4OCkrWHLkNV9Wv15O62cWkBsdz3yuNpyOpBPzNYJEG8V/43i6hvgLSGzDia7Ohtfq2Duakl9ONRCXHDwJWlv3p6Jo1t/mvuamKvS9YTVGcCtWdKs3y76shks935V/3/l+O2TDVZzFmcxPysmRA23K6hfm7xfojAQXwR4PAWSxiiZzo6iE0T0yIV8HlLTF5w2F56gYCZQk8YupcZvDunZwqeprO03/JEvR5gYChAh1jg8p86cqwZvLKjxU9m6JJdjhgDcWqYVk01s/l631bfXMUvNEAdPdnQH4els0lWov/RYvHGOkNSlsS7tTMCT6zASSCIgWk=
non_priority
g tnvenveaf fhmi kjkhiuivfuvx obuveui l
0
382,556
11,307,907,860
IssuesEvent
2020-01-19 00:45:27
buddyqc69/infernalwowproject
https://api.github.com/repos/buddyqc69/infernalwowproject
closed
Quêtes : L'arène de sang : l'ultime défi
Bugfix-Quest Bugfix-creatures Priority-Medium Resolu-Completed legion-7.3.5
niveaux 65 l'arène de sang quand on tue le pnj sela ne compte pas pour la quete https://fr.wowhead.com/quest=9977/lar%C3%A8ne-de-sang-lultime-d%C3%A9fi
1.0
Quêtes : L'arène de sang : l'ultime défi - niveaux 65 l'arène de sang quand on tue le pnj sela ne compte pas pour la quete https://fr.wowhead.com/quest=9977/lar%C3%A8ne-de-sang-lultime-d%C3%A9fi
priority
quêtes l arène de sang l ultime défi niveaux l arène de sang quand on tue le pnj sela ne compte pas pour la quete
1
241,117
7,808,920,830
IssuesEvent
2018-06-11 21:52:12
jdextraze/go-gesclient
https://api.github.com/repos/jdextraze/go-gesclient
closed
Running publisher example (with -race flag) yields warning
bug low-priority
Running `go run -race ./examples/publisher/main.go` produces the following: ``` 2017/12/03 22:58:01 -> 'Default': &{eventId:8764a574-1b19-4bd3-b733-7b22c2bd40ac type:TestEvent isJson:true data:[...] metadata:[...]} 2017/12/03 22:58:01 Connected: &{remoteEndpoint:127.0.0.1:1113, connection:AllCatchUpSubscriber} 2017/12/03 22:58:02 <- &{nextExpectedVersion:5103 logPosition:&{commitPosition:1514157 preparePosition:1514157}} 2017/12/03 22:58:03 -> 'Default': &{eventId:7a5aeabe-c813-4631-b43a-9608cacfde3e type:TestEvent isJson:true data:[...] metadata:[...]} ================== WARNING: DATA RACE Read at 0x00c4200a4410 by main goroutine: github.com/jdextraze/go-gesclient/internal.(*connectionLogicHandler).TotalOperationCount() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/operations_manager.go:150 +0x5d github.com/jdextraze/go-gesclient/internal.(*connection).enqueueOperation() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection.go:415 +0x7a github.com/jdextraze/go-gesclient/internal.(*connection).AppendToStreamAsync() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection.go:94 +0x167 main.main() /home/user/go/src/github.com/jdextraze/go-gesclient/examples/publisher/main.go:54 +0x8d8 Previous write at 0x00c4200a4410 by goroutine 7: github.com/jdextraze/go-gesclient/internal.(*OperationsManager).TryScheduleWaitingOperations() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/operations_manager.go:132 +0x22d github.com/jdextraze/go-gesclient/internal.(*connectionLogicHandler).handleTcpPackage() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection_logic_handler.go:550 +0xb84 github.com/jdextraze/go-gesclient/internal.(*connectionLogicHandler).(github.com/jdextraze/go-gesclient/internal.handleTcpPackage)-fm() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection_logic_handler.go:140 +0x55 github.com/jdextraze/go-gesclient/internal.(*simpleQueuedHandler).processQueue() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/simple_queued_handler.go:40 +0x183 Goroutine 7 (running) created at: github.com/jdextraze/go-gesclient/internal.newSimpleQueuedHandler() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/simple_queued_handler.go:19 +0x118 github.com/jdextraze/go-gesclient/internal.NewConnectionLogicHandler() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection_logic_handler.go:111 +0x69 github.com/jdextraze/go-gesclient/internal.NewConnection() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection.go:47 +0x1e5 github.com/jdextraze/go-gesclient.Create() /home/user/go/src/github.com/jdextraze/go-gesclient/connection.go:62 +0x5aa main.getConnection() /home/user/go/src/github.com/jdextraze/go-gesclient/examples/publisher/main.go:99 +0x680 main.main() /home/user/go/src/github.com/jdextraze/go-gesclient/examples/publisher/main.go:36 +0x362 ================== 2017/12/03 22:58:03 <- &{nextExpectedVersion:5104 logPosition:&{commitPosition:1514265 preparePosition:1514265}} 2017/12/03 22:58:04 -> 'Default': &{eventId:742204cc-2652-444e-86c5-29533fdc5ec0 type:TestEvent isJson:true data:[...] metadata:[...]} 2017/12/03 22:58:04 <- &{nextExpectedVersion:5105 logPosition:&{commitPosition:1514373 preparePosition:1514373}} ^C 2017/12/03 22:58:05 Connection closed: &{reason:Connection close requested by client. connection:AllCatchUpSubscriber} ```
1.0
Running publisher example (with -race flag) yields warning - Running `go run -race ./examples/publisher/main.go` produces the following: ``` 2017/12/03 22:58:01 -> 'Default': &{eventId:8764a574-1b19-4bd3-b733-7b22c2bd40ac type:TestEvent isJson:true data:[...] metadata:[...]} 2017/12/03 22:58:01 Connected: &{remoteEndpoint:127.0.0.1:1113, connection:AllCatchUpSubscriber} 2017/12/03 22:58:02 <- &{nextExpectedVersion:5103 logPosition:&{commitPosition:1514157 preparePosition:1514157}} 2017/12/03 22:58:03 -> 'Default': &{eventId:7a5aeabe-c813-4631-b43a-9608cacfde3e type:TestEvent isJson:true data:[...] metadata:[...]} ================== WARNING: DATA RACE Read at 0x00c4200a4410 by main goroutine: github.com/jdextraze/go-gesclient/internal.(*connectionLogicHandler).TotalOperationCount() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/operations_manager.go:150 +0x5d github.com/jdextraze/go-gesclient/internal.(*connection).enqueueOperation() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection.go:415 +0x7a github.com/jdextraze/go-gesclient/internal.(*connection).AppendToStreamAsync() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection.go:94 +0x167 main.main() /home/user/go/src/github.com/jdextraze/go-gesclient/examples/publisher/main.go:54 +0x8d8 Previous write at 0x00c4200a4410 by goroutine 7: github.com/jdextraze/go-gesclient/internal.(*OperationsManager).TryScheduleWaitingOperations() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/operations_manager.go:132 +0x22d github.com/jdextraze/go-gesclient/internal.(*connectionLogicHandler).handleTcpPackage() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection_logic_handler.go:550 +0xb84 github.com/jdextraze/go-gesclient/internal.(*connectionLogicHandler).(github.com/jdextraze/go-gesclient/internal.handleTcpPackage)-fm() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection_logic_handler.go:140 +0x55 github.com/jdextraze/go-gesclient/internal.(*simpleQueuedHandler).processQueue() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/simple_queued_handler.go:40 +0x183 Goroutine 7 (running) created at: github.com/jdextraze/go-gesclient/internal.newSimpleQueuedHandler() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/simple_queued_handler.go:19 +0x118 github.com/jdextraze/go-gesclient/internal.NewConnectionLogicHandler() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection_logic_handler.go:111 +0x69 github.com/jdextraze/go-gesclient/internal.NewConnection() /home/user/go/src/github.com/jdextraze/go-gesclient/internal/connection.go:47 +0x1e5 github.com/jdextraze/go-gesclient.Create() /home/user/go/src/github.com/jdextraze/go-gesclient/connection.go:62 +0x5aa main.getConnection() /home/user/go/src/github.com/jdextraze/go-gesclient/examples/publisher/main.go:99 +0x680 main.main() /home/user/go/src/github.com/jdextraze/go-gesclient/examples/publisher/main.go:36 +0x362 ================== 2017/12/03 22:58:03 <- &{nextExpectedVersion:5104 logPosition:&{commitPosition:1514265 preparePosition:1514265}} 2017/12/03 22:58:04 -> 'Default': &{eventId:742204cc-2652-444e-86c5-29533fdc5ec0 type:TestEvent isJson:true data:[...] metadata:[...]} 2017/12/03 22:58:04 <- &{nextExpectedVersion:5105 logPosition:&{commitPosition:1514373 preparePosition:1514373}} ^C 2017/12/03 22:58:05 Connection closed: &{reason:Connection close requested by client. connection:AllCatchUpSubscriber} ```
priority
running publisher example with race flag yields warning running go run race examples publisher main go produces the following default eventid type testevent isjson true data metadata connected remoteendpoint connection allcatchupsubscriber nextexpectedversion logposition commitposition prepareposition default eventid type testevent isjson true data metadata warning data race read at by main goroutine github com jdextraze go gesclient internal connectionlogichandler totaloperationcount home user go src github com jdextraze go gesclient internal operations manager go github com jdextraze go gesclient internal connection enqueueoperation home user go src github com jdextraze go gesclient internal connection go github com jdextraze go gesclient internal connection appendtostreamasync home user go src github com jdextraze go gesclient internal connection go main main home user go src github com jdextraze go gesclient examples publisher main go previous write at by goroutine github com jdextraze go gesclient internal operationsmanager tryschedulewaitingoperations home user go src github com jdextraze go gesclient internal operations manager go github com jdextraze go gesclient internal connectionlogichandler handletcppackage home user go src github com jdextraze go gesclient internal connection logic handler go github com jdextraze go gesclient internal connectionlogichandler github com jdextraze go gesclient internal handletcppackage fm home user go src github com jdextraze go gesclient internal connection logic handler go github com jdextraze go gesclient internal simplequeuedhandler processqueue home user go src github com jdextraze go gesclient internal simple queued handler go goroutine running created at github com jdextraze go gesclient internal newsimplequeuedhandler home user go src github com jdextraze go gesclient internal simple queued handler go github com jdextraze go gesclient internal newconnectionlogichandler home user go src github com jdextraze go gesclient internal connection logic handler go github com jdextraze go gesclient internal newconnection home user go src github com jdextraze go gesclient internal connection go github com jdextraze go gesclient create home user go src github com jdextraze go gesclient connection go main getconnection home user go src github com jdextraze go gesclient examples publisher main go main main home user go src github com jdextraze go gesclient examples publisher main go nextexpectedversion logposition commitposition prepareposition default eventid type testevent isjson true data metadata nextexpectedversion logposition commitposition prepareposition c connection closed reason connection close requested by client connection allcatchupsubscriber
1
108,511
9,309,584,781
IssuesEvent
2019-03-25 16:47:38
mozilla/iris
https://api.github.com/repos/mozilla/iris
closed
Fix download_paused_unpaused
test case
download_paused_unpaused - The download can be paused/unpaused. By: rrobotin -- 1 | 4:30:58 AM | Progress information is displayed. 2 | 4:31:13 AM | Download was paused. - [Actual]: False [Expected]: True
1.0
Fix download_paused_unpaused - download_paused_unpaused - The download can be paused/unpaused. By: rrobotin -- 1 | 4:30:58 AM | Progress information is displayed. 2 | 4:31:13 AM | Download was paused. - [Actual]: False [Expected]: True
non_priority
fix download paused unpaused download paused unpaused the download can be paused unpaused by rrobotin am progress information is displayed am download was paused false true
0
145,941
19,377,648,900
IssuesEvent
2021-12-17 01:02:14
TIBCOSoftware/jasperreports-server-ce
https://api.github.com/repos/TIBCOSoftware/jasperreports-server-ce
opened
CVE-2021-45046 (Low) detected in log4j-core-2.13.3.jar
security vulnerability
## CVE-2021-45046 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-core-2.13.3.jar</b></p></summary> <p>The Apache Log4j Implementation</p> <p>Library home page: <a href="https://logging.apache.org/log4j/2.x/">https://logging.apache.org/log4j/2.x/</a></p> <p> Dependency Hierarchy: - :x: **log4j-core-2.13.3.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> It was found that the fix to address CVE-2021-44228 in Apache Log4j 2.15.0 was incomplete in certain non-default configurations. This could allows attackers with control over Thread Context Map (MDC) input data when the logging configuration uses a non-default Pattern Layout with either a Context Lookup (for example, $${ctx:loginId}) or a Thread Context Map pattern (%X, %mdc, or %MDC) to craft malicious input data using a JNDI Lookup pattern resulting in a denial of service (DOS) attack. Log4j 2.15.0 restricts JNDI LDAP lookups to localhost by default. Note that previous mitigations involving configuration such as to set the system property `log4j2.noFormatMsgLookup` to `true` do NOT mitigate this specific vulnerability. Log4j 2.16.0 fixes this issue by removing support for message lookup patterns and disabling JNDI functionality by default. This issue can be mitigated in prior releases (<2.16.0) by removing the JndiLookup class from the classpath (example: zip -q -d log4j-core-*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class). <p>Publish Date: 2021-12-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-45046>CVE-2021-45046</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.7</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://logging.apache.org/log4j/2.x/security.html">https://logging.apache.org/log4j/2.x/security.html</a></p> <p>Release Date: 2021-12-14</p> <p>Fix Resolution: org.apache.logging.log4j:log4j-core:2.12.2,2.16.0;org.ops4j.pax.logging:pax-logging-log4j2:1.11.11,2.0.12</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.logging.log4j","packageName":"log4j-core","packageVersion":"2.13.3","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"org.apache.logging.log4j:log4j-core:2.13.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.logging.log4j:log4j-core:2.12.2,2.16.0;org.ops4j.pax.logging:pax-logging-log4j2:1.11.11,2.0.12","isBinary":true}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-45046","vulnerabilityDetails":"It was found that the fix to address CVE-2021-44228 in Apache Log4j 2.15.0 was incomplete in certain non-default configurations. This could allows attackers with control over Thread Context Map (MDC) input data when the logging configuration uses a non-default Pattern Layout with either a Context Lookup (for example, $${ctx:loginId}) or a Thread Context Map pattern (%X, %mdc, or %MDC) to craft malicious input data using a JNDI Lookup pattern resulting in a denial of service (DOS) attack. Log4j 2.15.0 restricts JNDI LDAP lookups to localhost by default. Note that previous mitigations involving configuration such as to set the system property `log4j2.noFormatMsgLookup` to `true` do NOT mitigate this specific vulnerability. Log4j 2.16.0 fixes this issue by removing support for message lookup patterns and disabling JNDI functionality by default. This issue can be mitigated in prior releases (\u003c2.16.0) by removing the JndiLookup class from the classpath (example: zip -q -d log4j-core-*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-45046","cvss3Severity":"low","cvss3Score":"3.7","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-45046 (Low) detected in log4j-core-2.13.3.jar - ## CVE-2021-45046 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-core-2.13.3.jar</b></p></summary> <p>The Apache Log4j Implementation</p> <p>Library home page: <a href="https://logging.apache.org/log4j/2.x/">https://logging.apache.org/log4j/2.x/</a></p> <p> Dependency Hierarchy: - :x: **log4j-core-2.13.3.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> It was found that the fix to address CVE-2021-44228 in Apache Log4j 2.15.0 was incomplete in certain non-default configurations. This could allows attackers with control over Thread Context Map (MDC) input data when the logging configuration uses a non-default Pattern Layout with either a Context Lookup (for example, $${ctx:loginId}) or a Thread Context Map pattern (%X, %mdc, or %MDC) to craft malicious input data using a JNDI Lookup pattern resulting in a denial of service (DOS) attack. Log4j 2.15.0 restricts JNDI LDAP lookups to localhost by default. Note that previous mitigations involving configuration such as to set the system property `log4j2.noFormatMsgLookup` to `true` do NOT mitigate this specific vulnerability. Log4j 2.16.0 fixes this issue by removing support for message lookup patterns and disabling JNDI functionality by default. This issue can be mitigated in prior releases (<2.16.0) by removing the JndiLookup class from the classpath (example: zip -q -d log4j-core-*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class). <p>Publish Date: 2021-12-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-45046>CVE-2021-45046</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.7</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://logging.apache.org/log4j/2.x/security.html">https://logging.apache.org/log4j/2.x/security.html</a></p> <p>Release Date: 2021-12-14</p> <p>Fix Resolution: org.apache.logging.log4j:log4j-core:2.12.2,2.16.0;org.ops4j.pax.logging:pax-logging-log4j2:1.11.11,2.0.12</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.logging.log4j","packageName":"log4j-core","packageVersion":"2.13.3","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"org.apache.logging.log4j:log4j-core:2.13.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.logging.log4j:log4j-core:2.12.2,2.16.0;org.ops4j.pax.logging:pax-logging-log4j2:1.11.11,2.0.12","isBinary":true}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-45046","vulnerabilityDetails":"It was found that the fix to address CVE-2021-44228 in Apache Log4j 2.15.0 was incomplete in certain non-default configurations. This could allows attackers with control over Thread Context Map (MDC) input data when the logging configuration uses a non-default Pattern Layout with either a Context Lookup (for example, $${ctx:loginId}) or a Thread Context Map pattern (%X, %mdc, or %MDC) to craft malicious input data using a JNDI Lookup pattern resulting in a denial of service (DOS) attack. Log4j 2.15.0 restricts JNDI LDAP lookups to localhost by default. Note that previous mitigations involving configuration such as to set the system property `log4j2.noFormatMsgLookup` to `true` do NOT mitigate this specific vulnerability. Log4j 2.16.0 fixes this issue by removing support for message lookup patterns and disabling JNDI functionality by default. This issue can be mitigated in prior releases (\u003c2.16.0) by removing the JndiLookup class from the classpath (example: zip -q -d log4j-core-*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-45046","cvss3Severity":"low","cvss3Score":"3.7","cvss3Metrics":{"A":"Low","AC":"High","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_priority
cve low detected in core jar cve low severity vulnerability vulnerable library core jar the apache implementation library home page a href dependency hierarchy x core jar vulnerable library found in base branch master vulnerability details it was found that the fix to address cve in apache was incomplete in certain non default configurations this could allows attackers with control over thread context map mdc input data when the logging configuration uses a non default pattern layout with either a context lookup for example ctx loginid or a thread context map pattern x mdc or mdc to craft malicious input data using a jndi lookup pattern resulting in a denial of service dos attack restricts jndi ldap lookups to localhost by default note that previous mitigations involving configuration such as to set the system property noformatmsglookup to true do not mitigate this specific vulnerability fixes this issue by removing support for message lookup patterns and disabling jndi functionality by default this issue can be mitigated in prior releases by removing the jndilookup class from the classpath example zip q d core jar org apache logging core lookup jndilookup class publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache logging core org pax logging pax logging isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree org apache logging core isminimumfixversionavailable true minimumfixversion org apache logging core org pax logging pax logging isbinary true basebranches vulnerabilityidentifier cve vulnerabilitydetails it was found that the fix to address cve in apache was incomplete in certain non default configurations this could allows attackers with control over thread context map mdc input data when the logging configuration uses a non default pattern layout with either a context lookup for example ctx loginid or a thread context map pattern x mdc or mdc to craft malicious input data using a jndi lookup pattern resulting in a denial of service dos attack restricts jndi ldap lookups to localhost by default note that previous mitigations involving configuration such as to set the system property noformatmsglookup to true do not mitigate this specific vulnerability fixes this issue by removing support for message lookup patterns and disabling jndi functionality by default this issue can be mitigated in prior releases by removing the jndilookup class from the classpath example zip q d core jar org apache logging core lookup jndilookup class vulnerabilityurl
0
273,395
8,530,219,247
IssuesEvent
2018-11-03 20:10:16
MyMICDS/MyMICDS-v2
https://api.github.com/repos/MyMICDS/MyMICDS-v2
opened
Exposed stickynotes data
bug effort: easy priority: it can wait work length: short
We expose the `_id` and `user` properties in the get stickynotes endpoint, which might be an indicator that we're directly responding with the database document-- We don't need to be responding with these properties, and we don't want to respond directly with document in case we add private data in those documents in the future.
1.0
Exposed stickynotes data - We expose the `_id` and `user` properties in the get stickynotes endpoint, which might be an indicator that we're directly responding with the database document-- We don't need to be responding with these properties, and we don't want to respond directly with document in case we add private data in those documents in the future.
priority
exposed stickynotes data we expose the id and user properties in the get stickynotes endpoint which might be an indicator that we re directly responding with the database document we don t need to be responding with these properties and we don t want to respond directly with document in case we add private data in those documents in the future
1
270,829
20,611,236,931
IssuesEvent
2022-03-07 08:51:28
estruyf/vscode-front-matter
https://api.github.com/repos/estruyf/vscode-front-matter
opened
Enhancement: Create new `isPublishDate` and `isModifiedDate` property to define date types
bug documentation
### Discussed in https://github.com/estruyf/vscode-front-matter/discussions/278 <div type='discussions-op-text'> <sup>Originally posted by **jmatthewpryor** March 5, 2022</sup> Hi I've followed the documentation to try & get custom publish and last mod dates working - so far without success My workspace editor settings appear correct ![Screen Recording 2022-03-05 at 03 53 11 pm](https://user-images.githubusercontent.com/850570/156868524-d0476d38-7913-4509-9351-6c9f30dad969.gif) And I have edits frontmatter.json in my project also ``` "frontMatter.taxonomy.dateField": "publishedAt", "frontMatter.taxonomy.modifiedField": "lastMod", ``` However, the FrontMatter editor fails to pick up those field value and sets a field "date" when I use the Dashboard to update I am running on the Beta Any tips appreciated</div>
1.0
Enhancement: Create new `isPublishDate` and `isModifiedDate` property to define date types - ### Discussed in https://github.com/estruyf/vscode-front-matter/discussions/278 <div type='discussions-op-text'> <sup>Originally posted by **jmatthewpryor** March 5, 2022</sup> Hi I've followed the documentation to try & get custom publish and last mod dates working - so far without success My workspace editor settings appear correct ![Screen Recording 2022-03-05 at 03 53 11 pm](https://user-images.githubusercontent.com/850570/156868524-d0476d38-7913-4509-9351-6c9f30dad969.gif) And I have edits frontmatter.json in my project also ``` "frontMatter.taxonomy.dateField": "publishedAt", "frontMatter.taxonomy.modifiedField": "lastMod", ``` However, the FrontMatter editor fails to pick up those field value and sets a field "date" when I use the Dashboard to update I am running on the Beta Any tips appreciated</div>
non_priority
enhancement create new ispublishdate and ismodifieddate property to define date types discussed in originally posted by jmatthewpryor march hi i ve followed the documentation to try get custom publish and last mod dates working so far without success my workspace editor settings appear correct and i have edits frontmatter json in my project also frontmatter taxonomy datefield publishedat frontmatter taxonomy modifiedfield lastmod however the frontmatter editor fails to pick up those field value and sets a field date when i use the dashboard to update i am running on the beta any tips appreciated
0
2,710
3,968,118,685
IssuesEvent
2016-05-03 18:33:09
google/end-to-end
https://api.github.com/repos/google/end-to-end
reopened
S2K uses small c/bytecount, inconsistent suite of S2K-KDF-SHA1 (160b) and AES-256
bug crypto imported logic security
_From [coruus@gmail.com](https://code.google.com/u/coruus@gmail.com/) on July 05, 2014 23:47:38_ **Is this report about the crypto library or the extension?** Crypto library. javascript/crypto/e2e/openpgp/encryptedcipher.js What is the security bug? The library assumes that the hash algorithm used for the Iterated and Salted S2K specifier type is SHA1. (This is also a functionality bug; it makes it impossible to interoperate with implementations that require > 128-bit security for private keys.) 1\. The count field is specified as c=69. This results in a bytecount of 65536, or 3277 SHA1 compression function evaluations. At a rate of 2^30 SHA1 compressions per second (Fermi), this allows ~ 327660 password guesses per second. (The rate on ASIC is much, much higher -- SHA1 is small in silicon.) This is plainly inadequate for typical users' passphrases. Fix: Set c=255, bytecount=65011712, or 3m SHA1 compression function evaluations. Use SHA2-512 instead (it is the largest RFC 4880 hash function in hardware). 2\. AES-256 is always used to encrypt the payload. But using SHA-1 as the S2K KDF results in passwords with > 160 bits of entropy being compressed to < 160 bits of entropy. And probably rather less security-strength than that. Fix: Use a hash function with at least 256 bits of output instead. SHA2-512, again, would be preferable. How would someone exploit it? Brute force. (Note that SHA2-256 is smaller in hardware than SHA2-512; they are distinct functions.) _Original issue:_ <http://code.google.com/p/end-to-end/issues/detail?id=102>
True
S2K uses small c/bytecount, inconsistent suite of S2K-KDF-SHA1 (160b) and AES-256 - _From [coruus@gmail.com](https://code.google.com/u/coruus@gmail.com/) on July 05, 2014 23:47:38_ **Is this report about the crypto library or the extension?** Crypto library. javascript/crypto/e2e/openpgp/encryptedcipher.js What is the security bug? The library assumes that the hash algorithm used for the Iterated and Salted S2K specifier type is SHA1. (This is also a functionality bug; it makes it impossible to interoperate with implementations that require > 128-bit security for private keys.) 1\. The count field is specified as c=69. This results in a bytecount of 65536, or 3277 SHA1 compression function evaluations. At a rate of 2^30 SHA1 compressions per second (Fermi), this allows ~ 327660 password guesses per second. (The rate on ASIC is much, much higher -- SHA1 is small in silicon.) This is plainly inadequate for typical users' passphrases. Fix: Set c=255, bytecount=65011712, or 3m SHA1 compression function evaluations. Use SHA2-512 instead (it is the largest RFC 4880 hash function in hardware). 2\. AES-256 is always used to encrypt the payload. But using SHA-1 as the S2K KDF results in passwords with > 160 bits of entropy being compressed to < 160 bits of entropy. And probably rather less security-strength than that. Fix: Use a hash function with at least 256 bits of output instead. SHA2-512, again, would be preferable. How would someone exploit it? Brute force. (Note that SHA2-256 is smaller in hardware than SHA2-512; they are distinct functions.) _Original issue:_ <http://code.google.com/p/end-to-end/issues/detail?id=102>
non_priority
uses small c bytecount inconsistent suite of kdf and aes from on july is this report about the crypto library or the extension crypto library javascript crypto openpgp encryptedcipher js what is the security bug the library assumes that the hash algorithm used for the iterated and salted specifier type is this is also a functionality bug it makes it impossible to interoperate with implementations that require bit security for private keys the count field is specified as c this results in a bytecount of or compression function evaluations at a rate of compressions per second fermi this allows password guesses per second the rate on asic is much much higher is small in silicon this is plainly inadequate for typical users passphrases fix set c bytecount or compression function evaluations use instead it is the largest rfc hash function in hardware aes is always used to encrypt the payload but using sha as the kdf results in passwords with bits of entropy being compressed to bits of entropy and probably rather less security strength than that fix use a hash function with at least bits of output instead again would be preferable how would someone exploit it brute force note that is smaller in hardware than they are distinct functions original issue
0
268,711
8,410,785,937
IssuesEvent
2018-10-12 11:51:26
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
newsprofin.com - site is not usable
browser-firefox priority-normal
<!-- @browser: Firefox 64.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0 --> <!-- @reported_with: desktop-reporter --> **URL**: http://newsprofin.com/t1/?&geocode=en-in&hero=12&tmplcode=igzt&instsmall=1&cep=FsCgAX6OQmTUXP_9L9pg3YxtqheNLusSI6O-2xiUhNMiaAaFG8tmyIm4-JJ8Ovuq8OEPVK20kRrIlyuhNcOUsFlUt38-1yRlpfD-CWeQDOM_WgdYFQ7iyEgvJ8f-zh9O32I7DPF6rivBW80r5VuJ75nNtORwQo-LkBl4hpByZ61EEycm8UWmhQwGsjpYtFBmLpPKAW9EH7aj7je5yipQhQ72uyuKg9rd4hEKuW9S4ciVSDb_CYQlBgL9nZn9bYykgSzkNDaup-bVH70ZF73JwA&utm_term=313944231&utm_source=revhits&utm_medium=p&ref=p_revhits_eb_NOP_desk&bid=0.0005&clickid=313944231067367856145 **Browser / Version**: Firefox 64.0 **Operating System**: Windows 10 **Tested Another Browser**: Unknown **Problem type**: Site is not usable **Description**: site opens automatically **Steps to Reproduce**: [![Screenshot Description](https://webcompat.com/uploads/2018/10/863baf2f-b08d-4669-9e3f-e7ba10351f5e-thumb.jpg)](https://webcompat.com/uploads/2018/10/863baf2f-b08d-4669-9e3f-e7ba10351f5e.jpg) <details> <summary>Browser Configuration</summary> <ul> <li>mixed active content blocked: false</li><li>buildID: 20181010235834</li><li>tracking content blocked: false</li><li>hasTouchScreen: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.all: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>channel: nightly</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
newsprofin.com - site is not usable - <!-- @browser: Firefox 64.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0 --> <!-- @reported_with: desktop-reporter --> **URL**: http://newsprofin.com/t1/?&geocode=en-in&hero=12&tmplcode=igzt&instsmall=1&cep=FsCgAX6OQmTUXP_9L9pg3YxtqheNLusSI6O-2xiUhNMiaAaFG8tmyIm4-JJ8Ovuq8OEPVK20kRrIlyuhNcOUsFlUt38-1yRlpfD-CWeQDOM_WgdYFQ7iyEgvJ8f-zh9O32I7DPF6rivBW80r5VuJ75nNtORwQo-LkBl4hpByZ61EEycm8UWmhQwGsjpYtFBmLpPKAW9EH7aj7je5yipQhQ72uyuKg9rd4hEKuW9S4ciVSDb_CYQlBgL9nZn9bYykgSzkNDaup-bVH70ZF73JwA&utm_term=313944231&utm_source=revhits&utm_medium=p&ref=p_revhits_eb_NOP_desk&bid=0.0005&clickid=313944231067367856145 **Browser / Version**: Firefox 64.0 **Operating System**: Windows 10 **Tested Another Browser**: Unknown **Problem type**: Site is not usable **Description**: site opens automatically **Steps to Reproduce**: [![Screenshot Description](https://webcompat.com/uploads/2018/10/863baf2f-b08d-4669-9e3f-e7ba10351f5e-thumb.jpg)](https://webcompat.com/uploads/2018/10/863baf2f-b08d-4669-9e3f-e7ba10351f5e.jpg) <details> <summary>Browser Configuration</summary> <ul> <li>mixed active content blocked: false</li><li>buildID: 20181010235834</li><li>tracking content blocked: false</li><li>hasTouchScreen: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.all: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>channel: nightly</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
priority
newsprofin com site is not usable url browser version firefox operating system windows tested another browser unknown problem type site is not usable description site opens automatically steps to reproduce browser configuration mixed active content blocked false buildid tracking content blocked false hastouchscreen false gfx webrender blob images true gfx webrender all false mixed passive content blocked false gfx webrender enabled false image mem shared true channel nightly from with ❤️
1
819,635
30,746,674,638
IssuesEvent
2023-07-28 15:32:55
AspectOfJerry/BapUtils
https://api.github.com/repos/AspectOfJerry/BapUtils
closed
party takeover
enhancement qol priority commands wip
**Describe your idea** Add a command to take someone's party. **Describe the solution you'd like** Implement a trust feature to only allow trusted players to take the party from you. **Additional context** Already a work in progress
1.0
party takeover - **Describe your idea** Add a command to take someone's party. **Describe the solution you'd like** Implement a trust feature to only allow trusted players to take the party from you. **Additional context** Already a work in progress
priority
party takeover describe your idea add a command to take someone s party describe the solution you d like implement a trust feature to only allow trusted players to take the party from you additional context already a work in progress
1
647,616
21,132,169,202
IssuesEvent
2022-04-06 00:18:17
apache/dolphinscheduler
https://api.github.com/repos/apache/dolphinscheduler
closed
[Bug] [k8s] start server failed because can't connect to postgresql
bug Stale priority:low
### Search before asking - [X] I had searched in the [issues](https://github.com/apache/dolphinscheduler/issues?q=is%3Aissue) and found no similar issues. ### What happened ![image](https://user-images.githubusercontent.com/11962619/149758073-6e1a570d-b77b-46a4-b7ed-58d1ed2c7fba.png) ### What you expected to happen run ds on k8s successfully ### How to reproduce reproduce by https://dolphinscheduler.apache.org/en-us/docs/latest/user_doc/guide/installation/kubernetes.html ### Anything else _No response_ ### Version 2.0.2 ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
1.0
[Bug] [k8s] start server failed because can't connect to postgresql - ### Search before asking - [X] I had searched in the [issues](https://github.com/apache/dolphinscheduler/issues?q=is%3Aissue) and found no similar issues. ### What happened ![image](https://user-images.githubusercontent.com/11962619/149758073-6e1a570d-b77b-46a4-b7ed-58d1ed2c7fba.png) ### What you expected to happen run ds on k8s successfully ### How to reproduce reproduce by https://dolphinscheduler.apache.org/en-us/docs/latest/user_doc/guide/installation/kubernetes.html ### Anything else _No response_ ### Version 2.0.2 ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct)
priority
start server failed because can t connect to postgresql search before asking i had searched in the and found no similar issues what happened what you expected to happen run ds on successfully how to reproduce reproduce by anything else no response version are you willing to submit pr yes i am willing to submit a pr code of conduct i agree to follow this project s
1
225,620
7,493,781,984
IssuesEvent
2018-04-07 00:46:00
kubernetes/kubeadm
https://api.github.com/repos/kubernetes/kubeadm
closed
Kubeadm HA ( high availability ) checklist
area/HA area/upgrades kind/enhancement priority/important-soon triaged
The following is a checklist for kubeadm support for deploying HA-clusters. This a distillation of action items from: https://docs.google.com/document/d/1lH9OKkFZMSqXCApmSXemEDuy9qlINdm5MfWWGrK3JYc/edit#heading=h.8hdxw3quu67g but there may be more. New Features: - [x] Option enable active<>passive locking via ConfigMaps https://github.com/kubernetes/kubernetes/issues/44857 PR: https://github.com/kubernetes/kubernetes/pull/45739 - [x] Enable support to ComponentsConfigs to be loaded from ConfigMaps https://docs.google.com/document/d/1arP4T9Qkp2SovlJZ_y790sBeiWXDO6SG10pZ_UUU-Lc/edit?ts=59110d75#heading=h.xgjl2srtytjt - [ ] Add support to standup etcd operator https://github.com/kubernetes/kubeadm/issues/254 PR: https://github.com/kubernetes/kubernetes/pull/45665 - [ ] Define exact initialization flow (remove explicit migration step) Contentious: - [ ] Handle the smart client vs. endpoints/proxy/apiserver discovery. (https://github.com/kubernetes/kubernetes/issues/18174 & more) Extending Support & Documentation: - [ ] Update documentation to list supported best practices on LB configuration https://github.com/kubernetes/kubernetes.github.io/issues/3607 - [ ] Allow input options to standup HA cluster using existing etcd bootstrap endpoint - [ ] https://github.com/kubernetes/kubernetes/pull/44793 - [x] Make secrets more secure so we can pass secrets (see sig-auth) https://github.com/kubernetes/kubernetes/pull/41939 Cleanup cruft: - [ ] Remove docs and contrib references to older HA. /cc @kubernetes/sig-cluster-lifecycle-feature-requests
1.0
Kubeadm HA ( high availability ) checklist - The following is a checklist for kubeadm support for deploying HA-clusters. This a distillation of action items from: https://docs.google.com/document/d/1lH9OKkFZMSqXCApmSXemEDuy9qlINdm5MfWWGrK3JYc/edit#heading=h.8hdxw3quu67g but there may be more. New Features: - [x] Option enable active<>passive locking via ConfigMaps https://github.com/kubernetes/kubernetes/issues/44857 PR: https://github.com/kubernetes/kubernetes/pull/45739 - [x] Enable support to ComponentsConfigs to be loaded from ConfigMaps https://docs.google.com/document/d/1arP4T9Qkp2SovlJZ_y790sBeiWXDO6SG10pZ_UUU-Lc/edit?ts=59110d75#heading=h.xgjl2srtytjt - [ ] Add support to standup etcd operator https://github.com/kubernetes/kubeadm/issues/254 PR: https://github.com/kubernetes/kubernetes/pull/45665 - [ ] Define exact initialization flow (remove explicit migration step) Contentious: - [ ] Handle the smart client vs. endpoints/proxy/apiserver discovery. (https://github.com/kubernetes/kubernetes/issues/18174 & more) Extending Support & Documentation: - [ ] Update documentation to list supported best practices on LB configuration https://github.com/kubernetes/kubernetes.github.io/issues/3607 - [ ] Allow input options to standup HA cluster using existing etcd bootstrap endpoint - [ ] https://github.com/kubernetes/kubernetes/pull/44793 - [x] Make secrets more secure so we can pass secrets (see sig-auth) https://github.com/kubernetes/kubernetes/pull/41939 Cleanup cruft: - [ ] Remove docs and contrib references to older HA. /cc @kubernetes/sig-cluster-lifecycle-feature-requests
priority
kubeadm ha high availability checklist the following is a checklist for kubeadm support for deploying ha clusters this a distillation of action items from but there may be more new features option enable active passive locking via configmaps pr enable support to componentsconfigs to be loaded from configmaps add support to standup etcd operator pr define exact initialization flow remove explicit migration step contentious handle the smart client vs endpoints proxy apiserver discovery more extending support documentation update documentation to list supported best practices on lb configuration allow input options to standup ha cluster using existing etcd bootstrap endpoint make secrets more secure so we can pass secrets see sig auth cleanup cruft remove docs and contrib references to older ha cc kubernetes sig cluster lifecycle feature requests
1
94,226
3,923,097,529
IssuesEvent
2016-04-22 09:45:15
HGustavs/LenaSYS
https://api.github.com/repos/HGustavs/LenaSYS
closed
Can't create new item on a course
highPriority
When you create a new item on a course and select the type code you get the following error. "Returned from setup: Debug: Error occur at line 453 in file /var/www/lenasys/4/CodeViewer/editorService.php. There are no examples or the ID of example is incorrect."
1.0
Can't create new item on a course - When you create a new item on a course and select the type code you get the following error. "Returned from setup: Debug: Error occur at line 453 in file /var/www/lenasys/4/CodeViewer/editorService.php. There are no examples or the ID of example is incorrect."
priority
can t create new item on a course when you create a new item on a course and select the type code you get the following error returned from setup debug error occur at line in file var www lenasys codeviewer editorservice php there are no examples or the id of example is incorrect
1
785,321
27,609,376,843
IssuesEvent
2023-03-09 15:02:05
frequenz-floss/frequenz-sdk-python
https://api.github.com/repos/frequenz-floss/frequenz-sdk-python
closed
Implement BatteryPool
priority:high type:enhancement part:data-pipeline
### What's needed? User should be able to subscribe for data from some pool of Batteries. BatteryPool should communicate with user using channels. This approach would be consisntent with LogicanMeter and EvChargerPool. ## What data are needed: All these values will be packed in the class and send with one channel. We use DataSourceActor for IT! * **total_capacity**: sum(battery.capacity for each working battery in battery pool) * **max_capacity**: sum(battery.capacity * battery.max_soc for each working battery in battery pool) * **min_capacity**: sum(battery.capacity * battery.min_soc for each working battery in battery pool) * **max_soc**: sum(battery.capacity*battery.max_soc for each working battery in battery pool)/total_capacity * **min_soc**: sum(battery.capacity * battery.min_soc for each working battery in battery pool)/total_capacity All the value below will have their own channel. * **average_soc**: sum(battery.capacity * battery.soc for each working battery in battery pool)/total_capacity * **active_power** sum(battery.adjacent_inverter.active_power for each battery in the battery pool) - even if it is broken we need this information. * TUPLE **power_supply_bound** : sum( min( battery.power_upper_bound, battery.adjacent_inverter.power_upper_bound ) for each working battery in battery pool ) * TUPLE **power_consumption_bound** : sum( max( battery.power_lower_bound, battery.adjacent_inverter.power_lower_bound ) for each working battery in battery pool ) ### How frequently we should send data: * Capacity, SoC bounds - when they change. These don't change frequently. * SoP - This can change frequently. However it is not needed for any calculation. It is needed only before making charge/discharge decision. * We could send new value only when it change. However this value are not needed frequently and we would compute it frequently. It seems to be lots of unnecessary work. * User can ask for this value every-time when he needs it. But then we have no channel communication. It won't be possible to move BatteryPool to separate process. (**No** : Implementation should be consistent with Logical Meter. So we should use channels.) * SoC - when it change * active power - with given resampling frequency. ### Proposed solution BatteryPool will use: * BatteryPoolStatus - to know what batteries are working * ComponentMetricResamplingActor - to subscribe for the metric data and receive them with some frequency. * Interpolation methods for the resampler: * Capacity, SoC bounds : Return last value - This remains constant for most of the time, so returning last value seems to be correct. * SoP, SoC : * Return last value (forward filling) * Linear interpolation. . Both SoC and SoP don't change the direction frequently. They either increases or decreases for some time. (or remain constant). If we find linear function based on first and last value in Resampler buffer, then we can predict the next value. How long time window we should store in ResamplerBuffer? After X seconds of inactivity battery/inverter will be considered as broken. (BatteryPoolStatus holds X argument). Time window could be X, too. Then we can interpolate as long as battery last message is considered as *relevant*. * active_power - average ### Open questions ### * Different method to predict SoC, SoP. * How frequently we should send SoP - it can change frequently. We could send it in second basis and user can use peekable feature to get the latest value. ### Use cases _No response_ ### Alternatives and workarounds _No response_ ### Additional context _No response_
1.0
Implement BatteryPool - ### What's needed? User should be able to subscribe for data from some pool of Batteries. BatteryPool should communicate with user using channels. This approach would be consisntent with LogicanMeter and EvChargerPool. ## What data are needed: All these values will be packed in the class and send with one channel. We use DataSourceActor for IT! * **total_capacity**: sum(battery.capacity for each working battery in battery pool) * **max_capacity**: sum(battery.capacity * battery.max_soc for each working battery in battery pool) * **min_capacity**: sum(battery.capacity * battery.min_soc for each working battery in battery pool) * **max_soc**: sum(battery.capacity*battery.max_soc for each working battery in battery pool)/total_capacity * **min_soc**: sum(battery.capacity * battery.min_soc for each working battery in battery pool)/total_capacity All the value below will have their own channel. * **average_soc**: sum(battery.capacity * battery.soc for each working battery in battery pool)/total_capacity * **active_power** sum(battery.adjacent_inverter.active_power for each battery in the battery pool) - even if it is broken we need this information. * TUPLE **power_supply_bound** : sum( min( battery.power_upper_bound, battery.adjacent_inverter.power_upper_bound ) for each working battery in battery pool ) * TUPLE **power_consumption_bound** : sum( max( battery.power_lower_bound, battery.adjacent_inverter.power_lower_bound ) for each working battery in battery pool ) ### How frequently we should send data: * Capacity, SoC bounds - when they change. These don't change frequently. * SoP - This can change frequently. However it is not needed for any calculation. It is needed only before making charge/discharge decision. * We could send new value only when it change. However this value are not needed frequently and we would compute it frequently. It seems to be lots of unnecessary work. * User can ask for this value every-time when he needs it. But then we have no channel communication. It won't be possible to move BatteryPool to separate process. (**No** : Implementation should be consistent with Logical Meter. So we should use channels.) * SoC - when it change * active power - with given resampling frequency. ### Proposed solution BatteryPool will use: * BatteryPoolStatus - to know what batteries are working * ComponentMetricResamplingActor - to subscribe for the metric data and receive them with some frequency. * Interpolation methods for the resampler: * Capacity, SoC bounds : Return last value - This remains constant for most of the time, so returning last value seems to be correct. * SoP, SoC : * Return last value (forward filling) * Linear interpolation. . Both SoC and SoP don't change the direction frequently. They either increases or decreases for some time. (or remain constant). If we find linear function based on first and last value in Resampler buffer, then we can predict the next value. How long time window we should store in ResamplerBuffer? After X seconds of inactivity battery/inverter will be considered as broken. (BatteryPoolStatus holds X argument). Time window could be X, too. Then we can interpolate as long as battery last message is considered as *relevant*. * active_power - average ### Open questions ### * Different method to predict SoC, SoP. * How frequently we should send SoP - it can change frequently. We could send it in second basis and user can use peekable feature to get the latest value. ### Use cases _No response_ ### Alternatives and workarounds _No response_ ### Additional context _No response_
priority
implement batterypool what s needed user should be able to subscribe for data from some pool of batteries batterypool should communicate with user using channels this approach would be consisntent with logicanmeter and evchargerpool what data are needed all these values will be packed in the class and send with one channel we use datasourceactor for it total capacity sum battery capacity for each working battery in battery pool max capacity sum battery capacity battery max soc for each working battery in battery pool min capacity sum battery capacity battery min soc for each working battery in battery pool max soc sum battery capacity battery max soc for each working battery in battery pool total capacity min soc sum battery capacity battery min soc for each working battery in battery pool total capacity all the value below will have their own channel average soc sum battery capacity battery soc for each working battery in battery pool total capacity active power sum battery adjacent inverter active power for each battery in the battery pool even if it is broken we need this information tuple power supply bound sum min battery power upper bound battery adjacent inverter power upper bound for each working battery in battery pool tuple power consumption bound sum max battery power lower bound battery adjacent inverter power lower bound for each working battery in battery pool how frequently we should send data capacity soc bounds when they change these don t change frequently sop this can change frequently however it is not needed for any calculation it is needed only before making charge discharge decision we could send new value only when it change however this value are not needed frequently and we would compute it frequently it seems to be lots of unnecessary work user can ask for this value every time when he needs it but then we have no channel communication it won t be possible to move batterypool to separate process no implementation should be consistent with logical meter so we should use channels soc when it change active power with given resampling frequency proposed solution batterypool will use batterypoolstatus to know what batteries are working componentmetricresamplingactor to subscribe for the metric data and receive them with some frequency interpolation methods for the resampler capacity soc bounds return last value this remains constant for most of the time so returning last value seems to be correct sop soc return last value forward filling linear interpolation both soc and sop don t change the direction frequently they either increases or decreases for some time or remain constant if we find linear function based on first and last value in resampler buffer then we can predict the next value how long time window we should store in resamplerbuffer after x seconds of inactivity battery inverter will be considered as broken batterypoolstatus holds x argument time window could be x too then we can interpolate as long as battery last message is considered as relevant active power average open questions different method to predict soc sop how frequently we should send sop it can change frequently we could send it in second basis and user can use peekable feature to get the latest value use cases no response alternatives and workarounds no response additional context no response
1
759,298
26,587,929,355
IssuesEvent
2023-01-23 04:42:35
googleapis/nodejs-spanner
https://api.github.com/repos/googleapis/nodejs-spanner
closed
Error parsing STRUCT with JSON "keys" field
type: bug priority: p2 api: spanner
# Environment details - OS: `Mac OS Ventura 13.1` - Node.js version: `14.17.6` - npm version: `6.14.15` - `@google-cloud/spanner` version: `6.7.0` --- # Error ### Steps to reproduce The error occurs when the queried column either is or includes a `STRUCT` with a child field of `type: "JSON"` and a `name` value equal to one of the [JS `Array` method names](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array), e.g. `"keys"`. This condition is satisfied by a change-stream query that returns data change records. Change stream queries that don't return data change records are unaffected. In my case I'm using `database.runStream()` to run the query. #### See also The error case is reproduced by this test case in the issue-associated PR: https://github.com/googleapis/nodejs-spanner/pull/1775/files#diff-bd3702694080cbb1cc59ec4e3021374f22a2fd665617a80bd3c29630b9dbe7b7R675-R695. ### Details I'm hitting an error when running a change-stream query via the `database.runStream()` method: ```sql SELECT ChangeRecord FROM READ_${MyStream} ( start_timestamp => @start_timestamp, end_timestamp => NULL, partition_token => @partition_token, heartbeat_milliseconds => 5000 ) ``` In the case where the query returns data change records, an error is thrown: ``` function keys() { [native code] } ^ SyntaxError: Unexpected token u in JSON at position 1 at JSON.parse (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:321:28) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:332:31 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:331:45) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:326:24 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:325:31) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:332:31 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:331:45) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:326:24 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:325:31) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:332:31 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:331:45) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:326:24 at Array.map (<anonymous>) at Object.decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:325:31) at <dir>/node_modules/@google-cloud/spanner/build/src/partial-result-stream.js:202:49 at Array.map (<anonymous>) ``` _codec.js:321_ – the initial source of the error – is part of the `decode()` function; this line specifically covers the decoding of JSON values [1]: ```ts 320 case 'JSON': 321 decoded = JSON.parse(decoded); 322 break; ``` Line _332_ – the next level down in the stack – handles `STRUCT` parsing [2]: ```ts 330 case 'STRUCT': 331 fields = type.structType.fields.map(({ name, type }, index) => { 332 const value = decode(decoded[name] || decoded[index], type); 333 return { name, value }; 334 }); 335 decoded = Struct.fromArray(fields); 336 break; ``` I added some logging to the file and found that the error occurs when parsing the [data-change record `.mods` column](https://cloud.google.com/spanner/docs/change-streams/details#data-change-records). `.mods` is an array with the following format: ```json { "name": "mods", "type": { "code": "ARRAY", "arrayElementType": { "code": "STRUCT", "arrayElementType": null, "structType": { "fields": [ { "name": "keys", "type": { "code": "JSON", "arrayElementType": null, "structType": null, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" } }, { "name": "new_values", "type": { "code": "JSON", "arrayElementType": null, "structType": null, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" } }, { "name": "old_values", "type": { "code": "JSON", "arrayElementType": null, "structType": null, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" } } ] }, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" }, "structType": null, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" } } ``` A sample input/not-yet-decoded `.mods` value: ```json [ [ "{\"entity_id\":\"d4374939-3f52-4dfe-a074-d2ca202aa5e5\"}", "{\"updated_at\":\"2023-01-18T08:01:40.402463Z\"}", "{}" ] ] ``` - (Note that the value here for the STRUCT is an array and not an object. The existing unit tests for running `decode()` with a STRUCT type use an object value, and not an array. Unsure why both cases are possible.) When we decode the first `.mods` array item, a `STRUCT`, we land at _codec.js:331_. The first field provided by `type.structType.fields` has `{ name: "keys", type: "JSON" }`. We, at this point, have a `decoded` value of: ```json [ "{\"entity_id\":\"d4374939-3f52-4dfe-a074-d2ca202aa5e5\"}", "{\"updated_at\":\"2023-01-18T08:01:40.402463Z\"}", "{}" ] ``` At _codec.js:332_ we _should_ be passing over `decoded[name]` and using `decoded[index]`, since `decoded` is an array; however since `name == "keys"` and [`Array.prototype.keys`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/keys) is a method, `decoded[name]` registers as truthy: `decoded.keys` is a function. We therefore pass the `Array.prototype.keys` function back into `decode()` with `type: "JSON"`, land at _codec.js:321_, and `JSON.parse()` fails with a function as an argument. --- ## Solution The apparent solution: when decoding a `STRUCT`, only prefer `decoded[name]` when `decoded` is not an array: ```ts 330 case 'STRUCT': 331 fields = type.structType.fields.map(({ name, type }, index) => { 332 const value = decode((!Array.isArray(decoded) && decoded[name!]) || decoded[index], type); 333 return { name, value }; 334 }); 335 decoded = Struct.fromArray(fields); 336 break; ``` Or, in the _src/_ Typescript code: ```ts case 'STRUCT': fields = type.structType!.fields!.map(({name, type}, index) => { const value = decode( (!Array.isArray(decoded) && decoded[name!]) || decoded[index], type as spannerClient.spanner.v1.Type ); return {name, value}; }); decoded = Struct.fromArray(fields as Field[]); break; ``` Indeed when this change is made, the aforementioned query works correctly. --- ## Footnotes 1. https://github.com/googleapis/nodejs-spanner/blob/ba7029a9e3739afafe9a1a57bfb64accef3ae808/src/codec.ts#L390-L400 2. https://github.com/googleapis/nodejs-spanner/blob/ba7029a9e3739afafe9a1a57bfb64accef3ae808/src/codec.ts#L411-L420
1.0
Error parsing STRUCT with JSON "keys" field - # Environment details - OS: `Mac OS Ventura 13.1` - Node.js version: `14.17.6` - npm version: `6.14.15` - `@google-cloud/spanner` version: `6.7.0` --- # Error ### Steps to reproduce The error occurs when the queried column either is or includes a `STRUCT` with a child field of `type: "JSON"` and a `name` value equal to one of the [JS `Array` method names](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array), e.g. `"keys"`. This condition is satisfied by a change-stream query that returns data change records. Change stream queries that don't return data change records are unaffected. In my case I'm using `database.runStream()` to run the query. #### See also The error case is reproduced by this test case in the issue-associated PR: https://github.com/googleapis/nodejs-spanner/pull/1775/files#diff-bd3702694080cbb1cc59ec4e3021374f22a2fd665617a80bd3c29630b9dbe7b7R675-R695. ### Details I'm hitting an error when running a change-stream query via the `database.runStream()` method: ```sql SELECT ChangeRecord FROM READ_${MyStream} ( start_timestamp => @start_timestamp, end_timestamp => NULL, partition_token => @partition_token, heartbeat_milliseconds => 5000 ) ``` In the case where the query returns data change records, an error is thrown: ``` function keys() { [native code] } ^ SyntaxError: Unexpected token u in JSON at position 1 at JSON.parse (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:321:28) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:332:31 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:331:45) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:326:24 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:325:31) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:332:31 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:331:45) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:326:24 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:325:31) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:332:31 at Array.map (<anonymous>) at decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:331:45) at <dir>/node_modules/@google-cloud/spanner/build/src/codec.js:326:24 at Array.map (<anonymous>) at Object.decode (<dir>/node_modules/@google-cloud/spanner/build/src/codec.js:325:31) at <dir>/node_modules/@google-cloud/spanner/build/src/partial-result-stream.js:202:49 at Array.map (<anonymous>) ``` _codec.js:321_ – the initial source of the error – is part of the `decode()` function; this line specifically covers the decoding of JSON values [1]: ```ts 320 case 'JSON': 321 decoded = JSON.parse(decoded); 322 break; ``` Line _332_ – the next level down in the stack – handles `STRUCT` parsing [2]: ```ts 330 case 'STRUCT': 331 fields = type.structType.fields.map(({ name, type }, index) => { 332 const value = decode(decoded[name] || decoded[index], type); 333 return { name, value }; 334 }); 335 decoded = Struct.fromArray(fields); 336 break; ``` I added some logging to the file and found that the error occurs when parsing the [data-change record `.mods` column](https://cloud.google.com/spanner/docs/change-streams/details#data-change-records). `.mods` is an array with the following format: ```json { "name": "mods", "type": { "code": "ARRAY", "arrayElementType": { "code": "STRUCT", "arrayElementType": null, "structType": { "fields": [ { "name": "keys", "type": { "code": "JSON", "arrayElementType": null, "structType": null, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" } }, { "name": "new_values", "type": { "code": "JSON", "arrayElementType": null, "structType": null, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" } }, { "name": "old_values", "type": { "code": "JSON", "arrayElementType": null, "structType": null, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" } } ] }, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" }, "structType": null, "typeAnnotation": "TYPE_ANNOTATION_CODE_UNSPECIFIED" } } ``` A sample input/not-yet-decoded `.mods` value: ```json [ [ "{\"entity_id\":\"d4374939-3f52-4dfe-a074-d2ca202aa5e5\"}", "{\"updated_at\":\"2023-01-18T08:01:40.402463Z\"}", "{}" ] ] ``` - (Note that the value here for the STRUCT is an array and not an object. The existing unit tests for running `decode()` with a STRUCT type use an object value, and not an array. Unsure why both cases are possible.) When we decode the first `.mods` array item, a `STRUCT`, we land at _codec.js:331_. The first field provided by `type.structType.fields` has `{ name: "keys", type: "JSON" }`. We, at this point, have a `decoded` value of: ```json [ "{\"entity_id\":\"d4374939-3f52-4dfe-a074-d2ca202aa5e5\"}", "{\"updated_at\":\"2023-01-18T08:01:40.402463Z\"}", "{}" ] ``` At _codec.js:332_ we _should_ be passing over `decoded[name]` and using `decoded[index]`, since `decoded` is an array; however since `name == "keys"` and [`Array.prototype.keys`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/keys) is a method, `decoded[name]` registers as truthy: `decoded.keys` is a function. We therefore pass the `Array.prototype.keys` function back into `decode()` with `type: "JSON"`, land at _codec.js:321_, and `JSON.parse()` fails with a function as an argument. --- ## Solution The apparent solution: when decoding a `STRUCT`, only prefer `decoded[name]` when `decoded` is not an array: ```ts 330 case 'STRUCT': 331 fields = type.structType.fields.map(({ name, type }, index) => { 332 const value = decode((!Array.isArray(decoded) && decoded[name!]) || decoded[index], type); 333 return { name, value }; 334 }); 335 decoded = Struct.fromArray(fields); 336 break; ``` Or, in the _src/_ Typescript code: ```ts case 'STRUCT': fields = type.structType!.fields!.map(({name, type}, index) => { const value = decode( (!Array.isArray(decoded) && decoded[name!]) || decoded[index], type as spannerClient.spanner.v1.Type ); return {name, value}; }); decoded = Struct.fromArray(fields as Field[]); break; ``` Indeed when this change is made, the aforementioned query works correctly. --- ## Footnotes 1. https://github.com/googleapis/nodejs-spanner/blob/ba7029a9e3739afafe9a1a57bfb64accef3ae808/src/codec.ts#L390-L400 2. https://github.com/googleapis/nodejs-spanner/blob/ba7029a9e3739afafe9a1a57bfb64accef3ae808/src/codec.ts#L411-L420
priority
error parsing struct with json keys field environment details os mac os ventura node js version npm version google cloud spanner version error steps to reproduce the error occurs when the queried column either is or includes a struct with a child field of type json and a name value equal to one of the e g keys this condition is satisfied by a change stream query that returns data change records change stream queries that don t return data change records are unaffected in my case i m using database runstream to run the query see also the error case is reproduced by this test case in the issue associated pr details i m hitting an error when running a change stream query via the database runstream method sql select changerecord from read mystream start timestamp start timestamp end timestamp null partition token partition token heartbeat milliseconds in the case where the query returns data change records an error is thrown function keys syntaxerror unexpected token u in json at position at json parse at decode node modules google cloud spanner build src codec js at node modules google cloud spanner build src codec js at array map at decode node modules google cloud spanner build src codec js at node modules google cloud spanner build src codec js at array map at decode node modules google cloud spanner build src codec js at node modules google cloud spanner build src codec js at array map at decode node modules google cloud spanner build src codec js at node modules google cloud spanner build src codec js at array map at decode node modules google cloud spanner build src codec js at node modules google cloud spanner build src codec js at array map at decode node modules google cloud spanner build src codec js at node modules google cloud spanner build src codec js at array map at object decode node modules google cloud spanner build src codec js at node modules google cloud spanner build src partial result stream js at array map codec js – the initial source of the error – is part of the decode function this line specifically covers the decoding of json values ts case json decoded json parse decoded break line – the next level down in the stack – handles struct parsing ts case struct fields type structtype fields map name type index const value decode decoded decoded type return name value decoded struct fromarray fields break i added some logging to the file and found that the error occurs when parsing the mods is an array with the following format json name mods type code array arrayelementtype code struct arrayelementtype null structtype fields name keys type code json arrayelementtype null structtype null typeannotation type annotation code unspecified name new values type code json arrayelementtype null structtype null typeannotation type annotation code unspecified name old values type code json arrayelementtype null structtype null typeannotation type annotation code unspecified typeannotation type annotation code unspecified structtype null typeannotation type annotation code unspecified a sample input not yet decoded mods value json entity id updated at note that the value here for the struct is an array and not an object the existing unit tests for running decode with a struct type use an object value and not an array unsure why both cases are possible when we decode the first mods array item a struct we land at codec js the first field provided by type structtype fields has name keys type json we at this point have a decoded value of json entity id updated at at codec js we should be passing over decoded and using decoded since decoded is an array however since name keys and is a method decoded registers as truthy decoded keys is a function we therefore pass the array prototype keys function back into decode with type json land at codec js and json parse fails with a function as an argument solution the apparent solution when decoding a struct only prefer decoded when decoded is not an array ts case struct fields type structtype fields map name type index const value decode array isarray decoded decoded decoded type return name value decoded struct fromarray fields break or in the src typescript code ts case struct fields type structtype fields map name type index const value decode array isarray decoded decoded decoded type as spannerclient spanner type return name value decoded struct fromarray fields as field break indeed when this change is made the aforementioned query works correctly footnotes
1
521,830
15,117,091,077
IssuesEvent
2021-02-09 07:55:46
rchain/rchain
https://api.github.com/repos/rchain/rchain
opened
Missing response to ApprovedBlockRequest keeps the node waiting forever
Priority-Low enhancement
On 0.10.0 release, NodeA makes a request for `ApprovedBlockRequest` from NodeB. NodeB responds with `ApprovedBlock`, however, NodeA never receives the block, making NodeA wait forever for the block. It would be good to have a timeout and retry N times before exiting.
1.0
Missing response to ApprovedBlockRequest keeps the node waiting forever - On 0.10.0 release, NodeA makes a request for `ApprovedBlockRequest` from NodeB. NodeB responds with `ApprovedBlock`, however, NodeA never receives the block, making NodeA wait forever for the block. It would be good to have a timeout and retry N times before exiting.
priority
missing response to approvedblockrequest keeps the node waiting forever on release nodea makes a request for approvedblockrequest from nodeb nodeb responds with approvedblock however nodea never receives the block making nodea wait forever for the block it would be good to have a timeout and retry n times before exiting
1
159,456
24,994,757,850
IssuesEvent
2022-11-02 22:27:34
Agoric/agoric-sdk
https://api.github.com/repos/Agoric/agoric-sdk
closed
QA the wallet UX
wallet needs-design Inter-protocol MUST-HAVE
## What is the Problem Being Solved? Wallet needs QA of the UX. ## Description of the Design - Go through user flows and make sure there aren't any kinks. - Go through other docs of issues - [UX Issues (from Econ Testing)](https://docs.google.com/document/d/1Kho6aLrNaD-qxG7StOQXLSMPTZMHGHQLbV7x70jEEDk/edit#) - Triage what needs to be fixed ## Security Considerations ## Test Plan
1.0
QA the wallet UX - ## What is the Problem Being Solved? Wallet needs QA of the UX. ## Description of the Design - Go through user flows and make sure there aren't any kinks. - Go through other docs of issues - [UX Issues (from Econ Testing)](https://docs.google.com/document/d/1Kho6aLrNaD-qxG7StOQXLSMPTZMHGHQLbV7x70jEEDk/edit#) - Triage what needs to be fixed ## Security Considerations ## Test Plan
non_priority
qa the wallet ux what is the problem being solved wallet needs qa of the ux description of the design go through user flows and make sure there aren t any kinks go through other docs of issues triage what needs to be fixed security considerations test plan
0
15,712
6,028,274,375
IssuesEvent
2017-06-08 15:24:59
CleverRaven/Cataclysm-DDA
https://api.github.com/repos/CleverRaven/Cataclysm-DDA
closed
Problem compiling with Visual Studio
Build
Using Visual Studio 2015 on Windows 7 can't compile latest SDL tiles version. Here is the log with /verbose (today I learn what that is) [Cataclysm.txt](https://github.com/CleverRaven/Cataclysm-DDA/files/1043248/Cataclysm.txt) After spending all this day looking what the linker error were, and after manually checking commits(I discovered later that I could have used git bisect) I found out that this commit https://github.com/CleverRaven/Cataclysm-DDA/commit/dbf94ea32432320fc874237d93965c069fb674f3 is the one causing the problem. I would have spotted this problem earlier if I were more used to sync my local repo, sorry about that.
1.0
Problem compiling with Visual Studio - Using Visual Studio 2015 on Windows 7 can't compile latest SDL tiles version. Here is the log with /verbose (today I learn what that is) [Cataclysm.txt](https://github.com/CleverRaven/Cataclysm-DDA/files/1043248/Cataclysm.txt) After spending all this day looking what the linker error were, and after manually checking commits(I discovered later that I could have used git bisect) I found out that this commit https://github.com/CleverRaven/Cataclysm-DDA/commit/dbf94ea32432320fc874237d93965c069fb674f3 is the one causing the problem. I would have spotted this problem earlier if I were more used to sync my local repo, sorry about that.
non_priority
problem compiling with visual studio using visual studio on windows can t compile latest sdl tiles version here is the log with verbose today i learn what that is after spending all this day looking what the linker error were and after manually checking commits i discovered later that i could have used git bisect i found out that this commit is the one causing the problem i would have spotted this problem earlier if i were more used to sync my local repo sorry about that
0
6,726
3,041,980,609
IssuesEvent
2015-08-08 05:22:34
latex3/svn-mirror
https://api.github.com/repos/latex3/svn-mirror
closed
incorrect rounding in l3fp
documentation l3fp
The rounding function doesn't work correctly for 2 (tested with a current texlive 2015): ~~~~~~ \documentclass{article} \usepackage{expl3} \begin{document} \ExplSyntaxOn \fp_eval:n{round(1.115,2)}\par %OK = 1.12 \fp_eval:n{round(1.125,2)}\par %wrong = 1.12 (instead of 1.13 \fp_eval:n{round(1.135,2)} %OK = 1.14 \par \fp_eval:n{round(1.15,1)}\par %OK = 1.2 \fp_eval:n{round(1.25,1)}\par %wrong = 1.2 (instead of 1.3 \fp_eval:n{round(1.35,1)} %OK = 1.4 \ExplSyntaxOff \end{document} ~~~~~~ ![round](https://cloud.githubusercontent.com/assets/4047173/8594798/a9cec020-2644-11e5-9c55-870c7b68c1bc.PNG)
1.0
incorrect rounding in l3fp - The rounding function doesn't work correctly for 2 (tested with a current texlive 2015): ~~~~~~ \documentclass{article} \usepackage{expl3} \begin{document} \ExplSyntaxOn \fp_eval:n{round(1.115,2)}\par %OK = 1.12 \fp_eval:n{round(1.125,2)}\par %wrong = 1.12 (instead of 1.13 \fp_eval:n{round(1.135,2)} %OK = 1.14 \par \fp_eval:n{round(1.15,1)}\par %OK = 1.2 \fp_eval:n{round(1.25,1)}\par %wrong = 1.2 (instead of 1.3 \fp_eval:n{round(1.35,1)} %OK = 1.4 \ExplSyntaxOff \end{document} ~~~~~~ ![round](https://cloud.githubusercontent.com/assets/4047173/8594798/a9cec020-2644-11e5-9c55-870c7b68c1bc.PNG)
non_priority
incorrect rounding in the rounding function doesn t work correctly for tested with a current texlive documentclass article usepackage begin document explsyntaxon fp eval n round par ok fp eval n round par wrong instead of fp eval n round ok par fp eval n round par ok fp eval n round par wrong instead of fp eval n round ok explsyntaxoff end document
0
72,370
3,384,866,826
IssuesEvent
2015-11-27 07:49:03
sphereio/sphere-jvm-sdk
https://api.github.com/repos/sphereio/sphere-jvm-sdk
opened
Add new payment fields
priority task
see http://dev.commercetools.com/release-notes.html#all-releases ``` [SPHERE.IO API] Transactions in PaymentsBETA has the new fields id and state. The field timestamp is now optional. [SPHERE.IO API] Payment has three new update actions for transactions: changeTransactionState, changeTransactionTimestamp and changeTransactionInteractionId. [SPHERE.IO API] New message PaymentTransactionStateChanged [DOC]. ```
1.0
Add new payment fields - see http://dev.commercetools.com/release-notes.html#all-releases ``` [SPHERE.IO API] Transactions in PaymentsBETA has the new fields id and state. The field timestamp is now optional. [SPHERE.IO API] Payment has three new update actions for transactions: changeTransactionState, changeTransactionTimestamp and changeTransactionInteractionId. [SPHERE.IO API] New message PaymentTransactionStateChanged [DOC]. ```
priority
add new payment fields see transactions in paymentsbeta has the new fields id and state the field timestamp is now optional payment has three new update actions for transactions changetransactionstate changetransactiontimestamp and changetransactioninteractionid new message paymenttransactionstatechanged
1
238,083
19,697,146,829
IssuesEvent
2022-01-12 13:20:54
backend-br/vagas
https://api.github.com/repos/backend-br/vagas
closed
[REMOTO] Back-end Developer Ruby on Rails @ James Delivery
CLT JavaScript Ruby Remoto AWS PostgreSQL Testes Unitários Git Rest Linux Stale
## James Delivery Por meio do meu app levo praticidade e conforto à vida das pessoas! Com apenas alguns cliques, ofereço a maneira mais prática de receber seu delivery, que vai muito além de comida. É do produto que você quiser e de qualquer lugar da cidade! E, para que isso tudo funcione da melhor forma, estamos em busca de uma pessoa para fazer parte do nosso time! ## Descrição da vaga Com o desenvolvimento Backend, você será responsável pelo levantamento e análise do banco de dados, pela atualização, correção, manutenção e sustentação do nosso sistema. Nossas squads tem 7-8 pessoas focadas nas melhores soluções para nossos apps e trabalhamos com metodologia ágil. ## Local - Remoto - temos colaboradores de todo canto do Brasil e fornecemos os equipamentos para trabalhar de casa ☺ ## Requisitos - Experiência com Ruby; - Experiência com Ruby on Rails; - Experiência com endpoints padrão REST; - Experiência com banco de dados PostgreSQL; - Rigor na organização em repositório GIT; - Experiência razoável com utilização de terminal Linux; - Experiência razoável com Ubuntu; - Rigor na produção de código: usar comentários, seguir a arquitetura do projeto, usar nomes de variáveis e métodos que fazem sentido, etc. **Diferenciais:** - Implementação de testes unitários para Ruby; - Experiência JavaScript (node.js); - Conhecimentos em AWS (ec2, rds, lambda, S3, cloudwatch, etc). ## Benefícios - Plano de saúde (Unimed Nacional); - Plano odontológico (DentalUni); - Seguro de vida (sem custo); - Birthday-off; - Gympass - No Dress Code, fique à vontade em casa; **Diferenciais:** - PLR por atingimento de metas; - James Prime para todos os colaboradores; - VR ou VA de R$ 25,00/dia através de cartão de crédito (flex food); - Horário semi flexível; - Convênio com Clínica de Psicologia (Pulsão e Vida); - Plano veterinário (Petwell); ## Contratação CLT ## Como se candidatar [Pela Gupy ](https://james.gupy.io/jobs/1289910)ou clique no link> https://james.gupy.io/jobs/1289910 ## Tempo médio de feedbacks 1-2 dias. Sendo o retorno positivo para trabalhar conosco ou negativo. Costumo elaborar um feedback construtivo para todas as pessoas. #### Regime - CLT - 40h/semanais #### Nível - Sênior
1.0
[REMOTO] Back-end Developer Ruby on Rails @ James Delivery - ## James Delivery Por meio do meu app levo praticidade e conforto à vida das pessoas! Com apenas alguns cliques, ofereço a maneira mais prática de receber seu delivery, que vai muito além de comida. É do produto que você quiser e de qualquer lugar da cidade! E, para que isso tudo funcione da melhor forma, estamos em busca de uma pessoa para fazer parte do nosso time! ## Descrição da vaga Com o desenvolvimento Backend, você será responsável pelo levantamento e análise do banco de dados, pela atualização, correção, manutenção e sustentação do nosso sistema. Nossas squads tem 7-8 pessoas focadas nas melhores soluções para nossos apps e trabalhamos com metodologia ágil. ## Local - Remoto - temos colaboradores de todo canto do Brasil e fornecemos os equipamentos para trabalhar de casa ☺ ## Requisitos - Experiência com Ruby; - Experiência com Ruby on Rails; - Experiência com endpoints padrão REST; - Experiência com banco de dados PostgreSQL; - Rigor na organização em repositório GIT; - Experiência razoável com utilização de terminal Linux; - Experiência razoável com Ubuntu; - Rigor na produção de código: usar comentários, seguir a arquitetura do projeto, usar nomes de variáveis e métodos que fazem sentido, etc. **Diferenciais:** - Implementação de testes unitários para Ruby; - Experiência JavaScript (node.js); - Conhecimentos em AWS (ec2, rds, lambda, S3, cloudwatch, etc). ## Benefícios - Plano de saúde (Unimed Nacional); - Plano odontológico (DentalUni); - Seguro de vida (sem custo); - Birthday-off; - Gympass - No Dress Code, fique à vontade em casa; **Diferenciais:** - PLR por atingimento de metas; - James Prime para todos os colaboradores; - VR ou VA de R$ 25,00/dia através de cartão de crédito (flex food); - Horário semi flexível; - Convênio com Clínica de Psicologia (Pulsão e Vida); - Plano veterinário (Petwell); ## Contratação CLT ## Como se candidatar [Pela Gupy ](https://james.gupy.io/jobs/1289910)ou clique no link> https://james.gupy.io/jobs/1289910 ## Tempo médio de feedbacks 1-2 dias. Sendo o retorno positivo para trabalhar conosco ou negativo. Costumo elaborar um feedback construtivo para todas as pessoas. #### Regime - CLT - 40h/semanais #### Nível - Sênior
non_priority
back end developer ruby on rails james delivery james delivery por meio do meu app levo praticidade e conforto à vida das pessoas com apenas alguns cliques ofereço a maneira mais prática de receber seu delivery que vai muito além de comida é do produto que você quiser e de qualquer lugar da cidade e para que isso tudo funcione da melhor forma estamos em busca de uma pessoa para fazer parte do nosso time descrição da vaga com o desenvolvimento backend você será responsável pelo levantamento e análise do banco de dados pela atualização correção manutenção e sustentação do nosso sistema nossas squads tem pessoas focadas nas melhores soluções para nossos apps e trabalhamos com metodologia ágil local remoto temos colaboradores de todo canto do brasil e fornecemos os equipamentos para trabalhar de casa ☺ requisitos experiência com ruby experiência com ruby on rails experiência com endpoints padrão rest experiência com banco de dados postgresql rigor na organização em repositório git experiência razoável com utilização de terminal linux experiência razoável com ubuntu rigor na produção de código usar comentários seguir a arquitetura do projeto usar nomes de variáveis e métodos que fazem sentido etc diferenciais implementação de testes unitários para ruby experiência javascript node js conhecimentos em aws rds lambda cloudwatch etc benefícios plano de saúde unimed nacional plano odontológico dentaluni seguro de vida sem custo birthday off gympass no dress code fique à vontade em casa diferenciais plr por atingimento de metas james prime para todos os colaboradores vr ou va de r dia através de cartão de crédito flex food horário semi flexível convênio com clínica de psicologia pulsão e vida plano veterinário petwell contratação clt como se candidatar clique no link tempo médio de feedbacks dias sendo o retorno positivo para trabalhar conosco ou negativo costumo elaborar um feedback construtivo para todas as pessoas regime clt semanais nível sênior
0
206,310
15,724,819,528
IssuesEvent
2021-03-29 09:15:29
ubtue/DatenProbleme
https://api.github.com/repos/ubtue/DatenProbleme
closed
Brill ORCID
Einspielung_Zotero_AUTO Zotero_AUTO_RSS ready for testing
**URL** https://doi.org/10.1163/15700747-bja10031 IxTheo#2021-03-25#EDC621C8CBE1FE9435F1AA4F37C1DFCBBA304D79 **Ausführliche Problembeschreibung** Hier wurde die ORCID-ID nicht als Hinweis ins entsprechende Feld eingespielt.
1.0
Brill ORCID - **URL** https://doi.org/10.1163/15700747-bja10031 IxTheo#2021-03-25#EDC621C8CBE1FE9435F1AA4F37C1DFCBBA304D79 **Ausführliche Problembeschreibung** Hier wurde die ORCID-ID nicht als Hinweis ins entsprechende Feld eingespielt.
non_priority
brill orcid url ixtheo ausführliche problembeschreibung hier wurde die orcid id nicht als hinweis ins entsprechende feld eingespielt
0
120,886
25,887,258,407
IssuesEvent
2022-12-14 15:21:51
pulumi/pulumi
https://api.github.com/repos/pulumi/pulumi
closed
Improve validation and linting for schema as part of codegen
kind/enhancement impact/usability area/codegen
## Hello! <!-- Please leave this section as-is, it's designed to help others in the community know how to interact with our GitHub issues. --> - Vote on this issue by adding a 👍 reaction - If you want to implement this feature, comment to let us know (we'll work with you on design, scheduling, etc.) ## Issue details This is a follow up from the postmortem for https://github.com/pulumi/pulumi-azure-native/issues/1309 where credentials were inadvertently leaked to the state in clear text through a potentially dangerous mix of schema options chosen by the schema authors. Given the schema definitions are an external interface for interacting with users (provider or MLC implementors), we should improve validation logic in codegen or add additional schema sanity checks to make sure invalid or dangerous schema definitions are appropriately surfaced to implementors during the provider development process. <!-- Enhancement requests are most helpful when they describe the problem you're having as well as articulating the potential solution you'd like to see built. --> ### Affected area/feature * codegen * package authoring <!-- If you know the specific area where this feature request would go (e.g. Automation API, the Pulumi Service, the Terraform bridge, etc.), feel free to put that area here. -->
1.0
Improve validation and linting for schema as part of codegen - ## Hello! <!-- Please leave this section as-is, it's designed to help others in the community know how to interact with our GitHub issues. --> - Vote on this issue by adding a 👍 reaction - If you want to implement this feature, comment to let us know (we'll work with you on design, scheduling, etc.) ## Issue details This is a follow up from the postmortem for https://github.com/pulumi/pulumi-azure-native/issues/1309 where credentials were inadvertently leaked to the state in clear text through a potentially dangerous mix of schema options chosen by the schema authors. Given the schema definitions are an external interface for interacting with users (provider or MLC implementors), we should improve validation logic in codegen or add additional schema sanity checks to make sure invalid or dangerous schema definitions are appropriately surfaced to implementors during the provider development process. <!-- Enhancement requests are most helpful when they describe the problem you're having as well as articulating the potential solution you'd like to see built. --> ### Affected area/feature * codegen * package authoring <!-- If you know the specific area where this feature request would go (e.g. Automation API, the Pulumi Service, the Terraform bridge, etc.), feel free to put that area here. -->
non_priority
improve validation and linting for schema as part of codegen hello vote on this issue by adding a 👍 reaction if you want to implement this feature comment to let us know we ll work with you on design scheduling etc issue details this is a follow up from the postmortem for where credentials were inadvertently leaked to the state in clear text through a potentially dangerous mix of schema options chosen by the schema authors given the schema definitions are an external interface for interacting with users provider or mlc implementors we should improve validation logic in codegen or add additional schema sanity checks to make sure invalid or dangerous schema definitions are appropriately surfaced to implementors during the provider development process affected area feature codegen package authoring
0
595,937
18,091,069,252
IssuesEvent
2021-09-22 01:43:18
knative/docs
https://api.github.com/repos/knative/docs
closed
[1.0] Fully document pingsources.sources.knative.dev
priority/high kind/eventing
@n3wscott commented on [Mon Feb 22 2021](https://github.com/knative/eventing/issues/4916) Related to https://github.com/knative/docs/issues/3268 The following work needs to be validated for pingsources.sources.knative.dev to be ready for 1.0: - [x] Getting Started demo using the component. - [x] Sample minimum of component - [ ] Sample complete resource of component - [ ] Fully documented features and options, including - [ ] Explanation of resource. - [ ] All spec fields - [ ] All status fields - [ ] Any magic labels or annotations that apply. - [x] With kubectl cli - [x] With kn cli
1.0
[1.0] Fully document pingsources.sources.knative.dev - @n3wscott commented on [Mon Feb 22 2021](https://github.com/knative/eventing/issues/4916) Related to https://github.com/knative/docs/issues/3268 The following work needs to be validated for pingsources.sources.knative.dev to be ready for 1.0: - [x] Getting Started demo using the component. - [x] Sample minimum of component - [ ] Sample complete resource of component - [ ] Fully documented features and options, including - [ ] Explanation of resource. - [ ] All spec fields - [ ] All status fields - [ ] Any magic labels or annotations that apply. - [x] With kubectl cli - [x] With kn cli
priority
fully document pingsources sources knative dev commented on related to the following work needs to be validated for pingsources sources knative dev to be ready for getting started demo using the component sample minimum of component sample complete resource of component fully documented features and options including explanation of resource all spec fields all status fields any magic labels or annotations that apply with kubectl cli with kn cli
1
49,880
20,970,921,463
IssuesEvent
2022-03-28 11:17:25
microsoft/vscode-cpptools
https://api.github.com/repos/microsoft/vscode-cpptools
closed
Error when using MACRO with VA_ARGS
bug Language Service more info needed not reproing
vscode version used: 1.61.2 C/C++ version used: v1.7.1 When we use a MACRO with VA_ARGS with just the 1 argument, the IntelliSense gives an Error: `expected an expression` ``` static void func(const char *format, ...) { va_list args; va_start(args,format); vfprintf(stdout,format,args); va_end(args); } #define CUSTOM_MACRO(format,...) func(format , ##__VA_ARGS__ ) static void usage() { func("test"); //OK func("%s", "test"); //OK CUSTOM_MACRO("%s", "test"); //OK CUSTOM_MACRO("test"); // ERROR } ```
1.0
Error when using MACRO with VA_ARGS - vscode version used: 1.61.2 C/C++ version used: v1.7.1 When we use a MACRO with VA_ARGS with just the 1 argument, the IntelliSense gives an Error: `expected an expression` ``` static void func(const char *format, ...) { va_list args; va_start(args,format); vfprintf(stdout,format,args); va_end(args); } #define CUSTOM_MACRO(format,...) func(format , ##__VA_ARGS__ ) static void usage() { func("test"); //OK func("%s", "test"); //OK CUSTOM_MACRO("%s", "test"); //OK CUSTOM_MACRO("test"); // ERROR } ```
non_priority
error when using macro with va args vscode version used c c version used when we use a macro with va args with just the argument the intellisense gives an error expected an expression static void func const char format va list args va start args format vfprintf stdout format args va end args define custom macro format func format va args static void usage func test ok func s test ok custom macro s test ok custom macro test error
0
8,810
4,328,527,184
IssuesEvent
2016-07-26 14:19:33
GitTools/GitVersion
https://api.github.com/repos/GitTools/GitVersion
closed
Cache broken with GitVersionTask 3.6.0
pr-open Type: MSBuild Task
I've just updated the MSBuild task package and apparently the cache is broken. Each target (`WriteVersionInfoToBuildLog`, `WriteVersionInfoToBuildLog` and `GitVersion`) in each built project recalculates the version number from scratch. Every time I see ``` Cache file C:\TeamCity\buildAgent\work\aaa244402599d927\.git\gitversion_cache\XYZ123.yml not found ``` And every time (during one build) the `XYZ123` part is different. Worked good in version 3.5.4
1.0
Cache broken with GitVersionTask 3.6.0 - I've just updated the MSBuild task package and apparently the cache is broken. Each target (`WriteVersionInfoToBuildLog`, `WriteVersionInfoToBuildLog` and `GitVersion`) in each built project recalculates the version number from scratch. Every time I see ``` Cache file C:\TeamCity\buildAgent\work\aaa244402599d927\.git\gitversion_cache\XYZ123.yml not found ``` And every time (during one build) the `XYZ123` part is different. Worked good in version 3.5.4
non_priority
cache broken with gitversiontask i ve just updated the msbuild task package and apparently the cache is broken each target writeversioninfotobuildlog writeversioninfotobuildlog and gitversion in each built project recalculates the version number from scratch every time i see cache file c teamcity buildagent work git gitversion cache yml not found and every time during one build the part is different worked good in version
0
20,135
6,822,400,632
IssuesEvent
2017-11-07 19:56:56
htacg/tidy-html5
https://api.github.com/repos/htacg/tidy-html5
closed
`sprtf` library should work cross platform
Build/Install
Although this is intended for debugging, its features should be platform agnostic. It doesn't affect release builds of Tidy, so this isn't a hard milestone.
1.0
`sprtf` library should work cross platform - Although this is intended for debugging, its features should be platform agnostic. It doesn't affect release builds of Tidy, so this isn't a hard milestone.
non_priority
sprtf library should work cross platform although this is intended for debugging its features should be platform agnostic it doesn t affect release builds of tidy so this isn t a hard milestone
0
86,848
17,091,076,464
IssuesEvent
2021-07-08 17:32:18
danieljharvey/mimsa
https://api.github.com/repos/danieljharvey/mimsa
closed
add let pattern
as a treat code hygiene
Allow destructuring with a `let` binding using patterns from pattern matching. Means we can bin off `LetPair`. ```haskell let { dog: d, cat: c } = { dog: 1, cat: "poo" } ``` This would be like doing `let d = 1; let c = "poo"` Only works when the pattern is exhaustive on it's own, this would not work because `Nothing` is not covered. ```haskell let (Just a) = Just 1 ```
1.0
add let pattern - Allow destructuring with a `let` binding using patterns from pattern matching. Means we can bin off `LetPair`. ```haskell let { dog: d, cat: c } = { dog: 1, cat: "poo" } ``` This would be like doing `let d = 1; let c = "poo"` Only works when the pattern is exhaustive on it's own, this would not work because `Nothing` is not covered. ```haskell let (Just a) = Just 1 ```
non_priority
add let pattern allow destructuring with a let binding using patterns from pattern matching means we can bin off letpair haskell let dog d cat c dog cat poo this would be like doing let d let c poo only works when the pattern is exhaustive on it s own this would not work because nothing is not covered haskell let just a just
0
175,876
14,543,752,236
IssuesEvent
2020-12-15 17:15:48
gabbhack/deser
https://api.github.com/repos/gabbhack/deser
opened
More information of {.flat.} behavior
documentation
Currently an important, but not obvious thing is not described - `deser` can override "global" pragmas: - https://github.com/gabbhack/deser/blob/1dbdf74fde55d1add9fbaf213c3590e9a9c288e6/tests/serialize/flat/rename_all/tlevels_replace.nim - https://github.com/gabbhack/deser/blob/1dbdf74fde55d1add9fbaf213c3590e9a9c288e6/tests/serialize/flat/skip_serialize_if/tlevels_replace.nim - https://github.com/gabbhack/deser/blob/1dbdf74fde55d1add9fbaf213c3590e9a9c288e6/tests/serialize/flat/with/tlevels_replace.nim etc
1.0
More information of {.flat.} behavior - Currently an important, but not obvious thing is not described - `deser` can override "global" pragmas: - https://github.com/gabbhack/deser/blob/1dbdf74fde55d1add9fbaf213c3590e9a9c288e6/tests/serialize/flat/rename_all/tlevels_replace.nim - https://github.com/gabbhack/deser/blob/1dbdf74fde55d1add9fbaf213c3590e9a9c288e6/tests/serialize/flat/skip_serialize_if/tlevels_replace.nim - https://github.com/gabbhack/deser/blob/1dbdf74fde55d1add9fbaf213c3590e9a9c288e6/tests/serialize/flat/with/tlevels_replace.nim etc
non_priority
more information of flat behavior currently an important but not obvious thing is not described deser can override global pragmas etc
0
263,937
23,091,672,708
IssuesEvent
2022-07-26 15:41:33
elastic/beats
https://api.github.com/repos/elastic/beats
closed
TestLinkedListRemoveOlderNoAllocs – github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/helper
flaky-test Team:Security-External Integrations
## Flaky Test * **Test Name:** `TestLinkedListRemoveOlderNoAllocs – github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/helper` * **Link:** Link to file/line number in github. * **Branch:** Git branch the test was seen in. If a PR, the branch the PR was based off. * **Artifact Link:** https://github.com/elastic/beats/issues/32496 * **Notes:** `x-pack/auditbeat-windows-10-windows-10` and `x-pack/auditbeat-windows-11-windows-11` / ### Stack Trace ``` ExtendedWin / x-pack/auditbeat-windows-10-windows-10 / TestLinkedListRemoveOlderNoAllocs – github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/helper Expand to view the error details Expand to view the stacktrace === RUN TestLinkedListRemoveOlderNoAllocs linkedlist_test.go:111: Error Trace: linkedlist_test.go:111 Error: Should be zero, but was 1 Test: TestLinkedListRemoveOlderNoAllocs --- FAIL: TestLinkedListRemoveOlderNoAllocs (1.54s) ``` ``` [2022-07-26T02:25:34.930Z] [2022-07-26T02:25:34.930Z] C:\Users\jenkins\workspace\main-712-5544bc47-84e1-4c25-a8ce-3e10af9fc7c2\src\github.com\elastic\beats\x-pack\auditbeat>mage build unitTest [2022-07-26T02:25:46.609Z] >> build: Building auditbeat [2022-07-26T02:25:55.304Z] >> go test: Unit Testing [2022-07-26T02:25:55.304Z] exec: gotestsum --no-color -f standard-quiet --junitfile build/TEST-go-unit.xml --jsonfile build/TEST-go-unit.out.json -- -tags null oracle -covermode=atomic -coverprofile=build\TEST-go-unit.cov ./... [2022-07-26T02:25:59.768Z] ok github.com/elastic/beats/v7/x-pack/auditbeat/cache 0.557s coverage: 100.0% of statements [2022-07-26T02:26:00.350Z] ok github.com/elastic/beats/v7/x-pack/auditbeat/module/system/host 0.615s coverage: 58.5% of statements [2022-07-26T02:26:00.350Z] ok github.com/elastic/beats/v7/x-pack/auditbeat/module/system/process 0.685s coverage: 62.6% of statements [2022-07-26T02:26:03.662Z] ok github.com/elastic/beats/v7/x-pack/auditbeat 0.343s coverage: 0.0% of statements [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/cmd [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/include [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/login [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/package [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/dns [no test files] [2022-07-26T02:26:07.883Z] FAIL [2022-07-26T02:26:07.883Z] FAIL github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/helper 8.560s [2022-07-26T02:26:07.883Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/user [no test files] [2022-07-26T02:26:07.883Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/tracing [no test files] [2022-07-26T02:26:07.883Z] [2022-07-26T02:26:07.883Z] === Failed [2022-07-26T02:26:07.883Z] === FAIL: x-pack/auditbeat/module/system/socket/helper TestLinkedListRemoveOlderNoAllocs (1.54s) [2022-07-26T02:26:07.883Z] linkedlist_test.go:111: [2022-07-26T02:26:07.883Z] Error Trace: linkedlist_test.go:111 [2022-07-26T02:26:07.883Z] Error: Should be zero, but was 1 [2022-07-26T02:26:07.883Z] Test: TestLinkedListRemoveOlderNoAllocs [2022-07-26T02:26:07.883Z] [2022-07-26T02:26:07.883Z] DONE 20 tests, 1 failure in 13.002s [2022-07-26T02:26:07.883Z] Error: failed to execute go: exit status 1 script returned exit code 1 ```
1.0
TestLinkedListRemoveOlderNoAllocs – github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/helper - ## Flaky Test * **Test Name:** `TestLinkedListRemoveOlderNoAllocs – github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/helper` * **Link:** Link to file/line number in github. * **Branch:** Git branch the test was seen in. If a PR, the branch the PR was based off. * **Artifact Link:** https://github.com/elastic/beats/issues/32496 * **Notes:** `x-pack/auditbeat-windows-10-windows-10` and `x-pack/auditbeat-windows-11-windows-11` / ### Stack Trace ``` ExtendedWin / x-pack/auditbeat-windows-10-windows-10 / TestLinkedListRemoveOlderNoAllocs – github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/helper Expand to view the error details Expand to view the stacktrace === RUN TestLinkedListRemoveOlderNoAllocs linkedlist_test.go:111: Error Trace: linkedlist_test.go:111 Error: Should be zero, but was 1 Test: TestLinkedListRemoveOlderNoAllocs --- FAIL: TestLinkedListRemoveOlderNoAllocs (1.54s) ``` ``` [2022-07-26T02:25:34.930Z] [2022-07-26T02:25:34.930Z] C:\Users\jenkins\workspace\main-712-5544bc47-84e1-4c25-a8ce-3e10af9fc7c2\src\github.com\elastic\beats\x-pack\auditbeat>mage build unitTest [2022-07-26T02:25:46.609Z] >> build: Building auditbeat [2022-07-26T02:25:55.304Z] >> go test: Unit Testing [2022-07-26T02:25:55.304Z] exec: gotestsum --no-color -f standard-quiet --junitfile build/TEST-go-unit.xml --jsonfile build/TEST-go-unit.out.json -- -tags null oracle -covermode=atomic -coverprofile=build\TEST-go-unit.cov ./... [2022-07-26T02:25:59.768Z] ok github.com/elastic/beats/v7/x-pack/auditbeat/cache 0.557s coverage: 100.0% of statements [2022-07-26T02:26:00.350Z] ok github.com/elastic/beats/v7/x-pack/auditbeat/module/system/host 0.615s coverage: 58.5% of statements [2022-07-26T02:26:00.350Z] ok github.com/elastic/beats/v7/x-pack/auditbeat/module/system/process 0.685s coverage: 62.6% of statements [2022-07-26T02:26:03.662Z] ok github.com/elastic/beats/v7/x-pack/auditbeat 0.343s coverage: 0.0% of statements [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/cmd [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/include [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/login [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/package [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket [no test files] [2022-07-26T02:26:03.663Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/dns [no test files] [2022-07-26T02:26:07.883Z] FAIL [2022-07-26T02:26:07.883Z] FAIL github.com/elastic/beats/v7/x-pack/auditbeat/module/system/socket/helper 8.560s [2022-07-26T02:26:07.883Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/module/system/user [no test files] [2022-07-26T02:26:07.883Z] ? github.com/elastic/beats/v7/x-pack/auditbeat/tracing [no test files] [2022-07-26T02:26:07.883Z] [2022-07-26T02:26:07.883Z] === Failed [2022-07-26T02:26:07.883Z] === FAIL: x-pack/auditbeat/module/system/socket/helper TestLinkedListRemoveOlderNoAllocs (1.54s) [2022-07-26T02:26:07.883Z] linkedlist_test.go:111: [2022-07-26T02:26:07.883Z] Error Trace: linkedlist_test.go:111 [2022-07-26T02:26:07.883Z] Error: Should be zero, but was 1 [2022-07-26T02:26:07.883Z] Test: TestLinkedListRemoveOlderNoAllocs [2022-07-26T02:26:07.883Z] [2022-07-26T02:26:07.883Z] DONE 20 tests, 1 failure in 13.002s [2022-07-26T02:26:07.883Z] Error: failed to execute go: exit status 1 script returned exit code 1 ```
non_priority
testlinkedlistremoveoldernoallocs – github com elastic beats x pack auditbeat module system socket helper flaky test test name testlinkedlistremoveoldernoallocs – github com elastic beats x pack auditbeat module system socket helper link link to file line number in github branch git branch the test was seen in if a pr the branch the pr was based off artifact link notes x pack auditbeat windows windows and x pack auditbeat windows windows stack trace extendedwin x pack auditbeat windows windows testlinkedlistremoveoldernoallocs – github com elastic beats x pack auditbeat module system socket helper expand to view the error details expand to view the stacktrace run testlinkedlistremoveoldernoallocs linkedlist test go error trace linkedlist test go error should be zero but was test testlinkedlistremoveoldernoallocs fail testlinkedlistremoveoldernoallocs c users jenkins workspace main src github com elastic beats x pack auditbeat mage build unittest build building auditbeat go test unit testing exec gotestsum no color f standard quiet junitfile build test go unit xml jsonfile build test go unit out json tags null oracle covermode atomic coverprofile build test go unit cov ok github com elastic beats x pack auditbeat cache coverage of statements ok github com elastic beats x pack auditbeat module system host coverage of statements ok github com elastic beats x pack auditbeat module system process coverage of statements ok github com elastic beats x pack auditbeat coverage of statements github com elastic beats x pack auditbeat cmd github com elastic beats x pack auditbeat include github com elastic beats x pack auditbeat module system github com elastic beats x pack auditbeat module system login github com elastic beats x pack auditbeat module system package github com elastic beats x pack auditbeat module system socket github com elastic beats x pack auditbeat module system socket dns fail fail github com elastic beats x pack auditbeat module system socket helper github com elastic beats x pack auditbeat module system user github com elastic beats x pack auditbeat tracing failed fail x pack auditbeat module system socket helper testlinkedlistremoveoldernoallocs linkedlist test go error trace linkedlist test go error should be zero but was test testlinkedlistremoveoldernoallocs done tests failure in error failed to execute go exit status script returned exit code
0
421,085
12,248,562,127
IssuesEvent
2020-05-05 17:43:00
grpc/grpc
https://api.github.com/repos/grpc/grpc
closed
Unity build linker error for Windows Standalone/HoloLens 1 & 2 ( x86 & x86_64 )
kind/bug lang/C# priority/P2
### What version of gRPC and what language are you using? * C# * 1.25-dev * Unity 2019.2.6f1 ### What operating system (Linux, Windows,...) and version? * Windows 10 * SDK 10.0.17763.0 (x86) * SDK 10.0.18362.0 (x64) ### What runtime / compiler are you using (e.g. python version or version of gcc) * VS Toolset v142 ### What did you do? * Created empty Unity project * Copied Plugins folder from this [artifact](https://packages.grpc.io/archive/2019/09/5f11fae4d20aba502b317919a051b18056c8ae17-f25d2c07-b548-44a1-870c-f2c2d464a034/csharp/grpc_unity_package.2.25.0-dev.zip) of this [build](https://packages.grpc.io/archive/2019/09/5f11fae4d20aba502b317919a051b18056c8ae17-f25d2c07-b548-44a1-870c-f2c2d464a034/index.xml#) into Unity Assets folder. * Added `Plugins\Grpc.Core\runtimes\win\x86\grpc_csharp_ext` to 'Any Platform -> x86' * Added `Plugins\Grpc.Core\runtimes\win\x64\grpc_csharp_ext` to 'Any Platform -> x64' * Added `Plugins\Grpc.Core\runtimes\grpc_csharp_ext_dummy` to `Standalon` and `WSA` with `Any SDK` * Added `RPCServer.cs` to `Assets\Scripts` and added it to the Scene: ``` using UnityEngine; using Grpc.Core; public class RPCServer: MonoBehaviour { public Server _server; protected void Awake() { Debug.Log("Init RPC Server"); _server = new Server { Ports = { new ServerPort("localhost", 12345, ServerCredentials.Insecure) } }; Debug.Log("RPC Server initialized"); } } ``` ### What did you expect to see? * Compilation success ### What did you see instead? x86: ``` 1>Creating library D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\Win32\Debug\linkresult_D88CFD5142EDC344E98E9548C1260915\GameAssembly.lib and object D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\Win32\Debug\linkresult_D88CFD5142EDC344E98E9548C1260915\GameAssembly.exp 1>6860474D8DC7A65FE4EE10F1482E5E5F.obj : error LNK2019: unresolved external symbol _grpcsharp_init@0 referenced in function _DllImportsFromStaticLib_grpcsharp_init_mD33485A76947BECB3FB8D588729F5A75DBF058E7 1> Hint on symbols that are defined and could potentially match: 1> _grpcsharp_init 1>6860474D8DC7A65FE4EE10F1482E5E5F.obj : error LNK2019: unresolved external symbol _grpcsharp_shutdown@0 referenced in function _DllImportsFromStaticLib_grpcsharp_shutdown_m2238A66AFFCC96D5BD4A30FBD2A5DE8FD10D50F5 1> Hint on symbols that are defined and could potentially match: 1> _grpcsharp_shutdown 1>6860474D8DC7A65FE4EE10F1482E5E5F.obj : error LNK2019: unresolved external symbol _grpcsharp_version_string@0 referenced in function _DllImportsFromStaticLib_grpcsharp_version_string_mBA017FAEEF230F896AFE112589C7758B2A1E020A 1> Hint on symbols that are defined and could potentially match: 1> _grpcsharp_version_string ... 1>6860474D8DC7A65FE4EE10F1482E5E5F.obj : error LNK2019: unresolved external symbol _grpcsharp_test_call_start_unary_echo@24 referenced in function _DllImportsFromStaticLib_grpcsharp_test_call_start_unary_echo_mACA7A3152CFE80D962D680C17AB5E0FBD6CAAC3B 1> Hint on symbols that are defined and could potentially match: 1> _grpcsharp_test_call_start_unary_echo 1>7A26A50001DCF70793854F547D470510.obj : error LNK2019: unresolved external symbol _dlopen@8 referenced in function _Mono_dlopen_mD32390860E62629D34C5680BB1A4D8737986E725 1>7A26A50001DCF70793854F547D470510.obj : error LNK2019: unresolved external symbol _dlerror@0 referenced in function _Mono_dlerror_m7A2342CF799D2F296D8FEE7B9AA56FCA687913F5 1>7A26A50001DCF70793854F547D470510.obj : error LNK2019: unresolved external symbol _dlsym@8 referenced in function _Mono_dlsym_m3BFEF9952963669C7B0FC7AA527945E796CF9BCC 1>D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\Win32\Debug\linkresult_D88CFD5142EDC344E98E9548C1260915\GameAssembly.dll : fatal error LNK1120: 106 unresolved externals 1> 1> at Unity.IL2CPP.Building.CppProgramBuilder.PostprocessObjectFiles(HashSet`1 objectFiles, CppToolChainContext toolChainContext) 1> at Unity.IL2CPP.Building.CppProgramBuilder.Build(IBuildStatistics& statistics) 1> at il2cpp.Program.DoRun(String[] args) 1> at il2cpp.Program.Run(String[] args) 1> at il2cpp.Program.Main(String[] args) 1> 1>Unhandled Exception: 1>C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Microsoft\VC\v160\Microsoft.MakeFile.Targets(49,5): error MSB3073: The command ""D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\IL2CPP\build\il2cpp.exe" --libil2cpp-static --compile-cpp -architecture=x86 -configuration=Debug -platform=winrt -outputpath="D:\workspace\GrpcUnity\Builds\\build\bin\Win32\Debug\GameAssembly.dll" --data-folder="D:\workspace\GrpcUnity\Builds\\build\bin\Win32\Debug\\" -cachedirectory="D:\workspace\GrpcUnity\Builds\\build\obj\il2cppOutputProject\Win32\Debug\\" -generatedcppdir="D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\Source" --profiler-report --additional-defines=WINDOWS_UWP --additional-defines=UNITY_UWP --additional-defines=UNITY_WSA_10_0 --additional-defines=UNITY_WSA --additional-defines=UNITY_WINRT --additional-defines=PLATFORM_WINRT -dotnetprofile=unityaot -verbose --map-file-parser="D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\IL2CPP\MapFileParser\MapFileParser.exe" -forcerebuild" exited with code -532462766. 1>Done building project "Il2CppOutputProject.vcxproj" -- FAILED. ``` x64 ``` 1>Creating library D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.lib and object D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.exp 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlopen referenced in function Mono_dlopen_mD32390860E62629D34C5680BB1A4D8737986E725 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlerror referenced in function Mono_dlerror_m7A2342CF799D2F296D8FEE7B9AA56FCA687913F5 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlsym referenced in function Mono_dlsym_m3BFEF9952963669C7B0FC7AA527945E796CF9BCC 1>D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.dll : fatal error LNK1120: 3 unresolved externals 1> 1> at Unity.IL2CPP.Building.CppProgramBuilder.PostprocessObjectFiles(HashSet`1 objectFiles, CppToolChainContext toolChainContext) 1> at Unity.IL2CPP.Building.CppProgramBuilder.Build(IBuildStatistics& statistics) 1> at il2cpp.Program.DoRun(String[] args) 1> at il2cpp.Program.Run(String[] args) 1> at il2cpp.Program.Main(String[] args) 1> 1>Unhandled Exception: Unity.IL2CPP.Building.BuilderFailedException: C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.22.27905\bin\HostX64\x64\link.exe /out:"D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.dll" /DEBUG:FASTLINK /INCREMENTAL:NO /LARGEADDRESSAWARE /NXCOMPAT /DYNAMICBASE /NOLOGO /TLBID:1 /HIGHENTROPYVA /DLL /NODEFAULTLIB:uuid.lib "Shcore.lib" "WindowsApp.lib" "Crypt32.lib" /LIBPATH:"C:\Program Files (x86)\Windows Kits\10\Lib\10.0.18362.0\um\x64" /LIBPATH:"C:\Program Files (x86)\Windows Kits\10\Lib\10.0.18362.0\ucrt\x64" /LIBPATH:"C:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\lib\um\x64" /LIBPATH:"C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.22.27905\lib\x64\store" /APPCONTAINER /SUBSYSTEM:WINDOWS /NODEFAULTLIB:ole32.lib /NODEFAULTLIB:kernel32.lib /NODEFAULTLIB:msvcrt.lib @"C:\Users\avikom_admin\AppData\Local\Temp\tmpEA7A.tmp" 1> 1>Creating library D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.lib and object D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.exp 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlopen referenced in function Mono_dlopen_mD32390860E62629D34C5680BB1A4D8737986E725 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlerror referenced in function Mono_dlerror_m7A2342CF799D2F296D8FEE7B9AA56FCA687913F5 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlsym referenced in function Mono_dlsym_m3BFEF9952963669C7B0FC7AA527945E796CF9BCC 1>D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.dll : fatal error LNK1120: 3 unresolved externals 1> 1> at Unity.IL2CPP.Building.CppProgramBuilder.PostprocessObjectFiles(HashSet`1 objectFiles, CppToolChainContext toolChainContext) 1> at Unity.IL2CPP.Building.CppProgramBuilder.Build(IBuildStatistics& statistics) 1> at il2cpp.Program.DoRun(String[] args) 1> at il2cpp.Program.Run(String[] args) 1> at il2cpp.Program.Main(String[] args) 1>C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Microsoft\VC\v160\Microsoft.MakeFile.Targets(49,5): error MSB3073: The command ""D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\IL2CPP\build\il2cpp.exe" --libil2cpp-static --compile-cpp -architecture=x64 -configuration=Debug -platform=winrt -outputpath="D:\workspace\GrpcUnity\Builds\\build\bin\x64\Debug\GameAssembly.dll" --data-folder="D:\workspace\GrpcUnity\Builds\\build\bin\x64\Debug\\" -cachedirectory="D:\workspace\GrpcUnity\Builds\\build\obj\il2cppOutputProject\x64\Debug\\" -generatedcppdir="D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\Source" --profiler-report --additional-defines=WINDOWS_UWP --additional-defines=UNITY_UWP --additional-defines=UNITY_WSA_10_0 --additional-defines=UNITY_WSA --additional-defines=UNITY_WINRT --additional-defines=PLATFORM_WINRT -dotnetprofile=unityaot -verbose --map-file-parser="D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\IL2CPP\MapFileParser\MapFileParser.exe" -forcerebuild" exited with code -532462766. 1>Done building project "Il2CppOutputProject.vcxproj" -- FAILED. ``` ### Anything else we should know about your project / environment? I tried to add `grpc_csharp_ext_dummy` as this used to solve issues for Android and Il2CPP. As seen in the output for x86, it is recognized as a candidate but not used. The `x64` does not have the linker issues related to grpc functions but still returns linking issues with `_Mono_dlsym`, `_dlerror`, `_dlopen` In case I should try something: I managed to build `grpc_csharp_ext` for `x64 (net45)` with `\tools\run_tests\run_tests.py` but could not build with arch x86/win32 since the scripts reports that this arch is not supported.
1.0
Unity build linker error for Windows Standalone/HoloLens 1 & 2 ( x86 & x86_64 ) - ### What version of gRPC and what language are you using? * C# * 1.25-dev * Unity 2019.2.6f1 ### What operating system (Linux, Windows,...) and version? * Windows 10 * SDK 10.0.17763.0 (x86) * SDK 10.0.18362.0 (x64) ### What runtime / compiler are you using (e.g. python version or version of gcc) * VS Toolset v142 ### What did you do? * Created empty Unity project * Copied Plugins folder from this [artifact](https://packages.grpc.io/archive/2019/09/5f11fae4d20aba502b317919a051b18056c8ae17-f25d2c07-b548-44a1-870c-f2c2d464a034/csharp/grpc_unity_package.2.25.0-dev.zip) of this [build](https://packages.grpc.io/archive/2019/09/5f11fae4d20aba502b317919a051b18056c8ae17-f25d2c07-b548-44a1-870c-f2c2d464a034/index.xml#) into Unity Assets folder. * Added `Plugins\Grpc.Core\runtimes\win\x86\grpc_csharp_ext` to 'Any Platform -> x86' * Added `Plugins\Grpc.Core\runtimes\win\x64\grpc_csharp_ext` to 'Any Platform -> x64' * Added `Plugins\Grpc.Core\runtimes\grpc_csharp_ext_dummy` to `Standalon` and `WSA` with `Any SDK` * Added `RPCServer.cs` to `Assets\Scripts` and added it to the Scene: ``` using UnityEngine; using Grpc.Core; public class RPCServer: MonoBehaviour { public Server _server; protected void Awake() { Debug.Log("Init RPC Server"); _server = new Server { Ports = { new ServerPort("localhost", 12345, ServerCredentials.Insecure) } }; Debug.Log("RPC Server initialized"); } } ``` ### What did you expect to see? * Compilation success ### What did you see instead? x86: ``` 1>Creating library D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\Win32\Debug\linkresult_D88CFD5142EDC344E98E9548C1260915\GameAssembly.lib and object D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\Win32\Debug\linkresult_D88CFD5142EDC344E98E9548C1260915\GameAssembly.exp 1>6860474D8DC7A65FE4EE10F1482E5E5F.obj : error LNK2019: unresolved external symbol _grpcsharp_init@0 referenced in function _DllImportsFromStaticLib_grpcsharp_init_mD33485A76947BECB3FB8D588729F5A75DBF058E7 1> Hint on symbols that are defined and could potentially match: 1> _grpcsharp_init 1>6860474D8DC7A65FE4EE10F1482E5E5F.obj : error LNK2019: unresolved external symbol _grpcsharp_shutdown@0 referenced in function _DllImportsFromStaticLib_grpcsharp_shutdown_m2238A66AFFCC96D5BD4A30FBD2A5DE8FD10D50F5 1> Hint on symbols that are defined and could potentially match: 1> _grpcsharp_shutdown 1>6860474D8DC7A65FE4EE10F1482E5E5F.obj : error LNK2019: unresolved external symbol _grpcsharp_version_string@0 referenced in function _DllImportsFromStaticLib_grpcsharp_version_string_mBA017FAEEF230F896AFE112589C7758B2A1E020A 1> Hint on symbols that are defined and could potentially match: 1> _grpcsharp_version_string ... 1>6860474D8DC7A65FE4EE10F1482E5E5F.obj : error LNK2019: unresolved external symbol _grpcsharp_test_call_start_unary_echo@24 referenced in function _DllImportsFromStaticLib_grpcsharp_test_call_start_unary_echo_mACA7A3152CFE80D962D680C17AB5E0FBD6CAAC3B 1> Hint on symbols that are defined and could potentially match: 1> _grpcsharp_test_call_start_unary_echo 1>7A26A50001DCF70793854F547D470510.obj : error LNK2019: unresolved external symbol _dlopen@8 referenced in function _Mono_dlopen_mD32390860E62629D34C5680BB1A4D8737986E725 1>7A26A50001DCF70793854F547D470510.obj : error LNK2019: unresolved external symbol _dlerror@0 referenced in function _Mono_dlerror_m7A2342CF799D2F296D8FEE7B9AA56FCA687913F5 1>7A26A50001DCF70793854F547D470510.obj : error LNK2019: unresolved external symbol _dlsym@8 referenced in function _Mono_dlsym_m3BFEF9952963669C7B0FC7AA527945E796CF9BCC 1>D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\Win32\Debug\linkresult_D88CFD5142EDC344E98E9548C1260915\GameAssembly.dll : fatal error LNK1120: 106 unresolved externals 1> 1> at Unity.IL2CPP.Building.CppProgramBuilder.PostprocessObjectFiles(HashSet`1 objectFiles, CppToolChainContext toolChainContext) 1> at Unity.IL2CPP.Building.CppProgramBuilder.Build(IBuildStatistics& statistics) 1> at il2cpp.Program.DoRun(String[] args) 1> at il2cpp.Program.Run(String[] args) 1> at il2cpp.Program.Main(String[] args) 1> 1>Unhandled Exception: 1>C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Microsoft\VC\v160\Microsoft.MakeFile.Targets(49,5): error MSB3073: The command ""D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\IL2CPP\build\il2cpp.exe" --libil2cpp-static --compile-cpp -architecture=x86 -configuration=Debug -platform=winrt -outputpath="D:\workspace\GrpcUnity\Builds\\build\bin\Win32\Debug\GameAssembly.dll" --data-folder="D:\workspace\GrpcUnity\Builds\\build\bin\Win32\Debug\\" -cachedirectory="D:\workspace\GrpcUnity\Builds\\build\obj\il2cppOutputProject\Win32\Debug\\" -generatedcppdir="D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\Source" --profiler-report --additional-defines=WINDOWS_UWP --additional-defines=UNITY_UWP --additional-defines=UNITY_WSA_10_0 --additional-defines=UNITY_WSA --additional-defines=UNITY_WINRT --additional-defines=PLATFORM_WINRT -dotnetprofile=unityaot -verbose --map-file-parser="D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\IL2CPP\MapFileParser\MapFileParser.exe" -forcerebuild" exited with code -532462766. 1>Done building project "Il2CppOutputProject.vcxproj" -- FAILED. ``` x64 ``` 1>Creating library D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.lib and object D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.exp 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlopen referenced in function Mono_dlopen_mD32390860E62629D34C5680BB1A4D8737986E725 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlerror referenced in function Mono_dlerror_m7A2342CF799D2F296D8FEE7B9AA56FCA687913F5 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlsym referenced in function Mono_dlsym_m3BFEF9952963669C7B0FC7AA527945E796CF9BCC 1>D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.dll : fatal error LNK1120: 3 unresolved externals 1> 1> at Unity.IL2CPP.Building.CppProgramBuilder.PostprocessObjectFiles(HashSet`1 objectFiles, CppToolChainContext toolChainContext) 1> at Unity.IL2CPP.Building.CppProgramBuilder.Build(IBuildStatistics& statistics) 1> at il2cpp.Program.DoRun(String[] args) 1> at il2cpp.Program.Run(String[] args) 1> at il2cpp.Program.Main(String[] args) 1> 1>Unhandled Exception: Unity.IL2CPP.Building.BuilderFailedException: C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.22.27905\bin\HostX64\x64\link.exe /out:"D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.dll" /DEBUG:FASTLINK /INCREMENTAL:NO /LARGEADDRESSAWARE /NXCOMPAT /DYNAMICBASE /NOLOGO /TLBID:1 /HIGHENTROPYVA /DLL /NODEFAULTLIB:uuid.lib "Shcore.lib" "WindowsApp.lib" "Crypt32.lib" /LIBPATH:"C:\Program Files (x86)\Windows Kits\10\Lib\10.0.18362.0\um\x64" /LIBPATH:"C:\Program Files (x86)\Windows Kits\10\Lib\10.0.18362.0\ucrt\x64" /LIBPATH:"C:\Program Files (x86)\Windows Kits\NETFXSDK\4.6.1\lib\um\x64" /LIBPATH:"C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Tools\MSVC\14.22.27905\lib\x64\store" /APPCONTAINER /SUBSYSTEM:WINDOWS /NODEFAULTLIB:ole32.lib /NODEFAULTLIB:kernel32.lib /NODEFAULTLIB:msvcrt.lib @"C:\Users\avikom_admin\AppData\Local\Temp\tmpEA7A.tmp" 1> 1>Creating library D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.lib and object D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.exp 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlopen referenced in function Mono_dlopen_mD32390860E62629D34C5680BB1A4D8737986E725 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlerror referenced in function Mono_dlerror_m7A2342CF799D2F296D8FEE7B9AA56FCA687913F5 1>3B616D723DD5AFFC3EEB7E766F8D9807.obj : error LNK2019: unresolved external symbol dlsym referenced in function Mono_dlsym_m3BFEF9952963669C7B0FC7AA527945E796CF9BCC 1>D:\workspace\GrpcUnity\Builds\build\obj\il2cppOutputProject\x64\Debug\linkresult_E66935DFECF713A67C1DEF87496019C7\GameAssembly.dll : fatal error LNK1120: 3 unresolved externals 1> 1> at Unity.IL2CPP.Building.CppProgramBuilder.PostprocessObjectFiles(HashSet`1 objectFiles, CppToolChainContext toolChainContext) 1> at Unity.IL2CPP.Building.CppProgramBuilder.Build(IBuildStatistics& statistics) 1> at il2cpp.Program.DoRun(String[] args) 1> at il2cpp.Program.Run(String[] args) 1> at il2cpp.Program.Main(String[] args) 1>C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\MSBuild\Microsoft\VC\v160\Microsoft.MakeFile.Targets(49,5): error MSB3073: The command ""D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\IL2CPP\build\il2cpp.exe" --libil2cpp-static --compile-cpp -architecture=x64 -configuration=Debug -platform=winrt -outputpath="D:\workspace\GrpcUnity\Builds\\build\bin\x64\Debug\GameAssembly.dll" --data-folder="D:\workspace\GrpcUnity\Builds\\build\bin\x64\Debug\\" -cachedirectory="D:\workspace\GrpcUnity\Builds\\build\obj\il2cppOutputProject\x64\Debug\\" -generatedcppdir="D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\Source" --profiler-report --additional-defines=WINDOWS_UWP --additional-defines=UNITY_UWP --additional-defines=UNITY_WSA_10_0 --additional-defines=UNITY_WSA --additional-defines=UNITY_WINRT --additional-defines=PLATFORM_WINRT -dotnetprofile=unityaot -verbose --map-file-parser="D:\workspace\GrpcUnity\Builds\Il2CppOutputProject\\IL2CPP\MapFileParser\MapFileParser.exe" -forcerebuild" exited with code -532462766. 1>Done building project "Il2CppOutputProject.vcxproj" -- FAILED. ``` ### Anything else we should know about your project / environment? I tried to add `grpc_csharp_ext_dummy` as this used to solve issues for Android and Il2CPP. As seen in the output for x86, it is recognized as a candidate but not used. The `x64` does not have the linker issues related to grpc functions but still returns linking issues with `_Mono_dlsym`, `_dlerror`, `_dlopen` In case I should try something: I managed to build `grpc_csharp_ext` for `x64 (net45)` with `\tools\run_tests\run_tests.py` but could not build with arch x86/win32 since the scripts reports that this arch is not supported.
priority
unity build linker error for windows standalone hololens what version of grpc and what language are you using c dev unity what operating system linux windows and version windows sdk sdk what runtime compiler are you using e g python version or version of gcc vs toolset what did you do created empty unity project copied plugins folder from this of this into unity assets folder added plugins grpc core runtimes win grpc csharp ext to any platform added plugins grpc core runtimes win grpc csharp ext to any platform added plugins grpc core runtimes grpc csharp ext dummy to standalon and wsa with any sdk added rpcserver cs to assets scripts and added it to the scene using unityengine using grpc core public class rpcserver monobehaviour public server server protected void awake debug log init rpc server server new server ports new serverport localhost servercredentials insecure debug log rpc server initialized what did you expect to see compilation success what did you see instead creating library d workspace grpcunity builds build obj debug linkresult gameassembly lib and object d workspace grpcunity builds build obj debug linkresult gameassembly exp obj error unresolved external symbol grpcsharp init referenced in function dllimportsfromstaticlib grpcsharp init hint on symbols that are defined and could potentially match grpcsharp init obj error unresolved external symbol grpcsharp shutdown referenced in function dllimportsfromstaticlib grpcsharp shutdown hint on symbols that are defined and could potentially match grpcsharp shutdown obj error unresolved external symbol grpcsharp version string referenced in function dllimportsfromstaticlib grpcsharp version string hint on symbols that are defined and could potentially match grpcsharp version string obj error unresolved external symbol grpcsharp test call start unary echo referenced in function dllimportsfromstaticlib grpcsharp test call start unary echo hint on symbols that are defined and could potentially match grpcsharp test call start unary echo obj error unresolved external symbol dlopen referenced in function mono dlopen obj error unresolved external symbol dlerror referenced in function mono dlerror obj error unresolved external symbol dlsym referenced in function mono dlsym d workspace grpcunity builds build obj debug linkresult gameassembly dll fatal error unresolved externals at unity building cppprogrambuilder postprocessobjectfiles hashset objectfiles cpptoolchaincontext toolchaincontext at unity building cppprogrambuilder build ibuildstatistics statistics at program dorun string args at program run string args at program main string args unhandled exception c program files microsoft visual studio community msbuild microsoft vc microsoft makefile targets error the command d workspace grpcunity builds build exe static compile cpp architecture configuration debug platform winrt outputpath d workspace grpcunity builds build bin debug gameassembly dll data folder d workspace grpcunity builds build bin debug cachedirectory d workspace grpcunity builds build obj debug generatedcppdir d workspace grpcunity builds source profiler report additional defines windows uwp additional defines unity uwp additional defines unity wsa additional defines unity wsa additional defines unity winrt additional defines platform winrt dotnetprofile unityaot verbose map file parser d workspace grpcunity builds mapfileparser mapfileparser exe forcerebuild exited with code done building project vcxproj failed creating library d workspace grpcunity builds build obj debug linkresult gameassembly lib and object d workspace grpcunity builds build obj debug linkresult gameassembly exp obj error unresolved external symbol dlopen referenced in function mono dlopen obj error unresolved external symbol dlerror referenced in function mono dlerror obj error unresolved external symbol dlsym referenced in function mono dlsym d workspace grpcunity builds build obj debug linkresult gameassembly dll fatal error unresolved externals at unity building cppprogrambuilder postprocessobjectfiles hashset objectfiles cpptoolchaincontext toolchaincontext at unity building cppprogrambuilder build ibuildstatistics statistics at program dorun string args at program run string args at program main string args unhandled exception unity building builderfailedexception c program files microsoft visual studio community vc tools msvc bin link exe out d workspace grpcunity builds build obj debug linkresult gameassembly dll debug fastlink incremental no largeaddressaware nxcompat dynamicbase nologo tlbid highentropyva dll nodefaultlib uuid lib shcore lib windowsapp lib lib libpath c program files windows kits lib um libpath c program files windows kits lib ucrt libpath c program files windows kits netfxsdk lib um libpath c program files microsoft visual studio community vc tools msvc lib store appcontainer subsystem windows nodefaultlib lib nodefaultlib lib nodefaultlib msvcrt lib c users avikom admin appdata local temp tmp creating library d workspace grpcunity builds build obj debug linkresult gameassembly lib and object d workspace grpcunity builds build obj debug linkresult gameassembly exp obj error unresolved external symbol dlopen referenced in function mono dlopen obj error unresolved external symbol dlerror referenced in function mono dlerror obj error unresolved external symbol dlsym referenced in function mono dlsym d workspace grpcunity builds build obj debug linkresult gameassembly dll fatal error unresolved externals at unity building cppprogrambuilder postprocessobjectfiles hashset objectfiles cpptoolchaincontext toolchaincontext at unity building cppprogrambuilder build ibuildstatistics statistics at program dorun string args at program run string args at program main string args c program files microsoft visual studio community msbuild microsoft vc microsoft makefile targets error the command d workspace grpcunity builds build exe static compile cpp architecture configuration debug platform winrt outputpath d workspace grpcunity builds build bin debug gameassembly dll data folder d workspace grpcunity builds build bin debug cachedirectory d workspace grpcunity builds build obj debug generatedcppdir d workspace grpcunity builds source profiler report additional defines windows uwp additional defines unity uwp additional defines unity wsa additional defines unity wsa additional defines unity winrt additional defines platform winrt dotnetprofile unityaot verbose map file parser d workspace grpcunity builds mapfileparser mapfileparser exe forcerebuild exited with code done building project vcxproj failed anything else we should know about your project environment i tried to add grpc csharp ext dummy as this used to solve issues for android and as seen in the output for it is recognized as a candidate but not used the does not have the linker issues related to grpc functions but still returns linking issues with mono dlsym dlerror dlopen in case i should try something i managed to build grpc csharp ext for with tools run tests run tests py but could not build with arch since the scripts reports that this arch is not supported
1
740,039
25,733,538,586
IssuesEvent
2022-12-07 22:22:05
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
opened
make generate-gh-issue-templates changes the current components list
bug priority:p2
### Component(s) _No response_ ### What happened? We need to update it and investigate why https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/551e5c97b32588c33a3852f2635052813e727c2e/.github/workflows/build-and-test.yml#L186 is not blocking the CI ### Collector version latest ### Environment information _No response_ ### OpenTelemetry Collector configuration _No response_ ### Log output _No response_ ### Additional context _No response_
1.0
make generate-gh-issue-templates changes the current components list - ### Component(s) _No response_ ### What happened? We need to update it and investigate why https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/551e5c97b32588c33a3852f2635052813e727c2e/.github/workflows/build-and-test.yml#L186 is not blocking the CI ### Collector version latest ### Environment information _No response_ ### OpenTelemetry Collector configuration _No response_ ### Log output _No response_ ### Additional context _No response_
priority
make generate gh issue templates changes the current components list component s no response what happened we need to update it and investigate why is not blocking the ci collector version latest environment information no response opentelemetry collector configuration no response log output no response additional context no response
1
150,248
11,953,519,580
IssuesEvent
2020-04-03 21:05:10
open-cluster-management/rhacm-docs
https://api.github.com/repos/open-cluster-management/rhacm-docs
closed
Sample Subscription YAML - timeWindow
bug squad:doc testathon
<!-- Use the [summary.md](https://github.com/open-cluster-management/rhacm-docs/blob/doc_stage/summary.md) file as the table of contents. The `summary.md` file provides direct linking within the repository to the corresponding files. 1. Provide detailed descriptions of the changes. If you provide your contact information, we can contact you with any questions related to your issue. 2. Add _squad-doc_ label to the issue. 3. Submit the issue. Please add `squad:doc` label. The ID team adds the `in-review` label when it is time to start reviewing the changes. --> ## Documentation Report The YAML samples for creating a subscription that include a "timeWindow" should not be indented. They should be the root element. https://github.com/open-cluster-management/rhacm-docs/blob/doc_stage/manage_applications/managing_subscriptions.md ## Product release(s) - [ ] [ACM 4.4] **Note:** ID will update the current version and the two previous versions (n-2). For earlier versions, we will address only P1 & P2 Doc APARS for releases in support. ## Type of documentation change - [ ] New topic - [x] Update to an existing topic ## Link to the topic(s) that require an update <!--If a new topic is required you can leave this section blank. Use a link from the summary.md file.--> * https://github.com/open-cluster-management/rhacm-docs/blob/doc_stage/manage_applications/managing_subscriptions.md ## Draft content or detailed description to get ID started ``` apiVersion: apps.open-cluster-management.io/v1 kind: Subscription metadata: name: nginx namespace: my-channel-subscription-nginx labels: app: nginx-app-details spec: channel: my-channel-namespace/my-development-channel name: nginx-ingress packageFilter: version: "1.20.x" placement: placementRef: kind: PlacementRule name: my-placement-rule **timeWindow: type: "block"/"active" location: "America/Los_Angeles" daysofweek: ["Monday", "Wednesday", "Friday"] hours: - start: "10:20AM" end: "10:30AM" - start: "12:40PM" end: "1:40PM"** ``` ``` apiVersion: apps.open-cluster-management.io/v1 kind: Subscription metadata: name: namespace: labels: spec: sourceNamespace: source: channel: name: packageFilter: version: labelSelector: matchLabels: package: component: annotations: packageOverrides: - packageName: packageAlias: - path: value: placement: local: clusters: name: clusterSelector: placementRef: name: kind: PlacementRule overrides: clusterName: clusterOverrides: path: value: **timeWindow: type: location: daysofweek: hours: - start: end:** ```
1.0
Sample Subscription YAML - timeWindow - <!-- Use the [summary.md](https://github.com/open-cluster-management/rhacm-docs/blob/doc_stage/summary.md) file as the table of contents. The `summary.md` file provides direct linking within the repository to the corresponding files. 1. Provide detailed descriptions of the changes. If you provide your contact information, we can contact you with any questions related to your issue. 2. Add _squad-doc_ label to the issue. 3. Submit the issue. Please add `squad:doc` label. The ID team adds the `in-review` label when it is time to start reviewing the changes. --> ## Documentation Report The YAML samples for creating a subscription that include a "timeWindow" should not be indented. They should be the root element. https://github.com/open-cluster-management/rhacm-docs/blob/doc_stage/manage_applications/managing_subscriptions.md ## Product release(s) - [ ] [ACM 4.4] **Note:** ID will update the current version and the two previous versions (n-2). For earlier versions, we will address only P1 & P2 Doc APARS for releases in support. ## Type of documentation change - [ ] New topic - [x] Update to an existing topic ## Link to the topic(s) that require an update <!--If a new topic is required you can leave this section blank. Use a link from the summary.md file.--> * https://github.com/open-cluster-management/rhacm-docs/blob/doc_stage/manage_applications/managing_subscriptions.md ## Draft content or detailed description to get ID started ``` apiVersion: apps.open-cluster-management.io/v1 kind: Subscription metadata: name: nginx namespace: my-channel-subscription-nginx labels: app: nginx-app-details spec: channel: my-channel-namespace/my-development-channel name: nginx-ingress packageFilter: version: "1.20.x" placement: placementRef: kind: PlacementRule name: my-placement-rule **timeWindow: type: "block"/"active" location: "America/Los_Angeles" daysofweek: ["Monday", "Wednesday", "Friday"] hours: - start: "10:20AM" end: "10:30AM" - start: "12:40PM" end: "1:40PM"** ``` ``` apiVersion: apps.open-cluster-management.io/v1 kind: Subscription metadata: name: namespace: labels: spec: sourceNamespace: source: channel: name: packageFilter: version: labelSelector: matchLabels: package: component: annotations: packageOverrides: - packageName: packageAlias: - path: value: placement: local: clusters: name: clusterSelector: placementRef: name: kind: PlacementRule overrides: clusterName: clusterOverrides: path: value: **timeWindow: type: location: daysofweek: hours: - start: end:** ```
non_priority
sample subscription yaml timewindow use the file as the table of contents the summary md file provides direct linking within the repository to the corresponding files provide detailed descriptions of the changes if you provide your contact information we can contact you with any questions related to your issue add squad doc label to the issue submit the issue please add squad doc label the id team adds the in review label when it is time to start reviewing the changes documentation report the yaml samples for creating a subscription that include a timewindow should not be indented they should be the root element product release s note id will update the current version and the two previous versions n for earlier versions we will address only doc apars for releases in support type of documentation change new topic update to an existing topic link to the topic s that require an update draft content or detailed description to get id started apiversion apps open cluster management io kind subscription metadata name nginx namespace my channel subscription nginx labels app nginx app details spec channel my channel namespace my development channel name nginx ingress packagefilter version x placement placementref kind placementrule name my placement rule timewindow type block active location america los angeles daysofweek hours start end start end apiversion apps open cluster management io kind subscription metadata name namespace labels spec sourcenamespace source channel name packagefilter version labelselector matchlabels package component annotations packageoverrides packagename packagealias path value placement local clusters name clusterselector placementref name kind placementrule overrides clustername clusteroverrides path value timewindow type location daysofweek hours start end
0
4,005
4,154,033,617
IssuesEvent
2016-06-16 10:00:22
bartdag/py4j
https://api.github.com/repos/bartdag/py4j
closed
Replace check_connection by so_linger
performance
Essentially, so linger at true with timeout 0 will send a RST to indicate that the connection ended with an error. This should only be used on timeout. We should never use it for other reasons. This will make a send fails without needing to read first. I believe this should make the performance go back to what it was before introducing check connection. If that's the case, the optional retry could wait until 0.10.3.
True
Replace check_connection by so_linger - Essentially, so linger at true with timeout 0 will send a RST to indicate that the connection ended with an error. This should only be used on timeout. We should never use it for other reasons. This will make a send fails without needing to read first. I believe this should make the performance go back to what it was before introducing check connection. If that's the case, the optional retry could wait until 0.10.3.
non_priority
replace check connection by so linger essentially so linger at true with timeout will send a rst to indicate that the connection ended with an error this should only be used on timeout we should never use it for other reasons this will make a send fails without needing to read first i believe this should make the performance go back to what it was before introducing check connection if that s the case the optional retry could wait until
0
334,980
30,001,722,825
IssuesEvent
2023-06-26 09:43:59
brave/brave-browser
https://api.github.com/repos/brave/brave-browser
closed
Backspacing after searching for non-existent feed source seems to "hang" search
bug priority/P5 QA/Yes QA/Test-Plan-Specified OS/Desktop feature/brave-news
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue. PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE. INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED--> ## Description <!--Provide a brief description of the issue--> Backspacing after searching for non-existent feed source seems to "hang" search ## Steps to Reproduce <!--Please add a series of steps to reproduce the issue--> 1. install `1.47.20` 2. launch Brave 3. open `brave://flags` 4. set `brave://flags/#brave-news-v2` to `Enabled` 5. open a new-tab page 6. click on `Customize` 7. click on `Brave News` 8. click on `Turn on Brave News` 9. type `https://brave.com` 10. click on the `Get feeds from https://brave.com` 11. now, backspace/delete the `m` 12. wait... ## Actual result: <!--Please add screenshots if needed--> step 9 | step 10 | step 11 -----|-----|----- <img width="1312" alt="Screen Shot 2022-10-26 at 4 10 31 PM" src="https://user-images.githubusercontent.com/387249/198155623-4a2ebf49-20a8-4d1d-9720-dc0dec923b4b.png"> | <img width="1312" alt="Screen Shot 2022-10-26 at 4 10 35 PM" src="https://user-images.githubusercontent.com/387249/198155624-5af54b36-dd09-4dff-8ff9-51d1933b3d23.png"> | <img width="1312" alt="Screen Shot 2022-10-26 at 4 10 38 PM" src="https://user-images.githubusercontent.com/387249/198155629-af0f5930-b7a5-438f-a68c-9b138fd7285b.png"> ## Expected result: <img width="1312" alt="Screen Shot 2022-10-26 at 4 12 05 PM" src="https://user-images.githubusercontent.com/387249/198155775-ce99c408-23bf-4f25-b132-3d9096c0e7ad.png"> ## Reproduces how often: <!--[Easily reproduced/Intermittent issue/No steps to reproduce]--> 100% ## Brave version (brave://version info) <!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details--> Brave | 1.47.20 Chromium: 107.0.5304.68 (Official Build) nightly (x86_64) -- | -- Revision | a4e93e89d3b3df1be22214603fba846ad0183ca5-refs/branch-heads/5304@{#991} OS | macOS Version 11.7.1 (Build 20G918) ## Version/Channel Information: <!--Does this issue happen on any other channels? Or is it specific to a certain channel?--> - Can you reproduce this issue with the current release? `No` - Can you reproduce this issue with the beta channel? `Yes` - Can you reproduce this issue with the nightly channel? `Yes` ## Other Additional Information: - Does the issue resolve itself when disabling Brave Shields? - Does the issue resolve itself when disabling Brave Rewards? - Is the issue reproducible on the latest version of Chrome? ## Miscellaneous Information: <!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue--> cc @fallaciousreasoning @mattmcalister
1.0
Backspacing after searching for non-existent feed source seems to "hang" search - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue. PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE. INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED--> ## Description <!--Provide a brief description of the issue--> Backspacing after searching for non-existent feed source seems to "hang" search ## Steps to Reproduce <!--Please add a series of steps to reproduce the issue--> 1. install `1.47.20` 2. launch Brave 3. open `brave://flags` 4. set `brave://flags/#brave-news-v2` to `Enabled` 5. open a new-tab page 6. click on `Customize` 7. click on `Brave News` 8. click on `Turn on Brave News` 9. type `https://brave.com` 10. click on the `Get feeds from https://brave.com` 11. now, backspace/delete the `m` 12. wait... ## Actual result: <!--Please add screenshots if needed--> step 9 | step 10 | step 11 -----|-----|----- <img width="1312" alt="Screen Shot 2022-10-26 at 4 10 31 PM" src="https://user-images.githubusercontent.com/387249/198155623-4a2ebf49-20a8-4d1d-9720-dc0dec923b4b.png"> | <img width="1312" alt="Screen Shot 2022-10-26 at 4 10 35 PM" src="https://user-images.githubusercontent.com/387249/198155624-5af54b36-dd09-4dff-8ff9-51d1933b3d23.png"> | <img width="1312" alt="Screen Shot 2022-10-26 at 4 10 38 PM" src="https://user-images.githubusercontent.com/387249/198155629-af0f5930-b7a5-438f-a68c-9b138fd7285b.png"> ## Expected result: <img width="1312" alt="Screen Shot 2022-10-26 at 4 12 05 PM" src="https://user-images.githubusercontent.com/387249/198155775-ce99c408-23bf-4f25-b132-3d9096c0e7ad.png"> ## Reproduces how often: <!--[Easily reproduced/Intermittent issue/No steps to reproduce]--> 100% ## Brave version (brave://version info) <!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details--> Brave | 1.47.20 Chromium: 107.0.5304.68 (Official Build) nightly (x86_64) -- | -- Revision | a4e93e89d3b3df1be22214603fba846ad0183ca5-refs/branch-heads/5304@{#991} OS | macOS Version 11.7.1 (Build 20G918) ## Version/Channel Information: <!--Does this issue happen on any other channels? Or is it specific to a certain channel?--> - Can you reproduce this issue with the current release? `No` - Can you reproduce this issue with the beta channel? `Yes` - Can you reproduce this issue with the nightly channel? `Yes` ## Other Additional Information: - Does the issue resolve itself when disabling Brave Shields? - Does the issue resolve itself when disabling Brave Rewards? - Is the issue reproducible on the latest version of Chrome? ## Miscellaneous Information: <!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue--> cc @fallaciousreasoning @mattmcalister
non_priority
backspacing after searching for non existent feed source seems to hang search have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description backspacing after searching for non existent feed source seems to hang search steps to reproduce install launch brave open brave flags set brave flags brave news to enabled open a new tab page click on customize click on brave news click on turn on brave news type click on the get feeds from now backspace delete the m wait actual result step step step img width alt screen shot at pm src img width alt screen shot at pm src img width alt screen shot at pm src expected result img width alt screen shot at pm src reproduces how often brave version brave version info brave chromium   official build  nightly  revision refs branch heads os macos version build version channel information can you reproduce this issue with the current release no can you reproduce this issue with the beta channel yes can you reproduce this issue with the nightly channel yes other additional information does the issue resolve itself when disabling brave shields does the issue resolve itself when disabling brave rewards is the issue reproducible on the latest version of chrome miscellaneous information cc fallaciousreasoning mattmcalister
0
621,997
19,603,481,047
IssuesEvent
2022-01-06 05:53:21
RobotLocomotion/drake
https://api.github.com/repos/RobotLocomotion/drake
closed
[multibody] Hydroelastics with polygon contact surfaces crash continuous systems.
type: bug priority: low team: dynamics component: multibody plant
After PR #16238 was merged, users of hydroelastic contact models can freely mix-and-match {continuous, discrete} system with {triangle,polygon} contact surfaces by using this switch in MultibodyPlant: ```C++ // Use polygon contact surfaces. plant.set_contact_surface_representation( geometry::HydroelasticContactRepresentation::kPolygon); // Use triangle contact surfaces. plant.set_contact_surface_representation( geometry::HydroelasticContactRepresentation::kTriangle); ``` I found a simple example that when I used a continuous system with hydroelastics and kPolygon, it crashed with this error message: ``` Encountered singular articulated body hinge inertia for body node index 1. Please ensure that this body has non-zero inertia along all axes of motion. ``` <h2>Reproduce: Continuous System Crash</h2> Apply this patch on drake/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc to use polygon contact surfaces. ``` diff --git a/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc b/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc index 7fb9773d4..151a35495 100644 --- a/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc +++ b/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc @@ -138,6 +138,8 @@ int do_main() { // Set contact model and parameters. if (FLAGS_contact_model == "hydroelastic") { plant.set_contact_model(ContactModel::kHydroelastic); + plant.set_contact_surface_representation( + geometry::HydroelasticContactRepresentation::kPolygon); plant.Finalize(); } else if (FLAGS_contact_model == "point") { // Plant must be finalized before setting the penetration allowance. ``` Then, run this command to get the error (mbp_dt=0 selects continuous system): ``` (master)*$ bazel run //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics -- --contact_model=hydroelastic --mbp_dt=0.0 INFO: Analyzed target //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics (0 packages loaded, 0 targets configured). INFO: Found 1 target... Target //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics up-to-date: bazel-bin/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics INFO: Elapsed time: 0.508s, Critical Path: 0.00s INFO: 1 process: 1 internal. INFO: Build completed successfully, 1 total action INFO: Build completed successfully, 1 total action terminate called after throwing an instance of 'std::runtime_error' what(): Encountered singular articulated body hinge inertia for body node index 1. Please ensure that this body has non-zero inertia along all axes of motion. Aborted (core dumped) ``` <h2> Verify: Discrete System OK </h2> By changing `mbp_dt` to non-zero (1e-3 seconds), we will get a discrete system that runs fine like this: ``` (master)*$ bazel run //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics -- --contact_model=hydroelastic --mbp_dt=1e-3 INFO: Analyzed target //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics (0 packages loaded, 0 targets configured). INFO: Found 1 target... Target //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics up-to-date: bazel-bin/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics INFO: Elapsed time: 0.387s, Critical Path: 0.00s INFO: 1 process: 1 internal. INFO: Build completed successfully, 1 total action INFO: Build completed successfully, 1 total action Simulator::AdvanceTo() wall clock time: 9.995 seconds. General stats regarding discrete updates: Number of time steps taken (simulator stats) = 2112 Simulator publishes every time step: false Number of publishes = 129 Number of discrete updates = 2000 Number of "unrestricted" updates = 0 Note: the following integrator took zero steps. The simulator exclusively used the discrete solver. Stats for integrator RungeKutta3Integrator with error control: Number of time steps taken (integrator stats) = 0 Initial time step taken = nan s Largest time step taken = nan s Smallest adapted step size = nan s Number of steps shrunk due to error control = 0 Number of derivative evaluations = 0 Number of steps shrunk due to convergence-based failure = 0 Number of convergence-based step failures (should match) = 0 ``` <h2>Post Script</h2> Luckily we do not expect to use continuous hydroelastics with polygons frequently. By default, continuous systems use triangles, and discrete systems use polygons.
1.0
[multibody] Hydroelastics with polygon contact surfaces crash continuous systems. - After PR #16238 was merged, users of hydroelastic contact models can freely mix-and-match {continuous, discrete} system with {triangle,polygon} contact surfaces by using this switch in MultibodyPlant: ```C++ // Use polygon contact surfaces. plant.set_contact_surface_representation( geometry::HydroelasticContactRepresentation::kPolygon); // Use triangle contact surfaces. plant.set_contact_surface_representation( geometry::HydroelasticContactRepresentation::kTriangle); ``` I found a simple example that when I used a continuous system with hydroelastics and kPolygon, it crashed with this error message: ``` Encountered singular articulated body hinge inertia for body node index 1. Please ensure that this body has non-zero inertia along all axes of motion. ``` <h2>Reproduce: Continuous System Crash</h2> Apply this patch on drake/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc to use polygon contact surfaces. ``` diff --git a/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc b/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc index 7fb9773d4..151a35495 100644 --- a/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc +++ b/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics.cc @@ -138,6 +138,8 @@ int do_main() { // Set contact model and parameters. if (FLAGS_contact_model == "hydroelastic") { plant.set_contact_model(ContactModel::kHydroelastic); + plant.set_contact_surface_representation( + geometry::HydroelasticContactRepresentation::kPolygon); plant.Finalize(); } else if (FLAGS_contact_model == "point") { // Plant must be finalized before setting the penetration allowance. ``` Then, run this command to get the error (mbp_dt=0 selects continuous system): ``` (master)*$ bazel run //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics -- --contact_model=hydroelastic --mbp_dt=0.0 INFO: Analyzed target //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics (0 packages loaded, 0 targets configured). INFO: Found 1 target... Target //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics up-to-date: bazel-bin/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics INFO: Elapsed time: 0.508s, Critical Path: 0.00s INFO: 1 process: 1 internal. INFO: Build completed successfully, 1 total action INFO: Build completed successfully, 1 total action terminate called after throwing an instance of 'std::runtime_error' what(): Encountered singular articulated body hinge inertia for body node index 1. Please ensure that this body has non-zero inertia along all axes of motion. Aborted (core dumped) ``` <h2> Verify: Discrete System OK </h2> By changing `mbp_dt` to non-zero (1e-3 seconds), we will get a discrete system that runs fine like this: ``` (master)*$ bazel run //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics -- --contact_model=hydroelastic --mbp_dt=1e-3 INFO: Analyzed target //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics (0 packages loaded, 0 targets configured). INFO: Found 1 target... Target //examples/multibody/rolling_sphere:rolling_sphere_run_dynamics up-to-date: bazel-bin/examples/multibody/rolling_sphere/rolling_sphere_run_dynamics INFO: Elapsed time: 0.387s, Critical Path: 0.00s INFO: 1 process: 1 internal. INFO: Build completed successfully, 1 total action INFO: Build completed successfully, 1 total action Simulator::AdvanceTo() wall clock time: 9.995 seconds. General stats regarding discrete updates: Number of time steps taken (simulator stats) = 2112 Simulator publishes every time step: false Number of publishes = 129 Number of discrete updates = 2000 Number of "unrestricted" updates = 0 Note: the following integrator took zero steps. The simulator exclusively used the discrete solver. Stats for integrator RungeKutta3Integrator with error control: Number of time steps taken (integrator stats) = 0 Initial time step taken = nan s Largest time step taken = nan s Smallest adapted step size = nan s Number of steps shrunk due to error control = 0 Number of derivative evaluations = 0 Number of steps shrunk due to convergence-based failure = 0 Number of convergence-based step failures (should match) = 0 ``` <h2>Post Script</h2> Luckily we do not expect to use continuous hydroelastics with polygons frequently. By default, continuous systems use triangles, and discrete systems use polygons.
priority
hydroelastics with polygon contact surfaces crash continuous systems after pr was merged users of hydroelastic contact models can freely mix and match continuous discrete system with triangle polygon contact surfaces by using this switch in multibodyplant c use polygon contact surfaces plant set contact surface representation geometry hydroelasticcontactrepresentation kpolygon use triangle contact surfaces plant set contact surface representation geometry hydroelasticcontactrepresentation ktriangle i found a simple example that when i used a continuous system with hydroelastics and kpolygon it crashed with this error message encountered singular articulated body hinge inertia for body node index please ensure that this body has non zero inertia along all axes of motion reproduce continuous system crash apply this patch on drake examples multibody rolling sphere rolling sphere run dynamics cc to use polygon contact surfaces diff git a examples multibody rolling sphere rolling sphere run dynamics cc b examples multibody rolling sphere rolling sphere run dynamics cc index a examples multibody rolling sphere rolling sphere run dynamics cc b examples multibody rolling sphere rolling sphere run dynamics cc int do main set contact model and parameters if flags contact model hydroelastic plant set contact model contactmodel khydroelastic plant set contact surface representation geometry hydroelasticcontactrepresentation kpolygon plant finalize else if flags contact model point plant must be finalized before setting the penetration allowance then run this command to get the error mbp dt selects continuous system master bazel run examples multibody rolling sphere rolling sphere run dynamics contact model hydroelastic mbp dt info analyzed target examples multibody rolling sphere rolling sphere run dynamics packages loaded targets configured info found target target examples multibody rolling sphere rolling sphere run dynamics up to date bazel bin examples multibody rolling sphere rolling sphere run dynamics info elapsed time critical path info process internal info build completed successfully total action info build completed successfully total action terminate called after throwing an instance of std runtime error what encountered singular articulated body hinge inertia for body node index please ensure that this body has non zero inertia along all axes of motion aborted core dumped verify discrete system ok by changing mbp dt to non zero seconds we will get a discrete system that runs fine like this master bazel run examples multibody rolling sphere rolling sphere run dynamics contact model hydroelastic mbp dt info analyzed target examples multibody rolling sphere rolling sphere run dynamics packages loaded targets configured info found target target examples multibody rolling sphere rolling sphere run dynamics up to date bazel bin examples multibody rolling sphere rolling sphere run dynamics info elapsed time critical path info process internal info build completed successfully total action info build completed successfully total action simulator advanceto wall clock time seconds general stats regarding discrete updates number of time steps taken simulator stats simulator publishes every time step false number of publishes number of discrete updates number of unrestricted updates note the following integrator took zero steps the simulator exclusively used the discrete solver stats for integrator with error control number of time steps taken integrator stats initial time step taken nan s largest time step taken nan s smallest adapted step size nan s number of steps shrunk due to error control number of derivative evaluations number of steps shrunk due to convergence based failure number of convergence based step failures should match post script luckily we do not expect to use continuous hydroelastics with polygons frequently by default continuous systems use triangles and discrete systems use polygons
1
743,548
25,904,135,585
IssuesEvent
2022-12-15 08:53:54
vyper-protocol/vyper-cli
https://api.github.com/repos/vyper-protocol/vyper-cli
closed
Pyth Rate plugin support
enhancement high priority
- Create new Pyth rate plugin - Fetch existing pyth rate plugin Ref: #24
1.0
Pyth Rate plugin support - - Create new Pyth rate plugin - Fetch existing pyth rate plugin Ref: #24
priority
pyth rate plugin support create new pyth rate plugin fetch existing pyth rate plugin ref
1
18,909
5,732,917,803
IssuesEvent
2017-04-21 15:55:09
google/dagger
https://api.github.com/repos/google/dagger
closed
provided Map and Sets will be recreated in Subcomponents
area: code-generation status: researching type: bug
This component does create extra providers for set and maps in the subcomponent, although they are not used. If you have many or large provided maps or sets and your subcomponent is used with a scope that is created on request, dagger recreates all the maps and sets on each request, although they are never injected. ``` java @javax.inject.Singleton @dagger.Component(modules = MapBugSubcomponent.MyModule.class) public interface MapBugSubcomponent { MainInjectable getMainInjectable(); MySubComponent getSubComponent(); @dagger.Subcomponent interface MySubComponent { // SubComponent does not need any provided Map and Set but dagger generates unused map and set providers for the Subcomponent // this only happens with map and sets. The integer provider from the main component is correctly reused without regeneration // in the subcomponent SubComponentInjectable getSubInjectable(); } // Provides all the objects for MainInjectable @dagger.Module class MyModule { @javax.inject.Singleton @dagger.Provides Integer provideInt() { return 1; } @javax.inject.Singleton @dagger.Provides(type = dagger.Provides.Type.SET_VALUES) java.util.Set<String> provideSet() { return java.util.Collections.singleton("SetEntry"); } @MyMapKey("1") @javax.inject.Singleton @dagger.Provides(type = dagger.Provides.Type.MAP) String provideEntry() { return "MapEntry"; } } @dagger.MapKey(unwrapValue = true) public @interface MyMapKey { String value(); } class MainInjectable { @javax.inject.Inject MainInjectable(java.util.Map<String, String> map, java.util.Set<String> set, Integer integer) { } } class SubComponentInjectable { @javax.inject.Inject SubComponentInjectable(Integer integer) { } } } ``` Note that in the generated code ``` private Provider<Map<String, Provider<String>>> mapOfStringAndProviderOfStringProvider; private Provider<Map<String, String>> mapOfStringAndStringProvider; private Provider<Set<String>> setOfStringProvider; ``` are recreated although they are not used in the Objects created by the subcomponent. Also note that provideIntProvider that is used in the subcomponent is not recreated and correctly used from the main component. ``` java @Generated("dagger.internal.codegen.ComponentProcessor") public final class DaggerMapBugSubcomponent implements MapBugSubcomponent { private Provider<String> mapOfStringAndProviderOfStringContribution1; private Provider<Map<String, Provider<String>>> mapOfStringAndProviderOfStringProvider; private Provider<Map<String, String>> mapOfStringAndStringProvider; private Provider<Set<String>> setOfStringContribution1Provider; private Provider<Set<String>> setOfStringProvider; private Provider<Integer> provideIntProvider; private Provider<MainInjectable> mainInjectableProvider; private DaggerMapBugSubcomponent(Builder builder) { assert builder != null; initialize(builder); } public static Builder builder() { return new Builder(); } public static MapBugSubcomponent create() { return builder().build(); } private void initialize(final Builder builder) { this.mapOfStringAndProviderOfStringContribution1 = ScopedProvider.create(MapBugSubcomponent$MyModule_ProvideEntryFactory.create(builder.myModule)); this.mapOfStringAndProviderOfStringProvider = MapProviderFactory.<String, String>builder(1) .put("1", mapOfStringAndProviderOfStringContribution1) .build(); this.mapOfStringAndStringProvider = MapFactory.create(mapOfStringAndProviderOfStringProvider); this.setOfStringContribution1Provider = ScopedProvider.create(MapBugSubcomponent$MyModule_ProvideSetFactory.create(builder.myModule)); this.setOfStringProvider = SetFactory.create(setOfStringContribution1Provider); this.provideIntProvider = ScopedProvider.create(MapBugSubcomponent$MyModule_ProvideIntFactory.create(builder.myModule)); this.mainInjectableProvider = MapBugSubcomponent$MainInjectable_Factory.create(mapOfStringAndStringProvider, setOfStringProvider, provideIntProvider); } @Override public MainInjectable getMainInjectable() { return mainInjectableProvider.get(); } @Override public MySubComponent getSubComponent() { return new MySubComponentImpl(); } public static final class Builder { private MyModule myModule; private Builder() { } public MapBugSubcomponent build() { if (myModule == null) { this.myModule = new MyModule(); } return new DaggerMapBugSubcomponent(this); } public Builder myModule(MyModule myModule) { if (myModule == null) { throw new NullPointerException("myModule"); } this.myModule = myModule; return this; } } private final class MySubComponentImpl implements MySubComponent { private Provider<SubComponentInjectable> subComponentInjectableProvider; private Provider<Map<String, Provider<String>>> mapOfStringAndProviderOfStringProvider; private Provider<Map<String, String>> mapOfStringAndStringProvider; private Provider<Set<String>> setOfStringProvider; private MySubComponentImpl() { initialize(); } private void initialize() { this.subComponentInjectableProvider = MapBugSubcomponent$SubComponentInjectable_Factory.create(DaggerMapBugSubcomponent.this.provideIntProvider); this.mapOfStringAndProviderOfStringProvider = MapProviderFactory.<String, String>builder(1) .put("1", mapOfStringAndProviderOfStringContribution1) .build(); this.mapOfStringAndStringProvider = MapFactory.create(mapOfStringAndProviderOfStringProvider); this.setOfStringProvider = SetFactory.create(setOfStringContribution1Provider); } @Override public SubComponentInjectable getSubInjectable() { return subComponentInjectableProvider.get(); } } } ```
1.0
provided Map and Sets will be recreated in Subcomponents - This component does create extra providers for set and maps in the subcomponent, although they are not used. If you have many or large provided maps or sets and your subcomponent is used with a scope that is created on request, dagger recreates all the maps and sets on each request, although they are never injected. ``` java @javax.inject.Singleton @dagger.Component(modules = MapBugSubcomponent.MyModule.class) public interface MapBugSubcomponent { MainInjectable getMainInjectable(); MySubComponent getSubComponent(); @dagger.Subcomponent interface MySubComponent { // SubComponent does not need any provided Map and Set but dagger generates unused map and set providers for the Subcomponent // this only happens with map and sets. The integer provider from the main component is correctly reused without regeneration // in the subcomponent SubComponentInjectable getSubInjectable(); } // Provides all the objects for MainInjectable @dagger.Module class MyModule { @javax.inject.Singleton @dagger.Provides Integer provideInt() { return 1; } @javax.inject.Singleton @dagger.Provides(type = dagger.Provides.Type.SET_VALUES) java.util.Set<String> provideSet() { return java.util.Collections.singleton("SetEntry"); } @MyMapKey("1") @javax.inject.Singleton @dagger.Provides(type = dagger.Provides.Type.MAP) String provideEntry() { return "MapEntry"; } } @dagger.MapKey(unwrapValue = true) public @interface MyMapKey { String value(); } class MainInjectable { @javax.inject.Inject MainInjectable(java.util.Map<String, String> map, java.util.Set<String> set, Integer integer) { } } class SubComponentInjectable { @javax.inject.Inject SubComponentInjectable(Integer integer) { } } } ``` Note that in the generated code ``` private Provider<Map<String, Provider<String>>> mapOfStringAndProviderOfStringProvider; private Provider<Map<String, String>> mapOfStringAndStringProvider; private Provider<Set<String>> setOfStringProvider; ``` are recreated although they are not used in the Objects created by the subcomponent. Also note that provideIntProvider that is used in the subcomponent is not recreated and correctly used from the main component. ``` java @Generated("dagger.internal.codegen.ComponentProcessor") public final class DaggerMapBugSubcomponent implements MapBugSubcomponent { private Provider<String> mapOfStringAndProviderOfStringContribution1; private Provider<Map<String, Provider<String>>> mapOfStringAndProviderOfStringProvider; private Provider<Map<String, String>> mapOfStringAndStringProvider; private Provider<Set<String>> setOfStringContribution1Provider; private Provider<Set<String>> setOfStringProvider; private Provider<Integer> provideIntProvider; private Provider<MainInjectable> mainInjectableProvider; private DaggerMapBugSubcomponent(Builder builder) { assert builder != null; initialize(builder); } public static Builder builder() { return new Builder(); } public static MapBugSubcomponent create() { return builder().build(); } private void initialize(final Builder builder) { this.mapOfStringAndProviderOfStringContribution1 = ScopedProvider.create(MapBugSubcomponent$MyModule_ProvideEntryFactory.create(builder.myModule)); this.mapOfStringAndProviderOfStringProvider = MapProviderFactory.<String, String>builder(1) .put("1", mapOfStringAndProviderOfStringContribution1) .build(); this.mapOfStringAndStringProvider = MapFactory.create(mapOfStringAndProviderOfStringProvider); this.setOfStringContribution1Provider = ScopedProvider.create(MapBugSubcomponent$MyModule_ProvideSetFactory.create(builder.myModule)); this.setOfStringProvider = SetFactory.create(setOfStringContribution1Provider); this.provideIntProvider = ScopedProvider.create(MapBugSubcomponent$MyModule_ProvideIntFactory.create(builder.myModule)); this.mainInjectableProvider = MapBugSubcomponent$MainInjectable_Factory.create(mapOfStringAndStringProvider, setOfStringProvider, provideIntProvider); } @Override public MainInjectable getMainInjectable() { return mainInjectableProvider.get(); } @Override public MySubComponent getSubComponent() { return new MySubComponentImpl(); } public static final class Builder { private MyModule myModule; private Builder() { } public MapBugSubcomponent build() { if (myModule == null) { this.myModule = new MyModule(); } return new DaggerMapBugSubcomponent(this); } public Builder myModule(MyModule myModule) { if (myModule == null) { throw new NullPointerException("myModule"); } this.myModule = myModule; return this; } } private final class MySubComponentImpl implements MySubComponent { private Provider<SubComponentInjectable> subComponentInjectableProvider; private Provider<Map<String, Provider<String>>> mapOfStringAndProviderOfStringProvider; private Provider<Map<String, String>> mapOfStringAndStringProvider; private Provider<Set<String>> setOfStringProvider; private MySubComponentImpl() { initialize(); } private void initialize() { this.subComponentInjectableProvider = MapBugSubcomponent$SubComponentInjectable_Factory.create(DaggerMapBugSubcomponent.this.provideIntProvider); this.mapOfStringAndProviderOfStringProvider = MapProviderFactory.<String, String>builder(1) .put("1", mapOfStringAndProviderOfStringContribution1) .build(); this.mapOfStringAndStringProvider = MapFactory.create(mapOfStringAndProviderOfStringProvider); this.setOfStringProvider = SetFactory.create(setOfStringContribution1Provider); } @Override public SubComponentInjectable getSubInjectable() { return subComponentInjectableProvider.get(); } } } ```
non_priority
provided map and sets will be recreated in subcomponents this component does create extra providers for set and maps in the subcomponent although they are not used if you have many or large provided maps or sets and your subcomponent is used with a scope that is created on request dagger recreates all the maps and sets on each request although they are never injected java javax inject singleton dagger component modules mapbugsubcomponent mymodule class public interface mapbugsubcomponent maininjectable getmaininjectable mysubcomponent getsubcomponent dagger subcomponent interface mysubcomponent subcomponent does not need any provided map and set but dagger generates unused map and set providers for the subcomponent this only happens with map and sets the integer provider from the main component is correctly reused without regeneration in the subcomponent subcomponentinjectable getsubinjectable provides all the objects for maininjectable dagger module class mymodule javax inject singleton dagger provides integer provideint return javax inject singleton dagger provides type dagger provides type set values java util set provideset return java util collections singleton setentry mymapkey javax inject singleton dagger provides type dagger provides type map string provideentry return mapentry dagger mapkey unwrapvalue true public interface mymapkey string value class maininjectable javax inject inject maininjectable java util map map java util set set integer integer class subcomponentinjectable javax inject inject subcomponentinjectable integer integer note that in the generated code private provider mapofstringandproviderofstringprovider private provider mapofstringandstringprovider private provider setofstringprovider are recreated although they are not used in the objects created by the subcomponent also note that provideintprovider that is used in the subcomponent is not recreated and correctly used from the main component java generated dagger internal codegen componentprocessor public final class daggermapbugsubcomponent implements mapbugsubcomponent private provider private provider mapofstringandproviderofstringprovider private provider mapofstringandstringprovider private provider private provider setofstringprovider private provider provideintprovider private provider maininjectableprovider private daggermapbugsubcomponent builder builder assert builder null initialize builder public static builder builder return new builder public static mapbugsubcomponent create return builder build private void initialize final builder builder this scopedprovider create mapbugsubcomponent mymodule provideentryfactory create builder mymodule this mapofstringandproviderofstringprovider mapproviderfactory builder put build this mapofstringandstringprovider mapfactory create mapofstringandproviderofstringprovider this scopedprovider create mapbugsubcomponent mymodule providesetfactory create builder mymodule this setofstringprovider setfactory create this provideintprovider scopedprovider create mapbugsubcomponent mymodule provideintfactory create builder mymodule this maininjectableprovider mapbugsubcomponent maininjectable factory create mapofstringandstringprovider setofstringprovider provideintprovider override public maininjectable getmaininjectable return maininjectableprovider get override public mysubcomponent getsubcomponent return new mysubcomponentimpl public static final class builder private mymodule mymodule private builder public mapbugsubcomponent build if mymodule null this mymodule new mymodule return new daggermapbugsubcomponent this public builder mymodule mymodule mymodule if mymodule null throw new nullpointerexception mymodule this mymodule mymodule return this private final class mysubcomponentimpl implements mysubcomponent private provider subcomponentinjectableprovider private provider mapofstringandproviderofstringprovider private provider mapofstringandstringprovider private provider setofstringprovider private mysubcomponentimpl initialize private void initialize this subcomponentinjectableprovider mapbugsubcomponent subcomponentinjectable factory create daggermapbugsubcomponent this provideintprovider this mapofstringandproviderofstringprovider mapproviderfactory builder put build this mapofstringandstringprovider mapfactory create mapofstringandproviderofstringprovider this setofstringprovider setfactory create override public subcomponentinjectable getsubinjectable return subcomponentinjectableprovider get
0
373,359
11,042,653,784
IssuesEvent
2019-12-09 09:36:36
xwikisas/application-forum
https://api.github.com/repos/xwikisas/application-forum
closed
Creating and commenting answers events are not displayed properly in Notifications list after Forum Pro is uninstalled
Priority: Minor Type: Bug
STEPS TO REPRODUCE 1. Login as Admin 2. Set Pages Notifications to ON (and Watch the wiki) 3. Create an user (U1) 4. Login with U1 5. Create a forum, add a topic, add an answer on that topic, add a comment to the answer 6. Login with Admin 7. Observe the Notifications list 8. Uninstall Forum Pro 9. Observe the Notifications list EXPECTED RESULTS The name of events related to creating/commenting Answers are displayed in a nice manner (like in Step 7). ACTUAL RESULTS The name of events related to creating/commenting Answers are displayed with many numbers in the title. Notifications list **before** uninstalling Forum Pro (Step 7): ![Answers_Notifications_5b](https://user-images.githubusercontent.com/32512703/64624224-fbb4fe80-d3f2-11e9-99db-2b2f4b3897d8.jpg) Notifications list **after** uninstalling Forum Pro: ![Answers_Notifications_4b](https://user-images.githubusercontent.com/32512703/64624260-0b344780-d3f3-11e9-9829-e6802b1f8f42.jpg) Environment: Windows 10 64bit, Chrome 76, using a local instance of XWiki 11.3.4 on Oracle 12c (Forum Pro v. 2.6.2)
1.0
Creating and commenting answers events are not displayed properly in Notifications list after Forum Pro is uninstalled - STEPS TO REPRODUCE 1. Login as Admin 2. Set Pages Notifications to ON (and Watch the wiki) 3. Create an user (U1) 4. Login with U1 5. Create a forum, add a topic, add an answer on that topic, add a comment to the answer 6. Login with Admin 7. Observe the Notifications list 8. Uninstall Forum Pro 9. Observe the Notifications list EXPECTED RESULTS The name of events related to creating/commenting Answers are displayed in a nice manner (like in Step 7). ACTUAL RESULTS The name of events related to creating/commenting Answers are displayed with many numbers in the title. Notifications list **before** uninstalling Forum Pro (Step 7): ![Answers_Notifications_5b](https://user-images.githubusercontent.com/32512703/64624224-fbb4fe80-d3f2-11e9-99db-2b2f4b3897d8.jpg) Notifications list **after** uninstalling Forum Pro: ![Answers_Notifications_4b](https://user-images.githubusercontent.com/32512703/64624260-0b344780-d3f3-11e9-9829-e6802b1f8f42.jpg) Environment: Windows 10 64bit, Chrome 76, using a local instance of XWiki 11.3.4 on Oracle 12c (Forum Pro v. 2.6.2)
priority
creating and commenting answers events are not displayed properly in notifications list after forum pro is uninstalled steps to reproduce login as admin set pages notifications to on and watch the wiki create an user login with create a forum add a topic add an answer on that topic add a comment to the answer login with admin observe the notifications list uninstall forum pro observe the notifications list expected results the name of events related to creating commenting answers are displayed in a nice manner like in step actual results the name of events related to creating commenting answers are displayed with many numbers in the title notifications list before uninstalling forum pro step notifications list after uninstalling forum pro environment windows chrome using a local instance of xwiki on oracle forum pro v
1
382,484
11,307,153,650
IssuesEvent
2020-01-18 18:57:34
vacuumlabs/fine-tracker
https://api.github.com/repos/vacuumlabs/fine-tracker
opened
Show all shortcuts
:arrow_right: Frontend Priority: Medium Type: Bug :bug:
There are a little more shortcuts now as there were half a year ago. Hence, there is ~10px that I can scroll when I open shortcuts. It is kind of irritating to me :cry: I would probably just make the shortcuts text smaller so that it fits without scrolling.
1.0
Show all shortcuts - There are a little more shortcuts now as there were half a year ago. Hence, there is ~10px that I can scroll when I open shortcuts. It is kind of irritating to me :cry: I would probably just make the shortcuts text smaller so that it fits without scrolling.
priority
show all shortcuts there are a little more shortcuts now as there were half a year ago hence there is that i can scroll when i open shortcuts it is kind of irritating to me cry i would probably just make the shortcuts text smaller so that it fits without scrolling
1
7,423
3,546,065,189
IssuesEvent
2016-01-20 00:15:54
Chantilly612Code/612-2016
https://api.github.com/repos/Chantilly612Code/612-2016
opened
Write Drivetrain DriveJoystick command.
driver controls drivetrain enhancement help wanted robot code
Use the wiki docs for more info. Note: This one is rather hard, compared to the others.
1.0
Write Drivetrain DriveJoystick command. - Use the wiki docs for more info. Note: This one is rather hard, compared to the others.
non_priority
write drivetrain drivejoystick command use the wiki docs for more info note this one is rather hard compared to the others
0
81,878
23,603,969,221
IssuesEvent
2022-08-24 06:28:32
PaddlePaddle/Paddle
https://api.github.com/repos/PaddlePaddle/Paddle
opened
动态图本地模型保存报错,Exception has occurred: ValueError In transformed code
status/new-issue type/build
### 问题描述 Issue Description 指南的模型开发入门的模型保存与载入,推荐使用paddle.jit.save/load(动态图),但是报错 Exception has occurred: ValueError In transformed code: File "XXXXXX.py", line 76, in forward x = F.relu(x) x = self.conv1(x) ~~~~~~~~~~~~~~~~~ <--- HERE x = x + x1 x = F.relu(x) 这是代码原文,那个报错的HERE指的地方没有什么啊, # save path = "H:/Desktop/XXX/XXX/evNet" paddle.jit.save( layer=layer, path=path, input_spec=[InputSpec(shape=[None, 15], dtype='float32')]) 这是保存模型用的代码,也是直接从帮助里复制过来的 ### 版本&环境信息 Version & Environment Information **************************************** Paddle version: 2.3.1 Paddle With CUDA: True OS: Windows 10 Python version: 3.7.13 CUDA version: 11.2.152 Build cuda_11.2.r11.2/compiler.29618528_0 cuDNN version: None.None.None Nvidia driver version: None ****************************************
1.0
动态图本地模型保存报错,Exception has occurred: ValueError In transformed code - ### 问题描述 Issue Description 指南的模型开发入门的模型保存与载入,推荐使用paddle.jit.save/load(动态图),但是报错 Exception has occurred: ValueError In transformed code: File "XXXXXX.py", line 76, in forward x = F.relu(x) x = self.conv1(x) ~~~~~~~~~~~~~~~~~ <--- HERE x = x + x1 x = F.relu(x) 这是代码原文,那个报错的HERE指的地方没有什么啊, # save path = "H:/Desktop/XXX/XXX/evNet" paddle.jit.save( layer=layer, path=path, input_spec=[InputSpec(shape=[None, 15], dtype='float32')]) 这是保存模型用的代码,也是直接从帮助里复制过来的 ### 版本&环境信息 Version & Environment Information **************************************** Paddle version: 2.3.1 Paddle With CUDA: True OS: Windows 10 Python version: 3.7.13 CUDA version: 11.2.152 Build cuda_11.2.r11.2/compiler.29618528_0 cuDNN version: None.None.None Nvidia driver version: None ****************************************
non_priority
动态图本地模型保存报错,exception has occurred valueerror in transformed code 问题描述 issue description 指南的模型开发入门的模型保存与载入,推荐使用paddle jit save load(动态图),但是报错 exception has occurred valueerror in transformed code file xxxxxx py line in forward x f relu x x self x here x x x f relu x 这是代码原文,那个报错的here指的地方没有什么啊, save path h desktop xxx xxx evnet paddle jit save layer layer path path input spec dtype 这是保存模型用的代码,也是直接从帮助里复制过来的 版本 环境信息 version environment information paddle version paddle with cuda true os windows python version cuda version build cuda compiler cudnn version none none none nvidia driver version none
0
133,736
29,509,967,307
IssuesEvent
2023-06-03 20:05:30
SkyEyeWeather/SunBot
https://api.github.com/repos/SkyEyeWeather/SunBot
closed
Transform the `SunController` class into a discord cog
enhancement code-cleaning
## Task description The `SunController` can be seen as a collection of commands and event listeners. This corresponds to the definition of a `commands.Cog` in the discord API. Thus, one improvement for the `SunController` might be to make it inherit from the `commands.Cog` class, as indicated in the [documentation](https://discordpy.readthedocs.io/en/stable/ext/commands/cogs.html). Do that would allow to reduce amount of code in the `main` script. ## Validation criteria * [x] : Inherit `SunController` class from `commands.Cog` and add needed decorator as described in the documentation * [x] : Update the code in the `main.py` python script to take into account `SunController` modification
1.0
Transform the `SunController` class into a discord cog - ## Task description The `SunController` can be seen as a collection of commands and event listeners. This corresponds to the definition of a `commands.Cog` in the discord API. Thus, one improvement for the `SunController` might be to make it inherit from the `commands.Cog` class, as indicated in the [documentation](https://discordpy.readthedocs.io/en/stable/ext/commands/cogs.html). Do that would allow to reduce amount of code in the `main` script. ## Validation criteria * [x] : Inherit `SunController` class from `commands.Cog` and add needed decorator as described in the documentation * [x] : Update the code in the `main.py` python script to take into account `SunController` modification
non_priority
transform the suncontroller class into a discord cog task description the suncontroller can be seen as a collection of commands and event listeners this corresponds to the definition of a commands cog in the discord api thus one improvement for the suncontroller might be to make it inherit from the commands cog class as indicated in the do that would allow to reduce amount of code in the main script validation criteria inherit suncontroller class from commands cog and add needed decorator as described in the documentation update the code in the main py python script to take into account suncontroller modification
0
104,862
16,622,403,003
IssuesEvent
2021-06-03 04:22:01
gms-ws-sandbox/NodeGoat
https://api.github.com/repos/gms-ws-sandbox/NodeGoat
opened
CVE-2019-1010266 (Medium) detected in lodash-4.13.1.tgz, lodash-2.4.2.tgz
security vulnerability
## CVE-2019-1010266 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.13.1.tgz</b>, <b>lodash-2.4.2.tgz</b></p></summary> <p> <details><summary><b>lodash-4.13.1.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.13.1.tgz">https://registry.npmjs.org/lodash/-/lodash-4.13.1.tgz</a></p> <p>Path to dependency file: NodeGoat/package.json</p> <p>Path to vulnerable library: NodeGoat/node_modules/nyc/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - grunt-if-0.2.0.tgz (Root Library) - grunt-contrib-nodeunit-1.0.0.tgz - nodeunit-0.9.5.tgz - tap-7.1.2.tgz - nyc-7.1.0.tgz - istanbul-lib-instrument-1.1.0-alpha.4.tgz - babel-generator-6.11.4.tgz - :x: **lodash-4.13.1.tgz** (Vulnerable Library) </details> <details><summary><b>lodash-2.4.2.tgz</b></p></summary> <p>A utility library delivering consistency, customization, performance, & extras.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz">https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz</a></p> <p>Path to dependency file: NodeGoat/package.json</p> <p>Path to vulnerable library: NodeGoat/node_modules/zaproxy/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - zaproxy-0.2.0.tgz (Root Library) - :x: **lodash-2.4.2.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/gms-ws-sandbox/NodeGoat/commit/c221163763b7f2f5d9c526f553b11a21602caa30">c221163763b7f2f5d9c526f553b11a21602caa30</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11. <p>Publish Date: 2019-07-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1010266>CVE-2019-1010266</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266</a></p> <p>Release Date: 2019-07-17</p> <p>Fix Resolution: 4.17.11</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.13.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-if:0.2.0;grunt-contrib-nodeunit:1.0.0;nodeunit:0.9.5;tap:7.1.2;nyc:7.1.0;istanbul-lib-instrument:1.1.0-alpha.4;babel-generator:6.11.4;lodash:4.13.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.17.11"},{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"2.4.2","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"zaproxy:0.2.0;lodash:2.4.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.17.11"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2019-1010266","vulnerabilityDetails":"lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1010266","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-1010266 (Medium) detected in lodash-4.13.1.tgz, lodash-2.4.2.tgz - ## CVE-2019-1010266 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.13.1.tgz</b>, <b>lodash-2.4.2.tgz</b></p></summary> <p> <details><summary><b>lodash-4.13.1.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.13.1.tgz">https://registry.npmjs.org/lodash/-/lodash-4.13.1.tgz</a></p> <p>Path to dependency file: NodeGoat/package.json</p> <p>Path to vulnerable library: NodeGoat/node_modules/nyc/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - grunt-if-0.2.0.tgz (Root Library) - grunt-contrib-nodeunit-1.0.0.tgz - nodeunit-0.9.5.tgz - tap-7.1.2.tgz - nyc-7.1.0.tgz - istanbul-lib-instrument-1.1.0-alpha.4.tgz - babel-generator-6.11.4.tgz - :x: **lodash-4.13.1.tgz** (Vulnerable Library) </details> <details><summary><b>lodash-2.4.2.tgz</b></p></summary> <p>A utility library delivering consistency, customization, performance, & extras.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz">https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz</a></p> <p>Path to dependency file: NodeGoat/package.json</p> <p>Path to vulnerable library: NodeGoat/node_modules/zaproxy/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - zaproxy-0.2.0.tgz (Root Library) - :x: **lodash-2.4.2.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/gms-ws-sandbox/NodeGoat/commit/c221163763b7f2f5d9c526f553b11a21602caa30">c221163763b7f2f5d9c526f553b11a21602caa30</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11. <p>Publish Date: 2019-07-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1010266>CVE-2019-1010266</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266</a></p> <p>Release Date: 2019-07-17</p> <p>Fix Resolution: 4.17.11</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.13.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-if:0.2.0;grunt-contrib-nodeunit:1.0.0;nodeunit:0.9.5;tap:7.1.2;nyc:7.1.0;istanbul-lib-instrument:1.1.0-alpha.4;babel-generator:6.11.4;lodash:4.13.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.17.11"},{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"2.4.2","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"zaproxy:0.2.0;lodash:2.4.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.17.11"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2019-1010266","vulnerabilityDetails":"lodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-1010266","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_priority
cve medium detected in lodash tgz lodash tgz cve medium severity vulnerability vulnerable libraries lodash tgz lodash tgz lodash tgz lodash modular utilities library home page a href path to dependency file nodegoat package json path to vulnerable library nodegoat node modules nyc node modules lodash package json dependency hierarchy grunt if tgz root library grunt contrib nodeunit tgz nodeunit tgz tap tgz nyc tgz istanbul lib instrument alpha tgz babel generator tgz x lodash tgz vulnerable library lodash tgz a utility library delivering consistency customization performance extras library home page a href path to dependency file nodegoat package json path to vulnerable library nodegoat node modules zaproxy node modules lodash package json dependency hierarchy zaproxy tgz root library x lodash tgz vulnerable library found in head commit a href found in base branch master vulnerability details lodash prior to is affected by cwe uncontrolled resource consumption the impact is denial of service the component is date handler the attack vector is attacker provides very long strings which the library attempts to match using a regular expression the fixed version is publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt if grunt contrib nodeunit nodeunit tap nyc istanbul lib instrument alpha babel generator lodash isminimumfixversionavailable true minimumfixversion packagetype javascript node js packagename lodash packageversion packagefilepaths istransitivedependency true dependencytree zaproxy lodash isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails lodash prior to is affected by cwe uncontrolled resource consumption the impact is denial of service the component is date handler the attack vector is attacker provides very long strings which the library attempts to match using a regular expression the fixed version is vulnerabilityurl
0
35,167
4,634,301,087
IssuesEvent
2016-09-29 00:27:00
dotnet/roslyn
https://api.github.com/repos/dotnet/roslyn
closed
Interface with setter and getter inherited from different types can't be retrieved or assigned
Area-Language Design Resolution-By Design
**Version Used**: Microsoft Visual Studio Community 2015: Version 14.0.25431.01 Update 3 Visual C# 2015 00322-20000-00000-AA851 **Steps to Reproduce**: 1. Attempt to compile the following: ``` interface Iterable<out T> { T this[int index] { get; } int Length { get; } } interface NumericMap<in T> { T this[int index] { set; } } interface Array<T> : Iterable<T>, NumericMap<T> { } public class Program { public static void Main(string[] args) { Array<string> items = null; string item = null; items[0] = item; } } ``` **Expected Behavior**: The assignment dispatches to NumericMap[int].set, and the code compiles correctly. **Actual Behavior**: I receive the following compiler error: > CS0121 The call is ambiguous between the following methods or properties: 'Iterable<T>.this[int]' and 'NumericMap<T>.this[int]' on the line where this assignment is performed: items[0] = item; **Notes**: Note that this isn't really specific to indexers, this is just the case I was fiddling with when I encountered the compiler error. You get the same error if you instead inherit `T Foo { get; }` and `T Foo { set; }` from different interfaces. It doesn't seem like there should be any ambiguity here in principle; since I am using the setter, so the call should be dispatched to whatever the implementation of the setter method is. The workarounds I can think of are: - Use casts or assign to a variable of one of the parent interface types, but depending on your use case this can get inconvenient quickly - Use methods, and rely on overload resolution to properly distinguish which interface you're working with
2.0
Interface with setter and getter inherited from different types can't be retrieved or assigned - **Version Used**: Microsoft Visual Studio Community 2015: Version 14.0.25431.01 Update 3 Visual C# 2015 00322-20000-00000-AA851 **Steps to Reproduce**: 1. Attempt to compile the following: ``` interface Iterable<out T> { T this[int index] { get; } int Length { get; } } interface NumericMap<in T> { T this[int index] { set; } } interface Array<T> : Iterable<T>, NumericMap<T> { } public class Program { public static void Main(string[] args) { Array<string> items = null; string item = null; items[0] = item; } } ``` **Expected Behavior**: The assignment dispatches to NumericMap[int].set, and the code compiles correctly. **Actual Behavior**: I receive the following compiler error: > CS0121 The call is ambiguous between the following methods or properties: 'Iterable<T>.this[int]' and 'NumericMap<T>.this[int]' on the line where this assignment is performed: items[0] = item; **Notes**: Note that this isn't really specific to indexers, this is just the case I was fiddling with when I encountered the compiler error. You get the same error if you instead inherit `T Foo { get; }` and `T Foo { set; }` from different interfaces. It doesn't seem like there should be any ambiguity here in principle; since I am using the setter, so the call should be dispatched to whatever the implementation of the setter method is. The workarounds I can think of are: - Use casts or assign to a variable of one of the parent interface types, but depending on your use case this can get inconvenient quickly - Use methods, and rely on overload resolution to properly distinguish which interface you're working with
non_priority
interface with setter and getter inherited from different types can t be retrieved or assigned version used microsoft visual studio community version update visual c steps to reproduce attempt to compile the following interface iterable t this get int length get interface numericmap t this set interface array iterable numericmap public class program public static void main string args array items null string item null items item expected behavior the assignment dispatches to numericmap set and the code compiles correctly actual behavior i receive the following compiler error the call is ambiguous between the following methods or properties iterable this and numericmap this on the line where this assignment is performed items item notes note that this isn t really specific to indexers this is just the case i was fiddling with when i encountered the compiler error you get the same error if you instead inherit t foo get and t foo set from different interfaces it doesn t seem like there should be any ambiguity here in principle since i am using the setter so the call should be dispatched to whatever the implementation of the setter method is the workarounds i can think of are use casts or assign to a variable of one of the parent interface types but depending on your use case this can get inconvenient quickly use methods and rely on overload resolution to properly distinguish which interface you re working with
0
5,149
7,689,143,767
IssuesEvent
2018-05-17 11:47:52
unb-cic-esw/youtube-data-monitor
https://api.github.com/repos/unb-cic-esw/youtube-data-monitor
opened
Informar inexistência de certos dados
functional requirements
EU : desenvolvedor(a) QUERO : Informar a inexistência de determinados dados: * Criar um arquivo .txt informando os canais que não possuem vídeos criados a partir de 01.01.2018. * Nos arquivos youtube.csv informar a inexistência dos dados. PARA : Melhorar a visibilidade das informações obtidas da API. Testes de aceitação: * Criar um arquivo .txt informando os canais que não possuem vídeos criados a partir de 01.01.2018, bem como a criação de testes para validar essa informação; * Nos arquivos youtube.csv informar a inexistência dos dados, bem como a criação de testes para validar essa informação.
1.0
Informar inexistência de certos dados - EU : desenvolvedor(a) QUERO : Informar a inexistência de determinados dados: * Criar um arquivo .txt informando os canais que não possuem vídeos criados a partir de 01.01.2018. * Nos arquivos youtube.csv informar a inexistência dos dados. PARA : Melhorar a visibilidade das informações obtidas da API. Testes de aceitação: * Criar um arquivo .txt informando os canais que não possuem vídeos criados a partir de 01.01.2018, bem como a criação de testes para validar essa informação; * Nos arquivos youtube.csv informar a inexistência dos dados, bem como a criação de testes para validar essa informação.
non_priority
informar inexistência de certos dados eu desenvolvedor a quero informar a inexistência de determinados dados criar um arquivo txt informando os canais que não possuem vídeos criados a partir de nos arquivos youtube csv informar a inexistência dos dados para melhorar a visibilidade das informações obtidas da api testes de aceitação criar um arquivo txt informando os canais que não possuem vídeos criados a partir de bem como a criação de testes para validar essa informação nos arquivos youtube csv informar a inexistência dos dados bem como a criação de testes para validar essa informação
0
707,726
24,316,106,022
IssuesEvent
2022-09-30 06:28:30
kubesphere/kubesphere
https://api.github.com/repos/kubesphere/kubesphere
closed
Proxy connected clusters should not have an update kubeconfig button
kind/bug priority/medium kind/need-to-verify
**Describe the Bug** 1、There is a multi-cluster environment, the mem cluster is a proxy connection, enter the cluster - "cluster settings -" basic information, click management, and display the update kubeconfig button ![image](https://user-images.githubusercontent.com/88183150/190955963-b2c02b56-562a-48b7-9e77-42fc0fd895ab.png)\ **Versions Used** KubeSphere: v3.3.1-rc.2 ks-console: weili520/ks-console:latest /king bug /assign @weili520 /priority medium
1.0
Proxy connected clusters should not have an update kubeconfig button - **Describe the Bug** 1、There is a multi-cluster environment, the mem cluster is a proxy connection, enter the cluster - "cluster settings -" basic information, click management, and display the update kubeconfig button ![image](https://user-images.githubusercontent.com/88183150/190955963-b2c02b56-562a-48b7-9e77-42fc0fd895ab.png)\ **Versions Used** KubeSphere: v3.3.1-rc.2 ks-console: weili520/ks-console:latest /king bug /assign @weili520 /priority medium
priority
proxy connected clusters should not have an update kubeconfig button describe the bug 、there is a multi cluster environment the mem cluster is a proxy connection enter the cluster cluster settings basic information click management and display the update kubeconfig button versions used kubesphere rc ks console ks console latest king bug assign priority medium
1
820,792
30,789,298,361
IssuesEvent
2023-07-31 15:06:56
BenWestgate/Bails
https://api.github.com/repos/BenWestgate/Bails
opened
Harvest entropy from the 'verify download' step
enhancement good first issue priority: low
A variable can store the yes/no choices to the verify procedure for several bits of entropy that can be added to the seed later. Other dialogs that loop themselves if closed with the "X" or Esc key can also append their return codes to this variable. The passphrase from early setup may be used as well but that likely is better to unset and store a KDF hash of instead.
1.0
Harvest entropy from the 'verify download' step - A variable can store the yes/no choices to the verify procedure for several bits of entropy that can be added to the seed later. Other dialogs that loop themselves if closed with the "X" or Esc key can also append their return codes to this variable. The passphrase from early setup may be used as well but that likely is better to unset and store a KDF hash of instead.
priority
harvest entropy from the verify download step a variable can store the yes no choices to the verify procedure for several bits of entropy that can be added to the seed later other dialogs that loop themselves if closed with the x or esc key can also append their return codes to this variable the passphrase from early setup may be used as well but that likely is better to unset and store a kdf hash of instead
1
64,804
7,842,482,335
IssuesEvent
2018-06-18 23:47:35
sul-dlss/SearchWorks
https://api.github.com/repos/sul-dlss/SearchWorks
closed
Set focus outline colour
visual design
Replace **-webkit-focus-ring-color** with **#EAAB00** ``` @mixin tab-focus() { // Default outline: thin dotted; // WebKit outline: 5px auto -webkit-focus-ring-color; outline-offset: -2px; } ```
1.0
Set focus outline colour - Replace **-webkit-focus-ring-color** with **#EAAB00** ``` @mixin tab-focus() { // Default outline: thin dotted; // WebKit outline: 5px auto -webkit-focus-ring-color; outline-offset: -2px; } ```
non_priority
set focus outline colour replace webkit focus ring color with mixin tab focus default outline thin dotted webkit outline auto webkit focus ring color outline offset
0
218,295
16,758,589,798
IssuesEvent
2021-06-13 10:09:32
bounswe/2021SpringGroup6
https://api.github.com/repos/bounswe/2021SpringGroup6
closed
Contribute to the Group Report
Priority: High Status: Complete Type: Documentation
- [x] Fill in the column of my personal effort. - [x] Add the information about a public API that used in the app. - [x] Review the parts ot the other teammates. - [x] Add the codes of back-end API that I created.
1.0
Contribute to the Group Report - - [x] Fill in the column of my personal effort. - [x] Add the information about a public API that used in the app. - [x] Review the parts ot the other teammates. - [x] Add the codes of back-end API that I created.
non_priority
contribute to the group report fill in the column of my personal effort add the information about a public api that used in the app review the parts ot the other teammates add the codes of back end api that i created
0
728,059
25,064,357,532
IssuesEvent
2022-11-07 06:43:43
flant/negentropy
https://api.github.com/repos/flant/negentropy
opened
fix (refactoring) message loop on jwks kafka topic
area/iam area/iam_auth source/security-team priority/backlog type/chore type/refactoring
file: vault-plugins/shared/jwt/kafka/jwks_source.go 1) It seems to be the same handler for messages at usual message loop and restoration loop, but two different functions are written 2) Deletion is processed by wrong way at usual message loop. Now we have no problems due to no deletions are at this topic
1.0
fix (refactoring) message loop on jwks kafka topic - file: vault-plugins/shared/jwt/kafka/jwks_source.go 1) It seems to be the same handler for messages at usual message loop and restoration loop, but two different functions are written 2) Deletion is processed by wrong way at usual message loop. Now we have no problems due to no deletions are at this topic
priority
fix refactoring message loop on jwks kafka topic file vault plugins shared jwt kafka jwks source go it seems to be the same handler for messages at usual message loop and restoration loop but two different functions are written deletion is processed by wrong way at usual message loop now we have no problems due to no deletions are at this topic
1
725,947
24,982,041,850
IssuesEvent
2022-11-02 12:30:35
IDAES/idaes-pse
https://api.github.com/repos/IDAES/idaes-pse
closed
Request for documentation/examples for new ALAMOpy interface.
Priority:Normal
I followed the AlamoPy example at the following link, and I received the error that alamopy has no attribute alamo. https://idaes-pse.readthedocs.io/en/stable/explanations/modeling_extensions/surrogate/alamopy/index.html When I investigated the submodules in alamopy, alamo does not appear either: Found submodule almain (is a package: False) Found submodule almconfidence (is a package: False) Found submodule almexec (is a package: False) Found submodule almlayer (is a package: False) Found submodule almopts (is a package: False) Found submodule almplot (is a package: False) Found submodule almread (is a package: False) Found submodule almutils (is a package: False) Found submodule almwrite (is a package: False) Found submodule example (is a package: False) Found submodule optsTest (is a package: False) Found submodule utilsTest (is a package: False) Do you have an updated example that works with the current version of alamopy which does not have a function called "alamo"?
1.0
Request for documentation/examples for new ALAMOpy interface. - I followed the AlamoPy example at the following link, and I received the error that alamopy has no attribute alamo. https://idaes-pse.readthedocs.io/en/stable/explanations/modeling_extensions/surrogate/alamopy/index.html When I investigated the submodules in alamopy, alamo does not appear either: Found submodule almain (is a package: False) Found submodule almconfidence (is a package: False) Found submodule almexec (is a package: False) Found submodule almlayer (is a package: False) Found submodule almopts (is a package: False) Found submodule almplot (is a package: False) Found submodule almread (is a package: False) Found submodule almutils (is a package: False) Found submodule almwrite (is a package: False) Found submodule example (is a package: False) Found submodule optsTest (is a package: False) Found submodule utilsTest (is a package: False) Do you have an updated example that works with the current version of alamopy which does not have a function called "alamo"?
priority
request for documentation examples for new alamopy interface i followed the alamopy example at the following link and i received the error that alamopy has no attribute alamo when i investigated the submodules in alamopy alamo does not appear either found submodule almain is a package false found submodule almconfidence is a package false found submodule almexec is a package false found submodule almlayer is a package false found submodule almopts is a package false found submodule almplot is a package false found submodule almread is a package false found submodule almutils is a package false found submodule almwrite is a package false found submodule example is a package false found submodule optstest is a package false found submodule utilstest is a package false do you have an updated example that works with the current version of alamopy which does not have a function called alamo
1
185,071
21,785,060,921
IssuesEvent
2022-05-14 02:20:10
Yash-Handa/GitHub-Org-Geographics
https://api.github.com/repos/Yash-Handa/GitHub-Org-Geographics
closed
CVE-2021-23368 (Medium) detected in postcss-7.0.14.tgz - autoclosed
security vulnerability
## CVE-2021-23368 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>postcss-7.0.14.tgz</b></p></summary> <p>Tool for transforming styles with JS plugins</p> <p>Library home page: <a href="https://registry.npmjs.org/postcss/-/postcss-7.0.14.tgz">https://registry.npmjs.org/postcss/-/postcss-7.0.14.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/postcss/package.json</p> <p> Dependency Hierarchy: - build-angular-0.13.8.tgz (Root Library) - :x: **postcss-7.0.14.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Yash-Handa/GitHub-Org-Geographics/commit/8541f31e4773acce1892733eceadb747a827e40f">8541f31e4773acce1892733eceadb747a827e40f</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package postcss from 7.0.0 and before 8.2.10 are vulnerable to Regular Expression Denial of Service (ReDoS) during source map parsing. <p>Publish Date: 2021-04-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23368>CVE-2021-23368</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23368">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23368</a></p> <p>Release Date: 2021-04-12</p> <p>Fix Resolution: postcss -8.2.10</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-23368 (Medium) detected in postcss-7.0.14.tgz - autoclosed - ## CVE-2021-23368 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>postcss-7.0.14.tgz</b></p></summary> <p>Tool for transforming styles with JS plugins</p> <p>Library home page: <a href="https://registry.npmjs.org/postcss/-/postcss-7.0.14.tgz">https://registry.npmjs.org/postcss/-/postcss-7.0.14.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/postcss/package.json</p> <p> Dependency Hierarchy: - build-angular-0.13.8.tgz (Root Library) - :x: **postcss-7.0.14.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Yash-Handa/GitHub-Org-Geographics/commit/8541f31e4773acce1892733eceadb747a827e40f">8541f31e4773acce1892733eceadb747a827e40f</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package postcss from 7.0.0 and before 8.2.10 are vulnerable to Regular Expression Denial of Service (ReDoS) during source map parsing. <p>Publish Date: 2021-04-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23368>CVE-2021-23368</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23368">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23368</a></p> <p>Release Date: 2021-04-12</p> <p>Fix Resolution: postcss -8.2.10</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve medium detected in postcss tgz autoclosed cve medium severity vulnerability vulnerable library postcss tgz tool for transforming styles with js plugins library home page a href path to dependency file package json path to vulnerable library node modules postcss package json dependency hierarchy build angular tgz root library x postcss tgz vulnerable library found in head commit a href vulnerability details the package postcss from and before are vulnerable to regular expression denial of service redos during source map parsing publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution postcss step up your open source security game with whitesource
0
65,515
12,604,755,818
IssuesEvent
2020-06-11 15:26:09
glific/glific-frontend
https://api.github.com/repos/glific/glific-frontend
closed
Convert UI components to functional components
code cleanup
**Describe the task** We have several UI components like Button, Checkbox, Dropdown, TextField that use class-based component definition. For consistency purposes, we need to convert them to functional components. ```js class TextFieldElement extends React.Component<TextFieldElementProps> { ``` **Expected behavior** * All the UI elements are functional components * The functionality works as expected. No breaking changes.
1.0
Convert UI components to functional components - **Describe the task** We have several UI components like Button, Checkbox, Dropdown, TextField that use class-based component definition. For consistency purposes, we need to convert them to functional components. ```js class TextFieldElement extends React.Component<TextFieldElementProps> { ``` **Expected behavior** * All the UI elements are functional components * The functionality works as expected. No breaking changes.
non_priority
convert ui components to functional components describe the task we have several ui components like button checkbox dropdown textfield that use class based component definition for consistency purposes we need to convert them to functional components js class textfieldelement extends react component expected behavior all the ui elements are functional components the functionality works as expected no breaking changes
0
19,563
10,369,104,674
IssuesEvent
2019-09-07 23:03:48
IncPlusPlus/betterstat-server
https://api.github.com/repos/IncPlusPlus/betterstat-server
closed
CVE-2018-14718 (High) detected in jackson-databind-2.9.6.jar
security vulnerability
## CVE-2018-14718 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /betterstat-server/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar</p> <p> Dependency Hierarchy: - geoip2-2.12.0.jar (Root Library) - :x: **jackson-databind-2.9.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/IncPlusPlus/betterstat-server/commit/6c026eb8b67245c860f7cfd3312980b3081c3cbe">6c026eb8b67245c860f7cfd3312980b3081c3cbe</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.7 might allow remote attackers to execute arbitrary code by leveraging failure to block the slf4j-ext class from polymorphic deserialization. <p>Publish Date: 2019-01-02 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-14718>CVE-2018-14718</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-14718">https://nvd.nist.gov/vuln/detail/CVE-2018-14718</a></p> <p>Release Date: 2019-01-02</p> <p>Fix Resolution: 2.9.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-14718 (High) detected in jackson-databind-2.9.6.jar - ## CVE-2018-14718 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /betterstat-server/pom.xml</p> <p>Path to vulnerable library: /root/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar</p> <p> Dependency Hierarchy: - geoip2-2.12.0.jar (Root Library) - :x: **jackson-databind-2.9.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/IncPlusPlus/betterstat-server/commit/6c026eb8b67245c860f7cfd3312980b3081c3cbe">6c026eb8b67245c860f7cfd3312980b3081c3cbe</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.7 might allow remote attackers to execute arbitrary code by leveraging failure to block the slf4j-ext class from polymorphic deserialization. <p>Publish Date: 2019-01-02 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-14718>CVE-2018-14718</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-14718">https://nvd.nist.gov/vuln/detail/CVE-2018-14718</a></p> <p>Release Date: 2019-01-02</p> <p>Fix Resolution: 2.9.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file betterstat server pom xml path to vulnerable library root repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy jar root library x jackson databind jar vulnerable library found in head commit a href vulnerability details fasterxml jackson databind x before might allow remote attackers to execute arbitrary code by leveraging failure to block the ext class from polymorphic deserialization publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
132,813
18,765,210,584
IssuesEvent
2021-11-05 22:23:13
Classroom-Clone/Mobile
https://api.github.com/repos/Classroom-Clone/Mobile
opened
Widok klasy
design new feature
Widok dla ucznia: ![widok_klasy_uczen_glowny](https://user-images.githubusercontent.com/48356242/140584888-2dfe6370-57f8-44eb-90f8-e3c913f091ea.png) ![widok_klasy_uczen_zadania](https://user-images.githubusercontent.com/48356242/140584899-ae26153f-a2f7-40cc-ab02-611c6d24a6f6.png) ![widok_klasy_uczen_osoby](https://user-images.githubusercontent.com/48356242/140584906-76fb9514-123d-4f55-9ac4-fb1a0c351454.png) Widok dla nauczyciela: ![widok_klasy_nauczyciel_glowny](https://user-images.githubusercontent.com/48356242/140584963-a61601a6-9964-40f3-af2f-dbeedcbc5aa6.png) ![widok_klasy_nauczyciel_zadania](https://user-images.githubusercontent.com/48356242/140584972-ea2941ef-e416-4581-b5be-a4a9b70378bd.png) ![widok_klasy_nauczyciel_osoby](https://user-images.githubusercontent.com/48356242/140584980-4cc1b568-d10f-4966-a92a-ce58d7da0432.png)
1.0
Widok klasy - Widok dla ucznia: ![widok_klasy_uczen_glowny](https://user-images.githubusercontent.com/48356242/140584888-2dfe6370-57f8-44eb-90f8-e3c913f091ea.png) ![widok_klasy_uczen_zadania](https://user-images.githubusercontent.com/48356242/140584899-ae26153f-a2f7-40cc-ab02-611c6d24a6f6.png) ![widok_klasy_uczen_osoby](https://user-images.githubusercontent.com/48356242/140584906-76fb9514-123d-4f55-9ac4-fb1a0c351454.png) Widok dla nauczyciela: ![widok_klasy_nauczyciel_glowny](https://user-images.githubusercontent.com/48356242/140584963-a61601a6-9964-40f3-af2f-dbeedcbc5aa6.png) ![widok_klasy_nauczyciel_zadania](https://user-images.githubusercontent.com/48356242/140584972-ea2941ef-e416-4581-b5be-a4a9b70378bd.png) ![widok_klasy_nauczyciel_osoby](https://user-images.githubusercontent.com/48356242/140584980-4cc1b568-d10f-4966-a92a-ce58d7da0432.png)
non_priority
widok klasy widok dla ucznia widok dla nauczyciela
0
174,682
21,300,341,118
IssuesEvent
2022-04-15 01:39:35
MidnightBSD/midnightbsd-app-store
https://api.github.com/repos/MidnightBSD/midnightbsd-app-store
opened
CVE-2021-43138 (High) detected in multiple libraries
security vulnerability
## CVE-2021-43138 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>async-0.1.22.tgz</b>, <b>async-0.2.10.tgz</b>, <b>async-0.1.15.tgz</b></p></summary> <p> <details><summary><b>async-0.1.22.tgz</b></p></summary> <p>Higher-order functions and common patterns for asynchronous code</p> <p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.1.22.tgz">https://registry.npmjs.org/async/-/async-0.1.22.tgz</a></p> <p>Path to dependency file: /src/main/resources/static/components/angular-ui-bootstrap/package.json</p> <p>Path to vulnerable library: /src/main/resources/static/components/angular-ui-bootstrap/node_modules/async/package.json,/src/main/resources/static/components/angular-ui/node_modules/async/package.json</p> <p> Dependency Hierarchy: - :x: **async-0.1.22.tgz** (Vulnerable Library) </details> <details><summary><b>async-0.2.10.tgz</b></p></summary> <p>Higher-order functions and common patterns for asynchronous code</p> <p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.2.10.tgz">https://registry.npmjs.org/async/-/async-0.2.10.tgz</a></p> <p>Path to dependency file: /src/main/resources/static/components/angular-ui/package.json</p> <p>Path to vulnerable library: /src/main/resources/static/components/angular-ui/node_modules/uglify-js/node_modules/async/package.json,/src/main/resources/static/components/angular-ui-bootstrap/node_modules/utile/node_modules/async/package.json,/src/main/resources/static/components/angular-ui-bootstrap/node_modules/uglify-js/node_modules/async/package.json,/src/main/resources/static/components/angular-ui-bootstrap/node_modules/handlebars/node_modules/async/package.json,/src/main/resources/static/components/angular-ui-bootstrap/node_modules/istanbul/node_modules/async/package.json</p> <p> Dependency Hierarchy: - grunt-contrib-uglify-0.3.3.tgz (Root Library) - uglify-js-2.4.24.tgz - :x: **async-0.2.10.tgz** (Vulnerable Library) </details> <details><summary><b>async-0.1.15.tgz</b></p></summary> <p>Higher-order functions and common patterns for asynchronous code</p> <p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.1.15.tgz">https://registry.npmjs.org/async/-/async-0.1.15.tgz</a></p> <p>Path to dependency file: /src/main/resources/static/components/angular-ui/package.json</p> <p>Path to vulnerable library: /src/main/resources/static/components/angular-ui/node_modules/log4js/node_modules/async/package.json</p> <p> Dependency Hierarchy: - testacular-0.5.11.tgz (Root Library) - log4js-0.5.6.tgz - :x: **async-0.1.15.tgz** (Vulnerable Library) </details> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A vulnerability exists in Async through 3.2.1 (fixed in 3.2.2) , which could let a malicious user obtain privileges via the mapValues() method. <p>Publish Date: 2022-04-06 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p> <p>Release Date: 2022-04-06</p> <p>Fix Resolution (async): 3.2.2</p> <p>Direct dependency fix Resolution (grunt-contrib-uglify): 0.4.0</p><p>Fix Resolution (async): 3.2.2</p> <p>Direct dependency fix Resolution (testacular): 0.6.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-43138 (High) detected in multiple libraries - ## CVE-2021-43138 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>async-0.1.22.tgz</b>, <b>async-0.2.10.tgz</b>, <b>async-0.1.15.tgz</b></p></summary> <p> <details><summary><b>async-0.1.22.tgz</b></p></summary> <p>Higher-order functions and common patterns for asynchronous code</p> <p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.1.22.tgz">https://registry.npmjs.org/async/-/async-0.1.22.tgz</a></p> <p>Path to dependency file: /src/main/resources/static/components/angular-ui-bootstrap/package.json</p> <p>Path to vulnerable library: /src/main/resources/static/components/angular-ui-bootstrap/node_modules/async/package.json,/src/main/resources/static/components/angular-ui/node_modules/async/package.json</p> <p> Dependency Hierarchy: - :x: **async-0.1.22.tgz** (Vulnerable Library) </details> <details><summary><b>async-0.2.10.tgz</b></p></summary> <p>Higher-order functions and common patterns for asynchronous code</p> <p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.2.10.tgz">https://registry.npmjs.org/async/-/async-0.2.10.tgz</a></p> <p>Path to dependency file: /src/main/resources/static/components/angular-ui/package.json</p> <p>Path to vulnerable library: /src/main/resources/static/components/angular-ui/node_modules/uglify-js/node_modules/async/package.json,/src/main/resources/static/components/angular-ui-bootstrap/node_modules/utile/node_modules/async/package.json,/src/main/resources/static/components/angular-ui-bootstrap/node_modules/uglify-js/node_modules/async/package.json,/src/main/resources/static/components/angular-ui-bootstrap/node_modules/handlebars/node_modules/async/package.json,/src/main/resources/static/components/angular-ui-bootstrap/node_modules/istanbul/node_modules/async/package.json</p> <p> Dependency Hierarchy: - grunt-contrib-uglify-0.3.3.tgz (Root Library) - uglify-js-2.4.24.tgz - :x: **async-0.2.10.tgz** (Vulnerable Library) </details> <details><summary><b>async-0.1.15.tgz</b></p></summary> <p>Higher-order functions and common patterns for asynchronous code</p> <p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.1.15.tgz">https://registry.npmjs.org/async/-/async-0.1.15.tgz</a></p> <p>Path to dependency file: /src/main/resources/static/components/angular-ui/package.json</p> <p>Path to vulnerable library: /src/main/resources/static/components/angular-ui/node_modules/log4js/node_modules/async/package.json</p> <p> Dependency Hierarchy: - testacular-0.5.11.tgz (Root Library) - log4js-0.5.6.tgz - :x: **async-0.1.15.tgz** (Vulnerable Library) </details> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A vulnerability exists in Async through 3.2.1 (fixed in 3.2.2) , which could let a malicious user obtain privileges via the mapValues() method. <p>Publish Date: 2022-04-06 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p> <p>Release Date: 2022-04-06</p> <p>Fix Resolution (async): 3.2.2</p> <p>Direct dependency fix Resolution (grunt-contrib-uglify): 0.4.0</p><p>Fix Resolution (async): 3.2.2</p> <p>Direct dependency fix Resolution (testacular): 0.6.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_priority
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries async tgz async tgz async tgz async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file src main resources static components angular ui bootstrap package json path to vulnerable library src main resources static components angular ui bootstrap node modules async package json src main resources static components angular ui node modules async package json dependency hierarchy x async tgz vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file src main resources static components angular ui package json path to vulnerable library src main resources static components angular ui node modules uglify js node modules async package json src main resources static components angular ui bootstrap node modules utile node modules async package json src main resources static components angular ui bootstrap node modules uglify js node modules async package json src main resources static components angular ui bootstrap node modules handlebars node modules async package json src main resources static components angular ui bootstrap node modules istanbul node modules async package json dependency hierarchy grunt contrib uglify tgz root library uglify js tgz x async tgz vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file src main resources static components angular ui package json path to vulnerable library src main resources static components angular ui node modules node modules async package json dependency hierarchy testacular tgz root library tgz x async tgz vulnerable library found in base branch master vulnerability details a vulnerability exists in async through fixed in which could let a malicious user obtain privileges via the mapvalues method publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution async direct dependency fix resolution grunt contrib uglify fix resolution async direct dependency fix resolution testacular step up your open source security game with whitesource
0
268,178
20,261,020,536
IssuesEvent
2022-02-15 07:25:57
LukeShortCloud/winesapOS
https://api.github.com/repos/LukeShortCloud/winesapOS
opened
Add a readme file to the desktop
documentation
Thank users for trying out winesapOS, link to GitHub for bug and feature reports, and explain the usage of all of the software / desktop shortcuts
1.0
Add a readme file to the desktop - Thank users for trying out winesapOS, link to GitHub for bug and feature reports, and explain the usage of all of the software / desktop shortcuts
non_priority
add a readme file to the desktop thank users for trying out winesapos link to github for bug and feature reports and explain the usage of all of the software desktop shortcuts
0
17,855
2,615,157,610
IssuesEvent
2015-03-01 06:35:56
chrsmith/html5rocks
https://api.github.com/repos/chrsmith/html5rocks
closed
Pressing cursor keys twice causes JavaSript exception and breaks presentation
auto-migrated Priority-P2 Type-Bug
``` 1. Goto any slide. 2. Press left or right mouse button twice quickly (before the page has fully transitioned to the next). 3. Presentation is now broken: pressing mouse cursors no longer changes visible page, but some remnants of the page transitions are still visible behind the current page. (You may have to try a few times to get the timing right). During step 2, a JavaScript error shows up in the console: Uncaught TypeError: Cannot call method 'buildNext' of undefined SlideShow.next/#slide2:2907 SlideShow.handleKeys/#slide2:2963 SlideShow/#slide2:2873 Tested with Google Chrome 8.0.552.224 ``` Original issue reported on code.google.com by `skylined@chromium.org` on 12 Jan 2011 at 8:42
1.0
Pressing cursor keys twice causes JavaSript exception and breaks presentation - ``` 1. Goto any slide. 2. Press left or right mouse button twice quickly (before the page has fully transitioned to the next). 3. Presentation is now broken: pressing mouse cursors no longer changes visible page, but some remnants of the page transitions are still visible behind the current page. (You may have to try a few times to get the timing right). During step 2, a JavaScript error shows up in the console: Uncaught TypeError: Cannot call method 'buildNext' of undefined SlideShow.next/#slide2:2907 SlideShow.handleKeys/#slide2:2963 SlideShow/#slide2:2873 Tested with Google Chrome 8.0.552.224 ``` Original issue reported on code.google.com by `skylined@chromium.org` on 12 Jan 2011 at 8:42
priority
pressing cursor keys twice causes javasript exception and breaks presentation goto any slide press left or right mouse button twice quickly before the page has fully transitioned to the next presentation is now broken pressing mouse cursors no longer changes visible page but some remnants of the page transitions are still visible behind the current page you may have to try a few times to get the timing right during step a javascript error shows up in the console uncaught typeerror cannot call method buildnext of undefined slideshow next slideshow handlekeys slideshow tested with google chrome original issue reported on code google com by skylined chromium org on jan at
1
338,834
24,601,038,694
IssuesEvent
2022-10-14 12:32:12
voxel51/fiftyone
https://api.github.com/repos/voxel51/fiftyone
opened
Interactive Plots: open fifty open gui in separate jupyter lab tab?
documentation
Hello, I'm also using jupyterlab and I'm interested in opening the "fiftyone"-GUI in a seperate "jupyter lab" tab as visualized here: https://voxel51.com/docs/fiftyone/user_guide/plots.html#view-plots There it seems to be a separte jupyter lab tab? I only know #session.open_tab(), but this open a separate browser tab instead of jupyter lab tab and I really want to see both elements side by side? Thank you
1.0
Interactive Plots: open fifty open gui in separate jupyter lab tab? - Hello, I'm also using jupyterlab and I'm interested in opening the "fiftyone"-GUI in a seperate "jupyter lab" tab as visualized here: https://voxel51.com/docs/fiftyone/user_guide/plots.html#view-plots There it seems to be a separte jupyter lab tab? I only know #session.open_tab(), but this open a separate browser tab instead of jupyter lab tab and I really want to see both elements side by side? Thank you
non_priority
interactive plots open fifty open gui in separate jupyter lab tab hello i m also using jupyterlab and i m interested in opening the fiftyone gui in a seperate jupyter lab tab as visualized here there it seems to be a separte jupyter lab tab i only know session open tab but this open a separate browser tab instead of jupyter lab tab and i really want to see both elements side by side thank you
0
29,941
4,542,177,051
IssuesEvent
2016-09-09 20:18:19
18F/doi-extractives-data
https://api.github.com/repos/18F/doi-extractives-data
closed
Bug: national revenue charts, asphalt has 'all's data
type:bug workflow:testing
Looks like we might be chunked over one for our charts. All commodities should have the data that's coming up for asphalt. Maybe all our data is moved over one slot? ![screenshot 2016-09-08 12 35 37](https://cloud.githubusercontent.com/assets/4827522/18363911/fbf6b250-75c0-11e6-98c8-5d588a4202d3.png)
1.0
Bug: national revenue charts, asphalt has 'all's data - Looks like we might be chunked over one for our charts. All commodities should have the data that's coming up for asphalt. Maybe all our data is moved over one slot? ![screenshot 2016-09-08 12 35 37](https://cloud.githubusercontent.com/assets/4827522/18363911/fbf6b250-75c0-11e6-98c8-5d588a4202d3.png)
non_priority
bug national revenue charts asphalt has all s data looks like we might be chunked over one for our charts all commodities should have the data that s coming up for asphalt maybe all our data is moved over one slot
0
66,311
20,147,663,451
IssuesEvent
2022-02-09 09:17:33
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
closed
Thread view shows me the wrong room
T-Defect
### Steps to reproduce I just stumbled upon that. 1. I have threads in rooms A and B. 2. I recently (2h ago) wrote something in a thread on room A. 3. Last time I wrote something in a thread in room B was yesterday and I have restarted my browser since then. 4. I just switched to room B and clicked on the *Threads* icon (top right of the tab). 5. This showed me my thread from room A instead. ### Outcome I expected my threads from room B. ### Operating system Ubuntu ### Browser information Firefox 96.0 (64-bit) ### URL for webapp develop.element.io ### Application version Element version: 64242a004eb7-react-4501308b4722-js-07171a95c4f0 Olm version: 3.2.8 ### Homeserver matrix.org ### Will you send logs? Yes
1.0
Thread view shows me the wrong room - ### Steps to reproduce I just stumbled upon that. 1. I have threads in rooms A and B. 2. I recently (2h ago) wrote something in a thread on room A. 3. Last time I wrote something in a thread in room B was yesterday and I have restarted my browser since then. 4. I just switched to room B and clicked on the *Threads* icon (top right of the tab). 5. This showed me my thread from room A instead. ### Outcome I expected my threads from room B. ### Operating system Ubuntu ### Browser information Firefox 96.0 (64-bit) ### URL for webapp develop.element.io ### Application version Element version: 64242a004eb7-react-4501308b4722-js-07171a95c4f0 Olm version: 3.2.8 ### Homeserver matrix.org ### Will you send logs? Yes
non_priority
thread view shows me the wrong room steps to reproduce i just stumbled upon that i have threads in rooms a and b i recently ago wrote something in a thread on room a last time i wrote something in a thread in room b was yesterday and i have restarted my browser since then i just switched to room b and clicked on the threads icon top right of the tab this showed me my thread from room a instead outcome i expected my threads from room b operating system ubuntu browser information firefox bit url for webapp develop element io application version element version react js olm version homeserver matrix org will you send logs yes
0
166,718
6,310,468,496
IssuesEvent
2017-07-23 10:41:15
oSoc17/rideaway-data
https://api.github.com/repos/oSoc17/rideaway-data
opened
Put Django in production mode
Priority: High Status: Confirmed
Django should be placed in production mode to prevent debug information to be shown to the public.
1.0
Put Django in production mode - Django should be placed in production mode to prevent debug information to be shown to the public.
priority
put django in production mode django should be placed in production mode to prevent debug information to be shown to the public
1
274,803
8,568,044,901
IssuesEvent
2018-11-10 17:44:20
CS2113-AY1819S1-T13-1/main
https://api.github.com/repos/CS2113-AY1819S1-T13-1/main
closed
Can updateTask with invalid date
priority.High type.bug
**Describe the bug** User is allowed to update an invalid date **To Reproduce** Steps to reproduce the behavior: 1. Key in "updateTask 1 s/00/00" 2. See error **Expected behavior** An error message should be shown when invalid dates are inputted. **Screenshots** ![screenshot 8](https://user-images.githubusercontent.com/35787585/47906480-4bb66100-dec4-11e8-8e1c-d03393723338.png) **Additional context** <hr> **Reported by:** @kelvintankaiboon **Severity:** `Low` <sub>[original: nusCS2113-AY1819S1/pe-1#244]</sub>
1.0
Can updateTask with invalid date - **Describe the bug** User is allowed to update an invalid date **To Reproduce** Steps to reproduce the behavior: 1. Key in "updateTask 1 s/00/00" 2. See error **Expected behavior** An error message should be shown when invalid dates are inputted. **Screenshots** ![screenshot 8](https://user-images.githubusercontent.com/35787585/47906480-4bb66100-dec4-11e8-8e1c-d03393723338.png) **Additional context** <hr> **Reported by:** @kelvintankaiboon **Severity:** `Low` <sub>[original: nusCS2113-AY1819S1/pe-1#244]</sub>
priority
can updatetask with invalid date describe the bug user is allowed to update an invalid date to reproduce steps to reproduce the behavior key in updatetask s see error expected behavior an error message should be shown when invalid dates are inputted screenshots additional context reported by kelvintankaiboon severity low
1
50,628
3,006,520,741
IssuesEvent
2015-07-27 10:58:27
Itseez/opencv
https://api.github.com/repos/Itseez/opencv
opened
Create a module to read/write AVIs using Directshow instead of VFW
auto-transferred category: highgui-video feature priority: normal
Transferred from http://code.opencv.org/issues/4075 ``` || Christophe Caltagirone on 2014-12-18 21:59 || Priority: Normal || Affected: None || Category: highgui-video || Tracker: Feature || Difficulty: || PR: || Platform: None / None ``` Create a module to read/write AVIs using Directshow instead of VFW ----------- ``` The current OpenCV implementation for reading and writing AVIs is based on Video for Windows, which does not support files over 2GB. It would be nice to create a capture plugin to read/write AVIs based on DirectShow, which supports OpenDML AVI formats allowing bigger files. ``` History -------
1.0
Create a module to read/write AVIs using Directshow instead of VFW - Transferred from http://code.opencv.org/issues/4075 ``` || Christophe Caltagirone on 2014-12-18 21:59 || Priority: Normal || Affected: None || Category: highgui-video || Tracker: Feature || Difficulty: || PR: || Platform: None / None ``` Create a module to read/write AVIs using Directshow instead of VFW ----------- ``` The current OpenCV implementation for reading and writing AVIs is based on Video for Windows, which does not support files over 2GB. It would be nice to create a capture plugin to read/write AVIs based on DirectShow, which supports OpenDML AVI formats allowing bigger files. ``` History -------
priority
create a module to read write avis using directshow instead of vfw transferred from christophe caltagirone on priority normal affected none category highgui video tracker feature difficulty pr platform none none create a module to read write avis using directshow instead of vfw the current opencv implementation for reading and writing avis is based on video for windows which does not support files over it would be nice to create a capture plugin to read write avis based on directshow which supports opendml avi formats allowing bigger files history
1
462,218
13,243,028,618
IssuesEvent
2020-08-19 10:43:20
magento/adobe-stock-integration
https://api.github.com/repos/magento/adobe-stock-integration
closed
The deleted tags are not removed from Tags drop-down until the web page is refreshed
Priority: P2 Progress: dev in progress Severity: S4 bug
### Preconditions 1. Install Magento with Adobe Stock Integration 2. Configured integration in Stores -> Configuration -> Advanced-> System -> Adobe Stock Integration fieldset 3. Enhanced Media Gallery Enabled 4. Uploaded image ### Steps to reproduce (*) 1. Select the uploaded image by clicking on it 2. Click "three dots", select **Edit** 3. Add three new **Tags**, for example. Click **Save** 4. Click "three dots" again, and select **Edit** 5. Remove the previously added Tags by clicking on the cross (x) near them 6. Click **Save** 7. Click "three dots" again, and select **Edit** ### Expected result (*) The Tags drop-down is empty ![tags3](https://user-images.githubusercontent.com/45624059/89173891-9bd8c080-d58d-11ea-93cc-cb1a175f2b93.png) ### Actual result (*) The Tags drop-down still contains the previously deleted tags ![tags4](https://user-images.githubusercontent.com/45624059/89173593-2ec52b00-d58d-11ea-865b-ba6cd837df2f.png) ### Note The tags are removed from View Details and will disappear after the web page is refreshed. It's just might be confusing for the user. Besides, the previously deleted Tags will be saved to Image Details if click Save after step number 7 **(The Tags are not removed from Database)**
1.0
The deleted tags are not removed from Tags drop-down until the web page is refreshed - ### Preconditions 1. Install Magento with Adobe Stock Integration 2. Configured integration in Stores -> Configuration -> Advanced-> System -> Adobe Stock Integration fieldset 3. Enhanced Media Gallery Enabled 4. Uploaded image ### Steps to reproduce (*) 1. Select the uploaded image by clicking on it 2. Click "three dots", select **Edit** 3. Add three new **Tags**, for example. Click **Save** 4. Click "three dots" again, and select **Edit** 5. Remove the previously added Tags by clicking on the cross (x) near them 6. Click **Save** 7. Click "three dots" again, and select **Edit** ### Expected result (*) The Tags drop-down is empty ![tags3](https://user-images.githubusercontent.com/45624059/89173891-9bd8c080-d58d-11ea-93cc-cb1a175f2b93.png) ### Actual result (*) The Tags drop-down still contains the previously deleted tags ![tags4](https://user-images.githubusercontent.com/45624059/89173593-2ec52b00-d58d-11ea-865b-ba6cd837df2f.png) ### Note The tags are removed from View Details and will disappear after the web page is refreshed. It's just might be confusing for the user. Besides, the previously deleted Tags will be saved to Image Details if click Save after step number 7 **(The Tags are not removed from Database)**
priority
the deleted tags are not removed from tags drop down until the web page is refreshed preconditions install magento with adobe stock integration configured integration in stores configuration advanced system adobe stock integration fieldset enhanced media gallery enabled uploaded image steps to reproduce select the uploaded image by clicking on it click three dots select edit add three new tags for example click save click three dots again and select edit remove the previously added tags by clicking on the cross x near them click save click three dots again and select edit expected result the tags drop down is empty actual result the tags drop down still contains the previously deleted tags note the tags are removed from view details and will disappear after the web page is refreshed it s just might be confusing for the user besides the previously deleted tags will be saved to image details if click save after step number the tags are not removed from database
1
494,172
14,246,422,792
IssuesEvent
2020-11-19 10:02:54
r-lib/styler
https://api.github.com/repos/r-lib/styler
closed
dplyr pipe without branckets fails
Complexity: Medium Priority: High Status: Unassigned Type: Bug
The following fails ``` style_text('x %>% View') style_text('x %>% unique %>% View()') ``` with ``` Error: Columns `token_before`, `token_after`, `spaces`, `indention_ref_pos_id`, `indent`, `stylerignore`, `block`, `is_cached` must be length 2, not 1, 1, 1, 1, 1, 1, 1, 1 ``` adding "()" solves the error ``` style_text('x %>% View()') style_text('x %>% unique() %>% View()') ``` Regards
1.0
dplyr pipe without branckets fails - The following fails ``` style_text('x %>% View') style_text('x %>% unique %>% View()') ``` with ``` Error: Columns `token_before`, `token_after`, `spaces`, `indention_ref_pos_id`, `indent`, `stylerignore`, `block`, `is_cached` must be length 2, not 1, 1, 1, 1, 1, 1, 1, 1 ``` adding "()" solves the error ``` style_text('x %>% View()') style_text('x %>% unique() %>% View()') ``` Regards
priority
dplyr pipe without branckets fails the following fails style text x view style text x unique view with error columns token before token after spaces indention ref pos id indent stylerignore block is cached must be length not adding solves the error style text x view style text x unique view regards
1