Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
20,443
| 27,100,574,062
|
IssuesEvent
|
2023-02-15 08:19:04
|
billingran/Newsletter
|
https://api.github.com/repos/billingran/Newsletter
|
closed
|
Afficher un message d’erreur compréhensible pour l'inscription avec la même adresse email.
|
processing... Brief 2
|
- [ ] Afficher un message d’erreur compréhensible pour l'inscription avec la même adresse email.
|
1.0
|
Afficher un message d’erreur compréhensible pour l'inscription avec la même adresse email. - - [ ] Afficher un message d’erreur compréhensible pour l'inscription avec la même adresse email.
|
process
|
afficher un message d’erreur compréhensible pour l inscription avec la même adresse email afficher un message d’erreur compréhensible pour l inscription avec la même adresse email
| 1
|
41,058
| 16,617,283,638
|
IssuesEvent
|
2021-06-02 18:26:11
|
microsoft/botframework-solutions
|
https://api.github.com/repos/microsoft/botframework-solutions
|
closed
|
[Docs] Missing " in dispatch add --type "qna" --name "kb-name `
|
Bot Services Type: Docs customer-replied-to customer-reported
|
**Missing " in dispatch add --type "qna" --name "kb-name `**
**https://microsoft.github.io/botframework-solutions/virtual-assistant/tutorials/deploy-assistant/cli/7-create-dispatch-model/**
|
1.0
|
[Docs] Missing " in dispatch add --type "qna" --name "kb-name ` - **Missing " in dispatch add --type "qna" --name "kb-name `**
**https://microsoft.github.io/botframework-solutions/virtual-assistant/tutorials/deploy-assistant/cli/7-create-dispatch-model/**
|
non_process
|
missing in dispatch add type qna name kb name missing in dispatch add type qna name kb name
| 0
|
20,806
| 27,566,827,980
|
IssuesEvent
|
2023-03-08 05:01:55
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
reopened
|
Clarify the status of ctx.tokenize
|
P4 type: process untriaged stale team-Rules-API
|
It's not documented, but it's still present.
cc @laszlocsomor
|
1.0
|
Clarify the status of ctx.tokenize - It's not documented, but it's still present.
cc @laszlocsomor
|
process
|
clarify the status of ctx tokenize it s not documented but it s still present cc laszlocsomor
| 1
|
273,828
| 23,788,438,553
|
IssuesEvent
|
2022-09-02 12:27:05
|
Exawind/nalu-wind
|
https://api.github.com/repos/Exawind/nalu-wind
|
closed
|
edgeHybridFluids test diffing
|
failing-tests
|
The edgeHybridFluids test is diffing on the SNL nightly dashboard ([internal link]()) since Aug. 25 at 10pm MDT. The diff seems to not be large, and the fact that it doesn't appear on the NREL dashboard implies that either something has changed slightly in Trilinos develop or it's system noise on the SNL machine.
@jhux2 were there any Trilinos solver changes of note around that time?
@psakievich who owns this test and can confirm whether a rebless is appropriate?
|
1.0
|
edgeHybridFluids test diffing - The edgeHybridFluids test is diffing on the SNL nightly dashboard ([internal link]()) since Aug. 25 at 10pm MDT. The diff seems to not be large, and the fact that it doesn't appear on the NREL dashboard implies that either something has changed slightly in Trilinos develop or it's system noise on the SNL machine.
@jhux2 were there any Trilinos solver changes of note around that time?
@psakievich who owns this test and can confirm whether a rebless is appropriate?
|
non_process
|
edgehybridfluids test diffing the edgehybridfluids test is diffing on the snl nightly dashboard since aug at mdt the diff seems to not be large and the fact that it doesn t appear on the nrel dashboard implies that either something has changed slightly in trilinos develop or it s system noise on the snl machine were there any trilinos solver changes of note around that time psakievich who owns this test and can confirm whether a rebless is appropriate
| 0
|
19,286
| 25,466,249,303
|
IssuesEvent
|
2022-11-25 04:53:40
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[GCI] [PM] UI issue in the sign in screen when entered invalid credentials
|
Bug P0 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
|
UI issue is observed in the sign in screen when entered invalid credentials.

|
3.0
|
[GCI] [PM] UI issue in the sign in screen when entered invalid credentials - UI issue is observed in the sign in screen when entered invalid credentials.

|
process
|
ui issue in the sign in screen when entered invalid credentials ui issue is observed in the sign in screen when entered invalid credentials
| 1
|
20,976
| 27,829,066,721
|
IssuesEvent
|
2023-03-20 02:00:08
|
lizhihao6/get-daily-arxiv-noti
|
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
|
opened
|
New submissions for Mon, 20 Mar 23
|
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
|
## Keyword: events
### Classification of Primitive Manufacturing Tasks from Filtered Event Data
- **Authors:** Laura Duarte, Pedro Neto
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Robotics (cs.RO)
- **Arxiv link:** https://arxiv.org/abs/2303.09558
- **Pdf link:** https://arxiv.org/pdf/2303.09558
- **Abstract**
Collaborative robots are increasingly present in industry to support human activities. However, to make the human-robot collaborative process more effective, there are several challenges to be addressed. Collaborative robotic systems need to be aware of the human activities to (1) anticipate collaborative/assistive actions, (2) learn by demonstration, and (3) activate safety procedures in shared workspace. This study proposes an action classification system to recognize primitive assembly tasks from human motion events data captured by a Dynamic and Active-pixel Vision Sensor (DAVIS). Several filters are compared and combined to remove event data noise. Task patterns are classified from a continuous stream of event data using advanced deep learning and recurrent networks to classify spatial and temporal features. Experiments were conducted on a novel dataset, the dataset of manufacturing tasks (DMT22), featuring 5 classes of representative manufacturing primitives (PickUp, Place, Screw, Hold, Idle) from 5 participants. Results show that the proposed filters remove about 65\% of all events (noise) per recording, conducting to a classification accuracy up to 99,37\% for subjects that trained the system and 97.08\% for new subjects. Data from a left-handed subject were successfully classified using only right-handed training data. These results are object independent.
### Event-based Human Pose Tracking by Spiking Spatiotemporal Transformer
- **Authors:** Shihao Zou, Yuxuan Mu, Xinxin Zuo, Sen Wang, Li Cheng
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09681
- **Pdf link:** https://arxiv.org/pdf/2303.09681
- **Abstract**
Event camera, as an emerging biologically-inspired vision sensor for capturing motion dynamics, presents new potential for 3D human pose tracking, or video-based 3D human pose estimation. However, existing works in pose tracking either require the presence of additional gray-scale images to establish a solid starting pose, or ignore the temporal dependencies all together by collapsing segments of event streams to form static image frames. Meanwhile, although the effectiveness of Artificial Neural Networks (ANNs, a.k.a. dense deep learning) has been showcased in many event-based tasks, the use of ANNs tends to neglect the fact that compared to the dense frame-based image sequences, the occurrence of events from an event camera is spatiotemporally much sparser. Motivated by the above mentioned issues, we present in this paper a dedicated end-to-end \textit{sparse deep learning} approach for event-based pose tracking: 1) to our knowledge this is the first time that 3D human pose tracking is obtained from events only, thus eliminating the need of accessing to any frame-based images as part of input; 2) our approach is based entirely upon the framework of Spiking Neural Networks (SNNs), which consists of Spike-Element-Wise (SEW) ResNet and our proposed spiking spatiotemporal transformer; 3) a large-scale synthetic dataset is constructed that features a broad and diverse set of annotated 3D human motions, as well as longer hours of event stream data, named SynEventHPD. Empirical experiments demonstrate the superiority of our approach in both performance and efficiency measures. For example, with comparable performance to the state-of-the-art ANNs counterparts, our approach achieves a computation reduction of 20\% in FLOPS. Our implementation is made available at https://github.com/JimmyZou/HumanPoseTracking_SNN and dataset will be released upon paper acceptance.
## Keyword: event camera
### Event-based Human Pose Tracking by Spiking Spatiotemporal Transformer
- **Authors:** Shihao Zou, Yuxuan Mu, Xinxin Zuo, Sen Wang, Li Cheng
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09681
- **Pdf link:** https://arxiv.org/pdf/2303.09681
- **Abstract**
Event camera, as an emerging biologically-inspired vision sensor for capturing motion dynamics, presents new potential for 3D human pose tracking, or video-based 3D human pose estimation. However, existing works in pose tracking either require the presence of additional gray-scale images to establish a solid starting pose, or ignore the temporal dependencies all together by collapsing segments of event streams to form static image frames. Meanwhile, although the effectiveness of Artificial Neural Networks (ANNs, a.k.a. dense deep learning) has been showcased in many event-based tasks, the use of ANNs tends to neglect the fact that compared to the dense frame-based image sequences, the occurrence of events from an event camera is spatiotemporally much sparser. Motivated by the above mentioned issues, we present in this paper a dedicated end-to-end \textit{sparse deep learning} approach for event-based pose tracking: 1) to our knowledge this is the first time that 3D human pose tracking is obtained from events only, thus eliminating the need of accessing to any frame-based images as part of input; 2) our approach is based entirely upon the framework of Spiking Neural Networks (SNNs), which consists of Spike-Element-Wise (SEW) ResNet and our proposed spiking spatiotemporal transformer; 3) a large-scale synthetic dataset is constructed that features a broad and diverse set of annotated 3D human motions, as well as longer hours of event stream data, named SynEventHPD. Empirical experiments demonstrate the superiority of our approach in both performance and efficiency measures. For example, with comparable performance to the state-of-the-art ANNs counterparts, our approach achieves a computation reduction of 20\% in FLOPS. Our implementation is made available at https://github.com/JimmyZou/HumanPoseTracking_SNN and dataset will be released upon paper acceptance.
### Dual Memory Aggregation Network for Event-Based Object Detection with Learnable Representation
- **Authors:** Dongsheng Wang, Xu Jia, Yang Zhang, Xinyu Zhang, Yaoyuan Wang, Ziyang Zhang, Dong Wang, Huchuan Lu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09919
- **Pdf link:** https://arxiv.org/pdf/2303.09919
- **Abstract**
Event-based cameras are bio-inspired sensors that capture brightness change of every pixel in an asynchronous manner. Compared with frame-based sensors, event cameras have microsecond-level latency and high dynamic range, hence showing great potential for object detection under high-speed motion and poor illumination conditions. Due to sparsity and asynchronism nature with event streams, most of existing approaches resort to hand-crafted methods to convert event data into 2D grid representation. However, they are sub-optimal in aggregating information from event stream for object detection. In this work, we propose to learn an event representation optimized for event-based object detection. Specifically, event streams are divided into grids in the x-y-t coordinates for both positive and negative polarity, producing a set of pillars as 3D tensor representation. To fully exploit information with event streams to detect objects, a dual-memory aggregation network (DMANet) is proposed to leverage both long and short memory along event streams to aggregate effective information for object detection. Long memory is encoded in the hidden state of adaptive convLSTMs while short memory is modeled by computing spatial-temporal correlation between event pillars at neighboring time intervals. Extensive experiments on the recently released event-based automotive detection dataset demonstrate the effectiveness of the proposed method.
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
There is no result
## Keyword: ISP
There is no result
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### MMFace4D: A Large-Scale Multi-Modal 4D Face Dataset for Audio-Driven 3D Face Animation
- **Authors:** Haozhe Wu, Jia Jia, Junliang Xing, Hongwei Xu, Xiangyuan Wang, Jelo Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09797
- **Pdf link:** https://arxiv.org/pdf/2303.09797
- **Abstract**
Audio-Driven Face Animation is an eagerly anticipated technique for applications such as VR/AR, games, and movie making. With the rapid development of 3D engines, there is an increasing demand for driving 3D faces with audio. However, currently available 3D face animation datasets are either scale-limited or quality-unsatisfied, which hampers further developments of audio-driven 3D face animation. To address this challenge, we propose MMFace4D, a large-scale multi-modal 4D (3D sequence) face dataset consisting of 431 identities, 35,904 sequences, and 3.9 million frames. MMFace4D has three appealing characteristics: 1) highly diversified subjects and corpus, 2) synchronized audio and 3D mesh sequence with high-resolution face details, and 3) low storage cost with a new efficient compression algorithm on 3D mesh sequences. These characteristics enable the training of high-fidelity, expressive, and generalizable face animation models. Upon MMFace4D, we construct a challenging benchmark of audio-driven 3D face animation with a strong baseline, which enables non-autoregressive generation with fast inference speed and outperforms the state-of-the-art autoregressive method. The whole benchmark will be released.
### Learning Data-Driven Vector-Quantized Degradation Model for Animation Video Super-Resolution
- **Authors:** Zixi Tuo, Huan Yang, Jianlong Fu, Yujie Dun, Xueming Qian
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09826
- **Pdf link:** https://arxiv.org/pdf/2303.09826
- **Abstract**
Existing real-world video super-resolution (VSR) methods focus on designing a general degradation pipeline for open-domain videos while ignoring data intrinsic characteristics which strongly limit their performance when applying to some specific domains (e.g. animation videos). In this paper, we thoroughly explore the characteristics of animation videos and leverage the rich priors in real-world animation data for a more practical animation VSR model. In particular, we propose a multi-scale Vector-Quantized Degradation model for animation video Super-Resolution (VQD-SR) to decompose the local details from global structures and transfer the degradation priors in real-world animation videos to a learned vector-quantized codebook for degradation modeling. A rich-content Real Animation Low-quality (RAL) video dataset is collected for extracting the priors. We further propose a data enhancement strategy for high-resolution (HR) training videos based on our observation that existing HR videos are mostly collected from the Web which contains conspicuous compression artifacts. The proposed strategy is valid to lift the upper bound of animation VSR performance, regardless of the specific VSR model. Experimental results demonstrate the superiority of the proposed VQD-SR over state-of-the-art methods, through extensive quantitative and qualitative evaluations of the latest animation video super-resolution benchmark.
### Toward Super-Resolution for Appearance-Based Gaze Estimation
- **Authors:** Galen O'Shea, Majid Komeili
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.10151
- **Pdf link:** https://arxiv.org/pdf/2303.10151
- **Abstract**
Gaze tracking is a valuable tool with a broad range of applications in various fields, including medicine, psychology, virtual reality, marketing, and safety. Therefore, it is essential to have gaze tracking software that is cost-efficient and high-performing. Accurately predicting gaze remains a difficult task, particularly in real-world situations where images are affected by motion blur, video compression, and noise. Super-resolution has been shown to improve image quality from a visual perspective. This work examines the usefulness of super-resolution for improving appearance-based gaze tracking. We show that not all SR models preserve the gaze direction. We propose a two-step framework based on SwinIR super-resolution model. The proposed method consistently outperforms the state-of-the-art, particularly in scenarios involving low-resolution or degraded images. Furthermore, we examine the use of super-resolution through the lens of self-supervised learning for gaze prediction. Self-supervised learning aims to learn from unlabelled data to reduce the amount of required labeled data for downstream tasks. We propose a novel architecture called SuperVision by fusing an SR backbone network to a ResNet18 (with some skip connections). The proposed SuperVision method uses 5x less labeled data and yet outperforms, by 15%, the state-of-the-art method of GazeTR which uses 100% of training data.
## Keyword: RAW
### DS-Fusion: Artistic Typography via Discriminated and Stylized Diffusion
- **Authors:** Maham Tanveer, Yizhi Wang, Ali Mahdavi-Amiri, Hao Zhang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Graphics (cs.GR)
- **Arxiv link:** https://arxiv.org/abs/2303.09604
- **Pdf link:** https://arxiv.org/pdf/2303.09604
- **Abstract**
We introduce a novel method to automatically generate an artistic typography by stylizing one or more letter fonts to visually convey the semantics of an input word, while ensuring that the output remains readable. To address an assortment of challenges with our task at hand including conflicting goals (artistic stylization vs. legibility), lack of ground truth, and immense search space, our approach utilizes large language models to bridge texts and visual images for stylization and build an unsupervised generative model with a diffusion model backbone. Specifically, we employ the denoising generator in Latent Diffusion Model (LDM), with the key addition of a CNN-based discriminator to adapt the input style onto the input text. The discriminator uses rasterized images of a given letter/word font as real samples and output of the denoising generator as fake samples. Our model is coined DS-Fusion for discriminated and stylized diffusion. We showcase the quality and versatility of our method through numerous examples, qualitative and quantitative evaluation, as well as ablation studies. User studies comparing to strong baselines including CLIPDraw and DALL-E 2, as well as artist-crafted typographies, demonstrate strong performance of DS-Fusion.
### TBP-Former: Learning Temporal Bird's-Eye-View Pyramid for Joint Perception and Prediction in Vision-Centric Autonomous Driving
- **Authors:** Shaoheng Fang, Zi Wang, Yiqi Zhong, Junhao Ge, Siheng Chen, Yanfeng Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09998
- **Pdf link:** https://arxiv.org/pdf/2303.09998
- **Abstract**
Vision-centric joint perception and prediction (PnP) has become an emerging trend in autonomous driving research. It predicts the future states of the traffic participants in the surrounding environment from raw RGB images. However, it is still a critical challenge to synchronize features obtained at multiple camera views and timestamps due to inevitable geometric distortions and further exploit those spatial-temporal features. To address this issue, we propose a temporal bird's-eye-view pyramid transformer (TBP-Former) for vision-centric PnP, which includes two novel designs. First, a pose-synchronized BEV encoder is proposed to map raw image inputs with any camera pose at any time to a shared and synchronized BEV space for better spatial-temporal synchronization. Second, a spatial-temporal pyramid transformer is introduced to comprehensively extract multi-scale BEV features and predict future BEV states with the support of spatial-temporal priors. Extensive experiments on nuScenes dataset show that our proposed framework overall outperforms all state-of-the-art vision-based prediction methods.
## Keyword: raw image
### TBP-Former: Learning Temporal Bird's-Eye-View Pyramid for Joint Perception and Prediction in Vision-Centric Autonomous Driving
- **Authors:** Shaoheng Fang, Zi Wang, Yiqi Zhong, Junhao Ge, Siheng Chen, Yanfeng Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09998
- **Pdf link:** https://arxiv.org/pdf/2303.09998
- **Abstract**
Vision-centric joint perception and prediction (PnP) has become an emerging trend in autonomous driving research. It predicts the future states of the traffic participants in the surrounding environment from raw RGB images. However, it is still a critical challenge to synchronize features obtained at multiple camera views and timestamps due to inevitable geometric distortions and further exploit those spatial-temporal features. To address this issue, we propose a temporal bird's-eye-view pyramid transformer (TBP-Former) for vision-centric PnP, which includes two novel designs. First, a pose-synchronized BEV encoder is proposed to map raw image inputs with any camera pose at any time to a shared and synchronized BEV space for better spatial-temporal synchronization. Second, a spatial-temporal pyramid transformer is introduced to comprehensively extract multi-scale BEV features and predict future BEV states with the support of spatial-temporal priors. Extensive experiments on nuScenes dataset show that our proposed framework overall outperforms all state-of-the-art vision-based prediction methods.
|
2.0
|
New submissions for Mon, 20 Mar 23 - ## Keyword: events
### Classification of Primitive Manufacturing Tasks from Filtered Event Data
- **Authors:** Laura Duarte, Pedro Neto
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Robotics (cs.RO)
- **Arxiv link:** https://arxiv.org/abs/2303.09558
- **Pdf link:** https://arxiv.org/pdf/2303.09558
- **Abstract**
Collaborative robots are increasingly present in industry to support human activities. However, to make the human-robot collaborative process more effective, there are several challenges to be addressed. Collaborative robotic systems need to be aware of the human activities to (1) anticipate collaborative/assistive actions, (2) learn by demonstration, and (3) activate safety procedures in shared workspace. This study proposes an action classification system to recognize primitive assembly tasks from human motion events data captured by a Dynamic and Active-pixel Vision Sensor (DAVIS). Several filters are compared and combined to remove event data noise. Task patterns are classified from a continuous stream of event data using advanced deep learning and recurrent networks to classify spatial and temporal features. Experiments were conducted on a novel dataset, the dataset of manufacturing tasks (DMT22), featuring 5 classes of representative manufacturing primitives (PickUp, Place, Screw, Hold, Idle) from 5 participants. Results show that the proposed filters remove about 65\% of all events (noise) per recording, conducting to a classification accuracy up to 99,37\% for subjects that trained the system and 97.08\% for new subjects. Data from a left-handed subject were successfully classified using only right-handed training data. These results are object independent.
### Event-based Human Pose Tracking by Spiking Spatiotemporal Transformer
- **Authors:** Shihao Zou, Yuxuan Mu, Xinxin Zuo, Sen Wang, Li Cheng
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09681
- **Pdf link:** https://arxiv.org/pdf/2303.09681
- **Abstract**
Event camera, as an emerging biologically-inspired vision sensor for capturing motion dynamics, presents new potential for 3D human pose tracking, or video-based 3D human pose estimation. However, existing works in pose tracking either require the presence of additional gray-scale images to establish a solid starting pose, or ignore the temporal dependencies all together by collapsing segments of event streams to form static image frames. Meanwhile, although the effectiveness of Artificial Neural Networks (ANNs, a.k.a. dense deep learning) has been showcased in many event-based tasks, the use of ANNs tends to neglect the fact that compared to the dense frame-based image sequences, the occurrence of events from an event camera is spatiotemporally much sparser. Motivated by the above mentioned issues, we present in this paper a dedicated end-to-end \textit{sparse deep learning} approach for event-based pose tracking: 1) to our knowledge this is the first time that 3D human pose tracking is obtained from events only, thus eliminating the need of accessing to any frame-based images as part of input; 2) our approach is based entirely upon the framework of Spiking Neural Networks (SNNs), which consists of Spike-Element-Wise (SEW) ResNet and our proposed spiking spatiotemporal transformer; 3) a large-scale synthetic dataset is constructed that features a broad and diverse set of annotated 3D human motions, as well as longer hours of event stream data, named SynEventHPD. Empirical experiments demonstrate the superiority of our approach in both performance and efficiency measures. For example, with comparable performance to the state-of-the-art ANNs counterparts, our approach achieves a computation reduction of 20\% in FLOPS. Our implementation is made available at https://github.com/JimmyZou/HumanPoseTracking_SNN and dataset will be released upon paper acceptance.
## Keyword: event camera
### Event-based Human Pose Tracking by Spiking Spatiotemporal Transformer
- **Authors:** Shihao Zou, Yuxuan Mu, Xinxin Zuo, Sen Wang, Li Cheng
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09681
- **Pdf link:** https://arxiv.org/pdf/2303.09681
- **Abstract**
Event camera, as an emerging biologically-inspired vision sensor for capturing motion dynamics, presents new potential for 3D human pose tracking, or video-based 3D human pose estimation. However, existing works in pose tracking either require the presence of additional gray-scale images to establish a solid starting pose, or ignore the temporal dependencies all together by collapsing segments of event streams to form static image frames. Meanwhile, although the effectiveness of Artificial Neural Networks (ANNs, a.k.a. dense deep learning) has been showcased in many event-based tasks, the use of ANNs tends to neglect the fact that compared to the dense frame-based image sequences, the occurrence of events from an event camera is spatiotemporally much sparser. Motivated by the above mentioned issues, we present in this paper a dedicated end-to-end \textit{sparse deep learning} approach for event-based pose tracking: 1) to our knowledge this is the first time that 3D human pose tracking is obtained from events only, thus eliminating the need of accessing to any frame-based images as part of input; 2) our approach is based entirely upon the framework of Spiking Neural Networks (SNNs), which consists of Spike-Element-Wise (SEW) ResNet and our proposed spiking spatiotemporal transformer; 3) a large-scale synthetic dataset is constructed that features a broad and diverse set of annotated 3D human motions, as well as longer hours of event stream data, named SynEventHPD. Empirical experiments demonstrate the superiority of our approach in both performance and efficiency measures. For example, with comparable performance to the state-of-the-art ANNs counterparts, our approach achieves a computation reduction of 20\% in FLOPS. Our implementation is made available at https://github.com/JimmyZou/HumanPoseTracking_SNN and dataset will be released upon paper acceptance.
### Dual Memory Aggregation Network for Event-Based Object Detection with Learnable Representation
- **Authors:** Dongsheng Wang, Xu Jia, Yang Zhang, Xinyu Zhang, Yaoyuan Wang, Ziyang Zhang, Dong Wang, Huchuan Lu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09919
- **Pdf link:** https://arxiv.org/pdf/2303.09919
- **Abstract**
Event-based cameras are bio-inspired sensors that capture brightness change of every pixel in an asynchronous manner. Compared with frame-based sensors, event cameras have microsecond-level latency and high dynamic range, hence showing great potential for object detection under high-speed motion and poor illumination conditions. Due to sparsity and asynchronism nature with event streams, most of existing approaches resort to hand-crafted methods to convert event data into 2D grid representation. However, they are sub-optimal in aggregating information from event stream for object detection. In this work, we propose to learn an event representation optimized for event-based object detection. Specifically, event streams are divided into grids in the x-y-t coordinates for both positive and negative polarity, producing a set of pillars as 3D tensor representation. To fully exploit information with event streams to detect objects, a dual-memory aggregation network (DMANet) is proposed to leverage both long and short memory along event streams to aggregate effective information for object detection. Long memory is encoded in the hidden state of adaptive convLSTMs while short memory is modeled by computing spatial-temporal correlation between event pillars at neighboring time intervals. Extensive experiments on the recently released event-based automotive detection dataset demonstrate the effectiveness of the proposed method.
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
There is no result
## Keyword: ISP
There is no result
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### MMFace4D: A Large-Scale Multi-Modal 4D Face Dataset for Audio-Driven 3D Face Animation
- **Authors:** Haozhe Wu, Jia Jia, Junliang Xing, Hongwei Xu, Xiangyuan Wang, Jelo Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09797
- **Pdf link:** https://arxiv.org/pdf/2303.09797
- **Abstract**
Audio-Driven Face Animation is an eagerly anticipated technique for applications such as VR/AR, games, and movie making. With the rapid development of 3D engines, there is an increasing demand for driving 3D faces with audio. However, currently available 3D face animation datasets are either scale-limited or quality-unsatisfied, which hampers further developments of audio-driven 3D face animation. To address this challenge, we propose MMFace4D, a large-scale multi-modal 4D (3D sequence) face dataset consisting of 431 identities, 35,904 sequences, and 3.9 million frames. MMFace4D has three appealing characteristics: 1) highly diversified subjects and corpus, 2) synchronized audio and 3D mesh sequence with high-resolution face details, and 3) low storage cost with a new efficient compression algorithm on 3D mesh sequences. These characteristics enable the training of high-fidelity, expressive, and generalizable face animation models. Upon MMFace4D, we construct a challenging benchmark of audio-driven 3D face animation with a strong baseline, which enables non-autoregressive generation with fast inference speed and outperforms the state-of-the-art autoregressive method. The whole benchmark will be released.
### Learning Data-Driven Vector-Quantized Degradation Model for Animation Video Super-Resolution
- **Authors:** Zixi Tuo, Huan Yang, Jianlong Fu, Yujie Dun, Xueming Qian
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09826
- **Pdf link:** https://arxiv.org/pdf/2303.09826
- **Abstract**
Existing real-world video super-resolution (VSR) methods focus on designing a general degradation pipeline for open-domain videos while ignoring data intrinsic characteristics which strongly limit their performance when applying to some specific domains (e.g. animation videos). In this paper, we thoroughly explore the characteristics of animation videos and leverage the rich priors in real-world animation data for a more practical animation VSR model. In particular, we propose a multi-scale Vector-Quantized Degradation model for animation video Super-Resolution (VQD-SR) to decompose the local details from global structures and transfer the degradation priors in real-world animation videos to a learned vector-quantized codebook for degradation modeling. A rich-content Real Animation Low-quality (RAL) video dataset is collected for extracting the priors. We further propose a data enhancement strategy for high-resolution (HR) training videos based on our observation that existing HR videos are mostly collected from the Web which contains conspicuous compression artifacts. The proposed strategy is valid to lift the upper bound of animation VSR performance, regardless of the specific VSR model. Experimental results demonstrate the superiority of the proposed VQD-SR over state-of-the-art methods, through extensive quantitative and qualitative evaluations of the latest animation video super-resolution benchmark.
### Toward Super-Resolution for Appearance-Based Gaze Estimation
- **Authors:** Galen O'Shea, Majid Komeili
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.10151
- **Pdf link:** https://arxiv.org/pdf/2303.10151
- **Abstract**
Gaze tracking is a valuable tool with a broad range of applications in various fields, including medicine, psychology, virtual reality, marketing, and safety. Therefore, it is essential to have gaze tracking software that is cost-efficient and high-performing. Accurately predicting gaze remains a difficult task, particularly in real-world situations where images are affected by motion blur, video compression, and noise. Super-resolution has been shown to improve image quality from a visual perspective. This work examines the usefulness of super-resolution for improving appearance-based gaze tracking. We show that not all SR models preserve the gaze direction. We propose a two-step framework based on SwinIR super-resolution model. The proposed method consistently outperforms the state-of-the-art, particularly in scenarios involving low-resolution or degraded images. Furthermore, we examine the use of super-resolution through the lens of self-supervised learning for gaze prediction. Self-supervised learning aims to learn from unlabelled data to reduce the amount of required labeled data for downstream tasks. We propose a novel architecture called SuperVision by fusing an SR backbone network to a ResNet18 (with some skip connections). The proposed SuperVision method uses 5x less labeled data and yet outperforms, by 15%, the state-of-the-art method of GazeTR which uses 100% of training data.
## Keyword: RAW
### DS-Fusion: Artistic Typography via Discriminated and Stylized Diffusion
- **Authors:** Maham Tanveer, Yizhi Wang, Ali Mahdavi-Amiri, Hao Zhang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Graphics (cs.GR)
- **Arxiv link:** https://arxiv.org/abs/2303.09604
- **Pdf link:** https://arxiv.org/pdf/2303.09604
- **Abstract**
We introduce a novel method to automatically generate an artistic typography by stylizing one or more letter fonts to visually convey the semantics of an input word, while ensuring that the output remains readable. To address an assortment of challenges with our task at hand including conflicting goals (artistic stylization vs. legibility), lack of ground truth, and immense search space, our approach utilizes large language models to bridge texts and visual images for stylization and build an unsupervised generative model with a diffusion model backbone. Specifically, we employ the denoising generator in Latent Diffusion Model (LDM), with the key addition of a CNN-based discriminator to adapt the input style onto the input text. The discriminator uses rasterized images of a given letter/word font as real samples and output of the denoising generator as fake samples. Our model is coined DS-Fusion for discriminated and stylized diffusion. We showcase the quality and versatility of our method through numerous examples, qualitative and quantitative evaluation, as well as ablation studies. User studies comparing to strong baselines including CLIPDraw and DALL-E 2, as well as artist-crafted typographies, demonstrate strong performance of DS-Fusion.
### TBP-Former: Learning Temporal Bird's-Eye-View Pyramid for Joint Perception and Prediction in Vision-Centric Autonomous Driving
- **Authors:** Shaoheng Fang, Zi Wang, Yiqi Zhong, Junhao Ge, Siheng Chen, Yanfeng Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09998
- **Pdf link:** https://arxiv.org/pdf/2303.09998
- **Abstract**
Vision-centric joint perception and prediction (PnP) has become an emerging trend in autonomous driving research. It predicts the future states of the traffic participants in the surrounding environment from raw RGB images. However, it is still a critical challenge to synchronize features obtained at multiple camera views and timestamps due to inevitable geometric distortions and further exploit those spatial-temporal features. To address this issue, we propose a temporal bird's-eye-view pyramid transformer (TBP-Former) for vision-centric PnP, which includes two novel designs. First, a pose-synchronized BEV encoder is proposed to map raw image inputs with any camera pose at any time to a shared and synchronized BEV space for better spatial-temporal synchronization. Second, a spatial-temporal pyramid transformer is introduced to comprehensively extract multi-scale BEV features and predict future BEV states with the support of spatial-temporal priors. Extensive experiments on nuScenes dataset show that our proposed framework overall outperforms all state-of-the-art vision-based prediction methods.
## Keyword: raw image
### TBP-Former: Learning Temporal Bird's-Eye-View Pyramid for Joint Perception and Prediction in Vision-Centric Autonomous Driving
- **Authors:** Shaoheng Fang, Zi Wang, Yiqi Zhong, Junhao Ge, Siheng Chen, Yanfeng Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2303.09998
- **Pdf link:** https://arxiv.org/pdf/2303.09998
- **Abstract**
Vision-centric joint perception and prediction (PnP) has become an emerging trend in autonomous driving research. It predicts the future states of the traffic participants in the surrounding environment from raw RGB images. However, it is still a critical challenge to synchronize features obtained at multiple camera views and timestamps due to inevitable geometric distortions and further exploit those spatial-temporal features. To address this issue, we propose a temporal bird's-eye-view pyramid transformer (TBP-Former) for vision-centric PnP, which includes two novel designs. First, a pose-synchronized BEV encoder is proposed to map raw image inputs with any camera pose at any time to a shared and synchronized BEV space for better spatial-temporal synchronization. Second, a spatial-temporal pyramid transformer is introduced to comprehensively extract multi-scale BEV features and predict future BEV states with the support of spatial-temporal priors. Extensive experiments on nuScenes dataset show that our proposed framework overall outperforms all state-of-the-art vision-based prediction methods.
|
process
|
new submissions for mon mar keyword events classification of primitive manufacturing tasks from filtered event data authors laura duarte pedro neto subjects computer vision and pattern recognition cs cv robotics cs ro arxiv link pdf link abstract collaborative robots are increasingly present in industry to support human activities however to make the human robot collaborative process more effective there are several challenges to be addressed collaborative robotic systems need to be aware of the human activities to anticipate collaborative assistive actions learn by demonstration and activate safety procedures in shared workspace this study proposes an action classification system to recognize primitive assembly tasks from human motion events data captured by a dynamic and active pixel vision sensor davis several filters are compared and combined to remove event data noise task patterns are classified from a continuous stream of event data using advanced deep learning and recurrent networks to classify spatial and temporal features experiments were conducted on a novel dataset the dataset of manufacturing tasks featuring classes of representative manufacturing primitives pickup place screw hold idle from participants results show that the proposed filters remove about of all events noise per recording conducting to a classification accuracy up to for subjects that trained the system and for new subjects data from a left handed subject were successfully classified using only right handed training data these results are object independent event based human pose tracking by spiking spatiotemporal transformer authors shihao zou yuxuan mu xinxin zuo sen wang li cheng subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract event camera as an emerging biologically inspired vision sensor for capturing motion dynamics presents new potential for human pose tracking or video based human pose estimation however existing works in pose tracking either require the presence of additional gray scale images to establish a solid starting pose or ignore the temporal dependencies all together by collapsing segments of event streams to form static image frames meanwhile although the effectiveness of artificial neural networks anns a k a dense deep learning has been showcased in many event based tasks the use of anns tends to neglect the fact that compared to the dense frame based image sequences the occurrence of events from an event camera is spatiotemporally much sparser motivated by the above mentioned issues we present in this paper a dedicated end to end textit sparse deep learning approach for event based pose tracking to our knowledge this is the first time that human pose tracking is obtained from events only thus eliminating the need of accessing to any frame based images as part of input our approach is based entirely upon the framework of spiking neural networks snns which consists of spike element wise sew resnet and our proposed spiking spatiotemporal transformer a large scale synthetic dataset is constructed that features a broad and diverse set of annotated human motions as well as longer hours of event stream data named syneventhpd empirical experiments demonstrate the superiority of our approach in both performance and efficiency measures for example with comparable performance to the state of the art anns counterparts our approach achieves a computation reduction of in flops our implementation is made available at and dataset will be released upon paper acceptance keyword event camera event based human pose tracking by spiking spatiotemporal transformer authors shihao zou yuxuan mu xinxin zuo sen wang li cheng subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract event camera as an emerging biologically inspired vision sensor for capturing motion dynamics presents new potential for human pose tracking or video based human pose estimation however existing works in pose tracking either require the presence of additional gray scale images to establish a solid starting pose or ignore the temporal dependencies all together by collapsing segments of event streams to form static image frames meanwhile although the effectiveness of artificial neural networks anns a k a dense deep learning has been showcased in many event based tasks the use of anns tends to neglect the fact that compared to the dense frame based image sequences the occurrence of events from an event camera is spatiotemporally much sparser motivated by the above mentioned issues we present in this paper a dedicated end to end textit sparse deep learning approach for event based pose tracking to our knowledge this is the first time that human pose tracking is obtained from events only thus eliminating the need of accessing to any frame based images as part of input our approach is based entirely upon the framework of spiking neural networks snns which consists of spike element wise sew resnet and our proposed spiking spatiotemporal transformer a large scale synthetic dataset is constructed that features a broad and diverse set of annotated human motions as well as longer hours of event stream data named syneventhpd empirical experiments demonstrate the superiority of our approach in both performance and efficiency measures for example with comparable performance to the state of the art anns counterparts our approach achieves a computation reduction of in flops our implementation is made available at and dataset will be released upon paper acceptance dual memory aggregation network for event based object detection with learnable representation authors dongsheng wang xu jia yang zhang xinyu zhang yaoyuan wang ziyang zhang dong wang huchuan lu subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract event based cameras are bio inspired sensors that capture brightness change of every pixel in an asynchronous manner compared with frame based sensors event cameras have microsecond level latency and high dynamic range hence showing great potential for object detection under high speed motion and poor illumination conditions due to sparsity and asynchronism nature with event streams most of existing approaches resort to hand crafted methods to convert event data into grid representation however they are sub optimal in aggregating information from event stream for object detection in this work we propose to learn an event representation optimized for event based object detection specifically event streams are divided into grids in the x y t coordinates for both positive and negative polarity producing a set of pillars as tensor representation to fully exploit information with event streams to detect objects a dual memory aggregation network dmanet is proposed to leverage both long and short memory along event streams to aggregate effective information for object detection long memory is encoded in the hidden state of adaptive convlstms while short memory is modeled by computing spatial temporal correlation between event pillars at neighboring time intervals extensive experiments on the recently released event based automotive detection dataset demonstrate the effectiveness of the proposed method keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb there is no result keyword isp there is no result keyword image signal processing there is no result keyword image signal process there is no result keyword compression a large scale multi modal face dataset for audio driven face animation authors haozhe wu jia jia junliang xing hongwei xu xiangyuan wang jelo wang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract audio driven face animation is an eagerly anticipated technique for applications such as vr ar games and movie making with the rapid development of engines there is an increasing demand for driving faces with audio however currently available face animation datasets are either scale limited or quality unsatisfied which hampers further developments of audio driven face animation to address this challenge we propose a large scale multi modal sequence face dataset consisting of identities sequences and million frames has three appealing characteristics highly diversified subjects and corpus synchronized audio and mesh sequence with high resolution face details and low storage cost with a new efficient compression algorithm on mesh sequences these characteristics enable the training of high fidelity expressive and generalizable face animation models upon we construct a challenging benchmark of audio driven face animation with a strong baseline which enables non autoregressive generation with fast inference speed and outperforms the state of the art autoregressive method the whole benchmark will be released learning data driven vector quantized degradation model for animation video super resolution authors zixi tuo huan yang jianlong fu yujie dun xueming qian subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract existing real world video super resolution vsr methods focus on designing a general degradation pipeline for open domain videos while ignoring data intrinsic characteristics which strongly limit their performance when applying to some specific domains e g animation videos in this paper we thoroughly explore the characteristics of animation videos and leverage the rich priors in real world animation data for a more practical animation vsr model in particular we propose a multi scale vector quantized degradation model for animation video super resolution vqd sr to decompose the local details from global structures and transfer the degradation priors in real world animation videos to a learned vector quantized codebook for degradation modeling a rich content real animation low quality ral video dataset is collected for extracting the priors we further propose a data enhancement strategy for high resolution hr training videos based on our observation that existing hr videos are mostly collected from the web which contains conspicuous compression artifacts the proposed strategy is valid to lift the upper bound of animation vsr performance regardless of the specific vsr model experimental results demonstrate the superiority of the proposed vqd sr over state of the art methods through extensive quantitative and qualitative evaluations of the latest animation video super resolution benchmark toward super resolution for appearance based gaze estimation authors galen o shea majid komeili subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract gaze tracking is a valuable tool with a broad range of applications in various fields including medicine psychology virtual reality marketing and safety therefore it is essential to have gaze tracking software that is cost efficient and high performing accurately predicting gaze remains a difficult task particularly in real world situations where images are affected by motion blur video compression and noise super resolution has been shown to improve image quality from a visual perspective this work examines the usefulness of super resolution for improving appearance based gaze tracking we show that not all sr models preserve the gaze direction we propose a two step framework based on swinir super resolution model the proposed method consistently outperforms the state of the art particularly in scenarios involving low resolution or degraded images furthermore we examine the use of super resolution through the lens of self supervised learning for gaze prediction self supervised learning aims to learn from unlabelled data to reduce the amount of required labeled data for downstream tasks we propose a novel architecture called supervision by fusing an sr backbone network to a with some skip connections the proposed supervision method uses less labeled data and yet outperforms by the state of the art method of gazetr which uses of training data keyword raw ds fusion artistic typography via discriminated and stylized diffusion authors maham tanveer yizhi wang ali mahdavi amiri hao zhang subjects computer vision and pattern recognition cs cv graphics cs gr arxiv link pdf link abstract we introduce a novel method to automatically generate an artistic typography by stylizing one or more letter fonts to visually convey the semantics of an input word while ensuring that the output remains readable to address an assortment of challenges with our task at hand including conflicting goals artistic stylization vs legibility lack of ground truth and immense search space our approach utilizes large language models to bridge texts and visual images for stylization and build an unsupervised generative model with a diffusion model backbone specifically we employ the denoising generator in latent diffusion model ldm with the key addition of a cnn based discriminator to adapt the input style onto the input text the discriminator uses rasterized images of a given letter word font as real samples and output of the denoising generator as fake samples our model is coined ds fusion for discriminated and stylized diffusion we showcase the quality and versatility of our method through numerous examples qualitative and quantitative evaluation as well as ablation studies user studies comparing to strong baselines including clipdraw and dall e as well as artist crafted typographies demonstrate strong performance of ds fusion tbp former learning temporal bird s eye view pyramid for joint perception and prediction in vision centric autonomous driving authors shaoheng fang zi wang yiqi zhong junhao ge siheng chen yanfeng wang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract vision centric joint perception and prediction pnp has become an emerging trend in autonomous driving research it predicts the future states of the traffic participants in the surrounding environment from raw rgb images however it is still a critical challenge to synchronize features obtained at multiple camera views and timestamps due to inevitable geometric distortions and further exploit those spatial temporal features to address this issue we propose a temporal bird s eye view pyramid transformer tbp former for vision centric pnp which includes two novel designs first a pose synchronized bev encoder is proposed to map raw image inputs with any camera pose at any time to a shared and synchronized bev space for better spatial temporal synchronization second a spatial temporal pyramid transformer is introduced to comprehensively extract multi scale bev features and predict future bev states with the support of spatial temporal priors extensive experiments on nuscenes dataset show that our proposed framework overall outperforms all state of the art vision based prediction methods keyword raw image tbp former learning temporal bird s eye view pyramid for joint perception and prediction in vision centric autonomous driving authors shaoheng fang zi wang yiqi zhong junhao ge siheng chen yanfeng wang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract vision centric joint perception and prediction pnp has become an emerging trend in autonomous driving research it predicts the future states of the traffic participants in the surrounding environment from raw rgb images however it is still a critical challenge to synchronize features obtained at multiple camera views and timestamps due to inevitable geometric distortions and further exploit those spatial temporal features to address this issue we propose a temporal bird s eye view pyramid transformer tbp former for vision centric pnp which includes two novel designs first a pose synchronized bev encoder is proposed to map raw image inputs with any camera pose at any time to a shared and synchronized bev space for better spatial temporal synchronization second a spatial temporal pyramid transformer is introduced to comprehensively extract multi scale bev features and predict future bev states with the support of spatial temporal priors extensive experiments on nuscenes dataset show that our proposed framework overall outperforms all state of the art vision based prediction methods
| 1
|
233,637
| 19,024,888,630
|
IssuesEvent
|
2021-11-24 01:21:42
|
crypto-org-chain/cronos
|
https://api.github.com/repos/crypto-org-chain/cronos
|
closed
|
Problem: hermes is not supported on 'x86_64-darwin'
|
tests
|
**Describe the bug**
After pulling the commit 9c3ba89b259f6da54a2c9f6096c8f0dd496efdc4, then
run `nix-shell integration_tests/shell.nix`, it prints out the following error:
```
error: Package ‘hermes’ in /Users/damonchen/Documents/cronos/nix/hermes.nix:4 is not supported on ‘x86_64-darwin’, refusing to evaluate.
a) To temporarily allow packages that are unsupported for this system, you can use an environment variable
for a single invocation of the nix tools.
$ export NIXPKGS_ALLOW_UNSUPPORTED_SYSTEM=1
b) For `nixos-rebuild` you can set
{ nixpkgs.config.allowUnsupportedSystem = true; }
in configuration.nix to override this.
c) For `nix-env`, `nix-build`, `nix-shell` or any other Nix command you can add
{ allowUnsupportedSystem = true; }
to ~/.config/nixpkgs/config.nix.
(use '--show-trace' to show detailed location information)
```
As it states, run `export NIXPKGS_ALLOW_UNSUPPORTED_SYSTEM=1` can temporarily avoid it.
**Desktop (please complete the following information):**
- OS: macOS Big Sur
- Version 11.6.1
|
1.0
|
Problem: hermes is not supported on 'x86_64-darwin' - **Describe the bug**
After pulling the commit 9c3ba89b259f6da54a2c9f6096c8f0dd496efdc4, then
run `nix-shell integration_tests/shell.nix`, it prints out the following error:
```
error: Package ‘hermes’ in /Users/damonchen/Documents/cronos/nix/hermes.nix:4 is not supported on ‘x86_64-darwin’, refusing to evaluate.
a) To temporarily allow packages that are unsupported for this system, you can use an environment variable
for a single invocation of the nix tools.
$ export NIXPKGS_ALLOW_UNSUPPORTED_SYSTEM=1
b) For `nixos-rebuild` you can set
{ nixpkgs.config.allowUnsupportedSystem = true; }
in configuration.nix to override this.
c) For `nix-env`, `nix-build`, `nix-shell` or any other Nix command you can add
{ allowUnsupportedSystem = true; }
to ~/.config/nixpkgs/config.nix.
(use '--show-trace' to show detailed location information)
```
As it states, run `export NIXPKGS_ALLOW_UNSUPPORTED_SYSTEM=1` can temporarily avoid it.
**Desktop (please complete the following information):**
- OS: macOS Big Sur
- Version 11.6.1
|
non_process
|
problem hermes is not supported on darwin describe the bug after pulling the commit then run nix shell integration tests shell nix it prints out the following error error package ‘hermes’ in users damonchen documents cronos nix hermes nix is not supported on ‘ darwin’ refusing to evaluate a to temporarily allow packages that are unsupported for this system you can use an environment variable for a single invocation of the nix tools export nixpkgs allow unsupported system b for nixos rebuild you can set nixpkgs config allowunsupportedsystem true in configuration nix to override this c for nix env nix build nix shell or any other nix command you can add allowunsupportedsystem true to config nixpkgs config nix use show trace to show detailed location information as it states run export nixpkgs allow unsupported system can temporarily avoid it desktop please complete the following information os macos big sur version
| 0
|
10,571
| 13,382,952,690
|
IssuesEvent
|
2020-09-02 09:34:25
|
prisma/migrate
|
https://api.github.com/repos/prisma/migrate
|
closed
|
A migration of `@default(cuid())` inserts `null` values to the DB
|
bug/1-repro-available engines/migration engine kind/bug process/candidate
|
## Bug description
Adding a new column with `@default(cuid())` ends up with null values in the new column after migration.
## How to reproduce
Add a new column to an existing table, e.g.:
```
model Folder {
...
uid String? @default(cuid())
...
}
```
Run `prisma migrate save --experimental` followed by `prisma migrate up --experimental`
Notes:
1. Tried with and without `@unique` and the result is the same
2. `String?` type is required as `String` crashes the migration. Not sure why. It would be preferred that the column would be mandatory.
---
### Edit:
With `String` as the type, the migration fails on the following error, which is also described in #527:
```
Added the required column X to the Y table without a default value. There are N rows in this table, it is not possible to execute this migration.
```
---
## Expected behavior
The `Folder` table should contain a new `uid` column with generated `cuid` values.
## Prisma information
N/A
## Environment & setup
- OS: Mac OS
- Database: PostgreSQL
- Node.js version: v12.18.3
- Prisma version:
```
@prisma/cli : 2.5.1
Current platform : darwin
Query Engine : query-engine c88925ce44a9b89b4351aec85ba6a28979d2658e (at node_modules/@prisma/cli/query-engine-darwin)
Migration Engine : migration-engine-cli c88925ce44a9b89b4351aec85ba6a28979d2658e (at node_modules/@prisma/cli/migration-engine-darwin)
Introspection Engine : introspection-core c88925ce44a9b89b4351aec85ba6a28979d2658e (at node_modules/@prisma/cli/introspection-engine-darwin)
Format Binary : prisma-fmt c88925ce44a9b89b4351aec85ba6a28979d2658e (at node_modules/@prisma/cli/prisma-fmt-darwin)
Studio : 0.261.0
```
|
1.0
|
A migration of `@default(cuid())` inserts `null` values to the DB - ## Bug description
Adding a new column with `@default(cuid())` ends up with null values in the new column after migration.
## How to reproduce
Add a new column to an existing table, e.g.:
```
model Folder {
...
uid String? @default(cuid())
...
}
```
Run `prisma migrate save --experimental` followed by `prisma migrate up --experimental`
Notes:
1. Tried with and without `@unique` and the result is the same
2. `String?` type is required as `String` crashes the migration. Not sure why. It would be preferred that the column would be mandatory.
---
### Edit:
With `String` as the type, the migration fails on the following error, which is also described in #527:
```
Added the required column X to the Y table without a default value. There are N rows in this table, it is not possible to execute this migration.
```
---
## Expected behavior
The `Folder` table should contain a new `uid` column with generated `cuid` values.
## Prisma information
N/A
## Environment & setup
- OS: Mac OS
- Database: PostgreSQL
- Node.js version: v12.18.3
- Prisma version:
```
@prisma/cli : 2.5.1
Current platform : darwin
Query Engine : query-engine c88925ce44a9b89b4351aec85ba6a28979d2658e (at node_modules/@prisma/cli/query-engine-darwin)
Migration Engine : migration-engine-cli c88925ce44a9b89b4351aec85ba6a28979d2658e (at node_modules/@prisma/cli/migration-engine-darwin)
Introspection Engine : introspection-core c88925ce44a9b89b4351aec85ba6a28979d2658e (at node_modules/@prisma/cli/introspection-engine-darwin)
Format Binary : prisma-fmt c88925ce44a9b89b4351aec85ba6a28979d2658e (at node_modules/@prisma/cli/prisma-fmt-darwin)
Studio : 0.261.0
```
|
process
|
a migration of default cuid inserts null values to the db bug description adding a new column with default cuid ends up with null values in the new column after migration how to reproduce add a new column to an existing table e g model folder uid string default cuid run prisma migrate save experimental followed by prisma migrate up experimental notes tried with and without unique and the result is the same string type is required as string crashes the migration not sure why it would be preferred that the column would be mandatory edit with string as the type the migration fails on the following error which is also described in added the required column x to the y table without a default value there are n rows in this table it is not possible to execute this migration expected behavior the folder table should contain a new uid column with generated cuid values prisma information n a environment setup os mac os database postgresql node js version prisma version prisma cli current platform darwin query engine query engine at node modules prisma cli query engine darwin migration engine migration engine cli at node modules prisma cli migration engine darwin introspection engine introspection core at node modules prisma cli introspection engine darwin format binary prisma fmt at node modules prisma cli prisma fmt darwin studio
| 1
|
21,165
| 11,137,730,305
|
IssuesEvent
|
2019-12-20 20:11:19
|
couchbase/sync_gateway
|
https://api.github.com/repos/couchbase/sync_gateway
|
closed
|
Store attachments outside the bucket
|
enhancement icebox known-issue performance
|
Currently we store attachments as individual binary docs in the bucket. That's problematic, especially because there's a 20mb size limit.
We should switch to external storage, probably CBFS which was created specifically to solve this problem.
I think CBFS will make it easier to GC attachments too.
|
True
|
Store attachments outside the bucket - Currently we store attachments as individual binary docs in the bucket. That's problematic, especially because there's a 20mb size limit.
We should switch to external storage, probably CBFS which was created specifically to solve this problem.
I think CBFS will make it easier to GC attachments too.
|
non_process
|
store attachments outside the bucket currently we store attachments as individual binary docs in the bucket that s problematic especially because there s a size limit we should switch to external storage probably cbfs which was created specifically to solve this problem i think cbfs will make it easier to gc attachments too
| 0
|
7,474
| 10,568,580,002
|
IssuesEvent
|
2019-10-06 14:01:21
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
Mid statement is incorrectly identified as a Function in VBA library (in the Strings module)
|
bug hacktoberfest navigation parse-tree-processing
|
While having the same appearance with regard to their names and parameters, the `Mid` statement is part of the `VBA` language, while the `Mid` function is (after aliasing) a function in the `VBA` type library.
RD should be able to add a `SubroutineDeclaration` for `Mid`, and then make the distinction between a statement call and a function call, and therefore correctly resolve usages of `Mid` as a Statement or Function respectively.
|
1.0
|
Mid statement is incorrectly identified as a Function in VBA library (in the Strings module) - While having the same appearance with regard to their names and parameters, the `Mid` statement is part of the `VBA` language, while the `Mid` function is (after aliasing) a function in the `VBA` type library.
RD should be able to add a `SubroutineDeclaration` for `Mid`, and then make the distinction between a statement call and a function call, and therefore correctly resolve usages of `Mid` as a Statement or Function respectively.
|
process
|
mid statement is incorrectly identified as a function in vba library in the strings module while having the same appearance with regard to their names and parameters the mid statement is part of the vba language while the mid function is after aliasing a function in the vba type library rd should be able to add a subroutinedeclaration for mid and then make the distinction between a statement call and a function call and therefore correctly resolve usages of mid as a statement or function respectively
| 1
|
670,777
| 22,703,657,430
|
IssuesEvent
|
2022-07-05 12:58:33
|
easystats/report
|
https://api.github.com/repos/easystats/report
|
closed
|
Fixing build failures
|
bug :bug: high priority :runner:
|
Builds seem to be failing with the following error:
```r
Quitting from lines 105-108 (report.Rmd)
Error: Error: processing vignette 'report.Rmd' failed with diagnostics:
argument is of length zero
--- failed re-building ‘report.Rmd’
SUMMARY: processing the following file failed:
‘report.Rmd’
Error: Error: Vignette re-building failed.
Execution halted
```
But if I locally knit this vignette, I see no error.
Does this have to do with updates to any of the dependencies? I bumped all _easystats_ dependency versions, but the issue is still there.
|
1.0
|
Fixing build failures - Builds seem to be failing with the following error:
```r
Quitting from lines 105-108 (report.Rmd)
Error: Error: processing vignette 'report.Rmd' failed with diagnostics:
argument is of length zero
--- failed re-building ‘report.Rmd’
SUMMARY: processing the following file failed:
‘report.Rmd’
Error: Error: Vignette re-building failed.
Execution halted
```
But if I locally knit this vignette, I see no error.
Does this have to do with updates to any of the dependencies? I bumped all _easystats_ dependency versions, but the issue is still there.
|
non_process
|
fixing build failures builds seem to be failing with the following error r quitting from lines report rmd error error processing vignette report rmd failed with diagnostics argument is of length zero failed re building ‘report rmd’ summary processing the following file failed ‘report rmd’ error error vignette re building failed execution halted but if i locally knit this vignette i see no error does this have to do with updates to any of the dependencies i bumped all easystats dependency versions but the issue is still there
| 0
|
2,264
| 5,095,516,729
|
IssuesEvent
|
2017-01-03 15:27:35
|
MTG/freesound
|
https://api.github.com/repos/MTG/freesound
|
closed
|
Refactor sound processing state handling
|
Improvement _Processing
|
The current way in which we handle processing state and processing of sounds does not allow for optimised workflows when we want to reprocess sounds. To improve this, the processing_state field in the sound model should only be used to say that a sound has either failed processing or being successfully processed (similarity to the moderation state). Another property 'processing_ongoing_state' should be used to set the state of the actual processing activity (either queued, processing or finished).
This will allow to better handle reprocessing of sounds and optimise other related stuff such as removing sounds from solr and similarity when they fail processing and updating packs, num_sounds on user profiles and other related stuff when sounds change their processing_state.
|
1.0
|
Refactor sound processing state handling - The current way in which we handle processing state and processing of sounds does not allow for optimised workflows when we want to reprocess sounds. To improve this, the processing_state field in the sound model should only be used to say that a sound has either failed processing or being successfully processed (similarity to the moderation state). Another property 'processing_ongoing_state' should be used to set the state of the actual processing activity (either queued, processing or finished).
This will allow to better handle reprocessing of sounds and optimise other related stuff such as removing sounds from solr and similarity when they fail processing and updating packs, num_sounds on user profiles and other related stuff when sounds change their processing_state.
|
process
|
refactor sound processing state handling the current way in which we handle processing state and processing of sounds does not allow for optimised workflows when we want to reprocess sounds to improve this the processing state field in the sound model should only be used to say that a sound has either failed processing or being successfully processed similarity to the moderation state another property processing ongoing state should be used to set the state of the actual processing activity either queued processing or finished this will allow to better handle reprocessing of sounds and optimise other related stuff such as removing sounds from solr and similarity when they fail processing and updating packs num sounds on user profiles and other related stuff when sounds change their processing state
| 1
|
67,100
| 16,818,686,674
|
IssuesEvent
|
2021-06-17 10:25:08
|
status-im/StatusQ
|
https://api.github.com/repos/status-im/StatusQ
|
opened
|
Extend sandbox build script to allow for qmake/qt version specification
|
type: build type: feature
|
Right now the build script simply uses `qmake` which is provided by whatever Qt version it is linked to. This makes it hard to build sandbox for different Qt versions to test upgrades etc.
Ideally, the script would rely on a global environment variable (if it exists), most likely `QT_PATH`, which should come with a `qmake` executed.
That way, switching the Qt version is as simple as updated the environment variable. This can also be done on a per command basis a la:
```
QT_PATH=path/to/qt ./scripts/build
```
In addition, there could be an accepted qt path parameter being passed to the script:
```
./scripts/build --qt-path=path/to/qt
```
The reason we need to specific an entire path and not just a version number, is because users are free to install Qt in whatever location they want, so we can't expect a particular location.
|
1.0
|
Extend sandbox build script to allow for qmake/qt version specification - Right now the build script simply uses `qmake` which is provided by whatever Qt version it is linked to. This makes it hard to build sandbox for different Qt versions to test upgrades etc.
Ideally, the script would rely on a global environment variable (if it exists), most likely `QT_PATH`, which should come with a `qmake` executed.
That way, switching the Qt version is as simple as updated the environment variable. This can also be done on a per command basis a la:
```
QT_PATH=path/to/qt ./scripts/build
```
In addition, there could be an accepted qt path parameter being passed to the script:
```
./scripts/build --qt-path=path/to/qt
```
The reason we need to specific an entire path and not just a version number, is because users are free to install Qt in whatever location they want, so we can't expect a particular location.
|
non_process
|
extend sandbox build script to allow for qmake qt version specification right now the build script simply uses qmake which is provided by whatever qt version it is linked to this makes it hard to build sandbox for different qt versions to test upgrades etc ideally the script would rely on a global environment variable if it exists most likely qt path which should come with a qmake executed that way switching the qt version is as simple as updated the environment variable this can also be done on a per command basis a la qt path path to qt scripts build in addition there could be an accepted qt path parameter being passed to the script scripts build qt path path to qt the reason we need to specific an entire path and not just a version number is because users are free to install qt in whatever location they want so we can t expect a particular location
| 0
|
22,584
| 31,811,050,055
|
IssuesEvent
|
2023-09-13 16:51:38
|
alphagov/govuk-design-system
|
https://api.github.com/repos/alphagov/govuk-design-system
|
reopened
|
Add Kam to the team page on Design System website
|
process
|
## What
Add Kam to the list of Design System team members on the website.
## Why
To tell everyone that Kam has joined the team.
## Who needs to work on this
Kelly & Kam
## Who needs to review this
Anybody on the team
## Done when
- [ ] created a pull request to add Kam's name to the list
|
1.0
|
Add Kam to the team page on Design System website - ## What
Add Kam to the list of Design System team members on the website.
## Why
To tell everyone that Kam has joined the team.
## Who needs to work on this
Kelly & Kam
## Who needs to review this
Anybody on the team
## Done when
- [ ] created a pull request to add Kam's name to the list
|
process
|
add kam to the team page on design system website what add kam to the list of design system team members on the website why to tell everyone that kam has joined the team who needs to work on this kelly kam who needs to review this anybody on the team done when created a pull request to add kam s name to the list
| 1
|
22,739
| 32,056,195,084
|
IssuesEvent
|
2023-09-24 05:19:39
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
[processor/metricstransform] Include Splitby operation
|
enhancement Stale processor/metricstransform closed as inactive
|
### Component(s)
processor/metricstransform
### Is your feature request related to a problem? Please describe.
In the components current form, it is not possible without some complicated aggregation logic to convert an attribute to a new metric.
### Describe the solution you'd like
What I'd like to do is the following:
```yaml
processors:
metricstransform:
transform:
- include: system.cpu.utilization
action: update
operations:
- action: split_metric
new_name: cpu.idle # potential worth reusing `new_value`
label: state
label_value: idle
```
### Describe alternatives you've considered
At this point in time, to make `split_metric` logic work, I have either done some combination of label aggregation or using the `signalfx exporter` to perform this operation.
### Additional context
While A solution exists and works, it is split across two components so keeping metric names consistent across different systems isn't possible.
|
1.0
|
[processor/metricstransform] Include Splitby operation - ### Component(s)
processor/metricstransform
### Is your feature request related to a problem? Please describe.
In the components current form, it is not possible without some complicated aggregation logic to convert an attribute to a new metric.
### Describe the solution you'd like
What I'd like to do is the following:
```yaml
processors:
metricstransform:
transform:
- include: system.cpu.utilization
action: update
operations:
- action: split_metric
new_name: cpu.idle # potential worth reusing `new_value`
label: state
label_value: idle
```
### Describe alternatives you've considered
At this point in time, to make `split_metric` logic work, I have either done some combination of label aggregation or using the `signalfx exporter` to perform this operation.
### Additional context
While A solution exists and works, it is split across two components so keeping metric names consistent across different systems isn't possible.
|
process
|
include splitby operation component s processor metricstransform is your feature request related to a problem please describe in the components current form it is not possible without some complicated aggregation logic to convert an attribute to a new metric describe the solution you d like what i d like to do is the following yaml processors metricstransform transform include system cpu utilization action update operations action split metric new name cpu idle potential worth reusing new value label state label value idle describe alternatives you ve considered at this point in time to make split metric logic work i have either done some combination of label aggregation or using the signalfx exporter to perform this operation additional context while a solution exists and works it is split across two components so keeping metric names consistent across different systems isn t possible
| 1
|
826,291
| 31,563,938,640
|
IssuesEvent
|
2023-09-03 15:33:12
|
NikkelM/Outlook-Mail-Notes
|
https://api.github.com/repos/NikkelM/Outlook-Mail-Notes
|
closed
|
Export notes to CSV
|
UI feature Priority: Normal
|
It would be best if users could decide to export all notes, or just those specific to the current mail, conversation or sender.
|
1.0
|
Export notes to CSV - It would be best if users could decide to export all notes, or just those specific to the current mail, conversation or sender.
|
non_process
|
export notes to csv it would be best if users could decide to export all notes or just those specific to the current mail conversation or sender
| 0
|
5,936
| 8,757,456,741
|
IssuesEvent
|
2018-12-14 21:19:48
|
kythe/kythe
|
https://api.github.com/repos/kythe/kythe
|
opened
|
Determine how to collect and serve available build configuration
|
API post-processing
|
#3309 discusses how to represent build config in the graph, but does not go into how to post-process and serve this information to make it relevant to users. This should be determined and documented.
|
1.0
|
Determine how to collect and serve available build configuration - #3309 discusses how to represent build config in the graph, but does not go into how to post-process and serve this information to make it relevant to users. This should be determined and documented.
|
process
|
determine how to collect and serve available build configuration discusses how to represent build config in the graph but does not go into how to post process and serve this information to make it relevant to users this should be determined and documented
| 1
|
17,855
| 23,800,775,703
|
IssuesEvent
|
2022-09-03 08:45:14
|
RobertCraigie/prisma-client-py
|
https://api.github.com/repos/RobertCraigie/prisma-client-py
|
closed
|
Make the custom `Base64` type compatible with OpenAPI generation
|
bug/2-confirmed kind/bug process/candidate topic: client level/beginner priority/medium
|
<!--
Thanks for helping us improve Prisma Client Python! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by enabling additional logging output.
See https://prisma-client-py.readthedocs.io/en/stable/reference/logging/ for how to enable additional logging output.
-->
## Bug description
<!-- A clear and concise description of what the bug is. -->
It is currently not possible to define a FastAPI endpoint that includes our custom `Base64` type, https://github.com/RobertCraigie/prisma-client-py/issues/318#issuecomment-1214218519.
## How to reproduce
<!--
Steps to reproduce the behavior:
1. Go to '...'
2. Change '....'
3. Run '....'
4. See error
-->
See https://github.com/RobertCraigie/prisma-client-py/issues/318#issuecomment-1214393759
## Expected behaviour
<!-- A clear and concise description of what you expected to happen. -->
FastAPI should generate an OpenAPI spec converting `Base64` inputs to a `string`.
|
1.0
|
Make the custom `Base64` type compatible with OpenAPI generation - <!--
Thanks for helping us improve Prisma Client Python! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by enabling additional logging output.
See https://prisma-client-py.readthedocs.io/en/stable/reference/logging/ for how to enable additional logging output.
-->
## Bug description
<!-- A clear and concise description of what the bug is. -->
It is currently not possible to define a FastAPI endpoint that includes our custom `Base64` type, https://github.com/RobertCraigie/prisma-client-py/issues/318#issuecomment-1214218519.
## How to reproduce
<!--
Steps to reproduce the behavior:
1. Go to '...'
2. Change '....'
3. Run '....'
4. See error
-->
See https://github.com/RobertCraigie/prisma-client-py/issues/318#issuecomment-1214393759
## Expected behaviour
<!-- A clear and concise description of what you expected to happen. -->
FastAPI should generate an OpenAPI spec converting `Base64` inputs to a `string`.
|
process
|
make the custom type compatible with openapi generation thanks for helping us improve prisma client python 🙏 please follow the sections in the template and provide as much information as possible about your problem e g by enabling additional logging output see for how to enable additional logging output bug description it is currently not possible to define a fastapi endpoint that includes our custom type how to reproduce steps to reproduce the behavior go to change run see error see expected behaviour fastapi should generate an openapi spec converting inputs to a string
| 1
|
9,776
| 2,615,174,521
|
IssuesEvent
|
2015-03-01 06:57:31
|
chrsmith/reaver-wps
|
https://api.github.com/repos/chrsmith/reaver-wps
|
opened
|
wps locked yes after 20 tryings
|
auto-migrated Priority-Triage Type-Defect
|
```
1. What operating system are you using (Linux is the only supported OS)?
reaver 1.4 backtrack
2. Is your wireless card in monitor mode (yes/no)?
rtl8187 monitor enable on mon0
3. What is the signal strength of the Access Point you are trying to crack?
60%
4. What is the manufacturer and model # of the device you are trying to
crack?
sagem f@st 1704
5. What is the entire command line string you are supplying to reaver?
reaver -i mon0 -b xx:xx:xx:xx:xx -vv
6. Please describe what you think the issue is.
alot
7. Paste the output from Reaver below.
```
Original issue reported on code.google.com by `med.ser...@gmail.com` on 10 Jan 2014 at 5:12
|
1.0
|
wps locked yes after 20 tryings - ```
1. What operating system are you using (Linux is the only supported OS)?
reaver 1.4 backtrack
2. Is your wireless card in monitor mode (yes/no)?
rtl8187 monitor enable on mon0
3. What is the signal strength of the Access Point you are trying to crack?
60%
4. What is the manufacturer and model # of the device you are trying to
crack?
sagem f@st 1704
5. What is the entire command line string you are supplying to reaver?
reaver -i mon0 -b xx:xx:xx:xx:xx -vv
6. Please describe what you think the issue is.
alot
7. Paste the output from Reaver below.
```
Original issue reported on code.google.com by `med.ser...@gmail.com` on 10 Jan 2014 at 5:12
|
non_process
|
wps locked yes after tryings what operating system are you using linux is the only supported os reaver backtrack is your wireless card in monitor mode yes no monitor enable on what is the signal strength of the access point you are trying to crack what is the manufacturer and model of the device you are trying to crack sagem f st what is the entire command line string you are supplying to reaver reaver i b xx xx xx xx xx vv please describe what you think the issue is alot paste the output from reaver below original issue reported on code google com by med ser gmail com on jan at
| 0
|
12,743
| 15,106,548,101
|
IssuesEvent
|
2021-02-08 14:25:58
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Update l.gcr.io/google/bazel:latest bazel docker image to bazel version 4.0.0
|
team-XProduct type: process untriaged
|
> ATTENTION! Please read and follow:
> - if this is a _question_ about how to build / test / query / deploy using Bazel, or a _discussion starter_, send it to bazel-discuss@googlegroups.com
> - if this is a _bug_ or _feature request_, fill the form below as best as you can.
### Description of the problem / feature request:
`l.gcr.io/google/bazel:latest` is currently using bazel 3.5.0, let's get this upgraded to 4.0.0
### Feature requests: what underlying problem are you trying to solve with this feature?
Wanting to use bazel 4.0.0
### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
```
$ docker run l.gcr.io/google/bazel:latest --version
bazel 3.5.0
```
### What operating system are you running Bazel on?
Ubuntu 20.04
### What's the output of `bazel info release`?
N/A
### If `bazel info release` returns "development version" or "(@non-git)", tell us how you built Bazel.
N/A
### What's the output of `git remote get-url origin ; git rev-parse master ; git rev-parse HEAD` ?
N/A
### Have you found anything relevant by searching the web?
No
### Any other information, logs, or outputs that you want to share?
No
|
1.0
|
Update l.gcr.io/google/bazel:latest bazel docker image to bazel version 4.0.0 - > ATTENTION! Please read and follow:
> - if this is a _question_ about how to build / test / query / deploy using Bazel, or a _discussion starter_, send it to bazel-discuss@googlegroups.com
> - if this is a _bug_ or _feature request_, fill the form below as best as you can.
### Description of the problem / feature request:
`l.gcr.io/google/bazel:latest` is currently using bazel 3.5.0, let's get this upgraded to 4.0.0
### Feature requests: what underlying problem are you trying to solve with this feature?
Wanting to use bazel 4.0.0
### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
```
$ docker run l.gcr.io/google/bazel:latest --version
bazel 3.5.0
```
### What operating system are you running Bazel on?
Ubuntu 20.04
### What's the output of `bazel info release`?
N/A
### If `bazel info release` returns "development version" or "(@non-git)", tell us how you built Bazel.
N/A
### What's the output of `git remote get-url origin ; git rev-parse master ; git rev-parse HEAD` ?
N/A
### Have you found anything relevant by searching the web?
No
### Any other information, logs, or outputs that you want to share?
No
|
process
|
update l gcr io google bazel latest bazel docker image to bazel version attention please read and follow if this is a question about how to build test query deploy using bazel or a discussion starter send it to bazel discuss googlegroups com if this is a bug or feature request fill the form below as best as you can description of the problem feature request l gcr io google bazel latest is currently using bazel let s get this upgraded to feature requests what underlying problem are you trying to solve with this feature wanting to use bazel bugs what s the simplest easiest way to reproduce this bug please provide a minimal example if possible docker run l gcr io google bazel latest version bazel what operating system are you running bazel on ubuntu what s the output of bazel info release n a if bazel info release returns development version or non git tell us how you built bazel n a what s the output of git remote get url origin git rev parse master git rev parse head n a have you found anything relevant by searching the web no any other information logs or outputs that you want to share no
| 1
|
565,836
| 16,771,123,055
|
IssuesEvent
|
2021-06-14 14:54:16
|
Brevada/brv
|
https://api.github.com/repos/Brevada/brv
|
closed
|
Remove Decimal Point From Data Presentation
|
enhancement high priority ux
|
I don't think we need to show the scores with a decimal point - on an 80 point scale its meaningless.
@RobbieGoldfarb @noahnu
|
1.0
|
Remove Decimal Point From Data Presentation - I don't think we need to show the scores with a decimal point - on an 80 point scale its meaningless.
@RobbieGoldfarb @noahnu
|
non_process
|
remove decimal point from data presentation i don t think we need to show the scores with a decimal point on an point scale its meaningless robbiegoldfarb noahnu
| 0
|
6,019
| 8,822,900,713
|
IssuesEvent
|
2019-01-02 11:20:21
|
linnovate/root
|
https://api.github.com/repos/linnovate/root
|
opened
|
Search: can't delete in multiple select just after refresh/go to other tab
|
2.0.6 Process bug bug
|
open a few items with name of test.
search in the search line : "test".
click on the multiple select and choose all items.

click on delete.

the items didn't delete until you refresh or going to other tab.
|
1.0
|
Search: can't delete in multiple select just after refresh/go to other tab - open a few items with name of test.
search in the search line : "test".
click on the multiple select and choose all items.

click on delete.

the items didn't delete until you refresh or going to other tab.
|
process
|
search can t delete in multiple select just after refresh go to other tab open a few items with name of test search in the search line test click on the multiple select and choose all items click on delete the items didn t delete until you refresh or going to other tab
| 1
|
80,801
| 10,210,774,378
|
IssuesEvent
|
2019-08-14 15:26:45
|
google/nixery
|
https://api.github.com/repos/google/nixery
|
closed
|
Pin specific nixpkgs checkout
|
documentation enhancement
|
For the sake of reproducibility and stability, it would be great if we could specify a specific nixpkgs checkout version that should be used while assembling the container.
|
1.0
|
Pin specific nixpkgs checkout - For the sake of reproducibility and stability, it would be great if we could specify a specific nixpkgs checkout version that should be used while assembling the container.
|
non_process
|
pin specific nixpkgs checkout for the sake of reproducibility and stability it would be great if we could specify a specific nixpkgs checkout version that should be used while assembling the container
| 0
|
390,914
| 11,565,696,595
|
IssuesEvent
|
2020-02-20 10:59:32
|
wso2/product-apim
|
https://api.github.com/repos/wso2/product-apim
|
closed
|
No way to download WSDL file from Dev Portal
|
3.1.0 Priority/Normal Type/Improvement
|
### Describe your problem(s)
In 2.6.0, WSDL can be downloaded from Store for the REST APIs generated from SOAP back-end. Refer the following screenshot:

In 3.1.0 Dev Portal, no UI option to download WSDL file.
|
1.0
|
No way to download WSDL file from Dev Portal - ### Describe your problem(s)
In 2.6.0, WSDL can be downloaded from Store for the REST APIs generated from SOAP back-end. Refer the following screenshot:

In 3.1.0 Dev Portal, no UI option to download WSDL file.
|
non_process
|
no way to download wsdl file from dev portal describe your problem s in wsdl can be downloaded from store for the rest apis generated from soap back end refer the following screenshot in dev portal no ui option to download wsdl file
| 0
|
19,306
| 25,466,659,346
|
IssuesEvent
|
2022-11-25 05:34:03
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[IDP] [PM] Submit button is disabled in the account setup screen
|
Bug Blocker P0 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
|
**Steps:**
1. Add admins in the participant manager by entering the value in the phone number field which should be as similar in the format displayed in the UI
2. After getting account setup link in an email, click on that link
3. Complete all the fields and click on Submit button and Verify
**AR:** Submit button is disabled in the account setup screen
**ER:** Submit button should not be disabled in the account setup screen
[screen-capture (99).webm](https://user-images.githubusercontent.com/86007179/194277119-34e340cb-2d15-42a1-9e91-3b04a01b5596.webm)
|
3.0
|
[IDP] [PM] Submit button is disabled in the account setup screen - **Steps:**
1. Add admins in the participant manager by entering the value in the phone number field which should be as similar in the format displayed in the UI
2. After getting account setup link in an email, click on that link
3. Complete all the fields and click on Submit button and Verify
**AR:** Submit button is disabled in the account setup screen
**ER:** Submit button should not be disabled in the account setup screen
[screen-capture (99).webm](https://user-images.githubusercontent.com/86007179/194277119-34e340cb-2d15-42a1-9e91-3b04a01b5596.webm)
|
process
|
submit button is disabled in the account setup screen steps add admins in the participant manager by entering the value in the phone number field which should be as similar in the format displayed in the ui after getting account setup link in an email click on that link complete all the fields and click on submit button and verify ar submit button is disabled in the account setup screen er submit button should not be disabled in the account setup screen
| 1
|
120,120
| 25,743,610,461
|
IssuesEvent
|
2022-12-08 08:16:59
|
cython/cython
|
https://api.github.com/repos/cython/cython
|
closed
|
Illegal C code created with double complex
|
defect Code Generation
|
I am using `double complex` numbers in my code in various places, e.g., as array types, with a custom type defined by
```
ctypedef double complex dcomplex
```
This results in faulty C code:
```
/* Declarations.proto */
#if CYTHON_CCOMPLEX
#ifdef __cplusplus
typedef ::std::complex< npy_float64 > __pyx_t_npy_float64_complex;
#else
typedef npy_float64 _Complex __pyx_t_npy_float64_complex;
#endif
#else
typedef struct { npy_float64 real, imag; } __pyx_t_npy_float64_complex;
#endif
static CYTHON_INLINE __pyx_t_npy_float64_complex __pyx_t_npy_float64_complex_from_parts(npy_float64, npy_float
```
The ` typedef npy_float64 _Complex` is invalid as `_Complex` is only allowed to be used with basic types like `float` and `double`. If I manually change `npy_float64` to `double` everything works fine.
Interestingly, the code generated for `float complex` is correct:
```
/* Declarations.proto */
#if CYTHON_CCOMPLEX
#ifdef __cplusplus
typedef ::std::complex< float > __pyx_t_float_complex;
#else
typedef float _Complex __pyx_t_float_complex;
#endif
#else
typedef struct { float real, imag; } __pyx_t_float_complex;
#endif
static CYTHON_INLINE __pyx_t_float_complex __pyx_t_float_complex_from_parts(float, float);
```
This is observed with Cython 0.25.2 on macOS with both gcc 6.3.0 and Apple LLVM's clang 8.1.0.
|
1.0
|
Illegal C code created with double complex - I am using `double complex` numbers in my code in various places, e.g., as array types, with a custom type defined by
```
ctypedef double complex dcomplex
```
This results in faulty C code:
```
/* Declarations.proto */
#if CYTHON_CCOMPLEX
#ifdef __cplusplus
typedef ::std::complex< npy_float64 > __pyx_t_npy_float64_complex;
#else
typedef npy_float64 _Complex __pyx_t_npy_float64_complex;
#endif
#else
typedef struct { npy_float64 real, imag; } __pyx_t_npy_float64_complex;
#endif
static CYTHON_INLINE __pyx_t_npy_float64_complex __pyx_t_npy_float64_complex_from_parts(npy_float64, npy_float
```
The ` typedef npy_float64 _Complex` is invalid as `_Complex` is only allowed to be used with basic types like `float` and `double`. If I manually change `npy_float64` to `double` everything works fine.
Interestingly, the code generated for `float complex` is correct:
```
/* Declarations.proto */
#if CYTHON_CCOMPLEX
#ifdef __cplusplus
typedef ::std::complex< float > __pyx_t_float_complex;
#else
typedef float _Complex __pyx_t_float_complex;
#endif
#else
typedef struct { float real, imag; } __pyx_t_float_complex;
#endif
static CYTHON_INLINE __pyx_t_float_complex __pyx_t_float_complex_from_parts(float, float);
```
This is observed with Cython 0.25.2 on macOS with both gcc 6.3.0 and Apple LLVM's clang 8.1.0.
|
non_process
|
illegal c code created with double complex i am using double complex numbers in my code in various places e g as array types with a custom type defined by ctypedef double complex dcomplex this results in faulty c code declarations proto if cython ccomplex ifdef cplusplus typedef std complex pyx t npy complex else typedef npy complex pyx t npy complex endif else typedef struct npy real imag pyx t npy complex endif static cython inline pyx t npy complex pyx t npy complex from parts npy npy float the typedef npy complex is invalid as complex is only allowed to be used with basic types like float and double if i manually change npy to double everything works fine interestingly the code generated for float complex is correct declarations proto if cython ccomplex ifdef cplusplus typedef std complex pyx t float complex else typedef float complex pyx t float complex endif else typedef struct float real imag pyx t float complex endif static cython inline pyx t float complex pyx t float complex from parts float float this is observed with cython on macos with both gcc and apple llvm s clang
| 0
|
807,971
| 30,028,741,417
|
IssuesEvent
|
2023-06-27 08:11:38
|
microsoft/PowerToys
|
https://api.github.com/repos/microsoft/PowerToys
|
closed
|
powerlauncher preventing powershell 7 uninstall/update
|
Issue-Bug Product-PowerToys Run Priority-1 Needs-Triage Needs-Team-Response
|
<!--
**Important: When reporting BSODs or security issues, DO NOT attach memory dumps, logs, or traces to Github issues**.
Instead, send dumps/traces to secure@microsoft.com, referencing this GitHub issue.
-->
## ℹ Computer information
- PowerToys version: 0.33.1
- PowerToy Utility: powerlauncher
- Running PowerToys as Admin: yes
- Windows build number: 10.0.19043.867
## 📝 Provide detailed reproduction steps (if any)
1. have PowerShell 7 installed using the msi package (mine is x64)
2. have powertoys running (in my testing i didn't need to open/interact with powerlauncher to have it happen)
3. try to uninstall PowerShell 7 from control panel or try to update it with a newer msi package
### ✔️ Expected result
should complete uninstaling/updating without problems
### ❌ Actual result
it says that powerlauncher process should be terminated before proceeding.
## 📷 Screenshots

|
1.0
|
powerlauncher preventing powershell 7 uninstall/update - <!--
**Important: When reporting BSODs or security issues, DO NOT attach memory dumps, logs, or traces to Github issues**.
Instead, send dumps/traces to secure@microsoft.com, referencing this GitHub issue.
-->
## ℹ Computer information
- PowerToys version: 0.33.1
- PowerToy Utility: powerlauncher
- Running PowerToys as Admin: yes
- Windows build number: 10.0.19043.867
## 📝 Provide detailed reproduction steps (if any)
1. have PowerShell 7 installed using the msi package (mine is x64)
2. have powertoys running (in my testing i didn't need to open/interact with powerlauncher to have it happen)
3. try to uninstall PowerShell 7 from control panel or try to update it with a newer msi package
### ✔️ Expected result
should complete uninstaling/updating without problems
### ❌ Actual result
it says that powerlauncher process should be terminated before proceeding.
## 📷 Screenshots

|
non_process
|
powerlauncher preventing powershell uninstall update important when reporting bsods or security issues do not attach memory dumps logs or traces to github issues instead send dumps traces to secure microsoft com referencing this github issue ℹ computer information powertoys version powertoy utility powerlauncher running powertoys as admin yes windows build number 📝 provide detailed reproduction steps if any have powershell installed using the msi package mine is have powertoys running in my testing i didn t need to open interact with powerlauncher to have it happen try to uninstall powershell from control panel or try to update it with a newer msi package ✔️ expected result should complete uninstaling updating without problems ❌ actual result it says that powerlauncher process should be terminated before proceeding 📷 screenshots
| 0
|
13,790
| 16,550,848,877
|
IssuesEvent
|
2021-05-28 08:24:55
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Mobile] Studies are not displaying in Study list screen in mobile
|
Blocker Bug P0 Process: Fixed Process: Tested QA Unknown backend
|
Issue is observed in QA instance

|
2.0
|
[Mobile] Studies are not displaying in Study list screen in mobile - Issue is observed in QA instance

|
process
|
studies are not displaying in study list screen in mobile issue is observed in qa instance
| 1
|
241,449
| 26,256,798,503
|
IssuesEvent
|
2023-01-06 01:58:59
|
DavidSpek/kubeflow
|
https://api.github.com/repos/DavidSpek/kubeflow
|
opened
|
CVE-2022-0155 (Medium) detected in follow-redirects-1.9.0.tgz
|
security vulnerability
|
## CVE-2022-0155 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>follow-redirects-1.9.0.tgz</b></p></summary>
<p>HTTP and HTTPS modules that follow redirects.</p>
<p>Library home page: <a href="https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.9.0.tgz">https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.9.0.tgz</a></p>
<p>Path to dependency file: /components/centraldashboard/package.json</p>
<p>Path to vulnerable library: /components/centraldashboard/node_modules/follow-redirects/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.3.0.tgz (Root Library)
- http-proxy-1.18.0.tgz
- :x: **follow-redirects-1.9.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
follow-redirects is vulnerable to Exposure of Private Personal Information to an Unauthorized Actor
<p>Publish Date: 2022-01-10
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-0155>CVE-2022-0155</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/fc524e4b-ebb6-427d-ab67-a64181020406/">https://huntr.dev/bounties/fc524e4b-ebb6-427d-ab67-a64181020406/</a></p>
<p>Release Date: 2022-01-10</p>
<p>Fix Resolution (follow-redirects): 1.14.7</p>
<p>Direct dependency fix Resolution (karma): 4.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-0155 (Medium) detected in follow-redirects-1.9.0.tgz - ## CVE-2022-0155 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>follow-redirects-1.9.0.tgz</b></p></summary>
<p>HTTP and HTTPS modules that follow redirects.</p>
<p>Library home page: <a href="https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.9.0.tgz">https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.9.0.tgz</a></p>
<p>Path to dependency file: /components/centraldashboard/package.json</p>
<p>Path to vulnerable library: /components/centraldashboard/node_modules/follow-redirects/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.3.0.tgz (Root Library)
- http-proxy-1.18.0.tgz
- :x: **follow-redirects-1.9.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
follow-redirects is vulnerable to Exposure of Private Personal Information to an Unauthorized Actor
<p>Publish Date: 2022-01-10
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-0155>CVE-2022-0155</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/fc524e4b-ebb6-427d-ab67-a64181020406/">https://huntr.dev/bounties/fc524e4b-ebb6-427d-ab67-a64181020406/</a></p>
<p>Release Date: 2022-01-10</p>
<p>Fix Resolution (follow-redirects): 1.14.7</p>
<p>Direct dependency fix Resolution (karma): 4.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in follow redirects tgz cve medium severity vulnerability vulnerable library follow redirects tgz http and https modules that follow redirects library home page a href path to dependency file components centraldashboard package json path to vulnerable library components centraldashboard node modules follow redirects package json dependency hierarchy karma tgz root library http proxy tgz x follow redirects tgz vulnerable library found in base branch master vulnerability details follow redirects is vulnerable to exposure of private personal information to an unauthorized actor publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution follow redirects direct dependency fix resolution karma step up your open source security game with mend
| 0
|
1,415
| 3,979,671,041
|
IssuesEvent
|
2016-05-06 01:17:01
|
neuropoly/spinalcordtoolbox
|
https://api.github.com/repos/neuropoly/spinalcordtoolbox
|
opened
|
minor improvements on EXCEL output
|
enhancement sct_process_segmentation
|
- move column F between column A and B
- when "z" is selected, the column "Vertebral level" shows "ALL", which is wrong. Please change for "NaN"
|
1.0
|
minor improvements on EXCEL output - - move column F between column A and B
- when "z" is selected, the column "Vertebral level" shows "ALL", which is wrong. Please change for "NaN"
|
process
|
minor improvements on excel output move column f between column a and b when z is selected the column vertebral level shows all which is wrong please change for nan
| 1
|
9,235
| 12,264,075,452
|
IssuesEvent
|
2020-05-07 03:03:51
|
hashicorp/packer
|
https://api.github.com/repos/hashicorp/packer
|
closed
|
vmware-iso builder losing artifact before post-processor can use it
|
bug post-processor/vsphere
|
#### Overview of the Issue
I can't put my finger on it, but I have a hunch that the `vmware-iso` builder is losing the `ova` artifact before the `vsphere` post-processor can use it.
I was getting the error `\* Post-processor failed: Error uploading virtual machine: exit status 1`, and saw https://github.com/hashicorp/packer/issues/7368 was an issue, so I moved the `output_folder` to the current working directory, but now I'm getting a new error `cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)"}`.
My goal is to upload the OVA to a location in the datastore.
#### Reproduction Steps
1. `packer build`
2. Errors on post-processor
### Packer version
1.5.5
### Simplified Packer Buildfile
```
{
"builders": [{
"boot_wait": "5s",
"boot_command": [
"<esc>",
"<wait>linux inst.ks=hd:/dev/fd0:ks.cfg<enter>"
],
"floppy_files": [
"ks.cfg"
],
"type": "vmware-iso",
"remote_username": "REDACTED",
"remote_password": "{{user `vcenter_password`}}",
"iso_checksum_type": "file",
"iso_checksum": "http://spout.ussg.indiana.edu/linux/centos/7.8.2003/isos/x86_64/sha256sum.txt.asc",
"guest_os_type": "centos7_64Guest",
"iso_urls": "http://spout.ussg.indiana.edu/linux/centos/7.8.2003/isos/x86_64/CentOS-7-x86_64-Minimal-2003.iso",
"vm_name": "devops-packer-base-centos-{{user `unix_timestamp`}}",
"ssh_username": "REDACTED",
"ssh_password": "REDACTED",
"disk_size": "40000",
"remote_host": "REDACTED",
"remote_datastore": "REDACTED",
"remote_cache_datastore": "REDACTED",
"remote_cache_directory": "REDACTED",
"network_adapter_type": "vmxnet3",
"memory": "8000",
"remote_type": "esx5",
"format": "ova",
"vnc_disable_password": true,
"ssh_timeout": "40m",
"keep_registered": false,
"vmx_data": {
"ethernet0.networkName": "VM Network"
},
"ovftool_options": [
"--X:logLevel=trivia",
"--X:logToConsole",
"--X:noPrompting"
],
"output_directory":"."
}
],
"provisioners" :[{
"type": "inspec",
"profile": "inspec/base-os"
}
],
"post-processors": [{
"type": "vsphere",
"cluster":"REDACTED",
"datacenter":"ninja-lab",
"host": "REDACTED",
"password": "{{user `vcenter_password`}}",
"username": "REDACTED",
"datastore": "REDACTED",
"vm_name": "devops-packer-base-centos-{{user `unix_timestamp`}}",
"insecure": "true",
"options":[
"--X:logToConsole",
"--X:logLevel=trivia",
"--X:noPrompting"
],
"vm_folder": "base-ova",
"keep_input_artifact": true
}]
}
```
### Operating system and Environment details
macOS Catalina 10.15.4
### Log Fragments and crash.log files
```
==> vmware-iso: Running post-processor: vsphere
vmware-iso (vsphere): Uploading devops-packer-base-centos-1588712204.ova to vSphere
2020/05/05 17:30:07 packer-post-processor-vsphere plugin: Starting ovftool with parameters: --acceptAllEulas --name=devops-packer-base-centos-1588713348 --datastore=ISOs --noSSLVerify=true --diskMode=thick --vmFolder=base-ova --X:logToConsole --X:logLevel=trivia --X:noPrompting devops-packer-base-centos-1588712204.ova vi://REDACTED:<password>@REDACTED
2020/05/05 17:30:10 [INFO] (telemetry) ending vsphere
2020/05/05 17:30:10 Deleting original artifact for build 'vmware-iso'
* Post-processor failed: Error uploading virtual machine: exit status 1
* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)
2020/05/05 17:30:10 machine readable: error-count []string{"1"}
==> Some builds didn't complete successfully and had errors:
2020/05/05 17:30:10 machine readable: vmware-iso,error []string{"2 error(s) occurred:\n\n* Post-processor failed: Error uploading virtual machine: exit status 1\n\n\n* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)"}
* Post-processor failed: Error uploading virtual machine: exit status 1
* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)
==> Builds finished but no artifacts were created.
2020/05/05 17:30:10 [INFO] (telemetry) Finalizing.
Build 'vmware-iso' errored: 2 error(s) occurred:
* Post-processor failed: Error uploading virtual machine: exit status 1
* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)
==> Some builds didn't complete successfully and had errors:
--> vmware-iso: 2 error(s) occurred:
* Post-processor failed: Error uploading virtual machine: exit status 1
* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)
```
|
1.0
|
vmware-iso builder losing artifact before post-processor can use it - #### Overview of the Issue
I can't put my finger on it, but I have a hunch that the `vmware-iso` builder is losing the `ova` artifact before the `vsphere` post-processor can use it.
I was getting the error `\* Post-processor failed: Error uploading virtual machine: exit status 1`, and saw https://github.com/hashicorp/packer/issues/7368 was an issue, so I moved the `output_folder` to the current working directory, but now I'm getting a new error `cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)"}`.
My goal is to upload the OVA to a location in the datastore.
#### Reproduction Steps
1. `packer build`
2. Errors on post-processor
### Packer version
1.5.5
### Simplified Packer Buildfile
```
{
"builders": [{
"boot_wait": "5s",
"boot_command": [
"<esc>",
"<wait>linux inst.ks=hd:/dev/fd0:ks.cfg<enter>"
],
"floppy_files": [
"ks.cfg"
],
"type": "vmware-iso",
"remote_username": "REDACTED",
"remote_password": "{{user `vcenter_password`}}",
"iso_checksum_type": "file",
"iso_checksum": "http://spout.ussg.indiana.edu/linux/centos/7.8.2003/isos/x86_64/sha256sum.txt.asc",
"guest_os_type": "centos7_64Guest",
"iso_urls": "http://spout.ussg.indiana.edu/linux/centos/7.8.2003/isos/x86_64/CentOS-7-x86_64-Minimal-2003.iso",
"vm_name": "devops-packer-base-centos-{{user `unix_timestamp`}}",
"ssh_username": "REDACTED",
"ssh_password": "REDACTED",
"disk_size": "40000",
"remote_host": "REDACTED",
"remote_datastore": "REDACTED",
"remote_cache_datastore": "REDACTED",
"remote_cache_directory": "REDACTED",
"network_adapter_type": "vmxnet3",
"memory": "8000",
"remote_type": "esx5",
"format": "ova",
"vnc_disable_password": true,
"ssh_timeout": "40m",
"keep_registered": false,
"vmx_data": {
"ethernet0.networkName": "VM Network"
},
"ovftool_options": [
"--X:logLevel=trivia",
"--X:logToConsole",
"--X:noPrompting"
],
"output_directory":"."
}
],
"provisioners" :[{
"type": "inspec",
"profile": "inspec/base-os"
}
],
"post-processors": [{
"type": "vsphere",
"cluster":"REDACTED",
"datacenter":"ninja-lab",
"host": "REDACTED",
"password": "{{user `vcenter_password`}}",
"username": "REDACTED",
"datastore": "REDACTED",
"vm_name": "devops-packer-base-centos-{{user `unix_timestamp`}}",
"insecure": "true",
"options":[
"--X:logToConsole",
"--X:logLevel=trivia",
"--X:noPrompting"
],
"vm_folder": "base-ova",
"keep_input_artifact": true
}]
}
```
### Operating system and Environment details
macOS Catalina 10.15.4
### Log Fragments and crash.log files
```
==> vmware-iso: Running post-processor: vsphere
vmware-iso (vsphere): Uploading devops-packer-base-centos-1588712204.ova to vSphere
2020/05/05 17:30:07 packer-post-processor-vsphere plugin: Starting ovftool with parameters: --acceptAllEulas --name=devops-packer-base-centos-1588713348 --datastore=ISOs --noSSLVerify=true --diskMode=thick --vmFolder=base-ova --X:logToConsole --X:logLevel=trivia --X:noPrompting devops-packer-base-centos-1588712204.ova vi://REDACTED:<password>@REDACTED
2020/05/05 17:30:10 [INFO] (telemetry) ending vsphere
2020/05/05 17:30:10 Deleting original artifact for build 'vmware-iso'
* Post-processor failed: Error uploading virtual machine: exit status 1
* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)
2020/05/05 17:30:10 machine readable: error-count []string{"1"}
==> Some builds didn't complete successfully and had errors:
2020/05/05 17:30:10 machine readable: vmware-iso,error []string{"2 error(s) occurred:\n\n* Post-processor failed: Error uploading virtual machine: exit status 1\n\n\n* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)"}
* Post-processor failed: Error uploading virtual machine: exit status 1
* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)
==> Builds finished but no artifacts were created.
2020/05/05 17:30:10 [INFO] (telemetry) Finalizing.
Build 'vmware-iso' errored: 2 error(s) occurred:
* Post-processor failed: Error uploading virtual machine: exit status 1
* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)
==> Some builds didn't complete successfully and had errors:
--> vmware-iso: 2 error(s) occurred:
* Post-processor failed: Error uploading virtual machine: exit status 1
* Error destroying builder artifact: reading body [pos 106559]: cannot decode non-nil codec value into nil error (1 methods); bad artifact: []string(nil)
```
|
process
|
vmware iso builder losing artifact before post processor can use it overview of the issue i can t put my finger on it but i have a hunch that the vmware iso builder is losing the ova artifact before the vsphere post processor can use it i was getting the error post processor failed error uploading virtual machine exit status and saw was an issue so i moved the output folder to the current working directory but now i m getting a new error cannot decode non nil codec value into nil error methods bad artifact string nil my goal is to upload the ova to a location in the datastore reproduction steps packer build errors on post processor packer version simplified packer buildfile builders boot wait boot command linux inst ks hd dev ks cfg floppy files ks cfg type vmware iso remote username redacted remote password user vcenter password iso checksum type file iso checksum guest os type iso urls vm name devops packer base centos user unix timestamp ssh username redacted ssh password redacted disk size remote host redacted remote datastore redacted remote cache datastore redacted remote cache directory redacted network adapter type memory remote type format ova vnc disable password true ssh timeout keep registered false vmx data networkname vm network ovftool options x loglevel trivia x logtoconsole x noprompting output directory provisioners type inspec profile inspec base os post processors type vsphere cluster redacted datacenter ninja lab host redacted password user vcenter password username redacted datastore redacted vm name devops packer base centos user unix timestamp insecure true options x logtoconsole x loglevel trivia x noprompting vm folder base ova keep input artifact true operating system and environment details macos catalina log fragments and crash log files vmware iso running post processor vsphere vmware iso vsphere uploading devops packer base centos ova to vsphere packer post processor vsphere plugin starting ovftool with parameters acceptalleulas name devops packer base centos datastore isos nosslverify true diskmode thick vmfolder base ova x logtoconsole x loglevel trivia x noprompting devops packer base centos ova vi redacted redacted telemetry ending vsphere deleting original artifact for build vmware iso post processor failed error uploading virtual machine exit status error destroying builder artifact reading body cannot decode non nil codec value into nil error methods bad artifact string nil machine readable error count string some builds didn t complete successfully and had errors machine readable vmware iso error string error s occurred n n post processor failed error uploading virtual machine exit status n n n error destroying builder artifact reading body cannot decode non nil codec value into nil error methods bad artifact string nil post processor failed error uploading virtual machine exit status error destroying builder artifact reading body cannot decode non nil codec value into nil error methods bad artifact string nil builds finished but no artifacts were created telemetry finalizing build vmware iso errored error s occurred post processor failed error uploading virtual machine exit status error destroying builder artifact reading body cannot decode non nil codec value into nil error methods bad artifact string nil some builds didn t complete successfully and had errors vmware iso error s occurred post processor failed error uploading virtual machine exit status error destroying builder artifact reading body cannot decode non nil codec value into nil error methods bad artifact string nil
| 1
|
339,617
| 10,256,870,340
|
IssuesEvent
|
2019-08-21 18:42:33
|
byu-animation/dccpipe
|
https://api.github.com/repos/byu-animation/dccpipe
|
opened
|
Optimize which departments are shown in lists when publishing
|
Maya enhancement good first issue priority: medium
|
We don't want users publishing to an un-used department for a prop, like cfx or fx when they are creating a static prop.
|
1.0
|
Optimize which departments are shown in lists when publishing - We don't want users publishing to an un-used department for a prop, like cfx or fx when they are creating a static prop.
|
non_process
|
optimize which departments are shown in lists when publishing we don t want users publishing to an un used department for a prop like cfx or fx when they are creating a static prop
| 0
|
44,273
| 2,902,837,296
|
IssuesEvent
|
2015-06-18 09:42:37
|
OpenDataNode/open-data-node
|
https://api.github.com/repos/OpenDataNode/open-data-node
|
opened
|
CKAN datastore: no view created
|
priority: High severity: bug
|
In 2.3 CKAN, there is no view generated in resource detail when using dpus:
* relationalToCkan
* relationalDiffToCkan
The resource is created correctly and data is inserted too, only the recline view isn't generated automatically and has to be added manually.
|
1.0
|
CKAN datastore: no view created - In 2.3 CKAN, there is no view generated in resource detail when using dpus:
* relationalToCkan
* relationalDiffToCkan
The resource is created correctly and data is inserted too, only the recline view isn't generated automatically and has to be added manually.
|
non_process
|
ckan datastore no view created in ckan there is no view generated in resource detail when using dpus relationaltockan relationaldifftockan the resource is created correctly and data is inserted too only the recline view isn t generated automatically and has to be added manually
| 0
|
99,184
| 4,049,127,056
|
IssuesEvent
|
2016-05-23 13:09:51
|
duckduckgo/community-platform
|
https://api.github.com/repos/duckduckgo/community-platform
|
closed
|
insert image does not allow upload
|
Feature Forum Improvement Priority: Low
|
current forum post `insert image` button prompts for an image url but does not allow a new image to be uploaded to dukgo.com
|
1.0
|
insert image does not allow upload - current forum post `insert image` button prompts for an image url but does not allow a new image to be uploaded to dukgo.com
|
non_process
|
insert image does not allow upload current forum post insert image button prompts for an image url but does not allow a new image to be uploaded to dukgo com
| 0
|
13,975
| 16,748,118,543
|
IssuesEvent
|
2021-06-11 18:23:11
|
sysflow-telemetry/sf-docs
|
https://api.github.com/repos/sysflow-telemetry/sf-docs
|
closed
|
Pull and update policies from S3/object store bucket
|
enhancement sf-processor
|
Implement a mechanism for pulling and updating policies from S3. Things to consider:
- configurable update/pull checks
- checksum/hash checking during updates
- policy syntax check before updating (use `compile` function from policy engine)
- use secrets wrapper to read S3 access/secret keys from container vault (also, support key setting in the pipeline configuration for testing/debugging)
|
1.0
|
Pull and update policies from S3/object store bucket - Implement a mechanism for pulling and updating policies from S3. Things to consider:
- configurable update/pull checks
- checksum/hash checking during updates
- policy syntax check before updating (use `compile` function from policy engine)
- use secrets wrapper to read S3 access/secret keys from container vault (also, support key setting in the pipeline configuration for testing/debugging)
|
process
|
pull and update policies from object store bucket implement a mechanism for pulling and updating policies from things to consider configurable update pull checks checksum hash checking during updates policy syntax check before updating use compile function from policy engine use secrets wrapper to read access secret keys from container vault also support key setting in the pipeline configuration for testing debugging
| 1
|
8,079
| 20,822,064,613
|
IssuesEvent
|
2022-03-18 16:20:30
|
RuanScherer/api-health-checker
|
https://api.github.com/repos/RuanScherer/api-health-checker
|
closed
|
Prepare backend base environment
|
architecture backend
|
**Describe the solution you'd like**
Prepare the backend development environment using:
- Node.js
- Typescript
- Express.js
- CORS
|
1.0
|
Prepare backend base environment - **Describe the solution you'd like**
Prepare the backend development environment using:
- Node.js
- Typescript
- Express.js
- CORS
|
non_process
|
prepare backend base environment describe the solution you d like prepare the backend development environment using node js typescript express js cors
| 0
|
13,571
| 16,109,056,243
|
IssuesEvent
|
2021-04-27 18:31:47
|
hashgraph/hedera-mirror-node
|
https://api.github.com/repos/hashgraph/hedera-mirror-node
|
closed
|
Optimize Timescaledb migration script
|
P3 database enhancement process
|
**Problem**
[This PR](https://github.com/hashgraph/hedera-mirror-node/pull/1798) added some fixes to the timescaledb migration script, however some optimization could still be done.
**Solution**
- Find a way to handle the column order mismatch that does not require an explicit column list in the restore script (this becomes a maintenance burden as time goes on). Some suggestions include using pg_dump instead of COPY commands, as it provides a way to pull the column headers with the data, as well as using `header true` for the copy commands.
- It may also be worthwhile to investigate using the Flyway CLI for parts of the migration, instead of just referencing the Flyway migration in part of the script, as this causes the Flyway migration to be run twice (the migrations are written in a way that this does not impact anything, so this is not as critical).
|
1.0
|
Optimize Timescaledb migration script - **Problem**
[This PR](https://github.com/hashgraph/hedera-mirror-node/pull/1798) added some fixes to the timescaledb migration script, however some optimization could still be done.
**Solution**
- Find a way to handle the column order mismatch that does not require an explicit column list in the restore script (this becomes a maintenance burden as time goes on). Some suggestions include using pg_dump instead of COPY commands, as it provides a way to pull the column headers with the data, as well as using `header true` for the copy commands.
- It may also be worthwhile to investigate using the Flyway CLI for parts of the migration, instead of just referencing the Flyway migration in part of the script, as this causes the Flyway migration to be run twice (the migrations are written in a way that this does not impact anything, so this is not as critical).
|
process
|
optimize timescaledb migration script problem added some fixes to the timescaledb migration script however some optimization could still be done solution find a way to handle the column order mismatch that does not require an explicit column list in the restore script this becomes a maintenance burden as time goes on some suggestions include using pg dump instead of copy commands as it provides a way to pull the column headers with the data as well as using header true for the copy commands it may also be worthwhile to investigate using the flyway cli for parts of the migration instead of just referencing the flyway migration in part of the script as this causes the flyway migration to be run twice the migrations are written in a way that this does not impact anything so this is not as critical
| 1
|
324,187
| 23,987,561,588
|
IssuesEvent
|
2022-09-13 20:34:41
|
sloik/Major.Minor.Patch
|
https://api.github.com/repos/sloik/Major.Minor.Patch
|
closed
|
Expand README.md
|
documentation good first issue
|
# What to do
Add more information sections to `README.md`:
## Acceptance Criteria
Sections that are added:
* Installation
* short description about semantic versioning and link to the specification
* some examples of usage
|
1.0
|
Expand README.md - # What to do
Add more information sections to `README.md`:
## Acceptance Criteria
Sections that are added:
* Installation
* short description about semantic versioning and link to the specification
* some examples of usage
|
non_process
|
expand readme md what to do add more information sections to readme md acceptance criteria sections that are added installation short description about semantic versioning and link to the specification some examples of usage
| 0
|
162,062
| 20,164,376,493
|
IssuesEvent
|
2022-02-10 01:46:50
|
kapseliboi/dapp
|
https://api.github.com/repos/kapseliboi/dapp
|
opened
|
CVE-2021-3807 (High) detected in multiple libraries
|
security vulnerability
|
## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-5.0.0.tgz</b>, <b>ansi-regex-3.0.0.tgz</b>, <b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/react-dev-utils/node_modules/strip-ansi/node_modules/ansi-regex/package.json,/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- react-9.5.0.tgz (Root Library)
- dom-6.16.0.tgz
- pretty-format-25.5.0.tgz
- :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/string-length/node_modules/ansi-regex/package.json,/node_modules/log-update/node_modules/ansi-regex/package.json,/node_modules/@oclif/command/node_modules/wrap-ansi/node_modules/ansi-regex/package.json,/node_modules/@oclif/plugin-help/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- apollo-2.30.2.tgz (Root Library)
- listr-0.14.3.tgz
- listr-update-renderer-0.5.0.tgz
- log-update-2.3.0.tgz
- wrap-ansi-3.0.1.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/pretty-format/node_modules/ansi-regex/package.json,/node_modules/react-dev-utils/node_modules/ansi-regex/package.json,/node_modules/strip-ansi/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- jest-dom-4.2.4.tgz (Root Library)
- pretty-format-24.9.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution (ansi-regex): 5.0.1</p>
<p>Direct dependency fix Resolution (@testing-library/react): 10.0.0</p><p>Fix Resolution (ansi-regex): 5.0.1</p>
<p>Direct dependency fix Resolution (@testing-library/jest-dom): 5.1.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-3807 (High) detected in multiple libraries - ## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-5.0.0.tgz</b>, <b>ansi-regex-3.0.0.tgz</b>, <b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/react-dev-utils/node_modules/strip-ansi/node_modules/ansi-regex/package.json,/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- react-9.5.0.tgz (Root Library)
- dom-6.16.0.tgz
- pretty-format-25.5.0.tgz
- :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/string-length/node_modules/ansi-regex/package.json,/node_modules/log-update/node_modules/ansi-regex/package.json,/node_modules/@oclif/command/node_modules/wrap-ansi/node_modules/ansi-regex/package.json,/node_modules/@oclif/plugin-help/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- apollo-2.30.2.tgz (Root Library)
- listr-0.14.3.tgz
- listr-update-renderer-0.5.0.tgz
- log-update-2.3.0.tgz
- wrap-ansi-3.0.1.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/pretty-format/node_modules/ansi-regex/package.json,/node_modules/react-dev-utils/node_modules/ansi-regex/package.json,/node_modules/strip-ansi/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- jest-dom-4.2.4.tgz (Root Library)
- pretty-format-24.9.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution (ansi-regex): 5.0.1</p>
<p>Direct dependency fix Resolution (@testing-library/react): 10.0.0</p><p>Fix Resolution (ansi-regex): 5.0.1</p>
<p>Direct dependency fix Resolution (@testing-library/jest-dom): 5.1.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries ansi regex tgz ansi regex tgz ansi regex tgz ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules react dev utils node modules strip ansi node modules ansi regex package json node modules ansi regex package json dependency hierarchy react tgz root library dom tgz pretty format tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules string length node modules ansi regex package json node modules log update node modules ansi regex package json node modules oclif command node modules wrap ansi node modules ansi regex package json node modules oclif plugin help node modules ansi regex package json dependency hierarchy apollo tgz root library listr tgz listr update renderer tgz log update tgz wrap ansi tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules pretty format node modules ansi regex package json node modules react dev utils node modules ansi regex package json node modules strip ansi node modules ansi regex package json dependency hierarchy jest dom tgz root library pretty format tgz x ansi regex tgz vulnerable library found in base branch master vulnerability details ansi regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ansi regex direct dependency fix resolution testing library react fix resolution ansi regex direct dependency fix resolution testing library jest dom step up your open source security game with whitesource
| 0
|
19,604
| 25,959,130,910
|
IssuesEvent
|
2022-12-18 16:54:11
|
tokio-rs/tokio
|
https://api.github.com/repos/tokio-rs/tokio
|
closed
|
Possible bug in tokio::process::Command
|
C-bug A-tokio M-process
|
**Version**
tokio = { version = "1.23.0", features = ["full"] }
**Platform**
Linux 160R 5.19.17-2-MANJARO
**Description**
From documentation of `tokio::process::Command::wait`:
> The stdin handle to the child process, if any, will be closed
> before waiting. This helps avoid deadlock: it ensures that the
> child does not block waiting for input from the parent, while
> the parent waits for the child to exit.
> If the caller wishes to explicitly control when the child's stdin
> handle is closed, they may `.take()` it before calling `.wait()`:
>```rust
> use tokio::io::AsyncWriteExt;
> use tokio::process::Command;
> use std::process::Stdio;
>
> #[tokio::main]
> async fn main() {
> let mut child = Command::new("cat")
> .stdin(Stdio::piped())
> .spawn()
> .unwrap();
>
> let mut stdin = child.stdin.take().unwrap();
> tokio::spawn(async move {
> // do something with stdin here...
> stdin.write_all(b"hello world\n").await.unwrap();
>
> // then drop when finished
> drop(stdin);
> });
>
> // wait for the process to complete
> let _ = child.wait().await;
>}
>```
But there seems to be something wrong.
I tried this code:
```rust
#[tokio::main]
async fn main() {
let mut p = tokio::process::Command::new("cat")
.stdin(std::fs::OpenOptions::new()
.read(true)
.open("/dev/tty")
.expect("cannot open tty")
)
.spawn()
.expect("cannot spawn");
p.stdin
.take()
.expect("cannot get stdin");
}
```
I expected it to execute without panics.
Instead, it panicked with `cannot get stdin` message.
Is it okay for `p.stdin` to be `None` in this case?
|
1.0
|
Possible bug in tokio::process::Command - **Version**
tokio = { version = "1.23.0", features = ["full"] }
**Platform**
Linux 160R 5.19.17-2-MANJARO
**Description**
From documentation of `tokio::process::Command::wait`:
> The stdin handle to the child process, if any, will be closed
> before waiting. This helps avoid deadlock: it ensures that the
> child does not block waiting for input from the parent, while
> the parent waits for the child to exit.
> If the caller wishes to explicitly control when the child's stdin
> handle is closed, they may `.take()` it before calling `.wait()`:
>```rust
> use tokio::io::AsyncWriteExt;
> use tokio::process::Command;
> use std::process::Stdio;
>
> #[tokio::main]
> async fn main() {
> let mut child = Command::new("cat")
> .stdin(Stdio::piped())
> .spawn()
> .unwrap();
>
> let mut stdin = child.stdin.take().unwrap();
> tokio::spawn(async move {
> // do something with stdin here...
> stdin.write_all(b"hello world\n").await.unwrap();
>
> // then drop when finished
> drop(stdin);
> });
>
> // wait for the process to complete
> let _ = child.wait().await;
>}
>```
But there seems to be something wrong.
I tried this code:
```rust
#[tokio::main]
async fn main() {
let mut p = tokio::process::Command::new("cat")
.stdin(std::fs::OpenOptions::new()
.read(true)
.open("/dev/tty")
.expect("cannot open tty")
)
.spawn()
.expect("cannot spawn");
p.stdin
.take()
.expect("cannot get stdin");
}
```
I expected it to execute without panics.
Instead, it panicked with `cannot get stdin` message.
Is it okay for `p.stdin` to be `None` in this case?
|
process
|
possible bug in tokio process command version tokio version features platform linux manjaro description from documentation of tokio process command wait the stdin handle to the child process if any will be closed before waiting this helps avoid deadlock it ensures that the child does not block waiting for input from the parent while the parent waits for the child to exit if the caller wishes to explicitly control when the child s stdin handle is closed they may take it before calling wait rust use tokio io asyncwriteext use tokio process command use std process stdio async fn main let mut child command new cat stdin stdio piped spawn unwrap let mut stdin child stdin take unwrap tokio spawn async move do something with stdin here stdin write all b hello world n await unwrap then drop when finished drop stdin wait for the process to complete let child wait await but there seems to be something wrong i tried this code rust async fn main let mut p tokio process command new cat stdin std fs openoptions new read true open dev tty expect cannot open tty spawn expect cannot spawn p stdin take expect cannot get stdin i expected it to execute without panics instead it panicked with cannot get stdin message is it okay for p stdin to be none in this case
| 1
|
102,243
| 16,550,354,059
|
IssuesEvent
|
2021-05-28 07:52:17
|
Check-den-Fakt/Frontend
|
https://api.github.com/repos/Check-den-Fakt/Frontend
|
closed
|
CVE-2021-24033 (Medium) detected in react-dev-utils-10.2.1.tgz
|
security vulnerability wontfix
|
## CVE-2021-24033 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>react-dev-utils-10.2.1.tgz</b></p></summary>
<p>webpack utilities used by Create React App</p>
<p>Library home page: <a href="https://registry.npmjs.org/react-dev-utils/-/react-dev-utils-10.2.1.tgz">https://registry.npmjs.org/react-dev-utils/-/react-dev-utils-10.2.1.tgz</a></p>
<p>Path to dependency file: Frontend/package.json</p>
<p>Path to vulnerable library: Frontend/node_modules/react-dev-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.4.1.tgz (Root Library)
- :x: **react-dev-utils-10.2.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
react-dev-utils prior to v11.0.4 exposes a function, getProcessForPort, where an input argument is concatenated into a command string to be executed. This function is typically used from react-scripts (in Create React App projects), where the usage is safe. Only when this function is manually invoked with user-provided values (ie: by custom code) is there the potential for command injection. If you're consuming it from react-scripts then this issue does not affect you.
<p>Publish Date: 2021-03-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-24033>CVE-2021-24033</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.facebook.com/security/advisories/cve-2021-24033">https://www.facebook.com/security/advisories/cve-2021-24033</a></p>
<p>Release Date: 2021-03-09</p>
<p>Fix Resolution: react-dev-utils-11.0.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-24033 (Medium) detected in react-dev-utils-10.2.1.tgz - ## CVE-2021-24033 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>react-dev-utils-10.2.1.tgz</b></p></summary>
<p>webpack utilities used by Create React App</p>
<p>Library home page: <a href="https://registry.npmjs.org/react-dev-utils/-/react-dev-utils-10.2.1.tgz">https://registry.npmjs.org/react-dev-utils/-/react-dev-utils-10.2.1.tgz</a></p>
<p>Path to dependency file: Frontend/package.json</p>
<p>Path to vulnerable library: Frontend/node_modules/react-dev-utils/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.4.1.tgz (Root Library)
- :x: **react-dev-utils-10.2.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
react-dev-utils prior to v11.0.4 exposes a function, getProcessForPort, where an input argument is concatenated into a command string to be executed. This function is typically used from react-scripts (in Create React App projects), where the usage is safe. Only when this function is manually invoked with user-provided values (ie: by custom code) is there the potential for command injection. If you're consuming it from react-scripts then this issue does not affect you.
<p>Publish Date: 2021-03-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-24033>CVE-2021-24033</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.facebook.com/security/advisories/cve-2021-24033">https://www.facebook.com/security/advisories/cve-2021-24033</a></p>
<p>Release Date: 2021-03-09</p>
<p>Fix Resolution: react-dev-utils-11.0.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in react dev utils tgz cve medium severity vulnerability vulnerable library react dev utils tgz webpack utilities used by create react app library home page a href path to dependency file frontend package json path to vulnerable library frontend node modules react dev utils package json dependency hierarchy react scripts tgz root library x react dev utils tgz vulnerable library vulnerability details react dev utils prior to exposes a function getprocessforport where an input argument is concatenated into a command string to be executed this function is typically used from react scripts in create react app projects where the usage is safe only when this function is manually invoked with user provided values ie by custom code is there the potential for command injection if you re consuming it from react scripts then this issue does not affect you publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution react dev utils step up your open source security game with whitesource
| 0
|
510,640
| 14,813,693,712
|
IssuesEvent
|
2021-01-14 02:44:35
|
google/knative-gcp
|
https://api.github.com/repos/google/knative-gcp
|
closed
|
Evaluate the impact of initial delivery timeout on throughput
|
area/broker kind/bug lifecycle/stale priority/2 release/2
|
Currently, BrokerCell deployments have a default timeout of 10 minutes per event. There is also a cap on the number of concurrent open connections. When there is a mix of fast and slow clients, from throughput perspective, it may not always be optimal to hang on to slow clients for that long. For example, if 10 clients out of 1000 are very slow and they happen to be at the beginning of the delivery queue, then there may be notably higher latency observed by the rest of the clients, compared to the case when the slow clients are at the end of the delivery queue. It is likely possible to come up with a more optimal policy, where we would prioritize more responsive clients by keeping it with a shorter timeout on the initial delivery attempt and handing over the slower ones to the retry queue. This task is about quantifying the effect and, if it is meaningful, applying the change.
|
1.0
|
Evaluate the impact of initial delivery timeout on throughput - Currently, BrokerCell deployments have a default timeout of 10 minutes per event. There is also a cap on the number of concurrent open connections. When there is a mix of fast and slow clients, from throughput perspective, it may not always be optimal to hang on to slow clients for that long. For example, if 10 clients out of 1000 are very slow and they happen to be at the beginning of the delivery queue, then there may be notably higher latency observed by the rest of the clients, compared to the case when the slow clients are at the end of the delivery queue. It is likely possible to come up with a more optimal policy, where we would prioritize more responsive clients by keeping it with a shorter timeout on the initial delivery attempt and handing over the slower ones to the retry queue. This task is about quantifying the effect and, if it is meaningful, applying the change.
|
non_process
|
evaluate the impact of initial delivery timeout on throughput currently brokercell deployments have a default timeout of minutes per event there is also a cap on the number of concurrent open connections when there is a mix of fast and slow clients from throughput perspective it may not always be optimal to hang on to slow clients for that long for example if clients out of are very slow and they happen to be at the beginning of the delivery queue then there may be notably higher latency observed by the rest of the clients compared to the case when the slow clients are at the end of the delivery queue it is likely possible to come up with a more optimal policy where we would prioritize more responsive clients by keeping it with a shorter timeout on the initial delivery attempt and handing over the slower ones to the retry queue this task is about quantifying the effect and if it is meaningful applying the change
| 0
|
7,911
| 11,092,068,876
|
IssuesEvent
|
2019-12-15 16:30:37
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Single most recently added Custom GDAL "Create options" profile does not work in Processing
|
Bug Processing
|
**New description**
Go in options > settings > GDAL
and create 2 or more custom "create options" profiles.
Then try to use them in Processing (GDAL tools) or in the "save as..." dialog (for rasters of course). Only the latter will work, while in the former only the 1st custom profile will work, the others shows empty options.
**Describe the bug**
A valid Create Options profile for GeoTIFF that is usable in the Save Raster Layer as... dialog is broken for the Translate (Convert Format) dialog.
**Valid Creation Profile**

**Usage In Save Raster Layer As...**

**How to Reproduce**
Attempt to use by what all appearances appears to be a valid Creation Profile in the Translate (Convert Format) dialog, see that no configuration options are passed to gdal_translate.
Add ANOTHER new valid profile, and prior invalid profile is now usable.



**Broken Profile in Translate (Convert Format)**

**QGIS and OS versions**
QGIS version | 3.8.2-Zanzibar | QGIS code revision | 4470baa1a3
-- | -- | -- | --
Compiled against Qt | 5.11.2 | Running against Qt | 5.11.2
Compiled against GDAL/OGR | 2.4.1 | Running against GDAL/OGR | 2.4.1
Compiled against GEOS | 3.7.2-CAPI-1.11.0 | Running against GEOS | 3.7.2-CAPI-1.11.0 b55d2125
PostgreSQL Client Version | 10.8 | SpatiaLite Version | 4.3.0
QWT Version | 6.1.3 | QScintilla2 Version | 2.10.8
Compiled against PROJ | 5.2.0 | Running against PROJ | Rel. 5.2.0, September 15th, 2018
OS Version | Windows 10 (10.0) | |
**Additional context**
Output of the Translate (Convert Format) dialog using my 1-bit Image Compression - ArcGIS Compatible profile from above
```
QGIS version: 3.8.2-Zanzibar
QGIS code revision: 4470baa1a3
Qt version: 5.11.2
GDAL version: 2.4.1
GEOS version: 3.7.2-CAPI-1.11.0 b55d2125
PROJ version: Rel. 5.2.0, September 15th, 2018
Processing algorithm…
Algorithm 'Translate (convert format)' starting…
Input parameters:
{ 'COPY_SUBDATASETS' : False, 'DATA_TYPE' : 0, 'INPUT' : 'C:/Users/username/Desktop/GeoTIFF_compression_test/test_deflate.tif', 'NODATA' : None, 'OPTIONS' : '', 'OUTPUT' : 'TEMPORARY_OUTPUT', 'TARGET_CRS' : None }
GDAL command:
gdal_translate -of GTiff C:/Users/username/Desktop/GeoTIFF_compression_test/test_deflate.tif C:/Users/username/AppData/Local/Temp/processing_47e9e70b244741bca9e4d08e35bd8e73/bf29ae54ae9843639631c692644d1f50/OUTPUT.tif
GDAL command output:
Input file size is 14016, 9920
0...10...20...30...40...50...60...70...80...90...100 - done.
Execution completed in 2.45 seconds
Results:
{'OUTPUT': 'C:/Users/username/AppData/Local/Temp/processing_47e9e70b244741bca9e4d08e35bd8e73/bf29ae54ae9843639631c692644d1f50/OUTPUT.tif'}
Loading resulting layers
Algorithm 'Translate (convert format)' finished
```
It would appear that this is not simply a display issue as no config information was passed to the tool, and the output raster was the same bitdepth as the input raster (32bit float), not 1-bit as anticipated.
It seems like whatever index is being used to store/recall the Creation Profiles is too short? N-1 or something, because adding another new profile invalidates the newest, but promotes the prior broken profile to working status. So, the list of working profiles is always one entry (the latest entry) short of the full list of stored profiles.
|
1.0
|
Single most recently added Custom GDAL "Create options" profile does not work in Processing - **New description**
Go in options > settings > GDAL
and create 2 or more custom "create options" profiles.
Then try to use them in Processing (GDAL tools) or in the "save as..." dialog (for rasters of course). Only the latter will work, while in the former only the 1st custom profile will work, the others shows empty options.
**Describe the bug**
A valid Create Options profile for GeoTIFF that is usable in the Save Raster Layer as... dialog is broken for the Translate (Convert Format) dialog.
**Valid Creation Profile**

**Usage In Save Raster Layer As...**

**How to Reproduce**
Attempt to use by what all appearances appears to be a valid Creation Profile in the Translate (Convert Format) dialog, see that no configuration options are passed to gdal_translate.
Add ANOTHER new valid profile, and prior invalid profile is now usable.



**Broken Profile in Translate (Convert Format)**

**QGIS and OS versions**
QGIS version | 3.8.2-Zanzibar | QGIS code revision | 4470baa1a3
-- | -- | -- | --
Compiled against Qt | 5.11.2 | Running against Qt | 5.11.2
Compiled against GDAL/OGR | 2.4.1 | Running against GDAL/OGR | 2.4.1
Compiled against GEOS | 3.7.2-CAPI-1.11.0 | Running against GEOS | 3.7.2-CAPI-1.11.0 b55d2125
PostgreSQL Client Version | 10.8 | SpatiaLite Version | 4.3.0
QWT Version | 6.1.3 | QScintilla2 Version | 2.10.8
Compiled against PROJ | 5.2.0 | Running against PROJ | Rel. 5.2.0, September 15th, 2018
OS Version | Windows 10 (10.0) | |
**Additional context**
Output of the Translate (Convert Format) dialog using my 1-bit Image Compression - ArcGIS Compatible profile from above
```
QGIS version: 3.8.2-Zanzibar
QGIS code revision: 4470baa1a3
Qt version: 5.11.2
GDAL version: 2.4.1
GEOS version: 3.7.2-CAPI-1.11.0 b55d2125
PROJ version: Rel. 5.2.0, September 15th, 2018
Processing algorithm…
Algorithm 'Translate (convert format)' starting…
Input parameters:
{ 'COPY_SUBDATASETS' : False, 'DATA_TYPE' : 0, 'INPUT' : 'C:/Users/username/Desktop/GeoTIFF_compression_test/test_deflate.tif', 'NODATA' : None, 'OPTIONS' : '', 'OUTPUT' : 'TEMPORARY_OUTPUT', 'TARGET_CRS' : None }
GDAL command:
gdal_translate -of GTiff C:/Users/username/Desktop/GeoTIFF_compression_test/test_deflate.tif C:/Users/username/AppData/Local/Temp/processing_47e9e70b244741bca9e4d08e35bd8e73/bf29ae54ae9843639631c692644d1f50/OUTPUT.tif
GDAL command output:
Input file size is 14016, 9920
0...10...20...30...40...50...60...70...80...90...100 - done.
Execution completed in 2.45 seconds
Results:
{'OUTPUT': 'C:/Users/username/AppData/Local/Temp/processing_47e9e70b244741bca9e4d08e35bd8e73/bf29ae54ae9843639631c692644d1f50/OUTPUT.tif'}
Loading resulting layers
Algorithm 'Translate (convert format)' finished
```
It would appear that this is not simply a display issue as no config information was passed to the tool, and the output raster was the same bitdepth as the input raster (32bit float), not 1-bit as anticipated.
It seems like whatever index is being used to store/recall the Creation Profiles is too short? N-1 or something, because adding another new profile invalidates the newest, but promotes the prior broken profile to working status. So, the list of working profiles is always one entry (the latest entry) short of the full list of stored profiles.
|
process
|
single most recently added custom gdal create options profile does not work in processing new description go in options settings gdal and create or more custom create options profiles then try to use them in processing gdal tools or in the save as dialog for rasters of course only the latter will work while in the former only the custom profile will work the others shows empty options describe the bug a valid create options profile for geotiff that is usable in the save raster layer as dialog is broken for the translate convert format dialog valid creation profile usage in save raster layer as how to reproduce attempt to use by what all appearances appears to be a valid creation profile in the translate convert format dialog see that no configuration options are passed to gdal translate add another new valid profile and prior invalid profile is now usable broken profile in translate convert format qgis and os versions qgis version zanzibar qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi postgresql client version spatialite version qwt version version compiled against proj running against proj rel september os version windows additional context output of the translate convert format dialog using my bit image compression arcgis compatible profile from above qgis version zanzibar qgis code revision qt version gdal version geos version capi proj version rel september processing algorithm… algorithm translate convert format starting… input parameters copy subdatasets false data type input c users username desktop geotiff compression test test deflate tif nodata none options output temporary output target crs none gdal command gdal translate of gtiff c users username desktop geotiff compression test test deflate tif c users username appdata local temp processing output tif gdal command output input file size is done execution completed in seconds results output c users username appdata local temp processing output tif loading resulting layers algorithm translate convert format finished it would appear that this is not simply a display issue as no config information was passed to the tool and the output raster was the same bitdepth as the input raster float not bit as anticipated it seems like whatever index is being used to store recall the creation profiles is too short n or something because adding another new profile invalidates the newest but promotes the prior broken profile to working status so the list of working profiles is always one entry the latest entry short of the full list of stored profiles
| 1
|
8,266
| 11,428,819,188
|
IssuesEvent
|
2020-02-04 06:05:06
|
kubeflow/manifests
|
https://api.github.com/repos/kubeflow/manifests
|
closed
|
Retag notebook images to v1.0 and update the configmap
|
area/jupyter feature kind/process priority/p0
|
We should retag our existing jupyter images to be v1.0 and then update the configmap that lists them for the jupyter web app.
Rebuilding the docker images I think only makes sense if we are bumping the TF versions and thus the base images. Ideally we would do that but I don't think we should block on it.
|
1.0
|
Retag notebook images to v1.0 and update the configmap - We should retag our existing jupyter images to be v1.0 and then update the configmap that lists them for the jupyter web app.
Rebuilding the docker images I think only makes sense if we are bumping the TF versions and thus the base images. Ideally we would do that but I don't think we should block on it.
|
process
|
retag notebook images to and update the configmap we should retag our existing jupyter images to be and then update the configmap that lists them for the jupyter web app rebuilding the docker images i think only makes sense if we are bumping the tf versions and thus the base images ideally we would do that but i don t think we should block on it
| 1
|
92,413
| 15,857,075,330
|
IssuesEvent
|
2021-04-08 03:54:17
|
f00b4rb00f/test-project
|
https://api.github.com/repos/f00b4rb00f/test-project
|
opened
|
WS-2020-0042 (High) detected in acorn-5.7.3.tgz
|
security vulnerability
|
## WS-2020-0042 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>acorn-5.7.3.tgz</b></p></summary>
<p>ECMAScript parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/acorn/-/acorn-5.7.3.tgz">https://registry.npmjs.org/acorn/-/acorn-5.7.3.tgz</a></p>
<p>Path to dependency file: /test-project/package.json</p>
<p>Path to vulnerable library: test-project/attacker-app/node_modules/acorn/package.json,test-project/attacker-app/node_modules/acorn/package.json</p>
<p>
Dependency Hierarchy:
- plato-1.7.0.tgz (Root Library)
- eslint-3.0.1.tgz
- espree-3.5.4.tgz
- :x: **acorn-5.7.3.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
acorn is vulnerable to REGEX DoS. A regex of the form /[x-\ud800]/u causes the parser to enter an infinite loop. attackers may leverage the vulnerability leading to a Denial of Service since the string is not valid UTF16 and it results in it being sanitized before reaching the parser.
<p>Publish Date: 2020-03-01
<p>URL: <a href=https://github.com/acornjs/acorn/commit/b5c17877ac0511e31579ea31e7650ba1a5871e51>WS-2020-0042</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1488">https://www.npmjs.com/advisories/1488</a></p>
<p>Release Date: 2020-03-08</p>
<p>Fix Resolution: 7.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2020-0042 (High) detected in acorn-5.7.3.tgz - ## WS-2020-0042 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>acorn-5.7.3.tgz</b></p></summary>
<p>ECMAScript parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/acorn/-/acorn-5.7.3.tgz">https://registry.npmjs.org/acorn/-/acorn-5.7.3.tgz</a></p>
<p>Path to dependency file: /test-project/package.json</p>
<p>Path to vulnerable library: test-project/attacker-app/node_modules/acorn/package.json,test-project/attacker-app/node_modules/acorn/package.json</p>
<p>
Dependency Hierarchy:
- plato-1.7.0.tgz (Root Library)
- eslint-3.0.1.tgz
- espree-3.5.4.tgz
- :x: **acorn-5.7.3.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
acorn is vulnerable to REGEX DoS. A regex of the form /[x-\ud800]/u causes the parser to enter an infinite loop. attackers may leverage the vulnerability leading to a Denial of Service since the string is not valid UTF16 and it results in it being sanitized before reaching the parser.
<p>Publish Date: 2020-03-01
<p>URL: <a href=https://github.com/acornjs/acorn/commit/b5c17877ac0511e31579ea31e7650ba1a5871e51>WS-2020-0042</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1488">https://www.npmjs.com/advisories/1488</a></p>
<p>Release Date: 2020-03-08</p>
<p>Fix Resolution: 7.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws high detected in acorn tgz ws high severity vulnerability vulnerable library acorn tgz ecmascript parser library home page a href path to dependency file test project package json path to vulnerable library test project attacker app node modules acorn package json test project attacker app node modules acorn package json dependency hierarchy plato tgz root library eslint tgz espree tgz x acorn tgz vulnerable library vulnerability details acorn is vulnerable to regex dos a regex of the form u causes the parser to enter an infinite loop attackers may leverage the vulnerability leading to a denial of service since the string is not valid and it results in it being sanitized before reaching the parser publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
1,145
| 3,632,358,534
|
IssuesEvent
|
2016-02-11 09:30:24
|
DevExpress/testcafe-hammerhead
|
https://api.github.com/repos/DevExpress/testcafe-hammerhead
|
closed
|
Setting up an href attribute for a non-added to DOM "base" tag should not cause urlResolver to update.
|
AREA: client SYSTEM: URL processing TYPE: bug
|
see https://testcafe-hhhm.devexpress.com/history?timestamp=1455001095958&siteUrl=http://www.microsoft.com/ru-ru/default.aspx
|
1.0
|
Setting up an href attribute for a non-added to DOM "base" tag should not cause urlResolver to update. - see https://testcafe-hhhm.devexpress.com/history?timestamp=1455001095958&siteUrl=http://www.microsoft.com/ru-ru/default.aspx
|
process
|
setting up an href attribute for a non added to dom base tag should not cause urlresolver to update see
| 1
|
1,266
| 3,798,609,854
|
IssuesEvent
|
2016-03-23 13:17:58
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
closed
|
Be more flexible with calib_smoother command
|
enhancement preprocessor
|
Currently, we do not allow ```prefilter```, ```loglinear```, and ```first_obs```. I think they are all valid options that can be easily handled within the current framework. The first step would be to add the preprocessor interface allowing these options.
|
1.0
|
Be more flexible with calib_smoother command - Currently, we do not allow ```prefilter```, ```loglinear```, and ```first_obs```. I think they are all valid options that can be easily handled within the current framework. The first step would be to add the preprocessor interface allowing these options.
|
process
|
be more flexible with calib smoother command currently we do not allow prefilter loglinear and first obs i think they are all valid options that can be easily handled within the current framework the first step would be to add the preprocessor interface allowing these options
| 1
|
13,006
| 15,364,691,598
|
IssuesEvent
|
2021-03-01 22:21:51
|
nextgenhealthcare/connect
|
https://api.github.com/repos/nextgenhealthcare/connect
|
closed
|
When reprocessing messages, add a button to clear all the checkboxes for the destinations
|
checkboxes closed-due-to-inactivity reprocess
|
I have a number of channels with a large number of destinations. Sometimes I wish to reprocess selected messages to just one or two destinations. The default is that all destination checkboxes are selected. I should like to be able to clear this in one go, and just tick the detinations I need rather than uncheck many destinations each time.
Imported Issue. Original Details:
Jira Issue Key: MIRTH-3367
Reporter: seaston
Created: 2014-07-15T14:00:59.000-0700
|
1.0
|
When reprocessing messages, add a button to clear all the checkboxes for the destinations - I have a number of channels with a large number of destinations. Sometimes I wish to reprocess selected messages to just one or two destinations. The default is that all destination checkboxes are selected. I should like to be able to clear this in one go, and just tick the detinations I need rather than uncheck many destinations each time.
Imported Issue. Original Details:
Jira Issue Key: MIRTH-3367
Reporter: seaston
Created: 2014-07-15T14:00:59.000-0700
|
process
|
when reprocessing messages add a button to clear all the checkboxes for the destinations i have a number of channels with a large number of destinations sometimes i wish to reprocess selected messages to just one or two destinations the default is that all destination checkboxes are selected i should like to be able to clear this in one go and just tick the detinations i need rather than uncheck many destinations each time imported issue original details jira issue key mirth reporter seaston created
| 1
|
42,827
| 2,874,573,734
|
IssuesEvent
|
2015-06-08 23:31:22
|
NuGet/Home
|
https://api.github.com/repos/NuGet/Home
|
closed
|
Inconsistency in the way UI window is launched for projects and solutions. Either only 1 window should show up or multiple windows should show up in all the cases
|
2 - Working Area:VS.Client Priority:2 Type:Bug
|
a. Repro steps
i. Create a console application
ii. Launch the 'Manage NuGet Packages' for solution. It opens a new window
iii. Launch the 'Manage NuGet Packages' for a project.
iv. Expected: It opens a new window
iv. Actual: It selects the 'Manage NuGet Packages' for solution rather than open a new window.
b. NOTE: If this is by design, note that 2 windows can be opened if the window for the project is opened first. Also, note that the project can be renamed to something other than the name of the solution. Does not matter. Same inconsistency observed.
|
1.0
|
Inconsistency in the way UI window is launched for projects and solutions. Either only 1 window should show up or multiple windows should show up in all the cases - a. Repro steps
i. Create a console application
ii. Launch the 'Manage NuGet Packages' for solution. It opens a new window
iii. Launch the 'Manage NuGet Packages' for a project.
iv. Expected: It opens a new window
iv. Actual: It selects the 'Manage NuGet Packages' for solution rather than open a new window.
b. NOTE: If this is by design, note that 2 windows can be opened if the window for the project is opened first. Also, note that the project can be renamed to something other than the name of the solution. Does not matter. Same inconsistency observed.
|
non_process
|
inconsistency in the way ui window is launched for projects and solutions either only window should show up or multiple windows should show up in all the cases a repro steps i create a console application ii launch the manage nuget packages for solution it opens a new window iii launch the manage nuget packages for a project iv expected it opens a new window iv actual it selects the manage nuget packages for solution rather than open a new window b note if this is by design note that windows can be opened if the window for the project is opened first also note that the project can be renamed to something other than the name of the solution does not matter same inconsistency observed
| 0
|
2,554
| 5,310,929,815
|
IssuesEvent
|
2017-02-13 00:06:46
|
csstree/stylelint-validator
|
https://api.github.com/repos/csstree/stylelint-validator
|
closed
|
Ignoring Less variables
|
enhancement preprocessors
|
I took this lovely plugin for a whirl today, it correctly discovered some issues where my team had written invalid syntax, however the majority of the errors it reported was the `csstree/validator` rule choking on Less variables, ex:
```
src/Component/Component.less
28:13 ✖ Can't parse value "@link-hover-color" csstree/validator
54:9 ✖ Can't parse value "@icon-color" csstree/validator
```
Any idea how to mute or ignore Less variables (I'm assuming the same goes for Sass variables).
Thanks!
|
1.0
|
Ignoring Less variables - I took this lovely plugin for a whirl today, it correctly discovered some issues where my team had written invalid syntax, however the majority of the errors it reported was the `csstree/validator` rule choking on Less variables, ex:
```
src/Component/Component.less
28:13 ✖ Can't parse value "@link-hover-color" csstree/validator
54:9 ✖ Can't parse value "@icon-color" csstree/validator
```
Any idea how to mute or ignore Less variables (I'm assuming the same goes for Sass variables).
Thanks!
|
process
|
ignoring less variables i took this lovely plugin for a whirl today it correctly discovered some issues where my team had written invalid syntax however the majority of the errors it reported was the csstree validator rule choking on less variables ex src component component less ✖ can t parse value link hover color csstree validator ✖ can t parse value icon color csstree validator any idea how to mute or ignore less variables i m assuming the same goes for sass variables thanks
| 1
|
309,841
| 23,307,461,493
|
IssuesEvent
|
2022-08-08 03:45:30
|
stoneatom/stonedb
|
https://api.github.com/repos/stoneatom/stonedb
|
closed
|
bug: Pictures width exceeds content area in docs site.
|
A-bug A-documentation
|
### Describe the problem
for example:
https://stonedb.io/docs/O&M-Guide/monitoring-and-alerting/prometheus+grafana-monitor/
For all the contents in the document that contain pictures, the width of the pictures exceeds the content area.
### Expected behavior
The width of the picture can be automatically adapted.
### How To Reproduce
See this page:https://stonedb.io/docs/O&M-Guide/monitoring-and-alerting/prometheus+grafana-monitor/
### Environment
_No response_
### Are you interested in submitting a PR to solve the problem?
- [X] Yes, I will!
|
1.0
|
bug: Pictures width exceeds content area in docs site. - ### Describe the problem
for example:
https://stonedb.io/docs/O&M-Guide/monitoring-and-alerting/prometheus+grafana-monitor/
For all the contents in the document that contain pictures, the width of the pictures exceeds the content area.
### Expected behavior
The width of the picture can be automatically adapted.
### How To Reproduce
See this page:https://stonedb.io/docs/O&M-Guide/monitoring-and-alerting/prometheus+grafana-monitor/
### Environment
_No response_
### Are you interested in submitting a PR to solve the problem?
- [X] Yes, I will!
|
non_process
|
bug pictures width exceeds content area in docs site describe the problem for example for all the contents in the document that contain pictures the width of the pictures exceeds the content area expected behavior the width of the picture can be automatically adapted how to reproduce see this page: environment no response are you interested in submitting a pr to solve the problem yes i will
| 0
|
12,175
| 14,741,929,772
|
IssuesEvent
|
2021-01-07 11:24:23
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Angel Care 045-W00922 - Billing Issues
|
anc-process anp-important ant-bug ant-enhancement ant-parent/primary
|
In GitLab by @kdjstudios on Feb 26, 2019, 15:43
**Submitted by:** "Kimberly Gagner" <kimberly.gagner@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-02-26-38195/conversation
**Server:** Internal
**Client/Site:** Orlando
**Account:** Angel Care 045-W00922
**Issue:**
I received an inquiry from our client, Angel Care, LLC, regarding the amount of their bill increasing from the previous month. The increase is partially due to the billing code being recently changed from 55084 to 55099 to include agent work time. I have spoken with the client and have agreed to credit back only the agent work time (Column AE in the export). However, I am noticing multiple discrepancies and need your assistance.
The billing export is showing two lines for Angel Care (Line 16 and Line 153). The usage in each is different in each line item. Also, in SAB in the usage history I can see that it pulled usage for both codes but did it apply them both?
The invoice states the overage was 769. The plan they have includes 250 so that would be a total of 1019 units used for the month. When I compared that to the billing export by adding columns H,J,P,R, & AE I am not coming up with that total for either line in the export.
Also, I ran VCC reports a detailed client call and the inbound and outbound times do not match either. I need to credit back the agent work time in column AE and bill for the original code 55099 in and out talk time but when I run reports to and or minus the AE amount it does not match the reports. For example, when we looked at column AE and saw the total 13756 we agreed to credit back 229 but then when you look at columns H, J, P, & R and the total is 38236 which would be 637 the VCC reports do not match up with this amount and neither does the invoice. If I were to subtract the 229 units (229 x.96 =$219.84) and apply a credit of $219.84 for the agent work time that leaves the talk time at 790 units which still does not match the reports or the export file.
Lastly, I looked at the human readable report and see what each code is pulling and I noticed 55052 is pulling patch time as well as 55099 are we billing twice for patch time? Also, the client asked if she is being billed for hold times? I know this is something we do not typically bill for but can we verify this is the case.
I am hoping to respond back to the client on this immediately. If it would be easier to join a call to discuss this, please let me know your availability.
|
1.0
|
Angel Care 045-W00922 - Billing Issues - In GitLab by @kdjstudios on Feb 26, 2019, 15:43
**Submitted by:** "Kimberly Gagner" <kimberly.gagner@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-02-26-38195/conversation
**Server:** Internal
**Client/Site:** Orlando
**Account:** Angel Care 045-W00922
**Issue:**
I received an inquiry from our client, Angel Care, LLC, regarding the amount of their bill increasing from the previous month. The increase is partially due to the billing code being recently changed from 55084 to 55099 to include agent work time. I have spoken with the client and have agreed to credit back only the agent work time (Column AE in the export). However, I am noticing multiple discrepancies and need your assistance.
The billing export is showing two lines for Angel Care (Line 16 and Line 153). The usage in each is different in each line item. Also, in SAB in the usage history I can see that it pulled usage for both codes but did it apply them both?
The invoice states the overage was 769. The plan they have includes 250 so that would be a total of 1019 units used for the month. When I compared that to the billing export by adding columns H,J,P,R, & AE I am not coming up with that total for either line in the export.
Also, I ran VCC reports a detailed client call and the inbound and outbound times do not match either. I need to credit back the agent work time in column AE and bill for the original code 55099 in and out talk time but when I run reports to and or minus the AE amount it does not match the reports. For example, when we looked at column AE and saw the total 13756 we agreed to credit back 229 but then when you look at columns H, J, P, & R and the total is 38236 which would be 637 the VCC reports do not match up with this amount and neither does the invoice. If I were to subtract the 229 units (229 x.96 =$219.84) and apply a credit of $219.84 for the agent work time that leaves the talk time at 790 units which still does not match the reports or the export file.
Lastly, I looked at the human readable report and see what each code is pulling and I noticed 55052 is pulling patch time as well as 55099 are we billing twice for patch time? Also, the client asked if she is being billed for hold times? I know this is something we do not typically bill for but can we verify this is the case.
I am hoping to respond back to the client on this immediately. If it would be easier to join a call to discuss this, please let me know your availability.
|
process
|
angel care billing issues in gitlab by kdjstudios on feb submitted by kimberly gagner helpdesk server internal client site orlando account angel care issue i received an inquiry from our client angel care llc regarding the amount of their bill increasing from the previous month the increase is partially due to the billing code being recently changed from to to include agent work time i have spoken with the client and have agreed to credit back only the agent work time column ae in the export however i am noticing multiple discrepancies and need your assistance the billing export is showing two lines for angel care line and line the usage in each is different in each line item also in sab in the usage history i can see that it pulled usage for both codes but did it apply them both the invoice states the overage was the plan they have includes so that would be a total of units used for the month when i compared that to the billing export by adding columns h j p r ae i am not coming up with that total for either line in the export also i ran vcc reports a detailed client call and the inbound and outbound times do not match either i need to credit back the agent work time in column ae and bill for the original code in and out talk time but when i run reports to and or minus the ae amount it does not match the reports for example when we looked at column ae and saw the total we agreed to credit back but then when you look at columns h j p r and the total is which would be the vcc reports do not match up with this amount and neither does the invoice if i were to subtract the units x and apply a credit of for the agent work time that leaves the talk time at units which still does not match the reports or the export file lastly i looked at the human readable report and see what each code is pulling and i noticed is pulling patch time as well as are we billing twice for patch time also the client asked if she is being billed for hold times i know this is something we do not typically bill for but can we verify this is the case i am hoping to respond back to the client on this immediately if it would be easier to join a call to discuss this please let me know your availability
| 1
|
14,962
| 18,452,557,101
|
IssuesEvent
|
2021-10-15 12:43:03
|
mt-ag/apex-flowsforapex
|
https://api.github.com/repos/mt-ag/apex-flowsforapex
|
opened
|
[enhancement]: Add "Start Step" to Manage Flow Instance Step plug-in
|
enhancement process-plugin
|
### Feature description
Ability to select "Start Step" as Action in the plug-in Manage Flow Instance Step.
This action should call the `flow_api_pkg.flow_start_step` procedure.
### Describe alternatives you've considered
_No response_
### Additional context
_No response_
|
1.0
|
[enhancement]: Add "Start Step" to Manage Flow Instance Step plug-in - ### Feature description
Ability to select "Start Step" as Action in the plug-in Manage Flow Instance Step.
This action should call the `flow_api_pkg.flow_start_step` procedure.
### Describe alternatives you've considered
_No response_
### Additional context
_No response_
|
process
|
add start step to manage flow instance step plug in feature description ability to select start step as action in the plug in manage flow instance step this action should call the flow api pkg flow start step procedure describe alternatives you ve considered no response additional context no response
| 1
|
17,790
| 23,716,175,880
|
IssuesEvent
|
2022-08-30 11:59:10
|
wp-media/wp-rocket
|
https://api.github.com/repos/wp-media/wp-rocket
|
closed
|
PHP warning: Invalid argument supplied for foreach() in wp-background-process.php
|
type: bug module: preload tool: background process
|
**Before submitting an issue please check that you’ve completed the following steps:**
- Made sure you’re on the latest version - 3.9.0.5
- Used the search feature to ensure that the bug hasn’t been reported before - yes
**Describe the bug**
WP_DEBUG logs multiple errors of the following content:
`PHP Warning: Invalid argument supplied for foreach() in /www/wp-content/plugins/wp-rocket/inc/classes/dependencies/wp-media/background-processing/wp-background-process.php on line 314`
The current issue appeared due to the rocket_partial_preload batches that contain serialized data.
The value of `$batch->data` logged after this line: https://github.com/wp-media/wp-rocket/blob/2d094e01278e3e911d50cdded013f22f53ea4e70/inc/classes/dependencies/wp-media/background-processing/wp-background-process.php#L312 is the following:

Link to the note with full value: https://secure.helpscout.net/conversation/1561866652/276589?folderId=3864740#thread-4543785933
**To Reproduce**
The issue was reproduced on a customer's site. The steps are unclear.
**Expected behavior**
No warnings in WP_DEBUG.
We should check if the `$data` is of the correct type before doing the `foreach` loop in the background process.
**Additional context**
Related ticket: https://secure.helpscout.net/conversation/1561866652/276589
More details in the note: https://secure.helpscout.net/conversation/1561866652/276589?folderId=3864740#thread-4542818897
Slack thread: https://wp-media.slack.com/archives/C43T1AYMQ/p1625397863006800
**Backlog Grooming (for WP Media dev team use only)**
- [ ] Reproduce the problem
- [ ] Identify the root cause
- [ ] Scope a solution
- [ ] Estimate the effort
|
1.0
|
PHP warning: Invalid argument supplied for foreach() in wp-background-process.php - **Before submitting an issue please check that you’ve completed the following steps:**
- Made sure you’re on the latest version - 3.9.0.5
- Used the search feature to ensure that the bug hasn’t been reported before - yes
**Describe the bug**
WP_DEBUG logs multiple errors of the following content:
`PHP Warning: Invalid argument supplied for foreach() in /www/wp-content/plugins/wp-rocket/inc/classes/dependencies/wp-media/background-processing/wp-background-process.php on line 314`
The current issue appeared due to the rocket_partial_preload batches that contain serialized data.
The value of `$batch->data` logged after this line: https://github.com/wp-media/wp-rocket/blob/2d094e01278e3e911d50cdded013f22f53ea4e70/inc/classes/dependencies/wp-media/background-processing/wp-background-process.php#L312 is the following:

Link to the note with full value: https://secure.helpscout.net/conversation/1561866652/276589?folderId=3864740#thread-4543785933
**To Reproduce**
The issue was reproduced on a customer's site. The steps are unclear.
**Expected behavior**
No warnings in WP_DEBUG.
We should check if the `$data` is of the correct type before doing the `foreach` loop in the background process.
**Additional context**
Related ticket: https://secure.helpscout.net/conversation/1561866652/276589
More details in the note: https://secure.helpscout.net/conversation/1561866652/276589?folderId=3864740#thread-4542818897
Slack thread: https://wp-media.slack.com/archives/C43T1AYMQ/p1625397863006800
**Backlog Grooming (for WP Media dev team use only)**
- [ ] Reproduce the problem
- [ ] Identify the root cause
- [ ] Scope a solution
- [ ] Estimate the effort
|
process
|
php warning invalid argument supplied for foreach in wp background process php before submitting an issue please check that you’ve completed the following steps made sure you’re on the latest version used the search feature to ensure that the bug hasn’t been reported before yes describe the bug wp debug logs multiple errors of the following content php warning invalid argument supplied for foreach in www wp content plugins wp rocket inc classes dependencies wp media background processing wp background process php on line the current issue appeared due to the rocket partial preload batches that contain serialized data the value of batch data logged after this line is the following link to the note with full value to reproduce the issue was reproduced on a customer s site the steps are unclear expected behavior no warnings in wp debug we should check if the data is of the correct type before doing the foreach loop in the background process additional context related ticket more details in the note slack thread backlog grooming for wp media dev team use only reproduce the problem identify the root cause scope a solution estimate the effort
| 1
|
378,777
| 26,337,545,349
|
IssuesEvent
|
2023-01-10 15:21:54
|
RandomFractals/geo-data-viewer
|
https://api.github.com/repos/RandomFractals/geo-data-viewer
|
closed
|
Refine documentation sections in README.md
|
documentation
|
Update extension description, recommended extensions list, and other sections in docs.
|
1.0
|
Refine documentation sections in README.md - Update extension description, recommended extensions list, and other sections in docs.
|
non_process
|
refine documentation sections in readme md update extension description recommended extensions list and other sections in docs
| 0
|
2,487
| 5,266,003,035
|
IssuesEvent
|
2017-02-04 07:18:45
|
mitchellh/packer
|
https://api.github.com/repos/mitchellh/packer
|
closed
|
Support replacing placeholders in vagrantfile with user variables
|
enhancement post-processor/vagrant
|
I would really love to have support for replacing variables in my vagrant template file with user variables.
Vagrantfile.template
``` ruby
config.vm.define {{boxname}}
config.vm.box {{boxname}}
config.ssh.forward_agent {{forwardSshAgent}}
```
packer.json
``` json
{
"variables": {
"boxname": 1,
"memory": 1024,
"forwardSshAgent": true,
"vram": 12,
"disk_size": 20000
},
"builders": [{
"vm_name": "{{user `boxname`}}",
```
This way you can reuse a lot of logic and configuration already in the packer file and have them automatically with the same values in the vagrant box Vagrantfile.
|
1.0
|
Support replacing placeholders in vagrantfile with user variables - I would really love to have support for replacing variables in my vagrant template file with user variables.
Vagrantfile.template
``` ruby
config.vm.define {{boxname}}
config.vm.box {{boxname}}
config.ssh.forward_agent {{forwardSshAgent}}
```
packer.json
``` json
{
"variables": {
"boxname": 1,
"memory": 1024,
"forwardSshAgent": true,
"vram": 12,
"disk_size": 20000
},
"builders": [{
"vm_name": "{{user `boxname`}}",
```
This way you can reuse a lot of logic and configuration already in the packer file and have them automatically with the same values in the vagrant box Vagrantfile.
|
process
|
support replacing placeholders in vagrantfile with user variables i would really love to have support for replacing variables in my vagrant template file with user variables vagrantfile template ruby config vm define boxname config vm box boxname config ssh forward agent forwardsshagent packer json json variables boxname memory forwardsshagent true vram disk size builders vm name user boxname this way you can reuse a lot of logic and configuration already in the packer file and have them automatically with the same values in the vagrant box vagrantfile
| 1
|
19,211
| 25,344,555,139
|
IssuesEvent
|
2022-11-19 03:42:18
|
Ultimate-Hosts-Blacklist/whitelist
|
https://api.github.com/repos/Ultimate-Hosts-Blacklist/whitelist
|
closed
|
[FALSE-POSITIVE?] bkrenergy.ca
|
whitelisting process
|
**Domains or links**
<!-- Please list below any domains and links listed here which you believe are a false positive. -->
- bkrenergy.ca 172.64.80.1
**More Information**
1. This must be a new addition, I stopped being able to access my thermostat through my app
2. I then found the IP of the company, found it in your list, tested by disabling the list and trying again and it worked
**Additional context**
Not sure how this was flagged originally, but just know it is a canadian energy company that helps manage heat pump vs furnace usage in homes. With this IP blocked the user can't use the phone app, at the very least, I'm not sure if it's stopping other comms in the background as well
|
1.0
|
[FALSE-POSITIVE?] bkrenergy.ca - **Domains or links**
<!-- Please list below any domains and links listed here which you believe are a false positive. -->
- bkrenergy.ca 172.64.80.1
**More Information**
1. This must be a new addition, I stopped being able to access my thermostat through my app
2. I then found the IP of the company, found it in your list, tested by disabling the list and trying again and it worked
**Additional context**
Not sure how this was flagged originally, but just know it is a canadian energy company that helps manage heat pump vs furnace usage in homes. With this IP blocked the user can't use the phone app, at the very least, I'm not sure if it's stopping other comms in the background as well
|
process
|
bkrenergy ca domains or links bkrenergy ca more information this must be a new addition i stopped being able to access my thermostat through my app i then found the ip of the company found it in your list tested by disabling the list and trying again and it worked additional context not sure how this was flagged originally but just know it is a canadian energy company that helps manage heat pump vs furnace usage in homes with this ip blocked the user can t use the phone app at the very least i m not sure if it s stopping other comms in the background as well
| 1
|
10,445
| 13,224,700,491
|
IssuesEvent
|
2020-08-17 19:40:07
|
jyn514/saltwater
|
https://api.github.com/repos/jyn514/saltwater
|
opened
|
The order of preprocessor stringifying is wrong.
|
bug preprocessor
|
Stringify should be applied before doing replacement on the function parameters, but that replacement should occur before passing them to deeper functions (I think). I expect this has been broken since #456 but I will dig deeper.
# Sample Code
```
#define str(a) # a
#define xstr(a) str(a)
#define f(x) [x]
str(f(5))
xstr(f(5))
```
# Expected Behavior
```
"f(5)"
"[5]"
```
# Actual Behavior
```
"f(5)"
"f(5)"
```
|
1.0
|
The order of preprocessor stringifying is wrong. - Stringify should be applied before doing replacement on the function parameters, but that replacement should occur before passing them to deeper functions (I think). I expect this has been broken since #456 but I will dig deeper.
# Sample Code
```
#define str(a) # a
#define xstr(a) str(a)
#define f(x) [x]
str(f(5))
xstr(f(5))
```
# Expected Behavior
```
"f(5)"
"[5]"
```
# Actual Behavior
```
"f(5)"
"f(5)"
```
|
process
|
the order of preprocessor stringifying is wrong stringify should be applied before doing replacement on the function parameters but that replacement should occur before passing them to deeper functions i think i expect this has been broken since but i will dig deeper sample code define str a a define xstr a str a define f x str f xstr f expected behavior f actual behavior f f
| 1
|
301,465
| 9,220,702,193
|
IssuesEvent
|
2019-03-11 18:07:07
|
python/mypy
|
https://api.github.com/repos/python/mypy
|
closed
|
attr converter with default value - false error
|
priority-1-normal topic-attrs
|
When using `attr` attribute with both `converter` and `default`, wrong expectations is applied on `default` type with mypy == 0.670
The valid code
```python
import attr
import typing
def to_int(x: typing.Any) -> int:
return int(x)
@attr.s
class C:
a: int = attr.ib(converter=to_int, default="1")
```
results in:
```
main.py:9: error: Incompatible types in assignment (expression has type "str", variable has type "int")
main.py:9: error: Argument "converter" has incompatible type "Callable[[Any], int]"; expected "Optional[Callable[[Any], str]]"
```
is not right, because the `converter` is applied on `default` value as well.
Of course the hotfix is to convert the default value as well.
|
1.0
|
attr converter with default value - false error - When using `attr` attribute with both `converter` and `default`, wrong expectations is applied on `default` type with mypy == 0.670
The valid code
```python
import attr
import typing
def to_int(x: typing.Any) -> int:
return int(x)
@attr.s
class C:
a: int = attr.ib(converter=to_int, default="1")
```
results in:
```
main.py:9: error: Incompatible types in assignment (expression has type "str", variable has type "int")
main.py:9: error: Argument "converter" has incompatible type "Callable[[Any], int]"; expected "Optional[Callable[[Any], str]]"
```
is not right, because the `converter` is applied on `default` value as well.
Of course the hotfix is to convert the default value as well.
|
non_process
|
attr converter with default value false error when using attr attribute with both converter and default wrong expectations is applied on default type with mypy the valid code python import attr import typing def to int x typing any int return int x attr s class c a int attr ib converter to int default results in main py error incompatible types in assignment expression has type str variable has type int main py error argument converter has incompatible type callable int expected optional str is not right because the converter is applied on default value as well of course the hotfix is to convert the default value as well
| 0
|
31,855
| 7,460,097,830
|
IssuesEvent
|
2018-03-30 18:08:48
|
google/shaka-player
|
https://api.github.com/repos/google/shaka-player
|
opened
|
Complete implementation of TS stream generator
|
code health enhancement
|
For our integration tests, we have a stream generator that generates DASH-style streams of Mp4 files. With the recent integration test for CEA-708 captions, a stream generator was created for HLS-style streams of TS files, but the implementation is very stripped-down. It makes a "stream" with a single segment.
A more complete implementation would be useful, as it would allow for the creation of new integration tests for TS content.
|
1.0
|
Complete implementation of TS stream generator - For our integration tests, we have a stream generator that generates DASH-style streams of Mp4 files. With the recent integration test for CEA-708 captions, a stream generator was created for HLS-style streams of TS files, but the implementation is very stripped-down. It makes a "stream" with a single segment.
A more complete implementation would be useful, as it would allow for the creation of new integration tests for TS content.
|
non_process
|
complete implementation of ts stream generator for our integration tests we have a stream generator that generates dash style streams of files with the recent integration test for cea captions a stream generator was created for hls style streams of ts files but the implementation is very stripped down it makes a stream with a single segment a more complete implementation would be useful as it would allow for the creation of new integration tests for ts content
| 0
|
5,423
| 8,269,188,675
|
IssuesEvent
|
2018-09-15 02:28:56
|
shirou/gopsutil
|
https://api.github.com/repos/shirou/gopsutil
|
closed
|
[process] get invalid pid by Processes in darwin_amd64
|
os:darwin package:process
|
When I run code:
```
all, err := process.Processes()
if err != nil {
panic(err)
}
for _, p := range all {
fmt.Println("pid", p.Pid)
}
```
It will show some invalid pid, such as
```
pid 1531858944
```
using gopsutil: [ef54649](https://github.com/shirou/gopsutil/commit/ef54649286408053c0620d366dc06ead7d7cae5a)
|
1.0
|
[process] get invalid pid by Processes in darwin_amd64 - When I run code:
```
all, err := process.Processes()
if err != nil {
panic(err)
}
for _, p := range all {
fmt.Println("pid", p.Pid)
}
```
It will show some invalid pid, such as
```
pid 1531858944
```
using gopsutil: [ef54649](https://github.com/shirou/gopsutil/commit/ef54649286408053c0620d366dc06ead7d7cae5a)
|
process
|
get invalid pid by processes in darwin when i run code all err process processes if err nil panic err for p range all fmt println pid p pid it will show some invalid pid such as pid using gopsutil
| 1
|
126,503
| 4,996,644,099
|
IssuesEvent
|
2016-12-09 14:32:57
|
HBHWoolacotts/RPii
|
https://api.github.com/repos/HBHWoolacotts/RPii
|
closed
|
"Unconfirm Selected Line" is not available if the sale is on a HP
|
FIXED - HBH Live Label: General RP Bugs and Support Priority - Low
|
"Unconfirm Selected Line" is not available if the sale is on a HP

|
1.0
|
"Unconfirm Selected Line" is not available if the sale is on a HP - "Unconfirm Selected Line" is not available if the sale is on a HP

|
non_process
|
unconfirm selected line is not available if the sale is on a hp unconfirm selected line is not available if the sale is on a hp
| 0
|
3,596
| 6,625,802,318
|
IssuesEvent
|
2017-09-22 16:50:22
|
wpninjas/ninja-forms
|
https://api.github.com/repos/wpninjas/ninja-forms
|
closed
|
Required Fields that have not been focused do not get caught as required
|
DIFFICULTY: Easy FRONT: Processing PRIORITY: High VALUE: Painless
|
Any required field that has been conditionally shown (part or individual field) is not being treated as "required" for Form Submission to take place.
|
1.0
|
Required Fields that have not been focused do not get caught as required - Any required field that has been conditionally shown (part or individual field) is not being treated as "required" for Form Submission to take place.
|
process
|
required fields that have not been focused do not get caught as required any required field that has been conditionally shown part or individual field is not being treated as required for form submission to take place
| 1
|
3,730
| 6,733,142,355
|
IssuesEvent
|
2017-10-18 13:58:37
|
york-region-tpss/stp
|
https://api.github.com/repos/york-region-tpss/stp
|
closed
|
Contract preparation single item view - Process Data
|
process workflow
|
Flush the data in form and collection to the database.
|
1.0
|
Contract preparation single item view - Process Data - Flush the data in form and collection to the database.
|
process
|
contract preparation single item view process data flush the data in form and collection to the database
| 1
|
97,207
| 20,194,231,642
|
IssuesEvent
|
2022-02-11 09:10:39
|
Regalis11/Barotrauma
|
https://api.github.com/repos/Regalis11/Barotrauma
|
opened
|
Ballast flora improvements part 2
|
Code Design Design pending
|
Split from the ticket https://github.com/regalis11/barotrauma-development/issues/2992
> Investigate a way in which keeping the organism is a valid tactic, albeit one that requires trimming of the plant from electronics. Consider player incentives.
|
1.0
|
Ballast flora improvements part 2 - Split from the ticket https://github.com/regalis11/barotrauma-development/issues/2992
> Investigate a way in which keeping the organism is a valid tactic, albeit one that requires trimming of the plant from electronics. Consider player incentives.
|
non_process
|
ballast flora improvements part split from the ticket investigate a way in which keeping the organism is a valid tactic albeit one that requires trimming of the plant from electronics consider player incentives
| 0
|
14,249
| 17,185,235,782
|
IssuesEvent
|
2021-07-16 00:06:32
|
filecoin-project/lotus
|
https://api.github.com/repos/filecoin-project/lotus
|
closed
|
[Deal Making Issue]
|
area/markets epic/seal-unseal-process hint/needs-author-input kind/stale
|
**Basic Information**
Client and Miner side for online retrieval deal.
**Describe the problem**
The client is trying to retrieval an online storage deal and is running into "ERROR: retrieval failed: Retrieve: Retrieval Error: error generated by data transfer: deal data transfer failed: incomplete response".
**Version**
lotus version 1.5.3+mainnet+git.358773e2b
**Setup**
GeForce RTX 2080Ti
256 GB RAM
2 TB NVMe M.2
AMD EPYC 7F72 24-Core Processor
**To Reproduce**
Steps to reproduce the behavior:
1. lotus client retrieve --miner f066596 bafyaa6qsgafcmalqudsaeiekkmiypfxq6vgdwepdafg6jyacye46gurjichuapbzxwqhoywq7yjaagelucbyabasgafcmalqudsaeif3e6tsyc7c3anlmobix3744duq7cyasjsmrsmbkdn4besizu5ffyjaagf6zcc36aykcqeaeghh3sbl6bzaqcaibaaeedt5zav7am output-test.car
2. See error
**Deal status**
Creation Verified ProposalCid DealId State Client
Size Price Duration TransferChannelID Message
Mar 31 18:26:52 false bafyreia2kdy6sad3be2hnb5ywvno7jrxcvr7sa5q7qytuv2lxtqxvhlzcq 1653845 StorageDealActive
f3rxhdrohdvzunxweebfvnch6izl4rron7qo7gb5x5czu47xdg25g3ys57tovcwoz2wgsr2ystugfyimcmhpbq 2GiB 0.0104405 FIL 522025 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-0
**Lotus daemon and miner logs**
**Client Logs**
lotus client retrieve --miner f066596 bafyaa6qsgafcmalqudsaeiekkmiypfxq6vgdwepdafg6jyacye46gurjichuapbzxwqhoywq7yjaagelucbyabasgafcmalqudsaeif3e6tsyc7c3anlmobix3744duq7cyasjsmrsmbkdn4besizu5ffyjaagf6zcc36aykcqeaeghh3sbl6bzaqcaibaaeedt5zav7am output-test.car
> Recv: 0 B, Paid 0 FIL, ClientEventOpen (DealStatusNew)
> Recv: 0 B, Paid 0 FIL, ClientEventDealProposed (DealStatusWaitForAcceptance)
> Recv: 122 B, Paid 0 FIL, ClientEventBlocksReceived (DealStatusWaitForAcceptance)
> Recv: 122 B, Paid 0 FIL, ClientEventDealAccepted (DealStatusAccepted)
> Recv: 122 B, Paid 0 FIL, ClientEventPaymentChannelCreateInitiated (DealStatusPaymentChannelCreating)
> Recv: 122 B, Paid 0 FIL, ClientEventDataTransferError (DealStatusErrored)
> Recv: 0 B, Paid 0 FIL, ClientEventOpen (DealStatusNew)
ERROR: retrieval failed: Retrieve: Retrieval Error: error generated by data transfer: deal data transfer failed: incomplete response


**Miner Logs**
2021-04-05T23:31:51.091-0700 INFO dt-impl impl/events.go:255 received new channel request from 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp
2021-04-05T23:31:51.092-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventOpen", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusNew", "message": ""}
2021-04-05T23:31:51.093-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventDealAccepted", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusUnsealing", "message": ""}
2021-04-05T23:32:02.362-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventUnsealComplete", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusUnsealed", "message": ""}
2021-04-05T23:32:02.362-0700 INFO dt-impl impl/impl.go:386 resume channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-2
2021-04-05T23:32:02.363-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventBlockSent", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusOngoing", "message": ""}
2021-04-05T23:32:02.364-0700 WARN dt-impl impl/events.go:235 data transfer channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-2 failed to transfer data: graphsync response to peer 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp did not complete: response status code RequestCompletedPartial
2021-04-05T23:32:02.364-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventDataTransferError", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusErrored", "message": "deal data transfer failed: data transfer channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-2 failed to transfer data: graphsync response to peer 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp did not complete: response status code RequestCompletedPartial"}
2021-04-05T23:32:44.771-0700 INFO graphsync impl/graphsync.go:309 Graphsync ReceiveError from 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp: stream reset
2021-04-06T01:59:33.973-0700 INFO dt-impl impl/events.go:255 received new channel request from 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp
2021-04-06T01:59:33.975-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventOpen", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusNew", "message": ""}
2021-04-06T01:59:33.976-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventDealAccepted", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusUnsealing", "message": ""}
2021-04-06T01:59:43.948-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventUnsealComplete", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusUnsealed", "message": ""}
2021-04-06T01:59:43.948-0700 INFO dt-impl impl/impl.go:386 resume channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-3
2021-04-06T01:59:43.948-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventBlockSent", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusOngoing", "message": ""}
2021-04-06T01:59:43.949-0700 WARN dt-impl impl/events.go:235 data transfer channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-3 failed to transfer data: graphsync response to peer 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp did not complete: response status code RequestCompletedPartial
2021-04-06T01:59:43.949-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventDataTransferError", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusErrored", "message": "deal data transfer failed: data transfer channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-3 failed to transfer data: graphsync response to peer 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp did not complete: response status code RequestCompletedPartial"}
2021-04-06T02:14:14.277-0700 INFO graphsync impl/graphsync.go:309 Graphsync ReceiveError from 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp: stream reset
** Code modifications **
No code modifications
|
1.0
|
[Deal Making Issue] - **Basic Information**
Client and Miner side for online retrieval deal.
**Describe the problem**
The client is trying to retrieval an online storage deal and is running into "ERROR: retrieval failed: Retrieve: Retrieval Error: error generated by data transfer: deal data transfer failed: incomplete response".
**Version**
lotus version 1.5.3+mainnet+git.358773e2b
**Setup**
GeForce RTX 2080Ti
256 GB RAM
2 TB NVMe M.2
AMD EPYC 7F72 24-Core Processor
**To Reproduce**
Steps to reproduce the behavior:
1. lotus client retrieve --miner f066596 bafyaa6qsgafcmalqudsaeiekkmiypfxq6vgdwepdafg6jyacye46gurjichuapbzxwqhoywq7yjaagelucbyabasgafcmalqudsaeif3e6tsyc7c3anlmobix3744duq7cyasjsmrsmbkdn4besizu5ffyjaagf6zcc36aykcqeaeghh3sbl6bzaqcaibaaeedt5zav7am output-test.car
2. See error
**Deal status**
Creation Verified ProposalCid DealId State Client
Size Price Duration TransferChannelID Message
Mar 31 18:26:52 false bafyreia2kdy6sad3be2hnb5ywvno7jrxcvr7sa5q7qytuv2lxtqxvhlzcq 1653845 StorageDealActive
f3rxhdrohdvzunxweebfvnch6izl4rron7qo7gb5x5czu47xdg25g3ys57tovcwoz2wgsr2ystugfyimcmhpbq 2GiB 0.0104405 FIL 522025 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-0
**Lotus daemon and miner logs**
**Client Logs**
lotus client retrieve --miner f066596 bafyaa6qsgafcmalqudsaeiekkmiypfxq6vgdwepdafg6jyacye46gurjichuapbzxwqhoywq7yjaagelucbyabasgafcmalqudsaeif3e6tsyc7c3anlmobix3744duq7cyasjsmrsmbkdn4besizu5ffyjaagf6zcc36aykcqeaeghh3sbl6bzaqcaibaaeedt5zav7am output-test.car
> Recv: 0 B, Paid 0 FIL, ClientEventOpen (DealStatusNew)
> Recv: 0 B, Paid 0 FIL, ClientEventDealProposed (DealStatusWaitForAcceptance)
> Recv: 122 B, Paid 0 FIL, ClientEventBlocksReceived (DealStatusWaitForAcceptance)
> Recv: 122 B, Paid 0 FIL, ClientEventDealAccepted (DealStatusAccepted)
> Recv: 122 B, Paid 0 FIL, ClientEventPaymentChannelCreateInitiated (DealStatusPaymentChannelCreating)
> Recv: 122 B, Paid 0 FIL, ClientEventDataTransferError (DealStatusErrored)
> Recv: 0 B, Paid 0 FIL, ClientEventOpen (DealStatusNew)
ERROR: retrieval failed: Retrieve: Retrieval Error: error generated by data transfer: deal data transfer failed: incomplete response


**Miner Logs**
2021-04-05T23:31:51.091-0700 INFO dt-impl impl/events.go:255 received new channel request from 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp
2021-04-05T23:31:51.092-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventOpen", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusNew", "message": ""}
2021-04-05T23:31:51.093-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventDealAccepted", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusUnsealing", "message": ""}
2021-04-05T23:32:02.362-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventUnsealComplete", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusUnsealed", "message": ""}
2021-04-05T23:32:02.362-0700 INFO dt-impl impl/impl.go:386 resume channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-2
2021-04-05T23:32:02.363-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventBlockSent", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusOngoing", "message": ""}
2021-04-05T23:32:02.364-0700 WARN dt-impl impl/events.go:235 data transfer channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-2 failed to transfer data: graphsync response to peer 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp did not complete: response status code RequestCompletedPartial
2021-04-05T23:32:02.364-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventDataTransferError", "deal ID": "1", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusErrored", "message": "deal data transfer failed: data transfer channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-2 failed to transfer data: graphsync response to peer 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp did not complete: response status code RequestCompletedPartial"}
2021-04-05T23:32:44.771-0700 INFO graphsync impl/graphsync.go:309 Graphsync ReceiveError from 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp: stream reset
2021-04-06T01:59:33.973-0700 INFO dt-impl impl/events.go:255 received new channel request from 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp
2021-04-06T01:59:33.975-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventOpen", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusNew", "message": ""}
2021-04-06T01:59:33.976-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventDealAccepted", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusUnsealing", "message": ""}
2021-04-06T01:59:43.948-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventUnsealComplete", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusUnsealed", "message": ""}
2021-04-06T01:59:43.948-0700 INFO dt-impl impl/impl.go:386 resume channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-3
2021-04-06T01:59:43.948-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventBlockSent", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusOngoing", "message": ""}
2021-04-06T01:59:43.949-0700 WARN dt-impl impl/events.go:235 data transfer channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-3 failed to transfer data: graphsync response to peer 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp did not complete: response status code RequestCompletedPartial
2021-04-06T01:59:43.949-0700 INFO markets loggers/loggers.go:30 retrieval provider event {"name": "ProviderEventDataTransferError", "deal ID": "2", "receiver": "12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp", "state": "DealStatusErrored", "message": "deal data transfer failed: data transfer channel 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp-12D3KooWH3mBSF2RFQJFZdqzkVBqSdspoAAunHkViTf5AQhubv1c-3 failed to transfer data: graphsync response to peer 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp did not complete: response status code RequestCompletedPartial"}
2021-04-06T02:14:14.277-0700 INFO graphsync impl/graphsync.go:309 Graphsync ReceiveError from 12D3KooWAWe8axhW2GM6MmuDSzE1Ln5BZdgvX9swhCsUySUncTHp: stream reset
** Code modifications **
No code modifications
|
process
|
basic information client and miner side for online retrieval deal describe the problem the client is trying to retrieval an online storage deal and is running into error retrieval failed retrieve retrieval error error generated by data transfer deal data transfer failed incomplete response version lotus version mainnet git setup geforce rtx gb ram tb nvme m amd epyc core processor to reproduce steps to reproduce the behavior lotus client retrieve miner output test car see error deal status creation verified proposalcid dealid state client size price duration transferchannelid message mar false storagedealactive fil lotus daemon and miner logs client logs lotus client retrieve miner output test car recv b paid fil clienteventopen dealstatusnew recv b paid fil clienteventdealproposed dealstatuswaitforacceptance recv b paid fil clienteventblocksreceived dealstatuswaitforacceptance recv b paid fil clienteventdealaccepted dealstatusaccepted recv b paid fil clienteventpaymentchannelcreateinitiated dealstatuspaymentchannelcreating recv b paid fil clienteventdatatransfererror dealstatuserrored recv b paid fil clienteventopen dealstatusnew error retrieval failed retrieve retrieval error error generated by data transfer deal data transfer failed incomplete response miner logs info dt impl impl events go received new channel request from info markets loggers loggers go retrieval provider event name providereventopen deal id receiver state dealstatusnew message info markets loggers loggers go retrieval provider event name providereventdealaccepted deal id receiver state dealstatusunsealing message info markets loggers loggers go retrieval provider event name providereventunsealcomplete deal id receiver state dealstatusunsealed message info dt impl impl impl go resume channel info markets loggers loggers go retrieval provider event name providereventblocksent deal id receiver state dealstatusongoing message warn dt impl impl events go data transfer channel failed to transfer data graphsync response to peer did not complete response status code requestcompletedpartial info markets loggers loggers go retrieval provider event name providereventdatatransfererror deal id receiver state dealstatuserrored message deal data transfer failed data transfer channel failed to transfer data graphsync response to peer did not complete response status code requestcompletedpartial info graphsync impl graphsync go graphsync receiveerror from stream reset info dt impl impl events go received new channel request from info markets loggers loggers go retrieval provider event name providereventopen deal id receiver state dealstatusnew message info markets loggers loggers go retrieval provider event name providereventdealaccepted deal id receiver state dealstatusunsealing message info markets loggers loggers go retrieval provider event name providereventunsealcomplete deal id receiver state dealstatusunsealed message info dt impl impl impl go resume channel info markets loggers loggers go retrieval provider event name providereventblocksent deal id receiver state dealstatusongoing message warn dt impl impl events go data transfer channel failed to transfer data graphsync response to peer did not complete response status code requestcompletedpartial info markets loggers loggers go retrieval provider event name providereventdatatransfererror deal id receiver state dealstatuserrored message deal data transfer failed data transfer channel failed to transfer data graphsync response to peer did not complete response status code requestcompletedpartial info graphsync impl graphsync go graphsync receiveerror from stream reset code modifications no code modifications
| 1
|
18,729
| 4,308,328,203
|
IssuesEvent
|
2016-07-21 12:37:19
|
spring-cloud/spring-cloud-netflix
|
https://api.github.com/repos/spring-cloud/spring-cloud-netflix
|
reopened
|
Feign: setting header in feign RequestInterceptor does not work in BRIXTON.SR3
|
documentation
|
We want to switch from spring-cloud-1.0.0-RELEASE to spring-cloud-Brixton.SR3 and and find that calls to @FeignClient annotated services interfaces are executed in an async manner (hystrix command).
This way our registred feign ReceptInterceptors which uses ThreadLocal bound headers to be forwarded to the receiver does not work as expected.
Is there a way to avoid the async execution or to put the required heders into a ThreadLocal within the Thread which executes the hystrix command.
|
1.0
|
Feign: setting header in feign RequestInterceptor does not work in BRIXTON.SR3 - We want to switch from spring-cloud-1.0.0-RELEASE to spring-cloud-Brixton.SR3 and and find that calls to @FeignClient annotated services interfaces are executed in an async manner (hystrix command).
This way our registred feign ReceptInterceptors which uses ThreadLocal bound headers to be forwarded to the receiver does not work as expected.
Is there a way to avoid the async execution or to put the required heders into a ThreadLocal within the Thread which executes the hystrix command.
|
non_process
|
feign setting header in feign requestinterceptor does not work in brixton we want to switch from spring cloud release to spring cloud brixton and and find that calls to feignclient annotated services interfaces are executed in an async manner hystrix command this way our registred feign receptinterceptors which uses threadlocal bound headers to be forwarded to the receiver does not work as expected is there a way to avoid the async execution or to put the required heders into a threadlocal within the thread which executes the hystrix command
| 0
|
9,157
| 12,217,314,720
|
IssuesEvent
|
2020-05-01 16:56:15
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
partially successful/warning condition
|
Pri1 devops-cicd-process/tech devops/prod doc-bug
|
If a task fails with continue on error the job is triggered as partially successful. Is there a condition to pick up on this? I tried with failed() which didn't trigger and succeeded() which worked. But i don't really want it to trigger on true succeeded...
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 3f151218-9a11-0078-e038-f96198a76143
* Version Independent ID: 09c4d032-62f3-d97c-79d7-6fbfd89910e9
* Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml#feedback)
* Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/conditions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
partially successful/warning condition - If a task fails with continue on error the job is triggered as partially successful. Is there a condition to pick up on this? I tried with failed() which didn't trigger and succeeded() which worked. But i don't really want it to trigger on true succeeded...
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 3f151218-9a11-0078-e038-f96198a76143
* Version Independent ID: 09c4d032-62f3-d97c-79d7-6fbfd89910e9
* Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml#feedback)
* Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/conditions.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
partially successful warning condition if a task fails with continue on error the job is triggered as partially successful is there a condition to pick up on this i tried with failed which didn t trigger and succeeded which worked but i don t really want it to trigger on true succeeded document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
9,596
| 12,543,313,730
|
IssuesEvent
|
2020-06-05 15:21:00
|
knative/serving
|
https://api.github.com/repos/knative/serving
|
closed
|
Kubernetes 1.16 minimum version
|
area/API area/test-and-release kind/feature kind/process
|
/area API
/area test-and-release
/kind process
## Describe the feature
This happens in knative/pkg, but opening this here to track.
This is in accordance with our Release Principles.
|
1.0
|
Kubernetes 1.16 minimum version - /area API
/area test-and-release
/kind process
## Describe the feature
This happens in knative/pkg, but opening this here to track.
This is in accordance with our Release Principles.
|
process
|
kubernetes minimum version area api area test and release kind process describe the feature this happens in knative pkg but opening this here to track this is in accordance with our release principles
| 1
|
1,285
| 3,822,507,638
|
IssuesEvent
|
2016-03-30 01:36:50
|
dataproofer/Dataproofer
|
https://api.github.com/repos/dataproofer/Dataproofer
|
opened
|
Read in Excel file data
|
engine: processing large
|
Currently the app handles data from a CSV file or a Google Spreadsheet tab. We'd need to add this functionality into the logic in `src/` and some UI into `electron/`. Dataproofer reads spreadsheets in as an array of row objects, so we'll have to convert it if it comes in as an array of row arrays.
## Possible Fix
[Node XLSX](https://www.npmjs.com/package/node-xlsx)
|
1.0
|
Read in Excel file data - Currently the app handles data from a CSV file or a Google Spreadsheet tab. We'd need to add this functionality into the logic in `src/` and some UI into `electron/`. Dataproofer reads spreadsheets in as an array of row objects, so we'll have to convert it if it comes in as an array of row arrays.
## Possible Fix
[Node XLSX](https://www.npmjs.com/package/node-xlsx)
|
process
|
read in excel file data currently the app handles data from a csv file or a google spreadsheet tab we d need to add this functionality into the logic in src and some ui into electron dataproofer reads spreadsheets in as an array of row objects so we ll have to convert it if it comes in as an array of row arrays possible fix
| 1
|
2,931
| 5,917,225,949
|
IssuesEvent
|
2017-05-22 12:45:16
|
intelsdi-x/snap
|
https://api.github.com/repos/intelsdi-x/snap
|
closed
|
Plugin wanted: logs filter
|
plugin-wishlist/processor
|
I'd like to have a processor plugin to filter logs (using tags) collected by [snap-plugin-collector-logs](https://github.com/intelsdi-x/snap-plugin-collector-logs).
|
1.0
|
Plugin wanted: logs filter - I'd like to have a processor plugin to filter logs (using tags) collected by [snap-plugin-collector-logs](https://github.com/intelsdi-x/snap-plugin-collector-logs).
|
process
|
plugin wanted logs filter i d like to have a processor plugin to filter logs using tags collected by
| 1
|
4,959
| 7,802,863,316
|
IssuesEvent
|
2018-06-10 17:12:25
|
StrikeNP/trac_test
|
https://api.github.com/repos/StrikeNP/trac_test
|
closed
|
Add useful plots and get rid of useless plots on plotgen (Trac #824)
|
Migrated from Trac bmg2@uwm.edu post_processing task
|
Plotgen could be improved by adding more useful panels to it. However, that also increases the amount of space that each plotgen ```.maff``` files takes up, and we don't want to go over the attachment limit for Trac. In order to make room for new plots, useless panels should be taken off plotgen. An example of a useless panel would be graupel mixing ratio for the FIRE stratocumulus case.
Attachments:
http://carson.math.uwm.edu/trac/clubb/attachment/ticket/824/plotgen_test_r8643.maff
http://carson.math.uwm.edu/trac/clubb/attachment/ticket/824/plotgen_test_r8655.maff
Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/824
```json
{
"status": "closed",
"changetime": "2018-06-05T18:22:00",
"description": "Plotgen could be improved by adding more useful panels to it. However, that also increases the amount of space that each plotgen {{{.maff}}} files takes up, and we don't want to go over the attachment limit for Trac. In order to make room for new plots, useless panels should be taken off plotgen. An example of a useless panel would be graupel mixing ratio for the FIRE stratocumulus case.",
"reporter": "bmg2@uwm.edu",
"cc": "vlarson@uwm.edu",
"resolution": "fixed",
"_ts": "1528222920626922",
"component": "post_processing",
"summary": "Add useful plots and get rid of useless plots on plotgen",
"priority": "minor",
"keywords": "new plots",
"time": "2018-04-19T21:46:14",
"milestone": "Improve Plotgen",
"owner": "bmg2@uwm.edu",
"type": "task"
}
```
|
1.0
|
Add useful plots and get rid of useless plots on plotgen (Trac #824) - Plotgen could be improved by adding more useful panels to it. However, that also increases the amount of space that each plotgen ```.maff``` files takes up, and we don't want to go over the attachment limit for Trac. In order to make room for new plots, useless panels should be taken off plotgen. An example of a useless panel would be graupel mixing ratio for the FIRE stratocumulus case.
Attachments:
http://carson.math.uwm.edu/trac/clubb/attachment/ticket/824/plotgen_test_r8643.maff
http://carson.math.uwm.edu/trac/clubb/attachment/ticket/824/plotgen_test_r8655.maff
Migrated from http://carson.math.uwm.edu/trac/clubb/ticket/824
```json
{
"status": "closed",
"changetime": "2018-06-05T18:22:00",
"description": "Plotgen could be improved by adding more useful panels to it. However, that also increases the amount of space that each plotgen {{{.maff}}} files takes up, and we don't want to go over the attachment limit for Trac. In order to make room for new plots, useless panels should be taken off plotgen. An example of a useless panel would be graupel mixing ratio for the FIRE stratocumulus case.",
"reporter": "bmg2@uwm.edu",
"cc": "vlarson@uwm.edu",
"resolution": "fixed",
"_ts": "1528222920626922",
"component": "post_processing",
"summary": "Add useful plots and get rid of useless plots on plotgen",
"priority": "minor",
"keywords": "new plots",
"time": "2018-04-19T21:46:14",
"milestone": "Improve Plotgen",
"owner": "bmg2@uwm.edu",
"type": "task"
}
```
|
process
|
add useful plots and get rid of useless plots on plotgen trac plotgen could be improved by adding more useful panels to it however that also increases the amount of space that each plotgen maff files takes up and we don t want to go over the attachment limit for trac in order to make room for new plots useless panels should be taken off plotgen an example of a useless panel would be graupel mixing ratio for the fire stratocumulus case attachments migrated from json status closed changetime description plotgen could be improved by adding more useful panels to it however that also increases the amount of space that each plotgen maff files takes up and we don t want to go over the attachment limit for trac in order to make room for new plots useless panels should be taken off plotgen an example of a useless panel would be graupel mixing ratio for the fire stratocumulus case reporter uwm edu cc vlarson uwm edu resolution fixed ts component post processing summary add useful plots and get rid of useless plots on plotgen priority minor keywords new plots time milestone improve plotgen owner uwm edu type task
| 1
|
1,888
| 2,578,103,259
|
IssuesEvent
|
2015-02-12 21:05:20
|
SCIInstitute/SCIRun
|
https://api.github.com/repos/SCIInstitute/SCIRun
|
opened
|
Unstable streamline computation methods in GenerateStreamLines module
|
Modules Priority-Medium Testing
|
Cell walk is definitely unstable (see SCIRun 5's GenerateStreamLines-cellwalk-partial.srn5 vs SCIRun 4's GenerateStreamLines-cellwalk.srn). Other computation methods need testing too.
|
1.0
|
Unstable streamline computation methods in GenerateStreamLines module - Cell walk is definitely unstable (see SCIRun 5's GenerateStreamLines-cellwalk-partial.srn5 vs SCIRun 4's GenerateStreamLines-cellwalk.srn). Other computation methods need testing too.
|
non_process
|
unstable streamline computation methods in generatestreamlines module cell walk is definitely unstable see scirun s generatestreamlines cellwalk partial vs scirun s generatestreamlines cellwalk srn other computation methods need testing too
| 0
|
18,362
| 24,492,287,732
|
IssuesEvent
|
2022-10-10 04:16:50
|
phamtanduongtk29/html-css-training
|
https://api.github.com/repos/phamtanduongtk29/html-css-training
|
opened
|
Create and Responsive pricing section
|
not yet processing
|
- Estimates; 7h
- Create heading
- Create table price list :
- Price
- Description
|
1.0
|
Create and Responsive pricing section - - Estimates; 7h
- Create heading
- Create table price list :
- Price
- Description
|
process
|
create and responsive pricing section estimates create heading create table price list price description
| 1
|
84,068
| 10,470,921,084
|
IssuesEvent
|
2019-09-23 06:11:38
|
ADK-Lulu/LMS-Grupp-awesome
|
https://api.github.com/repos/ADK-Lulu/LMS-Grupp-awesome
|
opened
|
6.1 "Logga ut"-menyn är inte optimerad för chrome på android
|
Design flaw Minor
|
När man ska logga ut från chrome android får man upp en meny som inte är optimerad för android. Knapparna "Avbryt" och "Byt lösenord" fungerar men man ser inte hela "logga ut" knappen. Om man drar menyn lite åt sidan får man hela knappen men då försvinner lite av den vita bakgrunden (se bilden). Man skulle kunna optimera detta genom att antingen förminska knapparna lite eller lägga de under varandra.

|
1.0
|
6.1 "Logga ut"-menyn är inte optimerad för chrome på android - När man ska logga ut från chrome android får man upp en meny som inte är optimerad för android. Knapparna "Avbryt" och "Byt lösenord" fungerar men man ser inte hela "logga ut" knappen. Om man drar menyn lite åt sidan får man hela knappen men då försvinner lite av den vita bakgrunden (se bilden). Man skulle kunna optimera detta genom att antingen förminska knapparna lite eller lägga de under varandra.

|
non_process
|
logga ut menyn är inte optimerad för chrome på android när man ska logga ut från chrome android får man upp en meny som inte är optimerad för android knapparna avbryt och byt lösenord fungerar men man ser inte hela logga ut knappen om man drar menyn lite åt sidan får man hela knappen men då försvinner lite av den vita bakgrunden se bilden man skulle kunna optimera detta genom att antingen förminska knapparna lite eller lägga de under varandra
| 0
|
155
| 2,581,503,469
|
IssuesEvent
|
2015-02-14 03:33:01
|
tinkerpop/tinkerpop3
|
https://api.github.com/repos/tinkerpop/tinkerpop3
|
opened
|
Add RangeLocalStep (and called RangeStep, RangeGlobalStep)
|
enhancement process
|
I need:
```groovy
__inject(m).order(local).by(keyDecr).limit(local,10)
```
Can you do this @dkuppitz ?
|
1.0
|
Add RangeLocalStep (and called RangeStep, RangeGlobalStep) - I need:
```groovy
__inject(m).order(local).by(keyDecr).limit(local,10)
```
Can you do this @dkuppitz ?
|
process
|
add rangelocalstep and called rangestep rangeglobalstep i need groovy inject m order local by keydecr limit local can you do this dkuppitz
| 1
|
12,558
| 14,979,103,221
|
IssuesEvent
|
2021-01-28 11:46:10
|
panther-labs/panther
|
https://api.github.com/repos/panther-labs/panther
|
opened
|
Panther should refresh the S3 role IAM credentials
|
bug p1 team:data processing
|
### Describe the bug
Log processor will fail to process incoming events from S3 with `"InvalidAccessKeyId: The AWS Access Key Id you provided does not exist in our records"`
### Steps to reproduce
Steps to reproduce the behavior:
1. Onboard an S3 source
2. Send some data to it, make sure Panther has processed them.
3. Delete the CFN stack. Redeploy a new stack
4. Check log processor for errors
### Expected behavior
Log process should pick up the new IAM creds from the source
### Environment
How are you deploying or using Panther?
- Panther version or commit: 1.15.2
### Additional context
Panther Log process caches the IAM role credentials indefinitely and only refreshes the STS credentials. We should refresh the IAM role credentials if we encounter `InvalidAccessKeyId` exception.
|
1.0
|
Panther should refresh the S3 role IAM credentials - ### Describe the bug
Log processor will fail to process incoming events from S3 with `"InvalidAccessKeyId: The AWS Access Key Id you provided does not exist in our records"`
### Steps to reproduce
Steps to reproduce the behavior:
1. Onboard an S3 source
2. Send some data to it, make sure Panther has processed them.
3. Delete the CFN stack. Redeploy a new stack
4. Check log processor for errors
### Expected behavior
Log process should pick up the new IAM creds from the source
### Environment
How are you deploying or using Panther?
- Panther version or commit: 1.15.2
### Additional context
Panther Log process caches the IAM role credentials indefinitely and only refreshes the STS credentials. We should refresh the IAM role credentials if we encounter `InvalidAccessKeyId` exception.
|
process
|
panther should refresh the role iam credentials describe the bug log processor will fail to process incoming events from with invalidaccesskeyid the aws access key id you provided does not exist in our records steps to reproduce steps to reproduce the behavior onboard an source send some data to it make sure panther has processed them delete the cfn stack redeploy a new stack check log processor for errors expected behavior log process should pick up the new iam creds from the source environment how are you deploying or using panther panther version or commit additional context panther log process caches the iam role credentials indefinitely and only refreshes the sts credentials we should refresh the iam role credentials if we encounter invalidaccesskeyid exception
| 1
|
14,170
| 8,494,404,616
|
IssuesEvent
|
2018-10-28 21:11:01
|
mapbox/mapbox-navigation-ios
|
https://api.github.com/repos/mapbox/mapbox-navigation-ios
|
opened
|
Road name / way name label should only be updated once per maneuver
|
- performance - refactor UI
|
Following on https://github.com/mapbox/mapbox-navigation-ios/pull/1576, it has been suggested that we introduce an optimization to only perform these calculations on each maneuver as opposed to on each location update.
We would also benefit from profiling the CPU time before and after such a change.
|
True
|
Road name / way name label should only be updated once per maneuver - Following on https://github.com/mapbox/mapbox-navigation-ios/pull/1576, it has been suggested that we introduce an optimization to only perform these calculations on each maneuver as opposed to on each location update.
We would also benefit from profiling the CPU time before and after such a change.
|
non_process
|
road name way name label should only be updated once per maneuver following on it has been suggested that we introduce an optimization to only perform these calculations on each maneuver as opposed to on each location update we would also benefit from profiling the cpu time before and after such a change
| 0
|
714
| 3,206,073,707
|
IssuesEvent
|
2015-10-04 18:09:21
|
symfony/symfony
|
https://api.github.com/repos/symfony/symfony
|
closed
|
Missing check in Finder component for WindowsPipes.php
|
Bug Process
|
Composer does not work anymore here, i get this message:
```
←[37;41m ←[39;49m
←[37;41m [ErrorException] ←[39;49m
←[37;41m fopen(): Filename cannot be empty ←[39;49m
←[37;41m ←[39;49m
←[32minstall [--prefer-source] [--prefer-dist] [--dry-run] [--dev] [--no-dev] [--no-plugins] [--no-custom-installers] [--no-autoloader] [--no-scripts] [--no-progress] [-v|vv|vvv|--verbose] [-o|--optim
ize-autoloader] [--ignore-platform-reqs] [packages1] ... [packagesN]←[39m
```
(not sure why i am getting terminal codes, that's another issue)
I extracted the `composer.phar` and placed some debug points around fopen statements. Seems that this call fails `tempnam(sys_get_temp_dir(), 'sf_proc_stdout')` [here](https://github.com/symfony/Process/blob/e42521d38b122a2ad8a0efb4f9ccaad9dd04b90b/Pipes/WindowsPipes.php#L51-L52)
There is no check if the function returns `false` and no exception is thrown.
Ini setting: `sys_temp_dir = "D:\dev\php\tmp"`. I forgot to recreate this directory when changing from a 64bits php to 32bits php.
PHP manual [states](https://secure.php.net/manual/en/function.tempnam.php): `If the directory does not exist or is not writable, tempnam() may generate a file in the system's temporary directory, and return the full path to that file, including its name.`
That `may` in `may generate a file` is not a promise that be relied on apparently
|
1.0
|
Missing check in Finder component for WindowsPipes.php - Composer does not work anymore here, i get this message:
```
←[37;41m ←[39;49m
←[37;41m [ErrorException] ←[39;49m
←[37;41m fopen(): Filename cannot be empty ←[39;49m
←[37;41m ←[39;49m
←[32minstall [--prefer-source] [--prefer-dist] [--dry-run] [--dev] [--no-dev] [--no-plugins] [--no-custom-installers] [--no-autoloader] [--no-scripts] [--no-progress] [-v|vv|vvv|--verbose] [-o|--optim
ize-autoloader] [--ignore-platform-reqs] [packages1] ... [packagesN]←[39m
```
(not sure why i am getting terminal codes, that's another issue)
I extracted the `composer.phar` and placed some debug points around fopen statements. Seems that this call fails `tempnam(sys_get_temp_dir(), 'sf_proc_stdout')` [here](https://github.com/symfony/Process/blob/e42521d38b122a2ad8a0efb4f9ccaad9dd04b90b/Pipes/WindowsPipes.php#L51-L52)
There is no check if the function returns `false` and no exception is thrown.
Ini setting: `sys_temp_dir = "D:\dev\php\tmp"`. I forgot to recreate this directory when changing from a 64bits php to 32bits php.
PHP manual [states](https://secure.php.net/manual/en/function.tempnam.php): `If the directory does not exist or is not writable, tempnam() may generate a file in the system's temporary directory, and return the full path to that file, including its name.`
That `may` in `may generate a file` is not a promise that be relied on apparently
|
process
|
missing check in finder component for windowspipes php composer does not work anymore here i get this message ← ← ← ← ← fopen filename cannot be empty ← ← ← ← o optim ize autoloader ← not sure why i am getting terminal codes that s another issue i extracted the composer phar and placed some debug points around fopen statements seems that this call fails tempnam sys get temp dir sf proc stdout there is no check if the function returns false and no exception is thrown ini setting sys temp dir d dev php tmp i forgot to recreate this directory when changing from a php to php php manual if the directory does not exist or is not writable tempnam may generate a file in the system s temporary directory and return the full path to that file including its name that may in may generate a file is not a promise that be relied on apparently
| 1
|
10,961
| 13,767,221,144
|
IssuesEvent
|
2020-10-07 15:28:20
|
Arch666Angel/mods
|
https://api.github.com/repos/Arch666Angel/mods
|
closed
|
Cannot manually craft wood from trees with bob character classes
|
Angels Bio Processing Impact: Bug
|
**Describe the bug**
See [forums](https://forums.factorio.com/viewtopic.php?p=508703#p508703), when playing with bobs character classes, the manual crafting of trees to wood is not possible as the manual crafting category is not added to bobs character classes.
https://github.com/Arch666Angel/mods/blob/b0c1ecad3cb397e0591209f289a96a8e6440f86d/angelsbioprocessing/prototypes/bio-processing-override.lua#L346-L348
Just need to find where we add it to the base game character...
|
1.0
|
Cannot manually craft wood from trees with bob character classes - **Describe the bug**
See [forums](https://forums.factorio.com/viewtopic.php?p=508703#p508703), when playing with bobs character classes, the manual crafting of trees to wood is not possible as the manual crafting category is not added to bobs character classes.
https://github.com/Arch666Angel/mods/blob/b0c1ecad3cb397e0591209f289a96a8e6440f86d/angelsbioprocessing/prototypes/bio-processing-override.lua#L346-L348
Just need to find where we add it to the base game character...
|
process
|
cannot manually craft wood from trees with bob character classes describe the bug see when playing with bobs character classes the manual crafting of trees to wood is not possible as the manual crafting category is not added to bobs character classes just need to find where we add it to the base game character
| 1
|
12,037
| 14,738,707,837
|
IssuesEvent
|
2021-01-07 05:30:55
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
RE: Speed-E'z Exchange - Not saving activities.
|
anc-external anc-ops anc-process anp-important ant-bug ant-support
|
In GitLab by @kdjstudios on Jul 11, 2018, 09:47
**Submitted by:** Jo Ann Browne <joann@speedez.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-07-11-77793/conversation
**Server:** External
**Client/Site:** Speedez
**Account:** 5412
**Issue:**
I am getting very tired of having these strange things that are happening to my account – It scares me when information just comes up missing and NOW I cannot even put in any ACTIVITIES on account 5412 so I can bill them. My goodness someone needs to help me, this is crazy – I have tried 10 times putting in the ACTIVITIES and clicked on SAVE and then nothing is there – it took me 6 times of putting in the RECURRING items before it showed up on the screen. Just maybe my account needs to be reset, but, I still have not heard from anyone in Support as to why the other two accounts lost all of the information that was in there to bill on. So, just saying – I am still waiting.
|
1.0
|
RE: Speed-E'z Exchange - Not saving activities. - In GitLab by @kdjstudios on Jul 11, 2018, 09:47
**Submitted by:** Jo Ann Browne <joann@speedez.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-07-11-77793/conversation
**Server:** External
**Client/Site:** Speedez
**Account:** 5412
**Issue:**
I am getting very tired of having these strange things that are happening to my account – It scares me when information just comes up missing and NOW I cannot even put in any ACTIVITIES on account 5412 so I can bill them. My goodness someone needs to help me, this is crazy – I have tried 10 times putting in the ACTIVITIES and clicked on SAVE and then nothing is there – it took me 6 times of putting in the RECURRING items before it showed up on the screen. Just maybe my account needs to be reset, but, I still have not heard from anyone in Support as to why the other two accounts lost all of the information that was in there to bill on. So, just saying – I am still waiting.
|
process
|
re speed e z exchange not saving activities in gitlab by kdjstudios on jul submitted by jo ann browne helpdesk server external client site speedez account issue i am getting very tired of having these strange things that are happening to my account – it scares me when information just comes up missing and now i cannot even put in any activities on account so i can bill them my goodness someone needs to help me this is crazy – i have tried times putting in the activities and clicked on save and then nothing is there – it took me times of putting in the recurring items before it showed up on the screen just maybe my account needs to be reset but i still have not heard from anyone in support as to why the other two accounts lost all of the information that was in there to bill on so just saying – i am still waiting
| 1
|
11,225
| 14,004,140,302
|
IssuesEvent
|
2020-10-28 16:43:47
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
closed
|
Native Types Postgres: prisma.types.create() with Decimal[] results in runtime error
|
bug/2-confirmed kind/bug process/candidate topic: native database types topic: prisma-client
|
<!--
Thanks for helping us improve Prisma! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by setting the `DEBUG="*"` environment variable and enabling additional logging output in Prisma Client.
Learn more about writing proper bug reports here: https://pris.ly/d/bug-reports
-->
## Bug description
`prisma.types.create()` with `Decimal[]` in Postgres results in runtime error
## How to reproduce
```prisma
model types {
dl Decimal[] @db.Numeric(65, 30)
}
```
```ts
import { PrismaClient, Decimal } from '@prisma/client'
const prisma = new PrismaClient()
async function main() {
await prisma.$connect()
await prisma.types.create({
dl: new Decimal(0) // in Schema, dl Decimal[] @db.Numeric(65, 30)
})
})
```
```
TypeError: arg.inputTypes is not iterable
at valueToArg (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14572:31)
at /Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14753:17
at Array.reduce (<anonymous>)
at objectToArgs (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14730:28)
at tryInferArgs (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14641:40)
at valueToArg (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14573:16)
at /Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14753:17
at Array.reduce (<anonymous>)
at objectToArgs (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14730:28)
at tryInferArgs (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14641:40) {
clientVersion: '2.10.0-dev.58'
}
```
## Expected behavior
Creating lists of decimals should work.
## Environment & setup
<!-- In which environment does the problem occur -->
```
Environment variables loaded from ./prisma/.env
@prisma/cli : 2.10.0-dev.74
@prisma/client : 2.10.0-dev.58
Current platform : darwin
Query Engine : query-engine 77abecc4840127ebdcc02b83ee1e2c9cc27009f2 (at ../../../../../.npm/_npx/6106/lib/node_modules/@prisma/cli/query-engine-darwin)
Migration Engine : migration-engine-cli 77abecc4840127ebdcc02b83ee1e2c9cc27009f2 (at ../../../../../.npm/_npx/6106/lib/node_modules/@prisma/cli/migration-engine-darwin)
Introspection Engine : introspection-core 77abecc4840127ebdcc02b83ee1e2c9cc27009f2 (at ../../../../../.npm/_npx/6106/lib/node_modules/@prisma/cli/introspection-engine-darwin)
Format Binary : prisma-fmt 77abecc4840127ebdcc02b83ee1e2c9cc27009f2 (at ../../../../../.npm/_npx/6106/lib/node_modules/@prisma/cli/prisma-fmt-darwin)
Studio : 0.304.0
Preview Features : nativeTypes
```
|
1.0
|
Native Types Postgres: prisma.types.create() with Decimal[] results in runtime error - <!--
Thanks for helping us improve Prisma! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by setting the `DEBUG="*"` environment variable and enabling additional logging output in Prisma Client.
Learn more about writing proper bug reports here: https://pris.ly/d/bug-reports
-->
## Bug description
`prisma.types.create()` with `Decimal[]` in Postgres results in runtime error
## How to reproduce
```prisma
model types {
dl Decimal[] @db.Numeric(65, 30)
}
```
```ts
import { PrismaClient, Decimal } from '@prisma/client'
const prisma = new PrismaClient()
async function main() {
await prisma.$connect()
await prisma.types.create({
dl: new Decimal(0) // in Schema, dl Decimal[] @db.Numeric(65, 30)
})
})
```
```
TypeError: arg.inputTypes is not iterable
at valueToArg (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14572:31)
at /Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14753:17
at Array.reduce (<anonymous>)
at objectToArgs (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14730:28)
at tryInferArgs (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14641:40)
at valueToArg (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14573:16)
at /Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14753:17
at Array.reduce (<anonymous>)
at objectToArgs (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14730:28)
at tryInferArgs (/Users/m/Go/src/github.com/prisma/qa-native-types/node_modules/@prisma/client/runtime/index.js:14641:40) {
clientVersion: '2.10.0-dev.58'
}
```
## Expected behavior
Creating lists of decimals should work.
## Environment & setup
<!-- In which environment does the problem occur -->
```
Environment variables loaded from ./prisma/.env
@prisma/cli : 2.10.0-dev.74
@prisma/client : 2.10.0-dev.58
Current platform : darwin
Query Engine : query-engine 77abecc4840127ebdcc02b83ee1e2c9cc27009f2 (at ../../../../../.npm/_npx/6106/lib/node_modules/@prisma/cli/query-engine-darwin)
Migration Engine : migration-engine-cli 77abecc4840127ebdcc02b83ee1e2c9cc27009f2 (at ../../../../../.npm/_npx/6106/lib/node_modules/@prisma/cli/migration-engine-darwin)
Introspection Engine : introspection-core 77abecc4840127ebdcc02b83ee1e2c9cc27009f2 (at ../../../../../.npm/_npx/6106/lib/node_modules/@prisma/cli/introspection-engine-darwin)
Format Binary : prisma-fmt 77abecc4840127ebdcc02b83ee1e2c9cc27009f2 (at ../../../../../.npm/_npx/6106/lib/node_modules/@prisma/cli/prisma-fmt-darwin)
Studio : 0.304.0
Preview Features : nativeTypes
```
|
process
|
native types postgres prisma types create with decimal results in runtime error thanks for helping us improve prisma 🙏 please follow the sections in the template and provide as much information as possible about your problem e g by setting the debug environment variable and enabling additional logging output in prisma client learn more about writing proper bug reports here bug description prisma types create with decimal in postgres results in runtime error how to reproduce prisma model types dl decimal db numeric ts import prismaclient decimal from prisma client const prisma new prismaclient async function main await prisma connect await prisma types create dl new decimal in schema dl decimal db numeric typeerror arg inputtypes is not iterable at valuetoarg users m go src github com prisma qa native types node modules prisma client runtime index js at users m go src github com prisma qa native types node modules prisma client runtime index js at array reduce at objecttoargs users m go src github com prisma qa native types node modules prisma client runtime index js at tryinferargs users m go src github com prisma qa native types node modules prisma client runtime index js at valuetoarg users m go src github com prisma qa native types node modules prisma client runtime index js at users m go src github com prisma qa native types node modules prisma client runtime index js at array reduce at objecttoargs users m go src github com prisma qa native types node modules prisma client runtime index js at tryinferargs users m go src github com prisma qa native types node modules prisma client runtime index js clientversion dev expected behavior creating lists of decimals should work environment setup environment variables loaded from prisma env prisma cli dev prisma client dev current platform darwin query engine query engine at npm npx lib node modules prisma cli query engine darwin migration engine migration engine cli at npm npx lib node modules prisma cli migration engine darwin introspection engine introspection core at npm npx lib node modules prisma cli introspection engine darwin format binary prisma fmt at npm npx lib node modules prisma cli prisma fmt darwin studio preview features nativetypes
| 1
|
19,588
| 25,923,546,120
|
IssuesEvent
|
2022-12-16 00:57:09
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
closed
|
DISABLED test_exception_single (__main__.SpawnTest)
|
module: multiprocessing triaged module: flaky-tests skipped
|
Platforms: linux
This test was disabled because it is failing in CI. See [recent examples](http://torch-ci.com/failure/test_exception_single%2C%20SpawnTest) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/6764234515).
Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 red and 6 green.
cc @VitalyFedyunin
|
1.0
|
DISABLED test_exception_single (__main__.SpawnTest) - Platforms: linux
This test was disabled because it is failing in CI. See [recent examples](http://torch-ci.com/failure/test_exception_single%2C%20SpawnTest) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/6764234515).
Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 red and 6 green.
cc @VitalyFedyunin
|
process
|
disabled test exception single main spawntest platforms linux this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has been determined flaky in workflow s with red and green cc vitalyfedyunin
| 1
|
320,209
| 23,804,612,450
|
IssuesEvent
|
2022-09-03 21:05:24
|
sveltejs/kit
|
https://api.github.com/repos/sveltejs/kit
|
closed
|
Arrays returned by endpoints (page or layout) are transformed into objects
|
documentation p3-edge-case
|
### Describe the bug
I have this `+page.svelte`:
```svelte
<script lang="ts">
import type { PageData } from './$types';
export let data: PageData;
</script>
<pre>{JSON.stringify(data, null, 2)}</pre>
```
And this `+page.ts`:
```ts
/** @type {import('./$types').PageLoad} */
export function load() {
return ['42', '9000'];
}
```
In my page, I get this:
<img width="144" alt="image" src="https://user-images.githubusercontent.com/537567/186864314-de6c1780-4f13-46b5-906a-079a8c3272b6.png">
Since layout data are merged into children pages data, I understand that we must work with objects, but I would love to see a note in the docs about this.
### Reproduction
https://github.com/SiegfriedEhret/svelte-request/tree/main/src/routes/test
### Logs
_No response_
### System Info
```shell
❯ npx envinfo --system --binaries --browsers --npmPackages "{svelte,@sveltejs/*,vite}"
System:
OS: macOS 12.5
CPU: (8) x64 Apple M1
Memory: 216.27 MB / 16.00 GB
Shell: 3.2.2 - /usr/local/bin/fish
Binaries:
Node: 16.17.0 - ~/.local/share/nvm/v16.17.0/bin/node
npm: 8.18.0 - ~/.local/share/nvm/v16.17.0/bin/npm
Browsers:
Chrome: 104.0.5112.101
Firefox Developer Edition: 105.0
Safari: 15.6
npmPackages:
@sveltejs/adapter-auto: next => 1.0.0-next.66
@sveltejs/kit: next => 1.0.0-next.442
svelte: ^3.46.0 => 3.49.0
vite: ^3.0.8 => 3.0.9
```
### Severity
annoyance
### Additional Information
_No response_
|
1.0
|
Arrays returned by endpoints (page or layout) are transformed into objects - ### Describe the bug
I have this `+page.svelte`:
```svelte
<script lang="ts">
import type { PageData } from './$types';
export let data: PageData;
</script>
<pre>{JSON.stringify(data, null, 2)}</pre>
```
And this `+page.ts`:
```ts
/** @type {import('./$types').PageLoad} */
export function load() {
return ['42', '9000'];
}
```
In my page, I get this:
<img width="144" alt="image" src="https://user-images.githubusercontent.com/537567/186864314-de6c1780-4f13-46b5-906a-079a8c3272b6.png">
Since layout data are merged into children pages data, I understand that we must work with objects, but I would love to see a note in the docs about this.
### Reproduction
https://github.com/SiegfriedEhret/svelte-request/tree/main/src/routes/test
### Logs
_No response_
### System Info
```shell
❯ npx envinfo --system --binaries --browsers --npmPackages "{svelte,@sveltejs/*,vite}"
System:
OS: macOS 12.5
CPU: (8) x64 Apple M1
Memory: 216.27 MB / 16.00 GB
Shell: 3.2.2 - /usr/local/bin/fish
Binaries:
Node: 16.17.0 - ~/.local/share/nvm/v16.17.0/bin/node
npm: 8.18.0 - ~/.local/share/nvm/v16.17.0/bin/npm
Browsers:
Chrome: 104.0.5112.101
Firefox Developer Edition: 105.0
Safari: 15.6
npmPackages:
@sveltejs/adapter-auto: next => 1.0.0-next.66
@sveltejs/kit: next => 1.0.0-next.442
svelte: ^3.46.0 => 3.49.0
vite: ^3.0.8 => 3.0.9
```
### Severity
annoyance
### Additional Information
_No response_
|
non_process
|
arrays returned by endpoints page or layout are transformed into objects describe the bug i have this page svelte svelte import type pagedata from types export let data pagedata json stringify data null and this page ts ts type import types pageload export function load return in my page i get this img width alt image src since layout data are merged into children pages data i understand that we must work with objects but i would love to see a note in the docs about this reproduction logs no response system info shell ❯ npx envinfo system binaries browsers npmpackages svelte sveltejs vite system os macos cpu apple memory mb gb shell usr local bin fish binaries node local share nvm bin node npm local share nvm bin npm browsers chrome firefox developer edition safari npmpackages sveltejs adapter auto next next sveltejs kit next next svelte vite severity annoyance additional information no response
| 0
|
485
| 2,921,750,226
|
IssuesEvent
|
2015-06-25 04:49:32
|
mitchellh/packer
|
https://api.github.com/repos/mitchellh/packer
|
closed
|
Unexpected EOF in qemu builder causes crash in Packer 0.8.0
|
bug post-processor/compress
|
My template works fine with Packer 0.7.5 but crashes with 0.8.0.
https://github.com/tylert/packer-build/blob/master/debian/jessie/base-64.json
On Mac OS X 10.10.3, "packer validate" causes a crash (no qemu binary there to test further). On Debian 8.1.0 for amd64, "packer validate" and "packer build -only=qemu" both cause the same crash.
```
Template validation failed. Errors are shown below.
Errors validating build 'qemu'. unexpected EOF
!!!!!!!!!!!!!!!!!!!!!!!!!!! PACKER CRASH !!!!!!!!!!!!!!!!!!!!!!!!!!!!
Packer crashed! This is always indicative of a bug within Packer.
A crash log has been placed at "crash.log" relative to your current
working directory. It would be immensely helpful if you could please
report the crash with Packer[1] so that we can fix this.
[1]: https://github.com/mitchellh/packer/issues
!!!!!!!!!!!!!!!!!!!!!!!!!!! PACKER CRASH !!!!!!!!!!!!!!!!!!!!!!!!!!!!
```
Contents of crash.log:
https://gist.github.com/tylert/8f6c157bf3d09c09cf9e
Please confirm this is not simply a "User Too Stupid Error".
|
1.0
|
Unexpected EOF in qemu builder causes crash in Packer 0.8.0 - My template works fine with Packer 0.7.5 but crashes with 0.8.0.
https://github.com/tylert/packer-build/blob/master/debian/jessie/base-64.json
On Mac OS X 10.10.3, "packer validate" causes a crash (no qemu binary there to test further). On Debian 8.1.0 for amd64, "packer validate" and "packer build -only=qemu" both cause the same crash.
```
Template validation failed. Errors are shown below.
Errors validating build 'qemu'. unexpected EOF
!!!!!!!!!!!!!!!!!!!!!!!!!!! PACKER CRASH !!!!!!!!!!!!!!!!!!!!!!!!!!!!
Packer crashed! This is always indicative of a bug within Packer.
A crash log has been placed at "crash.log" relative to your current
working directory. It would be immensely helpful if you could please
report the crash with Packer[1] so that we can fix this.
[1]: https://github.com/mitchellh/packer/issues
!!!!!!!!!!!!!!!!!!!!!!!!!!! PACKER CRASH !!!!!!!!!!!!!!!!!!!!!!!!!!!!
```
Contents of crash.log:
https://gist.github.com/tylert/8f6c157bf3d09c09cf9e
Please confirm this is not simply a "User Too Stupid Error".
|
process
|
unexpected eof in qemu builder causes crash in packer my template works fine with packer but crashes with on mac os x packer validate causes a crash no qemu binary there to test further on debian for packer validate and packer build only qemu both cause the same crash template validation failed errors are shown below errors validating build qemu unexpected eof packer crash packer crashed this is always indicative of a bug within packer a crash log has been placed at crash log relative to your current working directory it would be immensely helpful if you could please report the crash with packer so that we can fix this packer crash contents of crash log please confirm this is not simply a user too stupid error
| 1
|
12,663
| 15,035,580,588
|
IssuesEvent
|
2021-02-02 14:18:39
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
reopened
|
'PostgreSQL execute SQL' does not accept algorithm output as SQL query
|
Bug Feedback Processing
|
When using the algorithm 'PostgreSQL execute SQL', it is not possible to select an algorithm output as input in the SQL query. Also the 'pre-calculated value' gives an empty SQL-query in the log:

more info: https://gis.stackexchange.com/questions/352724/sql-insert-into-tablevalues-using-qgis-modeller
I'm using QGIS 3.16 on windows
|
1.0
|
'PostgreSQL execute SQL' does not accept algorithm output as SQL query - When using the algorithm 'PostgreSQL execute SQL', it is not possible to select an algorithm output as input in the SQL query. Also the 'pre-calculated value' gives an empty SQL-query in the log:

more info: https://gis.stackexchange.com/questions/352724/sql-insert-into-tablevalues-using-qgis-modeller
I'm using QGIS 3.16 on windows
|
process
|
postgresql execute sql does not accept algorithm output as sql query when using the algorithm postgresql execute sql it is not possible to select an algorithm output as input in the sql query also the pre calculated value gives an empty sql query in the log more info i m using qgis on windows
| 1
|
9,259
| 7,644,973,246
|
IssuesEvent
|
2018-05-08 17:05:10
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
opened
|
Microsoft Security Advisory CVE-2018-0765: .NET Core Denial Of Service Vulnerability
|
Security
|
# Microsoft Security Advisory CVE-2018-0765: .NET Core Denial Of Service Vulnerability
## <a name="executive-summary"></a>Executive summary
Microsoft is releasing this security advisory to provide information about a vulnerability in .NET Core and .NET native version 2.0. This advisory also provides guidance on what developers can do to update their applications to remove this vulnerability.
Microsoft is aware of a denial of service vulnerability that exists when .NET Framework and .NET Core improperly process XML documents. An attacker who successfully exploited this vulnerability could cause a denial of service against a .NET Framework, .NET Core, or .NET native application.
The update addresses the vulnerability by correcting how .NET Framework, .NET Core, and .NET native applications handle XML document processing.
If your application is an ASP.NET Core application, developers are also advised to update to ASP.NET Core 2.0.8.
## Announcement
The original announcement for this issue can be found at https://github.com/dotnet/announcements/issues/67
### <a name="mitigation-factors"></a>Mitigation factors
* If an application does not process signed XML, it is not affected by this vulnerability.
* If your application targets .NET Core 1.x or .NET native 1.x, it is not affected as the vulnerable package is not available for these platforms.
## <a name="affected-software"></a>Affected software
Any .NET Core, .NET native, or ASP.NET Core based application that uses System.Security.Cryptography.Xml with a version of 4.4.1 or earlier.
Package name | Vulnerable versions | Secure versions
------------ | ---------------- | -------------------------
System.Security.Cryptography.Xml | ≤4.4.1 | 4.4.2 or later
## <a name="advisory-faq"></a>Advisory FAQ
### <a name="how-affected"></a>How do I know if I am affected?
.NET Core has two types of dependencies: direct and transitive. Direct dependencies are dependencies where you specifically add a package to your project, transitive dependencies occur when you add a package to your project that in turn relies on another package.
For example, the `Microsoft.AspNetCore.Mvc` package depends on the `Microsoft.AspNetCore.Mvc.Core` package. When you add a dependency on `Microsoft.AspNetCore.Mvc` in your project, you're taking a transitive dependency on `Microsoft.AspNetCore.Mvc.Core`.
Any application that has a direct or transitive dependency on the [affected package](#affected-software) can be exposed to the vulnerability if it does not meet any of the [mitigation factors](#mitigation-factors).
### <a name="how-fix"></a>How do I fix the issue?
---
If you're targeting ASP.NET Core 2.0 and using the `Microsoft.AspNetCore.All` metapackage:
* Update its version number to 2.0.8 to pull in updated packages that update their transitive dependencies to the fixed version of `System.Security.Cryptography.Xml`.
---
.NET Core projects have two types of dependencies: direct and transitive. You must update your projects using the following instructions to address both types of dependency.
#### Direct dependencies
Direct dependencies are discoverable by examining your *csproj* file. They can be fixed by [editing the csproj file](#direct-dependencies) or using NuGet to update the dependency.
#### Transitive dependencies
Transitive dependencies occur when you add a package to your project that in turn relies on another package. For example, if Contoso publishes a package `Contoso.Utility` which, in turn, depends on `Contoso.Internals` and you add the `Contoso.Utility` package to your project now your project has a direct dependency on `Contoso.Utility` and, because `Contoso.Utility` depends 'Contoso.Internals', your application gains a transitive dependency on the `Contoso.Internals` package.
Transitive dependencies are reviewable in two ways:
* In the Visual Studio Solution Explorer window, which supports searching.
* By examining the *project.assets.json* file contained in the obj directory of your project.
The *project.assets.json* files are the authoritative list of all packages used by your project, containing both direct and transitive dependencies.
#### <a name="direct-dependencies"></a>Fixing direct dependencies
Open *projectname.csproj* in your editor. If you're using Visual Studio, right-click the project and choose **Edit projectname.csproj** from the context menu, where projectname is the name of your project. Look for `PackageReference` elements. The following shows an example project file:
```xml
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Security.Cryptography.Xml" Version="4.4.1" />
</ItemGroup>
</Project>
```
The preceding example has a reference to the [vulnerable package](#affected-software), as seen by the single `PackageReference` element. The name of the package is in the `Include` attribute.
The package version number is in the `Version` attribute. The previous example shows a single direct dependency on `System.Security.Cryptography.Xml` version 4.4.1.
To update the version to the secure package, change the version number to the updated package version as listed on the table [previously](#affected-software).
In this example, update `System.Security.Cryptography.Xml` to a [fixed package number](#affected-software). Save the *csproj* file. The example *csproj* now looks as follows:
```xml
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Security.Cryptography.Xml" Version="4.4.2" />
</ItemGroup>
</Project>
```
If you're using Visual Studio and you save your updated *csproj* file, Visual Studio will restore the new package version.
You can see the restore results by opening the **Output** window (Ctrl+Alt+O) and changing the **Show output from** drop-down list to **Package Manager**.
If you're not using Visual Studio, open a command line and change to your project directory. Execute the `dotnet restore` command to restore the updated dependencies.
Now recompile your application. If after recompilation you see a *Dependency conflict warning*, you must update your other direct dependencies to versions that take a dependency on the updated package.
After you've addressed all of your direct dependencies, you must review your transitive dependencies.
#### Discovering and fixing transitive dependencies
There are two ways to view transitive dependencies. You can either [use Visual Studio’s Solution Explorer](#vs-solution-explorer), or you can [review the *project.assets.json* file](#project-assets-json).
##### <a name="vs-solution-explorer"></a>Using Visual Studio Solution Explorer
To use Solution Explorer, open the project in Visual Studio 2017, and then press Ctrl+; to activate the search in Solution Explorer. Search for the [vulnerable package](#affected-software) and make a note of the version numbers of any results you find.
For example, searching for `Microsoft.AspNetCore.Mvc.Core` in an example project that contains a package that takes a dependency on `Microsoft.AspNetCore.Mvc` shows the following results in Visual Studio 2017:

The search results appear as a tree. In the previous results, you can see that a reference to `Microsoft.AspNetCore.Mvc.Core` version 1.1.2 is discovered.
Under the Dependencies node is a NuGet node. Under the NuGet node is the list of packages you have directly taken a dependency on and their versions.
In screenshot, the application takes a direct dependency on `Microsoft.AspNetCore.Mvc`. `Microsoft.AspNetCore.Mvc` in turn has leaf nodes that list its dependencies and their versions.
The `Microsoft.AspNetCore.Mvc` package takes a dependency on a version of `Microsoft.AspNetCore.Mvc.ApiExplorer`, that in turn takes a dependency on a version of `Microsoft.AspNetCore.Mvc.Core`.
##### <a name="project-assets-json"></a>Manually reviewing project.assets.json
Open the *project.assets.json* file from your project’s obj directory in your editor. We suggest you use an editor that understands JSON and allows you to collapse and expand nodes to review this file.
Visual Studio and Visual Studio Code provide JSON friendly editing.
Search the *project.assets.json* file for the [vulnerable package](#affected-software), using the format `packagename/` for each of the package names from the preceding table. If you find the assembly name in your search:
* Examine the line on which they are found, the version number is after the `/`.
* Compare to the [vulnerable versions table](#affected-software).
For example, a search result that shows `System.Security.Cryptography.Xml/4.4.0` is a reference to version 4.4.0 of `System.Security.Cryptography.Xml`.
If your *project.assets.json* file includes references to the [vulnerable package](#affected-software), then you need to fix the transitive dependencies.
##### Fixing transitive dependencies
If you have not found any reference to any vulnerable packages, this means either
* None of your direct dependencies depend on any vulnerable packages, or
* You have already fixed the problem by updating the direct dependencies.
If your transitive dependency review found references to the [vulnerable package](#affected-software), you must add a direct dependency to the updated package to your *csproj* file to override the transitive dependency.
Open *projectname.csproj* in your editor. If you're using Visual Studio, right-click the project and choose **Edit projectname.csproj** from the context menu, where projectname is the name of your project.
Look for `PackageReference` nodes, for example:
```xml
<Project Sdk="Microsoft.NET.Sdk.">
<PropertyGroup>
<TargetFramework>net461</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="ThirdParty.NotUpdatedYet" Version="2.0.0" />
</ItemGroup>
</Project>
```
You must add a direct dependency to the updated version of the [vulnerable package](#affected-software) by adding it to the *csproj* file.
You do this by adding a new line to the dependencies section, referencing the fixed version.
For example, if your search showed a transitive reference to a vulnerable `System.Security.Cryptography.Xml` version, you'd add a reference to the [fixed package number](#affected-software).
```xml
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net461</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Security.Cryptography.Xml" Version="4.4.2" />
<PackageReference Include="ThirdParty.NotUpdatedYet" Version="2.0.0" />
</ItemGroup>
</Project>
```
After you've added the direct dependency reference, save your *csproj* file.
If you're using Visual Studio, save your updated *csproj* file and Visual Studio will restore the new package versions.
You can see the restore results by opening the **Output** window (Ctrl+Alt+O) and changing the **Show output from** drop-down list to **Package Manager**.
If you're not using Visual Studio, open a command line and change to your project directory. Execute the `dotnet restore` command to restore the new dependencies.
#### Rebuilding your application
Finally you must rebuild your application, test, and redeploy.
---
If you're targeting .NET native for a Universal Windows Platform (UWP) application, you should update your dependencies and republish your application.
If you submit an application to the Windows Store with an outdated reference, the store will update the reference automatically, which has the potential for compatibility issues. Thus, we recommend updating your application directly and testing its functionality before resubmitting.
---
## Other Information
### Reporting Security Issues
If you have found a potential security issue in .NET Core, please email details to secure@microsoft.com. Reports may qualify for the .NET Core Bug Bounty. Details of the .NET Core Bug Bounty including terms and conditions are at [https://aka.ms/corebounty](https://aka.ms/corebounty).
### Support
You can ask questions about this issue on GitHub in the .NET Core or ASP.NET Core organizations. These are located at https://github.com/dotnet/ and https://github.com/aspnet/. The Announcements repo for each product (https://github.com/dotnet/Announcements and https://github.com/aspnet/Announcements) will contain this bulletin as an issue and will include a link to a discussion issue. You can ask questions in the discussion issue.
### Disclaimer
The information provided in this advisory is provided "as is" without warranty of any kind. Microsoft disclaims all warranties, either express or implied, including the warranties of merchantability and fitness for a particular purpose. In no event shall Microsoft Corporation or its suppliers be liable for any damages whatsoever including direct, indirect, incidental, consequential, loss of business profits or special damages, even if Microsoft Corporation or its suppliers have been advised of the possibility of such damages. Some states do not allow the exclusion or limitation of liability for consequential or incidental damages so the foregoing limitation may not apply.
### External Links
[CVE-2018-0787](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-CVE-2018-0765)
### Revisions
V1.0 (May 8, 2018): Advisory published.
_Version 1.0_
_Last Updated 2018-05-08_
|
True
|
Microsoft Security Advisory CVE-2018-0765: .NET Core Denial Of Service Vulnerability - # Microsoft Security Advisory CVE-2018-0765: .NET Core Denial Of Service Vulnerability
## <a name="executive-summary"></a>Executive summary
Microsoft is releasing this security advisory to provide information about a vulnerability in .NET Core and .NET native version 2.0. This advisory also provides guidance on what developers can do to update their applications to remove this vulnerability.
Microsoft is aware of a denial of service vulnerability that exists when .NET Framework and .NET Core improperly process XML documents. An attacker who successfully exploited this vulnerability could cause a denial of service against a .NET Framework, .NET Core, or .NET native application.
The update addresses the vulnerability by correcting how .NET Framework, .NET Core, and .NET native applications handle XML document processing.
If your application is an ASP.NET Core application, developers are also advised to update to ASP.NET Core 2.0.8.
## Announcement
The original announcement for this issue can be found at https://github.com/dotnet/announcements/issues/67
### <a name="mitigation-factors"></a>Mitigation factors
* If an application does not process signed XML, it is not affected by this vulnerability.
* If your application targets .NET Core 1.x or .NET native 1.x, it is not affected as the vulnerable package is not available for these platforms.
## <a name="affected-software"></a>Affected software
Any .NET Core, .NET native, or ASP.NET Core based application that uses System.Security.Cryptography.Xml with a version of 4.4.1 or earlier.
Package name | Vulnerable versions | Secure versions
------------ | ---------------- | -------------------------
System.Security.Cryptography.Xml | ≤4.4.1 | 4.4.2 or later
## <a name="advisory-faq"></a>Advisory FAQ
### <a name="how-affected"></a>How do I know if I am affected?
.NET Core has two types of dependencies: direct and transitive. Direct dependencies are dependencies where you specifically add a package to your project, transitive dependencies occur when you add a package to your project that in turn relies on another package.
For example, the `Microsoft.AspNetCore.Mvc` package depends on the `Microsoft.AspNetCore.Mvc.Core` package. When you add a dependency on `Microsoft.AspNetCore.Mvc` in your project, you're taking a transitive dependency on `Microsoft.AspNetCore.Mvc.Core`.
Any application that has a direct or transitive dependency on the [affected package](#affected-software) can be exposed to the vulnerability if it does not meet any of the [mitigation factors](#mitigation-factors).
### <a name="how-fix"></a>How do I fix the issue?
---
If you're targeting ASP.NET Core 2.0 and using the `Microsoft.AspNetCore.All` metapackage:
* Update its version number to 2.0.8 to pull in updated packages that update their transitive dependencies to the fixed version of `System.Security.Cryptography.Xml`.
---
.NET Core projects have two types of dependencies: direct and transitive. You must update your projects using the following instructions to address both types of dependency.
#### Direct dependencies
Direct dependencies are discoverable by examining your *csproj* file. They can be fixed by [editing the csproj file](#direct-dependencies) or using NuGet to update the dependency.
#### Transitive dependencies
Transitive dependencies occur when you add a package to your project that in turn relies on another package. For example, if Contoso publishes a package `Contoso.Utility` which, in turn, depends on `Contoso.Internals` and you add the `Contoso.Utility` package to your project now your project has a direct dependency on `Contoso.Utility` and, because `Contoso.Utility` depends 'Contoso.Internals', your application gains a transitive dependency on the `Contoso.Internals` package.
Transitive dependencies are reviewable in two ways:
* In the Visual Studio Solution Explorer window, which supports searching.
* By examining the *project.assets.json* file contained in the obj directory of your project.
The *project.assets.json* files are the authoritative list of all packages used by your project, containing both direct and transitive dependencies.
#### <a name="direct-dependencies"></a>Fixing direct dependencies
Open *projectname.csproj* in your editor. If you're using Visual Studio, right-click the project and choose **Edit projectname.csproj** from the context menu, where projectname is the name of your project. Look for `PackageReference` elements. The following shows an example project file:
```xml
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Security.Cryptography.Xml" Version="4.4.1" />
</ItemGroup>
</Project>
```
The preceding example has a reference to the [vulnerable package](#affected-software), as seen by the single `PackageReference` element. The name of the package is in the `Include` attribute.
The package version number is in the `Version` attribute. The previous example shows a single direct dependency on `System.Security.Cryptography.Xml` version 4.4.1.
To update the version to the secure package, change the version number to the updated package version as listed on the table [previously](#affected-software).
In this example, update `System.Security.Cryptography.Xml` to a [fixed package number](#affected-software). Save the *csproj* file. The example *csproj* now looks as follows:
```xml
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Security.Cryptography.Xml" Version="4.4.2" />
</ItemGroup>
</Project>
```
If you're using Visual Studio and you save your updated *csproj* file, Visual Studio will restore the new package version.
You can see the restore results by opening the **Output** window (Ctrl+Alt+O) and changing the **Show output from** drop-down list to **Package Manager**.
If you're not using Visual Studio, open a command line and change to your project directory. Execute the `dotnet restore` command to restore the updated dependencies.
Now recompile your application. If after recompilation you see a *Dependency conflict warning*, you must update your other direct dependencies to versions that take a dependency on the updated package.
After you've addressed all of your direct dependencies, you must review your transitive dependencies.
#### Discovering and fixing transitive dependencies
There are two ways to view transitive dependencies. You can either [use Visual Studio’s Solution Explorer](#vs-solution-explorer), or you can [review the *project.assets.json* file](#project-assets-json).
##### <a name="vs-solution-explorer"></a>Using Visual Studio Solution Explorer
To use Solution Explorer, open the project in Visual Studio 2017, and then press Ctrl+; to activate the search in Solution Explorer. Search for the [vulnerable package](#affected-software) and make a note of the version numbers of any results you find.
For example, searching for `Microsoft.AspNetCore.Mvc.Core` in an example project that contains a package that takes a dependency on `Microsoft.AspNetCore.Mvc` shows the following results in Visual Studio 2017:

The search results appear as a tree. In the previous results, you can see that a reference to `Microsoft.AspNetCore.Mvc.Core` version 1.1.2 is discovered.
Under the Dependencies node is a NuGet node. Under the NuGet node is the list of packages you have directly taken a dependency on and their versions.
In screenshot, the application takes a direct dependency on `Microsoft.AspNetCore.Mvc`. `Microsoft.AspNetCore.Mvc` in turn has leaf nodes that list its dependencies and their versions.
The `Microsoft.AspNetCore.Mvc` package takes a dependency on a version of `Microsoft.AspNetCore.Mvc.ApiExplorer`, that in turn takes a dependency on a version of `Microsoft.AspNetCore.Mvc.Core`.
##### <a name="project-assets-json"></a>Manually reviewing project.assets.json
Open the *project.assets.json* file from your project’s obj directory in your editor. We suggest you use an editor that understands JSON and allows you to collapse and expand nodes to review this file.
Visual Studio and Visual Studio Code provide JSON friendly editing.
Search the *project.assets.json* file for the [vulnerable package](#affected-software), using the format `packagename/` for each of the package names from the preceding table. If you find the assembly name in your search:
* Examine the line on which they are found, the version number is after the `/`.
* Compare to the [vulnerable versions table](#affected-software).
For example, a search result that shows `System.Security.Cryptography.Xml/4.4.0` is a reference to version 4.4.0 of `System.Security.Cryptography.Xml`.
If your *project.assets.json* file includes references to the [vulnerable package](#affected-software), then you need to fix the transitive dependencies.
##### Fixing transitive dependencies
If you have not found any reference to any vulnerable packages, this means either
* None of your direct dependencies depend on any vulnerable packages, or
* You have already fixed the problem by updating the direct dependencies.
If your transitive dependency review found references to the [vulnerable package](#affected-software), you must add a direct dependency to the updated package to your *csproj* file to override the transitive dependency.
Open *projectname.csproj* in your editor. If you're using Visual Studio, right-click the project and choose **Edit projectname.csproj** from the context menu, where projectname is the name of your project.
Look for `PackageReference` nodes, for example:
```xml
<Project Sdk="Microsoft.NET.Sdk.">
<PropertyGroup>
<TargetFramework>net461</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="ThirdParty.NotUpdatedYet" Version="2.0.0" />
</ItemGroup>
</Project>
```
You must add a direct dependency to the updated version of the [vulnerable package](#affected-software) by adding it to the *csproj* file.
You do this by adding a new line to the dependencies section, referencing the fixed version.
For example, if your search showed a transitive reference to a vulnerable `System.Security.Cryptography.Xml` version, you'd add a reference to the [fixed package number](#affected-software).
```xml
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net461</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="System.Security.Cryptography.Xml" Version="4.4.2" />
<PackageReference Include="ThirdParty.NotUpdatedYet" Version="2.0.0" />
</ItemGroup>
</Project>
```
After you've added the direct dependency reference, save your *csproj* file.
If you're using Visual Studio, save your updated *csproj* file and Visual Studio will restore the new package versions.
You can see the restore results by opening the **Output** window (Ctrl+Alt+O) and changing the **Show output from** drop-down list to **Package Manager**.
If you're not using Visual Studio, open a command line and change to your project directory. Execute the `dotnet restore` command to restore the new dependencies.
#### Rebuilding your application
Finally you must rebuild your application, test, and redeploy.
---
If you're targeting .NET native for a Universal Windows Platform (UWP) application, you should update your dependencies and republish your application.
If you submit an application to the Windows Store with an outdated reference, the store will update the reference automatically, which has the potential for compatibility issues. Thus, we recommend updating your application directly and testing its functionality before resubmitting.
---
## Other Information
### Reporting Security Issues
If you have found a potential security issue in .NET Core, please email details to secure@microsoft.com. Reports may qualify for the .NET Core Bug Bounty. Details of the .NET Core Bug Bounty including terms and conditions are at [https://aka.ms/corebounty](https://aka.ms/corebounty).
### Support
You can ask questions about this issue on GitHub in the .NET Core or ASP.NET Core organizations. These are located at https://github.com/dotnet/ and https://github.com/aspnet/. The Announcements repo for each product (https://github.com/dotnet/Announcements and https://github.com/aspnet/Announcements) will contain this bulletin as an issue and will include a link to a discussion issue. You can ask questions in the discussion issue.
### Disclaimer
The information provided in this advisory is provided "as is" without warranty of any kind. Microsoft disclaims all warranties, either express or implied, including the warranties of merchantability and fitness for a particular purpose. In no event shall Microsoft Corporation or its suppliers be liable for any damages whatsoever including direct, indirect, incidental, consequential, loss of business profits or special damages, even if Microsoft Corporation or its suppliers have been advised of the possibility of such damages. Some states do not allow the exclusion or limitation of liability for consequential or incidental damages so the foregoing limitation may not apply.
### External Links
[CVE-2018-0787](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-CVE-2018-0765)
### Revisions
V1.0 (May 8, 2018): Advisory published.
_Version 1.0_
_Last Updated 2018-05-08_
|
non_process
|
microsoft security advisory cve net core denial of service vulnerability microsoft security advisory cve net core denial of service vulnerability executive summary microsoft is releasing this security advisory to provide information about a vulnerability in net core and net native version this advisory also provides guidance on what developers can do to update their applications to remove this vulnerability microsoft is aware of a denial of service vulnerability that exists when net framework and net core improperly process xml documents an attacker who successfully exploited this vulnerability could cause a denial of service against a net framework net core or net native application the update addresses the vulnerability by correcting how net framework net core and net native applications handle xml document processing if your application is an asp net core application developers are also advised to update to asp net core announcement the original announcement for this issue can be found at mitigation factors if an application does not process signed xml it is not affected by this vulnerability if your application targets net core x or net native x it is not affected as the vulnerable package is not available for these platforms affected software any net core net native or asp net core based application that uses system security cryptography xml with a version of or earlier package name vulnerable versions secure versions system security cryptography xml ≤ or later advisory faq how do i know if i am affected net core has two types of dependencies direct and transitive direct dependencies are dependencies where you specifically add a package to your project transitive dependencies occur when you add a package to your project that in turn relies on another package for example the microsoft aspnetcore mvc package depends on the microsoft aspnetcore mvc core package when you add a dependency on microsoft aspnetcore mvc in your project you re taking a transitive dependency on microsoft aspnetcore mvc core any application that has a direct or transitive dependency on the affected software can be exposed to the vulnerability if it does not meet any of the mitigation factors how do i fix the issue if you re targeting asp net core and using the microsoft aspnetcore all metapackage update its version number to to pull in updated packages that update their transitive dependencies to the fixed version of system security cryptography xml net core projects have two types of dependencies direct and transitive you must update your projects using the following instructions to address both types of dependency direct dependencies direct dependencies are discoverable by examining your csproj file they can be fixed by direct dependencies or using nuget to update the dependency transitive dependencies transitive dependencies occur when you add a package to your project that in turn relies on another package for example if contoso publishes a package contoso utility which in turn depends on contoso internals and you add the contoso utility package to your project now your project has a direct dependency on contoso utility and because contoso utility depends contoso internals your application gains a transitive dependency on the contoso internals package transitive dependencies are reviewable in two ways in the visual studio solution explorer window which supports searching by examining the project assets json file contained in the obj directory of your project the project assets json files are the authoritative list of all packages used by your project containing both direct and transitive dependencies fixing direct dependencies open projectname csproj in your editor if you re using visual studio right click the project and choose edit projectname csproj from the context menu where projectname is the name of your project look for packagereference elements the following shows an example project file xml the preceding example has a reference to the affected software as seen by the single packagereference element the name of the package is in the include attribute the package version number is in the version attribute the previous example shows a single direct dependency on system security cryptography xml version to update the version to the secure package change the version number to the updated package version as listed on the table affected software in this example update system security cryptography xml to a affected software save the csproj file the example csproj now looks as follows xml if you re using visual studio and you save your updated csproj file visual studio will restore the new package version you can see the restore results by opening the output window ctrl alt o and changing the show output from drop down list to package manager if you re not using visual studio open a command line and change to your project directory execute the dotnet restore command to restore the updated dependencies now recompile your application if after recompilation you see a dependency conflict warning you must update your other direct dependencies to versions that take a dependency on the updated package after you ve addressed all of your direct dependencies you must review your transitive dependencies discovering and fixing transitive dependencies there are two ways to view transitive dependencies you can either vs solution explorer or you can project assets json using visual studio solution explorer to use solution explorer open the project in visual studio and then press ctrl to activate the search in solution explorer search for the affected software and make a note of the version numbers of any results you find for example searching for microsoft aspnetcore mvc core in an example project that contains a package that takes a dependency on microsoft aspnetcore mvc shows the following results in visual studio the search results appear as a tree in the previous results you can see that a reference to microsoft aspnetcore mvc core version is discovered under the dependencies node is a nuget node under the nuget node is the list of packages you have directly taken a dependency on and their versions in screenshot the application takes a direct dependency on microsoft aspnetcore mvc microsoft aspnetcore mvc in turn has leaf nodes that list its dependencies and their versions the microsoft aspnetcore mvc package takes a dependency on a version of microsoft aspnetcore mvc apiexplorer that in turn takes a dependency on a version of microsoft aspnetcore mvc core manually reviewing project assets json open the project assets json file from your project’s obj directory in your editor we suggest you use an editor that understands json and allows you to collapse and expand nodes to review this file visual studio and visual studio code provide json friendly editing search the project assets json file for the affected software using the format packagename for each of the package names from the preceding table if you find the assembly name in your search examine the line on which they are found the version number is after the compare to the affected software for example a search result that shows system security cryptography xml is a reference to version of system security cryptography xml if your project assets json file includes references to the affected software then you need to fix the transitive dependencies fixing transitive dependencies if you have not found any reference to any vulnerable packages this means either none of your direct dependencies depend on any vulnerable packages or you have already fixed the problem by updating the direct dependencies if your transitive dependency review found references to the affected software you must add a direct dependency to the updated package to your csproj file to override the transitive dependency open projectname csproj in your editor if you re using visual studio right click the project and choose edit projectname csproj from the context menu where projectname is the name of your project look for packagereference nodes for example xml you must add a direct dependency to the updated version of the affected software by adding it to the csproj file you do this by adding a new line to the dependencies section referencing the fixed version for example if your search showed a transitive reference to a vulnerable system security cryptography xml version you d add a reference to the affected software xml after you ve added the direct dependency reference save your csproj file if you re using visual studio save your updated csproj file and visual studio will restore the new package versions you can see the restore results by opening the output window ctrl alt o and changing the show output from drop down list to package manager if you re not using visual studio open a command line and change to your project directory execute the dotnet restore command to restore the new dependencies rebuilding your application finally you must rebuild your application test and redeploy if you re targeting net native for a universal windows platform uwp application you should update your dependencies and republish your application if you submit an application to the windows store with an outdated reference the store will update the reference automatically which has the potential for compatibility issues thus we recommend updating your application directly and testing its functionality before resubmitting other information reporting security issues if you have found a potential security issue in net core please email details to secure microsoft com reports may qualify for the net core bug bounty details of the net core bug bounty including terms and conditions are at support you can ask questions about this issue on github in the net core or asp net core organizations these are located at and the announcements repo for each product and will contain this bulletin as an issue and will include a link to a discussion issue you can ask questions in the discussion issue disclaimer the information provided in this advisory is provided as is without warranty of any kind microsoft disclaims all warranties either express or implied including the warranties of merchantability and fitness for a particular purpose in no event shall microsoft corporation or its suppliers be liable for any damages whatsoever including direct indirect incidental consequential loss of business profits or special damages even if microsoft corporation or its suppliers have been advised of the possibility of such damages some states do not allow the exclusion or limitation of liability for consequential or incidental damages so the foregoing limitation may not apply external links revisions may advisory published version last updated
| 0
|
67,799
| 21,161,423,573
|
IssuesEvent
|
2022-04-07 09:42:12
|
vector-im/element-ios
|
https://api.github.com/repos/vector-im/element-ios
|
closed
|
Restore collection view offsets after table view reload
|
T-Defect A-Room-List S-Minor O-Frequent
|
### Steps to reproduce
1. Setup the home screen in such a way that each section (favourites, people ...) has more items than can fit horizontally on screen
2. Scroll each row arbitrarily so that none of them are at the beginning
3. Close / Open different sections
=> Notice that each section is jumping around horizontally
### Outcome
#### What did you expect?
Sections that are not interacted with should stay at the same content offset as before
#### What happened instead?
Sections are jumping back and forth horizontally
### Your phone model
_No response_
### Operating system version
_No response_
### Application version
_No response_
### Homeserver
_No response_
### Will you send logs?
No
|
1.0
|
Restore collection view offsets after table view reload - ### Steps to reproduce
1. Setup the home screen in such a way that each section (favourites, people ...) has more items than can fit horizontally on screen
2. Scroll each row arbitrarily so that none of them are at the beginning
3. Close / Open different sections
=> Notice that each section is jumping around horizontally
### Outcome
#### What did you expect?
Sections that are not interacted with should stay at the same content offset as before
#### What happened instead?
Sections are jumping back and forth horizontally
### Your phone model
_No response_
### Operating system version
_No response_
### Application version
_No response_
### Homeserver
_No response_
### Will you send logs?
No
|
non_process
|
restore collection view offsets after table view reload steps to reproduce setup the home screen in such a way that each section favourites people has more items than can fit horizontally on screen scroll each row arbitrarily so that none of them are at the beginning close open different sections notice that each section is jumping around horizontally outcome what did you expect sections that are not interacted with should stay at the same content offset as before what happened instead sections are jumping back and forth horizontally your phone model no response operating system version no response application version no response homeserver no response will you send logs no
| 0
|
381,532
| 11,276,553,796
|
IssuesEvent
|
2020-01-14 23:36:14
|
googleapis/google-api-java-client-services
|
https://api.github.com/repos/googleapis/google-api-java-client-services
|
closed
|
Synthesis failed for tasks
|
autosynth failure priority: p1 type: bug
|
Hello! Autosynth couldn't regenerate tasks. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Checking out files: 25% (16421/65361)
Checking out files: 26% (16994/65361)
Checking out files: 27% (17648/65361)
Checking out files: 28% (18302/65361)
Checking out files: 29% (18955/65361)
Checking out files: 30% (19609/65361)
Checking out files: 31% (20262/65361)
Checking out files: 32% (20916/65361)
Checking out files: 33% (21570/65361)
Checking out files: 34% (22223/65361)
Checking out files: 35% (22877/65361)
Checking out files: 36% (23530/65361)
Checking out files: 37% (24184/65361)
Checking out files: 38% (24838/65361)
Checking out files: 39% (25491/65361)
Checking out files: 40% (26145/65361)
Checking out files: 41% (26799/65361)
Checking out files: 42% (27452/65361)
Checking out files: 43% (28106/65361)
Checking out files: 44% (28759/65361)
Checking out files: 45% (29413/65361)
Checking out files: 46% (30067/65361)
Checking out files: 47% (30720/65361)
Checking out files: 48% (31374/65361)
Checking out files: 49% (32027/65361)
Checking out files: 50% (32681/65361)
Checking out files: 51% (33335/65361)
Checking out files: 52% (33988/65361)
Checking out files: 52% (34477/65361)
Checking out files: 53% (34642/65361)
Checking out files: 54% (35295/65361)
Checking out files: 55% (35949/65361)
Checking out files: 56% (36603/65361)
Checking out files: 57% (37256/65361)
Checking out files: 58% (37910/65361)
Checking out files: 59% (38563/65361)
Checking out files: 60% (39217/65361)
Checking out files: 61% (39871/65361)
Checking out files: 62% (40524/65361)
Checking out files: 63% (41178/65361)
Checking out files: 64% (41832/65361)
Checking out files: 65% (42485/65361)
Checking out files: 66% (43139/65361)
Checking out files: 67% (43792/65361)
Checking out files: 68% (44446/65361)
Checking out files: 69% (45100/65361)
Checking out files: 70% (45753/65361)
Checking out files: 71% (46407/65361)
Checking out files: 72% (47060/65361)
Checking out files: 73% (47714/65361)
Checking out files: 74% (48368/65361)
Checking out files: 75% (49021/65361)
Checking out files: 76% (49675/65361)
Checking out files: 77% (50328/65361)
Checking out files: 78% (50982/65361)
Checking out files: 79% (51636/65361)
Checking out files: 79% (52090/65361)
Checking out files: 80% (52289/65361)
Checking out files: 81% (52943/65361)
Checking out files: 82% (53597/65361)
Checking out files: 83% (54250/65361)
Checking out files: 84% (54904/65361)
Checking out files: 85% (55557/65361)
Checking out files: 86% (56211/65361)
Checking out files: 87% (56865/65361)
Checking out files: 88% (57518/65361)
Checking out files: 89% (58172/65361)
Checking out files: 90% (58825/65361)
Checking out files: 91% (59479/65361)
Checking out files: 92% (60133/65361)
Checking out files: 93% (60786/65361)
Checking out files: 94% (61440/65361)
Checking out files: 95% (62093/65361)
Checking out files: 96% (62747/65361)
Checking out files: 97% (63401/65361)
Checking out files: 98% (64054/65361)
Checking out files: 99% (64708/65361)
Checking out files: 100% (65361/65361)
Checking out files: 100% (65361/65361), done.
Switched to branch 'autosynth-tasks'
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 256, in <module>
main()
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 196, in main
last_synth_commit_hash = get_last_metadata_commit(args.metadata_path)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 149, in get_last_metadata_commit
text=True,
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 403, in run
with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'text'
```
Google internal developers can see the full log [here](https://sponge/40f694d4-43de-41f0-b993-f4694e4a45de).
|
1.0
|
Synthesis failed for tasks - Hello! Autosynth couldn't regenerate tasks. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Checking out files: 25% (16421/65361)
Checking out files: 26% (16994/65361)
Checking out files: 27% (17648/65361)
Checking out files: 28% (18302/65361)
Checking out files: 29% (18955/65361)
Checking out files: 30% (19609/65361)
Checking out files: 31% (20262/65361)
Checking out files: 32% (20916/65361)
Checking out files: 33% (21570/65361)
Checking out files: 34% (22223/65361)
Checking out files: 35% (22877/65361)
Checking out files: 36% (23530/65361)
Checking out files: 37% (24184/65361)
Checking out files: 38% (24838/65361)
Checking out files: 39% (25491/65361)
Checking out files: 40% (26145/65361)
Checking out files: 41% (26799/65361)
Checking out files: 42% (27452/65361)
Checking out files: 43% (28106/65361)
Checking out files: 44% (28759/65361)
Checking out files: 45% (29413/65361)
Checking out files: 46% (30067/65361)
Checking out files: 47% (30720/65361)
Checking out files: 48% (31374/65361)
Checking out files: 49% (32027/65361)
Checking out files: 50% (32681/65361)
Checking out files: 51% (33335/65361)
Checking out files: 52% (33988/65361)
Checking out files: 52% (34477/65361)
Checking out files: 53% (34642/65361)
Checking out files: 54% (35295/65361)
Checking out files: 55% (35949/65361)
Checking out files: 56% (36603/65361)
Checking out files: 57% (37256/65361)
Checking out files: 58% (37910/65361)
Checking out files: 59% (38563/65361)
Checking out files: 60% (39217/65361)
Checking out files: 61% (39871/65361)
Checking out files: 62% (40524/65361)
Checking out files: 63% (41178/65361)
Checking out files: 64% (41832/65361)
Checking out files: 65% (42485/65361)
Checking out files: 66% (43139/65361)
Checking out files: 67% (43792/65361)
Checking out files: 68% (44446/65361)
Checking out files: 69% (45100/65361)
Checking out files: 70% (45753/65361)
Checking out files: 71% (46407/65361)
Checking out files: 72% (47060/65361)
Checking out files: 73% (47714/65361)
Checking out files: 74% (48368/65361)
Checking out files: 75% (49021/65361)
Checking out files: 76% (49675/65361)
Checking out files: 77% (50328/65361)
Checking out files: 78% (50982/65361)
Checking out files: 79% (51636/65361)
Checking out files: 79% (52090/65361)
Checking out files: 80% (52289/65361)
Checking out files: 81% (52943/65361)
Checking out files: 82% (53597/65361)
Checking out files: 83% (54250/65361)
Checking out files: 84% (54904/65361)
Checking out files: 85% (55557/65361)
Checking out files: 86% (56211/65361)
Checking out files: 87% (56865/65361)
Checking out files: 88% (57518/65361)
Checking out files: 89% (58172/65361)
Checking out files: 90% (58825/65361)
Checking out files: 91% (59479/65361)
Checking out files: 92% (60133/65361)
Checking out files: 93% (60786/65361)
Checking out files: 94% (61440/65361)
Checking out files: 95% (62093/65361)
Checking out files: 96% (62747/65361)
Checking out files: 97% (63401/65361)
Checking out files: 98% (64054/65361)
Checking out files: 99% (64708/65361)
Checking out files: 100% (65361/65361)
Checking out files: 100% (65361/65361), done.
Switched to branch 'autosynth-tasks'
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 256, in <module>
main()
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 196, in main
last_synth_commit_hash = get_last_metadata_commit(args.metadata_path)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 149, in get_last_metadata_commit
text=True,
File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/subprocess.py", line 403, in run
with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'text'
```
Google internal developers can see the full log [here](https://sponge/40f694d4-43de-41f0-b993-f4694e4a45de).
|
non_process
|
synthesis failed for tasks hello autosynth couldn t regenerate tasks broken heart here s the output from running synth py cloning into working repo checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files checking out files done switched to branch autosynth tasks traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth autosynth synth py line in main file tmpfs src git autosynth autosynth synth py line in main last synth commit hash get last metadata commit args metadata path file tmpfs src git autosynth autosynth synth py line in get last metadata commit text true file home kbuilder pyenv versions lib subprocess py line in run with popen popenargs kwargs as process typeerror init got an unexpected keyword argument text google internal developers can see the full log
| 0
|
80,104
| 9,979,738,738
|
IssuesEvent
|
2019-07-10 00:13:56
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
Docked extended FAB text gets clipped when returning to the page
|
a: animation f: material design framework
|
<!-- Thank you for using Flutter!
If you are looking for support, please check out our documentation
or consider asking a question on Stack Overflow:
* https://flutter.io/
* https://docs.flutter.io/
* https://stackoverflow.com/questions/tagged/flutter?sort=frequent
If you have found a bug or if our documentation doesn't have an answer
to what you're looking for, then fill our the template below. Please read
our guide to filing a bug first: https://flutter.io/bug-reports/
-->
## Steps to Reproduce
When navigating from a screen that has a center docked extended fab to another screen with a right docked extended fab that has a shorter label, when returning to the first screen the text at the right end of the fab gets clipped. See the imgur video below:
http://imgur.com/gallery/DQP8Vh7
## Logs
<!--
Run your application with `flutter run --verbose` and attach all the
log output below between the lines with the backticks. If there is an
exception, please see if the error message includes enough information
to explain how to solve the issue.
-->
```
C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker>flutter run --verbose
[ +46 ms] [C:\flutter\] git rev-parse --abbrev-ref --symbolic @{u}
[ +241 ms] Exit code 0 from: git rev-parse --abbrev-ref --symbolic @{u}
[ ] origin/beta
[ ] [C:\flutter\] git rev-parse --abbrev-ref HEAD
[ +244 ms] Exit code 0 from: git rev-parse --abbrev-ref HEAD
[ ] beta
[ ] [C:\flutter\] git ls-remote --get-url origin
[ +244 ms] Exit code 0 from: git ls-remote --get-url origin
[ ] https://github.com/flutter/flutter.git
[ ] [C:\flutter\] git log -n 1 --pretty=format:%H
[ +204 ms] Exit code 0 from: git log -n 1 --pretty=format:%H
[ ] c7ea3ca377e909469c68f2ab878a5bc53d3cf66b
[ ] [C:\flutter\] git log -n 1 --pretty=format:%ar
[ +168 ms] Exit code 0 from: git log -n 1 --pretty=format:%ar
[ ] 2 months ago
[ +1 ms] [C:\flutter\] git describe --match v*.*.* --first-parent --long --tags
[ +175 ms] Exit code 0 from: git describe --match v*.*.* --first-parent --long --tags
[ ] v0.5.1-0-gc7ea3ca37
[+1152 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb devices -l
[ +135 ms] Exit code 0 from: C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb devices -l
[ ] List of devices attached
HT74J0200552 device product:marlin model:Pixel_XL device:marlin transport_id:2
[ +543 ms] Found plugin google_places_picker at C:\flutter\.pub-cache\hosted\pub.dartlang.org\google_places_picker-0.0.4\
[ +570 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell getprop
[ +167 ms] ro.hardware = marlin
[ ] ro.build.characteristics = nosdcard
[+6176 ms] Launching lib/main.dart on Pixel XL in debug mode...
[ +28 ms] Initializing gradle...
[ +1 ms] Using gradle from C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\gradlew.bat.
[+1234 ms] C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\gradlew.bat -v
[+28800 ms]
------------------------------------------------------------
Gradle 4.4
------------------------------------------------------------
Build time: 2017-12-06 09:05:06 UTC
Revision: cf7821a6f79f8e2a598df21780e3ff7ce8db2b82
Groovy: 2.4.12
Ant: Apache Ant(TM) version 1.9.9 compiled on February 2 2017
JVM: 1.8.0_152-release (JetBrains s.r.o 25.152-b04)
OS: Windows 10 10.0 amd64
[ +11 ms] Resolving dependencies...
[ +1 ms] [android\] C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\gradlew.bat app:properties
[+131481 ms] Starting a Gradle Daemon, 1 incompatible and 1 stopped Daemons could not be reused, use --status for details
:app:properties
------------------------------------------------------------
Project :app
------------------------------------------------------------
allprojects: [project ':app']
android: com.android.build.gradle.AppExtension_Decorated@378812f6
androidDependencies: task ':app:androidDependencies'
ant: org.gradle.api.internal.project.DefaultAntBuilder@2113d18
antBuilderFactory: org.gradle.api.internal.project.DefaultAntBuilderFactory@b4b4aee
archivesBaseName: app
artifacts: org.gradle.api.internal.artifacts.dsl.DefaultArtifactHandler_Decorated@4261b282
asDynamicObject: DynamicObject for project ':app'
assemble: task ':app:assemble'
assembleAndroidTest: task ':app:assembleAndroidTest'
assembleDebug: task ':app:assembleDebug'
assembleDebugAndroidTest: task ':app:assembleDebugAndroidTest'
assembleDebugUnitTest: task ':app:assembleDebugUnitTest'
assembleProfile: task ':app:assembleProfile'
assembleProfileUnitTest: task ':app:assembleProfileUnitTest'
assembleRelease: task ':app:assembleRelease'
assembleReleaseUnitTest: task ':app:assembleReleaseUnitTest'
baseClassLoaderScope: org.gradle.api.internal.initialization.DefaultClassLoaderScope@3509079a
buildDependents: task ':app:buildDependents'
buildDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app
buildFile: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\app\build.gradle
buildNeeded: task ':app:buildNeeded'
buildOutputs: BaseVariantOutput container
buildPath: :
buildScriptSource: org.gradle.groovy.scripts.TextResourceScriptSource@3e7b05de
buildscript: org.gradle.api.internal.initialization.DefaultScriptHandler@6f3634c5
bundleAppClassesDebug: task ':app:bundleAppClassesDebug'
bundleAppClassesDebugAndroidTest: task ':app:bundleAppClassesDebugAndroidTest'
bundleAppClassesDebugUnitTest: task ':app:bundleAppClassesDebugUnitTest'
bundleAppClassesProfile: task ':app:bundleAppClassesProfile'
bundleAppClassesProfileUnitTest: task ':app:bundleAppClassesProfileUnitTest'
bundleAppClassesRelease: task ':app:bundleAppClassesRelease'
bundleAppClassesReleaseUnitTest: task ':app:bundleAppClassesReleaseUnitTest'
bundleDebugAndroidTestResources: task ':app:bundleDebugAndroidTestResources'
bundleDebugResources: task ':app:bundleDebugResources'
bundleProfileResources: task ':app:bundleProfileResources'
bundleReleaseResources: task ':app:bundleReleaseResources'
check: task ':app:check'
checkDebugManifest: task ':app:checkDebugManifest'
checkProfileManifest: task ':app:checkProfileManifest'
checkReleaseManifest: task ':app:checkReleaseManifest'
childProjects: {}
class: class org.gradle.api.internal.project.DefaultProject_Decorated
classLoaderScope: org.gradle.api.internal.initialization.DefaultClassLoaderScope@208c7e03
cleanBuildCache: task ':app:cleanBuildCache'
compileDebugAidl: task ':app:compileDebugAidl'
compileDebugAndroidTestAidl: task ':app:compileDebugAndroidTestAidl'
compileDebugAndroidTestJavaWithJavac: task ':app:compileDebugAndroidTestJavaWithJavac'
compileDebugAndroidTestNdk: task ':app:compileDebugAndroidTestNdk'
compileDebugAndroidTestRenderscript: task ':app:compileDebugAndroidTestRenderscript'
compileDebugAndroidTestShaders: task ':app:compileDebugAndroidTestShaders'
compileDebugAndroidTestSources: task ':app:compileDebugAndroidTestSources'
compileDebugJavaWithJavac: task ':app:compileDebugJavaWithJavac'
compileDebugNdk: task ':app:compileDebugNdk'
compileDebugRenderscript: task ':app:compileDebugRenderscript'
compileDebugShaders: task ':app:compileDebugShaders'
compileDebugSources: task ':app:compileDebugSources'
compileDebugUnitTestJavaWithJavac: task ':app:compileDebugUnitTestJavaWithJavac'
compileDebugUnitTestSources: task ':app:compileDebugUnitTestSources'
compileLint: task ':app:compileLint'
compileProfileAidl: task ':app:compileProfileAidl'
compileProfileJavaWithJavac: task ':app:compileProfileJavaWithJavac'
compileProfileNdk: task ':app:compileProfileNdk'
compileProfileRenderscript: task ':app:compileProfileRenderscript'
compileProfileShaders: task ':app:compileProfileShaders'
compileProfileSources: task ':app:compileProfileSources'
compileProfileUnitTestJavaWithJavac: task ':app:compileProfileUnitTestJavaWithJavac'
compileProfileUnitTestSources: task ':app:compileProfileUnitTestSources'
compileReleaseAidl: task ':app:compileReleaseAidl'
compileReleaseJavaWithJavac: task ':app:compileReleaseJavaWithJavac'
compileReleaseNdk: task ':app:compileReleaseNdk'
compileReleaseRenderscript: task ':app:compileReleaseRenderscript'
compileReleaseShaders: task ':app:compileReleaseShaders'
compileReleaseSources: task ':app:compileReleaseSources'
compileReleaseUnitTestJavaWithJavac: task ':app:compileReleaseUnitTestJavaWithJavac'
compileReleaseUnitTestSources: task ':app:compileReleaseUnitTestSources'
components: SoftwareComponentInternal set
configurationActions: org.gradle.configuration.project.DefaultProjectConfigurationActionContainer@5dc4885f
configurationTargetIdentifier: org.gradle.configuration.ConfigurationTargetIdentifier$1@3bf733cc
configurations: configuration container
connectedAndroidTest: task ':app:connectedAndroidTest'
connectedCheck: task ':app:connectedCheck'
connectedDebugAndroidTest: task ':app:connectedDebugAndroidTest'
consumeConfigAttr: task ':app:consumeConfigAttr'
convention: org.gradle.api.internal.plugins.DefaultConvention@6539a86b
copyFlutterAssetsDebug: task ':app:copyFlutterAssetsDebug'
copyFlutterAssetsProfile: task ':app:copyFlutterAssetsProfile'
copyFlutterAssetsRelease: task ':app:copyFlutterAssetsRelease'
createDebugCompatibleScreenManifests: task ':app:createDebugCompatibleScreenManifests'
createProfileCompatibleScreenManifests: task ':app:createProfileCompatibleScreenManifests'
createReleaseCompatibleScreenManifests: task ':app:createReleaseCompatibleScreenManifests'
defaultArtifacts: org.gradle.api.internal.plugins.DefaultArtifactPublicationSet_Decorated@2686aea9
defaultTasks: []
deferredProjectConfiguration: org.gradle.api.internal.project.DeferredProjectConfiguration@5ff95c56
dependencies: org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler_Decorated@5d927d36
depth: 1
description: null
deviceAndroidTest: task ':app:deviceAndroidTest'
deviceCheck: task ':app:deviceCheck'
displayName: project ':app'
distsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\distributions
distsDirName: distributions
docsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\docs
docsDirName: docs
ext: org.gradle.api.internal.plugins.DefaultExtraPropertiesExtension@75c0ec1f
extensions: org.gradle.api.internal.plugins.DefaultConvention@6539a86b
extractProguardFiles: task ':app:extractProguardFiles'
fileOperations: org.gradle.api.internal.file.DefaultFileOperations@6620e7e8
fileResolver: org.gradle.api.internal.file.BaseDirFileResolver@732e0c96
flutter: FlutterExtension_Decorated@3d1ba408
flutterBuildDebug: task ':app:flutterBuildDebug'
flutterBuildProfile: task ':app:flutterBuildProfile'
flutterBuildRelease: task ':app:flutterBuildRelease'
flutterBuildX86Jar: task ':app:flutterBuildX86Jar'
generateDebugAndroidTestAssets: task ':app:generateDebugAndroidTestAssets'
generateDebugAndroidTestBuildConfig: task ':app:generateDebugAndroidTestBuildConfig'
generateDebugAndroidTestResValues: task ':app:generateDebugAndroidTestResValues'
generateDebugAndroidTestResources: task ':app:generateDebugAndroidTestResources'
generateDebugAndroidTestSources: task ':app:generateDebugAndroidTestSources'
generateDebugAssets: task ':app:generateDebugAssets'
generateDebugBuildConfig: task ':app:generateDebugBuildConfig'
generateDebugResValues: task ':app:generateDebugResValues'
generateDebugResources: task ':app:generateDebugResources'
generateDebugSources: task ':app:generateDebugSources'
generateProfileAssets: task ':app:generateProfileAssets'
generateProfileBuildConfig: task ':app:generateProfileBuildConfig'
generateProfileResValues: task ':app:generateProfileResValues'
generateProfileResources: task ':app:generateProfileResources'
generateProfileSources: task ':app:generateProfileSources'
generateReleaseAssets: task ':app:generateReleaseAssets'
generateReleaseBuildConfig: task ':app:generateReleaseBuildConfig'
generateReleaseResValues: task ':app:generateReleaseResValues'
generateReleaseResources: task ':app:generateReleaseResources'
generateReleaseSources: task ':app:generateReleaseSources'
gradle: build 'android'
group: android
identityPath: :app
inheritedScope: org.gradle.api.internal.ExtensibleDynamicObject$InheritedDynamicObject@4f4907a0
installDebug: task ':app:installDebug'
installDebugAndroidTest: task ':app:installDebugAndroidTest'
installProfile: task ':app:installProfile'
installRelease: task ':app:installRelease'
javaPreCompileDebug: task ':app:javaPreCompileDebug'
javaPreCompileDebugAndroidTest: task ':app:javaPreCompileDebugAndroidTest'
javaPreCompileDebugUnitTest: task ':app:javaPreCompileDebugUnitTest'
javaPreCompileProfile: task ':app:javaPreCompileProfile'
javaPreCompileProfileUnitTest: task ':app:javaPreCompileProfileUnitTest'
javaPreCompileRelease: task ':app:javaPreCompileRelease'
javaPreCompileReleaseUnitTest: task ':app:javaPreCompileReleaseUnitTest'
layout: org.gradle.api.internal.file.DefaultProjectLayout@2eb7b606
libsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\libs
libsDirName: libs
lint: task ':app:lint'
lintDebug: task ':app:lintDebug'
lintProfile: task ':app:lintProfile'
lintRelease: task ':app:lintRelease'
lintVitalRelease: task ':app:lintVitalRelease'
logger: org.gradle.internal.logging.slf4j.OutputEventListenerBackedLogger@2d7e55fa
logging: org.gradle.internal.logging.services.DefaultLoggingManager@4d258fb9
mainApkListPersistenceDebug: task ':app:mainApkListPersistenceDebug'
mainApkListPersistenceDebugAndroidTest: task ':app:mainApkListPersistenceDebugAndroidTest'
mainApkListPersistenceProfile: task ':app:mainApkListPersistenceProfile'
mainApkListPersistenceRelease: task ':app:mainApkListPersistenceRelease'
mergeDebugAndroidTestAssets: task ':app:mergeDebugAndroidTestAssets'
mergeDebugAndroidTestJniLibFolders: task ':app:mergeDebugAndroidTestJniLibFolders'
mergeDebugAndroidTestResources: task ':app:mergeDebugAndroidTestResources'
mergeDebugAndroidTestShaders: task ':app:mergeDebugAndroidTestShaders'
mergeDebugAssets: task ':app:mergeDebugAssets'
mergeDebugJniLibFolders: task ':app:mergeDebugJniLibFolders'
mergeDebugResources: task ':app:mergeDebugResources'
mergeDebugShaders: task ':app:mergeDebugShaders'
mergeProfileAssets: task ':app:mergeProfileAssets'
mergeProfileJniLibFolders: task ':app:mergeProfileJniLibFolders'
mergeProfileResources: task ':app:mergeProfileResources'
mergeProfileShaders: task ':app:mergeProfileShaders'
mergeReleaseAssets: task ':app:mergeReleaseAssets'
mergeReleaseJniLibFolders: task ':app:mergeReleaseJniLibFolders'
mergeReleaseResources: task ':app:mergeReleaseResources'
mergeReleaseShaders: task ':app:mergeReleaseShaders'
mockableAndroidJar: task ':app:mockableAndroidJar'
modelRegistry: org.gradle.model.internal.registry.DefaultModelRegistry@4a56f5ba
modelSchemaStore: org.gradle.model.internal.manage.schema.extract.DefaultModelSchemaStore@59427d70
module: org.gradle.api.internal.artifacts.ProjectBackedModule@2625c26
name: app
normalization: org.gradle.normalization.internal.DefaultInputNormalizationHandler_Decorated@64c2227
objects: org.gradle.api.internal.model.DefaultObjectFactory@67752d10
org.gradle.jvmargs: -Xmx1536M
packageDebug: task ':app:packageDebug'
packageDebugAndroidTest: task ':app:packageDebugAndroidTest'
packageProfile: task ':app:packageProfile'
packageRelease: task ':app:packageRelease'
parent: root project 'android'
parentIdentifier: root project 'android'
path: :app
platformAttrExtractor: task ':app:platformAttrExtractor'
pluginManager: org.gradle.api.internal.plugins.DefaultPluginManager_Decorated@1a73ef61
plugins: [org.gradle.api.plugins.HelpTasksPlugin@53d57a54, com.android.build.gradle.api.AndroidBasePlugin@768ed0ad, org.gradle.language.base.plugins.LifecycleBasePlugin@6a5fc451, org.gradle.api.plugins.BasePl
ugin@4fddcc94, org.gradle.api.plugins.ReportingBasePlugin@2af1694d, org.gradle.platform.base.plugins.ComponentBasePlugin@697f1ec3, org.gradle.language.base.plugins.LanguageBasePlugin@26c95273, org.gradle.platform.base.plugins.Bin
aryBasePlugin@5b9560a6, org.gradle.api.plugins.JavaBasePlugin@1c6bd387, com.android.build.gradle.AppPlugin@949683, FlutterPlugin@7eae9d42]
preBuild: task ':app:preBuild'
preDebugAndroidTestBuild: task ':app:preDebugAndroidTestBuild'
preDebugBuild: task ':app:preDebugBuild'
preDebugUnitTestBuild: task ':app:preDebugUnitTestBuild'
preProfileBuild: task ':app:preProfileBuild'
preProfileUnitTestBuild: task ':app:preProfileUnitTestBuild'
preReleaseBuild: task ':app:preReleaseBuild'
preReleaseUnitTestBuild: task ':app:preReleaseUnitTestBuild'
prepareLintJar: task ':app:prepareLintJar'
preparePUBLISHED_DEXDebugAndroidTestForPublishing: task ':app:preparePUBLISHED_DEXDebugAndroidTestForPublishing'
preparePUBLISHED_DEXDebugForPublishing: task ':app:preparePUBLISHED_DEXDebugForPublishing'
preparePUBLISHED_DEXProfileForPublishing: task ':app:preparePUBLISHED_DEXProfileForPublishing'
preparePUBLISHED_DEXReleaseForPublishing: task ':app:preparePUBLISHED_DEXReleaseForPublishing'
preparePUBLISHED_JAVA_RESDebugAndroidTestForPublishing: task ':app:preparePUBLISHED_JAVA_RESDebugAndroidTestForPublishing'
preparePUBLISHED_JAVA_RESDebugForPublishing: task ':app:preparePUBLISHED_JAVA_RESDebugForPublishing'
preparePUBLISHED_JAVA_RESProfileForPublishing: task ':app:preparePUBLISHED_JAVA_RESProfileForPublishing'
preparePUBLISHED_JAVA_RESReleaseForPublishing: task ':app:preparePUBLISHED_JAVA_RESReleaseForPublishing'
preparePUBLISHED_NATIVE_LIBSDebugAndroidTestForPublishing: task ':app:preparePUBLISHED_NATIVE_LIBSDebugAndroidTestForPublishing'
preparePUBLISHED_NATIVE_LIBSDebugForPublishing: task ':app:preparePUBLISHED_NATIVE_LIBSDebugForPublishing'
preparePUBLISHED_NATIVE_LIBSProfileForPublishing: task ':app:preparePUBLISHED_NATIVE_LIBSProfileForPublishing'
preparePUBLISHED_NATIVE_LIBSReleaseForPublishing: task ':app:preparePUBLISHED_NATIVE_LIBSReleaseForPublishing'
processDebugAndroidTestJavaRes: task ':app:processDebugAndroidTestJavaRes'
processDebugAndroidTestManifest: task ':app:processDebugAndroidTestManifest'
processDebugAndroidTestResources: task ':app:processDebugAndroidTestResources'
processDebugJavaRes: task ':app:processDebugJavaRes'
processDebugManifest: task ':app:processDebugManifest'
processDebugResources: task ':app:processDebugResources'
processDebugUnitTestJavaRes: task ':app:processDebugUnitTestJavaRes'
processOperations: org.gradle.api.internal.file.DefaultFileOperations@6620e7e8
processProfileJavaRes: task ':app:processProfileJavaRes'
processProfileManifest: task ':app:processProfileManifest'
processProfileResources: task ':app:processProfileResources'
processProfileUnitTestJavaRes: task ':app:processProfileUnitTestJavaRes'
processReleaseJavaRes: task ':app:processReleaseJavaRes'
processReleaseManifest: task ':app:processReleaseManifest'
processReleaseResources: task ':app:processReleaseResources'
processReleaseUnitTestJavaRes: task ':app:processReleaseUnitTestJavaRes'
project: project ':app'
projectConfigurator: org.gradle.api.internal.project.BuildOperationCrossProjectConfigurator@5090b8a2
projectDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\app
projectEvaluationBroadcaster: ProjectEvaluationListener broadcast
projectEvaluator: org.gradle.configuration.project.LifecycleProjectEvaluator@372af58d
projectPath: :app
projectRegistry: org.gradle.api.internal.project.DefaultProjectRegistry@797953d5
properties: {...}
providers: org.gradle.api.internal.provider.DefaultProviderFactory@7cf4a37c
reportBuildArtifactsDebug: task ':app:reportBuildArtifactsDebug'
reportBuildArtifactsProfile: task ':app:reportBuildArtifactsProfile'
reportBuildArtifactsRelease: task ':app:reportBuildArtifactsRelease'
reporting: org.gradle.api.reporting.ReportingExtension_Decorated@297451e8
reportsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\reports
repositories: repository container
resolveConfigAttr: task ':app:resolveConfigAttr'
resourceLoader: org.gradle.internal.resource.transfer.DefaultUriTextResourceLoader@49125fd6
resources: org.gradle.api.internal.resources.DefaultResourceHandler@7ac64ada
rootDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android
rootProject: root project 'android'
script: false
scriptHandlerFactory: org.gradle.api.internal.initialization.DefaultScriptHandlerFactory@23f0e9bd
scriptPluginFactory: org.gradle.configuration.ScriptPluginFactorySelector@2800173a
serviceRegistryFactory: org.gradle.internal.service.scopes.ProjectScopeServices$4@73370103
services: ProjectScopeServices
signingReport: task ':app:signingReport'
sourceCompatibility: 1.8
sourceSets: SourceSet container
splitsDiscoveryTaskDebug: task ':app:splitsDiscoveryTaskDebug'
splitsDiscoveryTaskProfile: task ':app:splitsDiscoveryTaskProfile'
splitsDiscoveryTaskRelease: task ':app:splitsDiscoveryTaskRelease'
standardOutputCapture: org.gradle.internal.logging.services.DefaultLoggingManager@4d258fb9
state: project state 'EXECUTED'
status: integration
subprojects: []
targetCompatibility: 1.8
tasks: task set
test: task ':app:test'
testDebugUnitTest: task ':app:testDebugUnitTest'
testProfileUnitTest: task ':app:testProfileUnitTest'
testReleaseUnitTest: task ':app:testReleaseUnitTest'
testReportDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\reports\tests
testReportDirName: tests
testResultsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\test-results
testResultsDirName: test-results
transformClassesWithDexBuilderForDebug: task ':app:transformClassesWithDexBuilderForDebug'
transformClassesWithDexBuilderForDebugAndroidTest: task ':app:transformClassesWithDexBuilderForDebugAndroidTest'
transformClassesWithDexBuilderForProfile: task ':app:transformClassesWithDexBuilderForProfile'
transformClassesWithDexBuilderForRelease: task ':app:transformClassesWithDexBuilderForRelease'
transformDexArchiveWithDexMergerForDebug: task ':app:transformDexArchiveWithDexMergerForDebug'
transformDexArchiveWithDexMergerForDebugAndroidTest: task ':app:transformDexArchiveWithDexMergerForDebugAndroidTest'
transformDexArchiveWithDexMergerForProfile: task ':app:transformDexArchiveWithDexMergerForProfile'
transformDexArchiveWithDexMergerForRelease: task ':app:transformDexArchiveWithDexMergerForRelease'
transformDexArchiveWithExternalLibsDexMergerForDebug: task ':app:transformDexArchiveWithExternalLibsDexMergerForDebug'
transformDexArchiveWithExternalLibsDexMergerForDebugAndroidTest: task ':app:transformDexArchiveWithExternalLibsDexMergerForDebugAndroidTest'
transformDexArchiveWithExternalLibsDexMergerForProfile: task ':app:transformDexArchiveWithExternalLibsDexMergerForProfile'
transformDexArchiveWithExternalLibsDexMergerForRelease: task ':app:transformDexArchiveWithExternalLibsDexMergerForRelease'
transformNativeLibsWithMergeJniLibsForDebug: task ':app:transformNativeLibsWithMergeJniLibsForDebug'
transformNativeLibsWithMergeJniLibsForDebugAndroidTest: task ':app:transformNativeLibsWithMergeJniLibsForDebugAndroidTest'
transformNativeLibsWithMergeJniLibsForProfile: task ':app:transformNativeLibsWithMergeJniLibsForProfile'
transformNativeLibsWithMergeJniLibsForRelease: task ':app:transformNativeLibsWithMergeJniLibsForRelease'
transformResourcesWithMergeJavaResForDebug: task ':app:transformResourcesWithMergeJavaResForDebug'
transformResourcesWithMergeJavaResForDebugAndroidTest: task ':app:transformResourcesWithMergeJavaResForDebugAndroidTest'
transformResourcesWithMergeJavaResForDebugUnitTest: task ':app:transformResourcesWithMergeJavaResForDebugUnitTest'
transformResourcesWithMergeJavaResForProfile: task ':app:transformResourcesWithMergeJavaResForProfile'
transformResourcesWithMergeJavaResForProfileUnitTest: task ':app:transformResourcesWithMergeJavaResForProfileUnitTest'
transformResourcesWithMergeJavaResForRelease: task ':app:transformResourcesWithMergeJavaResForRelease'
transformResourcesWithMergeJavaResForReleaseUnitTest: task ':app:transformResourcesWithMergeJavaResForReleaseUnitTest'
uninstallAll: task ':app:uninstallAll'
uninstallDebug: task ':app:uninstallDebug'
uninstallDebugAndroidTest: task ':app:uninstallDebugAndroidTest'
uninstallProfile: task ':app:uninstallProfile'
uninstallRelease: task ':app:uninstallRelease'
validateSigningDebug: task ':app:validateSigningDebug'
validateSigningDebugAndroidTest: task ':app:validateSigningDebugAndroidTest'
validateSigningProfile: task ':app:validateSigningProfile'
validateSigningRelease: task ':app:validateSigningRelease'
version: unspecified
writeDebugApplicationId: task ':app:writeDebugApplicationId'
writeProfileApplicationId: task ':app:writeProfileApplicationId'
writeReleaseApplicationId: task ':app:writeReleaseApplicationId'
BUILD SUCCESSFUL in 2m 9s
1 actionable task: 1 executed
[ +9 ms] C:\Users\groov\AppData\Local\Android\sdk\build-tools\28.0.0\aapt dump badging build\app\outputs\apk\app.apk
[ +873 ms] Exit code 0 from: C:\Users\groov\AppData\Local\Android\sdk\build-tools\28.0.0\aapt dump badging build\app\outputs\apk\app.apk
[ ] package: name='com.example.guitarstudenttracker' versionCode='1' versionName='1.0'
sdkVersion:'16'
targetSdkVersion:'27'
uses-permission: name='android.permission.INTERNET'
uses-permission: name='android.permission.ACCESS_NETWORK_STATE'
application-label:'guitar_student_tracker'
application-label-af:'guitar_student_tracker'
application-label-am:'guitar_student_tracker'
application-label-ar:'guitar_student_tracker'
application-label-az:'guitar_student_tracker'
application-label-be:'guitar_student_tracker'
application-label-bg:'guitar_student_tracker'
application-label-bn:'guitar_student_tracker'
application-label-bs:'guitar_student_tracker'
application-label-ca:'guitar_student_tracker'
application-label-cs:'guitar_student_tracker'
application-label-da:'guitar_student_tracker'
application-label-de:'guitar_student_tracker'
application-label-el:'guitar_student_tracker'
application-label-en-AU:'guitar_student_tracker'
application-label-en-GB:'guitar_student_tracker'
application-label-en-IN:'guitar_student_tracker'
application-label-es:'guitar_student_tracker'
application-label-es-US:'guitar_student_tracker'
application-label-et:'guitar_student_tracker'
application-label-eu:'guitar_student_tracker'
application-label-fa:'guitar_student_tracker'
application-label-fi:'guitar_student_tracker'
application-label-fr:'guitar_student_tracker'
application-label-fr-CA:'guitar_student_tracker'
application-label-gl:'guitar_student_tracker'
application-label-gu:'guitar_student_tracker'
application-label-hi:'guitar_student_tracker'
application-label-hr:'guitar_student_tracker'
application-label-hu:'guitar_student_tracker'
application-label-hy:'guitar_student_tracker'
application-label-in:'guitar_student_tracker'
application-label-is:'guitar_student_tracker'
application-label-it:'guitar_student_tracker'
application-label-iw:'guitar_student_tracker'
application-label-ja:'guitar_student_tracker'
application-label-ka:'guitar_student_tracker'
application-label-kk:'guitar_student_tracker'
application-label-km:'guitar_student_tracker'
application-label-kn:'guitar_student_tracker'
application-label-ko:'guitar_student_tracker'
application-label-ky:'guitar_student_tracker'
application-label-lo:'guitar_student_tracker'
application-label-lt:'guitar_student_tracker'
application-label-lv:'guitar_student_tracker'
application-label-mk:'guitar_student_tracker'
application-label-ml:'guitar_student_tracker'
application-label-mn:'guitar_student_tracker'
application-label-mr:'guitar_student_tracker'
application-label-ms:'guitar_student_tracker'
application-label-my:'guitar_student_tracker'
application-label-nb:'guitar_student_tracker'
application-label-ne:'guitar_student_tracker'
application-label-nl:'guitar_student_tracker'
application-label-pa:'guitar_student_tracker'
application-label-pl:'guitar_student_tracker'
application-label-pt:'guitar_student_tracker'
application-label-pt-BR:'guitar_student_tracker'
application-label-pt-PT:'guitar_student_tracker'
application-label-ro:'guitar_student_tracker'
application-label-ru:'guitar_student_tracker'
application-label-si:'guitar_student_tracker'
application-label-sk:'guitar_student_tracker'
application-label-sl:'guitar_student_tracker'
application-label-sq:'guitar_student_tracker'
application-label-sr:'guitar_student_tracker'
application-label-sr-Latn:'guitar_student_tracker'
application-label-sv:'guitar_student_tracker'
application-label-sw:'guitar_student_tracker'
application-label-ta:'guitar_student_tracker'
application-label-te:'guitar_student_tracker'
application-label-th:'guitar_student_tracker'
application-label-tl:'guitar_student_tracker'
application-label-tr:'guitar_student_tracker'
application-label-uk:'guitar_student_tracker'
application-label-ur:'guitar_student_tracker'
application-label-uz:'guitar_student_tracker'
application-label-vi:'guitar_student_tracker'
application-label-zh-CN:'guitar_student_tracker'
application-label-zh-HK:'guitar_student_tracker'
application-label-zh-TW:'guitar_student_tracker'
application-label-zu:'guitar_student_tracker'
application-icon-160:'res/mipmap-mdpi-v4/ic_launcher.png'
application-icon-240:'res/mipmap-hdpi-v4/ic_launcher.png'
application-icon-320:'res/mipmap-xhdpi-v4/ic_launcher.png'
application-icon-480:'res/mipmap-xxhdpi-v4/ic_launcher.png'
application-icon-640:'res/mipmap-xxxhdpi-v4/ic_launcher.png'
application: label='guitar_student_tracker' icon='res/mipmap-mdpi-v4/ic_launcher.png'
launchable-activity: name='com.example.guitarstudenttracker.MainActivity' label='' icon=''
feature-group: label=''
uses-gl-es: '0x20000'
uses-feature: name='android.hardware.faketouch'
uses-implied-feature: name='android.hardware.faketouch' reason='default feature for all apps'
main
other-activities
supports-screens: 'small' 'normal' 'large' 'xlarge'
supports-any-density: 'true'
locales: '--_--' 'af' 'am' 'ar' 'az' 'be' 'bg' 'bn' 'bs' 'ca' 'cs' 'da' 'de' 'el' 'en-AU' 'en-GB' 'en-IN' 'es' 'es-US' 'et' 'eu' 'fa' 'fi' 'fr' 'fr-CA' 'gl' 'gu' 'hi' 'hr' 'hu' 'hy' 'in' 'is' 'it' 'iw' 'ja' 'ka' 'kk' '
km' 'kn' 'ko' 'ky' 'lo' 'lt' 'lv' 'mk' 'ml' 'mn' 'mr' 'ms' 'my' 'nb' 'ne' 'nl' 'pa' 'pl' 'pt' 'pt-BR' 'pt-PT' 'ro' 'ru' 'si' 'sk' 'sl' 'sq' 'sr' 'sr-Latn' 'sv' 'sw' 'ta' 'te' 'th' 'tl' 'tr' 'uk' 'ur' 'uz' 'vi' 'zh-CN' 'zh-HK' 'zh
-TW' 'zu'
densities: '160' '240' '320' '480' '640'
native-code: 'arm64-v8a'
[ +10 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 logcat -v time -t 1
[ +335 ms] Exit code 0 from: C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 logcat -v time -t 1
[ +1 ms] --------- beginning of main
08-09 21:43:33.808 D/NewStreamAdapter( 2543): Publishing revision #2270812 to adapter clients
[ +8 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 logcat -v time
[ +419 ms] DependencyChecker: nothing is modified after 2018-08-09 21:07:09.000.
[ +4 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb version
[ +143 ms] Android Debug Bridge version 1.0.40
Version 4797878
Installed as C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb.EXE
[ +3 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb start-server
[ +66 ms] Building APK
[ +66 ms] Running 'gradlew assembleDebug'...
[ +2 ms] [android\] C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\gradlew.bat -Ptarget=C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\lib/main.dart -Ppreview-dart-2=true -Ptarget-platform=android-arm64
assembleDebug
[+4194 ms] :app:preBuild UP-TO-DATE
[ ] :google_places_picker:preBuild UP-TO-DATE
[ ] :google_places_picker:preDebugBuild UP-TO-DATE
[ +252 ms] :google_places_picker:checkDebugManifest UP-TO-DATE
[ +199 ms] :google_places_picker:processDebugManifest UP-TO-DATE
[+3580 ms] :app:preDebugBuild UP-TO-DATE
[ +66 ms] :google_places_picker:compileDebugAidl UP-TO-DATE
[ +10 ms] :app:compileDebugAidl UP-TO-DATE
[ ] :google_places_picker:packageDebugRenderscript NO-SOURCE
[ +33 ms] :app:compileDebugRenderscript UP-TO-DATE
[ +10 ms] :app:flutterBuildX86Jar UP-TO-DATE
[ +10 ms] :app:checkDebugManifest UP-TO-DATE
[ ] :app:generateDebugBuildConfig UP-TO-DATE
[ ] :app:prepareLintJar UP-TO-DATE
[ +162 ms] :app:cleanMergeDebugAssets
[+13601 ms] :app:flutterBuildDebug
[ +199 ms] :app:mergeDebugShaders UP-TO-DATE
[ ] :app:compileDebugShaders UP-TO-DATE
[ ] :app:generateDebugAssets UP-TO-DATE
[ +10 ms] :google_places_picker:mergeDebugShaders UP-TO-DATE
[ ] :google_places_picker:compileDebugShaders UP-TO-DATE
[ ] :google_places_picker:generateDebugAssets UP-TO-DATE
[ ] :google_places_picker:packageDebugAssets UP-TO-DATE
[ +494 ms] :app:mergeDebugAssets
[ +681 ms] :app:copyFlutterAssetsDebug
[ +96 ms] :app:mainApkListPersistenceDebug UP-TO-DATE
[ +1 ms] :app:generateDebugResValues UP-TO-DATE
[ ] :app:generateDebugResources UP-TO-DATE
[ +9 ms] :google_places_picker:compileDebugRenderscript UP-TO-DATE
[ +10 ms] :google_places_picker:generateDebugResValues UP-TO-DATE
[ ] :google_places_picker:generateDebugResources UP-TO-DATE
[ +50 ms] :google_places_picker:packageDebugResources UP-TO-DATE
[ +919 ms] :app:mergeDebugResources UP-TO-DATE
[ +62 ms] :app:createDebugCompatibleScreenManifests UP-TO-DATE
[ +11 ms] :app:processDebugManifest UP-TO-DATE
[ ] :app:splitsDiscoveryTaskDebug UP-TO-DATE
[ +10 ms] :google_places_picker:platformAttrExtractor UP-TO-DATE
[ +518 ms] :google_places_picker:generateDebugRFile UP-TO-DATE
[ +55 ms] :app:processDebugResources UP-TO-DATE
[ +1 ms] :app:generateDebugSources UP-TO-DATE
[ ] :google_places_picker:generateDebugBuildConfig UP-TO-DATE
[ +249 ms] :google_places_picker:compileDebugKotlin UP-TO-DATE
[ +2 ms] :google_places_picker:prepareLintJar UP-TO-DATE
[ +9 ms] :google_places_picker:generateDebugSources UP-TO-DATE
[ ] :google_places_picker:javaPreCompileDebug UP-TO-DATE
[ +98 ms] :google_places_picker:compileDebugJavaWithJavac UP-TO-DATE
[ +3 ms] :google_places_picker:processDebugJavaRes NO-SOURCE
[ ] :google_places_picker:transformClassesAndResourcesWithPrepareIntermediateJarsForDebug UP-TO-DATE
[ +7 ms] :app:javaPreCompileDebug UP-TO-DATE
[ +75 ms] :app:compileDebugJavaWithJavac UP-TO-DATE
[ ] :app:compileDebugNdk NO-SOURCE
[ ] :app:compileDebugSources UP-TO-DATE
[ +92 ms] :app:transformClassesWithDexBuilderForDebug UP-TO-DATE
[ +17 ms] :app:transformDexArchiveWithExternalLibsDexMergerForDebug UP-TO-DATE
[ +177 ms] :app:transformDexArchiveWithDexMergerForDebug UP-TO-DATE
[ +43 ms] :app:mergeDebugJniLibFolders UP-TO-DATE
[ +1 ms] :google_places_picker:compileDebugNdk NO-SOURCE
[ ] :google_places_picker:mergeDebugJniLibFolders UP-TO-DATE
[ +9 ms] :google_places_picker:transformNativeLibsWithMergeJniLibsForDebug UP-TO-DATE
[ ] :google_places_picker:transformNativeLibsWithIntermediateJniLibsForDebug UP-TO-DATE
[ +190 ms] :app:transformNativeLibsWithMergeJniLibsForDebug UP-TO-DATE
[ ] :app:processDebugJavaRes NO-SOURCE
[ +54 ms] :app:transformResourcesWithMergeJavaResForDebug UP-TO-DATE
[ +54 ms] :app:validateSigningDebug UP-TO-DATE
[+8945 ms] :app:packageDebug
[ ] :app:assembleDebug
[+1221 ms] :google_places_picker:extractDebugAnnotations UP-TO-DATE
[ ] :google_places_picker:mergeDebugConsumerProguardFiles UP-TO-DATE
[ +51 ms] :google_places_picker:transformResourcesWithMergeJavaResForDebug UP-TO-DATE
[ +216 ms] :google_places_picker:transformClassesAndResourcesWithSyncLibJarsForDebug UP-TO-DATE
[ +11 ms] :google_places_picker:transformNativeLibsWithSyncJniLibsForDebug UP-TO-DATE
[ ] :google_places_picker:bundleDebug UP-TO-DATE
[ ] :google_places_picker:compileDebugSources UP-TO-DATE
[ ] :google_places_picker:assembleDebug UP-TO-DATE
[ +9 ms] BUILD SUCCESSFUL in 36s
[ ] 56 actionable tasks: 5 executed, 51 up-to-date
[ +942 ms] calculateSha: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\outputs\apk/app.apk
[ +549 ms] Built build\app\outputs\apk\debug\app-debug.apk.
[ +2 ms] C:\Users\groov\AppData\Local\Android\sdk\build-tools\28.0.0\aapt dump badging build\app\outputs\apk\app.apk
[ +121 ms] Exit code 0 from: C:\Users\groov\AppData\Local\Android\sdk\build-tools\28.0.0\aapt dump badging build\app\outputs\apk\app.apk
[ ] package: name='com.example.guitarstudenttracker' versionCode='1' versionName='1.0'
sdkVersion:'16'
targetSdkVersion:'27'
uses-permission: name='android.permission.INTERNET'
uses-permission: name='android.permission.ACCESS_NETWORK_STATE'
application-label:'guitar_student_tracker'
application-label-af:'guitar_student_tracker'
application-label-am:'guitar_student_tracker'
application-label-ar:'guitar_student_tracker'
application-label-az:'guitar_student_tracker'
application-label-be:'guitar_student_tracker'
application-label-bg:'guitar_student_tracker'
application-label-bn:'guitar_student_tracker'
application-label-bs:'guitar_student_tracker'
application-label-ca:'guitar_student_tracker'
application-label-cs:'guitar_student_tracker'
application-label-da:'guitar_student_tracker'
application-label-de:'guitar_student_tracker'
application-label-el:'guitar_student_tracker'
application-label-en-AU:'guitar_student_tracker'
application-label-en-GB:'guitar_student_tracker'
application-label-en-IN:'guitar_student_tracker'
application-label-es:'guitar_student_tracker'
application-label-es-US:'guitar_student_tracker'
application-label-et:'guitar_student_tracker'
application-label-eu:'guitar_student_tracker'
application-label-fa:'guitar_student_tracker'
application-label-fi:'guitar_student_tracker'
application-label-fr:'guitar_student_tracker'
application-label-fr-CA:'guitar_student_tracker'
application-label-gl:'guitar_student_tracker'
application-label-gu:'guitar_student_tracker'
application-label-hi:'guitar_student_tracker'
application-label-hr:'guitar_student_tracker'
application-label-hu:'guitar_student_tracker'
application-label-hy:'guitar_student_tracker'
application-label-in:'guitar_student_tracker'
application-label-is:'guitar_student_tracker'
application-label-it:'guitar_student_tracker'
application-label-iw:'guitar_student_tracker'
application-label-ja:'guitar_student_tracker'
application-label-ka:'guitar_student_tracker'
application-label-kk:'guitar_student_tracker'
application-label-km:'guitar_student_tracker'
application-label-kn:'guitar_student_tracker'
application-label-ko:'guitar_student_tracker'
application-label-ky:'guitar_student_tracker'
application-label-lo:'guitar_student_tracker'
application-label-lt:'guitar_student_tracker'
application-label-lv:'guitar_student_tracker'
application-label-mk:'guitar_student_tracker'
application-label-ml:'guitar_student_tracker'
application-label-mn:'guitar_student_tracker'
application-label-mr:'guitar_student_tracker'
application-label-ms:'guitar_student_tracker'
application-label-my:'guitar_student_tracker'
application-label-nb:'guitar_student_tracker'
application-label-ne:'guitar_student_tracker'
application-label-nl:'guitar_student_tracker'
application-label-pa:'guitar_student_tracker'
application-label-pl:'guitar_student_tracker'
application-label-pt:'guitar_student_tracker'
application-label-pt-BR:'guitar_student_tracker'
application-label-pt-PT:'guitar_student_tracker'
application-label-ro:'guitar_student_tracker'
application-label-ru:'guitar_student_tracker'
application-label-si:'guitar_student_tracker'
application-label-sk:'guitar_student_tracker'
application-label-sl:'guitar_student_tracker'
application-label-sq:'guitar_student_tracker'
application-label-sr:'guitar_student_tracker'
application-label-sr-Latn:'guitar_student_tracker'
application-label-sv:'guitar_student_tracker'
application-label-sw:'guitar_student_tracker'
application-label-ta:'guitar_student_tracker'
application-label-te:'guitar_student_tracker'
application-label-th:'guitar_student_tracker'
application-label-tl:'guitar_student_tracker'
application-label-tr:'guitar_student_tracker'
application-label-uk:'guitar_student_tracker'
application-label-ur:'guitar_student_tracker'
application-label-uz:'guitar_student_tracker'
application-label-vi:'guitar_student_tracker'
application-label-zh-CN:'guitar_student_tracker'
application-label-zh-HK:'guitar_student_tracker'
application-label-zh-TW:'guitar_student_tracker'
application-label-zu:'guitar_student_tracker'
application-icon-160:'res/mipmap-mdpi-v4/ic_launcher.png'
application-icon-240:'res/mipmap-hdpi-v4/ic_launcher.png'
application-icon-320:'res/mipmap-xhdpi-v4/ic_launcher.png'
application-icon-480:'res/mipmap-xxhdpi-v4/ic_launcher.png'
application-icon-640:'res/mipmap-xxxhdpi-v4/ic_launcher.png'
application: label='guitar_student_tracker' icon='res/mipmap-mdpi-v4/ic_launcher.png'
application-debuggable
launchable-activity: name='com.example.guitarstudenttracker.MainActivity' label='' icon=''
feature-group: label=''
uses-gl-es: '0x20000'
uses-feature: name='android.hardware.faketouch'
uses-implied-feature: name='android.hardware.faketouch' reason='default feature for all apps'
main
other-activities
supports-screens: 'small' 'normal' 'large' 'xlarge'
supports-any-density: 'true'
locales: '--_--' 'af' 'am' 'ar' 'az' 'be' 'bg' 'bn' 'bs' 'ca' 'cs' 'da' 'de' 'el' 'en-AU' 'en-GB' 'en-IN' 'es' 'es-US' 'et' 'eu' 'fa' 'fi' 'fr' 'fr-CA' 'gl' 'gu' 'hi' 'hr' 'hu' 'hy' 'in' 'is' 'it' 'iw' 'ja' 'ka' 'kk' '
km' 'kn' 'ko' 'ky' 'lo' 'lt' 'lv' 'mk' 'ml' 'mn' 'mr' 'ms' 'my' 'nb' 'ne' 'nl' 'pa' 'pl' 'pt' 'pt-BR' 'pt-PT' 'ro' 'ru' 'si' 'sk' 'sl' 'sq' 'sr' 'sr-Latn' 'sv' 'sw' 'ta' 'te' 'th' 'tl' 'tr' 'uk' 'ur' 'uz' 'vi' 'zh-CN' 'zh-HK' 'zh
-TW' 'zu'
densities: '160' '240' '320' '480' '640'
native-code: 'arm64-v8a' 'x86' 'x86_64'
[ +1 ms] Stopping app 'app.apk' on Pixel XL.
[ ] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell am force-stop com.example.guitarstudenttracker
[ +271 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell pm list packages com.example.guitarstudenttracker
[ +131 ms] package:com.example.guitarstudenttracker
[ +2 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell cat /data/local/tmp/sky.com.example.guitarstudenttracker.sha1
[ +128 ms] 571c14499f0edade24cc0ec188e907d6a4735f25
[ +1 ms] Installing APK.
[ +2 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb version
[ +153 ms] Android Debug Bridge version 1.0.40
Version 4797878
Installed as C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb.EXE
[ ] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb start-server
[ +217 ms] Installing build\app\outputs\apk\app.apk...
[ ] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 install -r build\app\outputs\apk\app.apk
[+7024 ms] Success
[ +1 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell echo -n ce373916fd64d2f3c5293455f6d34f96895df215 > /data/local/tmp/sky.com.example.guitarstudenttracker.sha1
[ +114 ms] Pixel XL startApp
[ +2 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell am start -a android.intent.action.RUN -f 0x20000000 --ez enable-background-compilation true --ez enable-dart-profiling true --ez enable-
checked-mode true com.example.guitarstudenttracker/com.example.guitarstudenttracker.MainActivity
[ +152 ms] Starting: Intent { act=android.intent.action.RUN flg=0x20000000 cmp=com.example.guitarstudenttracker/.MainActivity (has extras) }
[ ] Waiting for observatory port to be available...
[ +982 ms] I/FlutterActivityDelegate( 6562): onResume setting current activity to this
[ +187 ms] Observatory URL on device: http://127.0.0.1:38780/
[ +6 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 forward tcp:8100 tcp:38780
[ +86 ms] Forwarded host port 8100 to device port 38780 for Observatory
[ +7 ms] Connecting to service protocol: http://127.0.0.1:8100/
[ +436 ms] Successfully connected to service protocol: http://127.0.0.1:8100/
[ +3 ms] getVM: {}
[ +18 ms] getIsolate: {isolateId: isolates/680595021}
[ +2 ms] _flutter.listViews: {isolateId: isolates/680595021}
[ +61 ms] DevFS: Creating new filesystem on the device (null)
[ ] _createDevFS: {fsName: Guitar-Student-Tracker}
[ +74 ms] DevFS: Created new filesystem on the device (file:///data/user/0/com.example.guitarstudenttracker/cache/Guitar-Student-TrackerRVDOHP/Guitar-Student-Tracker/)
[ +2 ms] Updating assets
[ +266 ms] Syncing files to device Pixel XL...
[ +8 ms] DevFS: Starting sync from LocalDirectory: 'C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker'
[ ] Scanning project files
[ +4 ms] Scanning package files
[ +88 ms] Scanning asset files
[ ] Scanning for deleted files
[ +35 ms] Compiling dart to kernel with 444 updated files
[ +4 ms] C:\flutter\bin\cache\dart-sdk\bin\dart C:\flutter\bin\cache\artifacts\engine\windows-x64\frontend_server.dart.snapshot --sdk-root C:\flutter\bin\cache\artifacts\engine\common\flutter_patched_sdk/ --incremental --strong
--target=flutter --output-dill build\app.dill --packages C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\.packages --filesystem-scheme org-dartlang-root
[+4067 ms] Updating files
[ +879 ms] DevFS: Sync finished
[ ] Synced 1.4MB.
[ +2 ms] _flutter.listViews: {isolateId: isolates/680595021}
[ +12 ms] Connected to _flutterView/0x7b10a36c18.
[ +1 ms] 🔥 To hot reload changes while running, press "r". To hot restart (and rebuild state), press "R".
[ +1 ms] An Observatory debugger and profiler on Pixel XL is available at: http://127.0.0.1:8100/
[ ] For a more detailed help message, press "h". To quit, press "q".
[+33515 ms] I/flutter ( 6562): ══╡ EXCEPTION CAUGHT BY RENDERING LIBRARY ╞═════════════════════════════════════════════════════════
[ +7 ms] I/flutter ( 6562): The following message was thrown during layout:
[ +1 ms] I/flutter ( 6562): A RenderFlex overflowed by 49 pixels on the right.
[ +26 ms] I/flutter ( 6562):
[ +1 ms] I/flutter ( 6562): The overflowing RenderFlex has an orientation of Axis.horizontal.
[ ] I/flutter ( 6562): The edge of the RenderFlex that is overflowing has been marked in the rendering with a yellow and
[ ] I/flutter ( 6562): black striped pattern. This is usually caused by the contents being too big for the RenderFlex.
[ ] I/flutter ( 6562): Consider applying a flex factor (e.g. using an Expanded widget) to force the children of the
[ ] I/flutter ( 6562): RenderFlex to fit within the available space instead of being sized to their natural size.
[ ] I/flutter ( 6562): This is considered an error condition because it indicates that there is content that cannot be
[ ] I/flutter ( 6562): seen. If the content is legitimately bigger than the available space, consider clipping it with a
[ ] I/flutter ( 6562): ClipRect widget before putting it in the flex, or using a scrollable container rather than a Flex,
[ ] I/flutter ( 6562): like a ListView.
[ ] I/flutter ( 6562): The specific RenderFlex in question is:
[ ] I/flutter ( 6562): RenderFlex#9f4e4 relayoutBoundary=up1 OVERFLOWING
[ ] I/flutter ( 6562): creator: Row ← IconTheme ← Builder ← Center ← Padding ← Container ← IconTheme ← Builder ← Listener
[ +58 ms] I/flutter ( 6562): ← _GestureSemantics ← RawGestureDetector ← GestureDetector ← ⋯
[ +2 ms] I/flutter ( 6562): parentData: offset=Offset(0.0, 12.0) (can use size)
[ ] I/flutter ( 6562): constraints: BoxConstraints(0.0<=w<=110.4, 0.0<=h<=48.0)
[ ] I/flutter ( 6562): size: Size(110.4, 24.0)
[ ] I/flutter ( 6562): direction: horizontal
[ ] I/flutter ( 6562): mainAxisAlignment: start
[ ] I/flutter ( 6562): mainAxisSize: min
[ ] I/flutter ( 6562): crossAxisAlignment: center
[ ] I/flutter ( 6562): textDirection: ltr
[ ] I/flutter ( 6562): verticalDirection: down
[ ] I/flutter ( 6562): ◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤
[ ] I/flutter ( 6562): ════════════════════════════════════════════════════════════════════════════════════════════════════
```
<!--
Run `flutter analyze` and attach any output of that command below.
If there are any analysis errors, try resolving them before filing this issue.
-->
```
C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker>flutter analyze
Analyzing Guitar-Student-Tracker...
info - Unused import: 'package:flutter/material.dart' - lib\addNewLessonPage.dart:1:8
info - Unused import: 'package:google_places_picker/google_places_picker.dart' - lib\addNewStudentPage.dart:3:8
info - Name non-constant identifiers using lowerCamelCase - lib\bottomBar.dart:4:8
info - Unused import: 'package:guitar_student_tracker/addNewLessonPage.dart' - lib\calendarPage.dart:2:8
info - Unused import: 'package:guitar_student_tracker/calendarPage.dart' - lib\main.dart:5:8
info - Unused import: 'package:guitar_student_tracker/globals.dart' - lib\main.dart:6:8
6 issues found. (ran in 23.4s)
```
<!-- Finally, paste the output of running `flutter doctor -v` here. -->
```
C:\flutter\bin\flutter.bat --no-color doctor
Doctor summary (to see all details, run flutter doctor -v):
[√] Flutter (Channel beta, v0.5.1, on Microsoft Windows [Version 10.0.17134.112], locale en-US)
[√] Android toolchain - develop for Android devices (Android SDK 28.0.0)
[√] Android Studio (version 3.1)
[√] Android Studio (version 3.2)
X Flutter plugin not installed; this adds Flutter specific functionality.
X Dart plugin not installed; this adds Dart specific functionality.
[√] Connected devices (2 available)
• No issues found!
Process finished with exit code 0
```
|
1.0
|
Docked extended FAB text gets clipped when returning to the page - <!-- Thank you for using Flutter!
If you are looking for support, please check out our documentation
or consider asking a question on Stack Overflow:
* https://flutter.io/
* https://docs.flutter.io/
* https://stackoverflow.com/questions/tagged/flutter?sort=frequent
If you have found a bug or if our documentation doesn't have an answer
to what you're looking for, then fill our the template below. Please read
our guide to filing a bug first: https://flutter.io/bug-reports/
-->
## Steps to Reproduce
When navigating from a screen that has a center docked extended fab to another screen with a right docked extended fab that has a shorter label, when returning to the first screen the text at the right end of the fab gets clipped. See the imgur video below:
http://imgur.com/gallery/DQP8Vh7
## Logs
<!--
Run your application with `flutter run --verbose` and attach all the
log output below between the lines with the backticks. If there is an
exception, please see if the error message includes enough information
to explain how to solve the issue.
-->
```
C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker>flutter run --verbose
[ +46 ms] [C:\flutter\] git rev-parse --abbrev-ref --symbolic @{u}
[ +241 ms] Exit code 0 from: git rev-parse --abbrev-ref --symbolic @{u}
[ ] origin/beta
[ ] [C:\flutter\] git rev-parse --abbrev-ref HEAD
[ +244 ms] Exit code 0 from: git rev-parse --abbrev-ref HEAD
[ ] beta
[ ] [C:\flutter\] git ls-remote --get-url origin
[ +244 ms] Exit code 0 from: git ls-remote --get-url origin
[ ] https://github.com/flutter/flutter.git
[ ] [C:\flutter\] git log -n 1 --pretty=format:%H
[ +204 ms] Exit code 0 from: git log -n 1 --pretty=format:%H
[ ] c7ea3ca377e909469c68f2ab878a5bc53d3cf66b
[ ] [C:\flutter\] git log -n 1 --pretty=format:%ar
[ +168 ms] Exit code 0 from: git log -n 1 --pretty=format:%ar
[ ] 2 months ago
[ +1 ms] [C:\flutter\] git describe --match v*.*.* --first-parent --long --tags
[ +175 ms] Exit code 0 from: git describe --match v*.*.* --first-parent --long --tags
[ ] v0.5.1-0-gc7ea3ca37
[+1152 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb devices -l
[ +135 ms] Exit code 0 from: C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb devices -l
[ ] List of devices attached
HT74J0200552 device product:marlin model:Pixel_XL device:marlin transport_id:2
[ +543 ms] Found plugin google_places_picker at C:\flutter\.pub-cache\hosted\pub.dartlang.org\google_places_picker-0.0.4\
[ +570 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell getprop
[ +167 ms] ro.hardware = marlin
[ ] ro.build.characteristics = nosdcard
[+6176 ms] Launching lib/main.dart on Pixel XL in debug mode...
[ +28 ms] Initializing gradle...
[ +1 ms] Using gradle from C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\gradlew.bat.
[+1234 ms] C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\gradlew.bat -v
[+28800 ms]
------------------------------------------------------------
Gradle 4.4
------------------------------------------------------------
Build time: 2017-12-06 09:05:06 UTC
Revision: cf7821a6f79f8e2a598df21780e3ff7ce8db2b82
Groovy: 2.4.12
Ant: Apache Ant(TM) version 1.9.9 compiled on February 2 2017
JVM: 1.8.0_152-release (JetBrains s.r.o 25.152-b04)
OS: Windows 10 10.0 amd64
[ +11 ms] Resolving dependencies...
[ +1 ms] [android\] C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\gradlew.bat app:properties
[+131481 ms] Starting a Gradle Daemon, 1 incompatible and 1 stopped Daemons could not be reused, use --status for details
:app:properties
------------------------------------------------------------
Project :app
------------------------------------------------------------
allprojects: [project ':app']
android: com.android.build.gradle.AppExtension_Decorated@378812f6
androidDependencies: task ':app:androidDependencies'
ant: org.gradle.api.internal.project.DefaultAntBuilder@2113d18
antBuilderFactory: org.gradle.api.internal.project.DefaultAntBuilderFactory@b4b4aee
archivesBaseName: app
artifacts: org.gradle.api.internal.artifacts.dsl.DefaultArtifactHandler_Decorated@4261b282
asDynamicObject: DynamicObject for project ':app'
assemble: task ':app:assemble'
assembleAndroidTest: task ':app:assembleAndroidTest'
assembleDebug: task ':app:assembleDebug'
assembleDebugAndroidTest: task ':app:assembleDebugAndroidTest'
assembleDebugUnitTest: task ':app:assembleDebugUnitTest'
assembleProfile: task ':app:assembleProfile'
assembleProfileUnitTest: task ':app:assembleProfileUnitTest'
assembleRelease: task ':app:assembleRelease'
assembleReleaseUnitTest: task ':app:assembleReleaseUnitTest'
baseClassLoaderScope: org.gradle.api.internal.initialization.DefaultClassLoaderScope@3509079a
buildDependents: task ':app:buildDependents'
buildDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app
buildFile: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\app\build.gradle
buildNeeded: task ':app:buildNeeded'
buildOutputs: BaseVariantOutput container
buildPath: :
buildScriptSource: org.gradle.groovy.scripts.TextResourceScriptSource@3e7b05de
buildscript: org.gradle.api.internal.initialization.DefaultScriptHandler@6f3634c5
bundleAppClassesDebug: task ':app:bundleAppClassesDebug'
bundleAppClassesDebugAndroidTest: task ':app:bundleAppClassesDebugAndroidTest'
bundleAppClassesDebugUnitTest: task ':app:bundleAppClassesDebugUnitTest'
bundleAppClassesProfile: task ':app:bundleAppClassesProfile'
bundleAppClassesProfileUnitTest: task ':app:bundleAppClassesProfileUnitTest'
bundleAppClassesRelease: task ':app:bundleAppClassesRelease'
bundleAppClassesReleaseUnitTest: task ':app:bundleAppClassesReleaseUnitTest'
bundleDebugAndroidTestResources: task ':app:bundleDebugAndroidTestResources'
bundleDebugResources: task ':app:bundleDebugResources'
bundleProfileResources: task ':app:bundleProfileResources'
bundleReleaseResources: task ':app:bundleReleaseResources'
check: task ':app:check'
checkDebugManifest: task ':app:checkDebugManifest'
checkProfileManifest: task ':app:checkProfileManifest'
checkReleaseManifest: task ':app:checkReleaseManifest'
childProjects: {}
class: class org.gradle.api.internal.project.DefaultProject_Decorated
classLoaderScope: org.gradle.api.internal.initialization.DefaultClassLoaderScope@208c7e03
cleanBuildCache: task ':app:cleanBuildCache'
compileDebugAidl: task ':app:compileDebugAidl'
compileDebugAndroidTestAidl: task ':app:compileDebugAndroidTestAidl'
compileDebugAndroidTestJavaWithJavac: task ':app:compileDebugAndroidTestJavaWithJavac'
compileDebugAndroidTestNdk: task ':app:compileDebugAndroidTestNdk'
compileDebugAndroidTestRenderscript: task ':app:compileDebugAndroidTestRenderscript'
compileDebugAndroidTestShaders: task ':app:compileDebugAndroidTestShaders'
compileDebugAndroidTestSources: task ':app:compileDebugAndroidTestSources'
compileDebugJavaWithJavac: task ':app:compileDebugJavaWithJavac'
compileDebugNdk: task ':app:compileDebugNdk'
compileDebugRenderscript: task ':app:compileDebugRenderscript'
compileDebugShaders: task ':app:compileDebugShaders'
compileDebugSources: task ':app:compileDebugSources'
compileDebugUnitTestJavaWithJavac: task ':app:compileDebugUnitTestJavaWithJavac'
compileDebugUnitTestSources: task ':app:compileDebugUnitTestSources'
compileLint: task ':app:compileLint'
compileProfileAidl: task ':app:compileProfileAidl'
compileProfileJavaWithJavac: task ':app:compileProfileJavaWithJavac'
compileProfileNdk: task ':app:compileProfileNdk'
compileProfileRenderscript: task ':app:compileProfileRenderscript'
compileProfileShaders: task ':app:compileProfileShaders'
compileProfileSources: task ':app:compileProfileSources'
compileProfileUnitTestJavaWithJavac: task ':app:compileProfileUnitTestJavaWithJavac'
compileProfileUnitTestSources: task ':app:compileProfileUnitTestSources'
compileReleaseAidl: task ':app:compileReleaseAidl'
compileReleaseJavaWithJavac: task ':app:compileReleaseJavaWithJavac'
compileReleaseNdk: task ':app:compileReleaseNdk'
compileReleaseRenderscript: task ':app:compileReleaseRenderscript'
compileReleaseShaders: task ':app:compileReleaseShaders'
compileReleaseSources: task ':app:compileReleaseSources'
compileReleaseUnitTestJavaWithJavac: task ':app:compileReleaseUnitTestJavaWithJavac'
compileReleaseUnitTestSources: task ':app:compileReleaseUnitTestSources'
components: SoftwareComponentInternal set
configurationActions: org.gradle.configuration.project.DefaultProjectConfigurationActionContainer@5dc4885f
configurationTargetIdentifier: org.gradle.configuration.ConfigurationTargetIdentifier$1@3bf733cc
configurations: configuration container
connectedAndroidTest: task ':app:connectedAndroidTest'
connectedCheck: task ':app:connectedCheck'
connectedDebugAndroidTest: task ':app:connectedDebugAndroidTest'
consumeConfigAttr: task ':app:consumeConfigAttr'
convention: org.gradle.api.internal.plugins.DefaultConvention@6539a86b
copyFlutterAssetsDebug: task ':app:copyFlutterAssetsDebug'
copyFlutterAssetsProfile: task ':app:copyFlutterAssetsProfile'
copyFlutterAssetsRelease: task ':app:copyFlutterAssetsRelease'
createDebugCompatibleScreenManifests: task ':app:createDebugCompatibleScreenManifests'
createProfileCompatibleScreenManifests: task ':app:createProfileCompatibleScreenManifests'
createReleaseCompatibleScreenManifests: task ':app:createReleaseCompatibleScreenManifests'
defaultArtifacts: org.gradle.api.internal.plugins.DefaultArtifactPublicationSet_Decorated@2686aea9
defaultTasks: []
deferredProjectConfiguration: org.gradle.api.internal.project.DeferredProjectConfiguration@5ff95c56
dependencies: org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler_Decorated@5d927d36
depth: 1
description: null
deviceAndroidTest: task ':app:deviceAndroidTest'
deviceCheck: task ':app:deviceCheck'
displayName: project ':app'
distsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\distributions
distsDirName: distributions
docsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\docs
docsDirName: docs
ext: org.gradle.api.internal.plugins.DefaultExtraPropertiesExtension@75c0ec1f
extensions: org.gradle.api.internal.plugins.DefaultConvention@6539a86b
extractProguardFiles: task ':app:extractProguardFiles'
fileOperations: org.gradle.api.internal.file.DefaultFileOperations@6620e7e8
fileResolver: org.gradle.api.internal.file.BaseDirFileResolver@732e0c96
flutter: FlutterExtension_Decorated@3d1ba408
flutterBuildDebug: task ':app:flutterBuildDebug'
flutterBuildProfile: task ':app:flutterBuildProfile'
flutterBuildRelease: task ':app:flutterBuildRelease'
flutterBuildX86Jar: task ':app:flutterBuildX86Jar'
generateDebugAndroidTestAssets: task ':app:generateDebugAndroidTestAssets'
generateDebugAndroidTestBuildConfig: task ':app:generateDebugAndroidTestBuildConfig'
generateDebugAndroidTestResValues: task ':app:generateDebugAndroidTestResValues'
generateDebugAndroidTestResources: task ':app:generateDebugAndroidTestResources'
generateDebugAndroidTestSources: task ':app:generateDebugAndroidTestSources'
generateDebugAssets: task ':app:generateDebugAssets'
generateDebugBuildConfig: task ':app:generateDebugBuildConfig'
generateDebugResValues: task ':app:generateDebugResValues'
generateDebugResources: task ':app:generateDebugResources'
generateDebugSources: task ':app:generateDebugSources'
generateProfileAssets: task ':app:generateProfileAssets'
generateProfileBuildConfig: task ':app:generateProfileBuildConfig'
generateProfileResValues: task ':app:generateProfileResValues'
generateProfileResources: task ':app:generateProfileResources'
generateProfileSources: task ':app:generateProfileSources'
generateReleaseAssets: task ':app:generateReleaseAssets'
generateReleaseBuildConfig: task ':app:generateReleaseBuildConfig'
generateReleaseResValues: task ':app:generateReleaseResValues'
generateReleaseResources: task ':app:generateReleaseResources'
generateReleaseSources: task ':app:generateReleaseSources'
gradle: build 'android'
group: android
identityPath: :app
inheritedScope: org.gradle.api.internal.ExtensibleDynamicObject$InheritedDynamicObject@4f4907a0
installDebug: task ':app:installDebug'
installDebugAndroidTest: task ':app:installDebugAndroidTest'
installProfile: task ':app:installProfile'
installRelease: task ':app:installRelease'
javaPreCompileDebug: task ':app:javaPreCompileDebug'
javaPreCompileDebugAndroidTest: task ':app:javaPreCompileDebugAndroidTest'
javaPreCompileDebugUnitTest: task ':app:javaPreCompileDebugUnitTest'
javaPreCompileProfile: task ':app:javaPreCompileProfile'
javaPreCompileProfileUnitTest: task ':app:javaPreCompileProfileUnitTest'
javaPreCompileRelease: task ':app:javaPreCompileRelease'
javaPreCompileReleaseUnitTest: task ':app:javaPreCompileReleaseUnitTest'
layout: org.gradle.api.internal.file.DefaultProjectLayout@2eb7b606
libsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\libs
libsDirName: libs
lint: task ':app:lint'
lintDebug: task ':app:lintDebug'
lintProfile: task ':app:lintProfile'
lintRelease: task ':app:lintRelease'
lintVitalRelease: task ':app:lintVitalRelease'
logger: org.gradle.internal.logging.slf4j.OutputEventListenerBackedLogger@2d7e55fa
logging: org.gradle.internal.logging.services.DefaultLoggingManager@4d258fb9
mainApkListPersistenceDebug: task ':app:mainApkListPersistenceDebug'
mainApkListPersistenceDebugAndroidTest: task ':app:mainApkListPersistenceDebugAndroidTest'
mainApkListPersistenceProfile: task ':app:mainApkListPersistenceProfile'
mainApkListPersistenceRelease: task ':app:mainApkListPersistenceRelease'
mergeDebugAndroidTestAssets: task ':app:mergeDebugAndroidTestAssets'
mergeDebugAndroidTestJniLibFolders: task ':app:mergeDebugAndroidTestJniLibFolders'
mergeDebugAndroidTestResources: task ':app:mergeDebugAndroidTestResources'
mergeDebugAndroidTestShaders: task ':app:mergeDebugAndroidTestShaders'
mergeDebugAssets: task ':app:mergeDebugAssets'
mergeDebugJniLibFolders: task ':app:mergeDebugJniLibFolders'
mergeDebugResources: task ':app:mergeDebugResources'
mergeDebugShaders: task ':app:mergeDebugShaders'
mergeProfileAssets: task ':app:mergeProfileAssets'
mergeProfileJniLibFolders: task ':app:mergeProfileJniLibFolders'
mergeProfileResources: task ':app:mergeProfileResources'
mergeProfileShaders: task ':app:mergeProfileShaders'
mergeReleaseAssets: task ':app:mergeReleaseAssets'
mergeReleaseJniLibFolders: task ':app:mergeReleaseJniLibFolders'
mergeReleaseResources: task ':app:mergeReleaseResources'
mergeReleaseShaders: task ':app:mergeReleaseShaders'
mockableAndroidJar: task ':app:mockableAndroidJar'
modelRegistry: org.gradle.model.internal.registry.DefaultModelRegistry@4a56f5ba
modelSchemaStore: org.gradle.model.internal.manage.schema.extract.DefaultModelSchemaStore@59427d70
module: org.gradle.api.internal.artifacts.ProjectBackedModule@2625c26
name: app
normalization: org.gradle.normalization.internal.DefaultInputNormalizationHandler_Decorated@64c2227
objects: org.gradle.api.internal.model.DefaultObjectFactory@67752d10
org.gradle.jvmargs: -Xmx1536M
packageDebug: task ':app:packageDebug'
packageDebugAndroidTest: task ':app:packageDebugAndroidTest'
packageProfile: task ':app:packageProfile'
packageRelease: task ':app:packageRelease'
parent: root project 'android'
parentIdentifier: root project 'android'
path: :app
platformAttrExtractor: task ':app:platformAttrExtractor'
pluginManager: org.gradle.api.internal.plugins.DefaultPluginManager_Decorated@1a73ef61
plugins: [org.gradle.api.plugins.HelpTasksPlugin@53d57a54, com.android.build.gradle.api.AndroidBasePlugin@768ed0ad, org.gradle.language.base.plugins.LifecycleBasePlugin@6a5fc451, org.gradle.api.plugins.BasePl
ugin@4fddcc94, org.gradle.api.plugins.ReportingBasePlugin@2af1694d, org.gradle.platform.base.plugins.ComponentBasePlugin@697f1ec3, org.gradle.language.base.plugins.LanguageBasePlugin@26c95273, org.gradle.platform.base.plugins.Bin
aryBasePlugin@5b9560a6, org.gradle.api.plugins.JavaBasePlugin@1c6bd387, com.android.build.gradle.AppPlugin@949683, FlutterPlugin@7eae9d42]
preBuild: task ':app:preBuild'
preDebugAndroidTestBuild: task ':app:preDebugAndroidTestBuild'
preDebugBuild: task ':app:preDebugBuild'
preDebugUnitTestBuild: task ':app:preDebugUnitTestBuild'
preProfileBuild: task ':app:preProfileBuild'
preProfileUnitTestBuild: task ':app:preProfileUnitTestBuild'
preReleaseBuild: task ':app:preReleaseBuild'
preReleaseUnitTestBuild: task ':app:preReleaseUnitTestBuild'
prepareLintJar: task ':app:prepareLintJar'
preparePUBLISHED_DEXDebugAndroidTestForPublishing: task ':app:preparePUBLISHED_DEXDebugAndroidTestForPublishing'
preparePUBLISHED_DEXDebugForPublishing: task ':app:preparePUBLISHED_DEXDebugForPublishing'
preparePUBLISHED_DEXProfileForPublishing: task ':app:preparePUBLISHED_DEXProfileForPublishing'
preparePUBLISHED_DEXReleaseForPublishing: task ':app:preparePUBLISHED_DEXReleaseForPublishing'
preparePUBLISHED_JAVA_RESDebugAndroidTestForPublishing: task ':app:preparePUBLISHED_JAVA_RESDebugAndroidTestForPublishing'
preparePUBLISHED_JAVA_RESDebugForPublishing: task ':app:preparePUBLISHED_JAVA_RESDebugForPublishing'
preparePUBLISHED_JAVA_RESProfileForPublishing: task ':app:preparePUBLISHED_JAVA_RESProfileForPublishing'
preparePUBLISHED_JAVA_RESReleaseForPublishing: task ':app:preparePUBLISHED_JAVA_RESReleaseForPublishing'
preparePUBLISHED_NATIVE_LIBSDebugAndroidTestForPublishing: task ':app:preparePUBLISHED_NATIVE_LIBSDebugAndroidTestForPublishing'
preparePUBLISHED_NATIVE_LIBSDebugForPublishing: task ':app:preparePUBLISHED_NATIVE_LIBSDebugForPublishing'
preparePUBLISHED_NATIVE_LIBSProfileForPublishing: task ':app:preparePUBLISHED_NATIVE_LIBSProfileForPublishing'
preparePUBLISHED_NATIVE_LIBSReleaseForPublishing: task ':app:preparePUBLISHED_NATIVE_LIBSReleaseForPublishing'
processDebugAndroidTestJavaRes: task ':app:processDebugAndroidTestJavaRes'
processDebugAndroidTestManifest: task ':app:processDebugAndroidTestManifest'
processDebugAndroidTestResources: task ':app:processDebugAndroidTestResources'
processDebugJavaRes: task ':app:processDebugJavaRes'
processDebugManifest: task ':app:processDebugManifest'
processDebugResources: task ':app:processDebugResources'
processDebugUnitTestJavaRes: task ':app:processDebugUnitTestJavaRes'
processOperations: org.gradle.api.internal.file.DefaultFileOperations@6620e7e8
processProfileJavaRes: task ':app:processProfileJavaRes'
processProfileManifest: task ':app:processProfileManifest'
processProfileResources: task ':app:processProfileResources'
processProfileUnitTestJavaRes: task ':app:processProfileUnitTestJavaRes'
processReleaseJavaRes: task ':app:processReleaseJavaRes'
processReleaseManifest: task ':app:processReleaseManifest'
processReleaseResources: task ':app:processReleaseResources'
processReleaseUnitTestJavaRes: task ':app:processReleaseUnitTestJavaRes'
project: project ':app'
projectConfigurator: org.gradle.api.internal.project.BuildOperationCrossProjectConfigurator@5090b8a2
projectDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\app
projectEvaluationBroadcaster: ProjectEvaluationListener broadcast
projectEvaluator: org.gradle.configuration.project.LifecycleProjectEvaluator@372af58d
projectPath: :app
projectRegistry: org.gradle.api.internal.project.DefaultProjectRegistry@797953d5
properties: {...}
providers: org.gradle.api.internal.provider.DefaultProviderFactory@7cf4a37c
reportBuildArtifactsDebug: task ':app:reportBuildArtifactsDebug'
reportBuildArtifactsProfile: task ':app:reportBuildArtifactsProfile'
reportBuildArtifactsRelease: task ':app:reportBuildArtifactsRelease'
reporting: org.gradle.api.reporting.ReportingExtension_Decorated@297451e8
reportsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\reports
repositories: repository container
resolveConfigAttr: task ':app:resolveConfigAttr'
resourceLoader: org.gradle.internal.resource.transfer.DefaultUriTextResourceLoader@49125fd6
resources: org.gradle.api.internal.resources.DefaultResourceHandler@7ac64ada
rootDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android
rootProject: root project 'android'
script: false
scriptHandlerFactory: org.gradle.api.internal.initialization.DefaultScriptHandlerFactory@23f0e9bd
scriptPluginFactory: org.gradle.configuration.ScriptPluginFactorySelector@2800173a
serviceRegistryFactory: org.gradle.internal.service.scopes.ProjectScopeServices$4@73370103
services: ProjectScopeServices
signingReport: task ':app:signingReport'
sourceCompatibility: 1.8
sourceSets: SourceSet container
splitsDiscoveryTaskDebug: task ':app:splitsDiscoveryTaskDebug'
splitsDiscoveryTaskProfile: task ':app:splitsDiscoveryTaskProfile'
splitsDiscoveryTaskRelease: task ':app:splitsDiscoveryTaskRelease'
standardOutputCapture: org.gradle.internal.logging.services.DefaultLoggingManager@4d258fb9
state: project state 'EXECUTED'
status: integration
subprojects: []
targetCompatibility: 1.8
tasks: task set
test: task ':app:test'
testDebugUnitTest: task ':app:testDebugUnitTest'
testProfileUnitTest: task ':app:testProfileUnitTest'
testReleaseUnitTest: task ':app:testReleaseUnitTest'
testReportDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\reports\tests
testReportDirName: tests
testResultsDir: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\test-results
testResultsDirName: test-results
transformClassesWithDexBuilderForDebug: task ':app:transformClassesWithDexBuilderForDebug'
transformClassesWithDexBuilderForDebugAndroidTest: task ':app:transformClassesWithDexBuilderForDebugAndroidTest'
transformClassesWithDexBuilderForProfile: task ':app:transformClassesWithDexBuilderForProfile'
transformClassesWithDexBuilderForRelease: task ':app:transformClassesWithDexBuilderForRelease'
transformDexArchiveWithDexMergerForDebug: task ':app:transformDexArchiveWithDexMergerForDebug'
transformDexArchiveWithDexMergerForDebugAndroidTest: task ':app:transformDexArchiveWithDexMergerForDebugAndroidTest'
transformDexArchiveWithDexMergerForProfile: task ':app:transformDexArchiveWithDexMergerForProfile'
transformDexArchiveWithDexMergerForRelease: task ':app:transformDexArchiveWithDexMergerForRelease'
transformDexArchiveWithExternalLibsDexMergerForDebug: task ':app:transformDexArchiveWithExternalLibsDexMergerForDebug'
transformDexArchiveWithExternalLibsDexMergerForDebugAndroidTest: task ':app:transformDexArchiveWithExternalLibsDexMergerForDebugAndroidTest'
transformDexArchiveWithExternalLibsDexMergerForProfile: task ':app:transformDexArchiveWithExternalLibsDexMergerForProfile'
transformDexArchiveWithExternalLibsDexMergerForRelease: task ':app:transformDexArchiveWithExternalLibsDexMergerForRelease'
transformNativeLibsWithMergeJniLibsForDebug: task ':app:transformNativeLibsWithMergeJniLibsForDebug'
transformNativeLibsWithMergeJniLibsForDebugAndroidTest: task ':app:transformNativeLibsWithMergeJniLibsForDebugAndroidTest'
transformNativeLibsWithMergeJniLibsForProfile: task ':app:transformNativeLibsWithMergeJniLibsForProfile'
transformNativeLibsWithMergeJniLibsForRelease: task ':app:transformNativeLibsWithMergeJniLibsForRelease'
transformResourcesWithMergeJavaResForDebug: task ':app:transformResourcesWithMergeJavaResForDebug'
transformResourcesWithMergeJavaResForDebugAndroidTest: task ':app:transformResourcesWithMergeJavaResForDebugAndroidTest'
transformResourcesWithMergeJavaResForDebugUnitTest: task ':app:transformResourcesWithMergeJavaResForDebugUnitTest'
transformResourcesWithMergeJavaResForProfile: task ':app:transformResourcesWithMergeJavaResForProfile'
transformResourcesWithMergeJavaResForProfileUnitTest: task ':app:transformResourcesWithMergeJavaResForProfileUnitTest'
transformResourcesWithMergeJavaResForRelease: task ':app:transformResourcesWithMergeJavaResForRelease'
transformResourcesWithMergeJavaResForReleaseUnitTest: task ':app:transformResourcesWithMergeJavaResForReleaseUnitTest'
uninstallAll: task ':app:uninstallAll'
uninstallDebug: task ':app:uninstallDebug'
uninstallDebugAndroidTest: task ':app:uninstallDebugAndroidTest'
uninstallProfile: task ':app:uninstallProfile'
uninstallRelease: task ':app:uninstallRelease'
validateSigningDebug: task ':app:validateSigningDebug'
validateSigningDebugAndroidTest: task ':app:validateSigningDebugAndroidTest'
validateSigningProfile: task ':app:validateSigningProfile'
validateSigningRelease: task ':app:validateSigningRelease'
version: unspecified
writeDebugApplicationId: task ':app:writeDebugApplicationId'
writeProfileApplicationId: task ':app:writeProfileApplicationId'
writeReleaseApplicationId: task ':app:writeReleaseApplicationId'
BUILD SUCCESSFUL in 2m 9s
1 actionable task: 1 executed
[ +9 ms] C:\Users\groov\AppData\Local\Android\sdk\build-tools\28.0.0\aapt dump badging build\app\outputs\apk\app.apk
[ +873 ms] Exit code 0 from: C:\Users\groov\AppData\Local\Android\sdk\build-tools\28.0.0\aapt dump badging build\app\outputs\apk\app.apk
[ ] package: name='com.example.guitarstudenttracker' versionCode='1' versionName='1.0'
sdkVersion:'16'
targetSdkVersion:'27'
uses-permission: name='android.permission.INTERNET'
uses-permission: name='android.permission.ACCESS_NETWORK_STATE'
application-label:'guitar_student_tracker'
application-label-af:'guitar_student_tracker'
application-label-am:'guitar_student_tracker'
application-label-ar:'guitar_student_tracker'
application-label-az:'guitar_student_tracker'
application-label-be:'guitar_student_tracker'
application-label-bg:'guitar_student_tracker'
application-label-bn:'guitar_student_tracker'
application-label-bs:'guitar_student_tracker'
application-label-ca:'guitar_student_tracker'
application-label-cs:'guitar_student_tracker'
application-label-da:'guitar_student_tracker'
application-label-de:'guitar_student_tracker'
application-label-el:'guitar_student_tracker'
application-label-en-AU:'guitar_student_tracker'
application-label-en-GB:'guitar_student_tracker'
application-label-en-IN:'guitar_student_tracker'
application-label-es:'guitar_student_tracker'
application-label-es-US:'guitar_student_tracker'
application-label-et:'guitar_student_tracker'
application-label-eu:'guitar_student_tracker'
application-label-fa:'guitar_student_tracker'
application-label-fi:'guitar_student_tracker'
application-label-fr:'guitar_student_tracker'
application-label-fr-CA:'guitar_student_tracker'
application-label-gl:'guitar_student_tracker'
application-label-gu:'guitar_student_tracker'
application-label-hi:'guitar_student_tracker'
application-label-hr:'guitar_student_tracker'
application-label-hu:'guitar_student_tracker'
application-label-hy:'guitar_student_tracker'
application-label-in:'guitar_student_tracker'
application-label-is:'guitar_student_tracker'
application-label-it:'guitar_student_tracker'
application-label-iw:'guitar_student_tracker'
application-label-ja:'guitar_student_tracker'
application-label-ka:'guitar_student_tracker'
application-label-kk:'guitar_student_tracker'
application-label-km:'guitar_student_tracker'
application-label-kn:'guitar_student_tracker'
application-label-ko:'guitar_student_tracker'
application-label-ky:'guitar_student_tracker'
application-label-lo:'guitar_student_tracker'
application-label-lt:'guitar_student_tracker'
application-label-lv:'guitar_student_tracker'
application-label-mk:'guitar_student_tracker'
application-label-ml:'guitar_student_tracker'
application-label-mn:'guitar_student_tracker'
application-label-mr:'guitar_student_tracker'
application-label-ms:'guitar_student_tracker'
application-label-my:'guitar_student_tracker'
application-label-nb:'guitar_student_tracker'
application-label-ne:'guitar_student_tracker'
application-label-nl:'guitar_student_tracker'
application-label-pa:'guitar_student_tracker'
application-label-pl:'guitar_student_tracker'
application-label-pt:'guitar_student_tracker'
application-label-pt-BR:'guitar_student_tracker'
application-label-pt-PT:'guitar_student_tracker'
application-label-ro:'guitar_student_tracker'
application-label-ru:'guitar_student_tracker'
application-label-si:'guitar_student_tracker'
application-label-sk:'guitar_student_tracker'
application-label-sl:'guitar_student_tracker'
application-label-sq:'guitar_student_tracker'
application-label-sr:'guitar_student_tracker'
application-label-sr-Latn:'guitar_student_tracker'
application-label-sv:'guitar_student_tracker'
application-label-sw:'guitar_student_tracker'
application-label-ta:'guitar_student_tracker'
application-label-te:'guitar_student_tracker'
application-label-th:'guitar_student_tracker'
application-label-tl:'guitar_student_tracker'
application-label-tr:'guitar_student_tracker'
application-label-uk:'guitar_student_tracker'
application-label-ur:'guitar_student_tracker'
application-label-uz:'guitar_student_tracker'
application-label-vi:'guitar_student_tracker'
application-label-zh-CN:'guitar_student_tracker'
application-label-zh-HK:'guitar_student_tracker'
application-label-zh-TW:'guitar_student_tracker'
application-label-zu:'guitar_student_tracker'
application-icon-160:'res/mipmap-mdpi-v4/ic_launcher.png'
application-icon-240:'res/mipmap-hdpi-v4/ic_launcher.png'
application-icon-320:'res/mipmap-xhdpi-v4/ic_launcher.png'
application-icon-480:'res/mipmap-xxhdpi-v4/ic_launcher.png'
application-icon-640:'res/mipmap-xxxhdpi-v4/ic_launcher.png'
application: label='guitar_student_tracker' icon='res/mipmap-mdpi-v4/ic_launcher.png'
launchable-activity: name='com.example.guitarstudenttracker.MainActivity' label='' icon=''
feature-group: label=''
uses-gl-es: '0x20000'
uses-feature: name='android.hardware.faketouch'
uses-implied-feature: name='android.hardware.faketouch' reason='default feature for all apps'
main
other-activities
supports-screens: 'small' 'normal' 'large' 'xlarge'
supports-any-density: 'true'
locales: '--_--' 'af' 'am' 'ar' 'az' 'be' 'bg' 'bn' 'bs' 'ca' 'cs' 'da' 'de' 'el' 'en-AU' 'en-GB' 'en-IN' 'es' 'es-US' 'et' 'eu' 'fa' 'fi' 'fr' 'fr-CA' 'gl' 'gu' 'hi' 'hr' 'hu' 'hy' 'in' 'is' 'it' 'iw' 'ja' 'ka' 'kk' '
km' 'kn' 'ko' 'ky' 'lo' 'lt' 'lv' 'mk' 'ml' 'mn' 'mr' 'ms' 'my' 'nb' 'ne' 'nl' 'pa' 'pl' 'pt' 'pt-BR' 'pt-PT' 'ro' 'ru' 'si' 'sk' 'sl' 'sq' 'sr' 'sr-Latn' 'sv' 'sw' 'ta' 'te' 'th' 'tl' 'tr' 'uk' 'ur' 'uz' 'vi' 'zh-CN' 'zh-HK' 'zh
-TW' 'zu'
densities: '160' '240' '320' '480' '640'
native-code: 'arm64-v8a'
[ +10 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 logcat -v time -t 1
[ +335 ms] Exit code 0 from: C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 logcat -v time -t 1
[ +1 ms] --------- beginning of main
08-09 21:43:33.808 D/NewStreamAdapter( 2543): Publishing revision #2270812 to adapter clients
[ +8 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 logcat -v time
[ +419 ms] DependencyChecker: nothing is modified after 2018-08-09 21:07:09.000.
[ +4 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb version
[ +143 ms] Android Debug Bridge version 1.0.40
Version 4797878
Installed as C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb.EXE
[ +3 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb start-server
[ +66 ms] Building APK
[ +66 ms] Running 'gradlew assembleDebug'...
[ +2 ms] [android\] C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\android\gradlew.bat -Ptarget=C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\lib/main.dart -Ppreview-dart-2=true -Ptarget-platform=android-arm64
assembleDebug
[+4194 ms] :app:preBuild UP-TO-DATE
[ ] :google_places_picker:preBuild UP-TO-DATE
[ ] :google_places_picker:preDebugBuild UP-TO-DATE
[ +252 ms] :google_places_picker:checkDebugManifest UP-TO-DATE
[ +199 ms] :google_places_picker:processDebugManifest UP-TO-DATE
[+3580 ms] :app:preDebugBuild UP-TO-DATE
[ +66 ms] :google_places_picker:compileDebugAidl UP-TO-DATE
[ +10 ms] :app:compileDebugAidl UP-TO-DATE
[ ] :google_places_picker:packageDebugRenderscript NO-SOURCE
[ +33 ms] :app:compileDebugRenderscript UP-TO-DATE
[ +10 ms] :app:flutterBuildX86Jar UP-TO-DATE
[ +10 ms] :app:checkDebugManifest UP-TO-DATE
[ ] :app:generateDebugBuildConfig UP-TO-DATE
[ ] :app:prepareLintJar UP-TO-DATE
[ +162 ms] :app:cleanMergeDebugAssets
[+13601 ms] :app:flutterBuildDebug
[ +199 ms] :app:mergeDebugShaders UP-TO-DATE
[ ] :app:compileDebugShaders UP-TO-DATE
[ ] :app:generateDebugAssets UP-TO-DATE
[ +10 ms] :google_places_picker:mergeDebugShaders UP-TO-DATE
[ ] :google_places_picker:compileDebugShaders UP-TO-DATE
[ ] :google_places_picker:generateDebugAssets UP-TO-DATE
[ ] :google_places_picker:packageDebugAssets UP-TO-DATE
[ +494 ms] :app:mergeDebugAssets
[ +681 ms] :app:copyFlutterAssetsDebug
[ +96 ms] :app:mainApkListPersistenceDebug UP-TO-DATE
[ +1 ms] :app:generateDebugResValues UP-TO-DATE
[ ] :app:generateDebugResources UP-TO-DATE
[ +9 ms] :google_places_picker:compileDebugRenderscript UP-TO-DATE
[ +10 ms] :google_places_picker:generateDebugResValues UP-TO-DATE
[ ] :google_places_picker:generateDebugResources UP-TO-DATE
[ +50 ms] :google_places_picker:packageDebugResources UP-TO-DATE
[ +919 ms] :app:mergeDebugResources UP-TO-DATE
[ +62 ms] :app:createDebugCompatibleScreenManifests UP-TO-DATE
[ +11 ms] :app:processDebugManifest UP-TO-DATE
[ ] :app:splitsDiscoveryTaskDebug UP-TO-DATE
[ +10 ms] :google_places_picker:platformAttrExtractor UP-TO-DATE
[ +518 ms] :google_places_picker:generateDebugRFile UP-TO-DATE
[ +55 ms] :app:processDebugResources UP-TO-DATE
[ +1 ms] :app:generateDebugSources UP-TO-DATE
[ ] :google_places_picker:generateDebugBuildConfig UP-TO-DATE
[ +249 ms] :google_places_picker:compileDebugKotlin UP-TO-DATE
[ +2 ms] :google_places_picker:prepareLintJar UP-TO-DATE
[ +9 ms] :google_places_picker:generateDebugSources UP-TO-DATE
[ ] :google_places_picker:javaPreCompileDebug UP-TO-DATE
[ +98 ms] :google_places_picker:compileDebugJavaWithJavac UP-TO-DATE
[ +3 ms] :google_places_picker:processDebugJavaRes NO-SOURCE
[ ] :google_places_picker:transformClassesAndResourcesWithPrepareIntermediateJarsForDebug UP-TO-DATE
[ +7 ms] :app:javaPreCompileDebug UP-TO-DATE
[ +75 ms] :app:compileDebugJavaWithJavac UP-TO-DATE
[ ] :app:compileDebugNdk NO-SOURCE
[ ] :app:compileDebugSources UP-TO-DATE
[ +92 ms] :app:transformClassesWithDexBuilderForDebug UP-TO-DATE
[ +17 ms] :app:transformDexArchiveWithExternalLibsDexMergerForDebug UP-TO-DATE
[ +177 ms] :app:transformDexArchiveWithDexMergerForDebug UP-TO-DATE
[ +43 ms] :app:mergeDebugJniLibFolders UP-TO-DATE
[ +1 ms] :google_places_picker:compileDebugNdk NO-SOURCE
[ ] :google_places_picker:mergeDebugJniLibFolders UP-TO-DATE
[ +9 ms] :google_places_picker:transformNativeLibsWithMergeJniLibsForDebug UP-TO-DATE
[ ] :google_places_picker:transformNativeLibsWithIntermediateJniLibsForDebug UP-TO-DATE
[ +190 ms] :app:transformNativeLibsWithMergeJniLibsForDebug UP-TO-DATE
[ ] :app:processDebugJavaRes NO-SOURCE
[ +54 ms] :app:transformResourcesWithMergeJavaResForDebug UP-TO-DATE
[ +54 ms] :app:validateSigningDebug UP-TO-DATE
[+8945 ms] :app:packageDebug
[ ] :app:assembleDebug
[+1221 ms] :google_places_picker:extractDebugAnnotations UP-TO-DATE
[ ] :google_places_picker:mergeDebugConsumerProguardFiles UP-TO-DATE
[ +51 ms] :google_places_picker:transformResourcesWithMergeJavaResForDebug UP-TO-DATE
[ +216 ms] :google_places_picker:transformClassesAndResourcesWithSyncLibJarsForDebug UP-TO-DATE
[ +11 ms] :google_places_picker:transformNativeLibsWithSyncJniLibsForDebug UP-TO-DATE
[ ] :google_places_picker:bundleDebug UP-TO-DATE
[ ] :google_places_picker:compileDebugSources UP-TO-DATE
[ ] :google_places_picker:assembleDebug UP-TO-DATE
[ +9 ms] BUILD SUCCESSFUL in 36s
[ ] 56 actionable tasks: 5 executed, 51 up-to-date
[ +942 ms] calculateSha: C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\build\app\outputs\apk/app.apk
[ +549 ms] Built build\app\outputs\apk\debug\app-debug.apk.
[ +2 ms] C:\Users\groov\AppData\Local\Android\sdk\build-tools\28.0.0\aapt dump badging build\app\outputs\apk\app.apk
[ +121 ms] Exit code 0 from: C:\Users\groov\AppData\Local\Android\sdk\build-tools\28.0.0\aapt dump badging build\app\outputs\apk\app.apk
[ ] package: name='com.example.guitarstudenttracker' versionCode='1' versionName='1.0'
sdkVersion:'16'
targetSdkVersion:'27'
uses-permission: name='android.permission.INTERNET'
uses-permission: name='android.permission.ACCESS_NETWORK_STATE'
application-label:'guitar_student_tracker'
application-label-af:'guitar_student_tracker'
application-label-am:'guitar_student_tracker'
application-label-ar:'guitar_student_tracker'
application-label-az:'guitar_student_tracker'
application-label-be:'guitar_student_tracker'
application-label-bg:'guitar_student_tracker'
application-label-bn:'guitar_student_tracker'
application-label-bs:'guitar_student_tracker'
application-label-ca:'guitar_student_tracker'
application-label-cs:'guitar_student_tracker'
application-label-da:'guitar_student_tracker'
application-label-de:'guitar_student_tracker'
application-label-el:'guitar_student_tracker'
application-label-en-AU:'guitar_student_tracker'
application-label-en-GB:'guitar_student_tracker'
application-label-en-IN:'guitar_student_tracker'
application-label-es:'guitar_student_tracker'
application-label-es-US:'guitar_student_tracker'
application-label-et:'guitar_student_tracker'
application-label-eu:'guitar_student_tracker'
application-label-fa:'guitar_student_tracker'
application-label-fi:'guitar_student_tracker'
application-label-fr:'guitar_student_tracker'
application-label-fr-CA:'guitar_student_tracker'
application-label-gl:'guitar_student_tracker'
application-label-gu:'guitar_student_tracker'
application-label-hi:'guitar_student_tracker'
application-label-hr:'guitar_student_tracker'
application-label-hu:'guitar_student_tracker'
application-label-hy:'guitar_student_tracker'
application-label-in:'guitar_student_tracker'
application-label-is:'guitar_student_tracker'
application-label-it:'guitar_student_tracker'
application-label-iw:'guitar_student_tracker'
application-label-ja:'guitar_student_tracker'
application-label-ka:'guitar_student_tracker'
application-label-kk:'guitar_student_tracker'
application-label-km:'guitar_student_tracker'
application-label-kn:'guitar_student_tracker'
application-label-ko:'guitar_student_tracker'
application-label-ky:'guitar_student_tracker'
application-label-lo:'guitar_student_tracker'
application-label-lt:'guitar_student_tracker'
application-label-lv:'guitar_student_tracker'
application-label-mk:'guitar_student_tracker'
application-label-ml:'guitar_student_tracker'
application-label-mn:'guitar_student_tracker'
application-label-mr:'guitar_student_tracker'
application-label-ms:'guitar_student_tracker'
application-label-my:'guitar_student_tracker'
application-label-nb:'guitar_student_tracker'
application-label-ne:'guitar_student_tracker'
application-label-nl:'guitar_student_tracker'
application-label-pa:'guitar_student_tracker'
application-label-pl:'guitar_student_tracker'
application-label-pt:'guitar_student_tracker'
application-label-pt-BR:'guitar_student_tracker'
application-label-pt-PT:'guitar_student_tracker'
application-label-ro:'guitar_student_tracker'
application-label-ru:'guitar_student_tracker'
application-label-si:'guitar_student_tracker'
application-label-sk:'guitar_student_tracker'
application-label-sl:'guitar_student_tracker'
application-label-sq:'guitar_student_tracker'
application-label-sr:'guitar_student_tracker'
application-label-sr-Latn:'guitar_student_tracker'
application-label-sv:'guitar_student_tracker'
application-label-sw:'guitar_student_tracker'
application-label-ta:'guitar_student_tracker'
application-label-te:'guitar_student_tracker'
application-label-th:'guitar_student_tracker'
application-label-tl:'guitar_student_tracker'
application-label-tr:'guitar_student_tracker'
application-label-uk:'guitar_student_tracker'
application-label-ur:'guitar_student_tracker'
application-label-uz:'guitar_student_tracker'
application-label-vi:'guitar_student_tracker'
application-label-zh-CN:'guitar_student_tracker'
application-label-zh-HK:'guitar_student_tracker'
application-label-zh-TW:'guitar_student_tracker'
application-label-zu:'guitar_student_tracker'
application-icon-160:'res/mipmap-mdpi-v4/ic_launcher.png'
application-icon-240:'res/mipmap-hdpi-v4/ic_launcher.png'
application-icon-320:'res/mipmap-xhdpi-v4/ic_launcher.png'
application-icon-480:'res/mipmap-xxhdpi-v4/ic_launcher.png'
application-icon-640:'res/mipmap-xxxhdpi-v4/ic_launcher.png'
application: label='guitar_student_tracker' icon='res/mipmap-mdpi-v4/ic_launcher.png'
application-debuggable
launchable-activity: name='com.example.guitarstudenttracker.MainActivity' label='' icon=''
feature-group: label=''
uses-gl-es: '0x20000'
uses-feature: name='android.hardware.faketouch'
uses-implied-feature: name='android.hardware.faketouch' reason='default feature for all apps'
main
other-activities
supports-screens: 'small' 'normal' 'large' 'xlarge'
supports-any-density: 'true'
locales: '--_--' 'af' 'am' 'ar' 'az' 'be' 'bg' 'bn' 'bs' 'ca' 'cs' 'da' 'de' 'el' 'en-AU' 'en-GB' 'en-IN' 'es' 'es-US' 'et' 'eu' 'fa' 'fi' 'fr' 'fr-CA' 'gl' 'gu' 'hi' 'hr' 'hu' 'hy' 'in' 'is' 'it' 'iw' 'ja' 'ka' 'kk' '
km' 'kn' 'ko' 'ky' 'lo' 'lt' 'lv' 'mk' 'ml' 'mn' 'mr' 'ms' 'my' 'nb' 'ne' 'nl' 'pa' 'pl' 'pt' 'pt-BR' 'pt-PT' 'ro' 'ru' 'si' 'sk' 'sl' 'sq' 'sr' 'sr-Latn' 'sv' 'sw' 'ta' 'te' 'th' 'tl' 'tr' 'uk' 'ur' 'uz' 'vi' 'zh-CN' 'zh-HK' 'zh
-TW' 'zu'
densities: '160' '240' '320' '480' '640'
native-code: 'arm64-v8a' 'x86' 'x86_64'
[ +1 ms] Stopping app 'app.apk' on Pixel XL.
[ ] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell am force-stop com.example.guitarstudenttracker
[ +271 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell pm list packages com.example.guitarstudenttracker
[ +131 ms] package:com.example.guitarstudenttracker
[ +2 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell cat /data/local/tmp/sky.com.example.guitarstudenttracker.sha1
[ +128 ms] 571c14499f0edade24cc0ec188e907d6a4735f25
[ +1 ms] Installing APK.
[ +2 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb version
[ +153 ms] Android Debug Bridge version 1.0.40
Version 4797878
Installed as C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb.EXE
[ ] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb start-server
[ +217 ms] Installing build\app\outputs\apk\app.apk...
[ ] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 install -r build\app\outputs\apk\app.apk
[+7024 ms] Success
[ +1 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell echo -n ce373916fd64d2f3c5293455f6d34f96895df215 > /data/local/tmp/sky.com.example.guitarstudenttracker.sha1
[ +114 ms] Pixel XL startApp
[ +2 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 shell am start -a android.intent.action.RUN -f 0x20000000 --ez enable-background-compilation true --ez enable-dart-profiling true --ez enable-
checked-mode true com.example.guitarstudenttracker/com.example.guitarstudenttracker.MainActivity
[ +152 ms] Starting: Intent { act=android.intent.action.RUN flg=0x20000000 cmp=com.example.guitarstudenttracker/.MainActivity (has extras) }
[ ] Waiting for observatory port to be available...
[ +982 ms] I/FlutterActivityDelegate( 6562): onResume setting current activity to this
[ +187 ms] Observatory URL on device: http://127.0.0.1:38780/
[ +6 ms] C:\Users\groov\AppData\Local\Android\sdk\platform-tools\adb -s HT74J0200552 forward tcp:8100 tcp:38780
[ +86 ms] Forwarded host port 8100 to device port 38780 for Observatory
[ +7 ms] Connecting to service protocol: http://127.0.0.1:8100/
[ +436 ms] Successfully connected to service protocol: http://127.0.0.1:8100/
[ +3 ms] getVM: {}
[ +18 ms] getIsolate: {isolateId: isolates/680595021}
[ +2 ms] _flutter.listViews: {isolateId: isolates/680595021}
[ +61 ms] DevFS: Creating new filesystem on the device (null)
[ ] _createDevFS: {fsName: Guitar-Student-Tracker}
[ +74 ms] DevFS: Created new filesystem on the device (file:///data/user/0/com.example.guitarstudenttracker/cache/Guitar-Student-TrackerRVDOHP/Guitar-Student-Tracker/)
[ +2 ms] Updating assets
[ +266 ms] Syncing files to device Pixel XL...
[ +8 ms] DevFS: Starting sync from LocalDirectory: 'C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker'
[ ] Scanning project files
[ +4 ms] Scanning package files
[ +88 ms] Scanning asset files
[ ] Scanning for deleted files
[ +35 ms] Compiling dart to kernel with 444 updated files
[ +4 ms] C:\flutter\bin\cache\dart-sdk\bin\dart C:\flutter\bin\cache\artifacts\engine\windows-x64\frontend_server.dart.snapshot --sdk-root C:\flutter\bin\cache\artifacts\engine\common\flutter_patched_sdk/ --incremental --strong
--target=flutter --output-dill build\app.dill --packages C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker\.packages --filesystem-scheme org-dartlang-root
[+4067 ms] Updating files
[ +879 ms] DevFS: Sync finished
[ ] Synced 1.4MB.
[ +2 ms] _flutter.listViews: {isolateId: isolates/680595021}
[ +12 ms] Connected to _flutterView/0x7b10a36c18.
[ +1 ms] 🔥 To hot reload changes while running, press "r". To hot restart (and rebuild state), press "R".
[ +1 ms] An Observatory debugger and profiler on Pixel XL is available at: http://127.0.0.1:8100/
[ ] For a more detailed help message, press "h". To quit, press "q".
[+33515 ms] I/flutter ( 6562): ══╡ EXCEPTION CAUGHT BY RENDERING LIBRARY ╞═════════════════════════════════════════════════════════
[ +7 ms] I/flutter ( 6562): The following message was thrown during layout:
[ +1 ms] I/flutter ( 6562): A RenderFlex overflowed by 49 pixels on the right.
[ +26 ms] I/flutter ( 6562):
[ +1 ms] I/flutter ( 6562): The overflowing RenderFlex has an orientation of Axis.horizontal.
[ ] I/flutter ( 6562): The edge of the RenderFlex that is overflowing has been marked in the rendering with a yellow and
[ ] I/flutter ( 6562): black striped pattern. This is usually caused by the contents being too big for the RenderFlex.
[ ] I/flutter ( 6562): Consider applying a flex factor (e.g. using an Expanded widget) to force the children of the
[ ] I/flutter ( 6562): RenderFlex to fit within the available space instead of being sized to their natural size.
[ ] I/flutter ( 6562): This is considered an error condition because it indicates that there is content that cannot be
[ ] I/flutter ( 6562): seen. If the content is legitimately bigger than the available space, consider clipping it with a
[ ] I/flutter ( 6562): ClipRect widget before putting it in the flex, or using a scrollable container rather than a Flex,
[ ] I/flutter ( 6562): like a ListView.
[ ] I/flutter ( 6562): The specific RenderFlex in question is:
[ ] I/flutter ( 6562): RenderFlex#9f4e4 relayoutBoundary=up1 OVERFLOWING
[ ] I/flutter ( 6562): creator: Row ← IconTheme ← Builder ← Center ← Padding ← Container ← IconTheme ← Builder ← Listener
[ +58 ms] I/flutter ( 6562): ← _GestureSemantics ← RawGestureDetector ← GestureDetector ← ⋯
[ +2 ms] I/flutter ( 6562): parentData: offset=Offset(0.0, 12.0) (can use size)
[ ] I/flutter ( 6562): constraints: BoxConstraints(0.0<=w<=110.4, 0.0<=h<=48.0)
[ ] I/flutter ( 6562): size: Size(110.4, 24.0)
[ ] I/flutter ( 6562): direction: horizontal
[ ] I/flutter ( 6562): mainAxisAlignment: start
[ ] I/flutter ( 6562): mainAxisSize: min
[ ] I/flutter ( 6562): crossAxisAlignment: center
[ ] I/flutter ( 6562): textDirection: ltr
[ ] I/flutter ( 6562): verticalDirection: down
[ ] I/flutter ( 6562): ◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤
[ ] I/flutter ( 6562): ════════════════════════════════════════════════════════════════════════════════════════════════════
```
<!--
Run `flutter analyze` and attach any output of that command below.
If there are any analysis errors, try resolving them before filing this issue.
-->
```
C:\Users\groov\Flutter_Projects\Guitar-Student-Tracker>flutter analyze
Analyzing Guitar-Student-Tracker...
info - Unused import: 'package:flutter/material.dart' - lib\addNewLessonPage.dart:1:8
info - Unused import: 'package:google_places_picker/google_places_picker.dart' - lib\addNewStudentPage.dart:3:8
info - Name non-constant identifiers using lowerCamelCase - lib\bottomBar.dart:4:8
info - Unused import: 'package:guitar_student_tracker/addNewLessonPage.dart' - lib\calendarPage.dart:2:8
info - Unused import: 'package:guitar_student_tracker/calendarPage.dart' - lib\main.dart:5:8
info - Unused import: 'package:guitar_student_tracker/globals.dart' - lib\main.dart:6:8
6 issues found. (ran in 23.4s)
```
<!-- Finally, paste the output of running `flutter doctor -v` here. -->
```
C:\flutter\bin\flutter.bat --no-color doctor
Doctor summary (to see all details, run flutter doctor -v):
[√] Flutter (Channel beta, v0.5.1, on Microsoft Windows [Version 10.0.17134.112], locale en-US)
[√] Android toolchain - develop for Android devices (Android SDK 28.0.0)
[√] Android Studio (version 3.1)
[√] Android Studio (version 3.2)
X Flutter plugin not installed; this adds Flutter specific functionality.
X Dart plugin not installed; this adds Dart specific functionality.
[√] Connected devices (2 available)
• No issues found!
Process finished with exit code 0
```
|
non_process
|
docked extended fab text gets clipped when returning to the page thank you for using flutter if you are looking for support please check out our documentation or consider asking a question on stack overflow if you have found a bug or if our documentation doesn t have an answer to what you re looking for then fill our the template below please read our guide to filing a bug first steps to reproduce when navigating from a screen that has a center docked extended fab to another screen with a right docked extended fab that has a shorter label when returning to the first screen the text at the right end of the fab gets clipped see the imgur video below logs run your application with flutter run verbose and attach all the log output below between the lines with the backticks if there is an exception please see if the error message includes enough information to explain how to solve the issue c users groov flutter projects guitar student tracker flutter run verbose git rev parse abbrev ref symbolic u exit code from git rev parse abbrev ref symbolic u origin beta git rev parse abbrev ref head exit code from git rev parse abbrev ref head beta git ls remote get url origin exit code from git ls remote get url origin git log n pretty format h exit code from git log n pretty format h git log n pretty format ar exit code from git log n pretty format ar months ago git describe match v first parent long tags exit code from git describe match v first parent long tags c users groov appdata local android sdk platform tools adb devices l exit code from c users groov appdata local android sdk platform tools adb devices l list of devices attached device product marlin model pixel xl device marlin transport id found plugin google places picker at c flutter pub cache hosted pub dartlang org google places picker c users groov appdata local android sdk platform tools adb s shell getprop ro hardware marlin ro build characteristics nosdcard launching lib main dart on pixel xl in debug mode initializing gradle using gradle from c users groov flutter projects guitar student tracker android gradlew bat c users groov flutter projects guitar student tracker android gradlew bat v gradle build time utc revision groovy ant apache ant tm version compiled on february jvm release jetbrains s r o os windows resolving dependencies c users groov flutter projects guitar student tracker android gradlew bat app properties starting a gradle daemon incompatible and stopped daemons could not be reused use status for details app properties project app allprojects android com android build gradle appextension decorated androiddependencies task app androiddependencies ant org gradle api internal project defaultantbuilder antbuilderfactory org gradle api internal project defaultantbuilderfactory archivesbasename app artifacts org gradle api internal artifacts dsl defaultartifacthandler decorated asdynamicobject dynamicobject for project app assemble task app assemble assembleandroidtest task app assembleandroidtest assembledebug task app assembledebug assembledebugandroidtest task app assembledebugandroidtest assembledebugunittest task app assembledebugunittest assembleprofile task app assembleprofile assembleprofileunittest task app assembleprofileunittest assemblerelease task app assemblerelease assemblereleaseunittest task app assemblereleaseunittest baseclassloaderscope org gradle api internal initialization defaultclassloaderscope builddependents task app builddependents builddir c users groov flutter projects guitar student tracker build app buildfile c users groov flutter projects guitar student tracker android app build gradle buildneeded task app buildneeded buildoutputs basevariantoutput container buildpath buildscriptsource org gradle groovy scripts textresourcescriptsource buildscript org gradle api internal initialization defaultscripthandler bundleappclassesdebug task app bundleappclassesdebug bundleappclassesdebugandroidtest task app bundleappclassesdebugandroidtest bundleappclassesdebugunittest task app bundleappclassesdebugunittest bundleappclassesprofile task app bundleappclassesprofile bundleappclassesprofileunittest task app bundleappclassesprofileunittest bundleappclassesrelease task app bundleappclassesrelease bundleappclassesreleaseunittest task app bundleappclassesreleaseunittest bundledebugandroidtestresources task app bundledebugandroidtestresources bundledebugresources task app bundledebugresources bundleprofileresources task app bundleprofileresources bundlereleaseresources task app bundlereleaseresources check task app check checkdebugmanifest task app checkdebugmanifest checkprofilemanifest task app checkprofilemanifest checkreleasemanifest task app checkreleasemanifest childprojects class class org gradle api internal project defaultproject decorated classloaderscope org gradle api internal initialization defaultclassloaderscope cleanbuildcache task app cleanbuildcache compiledebugaidl task app compiledebugaidl compiledebugandroidtestaidl task app compiledebugandroidtestaidl compiledebugandroidtestjavawithjavac task app compiledebugandroidtestjavawithjavac compiledebugandroidtestndk task app compiledebugandroidtestndk compiledebugandroidtestrenderscript task app compiledebugandroidtestrenderscript compiledebugandroidtestshaders task app compiledebugandroidtestshaders compiledebugandroidtestsources task app compiledebugandroidtestsources compiledebugjavawithjavac task app compiledebugjavawithjavac compiledebugndk task app compiledebugndk compiledebugrenderscript task app compiledebugrenderscript compiledebugshaders task app compiledebugshaders compiledebugsources task app compiledebugsources compiledebugunittestjavawithjavac task app compiledebugunittestjavawithjavac compiledebugunittestsources task app compiledebugunittestsources compilelint task app compilelint compileprofileaidl task app compileprofileaidl compileprofilejavawithjavac task app compileprofilejavawithjavac compileprofilendk task app compileprofilendk compileprofilerenderscript task app compileprofilerenderscript compileprofileshaders task app compileprofileshaders compileprofilesources task app compileprofilesources compileprofileunittestjavawithjavac task app compileprofileunittestjavawithjavac compileprofileunittestsources task app compileprofileunittestsources compilereleaseaidl task app compilereleaseaidl compilereleasejavawithjavac task app compilereleasejavawithjavac compilereleasendk task app compilereleasendk compilereleaserenderscript task app compilereleaserenderscript compilereleaseshaders task app compilereleaseshaders compilereleasesources task app compilereleasesources compilereleaseunittestjavawithjavac task app compilereleaseunittestjavawithjavac compilereleaseunittestsources task app compilereleaseunittestsources components softwarecomponentinternal set configurationactions org gradle configuration project defaultprojectconfigurationactioncontainer configurationtargetidentifier org gradle configuration configurationtargetidentifier configurations configuration container connectedandroidtest task app connectedandroidtest connectedcheck task app connectedcheck connecteddebugandroidtest task app connecteddebugandroidtest consumeconfigattr task app consumeconfigattr convention org gradle api internal plugins defaultconvention copyflutterassetsdebug task app copyflutterassetsdebug copyflutterassetsprofile task app copyflutterassetsprofile copyflutterassetsrelease task app copyflutterassetsrelease createdebugcompatiblescreenmanifests task app createdebugcompatiblescreenmanifests createprofilecompatiblescreenmanifests task app createprofilecompatiblescreenmanifests createreleasecompatiblescreenmanifests task app createreleasecompatiblescreenmanifests defaultartifacts org gradle api internal plugins defaultartifactpublicationset decorated defaulttasks deferredprojectconfiguration org gradle api internal project deferredprojectconfiguration dependencies org gradle api internal artifacts dsl dependencies defaultdependencyhandler decorated depth description null deviceandroidtest task app deviceandroidtest devicecheck task app devicecheck displayname project app distsdir c users groov flutter projects guitar student tracker build app distributions distsdirname distributions docsdir c users groov flutter projects guitar student tracker build app docs docsdirname docs ext org gradle api internal plugins defaultextrapropertiesextension extensions org gradle api internal plugins defaultconvention extractproguardfiles task app extractproguardfiles fileoperations org gradle api internal file defaultfileoperations fileresolver org gradle api internal file basedirfileresolver flutter flutterextension decorated flutterbuilddebug task app flutterbuilddebug flutterbuildprofile task app flutterbuildprofile flutterbuildrelease task app flutterbuildrelease task app generatedebugandroidtestassets task app generatedebugandroidtestassets generatedebugandroidtestbuildconfig task app generatedebugandroidtestbuildconfig generatedebugandroidtestresvalues task app generatedebugandroidtestresvalues generatedebugandroidtestresources task app generatedebugandroidtestresources generatedebugandroidtestsources task app generatedebugandroidtestsources generatedebugassets task app generatedebugassets generatedebugbuildconfig task app generatedebugbuildconfig generatedebugresvalues task app generatedebugresvalues generatedebugresources task app generatedebugresources generatedebugsources task app generatedebugsources generateprofileassets task app generateprofileassets generateprofilebuildconfig task app generateprofilebuildconfig generateprofileresvalues task app generateprofileresvalues generateprofileresources task app generateprofileresources generateprofilesources task app generateprofilesources generatereleaseassets task app generatereleaseassets generatereleasebuildconfig task app generatereleasebuildconfig generatereleaseresvalues task app generatereleaseresvalues generatereleaseresources task app generatereleaseresources generatereleasesources task app generatereleasesources gradle build android group android identitypath app inheritedscope org gradle api internal extensibledynamicobject inheriteddynamicobject installdebug task app installdebug installdebugandroidtest task app installdebugandroidtest installprofile task app installprofile installrelease task app installrelease javaprecompiledebug task app javaprecompiledebug javaprecompiledebugandroidtest task app javaprecompiledebugandroidtest javaprecompiledebugunittest task app javaprecompiledebugunittest javaprecompileprofile task app javaprecompileprofile javaprecompileprofileunittest task app javaprecompileprofileunittest javaprecompilerelease task app javaprecompilerelease javaprecompilereleaseunittest task app javaprecompilereleaseunittest layout org gradle api internal file defaultprojectlayout libsdir c users groov flutter projects guitar student tracker build app libs libsdirname libs lint task app lint lintdebug task app lintdebug lintprofile task app lintprofile lintrelease task app lintrelease lintvitalrelease task app lintvitalrelease logger org gradle internal logging outputeventlistenerbackedlogger logging org gradle internal logging services defaultloggingmanager mainapklistpersistencedebug task app mainapklistpersistencedebug mainapklistpersistencedebugandroidtest task app mainapklistpersistencedebugandroidtest mainapklistpersistenceprofile task app mainapklistpersistenceprofile mainapklistpersistencerelease task app mainapklistpersistencerelease mergedebugandroidtestassets task app mergedebugandroidtestassets mergedebugandroidtestjnilibfolders task app mergedebugandroidtestjnilibfolders mergedebugandroidtestresources task app mergedebugandroidtestresources mergedebugandroidtestshaders task app mergedebugandroidtestshaders mergedebugassets task app mergedebugassets mergedebugjnilibfolders task app mergedebugjnilibfolders mergedebugresources task app mergedebugresources mergedebugshaders task app mergedebugshaders mergeprofileassets task app mergeprofileassets mergeprofilejnilibfolders task app mergeprofilejnilibfolders mergeprofileresources task app mergeprofileresources mergeprofileshaders task app mergeprofileshaders mergereleaseassets task app mergereleaseassets mergereleasejnilibfolders task app mergereleasejnilibfolders mergereleaseresources task app mergereleaseresources mergereleaseshaders task app mergereleaseshaders mockableandroidjar task app mockableandroidjar modelregistry org gradle model internal registry defaultmodelregistry modelschemastore org gradle model internal manage schema extract defaultmodelschemastore module org gradle api internal artifacts projectbackedmodule name app normalization org gradle normalization internal defaultinputnormalizationhandler decorated objects org gradle api internal model defaultobjectfactory org gradle jvmargs packagedebug task app packagedebug packagedebugandroidtest task app packagedebugandroidtest packageprofile task app packageprofile packagerelease task app packagerelease parent root project android parentidentifier root project android path app platformattrextractor task app platformattrextractor pluginmanager org gradle api internal plugins defaultpluginmanager decorated plugins org gradle api plugins helptasksplugin com android build gradle api androidbaseplugin org gradle language base plugins lifecyclebaseplugin org gradle api plugins basepl ugin org gradle api plugins reportingbaseplugin org gradle platform base plugins componentbaseplugin org gradle language base plugins languagebaseplugin org gradle platform base plugins bin arybaseplugin org gradle api plugins javabaseplugin com android build gradle appplugin flutterplugin prebuild task app prebuild predebugandroidtestbuild task app predebugandroidtestbuild predebugbuild task app predebugbuild predebugunittestbuild task app predebugunittestbuild preprofilebuild task app preprofilebuild preprofileunittestbuild task app preprofileunittestbuild prereleasebuild task app prereleasebuild prereleaseunittestbuild task app prereleaseunittestbuild preparelintjar task app preparelintjar preparepublished dexdebugandroidtestforpublishing task app preparepublished dexdebugandroidtestforpublishing preparepublished dexdebugforpublishing task app preparepublished dexdebugforpublishing preparepublished dexprofileforpublishing task app preparepublished dexprofileforpublishing preparepublished dexreleaseforpublishing task app preparepublished dexreleaseforpublishing preparepublished java resdebugandroidtestforpublishing task app preparepublished java resdebugandroidtestforpublishing preparepublished java resdebugforpublishing task app preparepublished java resdebugforpublishing preparepublished java resprofileforpublishing task app preparepublished java resprofileforpublishing preparepublished java resreleaseforpublishing task app preparepublished java resreleaseforpublishing preparepublished native libsdebugandroidtestforpublishing task app preparepublished native libsdebugandroidtestforpublishing preparepublished native libsdebugforpublishing task app preparepublished native libsdebugforpublishing preparepublished native libsprofileforpublishing task app preparepublished native libsprofileforpublishing preparepublished native libsreleaseforpublishing task app preparepublished native libsreleaseforpublishing processdebugandroidtestjavares task app processdebugandroidtestjavares processdebugandroidtestmanifest task app processdebugandroidtestmanifest processdebugandroidtestresources task app processdebugandroidtestresources processdebugjavares task app processdebugjavares processdebugmanifest task app processdebugmanifest processdebugresources task app processdebugresources processdebugunittestjavares task app processdebugunittestjavares processoperations org gradle api internal file defaultfileoperations processprofilejavares task app processprofilejavares processprofilemanifest task app processprofilemanifest processprofileresources task app processprofileresources processprofileunittestjavares task app processprofileunittestjavares processreleasejavares task app processreleasejavares processreleasemanifest task app processreleasemanifest processreleaseresources task app processreleaseresources processreleaseunittestjavares task app processreleaseunittestjavares project project app projectconfigurator org gradle api internal project buildoperationcrossprojectconfigurator projectdir c users groov flutter projects guitar student tracker android app projectevaluationbroadcaster projectevaluationlistener broadcast projectevaluator org gradle configuration project lifecycleprojectevaluator projectpath app projectregistry org gradle api internal project defaultprojectregistry properties providers org gradle api internal provider defaultproviderfactory reportbuildartifactsdebug task app reportbuildartifactsdebug reportbuildartifactsprofile task app reportbuildartifactsprofile reportbuildartifactsrelease task app reportbuildartifactsrelease reporting org gradle api reporting reportingextension decorated reportsdir c users groov flutter projects guitar student tracker build app reports repositories repository container resolveconfigattr task app resolveconfigattr resourceloader org gradle internal resource transfer defaulturitextresourceloader resources org gradle api internal resources defaultresourcehandler rootdir c users groov flutter projects guitar student tracker android rootproject root project android script false scripthandlerfactory org gradle api internal initialization defaultscripthandlerfactory scriptpluginfactory org gradle configuration scriptpluginfactoryselector serviceregistryfactory org gradle internal service scopes projectscopeservices services projectscopeservices signingreport task app signingreport sourcecompatibility sourcesets sourceset container splitsdiscoverytaskdebug task app splitsdiscoverytaskdebug splitsdiscoverytaskprofile task app splitsdiscoverytaskprofile splitsdiscoverytaskrelease task app splitsdiscoverytaskrelease standardoutputcapture org gradle internal logging services defaultloggingmanager state project state executed status integration subprojects targetcompatibility tasks task set test task app test testdebugunittest task app testdebugunittest testprofileunittest task app testprofileunittest testreleaseunittest task app testreleaseunittest testreportdir c users groov flutter projects guitar student tracker build app reports tests testreportdirname tests testresultsdir c users groov flutter projects guitar student tracker build app test results testresultsdirname test results transformclasseswithdexbuilderfordebug task app transformclasseswithdexbuilderfordebug transformclasseswithdexbuilderfordebugandroidtest task app transformclasseswithdexbuilderfordebugandroidtest transformclasseswithdexbuilderforprofile task app transformclasseswithdexbuilderforprofile transformclasseswithdexbuilderforrelease task app transformclasseswithdexbuilderforrelease transformdexarchivewithdexmergerfordebug task app transformdexarchivewithdexmergerfordebug transformdexarchivewithdexmergerfordebugandroidtest task app transformdexarchivewithdexmergerfordebugandroidtest transformdexarchivewithdexmergerforprofile task app transformdexarchivewithdexmergerforprofile transformdexarchivewithdexmergerforrelease task app transformdexarchivewithdexmergerforrelease transformdexarchivewithexternallibsdexmergerfordebug task app transformdexarchivewithexternallibsdexmergerfordebug transformdexarchivewithexternallibsdexmergerfordebugandroidtest task app transformdexarchivewithexternallibsdexmergerfordebugandroidtest transformdexarchivewithexternallibsdexmergerforprofile task app transformdexarchivewithexternallibsdexmergerforprofile transformdexarchivewithexternallibsdexmergerforrelease task app transformdexarchivewithexternallibsdexmergerforrelease transformnativelibswithmergejnilibsfordebug task app transformnativelibswithmergejnilibsfordebug transformnativelibswithmergejnilibsfordebugandroidtest task app transformnativelibswithmergejnilibsfordebugandroidtest transformnativelibswithmergejnilibsforprofile task app transformnativelibswithmergejnilibsforprofile transformnativelibswithmergejnilibsforrelease task app transformnativelibswithmergejnilibsforrelease transformresourceswithmergejavaresfordebug task app transformresourceswithmergejavaresfordebug transformresourceswithmergejavaresfordebugandroidtest task app transformresourceswithmergejavaresfordebugandroidtest transformresourceswithmergejavaresfordebugunittest task app transformresourceswithmergejavaresfordebugunittest transformresourceswithmergejavaresforprofile task app transformresourceswithmergejavaresforprofile transformresourceswithmergejavaresforprofileunittest task app transformresourceswithmergejavaresforprofileunittest transformresourceswithmergejavaresforrelease task app transformresourceswithmergejavaresforrelease transformresourceswithmergejavaresforreleaseunittest task app transformresourceswithmergejavaresforreleaseunittest uninstallall task app uninstallall uninstalldebug task app uninstalldebug uninstalldebugandroidtest task app uninstalldebugandroidtest uninstallprofile task app uninstallprofile uninstallrelease task app uninstallrelease validatesigningdebug task app validatesigningdebug validatesigningdebugandroidtest task app validatesigningdebugandroidtest validatesigningprofile task app validatesigningprofile validatesigningrelease task app validatesigningrelease version unspecified writedebugapplicationid task app writedebugapplicationid writeprofileapplicationid task app writeprofileapplicationid writereleaseapplicationid task app writereleaseapplicationid build successful in actionable task executed c users groov appdata local android sdk build tools aapt dump badging build app outputs apk app apk exit code from c users groov appdata local android sdk build tools aapt dump badging build app outputs apk app apk package name com example guitarstudenttracker versioncode versionname sdkversion targetsdkversion uses permission name android permission internet uses permission name android permission access network state application label guitar student tracker application label af guitar student tracker application label am guitar student tracker application label ar guitar student tracker application label az guitar student tracker application label be guitar student tracker application label bg guitar student tracker application label bn guitar student tracker application label bs guitar student tracker application label ca guitar student tracker application label cs guitar student tracker application label da guitar student tracker application label de guitar student tracker application label el guitar student tracker application label en au guitar student tracker application label en gb guitar student tracker application label en in guitar student tracker application label es guitar student tracker application label es us guitar student tracker application label et guitar student tracker application label eu guitar student tracker application label fa guitar student tracker application label fi guitar student tracker application label fr guitar student tracker application label fr ca guitar student tracker application label gl guitar student tracker application label gu guitar student tracker application label hi guitar student tracker application label hr guitar student tracker application label hu guitar student tracker application label hy guitar student tracker application label in guitar student tracker application label is guitar student tracker application label it guitar student tracker application label iw guitar student tracker application label ja guitar student tracker application label ka guitar student tracker application label kk guitar student tracker application label km guitar student tracker application label kn guitar student tracker application label ko guitar student tracker application label ky guitar student tracker application label lo guitar student tracker application label lt guitar student tracker application label lv guitar student tracker application label mk guitar student tracker application label ml guitar student tracker application label mn guitar student tracker application label mr guitar student tracker application label ms guitar student tracker application label my guitar student tracker application label nb guitar student tracker application label ne guitar student tracker application label nl guitar student tracker application label pa guitar student tracker application label pl guitar student tracker application label pt guitar student tracker application label pt br guitar student tracker application label pt pt guitar student tracker application label ro guitar student tracker application label ru guitar student tracker application label si guitar student tracker application label sk guitar student tracker application label sl guitar student tracker application label sq guitar student tracker application label sr guitar student tracker application label sr latn guitar student tracker application label sv guitar student tracker application label sw guitar student tracker application label ta guitar student tracker application label te guitar student tracker application label th guitar student tracker application label tl guitar student tracker application label tr guitar student tracker application label uk guitar student tracker application label ur guitar student tracker application label uz guitar student tracker application label vi guitar student tracker application label zh cn guitar student tracker application label zh hk guitar student tracker application label zh tw guitar student tracker application label zu guitar student tracker application icon res mipmap mdpi ic launcher png application icon res mipmap hdpi ic launcher png application icon res mipmap xhdpi ic launcher png application icon res mipmap xxhdpi ic launcher png application icon res mipmap xxxhdpi ic launcher png application label guitar student tracker icon res mipmap mdpi ic launcher png launchable activity name com example guitarstudenttracker mainactivity label icon feature group label uses gl es uses feature name android hardware faketouch uses implied feature name android hardware faketouch reason default feature for all apps main other activities supports screens small normal large xlarge supports any density true locales af am ar az be bg bn bs ca cs da de el en au en gb en in es es us et eu fa fi fr fr ca gl gu hi hr hu hy in is it iw ja ka kk km kn ko ky lo lt lv mk ml mn mr ms my nb ne nl pa pl pt pt br pt pt ro ru si sk sl sq sr sr latn sv sw ta te th tl tr uk ur uz vi zh cn zh hk zh tw zu densities native code c users groov appdata local android sdk platform tools adb s logcat v time t exit code from c users groov appdata local android sdk platform tools adb s logcat v time t beginning of main d newstreamadapter publishing revision to adapter clients c users groov appdata local android sdk platform tools adb s logcat v time dependencychecker nothing is modified after c users groov appdata local android sdk platform tools adb version android debug bridge version version installed as c users groov appdata local android sdk platform tools adb exe c users groov appdata local android sdk platform tools adb start server building apk running gradlew assembledebug c users groov flutter projects guitar student tracker android gradlew bat ptarget c users groov flutter projects guitar student tracker lib main dart ppreview dart true ptarget platform android assembledebug app prebuild up to date google places picker prebuild up to date google places picker predebugbuild up to date google places picker checkdebugmanifest up to date google places picker processdebugmanifest up to date app predebugbuild up to date google places picker compiledebugaidl up to date app compiledebugaidl up to date google places picker packagedebugrenderscript no source app compiledebugrenderscript up to date app up to date app checkdebugmanifest up to date app generatedebugbuildconfig up to date app preparelintjar up to date app cleanmergedebugassets app flutterbuilddebug app mergedebugshaders up to date app compiledebugshaders up to date app generatedebugassets up to date google places picker mergedebugshaders up to date google places picker compiledebugshaders up to date google places picker generatedebugassets up to date google places picker packagedebugassets up to date app mergedebugassets app copyflutterassetsdebug app mainapklistpersistencedebug up to date app generatedebugresvalues up to date app generatedebugresources up to date google places picker compiledebugrenderscript up to date google places picker generatedebugresvalues up to date google places picker generatedebugresources up to date google places picker packagedebugresources up to date app mergedebugresources up to date app createdebugcompatiblescreenmanifests up to date app processdebugmanifest up to date app splitsdiscoverytaskdebug up to date google places picker platformattrextractor up to date google places picker generatedebugrfile up to date app processdebugresources up to date app generatedebugsources up to date google places picker generatedebugbuildconfig up to date google places picker compiledebugkotlin up to date google places picker preparelintjar up to date google places picker generatedebugsources up to date google places picker javaprecompiledebug up to date google places picker compiledebugjavawithjavac up to date google places picker processdebugjavares no source google places picker transformclassesandresourceswithprepareintermediatejarsfordebug up to date app javaprecompiledebug up to date app compiledebugjavawithjavac up to date app compiledebugndk no source app compiledebugsources up to date app transformclasseswithdexbuilderfordebug up to date app transformdexarchivewithexternallibsdexmergerfordebug up to date app transformdexarchivewithdexmergerfordebug up to date app mergedebugjnilibfolders up to date google places picker compiledebugndk no source google places picker mergedebugjnilibfolders up to date google places picker transformnativelibswithmergejnilibsfordebug up to date google places picker transformnativelibswithintermediatejnilibsfordebug up to date app transformnativelibswithmergejnilibsfordebug up to date app processdebugjavares no source app transformresourceswithmergejavaresfordebug up to date app validatesigningdebug up to date app packagedebug app assembledebug google places picker extractdebugannotations up to date google places picker mergedebugconsumerproguardfiles up to date google places picker transformresourceswithmergejavaresfordebug up to date google places picker transformclassesandresourceswithsynclibjarsfordebug up to date google places picker transformnativelibswithsyncjnilibsfordebug up to date google places picker bundledebug up to date google places picker compiledebugsources up to date google places picker assembledebug up to date build successful in actionable tasks executed up to date calculatesha c users groov flutter projects guitar student tracker build app outputs apk app apk built build app outputs apk debug app debug apk c users groov appdata local android sdk build tools aapt dump badging build app outputs apk app apk exit code from c users groov appdata local android sdk build tools aapt dump badging build app outputs apk app apk package name com example guitarstudenttracker versioncode versionname sdkversion targetsdkversion uses permission name android permission internet uses permission name android permission access network state application label guitar student tracker application label af guitar student tracker application label am guitar student tracker application label ar guitar student tracker application label az guitar student tracker application label be guitar student tracker application label bg guitar student tracker application label bn guitar student tracker application label bs guitar student tracker application label ca guitar student tracker application label cs guitar student tracker application label da guitar student tracker application label de guitar student tracker application label el guitar student tracker application label en au guitar student tracker application label en gb guitar student tracker application label en in guitar student tracker application label es guitar student tracker application label es us guitar student tracker application label et guitar student tracker application label eu guitar student tracker application label fa guitar student tracker application label fi guitar student tracker application label fr guitar student tracker application label fr ca guitar student tracker application label gl guitar student tracker application label gu guitar student tracker application label hi guitar student tracker application label hr guitar student tracker application label hu guitar student tracker application label hy guitar student tracker application label in guitar student tracker application label is guitar student tracker application label it guitar student tracker application label iw guitar student tracker application label ja guitar student tracker application label ka guitar student tracker application label kk guitar student tracker application label km guitar student tracker application label kn guitar student tracker application label ko guitar student tracker application label ky guitar student tracker application label lo guitar student tracker application label lt guitar student tracker application label lv guitar student tracker application label mk guitar student tracker application label ml guitar student tracker application label mn guitar student tracker application label mr guitar student tracker application label ms guitar student tracker application label my guitar student tracker application label nb guitar student tracker application label ne guitar student tracker application label nl guitar student tracker application label pa guitar student tracker application label pl guitar student tracker application label pt guitar student tracker application label pt br guitar student tracker application label pt pt guitar student tracker application label ro guitar student tracker application label ru guitar student tracker application label si guitar student tracker application label sk guitar student tracker application label sl guitar student tracker application label sq guitar student tracker application label sr guitar student tracker application label sr latn guitar student tracker application label sv guitar student tracker application label sw guitar student tracker application label ta guitar student tracker application label te guitar student tracker application label th guitar student tracker application label tl guitar student tracker application label tr guitar student tracker application label uk guitar student tracker application label ur guitar student tracker application label uz guitar student tracker application label vi guitar student tracker application label zh cn guitar student tracker application label zh hk guitar student tracker application label zh tw guitar student tracker application label zu guitar student tracker application icon res mipmap mdpi ic launcher png application icon res mipmap hdpi ic launcher png application icon res mipmap xhdpi ic launcher png application icon res mipmap xxhdpi ic launcher png application icon res mipmap xxxhdpi ic launcher png application label guitar student tracker icon res mipmap mdpi ic launcher png application debuggable launchable activity name com example guitarstudenttracker mainactivity label icon feature group label uses gl es uses feature name android hardware faketouch uses implied feature name android hardware faketouch reason default feature for all apps main other activities supports screens small normal large xlarge supports any density true locales af am ar az be bg bn bs ca cs da de el en au en gb en in es es us et eu fa fi fr fr ca gl gu hi hr hu hy in is it iw ja ka kk km kn ko ky lo lt lv mk ml mn mr ms my nb ne nl pa pl pt pt br pt pt ro ru si sk sl sq sr sr latn sv sw ta te th tl tr uk ur uz vi zh cn zh hk zh tw zu densities native code stopping app app apk on pixel xl c users groov appdata local android sdk platform tools adb s shell am force stop com example guitarstudenttracker c users groov appdata local android sdk platform tools adb s shell pm list packages com example guitarstudenttracker package com example guitarstudenttracker c users groov appdata local android sdk platform tools adb s shell cat data local tmp sky com example guitarstudenttracker installing apk c users groov appdata local android sdk platform tools adb version android debug bridge version version installed as c users groov appdata local android sdk platform tools adb exe c users groov appdata local android sdk platform tools adb start server installing build app outputs apk app apk c users groov appdata local android sdk platform tools adb s install r build app outputs apk app apk success c users groov appdata local android sdk platform tools adb s shell echo n data local tmp sky com example guitarstudenttracker pixel xl startapp c users groov appdata local android sdk platform tools adb s shell am start a android intent action run f ez enable background compilation true ez enable dart profiling true ez enable checked mode true com example guitarstudenttracker com example guitarstudenttracker mainactivity starting intent act android intent action run flg cmp com example guitarstudenttracker mainactivity has extras waiting for observatory port to be available i flutteractivitydelegate onresume setting current activity to this observatory url on device c users groov appdata local android sdk platform tools adb s forward tcp tcp forwarded host port to device port for observatory connecting to service protocol successfully connected to service protocol getvm getisolate isolateid isolates flutter listviews isolateid isolates devfs creating new filesystem on the device null createdevfs fsname guitar student tracker devfs created new filesystem on the device file data user com example guitarstudenttracker cache guitar student trackerrvdohp guitar student tracker updating assets syncing files to device pixel xl devfs starting sync from localdirectory c users groov flutter projects guitar student tracker scanning project files scanning package files scanning asset files scanning for deleted files compiling dart to kernel with updated files c flutter bin cache dart sdk bin dart c flutter bin cache artifacts engine windows frontend server dart snapshot sdk root c flutter bin cache artifacts engine common flutter patched sdk incremental strong target flutter output dill build app dill packages c users groov flutter projects guitar student tracker packages filesystem scheme org dartlang root updating files devfs sync finished synced flutter listviews isolateid isolates connected to flutterview 🔥 to hot reload changes while running press r to hot restart and rebuild state press r an observatory debugger and profiler on pixel xl is available at for a more detailed help message press h to quit press q i flutter ══╡ exception caught by rendering library ╞═════════════════════════════════════════════════════════ i flutter the following message was thrown during layout i flutter a renderflex overflowed by pixels on the right i flutter i flutter the overflowing renderflex has an orientation of axis horizontal i flutter the edge of the renderflex that is overflowing has been marked in the rendering with a yellow and i flutter black striped pattern this is usually caused by the contents being too big for the renderflex i flutter consider applying a flex factor e g using an expanded widget to force the children of the i flutter renderflex to fit within the available space instead of being sized to their natural size i flutter this is considered an error condition because it indicates that there is content that cannot be i flutter seen if the content is legitimately bigger than the available space consider clipping it with a i flutter cliprect widget before putting it in the flex or using a scrollable container rather than a flex i flutter like a listview i flutter the specific renderflex in question is i flutter renderflex relayoutboundary overflowing i flutter creator row ← icontheme ← builder ← center ← padding ← container ← icontheme ← builder ← listener i flutter ← gesturesemantics ← rawgesturedetector ← gesturedetector ← ⋯ i flutter parentdata offset offset can use size i flutter constraints boxconstraints w h i flutter size size i flutter direction horizontal i flutter mainaxisalignment start i flutter mainaxissize min i flutter crossaxisalignment center i flutter textdirection ltr i flutter verticaldirection down i flutter ◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤◢◤ i flutter ════════════════════════════════════════════════════════════════════════════════════════════════════ run flutter analyze and attach any output of that command below if there are any analysis errors try resolving them before filing this issue c users groov flutter projects guitar student tracker flutter analyze analyzing guitar student tracker info unused import package flutter material dart lib addnewlessonpage dart info unused import package google places picker google places picker dart lib addnewstudentpage dart info name non constant identifiers using lowercamelcase lib bottombar dart info unused import package guitar student tracker addnewlessonpage dart lib calendarpage dart info unused import package guitar student tracker calendarpage dart lib main dart info unused import package guitar student tracker globals dart lib main dart issues found ran in c flutter bin flutter bat no color doctor doctor summary to see all details run flutter doctor v flutter channel beta on microsoft windows locale en us android toolchain develop for android devices android sdk android studio version android studio version x flutter plugin not installed this adds flutter specific functionality x dart plugin not installed this adds dart specific functionality connected devices available • no issues found process finished with exit code
| 0
|
794,895
| 28,054,035,571
|
IssuesEvent
|
2023-03-29 08:11:32
|
hotosm/odkconvert
|
https://api.github.com/repos/hotosm/odkconvert
|
opened
|
Support using Overpass for data extracts
|
enhancement progress Priority: Must have
|
Some people will prefer to use Overpass to make data extracts, and FMTM needs this till the Underpass database is fully populated.
|
1.0
|
Support using Overpass for data extracts - Some people will prefer to use Overpass to make data extracts, and FMTM needs this till the Underpass database is fully populated.
|
non_process
|
support using overpass for data extracts some people will prefer to use overpass to make data extracts and fmtm needs this till the underpass database is fully populated
| 0
|
821,320
| 30,817,098,593
|
IssuesEvent
|
2023-08-01 14:06:33
|
sygmaprotocol/explorer-ui
|
https://api.github.com/repos/sygmaprotocol/explorer-ui
|
closed
|
Time formating
|
Priority: P2
|
<!--- Provide a general summary of the issue in the Title above -->
We should use some library for time (last I remember Moment does this but it might be outdated by this point) to have formatted time on transfers (5 minutes ago, 1month ago...)
## Implementation details
<!-- Enter description of implementation that may help dev team -->
Transfer list should just have formmated time and the transfer detail should have full time and date.
Etherscan for example
Detail: https://etherscan.io/tx/0xa6dbae65d8a5383aa9846fdb12009928098f615359ad1595e709672e2a172e57
List https://etherscan.io/token/0x6c5ba91642f10282b576d91922ae6448c9d52f4e?a=0x4038d6a58d58d2560554cf82002cde9cfaab0db1
## Testing details
<!-- Enter description of special test-cases-->
## Acceptance Criteria
<!-- Enter the conditions of satisfaction here. That is, the conditions that will satisfy the user/persona that the goal/benefit/value has been achieved -->
|
1.0
|
Time formating - <!--- Provide a general summary of the issue in the Title above -->
We should use some library for time (last I remember Moment does this but it might be outdated by this point) to have formatted time on transfers (5 minutes ago, 1month ago...)
## Implementation details
<!-- Enter description of implementation that may help dev team -->
Transfer list should just have formmated time and the transfer detail should have full time and date.
Etherscan for example
Detail: https://etherscan.io/tx/0xa6dbae65d8a5383aa9846fdb12009928098f615359ad1595e709672e2a172e57
List https://etherscan.io/token/0x6c5ba91642f10282b576d91922ae6448c9d52f4e?a=0x4038d6a58d58d2560554cf82002cde9cfaab0db1
## Testing details
<!-- Enter description of special test-cases-->
## Acceptance Criteria
<!-- Enter the conditions of satisfaction here. That is, the conditions that will satisfy the user/persona that the goal/benefit/value has been achieved -->
|
non_process
|
time formating we should use some library for time last i remember moment does this but it might be outdated by this point to have formatted time on transfers minutes ago ago implementation details transfer list should just have formmated time and the transfer detail should have full time and date etherscan for example detail list testing details acceptance criteria
| 0
|
10,503
| 13,262,215,674
|
IssuesEvent
|
2020-08-20 21:19:40
|
google/verible
|
https://api.github.com/repos/google/verible
|
closed
|
strip comments tool, based on lexer
|
good first issue help wanted preprocessor
|
Task:
Create a library function (text-in, text-out) and standalone tool to strip `//comments` and `/*comments*/` from source code, starting with the lexer's output, a token stream.
Suggest creating a new binary tool: `verible-verilog-preprocess` whose first positional argument includes the subcommand `strip-comments` for this task. Classifying this under 'preprocessor' because in other languages, it is the preprocessor that strips comments before feeding a parser.
Lexical errors may be ignored, as long as their text is preserved in the output.
Options:
* recursive: apply to comments inside un-lexed text, such as macro arguments and macro definition bodies
* `white-out` vs. `delete`. `white-out` replaces the affected text byte-for-byte with spaces (preserving newlines and tabs). This is useful for producing text whose byte offsets and line numbers are consistent with the original, for diagnostic purposes for example.
May result in text with trailing spaces.
Tips:
* Example of recursive lexing: https://github.com/google/verible/blob/master/verilog/analysis/verilog_equivalence.cc
* Example of a tool with subcommands: https://github.com/google/verible/blob/master/common/tools/patch_tool.cc
|
1.0
|
strip comments tool, based on lexer - Task:
Create a library function (text-in, text-out) and standalone tool to strip `//comments` and `/*comments*/` from source code, starting with the lexer's output, a token stream.
Suggest creating a new binary tool: `verible-verilog-preprocess` whose first positional argument includes the subcommand `strip-comments` for this task. Classifying this under 'preprocessor' because in other languages, it is the preprocessor that strips comments before feeding a parser.
Lexical errors may be ignored, as long as their text is preserved in the output.
Options:
* recursive: apply to comments inside un-lexed text, such as macro arguments and macro definition bodies
* `white-out` vs. `delete`. `white-out` replaces the affected text byte-for-byte with spaces (preserving newlines and tabs). This is useful for producing text whose byte offsets and line numbers are consistent with the original, for diagnostic purposes for example.
May result in text with trailing spaces.
Tips:
* Example of recursive lexing: https://github.com/google/verible/blob/master/verilog/analysis/verilog_equivalence.cc
* Example of a tool with subcommands: https://github.com/google/verible/blob/master/common/tools/patch_tool.cc
|
process
|
strip comments tool based on lexer task create a library function text in text out and standalone tool to strip comments and comments from source code starting with the lexer s output a token stream suggest creating a new binary tool verible verilog preprocess whose first positional argument includes the subcommand strip comments for this task classifying this under preprocessor because in other languages it is the preprocessor that strips comments before feeding a parser lexical errors may be ignored as long as their text is preserved in the output options recursive apply to comments inside un lexed text such as macro arguments and macro definition bodies white out vs delete white out replaces the affected text byte for byte with spaces preserving newlines and tabs this is useful for producing text whose byte offsets and line numbers are consistent with the original for diagnostic purposes for example may result in text with trailing spaces tips example of recursive lexing example of a tool with subcommands
| 1
|
174,699
| 21,300,371,320
|
IssuesEvent
|
2022-04-15 01:43:36
|
JoseSunoj/markdown-here
|
https://api.github.com/repos/JoseSunoj/markdown-here
|
opened
|
CVE-2021-43138 (High) detected in async-2.6.1.tgz
|
security vulnerability
|
## CVE-2021-43138 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>async-2.6.1.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-2.6.1.tgz">https://registry.npmjs.org/async/-/async-2.6.1.tgz</a></p>
<p>Path to dependency file: /utils/package.json</p>
<p>Path to vulnerable library: /utils/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- archiver-3.0.0.tgz (Root Library)
- :x: **async-2.6.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability exists in Async through 3.2.1 (fixed in 3.2.2) , which could let a malicious user obtain privileges via the mapValues() method.
<p>Publish Date: 2022-04-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p>
<p>Release Date: 2022-04-06</p>
<p>Fix Resolution (async): 3.2.2</p>
<p>Direct dependency fix Resolution (archiver): 4.0.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-43138 (High) detected in async-2.6.1.tgz - ## CVE-2021-43138 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>async-2.6.1.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-2.6.1.tgz">https://registry.npmjs.org/async/-/async-2.6.1.tgz</a></p>
<p>Path to dependency file: /utils/package.json</p>
<p>Path to vulnerable library: /utils/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- archiver-3.0.0.tgz (Root Library)
- :x: **async-2.6.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability exists in Async through 3.2.1 (fixed in 3.2.2) , which could let a malicious user obtain privileges via the mapValues() method.
<p>Publish Date: 2022-04-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p>
<p>Release Date: 2022-04-06</p>
<p>Fix Resolution (async): 3.2.2</p>
<p>Direct dependency fix Resolution (archiver): 4.0.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in async tgz cve high severity vulnerability vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file utils package json path to vulnerable library utils node modules async package json dependency hierarchy archiver tgz root library x async tgz vulnerable library found in base branch master vulnerability details a vulnerability exists in async through fixed in which could let a malicious user obtain privileges via the mapvalues method publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution async direct dependency fix resolution archiver step up your open source security game with whitesource
| 0
|
21,537
| 29,837,211,669
|
IssuesEvent
|
2023-06-19 00:37:43
|
devssa/onde-codar-em-salvador
|
https://api.github.com/repos/devssa/onde-codar-em-salvador
|
closed
|
[Hibrido / Belo Horizonte] Fullstack Developer (Asp.Net/Javascript) na Coodesh
|
SALVADOR JAVASCRIPT FULL-STACK MVC HTML JQUERY SQL BOOTSTRAP JSON XML REQUISITOS ASP.NET PROCESSOS GITHUB CI UMA TFS C QUALIDADE PADRÕ WEB FORMS ALOCADO Stale
|
## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/desenvolvedor-c-111415427?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p>A Revelar RH está em busca de Fullstack Developer (Asp.Net/Javascript) para compor seu time!</p>
<p>Responsabilidades:</p>
<ul>
<li>Irá atuar no time de desenvolvimento na Análise, desenvolvimento e sustentação do sistema na plataforma Asp.Net C# seguindo os padrões de código de qualidade estabelecidos.</li>
</ul>
<p>Horário de segunda à sexta, das 8h às 17h. </p>
## Revelar RH:
<p>A Revelar RH é uma empresa inovadora que busca alinhar os objetivos das Empresas que têm carência de profissionais qualificados, com os das pessoas que estão em busca de uma oportunidades no mercado de trabalho. Nosso trabalho visa o aprimoramento e racionalização de processos voltados à Gestão de Pessoas, sendo realizado de forma personalizada para cada cliente. Atualmente atendemos várias empresas de tecnologia e softwares house. </p></p>
## Habilidades:
- .NET
- C#
- Asp.Net MVC
## Local:
Belo Horizonte
## Requisitos:
- Superior em curso em Ciência da Computação, Sistemas de Informação e áreas afins;
- Conhecimentos nivel júnior em: .NET Framework 3.5 e superior, Asp.Net C#, Web Forms, MVC;
- Conhecimentos nivel júnior em: JSON, XML, HTML, CSS, JavaScript, Jquery, Bootstrap;
- Conhecimentos nivel júnior em SQL (Querys Language);
- Controle de Versão (TFS).
## Benefícios:
- Plano de saúde;
- Plano odontológico;
- Seguro de vida;
- Cartão de benefícios Swile (alimentação, auxílio internet, desconto em academia).
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Fullstack Developer (Asp.Net/Javascript) na Revelar RH](https://coodesh.com/jobs/desenvolvedor-c-111415427?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Alocado
#### Regime
CLT
#### Categoria
Full-Stack
|
1.0
|
[Hibrido / Belo Horizonte] Fullstack Developer (Asp.Net/Javascript) na Coodesh - ## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/desenvolvedor-c-111415427?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p>A Revelar RH está em busca de Fullstack Developer (Asp.Net/Javascript) para compor seu time!</p>
<p>Responsabilidades:</p>
<ul>
<li>Irá atuar no time de desenvolvimento na Análise, desenvolvimento e sustentação do sistema na plataforma Asp.Net C# seguindo os padrões de código de qualidade estabelecidos.</li>
</ul>
<p>Horário de segunda à sexta, das 8h às 17h. </p>
## Revelar RH:
<p>A Revelar RH é uma empresa inovadora que busca alinhar os objetivos das Empresas que têm carência de profissionais qualificados, com os das pessoas que estão em busca de uma oportunidades no mercado de trabalho. Nosso trabalho visa o aprimoramento e racionalização de processos voltados à Gestão de Pessoas, sendo realizado de forma personalizada para cada cliente. Atualmente atendemos várias empresas de tecnologia e softwares house. </p></p>
## Habilidades:
- .NET
- C#
- Asp.Net MVC
## Local:
Belo Horizonte
## Requisitos:
- Superior em curso em Ciência da Computação, Sistemas de Informação e áreas afins;
- Conhecimentos nivel júnior em: .NET Framework 3.5 e superior, Asp.Net C#, Web Forms, MVC;
- Conhecimentos nivel júnior em: JSON, XML, HTML, CSS, JavaScript, Jquery, Bootstrap;
- Conhecimentos nivel júnior em SQL (Querys Language);
- Controle de Versão (TFS).
## Benefícios:
- Plano de saúde;
- Plano odontológico;
- Seguro de vida;
- Cartão de benefícios Swile (alimentação, auxílio internet, desconto em academia).
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Fullstack Developer (Asp.Net/Javascript) na Revelar RH](https://coodesh.com/jobs/desenvolvedor-c-111415427?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Alocado
#### Regime
CLT
#### Categoria
Full-Stack
|
process
|
fullstack developer asp net javascript na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a revelar rh está em busca de fullstack developer asp net javascript para compor seu time responsabilidades irá atuar no time de desenvolvimento na análise desenvolvimento e sustentação do sistema na plataforma asp net c seguindo os padrões de código de qualidade estabelecidos horário de segunda à sexta das às nbsp revelar rh a revelar rh é uma empresa inovadora que busca alinhar os objetivos das empresas que têm carência de profissionais qualificados com os das pessoas que estão em busca de uma oportunidades no mercado de trabalho nosso trabalho visa o aprimoramento e racionalização de processos voltados à gestão de pessoas sendo realizado de forma personalizada para cada cliente atualmente atendemos várias empresas de tecnologia e softwares house nbsp habilidades net c asp net mvc local belo horizonte requisitos superior em curso em ciência da computação sistemas de informação e áreas afins conhecimentos nivel júnior em net framework e superior asp net c web forms mvc conhecimentos nivel júnior em json xml html css javascript jquery bootstrap conhecimentos nivel júnior em sql querys language controle de versão tfs benefícios plano de saúde plano odontológico seguro de vida cartão de benefícios swile alimentação auxílio internet desconto em academia como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação alocado regime clt categoria full stack
| 1
|
80,819
| 15,587,285,172
|
IssuesEvent
|
2021-03-18 03:58:26
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
Misguided code lab - first flutter app pt1 - invalid constant value
|
d: codelabs d: website - content documentation easy fix passed first triage
|
Step 3 for this code lab is using the "const" keyword for the "Center" widget. However, the next step (step 4) is using an external package (english_words) that is causing the code to generate an "invalid constant value" error if the person following the lab does not remove the "const" keyword. The actual code for step 4 is correct in the code lab, however, the instructions do not indicate to the user that they need to remove the "const" keyword. My suggestion is to simply remove the "const" keyword from the step 3 of the lab. This will prevent the error from occurring if the user is trying to follow along with the code lab.
Reference: https://codelabs.developers.google.com/codelabs/first-flutter-app-pt1
Screenshot for step 3:

Screenshot for step 4 (instructions do not say to remove "const"):

|
1.0
|
Misguided code lab - first flutter app pt1 - invalid constant value - Step 3 for this code lab is using the "const" keyword for the "Center" widget. However, the next step (step 4) is using an external package (english_words) that is causing the code to generate an "invalid constant value" error if the person following the lab does not remove the "const" keyword. The actual code for step 4 is correct in the code lab, however, the instructions do not indicate to the user that they need to remove the "const" keyword. My suggestion is to simply remove the "const" keyword from the step 3 of the lab. This will prevent the error from occurring if the user is trying to follow along with the code lab.
Reference: https://codelabs.developers.google.com/codelabs/first-flutter-app-pt1
Screenshot for step 3:

Screenshot for step 4 (instructions do not say to remove "const"):

|
non_process
|
misguided code lab first flutter app invalid constant value step for this code lab is using the const keyword for the center widget however the next step step is using an external package english words that is causing the code to generate an invalid constant value error if the person following the lab does not remove the const keyword the actual code for step is correct in the code lab however the instructions do not indicate to the user that they need to remove the const keyword my suggestion is to simply remove the const keyword from the step of the lab this will prevent the error from occurring if the user is trying to follow along with the code lab reference screenshot for step screenshot for step instructions do not say to remove const
| 0
|
10,250
| 13,105,149,188
|
IssuesEvent
|
2020-08-04 11:38:42
|
Explore-AI/test-repo
|
https://api.github.com/repos/Explore-AI/test-repo
|
closed
|
pre-pro test
|
bug content-type:pre-processing student-submitted
|
Content Type: Pre-Processing
Content Name: pre-pro test
Problem: pre-pro desc
Details:
pre-pro desc
Supporting Files: https://drive.google.com/open?id=1XuatP5O6ODNLTwM2-qbtHPXEj0oZb87z
Reported by: jacob@explore-ai.net
|
1.0
|
pre-pro test - Content Type: Pre-Processing
Content Name: pre-pro test
Problem: pre-pro desc
Details:
pre-pro desc
Supporting Files: https://drive.google.com/open?id=1XuatP5O6ODNLTwM2-qbtHPXEj0oZb87z
Reported by: jacob@explore-ai.net
|
process
|
pre pro test content type pre processing content name pre pro test problem pre pro desc details pre pro desc supporting files reported by jacob explore ai net
| 1
|
123,231
| 12,195,411,117
|
IssuesEvent
|
2020-04-29 17:19:47
|
trussworks/react-uswds
|
https://api.github.com/repos/trussworks/react-uswds
|
closed
|
Set up Storybook site as public Github page
|
documentation
|
In order to make this library more accessible and so it's easier to view the components, we should set up the Storybook instance as a public Github page. More info https://storybook.js.org/docs/basics/exporting-storybook/#deploying-to-github-pages
|
1.0
|
Set up Storybook site as public Github page - In order to make this library more accessible and so it's easier to view the components, we should set up the Storybook instance as a public Github page. More info https://storybook.js.org/docs/basics/exporting-storybook/#deploying-to-github-pages
|
non_process
|
set up storybook site as public github page in order to make this library more accessible and so it s easier to view the components we should set up the storybook instance as a public github page more info
| 0
|
20,297
| 26,935,839,537
|
IssuesEvent
|
2023-02-07 20:34:19
|
benthosdev/benthos
|
https://api.github.com/repos/benthosdev/benthos
|
closed
|
ZSTD compresssion support
|
enhancement processors
|
Hey,
It would be great if compress/decompress processor would support ZSTD as well.
Thanks
|
1.0
|
ZSTD compresssion support - Hey,
It would be great if compress/decompress processor would support ZSTD as well.
Thanks
|
process
|
zstd compresssion support hey it would be great if compress decompress processor would support zstd as well thanks
| 1
|
14,213
| 17,111,349,329
|
IssuesEvent
|
2021-07-10 11:18:11
|
sujaldev/skylon
|
https://api.github.com/repos/sujaldev/skylon
|
closed
|
Preprocessor does not generate parse errors
|
Preprocessor
|
# WHAT FAILS
No parse errors are being generated
# TESTS TO REPRODUCE
tokenizer@unicodeChars#10
tokenizer@unicodeChars#11
tokenizer@unicodeChars#62
tokenizer@unicodeChars#79
tokenizer@test3#356
tokenizer@test3#557
and many many more.
# POSSIBLE FIX
Append parse errors generated by the preprocessing stage to the tokenizer's parse errors.
|
1.0
|
Preprocessor does not generate parse errors - # WHAT FAILS
No parse errors are being generated
# TESTS TO REPRODUCE
tokenizer@unicodeChars#10
tokenizer@unicodeChars#11
tokenizer@unicodeChars#62
tokenizer@unicodeChars#79
tokenizer@test3#356
tokenizer@test3#557
and many many more.
# POSSIBLE FIX
Append parse errors generated by the preprocessing stage to the tokenizer's parse errors.
|
process
|
preprocessor does not generate parse errors what fails no parse errors are being generated tests to reproduce tokenizer unicodechars tokenizer unicodechars tokenizer unicodechars tokenizer unicodechars tokenizer tokenizer and many many more possible fix append parse errors generated by the preprocessing stage to the tokenizer s parse errors
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.