Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
853
labels
stringlengths
4
898
body
stringlengths
2
262k
index
stringclasses
13 values
text_combine
stringlengths
96
262k
label
stringclasses
2 values
text
stringlengths
96
250k
binary_label
int64
0
1
4,417
22,740,492,328
IssuesEvent
2022-07-07 02:58:34
BioArchLinux/Packages
https://api.github.com/repos/BioArchLinux/Packages
closed
[MAINTAIN] RESTfulSummarizedExperiment error load
maintain
<!-- Please report the error of one package in one issue! Use multi issues to report multi bugs. Thanks! --> **Log of the bug** <details> ``` * installing *source* package ‘tenXplore’ ... ** using staged installation ** R ** data ** inst ** byte-compile and prepare package for lazy loading Error: class "RESTfulSummarizedExperiment" is not exported by 'namespace:restfulSE' Execution halted ERROR: lazy loading failed for package ‘tenXplore’ * removing ‘/build/r-tenxplore/src/tenXplore’ * restoring previous ‘/build/r-tenxplore/src/tenXplore’ ``` </details> **Packages (please complete the following information):** - Package Name: [e.g. iqtree] **Description** Add any other context about the problem here.
True
[MAINTAIN] RESTfulSummarizedExperiment error load - <!-- Please report the error of one package in one issue! Use multi issues to report multi bugs. Thanks! --> **Log of the bug** <details> ``` * installing *source* package ‘tenXplore’ ... ** using staged installation ** R ** data ** inst ** byte-compile and prepare package for lazy loading Error: class "RESTfulSummarizedExperiment" is not exported by 'namespace:restfulSE' Execution halted ERROR: lazy loading failed for package ‘tenXplore’ * removing ‘/build/r-tenxplore/src/tenXplore’ * restoring previous ‘/build/r-tenxplore/src/tenXplore’ ``` </details> **Packages (please complete the following information):** - Package Name: [e.g. iqtree] **Description** Add any other context about the problem here.
non_build
restfulsummarizedexperiment error load please report the error of one package in one issue use multi issues to report multi bugs thanks log of the bug installing source package ‘tenxplore’ using staged installation r data inst byte compile and prepare package for lazy loading error class restfulsummarizedexperiment is not exported by namespace restfulse execution halted error lazy loading failed for package ‘tenxplore’ removing ‘ build r tenxplore src tenxplore’ restoring previous ‘ build r tenxplore src tenxplore’ packages please complete the following information package name description add any other context about the problem here
0
746,967
26,052,189,195
IssuesEvent
2022-12-22 19:59:04
bounswe/bounswe2022group8
https://api.github.com/repos/bounswe/bounswe2022group8
opened
BE-40: Recommendation System
Effort: High Priority: Medium Status: In Progress Coding Team: Backend
### What's up? As a part of our requirements we will create a recommendation system that will recommend to registered users, similar art item content. While deciding the similarity, various parameters will be taken into consideration, such as category and tags of art items along with likes and history of the user. ### To Do * [ ] Research useful technologies and tools that can be used for building a recommendation system. * [ ] Decide on the parameters we will use. * [ ] Prepare helpful structures for recommendation. (Such as [category](https://github.com/bounswe/bounswe2022group8/issues/357) and [user history](https://github.com/bounswe/bounswe2022group8/issues/353)) * [ ] Implement functionality for deciding on the similarity of content. * [ ] Implement functionality for matching the content to user. * [ ] Implement necessary API(s). ### Deadline 25.12.2022 @23.59 ### Additional Information _No response_ ### Reviewers @KarahanS @se
1.0
BE-40: Recommendation System - ### What's up? As a part of our requirements we will create a recommendation system that will recommend to registered users, similar art item content. While deciding the similarity, various parameters will be taken into consideration, such as category and tags of art items along with likes and history of the user. ### To Do * [ ] Research useful technologies and tools that can be used for building a recommendation system. * [ ] Decide on the parameters we will use. * [ ] Prepare helpful structures for recommendation. (Such as [category](https://github.com/bounswe/bounswe2022group8/issues/357) and [user history](https://github.com/bounswe/bounswe2022group8/issues/353)) * [ ] Implement functionality for deciding on the similarity of content. * [ ] Implement functionality for matching the content to user. * [ ] Implement necessary API(s). ### Deadline 25.12.2022 @23.59 ### Additional Information _No response_ ### Reviewers @KarahanS @se
non_build
be recommendation system what s up as a part of our requirements we will create a recommendation system that will recommend to registered users similar art item content while deciding the similarity various parameters will be taken into consideration such as category and tags of art items along with likes and history of the user to do research useful technologies and tools that can be used for building a recommendation system decide on the parameters we will use prepare helpful structures for recommendation such as and implement functionality for deciding on the similarity of content implement functionality for matching the content to user implement necessary api s deadline additional information no response reviewers karahans se
0
96,059
19,862,342,875
IssuesEvent
2022-01-22 02:44:17
trezor/trezor-suite
https://api.github.com/repos/trezor/trezor-suite
opened
Clean up messages
code
The following merge improved messages.ts and en.json. https://github.com/trezor/trezor-suite/pull/4777 https://github.com/trezor/trezor-suite/pull/4778 Therefore, I considered cleaning up message files. I've written python code that does a full-text search of all the files to see if the keys contained in the messages.ts file are actually used. This code will eventually output new messages.ts and master.json based on the key actually used. I will attach this code: [cleanup_messages.zip](https://github.com/trezor/trezor-suite/files/7917482/cleanup_messages.zip) How to use ---------- Create a work folder in the folder where the trezor-suite repository is located and place `cleanup_messages.py`. Make sure the repository HEAD is on the latest develop branch. ``` cd work cp -r ../trezor-suite/packages/components . cp -r ../trezor-suite/packages/suite . cp -r ../trezor-suite/packages/suite-desktop . cp -r ../trezor-suite/packages/suite-web-landing . cp ../trezor-suite/packages/suite/src/support/messages.ts . python3 cleanup_messages.py ``` Cleaned up files will be output in about an hour. The result I did was that the latest message.ts contained 1680 keys, but after cleanup it was 1402. As a result, 278 unused keys have been removed. Uploading the output new_master.json to crowdin will reduce the translation effort by 17%. Please consider it. Thanks, motty fujicoin.org
1.0
Clean up messages - The following merge improved messages.ts and en.json. https://github.com/trezor/trezor-suite/pull/4777 https://github.com/trezor/trezor-suite/pull/4778 Therefore, I considered cleaning up message files. I've written python code that does a full-text search of all the files to see if the keys contained in the messages.ts file are actually used. This code will eventually output new messages.ts and master.json based on the key actually used. I will attach this code: [cleanup_messages.zip](https://github.com/trezor/trezor-suite/files/7917482/cleanup_messages.zip) How to use ---------- Create a work folder in the folder where the trezor-suite repository is located and place `cleanup_messages.py`. Make sure the repository HEAD is on the latest develop branch. ``` cd work cp -r ../trezor-suite/packages/components . cp -r ../trezor-suite/packages/suite . cp -r ../trezor-suite/packages/suite-desktop . cp -r ../trezor-suite/packages/suite-web-landing . cp ../trezor-suite/packages/suite/src/support/messages.ts . python3 cleanup_messages.py ``` Cleaned up files will be output in about an hour. The result I did was that the latest message.ts contained 1680 keys, but after cleanup it was 1402. As a result, 278 unused keys have been removed. Uploading the output new_master.json to crowdin will reduce the translation effort by 17%. Please consider it. Thanks, motty fujicoin.org
non_build
clean up messages the following merge improved messages ts and en json therefore i considered cleaning up message files i ve written python code that does a full text search of all the files to see if the keys contained in the messages ts file are actually used this code will eventually output new messages ts and master json based on the key actually used i will attach this code how to use create a work folder in the folder where the trezor suite repository is located and place cleanup messages py make sure the repository head is on the latest develop branch cd work cp r trezor suite packages components cp r trezor suite packages suite cp r trezor suite packages suite desktop cp r trezor suite packages suite web landing cp trezor suite packages suite src support messages ts cleanup messages py cleaned up files will be output in about an hour the result i did was that the latest message ts contained keys but after cleanup it was as a result unused keys have been removed uploading the output new master json to crowdin will reduce the translation effort by please consider it thanks motty fujicoin org
0
722,838
24,875,694,421
IssuesEvent
2022-10-27 18:53:11
AY2223S1-CS2103T-T12-2/tp
https://api.github.com/repos/AY2223S1-CS2103T-T12-2/tp
closed
📝 Docs: update `readme.md`
priority.Medium
## Tasks - Update new UI photo - Update acknowledgements to include Tailwind for UI inspiration
1.0
📝 Docs: update `readme.md` - ## Tasks - Update new UI photo - Update acknowledgements to include Tailwind for UI inspiration
non_build
📝 docs update readme md tasks update new ui photo update acknowledgements to include tailwind for ui inspiration
0
54,105
13,252,911,997
IssuesEvent
2020-08-20 06:33:00
microsoft/onnxruntime
https://api.github.com/repos/microsoft/onnxruntime
closed
Failed to build onnxruntime with TensorRT on Windows 10
component:build ep:TensorRT
**Describe the bug** I'm trying to build onnxruntime with tensorRT, and I'm getting errors as I will show. (Similar to [this](https://github.com/microsoft/onnxruntime/issues/3630#issue-604657058) issue) **Urgency** none (But I've been working on this issue for about a week, but still can't figure it out....) **System information** - OS Platform and Distribution: Windows10 - ONNX Runtime installed from (source or binary): source - ONNX Runtime version: 1.4.0 - Python version: 3.7.0 - Visual Studio version (if applicable): VS2019 Community 16.6.5 - GCC/Compiler version (if compiling from source): - - CUDA/cuDNN version: 10.2/ 7.6.5 - TensorRT version: 7.0.0.11 with CUDA10.2 - GPU model and memory: Nvidia RTX 2080 Ti 11G **To Reproduce** Go to onnxruntime's directory, open terminal and type command as below: `build.bat --config Release --parallel --build_shared_lib --use_cuda --cuda_version 10.2 --cudnn_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.2" --cuda_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.2" --use_tensorrt --tensorrt_home "D:\Coco\Libs\TensorRT-7.0.0.11_cuda10.2" --cmake_generator "Visual Studio 16 2019"` **Expected behavior** Finish the build successfully. **Screenshots** ![image](https://user-images.githubusercontent.com/37019005/89774383-593c5880-db38-11ea-9f8d-43e15eae34a0.png) **Additional context** I've tried to build with CUDA10.0 succefully without build with tensorRT with the command below in build-in terminal: `build.bat --config Release --build_shared_lib --parallel --use_cuda --cuda_version 10.0 --cuda_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0" --cudnn_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0" --cmake_generator "Visual Studio 15 2017"` (The building also failed if with tensorRT) Due to the usage of TensorRT, I can build onnxruntime with CUDA10.2 with VS2019 by similar command mentioned above. But I cannot build successfully if it's with TensorRT. I also tried to clone the repo with --recursive, but the error seems the same... Here's some log file during the build: [log_with_trt_vs2019.txt](https://drive.google.com/file/d/10Rg3kHeVEl_oBiwYwe73LAT0qd6fqxBT/view?usp=sharing) Thanks in advance for any help!
1.0
Failed to build onnxruntime with TensorRT on Windows 10 - **Describe the bug** I'm trying to build onnxruntime with tensorRT, and I'm getting errors as I will show. (Similar to [this](https://github.com/microsoft/onnxruntime/issues/3630#issue-604657058) issue) **Urgency** none (But I've been working on this issue for about a week, but still can't figure it out....) **System information** - OS Platform and Distribution: Windows10 - ONNX Runtime installed from (source or binary): source - ONNX Runtime version: 1.4.0 - Python version: 3.7.0 - Visual Studio version (if applicable): VS2019 Community 16.6.5 - GCC/Compiler version (if compiling from source): - - CUDA/cuDNN version: 10.2/ 7.6.5 - TensorRT version: 7.0.0.11 with CUDA10.2 - GPU model and memory: Nvidia RTX 2080 Ti 11G **To Reproduce** Go to onnxruntime's directory, open terminal and type command as below: `build.bat --config Release --parallel --build_shared_lib --use_cuda --cuda_version 10.2 --cudnn_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.2" --cuda_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.2" --use_tensorrt --tensorrt_home "D:\Coco\Libs\TensorRT-7.0.0.11_cuda10.2" --cmake_generator "Visual Studio 16 2019"` **Expected behavior** Finish the build successfully. **Screenshots** ![image](https://user-images.githubusercontent.com/37019005/89774383-593c5880-db38-11ea-9f8d-43e15eae34a0.png) **Additional context** I've tried to build with CUDA10.0 succefully without build with tensorRT with the command below in build-in terminal: `build.bat --config Release --build_shared_lib --parallel --use_cuda --cuda_version 10.0 --cuda_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0" --cudnn_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0" --cmake_generator "Visual Studio 15 2017"` (The building also failed if with tensorRT) Due to the usage of TensorRT, I can build onnxruntime with CUDA10.2 with VS2019 by similar command mentioned above. But I cannot build successfully if it's with TensorRT. I also tried to clone the repo with --recursive, but the error seems the same... Here's some log file during the build: [log_with_trt_vs2019.txt](https://drive.google.com/file/d/10Rg3kHeVEl_oBiwYwe73LAT0qd6fqxBT/view?usp=sharing) Thanks in advance for any help!
build
failed to build onnxruntime with tensorrt on windows describe the bug i m trying to build onnxruntime with tensorrt and i m getting errors as i will show similar to issue urgency none but i ve been working on this issue for about a week but still can t figure it out system information os platform and distribution onnx runtime installed from source or binary source onnx runtime version python version visual studio version if applicable community gcc compiler version if compiling from source cuda cudnn version tensorrt version with gpu model and memory nvidia rtx ti to reproduce go to onnxruntime s directory open terminal and type command as below build bat config release parallel build shared lib use cuda cuda version cudnn home c program files nvidia gpu computing toolkit cuda cuda home c program files nvidia gpu computing toolkit cuda use tensorrt tensorrt home d coco libs tensorrt cmake generator visual studio expected behavior finish the build successfully screenshots additional context i ve tried to build with succefully without build with tensorrt with the command below in build in terminal build bat config release build shared lib parallel use cuda cuda version cuda home c program files nvidia gpu computing toolkit cuda cudnn home c program files nvidia gpu computing toolkit cuda cmake generator visual studio the building also failed if with tensorrt due to the usage of tensorrt i can build onnxruntime with with by similar command mentioned above but i cannot build successfully if it s with tensorrt i also tried to clone the repo with recursive but the error seems the same here s some log file during the build thanks in advance for any help
1
239,151
26,204,350,593
IssuesEvent
2023-01-03 20:49:38
istio/ztunnel
https://api.github.com/repos/istio/ztunnel
closed
Implement (L4) RBAC
area/security P0 OSS Alpha
This was an initial attempt in Go: ```go unc (p *Proxy) AssertRBAC(r *http.Request) error { ip, dport, err := net.SplitHostPort(r.Host) if err != nil { return err } pip, err := netip.ParseAddr(ip) if err != nil { return err } wl := p.ConnectionManager.FindWorkloadByAddr(pip) var identity string if r.TLS != nil && len(r.TLS.PeerCertificates) > 0 { n, err := util.ExtractIDs(r.TLS.PeerCertificates[0].Extensions) if err != nil { return err } if len(n) > 0 { identity = n[0] } } if wl.RBAC == nil { return nil } var namespace string if identity != "" { s, err := spiffe.ParseIdentity(identity) if err != nil { return err } namespace = s.Namespace } log := log.WithLabels("ip", pip.String(), "ident", identity, "port", dport) // This is from the PEP, which handles this already // TODO: make this check more robust if wl.RemoteProxy != (netip.Addr{}) && identity == wl.Identity { if len(wl.RBAC.Allow) == 0 { log.Debugf("allow (no policies)") } else { log.Infof("allow (from remote)") } return nil } deny := rbacMatch(wl.RBAC.Deny, namespace, identity, dport) if deny { // TODO context return fmt.Errorf("RBAC: deny") } if len(wl.RBAC.Allow) == 0 { log.Debug("allow (no policies)") return nil } allow := rbacMatch(wl.RBAC.Allow, namespace, identity, dport) if allow { log.Info("allow") return nil } return fmt.Errorf("RBAC: no allow matched") } func rbacMatch(pol []*uproxyapi.Policy, namespace string, identity string, dports string) bool { dport, _ := strconv.Atoi(dports) for _, pol := range pol { // TODO pol.When ruleMatch := false for _, rule := range pol.Rule { rmatch := true if rule.Namespace != "" && rule.Namespace != namespace { rmatch = false } if rule.Identity != "" && "spiffe://"+rule.Identity != identity { rmatch = false } if rule.Invert { rmatch = !rmatch } ruleMatch = ruleMatch || rmatch } whenMatch := false if len(pol.When) == 0 { whenMatch = true } for _, when := range pol.When { rmatch := true if when.Port != 0 && when.Port != uint32(dport) { rmatch = false } if when.Invert { rmatch = !rmatch } whenMatch = whenMatch || rmatch } if ruleMatch && whenMatch { return true } } return false } ``` Likely needs a lot of work though.
True
Implement (L4) RBAC - This was an initial attempt in Go: ```go unc (p *Proxy) AssertRBAC(r *http.Request) error { ip, dport, err := net.SplitHostPort(r.Host) if err != nil { return err } pip, err := netip.ParseAddr(ip) if err != nil { return err } wl := p.ConnectionManager.FindWorkloadByAddr(pip) var identity string if r.TLS != nil && len(r.TLS.PeerCertificates) > 0 { n, err := util.ExtractIDs(r.TLS.PeerCertificates[0].Extensions) if err != nil { return err } if len(n) > 0 { identity = n[0] } } if wl.RBAC == nil { return nil } var namespace string if identity != "" { s, err := spiffe.ParseIdentity(identity) if err != nil { return err } namespace = s.Namespace } log := log.WithLabels("ip", pip.String(), "ident", identity, "port", dport) // This is from the PEP, which handles this already // TODO: make this check more robust if wl.RemoteProxy != (netip.Addr{}) && identity == wl.Identity { if len(wl.RBAC.Allow) == 0 { log.Debugf("allow (no policies)") } else { log.Infof("allow (from remote)") } return nil } deny := rbacMatch(wl.RBAC.Deny, namespace, identity, dport) if deny { // TODO context return fmt.Errorf("RBAC: deny") } if len(wl.RBAC.Allow) == 0 { log.Debug("allow (no policies)") return nil } allow := rbacMatch(wl.RBAC.Allow, namespace, identity, dport) if allow { log.Info("allow") return nil } return fmt.Errorf("RBAC: no allow matched") } func rbacMatch(pol []*uproxyapi.Policy, namespace string, identity string, dports string) bool { dport, _ := strconv.Atoi(dports) for _, pol := range pol { // TODO pol.When ruleMatch := false for _, rule := range pol.Rule { rmatch := true if rule.Namespace != "" && rule.Namespace != namespace { rmatch = false } if rule.Identity != "" && "spiffe://"+rule.Identity != identity { rmatch = false } if rule.Invert { rmatch = !rmatch } ruleMatch = ruleMatch || rmatch } whenMatch := false if len(pol.When) == 0 { whenMatch = true } for _, when := range pol.When { rmatch := true if when.Port != 0 && when.Port != uint32(dport) { rmatch = false } if when.Invert { rmatch = !rmatch } whenMatch = whenMatch || rmatch } if ruleMatch && whenMatch { return true } } return false } ``` Likely needs a lot of work though.
non_build
implement rbac this was an initial attempt in go go unc p proxy assertrbac r http request error ip dport err net splithostport r host if err nil return err pip err netip parseaddr ip if err nil return err wl p connectionmanager findworkloadbyaddr pip var identity string if r tls nil len r tls peercertificates n err util extractids r tls peercertificates extensions if err nil return err if len n identity n if wl rbac nil return nil var namespace string if identity s err spiffe parseidentity identity if err nil return err namespace s namespace log log withlabels ip pip string ident identity port dport this is from the pep which handles this already todo make this check more robust if wl remoteproxy netip addr identity wl identity if len wl rbac allow log debugf allow no policies else log infof allow from remote return nil deny rbacmatch wl rbac deny namespace identity dport if deny todo context return fmt errorf rbac deny if len wl rbac allow log debug allow no policies return nil allow rbacmatch wl rbac allow namespace identity dport if allow log info allow return nil return fmt errorf rbac no allow matched func rbacmatch pol uproxyapi policy namespace string identity string dports string bool dport strconv atoi dports for pol range pol todo pol when rulematch false for rule range pol rule rmatch true if rule namespace rule namespace namespace rmatch false if rule identity spiffe rule identity identity rmatch false if rule invert rmatch rmatch rulematch rulematch rmatch whenmatch false if len pol when whenmatch true for when range pol when rmatch true if when port when port dport rmatch false if when invert rmatch rmatch whenmatch whenmatch rmatch if rulematch whenmatch return true return false likely needs a lot of work though
0
490,164
14,116,139,509
IssuesEvent
2020-11-08 01:05:14
drashland/sockets
https://api.github.com/repos/drashland/sockets
closed
Remove `openChannel`?
Priority: Low Type: Question
## Summary What: Remove `openChannel`, then `on` would 'open' it itself. Why: I feel openChannel is unnecessary, channels are always opened right before it's listener method - i feel using `on` should create thee listener itself Thoughts? ## Acceptance Criteria - [x] Determine whether to remove `onChannel`, and open channels from within the `on` method * Result: ## Example Pseudo Code (for implementation) ```typescript public on (name) { this.listeners.set(name, ...) ... } ``` - [ ] Update to sockets documentation on website is required (https://github.com/drashland/website/issues/226)
1.0
Remove `openChannel`? - ## Summary What: Remove `openChannel`, then `on` would 'open' it itself. Why: I feel openChannel is unnecessary, channels are always opened right before it's listener method - i feel using `on` should create thee listener itself Thoughts? ## Acceptance Criteria - [x] Determine whether to remove `onChannel`, and open channels from within the `on` method * Result: ## Example Pseudo Code (for implementation) ```typescript public on (name) { this.listeners.set(name, ...) ... } ``` - [ ] Update to sockets documentation on website is required (https://github.com/drashland/website/issues/226)
non_build
remove openchannel summary what remove openchannel then on would open it itself why i feel openchannel is unnecessary channels are always opened right before it s listener method i feel using on should create thee listener itself thoughts acceptance criteria determine whether to remove onchannel and open channels from within the on method result example pseudo code for implementation typescript public on name this listeners set name update to sockets documentation on website is required
0
23,468
7,331,899,609
IssuesEvent
2018-03-05 14:54:27
alibaba/ice
https://api.github.com/repos/alibaba/ice
closed
Module parse failed: Unexpected character '�' (1:0)
build
import img from 'xxx.png'; ------------------------- Module parse failed: Unexpected character '�' (1:0) You may need an appropriate loader to handle this file type. (Source code omitted for this binary file)
1.0
Module parse failed: Unexpected character '�' (1:0) - import img from 'xxx.png'; ------------------------- Module parse failed: Unexpected character '�' (1:0) You may need an appropriate loader to handle this file type. (Source code omitted for this binary file)
build
module parse failed unexpected character � import img from xxx png module parse failed unexpected character � you may need an appropriate loader to handle this file type source code omitted for this binary file
1
8,987
23,927,859,716
IssuesEvent
2022-09-10 04:23:36
kubernetes/enhancements
https://api.github.com/repos/kubernetes/enhancements
reopened
Pod level resource limits
sig/node sig/cli sig/architecture lifecycle/rotten tracked/no sig/cloud-provider
### Enhancement Description - One-line enhancement description (can be used as a release note): Allow setting of resource requests & limits at Pod level - Kubernetes Enhancement Proposal: <!-- link to kubernetes/enhancements file; if none yet, link to PR --> https://github.com/kubernetes/enhancements/pull/1592 - Discussion Link: <!-- link to SIG mailing list thread, meeting, or recording where the Enhancement was discussed before KEP creation -->https://www.youtube.com/watch?v=3cU56ZiUZ8w&list=PL69nYSiGNLP1wJPj5DYWXjiArF-MJ5fNG&index=100&ab_channel=DerekCarr - Primary contact (assignee): @n4j - Responsible SIGs: `sig/node` - Enhancement target (which target equals to which milestone): - Alpha release target (x.y): 1.23 - Beta release target (x.y): 1.23 - Stable release target (x.y): 1.23 - [ ] Alpha - [x] KEP (`k/enhancements`) update PR(s): https://github.com/kubernetes/enhancements/pull/1592 - [ ] Code (`k/k`) update PR(s): - [ ] Docs (`k/website`) update PR(s): <!-- Uncomment these as you prepare the enhancement for the next stage - [ ] Beta - [ ] KEP (`k/enhancements`) update PR(s): - [ ] Code (`k/k`) update PR(s): - [ ] Docs (`k/website`) update(s): - [ ] Stable - [ ] KEP (`k/enhancements`) update PR(s): - [ ] Code (`k/k`) update PR(s): - [ ] Docs (`k/website`) update(s): --> _Please keep this description up to date. This will help the Enhancement Team to track the evolution of the enhancement efficiently._
1.0
Pod level resource limits - ### Enhancement Description - One-line enhancement description (can be used as a release note): Allow setting of resource requests & limits at Pod level - Kubernetes Enhancement Proposal: <!-- link to kubernetes/enhancements file; if none yet, link to PR --> https://github.com/kubernetes/enhancements/pull/1592 - Discussion Link: <!-- link to SIG mailing list thread, meeting, or recording where the Enhancement was discussed before KEP creation -->https://www.youtube.com/watch?v=3cU56ZiUZ8w&list=PL69nYSiGNLP1wJPj5DYWXjiArF-MJ5fNG&index=100&ab_channel=DerekCarr - Primary contact (assignee): @n4j - Responsible SIGs: `sig/node` - Enhancement target (which target equals to which milestone): - Alpha release target (x.y): 1.23 - Beta release target (x.y): 1.23 - Stable release target (x.y): 1.23 - [ ] Alpha - [x] KEP (`k/enhancements`) update PR(s): https://github.com/kubernetes/enhancements/pull/1592 - [ ] Code (`k/k`) update PR(s): - [ ] Docs (`k/website`) update PR(s): <!-- Uncomment these as you prepare the enhancement for the next stage - [ ] Beta - [ ] KEP (`k/enhancements`) update PR(s): - [ ] Code (`k/k`) update PR(s): - [ ] Docs (`k/website`) update(s): - [ ] Stable - [ ] KEP (`k/enhancements`) update PR(s): - [ ] Code (`k/k`) update PR(s): - [ ] Docs (`k/website`) update(s): --> _Please keep this description up to date. This will help the Enhancement Team to track the evolution of the enhancement efficiently._
non_build
pod level resource limits enhancement description one line enhancement description can be used as a release note allow setting of resource requests limits at pod level kubernetes enhancement proposal discussion link primary contact assignee responsible sigs sig node enhancement target which target equals to which milestone alpha release target x y beta release target x y stable release target x y alpha kep k enhancements update pr s code k k update pr s docs k website update pr s uncomment these as you prepare the enhancement for the next stage beta kep k enhancements update pr s code k k update pr s docs k website update s stable kep k enhancements update pr s code k k update pr s docs k website update s please keep this description up to date this will help the enhancement team to track the evolution of the enhancement efficiently
0
725,980
24,983,279,234
IssuesEvent
2022-11-02 13:23:35
teogor/ceres
https://api.github.com/repos/teogor/ceres
closed
Crash: `BaseTransientBottomBar.java line 146`
@bug @priority-critical
Crash: `BaseTransientBottomBar.java line 146` ```txt Caused by java.lang.reflect.InvocationTargetException at java.lang.reflect.Constructor.newInstance0(Constructor.java) at java.lang.reflect.Constructor.newInstance(Constructor.java:343) at android.view.LayoutInflater.createView(LayoutInflater.java:858) at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:1010) at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:965) at android.view.LayoutInflater.inflate(LayoutInflater.java:663) at android.view.LayoutInflater.inflate(LayoutInflater.java:538) at dev.teogor.ceres.m3.snackbar.BaseTransientBottomBar.<init>(BaseTransientBottomBar.java:36) at dev.teogor.ceres.m3.snackbar.Snackbar.<init>(Snackbar.java) at dev.teogor.ceres.m3.snackbar.Snackbar.makeInternal(Snackbar.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.buildSnackbar(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.buildSnackbar$default(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.<init>(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.<init>(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3$Builder.prepareSnackbar(SnackbarBetaM3.java:137) at dev.teogor.ceres.m3.app.BaseActivityM3.processUiEvent(BaseActivityM3.java:137) at dev.teogor.ceres.components.app.BaseFragment.processUiEvent(BaseFragment.java:33) at dev.teogor.ceres.components.app.BaseFragment$$InternalSyntheticLambda$1$93e51e306c0a7d91c4b850e5dc67e12181e9dcc370125f1a29100ef6cd2df40f$2.onChanged(BaseFragment.java:33) at com.google.android.datatransport.runtime.scheduling.persistence.SQLiteEventStore$$ExternalSyntheticLambda5.d(R8$$SyntheticClass:30) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at dev.teogor.ceres.components.events.SingleLiveEvent.setValue(SingleLiveEvent.java:6) at com.zeoowl.lwp.aquarium.presentation.feature.home.HomeFragment.setupObservers$lambda-0(HomeFragment.java:111) at com.zeoowl.lwp.aquarium.presentation.feature.home.HomeFragment$$InternalSyntheticLambda$1$d0691176a84501196567b810ae9f295ada733c275072e27364673e0325f59bb2$0.onChanged(HomeFragment.java:111) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at dev.teogor.ceres.ads.view.NativeAdView.prepNativeAd$lambda-0(NativeAdView.java:71) at dev.teogor.ceres.ads.view.NativeAdView$$InternalSyntheticLambda$1$3a59988f7bfb7fb66125d11b40a4f22e0b10de53fdf4ff5ea0c81630a2e5acf7$0.onChanged(NativeAdView.java:71) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at androidx.lifecycle.LiveData$1.run(LiveData.java:18) at android.os.Handler.handleCallback(Handler.java:938) at android.os.Handler.dispatchMessage(Handler.java:99) at android.os.Looper.loopOnce(Looper.java:226) at android.os.Looper.loop(Looper.java:313) at android.app.ActivityThread.main(ActivityThread.java:8751) at java.lang.reflect.Method.invoke(Method.java) at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:571) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1135) ``` ```txt Caused by java.lang.IllegalArgumentException: dev.teogor.ceres.m3.snackbar.Snackbar.SnackbarLayout requires a value for the com.zeoowl.lwp.aquarium:attr/colorSurface attribute to be set in your app theme. You can either set the attribute in your theme or update your theme to inherit from Theme.MaterialComponents (or a descendant). at com.google.android.material.resources.MaterialAttributes.resolveTypedValueOrThrow(MaterialAttributes.java:32) at com.google.android.material.resources.MaterialAttributes.resolveTypedValueOrThrow(MaterialAttributes.java:17) at com.google.android.material.color.MaterialColors.getColor(MaterialColors.java:17) at com.google.android.material.color.MaterialColors.layer(MaterialColors.java:146) at dev.teogor.ceres.m3.snackbar.BaseTransientBottomBar$SnackbarBaseLayout.createThemedBackground(BaseTransientBottomBar.java:146) at dev.teogor.ceres.m3.snackbar.BaseTransientBottomBar$SnackbarBaseLayout.<init>(BaseTransientBottomBar.java:146) at dev.teogor.ceres.m3.snackbar.Snackbar$SnackbarLayout.<init>(Snackbar.java) at java.lang.reflect.Constructor.newInstance0(Constructor.java) at java.lang.reflect.Constructor.newInstance(Constructor.java:343) at android.view.LayoutInflater.createView(LayoutInflater.java:858) at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:1010) at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:965) at android.view.LayoutInflater.inflate(LayoutInflater.java:663) at android.view.LayoutInflater.inflate(LayoutInflater.java:538) at dev.teogor.ceres.m3.snackbar.BaseTransientBottomBar.<init>(BaseTransientBottomBar.java:36) at dev.teogor.ceres.m3.snackbar.Snackbar.<init>(Snackbar.java) at dev.teogor.ceres.m3.snackbar.Snackbar.makeInternal(Snackbar.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.buildSnackbar(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.buildSnackbar$default(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.<init>(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.<init>(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3$Builder.prepareSnackbar(SnackbarBetaM3.java:137) at dev.teogor.ceres.m3.app.BaseActivityM3.processUiEvent(BaseActivityM3.java:137) at dev.teogor.ceres.components.app.BaseFragment.processUiEvent(BaseFragment.java:33) at dev.teogor.ceres.components.app.BaseFragment$$InternalSyntheticLambda$1$93e51e306c0a7d91c4b850e5dc67e12181e9dcc370125f1a29100ef6cd2df40f$2.onChanged(BaseFragment.java:33) at com.google.android.datatransport.runtime.scheduling.persistence.SQLiteEventStore$$ExternalSyntheticLambda5.d(R8$$SyntheticClass:30) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at dev.teogor.ceres.components.events.SingleLiveEvent.setValue(SingleLiveEvent.java:6) at com.zeoowl.lwp.aquarium.presentation.feature.home.HomeFragment.setupObservers$lambda-0(HomeFragment.java:111) at com.zeoowl.lwp.aquarium.presentation.feature.home.HomeFragment$$InternalSyntheticLambda$1$d0691176a84501196567b810ae9f295ada733c275072e27364673e0325f59bb2$0.onChanged(HomeFragment.java:111) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at dev.teogor.ceres.ads.view.NativeAdView.prepNativeAd$lambda-0(NativeAdView.java:71) at dev.teogor.ceres.ads.view.NativeAdView$$InternalSyntheticLambda$1$3a59988f7bfb7fb66125d11b40a4f22e0b10de53fdf4ff5ea0c81630a2e5acf7$0.onChanged(NativeAdView.java:71) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at androidx.lifecycle.LiveData$1.run(LiveData.java:18) at android.os.Handler.handleCallback(Handler.java:938) at android.os.Handler.dispatchMessage(Handler.java:99) at android.os.Looper.loopOnce(Looper.java:226) at android.os.Looper.loop(Looper.java:313) at android.app.ActivityThread.main(ActivityThread.java:8751) at java.lang.reflect.Method.invoke(Method.java) at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:571) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1135) ```
1.0
Crash: `BaseTransientBottomBar.java line 146` - Crash: `BaseTransientBottomBar.java line 146` ```txt Caused by java.lang.reflect.InvocationTargetException at java.lang.reflect.Constructor.newInstance0(Constructor.java) at java.lang.reflect.Constructor.newInstance(Constructor.java:343) at android.view.LayoutInflater.createView(LayoutInflater.java:858) at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:1010) at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:965) at android.view.LayoutInflater.inflate(LayoutInflater.java:663) at android.view.LayoutInflater.inflate(LayoutInflater.java:538) at dev.teogor.ceres.m3.snackbar.BaseTransientBottomBar.<init>(BaseTransientBottomBar.java:36) at dev.teogor.ceres.m3.snackbar.Snackbar.<init>(Snackbar.java) at dev.teogor.ceres.m3.snackbar.Snackbar.makeInternal(Snackbar.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.buildSnackbar(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.buildSnackbar$default(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.<init>(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.<init>(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3$Builder.prepareSnackbar(SnackbarBetaM3.java:137) at dev.teogor.ceres.m3.app.BaseActivityM3.processUiEvent(BaseActivityM3.java:137) at dev.teogor.ceres.components.app.BaseFragment.processUiEvent(BaseFragment.java:33) at dev.teogor.ceres.components.app.BaseFragment$$InternalSyntheticLambda$1$93e51e306c0a7d91c4b850e5dc67e12181e9dcc370125f1a29100ef6cd2df40f$2.onChanged(BaseFragment.java:33) at com.google.android.datatransport.runtime.scheduling.persistence.SQLiteEventStore$$ExternalSyntheticLambda5.d(R8$$SyntheticClass:30) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at dev.teogor.ceres.components.events.SingleLiveEvent.setValue(SingleLiveEvent.java:6) at com.zeoowl.lwp.aquarium.presentation.feature.home.HomeFragment.setupObservers$lambda-0(HomeFragment.java:111) at com.zeoowl.lwp.aquarium.presentation.feature.home.HomeFragment$$InternalSyntheticLambda$1$d0691176a84501196567b810ae9f295ada733c275072e27364673e0325f59bb2$0.onChanged(HomeFragment.java:111) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at dev.teogor.ceres.ads.view.NativeAdView.prepNativeAd$lambda-0(NativeAdView.java:71) at dev.teogor.ceres.ads.view.NativeAdView$$InternalSyntheticLambda$1$3a59988f7bfb7fb66125d11b40a4f22e0b10de53fdf4ff5ea0c81630a2e5acf7$0.onChanged(NativeAdView.java:71) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at androidx.lifecycle.LiveData$1.run(LiveData.java:18) at android.os.Handler.handleCallback(Handler.java:938) at android.os.Handler.dispatchMessage(Handler.java:99) at android.os.Looper.loopOnce(Looper.java:226) at android.os.Looper.loop(Looper.java:313) at android.app.ActivityThread.main(ActivityThread.java:8751) at java.lang.reflect.Method.invoke(Method.java) at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:571) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1135) ``` ```txt Caused by java.lang.IllegalArgumentException: dev.teogor.ceres.m3.snackbar.Snackbar.SnackbarLayout requires a value for the com.zeoowl.lwp.aquarium:attr/colorSurface attribute to be set in your app theme. You can either set the attribute in your theme or update your theme to inherit from Theme.MaterialComponents (or a descendant). at com.google.android.material.resources.MaterialAttributes.resolveTypedValueOrThrow(MaterialAttributes.java:32) at com.google.android.material.resources.MaterialAttributes.resolveTypedValueOrThrow(MaterialAttributes.java:17) at com.google.android.material.color.MaterialColors.getColor(MaterialColors.java:17) at com.google.android.material.color.MaterialColors.layer(MaterialColors.java:146) at dev.teogor.ceres.m3.snackbar.BaseTransientBottomBar$SnackbarBaseLayout.createThemedBackground(BaseTransientBottomBar.java:146) at dev.teogor.ceres.m3.snackbar.BaseTransientBottomBar$SnackbarBaseLayout.<init>(BaseTransientBottomBar.java:146) at dev.teogor.ceres.m3.snackbar.Snackbar$SnackbarLayout.<init>(Snackbar.java) at java.lang.reflect.Constructor.newInstance0(Constructor.java) at java.lang.reflect.Constructor.newInstance(Constructor.java:343) at android.view.LayoutInflater.createView(LayoutInflater.java:858) at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:1010) at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:965) at android.view.LayoutInflater.inflate(LayoutInflater.java:663) at android.view.LayoutInflater.inflate(LayoutInflater.java:538) at dev.teogor.ceres.m3.snackbar.BaseTransientBottomBar.<init>(BaseTransientBottomBar.java:36) at dev.teogor.ceres.m3.snackbar.Snackbar.<init>(Snackbar.java) at dev.teogor.ceres.m3.snackbar.Snackbar.makeInternal(Snackbar.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.buildSnackbar(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.buildSnackbar$default(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.<init>(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3.<init>(SnackbarBetaM3.java:88) at dev.teogor.ceres.m3.SnackbarBetaM3$Builder.prepareSnackbar(SnackbarBetaM3.java:137) at dev.teogor.ceres.m3.app.BaseActivityM3.processUiEvent(BaseActivityM3.java:137) at dev.teogor.ceres.components.app.BaseFragment.processUiEvent(BaseFragment.java:33) at dev.teogor.ceres.components.app.BaseFragment$$InternalSyntheticLambda$1$93e51e306c0a7d91c4b850e5dc67e12181e9dcc370125f1a29100ef6cd2df40f$2.onChanged(BaseFragment.java:33) at com.google.android.datatransport.runtime.scheduling.persistence.SQLiteEventStore$$ExternalSyntheticLambda5.d(R8$$SyntheticClass:30) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at dev.teogor.ceres.components.events.SingleLiveEvent.setValue(SingleLiveEvent.java:6) at com.zeoowl.lwp.aquarium.presentation.feature.home.HomeFragment.setupObservers$lambda-0(HomeFragment.java:111) at com.zeoowl.lwp.aquarium.presentation.feature.home.HomeFragment$$InternalSyntheticLambda$1$d0691176a84501196567b810ae9f295ada733c275072e27364673e0325f59bb2$0.onChanged(HomeFragment.java:111) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at dev.teogor.ceres.ads.view.NativeAdView.prepNativeAd$lambda-0(NativeAdView.java:71) at dev.teogor.ceres.ads.view.NativeAdView$$InternalSyntheticLambda$1$3a59988f7bfb7fb66125d11b40a4f22e0b10de53fdf4ff5ea0c81630a2e5acf7$0.onChanged(NativeAdView.java:71) at androidx.lifecycle.LiveData.considerNotify(LiveData.java:29) at androidx.lifecycle.LiveData.dispatchingValue(LiveData.java:56) at androidx.lifecycle.LiveData.setValue(LiveData.java:14) at androidx.lifecycle.MutableLiveData.setValue(MutableLiveData.java) at androidx.lifecycle.LiveData$1.run(LiveData.java:18) at android.os.Handler.handleCallback(Handler.java:938) at android.os.Handler.dispatchMessage(Handler.java:99) at android.os.Looper.loopOnce(Looper.java:226) at android.os.Looper.loop(Looper.java:313) at android.app.ActivityThread.main(ActivityThread.java:8751) at java.lang.reflect.Method.invoke(Method.java) at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:571) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1135) ```
non_build
crash basetransientbottombar java line crash basetransientbottombar java line txt caused by java lang reflect invocationtargetexception at java lang reflect constructor constructor java at java lang reflect constructor newinstance constructor java at android view layoutinflater createview layoutinflater java at android view layoutinflater createviewfromtag layoutinflater java at android view layoutinflater createviewfromtag layoutinflater java at android view layoutinflater inflate layoutinflater java at android view layoutinflater inflate layoutinflater java at dev teogor ceres snackbar basetransientbottombar basetransientbottombar java at dev teogor ceres snackbar snackbar snackbar java at dev teogor ceres snackbar snackbar makeinternal snackbar java at dev teogor ceres buildsnackbar java at dev teogor ceres buildsnackbar default java at dev teogor ceres java at dev teogor ceres java at dev teogor ceres builder preparesnackbar java at dev teogor ceres app processuievent java at dev teogor ceres components app basefragment processuievent basefragment java at dev teogor ceres components app basefragment internalsyntheticlambda onchanged basefragment java at com google android datatransport runtime scheduling persistence sqliteeventstore d syntheticclass at androidx lifecycle livedata considernotify livedata java at androidx lifecycle livedata dispatchingvalue livedata java at androidx lifecycle livedata setvalue livedata java at androidx lifecycle mutablelivedata setvalue mutablelivedata java at dev teogor ceres components events singleliveevent setvalue singleliveevent java at com zeoowl lwp aquarium presentation feature home homefragment setupobservers lambda homefragment java at com zeoowl lwp aquarium presentation feature home homefragment internalsyntheticlambda onchanged homefragment java at androidx lifecycle livedata considernotify livedata java at androidx lifecycle livedata dispatchingvalue livedata java at androidx lifecycle livedata setvalue livedata java at androidx lifecycle mutablelivedata setvalue mutablelivedata java at dev teogor ceres ads view nativeadview prepnativead lambda nativeadview java at dev teogor ceres ads view nativeadview internalsyntheticlambda onchanged nativeadview java at androidx lifecycle livedata considernotify livedata java at androidx lifecycle livedata dispatchingvalue livedata java at androidx lifecycle livedata setvalue livedata java at androidx lifecycle mutablelivedata setvalue mutablelivedata java at androidx lifecycle livedata run livedata java at android os handler handlecallback handler java at android os handler dispatchmessage handler java at android os looper looponce looper java at android os looper loop looper java at android app activitythread main activitythread java at java lang reflect method invoke method java at com android internal os runtimeinit methodandargscaller run runtimeinit java at com android internal os zygoteinit main zygoteinit java txt caused by java lang illegalargumentexception dev teogor ceres snackbar snackbar snackbarlayout requires a value for the com zeoowl lwp aquarium attr colorsurface attribute to be set in your app theme you can either set the attribute in your theme or update your theme to inherit from theme materialcomponents or a descendant at com google android material resources materialattributes resolvetypedvalueorthrow materialattributes java at com google android material resources materialattributes resolvetypedvalueorthrow materialattributes java at com google android material color materialcolors getcolor materialcolors java at com google android material color materialcolors layer materialcolors java at dev teogor ceres snackbar basetransientbottombar snackbarbaselayout createthemedbackground basetransientbottombar java at dev teogor ceres snackbar basetransientbottombar snackbarbaselayout basetransientbottombar java at dev teogor ceres snackbar snackbar snackbarlayout snackbar java at java lang reflect constructor constructor java at java lang reflect constructor newinstance constructor java at android view layoutinflater createview layoutinflater java at android view layoutinflater createviewfromtag layoutinflater java at android view layoutinflater createviewfromtag layoutinflater java at android view layoutinflater inflate layoutinflater java at android view layoutinflater inflate layoutinflater java at dev teogor ceres snackbar basetransientbottombar basetransientbottombar java at dev teogor ceres snackbar snackbar snackbar java at dev teogor ceres snackbar snackbar makeinternal snackbar java at dev teogor ceres buildsnackbar java at dev teogor ceres buildsnackbar default java at dev teogor ceres java at dev teogor ceres java at dev teogor ceres builder preparesnackbar java at dev teogor ceres app processuievent java at dev teogor ceres components app basefragment processuievent basefragment java at dev teogor ceres components app basefragment internalsyntheticlambda onchanged basefragment java at com google android datatransport runtime scheduling persistence sqliteeventstore d syntheticclass at androidx lifecycle livedata considernotify livedata java at androidx lifecycle livedata dispatchingvalue livedata java at androidx lifecycle livedata setvalue livedata java at androidx lifecycle mutablelivedata setvalue mutablelivedata java at dev teogor ceres components events singleliveevent setvalue singleliveevent java at com zeoowl lwp aquarium presentation feature home homefragment setupobservers lambda homefragment java at com zeoowl lwp aquarium presentation feature home homefragment internalsyntheticlambda onchanged homefragment java at androidx lifecycle livedata considernotify livedata java at androidx lifecycle livedata dispatchingvalue livedata java at androidx lifecycle livedata setvalue livedata java at androidx lifecycle mutablelivedata setvalue mutablelivedata java at dev teogor ceres ads view nativeadview prepnativead lambda nativeadview java at dev teogor ceres ads view nativeadview internalsyntheticlambda onchanged nativeadview java at androidx lifecycle livedata considernotify livedata java at androidx lifecycle livedata dispatchingvalue livedata java at androidx lifecycle livedata setvalue livedata java at androidx lifecycle mutablelivedata setvalue mutablelivedata java at androidx lifecycle livedata run livedata java at android os handler handlecallback handler java at android os handler dispatchmessage handler java at android os looper looponce looper java at android os looper loop looper java at android app activitythread main activitythread java at java lang reflect method invoke method java at com android internal os runtimeinit methodandargscaller run runtimeinit java at com android internal os zygoteinit main zygoteinit java
0
91,098
26,273,865,932
IssuesEvent
2023-01-06 19:48:20
cryostatio/cryostat-web
https://api.github.com/repos/cryostatio/cryostat-web
closed
Rebuilds without source changes should be faster
chore build needs-triage
Currently, running a rebuild without having made any source changes still takes a fairly long time. It should be possible to configure some Webpack caching so that unchanged files are not recompiled. We already have some configuration that should do this, but it doesn't seem to be working as expected, so perhaps some other step in the pipeline is not performing caching and this busts the later caches that are set up. https://webpack.js.org/guides/caching/ https://javascript.plainenglish.io/how-to-improve-webpack-performance-7637db26fa5f
1.0
Rebuilds without source changes should be faster - Currently, running a rebuild without having made any source changes still takes a fairly long time. It should be possible to configure some Webpack caching so that unchanged files are not recompiled. We already have some configuration that should do this, but it doesn't seem to be working as expected, so perhaps some other step in the pipeline is not performing caching and this busts the later caches that are set up. https://webpack.js.org/guides/caching/ https://javascript.plainenglish.io/how-to-improve-webpack-performance-7637db26fa5f
build
rebuilds without source changes should be faster currently running a rebuild without having made any source changes still takes a fairly long time it should be possible to configure some webpack caching so that unchanged files are not recompiled we already have some configuration that should do this but it doesn t seem to be working as expected so perhaps some other step in the pipeline is not performing caching and this busts the later caches that are set up
1
180,386
21,625,723,884
IssuesEvent
2022-05-05 01:40:36
Sh2dowFi3nd/Test_2
https://api.github.com/repos/Sh2dowFi3nd/Test_2
closed
WS-2016-7061 (Medium) detected in poi-3.8.jar - autoclosed
security vulnerability
## WS-2016-7061 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>poi-3.8.jar</b></p></summary> <p>Apache POI - Java API To Access Microsoft Format Files</p> <p>Library home page: <a href="http://poi.apache.org/">http://poi.apache.org/</a></p> <p> Dependency Hierarchy: - :x: **poi-3.8.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Sh2dowFi3nd/Test_2/commit/496ee93a49670cf2906171c7293cf9b50cc09d47">496ee93a49670cf2906171c7293cf9b50cc09d47</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Security vulnerability found in Apache POI before 3.16-beta1. Lack of length sanity check for length of embedded OLE10Native. <p>Publish Date: 2019-09-26 <p>URL: <a href=https://github.com/apache/poi/commit/7f9f8e9afa8160ef401ec8b3416d36428e928e2f>WS-2016-7061</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/apache/poi/commit/7f9f8e9afa8160ef401ec8b3416d36428e928e2f">https://github.com/apache/poi/commit/7f9f8e9afa8160ef401ec8b3416d36428e928e2f</a></p> <p>Release Date: 2019-09-26</p> <p>Fix Resolution: 3.16-beta1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2016-7061 (Medium) detected in poi-3.8.jar - autoclosed - ## WS-2016-7061 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>poi-3.8.jar</b></p></summary> <p>Apache POI - Java API To Access Microsoft Format Files</p> <p>Library home page: <a href="http://poi.apache.org/">http://poi.apache.org/</a></p> <p> Dependency Hierarchy: - :x: **poi-3.8.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Sh2dowFi3nd/Test_2/commit/496ee93a49670cf2906171c7293cf9b50cc09d47">496ee93a49670cf2906171c7293cf9b50cc09d47</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Security vulnerability found in Apache POI before 3.16-beta1. Lack of length sanity check for length of embedded OLE10Native. <p>Publish Date: 2019-09-26 <p>URL: <a href=https://github.com/apache/poi/commit/7f9f8e9afa8160ef401ec8b3416d36428e928e2f>WS-2016-7061</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/apache/poi/commit/7f9f8e9afa8160ef401ec8b3416d36428e928e2f">https://github.com/apache/poi/commit/7f9f8e9afa8160ef401ec8b3416d36428e928e2f</a></p> <p>Release Date: 2019-09-26</p> <p>Fix Resolution: 3.16-beta1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_build
ws medium detected in poi jar autoclosed ws medium severity vulnerability vulnerable library poi jar apache poi java api to access microsoft format files library home page a href dependency hierarchy x poi jar vulnerable library found in head commit a href vulnerability details security vulnerability found in apache poi before lack of length sanity check for length of embedded publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
564,202
16,720,578,999
IssuesEvent
2021-06-10 06:43:14
ita-social-projects/TeachUA
https://api.github.com/repos/ita-social-projects/TeachUA
closed
[Гуртки page] Address is not displayed near the location icon on the club's pop-up window
Priority: High bug
Environment: Windows 7, Service Pack 1, Google Chrome, 90.0.4430.212 (Розробка) (64-розрядна версія). Reproducible: always. Build found: the last commit 07.06.2021 Preconditions Open [https://speak-ukrainian.org.ua/dev/](https://speak-ukrainian.org.ua/dev/) **Steps to reproduce** 1. Go to 'Гуртки' menu navigation tab 2. Click on the club's card and open club's pop-up window (not on 'Детальніше' button) Actual result The address is not displayed near the location icon on the club's pop-up window ![2021-06-07_14h59_02](https://user-images.githubusercontent.com/83178616/121015486-b7ab8500-c7a3-11eb-8257-f9f4e4dcc221.png) Expected result Address of the club should be displayed. If club is without location - 'Доступний онлайн' should be displayed
1.0
[Гуртки page] Address is not displayed near the location icon on the club's pop-up window - Environment: Windows 7, Service Pack 1, Google Chrome, 90.0.4430.212 (Розробка) (64-розрядна версія). Reproducible: always. Build found: the last commit 07.06.2021 Preconditions Open [https://speak-ukrainian.org.ua/dev/](https://speak-ukrainian.org.ua/dev/) **Steps to reproduce** 1. Go to 'Гуртки' menu navigation tab 2. Click on the club's card and open club's pop-up window (not on 'Детальніше' button) Actual result The address is not displayed near the location icon on the club's pop-up window ![2021-06-07_14h59_02](https://user-images.githubusercontent.com/83178616/121015486-b7ab8500-c7a3-11eb-8257-f9f4e4dcc221.png) Expected result Address of the club should be displayed. If club is without location - 'Доступний онлайн' should be displayed
non_build
address is not displayed near the location icon on the club s pop up window environment windows service pack google chrome розробка розрядна версія reproducible always build found the last commit preconditions open steps to reproduce go to гуртки menu navigation tab click on the club s card and open club s pop up window not on детальніше button actual result the address is not displayed near the location icon on the club s pop up window expected result address of the club should be displayed if club is without location доступний онлайн should be displayed
0
200,621
15,801,755,739
IssuesEvent
2021-04-03 06:26:15
w2vgd/ped
https://api.github.com/repos/w2vgd/ped
opened
Incorrect jar file name
severity.VeryLow type.DocumentationBug
UG says the jar file should be `focuris.jar` but the downloaded jar file name is `v1.3.jar` ![image.png](https://raw.githubusercontent.com/w2vgd/ped/main/files/300a6381-6dfe-4176-bfab-f27bc1d224b2.png) <!--session: 1617429886253-dfdc7f54-0b8c-4a03-952b-44062911198f-->
1.0
Incorrect jar file name - UG says the jar file should be `focuris.jar` but the downloaded jar file name is `v1.3.jar` ![image.png](https://raw.githubusercontent.com/w2vgd/ped/main/files/300a6381-6dfe-4176-bfab-f27bc1d224b2.png) <!--session: 1617429886253-dfdc7f54-0b8c-4a03-952b-44062911198f-->
non_build
incorrect jar file name ug says the jar file should be focuris jar but the downloaded jar file name is jar
0
23,504
7,334,538,884
IssuesEvent
2018-03-05 23:12:53
moby/moby
https://api.github.com/repos/moby/moby
closed
docker pull of image built on 18.02 fails on 17.06
area/builder area/distribution kind/bug platform/desktop platform/windows status/confirmed version/17.06
--------------------------------------------------- BUG REPORT INFORMATION --------------------------------------------------- **Description** Using Docker for Windows Edge I built a new asp.net core app on nanoserver. Pushed to Docker Hub. Pull on Windows Server 2016 results in an error. **Steps to reproduce the issue:** On Windows Server 2016: docker pull pdebruin/aspcoremvc1802 or to also build the image 1. on D4W git clone https://github.com/pdebruin/aspnetcoremvc.git (or dotnet new mvc and add the dockerfile) 2. docker build . -t <dockerhubid>/aspnetcoremvc1802 3. docker push <dockerhubid>/aspnetcoremvc1802 4. On WS2016 docker pull <dockerhubid>/aspnetcoremvc1802 **Describe the results you received:** When I pull it to WS2016 I get an error when the first layer is almost done extracting: failed to register layer: re-exec error: exit status 1: output: ProcessUtilityVMImage C:\ProgramData\docker\windowsfilter\6faec334f4ba8d55ae5707f2456e474ee6086307e502463bbb64820951f7ee53\UtilityVM: The system cannot find the path specified. **Describe the results you expected:** docker pull should just work (or if it requires a different engine it should say so) **Additional information you deem important (e.g. issue happens only occasionally):** I can reproduce on another WS2016: Just pull pdebruin/aspcoremvc1802. The exact same build on a 17.06 machine works fine anywhere else. **Output of `docker version`:** D4W Win10 ``` Client: Version: 18.02.0-ce-rc1 API version: 1.35 Go version: go1.9.2 Git commit: 5e1d90a Built: Thu Jan 25 00:34:00 2018 OS/Arch: windows/amd64 Experimental: true Orchestrator: kubernetes Server: Engine: Version: 18.02.0-ce-rc1 API version: 1.36 (minimum version 1.24) Go version: go1.9.3 Git commit: 5e1d90a Built: Thu Jan 25 00:47:42 2018 OS/Arch: windows/amd64 Experimental: true ``` **Output of `docker info`:** ``` Containers: 4 Running: 0 Paused: 0 Stopped: 4 Images: 11 Server Version: 18.02.0-ce-rc1 Storage Driver: windowsfilter (windows) lcow (linux) Windows: LCOW: Logging Driver: json-file Plugins: Volume: local Network: ics l2bridge l2tunnel nat null overlay transparent Log: awslogs etwlogs fluentd gelf json-file logentries splunk syslog Swarm: inactive Default Isolation: hyperv Kernel Version: 10.0 16299 (16299.15.amd64fre.rs3_release.170928-1534) Operating System: Windows 10 Pro OSType: windows Architecture: x86_64 CPUs: 4 Total Memory: 15.93GiB Name: DESKTOP-49LCHD9 ID: KZRK:ZMLF:BN5U:OXYU:BRUB:EPCG:FLN4:EKH6:V6H4:6BBG:TWH5:JVAC Docker Root Dir: C:\ProgramData\Docker Debug Mode (client): false Debug Mode (server): true File Descriptors: -1 Goroutines: 29 System Time: 2018-02-02T09:21:14.2276791+01:00 EventsListeners: 1 Registry: https://index.docker.io/v1/ Labels: Experimental: true Insecure Registries: dtrlb-k3o7y3fv3kvgm.westeurope.cloudapp.azure.com 127.0.0.0/8 Live Restore Enabled: false ``` **Output of `docker version`:** WS2016 ``` Client: Version: 17.06.1-ee-2 API version: 1.30 Go version: go1.8.3 Git commit: 8e43158 Built: Wed Aug 23 21:16:53 2017 OS/Arch: windows/amd64 Server: Version: 17.06.1-ee-2 API version: 1.30 (minimum version 1.24) Go version: go1.8.3 Git commit: 8e43158 Built: Wed Aug 23 21:25:53 2017 OS/Arch: windows/amd64 Experimental: false ``` **Output of `docker info`:** ``` Containers: 2 Running: 0 Paused: 0 Stopped: 2 Images: 55 Server Version: 17.06.1-ee-2 Storage Driver: windowsfilter Windows: Logging Driver: json-file Plugins: Volume: local Network: l2bridge l2tunnel nat null overlay transparent Log: awslogs etwlogs fluentd json-file logentries splunk syslog Swarm: inactive Default Isolation: process Kernel Version: 10.0 14393 (14393.1715.amd64fre.rs1_release_inmarket.170906-1810) Operating System: Windows Server 2016 Datacenter OSType: windows Architecture: x86_64 CPUs: 1 Total Memory: 3.5GiB Name: pieterd-vsts ID: M6TN:V5PO:2Q2S:DEP7:BY4P:PBZ3:PNHV:3I7M:NX3D:JHZ6:XGNL:Y727 Docker Root Dir: C:\ProgramData\docker Debug Mode (client): false Debug Mode (server): false Registry: https://index.docker.io/v1/ Experimental: false Insecure Registries: dtrlb-k3o7y3fv3kvgm.westeurope.cloudapp.azure.com 127.0.0.0/8 Live Restore Enabled: false ``` **Additional environment details (AWS, VirtualBox, physical, etc.):** D4W on Windows 10 physical. WS2016 physical and another on Azure.
1.0
docker pull of image built on 18.02 fails on 17.06 - --------------------------------------------------- BUG REPORT INFORMATION --------------------------------------------------- **Description** Using Docker for Windows Edge I built a new asp.net core app on nanoserver. Pushed to Docker Hub. Pull on Windows Server 2016 results in an error. **Steps to reproduce the issue:** On Windows Server 2016: docker pull pdebruin/aspcoremvc1802 or to also build the image 1. on D4W git clone https://github.com/pdebruin/aspnetcoremvc.git (or dotnet new mvc and add the dockerfile) 2. docker build . -t <dockerhubid>/aspnetcoremvc1802 3. docker push <dockerhubid>/aspnetcoremvc1802 4. On WS2016 docker pull <dockerhubid>/aspnetcoremvc1802 **Describe the results you received:** When I pull it to WS2016 I get an error when the first layer is almost done extracting: failed to register layer: re-exec error: exit status 1: output: ProcessUtilityVMImage C:\ProgramData\docker\windowsfilter\6faec334f4ba8d55ae5707f2456e474ee6086307e502463bbb64820951f7ee53\UtilityVM: The system cannot find the path specified. **Describe the results you expected:** docker pull should just work (or if it requires a different engine it should say so) **Additional information you deem important (e.g. issue happens only occasionally):** I can reproduce on another WS2016: Just pull pdebruin/aspcoremvc1802. The exact same build on a 17.06 machine works fine anywhere else. **Output of `docker version`:** D4W Win10 ``` Client: Version: 18.02.0-ce-rc1 API version: 1.35 Go version: go1.9.2 Git commit: 5e1d90a Built: Thu Jan 25 00:34:00 2018 OS/Arch: windows/amd64 Experimental: true Orchestrator: kubernetes Server: Engine: Version: 18.02.0-ce-rc1 API version: 1.36 (minimum version 1.24) Go version: go1.9.3 Git commit: 5e1d90a Built: Thu Jan 25 00:47:42 2018 OS/Arch: windows/amd64 Experimental: true ``` **Output of `docker info`:** ``` Containers: 4 Running: 0 Paused: 0 Stopped: 4 Images: 11 Server Version: 18.02.0-ce-rc1 Storage Driver: windowsfilter (windows) lcow (linux) Windows: LCOW: Logging Driver: json-file Plugins: Volume: local Network: ics l2bridge l2tunnel nat null overlay transparent Log: awslogs etwlogs fluentd gelf json-file logentries splunk syslog Swarm: inactive Default Isolation: hyperv Kernel Version: 10.0 16299 (16299.15.amd64fre.rs3_release.170928-1534) Operating System: Windows 10 Pro OSType: windows Architecture: x86_64 CPUs: 4 Total Memory: 15.93GiB Name: DESKTOP-49LCHD9 ID: KZRK:ZMLF:BN5U:OXYU:BRUB:EPCG:FLN4:EKH6:V6H4:6BBG:TWH5:JVAC Docker Root Dir: C:\ProgramData\Docker Debug Mode (client): false Debug Mode (server): true File Descriptors: -1 Goroutines: 29 System Time: 2018-02-02T09:21:14.2276791+01:00 EventsListeners: 1 Registry: https://index.docker.io/v1/ Labels: Experimental: true Insecure Registries: dtrlb-k3o7y3fv3kvgm.westeurope.cloudapp.azure.com 127.0.0.0/8 Live Restore Enabled: false ``` **Output of `docker version`:** WS2016 ``` Client: Version: 17.06.1-ee-2 API version: 1.30 Go version: go1.8.3 Git commit: 8e43158 Built: Wed Aug 23 21:16:53 2017 OS/Arch: windows/amd64 Server: Version: 17.06.1-ee-2 API version: 1.30 (minimum version 1.24) Go version: go1.8.3 Git commit: 8e43158 Built: Wed Aug 23 21:25:53 2017 OS/Arch: windows/amd64 Experimental: false ``` **Output of `docker info`:** ``` Containers: 2 Running: 0 Paused: 0 Stopped: 2 Images: 55 Server Version: 17.06.1-ee-2 Storage Driver: windowsfilter Windows: Logging Driver: json-file Plugins: Volume: local Network: l2bridge l2tunnel nat null overlay transparent Log: awslogs etwlogs fluentd json-file logentries splunk syslog Swarm: inactive Default Isolation: process Kernel Version: 10.0 14393 (14393.1715.amd64fre.rs1_release_inmarket.170906-1810) Operating System: Windows Server 2016 Datacenter OSType: windows Architecture: x86_64 CPUs: 1 Total Memory: 3.5GiB Name: pieterd-vsts ID: M6TN:V5PO:2Q2S:DEP7:BY4P:PBZ3:PNHV:3I7M:NX3D:JHZ6:XGNL:Y727 Docker Root Dir: C:\ProgramData\docker Debug Mode (client): false Debug Mode (server): false Registry: https://index.docker.io/v1/ Experimental: false Insecure Registries: dtrlb-k3o7y3fv3kvgm.westeurope.cloudapp.azure.com 127.0.0.0/8 Live Restore Enabled: false ``` **Additional environment details (AWS, VirtualBox, physical, etc.):** D4W on Windows 10 physical. WS2016 physical and another on Azure.
build
docker pull of image built on fails on bug report information description using docker for windows edge i built a new asp net core app on nanoserver pushed to docker hub pull on windows server results in an error steps to reproduce the issue on windows server docker pull pdebruin or to also build the image on git clone or dotnet new mvc and add the dockerfile docker build t docker push on docker pull describe the results you received when i pull it to i get an error when the first layer is almost done extracting failed to register layer re exec error exit status output processutilityvmimage c programdata docker windowsfilter utilityvm the system cannot find the path specified describe the results you expected docker pull should just work or if it requires a different engine it should say so additional information you deem important e g issue happens only occasionally i can reproduce on another just pull pdebruin the exact same build on a machine works fine anywhere else output of docker version client version ce api version go version git commit built thu jan os arch windows experimental true orchestrator kubernetes server engine version ce api version minimum version go version git commit built thu jan os arch windows experimental true output of docker info containers running paused stopped images server version ce storage driver windowsfilter windows lcow linux windows lcow logging driver json file plugins volume local network ics nat null overlay transparent log awslogs etwlogs fluentd gelf json file logentries splunk syslog swarm inactive default isolation hyperv kernel version release operating system windows pro ostype windows architecture cpus total memory name desktop id kzrk zmlf oxyu brub epcg jvac docker root dir c programdata docker debug mode client false debug mode server true file descriptors goroutines system time eventslisteners registry labels experimental true insecure registries dtrlb westeurope cloudapp azure com live restore enabled false output of docker version client version ee api version go version git commit built wed aug os arch windows server version ee api version minimum version go version git commit built wed aug os arch windows experimental false output of docker info containers running paused stopped images server version ee storage driver windowsfilter windows logging driver json file plugins volume local network nat null overlay transparent log awslogs etwlogs fluentd json file logentries splunk syslog swarm inactive default isolation process kernel version release inmarket operating system windows server datacenter ostype windows architecture cpus total memory name pieterd vsts id pnhv xgnl docker root dir c programdata docker debug mode client false debug mode server false registry experimental false insecure registries dtrlb westeurope cloudapp azure com live restore enabled false additional environment details aws virtualbox physical etc on windows physical physical and another on azure
1
1,921
21,772,828,122
IssuesEvent
2022-05-13 10:48:22
jina-ai/jina
https://api.github.com/repos/jina-ai/jina
closed
Connection failures from the Gateway are confusing
epic/reliability
When the Gateway can't connect to an Executor, the error messages are quite confusing. They also dont include the information which executor actually failed.
True
Connection failures from the Gateway are confusing - When the Gateway can't connect to an Executor, the error messages are quite confusing. They also dont include the information which executor actually failed.
non_build
connection failures from the gateway are confusing when the gateway can t connect to an executor the error messages are quite confusing they also dont include the information which executor actually failed
0
551
8,555,814,137
IssuesEvent
2018-11-08 11:08:54
LiskHQ/lisk
https://api.github.com/repos/LiskHQ/lisk
closed
Node receives blocks during snapshotting
*medium :hammer: reliability chain p2p performance
### Expected behavior The node should not receive blocks/transaction during snapshotting process. ### Actual behavior Node receives blocks during snapshotting ``` [inf] 2018-09-25 08:23:19 | Verify->verifyBlock succeeded for block 8323462238787390927 at height 1616. [inf] 2018-09-25 08:23:20 | Rebuilding accounts states, current round: 17, height: 1617 [WRN] 2018-09-25 08:23:20 | Discarded block that does not match with current chain: 13500374753101519740 height: 6298816 round: 62365 slot: 7375460 generator: 6a8d02899c66dfa2423b125f44d360be6da0669cedadde32e63e629cb2e3195c [inf] 2018-09-25 08:23:20 | Verify->verifyBlock succeeded for block 8197888260741386974 at height 1617. ``` ### Steps to reproduce Run snapshotting on node which is running under devnet/testnet/mainnet (`1.1.0`) ### Which version(s) does this affect? (Environment, OS, etc...)
True
Node receives blocks during snapshotting - ### Expected behavior The node should not receive blocks/transaction during snapshotting process. ### Actual behavior Node receives blocks during snapshotting ``` [inf] 2018-09-25 08:23:19 | Verify->verifyBlock succeeded for block 8323462238787390927 at height 1616. [inf] 2018-09-25 08:23:20 | Rebuilding accounts states, current round: 17, height: 1617 [WRN] 2018-09-25 08:23:20 | Discarded block that does not match with current chain: 13500374753101519740 height: 6298816 round: 62365 slot: 7375460 generator: 6a8d02899c66dfa2423b125f44d360be6da0669cedadde32e63e629cb2e3195c [inf] 2018-09-25 08:23:20 | Verify->verifyBlock succeeded for block 8197888260741386974 at height 1617. ``` ### Steps to reproduce Run snapshotting on node which is running under devnet/testnet/mainnet (`1.1.0`) ### Which version(s) does this affect? (Environment, OS, etc...)
non_build
node receives blocks during snapshotting expected behavior the node should not receive blocks transaction during snapshotting process actual behavior node receives blocks during snapshotting verify verifyblock succeeded for block at height rebuilding accounts states current round height discarded block that does not match with current chain height round slot generator verify verifyblock succeeded for block at height steps to reproduce run snapshotting on node which is running under devnet testnet mainnet which version s does this affect environment os etc
0
33,762
9,205,102,098
IssuesEvent
2019-03-08 09:37:43
qissue-bot/QGIS
https://api.github.com/repos/qissue-bot/QGIS
closed
qgis-0.8.0.dmg - no mountable file systems
Category: Build/Install Component: Affected QGIS version Component: Crashes QGIS or corrupts data Component: Easy fix? Component: Operating System Component: Pull Request or Patch supplied Component: Regression? Component: Resolution Priority: Low Project: QGIS Application Status: Closed Tracker: Bug report
--- Author Name: **mike-coan-gmail-com -** (mike-coan-gmail-com -) Original Redmine Issue: 517, https://issues.qgis.org/issues/517 Original Assignee: nobody - --- Gunzip's correctly but failure to mount "qgis-0.8.0.dmg" - stated reason, "no mountable file systems". Same issue with single large file download, and with bit-torrent download. Os X v10.4.8 on macbook pro (intel core duo)
1.0
qgis-0.8.0.dmg - no mountable file systems - --- Author Name: **mike-coan-gmail-com -** (mike-coan-gmail-com -) Original Redmine Issue: 517, https://issues.qgis.org/issues/517 Original Assignee: nobody - --- Gunzip's correctly but failure to mount "qgis-0.8.0.dmg" - stated reason, "no mountable file systems". Same issue with single large file download, and with bit-torrent download. Os X v10.4.8 on macbook pro (intel core duo)
build
qgis dmg no mountable file systems author name mike coan gmail com mike coan gmail com original redmine issue original assignee nobody gunzip s correctly but failure to mount qgis dmg stated reason no mountable file systems same issue with single large file download and with bit torrent download os x on macbook pro intel core duo
1
315
2,643,324,746
IssuesEvent
2015-03-12 10:06:22
mattrayner/mattrayner
https://api.github.com/repos/mattrayner/mattrayner
closed
Set up monitoring and 'DevOps'
infrastructure task
Set up and integrate: * Slack * CircleCI Slack Notifications * Code Climate * GitHub Issue Creation * Slack Notifications * NewRelic monitoring * Slack Notifications * Email Notifications
1.0
Set up monitoring and 'DevOps' - Set up and integrate: * Slack * CircleCI Slack Notifications * Code Climate * GitHub Issue Creation * Slack Notifications * NewRelic monitoring * Slack Notifications * Email Notifications
non_build
set up monitoring and devops set up and integrate slack circleci slack notifications code climate github issue creation slack notifications newrelic monitoring slack notifications email notifications
0
9,558
4,544,944,890
IssuesEvent
2016-09-11 00:03:34
CleverRaven/Cataclysm-DDA
https://api.github.com/repos/CleverRaven/Cataclysm-DDA
opened
CodeBlocks project Release build weirdness
Build
Specifies both `-Os` and `-O2` flags Doesn't `-Os` include `-O2` anyway? Includes debug symbols in the executable As a result, the executable is almost 300mb.
1.0
CodeBlocks project Release build weirdness - Specifies both `-Os` and `-O2` flags Doesn't `-Os` include `-O2` anyway? Includes debug symbols in the executable As a result, the executable is almost 300mb.
build
codeblocks project release build weirdness specifies both os and flags doesn t os include anyway includes debug symbols in the executable as a result the executable is almost
1
3,885
3,266,269,356
IssuesEvent
2015-10-22 19:55:36
openshift/origin
https://api.github.com/repos/openshift/origin
closed
Internal registry push loading bar
area/usability component/build priority/P3
I want to see pushing to internal registry with loading bar: -bash-4.2$ oc logs -f apaas-1-2708674907-build Internal Error: pod is not in 'Running', 'Succeeded' or 'Failed' state - State: "Pending" -bash-4.2$ oc logs -f apaas-1-2708674907-build Already on 'master' W1022 08:40:30.217551 1 docker.go:304] The 'io.s2i.scripts-url' label is deprecated. Use "io.openshift.s2i.scripts-url" instead. W1022 08:40:30.236089 1 docker.go:304] The 'io.s2i.scripts-url' label is deprecated. Use "io.openshift.s2i.scripts-url" instead. W1022 08:40:30.294958 1 docker.go:304] The 'io.s2i.scripts-url' label is deprecated. Use "io.openshift.s2i.scripts-url" instead. I1022 08:40:31.313216 1 sti.go:407] Copying all WAR artifacts from /home/jboss/source/deployments directory into /opt/webserver/webapps for later deployment... I1022 08:40:31.346271 1 sti.go:407] '/home/jboss/source/deployments/prime-face.war' -> '/opt/webserver/webapps/prime-face.war' I1022 08:40:40.333435 1 sti.go:149] Using provided push secret for pushing 172.17.131.32:5000/mangis/apaas:latest image I1022 08:40:40.333475 1 sti.go:151] Pushing 172.17.131.32:5000/mangis/apaas:latest image ... My images (STI builders) are quite big and sometimes due some performance issues it hand up on push procedure. So status on push state would help to identify hanged builds.
1.0
Internal registry push loading bar - I want to see pushing to internal registry with loading bar: -bash-4.2$ oc logs -f apaas-1-2708674907-build Internal Error: pod is not in 'Running', 'Succeeded' or 'Failed' state - State: "Pending" -bash-4.2$ oc logs -f apaas-1-2708674907-build Already on 'master' W1022 08:40:30.217551 1 docker.go:304] The 'io.s2i.scripts-url' label is deprecated. Use "io.openshift.s2i.scripts-url" instead. W1022 08:40:30.236089 1 docker.go:304] The 'io.s2i.scripts-url' label is deprecated. Use "io.openshift.s2i.scripts-url" instead. W1022 08:40:30.294958 1 docker.go:304] The 'io.s2i.scripts-url' label is deprecated. Use "io.openshift.s2i.scripts-url" instead. I1022 08:40:31.313216 1 sti.go:407] Copying all WAR artifacts from /home/jboss/source/deployments directory into /opt/webserver/webapps for later deployment... I1022 08:40:31.346271 1 sti.go:407] '/home/jboss/source/deployments/prime-face.war' -> '/opt/webserver/webapps/prime-face.war' I1022 08:40:40.333435 1 sti.go:149] Using provided push secret for pushing 172.17.131.32:5000/mangis/apaas:latest image I1022 08:40:40.333475 1 sti.go:151] Pushing 172.17.131.32:5000/mangis/apaas:latest image ... My images (STI builders) are quite big and sometimes due some performance issues it hand up on push procedure. So status on push state would help to identify hanged builds.
build
internal registry push loading bar i want to see pushing to internal registry with loading bar bash oc logs f apaas build internal error pod is not in running succeeded or failed state state pending bash oc logs f apaas build already on master docker go the io scripts url label is deprecated use io openshift scripts url instead docker go the io scripts url label is deprecated use io openshift scripts url instead docker go the io scripts url label is deprecated use io openshift scripts url instead sti go copying all war artifacts from home jboss source deployments directory into opt webserver webapps for later deployment sti go home jboss source deployments prime face war opt webserver webapps prime face war sti go using provided push secret for pushing mangis apaas latest image sti go pushing mangis apaas latest image my images sti builders are quite big and sometimes due some performance issues it hand up on push procedure so status on push state would help to identify hanged builds
1
188,585
14,447,513,163
IssuesEvent
2020-12-08 03:58:54
kalexmills/github-vet-tests-dec2020
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
closed
BinaryTreeNode/KDS: vendor/github.com/gophercloud/gophercloud/acceptance/openstack/dns/v2/recordsets_test.go; 3 LoC
fresh test tiny vendored
Found a possible issue in [BinaryTreeNode/KDS](https://www.github.com/BinaryTreeNode/KDS) at [vendor/github.com/gophercloud/gophercloud/acceptance/openstack/dns/v2/recordsets_test.go](https://github.com/BinaryTreeNode/KDS/blob/6220475814b42733c86ac0005e8548bb9a481c75/vendor/github.com/gophercloud/gophercloud/acceptance/openstack/dns/v2/recordsets_test.go#L36-L38) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to recordset at line 37 may start a goroutine [Click here to see the code in its original context.](https://github.com/BinaryTreeNode/KDS/blob/6220475814b42733c86ac0005e8548bb9a481c75/vendor/github.com/gophercloud/gophercloud/acceptance/openstack/dns/v2/recordsets_test.go#L36-L38) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for _, recordset := range allRecordSets { tools.PrintResource(t, &recordset) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 6220475814b42733c86ac0005e8548bb9a481c75
1.0
BinaryTreeNode/KDS: vendor/github.com/gophercloud/gophercloud/acceptance/openstack/dns/v2/recordsets_test.go; 3 LoC - Found a possible issue in [BinaryTreeNode/KDS](https://www.github.com/BinaryTreeNode/KDS) at [vendor/github.com/gophercloud/gophercloud/acceptance/openstack/dns/v2/recordsets_test.go](https://github.com/BinaryTreeNode/KDS/blob/6220475814b42733c86ac0005e8548bb9a481c75/vendor/github.com/gophercloud/gophercloud/acceptance/openstack/dns/v2/recordsets_test.go#L36-L38) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to recordset at line 37 may start a goroutine [Click here to see the code in its original context.](https://github.com/BinaryTreeNode/KDS/blob/6220475814b42733c86ac0005e8548bb9a481c75/vendor/github.com/gophercloud/gophercloud/acceptance/openstack/dns/v2/recordsets_test.go#L36-L38) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for _, recordset := range allRecordSets { tools.PrintResource(t, &recordset) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 6220475814b42733c86ac0005e8548bb9a481c75
non_build
binarytreenode kds vendor github com gophercloud gophercloud acceptance openstack dns recordsets test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to recordset at line may start a goroutine click here to show the line s of go which triggered the analyzer go for recordset range allrecordsets tools printresource t recordset leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
0
102,151
4,151,284,495
IssuesEvent
2016-06-15 20:06:18
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
Protobuf communications
area/api area/performance priority/P1 team/CSI-API Machinery SIG
re: #3338 Post 1.0 and etcd v3. It would be worth while to investigate the tradeoffs for using protobuf for internal communication whose external representation is json for performance and scale out reasons.
1.0
Protobuf communications - re: #3338 Post 1.0 and etcd v3. It would be worth while to investigate the tradeoffs for using protobuf for internal communication whose external representation is json for performance and scale out reasons.
non_build
protobuf communications re post and etcd it would be worth while to investigate the tradeoffs for using protobuf for internal communication whose external representation is json for performance and scale out reasons
0
7,045
2,596,715,116
IssuesEvent
2015-02-20 22:43:00
chessmasterhong/WaterEmblem
https://api.github.com/repos/chessmasterhong/WaterEmblem
closed
Weapon broke mid-battle, freezing game
bug high priority
I believe my weapon broke mid-battle, causing the errors below: ![screen shot 2015-02-20 at 12 42 49 pm](https://cloud.githubusercontent.com/assets/1226900/6306068/2aec4024-b8fe-11e4-8a0d-649f05c2a95e.png)
1.0
Weapon broke mid-battle, freezing game - I believe my weapon broke mid-battle, causing the errors below: ![screen shot 2015-02-20 at 12 42 49 pm](https://cloud.githubusercontent.com/assets/1226900/6306068/2aec4024-b8fe-11e4-8a0d-649f05c2a95e.png)
non_build
weapon broke mid battle freezing game i believe my weapon broke mid battle causing the errors below
0
77,940
22,047,859,851
IssuesEvent
2022-05-30 05:13:47
google/mediapipe
https://api.github.com/repos/google/mediapipe
closed
<Solved> from mediapipe.python._framework_bindings import
type:build/install stat:awaiting response platform:python stalled
**Versions:** python version - **3.8** (I did not use anaconda) OS - **Windows** system terminal the rest are all based on installation **tutorial** Suppose you are not using 'pip3 install mediapipe' to install mediapipe. Instead, you use python package, and the last step is '**python setup.py install --link-opencv**'. **Errors:** from mediapipe.python._framework_bindings import resource_util ImportError: DLL load failed: The specified module could not be found. **Solutions:** If #1839 #1405 doesn't work for you, please check if you run 'import mediapipe' in the mediapipe repository. (I do not need Visual C++ redistributable packages OR msvc-runtime) **Reminder:** DO NOT run test ‘import mediapipe’ in the mediapipe repository Firstly, you should run '**pip3 install absl-py attrs matplotlib opencv-contrib-python protobuf --user**' in your terminal since requirements.txt lacks some libraries. Then, for me, I run 'import mediapipe' in **IDLE (Python 3.8 64-bit)**. It should work! **Last check step** -> >> pip3 list ``` Package Version --------------------- --------- matplotlib 3.5.1 mediapipe dev numpy 1.22.2 opencv-contrib-python 4.5.5.62 opencv-python 4.5.5.62 ... ``` There should not be version number here. Enjoy editing and creating your own mediapipe :) I made this post since there are still many people asking the same question. I am glad to share my solution which is unqiue but effective.
1.0
<Solved> from mediapipe.python._framework_bindings import - **Versions:** python version - **3.8** (I did not use anaconda) OS - **Windows** system terminal the rest are all based on installation **tutorial** Suppose you are not using 'pip3 install mediapipe' to install mediapipe. Instead, you use python package, and the last step is '**python setup.py install --link-opencv**'. **Errors:** from mediapipe.python._framework_bindings import resource_util ImportError: DLL load failed: The specified module could not be found. **Solutions:** If #1839 #1405 doesn't work for you, please check if you run 'import mediapipe' in the mediapipe repository. (I do not need Visual C++ redistributable packages OR msvc-runtime) **Reminder:** DO NOT run test ‘import mediapipe’ in the mediapipe repository Firstly, you should run '**pip3 install absl-py attrs matplotlib opencv-contrib-python protobuf --user**' in your terminal since requirements.txt lacks some libraries. Then, for me, I run 'import mediapipe' in **IDLE (Python 3.8 64-bit)**. It should work! **Last check step** -> >> pip3 list ``` Package Version --------------------- --------- matplotlib 3.5.1 mediapipe dev numpy 1.22.2 opencv-contrib-python 4.5.5.62 opencv-python 4.5.5.62 ... ``` There should not be version number here. Enjoy editing and creating your own mediapipe :) I made this post since there are still many people asking the same question. I am glad to share my solution which is unqiue but effective.
build
from mediapipe python framework bindings import versions python version i did not use anaconda os windows system terminal the rest are all based on installation tutorial suppose you are not using install mediapipe to install mediapipe instead you use python package and the last step is python setup py install link opencv errors from mediapipe python framework bindings import resource util importerror dll load failed the specified module could not be found solutions if doesn t work for you please check if you run import mediapipe in the mediapipe repository i do not need visual c redistributable packages or msvc runtime reminder do not run test ‘import mediapipe’ in the mediapipe repository firstly you should run install absl py attrs matplotlib opencv contrib python protobuf user in your terminal since requirements txt lacks some libraries then for me i run import mediapipe in idle python bit it should work last check step list package version matplotlib mediapipe dev numpy opencv contrib python opencv python there should not be version number here enjoy editing and creating your own mediapipe i made this post since there are still many people asking the same question i am glad to share my solution which is unqiue but effective
1
18,579
6,625,301,628
IssuesEvent
2017-09-22 14:57:27
csmk/frabjous
https://api.github.com/repos/csmk/frabjous
closed
net-analyzer/prometheus-*: new packages
in progress new ebuild
- [x] [alertmanager](https://github.com/prometheus/alertmanager) > aaa3851 - [x] [node_exporter](https://github.com/prometheus/node_exporter) > 8cfd497 - [x] [mysqld_exporter](https://github.com/prometheus/mysqld_exporter) > 7ff6949 - [x] [memcached_exporter](https://github.com/prometheus/memcached_exporter) > 9054a7b - [x] [statsd_exporter](https://github.com/prometheus/statsd_exporter) > d21878d - [x] [haproxy_exporter](https://github.com/prometheus/haproxy_exporter) > 2d5e6b3 - [x] [blackbox_exporter](https://github.com/prometheus/blackbox_exporter) > 77378ea - [x] [influxdb_exporter](https://github.com/prometheus/influxdb_exporter) > 5405c93 - [x] [collectd_exporter](https://github.com/prometheus/collectd_exporter) > 525cc86 - [x] [graphite_exporter](https://github.com/prometheus/graphite_exporter) > bc4de2b - [x] [redis_exporter](https://github.com/oliver006/redis_exporter) > cca7a84 - [x] [pushgateway](https://github.com/prometheus/pushgateway) > ab3e8e4 - [x] [snmp_exporter](https://github.com/prometheus/snmp_exporter) > abf9b0b - [x] [unbound_exporter](https://github.com/kumina/unbound_exporter) > 76ad323 - [x] [postgres_exporter](https://github.com/wrouesnel/postgres_exporter) > bec3835 - [x] [postfix_exporter](https://github.com/kumina/postfix_exporter) > d6275f3 - [x] [dovecot_exporter](https://github.com/kumina/dovecot_exporter) > 0c765b8 - [x] [openvpn_exporter](https://github.com/kumina/openvpn_exporter) > 8051b3d - [x] [phpfpm_exporter](https://github.com/kumina/phpfpm_exporter) > faa1117 - [x] [nginx-vts-exporter](https://github.com/hnlq715/nginx-vts-exporter) (~~depend on [nginx-module-vts](https://github.com/vozlt/nginx-module-vts)~~ done: c47cd96) > a3b1146 - [x] [process-exporter](https://github.com/ncabatoff/process-exporter) > 9cbf29d - [x] [script_exporter ](https://github.com/adhocteam/script_exporter) > f04369a - [x] [nsq_exporter](https://github.com/lovoo/nsq_exporter) > 75a1fee ~~[procfs](https://github.com/prometheus/procfs) (low priority: still a WIP)~~
1.0
net-analyzer/prometheus-*: new packages - - [x] [alertmanager](https://github.com/prometheus/alertmanager) > aaa3851 - [x] [node_exporter](https://github.com/prometheus/node_exporter) > 8cfd497 - [x] [mysqld_exporter](https://github.com/prometheus/mysqld_exporter) > 7ff6949 - [x] [memcached_exporter](https://github.com/prometheus/memcached_exporter) > 9054a7b - [x] [statsd_exporter](https://github.com/prometheus/statsd_exporter) > d21878d - [x] [haproxy_exporter](https://github.com/prometheus/haproxy_exporter) > 2d5e6b3 - [x] [blackbox_exporter](https://github.com/prometheus/blackbox_exporter) > 77378ea - [x] [influxdb_exporter](https://github.com/prometheus/influxdb_exporter) > 5405c93 - [x] [collectd_exporter](https://github.com/prometheus/collectd_exporter) > 525cc86 - [x] [graphite_exporter](https://github.com/prometheus/graphite_exporter) > bc4de2b - [x] [redis_exporter](https://github.com/oliver006/redis_exporter) > cca7a84 - [x] [pushgateway](https://github.com/prometheus/pushgateway) > ab3e8e4 - [x] [snmp_exporter](https://github.com/prometheus/snmp_exporter) > abf9b0b - [x] [unbound_exporter](https://github.com/kumina/unbound_exporter) > 76ad323 - [x] [postgres_exporter](https://github.com/wrouesnel/postgres_exporter) > bec3835 - [x] [postfix_exporter](https://github.com/kumina/postfix_exporter) > d6275f3 - [x] [dovecot_exporter](https://github.com/kumina/dovecot_exporter) > 0c765b8 - [x] [openvpn_exporter](https://github.com/kumina/openvpn_exporter) > 8051b3d - [x] [phpfpm_exporter](https://github.com/kumina/phpfpm_exporter) > faa1117 - [x] [nginx-vts-exporter](https://github.com/hnlq715/nginx-vts-exporter) (~~depend on [nginx-module-vts](https://github.com/vozlt/nginx-module-vts)~~ done: c47cd96) > a3b1146 - [x] [process-exporter](https://github.com/ncabatoff/process-exporter) > 9cbf29d - [x] [script_exporter ](https://github.com/adhocteam/script_exporter) > f04369a - [x] [nsq_exporter](https://github.com/lovoo/nsq_exporter) > 75a1fee ~~[procfs](https://github.com/prometheus/procfs) (low priority: still a WIP)~~
build
net analyzer prometheus new packages depend on done low priority still a wip
1
21,293
6,133,112,155
IssuesEvent
2017-06-25 10:41:14
yunity/foodsaving-frontend
https://api.github.com/repos/yunity/foodsaving-frontend
closed
Make styl files more uniform
code-improvement code-question
The `.styl` files are in the [stylus](http://stylus-lang.com/) language. The syntax is pretty flexible and allows leaving away `:;{}` etc, but also supports the full CSS syntax. Right now, we have wild mixture of different styles. I would wish for more uniform code and would be interested in defining a style. There's also https://github.com/SimenB/stylint to enforce this. Optionally, we could also get rid of Stylus and switch to plain CSS if there's interest among you. I quite like the features of Stylus though.
2.0
Make styl files more uniform - The `.styl` files are in the [stylus](http://stylus-lang.com/) language. The syntax is pretty flexible and allows leaving away `:;{}` etc, but also supports the full CSS syntax. Right now, we have wild mixture of different styles. I would wish for more uniform code and would be interested in defining a style. There's also https://github.com/SimenB/stylint to enforce this. Optionally, we could also get rid of Stylus and switch to plain CSS if there's interest among you. I quite like the features of Stylus though.
non_build
make styl files more uniform the styl files are in the language the syntax is pretty flexible and allows leaving away etc but also supports the full css syntax right now we have wild mixture of different styles i would wish for more uniform code and would be interested in defining a style there s also to enforce this optionally we could also get rid of stylus and switch to plain css if there s interest among you i quite like the features of stylus though
0
60,033
14,697,066,480
IssuesEvent
2021-01-04 01:58:15
ClickHouse/ClickHouse
https://api.github.com/repos/ClickHouse/ClickHouse
closed
FAILED: programs/clickhouse-odbc-bridge when using ninja to build clickhouse
build
**Operating system** WSL2 in windows10 Linux version 4.19.104-microsoft-standard **Cmake version** 3.15 **Ninja version** 1.9.0 **Compiler name and version** GCC:10.2.0 **Full cmake and/or ninja output** ```c++ [kangxiang@Wsl[17:26:53]build_20.13.1.1]$ ninja [0/2] Re-checking globbed directories... [1043/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/nfa/limex_simd256.c.o In file included from ../contrib/hyperscan/src/nfa/limex_simd256.c:43: ../contrib/hyperscan/src/util/simd_utils.h: In function ‘testbit256’: ../contrib/hyperscan/src/util/simd_utils.h:684:6: note: the ABI for passing parameters with 32-byte alignment has changed in GCC 4.6 684 | char testbit256(m256 val, unsigned int n) { | ^~~~~~~~~~ [1045/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/nfa/limex_simd512.c.o In file included from ../contrib/hyperscan/src/nfa/limex_simd512.c:43: ../contrib/hyperscan/src/util/simd_utils.h: In function ‘testbit512’: ../contrib/hyperscan/src/util/simd_utils.h:1296:6: note: the ABI for passing parameters with 64-byte alignment has changed in GCC 4.6 1296 | char testbit512(m512 val, unsigned int n) { | ^~~~~~~~~~ [1157/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/rose/program_runtime.c.o In file included from ../contrib/hyperscan/src/rose/counting_miracle.h:36, from ../contrib/hyperscan/src/rose/program_runtime.c:37: ../contrib/hyperscan/src/util/simd_utils.h: In function ‘storeu256’: ../contrib/hyperscan/src/util/simd_utils.h:608:27: note: the ABI for passing parameters with 32-byte alignment has changed in GCC 4.6 608 | static really_inline void storeu256(void *ptr, m256 a) { | ^~~~~~~~~ [1175/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/rose/stream.c.o In file included from ../contrib/hyperscan/src/rose/counting_miracle.h:36, from ../contrib/hyperscan/src/rose/stream.c:30: ../contrib/hyperscan/src/util/simd_utils.h: In function ‘storeu256’: ../contrib/hyperscan/src/util/simd_utils.h:608:27: note: the ABI for passing parameters with 32-byte alignment has changed in GCC 4.6 608 | static really_inline void storeu256(void *ptr, m256 a) { | ^~~~~~~~~ [1198/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/util/state_compress.c.o ../contrib/hyperscan/src/util/state_compress.c: In function ‘storecompressed256_64bit’: ../contrib/hyperscan/src/util/state_compress.c:216:6: note: the ABI for passing parameters with 32-byte alignment has changed in GCC 4.6 216 | void storecompressed256_64bit(void *ptr, m256 xvec, m256 mvec) { | ^~~~~~~~~~~~~~~~~~~~~~~~ [3524/9432] Generating VCSRevision.h -- Found Git: /home/kangxiang/softwares/bin/git (found version "2.20.1") [4482/9432] Building CXX object contrib/llvm/llvm/lib/Target/X86/CMakeFiles/LLVMX86CodeGen.dir/X86ISelDAGToDAG.cpp.o In file included from ../contrib/llvm/llvm/lib/Target/X86/X86ISelDAGToDAG.cpp:206: contrib/llvm/llvm/lib/Target/X86/X86GenDAGISel.inc: In member function ‘virtual bool {anonymous}::X86DAGToDAGISel::CheckNodePredicate(llvm::SDNode*, unsigned int) const’: contrib/llvm/llvm/lib/Target/X86/X86GenDAGISel.inc:267202: note: ‘-Wmisleading-indentation’ is disabled from this point onwards, since column-tracking was disabled due to the size of the code/headers 267202 | return true; | [9308/9432] Linking CXX executable programs/clickhouse-odbc-bridge FAILED: programs/clickhouse-odbc-bridge : && /home/kangxiang/softwares/gcc10.2.0/bin/g++ -fdiagnostics-color=always -std=gnu++2a -fsized-deallocation -pipe -msse4.1 -msse4.2 -mpopcnt -Wall -Werror -Wextra -Wframe-larger-than=65536 -O2 -g -DNDEBUG -O3 -fno-pie -fuse-ld=gold -rdynamic -Wl,--no-undefined -Wl,-no-pie -Wl,--no-export-dynamic -rdynamic src/CMakeFiles/clickhouse_malloc.dir/Common/malloc.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/ColumnInfoHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/getIdentifierQuote.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/HandlerFactory.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/IdentifierQuoteHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/MainHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/ODBCBlockInputStream.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/ODBCBlockOutputStream.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/ODBCBridge.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/PingHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/SchemaAllowedHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/validateODBCConnectionString.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/odbc-bridge.cpp.o -o programs/clickhouse-odbc-bridge src/libclickhouse_new_delete.a base/daemon/libdaemon.a src/libdbms.a src/Parsers/libclickhouse_parsers.a contrib/poco-cmake/Data/lib_poco_data.a contrib/poco-cmake/Data/ODBC/lib_poco_data_odbc.a base/loggers/libloggers.a src/libdbms.a contrib/jemalloc-cmake/libjemalloc.a src/Parsers/New/libclickhouse_parsers_new.a src/Parsers/libclickhouse_parsers.a contrib/antlr4-runtime-cmake/libantlr4-runtime.a contrib/llvm/llvm/lib/libLLVMOrcJIT.a contrib/llvm/llvm/lib/libLLVMJITLink.a contrib/llvm/llvm/lib/libLLVMExecutionEngine.a contrib/llvm/llvm/lib/libLLVMRuntimeDyld.a contrib/llvm/llvm/lib/libLLVMX86CodeGen.a contrib/llvm/llvm/lib/libLLVMX86Desc.a contrib/llvm/llvm/lib/libLLVMX86Info.a contrib/llvm/llvm/lib/libLLVMX86Utils.a contrib/llvm/llvm/lib/libLLVMAsmPrinter.a contrib/llvm/llvm/lib/libLLVMDebugInfoDWARF.a contrib/llvm/llvm/lib/libLLVMGlobalISel.a contrib/llvm/llvm/lib/libLLVMSelectionDAG.a contrib/llvm/llvm/lib/libLLVMMCDisassembler.a contrib/llvm/llvm/lib/libLLVMPasses.a contrib/llvm/llvm/lib/libLLVMCodeGen.a contrib/llvm/llvm/lib/libLLVMipo.a contrib/llvm/llvm/lib/libLLVMIRReader.a contrib/llvm/llvm/lib/libLLVMAsmParser.a contrib/llvm/llvm/lib/libLLVMLinker.a contrib/llvm/llvm/lib/libLLVMBitWriter.a contrib/llvm/llvm/lib/libLLVMInstrumentation.a contrib/llvm/llvm/lib/libLLVMScalarOpts.a contrib/llvm/llvm/lib/libLLVMAggressiveInstCombine.a contrib/llvm/llvm/lib/libLLVMInstCombine.a contrib/llvm/llvm/lib/libLLVMVectorize.a contrib/llvm/llvm/lib/libLLVMTransformUtils.a contrib/llvm/llvm/lib/libLLVMTarget.a contrib/llvm/llvm/lib/libLLVMAnalysis.a contrib/llvm/llvm/lib/libLLVMProfileData.a contrib/llvm/llvm/lib/libLLVMObject.a contrib/llvm/llvm/lib/libLLVMBitReader.a contrib/llvm/llvm/lib/libLLVMCore.a contrib/llvm/llvm/lib/libLLVMRemarks.a contrib/llvm/llvm/lib/libLLVMBitstreamReader.a contrib/llvm/llvm/lib/libLLVMMCParser.a contrib/llvm/llvm/lib/libLLVMMC.a contrib/llvm/llvm/lib/libLLVMBinaryFormat.a contrib/llvm/llvm/lib/libLLVMDebugInfoCodeView.a contrib/llvm/llvm/lib/libLLVMDebugInfoMSF.a contrib/llvm/llvm/lib/libLLVMSupport.a contrib/llvm/llvm/lib/libLLVMDemangle.a contrib/cppkafka-cmake/libcppkafka.a contrib/librdkafka-cmake/librdkafka.a contrib/amqpcpp-cmake/libamqp-cpp.a contrib/cyrus-sasl-cmake/libsasl2.a contrib/nuraft-cmake/libnuraft.a contrib/boost-cmake/lib_boost_coroutine.a src/Dictionaries/Embedded/libclickhouse_dictionaries_embedded.a contrib/poco-cmake/MongoDB/lib_poco_mongodb.a base/mysqlxx/libmysqlxx.a contrib/mariadb-connector-c/libmariadb/libmariadbclient.a contrib/icu-cmake/libicui18n.a contrib/icu-cmake/libicuuc.a contrib/icu-cmake/libicudata.a contrib/capnproto-cmake/libcapnpc.a contrib/capnproto-cmake/libcapnp.a contrib/capnproto-cmake/libkj.a contrib/arrow-cmake/libparquet_static.a contrib/arrow-cmake/libarrow_static.a contrib/boost-cmake/lib_boost_filesystem.a contrib/flatbuffers/libflatbuffers.a contrib/arrow-cmake/libthrift_static.a contrib/boost-cmake/lib_boost_regex.a contrib/avro-cmake/libavrocpp.a contrib/boost-cmake/lib_boost_iostreams.a contrib/openldap-cmake/libldap_r.a contrib/openldap-cmake/liblber.a src/Server/grpc_protos/libclickhouse_grpc_protos.a contrib/grpc/libgrpc++.a contrib/grpc/libgrpc.a contrib/grpc/third_party/cares/cares/lib/libcares.a contrib/abseil-cpp/absl/status/libabsl_status.a contrib/abseil-cpp/absl/strings/libabsl_cord.a contrib/abseil-cpp/absl/hash/libabsl_hash.a contrib/abseil-cpp/absl/types/libabsl_bad_variant_access.a contrib/abseil-cpp/absl/hash/libabsl_city.a contrib/abseil-cpp/absl/container/libabsl_raw_hash_set.a contrib/abseil-cpp/absl/types/libabsl_bad_optional_access.a contrib/abseil-cpp/absl/container/libabsl_hashtablez_sampler.a contrib/abseil-cpp/absl/base/libabsl_exponential_biased.a contrib/grpc/libaddress_sorting.a contrib/grpc/libupb.a contrib/grpc/libgpr.a contrib/abseil-cpp/absl/synchronization/libabsl_synchronization.a contrib/abseil-cpp/absl/debugging/libabsl_stacktrace.a contrib/abseil-cpp/absl/debugging/libabsl_symbolize.a contrib/abseil-cpp/absl/debugging/libabsl_debugging_internal.a contrib/abseil-cpp/absl/debugging/libabsl_demangle_internal.a contrib/abseil-cpp/absl/synchronization/libabsl_graphcycles_internal.a contrib/abseil-cpp/absl/time/libabsl_time.a contrib/abseil-cpp/absl/time/libabsl_civil_time.a contrib/abseil-cpp/absl/time/libabsl_time_zone.a contrib/abseil-cpp/absl/base/libabsl_malloc_internal.a contrib/abseil-cpp/absl/strings/libabsl_str_format_internal.a contrib/abseil-cpp/absl/strings/libabsl_strings.a contrib/abseil-cpp/absl/strings/libabsl_strings_internal.a contrib/abseil-cpp/absl/numeric/libabsl_int128.a contrib/abseil-cpp/absl/base/libabsl_throw_delegate.a contrib/abseil-cpp/absl/base/libabsl_base.a contrib/abseil-cpp/absl/base/libabsl_raw_logging_internal.a contrib/abseil-cpp/absl/base/libabsl_dynamic_annotations.a contrib/abseil-cpp/absl/base/libabsl_log_severity.a contrib/abseil-cpp/absl/base/libabsl_spinlock_wait.a -lrt contrib/libhdfs3-cmake/libhdfs3.a contrib/protobuf/libprotobuf.a contrib/libgsasl/liblibgsasl.a contrib/krb5-cmake/libkrb5.a contrib/libxml2-cmake/liblibxml2.a contrib/cassandra/libcassandra_static.a contrib/libuv/libuv_a.a -lrt contrib/rocksdb-cmake/librocksdb.a contrib/lz4-cmake/liblz4.a contrib/snappy/libsnappy.a contrib/boost-cmake/lib_boost_context.a src/Common/Config/libclickhouse_common_config.a src/Common/ZooKeeper/libclickhouse_common_zookeeper.a src/libclickhouse_common_io.a contrib/boost-cmake/lib_boost_program_options.a contrib/zstd-cmake/libzstd.a base/widechar_width/libwidechar_width.a contrib/double-conversion-cmake/libdouble-conversion.a contrib/dragonbox-cmake/libdragonbox_to_chars.a contrib/re2/libre2.a contrib/re2_st/libre2_st.a contrib/libcpuid-cmake/libcpuid.a contrib/croaring-cmake/libroaring.a contrib/xz/liblzma.a -pthread contrib/aws-s3-cmake/libaws_s3.a contrib/aws-s3-cmake/libaws_s3_checksums.a contrib/brotli-cmake/libbrotli.a src/Common/StringUtils/libstring_utils.a base/common/libcommon.a contrib/boost-cmake/lib_boost_system.a contrib/cityhash102/libcityhash.a contrib/FastMemcpy/libFastMemcpy.a contrib/poco-cmake/Net/SSL/lib_poco_net_ssl.a contrib/poco-cmake/Net/lib_poco_net.a contrib/poco-cmake/Crypto/lib_poco_crypto.a contrib/poco-cmake/Util/lib_poco_util.a contrib/poco-cmake/JSON/lib_poco_json.a contrib/poco-cmake/JSON/lib_poco_json_pdjson.a contrib/poco-cmake/XML/lib_poco_xml.a contrib/poco-cmake/XML/lib_poco_xml_expat.a contrib/replxx-cmake/libreplxx.a contrib/cctz-cmake/libcctz.a -Wl,--whole-archive /home/kangxiang/ClickHouse/build_20.13.1.1/contrib/cctz-cmake/libtzdata.a -Wl,--no-whole-archive contrib/fmtlib-cmake/libfmt.a contrib/sentry-native/libsentry.a contrib/curl-cmake/libcurl.a contrib/boringssl-cmake/libssl.a contrib/boringssl-cmake/libcrypto.a -lpthread -ldl contrib/poco-cmake/Data/lib_poco_data.a contrib/poco-cmake/Foundation/lib_poco_foundation.a contrib/zlib-ng/libzlib.a contrib/poco-cmake/Foundation/lib_poco_foundation_pcre.a contrib/unixodbc-cmake/libunixodbc.a contrib/unixodbc-cmake/libltdl.a -Wl,--start-group base/glibc-compatibility/libglibc-compatibility.a contrib/libcxx-cmake/libcxx.a contrib/libcxxabi-cmake/libcxxabi.a contrib/libunwind-cmake/libunwind.a -Wl,--end-group -nodefaultlibs -lgcc -lc -lm -lrt -lpthread -ldl && : /usr/bin/ld.gold: fatal error: programs/clickhouse-odbc-bridge: No space left on device collect2: error: ld returned 1 exit status [9321/9432] Building CXX object src/Storages/System/CMakeFiles/clickhouse_storages_system.dir/StorageSystemDictionaries.cpp.o ninja: build stopped: subcommand failed. ``` any idea? Thank in advance :)
1.0
FAILED: programs/clickhouse-odbc-bridge when using ninja to build clickhouse - **Operating system** WSL2 in windows10 Linux version 4.19.104-microsoft-standard **Cmake version** 3.15 **Ninja version** 1.9.0 **Compiler name and version** GCC:10.2.0 **Full cmake and/or ninja output** ```c++ [kangxiang@Wsl[17:26:53]build_20.13.1.1]$ ninja [0/2] Re-checking globbed directories... [1043/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/nfa/limex_simd256.c.o In file included from ../contrib/hyperscan/src/nfa/limex_simd256.c:43: ../contrib/hyperscan/src/util/simd_utils.h: In function ‘testbit256’: ../contrib/hyperscan/src/util/simd_utils.h:684:6: note: the ABI for passing parameters with 32-byte alignment has changed in GCC 4.6 684 | char testbit256(m256 val, unsigned int n) { | ^~~~~~~~~~ [1045/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/nfa/limex_simd512.c.o In file included from ../contrib/hyperscan/src/nfa/limex_simd512.c:43: ../contrib/hyperscan/src/util/simd_utils.h: In function ‘testbit512’: ../contrib/hyperscan/src/util/simd_utils.h:1296:6: note: the ABI for passing parameters with 64-byte alignment has changed in GCC 4.6 1296 | char testbit512(m512 val, unsigned int n) { | ^~~~~~~~~~ [1157/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/rose/program_runtime.c.o In file included from ../contrib/hyperscan/src/rose/counting_miracle.h:36, from ../contrib/hyperscan/src/rose/program_runtime.c:37: ../contrib/hyperscan/src/util/simd_utils.h: In function ‘storeu256’: ../contrib/hyperscan/src/util/simd_utils.h:608:27: note: the ABI for passing parameters with 32-byte alignment has changed in GCC 4.6 608 | static really_inline void storeu256(void *ptr, m256 a) { | ^~~~~~~~~ [1175/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/rose/stream.c.o In file included from ../contrib/hyperscan/src/rose/counting_miracle.h:36, from ../contrib/hyperscan/src/rose/stream.c:30: ../contrib/hyperscan/src/util/simd_utils.h: In function ‘storeu256’: ../contrib/hyperscan/src/util/simd_utils.h:608:27: note: the ABI for passing parameters with 32-byte alignment has changed in GCC 4.6 608 | static really_inline void storeu256(void *ptr, m256 a) { | ^~~~~~~~~ [1198/9432] Building C object contrib/hyperscan-cmake/CMakeFiles/hyperscan.dir/__/hyperscan/src/util/state_compress.c.o ../contrib/hyperscan/src/util/state_compress.c: In function ‘storecompressed256_64bit’: ../contrib/hyperscan/src/util/state_compress.c:216:6: note: the ABI for passing parameters with 32-byte alignment has changed in GCC 4.6 216 | void storecompressed256_64bit(void *ptr, m256 xvec, m256 mvec) { | ^~~~~~~~~~~~~~~~~~~~~~~~ [3524/9432] Generating VCSRevision.h -- Found Git: /home/kangxiang/softwares/bin/git (found version "2.20.1") [4482/9432] Building CXX object contrib/llvm/llvm/lib/Target/X86/CMakeFiles/LLVMX86CodeGen.dir/X86ISelDAGToDAG.cpp.o In file included from ../contrib/llvm/llvm/lib/Target/X86/X86ISelDAGToDAG.cpp:206: contrib/llvm/llvm/lib/Target/X86/X86GenDAGISel.inc: In member function ‘virtual bool {anonymous}::X86DAGToDAGISel::CheckNodePredicate(llvm::SDNode*, unsigned int) const’: contrib/llvm/llvm/lib/Target/X86/X86GenDAGISel.inc:267202: note: ‘-Wmisleading-indentation’ is disabled from this point onwards, since column-tracking was disabled due to the size of the code/headers 267202 | return true; | [9308/9432] Linking CXX executable programs/clickhouse-odbc-bridge FAILED: programs/clickhouse-odbc-bridge : && /home/kangxiang/softwares/gcc10.2.0/bin/g++ -fdiagnostics-color=always -std=gnu++2a -fsized-deallocation -pipe -msse4.1 -msse4.2 -mpopcnt -Wall -Werror -Wextra -Wframe-larger-than=65536 -O2 -g -DNDEBUG -O3 -fno-pie -fuse-ld=gold -rdynamic -Wl,--no-undefined -Wl,-no-pie -Wl,--no-export-dynamic -rdynamic src/CMakeFiles/clickhouse_malloc.dir/Common/malloc.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/ColumnInfoHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/getIdentifierQuote.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/HandlerFactory.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/IdentifierQuoteHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/MainHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/ODBCBlockInputStream.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/ODBCBlockOutputStream.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/ODBCBridge.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/PingHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/SchemaAllowedHandler.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/validateODBCConnectionString.cpp.o programs/odbc-bridge/CMakeFiles/clickhouse-odbc-bridge.dir/odbc-bridge.cpp.o -o programs/clickhouse-odbc-bridge src/libclickhouse_new_delete.a base/daemon/libdaemon.a src/libdbms.a src/Parsers/libclickhouse_parsers.a contrib/poco-cmake/Data/lib_poco_data.a contrib/poco-cmake/Data/ODBC/lib_poco_data_odbc.a base/loggers/libloggers.a src/libdbms.a contrib/jemalloc-cmake/libjemalloc.a src/Parsers/New/libclickhouse_parsers_new.a src/Parsers/libclickhouse_parsers.a contrib/antlr4-runtime-cmake/libantlr4-runtime.a contrib/llvm/llvm/lib/libLLVMOrcJIT.a contrib/llvm/llvm/lib/libLLVMJITLink.a contrib/llvm/llvm/lib/libLLVMExecutionEngine.a contrib/llvm/llvm/lib/libLLVMRuntimeDyld.a contrib/llvm/llvm/lib/libLLVMX86CodeGen.a contrib/llvm/llvm/lib/libLLVMX86Desc.a contrib/llvm/llvm/lib/libLLVMX86Info.a contrib/llvm/llvm/lib/libLLVMX86Utils.a contrib/llvm/llvm/lib/libLLVMAsmPrinter.a contrib/llvm/llvm/lib/libLLVMDebugInfoDWARF.a contrib/llvm/llvm/lib/libLLVMGlobalISel.a contrib/llvm/llvm/lib/libLLVMSelectionDAG.a contrib/llvm/llvm/lib/libLLVMMCDisassembler.a contrib/llvm/llvm/lib/libLLVMPasses.a contrib/llvm/llvm/lib/libLLVMCodeGen.a contrib/llvm/llvm/lib/libLLVMipo.a contrib/llvm/llvm/lib/libLLVMIRReader.a contrib/llvm/llvm/lib/libLLVMAsmParser.a contrib/llvm/llvm/lib/libLLVMLinker.a contrib/llvm/llvm/lib/libLLVMBitWriter.a contrib/llvm/llvm/lib/libLLVMInstrumentation.a contrib/llvm/llvm/lib/libLLVMScalarOpts.a contrib/llvm/llvm/lib/libLLVMAggressiveInstCombine.a contrib/llvm/llvm/lib/libLLVMInstCombine.a contrib/llvm/llvm/lib/libLLVMVectorize.a contrib/llvm/llvm/lib/libLLVMTransformUtils.a contrib/llvm/llvm/lib/libLLVMTarget.a contrib/llvm/llvm/lib/libLLVMAnalysis.a contrib/llvm/llvm/lib/libLLVMProfileData.a contrib/llvm/llvm/lib/libLLVMObject.a contrib/llvm/llvm/lib/libLLVMBitReader.a contrib/llvm/llvm/lib/libLLVMCore.a contrib/llvm/llvm/lib/libLLVMRemarks.a contrib/llvm/llvm/lib/libLLVMBitstreamReader.a contrib/llvm/llvm/lib/libLLVMMCParser.a contrib/llvm/llvm/lib/libLLVMMC.a contrib/llvm/llvm/lib/libLLVMBinaryFormat.a contrib/llvm/llvm/lib/libLLVMDebugInfoCodeView.a contrib/llvm/llvm/lib/libLLVMDebugInfoMSF.a contrib/llvm/llvm/lib/libLLVMSupport.a contrib/llvm/llvm/lib/libLLVMDemangle.a contrib/cppkafka-cmake/libcppkafka.a contrib/librdkafka-cmake/librdkafka.a contrib/amqpcpp-cmake/libamqp-cpp.a contrib/cyrus-sasl-cmake/libsasl2.a contrib/nuraft-cmake/libnuraft.a contrib/boost-cmake/lib_boost_coroutine.a src/Dictionaries/Embedded/libclickhouse_dictionaries_embedded.a contrib/poco-cmake/MongoDB/lib_poco_mongodb.a base/mysqlxx/libmysqlxx.a contrib/mariadb-connector-c/libmariadb/libmariadbclient.a contrib/icu-cmake/libicui18n.a contrib/icu-cmake/libicuuc.a contrib/icu-cmake/libicudata.a contrib/capnproto-cmake/libcapnpc.a contrib/capnproto-cmake/libcapnp.a contrib/capnproto-cmake/libkj.a contrib/arrow-cmake/libparquet_static.a contrib/arrow-cmake/libarrow_static.a contrib/boost-cmake/lib_boost_filesystem.a contrib/flatbuffers/libflatbuffers.a contrib/arrow-cmake/libthrift_static.a contrib/boost-cmake/lib_boost_regex.a contrib/avro-cmake/libavrocpp.a contrib/boost-cmake/lib_boost_iostreams.a contrib/openldap-cmake/libldap_r.a contrib/openldap-cmake/liblber.a src/Server/grpc_protos/libclickhouse_grpc_protos.a contrib/grpc/libgrpc++.a contrib/grpc/libgrpc.a contrib/grpc/third_party/cares/cares/lib/libcares.a contrib/abseil-cpp/absl/status/libabsl_status.a contrib/abseil-cpp/absl/strings/libabsl_cord.a contrib/abseil-cpp/absl/hash/libabsl_hash.a contrib/abseil-cpp/absl/types/libabsl_bad_variant_access.a contrib/abseil-cpp/absl/hash/libabsl_city.a contrib/abseil-cpp/absl/container/libabsl_raw_hash_set.a contrib/abseil-cpp/absl/types/libabsl_bad_optional_access.a contrib/abseil-cpp/absl/container/libabsl_hashtablez_sampler.a contrib/abseil-cpp/absl/base/libabsl_exponential_biased.a contrib/grpc/libaddress_sorting.a contrib/grpc/libupb.a contrib/grpc/libgpr.a contrib/abseil-cpp/absl/synchronization/libabsl_synchronization.a contrib/abseil-cpp/absl/debugging/libabsl_stacktrace.a contrib/abseil-cpp/absl/debugging/libabsl_symbolize.a contrib/abseil-cpp/absl/debugging/libabsl_debugging_internal.a contrib/abseil-cpp/absl/debugging/libabsl_demangle_internal.a contrib/abseil-cpp/absl/synchronization/libabsl_graphcycles_internal.a contrib/abseil-cpp/absl/time/libabsl_time.a contrib/abseil-cpp/absl/time/libabsl_civil_time.a contrib/abseil-cpp/absl/time/libabsl_time_zone.a contrib/abseil-cpp/absl/base/libabsl_malloc_internal.a contrib/abseil-cpp/absl/strings/libabsl_str_format_internal.a contrib/abseil-cpp/absl/strings/libabsl_strings.a contrib/abseil-cpp/absl/strings/libabsl_strings_internal.a contrib/abseil-cpp/absl/numeric/libabsl_int128.a contrib/abseil-cpp/absl/base/libabsl_throw_delegate.a contrib/abseil-cpp/absl/base/libabsl_base.a contrib/abseil-cpp/absl/base/libabsl_raw_logging_internal.a contrib/abseil-cpp/absl/base/libabsl_dynamic_annotations.a contrib/abseil-cpp/absl/base/libabsl_log_severity.a contrib/abseil-cpp/absl/base/libabsl_spinlock_wait.a -lrt contrib/libhdfs3-cmake/libhdfs3.a contrib/protobuf/libprotobuf.a contrib/libgsasl/liblibgsasl.a contrib/krb5-cmake/libkrb5.a contrib/libxml2-cmake/liblibxml2.a contrib/cassandra/libcassandra_static.a contrib/libuv/libuv_a.a -lrt contrib/rocksdb-cmake/librocksdb.a contrib/lz4-cmake/liblz4.a contrib/snappy/libsnappy.a contrib/boost-cmake/lib_boost_context.a src/Common/Config/libclickhouse_common_config.a src/Common/ZooKeeper/libclickhouse_common_zookeeper.a src/libclickhouse_common_io.a contrib/boost-cmake/lib_boost_program_options.a contrib/zstd-cmake/libzstd.a base/widechar_width/libwidechar_width.a contrib/double-conversion-cmake/libdouble-conversion.a contrib/dragonbox-cmake/libdragonbox_to_chars.a contrib/re2/libre2.a contrib/re2_st/libre2_st.a contrib/libcpuid-cmake/libcpuid.a contrib/croaring-cmake/libroaring.a contrib/xz/liblzma.a -pthread contrib/aws-s3-cmake/libaws_s3.a contrib/aws-s3-cmake/libaws_s3_checksums.a contrib/brotli-cmake/libbrotli.a src/Common/StringUtils/libstring_utils.a base/common/libcommon.a contrib/boost-cmake/lib_boost_system.a contrib/cityhash102/libcityhash.a contrib/FastMemcpy/libFastMemcpy.a contrib/poco-cmake/Net/SSL/lib_poco_net_ssl.a contrib/poco-cmake/Net/lib_poco_net.a contrib/poco-cmake/Crypto/lib_poco_crypto.a contrib/poco-cmake/Util/lib_poco_util.a contrib/poco-cmake/JSON/lib_poco_json.a contrib/poco-cmake/JSON/lib_poco_json_pdjson.a contrib/poco-cmake/XML/lib_poco_xml.a contrib/poco-cmake/XML/lib_poco_xml_expat.a contrib/replxx-cmake/libreplxx.a contrib/cctz-cmake/libcctz.a -Wl,--whole-archive /home/kangxiang/ClickHouse/build_20.13.1.1/contrib/cctz-cmake/libtzdata.a -Wl,--no-whole-archive contrib/fmtlib-cmake/libfmt.a contrib/sentry-native/libsentry.a contrib/curl-cmake/libcurl.a contrib/boringssl-cmake/libssl.a contrib/boringssl-cmake/libcrypto.a -lpthread -ldl contrib/poco-cmake/Data/lib_poco_data.a contrib/poco-cmake/Foundation/lib_poco_foundation.a contrib/zlib-ng/libzlib.a contrib/poco-cmake/Foundation/lib_poco_foundation_pcre.a contrib/unixodbc-cmake/libunixodbc.a contrib/unixodbc-cmake/libltdl.a -Wl,--start-group base/glibc-compatibility/libglibc-compatibility.a contrib/libcxx-cmake/libcxx.a contrib/libcxxabi-cmake/libcxxabi.a contrib/libunwind-cmake/libunwind.a -Wl,--end-group -nodefaultlibs -lgcc -lc -lm -lrt -lpthread -ldl && : /usr/bin/ld.gold: fatal error: programs/clickhouse-odbc-bridge: No space left on device collect2: error: ld returned 1 exit status [9321/9432] Building CXX object src/Storages/System/CMakeFiles/clickhouse_storages_system.dir/StorageSystemDictionaries.cpp.o ninja: build stopped: subcommand failed. ``` any idea? Thank in advance :)
build
failed programs clickhouse odbc bridge when using ninja to build clickhouse operating system in linux version microsoft standard cmake version ninja version compiler name and version gcc: full cmake and or ninja output c build ninja re checking globbed directories building c object contrib hyperscan cmake cmakefiles hyperscan dir hyperscan src nfa limex c o in file included from contrib hyperscan src nfa limex c contrib hyperscan src util simd utils h in function ‘ ’ contrib hyperscan src util simd utils h note the abi for passing parameters with byte alignment has changed in gcc char val unsigned int n building c object contrib hyperscan cmake cmakefiles hyperscan dir hyperscan src nfa limex c o in file included from contrib hyperscan src nfa limex c contrib hyperscan src util simd utils h in function ‘ ’ contrib hyperscan src util simd utils h note the abi for passing parameters with byte alignment has changed in gcc char val unsigned int n building c object contrib hyperscan cmake cmakefiles hyperscan dir hyperscan src rose program runtime c o in file included from contrib hyperscan src rose counting miracle h from contrib hyperscan src rose program runtime c contrib hyperscan src util simd utils h in function ‘ ’ contrib hyperscan src util simd utils h note the abi for passing parameters with byte alignment has changed in gcc static really inline void void ptr a building c object contrib hyperscan cmake cmakefiles hyperscan dir hyperscan src rose stream c o in file included from contrib hyperscan src rose counting miracle h from contrib hyperscan src rose stream c contrib hyperscan src util simd utils h in function ‘ ’ contrib hyperscan src util simd utils h note the abi for passing parameters with byte alignment has changed in gcc static really inline void void ptr a building c object contrib hyperscan cmake cmakefiles hyperscan dir hyperscan src util state compress c o contrib hyperscan src util state compress c in function ‘ ’ contrib hyperscan src util state compress c note the abi for passing parameters with byte alignment has changed in gcc void void ptr xvec mvec generating vcsrevision h found git home kangxiang softwares bin git found version building cxx object contrib llvm llvm lib target cmakefiles dir cpp o in file included from contrib llvm llvm lib target cpp contrib llvm llvm lib target inc in member function ‘virtual bool anonymous checknodepredicate llvm sdnode unsigned int const’ contrib llvm llvm lib target inc note ‘ wmisleading indentation’ is disabled from this point onwards since column tracking was disabled due to the size of the code headers return true linking cxx executable programs clickhouse odbc bridge failed programs clickhouse odbc bridge home kangxiang softwares bin g fdiagnostics color always std gnu fsized deallocation pipe mpopcnt wall werror wextra wframe larger than g dndebug fno pie fuse ld gold rdynamic wl no undefined wl no pie wl no export dynamic rdynamic src cmakefiles clickhouse malloc dir common malloc cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir columninfohandler cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir getidentifierquote cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir handlerfactory cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir identifierquotehandler cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir mainhandler cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir odbcblockinputstream cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir odbcblockoutputstream cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir odbcbridge cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir pinghandler cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir schemaallowedhandler cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir validateodbcconnectionstring cpp o programs odbc bridge cmakefiles clickhouse odbc bridge dir odbc bridge cpp o o programs clickhouse odbc bridge src libclickhouse new delete a base daemon libdaemon a src libdbms a src parsers libclickhouse parsers a contrib poco cmake data lib poco data a contrib poco cmake data odbc lib poco data odbc a base loggers libloggers a src libdbms a contrib jemalloc cmake libjemalloc a src parsers new libclickhouse parsers new a src parsers libclickhouse parsers a contrib runtime cmake runtime a contrib llvm llvm lib libllvmorcjit a contrib llvm llvm lib libllvmjitlink a contrib llvm llvm lib libllvmexecutionengine a contrib llvm llvm lib libllvmruntimedyld a contrib llvm llvm lib a contrib llvm llvm lib a contrib llvm llvm lib a contrib llvm llvm lib a contrib llvm llvm lib libllvmasmprinter a contrib llvm llvm lib libllvmdebuginfodwarf a contrib llvm llvm lib libllvmglobalisel a contrib llvm llvm lib libllvmselectiondag a contrib llvm llvm lib libllvmmcdisassembler a contrib llvm llvm lib libllvmpasses a contrib llvm llvm lib libllvmcodegen a contrib llvm llvm lib libllvmipo a contrib llvm llvm lib libllvmirreader a contrib llvm llvm lib libllvmasmparser a contrib llvm llvm lib libllvmlinker a contrib llvm llvm lib libllvmbitwriter a contrib llvm llvm lib libllvminstrumentation a contrib llvm llvm lib libllvmscalaropts a contrib llvm llvm lib libllvmaggressiveinstcombine a contrib llvm llvm lib libllvminstcombine a contrib llvm llvm lib libllvmvectorize a contrib llvm llvm lib libllvmtransformutils a contrib llvm llvm lib libllvmtarget a contrib llvm llvm lib libllvmanalysis a contrib llvm llvm lib libllvmprofiledata a contrib llvm llvm lib libllvmobject a contrib llvm llvm lib libllvmbitreader a contrib llvm llvm lib libllvmcore a contrib llvm llvm lib libllvmremarks a contrib llvm llvm lib libllvmbitstreamreader a contrib llvm llvm lib libllvmmcparser a contrib llvm llvm lib libllvmmc a contrib llvm llvm lib libllvmbinaryformat a contrib llvm llvm lib libllvmdebuginfocodeview a contrib llvm llvm lib libllvmdebuginfomsf a contrib llvm llvm lib libllvmsupport a contrib llvm llvm lib libllvmdemangle a contrib cppkafka cmake libcppkafka a contrib librdkafka cmake librdkafka a contrib amqpcpp cmake libamqp cpp a contrib cyrus sasl cmake a contrib nuraft cmake libnuraft a contrib boost cmake lib boost coroutine a src dictionaries embedded libclickhouse dictionaries embedded a contrib poco cmake mongodb lib poco mongodb a base mysqlxx libmysqlxx a contrib mariadb connector c libmariadb libmariadbclient a contrib icu cmake a contrib icu cmake libicuuc a contrib icu cmake libicudata a contrib capnproto cmake libcapnpc a contrib capnproto cmake libcapnp a contrib capnproto cmake libkj a contrib arrow cmake libparquet static a contrib arrow cmake libarrow static a contrib boost cmake lib boost filesystem a contrib flatbuffers libflatbuffers a contrib arrow cmake libthrift static a contrib boost cmake lib boost regex a contrib avro cmake libavrocpp a contrib boost cmake lib boost iostreams a contrib openldap cmake libldap r a contrib openldap cmake liblber a src server grpc protos libclickhouse grpc protos a contrib grpc libgrpc a contrib grpc libgrpc a contrib grpc third party cares cares lib libcares a contrib abseil cpp absl status libabsl status a contrib abseil cpp absl strings libabsl cord a contrib abseil cpp absl hash libabsl hash a contrib abseil cpp absl types libabsl bad variant access a contrib abseil cpp absl hash libabsl city a contrib abseil cpp absl container libabsl raw hash set a contrib abseil cpp absl types libabsl bad optional access a contrib abseil cpp absl container libabsl hashtablez sampler a contrib abseil cpp absl base libabsl exponential biased a contrib grpc libaddress sorting a contrib grpc libupb a contrib grpc libgpr a contrib abseil cpp absl synchronization libabsl synchronization a contrib abseil cpp absl debugging libabsl stacktrace a contrib abseil cpp absl debugging libabsl symbolize a contrib abseil cpp absl debugging libabsl debugging internal a contrib abseil cpp absl debugging libabsl demangle internal a contrib abseil cpp absl synchronization libabsl graphcycles internal a contrib abseil cpp absl time libabsl time a contrib abseil cpp absl time libabsl civil time a contrib abseil cpp absl time libabsl time zone a contrib abseil cpp absl base libabsl malloc internal a contrib abseil cpp absl strings libabsl str format internal a contrib abseil cpp absl strings libabsl strings a contrib abseil cpp absl strings libabsl strings internal a contrib abseil cpp absl numeric libabsl a contrib abseil cpp absl base libabsl throw delegate a contrib abseil cpp absl base libabsl base a contrib abseil cpp absl base libabsl raw logging internal a contrib abseil cpp absl base libabsl dynamic annotations a contrib abseil cpp absl base libabsl log severity a contrib abseil cpp absl base libabsl spinlock wait a lrt contrib cmake a contrib protobuf libprotobuf a contrib libgsasl liblibgsasl a contrib cmake a contrib cmake a contrib cassandra libcassandra static a contrib libuv libuv a a lrt contrib rocksdb cmake librocksdb a contrib cmake a contrib snappy libsnappy a contrib boost cmake lib boost context a src common config libclickhouse common config a src common zookeeper libclickhouse common zookeeper a src libclickhouse common io a contrib boost cmake lib boost program options a contrib zstd cmake libzstd a base widechar width libwidechar width a contrib double conversion cmake libdouble conversion a contrib dragonbox cmake libdragonbox to chars a contrib a contrib st st a contrib libcpuid cmake libcpuid a contrib croaring cmake libroaring a contrib xz liblzma a pthread contrib aws cmake libaws a contrib aws cmake libaws checksums a contrib brotli cmake libbrotli a src common stringutils libstring utils a base common libcommon a contrib boost cmake lib boost system a contrib libcityhash a contrib fastmemcpy libfastmemcpy a contrib poco cmake net ssl lib poco net ssl a contrib poco cmake net lib poco net a contrib poco cmake crypto lib poco crypto a contrib poco cmake util lib poco util a contrib poco cmake json lib poco json a contrib poco cmake json lib poco json pdjson a contrib poco cmake xml lib poco xml a contrib poco cmake xml lib poco xml expat a contrib replxx cmake libreplxx a contrib cctz cmake libcctz a wl whole archive home kangxiang clickhouse build contrib cctz cmake libtzdata a wl no whole archive contrib fmtlib cmake libfmt a contrib sentry native libsentry a contrib curl cmake libcurl a contrib boringssl cmake libssl a contrib boringssl cmake libcrypto a lpthread ldl contrib poco cmake data lib poco data a contrib poco cmake foundation lib poco foundation a contrib zlib ng libzlib a contrib poco cmake foundation lib poco foundation pcre a contrib unixodbc cmake libunixodbc a contrib unixodbc cmake libltdl a wl start group base glibc compatibility libglibc compatibility a contrib libcxx cmake libcxx a contrib libcxxabi cmake libcxxabi a contrib libunwind cmake libunwind a wl end group nodefaultlibs lgcc lc lm lrt lpthread ldl usr bin ld gold fatal error programs clickhouse odbc bridge no space left on device error ld returned exit status building cxx object src storages system cmakefiles clickhouse storages system dir storagesystemdictionaries cpp o ninja build stopped subcommand failed any idea thank in advance
1
728,329
25,075,445,142
IssuesEvent
2022-11-07 15:10:46
AY2223S1-CS2103T-F12-1/tp
https://api.github.com/repos/AY2223S1-CS2103T-F12-1/tp
closed
[PE-D][Tester E] Insufficient visuals in UG
priority.low ugbug
There is only 1 picture in the UG, and descriptions of commands are all given in text which may be difficult to understand in view of your target audience. <!--session: 1666943752219-ab42b068-3b76-4b0f-978d-954f8e09e177--> <!--Version: Web v3.4.4--> ------------- Labels: `severity.Low` `type.DocumentationBug` original: wrewsama/ped#15
1.0
[PE-D][Tester E] Insufficient visuals in UG - There is only 1 picture in the UG, and descriptions of commands are all given in text which may be difficult to understand in view of your target audience. <!--session: 1666943752219-ab42b068-3b76-4b0f-978d-954f8e09e177--> <!--Version: Web v3.4.4--> ------------- Labels: `severity.Low` `type.DocumentationBug` original: wrewsama/ped#15
non_build
insufficient visuals in ug there is only picture in the ug and descriptions of commands are all given in text which may be difficult to understand in view of your target audience labels severity low type documentationbug original wrewsama ped
0
295,904
25,514,251,184
IssuesEvent
2022-11-28 15:14:02
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
closed
com.hazelcast.mapstore.GenericMapStoreTest.whenMapStoreDestroyOnNonMaster_thenDropMapping [HZ-1816]
Type: Test-Failure Source: Internal Module: IMap Not Release Notes content to-jira Team: Platform
https://github.com/hazelcast/hazelcast/pull/22941#issuecomment-1326597448 (commit 47673944465b32875c60c6125ab6a2e26245947c) Failed on `Hazelcast-pr-builder`: https://jenkins.hazelcast.com/job/Hazelcast-pr-builder/14461/testReport/junit/com.hazelcast.mapstore/GenericMapStoreTest/whenMapStoreDestroyOnNonMaster_thenDropMapping/ <details><summary>Stacktrace:</summary> ``` java.lang.AssertionError: Expecting: <[Row{[__map-store.people_o_3a213dc2_bf2b_47a1_a1ff_6a896375bc2c]}, Row{[__map-store.people_o_5298b7ff_cb8c_4b10_8042_87b96c91b01e]}]> to contain exactly in any order: <[Row{[__map-store.people_o_3a213dc2_bf2b_47a1_a1ff_6a896375bc2c]}]> but the following elements were unexpected: <[Row{[__map-store.people_o_5298b7ff_cb8c_4b10_8042_87b96c91b01e]}]> at com.hazelcast.jet.sql.SqlTestSupport.assertRowsAnyOrder(SqlTestSupport.java:322) at com.hazelcast.jet.sql.SqlTestSupport.assertRowsAnyOrder(SqlTestSupport.java:284) at com.hazelcast.mapstore.GenericMapStoreTest.lambda$awaitMappingCreated$3(GenericMapStoreTest.java:668) at com.hazelcast.test.HazelcastTestSupport.assertTrueEventually(HazelcastTestSupport.java:1236) at com.hazelcast.test.HazelcastTestSupport.assertTrueEventually(HazelcastTestSupport.java:1253) at com.hazelcast.mapstore.GenericMapStoreTest.awaitMappingCreated(GenericMapStoreTest.java:667) at com.hazelcast.mapstore.GenericMapStoreTest.whenMapStoreDestroyOnNonMaster_thenDropMapping(GenericMapStoreTest.java:136) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:115) at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:107) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.lang.Thread.run(Thread.java:750) ``` </details> <details><summary>Standard output:</summary> ``` Finished Running Test: whenSetNonExistingColumnOnSecondMapStore_thenFailToInitialize in 0.039 seconds. Started Running Test: whenMapStoreDestroyOnNonMaster_thenDropMapping 15:29:56,960 DEBUG || - [GenericMapStore] hz.tender_borg.cached.thread-6 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] Initializing for map people_o_3a213dc2_bf2b_47a1_a1ff_6a896375bc2c 15:29:56,963 DEBUG || - [GenericMapStore] hz.tender_borg.cached.thread-7 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] Discovered following mapping columns: id INTEGER, name VARCHAR, age INTEGER 15:29:56,964 WARN || - [InitExecutionOperation] ForkJoinPool.commonPool-worker-2 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] null java.util.concurrent.CancellationException: null at com.hazelcast.jet.sql.impl.processors.RootResultConsumerSink.tryProcess(RootResultConsumerSink.java:97) ~[hazelcast-sql-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.lambda$stateMachineStep$2(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.util.Util.doWithClassLoader(Util.java:547) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:328) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.call(ProcessorTasklet.java:291) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:404) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:895) ~[?:1.8.0_351] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:369) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_351] 15:29:56,964 WARN || - [SubmitJobOperation] hz.tender_borg.cached.thread-8 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] null java.util.concurrent.CancellationException: null at com.hazelcast.jet.sql.impl.processors.RootResultConsumerSink.tryProcess(RootResultConsumerSink.java:97) ~[hazelcast-sql-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.lambda$stateMachineStep$2(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.util.Util.doWithClassLoader(Util.java:547) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:328) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.call(ProcessorTasklet.java:291) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:404) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:895) ~[?:1.8.0_351] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:369) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_351] 15:29:56,975 WARN || - [InitExecutionOperation] ForkJoinPool.commonPool-worker-9 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] null java.util.concurrent.CancellationException: null at com.hazelcast.jet.sql.impl.processors.RootResultConsumerSink.tryProcess(RootResultConsumerSink.java:97) ~[hazelcast-sql-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.lambda$stateMachineStep$2(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.util.Util.doWithClassLoader(Util.java:547) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:328) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.call(ProcessorTasklet.java:291) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:404) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:895) ~[?:1.8.0_351] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:369) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_351] 15:29:56,976 WARN || - [SubmitJobOperation] ForkJoinPool.commonPool-worker-2 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] null java.util.concurrent.CancellationException: null at com.hazelcast.jet.sql.impl.processors.RootResultConsumerSink.tryProcess(RootResultConsumerSink.java:97) ~[hazelcast-sql-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.lambda$stateMachineStep$2(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.util.Util.doWithClassLoader(Util.java:547) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:328) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.call(ProcessorTasklet.java:291) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:404) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:895) ~[?:1.8.0_351] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:369) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_351] 15:30:02,305 INFO |whenMapStoreDestroyOnNonMaster_thenDropMapping| - [SqlTestSupport] Time-limited test - Removing 0 cached plans in SqlTestSupport.@After 15:30:02,306 INFO |whenMapStoreDestroyOnNonMaster_thenDropMapping| - [SqlTestSupport] Time-limited test - Removing 0 cached plans in SqlTestSupport.@After 15:30:02,318 INFO |whenMapStoreDestroyOnNonMaster_thenDropMapping| - [SimpleTestInClusterSupport] Time-limited test - Ditching 0 jobs in SimpleTestInClusterSupport.@After: [] 15:30:02,320 INFO |whenMapStoreDestroyOnNonMaster_thenDropMapping| - [SimpleTestInClusterSupport] Time-limited test - Destroying 1 distributed objects in SimpleTestInClusterSupport.@After: [hz:impl:mapService/__sql.catalog] BuildInfo right after whenMapStoreDestroyOnNonMaster_thenDropMapping(com.hazelcast.mapstore.GenericMapStoreTest): BuildInfo{version='5.3.0-SNAPSHOT', build='20221124', buildNumber=20221124, revision=ada41fc, enterprise=false, serializationVersion=1} Hiccups measured while running test 'whenMapStoreDestroyOnNonMaster_thenDropMapping(com.hazelcast.mapstore.GenericMapStoreTest):' 15:29:55, accumulated pauses: 40 ms, max pause: 7 ms, pauses over 1000 ms: 0 15:30:00, accumulated pauses: 184 ms, max pause: 170 ms, pauses over 1000 ms: 0 No metrics recorded during the test ``` </details>
1.0
com.hazelcast.mapstore.GenericMapStoreTest.whenMapStoreDestroyOnNonMaster_thenDropMapping [HZ-1816] - https://github.com/hazelcast/hazelcast/pull/22941#issuecomment-1326597448 (commit 47673944465b32875c60c6125ab6a2e26245947c) Failed on `Hazelcast-pr-builder`: https://jenkins.hazelcast.com/job/Hazelcast-pr-builder/14461/testReport/junit/com.hazelcast.mapstore/GenericMapStoreTest/whenMapStoreDestroyOnNonMaster_thenDropMapping/ <details><summary>Stacktrace:</summary> ``` java.lang.AssertionError: Expecting: <[Row{[__map-store.people_o_3a213dc2_bf2b_47a1_a1ff_6a896375bc2c]}, Row{[__map-store.people_o_5298b7ff_cb8c_4b10_8042_87b96c91b01e]}]> to contain exactly in any order: <[Row{[__map-store.people_o_3a213dc2_bf2b_47a1_a1ff_6a896375bc2c]}]> but the following elements were unexpected: <[Row{[__map-store.people_o_5298b7ff_cb8c_4b10_8042_87b96c91b01e]}]> at com.hazelcast.jet.sql.SqlTestSupport.assertRowsAnyOrder(SqlTestSupport.java:322) at com.hazelcast.jet.sql.SqlTestSupport.assertRowsAnyOrder(SqlTestSupport.java:284) at com.hazelcast.mapstore.GenericMapStoreTest.lambda$awaitMappingCreated$3(GenericMapStoreTest.java:668) at com.hazelcast.test.HazelcastTestSupport.assertTrueEventually(HazelcastTestSupport.java:1236) at com.hazelcast.test.HazelcastTestSupport.assertTrueEventually(HazelcastTestSupport.java:1253) at com.hazelcast.mapstore.GenericMapStoreTest.awaitMappingCreated(GenericMapStoreTest.java:667) at com.hazelcast.mapstore.GenericMapStoreTest.whenMapStoreDestroyOnNonMaster_thenDropMapping(GenericMapStoreTest.java:136) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:115) at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:107) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.lang.Thread.run(Thread.java:750) ``` </details> <details><summary>Standard output:</summary> ``` Finished Running Test: whenSetNonExistingColumnOnSecondMapStore_thenFailToInitialize in 0.039 seconds. Started Running Test: whenMapStoreDestroyOnNonMaster_thenDropMapping 15:29:56,960 DEBUG || - [GenericMapStore] hz.tender_borg.cached.thread-6 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] Initializing for map people_o_3a213dc2_bf2b_47a1_a1ff_6a896375bc2c 15:29:56,963 DEBUG || - [GenericMapStore] hz.tender_borg.cached.thread-7 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] Discovered following mapping columns: id INTEGER, name VARCHAR, age INTEGER 15:29:56,964 WARN || - [InitExecutionOperation] ForkJoinPool.commonPool-worker-2 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] null java.util.concurrent.CancellationException: null at com.hazelcast.jet.sql.impl.processors.RootResultConsumerSink.tryProcess(RootResultConsumerSink.java:97) ~[hazelcast-sql-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.lambda$stateMachineStep$2(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.util.Util.doWithClassLoader(Util.java:547) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:328) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.call(ProcessorTasklet.java:291) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:404) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:895) ~[?:1.8.0_351] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:369) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_351] 15:29:56,964 WARN || - [SubmitJobOperation] hz.tender_borg.cached.thread-8 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] null java.util.concurrent.CancellationException: null at com.hazelcast.jet.sql.impl.processors.RootResultConsumerSink.tryProcess(RootResultConsumerSink.java:97) ~[hazelcast-sql-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.lambda$stateMachineStep$2(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.util.Util.doWithClassLoader(Util.java:547) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:328) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.call(ProcessorTasklet.java:291) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:404) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:895) ~[?:1.8.0_351] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:369) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_351] 15:29:56,975 WARN || - [InitExecutionOperation] ForkJoinPool.commonPool-worker-9 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] null java.util.concurrent.CancellationException: null at com.hazelcast.jet.sql.impl.processors.RootResultConsumerSink.tryProcess(RootResultConsumerSink.java:97) ~[hazelcast-sql-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.lambda$stateMachineStep$2(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.util.Util.doWithClassLoader(Util.java:547) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:328) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.call(ProcessorTasklet.java:291) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:404) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:895) ~[?:1.8.0_351] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:369) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_351] 15:29:56,976 WARN || - [SubmitJobOperation] ForkJoinPool.commonPool-worker-2 - [127.0.0.1]:5701 [dev] [5.3.0-SNAPSHOT] null java.util.concurrent.CancellationException: null at com.hazelcast.jet.sql.impl.processors.RootResultConsumerSink.tryProcess(RootResultConsumerSink.java:97) ~[hazelcast-sql-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.lambda$stateMachineStep$2(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.util.Util.doWithClassLoader(Util.java:547) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:334) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.stateMachineStep(ProcessorTasklet.java:328) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.ProcessorTasklet.call(ProcessorTasklet.java:291) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.runTasklet(TaskletExecutionService.java:404) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.util.concurrent.CopyOnWriteArrayList.forEach(CopyOnWriteArrayList.java:895) ~[?:1.8.0_351] at com.hazelcast.jet.impl.execution.TaskletExecutionService$CooperativeWorker.run(TaskletExecutionService.java:369) ~[hazelcast-5.3.0-SNAPSHOT.jar:5.3.0-SNAPSHOT] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_351] 15:30:02,305 INFO |whenMapStoreDestroyOnNonMaster_thenDropMapping| - [SqlTestSupport] Time-limited test - Removing 0 cached plans in SqlTestSupport.@After 15:30:02,306 INFO |whenMapStoreDestroyOnNonMaster_thenDropMapping| - [SqlTestSupport] Time-limited test - Removing 0 cached plans in SqlTestSupport.@After 15:30:02,318 INFO |whenMapStoreDestroyOnNonMaster_thenDropMapping| - [SimpleTestInClusterSupport] Time-limited test - Ditching 0 jobs in SimpleTestInClusterSupport.@After: [] 15:30:02,320 INFO |whenMapStoreDestroyOnNonMaster_thenDropMapping| - [SimpleTestInClusterSupport] Time-limited test - Destroying 1 distributed objects in SimpleTestInClusterSupport.@After: [hz:impl:mapService/__sql.catalog] BuildInfo right after whenMapStoreDestroyOnNonMaster_thenDropMapping(com.hazelcast.mapstore.GenericMapStoreTest): BuildInfo{version='5.3.0-SNAPSHOT', build='20221124', buildNumber=20221124, revision=ada41fc, enterprise=false, serializationVersion=1} Hiccups measured while running test 'whenMapStoreDestroyOnNonMaster_thenDropMapping(com.hazelcast.mapstore.GenericMapStoreTest):' 15:29:55, accumulated pauses: 40 ms, max pause: 7 ms, pauses over 1000 ms: 0 15:30:00, accumulated pauses: 184 ms, max pause: 170 ms, pauses over 1000 ms: 0 No metrics recorded during the test ``` </details>
non_build
com hazelcast mapstore genericmapstoretest whenmapstoredestroyonnonmaster thendropmapping commit failed on hazelcast pr builder stacktrace java lang assertionerror expecting row to contain exactly in any order but the following elements were unexpected at com hazelcast jet sql sqltestsupport assertrowsanyorder sqltestsupport java at com hazelcast jet sql sqltestsupport assertrowsanyorder sqltestsupport java at com hazelcast mapstore genericmapstoretest lambda awaitmappingcreated genericmapstoretest java at com hazelcast test hazelcasttestsupport asserttrueeventually hazelcasttestsupport java at com hazelcast test hazelcasttestsupport asserttrueeventually hazelcasttestsupport java at com hazelcast mapstore genericmapstoretest awaitmappingcreated genericmapstoretest java at com hazelcast mapstore genericmapstoretest whenmapstoredestroyonnonmaster thendropmapping genericmapstoretest java at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org junit runners model frameworkmethod runreflectivecall frameworkmethod java at org junit internal runners model reflectivecallable run reflectivecallable java at org junit runners model frameworkmethod invokeexplosively frameworkmethod java at org junit internal runners statements invokemethod evaluate invokemethod java at com hazelcast test failontimeoutstatement callablestatement call failontimeoutstatement java at com hazelcast test failontimeoutstatement callablestatement call failontimeoutstatement java at java util concurrent futuretask run futuretask java at java lang thread run thread java standard output finished running test whensetnonexistingcolumnonsecondmapstore thenfailtoinitialize in seconds started running test whenmapstoredestroyonnonmaster thendropmapping debug hz tender borg cached thread initializing for map people o debug hz tender borg cached thread discovered following mapping columns id integer name varchar age integer warn forkjoinpool commonpool worker null java util concurrent cancellationexception null at com hazelcast jet sql impl processors rootresultconsumersink tryprocess rootresultconsumersink java at com hazelcast jet impl execution processortasklet lambda statemachinestep processortasklet java at com hazelcast jet impl util util dowithclassloader util java at com hazelcast jet impl execution processortasklet statemachinestep processortasklet java at com hazelcast jet impl execution processortasklet statemachinestep processortasklet java at com hazelcast jet impl execution processortasklet call processortasklet java at com hazelcast jet impl execution taskletexecutionservice cooperativeworker runtasklet taskletexecutionservice java at java util concurrent copyonwritearraylist foreach copyonwritearraylist java at com hazelcast jet impl execution taskletexecutionservice cooperativeworker run taskletexecutionservice java at java lang thread run thread java warn hz tender borg cached thread null java util concurrent cancellationexception null at com hazelcast jet sql impl processors rootresultconsumersink tryprocess rootresultconsumersink java at com hazelcast jet impl execution processortasklet lambda statemachinestep processortasklet java at com hazelcast jet impl util util dowithclassloader util java at com hazelcast jet impl execution processortasklet statemachinestep processortasklet java at com hazelcast jet impl execution processortasklet statemachinestep processortasklet java at com hazelcast jet impl execution processortasklet call processortasklet java at com hazelcast jet impl execution taskletexecutionservice cooperativeworker runtasklet taskletexecutionservice java at java util concurrent copyonwritearraylist foreach copyonwritearraylist java at com hazelcast jet impl execution taskletexecutionservice cooperativeworker run taskletexecutionservice java at java lang thread run thread java warn forkjoinpool commonpool worker null java util concurrent cancellationexception null at com hazelcast jet sql impl processors rootresultconsumersink tryprocess rootresultconsumersink java at com hazelcast jet impl execution processortasklet lambda statemachinestep processortasklet java at com hazelcast jet impl util util dowithclassloader util java at com hazelcast jet impl execution processortasklet statemachinestep processortasklet java at com hazelcast jet impl execution processortasklet statemachinestep processortasklet java at com hazelcast jet impl execution processortasklet call processortasklet java at com hazelcast jet impl execution taskletexecutionservice cooperativeworker runtasklet taskletexecutionservice java at java util concurrent copyonwritearraylist foreach copyonwritearraylist java at com hazelcast jet impl execution taskletexecutionservice cooperativeworker run taskletexecutionservice java at java lang thread run thread java warn forkjoinpool commonpool worker null java util concurrent cancellationexception null at com hazelcast jet sql impl processors rootresultconsumersink tryprocess rootresultconsumersink java at com hazelcast jet impl execution processortasklet lambda statemachinestep processortasklet java at com hazelcast jet impl util util dowithclassloader util java at com hazelcast jet impl execution processortasklet statemachinestep processortasklet java at com hazelcast jet impl execution processortasklet statemachinestep processortasklet java at com hazelcast jet impl execution processortasklet call processortasklet java at com hazelcast jet impl execution taskletexecutionservice cooperativeworker runtasklet taskletexecutionservice java at java util concurrent copyonwritearraylist foreach copyonwritearraylist java at com hazelcast jet impl execution taskletexecutionservice cooperativeworker run taskletexecutionservice java at java lang thread run thread java info whenmapstoredestroyonnonmaster thendropmapping time limited test removing cached plans in sqltestsupport after info whenmapstoredestroyonnonmaster thendropmapping time limited test removing cached plans in sqltestsupport after info whenmapstoredestroyonnonmaster thendropmapping time limited test ditching jobs in simpletestinclustersupport after info whenmapstoredestroyonnonmaster thendropmapping time limited test destroying distributed objects in simpletestinclustersupport after buildinfo right after whenmapstoredestroyonnonmaster thendropmapping com hazelcast mapstore genericmapstoretest buildinfo version snapshot build buildnumber revision enterprise false serializationversion hiccups measured while running test whenmapstoredestroyonnonmaster thendropmapping com hazelcast mapstore genericmapstoretest accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms no metrics recorded during the test
0
1,603
2,790,322,797
IssuesEvent
2015-05-09 04:27:01
driftyco/ionic-cli
https://api.github.com/repos/driftyco/ionic-cli
closed
Release password prompt does not work in android build
build cli
_From @kristianbenoit on November 7, 2014 15:21_ Building with `ionic build --release` while the build system is configured to use a keystore and alias, prompts for a password, but the password is never sent to the underlying tools. To reproduce the problem ===================== create a file `platforms/android/ant.properties` with the following content: key.store=/path/to/release_store.keystore key.alias=alias_name Generate your keystore (see: http://developer.android.com/tools/publishing/app-signing.html#signing-manually): keytool -genkey -v -keystore /path/to/release_store.keystore -alias alias_name -keyalg RSA -keysize 2048 -validity 10000 Build with ionic: ionic build --release Temporary solution =============== Use cordova instead of ionic: cordova build --release _Copied from original issue: driftyco/ionic#2512_
1.0
Release password prompt does not work in android build - _From @kristianbenoit on November 7, 2014 15:21_ Building with `ionic build --release` while the build system is configured to use a keystore and alias, prompts for a password, but the password is never sent to the underlying tools. To reproduce the problem ===================== create a file `platforms/android/ant.properties` with the following content: key.store=/path/to/release_store.keystore key.alias=alias_name Generate your keystore (see: http://developer.android.com/tools/publishing/app-signing.html#signing-manually): keytool -genkey -v -keystore /path/to/release_store.keystore -alias alias_name -keyalg RSA -keysize 2048 -validity 10000 Build with ionic: ionic build --release Temporary solution =============== Use cordova instead of ionic: cordova build --release _Copied from original issue: driftyco/ionic#2512_
build
release password prompt does not work in android build from kristianbenoit on november building with ionic build release while the build system is configured to use a keystore and alias prompts for a password but the password is never sent to the underlying tools to reproduce the problem create a file platforms android ant properties with the following content key store path to release store keystore key alias alias name generate your keystore see keytool genkey v keystore path to release store keystore alias alias name keyalg rsa keysize validity build with ionic ionic build release temporary solution use cordova instead of ionic cordova build release copied from original issue driftyco ionic
1
68,198
17,191,487,610
IssuesEvent
2021-07-16 11:39:11
Kotlin/kotlinx.coroutines
https://api.github.com/repos/Kotlin/kotlinx.coroutines
closed
Investigate JS test failures that are silent on TeamCity
build js
Steps to reproduce: `./gradlew :kotlinx-coroutines-core:jsIrTest` Output, a lot of: ``` Too many unhandled exceptions 1, expected 0, got: CompletionHandlerException: Exception in completion handler InvokeOnCompletion@17[job@18] for BroadcastCoroutine{Cancelled}@18: CompletionHandlerException: Exception in completion handler InvokeOnCompletion@17[job@18] for BroadcastCoroutine{Cancelled}@18 CompletionHandlerException: Exception in completion handler InvokeOnCompletion@17[job@18] for BroadcastCoroutine{Cancelled}@18 ```
1.0
Investigate JS test failures that are silent on TeamCity - Steps to reproduce: `./gradlew :kotlinx-coroutines-core:jsIrTest` Output, a lot of: ``` Too many unhandled exceptions 1, expected 0, got: CompletionHandlerException: Exception in completion handler InvokeOnCompletion@17[job@18] for BroadcastCoroutine{Cancelled}@18: CompletionHandlerException: Exception in completion handler InvokeOnCompletion@17[job@18] for BroadcastCoroutine{Cancelled}@18 CompletionHandlerException: Exception in completion handler InvokeOnCompletion@17[job@18] for BroadcastCoroutine{Cancelled}@18 ```
build
investigate js test failures that are silent on teamcity steps to reproduce gradlew kotlinx coroutines core jsirtest output a lot of too many unhandled exceptions expected got completionhandlerexception exception in completion handler invokeoncompletion for broadcastcoroutine cancelled completionhandlerexception exception in completion handler invokeoncompletion for broadcastcoroutine cancelled completionhandlerexception exception in completion handler invokeoncompletion for broadcastcoroutine cancelled
1
97,392
28,241,691,594
IssuesEvent
2023-04-06 07:42:35
expo/eas-cli
https://api.github.com/repos/expo/eas-cli
closed
EAS build fails for iOS with Bare Workflow
incomplete issue: missing or invalid repro eas build invalid issue: project specific issue
### Build/Submit details page URL https://expo.dev/accounts/csm-developer/projects/pogo-app/builds/a5be8c95-1167-48d0-810f-d0091780bf34 ### Summary The iOS build fails for Expo SDK47. ### Managed or bare? Bare ### Environment expo-env-info 1.0.5 environment info: System: OS: macOS 13.2.1 Shell: 5.8.1 - /bin/zsh Binaries: Node: 16.14.0 - /usr/local/bin/node Yarn: 1.22.19 - /usr/local/bin/yarn npm: 8.12.2 - /usr/local/bin/npm Watchman: 2023.02.20.00 - /usr/local/bin/watchman Managers: CocoaPods: 1.11.2 - /usr/local/bin/pod SDKs: iOS SDK: Platforms: DriverKit 22.2, iOS 16.2, macOS 13.1, tvOS 16.1, watchOS 9.1 Android SDK: API Levels: 31, 33 Build Tools: 30.0.2, 31.0.0, 33.0.0, 33.0.1 System Images: android-30 | Google APIs Intel x86 Atom, android-30 | Google Play Intel x86 Atom IDEs: Android Studio: 2021.2 AI-212.5712.43.2112.8815526 Xcode: 14.2/14C18 - /usr/bin/xcodebuild npmPackages: @expo/metro-config: ^0.5.2 => 0.5.2 expo: ^47.0.13 => 47.0.13 metro: ^0.66.2 => 0.66.2 react: 18.1.0 => 18.1.0 react-dom: 18.1.0 => 18.1.0 react-native: ^0.70.8 => 0.70.8 react-native-web: ^0.18.12 => 0.18.12 npmGlobalPackages: eas-cli: 3.5.2 expo-cli: 6.3.2 Expo Workflow: managed 🎉 Didn't find any issues with the project! ### Error output ⚠️ ld: duplicate method '+moduleName' in ┌─[category]: ExpoBridgeModule-a5938b8bafa3eb034b428d973be07cfa.o ExpoModulesCore/libExpoModulesCore.a └─[class]: ExpoBridgeModule-674e3c7542e496ffb5e7fcb51e49e14d.o ExpoModulesCore/libExpoModulesCore.a ⚠️ ld: method '+UIStatusBarAnimation:' in category from /Users/expo/Library/Developer/Xcode/DerivedData/pogoapp-edbsmuxtvewrvdamrowtwcrdbdwf/Build/Intermediates.noindex/ArchiveIntermediates/pogoapp/BuildProductsPath/Release-iphoneos/React-CoreModules/libReact-CoreModules.a(RCTStatusBarManager.o) conflicts with same method from another category › Generating debug pogoapp » pogo.app.dSYM › Executing pogoapp » Bundle React Native code and images the transform cache was reset. ❌ error: File /Users/expo/Library/Developer/Xcode/DerivedData/pogoapp-edbsmuxtvewrvdamrowtwcrdbdwf/Build/Intermediates.noindex/ArchiveIntermediates/pogoapp/BuildProductsPath/Release-iphoneos/pogo.app/main.jsbundle does not exist. This must be a bug with React Native, please report it here: https://github.com/facebook/react-native/issues ### Reproducible demo or steps to reproduce from a blank project Upgrade bare workflow Expo project to SDK47.
1.0
EAS build fails for iOS with Bare Workflow - ### Build/Submit details page URL https://expo.dev/accounts/csm-developer/projects/pogo-app/builds/a5be8c95-1167-48d0-810f-d0091780bf34 ### Summary The iOS build fails for Expo SDK47. ### Managed or bare? Bare ### Environment expo-env-info 1.0.5 environment info: System: OS: macOS 13.2.1 Shell: 5.8.1 - /bin/zsh Binaries: Node: 16.14.0 - /usr/local/bin/node Yarn: 1.22.19 - /usr/local/bin/yarn npm: 8.12.2 - /usr/local/bin/npm Watchman: 2023.02.20.00 - /usr/local/bin/watchman Managers: CocoaPods: 1.11.2 - /usr/local/bin/pod SDKs: iOS SDK: Platforms: DriverKit 22.2, iOS 16.2, macOS 13.1, tvOS 16.1, watchOS 9.1 Android SDK: API Levels: 31, 33 Build Tools: 30.0.2, 31.0.0, 33.0.0, 33.0.1 System Images: android-30 | Google APIs Intel x86 Atom, android-30 | Google Play Intel x86 Atom IDEs: Android Studio: 2021.2 AI-212.5712.43.2112.8815526 Xcode: 14.2/14C18 - /usr/bin/xcodebuild npmPackages: @expo/metro-config: ^0.5.2 => 0.5.2 expo: ^47.0.13 => 47.0.13 metro: ^0.66.2 => 0.66.2 react: 18.1.0 => 18.1.0 react-dom: 18.1.0 => 18.1.0 react-native: ^0.70.8 => 0.70.8 react-native-web: ^0.18.12 => 0.18.12 npmGlobalPackages: eas-cli: 3.5.2 expo-cli: 6.3.2 Expo Workflow: managed 🎉 Didn't find any issues with the project! ### Error output ⚠️ ld: duplicate method '+moduleName' in ┌─[category]: ExpoBridgeModule-a5938b8bafa3eb034b428d973be07cfa.o ExpoModulesCore/libExpoModulesCore.a └─[class]: ExpoBridgeModule-674e3c7542e496ffb5e7fcb51e49e14d.o ExpoModulesCore/libExpoModulesCore.a ⚠️ ld: method '+UIStatusBarAnimation:' in category from /Users/expo/Library/Developer/Xcode/DerivedData/pogoapp-edbsmuxtvewrvdamrowtwcrdbdwf/Build/Intermediates.noindex/ArchiveIntermediates/pogoapp/BuildProductsPath/Release-iphoneos/React-CoreModules/libReact-CoreModules.a(RCTStatusBarManager.o) conflicts with same method from another category › Generating debug pogoapp » pogo.app.dSYM › Executing pogoapp » Bundle React Native code and images the transform cache was reset. ❌ error: File /Users/expo/Library/Developer/Xcode/DerivedData/pogoapp-edbsmuxtvewrvdamrowtwcrdbdwf/Build/Intermediates.noindex/ArchiveIntermediates/pogoapp/BuildProductsPath/Release-iphoneos/pogo.app/main.jsbundle does not exist. This must be a bug with React Native, please report it here: https://github.com/facebook/react-native/issues ### Reproducible demo or steps to reproduce from a blank project Upgrade bare workflow Expo project to SDK47.
build
eas build fails for ios with bare workflow build submit details page url summary the ios build fails for expo managed or bare bare environment expo env info environment info system os macos shell bin zsh binaries node usr local bin node yarn usr local bin yarn npm usr local bin npm watchman usr local bin watchman managers cocoapods usr local bin pod sdks ios sdk platforms driverkit ios macos tvos watchos android sdk api levels build tools system images android google apis intel atom android google play intel atom ides android studio ai xcode usr bin xcodebuild npmpackages expo metro config expo metro react react dom react native react native web npmglobalpackages eas cli expo cli expo workflow managed 🎉 didn t find any issues with the project error output ⚠️ ld duplicate method modulename in ┌─ expobridgemodule o expomodulescore libexpomodulescore a └─ expobridgemodule o expomodulescore libexpomodulescore a ⚠️ ld method uistatusbaranimation in category from users expo library developer xcode deriveddata pogoapp edbsmuxtvewrvdamrowtwcrdbdwf build intermediates noindex archiveintermediates pogoapp buildproductspath release iphoneos react coremodules libreact coremodules a rctstatusbarmanager o conflicts with same method from another category › generating debug pogoapp » pogo app dsym › executing pogoapp » bundle react native code and images the transform cache was reset ❌ error file users expo library developer xcode deriveddata pogoapp edbsmuxtvewrvdamrowtwcrdbdwf build intermediates noindex archiveintermediates pogoapp buildproductspath release iphoneos pogo app main jsbundle does not exist this must be a bug with react native please report it here reproducible demo or steps to reproduce from a blank project upgrade bare workflow expo project to
1
73,449
19,688,033,211
IssuesEvent
2022-01-12 01:34:32
facebook/zstd
https://api.github.com/repos/facebook/zstd
closed
Build failure using Visual Studio 2022
build issue release-blocking
**Describe the bug** When I build zstd using Visual Studio 2022, I got some build error: ``` [1/3] C:\PROGRA~1\MICROS~2\2022\COMMUN~1\VC\Tools\MSVC\1430~1.307\bin\Hostx64\x64\cl.exe -DXXH_NAMESPACE=ZSTD_ -DZSTD_DLL_EXPORT=1 -DZSTD_HEAPMODE=0 -DZSTD_LEGACY_SUPPORT=5 -DZSTD_MULTITHREAD -D_CONSOLE -D_CRT_SECURE_NO_WARNINGS -Dlibzstd_shared_EXPORTS -IC:\src\vcpkg\buildtrees\zstd\x64-windows-dbg\lib -IC:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\cmake\lib -IC:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\cmake\..\..\lib -IC:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\cmake\..\..\lib\common -IC:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\cmake\..\..\lib\legacy -o lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S.obj -c C:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S Microsoft (R) C/C++ Optimizing Compiler Version 19.30.30706 for x64 Copyright (C) Microsoft Corporation. All rights reserved. cl : Command line warning D9035 : option 'o' has been deprecated and will be removed in a future release cl : Command line warning D9024 : unrecognized source file type 'C:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S', object file assumed cl : Command line warning D9027 : source file 'C:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S' ignored cl : Command line warning D9021 : no action performed ... LINK Pass 1: command "C:\PROGRA~1\MICROS~2\2022\COMMUN~1\VC\Tools\MSVC\1430~1.307\bin\Hostx64\x64\link.exe lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\debug.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\entropy_common.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\error_private.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\fse_decompress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\pool.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\threading.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\xxhash.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\zstd_common.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\fse_compress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\hist.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\huf_compress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_compress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_compress_literals.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_compress_sequences.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_compress_superblock.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_double_fast.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_fast.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_lazy.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_ldm.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_opt.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstdmt_compress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\zstd_ddict.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\zstd_decompress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\zstd_decompress_block.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\dictBuilder\cover.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\dictBuilder\divsufsort.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\dictBuilder\fastcover.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\dictBuilder\zdict.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v01.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v02.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v03.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v04.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v05.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v06.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v07.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\VS2010\libzstd-dll\libzstd-dll.rc.res /out:lib\zstdd.dll /implib:lib\zstdd.lib /pdb:lib\zstdd.pdb /dll /version:1.5 /machine:x64 /nologo /debug /INCREMENTAL kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTFILE:lib\CMakeFiles\libzstd_shared.dir/intermediate.manifest lib\CMakeFiles\libzstd_shared.dir/manifest.res" failed (exit code 1104) with the following output: LINK : fatal error LNK1104: cannot open file 'lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S.obj' ninja: build stopped: subcommand failed. ``` The cmake options are: ```cmake -DZSTD_BUILD_SHARED=${ZSTD_BUILD_SHARED} -DZSTD_BUILD_STATIC=${ZSTD_BUILD_STATIC} -DZSTD_LEGACY_SUPPORT=1 -DZSTD_BUILD_PROGRAMS=0 -DZSTD_BUILD_TESTS=0 -DZSTD_BUILD_CONTRIB=0 ``` It seems that Visual Studio 2022 doesn't recognize this source file. **Expected behavior** Build success. **Desktop (please complete the following information):** - OS: Windows - Version 10 - Compiler Visual Studio 2022 - Flags see log above - Other relevant hardware specs N/A - Build system CMake **Additional context** N/A
1.0
Build failure using Visual Studio 2022 - **Describe the bug** When I build zstd using Visual Studio 2022, I got some build error: ``` [1/3] C:\PROGRA~1\MICROS~2\2022\COMMUN~1\VC\Tools\MSVC\1430~1.307\bin\Hostx64\x64\cl.exe -DXXH_NAMESPACE=ZSTD_ -DZSTD_DLL_EXPORT=1 -DZSTD_HEAPMODE=0 -DZSTD_LEGACY_SUPPORT=5 -DZSTD_MULTITHREAD -D_CONSOLE -D_CRT_SECURE_NO_WARNINGS -Dlibzstd_shared_EXPORTS -IC:\src\vcpkg\buildtrees\zstd\x64-windows-dbg\lib -IC:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\cmake\lib -IC:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\cmake\..\..\lib -IC:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\cmake\..\..\lib\common -IC:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\cmake\..\..\lib\legacy -o lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S.obj -c C:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S Microsoft (R) C/C++ Optimizing Compiler Version 19.30.30706 for x64 Copyright (C) Microsoft Corporation. All rights reserved. cl : Command line warning D9035 : option 'o' has been deprecated and will be removed in a future release cl : Command line warning D9024 : unrecognized source file type 'C:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S', object file assumed cl : Command line warning D9027 : source file 'C:\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S' ignored cl : Command line warning D9021 : no action performed ... LINK Pass 1: command "C:\PROGRA~1\MICROS~2\2022\COMMUN~1\VC\Tools\MSVC\1430~1.307\bin\Hostx64\x64\link.exe lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\debug.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\entropy_common.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\error_private.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\fse_decompress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\pool.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\threading.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\xxhash.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\common\zstd_common.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\fse_compress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\hist.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\huf_compress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_compress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_compress_literals.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_compress_sequences.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_compress_superblock.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_double_fast.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_fast.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_lazy.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_ldm.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstd_opt.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\compress\zstdmt_compress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\zstd_ddict.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\zstd_decompress.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\zstd_decompress_block.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\dictBuilder\cover.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\dictBuilder\divsufsort.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\dictBuilder\fastcover.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\dictBuilder\zdict.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v01.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v02.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v03.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v04.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v05.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v06.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\legacy\zstd_v07.c.obj lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\build\VS2010\libzstd-dll\libzstd-dll.rc.res /out:lib\zstdd.dll /implib:lib\zstdd.lib /pdb:lib\zstdd.pdb /dll /version:1.5 /machine:x64 /nologo /debug /INCREMENTAL kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib /MANIFEST /MANIFESTFILE:lib\CMakeFiles\libzstd_shared.dir/intermediate.manifest lib\CMakeFiles\libzstd_shared.dir/manifest.res" failed (exit code 1104) with the following output: LINK : fatal error LNK1104: cannot open file 'lib\CMakeFiles\libzstd_shared.dir\C_\src\vcpkg\buildtrees\zstd\src\dd82848e01-1d82dd46ac\lib\decompress\huf_decompress_amd64.S.obj' ninja: build stopped: subcommand failed. ``` The cmake options are: ```cmake -DZSTD_BUILD_SHARED=${ZSTD_BUILD_SHARED} -DZSTD_BUILD_STATIC=${ZSTD_BUILD_STATIC} -DZSTD_LEGACY_SUPPORT=1 -DZSTD_BUILD_PROGRAMS=0 -DZSTD_BUILD_TESTS=0 -DZSTD_BUILD_CONTRIB=0 ``` It seems that Visual Studio 2022 doesn't recognize this source file. **Expected behavior** Build success. **Desktop (please complete the following information):** - OS: Windows - Version 10 - Compiler Visual Studio 2022 - Flags see log above - Other relevant hardware specs N/A - Build system CMake **Additional context** N/A
build
build failure using visual studio describe the bug when i build zstd using visual studio i got some build error c progra micros commun vc tools msvc bin cl exe dxxh namespace zstd dzstd dll export dzstd heapmode dzstd legacy support dzstd multithread d console d crt secure no warnings dlibzstd shared exports ic src vcpkg buildtrees zstd windows dbg lib ic src vcpkg buildtrees zstd src build cmake lib ic src vcpkg buildtrees zstd src build cmake lib ic src vcpkg buildtrees zstd src build cmake lib common ic src vcpkg buildtrees zstd src build cmake lib legacy o lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib decompress huf decompress s obj c c src vcpkg buildtrees zstd src lib decompress huf decompress s microsoft r c c optimizing compiler version for copyright c microsoft corporation all rights reserved cl command line warning option o has been deprecated and will be removed in a future release cl command line warning unrecognized source file type c src vcpkg buildtrees zstd src lib decompress huf decompress s object file assumed cl command line warning source file c src vcpkg buildtrees zstd src lib decompress huf decompress s ignored cl command line warning no action performed link pass command c progra micros commun vc tools msvc bin link exe lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib common debug c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib common entropy common c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib common error private c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib common fse decompress c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib common pool c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib common threading c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib common xxhash c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib common zstd common c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress fse compress c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress hist c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress huf compress c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd compress c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd compress literals c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd compress sequences c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd compress superblock c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd double fast c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd fast c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd lazy c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd ldm c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstd opt c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib compress zstdmt compress c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib decompress huf decompress c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib decompress huf decompress s obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib decompress zstd ddict c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib decompress zstd decompress c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib decompress zstd decompress block c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib dictbuilder cover c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib dictbuilder divsufsort c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib dictbuilder fastcover c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib dictbuilder zdict c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib legacy zstd c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib legacy zstd c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib legacy zstd c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib legacy zstd c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib legacy zstd c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib legacy zstd c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib legacy zstd c obj lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src build libzstd dll libzstd dll rc res out lib zstdd dll implib lib zstdd lib pdb lib zstdd pdb dll version machine nologo debug incremental lib lib lib winspool lib lib lib lib uuid lib lib lib manifest manifestfile lib cmakefiles libzstd shared dir intermediate manifest lib cmakefiles libzstd shared dir manifest res failed exit code with the following output link fatal error cannot open file lib cmakefiles libzstd shared dir c src vcpkg buildtrees zstd src lib decompress huf decompress s obj ninja build stopped subcommand failed the cmake options are cmake dzstd build shared zstd build shared dzstd build static zstd build static dzstd legacy support dzstd build programs dzstd build tests dzstd build contrib it seems that visual studio doesn t recognize this source file expected behavior build success desktop please complete the following information os windows version compiler visual studio flags see log above other relevant hardware specs n a build system cmake additional context n a
1
414,253
27,982,650,789
IssuesEvent
2023-03-26 10:41:42
ipython/ipython
https://api.github.com/repos/ipython/ipython
closed
History search / tab completion does not behave as expected.
documentation tab-completion autosuggestions
Hello I am on ipython 8.9.0, using python 3.11.0. This is a fresh install on a completely clean Ubuntu 22.04, using pyenv as the virtual environment manager. This is a setup that I have used elsewhere without (until now) any need for additional setings. On (all of) my systems, I always set bash's history navigation to work in [this way](https://stackoverflow.com/questions/3535192/matlab-like-command-history-retrieval-in-unix-command-line) and the same works within ipython without any extra settings. With the configuration mentioned above, when I start ipython and try to navigate in history, the search works but instead of bringing back the whole line, marking it as "active", it brings back the rest of the line but shaded. For example: ![image](https://user-images.githubusercontent.com/1336337/217814794-ca2e32b7-924a-400d-9433-e7e9bce9e9ba.png) To "activate" that line, I have to press "End"...which is a bit annoying. But perhaps more importrantly, tab completion does not work. So, as I stand there looking at that line, taping "Tab" will not make the rest of the line "active" (in the way I described hapenning with the "End" key) Any ideas what might be going on here? (I don't remember if I had the libreadline-dev package installed when I install python 3.11 using pyenv, could this be a factor in this?)
1.0
History search / tab completion does not behave as expected. - Hello I am on ipython 8.9.0, using python 3.11.0. This is a fresh install on a completely clean Ubuntu 22.04, using pyenv as the virtual environment manager. This is a setup that I have used elsewhere without (until now) any need for additional setings. On (all of) my systems, I always set bash's history navigation to work in [this way](https://stackoverflow.com/questions/3535192/matlab-like-command-history-retrieval-in-unix-command-line) and the same works within ipython without any extra settings. With the configuration mentioned above, when I start ipython and try to navigate in history, the search works but instead of bringing back the whole line, marking it as "active", it brings back the rest of the line but shaded. For example: ![image](https://user-images.githubusercontent.com/1336337/217814794-ca2e32b7-924a-400d-9433-e7e9bce9e9ba.png) To "activate" that line, I have to press "End"...which is a bit annoying. But perhaps more importrantly, tab completion does not work. So, as I stand there looking at that line, taping "Tab" will not make the rest of the line "active" (in the way I described hapenning with the "End" key) Any ideas what might be going on here? (I don't remember if I had the libreadline-dev package installed when I install python 3.11 using pyenv, could this be a factor in this?)
non_build
history search tab completion does not behave as expected hello i am on ipython using python this is a fresh install on a completely clean ubuntu using pyenv as the virtual environment manager this is a setup that i have used elsewhere without until now any need for additional setings on all of my systems i always set bash s history navigation to work in and the same works within ipython without any extra settings with the configuration mentioned above when i start ipython and try to navigate in history the search works but instead of bringing back the whole line marking it as active it brings back the rest of the line but shaded for example to activate that line i have to press end which is a bit annoying but perhaps more importrantly tab completion does not work so as i stand there looking at that line taping tab will not make the rest of the line active in the way i described hapenning with the end key any ideas what might be going on here i don t remember if i had the libreadline dev package installed when i install python using pyenv could this be a factor in this
0
33,964
9,221,138,472
IssuesEvent
2019-03-11 19:12:32
assimp/assimp
https://api.github.com/repos/assimp/assimp
closed
buiding issue C2678 and C2088 with Visual Studio 15 2017 and Assimp 4.0.1
bug build
I did the command: cmake . -G "Visual Studio 15 2017 Win64" and then I tried to compile the library, however I got this error: ![image](https://user-images.githubusercontent.com/573633/54090617-1a531380-4344-11e9-9fb0-65d1ca401815.png) I tried to change the C++ standard lenguaje and same error, seems a compiler bug or "VS standard bug" 📦 ![image](https://user-images.githubusercontent.com/573633/54090648-4ec6cf80-4344-11e9-860b-2ec32a5b465f.png)
1.0
buiding issue C2678 and C2088 with Visual Studio 15 2017 and Assimp 4.0.1 - I did the command: cmake . -G "Visual Studio 15 2017 Win64" and then I tried to compile the library, however I got this error: ![image](https://user-images.githubusercontent.com/573633/54090617-1a531380-4344-11e9-9fb0-65d1ca401815.png) I tried to change the C++ standard lenguaje and same error, seems a compiler bug or "VS standard bug" 📦 ![image](https://user-images.githubusercontent.com/573633/54090648-4ec6cf80-4344-11e9-860b-2ec32a5b465f.png)
build
buiding issue and with visual studio and assimp i did the command cmake g visual studio and then i tried to compile the library however i got this error i tried to change the c standard lenguaje and same error seems a compiler bug or vs standard bug 📦
1
264,337
28,144,823,611
IssuesEvent
2023-04-02 11:11:59
RG4421/ampere-centos-kernel
https://api.github.com/repos/RG4421/ampere-centos-kernel
opened
CVE-2023-1670 (Medium) detected in linuxv5.2
Mend: dependency security vulnerability
## CVE-2023-1670 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary> <p> <p>Linux kernel source tree</p> <p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p> <p>Found in base branch: <b>amp-centos-8.0-kernel</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/ethernet/xircom/xirc2ps_cs.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/ethernet/xircom/xirc2ps_cs.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/ethernet/xircom/xirc2ps_cs.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw use after free in the Linux kernel Xircom 16-bit PCMCIA (PC-card) Ethernet driver was found.A local user could use this flaw to crash the system or potentially escalate their privileges on the system. <p>Publish Date: 2023-03-30 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-1670>CVE-2023-1670</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2023-1670">https://www.linuxkernelcves.com/cves/CVE-2023-1670</a></p> <p>Release Date: 2023-03-30</p> <p>Fix Resolution: v5.15.105,v6.1.22,v6.2.9</p> </p> </details> <p></p>
True
CVE-2023-1670 (Medium) detected in linuxv5.2 - ## CVE-2023-1670 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary> <p> <p>Linux kernel source tree</p> <p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p> <p>Found in base branch: <b>amp-centos-8.0-kernel</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/ethernet/xircom/xirc2ps_cs.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/ethernet/xircom/xirc2ps_cs.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/ethernet/xircom/xirc2ps_cs.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw use after free in the Linux kernel Xircom 16-bit PCMCIA (PC-card) Ethernet driver was found.A local user could use this flaw to crash the system or potentially escalate their privileges on the system. <p>Publish Date: 2023-03-30 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-1670>CVE-2023-1670</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2023-1670">https://www.linuxkernelcves.com/cves/CVE-2023-1670</a></p> <p>Release Date: 2023-03-30</p> <p>Fix Resolution: v5.15.105,v6.1.22,v6.2.9</p> </p> </details> <p></p>
non_build
cve medium detected in cve medium severity vulnerability vulnerable library linux kernel source tree library home page a href found in base branch amp centos kernel vulnerable source files drivers net ethernet xircom cs c drivers net ethernet xircom cs c drivers net ethernet xircom cs c vulnerability details a flaw use after free in the linux kernel xircom bit pcmcia pc card ethernet driver was found a local user could use this flaw to crash the system or potentially escalate their privileges on the system publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
0
667,327
22,465,866,489
IssuesEvent
2022-06-22 01:38:54
yukieiji/ExtremeRoles
https://api.github.com/repos/yukieiji/ExtremeRoles
opened
AmongUs v2022.06.21対応状況
優先度:最高/Priority:EXTREME バグ/Bug ExtremeSkins
### Extreme RolesはAmongUs v2022.06.21に対応していません、MOD自体の読み込みに失敗します - 更新されたNugetパッケージを見る限りPlayerControlが大規模にリファクタリングされている - MOD内にあるPlayerControlのPatch、PlayerControlの各種要素へのアクセスの一部が死んでる
1.0
AmongUs v2022.06.21対応状況 - ### Extreme RolesはAmongUs v2022.06.21に対応していません、MOD自体の読み込みに失敗します - 更新されたNugetパッケージを見る限りPlayerControlが大規模にリファクタリングされている - MOD内にあるPlayerControlのPatch、PlayerControlの各種要素へのアクセスの一部が死んでる
non_build
amongus extreme rolesはamongus 、mod自体の読み込みに失敗します 更新されたnugetパッケージを見る限りplayercontrolが大規模にリファクタリングされている mod内にあるplayercontrolのpatch、playercontrolの各種要素へのアクセスの一部が死んでる
0
109,912
16,920,508,657
IssuesEvent
2021-06-25 04:27:14
dhlinh98/juice-shop
https://api.github.com/repos/dhlinh98/juice-shop
opened
CVE-2020-28469 (High) detected in glob-parent-3.1.0.tgz
security vulnerability
## CVE-2020-28469 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>glob-parent-3.1.0.tgz</b></p></summary> <p>Strips glob magic from a string to provide the parent directory path</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p> <p>Path to dependency file: juice-shop/frontend/package.json</p> <p>Path to vulnerable library: juice-shop/frontend/node_modules/webpack-dev-server/node_modules/glob-parent/package.json,juice-shop/frontend/node_modules/@angular/compiler-cli/node_modules/glob-parent/package.json,juice-shop/frontend/node_modules/watchpack-chokidar2/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - compiler-cli-8.2.14.tgz (Root Library) - chokidar-2.1.8.tgz - :x: **glob-parent-3.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/dhlinh98/juice-shop/commit/6c891606b07d3b3862894a2a57234f708033d6e1">6c891606b07d3b3862894a2a57234f708033d6e1</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator. <p>Publish Date: 2021-06-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p> <p>Release Date: 2021-06-03</p> <p>Fix Resolution: glob-parent - 5.1.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-28469 (High) detected in glob-parent-3.1.0.tgz - ## CVE-2020-28469 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>glob-parent-3.1.0.tgz</b></p></summary> <p>Strips glob magic from a string to provide the parent directory path</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p> <p>Path to dependency file: juice-shop/frontend/package.json</p> <p>Path to vulnerable library: juice-shop/frontend/node_modules/webpack-dev-server/node_modules/glob-parent/package.json,juice-shop/frontend/node_modules/@angular/compiler-cli/node_modules/glob-parent/package.json,juice-shop/frontend/node_modules/watchpack-chokidar2/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - compiler-cli-8.2.14.tgz (Root Library) - chokidar-2.1.8.tgz - :x: **glob-parent-3.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/dhlinh98/juice-shop/commit/6c891606b07d3b3862894a2a57234f708033d6e1">6c891606b07d3b3862894a2a57234f708033d6e1</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator. <p>Publish Date: 2021-06-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p> <p>Release Date: 2021-06-03</p> <p>Fix Resolution: glob-parent - 5.1.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_build
cve high detected in glob parent tgz cve high severity vulnerability vulnerable library glob parent tgz strips glob magic from a string to provide the parent directory path library home page a href path to dependency file juice shop frontend package json path to vulnerable library juice shop frontend node modules webpack dev server node modules glob parent package json juice shop frontend node modules angular compiler cli node modules glob parent package json juice shop frontend node modules watchpack node modules glob parent package json dependency hierarchy compiler cli tgz root library chokidar tgz x glob parent tgz vulnerable library found in head commit a href vulnerability details this affects the package glob parent before the enclosure regex used to check for strings ending in enclosure containing path separator publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution glob parent step up your open source security game with whitesource
0
77,204
21,699,219,676
IssuesEvent
2022-05-10 00:50:13
ros-noetic-arch/ros-noetic-kdl-parser
https://api.github.com/repos/ros-noetic-arch/ros-noetic-kdl-parser
closed
build error
build error
``` [100%] Linking CXX executable devel/lib/kdl_parser/check_kdl_parser /usr/bin/ld: warning: libglog.so.0, needed by /opt/ros/noetic/lib/librosconsole_glog.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: /opt/ros/noetic/lib/librosconsole_glog.so: undefined reference to `google::LogMessage::stream()' /usr/bin/ld: /opt/ros/noetic/lib/librosconsole_glog.so: undefined reference to `google::LogMessage::LogMessage(char const*, int, int)' /usr/bin/ld: /opt/ros/noetic/lib/librosconsole_glog.so: undefined reference to `google::InitGoogleLogging(char const*)' /usr/bin/ld: /opt/ros/noetic/lib/librosconsole_glog.so: undefined reference to `google::LogMessage::~LogMessage()' collect2: error: ld returned 1 exit status make[2]: *** [CMakeFiles/check_kdl_parser.dir/build.make:162: devel/lib/kdl_parser/check_kdl_parser] Error 1 make[1]: *** [CMakeFiles/Makefile2:725: CMakeFiles/check_kdl_parser.dir/all] Error 2 make: *** [Makefile:136: all] Error 2 ==> ERROR: A failure occurred in build(). Aborting... ==> ERROR: Build failed, check /var/lib/aurbuild/x86_64/acxz/build error: failed to build 'ros-noetic-kdl-parser-1.14.2-1': failed to run: makechrootpkg -r /var/lib/aurbuild/x86_64 -D /home/acxz/.cache/paru/repo -d /var/cache/pacman/pkg/ -- -feA --noconfirm --noprepare --holdver: error: packages failed to build: ros-noetic-kdl-parser-1.14.2-1 paru -S ros-noetic-kdl-parser 39.30s user 12.20s system 100% cpu 51.139 total ``` Maybe this is a rebuild issue? i.e. i should rebuild rosconsole? Did you encounter this @AchmadFathoni ?
1.0
build error - ``` [100%] Linking CXX executable devel/lib/kdl_parser/check_kdl_parser /usr/bin/ld: warning: libglog.so.0, needed by /opt/ros/noetic/lib/librosconsole_glog.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: /opt/ros/noetic/lib/librosconsole_glog.so: undefined reference to `google::LogMessage::stream()' /usr/bin/ld: /opt/ros/noetic/lib/librosconsole_glog.so: undefined reference to `google::LogMessage::LogMessage(char const*, int, int)' /usr/bin/ld: /opt/ros/noetic/lib/librosconsole_glog.so: undefined reference to `google::InitGoogleLogging(char const*)' /usr/bin/ld: /opt/ros/noetic/lib/librosconsole_glog.so: undefined reference to `google::LogMessage::~LogMessage()' collect2: error: ld returned 1 exit status make[2]: *** [CMakeFiles/check_kdl_parser.dir/build.make:162: devel/lib/kdl_parser/check_kdl_parser] Error 1 make[1]: *** [CMakeFiles/Makefile2:725: CMakeFiles/check_kdl_parser.dir/all] Error 2 make: *** [Makefile:136: all] Error 2 ==> ERROR: A failure occurred in build(). Aborting... ==> ERROR: Build failed, check /var/lib/aurbuild/x86_64/acxz/build error: failed to build 'ros-noetic-kdl-parser-1.14.2-1': failed to run: makechrootpkg -r /var/lib/aurbuild/x86_64 -D /home/acxz/.cache/paru/repo -d /var/cache/pacman/pkg/ -- -feA --noconfirm --noprepare --holdver: error: packages failed to build: ros-noetic-kdl-parser-1.14.2-1 paru -S ros-noetic-kdl-parser 39.30s user 12.20s system 100% cpu 51.139 total ``` Maybe this is a rebuild issue? i.e. i should rebuild rosconsole? Did you encounter this @AchmadFathoni ?
build
build error linking cxx executable devel lib kdl parser check kdl parser usr bin ld warning libglog so needed by opt ros noetic lib librosconsole glog so not found try using rpath or rpath link usr bin ld opt ros noetic lib librosconsole glog so undefined reference to google logmessage stream usr bin ld opt ros noetic lib librosconsole glog so undefined reference to google logmessage logmessage char const int int usr bin ld opt ros noetic lib librosconsole glog so undefined reference to google initgooglelogging char const usr bin ld opt ros noetic lib librosconsole glog so undefined reference to google logmessage logmessage error ld returned exit status make error make error make error error a failure occurred in build aborting error build failed check var lib aurbuild acxz build error failed to build ros noetic kdl parser failed to run makechrootpkg r var lib aurbuild d home acxz cache paru repo d var cache pacman pkg fea noconfirm noprepare holdver error packages failed to build ros noetic kdl parser paru s ros noetic kdl parser user system cpu total maybe this is a rebuild issue i e i should rebuild rosconsole did you encounter this achmadfathoni
1
38,275
10,164,462,643
IssuesEvent
2019-08-07 11:45:15
kube-HPC/hkube
https://api.github.com/repos/kube-HPC/hkube
closed
building a node algorith, dosent work
HKube: Algorithm Builder Type: Bug
**HKube micro-service** algorithm builder **Describe the bug** building a node app dosent work **Expected behavior** it should work and build the algorithm **To Reproduce** Steps to reproduce the behavior: 1. create a node algorithm 2. zip it with the package.json file 3. build the algorithm **Screenshots** If applicable, add screenshots to help explain your problem. **Additional context** " IMAGE_NAME=docker.io/eytang/codeapi:v1.0.0 BUILD_PATH=builds/nodejs/codeapi BASE_IMAGE=docker.io/hkube/nodejs-env:1.1.84 DOCKER_PULL_REGISTRY=docker.io/hkube INSECURE_PULL=false SKIP_TLS_VERIFY_PULL=false DOCKER_PUSH_REGISTRY=docker.io/eytang SKIP_TLS_VERIFY=false INSECURE=false PACKAGES_REGISTRY= REMOVE_IMAGE=True TMP_FOLDER=/tmp Building image docker.io/eytang/codeapi:v1.0.0 copy context from builds/nodejs/codeapi to /tmp/workspace build done INFO[0000] Checking push permissions INFO[0001] Resolved base name ${baseImage} to docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Resolved base name ${baseImage} to docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Resolved base name ${baseImage} to docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Resolved base name ${baseImage} to docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Error while retrieving image from cache: getting file info: stat /cache/sha256:24bd806383ca0f0c7552e5126b0b3abd1eee6de731105d63089a08329e947b61: no such file or directory INFO[0001] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0002] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0003] Error while retrieving image from cache: getting file info: stat /cache/sha256:24bd806383ca0f0c7552e5126b0b3abd1eee6de731105d63089a08329e947b61: no such file or directory INFO[0003] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0003] Built cross stage deps: map[0:[/hkube/algorithm-runner]] INFO[0003] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0004] Error while retrieving image from cache: getting file info: stat /cache/sha256:24bd806383ca0f0c7552e5126b0b3abd1eee6de731105d63089a08329e947b61: no such file or directory INFO[0004] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0005] Executing 0 build triggers INFO[0005] Unpacking rootfs as cmd RUN ../docker/requirements.sh ${packagesRegistry} ${packagesToken} requires it. error building image: error building stage: removing whiteout usr/share/doc/nano/.wh..wh..opq: lstat /usr/share/doc/nano/.wh..opq: operation not permitted error: 1 build finished with code 1 "
1.0
building a node algorith, dosent work - **HKube micro-service** algorithm builder **Describe the bug** building a node app dosent work **Expected behavior** it should work and build the algorithm **To Reproduce** Steps to reproduce the behavior: 1. create a node algorithm 2. zip it with the package.json file 3. build the algorithm **Screenshots** If applicable, add screenshots to help explain your problem. **Additional context** " IMAGE_NAME=docker.io/eytang/codeapi:v1.0.0 BUILD_PATH=builds/nodejs/codeapi BASE_IMAGE=docker.io/hkube/nodejs-env:1.1.84 DOCKER_PULL_REGISTRY=docker.io/hkube INSECURE_PULL=false SKIP_TLS_VERIFY_PULL=false DOCKER_PUSH_REGISTRY=docker.io/eytang SKIP_TLS_VERIFY=false INSECURE=false PACKAGES_REGISTRY= REMOVE_IMAGE=True TMP_FOLDER=/tmp Building image docker.io/eytang/codeapi:v1.0.0 copy context from builds/nodejs/codeapi to /tmp/workspace build done INFO[0000] Checking push permissions INFO[0001] Resolved base name ${baseImage} to docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Resolved base name ${baseImage} to docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Resolved base name ${baseImage} to docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Resolved base name ${baseImage} to docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0001] Error while retrieving image from cache: getting file info: stat /cache/sha256:24bd806383ca0f0c7552e5126b0b3abd1eee6de731105d63089a08329e947b61: no such file or directory INFO[0001] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0002] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0003] Error while retrieving image from cache: getting file info: stat /cache/sha256:24bd806383ca0f0c7552e5126b0b3abd1eee6de731105d63089a08329e947b61: no such file or directory INFO[0003] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0003] Built cross stage deps: map[0:[/hkube/algorithm-runner]] INFO[0003] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0004] Error while retrieving image from cache: getting file info: stat /cache/sha256:24bd806383ca0f0c7552e5126b0b3abd1eee6de731105d63089a08329e947b61: no such file or directory INFO[0004] Downloading base image docker.io/hkube/nodejs-env:1.1.84 INFO[0005] Executing 0 build triggers INFO[0005] Unpacking rootfs as cmd RUN ../docker/requirements.sh ${packagesRegistry} ${packagesToken} requires it. error building image: error building stage: removing whiteout usr/share/doc/nano/.wh..wh..opq: lstat /usr/share/doc/nano/.wh..opq: operation not permitted error: 1 build finished with code 1 "
build
building a node algorith dosent work hkube micro service algorithm builder describe the bug building a node app dosent work expected behavior it should work and build the algorithm to reproduce steps to reproduce the behavior create a node algorithm zip it with the package json file build the algorithm screenshots if applicable add screenshots to help explain your problem additional context image name docker io eytang codeapi build path builds nodejs codeapi base image docker io hkube nodejs env docker pull registry docker io hkube insecure pull false skip tls verify pull false docker push registry docker io eytang skip tls verify false insecure false packages registry remove image true tmp folder tmp building image docker io eytang codeapi copy context from builds nodejs codeapi to tmp workspace build done  checking push permissions  resolved base name baseimage to docker io hkube nodejs env  resolved base name baseimage to docker io hkube nodejs env  resolved base name baseimage to docker io hkube nodejs env  resolved base name baseimage to docker io hkube nodejs env  downloading base image docker io hkube nodejs env  error while retrieving image from cache getting file info stat cache no such file or directory  downloading base image docker io hkube nodejs env  downloading base image docker io hkube nodejs env  error while retrieving image from cache getting file info stat cache no such file or directory  downloading base image docker io hkube nodejs env  built cross stage deps map  downloading base image docker io hkube nodejs env  error while retrieving image from cache getting file info stat cache no such file or directory  downloading base image docker io hkube nodejs env  executing build triggers  unpacking rootfs as cmd run docker requirements sh packagesregistry packagestoken requires it error building image error building stage removing whiteout usr share doc nano wh wh opq lstat usr share doc nano wh opq operation not permitted error build finished with code
1
39,756
10,374,428,399
IssuesEvent
2019-09-09 09:35:08
widelands/widelands-issue-migration2
https://api.github.com/repos/widelands/widelands-issue-migration2
closed
Install fails (file INSTALL cannot find "..../fonts")
Fix Released High buildsystem linux
When r7320 was merged, the fonts directory was moved into i18n. However, this has made `make install` fail since something is still refering to the old directory. How to reproduce: 1. Build Widelands 2. In the build directory, run `ninja install` (or `make install`) Note: this step will attempt to install the files system-wide on your computer. Only do this on a test system where you can roll back (like a virtual machine) or a system where you don't care about placing these files globally. The install step currently fails with the following error message: CMake Error at cmake_install.cmake:59 (file): file INSTALL cannot find "/build/buildd/widelands-18-ppa0-bzr7321/fonts". This broke the PPA, and while investigating I found that it occured when calling `make install` so the packaging was not involved/responsible. I believe the underlying issue is that the fonts directory is mentioned on line 232 in CMakeLists.txt. Should it be replace with i18n instead, is that what the game expects now?
1.0
Install fails (file INSTALL cannot find "..../fonts") - When r7320 was merged, the fonts directory was moved into i18n. However, this has made `make install` fail since something is still refering to the old directory. How to reproduce: 1. Build Widelands 2. In the build directory, run `ninja install` (or `make install`) Note: this step will attempt to install the files system-wide on your computer. Only do this on a test system where you can roll back (like a virtual machine) or a system where you don't care about placing these files globally. The install step currently fails with the following error message: CMake Error at cmake_install.cmake:59 (file): file INSTALL cannot find "/build/buildd/widelands-18-ppa0-bzr7321/fonts". This broke the PPA, and while investigating I found that it occured when calling `make install` so the packaging was not involved/responsible. I believe the underlying issue is that the fonts directory is mentioned on line 232 in CMakeLists.txt. Should it be replace with i18n instead, is that what the game expects now?
build
install fails file install cannot find fonts when was merged the fonts directory was moved into however this has made make install fail since something is still refering to the old directory how to reproduce build widelands in the build directory run ninja install or make install note this step will attempt to install the files system wide on your computer only do this on a test system where you can roll back like a virtual machine or a system where you don t care about placing these files globally the install step currently fails with the following error message cmake error at cmake install cmake file file install cannot find build buildd widelands fonts this broke the ppa and while investigating i found that it occured when calling make install so the packaging was not involved responsible i believe the underlying issue is that the fonts directory is mentioned on line in cmakelists txt should it be replace with instead is that what the game expects now
1
50,181
10,467,037,534
IssuesEvent
2019-09-22 00:41:12
icoxfog417/baby-steps-of-rl-ja
https://api.github.com/repos/icoxfog417/baby-steps-of-rl-ja
closed
p107 コード
code
### 指摘事項 ### 指摘箇所 * [ ] Day1: 強化学習の位置づけを知る * [ ] Day2: 強化学習の解法(1): 環境から計画を立てる * [ ] Day3: 強化学習の解法(2): 経験から計画を立てる * [x] Day4: 強化学習に対するニューラルネットワークの適用 * [ ] Day5: 強化学習の弱点 * [ ] Day6: 強化学習の弱点を克服するための手法 * [ ] Day7: 強化学習の活用領域 ページ番号: p107 ### 実行環境 * OS: * Python version: * `pip freeze`の実行結果 (下に添付) ### エラー内容 細かいですが p107 の最後 ```python def initialize(self, experiences): raise Exception("You have to implements estimate method") def estimate(self, s): raise Exception("You have to implements estimate method") ``` となっていますが,正しくは ```python def initialize(self, experiences): raise Exception("You have to implement initialize method") def estimate(self, s): raise Exception("You have to implement estimate method") ``` (~~implements~~ implement,~~estimate~~ initialize) あとこういう場合は Exception ではなく NotImplementedError を使ったほうが良いと思います.
1.0
p107 コード - ### 指摘事項 ### 指摘箇所 * [ ] Day1: 強化学習の位置づけを知る * [ ] Day2: 強化学習の解法(1): 環境から計画を立てる * [ ] Day3: 強化学習の解法(2): 経験から計画を立てる * [x] Day4: 強化学習に対するニューラルネットワークの適用 * [ ] Day5: 強化学習の弱点 * [ ] Day6: 強化学習の弱点を克服するための手法 * [ ] Day7: 強化学習の活用領域 ページ番号: p107 ### 実行環境 * OS: * Python version: * `pip freeze`の実行結果 (下に添付) ### エラー内容 細かいですが p107 の最後 ```python def initialize(self, experiences): raise Exception("You have to implements estimate method") def estimate(self, s): raise Exception("You have to implements estimate method") ``` となっていますが,正しくは ```python def initialize(self, experiences): raise Exception("You have to implement initialize method") def estimate(self, s): raise Exception("You have to implement estimate method") ``` (~~implements~~ implement,~~estimate~~ initialize) あとこういう場合は Exception ではなく NotImplementedError を使ったほうが良いと思います.
non_build
コード 指摘事項 指摘箇所 強化学習の位置づけを知る 強化学習の解法 環境から計画を立てる 強化学習の解法 経験から計画を立てる 強化学習に対するニューラルネットワークの適用 強化学習の弱点 強化学習の弱点を克服するための手法 強化学習の活用領域 ページ番号 実行環境 os python version pip freeze の実行結果 下に添付 エラー内容 細かいですが の最後 python def initialize self experiences raise exception you have to implements estimate method def estimate self s raise exception you have to implements estimate method となっていますが,正しくは python def initialize self experiences raise exception you have to implement initialize method def estimate self s raise exception you have to implement estimate method implements implement, estimate initialize あとこういう場合は exception ではなく notimplementederror を使ったほうが良いと思います.
0
39,484
8,652,834,518
IssuesEvent
2018-11-27 09:16:16
Altinn/altinn-studio
https://api.github.com/repos/Altinn/altinn-studio
opened
Make a golden standard for Sagas
code-quality
**Functional architect/designer:** @TheTechArch **Technical architect:** @-mention **Description** We need to create better, more testable sagas. Instead of refactoring exsisting, we start iteration 22 with writing one golden standard saga This should make it easyer for later refactoring. Could be solved by either team, when the first new saga is created **Technical considerations** Input (beyond tasks) on how the user story should be solved can be put here. **Acceptance criterea** - standard is agreed and communicated - testable sagas - readable sagas - **Tasks** - [ ] Analyse - [ ] Discuss - [ ] Documentation - [ ] Spread info to the other team - [ ] Implemtation
1.0
Make a golden standard for Sagas - **Functional architect/designer:** @TheTechArch **Technical architect:** @-mention **Description** We need to create better, more testable sagas. Instead of refactoring exsisting, we start iteration 22 with writing one golden standard saga This should make it easyer for later refactoring. Could be solved by either team, when the first new saga is created **Technical considerations** Input (beyond tasks) on how the user story should be solved can be put here. **Acceptance criterea** - standard is agreed and communicated - testable sagas - readable sagas - **Tasks** - [ ] Analyse - [ ] Discuss - [ ] Documentation - [ ] Spread info to the other team - [ ] Implemtation
non_build
make a golden standard for sagas functional architect designer thetecharch technical architect mention description we need to create better more testable sagas instead of refactoring exsisting we start iteration with writing one golden standard saga this should make it easyer for later refactoring could be solved by either team when the first new saga is created technical considerations input beyond tasks on how the user story should be solved can be put here acceptance criterea standard is agreed and communicated testable sagas readable sagas tasks analyse discuss documentation spread info to the other team implemtation
0
71,427
18,737,877,838
IssuesEvent
2021-11-04 10:01:07
vaticle/typedb-studio
https://api.github.com/repos/vaticle/typedb-studio
closed
Create a Bazel target for deploying to RPM
type: build status: blocked
Use `//:deploy-rpm` from https://github.com/graknlabs/bazel-distribution to deploy to RPM in repo.grakn.ai
1.0
Create a Bazel target for deploying to RPM - Use `//:deploy-rpm` from https://github.com/graknlabs/bazel-distribution to deploy to RPM in repo.grakn.ai
build
create a bazel target for deploying to rpm use deploy rpm from to deploy to rpm in repo grakn ai
1
52,053
12,854,566,696
IssuesEvent
2020-07-09 02:21:39
microsoft/onnxruntime
https://api.github.com/repos/microsoft/onnxruntime
closed
How to import onnxruntime when I build from source?
builds question
I have build the onnxruntime use "./build.sh --config RelWithDebInfo --build_shared_lib --parallel" successully. Then how to import onnxruntime? or before I import onnxruntime what else should I do?
1.0
How to import onnxruntime when I build from source? - I have build the onnxruntime use "./build.sh --config RelWithDebInfo --build_shared_lib --parallel" successully. Then how to import onnxruntime? or before I import onnxruntime what else should I do?
build
how to import onnxruntime when i build from source i have build the onnxruntime use build sh config relwithdebinfo build shared lib parallel successully then how to import onnxruntime or before i import onnxruntime what else should i do
1
58,402
14,382,940,060
IssuesEvent
2020-12-02 08:23:05
nipafx/nipafx.dev
https://api.github.com/repos/nipafx/nipafx.dev
opened
Merge <Accordion> and <PopOutAccordion>
aspect: development 👷 component: navigation type: performance 🏃‍♀️ type: refactoring :building_construction:
The navigation side menu uses these components to implement the expand/pop-out behavior (respectively). When creating the pop-out variant I decided it was too complex to add it to the already complex, expanding variant. While keeping each of the components simpler than a single one offering both behaviors, it has some downsides: * additional maintenance work * menu elements are duplicated to show up in both components * can't use HTML IDs for menu elements (wouldn't be unique) * adds additional DOM nodes (~80 on landing, >100 on other pages) * additional JS at run time to update two tocs on click/scroll Look into merging these two components into one.
1.0
Merge <Accordion> and <PopOutAccordion> - The navigation side menu uses these components to implement the expand/pop-out behavior (respectively). When creating the pop-out variant I decided it was too complex to add it to the already complex, expanding variant. While keeping each of the components simpler than a single one offering both behaviors, it has some downsides: * additional maintenance work * menu elements are duplicated to show up in both components * can't use HTML IDs for menu elements (wouldn't be unique) * adds additional DOM nodes (~80 on landing, >100 on other pages) * additional JS at run time to update two tocs on click/scroll Look into merging these two components into one.
build
merge and the navigation side menu uses these components to implement the expand pop out behavior respectively when creating the pop out variant i decided it was too complex to add it to the already complex expanding variant while keeping each of the components simpler than a single one offering both behaviors it has some downsides additional maintenance work menu elements are duplicated to show up in both components can t use html ids for menu elements wouldn t be unique adds additional dom nodes on landing on other pages additional js at run time to update two tocs on click scroll look into merging these two components into one
1
128,983
5,081,257,550
IssuesEvent
2016-12-29 09:23:04
jesparza/peepdf
https://api.github.com/repos/jesparza/peepdf
closed
using peepdf in python programming
Priority-Medium question Type-Other
@jesparza As this is a python tool. Can you tell if I can use its commands in python programming by importing peepdf as a package. I have analysed my pdf and had a look at its all objects by executing console commands like object 1, object 2 etc. Now my goal is to replace the content of 24 numbered object. Is that possible with this? Please suggest.
1.0
using peepdf in python programming - @jesparza As this is a python tool. Can you tell if I can use its commands in python programming by importing peepdf as a package. I have analysed my pdf and had a look at its all objects by executing console commands like object 1, object 2 etc. Now my goal is to replace the content of 24 numbered object. Is that possible with this? Please suggest.
non_build
using peepdf in python programming jesparza as this is a python tool can you tell if i can use its commands in python programming by importing peepdf as a package i have analysed my pdf and had a look at its all objects by executing console commands like object object etc now my goal is to replace the content of numbered object is that possible with this please suggest
0
99,888
30,573,201,837
IssuesEvent
2023-07-21 01:24:59
microsoft/onnxruntime
https://api.github.com/repos/microsoft/onnxruntime
closed
[Build] Compilation on an Apple M1 Pro produces an illegal instruction
build
### Describe the issue Seems that #16082 has introduced a bug when compiling for an Apple M1 machine. The bug stems from a missing inlined-assembly [instruction](https://github.com/microsoft/onnxruntime/commit/b28e927ca4e#diff-2f1ef5fcaaa83277145acadea37a3d9da65e9605664f03efd4ca6208fa191e3bR452). Reverting the commit introduces a regression on the test suite. ### Urgency This is a presumed blocker for a release on an arm64 Apple. ### Target platform arm64-apple-darwin22.5.0 ### Build script `./build.sh --config Debug --build_shared_lib --parallel --compile_no_warning_as_error --skip_submodule_sync --cmake_extra_defines CMAKE_OSX_ARCHITECTURES=arm64` ### Error / output ``` ... 5: 2023-06-27 14:08:02.109 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.109664 [I:onnxruntime:Default, bfc_arena.cc:212 Extend] Allocated memory at 0x128a88000 to 0x128e88000 5: 2023-06-27 14:08:02.110 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.110300 [I:onnxruntime:, session_state_utils.cc:344 SaveInitializedTensors] Done saving initialized tensors 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111169 [I:onnxruntime:, inference_session.cc:1675 Initialize] Session successfully initialized. 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111225 [V:onnxruntime:, sequential_executor.cc:534 ExecuteThePlan] Number of streams: 1 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111319 [V:onnxruntime:, sequential_executor.cc:184 SessionScope] Begin execution 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111356 [I:onnxruntime:Default, bfc_arena.cc:347 AllocateRawInternal] Extending BFCArena for Cpu. bin_num:13 (requested) num_bytes: 3154176 (actual) rounded_bytes:3154176 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111374 [I:onnxruntime:Default, bfc_arena.cc:206 Extend] Extended allocation by 8388608 bytes. 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111385 [I:onnxruntime:Default, bfc_arena.cc:209 Extend] Total allocated bytes: 15728640 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111395 [I:onnxruntime:Default, bfc_arena.cc:212 Extend] Allocated memory at 0x120000000 to 0x120800000 5: 2023-06-27 14:08:02.112 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.112710 [V:onnxruntime:, sequential_executor.cc:518 ExecuteKernel] stream 0 launch kernel with idx 66 5: 2023-06-27 14:08:02.113 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.113577 [V:onnxruntime:, stream_execution_context.cc:171 RecycleNodeInputs] ort value 53 released 5: 2023-06-27 14:08:02.113 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.113592 [V:onnxruntime:, sequential_executor.cc:518 ExecuteKernel] stream 0 launch kernel with idx 2 5/8 Test #5: onnxruntime_global_thread_pools_test ....***Exception: Illegal 0.27 sec ... 38% tests passed, 5 tests failed out of 8 Total Test time (real) = 5.27 sec The following tests FAILED: 1 - onnxruntime_test_all (ILLEGAL) 2 - onnx_test_pytorch_converted (ILLEGAL) 3 - onnx_test_pytorch_operator (ILLEGAL) 4 - onnxruntime_shared_lib_test (ILLEGAL) 5 - onnxruntime_global_thread_pools_test (ILLEGAL)``` ``` Running `lldb onnxruntime_mlas_test` provides the following output: ``` * thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_INSTRUCTION (code=1, subcode=0xd5380608) frame #0: 0x00000001001b5458 onnxruntime_mlas_test`MLAS_PLATFORM::MLAS_PLATFORM(this=0x000000010029cd48) at platform.cpp:456:5 453 HasDotProductInstructions = (IsProcessorFeaturePresent(PF_ARM_V82_DP_INSTRUCTIONS_AVAILABLE) != 0); 454 #else 455 uint64_t isar0_el1; -> 456 asm("mrs %[reg], ID_AA64ISAR0_EL1\n" : [reg] "=r"(isar0_el1) : :); 457 HasDotProductInstructions = ((isar0_el1 >> 44) & 0xfu) == 0x1u; 458 #endif 459 Target 0: (onnxruntime_mlas_test) stopped. ``` Reverting the offending commit introduces the following test cases to fail: ``` [ FAILED ] 11 tests, listed below: [ FAILED ] GraphTransformationTests.FuseConvBnAddMulFloat16 [ FAILED ] GraphTransformationTests.QuickGelu [ FAILED ] GraphTransformationTests.ConstantSharing_ShareFloatOrHalfTypedInitializer [ FAILED ] GraphTransformationTests.ConstantSharing_Share2DFloatOrHalfTypedInitializer [ FAILED ] GraphTransformationTests.ConstantSharing_ShareFloatAndHalfTypedInitializer [ FAILED ] GraphTransformationTests.ConstantSharing_Share2DFloatAndHalfTypedInitializer [ FAILED ] Float16_Tests.Mul_16_Test [ FAILED ] InverseContribOpTest.two_by_two_float16 [ FAILED ] CastOpTest.NonStringTypes [ FAILED ] CastOpTest.ToString [ FAILED ] IsNaNOpTest.IsNaNFloat16 ``` ### Visual Studio Version _No response_ ### GCC / Compiler Version clang version 11.1.0
1.0
[Build] Compilation on an Apple M1 Pro produces an illegal instruction - ### Describe the issue Seems that #16082 has introduced a bug when compiling for an Apple M1 machine. The bug stems from a missing inlined-assembly [instruction](https://github.com/microsoft/onnxruntime/commit/b28e927ca4e#diff-2f1ef5fcaaa83277145acadea37a3d9da65e9605664f03efd4ca6208fa191e3bR452). Reverting the commit introduces a regression on the test suite. ### Urgency This is a presumed blocker for a release on an arm64 Apple. ### Target platform arm64-apple-darwin22.5.0 ### Build script `./build.sh --config Debug --build_shared_lib --parallel --compile_no_warning_as_error --skip_submodule_sync --cmake_extra_defines CMAKE_OSX_ARCHITECTURES=arm64` ### Error / output ``` ... 5: 2023-06-27 14:08:02.109 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.109664 [I:onnxruntime:Default, bfc_arena.cc:212 Extend] Allocated memory at 0x128a88000 to 0x128e88000 5: 2023-06-27 14:08:02.110 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.110300 [I:onnxruntime:, session_state_utils.cc:344 SaveInitializedTensors] Done saving initialized tensors 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111169 [I:onnxruntime:, inference_session.cc:1675 Initialize] Session successfully initialized. 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111225 [V:onnxruntime:, sequential_executor.cc:534 ExecuteThePlan] Number of streams: 1 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111319 [V:onnxruntime:, sequential_executor.cc:184 SessionScope] Begin execution 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111356 [I:onnxruntime:Default, bfc_arena.cc:347 AllocateRawInternal] Extending BFCArena for Cpu. bin_num:13 (requested) num_bytes: 3154176 (actual) rounded_bytes:3154176 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111374 [I:onnxruntime:Default, bfc_arena.cc:206 Extend] Extended allocation by 8388608 bytes. 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111385 [I:onnxruntime:Default, bfc_arena.cc:209 Extend] Total allocated bytes: 15728640 5: 2023-06-27 14:08:02.111 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.111395 [I:onnxruntime:Default, bfc_arena.cc:212 Extend] Allocated memory at 0x120000000 to 0x120800000 5: 2023-06-27 14:08:02.112 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.112710 [V:onnxruntime:, sequential_executor.cc:518 ExecuteKernel] stream 0 launch kernel with idx 66 5: 2023-06-27 14:08:02.113 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.113577 [V:onnxruntime:, stream_execution_context.cc:171 RecycleNodeInputs] ort value 53 released 5: 2023-06-27 14:08:02.113 onnxruntime_global_thread_pools_test[44651:1311462] 2023-06-27 14:08:02.113592 [V:onnxruntime:, sequential_executor.cc:518 ExecuteKernel] stream 0 launch kernel with idx 2 5/8 Test #5: onnxruntime_global_thread_pools_test ....***Exception: Illegal 0.27 sec ... 38% tests passed, 5 tests failed out of 8 Total Test time (real) = 5.27 sec The following tests FAILED: 1 - onnxruntime_test_all (ILLEGAL) 2 - onnx_test_pytorch_converted (ILLEGAL) 3 - onnx_test_pytorch_operator (ILLEGAL) 4 - onnxruntime_shared_lib_test (ILLEGAL) 5 - onnxruntime_global_thread_pools_test (ILLEGAL)``` ``` Running `lldb onnxruntime_mlas_test` provides the following output: ``` * thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_INSTRUCTION (code=1, subcode=0xd5380608) frame #0: 0x00000001001b5458 onnxruntime_mlas_test`MLAS_PLATFORM::MLAS_PLATFORM(this=0x000000010029cd48) at platform.cpp:456:5 453 HasDotProductInstructions = (IsProcessorFeaturePresent(PF_ARM_V82_DP_INSTRUCTIONS_AVAILABLE) != 0); 454 #else 455 uint64_t isar0_el1; -> 456 asm("mrs %[reg], ID_AA64ISAR0_EL1\n" : [reg] "=r"(isar0_el1) : :); 457 HasDotProductInstructions = ((isar0_el1 >> 44) & 0xfu) == 0x1u; 458 #endif 459 Target 0: (onnxruntime_mlas_test) stopped. ``` Reverting the offending commit introduces the following test cases to fail: ``` [ FAILED ] 11 tests, listed below: [ FAILED ] GraphTransformationTests.FuseConvBnAddMulFloat16 [ FAILED ] GraphTransformationTests.QuickGelu [ FAILED ] GraphTransformationTests.ConstantSharing_ShareFloatOrHalfTypedInitializer [ FAILED ] GraphTransformationTests.ConstantSharing_Share2DFloatOrHalfTypedInitializer [ FAILED ] GraphTransformationTests.ConstantSharing_ShareFloatAndHalfTypedInitializer [ FAILED ] GraphTransformationTests.ConstantSharing_Share2DFloatAndHalfTypedInitializer [ FAILED ] Float16_Tests.Mul_16_Test [ FAILED ] InverseContribOpTest.two_by_two_float16 [ FAILED ] CastOpTest.NonStringTypes [ FAILED ] CastOpTest.ToString [ FAILED ] IsNaNOpTest.IsNaNFloat16 ``` ### Visual Studio Version _No response_ ### GCC / Compiler Version clang version 11.1.0
build
compilation on an apple pro produces an illegal instruction describe the issue seems that has introduced a bug when compiling for an apple machine the bug stems from a missing inlined assembly reverting the commit introduces a regression on the test suite urgency this is a presumed blocker for a release on an apple target platform apple build script build sh config debug build shared lib parallel compile no warning as error skip submodule sync cmake extra defines cmake osx architectures error output onnxruntime global thread pools test allocated memory at to onnxruntime global thread pools test done saving initialized tensors onnxruntime global thread pools test session successfully initialized onnxruntime global thread pools test number of streams onnxruntime global thread pools test begin execution onnxruntime global thread pools test extending bfcarena for cpu bin num requested num bytes actual rounded bytes onnxruntime global thread pools test extended allocation by bytes onnxruntime global thread pools test total allocated bytes onnxruntime global thread pools test allocated memory at to onnxruntime global thread pools test stream launch kernel with idx onnxruntime global thread pools test ort value released onnxruntime global thread pools test stream launch kernel with idx test onnxruntime global thread pools test exception illegal sec tests passed tests failed out of total test time real sec the following tests failed onnxruntime test all illegal onnx test pytorch converted illegal onnx test pytorch operator illegal onnxruntime shared lib test illegal onnxruntime global thread pools test illegal running lldb onnxruntime mlas test provides the following output thread queue com apple main thread stop reason exc bad instruction code subcode frame onnxruntime mlas test mlas platform mlas platform this at platform cpp hasdotproductinstructions isprocessorfeaturepresent pf arm dp instructions available else t asm mrs id n r hasdotproductinstructions endif target onnxruntime mlas test stopped reverting the offending commit introduces the following test cases to fail tests listed below graphtransformationtests graphtransformationtests quickgelu graphtransformationtests constantsharing sharefloatorhalftypedinitializer graphtransformationtests constantsharing graphtransformationtests constantsharing sharefloatandhalftypedinitializer graphtransformationtests constantsharing tests mul test inversecontriboptest two by two castoptest nonstringtypes castoptest tostring isnanoptest visual studio version no response gcc compiler version clang version
1
187,540
14,428,231,991
IssuesEvent
2020-12-06 08:45:40
kalexmills/github-vet-tests-dec2020
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
closed
coreroller/coreroller: backend/vendor/src/github.com/pmylund/go-cache/sharded_test.go; 8 LoC
fresh test tiny
Found a possible issue in [coreroller/coreroller](https://www.github.com/coreroller/coreroller) at [backend/vendor/src/github.com/pmylund/go-cache/sharded_test.go](https://github.com/coreroller/coreroller/blob/1a1f412434584d0569f408bffa059ead21e28d9d/backend/vendor/src/github.com/pmylund/go-cache/sharded_test.go#L58-L65) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > range-loop variable v used in defer or goroutine at line 61 [Click here to see the code in its original context.](https://github.com/coreroller/coreroller/blob/1a1f412434584d0569f408bffa059ead21e28d9d/backend/vendor/src/github.com/pmylund/go-cache/sharded_test.go#L58-L65) <details> <summary>Click here to show the 8 line(s) of Go which triggered the analyzer.</summary> ```go for _, v := range keys { go func() { for j := 0; j < each; j++ { tsc.Get(v) } wg.Done() }() } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 1a1f412434584d0569f408bffa059ead21e28d9d
1.0
coreroller/coreroller: backend/vendor/src/github.com/pmylund/go-cache/sharded_test.go; 8 LoC - Found a possible issue in [coreroller/coreroller](https://www.github.com/coreroller/coreroller) at [backend/vendor/src/github.com/pmylund/go-cache/sharded_test.go](https://github.com/coreroller/coreroller/blob/1a1f412434584d0569f408bffa059ead21e28d9d/backend/vendor/src/github.com/pmylund/go-cache/sharded_test.go#L58-L65) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > range-loop variable v used in defer or goroutine at line 61 [Click here to see the code in its original context.](https://github.com/coreroller/coreroller/blob/1a1f412434584d0569f408bffa059ead21e28d9d/backend/vendor/src/github.com/pmylund/go-cache/sharded_test.go#L58-L65) <details> <summary>Click here to show the 8 line(s) of Go which triggered the analyzer.</summary> ```go for _, v := range keys { go func() { for j := 0; j < each; j++ { tsc.Get(v) } wg.Done() }() } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 1a1f412434584d0569f408bffa059ead21e28d9d
non_build
coreroller coreroller backend vendor src github com pmylund go cache sharded test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable v used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for v range keys go func for j j each j tsc get v wg done leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
0
118,369
11,967,570,293
IssuesEvent
2020-04-06 06:59:19
RedHatInsights/insights-results-aggregator
https://api.github.com/repos/RedHatInsights/insights-results-aggregator
opened
Metrics endpoint is not described in OpenAPI specification
bug documentation
Metrics endpoint is not described in OpenAPI specification
1.0
Metrics endpoint is not described in OpenAPI specification - Metrics endpoint is not described in OpenAPI specification
non_build
metrics endpoint is not described in openapi specification metrics endpoint is not described in openapi specification
0
356,773
25,176,257,994
IssuesEvent
2022-11-11 09:31:34
JordanChua/pe
https://api.github.com/repos/JordanChua/pe
opened
Explanation for predicting a Student's grade too complex
severity.Low type.DocumentationBug
The whole calculation for the prediction of a student's grade is very complex, and is visually hard for the user to imagine as well since there are several terms here: "learning rating" , "difficulty bonus" "normalised scores", which are all concepts It would be great if there was a solid example to show the actual calculations for a certain student, along with pictures of previous assessments grades and show the whole process of how the predicted grade is calculated so the user can understand the whole flow. ![image.png](https://raw.githubusercontent.com/JordanChua/pe/main/files/733b3af3-e6f5-4cd1-821c-0cc7ac2017da.png) <!--session: 1668153175057-c258eb52-23bd-4e38-9ba0-7445348ca312--> <!--Version: Web v3.4.4-->
1.0
Explanation for predicting a Student's grade too complex - The whole calculation for the prediction of a student's grade is very complex, and is visually hard for the user to imagine as well since there are several terms here: "learning rating" , "difficulty bonus" "normalised scores", which are all concepts It would be great if there was a solid example to show the actual calculations for a certain student, along with pictures of previous assessments grades and show the whole process of how the predicted grade is calculated so the user can understand the whole flow. ![image.png](https://raw.githubusercontent.com/JordanChua/pe/main/files/733b3af3-e6f5-4cd1-821c-0cc7ac2017da.png) <!--session: 1668153175057-c258eb52-23bd-4e38-9ba0-7445348ca312--> <!--Version: Web v3.4.4-->
non_build
explanation for predicting a student s grade too complex the whole calculation for the prediction of a student s grade is very complex and is visually hard for the user to imagine as well since there are several terms here learning rating difficulty bonus normalised scores which are all concepts it would be great if there was a solid example to show the actual calculations for a certain student along with pictures of previous assessments grades and show the whole process of how the predicted grade is calculated so the user can understand the whole flow
0
15,756
10,345,674,933
IssuesEvent
2019-09-04 13:55:18
Azure/azure-sdk-for-net
https://api.github.com/repos/Azure/azure-sdk-for-net
opened
[Track One] Tests for Service Fabric Processor error handling are unstable for CI and nightly runs
Client Event Hubs Service Attention
# Summary A recent change introduced some new timing elements to help the exception tests for the `ServiceFabricProcessor` to be more stable and provide more consistent results during nightly runs. Since those changes were introduced, there the tests have proven to be non-deterministic during CI and nightly test runs, failing intermittently but frequently. It is not the intention to imply that the changes were the root cause of instability, but only to help narrow down the timing; these failures may be unrelated. An example failure can be found in the [nightly results fromSeptember 3](https://dev.azure.com/azure-sdk/internal/_build/results?buildId=101902&view=ms.vss-test-web.build-test-results-tab&runId=4070760&resultId=101118&paneView=debug). The failure looks like: ![image](https://user-images.githubusercontent.com/913445/64260878-a01dd900-cef9-11e9-9770-3da54da94dd2.png) Another example failure can be found in [PPR #7467](https://github.com/Azure/azure-sdk-for-net/pull/7467/checks?check_run_id=211231449). The failure looks like: ![image](https://user-images.githubusercontent.com/913445/64261011-d52a2b80-cef9-11e9-8a93-76b0ec71a8e6.png) # Scope of Work - Analyze the tests to understand the point of non-determinism and decide on an approach to stabilize for CI and nightly runs. - Refactor the tests as needed to ensure that the results are deterministic and consistently pass. # Out of Scope - Changes or refactoring in other areas. # Success Criteria - The suite of tests for the `ServiceFabricProcessor` no longer causes intermittent, but frequent, failures during nightly runs. - Any changes did not cause regressions elsewhere; CI and nightly test runs pass consistently and reliably with deterministic results.
1.0
[Track One] Tests for Service Fabric Processor error handling are unstable for CI and nightly runs - # Summary A recent change introduced some new timing elements to help the exception tests for the `ServiceFabricProcessor` to be more stable and provide more consistent results during nightly runs. Since those changes were introduced, there the tests have proven to be non-deterministic during CI and nightly test runs, failing intermittently but frequently. It is not the intention to imply that the changes were the root cause of instability, but only to help narrow down the timing; these failures may be unrelated. An example failure can be found in the [nightly results fromSeptember 3](https://dev.azure.com/azure-sdk/internal/_build/results?buildId=101902&view=ms.vss-test-web.build-test-results-tab&runId=4070760&resultId=101118&paneView=debug). The failure looks like: ![image](https://user-images.githubusercontent.com/913445/64260878-a01dd900-cef9-11e9-9770-3da54da94dd2.png) Another example failure can be found in [PPR #7467](https://github.com/Azure/azure-sdk-for-net/pull/7467/checks?check_run_id=211231449). The failure looks like: ![image](https://user-images.githubusercontent.com/913445/64261011-d52a2b80-cef9-11e9-8a93-76b0ec71a8e6.png) # Scope of Work - Analyze the tests to understand the point of non-determinism and decide on an approach to stabilize for CI and nightly runs. - Refactor the tests as needed to ensure that the results are deterministic and consistently pass. # Out of Scope - Changes or refactoring in other areas. # Success Criteria - The suite of tests for the `ServiceFabricProcessor` no longer causes intermittent, but frequent, failures during nightly runs. - Any changes did not cause regressions elsewhere; CI and nightly test runs pass consistently and reliably with deterministic results.
non_build
tests for service fabric processor error handling are unstable for ci and nightly runs summary a recent change introduced some new timing elements to help the exception tests for the servicefabricprocessor to be more stable and provide more consistent results during nightly runs since those changes were introduced there the tests have proven to be non deterministic during ci and nightly test runs failing intermittently but frequently it is not the intention to imply that the changes were the root cause of instability but only to help narrow down the timing these failures may be unrelated an example failure can be found in the the failure looks like another example failure can be found in the failure looks like scope of work analyze the tests to understand the point of non determinism and decide on an approach to stabilize for ci and nightly runs refactor the tests as needed to ensure that the results are deterministic and consistently pass out of scope changes or refactoring in other areas success criteria the suite of tests for the servicefabricprocessor no longer causes intermittent but frequent failures during nightly runs any changes did not cause regressions elsewhere ci and nightly test runs pass consistently and reliably with deterministic results
0
95,764
27,612,639,738
IssuesEvent
2023-03-09 16:59:12
golang/go
https://api.github.com/repos/golang/go
closed
x/build: aix-ppc64 builder was missing
Builders NeedsInvestigation
From https://farmer.golang.org/#pools: ``` host-aix-ppc64-osuosl: 0/0 (1 missing) ``` /cc @trex58 Are you able to look into this? Thank you. /cc @toothrot @cagedmantis @andybons
1.0
x/build: aix-ppc64 builder was missing - From https://farmer.golang.org/#pools: ``` host-aix-ppc64-osuosl: 0/0 (1 missing) ``` /cc @trex58 Are you able to look into this? Thank you. /cc @toothrot @cagedmantis @andybons
build
x build aix builder was missing from host aix osuosl missing cc are you able to look into this thank you cc toothrot cagedmantis andybons
1
102,314
31,886,169,737
IssuesEvent
2023-09-17 00:38:40
habitat-sh/builder
https://api.github.com/repos/habitat-sh/builder
closed
[builder-worker] Clean up `job.root()/dockerd` directory on teardown.
Focus:Builder Type:Chore Platform:Linux Stale
@fnichol commented on [Tue Oct 03 2017](https://github.com/habitat-sh/habitat/issues/3484) Currently the Docker Engine's data directories aren't cleaned up when the job is torn down. It shouldn't be big at the moment as the exporter cleans up its image layers. --- @fnichol commented on [Wed Oct 04 2017](https://github.com/habitat-sh/habitat/issues/3484#issuecomment-334201547) This is non-critical and only applies to hosted Builder. We'll fix this shortly, but doesn't impact exporting Docker images.
1.0
[builder-worker] Clean up `job.root()/dockerd` directory on teardown. - @fnichol commented on [Tue Oct 03 2017](https://github.com/habitat-sh/habitat/issues/3484) Currently the Docker Engine's data directories aren't cleaned up when the job is torn down. It shouldn't be big at the moment as the exporter cleans up its image layers. --- @fnichol commented on [Wed Oct 04 2017](https://github.com/habitat-sh/habitat/issues/3484#issuecomment-334201547) This is non-critical and only applies to hosted Builder. We'll fix this shortly, but doesn't impact exporting Docker images.
build
clean up job root dockerd directory on teardown fnichol commented on currently the docker engine s data directories aren t cleaned up when the job is torn down it shouldn t be big at the moment as the exporter cleans up its image layers fnichol commented on this is non critical and only applies to hosted builder we ll fix this shortly but doesn t impact exporting docker images
1
76,205
21,220,628,553
IssuesEvent
2022-04-11 11:35:17
appsmithorg/appsmith
https://api.github.com/repos/appsmithorg/appsmith
closed
[Bug]: Select widget shows blank spaces
Bug Select Widget App Viewers Pod High UI Building Pod regression Needs Triaging
### Is there an existing issue for this? - [X] I have searched the existing issues ### Description Select widget shows blank spaces when more options are added. https://loom.com/share/f7b8333ca2da4fae84fed6574dfea635 ![image](https://user-images.githubusercontent.com/92293815/162132717-635ea4cb-d5ae-4bda-96f8-71e587e42f53.png) ### Steps To Reproduce 1. Drag and drop select widget 2. Add more values at the end of the default values given ```[ { "label": "Blue", "value": "BLUE" }, { "label": "Green", "value": "GREEN" }, { "label": "Red", "value": "RED" }, { "label": "Orange", "value": "ORAN" }, { "label": "Oe", "value": "ON" } ]``` 3. Click on the drop down and select the last option ### Public Sample App _No response_ ### Version cloud
1.0
[Bug]: Select widget shows blank spaces - ### Is there an existing issue for this? - [X] I have searched the existing issues ### Description Select widget shows blank spaces when more options are added. https://loom.com/share/f7b8333ca2da4fae84fed6574dfea635 ![image](https://user-images.githubusercontent.com/92293815/162132717-635ea4cb-d5ae-4bda-96f8-71e587e42f53.png) ### Steps To Reproduce 1. Drag and drop select widget 2. Add more values at the end of the default values given ```[ { "label": "Blue", "value": "BLUE" }, { "label": "Green", "value": "GREEN" }, { "label": "Red", "value": "RED" }, { "label": "Orange", "value": "ORAN" }, { "label": "Oe", "value": "ON" } ]``` 3. Click on the drop down and select the last option ### Public Sample App _No response_ ### Version cloud
build
select widget shows blank spaces is there an existing issue for this i have searched the existing issues description select widget shows blank spaces when more options are added steps to reproduce drag and drop select widget add more values at the end of the default values given label blue value blue label green value green label red value red label orange value oran label oe value on click on the drop down and select the last option public sample app no response version cloud
1
586
2,533,407,399
IssuesEvent
2015-01-23 23:06:31
biocore/emperor
https://api.github.com/repos/biocore/emperor
closed
Remove chrome warning from website
documentation easy-fix
Chrome for the mac is now 64-bit, see: https://code.google.com/p/chromium/issues/detail?id=18323 ---- This is an easy one if someone wants to take it.
1.0
Remove chrome warning from website - Chrome for the mac is now 64-bit, see: https://code.google.com/p/chromium/issues/detail?id=18323 ---- This is an easy one if someone wants to take it.
non_build
remove chrome warning from website chrome for the mac is now bit see this is an easy one if someone wants to take it
0
12,857
21,016,880,185
IssuesEvent
2022-03-30 11:52:51
renovatebot/renovate
https://api.github.com/repos/renovatebot/renovate
closed
Problem updating private Docker registry implicit library/ image
priority-4-low manager:dockerfile status:requirements
### Which Renovate are you using? Renovate docker image 19.156.0 ### Which platform are you using? GitLab self-hosted ### Have you checked the logs? Don't forget to include them if relevant ``` WARN: Error getting docker image tags (repository=backend/myproject) "registry": "https://artifactoryDomain", "dockerRepository": "backend-docker/openjdk", "err": { "name": "HTTPError", "host": "artifactoryDomain", "hostname": "artifactoryDomain", "method": "GET", "path": "/v2/backend-docker/openjdk/tags/list?n=10000", "protocol": "https:", "url": "https://artifactoryDomain/v2/backend-docker/openjdk/tags/list?n=10000", "gotOptions": { "path": "/v2/backend-docker/openjdk/tags/list?n=10000", "protocol": "https:", "slashes": true, "auth": null, "host": "artifactoryDomain", "port": null, "hostname": "artifactoryDomain", "hash": null, "search": "?n=10000", "pathname": "/v2/backend-docker/openjdk/tags/list", "href": "https://artifactoryDomain/v2/backend-docker/openjdk/tags/list?n=10000", "headers": { "user-agent": "https://github.com/renovatebot/renovate", "authorization": "** redacted **", "accept": "application/json", "accept-encoding": "gzip, deflate" }, "hooks": { "beforeError": [], "init": [], "beforeRequest": [], "beforeRedirect": [], "beforeRetry": [], "afterResponse": [] }, "retry": {"methods": {}, "statusCodes": {}, "errorCodes": {}}, "decompress": true, "throwHttpErrors": true, "followRedirect": true, "stream": false, "form": false, "json": true, "cache": false, "useElectronNet": false, "method": "GET", "gotTimeout": {"request": 60000} }, "statusCode": 404, "statusMessage": "Not Found", "headers": { "server": "nginx/1.17.7", "date": "Mon, 30 Mar 2020 13:22:58 GMT", "content-type": "application/json", "transfer-encoding": "chunked", "connection": "close", "vary": "Accept-Encoding", "x-artifactory-id": "qwertz", "docker-distribution-api-version": "registry/2.0", "strict-transport-security": "max-age=15724800; includeSubDomains", "content-encoding": "gzip" }, "body": { "errors": [ { "code": "NAME_UNKNOWN", "message": "Repository name not known to registry.", "detail": {"name": "openjdk"} } ] }, "message": "Response code 404 (Not Found)", "stack": "HTTPError: Response code 404 (Not Found)\n at EventEmitter.emitter.on (/usr/local/lib/node_modules/renovate/node_modules/got/source/as-promise.js:74:19)\n at <anonymous>\n at process._tickCallback (internal/process/next_tick.js:189:7)" } ``` ### What would you like to do? The gitlab project `myproject` in gitlab group backend contains a Dockerfile. This file is being scanned, but the tags cannot be resolved since the URL is not built properly. It should read https://artifactoryDomain/v2/backend-docker/library/openjdk/tags/list?n=10000 instead of https://artifactoryDomain/v2/backend-docker/openjdk/tags/list?n=10000, since our artifactory has grouped repositories. Is there any configuration option to fix this? I already tried ``` "hostRules": [ { "hostType": "docker", "baseUrl": "https://artifactoryDomain/backend-docker" } ], ``` but got the same result. For me it seems like https://github.com/renovatebot/renovate/blob/master/lib/datasource/docker/index.ts#L372 is a poor condition in this case, because the repository reads `backend-docker/openjdk`. The Dockerfile starts with `FROM artifactoryDomain/backend-docker/openjdk:12-alpine `
1.0
Problem updating private Docker registry implicit library/ image - ### Which Renovate are you using? Renovate docker image 19.156.0 ### Which platform are you using? GitLab self-hosted ### Have you checked the logs? Don't forget to include them if relevant ``` WARN: Error getting docker image tags (repository=backend/myproject) "registry": "https://artifactoryDomain", "dockerRepository": "backend-docker/openjdk", "err": { "name": "HTTPError", "host": "artifactoryDomain", "hostname": "artifactoryDomain", "method": "GET", "path": "/v2/backend-docker/openjdk/tags/list?n=10000", "protocol": "https:", "url": "https://artifactoryDomain/v2/backend-docker/openjdk/tags/list?n=10000", "gotOptions": { "path": "/v2/backend-docker/openjdk/tags/list?n=10000", "protocol": "https:", "slashes": true, "auth": null, "host": "artifactoryDomain", "port": null, "hostname": "artifactoryDomain", "hash": null, "search": "?n=10000", "pathname": "/v2/backend-docker/openjdk/tags/list", "href": "https://artifactoryDomain/v2/backend-docker/openjdk/tags/list?n=10000", "headers": { "user-agent": "https://github.com/renovatebot/renovate", "authorization": "** redacted **", "accept": "application/json", "accept-encoding": "gzip, deflate" }, "hooks": { "beforeError": [], "init": [], "beforeRequest": [], "beforeRedirect": [], "beforeRetry": [], "afterResponse": [] }, "retry": {"methods": {}, "statusCodes": {}, "errorCodes": {}}, "decompress": true, "throwHttpErrors": true, "followRedirect": true, "stream": false, "form": false, "json": true, "cache": false, "useElectronNet": false, "method": "GET", "gotTimeout": {"request": 60000} }, "statusCode": 404, "statusMessage": "Not Found", "headers": { "server": "nginx/1.17.7", "date": "Mon, 30 Mar 2020 13:22:58 GMT", "content-type": "application/json", "transfer-encoding": "chunked", "connection": "close", "vary": "Accept-Encoding", "x-artifactory-id": "qwertz", "docker-distribution-api-version": "registry/2.0", "strict-transport-security": "max-age=15724800; includeSubDomains", "content-encoding": "gzip" }, "body": { "errors": [ { "code": "NAME_UNKNOWN", "message": "Repository name not known to registry.", "detail": {"name": "openjdk"} } ] }, "message": "Response code 404 (Not Found)", "stack": "HTTPError: Response code 404 (Not Found)\n at EventEmitter.emitter.on (/usr/local/lib/node_modules/renovate/node_modules/got/source/as-promise.js:74:19)\n at <anonymous>\n at process._tickCallback (internal/process/next_tick.js:189:7)" } ``` ### What would you like to do? The gitlab project `myproject` in gitlab group backend contains a Dockerfile. This file is being scanned, but the tags cannot be resolved since the URL is not built properly. It should read https://artifactoryDomain/v2/backend-docker/library/openjdk/tags/list?n=10000 instead of https://artifactoryDomain/v2/backend-docker/openjdk/tags/list?n=10000, since our artifactory has grouped repositories. Is there any configuration option to fix this? I already tried ``` "hostRules": [ { "hostType": "docker", "baseUrl": "https://artifactoryDomain/backend-docker" } ], ``` but got the same result. For me it seems like https://github.com/renovatebot/renovate/blob/master/lib/datasource/docker/index.ts#L372 is a poor condition in this case, because the repository reads `backend-docker/openjdk`. The Dockerfile starts with `FROM artifactoryDomain/backend-docker/openjdk:12-alpine `
non_build
problem updating private docker registry implicit library image which renovate are you using renovate docker image which platform are you using gitlab self hosted have you checked the logs don t forget to include them if relevant warn error getting docker image tags repository backend myproject registry dockerrepository backend docker openjdk err name httperror host artifactorydomain hostname artifactorydomain method get path backend docker openjdk tags list n protocol https url gotoptions path backend docker openjdk tags list n protocol https slashes true auth null host artifactorydomain port null hostname artifactorydomain hash null search n pathname backend docker openjdk tags list href headers user agent authorization redacted accept application json accept encoding gzip deflate hooks beforeerror init beforerequest beforeredirect beforeretry afterresponse retry methods statuscodes errorcodes decompress true throwhttperrors true followredirect true stream false form false json true cache false useelectronnet false method get gottimeout request statuscode statusmessage not found headers server nginx date mon mar gmt content type application json transfer encoding chunked connection close vary accept encoding x artifactory id qwertz docker distribution api version registry strict transport security max age includesubdomains content encoding gzip body errors code name unknown message repository name not known to registry detail name openjdk message response code not found stack httperror response code not found n at eventemitter emitter on usr local lib node modules renovate node modules got source as promise js n at n at process tickcallback internal process next tick js what would you like to do the gitlab project myproject in gitlab group backend contains a dockerfile this file is being scanned but the tags cannot be resolved since the url is not built properly it should read instead of since our artifactory has grouped repositories is there any configuration option to fix this i already tried hostrules hosttype docker baseurl but got the same result for me it seems like is a poor condition in this case because the repository reads backend docker openjdk the dockerfile starts with from artifactorydomain backend docker openjdk alpine
0
104,174
11,394,886,375
IssuesEvent
2020-01-30 10:16:04
wso2/product-apim
https://api.github.com/repos/wso2/product-apim
closed
[APIM300]No doc can be found for APIM300 Deployment patterns/clustering
4.0.0 Priority/Highest Severity/Blocker To-do documentation-required
**Description:** Please provide a document for APIM300 deployment patterns/clustering. **Suggested Labels:** APIM300 **Affected Product Version:** APIM300
1.0
[APIM300]No doc can be found for APIM300 Deployment patterns/clustering - **Description:** Please provide a document for APIM300 deployment patterns/clustering. **Suggested Labels:** APIM300 **Affected Product Version:** APIM300
non_build
no doc can be found for deployment patterns clustering description please provide a document for deployment patterns clustering suggested labels affected product version
0
215,348
16,666,906,692
IssuesEvent
2021-06-07 05:57:43
AtlasOfLivingAustralia/la-pipelines
https://api.github.com/repos/AtlasOfLivingAustralia/la-pipelines
closed
AVH testing: wildcard search on collector not working
bug testing-findings
Compare: https://avh.ala.org.au/occurrences/search?q=collector_text%3AKlazenga and https://avh-test.ala.org.au/occurrences/search?q=collector_text%3AKlazenga. This is accessed in AVH through the 'Advanced search'.
1.0
AVH testing: wildcard search on collector not working - Compare: https://avh.ala.org.au/occurrences/search?q=collector_text%3AKlazenga and https://avh-test.ala.org.au/occurrences/search?q=collector_text%3AKlazenga. This is accessed in AVH through the 'Advanced search'.
non_build
avh testing wildcard search on collector not working compare and this is accessed in avh through the advanced search
0
31,981
8,784,569,272
IssuesEvent
2018-12-20 10:13:58
conan-io/docs
https://api.github.com/repos/conan-io/docs
closed
CMAKE_BUILD_TYPE is not set by Conan on Windows/Visual
complex: low component: build priority: low stage: review type: feature
Coming from conan-io/conan#2488 Update the docs to mention that `CMAKE_BUILD_TYPE` is not set by Conan on Windows/Visual Studio. CMake helper is not setting `CMAKE_BUILD_TYPE` automatically in the configure step. However, it is doing so in the build step with the `--config` flag. This is because of CMake helper can be used together with `cmake_multi` generator too to allow having both release a debug build types.
1.0
CMAKE_BUILD_TYPE is not set by Conan on Windows/Visual - Coming from conan-io/conan#2488 Update the docs to mention that `CMAKE_BUILD_TYPE` is not set by Conan on Windows/Visual Studio. CMake helper is not setting `CMAKE_BUILD_TYPE` automatically in the configure step. However, it is doing so in the build step with the `--config` flag. This is because of CMake helper can be used together with `cmake_multi` generator too to allow having both release a debug build types.
build
cmake build type is not set by conan on windows visual coming from conan io conan update the docs to mention that cmake build type is not set by conan on windows visual studio cmake helper is not setting cmake build type automatically in the configure step however it is doing so in the build step with the config flag this is because of cmake helper can be used together with cmake multi generator too to allow having both release a debug build types
1
81,668
23,527,679,593
IssuesEvent
2022-08-19 12:32:14
camunda/zeebe
https://api.github.com/repos/camunda/zeebe
closed
Can riscv architecture be supported ?
kind/bug kind/question area/build
**Under the riscv architecture, after compiling zeebe with maven and making it into a docker image, an error is reported when running, Ask Daniel to help analyze,thanks** **dockerfile** ```` ARG APP_ENV=prod # Building builder image FROM riscv64/ubuntu:20.04 as builder ARG DISTBALL ENV TMP_ARCHIVE=/tmp/zeebe.tar.gz \ TMP_DIR=/tmp/zeebe \ TINI_VERSION=v0.19.0 COPY ${DISTBALL} ${TMP_ARCHIVE} RUN mkdir -p ${TMP_DIR} && \ tar xfvz ${TMP_ARCHIVE} --strip 1 -C ${TMP_DIR} && \ # already create volume dir to later have correct rights mkdir ${TMP_DIR}/data RUN apt-get update && \ apt-get install tini && \ cp /usr/bin/tini ${TMP_DIR}/bin/tini #ADD https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini ${TMP_DIR}/bin/tini COPY docker/utils/startup.sh ${TMP_DIR}/bin/startup.sh RUN chmod +x -R ${TMP_DIR}/bin/ RUN chmod 0775 ${TMP_DIR} ${TMP_DIR}/data #RUN apt-get update && \ # apt-get install tini && \ # cp /usr/bin/tini ${TMP_DIR}/bin/tini # Building prod image FROM riscv64/openjdk:17 as prod # Building dev image FROM riscv64/openjdk:17 as dev RUN echo "running DEV pre-install commands" RUN apt-get update RUN apt install curl -y #RUN curl -sSL https://github.com/jvm-profiling-tools/async-profiler/releases/download/v1.7.1/async-profiler-1.7.1-linux-x64.tar.gz | tar xzv # Building application image FROM ${APP_ENV} as app ENV ZB_HOME=/usr/local/zeebe \ ZEEBE_BROKER_GATEWAY_NETWORK_HOST=0.0.0.0 \ ZEEBE_STANDALONE_GATEWAY=false ENV PATH "${ZB_HOME}/bin:${PATH}" WORKDIR ${ZB_HOME} EXPOSE 26500 26501 26502 VOLUME ${ZB_HOME}/data RUN groupadd -g 1000 zeebe && \ adduser -u 1000 zeebe --system --ingroup zeebe && \ chmod g=u /etc/passwd && \ chown 1000:0 ${ZB_HOME} && \ chmod 0775 ${ZB_HOME} COPY --from=builder --chown=1000:0 /tmp/zeebe/bin/startup.sh /usr/local/bin/startup.sh COPY --from=builder --chown=1000:0 /tmp/zeebe ${ZB_HOME} ENTRYPOINT ["tini", "--", "/usr/local/bin/startup.sh"] ``` ` ### the error is as follows: `Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled. 2022-08-05 06:26:39.008 [] [main] ERROR org.springframework.boot.SpringApplication - Application run failed java.lang.IllegalStateException: Failed to execute CommandLineRunner at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:770) [spring-boot-2.6.1.jar:2.6.1] at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:751) [spring-boot-2.6.1.jar:2.6.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:309) [spring-boot-2.6.1.jar:2.6.1] at io.camunda.zeebe.broker.StandaloneBroker.main(StandaloneBroker.java:79) [camunda-cloud-zeebe-1.3.5.jar:1.3.5] Caused by: io.camunda.zeebe.util.exception.UncheckedExecutionException: Failed to start broker at io.camunda.zeebe.broker.Broker.internalStart(Broker.java:110) ~[zeebe-broker-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.LogUtil.doWithMDC(LogUtil.java:23) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.Broker.start(Broker.java:83) ~[zeebe-broker-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.StandaloneBroker.run(StandaloneBroker.java:93) ~[camunda-cloud-zeebe-1.3.5.jar:1.3.5] at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:767) ~[spring-boot-2.6.1.jar:2.6.1] ... 3 more Caused by: java.util.concurrent.ExecutionException: Startup failed in the following steps: [Partition Manager]. See suppressed exceptions for details. at io.camunda.zeebe.util.sched.future.CompletableActorFuture.get(CompletableActorFuture.java:141) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.future.CompletableActorFuture.get(CompletableActorFuture.java:108) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.FutureUtil.join(FutureUtil.java:21) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.future.CompletableActorFuture.join(CompletableActorFuture.java:196) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.Broker.internalStart(Broker.java:101) ~[zeebe-broker-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.LogUtil.doWithMDC(LogUtil.java:23) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.Broker.start(Broker.java:83) ~[zeebe-broker-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.StandaloneBroker.run(StandaloneBroker.java:93) ~[camunda-cloud-zeebe-1.3.5.jar:1.3.5] at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:767) ~[spring-boot-2.6.1.jar:2.6.1] ... 3 more Caused by: io.camunda.zeebe.util.startup.StartupProcessException: Startup failed in the following steps: [Partition Manager]. See suppressed exceptions for details. at io.camunda.zeebe.util.startup.StartupProcess.aggregateExceptionsSynchronized(StartupProcess.java:282) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.startup.StartupProcess.completeStartupFutureExceptionallySynchronized(StartupProcess.java:183) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.startup.StartupProcess.lambda$proceedWithStartupSynchronized$3(StartupProcess.java:167) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.future.FutureContinuationRunnable.run(FutureContinuationRunnable.java:33) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorJob.invoke(ActorJob.java:82) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorJob.execute(ActorJob.java:44) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorTask.execute(ActorTask.java:122) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.executeCurrentTask(ActorThread.java:97) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.doWork(ActorThread.java:80) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.run(ActorThread.java:189) ~[zeebe-util-1.3.5.jar:1.3.5] Suppressed: io.camunda.zeebe.util.startup.StartupProcessStepException: Bootstrap step Partition Manager failed at io.camunda.zeebe.util.startup.StartupProcess.completeStartupFutureExceptionallySynchronized(StartupProcess.java:185) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.startup.StartupProcess.lambda$proceedWithStartupSynchronized$3(StartupProcess.java:167) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.future.FutureContinuationRunnable.run(FutureContinuationRunnable.java:33) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorJob.invoke(ActorJob.java:82) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorJob.execute(ActorJob.java:44) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorTask.execute(ActorTask.java:122) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.executeCurrentTask(ActorThread.java:97) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.doWork(ActorThread.java:80) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.run(ActorThread.java:189) ~[zeebe-util-1.3.5.jar:1.3.5] Caused by: java.util.concurrent.CompletionException: java.lang.UnsatisfiedLinkError: /tmp/librocksdbjni12637028452020772129.so: /tmp/librocksdbjni12637028452020772129.so: cannot open shared object file: No such file or directory (Possible cause: can't load AMD 64 .so on a RISC-V platform) at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) ~[?:?] at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) ~[?:?] at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:649) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.utils.concurrent.AtomixFuture.lambda$wrap$0(AtomixFuture.java:50) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.utils.concurrent.AtomixFuture.lambda$wrap$0(AtomixFuture.java:50) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.utils.concurrent.AtomixFuture.lambda$wrap$0(AtomixFuture.java:50) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.utils.concurrent.AtomixFuture.lambda$wrap$0(AtomixFuture.java:50) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.raft.impl.DefaultRaftServer.lambda$start$5(DefaultRaftServer.java:211) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at io.atomix.raft.impl.RaftContext.awaitState(RaftContext.java:338) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at io.atomix.raft.impl.DefaultRaftServer.lambda$start$6(DefaultRaftServer.java:207) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.raft.cluster.impl.RaftClusterContext.completeBootstrapFuture(RaftClusterContext.java:275) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at io.atomix.raft.cluster.impl.RaftClusterContext.lambda$bootstrap$0(RaftClusterContext.java:135) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at io.atomix.utils.concurrent.SingleThreadContext$WrappedRunnable.run(SingleThreadContext.java:171) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) ~[?:?] at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] at java.lang.Thread.run(Thread.java:833) ~[?:?] `
1.0
Can riscv architecture be supported ? - **Under the riscv architecture, after compiling zeebe with maven and making it into a docker image, an error is reported when running, Ask Daniel to help analyze,thanks** **dockerfile** ```` ARG APP_ENV=prod # Building builder image FROM riscv64/ubuntu:20.04 as builder ARG DISTBALL ENV TMP_ARCHIVE=/tmp/zeebe.tar.gz \ TMP_DIR=/tmp/zeebe \ TINI_VERSION=v0.19.0 COPY ${DISTBALL} ${TMP_ARCHIVE} RUN mkdir -p ${TMP_DIR} && \ tar xfvz ${TMP_ARCHIVE} --strip 1 -C ${TMP_DIR} && \ # already create volume dir to later have correct rights mkdir ${TMP_DIR}/data RUN apt-get update && \ apt-get install tini && \ cp /usr/bin/tini ${TMP_DIR}/bin/tini #ADD https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini ${TMP_DIR}/bin/tini COPY docker/utils/startup.sh ${TMP_DIR}/bin/startup.sh RUN chmod +x -R ${TMP_DIR}/bin/ RUN chmod 0775 ${TMP_DIR} ${TMP_DIR}/data #RUN apt-get update && \ # apt-get install tini && \ # cp /usr/bin/tini ${TMP_DIR}/bin/tini # Building prod image FROM riscv64/openjdk:17 as prod # Building dev image FROM riscv64/openjdk:17 as dev RUN echo "running DEV pre-install commands" RUN apt-get update RUN apt install curl -y #RUN curl -sSL https://github.com/jvm-profiling-tools/async-profiler/releases/download/v1.7.1/async-profiler-1.7.1-linux-x64.tar.gz | tar xzv # Building application image FROM ${APP_ENV} as app ENV ZB_HOME=/usr/local/zeebe \ ZEEBE_BROKER_GATEWAY_NETWORK_HOST=0.0.0.0 \ ZEEBE_STANDALONE_GATEWAY=false ENV PATH "${ZB_HOME}/bin:${PATH}" WORKDIR ${ZB_HOME} EXPOSE 26500 26501 26502 VOLUME ${ZB_HOME}/data RUN groupadd -g 1000 zeebe && \ adduser -u 1000 zeebe --system --ingroup zeebe && \ chmod g=u /etc/passwd && \ chown 1000:0 ${ZB_HOME} && \ chmod 0775 ${ZB_HOME} COPY --from=builder --chown=1000:0 /tmp/zeebe/bin/startup.sh /usr/local/bin/startup.sh COPY --from=builder --chown=1000:0 /tmp/zeebe ${ZB_HOME} ENTRYPOINT ["tini", "--", "/usr/local/bin/startup.sh"] ``` ` ### the error is as follows: `Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled. 2022-08-05 06:26:39.008 [] [main] ERROR org.springframework.boot.SpringApplication - Application run failed java.lang.IllegalStateException: Failed to execute CommandLineRunner at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:770) [spring-boot-2.6.1.jar:2.6.1] at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:751) [spring-boot-2.6.1.jar:2.6.1] at org.springframework.boot.SpringApplication.run(SpringApplication.java:309) [spring-boot-2.6.1.jar:2.6.1] at io.camunda.zeebe.broker.StandaloneBroker.main(StandaloneBroker.java:79) [camunda-cloud-zeebe-1.3.5.jar:1.3.5] Caused by: io.camunda.zeebe.util.exception.UncheckedExecutionException: Failed to start broker at io.camunda.zeebe.broker.Broker.internalStart(Broker.java:110) ~[zeebe-broker-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.LogUtil.doWithMDC(LogUtil.java:23) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.Broker.start(Broker.java:83) ~[zeebe-broker-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.StandaloneBroker.run(StandaloneBroker.java:93) ~[camunda-cloud-zeebe-1.3.5.jar:1.3.5] at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:767) ~[spring-boot-2.6.1.jar:2.6.1] ... 3 more Caused by: java.util.concurrent.ExecutionException: Startup failed in the following steps: [Partition Manager]. See suppressed exceptions for details. at io.camunda.zeebe.util.sched.future.CompletableActorFuture.get(CompletableActorFuture.java:141) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.future.CompletableActorFuture.get(CompletableActorFuture.java:108) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.FutureUtil.join(FutureUtil.java:21) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.future.CompletableActorFuture.join(CompletableActorFuture.java:196) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.Broker.internalStart(Broker.java:101) ~[zeebe-broker-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.LogUtil.doWithMDC(LogUtil.java:23) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.Broker.start(Broker.java:83) ~[zeebe-broker-1.3.5.jar:1.3.5] at io.camunda.zeebe.broker.StandaloneBroker.run(StandaloneBroker.java:93) ~[camunda-cloud-zeebe-1.3.5.jar:1.3.5] at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:767) ~[spring-boot-2.6.1.jar:2.6.1] ... 3 more Caused by: io.camunda.zeebe.util.startup.StartupProcessException: Startup failed in the following steps: [Partition Manager]. See suppressed exceptions for details. at io.camunda.zeebe.util.startup.StartupProcess.aggregateExceptionsSynchronized(StartupProcess.java:282) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.startup.StartupProcess.completeStartupFutureExceptionallySynchronized(StartupProcess.java:183) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.startup.StartupProcess.lambda$proceedWithStartupSynchronized$3(StartupProcess.java:167) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.future.FutureContinuationRunnable.run(FutureContinuationRunnable.java:33) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorJob.invoke(ActorJob.java:82) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorJob.execute(ActorJob.java:44) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorTask.execute(ActorTask.java:122) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.executeCurrentTask(ActorThread.java:97) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.doWork(ActorThread.java:80) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.run(ActorThread.java:189) ~[zeebe-util-1.3.5.jar:1.3.5] Suppressed: io.camunda.zeebe.util.startup.StartupProcessStepException: Bootstrap step Partition Manager failed at io.camunda.zeebe.util.startup.StartupProcess.completeStartupFutureExceptionallySynchronized(StartupProcess.java:185) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.startup.StartupProcess.lambda$proceedWithStartupSynchronized$3(StartupProcess.java:167) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.future.FutureContinuationRunnable.run(FutureContinuationRunnable.java:33) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorJob.invoke(ActorJob.java:82) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorJob.execute(ActorJob.java:44) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorTask.execute(ActorTask.java:122) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.executeCurrentTask(ActorThread.java:97) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.doWork(ActorThread.java:80) ~[zeebe-util-1.3.5.jar:1.3.5] at io.camunda.zeebe.util.sched.ActorThread.run(ActorThread.java:189) ~[zeebe-util-1.3.5.jar:1.3.5] Caused by: java.util.concurrent.CompletionException: java.lang.UnsatisfiedLinkError: /tmp/librocksdbjni12637028452020772129.so: /tmp/librocksdbjni12637028452020772129.so: cannot open shared object file: No such file or directory (Possible cause: can't load AMD 64 .so on a RISC-V platform) at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315) ~[?:?] at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320) ~[?:?] at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:649) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.utils.concurrent.AtomixFuture.lambda$wrap$0(AtomixFuture.java:50) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.utils.concurrent.AtomixFuture.lambda$wrap$0(AtomixFuture.java:50) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.utils.concurrent.AtomixFuture.lambda$wrap$0(AtomixFuture.java:50) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.utils.concurrent.AtomixFuture.lambda$wrap$0(AtomixFuture.java:50) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.raft.impl.DefaultRaftServer.lambda$start$5(DefaultRaftServer.java:211) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at io.atomix.raft.impl.RaftContext.awaitState(RaftContext.java:338) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at io.atomix.raft.impl.DefaultRaftServer.lambda$start$6(DefaultRaftServer.java:207) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) ~[?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) ~[?:?] at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510) ~[?:?] at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147) ~[?:?] at io.atomix.raft.cluster.impl.RaftClusterContext.completeBootstrapFuture(RaftClusterContext.java:275) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at io.atomix.raft.cluster.impl.RaftClusterContext.lambda$bootstrap$0(RaftClusterContext.java:135) ~[zeebe-atomix-cluster-1.3.5.jar:1.3.5] at io.atomix.utils.concurrent.SingleThreadContext$WrappedRunnable.run(SingleThreadContext.java:171) ~[zeebe-atomix-utils-1.3.5.jar:1.3.5] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) ~[?:?] at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[?:?] at java.lang.Thread.run(Thread.java:833) ~[?:?] `
build
can riscv architecture be supported ? under the riscv architecture after compiling zeebe with maven and making it into a docker image an error is reported when running ask daniel to help analyze,thanks dockerfile arg app env prod building builder image from ubuntu as builder arg distball env tmp archive tmp zeebe tar gz tmp dir tmp zeebe tini version copy distball tmp archive run mkdir p tmp dir tar xfvz tmp archive strip c tmp dir already create volume dir to later have correct rights mkdir tmp dir data run apt get update apt get install tini cp usr bin tini tmp dir bin tini add tmp dir bin tini copy docker utils startup sh tmp dir bin startup sh run chmod x r tmp dir bin run chmod tmp dir tmp dir data run apt get update apt get install tini cp usr bin tini tmp dir bin tini building prod image from openjdk as prod building dev image from openjdk as dev run echo running dev pre install commands run apt get update run apt install curl y run curl ssl tar xzv building application image from app env as app env zb home usr local zeebe zeebe broker gateway network host zeebe standalone gateway false env path zb home bin path workdir zb home expose volume zb home data run groupadd g zeebe adduser u zeebe system ingroup zeebe chmod g u etc passwd chown zb home chmod zb home copy from builder chown tmp zeebe bin startup sh usr local bin startup sh copy from builder chown tmp zeebe zb home entrypoint the error is as follows error starting applicationcontext to display the conditions report re run your application with debug enabled error org springframework boot springapplication application run failed java lang illegalstateexception failed to execute commandlinerunner at org springframework boot springapplication callrunner springapplication java at org springframework boot springapplication callrunners springapplication java at org springframework boot springapplication run springapplication java at io camunda zeebe broker standalonebroker main standalonebroker java caused by io camunda zeebe util exception uncheckedexecutionexception failed to start broker at io camunda zeebe broker broker internalstart broker java at io camunda zeebe util logutil dowithmdc logutil java at io camunda zeebe broker broker start broker java at io camunda zeebe broker standalonebroker run standalonebroker java at org springframework boot springapplication callrunner springapplication java more caused by java util concurrent executionexception startup failed in the following steps see suppressed exceptions for details at io camunda zeebe util sched future completableactorfuture get completableactorfuture java at io camunda zeebe util sched future completableactorfuture get completableactorfuture java at io camunda zeebe util sched futureutil join futureutil java at io camunda zeebe util sched future completableactorfuture join completableactorfuture java at io camunda zeebe broker broker internalstart broker java at io camunda zeebe util logutil dowithmdc logutil java at io camunda zeebe broker broker start broker java at io camunda zeebe broker standalonebroker run standalonebroker java at org springframework boot springapplication callrunner springapplication java more caused by io camunda zeebe util startup startupprocessexception startup failed in the following steps see suppressed exceptions for details at io camunda zeebe util startup startupprocess aggregateexceptionssynchronized startupprocess java at io camunda zeebe util startup startupprocess completestartupfutureexceptionallysynchronized startupprocess java at io camunda zeebe util startup startupprocess lambda proceedwithstartupsynchronized startupprocess java at io camunda zeebe util sched future futurecontinuationrunnable run futurecontinuationrunnable java at io camunda zeebe util sched actorjob invoke actorjob java at io camunda zeebe util sched actorjob execute actorjob java at io camunda zeebe util sched actortask execute actortask java at io camunda zeebe util sched actorthread executecurrenttask actorthread java at io camunda zeebe util sched actorthread dowork actorthread java at io camunda zeebe util sched actorthread run actorthread java suppressed io camunda zeebe util startup startupprocessstepexception bootstrap step partition manager failed at io camunda zeebe util startup startupprocess completestartupfutureexceptionallysynchronized startupprocess java at io camunda zeebe util startup startupprocess lambda proceedwithstartupsynchronized startupprocess java at io camunda zeebe util sched future futurecontinuationrunnable run futurecontinuationrunnable java at io camunda zeebe util sched actorjob invoke actorjob java at io camunda zeebe util sched actorjob execute actorjob java at io camunda zeebe util sched actortask execute actortask java at io camunda zeebe util sched actorthread executecurrenttask actorthread java at io camunda zeebe util sched actorthread dowork actorthread java at io camunda zeebe util sched actorthread run actorthread java caused by java util concurrent completionexception java lang unsatisfiedlinkerror tmp so tmp so cannot open shared object file no such file or directory possible cause can t load amd so on a risc v platform at java util concurrent completablefuture encodethrowable completablefuture java at java util concurrent completablefuture completethrowable completablefuture java at java util concurrent completablefuture uniapply tryfire completablefuture java at java util concurrent completablefuture postcomplete completablefuture java at java util concurrent completablefuture complete completablefuture java at io atomix utils concurrent atomixfuture lambda wrap atomixfuture java at java util concurrent completablefuture uniwhencomplete completablefuture java at java util concurrent completablefuture uniwhencomplete tryfire completablefuture java at java util concurrent completablefuture postcomplete completablefuture java at java util concurrent completablefuture complete completablefuture java at io atomix utils concurrent atomixfuture lambda wrap atomixfuture java at java util concurrent completablefuture uniwhencomplete completablefuture java at java util concurrent completablefuture uniwhencomplete tryfire completablefuture java at java util concurrent completablefuture postcomplete completablefuture java at java util concurrent completablefuture complete completablefuture java at io atomix utils concurrent atomixfuture lambda wrap atomixfuture java at java util concurrent completablefuture uniwhencomplete completablefuture java at java util concurrent completablefuture uniwhencomplete tryfire completablefuture java at java util concurrent completablefuture postcomplete completablefuture java at java util concurrent completablefuture complete completablefuture java at io atomix utils concurrent atomixfuture lambda wrap atomixfuture java at java util concurrent completablefuture uniwhencomplete completablefuture java at java util concurrent completablefuture uniwhencomplete tryfire completablefuture java at java util concurrent completablefuture postcomplete completablefuture java at java util concurrent completablefuture complete completablefuture java at io atomix raft impl defaultraftserver lambda start defaultraftserver java at io atomix raft impl raftcontext awaitstate raftcontext java at io atomix raft impl defaultraftserver lambda start defaultraftserver java at java util concurrent completablefuture uniwhencomplete completablefuture java at java util concurrent completablefuture uniwhencomplete tryfire completablefuture java at java util concurrent completablefuture postcomplete completablefuture java at java util concurrent completablefuture complete completablefuture java at io atomix raft cluster impl raftclustercontext completebootstrapfuture raftclustercontext java at io atomix raft cluster impl raftclustercontext lambda bootstrap raftclustercontext java at io atomix utils concurrent singlethreadcontext wrappedrunnable run singlethreadcontext java at java util concurrent executors runnableadapter call executors java at java util concurrent futuretask run futuretask java at java util concurrent scheduledthreadpoolexecutor scheduledfuturetask run scheduledthreadpoolexecutor java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java
1
24,745
7,551,368,913
IssuesEvent
2018-04-18 19:54:30
dart-lang/build
https://api.github.com/repos/dart-lang/build
closed
AssetWriter.write* – return WriteAction (or something)
package: build type: enhancement type: question
The fact that folks aren't supposed to await writeAsBytes/writeAsString is really annoying – since it's good to await futures. Could these methods return an instance of ```dart class WriteAction { Future get complete; } ``` So folks who need to wait can, but the lints to get to them?
1.0
AssetWriter.write* – return WriteAction (or something) - The fact that folks aren't supposed to await writeAsBytes/writeAsString is really annoying – since it's good to await futures. Could these methods return an instance of ```dart class WriteAction { Future get complete; } ``` So folks who need to wait can, but the lints to get to them?
build
assetwriter write – return writeaction or something the fact that folks aren t supposed to await writeasbytes writeasstring is really annoying – since it s good to await futures could these methods return an instance of dart class writeaction future get complete so folks who need to wait can but the lints to get to them
1
101,260
30,958,699,422
IssuesEvent
2023-08-08 00:48:54
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
Arm32: System.Threading.Tasks.Dataflow.Tests failing with NRE
bug arch-arm32 area-System.Threading.Tasks blocking-clean-ci Known Build Error
## Build Information Build: https://dev.azure.com/dnceng-public/cbb18261-c48f-4abb-8651-8cdcb5474649/_build/results?buildId=140623 Build error leg or test failing: System.Threading.Tasks.Dataflow.Tests.WorkItemExecution Pull request: https://github.com/dotnet/runtime/pull/80323 ## Error Message Fill the error message using [known issues guidance](https://github.com/dotnet/arcade/blob/main/Documentation/Projects/Build%20Analysis/KnownIssues.md#how-to-fill-out-a-known-issue-error-message-section). ```json { "ErrorMessage": "System.Threading.Tasks.ConcurrentExclusiveSchedulerPair.ProcessConcurrentTasks", "BuildRetry": false } ``` <!--Known issue error report start --> ### Report #### Summary |24-Hour Hit Count|7-Day Hit Count|1-Month Count| |---|---|---| |0|0|0| <!--Known issue error report end -->
1.0
Arm32: System.Threading.Tasks.Dataflow.Tests failing with NRE - ## Build Information Build: https://dev.azure.com/dnceng-public/cbb18261-c48f-4abb-8651-8cdcb5474649/_build/results?buildId=140623 Build error leg or test failing: System.Threading.Tasks.Dataflow.Tests.WorkItemExecution Pull request: https://github.com/dotnet/runtime/pull/80323 ## Error Message Fill the error message using [known issues guidance](https://github.com/dotnet/arcade/blob/main/Documentation/Projects/Build%20Analysis/KnownIssues.md#how-to-fill-out-a-known-issue-error-message-section). ```json { "ErrorMessage": "System.Threading.Tasks.ConcurrentExclusiveSchedulerPair.ProcessConcurrentTasks", "BuildRetry": false } ``` <!--Known issue error report start --> ### Report #### Summary |24-Hour Hit Count|7-Day Hit Count|1-Month Count| |---|---|---| |0|0|0| <!--Known issue error report end -->
build
system threading tasks dataflow tests failing with nre build information build build error leg or test failing system threading tasks dataflow tests workitemexecution pull request error message fill the error message using json errormessage system threading tasks concurrentexclusiveschedulerpair processconcurrenttasks buildretry false report summary hour hit count day hit count month count
1
126,655
26,890,781,506
IssuesEvent
2023-02-06 08:45:31
samq-ghdemo/snyk-goof
https://api.github.com/repos/samq-ghdemo/snyk-goof
opened
Code Security Report: 1 high severity findings, 8 total findings
code security findings
# Code Security Report **Latest Scan:** 2023-02-06 08:44am **Total Findings:** 8 **Tested Project Files:** 6 **Detected Programming Languages:** 1 <!-- SAST-MANUAL-SCAN-START --> - [ ] Check this box to manually trigger a scan <!-- SAST-MANUAL-SCAN-END --> ## Language: JavaScript / Node.js | Severity | CWE | Vulnerability Type | Count | |-|-|-|-| |<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-78](https://cwe.mitre.org/data/definitions/78.html)|Command Injection|1| |<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-338](https://cwe.mitre.org/data/definitions/338.html)|Weak Pseudo-Random|2| |<img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low|[CWE-117](https://cwe.mitre.org/data/definitions/117.html)|Log Forging|5| ### Details > The below list presents the 1 high vulnerability findings that need your attention. To view information on these findings, navigate to the [Mend SAST Application](https://saas.whitesourcesoftware.com/sast/#/scans/642eebe1-e6e7-4a68-8641-5959bb5fdf9c/details). <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Command Injection (CWE-78) : 1</summary> #### Findings <details> <summary>routes/index.js:86</summary> https://github.com/samq-ghdemo/snyk-goof/blob/77eae86d878e3f5b3d008c20fb02ac8f24ee2393/routes/index.js#L81-L86 <details> <summary> Trace </summary> https://github.com/samq-ghdemo/snyk-goof/blob/77eae86d878e3f5b3d008c20fb02ac8f24ee2393/routes/index.js#L80 https://github.com/samq-ghdemo/snyk-goof/blob/77eae86d878e3f5b3d008c20fb02ac8f24ee2393/routes/index.js#L83 https://github.com/samq-ghdemo/snyk-goof/blob/77eae86d878e3f5b3d008c20fb02ac8f24ee2393/routes/index.js#L86 </details> </details> </details>
1.0
Code Security Report: 1 high severity findings, 8 total findings - # Code Security Report **Latest Scan:** 2023-02-06 08:44am **Total Findings:** 8 **Tested Project Files:** 6 **Detected Programming Languages:** 1 <!-- SAST-MANUAL-SCAN-START --> - [ ] Check this box to manually trigger a scan <!-- SAST-MANUAL-SCAN-END --> ## Language: JavaScript / Node.js | Severity | CWE | Vulnerability Type | Count | |-|-|-|-| |<img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High|[CWE-78](https://cwe.mitre.org/data/definitions/78.html)|Command Injection|1| |<img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium|[CWE-338](https://cwe.mitre.org/data/definitions/338.html)|Weak Pseudo-Random|2| |<img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low|[CWE-117](https://cwe.mitre.org/data/definitions/117.html)|Log Forging|5| ### Details > The below list presents the 1 high vulnerability findings that need your attention. To view information on these findings, navigate to the [Mend SAST Application](https://saas.whitesourcesoftware.com/sast/#/scans/642eebe1-e6e7-4a68-8641-5959bb5fdf9c/details). <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20>Command Injection (CWE-78) : 1</summary> #### Findings <details> <summary>routes/index.js:86</summary> https://github.com/samq-ghdemo/snyk-goof/blob/77eae86d878e3f5b3d008c20fb02ac8f24ee2393/routes/index.js#L81-L86 <details> <summary> Trace </summary> https://github.com/samq-ghdemo/snyk-goof/blob/77eae86d878e3f5b3d008c20fb02ac8f24ee2393/routes/index.js#L80 https://github.com/samq-ghdemo/snyk-goof/blob/77eae86d878e3f5b3d008c20fb02ac8f24ee2393/routes/index.js#L83 https://github.com/samq-ghdemo/snyk-goof/blob/77eae86d878e3f5b3d008c20fb02ac8f24ee2393/routes/index.js#L86 </details> </details> </details>
non_build
code security report high severity findings total findings code security report latest scan total findings tested project files detected programming languages check this box to manually trigger a scan language javascript node js severity cwe vulnerability type count high injection medium pseudo random low forging details the below list presents the high vulnerability findings that need your attention to view information on these findings navigate to the command injection cwe findings routes index js trace
0
39,669
10,372,024,471
IssuesEvent
2019-09-09 00:43:49
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
bazel should support building multiple platforms
area/build-release kind/feature lifecycle/rotten sig/release sig/testing
#73930 added the support of cross-compilation for bazel, but there is currently no way to build for multiple platforms. For example, to build release tarballs for both windows and linux, one would have to run the following commands. ``` bazel build --config=cross:linux_amd64 //build/release-tars # May need to copy the artifacts out before running the next command due to bazel's symlink logic bazel build --config=cross:windows_amd64 //build/release-tars ``` It'd be good to support multi-platform building, and then we can extend kubetest to enable this for test jobs that include multiple platforms. /cc @ixdy /cc @mtaufen @pjh
1.0
bazel should support building multiple platforms - #73930 added the support of cross-compilation for bazel, but there is currently no way to build for multiple platforms. For example, to build release tarballs for both windows and linux, one would have to run the following commands. ``` bazel build --config=cross:linux_amd64 //build/release-tars # May need to copy the artifacts out before running the next command due to bazel's symlink logic bazel build --config=cross:windows_amd64 //build/release-tars ``` It'd be good to support multi-platform building, and then we can extend kubetest to enable this for test jobs that include multiple platforms. /cc @ixdy /cc @mtaufen @pjh
build
bazel should support building multiple platforms added the support of cross compilation for bazel but there is currently no way to build for multiple platforms for example to build release tarballs for both windows and linux one would have to run the following commands bazel build config cross linux build release tars may need to copy the artifacts out before running the next command due to bazel s symlink logic bazel build config cross windows build release tars it d be good to support multi platform building and then we can extend kubetest to enable this for test jobs that include multiple platforms cc ixdy cc mtaufen pjh
1
69,641
7,156,542,803
IssuesEvent
2018-01-26 16:37:36
poanetwork/poa-popa
https://api.github.com/repos/poanetwork/poa-popa
opened
(Chore) Refactor prepareConTx
category-tests
Simplify code in order to make it more readable and testable by mainly extracting and refactoring functions. For example, wallet validation and assignation
1.0
(Chore) Refactor prepareConTx - Simplify code in order to make it more readable and testable by mainly extracting and refactoring functions. For example, wallet validation and assignation
non_build
chore refactor preparecontx simplify code in order to make it more readable and testable by mainly extracting and refactoring functions for example wallet validation and assignation
0
126,161
10,410,598,524
IssuesEvent
2019-09-13 11:51:31
flutter/flutter
https://api.github.com/repos/flutter/flutter
closed
UI Tests: If test is waiting for an expect(...emits) test a strange exception is thrown
a: tests
## Steps to Reproduce clone https://github.com/escamoteur/rx_widgets/tree/UI_Test_Exception and run the test in rx_widgets/example/test/homepage_test.dart you will get this output ``` Running "flutter packages get" in example... Received: null Called Execute Received: [Instance of 'WeatherEntry'] ══╡ EXCEPTION CAUGHT BY FLUTTER TEST FRAMEWORK ╞════════════════════════════════════════════════════ The following assertion was thrown running a test: A Timer is still pending even after the widget tree was disposed. 'package:flutter_test/src/binding.dart': Failed assertion: line 672 pos 7: '_fakeAsync.nonPeriodicTimerCount == 0' Either the assertion indicates an error in the framework itself, or we should provide substantially more information in this error message to help you determine and fix the underlying cause. In either case, please report this assertion by filing a bug on GitHub: https://github.com/flutter/flutter/issues/new When the exception was thrown, this was the stack: #2 AutomatedTestWidgetsFlutterBinding._verifyInvariants (package:flutter_test/src/binding.dart) #3 TestWidgetsFlutterBinding._runTestBody (package:flutter_test/src/binding.dart:487:7) <asynchronous suspension> #6 TestWidgetsFlutterBinding._runTest (package:flutter_test/src/binding.dart:464:14) #7 AutomatedTestWidgetsFlutterBinding.runTest.<anonymous closure> (package:flutter_test/src/binding.dart:646:24) #13 AutomatedTestWidgetsFlutterBinding.runTest (package:flutter_test/src/binding.dart:644:16) #14 testWidgets.<anonymous closure>.<anonymous closure> (package:flutter_test/src/widget_tester.dart:62:24) #15 Declarer.test.<anonymous closure>.<anonymous closure>.<anonymous closure> (package:test/src/backend/declarer.dart:161:27) <asynchronous suspension> #16 Invoker.waitForOutstandingCallbacks.<anonymous closure> (package:test/src/backend/invoker.dart:249:15) <asynchronous suspension> #20 Invoker.waitForOutstandingCallbacks (package:test/src/backend/invoker.dart:246:5) #21 Declarer.test.<anonymous closure>.<anonymous closure> (package:test/src/backend/declarer.dart:159:33) #25 Declarer.test.<anonymous closure> (package:test/src/backend/declarer.dart:158:13) <asynchronous suspension> #26 Invoker._onRun.<anonymous closure>.<anonymous closure>.<anonymous closure>.<anonymous closure> (package:test/src/backend/invoker.dart:403:25) <asynchronous suspension> #40 _Timer._runTimers (dart:isolate/runtime/libtimer_impl.dart:382:19) #41 _Timer._handleMessage (dart:isolate/runtime/libtimer_impl.dart:416:5) #42 _RawReceivePortImpl._handleMessage (dart:isolate/runtime/libisolate_patch.dart:165:12) (elided 28 frames from class _AssertionError, class _FakeAsync, package dart:async, and package stack_trace) The test description was: Tapping update button updates the weather ════════════════════════════════════════════════════════════════════════════════════════════════════ Test failed. See exception logs above. The test description was: Tapping update button updates the weather ✖ HomePage - Tapping update button updates the weather Some tests failed. ``` This is the Test: ```Dart testWidgets('Tapping update button updates the weather', (tester) async { final model = new MockModel(); final command = new MockCommand<String,List<WeatherEntry>>(); final widget = new ModelProvider( model: model, child: new MaterialApp(home: new HomePage()), ); when(model.updateWeatherCommand).thenReturn(command); // Expectation doesn't match output an exception is thwon inside Flutter. // Change `Londo` to `London` and it will run again command.queueResultsForNextExecuteCall([CommandResult<List<WeatherEntry>>( [WeatherEntry("Londo", 10.0, 30.0, "sunny", 12)],null, false)]); expect(command, emitsInOrder([ crm(null, false, false), crm([WeatherEntry("London", 10.0, 30.0, "sunny", 12)], false, false) ])); command.listen((data)=> print("Received: " + data.data.toString())); await tester.pumpWidget(widget); // Build initial State await tester.pump(); // Build after Stream delivers value await tester.tap(find.byKey(AppKeys.updateButtonEnabled)); }); ``` Because the expected event never is received the expects would normally timeout with an timeout exception together with Flutter it is an exception deep in flutter. Change `Londo` to `London` again and it runs through ## Flutter Doctor ``` PS C:\Entwicklung\FlutterApps\packages\rx_widgets\example> flutter doctor -v [√] Flutter (Channel dev, v0.3.0, on Microsoft Windows [Version 10.0.16299.371], locale de-DE) • Flutter version 0.3.0 at C:\Entwicklung\Flutter • Framework revision c73b8a7cf6 (7 days ago), 2018-04-12 16:17:26 -0700 • Engine revision 8a6e64a8ef • Dart version 2.0.0-dev.47.0.flutter-4126459025 [√] Android toolchain - develop for Android devices (Android SDK 27.0.3) • Android SDK at C:\Users\escam\AppData\Local\Android\sdk • Android NDK location not configured (optional; useful for native profiling support) • Platform android-27, build-tools 27.0.3 • Java binary at: C:\Program Files\Android\Android Studio\jre\bin\java • Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1024-b02) • All Android licenses accepted. [√] Android Studio (version 3.1) • Android Studio at C:\Program Files\Android\Android Studio • Flutter plugin version 23.2.2 • Dart plugin version 173.4700 • Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1024-b02) [√] VS Code, 32-bit edition (version 1.22.2) • VS Code at C:\Program Files (x86)\Microsoft VS Code • Dart Code extension version 2.12.0-dev.windows-paths-experimental [√] Connected devices (1 available) • Android SDK built for x86 • emulator-5554 • android-x86 • Android 6.0 (API 23) (emulator) • No issues found! ```
1.0
UI Tests: If test is waiting for an expect(...emits) test a strange exception is thrown - ## Steps to Reproduce clone https://github.com/escamoteur/rx_widgets/tree/UI_Test_Exception and run the test in rx_widgets/example/test/homepage_test.dart you will get this output ``` Running "flutter packages get" in example... Received: null Called Execute Received: [Instance of 'WeatherEntry'] ══╡ EXCEPTION CAUGHT BY FLUTTER TEST FRAMEWORK ╞════════════════════════════════════════════════════ The following assertion was thrown running a test: A Timer is still pending even after the widget tree was disposed. 'package:flutter_test/src/binding.dart': Failed assertion: line 672 pos 7: '_fakeAsync.nonPeriodicTimerCount == 0' Either the assertion indicates an error in the framework itself, or we should provide substantially more information in this error message to help you determine and fix the underlying cause. In either case, please report this assertion by filing a bug on GitHub: https://github.com/flutter/flutter/issues/new When the exception was thrown, this was the stack: #2 AutomatedTestWidgetsFlutterBinding._verifyInvariants (package:flutter_test/src/binding.dart) #3 TestWidgetsFlutterBinding._runTestBody (package:flutter_test/src/binding.dart:487:7) <asynchronous suspension> #6 TestWidgetsFlutterBinding._runTest (package:flutter_test/src/binding.dart:464:14) #7 AutomatedTestWidgetsFlutterBinding.runTest.<anonymous closure> (package:flutter_test/src/binding.dart:646:24) #13 AutomatedTestWidgetsFlutterBinding.runTest (package:flutter_test/src/binding.dart:644:16) #14 testWidgets.<anonymous closure>.<anonymous closure> (package:flutter_test/src/widget_tester.dart:62:24) #15 Declarer.test.<anonymous closure>.<anonymous closure>.<anonymous closure> (package:test/src/backend/declarer.dart:161:27) <asynchronous suspension> #16 Invoker.waitForOutstandingCallbacks.<anonymous closure> (package:test/src/backend/invoker.dart:249:15) <asynchronous suspension> #20 Invoker.waitForOutstandingCallbacks (package:test/src/backend/invoker.dart:246:5) #21 Declarer.test.<anonymous closure>.<anonymous closure> (package:test/src/backend/declarer.dart:159:33) #25 Declarer.test.<anonymous closure> (package:test/src/backend/declarer.dart:158:13) <asynchronous suspension> #26 Invoker._onRun.<anonymous closure>.<anonymous closure>.<anonymous closure>.<anonymous closure> (package:test/src/backend/invoker.dart:403:25) <asynchronous suspension> #40 _Timer._runTimers (dart:isolate/runtime/libtimer_impl.dart:382:19) #41 _Timer._handleMessage (dart:isolate/runtime/libtimer_impl.dart:416:5) #42 _RawReceivePortImpl._handleMessage (dart:isolate/runtime/libisolate_patch.dart:165:12) (elided 28 frames from class _AssertionError, class _FakeAsync, package dart:async, and package stack_trace) The test description was: Tapping update button updates the weather ════════════════════════════════════════════════════════════════════════════════════════════════════ Test failed. See exception logs above. The test description was: Tapping update button updates the weather ✖ HomePage - Tapping update button updates the weather Some tests failed. ``` This is the Test: ```Dart testWidgets('Tapping update button updates the weather', (tester) async { final model = new MockModel(); final command = new MockCommand<String,List<WeatherEntry>>(); final widget = new ModelProvider( model: model, child: new MaterialApp(home: new HomePage()), ); when(model.updateWeatherCommand).thenReturn(command); // Expectation doesn't match output an exception is thwon inside Flutter. // Change `Londo` to `London` and it will run again command.queueResultsForNextExecuteCall([CommandResult<List<WeatherEntry>>( [WeatherEntry("Londo", 10.0, 30.0, "sunny", 12)],null, false)]); expect(command, emitsInOrder([ crm(null, false, false), crm([WeatherEntry("London", 10.0, 30.0, "sunny", 12)], false, false) ])); command.listen((data)=> print("Received: " + data.data.toString())); await tester.pumpWidget(widget); // Build initial State await tester.pump(); // Build after Stream delivers value await tester.tap(find.byKey(AppKeys.updateButtonEnabled)); }); ``` Because the expected event never is received the expects would normally timeout with an timeout exception together with Flutter it is an exception deep in flutter. Change `Londo` to `London` again and it runs through ## Flutter Doctor ``` PS C:\Entwicklung\FlutterApps\packages\rx_widgets\example> flutter doctor -v [√] Flutter (Channel dev, v0.3.0, on Microsoft Windows [Version 10.0.16299.371], locale de-DE) • Flutter version 0.3.0 at C:\Entwicklung\Flutter • Framework revision c73b8a7cf6 (7 days ago), 2018-04-12 16:17:26 -0700 • Engine revision 8a6e64a8ef • Dart version 2.0.0-dev.47.0.flutter-4126459025 [√] Android toolchain - develop for Android devices (Android SDK 27.0.3) • Android SDK at C:\Users\escam\AppData\Local\Android\sdk • Android NDK location not configured (optional; useful for native profiling support) • Platform android-27, build-tools 27.0.3 • Java binary at: C:\Program Files\Android\Android Studio\jre\bin\java • Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1024-b02) • All Android licenses accepted. [√] Android Studio (version 3.1) • Android Studio at C:\Program Files\Android\Android Studio • Flutter plugin version 23.2.2 • Dart plugin version 173.4700 • Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1024-b02) [√] VS Code, 32-bit edition (version 1.22.2) • VS Code at C:\Program Files (x86)\Microsoft VS Code • Dart Code extension version 2.12.0-dev.windows-paths-experimental [√] Connected devices (1 available) • Android SDK built for x86 • emulator-5554 • android-x86 • Android 6.0 (API 23) (emulator) • No issues found! ```
non_build
ui tests if test is waiting for an expect emits test a strange exception is thrown steps to reproduce clone and run the test in rx widgets example test homepage test dart you will get this output running flutter packages get in example received null called execute received ══╡ exception caught by flutter test framework ╞════════════════════════════════════════════════════ the following assertion was thrown running a test a timer is still pending even after the widget tree was disposed package flutter test src binding dart failed assertion line pos fakeasync nonperiodictimercount either the assertion indicates an error in the framework itself or we should provide substantially more information in this error message to help you determine and fix the underlying cause in either case please report this assertion by filing a bug on github when the exception was thrown this was the stack automatedtestwidgetsflutterbinding verifyinvariants package flutter test src binding dart testwidgetsflutterbinding runtestbody package flutter test src binding dart testwidgetsflutterbinding runtest package flutter test src binding dart automatedtestwidgetsflutterbinding runtest package flutter test src binding dart automatedtestwidgetsflutterbinding runtest package flutter test src binding dart testwidgets package flutter test src widget tester dart declarer test package test src backend declarer dart invoker waitforoutstandingcallbacks package test src backend invoker dart invoker waitforoutstandingcallbacks package test src backend invoker dart declarer test package test src backend declarer dart declarer test package test src backend declarer dart invoker onrun package test src backend invoker dart timer runtimers dart isolate runtime libtimer impl dart timer handlemessage dart isolate runtime libtimer impl dart rawreceiveportimpl handlemessage dart isolate runtime libisolate patch dart elided frames from class assertionerror class fakeasync package dart async and package stack trace the test description was tapping update button updates the weather ════════════════════════════════════════════════════════════════════════════════════════════════════ test failed see exception logs above the test description was tapping update button updates the weather ✖ homepage tapping update button updates the weather some tests failed this is the test dart testwidgets tapping update button updates the weather tester async final model new mockmodel final command new mockcommand final widget new modelprovider model model child new materialapp home new homepage when model updateweathercommand thenreturn command expectation doesn t match output an exception is thwon inside flutter change londo to london and it will run again command queueresultsfornextexecutecall commandresult null false expect command emitsinorder false false command listen data print received data data tostring await tester pumpwidget widget build initial state await tester pump build after stream delivers value await tester tap find bykey appkeys updatebuttonenabled because the expected event never is received the expects would normally timeout with an timeout exception together with flutter it is an exception deep in flutter change londo to london again and it runs through flutter doctor ps c entwicklung flutterapps packages rx widgets example flutter doctor v flutter channel dev on microsoft windows locale de de • flutter version at c entwicklung flutter • framework revision days ago • engine revision • dart version dev flutter android toolchain develop for android devices android sdk • android sdk at c users escam appdata local android sdk • android ndk location not configured optional useful for native profiling support • platform android build tools • java binary at c program files android android studio jre bin java • java version openjdk runtime environment build release • all android licenses accepted android studio version • android studio at c program files android android studio • flutter plugin version • dart plugin version • java version openjdk runtime environment build release vs code bit edition version • vs code at c program files microsoft vs code • dart code extension version dev windows paths experimental connected devices available • android sdk built for • emulator • android • android api emulator • no issues found
0
47,296
11,996,842,385
IssuesEvent
2020-04-08 17:28:20
dotnet/machinelearning
https://api.github.com/repos/dotnet/machinelearning
closed
libomp 8.0.0 version has dependencies that do no exist on macOS build machines
Build P0 bug
It seems brew installing latest libomp has dependencies that do not exist on build machines. Investigate what those dependencies are by taking a trace and install them. Related to #3694
1.0
libomp 8.0.0 version has dependencies that do no exist on macOS build machines - It seems brew installing latest libomp has dependencies that do not exist on build machines. Investigate what those dependencies are by taking a trace and install them. Related to #3694
build
libomp version has dependencies that do no exist on macos build machines it seems brew installing latest libomp has dependencies that do not exist on build machines investigate what those dependencies are by taking a trace and install them related to
1
87,301
25,081,613,622
IssuesEvent
2022-11-07 19:51:06
irods/irods_rule_engine_plugin_audit_amqp
https://api.github.com/repos/irods/irods_rule_engine_plugin_audit_amqp
closed
Replace use of deprecated functionality
build
Compiling the main branch against what will be iRODS 4.3 produces the following: ```bash /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:126:17: warning: 'pn_messenger' is deprecated [-Wdeprecated-declarations] messenger = pn_messenger(NULL); ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:193:1: note: 'pn_messenger' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ^ /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:127:5: warning: 'pn_messenger_start' is deprecated [-Wdeprecated-declarations] pn_messenger_start(messenger); ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:476:1: note: 'pn_messenger_start' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ^ /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:128:5: warning: 'pn_messenger_set_blocking' is deprecated [-Wdeprecated-declarations] pn_messenger_set_blocking(messenger, false); // do not block ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:333:1: note: 'pn_messenger_set_blocking' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:126:17: warning: 'pn_messenger' is deprecated [-Wdeprecated-declarations] messenger = pn_messenger(NULL); ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:193:1: note: 'pn_messenger' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ^ /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:127:5: warning: 'pn_messenger_start' is deprecated [-Wdeprecated-declarations] pn_messenger_start(messenger); ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:476:1: note: 'pn_messenger_start' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ^ /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:128:5: warning: 'pn_messenger_set_blocking' is deprecated [-Wdeprecated-declarations] pn_messenger_set_blocking(messenger, false); // do not block ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:333:1: note: 'pn_messenger_set_blocking' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ```
1.0
Replace use of deprecated functionality - Compiling the main branch against what will be iRODS 4.3 produces the following: ```bash /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:126:17: warning: 'pn_messenger' is deprecated [-Wdeprecated-declarations] messenger = pn_messenger(NULL); ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:193:1: note: 'pn_messenger' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ^ /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:127:5: warning: 'pn_messenger_start' is deprecated [-Wdeprecated-declarations] pn_messenger_start(messenger); ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:476:1: note: 'pn_messenger_start' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ^ /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:128:5: warning: 'pn_messenger_set_blocking' is deprecated [-Wdeprecated-declarations] pn_messenger_set_blocking(messenger, false); // do not block ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:333:1: note: 'pn_messenger_set_blocking' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:126:17: warning: 'pn_messenger' is deprecated [-Wdeprecated-declarations] messenger = pn_messenger(NULL); ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:193:1: note: 'pn_messenger' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ^ /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:127:5: warning: 'pn_messenger_start' is deprecated [-Wdeprecated-declarations] pn_messenger_start(messenger); ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:476:1: note: 'pn_messenger_start' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ^ /home/kory/dev/prog/cpp/irods_rule_engine_plugin_audit_amqp/libirods_rule_engine_plugin-audit_amqp.cpp:128:5: warning: 'pn_messenger_set_blocking' is deprecated [-Wdeprecated-declarations] pn_messenger_set_blocking(messenger, false); // do not block ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/messenger.h:333:1: note: 'pn_messenger_set_blocking' has been explicitly marked deprecated here PN_DEPRECATED("Use the Proactor API or Qpid Proton C++") ^ /opt/irods-externals/qpid-proton0.36.0-0/include/proton/import_export.h:75:53: note: expanded from macro 'PN_DEPRECATED' # define PN_DEPRECATED(message) __attribute__((deprecated)) ```
build
replace use of deprecated functionality compiling the main branch against what will be irods produces the following bash home kory dev prog cpp irods rule engine plugin audit amqp libirods rule engine plugin audit amqp cpp warning pn messenger is deprecated messenger pn messenger null opt irods externals qpid include proton messenger h note pn messenger has been explicitly marked deprecated here pn deprecated use the proactor api or qpid proton c opt irods externals qpid include proton import export h note expanded from macro pn deprecated define pn deprecated message attribute deprecated home kory dev prog cpp irods rule engine plugin audit amqp libirods rule engine plugin audit amqp cpp warning pn messenger start is deprecated pn messenger start messenger opt irods externals qpid include proton messenger h note pn messenger start has been explicitly marked deprecated here pn deprecated use the proactor api or qpid proton c opt irods externals qpid include proton import export h note expanded from macro pn deprecated define pn deprecated message attribute deprecated home kory dev prog cpp irods rule engine plugin audit amqp libirods rule engine plugin audit amqp cpp warning pn messenger set blocking is deprecated pn messenger set blocking messenger false do not block opt irods externals qpid include proton messenger h note pn messenger set blocking has been explicitly marked deprecated here pn deprecated use the proactor api or qpid proton c opt irods externals qpid include proton import export h note expanded from macro pn deprecated define pn deprecated message attribute deprecated home kory dev prog cpp irods rule engine plugin audit amqp libirods rule engine plugin audit amqp cpp warning pn messenger is deprecated messenger pn messenger null opt irods externals qpid include proton messenger h note pn messenger has been explicitly marked deprecated here pn deprecated use the proactor api or qpid proton c opt irods externals qpid include proton import export h note expanded from macro pn deprecated define pn deprecated message attribute deprecated home kory dev prog cpp irods rule engine plugin audit amqp libirods rule engine plugin audit amqp cpp warning pn messenger start is deprecated pn messenger start messenger opt irods externals qpid include proton messenger h note pn messenger start has been explicitly marked deprecated here pn deprecated use the proactor api or qpid proton c opt irods externals qpid include proton import export h note expanded from macro pn deprecated define pn deprecated message attribute deprecated home kory dev prog cpp irods rule engine plugin audit amqp libirods rule engine plugin audit amqp cpp warning pn messenger set blocking is deprecated pn messenger set blocking messenger false do not block opt irods externals qpid include proton messenger h note pn messenger set blocking has been explicitly marked deprecated here pn deprecated use the proactor api or qpid proton c opt irods externals qpid include proton import export h note expanded from macro pn deprecated define pn deprecated message attribute deprecated
1
42,747
11,053,275,965
IssuesEvent
2019-12-10 11:01:45
Kavawuvi/invader
https://api.github.com/repos/Kavawuvi/invader
reopened
[invader-build] Tags extracted from combustion maps can cause earrape (tool.exe is uneffected)
invader-build
In order to reproduce this, you need tags that have been touched by the Combustion Retail>CE cache file conversion tool. To reproduce, first convert stock retail bloodgulch to CE format with Combustion, allowing it to optimise against the CE resource maps. Second, extract that map with invader-extract. Some tags will not extract, just extract them from the original retail bloodgulch map so it builds again. The issue is in the sound tags that do extract. Third, rebuild the map with invader-build (git master). sounds will be glitched. If you build these same tags with tool.exe the map works fine.
1.0
[invader-build] Tags extracted from combustion maps can cause earrape (tool.exe is uneffected) - In order to reproduce this, you need tags that have been touched by the Combustion Retail>CE cache file conversion tool. To reproduce, first convert stock retail bloodgulch to CE format with Combustion, allowing it to optimise against the CE resource maps. Second, extract that map with invader-extract. Some tags will not extract, just extract them from the original retail bloodgulch map so it builds again. The issue is in the sound tags that do extract. Third, rebuild the map with invader-build (git master). sounds will be glitched. If you build these same tags with tool.exe the map works fine.
build
tags extracted from combustion maps can cause earrape tool exe is uneffected in order to reproduce this you need tags that have been touched by the combustion retail ce cache file conversion tool to reproduce first convert stock retail bloodgulch to ce format with combustion allowing it to optimise against the ce resource maps second extract that map with invader extract some tags will not extract just extract them from the original retail bloodgulch map so it builds again the issue is in the sound tags that do extract third rebuild the map with invader build git master sounds will be glitched if you build these same tags with tool exe the map works fine
1
110,786
13,939,704,818
IssuesEvent
2020-10-22 16:51:26
flutter/flutter
https://api.github.com/repos/flutter/flutter
closed
Scrollbar widget renders in the middle when scroll direction is horizontal
P4 f: material design f: scrolling found in release: 1.22 framework has reproducible steps platform-ios
Hello! I was implementing a `SingleChildScrollView (horizontal direction)` with a `Row` as a `child`. I tried to use the `Scrollbar` widget as a parent. ```dart Scrollbar( isAlwaysShown: true, controller: _scrollController, child: SingleChildScrollView( scrollDirection: Axis.horizontal, controller: _scrollController, child: Padding( padding: const EdgeInsets.symmetric(horizontal: 8), child: Row( children: <Widget>[ // my widgets ] ), )), ) ``` Here is how it renders: ![Captura de pantalla 2020-08-14 a las 13 26 54](https://user-images.githubusercontent.com/20211760/90245123-59876d00-de32-11ea-9c3d-495863337c77.png) I expected I could tell the scrollbar to render on one side of the scroll view, below for example. ```bash $ flutter doctor Doctor summary (to see all details, run flutter doctor -v): [✓] Flutter (Channel stable, 1.20.1, on Mac OS X 10.15.4 19E287, locale es-ES) [✓] Android toolchain - develop for Android devices (Android SDK version 28.0.3) [✓] Xcode - develop for iOS and macOS (Xcode 11.4) [!] Android Studio (not installed) [✓] VS Code (version 1.47.0) [✓] Connected device (1 available) ! Doctor found issues in 1 category. ```
1.0
Scrollbar widget renders in the middle when scroll direction is horizontal - Hello! I was implementing a `SingleChildScrollView (horizontal direction)` with a `Row` as a `child`. I tried to use the `Scrollbar` widget as a parent. ```dart Scrollbar( isAlwaysShown: true, controller: _scrollController, child: SingleChildScrollView( scrollDirection: Axis.horizontal, controller: _scrollController, child: Padding( padding: const EdgeInsets.symmetric(horizontal: 8), child: Row( children: <Widget>[ // my widgets ] ), )), ) ``` Here is how it renders: ![Captura de pantalla 2020-08-14 a las 13 26 54](https://user-images.githubusercontent.com/20211760/90245123-59876d00-de32-11ea-9c3d-495863337c77.png) I expected I could tell the scrollbar to render on one side of the scroll view, below for example. ```bash $ flutter doctor Doctor summary (to see all details, run flutter doctor -v): [✓] Flutter (Channel stable, 1.20.1, on Mac OS X 10.15.4 19E287, locale es-ES) [✓] Android toolchain - develop for Android devices (Android SDK version 28.0.3) [✓] Xcode - develop for iOS and macOS (Xcode 11.4) [!] Android Studio (not installed) [✓] VS Code (version 1.47.0) [✓] Connected device (1 available) ! Doctor found issues in 1 category. ```
non_build
scrollbar widget renders in the middle when scroll direction is horizontal hello i was implementing a singlechildscrollview horizontal direction with a row as a child i tried to use the scrollbar widget as a parent dart scrollbar isalwaysshown true controller scrollcontroller child singlechildscrollview scrolldirection axis horizontal controller scrollcontroller child padding padding const edgeinsets symmetric horizontal child row children my widgets here is how it renders i expected i could tell the scrollbar to render on one side of the scroll view below for example bash flutter doctor doctor summary to see all details run flutter doctor v flutter channel stable on mac os x locale es es android toolchain develop for android devices android sdk version xcode develop for ios and macos xcode android studio not installed vs code version connected device available doctor found issues in category
0
8,699
4,310,299,092
IssuesEvent
2016-07-21 18:44:59
Linuxbrew/homebrew-core
https://api.github.com/repos/Linuxbrew/homebrew-core
closed
openssl: error: target already defined
build-error
Please follow the general troubleshooting steps first: + [x] Ran brew update and retried your prior step? + [x] Ran brew doctor, fixed as many issues as possible and retried your prior step? + [x] If you're seeing permission errors tried running sudo chown -R $(whoami) $(brew --prefix)? Bug reports: Having trouble installing cmake and qt. perl installed fine. Using Archlinux, and other information needed can be provided. Anyone have any insight? I tried sudo chown -R $(whoami) $(brew --prefix) and i just got the cowardly error. (as expected, so i reversed it back to root ownership.) ``` > sudo brew install cmake > ==> Installing dependencies for cmake: openssl, bzip2, python, curl, libi > ==> Installing cmake dependency: openssl > ==> Downloading https://www.openssl.org/source/openssl-1.0.2h.tar.gz > Already downloaded: /root/.cache/Homebrew/openssl-1.0.2h.tar.gz > ==> perl ./Configure --prefix=/opt/linuxbrew/Cellar/openssl/1.0.2h --openssldir= > Last 15 lines from /root/.cache/Homebrew/Logs/openssl/01.perl: > 2016-05-25 15:13:17 -0400 > > perl > ./Configure > --prefix=/opt/linuxbrew/Cellar/openssl/1.0.2h > --openssldir=/usr/local/etc/openssl > no-ssl2 > zlib-dynamic > shared > enable-cms > -Os -w -pipe -march=native -Wl,--dynamic-linker=/usr/local/lib/ld.so -Wl,-rpath,/usr/local/lib > linux-x86_64 > > target already defined - -Os -w -pipe -march=native -Wl,--dynamic-linker=/usr/local/lib/ld.so -Wl,-rpath,/usr/local/lib (offending arg: linux-x86_64) > > READ THIS: https://github.com/Linuxbrew/brew/blob/master/share/doc/homebrew/Troubleshooting.md#troubleshooting > If reporting this issue please do so at (not Homebrew/brew): > https://github.com/Linuxbrew/homebrew-core/issues ```
1.0
openssl: error: target already defined - Please follow the general troubleshooting steps first: + [x] Ran brew update and retried your prior step? + [x] Ran brew doctor, fixed as many issues as possible and retried your prior step? + [x] If you're seeing permission errors tried running sudo chown -R $(whoami) $(brew --prefix)? Bug reports: Having trouble installing cmake and qt. perl installed fine. Using Archlinux, and other information needed can be provided. Anyone have any insight? I tried sudo chown -R $(whoami) $(brew --prefix) and i just got the cowardly error. (as expected, so i reversed it back to root ownership.) ``` > sudo brew install cmake > ==> Installing dependencies for cmake: openssl, bzip2, python, curl, libi > ==> Installing cmake dependency: openssl > ==> Downloading https://www.openssl.org/source/openssl-1.0.2h.tar.gz > Already downloaded: /root/.cache/Homebrew/openssl-1.0.2h.tar.gz > ==> perl ./Configure --prefix=/opt/linuxbrew/Cellar/openssl/1.0.2h --openssldir= > Last 15 lines from /root/.cache/Homebrew/Logs/openssl/01.perl: > 2016-05-25 15:13:17 -0400 > > perl > ./Configure > --prefix=/opt/linuxbrew/Cellar/openssl/1.0.2h > --openssldir=/usr/local/etc/openssl > no-ssl2 > zlib-dynamic > shared > enable-cms > -Os -w -pipe -march=native -Wl,--dynamic-linker=/usr/local/lib/ld.so -Wl,-rpath,/usr/local/lib > linux-x86_64 > > target already defined - -Os -w -pipe -march=native -Wl,--dynamic-linker=/usr/local/lib/ld.so -Wl,-rpath,/usr/local/lib (offending arg: linux-x86_64) > > READ THIS: https://github.com/Linuxbrew/brew/blob/master/share/doc/homebrew/Troubleshooting.md#troubleshooting > If reporting this issue please do so at (not Homebrew/brew): > https://github.com/Linuxbrew/homebrew-core/issues ```
build
openssl error target already defined please follow the general troubleshooting steps first ran brew update and retried your prior step ran brew doctor fixed as many issues as possible and retried your prior step if you re seeing permission errors tried running sudo chown r whoami brew prefix bug reports having trouble installing cmake and qt perl installed fine using archlinux and other information needed can be provided anyone have any insight i tried sudo chown r whoami brew prefix and i just got the cowardly error as expected so i reversed it back to root ownership sudo brew install cmake installing dependencies for cmake openssl python curl libi installing cmake dependency openssl downloading already downloaded root cache homebrew openssl tar gz perl configure prefix opt linuxbrew cellar openssl openssldir last lines from root cache homebrew logs openssl perl perl configure prefix opt linuxbrew cellar openssl openssldir usr local etc openssl no zlib dynamic shared enable cms os w pipe march native wl dynamic linker usr local lib ld so wl rpath usr local lib linux target already defined os w pipe march native wl dynamic linker usr local lib ld so wl rpath usr local lib offending arg linux read this if reporting this issue please do so at not homebrew brew
1
521,603
15,112,291,195
IssuesEvent
2021-02-08 21:38:12
nlpsandbox/nlpsandbox
https://api.github.com/repos/nlpsandbox/nlpsandbox
closed
Develop and submit MCW annotators
Priority: Medium
@gkowalski Is it possible for you to create one or more annotators (date, person name and/or physical address annotators), associate them to an "MCW" team on Synapse and submit them for evaluation so that they appear in the leaderboard tables when we launch the NLP Sandbox?
1.0
Develop and submit MCW annotators - @gkowalski Is it possible for you to create one or more annotators (date, person name and/or physical address annotators), associate them to an "MCW" team on Synapse and submit them for evaluation so that they appear in the leaderboard tables when we launch the NLP Sandbox?
non_build
develop and submit mcw annotators gkowalski is it possible for you to create one or more annotators date person name and or physical address annotators associate them to an mcw team on synapse and submit them for evaluation so that they appear in the leaderboard tables when we launch the nlp sandbox
0
71,002
18,388,862,699
IssuesEvent
2021-10-12 00:59:46
Xarlyu/devops
https://api.github.com/repos/Xarlyu/devops
closed
build(nodejs): add Dockerfile with yarn
build
We already have a `Dockerfile `with `npm`, would be great have an example with `yarn`
1.0
build(nodejs): add Dockerfile with yarn - We already have a `Dockerfile `with `npm`, would be great have an example with `yarn`
build
build nodejs add dockerfile with yarn we already have a dockerfile with npm would be great have an example with yarn
1
27,489
13,258,096,697
IssuesEvent
2020-08-20 14:56:44
Azure/azure-sdk-for-java
https://api.github.com/repos/Azure/azure-sdk-for-java
opened
Azure SDKs Performance work Iron Semester
tenet-performance
- [ ] Add Performance Tests for Eventhubs SDK #14307 - [ ] Add Performance Tests for Search SDK #14308 - [ ] Add Performance Tests for Text Analytics SDK #14309 - [ ] Add Performance Tests for Form Recognizer SDK #14310 - [ ] Add Performance Tests for Identity SDK #14313 - [ ] Add Performance Tests for Service Bus SDK #14314 - [ ] Expand Performance Test suite for App Config SDK #14311 - [ ] Add Latency Support to Perf Test framework #14315
True
Azure SDKs Performance work Iron Semester - - [ ] Add Performance Tests for Eventhubs SDK #14307 - [ ] Add Performance Tests for Search SDK #14308 - [ ] Add Performance Tests for Text Analytics SDK #14309 - [ ] Add Performance Tests for Form Recognizer SDK #14310 - [ ] Add Performance Tests for Identity SDK #14313 - [ ] Add Performance Tests for Service Bus SDK #14314 - [ ] Expand Performance Test suite for App Config SDK #14311 - [ ] Add Latency Support to Perf Test framework #14315
non_build
azure sdks performance work iron semester add performance tests for eventhubs sdk add performance tests for search sdk add performance tests for text analytics sdk add performance tests for form recognizer sdk add performance tests for identity sdk add performance tests for service bus sdk expand performance test suite for app config sdk add latency support to perf test framework
0
22,086
3,593,662,366
IssuesEvent
2016-02-01 20:35:09
firemodels/fds-smv
https://api.github.com/repos/firemodels/fds-smv
closed
Unclear error occurs when running an FDS file with more than ~18 species
Priority-Medium Type-Defect
``` Please complete the following lines... FDS Version: 6.0.0 SVN Revision Number: 17279 Compile Date: Sun, 03 Nov 2013 Operating System: Windows 8 Describe details of the issue below: It appears a number of unclear errors can occur if you try to run an FDS file with an abnormally large number of SPECies defined. It appears that the magic number you are allowed to use is roughly 18, however this is not made clear by FDS. Instead, nonsense warnings such as: ERROR: RAMP INERT not found ERROR: SPEC AIR, sub species 1 not found. Attached are example files of 3 different cases. All have an additional 'implicit' SPEC line generated through the REAC line as well. Case 1 works, with 19 SPEC lines (1 declares LUMPED_COMPONENT_ONLY=.TRUE.) Case 2 fails with "RAMP INERT..." error, same as case 1, except the SOOT line no longer has LUMPED_COMPONENT_ONLY specified. Case 3 fails with "SPEC AIR..." error. This case has all predefined SPEC lines presented in Table 11.1 of the User's Manual. ``` Original issue reported on code.google.com by `albrecht@thunderheadeng.com` on 2013-11-22 17:59:37 <hr> * *Attachment: [case1.fds](https://storage.googleapis.com/google-code-attachments/fds-smv/issue-2034/comment-0/case1.fds)* * *Attachment: [case2.fds](https://storage.googleapis.com/google-code-attachments/fds-smv/issue-2034/comment-0/case2.fds)* * *Attachment: [case3.fds](https://storage.googleapis.com/google-code-attachments/fds-smv/issue-2034/comment-0/case3.fds)*
1.0
Unclear error occurs when running an FDS file with more than ~18 species - ``` Please complete the following lines... FDS Version: 6.0.0 SVN Revision Number: 17279 Compile Date: Sun, 03 Nov 2013 Operating System: Windows 8 Describe details of the issue below: It appears a number of unclear errors can occur if you try to run an FDS file with an abnormally large number of SPECies defined. It appears that the magic number you are allowed to use is roughly 18, however this is not made clear by FDS. Instead, nonsense warnings such as: ERROR: RAMP INERT not found ERROR: SPEC AIR, sub species 1 not found. Attached are example files of 3 different cases. All have an additional 'implicit' SPEC line generated through the REAC line as well. Case 1 works, with 19 SPEC lines (1 declares LUMPED_COMPONENT_ONLY=.TRUE.) Case 2 fails with "RAMP INERT..." error, same as case 1, except the SOOT line no longer has LUMPED_COMPONENT_ONLY specified. Case 3 fails with "SPEC AIR..." error. This case has all predefined SPEC lines presented in Table 11.1 of the User's Manual. ``` Original issue reported on code.google.com by `albrecht@thunderheadeng.com` on 2013-11-22 17:59:37 <hr> * *Attachment: [case1.fds](https://storage.googleapis.com/google-code-attachments/fds-smv/issue-2034/comment-0/case1.fds)* * *Attachment: [case2.fds](https://storage.googleapis.com/google-code-attachments/fds-smv/issue-2034/comment-0/case2.fds)* * *Attachment: [case3.fds](https://storage.googleapis.com/google-code-attachments/fds-smv/issue-2034/comment-0/case3.fds)*
non_build
unclear error occurs when running an fds file with more than species please complete the following lines fds version svn revision number compile date sun nov operating system windows describe details of the issue below it appears a number of unclear errors can occur if you try to run an fds file with an abnormally large number of species defined it appears that the magic number you are allowed to use is roughly however this is not made clear by fds instead nonsense warnings such as error ramp inert not found error spec air sub species not found attached are example files of different cases all have an additional implicit spec line generated through the reac line as well case works with spec lines declares lumped component only true case fails with ramp inert error same as case except the soot line no longer has lumped component only specified case fails with spec air error this case has all predefined spec lines presented in table of the user s manual original issue reported on code google com by albrecht thunderheadeng com on attachment attachment attachment
0
416,593
28,090,603,593
IssuesEvent
2023-03-30 12:50:20
scikit-learn/scikit-learn
https://api.github.com/repos/scikit-learn/scikit-learn
closed
Incorrect documentation for SplineTransformer.include_bias - the opposite is true
Documentation Needs Triage
### Describe the issue linked to the documentation Documentation says: > include_bias: bool, default=True > **If True (default), then the last spline element inside the data range of a feature is dropped**. As B-splines sum to one over the spline basis functions for each data point, they implicitly include a bias term, i.e. a column of ones. It acts as an intercept term in a linear models. https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.SplineTransformer.html But it seems it's exactly the opposite, with `include_bias=True` I get 3 columns, with `include_bias=False` I get 2 columns: ```python import numpy as np from sklearn.preprocessing import SplineTransformer X = np.arange(4).reshape(4, 1) spline = SplineTransformer(degree=2, n_knots=2, include_bias=True) spline.fit_transform(X) # array([[0.5 , 0.5 , 0. ], # [0.22222222, 0.72222222, 0.05555556], # [0.05555556, 0.72222222, 0.22222222], # [0. , 0.5 , 0.5 ]]) spline = SplineTransformer(degree=2, n_knots=2, include_bias=False) spline.fit_transform(X) # array([[0.5 , 0.5 ], # [0.22222222, 0.72222222], # [0.05555556, 0.72222222], # [0. , 0.5 ]]) ``` ### Suggest a potential alternative/fix _No response_
1.0
Incorrect documentation for SplineTransformer.include_bias - the opposite is true - ### Describe the issue linked to the documentation Documentation says: > include_bias: bool, default=True > **If True (default), then the last spline element inside the data range of a feature is dropped**. As B-splines sum to one over the spline basis functions for each data point, they implicitly include a bias term, i.e. a column of ones. It acts as an intercept term in a linear models. https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.SplineTransformer.html But it seems it's exactly the opposite, with `include_bias=True` I get 3 columns, with `include_bias=False` I get 2 columns: ```python import numpy as np from sklearn.preprocessing import SplineTransformer X = np.arange(4).reshape(4, 1) spline = SplineTransformer(degree=2, n_knots=2, include_bias=True) spline.fit_transform(X) # array([[0.5 , 0.5 , 0. ], # [0.22222222, 0.72222222, 0.05555556], # [0.05555556, 0.72222222, 0.22222222], # [0. , 0.5 , 0.5 ]]) spline = SplineTransformer(degree=2, n_knots=2, include_bias=False) spline.fit_transform(X) # array([[0.5 , 0.5 ], # [0.22222222, 0.72222222], # [0.05555556, 0.72222222], # [0. , 0.5 ]]) ``` ### Suggest a potential alternative/fix _No response_
non_build
incorrect documentation for splinetransformer include bias the opposite is true describe the issue linked to the documentation documentation says include bias bool default true if true default then the last spline element inside the data range of a feature is dropped as b splines sum to one over the spline basis functions for each data point they implicitly include a bias term i e a column of ones it acts as an intercept term in a linear models but it seems it s exactly the opposite with include bias true i get columns with include bias false i get columns python import numpy as np from sklearn preprocessing import splinetransformer x np arange reshape spline splinetransformer degree n knots include bias true spline fit transform x array spline splinetransformer degree n knots include bias false spline fit transform x array suggest a potential alternative fix no response
0
79,372
22,742,720,531
IssuesEvent
2022-07-07 06:11:39
google/mediapipe
https://api.github.com/repos/google/mediapipe
closed
Bitmap Converter
type:build/install platform:android stat:awaiting response solution:hair segmentation stalled
Hello. I use hair segmentation. I want to buy a mask (black & white) for hair coloring and to make changes. I looked at other issues and couldn't find the answer. OR I want to see the image I see in Surface View as a bitmap ``` processor.addPacketCallback("output_video"){ packet-> } ``` How can I convert the data I listen as a packet to a bitmap?
1.0
Bitmap Converter - Hello. I use hair segmentation. I want to buy a mask (black & white) for hair coloring and to make changes. I looked at other issues and couldn't find the answer. OR I want to see the image I see in Surface View as a bitmap ``` processor.addPacketCallback("output_video"){ packet-> } ``` How can I convert the data I listen as a packet to a bitmap?
build
bitmap converter hello i use hair segmentation i want to buy a mask black white for hair coloring and to make changes i looked at other issues and couldn t find the answer or i want to see the image i see in surface view as a bitmap processor addpacketcallback output video packet how can i convert the data i listen as a packet to a bitmap
1
12,853
15,103,986,267
IssuesEvent
2021-02-08 11:01:41
jenkinsci/configuration-as-code-plugin
https://api.github.com/repos/jenkinsci/configuration-as-code-plugin
closed
Support "Preventive Node Monitoring"
feature plugin-compatibility
### Feature Request This configuration is available from UI at `<base url>/computer/configure` ![image](https://user-images.githubusercontent.com/4091725/106894185-4b3a4f80-66f7-11eb-97b1-4b477f446485.png) It would be nice to have that configuration supported by JCasC
True
Support "Preventive Node Monitoring" - ### Feature Request This configuration is available from UI at `<base url>/computer/configure` ![image](https://user-images.githubusercontent.com/4091725/106894185-4b3a4f80-66f7-11eb-97b1-4b477f446485.png) It would be nice to have that configuration supported by JCasC
non_build
support preventive node monitoring feature request this configuration is available from ui at computer configure it would be nice to have that configuration supported by jcasc
0
181,526
21,664,369,760
IssuesEvent
2022-05-07 01:08:35
Baneeishaque/Ethiopian_Construction_Directory
https://api.github.com/repos/Baneeishaque/Ethiopian_Construction_Directory
closed
WS-2017-0268 (Low) detected in angular-v1.5.3 - autoclosed
security vulnerability
## WS-2017-0268 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>angular-v1.5.3</b></p></summary> <p></p> <p> Dependency Hierarchy: - angular-ui-router-0.2.13 (Root Library) - :x: **angular-v1.5.3** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Baneeishaque/Ethiopian_Construction_Directory/commit/62ade0b45d607e488ee01a4d9442f9d282ab0443">62ade0b45d607e488ee01a4d9442f9d282ab0443</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Both Firefox and Safari are vulnerable to XSS if we use an inert document created via `document.implementation.createHTMLDocument()`. <p>Publish Date: 2017-05-25 <p>URL: <a href=https://github.com/angular/angular.js/commit/8f31f1ff43b673a24f84422d5c13d6312b2c4d94>WS-2017-0268</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>3.4</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Change files</p> <p>Origin: <a href="https://github.com/angular/angular.js/commit/8f31f1ff43b673a24f84422d5c13d6312b2c4d94">https://github.com/angular/angular.js/commit/8f31f1ff43b673a24f84422d5c13d6312b2c4d94</a></p> <p>Release Date: 2017-06-05</p> <p>Fix Resolution: Replace or update the following files: sanitize.js, sanitizeSpec.js</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2017-0268 (Low) detected in angular-v1.5.3 - autoclosed - ## WS-2017-0268 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>angular-v1.5.3</b></p></summary> <p></p> <p> Dependency Hierarchy: - angular-ui-router-0.2.13 (Root Library) - :x: **angular-v1.5.3** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Baneeishaque/Ethiopian_Construction_Directory/commit/62ade0b45d607e488ee01a4d9442f9d282ab0443">62ade0b45d607e488ee01a4d9442f9d282ab0443</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Both Firefox and Safari are vulnerable to XSS if we use an inert document created via `document.implementation.createHTMLDocument()`. <p>Publish Date: 2017-05-25 <p>URL: <a href=https://github.com/angular/angular.js/commit/8f31f1ff43b673a24f84422d5c13d6312b2c4d94>WS-2017-0268</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>3.4</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Change files</p> <p>Origin: <a href="https://github.com/angular/angular.js/commit/8f31f1ff43b673a24f84422d5c13d6312b2c4d94">https://github.com/angular/angular.js/commit/8f31f1ff43b673a24f84422d5c13d6312b2c4d94</a></p> <p>Release Date: 2017-06-05</p> <p>Fix Resolution: Replace or update the following files: sanitize.js, sanitizeSpec.js</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_build
ws low detected in angular autoclosed ws low severity vulnerability vulnerable library angular dependency hierarchy angular ui router root library x angular vulnerable library found in head commit a href vulnerability details both firefox and safari are vulnerable to xss if we use an inert document created via document implementation createhtmldocument publish date url a href cvss score details base score metrics not available suggested fix type change files origin a href release date fix resolution replace or update the following files sanitize js sanitizespec js step up your open source security game with whitesource
0
60,222
14,737,661,643
IssuesEvent
2021-01-07 02:28:32
rust-lang/cargo
https://api.github.com/repos/rust-lang/cargo
closed
RLS and cargo check run sometimes unnecesary build.rs steps
A-build-scripts C-feature-request
When running an IDE using RLS or just running `cargo check`, some large projects using crates that have big static builds of non Rust based native libraries, the `build.rs` of those dependencies run a complete build of the non Rust code. One example is `openssl`, when using it in vendored mode, takes a lot of time to get feedback on an IDE because it will build openssl entirely, something it isn't needed for getting the information RLS or `cargo check` needs, they don't need a complete build of those crates. Maybe standardizing some kind of environment variable passed to the `build.rs` program could tell these dependencies that the build is being run for RLS or cargo check, and these scripts could be updated to be smarter, avoiding the native build in these circumstances. This will require cooperation of `build.rs` users but it is better than than the current status where an IDE takes too much time to give some feedback.
1.0
RLS and cargo check run sometimes unnecesary build.rs steps - When running an IDE using RLS or just running `cargo check`, some large projects using crates that have big static builds of non Rust based native libraries, the `build.rs` of those dependencies run a complete build of the non Rust code. One example is `openssl`, when using it in vendored mode, takes a lot of time to get feedback on an IDE because it will build openssl entirely, something it isn't needed for getting the information RLS or `cargo check` needs, they don't need a complete build of those crates. Maybe standardizing some kind of environment variable passed to the `build.rs` program could tell these dependencies that the build is being run for RLS or cargo check, and these scripts could be updated to be smarter, avoiding the native build in these circumstances. This will require cooperation of `build.rs` users but it is better than than the current status where an IDE takes too much time to give some feedback.
build
rls and cargo check run sometimes unnecesary build rs steps when running an ide using rls or just running cargo check some large projects using crates that have big static builds of non rust based native libraries the build rs of those dependencies run a complete build of the non rust code one example is openssl when using it in vendored mode takes a lot of time to get feedback on an ide because it will build openssl entirely something it isn t needed for getting the information rls or cargo check needs they don t need a complete build of those crates maybe standardizing some kind of environment variable passed to the build rs program could tell these dependencies that the build is being run for rls or cargo check and these scripts could be updated to be smarter avoiding the native build in these circumstances this will require cooperation of build rs users but it is better than than the current status where an ide takes too much time to give some feedback
1
91,042
26,256,224,185
IssuesEvent
2023-01-06 01:07:20
sonic-net/sonic-buildimage
https://api.github.com/repos/sonic-net/sonic-buildimage
opened
[Build] Failed to build isc dhcp
Build :hammer:
<!-- If you are reporting a new issue, make sure that we do not have any duplicates already open. You can ensure this by searching the issue list for this repository. If there is a duplicate, please close your issue and add a comment to the existing issue instead. If you suspect your issue is a bug, please edit your issue description to include the BUG REPORT INFORMATION shown below. If you fail to provide this information within 7 days, we cannot debug your issue and will close it. We will, however, reopen it if you later provide the information. For more information about reporting issues, see https://github.com/Azure/SONiC/wiki#report-issues --------------------------------------------------- GENERAL SUPPORT INFORMATION --------------------------------------------------- The GitHub issue tracker is for bug reports and feature requests. General support can be found at the following locations: - SONiC Support Forums - https://groups.google.com/forum/#!forum/sonicproject --------------------------------------------------- BUG REPORT INFORMATION --------------------------------------------------- Use the commands below to provide key information from your environment: You do NOT have to include this information if this is a FEATURE REQUEST --> #### Description Build failed due to ``` dget: curl isc-dhcp_4.4.1-2.3.dsc http://deb.debian.org/debian/pool/main/i/isc-dhcp/isc-dhcp_4.4.1-2.3.dsc failed ``` The root cause is that isc-dhcp_4.4.1-2.3.dsc is no longer available. #### Steps to reproduce the issue: Build sonic #### Describe the results you received: ``` dget: curl isc-dhcp_4.4.1-2.3.dsc http://deb.debian.org/debian/pool/main/i/isc-dhcp/isc-dhcp_4.4.1-2.3.dsc failed ``` #### Describe the results you expected: No error #### Output of `show version`: ``` (paste your output here) ``` #### Output of `show techsupport`: ``` (paste your output here or download and attach the file here ) ``` #### Additional information you deem important (e.g. issue happens only occasionally): <!-- Also attach debug file produced by `sudo generate_dump` -->
1.0
[Build] Failed to build isc dhcp - <!-- If you are reporting a new issue, make sure that we do not have any duplicates already open. You can ensure this by searching the issue list for this repository. If there is a duplicate, please close your issue and add a comment to the existing issue instead. If you suspect your issue is a bug, please edit your issue description to include the BUG REPORT INFORMATION shown below. If you fail to provide this information within 7 days, we cannot debug your issue and will close it. We will, however, reopen it if you later provide the information. For more information about reporting issues, see https://github.com/Azure/SONiC/wiki#report-issues --------------------------------------------------- GENERAL SUPPORT INFORMATION --------------------------------------------------- The GitHub issue tracker is for bug reports and feature requests. General support can be found at the following locations: - SONiC Support Forums - https://groups.google.com/forum/#!forum/sonicproject --------------------------------------------------- BUG REPORT INFORMATION --------------------------------------------------- Use the commands below to provide key information from your environment: You do NOT have to include this information if this is a FEATURE REQUEST --> #### Description Build failed due to ``` dget: curl isc-dhcp_4.4.1-2.3.dsc http://deb.debian.org/debian/pool/main/i/isc-dhcp/isc-dhcp_4.4.1-2.3.dsc failed ``` The root cause is that isc-dhcp_4.4.1-2.3.dsc is no longer available. #### Steps to reproduce the issue: Build sonic #### Describe the results you received: ``` dget: curl isc-dhcp_4.4.1-2.3.dsc http://deb.debian.org/debian/pool/main/i/isc-dhcp/isc-dhcp_4.4.1-2.3.dsc failed ``` #### Describe the results you expected: No error #### Output of `show version`: ``` (paste your output here) ``` #### Output of `show techsupport`: ``` (paste your output here or download and attach the file here ) ``` #### Additional information you deem important (e.g. issue happens only occasionally): <!-- Also attach debug file produced by `sudo generate_dump` -->
build
failed to build isc dhcp if you are reporting a new issue make sure that we do not have any duplicates already open you can ensure this by searching the issue list for this repository if there is a duplicate please close your issue and add a comment to the existing issue instead if you suspect your issue is a bug please edit your issue description to include the bug report information shown below if you fail to provide this information within days we cannot debug your issue and will close it we will however reopen it if you later provide the information for more information about reporting issues see general support information the github issue tracker is for bug reports and feature requests general support can be found at the following locations sonic support forums bug report information use the commands below to provide key information from your environment you do not have to include this information if this is a feature request description build failed due to dget curl isc dhcp dsc failed the root cause is that isc dhcp dsc is no longer available steps to reproduce the issue build sonic describe the results you received dget curl isc dhcp dsc failed describe the results you expected no error output of show version paste your output here output of show techsupport paste your output here or download and attach the file here additional information you deem important e g issue happens only occasionally also attach debug file produced by sudo generate dump
1
268,814
20,361,985,508
IssuesEvent
2022-02-20 20:21:54
dagster-io/dagster
https://api.github.com/repos/dagster-io/dagster
closed
[Content Gap] docs include volume mounting information and more information about code updates in DockerRunLauncher docs and/or deploy_docker example
documentation platform
### Dagster Documentation Gap here: <@U018K0G2Y85> docs include volume mounting information and more information about code updates in DockerRunLauncher docs and/or deploy_docker example This issue was generated from the slack conversation at: https://dagster.slack.com/archives/C01U954MEER/p1625086523270400?thread_ts=1625086523.270400&cid=C01U954MEER Conversation excerpt: ``` U026QQT48DQ: Hello, a question for you - I have dagster running locally in docker, using postgres for log storage etc. In the dagster GUI I am able to see the event logs, but if I click the "View stdout / stderr" link on a "Logs captured" event I see a screen with "No log file available" - is this normal? Or do I have an issue with my configuration? UH3RM70A2: to get that working with docker - you will have to configure your `ComputeLogManager` to put the files somewhere that is accessible from outside the container. This can be done with volume mounts or by uploading them externally to something like s3 <https://docs.dagster.io/deployment/dagster-instance#compute-log-storage> U026QQT48DQ: Thanks <@UH3RM70A2> - this is exactly what I was looking for! Another docker question you might be able to help with. Is there a recommended way to develop in docker? The example I followed uses the gprc image for run containers and doesn't seem to support volume mounting - if I am working on a new pipeline and want to test my work is the recommendation to rebuild that container each time? Ideally I can mount my pipeline code in the docker image and have changes sync through the volume UH3RM70A2: cc <@U016C4E5CP8> U016C4E5CP8: Hi Jake - volume mounts should work, but they also need to be included in the config for the run launcher if you're using the DockerRunLauncher U016C4E5CP8: is there a place you're not seeing it work? U026QQT48DQ: A couple of things seem not to work with pipeline code mounted in a volume. The dagster GUI isn't picking up the changes without restarting the grpc container, and the volume doesn't seem to be available in the run container. I'm away from my computer right now, but would definitely appreciate some help with this tomorrow! U016C4E5CP8: Got it - the first thing is expected (a restart of the container is required when code, a rebuild is not). The second thing can be addressed in one of two ways: a) Use the DefaultRunLauncher (which launches the run in the grpc container) instead of the DockerRunLauncher (which launches a run in a container using the same image, but won't have your mounted volumes by default) b) Configure the DockerRunLauncher in your dagster.yaml to include the mounted volume as well in the launched container. Happy to talk more about this tomorrow! This question comes up a fair amount so I'll file a task to spruce up the docs here: <@U018K0G2Y85> docs include volume mounting information and more information about code updates in DockerRunLauncher docs and/or deploy_docker example ``` --- #### Message from the maintainers: Are you looking for the same documentation content? Give it a :thumbsup:. We factor engagement into prioritization.
1.0
[Content Gap] docs include volume mounting information and more information about code updates in DockerRunLauncher docs and/or deploy_docker example - ### Dagster Documentation Gap here: <@U018K0G2Y85> docs include volume mounting information and more information about code updates in DockerRunLauncher docs and/or deploy_docker example This issue was generated from the slack conversation at: https://dagster.slack.com/archives/C01U954MEER/p1625086523270400?thread_ts=1625086523.270400&cid=C01U954MEER Conversation excerpt: ``` U026QQT48DQ: Hello, a question for you - I have dagster running locally in docker, using postgres for log storage etc. In the dagster GUI I am able to see the event logs, but if I click the "View stdout / stderr" link on a "Logs captured" event I see a screen with "No log file available" - is this normal? Or do I have an issue with my configuration? UH3RM70A2: to get that working with docker - you will have to configure your `ComputeLogManager` to put the files somewhere that is accessible from outside the container. This can be done with volume mounts or by uploading them externally to something like s3 <https://docs.dagster.io/deployment/dagster-instance#compute-log-storage> U026QQT48DQ: Thanks <@UH3RM70A2> - this is exactly what I was looking for! Another docker question you might be able to help with. Is there a recommended way to develop in docker? The example I followed uses the gprc image for run containers and doesn't seem to support volume mounting - if I am working on a new pipeline and want to test my work is the recommendation to rebuild that container each time? Ideally I can mount my pipeline code in the docker image and have changes sync through the volume UH3RM70A2: cc <@U016C4E5CP8> U016C4E5CP8: Hi Jake - volume mounts should work, but they also need to be included in the config for the run launcher if you're using the DockerRunLauncher U016C4E5CP8: is there a place you're not seeing it work? U026QQT48DQ: A couple of things seem not to work with pipeline code mounted in a volume. The dagster GUI isn't picking up the changes without restarting the grpc container, and the volume doesn't seem to be available in the run container. I'm away from my computer right now, but would definitely appreciate some help with this tomorrow! U016C4E5CP8: Got it - the first thing is expected (a restart of the container is required when code, a rebuild is not). The second thing can be addressed in one of two ways: a) Use the DefaultRunLauncher (which launches the run in the grpc container) instead of the DockerRunLauncher (which launches a run in a container using the same image, but won't have your mounted volumes by default) b) Configure the DockerRunLauncher in your dagster.yaml to include the mounted volume as well in the launched container. Happy to talk more about this tomorrow! This question comes up a fair amount so I'll file a task to spruce up the docs here: <@U018K0G2Y85> docs include volume mounting information and more information about code updates in DockerRunLauncher docs and/or deploy_docker example ``` --- #### Message from the maintainers: Are you looking for the same documentation content? Give it a :thumbsup:. We factor engagement into prioritization.
non_build
docs include volume mounting information and more information about code updates in dockerrunlauncher docs and or deploy docker example dagster documentation gap here docs include volume mounting information and more information about code updates in dockerrunlauncher docs and or deploy docker example this issue was generated from the slack conversation at conversation excerpt hello a question for you i have dagster running locally in docker using postgres for log storage etc in the dagster gui i am able to see the event logs but if i click the view stdout stderr link on a logs captured event i see a screen with no log file available is this normal or do i have an issue with my configuration to get that working with docker you will have to configure your computelogmanager to put the files somewhere that is accessible from outside the container this can be done with volume mounts or by uploading them externally to something like thanks this is exactly what i was looking for another docker question you might be able to help with is there a recommended way to develop in docker the example i followed uses the gprc image for run containers and doesn t seem to support volume mounting if i am working on a new pipeline and want to test my work is the recommendation to rebuild that container each time ideally i can mount my pipeline code in the docker image and have changes sync through the volume cc hi jake volume mounts should work but they also need to be included in the config for the run launcher if you re using the dockerrunlauncher is there a place you re not seeing it work a couple of things seem not to work with pipeline code mounted in a volume the dagster gui isn t picking up the changes without restarting the grpc container and the volume doesn t seem to be available in the run container i m away from my computer right now but would definitely appreciate some help with this tomorrow got it the first thing is expected a restart of the container is required when code a rebuild is not the second thing can be addressed in one of two ways a use the defaultrunlauncher which launches the run in the grpc container instead of the dockerrunlauncher which launches a run in a container using the same image but won t have your mounted volumes by default b configure the dockerrunlauncher in your dagster yaml to include the mounted volume as well in the launched container happy to talk more about this tomorrow this question comes up a fair amount so i ll file a task to spruce up the docs here docs include volume mounting information and more information about code updates in dockerrunlauncher docs and or deploy docker example message from the maintainers are you looking for the same documentation content give it a thumbsup we factor engagement into prioritization
0
131,979
10,726,413,713
IssuesEvent
2019-10-28 09:21:18
DivanteLtd/shopware-pwa
https://api.github.com/repos/DivanteLtd/shopware-pwa
closed
Use Shopware-6-Client in "Test eCommerce Site"
Test eCommerce SIte
Context **Acceptance criteria** * Link client in Test eCommerce Site project. * It is possible to use client methods in Test eCommerce Site. * It is possible to develop both sites (Test eCommerce Site and Shopware-6-Client) in dev environment without manual rebuilds.
1.0
Use Shopware-6-Client in "Test eCommerce Site" - Context **Acceptance criteria** * Link client in Test eCommerce Site project. * It is possible to use client methods in Test eCommerce Site. * It is possible to develop both sites (Test eCommerce Site and Shopware-6-Client) in dev environment without manual rebuilds.
non_build
use shopware client in test ecommerce site context acceptance criteria link client in test ecommerce site project it is possible to use client methods in test ecommerce site it is possible to develop both sites test ecommerce site and shopware client in dev environment without manual rebuilds
0
12,619
5,220,331,380
IssuesEvent
2017-01-26 21:34:19
cf-tm-bot/mega
https://api.github.com/repos/cf-tm-bot/mega
closed
build-docker-images/build-bbl-destroy-docker-image has failed - Story Id: 138415027
accepted broken build chore
--- Mirrors: [story 138415027](https://www.pivotaltracker.com/story/show/138415027) submitted on Jan 26, 2017 UTC - **Requester**: CF Gitbot - **Owners**: David Sabeti - **Estimate**: 0.0
1.0
build-docker-images/build-bbl-destroy-docker-image has failed - Story Id: 138415027 - --- Mirrors: [story 138415027](https://www.pivotaltracker.com/story/show/138415027) submitted on Jan 26, 2017 UTC - **Requester**: CF Gitbot - **Owners**: David Sabeti - **Estimate**: 0.0
build
build docker images build bbl destroy docker image has failed story id mirrors submitted on jan utc requester cf gitbot owners david sabeti estimate
1
19,621
5,908,937,693
IssuesEvent
2017-05-19 21:54:49
joomla/joomla-cms
https://api.github.com/repos/joomla/joomla-cms
closed
Update message when on staging
No Code Attached Yet
I am on staging and my update channel is set to default but I am being offered an update to 3.7.1 - is there an xml on the update server that hasnt been updated? @zero-24 @rdeutz <img width="785" alt="screenshotr13-56-36" src="https://cloud.githubusercontent.com/assets/1296369/26248645/1f28557a-3c9b-11e7-9286-cfd871ddcf57.png">
1.0
Update message when on staging - I am on staging and my update channel is set to default but I am being offered an update to 3.7.1 - is there an xml on the update server that hasnt been updated? @zero-24 @rdeutz <img width="785" alt="screenshotr13-56-36" src="https://cloud.githubusercontent.com/assets/1296369/26248645/1f28557a-3c9b-11e7-9286-cfd871ddcf57.png">
non_build
update message when on staging i am on staging and my update channel is set to default but i am being offered an update to is there an xml on the update server that hasnt been updated zero rdeutz img width alt src
0
67,118
16,821,917,579
IssuesEvent
2021-06-17 14:02:18
microsoft/appcenter
https://api.github.com/repos/microsoft/appcenter
closed
iOS/Android version build in distribution properties
build feature request
**Describe the solution you'd like** In the distribution build properties, there is no way of automatically identifying a build and what version of iOS/Android it was built on. It would be handy when identifying versions that may be the same but build properties were different. **Describe alternatives you've considered** Updating the build text manually, but this is not ideal.
1.0
iOS/Android version build in distribution properties - **Describe the solution you'd like** In the distribution build properties, there is no way of automatically identifying a build and what version of iOS/Android it was built on. It would be handy when identifying versions that may be the same but build properties were different. **Describe alternatives you've considered** Updating the build text manually, but this is not ideal.
build
ios android version build in distribution properties describe the solution you d like in the distribution build properties there is no way of automatically identifying a build and what version of ios android it was built on it would be handy when identifying versions that may be the same but build properties were different describe alternatives you ve considered updating the build text manually but this is not ideal
1
67,491
16,989,400,009
IssuesEvent
2021-06-30 18:20:04
hashicorp/packer-plugin-amazon
https://api.github.com/repos/hashicorp/packer-plugin-amazon
opened
AMI Tag Sharing
builder/amazon enhancement question
_This issue was originally opened by @bantl23 as hashicorp/packer#11120. It was migrated here as a result of the [Packer plugin split](https://github.com/hashicorp/packer/issues/8610#issuecomment-770034737). The original body of the issue is below._ <hr> Since ami tag sharing is not implemented in packer, what is a good way to share tags using packer? I've searched and only came up with a plugin. Is there no built in way even if the solution is running packer again?
1.0
AMI Tag Sharing - _This issue was originally opened by @bantl23 as hashicorp/packer#11120. It was migrated here as a result of the [Packer plugin split](https://github.com/hashicorp/packer/issues/8610#issuecomment-770034737). The original body of the issue is below._ <hr> Since ami tag sharing is not implemented in packer, what is a good way to share tags using packer? I've searched and only came up with a plugin. Is there no built in way even if the solution is running packer again?
build
ami tag sharing this issue was originally opened by as hashicorp packer it was migrated here as a result of the the original body of the issue is below since ami tag sharing is not implemented in packer what is a good way to share tags using packer i ve searched and only came up with a plugin is there no built in way even if the solution is running packer again
1
82,374
23,755,111,003
IssuesEvent
2022-09-01 01:48:05
tensorflow/tensorflow
https://api.github.com/repos/tensorflow/tensorflow
reopened
Bazel CI Failing for Tensorflow with Bazel@HEAD
stat:awaiting tensorflower type:build/install subtype:bazel
Tensorflow is impacted by an incompatible Bazel change: https://github.com/bazelbuild/continuous-integration/issues/1404 Tensorflow will need to migrate usages of `@bazel_tools//platforms` to `@platforms`, as described in the above bug. Example breakage: https://buildkite.com/bazel/bazel-at-head-plus-downstream/builds/2590#0182aec8-7365-4995-81b0-a0494cafc3ea
1.0
Bazel CI Failing for Tensorflow with Bazel@HEAD - Tensorflow is impacted by an incompatible Bazel change: https://github.com/bazelbuild/continuous-integration/issues/1404 Tensorflow will need to migrate usages of `@bazel_tools//platforms` to `@platforms`, as described in the above bug. Example breakage: https://buildkite.com/bazel/bazel-at-head-plus-downstream/builds/2590#0182aec8-7365-4995-81b0-a0494cafc3ea
build
bazel ci failing for tensorflow with bazel head tensorflow is impacted by an incompatible bazel change tensorflow will need to migrate usages of bazel tools platforms to platforms as described in the above bug example breakage
1
10,311
4,750,858,487
IssuesEvent
2016-10-22 15:30:15
fdiogo/bityn
https://api.github.com/repos/fdiogo/bityn
closed
macOS Build Pipeline
build feature hacktoberfest
Setup the build pipeline for macOS so it can easily be used when a new release is pushed. Docs: http://electron.atom.io/docs/development/build-instructions-osx/
1.0
macOS Build Pipeline - Setup the build pipeline for macOS so it can easily be used when a new release is pushed. Docs: http://electron.atom.io/docs/development/build-instructions-osx/
build
macos build pipeline setup the build pipeline for macos so it can easily be used when a new release is pushed docs
1
25,745
19,094,897,355
IssuesEvent
2021-11-29 15:44:25
RasaHQ/rasa
https://api.github.com/repos/RasaHQ/rasa
closed
Performance assessment of slot mappings feature
area:rasa-oss :ferris_wheel: area:rasa-oss/infrastructure :bullettrain_front: priority:high feature:3.0/slot-mappings 3.0.0rc-QA-issue
**Definition of Done**: - [ ] benchmark performance effects of running the default action `action_extract_slots` after every user utterance to check if this adds any extra latency
1.0
Performance assessment of slot mappings feature - **Definition of Done**: - [ ] benchmark performance effects of running the default action `action_extract_slots` after every user utterance to check if this adds any extra latency
non_build
performance assessment of slot mappings feature definition of done benchmark performance effects of running the default action action extract slots after every user utterance to check if this adds any extra latency
0
44,703
11,492,679,940
IssuesEvent
2020-02-11 21:28:45
teserakt-io/libe4
https://api.github.com/repos/teserakt-io/libe4
closed
Compile-time configurable storage sizes
A-Storage B-build C-Enhancement P-MED
Currently, we hard-code maximum storage quotas inside the code, for example, in e4c_store_file.c we have: #define E4_TOPICS_MAX 100 We should probably build the code in such a way that this can be configured according to various memory profiles.
1.0
Compile-time configurable storage sizes - Currently, we hard-code maximum storage quotas inside the code, for example, in e4c_store_file.c we have: #define E4_TOPICS_MAX 100 We should probably build the code in such a way that this can be configured according to various memory profiles.
build
compile time configurable storage sizes currently we hard code maximum storage quotas inside the code for example in store file c we have define topics max we should probably build the code in such a way that this can be configured according to various memory profiles
1
14,061
3,373,164,425
IssuesEvent
2015-11-24 05:08:44
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Whitespace in front of CA bundle
area/balancer-ssl kind/bug status/reopened status/to-test
Rancher v0.43.1 Cattle v0.106.0 User Interface v0.63.0 Rancher Compose v0.5.0 Tried to use a Comodo DV cert with a HTTPS load balancer. I couldn't figure out why a cerificate from our chain was missing, but then I checked the HAproxy container. The intermediate was prefixed with a space so the entire first intermediate was discarded. API shows no whitespace in the bundle. Bundle https://u.pomf.io/weuwpa.txt Cert https://u.pomf.io/bpyswm.txt
1.0
Whitespace in front of CA bundle - Rancher v0.43.1 Cattle v0.106.0 User Interface v0.63.0 Rancher Compose v0.5.0 Tried to use a Comodo DV cert with a HTTPS load balancer. I couldn't figure out why a cerificate from our chain was missing, but then I checked the HAproxy container. The intermediate was prefixed with a space so the entire first intermediate was discarded. API shows no whitespace in the bundle. Bundle https://u.pomf.io/weuwpa.txt Cert https://u.pomf.io/bpyswm.txt
non_build
whitespace in front of ca bundle rancher cattle user interface rancher compose tried to use a comodo dv cert with a https load balancer i couldn t figure out why a cerificate from our chain was missing but then i checked the haproxy container the intermediate was prefixed with a space so the entire first intermediate was discarded api shows no whitespace in the bundle bundle cert
0
92,161
26,599,248,489
IssuesEvent
2023-01-23 14:41:36
spack/spack
https://api.github.com/repos/spack/spack
opened
axom@0.7.0 +cuda %gcc@11.1.0 fails: RAJA/policy/cuda/reduce.hpp:453:5: there are no arguments to '__syncthreads' that depend on a template parameter
build-error cuda e4s
### Steps to reproduce the issue `axom@0.7.0 +cuda cuda_arch=80` fails to build using: * Ubuntu 20.04, GCC 11.1.0, X86_64 * `spack@develop` (f593309b4e from `Sat Jan 21 10:17:53 2023 -0500`) * CUDA 11.7.0, Spack-installed * BLT 0.5.2 * RAJA 2022.03.0 This issue is affecting builds on Perlmutter with GCC 11.2.0 as well. Spack environment: [spack.yaml.txt](https://github.com/spack/spack/files/10480624/spack.yaml.txt) <details><summary>Concretization</summary><pre> - oila7ki axom@0.7.0%gcc@11.1.0+cpp14+cuda~devtools+examples+fortran+hdf5~ipo+lua~mfem+mpi+openmp~python+raja~rocm~scr+shared+tools+umpire build_system=cmake build_type=RelWithDebInfo cuda_arch=80 arch=linux-ubuntu20.04-x86_64 [+] gjnzmcb ^blt@0.5.2%gcc@11.1.0 build_system=generic arch=linux-ubuntu20.04-x86_64 [+] brghmpd ^cmake@3.25.1%gcc@11.1.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-ubuntu20.04-x86_64 [+] uvosrzt ^ncurses@6.4%gcc@11.1.0~symlinks+termlib abi=none build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] 53uimxu ^openssl@1.1.1s%gcc@11.1.0~docs~shared build_system=generic certs=mozilla arch=linux-ubuntu20.04-x86_64 [+] w233qs7 ^ca-certificates-mozilla@2023-01-10%gcc@11.1.0 build_system=generic arch=linux-ubuntu20.04-x86_64 [+] vjctwmd ^perl@5.36.0%gcc@11.1.0+cpanm+open+shared+threads build_system=generic arch=linux-ubuntu20.04-x86_64 [+] alkzfee ^berkeley-db@18.1.40%gcc@11.1.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=linux-ubuntu20.04-x86_64 [+] 4w6pcqh ^bzip2@1.0.8%gcc@11.1.0~debug~pic+shared build_system=generic arch=linux-ubuntu20.04-x86_64 [+] qdviltn ^gdbm@1.23%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] jfxuokb ^conduit@0.8.6%gcc@11.1.0~adios+blt_find_mpi~caliper~doc~doxygen+examples+fortran+hdf5+hdf5_compat~ipo+mpi+parmetis~python+shared~silo+test+utilities~zfp build_system=cmake build_type=RelWithDebInfo arch=linux-ubuntu20.04-x86_64 [+] to42dw5 ^metis@5.1.0%gcc@11.1.0~gdb~int64~ipo~real64+shared build_system=cmake build_type=RelWithDebInfo patches=4991da9,93a7903,b1225da arch=linux-ubuntu20.04-x86_64 [+] pg4zgow ^parmetis@4.0.3%gcc@11.1.0~gdb~int64~ipo+shared build_system=cmake build_type=RelWithDebInfo patches=4f89253,50ed208,704b84f arch=linux-ubuntu20.04-x86_64 [+] uudof2f ^cuda@11.7.0%gcc@11.1.0~allow-unsupported-compilers~dev build_system=generic arch=linux-ubuntu20.04-x86_64 [+] ndptfbi ^libxml2@2.10.3%gcc@11.1.0~python build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] 5f66p4a ^libiconv@1.17%gcc@11.1.0 build_system=autotools libs=shared,static arch=linux-ubuntu20.04-x86_64 [+] 4glpvrf ^xz@5.2.7%gcc@11.1.0+pic build_system=autotools libs=shared,static arch=linux-ubuntu20.04-x86_64 [+] ngjfkdb ^hdf5@1.8.21%gcc@11.1.0~cxx+fortran+hl~ipo+mpi+shared~szip~threadsafe+tools api=default build_system=cmake build_type=RelWithDebInfo patches=0e20187,b61e2f0 arch=linux-ubuntu20.04-x86_64 [+] uorrjkf ^pkgconf@1.8.0%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] 55gmmg7 ^zlib@1.2.13%gcc@11.1.0+optimize+pic+shared build_system=makefile arch=linux-ubuntu20.04-x86_64 [+] ox62a6d ^lua@5.4.4%gcc@11.1.0~pcfile+shared build_system=makefile fetcher=curl arch=linux-ubuntu20.04-x86_64 [+] dxygujy ^curl@7.85.0%gcc@11.1.0~gssapi~ldap~libidn2~librtmp~libssh~libssh2~nghttp2 build_system=autotools libs=shared,static tls=openssl arch=linux-ubuntu20.04-x86_64 [+] vxrl4pn ^readline@8.2%gcc@11.1.0 build_system=autotools patches=bbf97f1 arch=linux-ubuntu20.04-x86_64 [+] dkpfbr2 ^unzip@6.0%gcc@11.1.0 build_system=makefile arch=linux-ubuntu20.04-x86_64 [+] qfrpz5h ^mpich@4.0.2%gcc@11.1.0~argobots~cuda+fortran+hwloc+hydra+libxml2+pci~rocm+romio~slurm~two_level_namespace~vci~verbs~wrapperrpath build_system=autotools datatype-engine=auto device=ch4 netmod=ofi patches=d4c0e99 pmi=pmi arch=linux-ubuntu20.04-x86_64 [+] bqre6so ^findutils@4.9.0%gcc@11.1.0 build_system=autotools patches=440b954 arch=linux-ubuntu20.04-x86_64 [+] g4tnevt ^hwloc@2.9.0%gcc@11.1.0~cairo~cuda~gl~libudev+libxml2~netloc~nvml~oneapi-level-zero~opencl+pci~rocm build_system=autotools libs=shared,static arch=linux-ubuntu20.04-x86_64 [+] ieh7e7v ^libfabric@1.16.1%gcc@11.1.0~debug~kdreg build_system=autotools fabrics=rxm,sockets,tcp,udp arch=linux-ubuntu20.04-x86_64 [+] srwvvk5 ^libpciaccess@0.16%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] bntep2v ^libtool@2.4.7%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] i5p6x5b ^util-macros@1.19.3%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] gjvbvvm ^yaksa@0.2%gcc@11.1.0~cuda~rocm build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] rri3iyz ^autoconf@2.69%gcc@11.1.0 build_system=autotools patches=35c4492,7793209,a49dd5b arch=linux-ubuntu20.04-x86_64 [+] n7fisjn ^automake@1.16.5%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] qdxsvxc ^m4@1.4.19%gcc@11.1.0+sigsegv build_system=autotools patches=9dc5fbd,bfdffa7 arch=linux-ubuntu20.04-x86_64 [+] zbe45vu ^diffutils@3.8%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] lypoofr ^libsigsegv@2.13%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] dycqarv ^python@3.8.13%gcc@11.1.0+bz2+crypt+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tkinter+uuid+zlib build_system=generic patches=0d98e93,4c24573,f2fd060 arch=linux-ubuntu20.04-x86_64 [+] 4qorfjg ^expat@2.5.0%gcc@11.1.0+libbsd build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] bstssvl ^libbsd@0.11.7%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] cb26zci ^libmd@1.0.4%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] ynfqffs ^gettext@0.21.1%gcc@11.1.0+bzip2+curses+git~libunistring+libxml2+tar+xz build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] grgygjr ^tar@1.34%gcc@11.1.0 build_system=autotools zip=pigz arch=linux-ubuntu20.04-x86_64 [+] 6fhsko7 ^pigz@2.7%gcc@11.1.0 build_system=makefile arch=linux-ubuntu20.04-x86_64 [+] nwwllvp ^zstd@1.5.2%gcc@11.1.0+programs build_system=makefile compression=none libs=shared,static arch=linux-ubuntu20.04-x86_64 [+] mgim53z ^libffi@3.4.3%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] jky7lis ^libxcrypt@4.4.33%gcc@11.1.0~obsolete_api build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] nuy4oha ^sqlite@3.40.1%gcc@11.1.0+column_metadata+dynamic_extensions+fts~functions+rtree build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] i5b6dq6 ^util-linux-uuid@2.38.1%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] ias7lnj ^raja@2022.03.0%gcc@11.1.0+cuda+examples+exercises~ipo+openmp~rocm+shared~tests build_system=cmake build_type=RelWithDebInfo cuda_arch=80 arch=linux-ubuntu20.04-x86_64 [+] xk7o37x ^camp@2022.03.2%gcc@11.1.0+cuda~ipo+openmp~rocm~tests build_system=cmake build_type=RelWithDebInfo cuda_arch=80 arch=linux-ubuntu20.04-x86_64 [+] mvcn2op ^cub@1.16.0%gcc@11.1.0 build_system=generic arch=linux-ubuntu20.04-x86_64 [+] u5ufv43 ^umpire@2022.03.1%gcc@11.1.0+c+cuda+device_alloc~deviceconst+examples~fortran~ipo~numa+openmp~rocm~shared build_system=cmake build_type=RelWithDebInfo cuda_arch=80 tests=none arch=linux-ubuntu20.04-x86_64 </pre></details> Install error: ``` $> spack -e . install ... ==> Installing axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime ==> No binary for axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime found: installing from source ==> Using cached archive: /spack/var/spack/cache/_source-cache/git//LLNL/axom.git/v0.7.0.tar.gz ==> Warning: Fetching from mirror without a checksum! This package is normally checked out from a version control system, but it has been archived on a spack mirror. This means we cannot know a checksum for the tarball in advance. Be sure that your connection to this mirror is secure! ==> Ran patch() for axom ==> axom: Executing phase: 'initconfig' ==> axom: Executing phase: 'cmake' ==> axom: Executing phase: 'build' ... >> 39028 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:453:5: error: there are no arguments to '__sy ncthreads' that depend on a template parameter, so a declaration of '__syncthreads' must be available [-fpermissive] 39029 453 | __syncthreads(); 39030 | ^~~~~~~~~~~~~ 39031 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:453:5: note: (if you use '-fpermissive', G++ will accept your code, but allowing the use of an undeclared name is deprecated) >> 39032 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:458:28: error: 'RAJA::policy::cuda' has not b een declared 39033 458 | if (warpId * policy::cuda::WARP_SIZE < numThreads) { 39034 | ^~~~ >> 39035 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:459:20: error: request for member 'get' in 's d->', which is of non-class type 'int' 39036 459 | temp = sd->get(warpId); 39037 | ^~~ >> 39038 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:464:35: error: 'RAJA::policy::cuda' has not b een declared 39039 464 | for (int i = 1; i < policy::cuda::MAX_WARPS; i *= 2) { ... >> 39101 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:509:15: error: there are no arguments to '__s yncthreads_or' that depend on a template parameter, so a declaration of '__syncthreads_or' must be available [-fpermissive] 39102 509 | lastBlock = __syncthreads_or(lastBlock); 39103 | ^~~~~~~~~~~~~~~~ 39104 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp: In function 'bool RAJA::cuda::impl::grid_red uce_atomic(T&, T, T*, unsigned int*)': >> 39105 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:539:19: error: 'gridDim' was not declared in this scope 39106 539 | int numBlocks = gridDim.x * gridDim.y * gridDim.z; 39107 | ^~~~~~~ >> 39108 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:542:18: error: 'threadIdx' was not declared i n this scope; did you mean 'threadId'? 39109 542 | int threadId = threadIdx.x + blockDim.x * threadIdx.y + 39110 | ^~~~~~~~~ 39111 | threadId 39112 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ >> 39113 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:542:32: error: 'blockDim' was not declared in this scope 39114 542 | int threadId = threadIdx.x + blockDim.x * threadIdx.y + 39115 | ^~~~~~~~ >> 39116 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:547:30: error: '::atomicCAS' has not been dec lared; did you mean 'RAJA::atomicCAS'? 39117 547 | unsigned int old_val = ::atomicCAS(device_count, 0u, 1u); 39118 | ^~~~~~~~~ 39119 | RAJA::atomicCAS 39120 In file included from /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/atomic.hpp:32, 39121 from /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:47, 39122 from /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/tensor/arch/cuda/cuda_warp.hpp:27, ... >> 39439 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:1172:20: error: 'cuda_reduce_base' was not de clared in this scope 39440 1172 | class ReduceMinLoc<cuda_reduce_base<maybe_atomic>, T, IndexType> 39441 | ^~~~~~~~~~~~~~~~ >> 39442 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:1172:49: error: wrong number of template argu ments (1, should be at least 2) 39443 1172 | class ReduceMinLoc<cuda_reduce_base<maybe_atomic>, T, IndexType> ... >> 39541 make[1]: *** [CMakeFiles/Makefile2:2648: axom/klee/CMakeFiles/klee.dir/all] Error 2 39542 [ 27%] Building CUDA object axom/mint/CMakeFiles/mint.dir/utils/su2_utils.cpp.o 39543 cd /tmp/root/spack-stage/spack-stage-axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime/spack-build-oila7ki/axom/mint && /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/cuda-11.7.0-uud of2fpqnxpwwyj3pbftsiibhcmqqgp/bin/nvcc -forward-unknown-to-host-compiler -ccbin=/usr/bin/g++ -DAXOM_DEBUG -DCAMP_HAVE_CUDA --options-file CMakeFiles/mint.dir/includes_CUDA.rsp -restr ict --expt-extended-lambda -arch sm_80 -std=c++14 -O2 -g -DNDEBUG --generate-code=arch=compute_80,code=[compute_80,sm_80] -Xcompiler=-fPIC -Xcompiler=-fopenmp -fallow-argument-misma tch -std=c++14 -MD -MT axom/mint/CMakeFiles/mint.dir/utils/su2_utils.cpp.o -MF CMakeFiles/mint.dir/utils/su2_utils.cpp.o.d -x cu -rdc=true -c /tmp/root/spack-stage/spack-stage-axom-0 .7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime/spack-src/src/axom/mint/utils/su2_utils.cpp -o CMakeFiles/mint.dir/utils/su2_utils.cpp.o 39544 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39545 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39546 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39547 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ ... 39694 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39695 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39696 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39697 make[2]: Leaving directory '/tmp/root/spack-stage/spack-stage-axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime/spack-build-oila7ki' 39698 [ 35%] Built target quest 39699 make[1]: Leaving directory '/tmp/root/spack-stage/spack-stage-axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime/spack-build-oila7ki' >> 39700 make: *** [Makefile:139: all] Error 2 ``` ### Error message <details><summary>Error message</summary><pre> ... see above </pre></details> ### Information on your system * **Spack:** 0.20.0.dev0 (6fef543a6f3aaa4155082c54762a6852d732b999) * **Python:** 3.8.10 * **Platform:** linux-ubuntu20.04-cascadelake * **Concretizer:** clingo ### Additional information [spack-build-out.txt](https://github.com/spack/spack/files/10480681/spack-build-out.txt) [spack-build-env.txt](https://github.com/spack/spack/files/10480683/spack-build-env.txt) [CMakeError.log](https://github.com/spack/spack/files/10480685/CMakeError.log) [CMakeOutput.log](https://github.com/spack/spack/files/10480686/CMakeOutput.log) @white238 @wspear ### General information - [X] I have run `spack debug report` and reported the version of Spack/Python/Platform - [X] I have run `spack maintainers <name-of-the-package>` and **@mentioned** any maintainers - [X] I have uploaded the build log and environment files - [X] I have searched the issues of this repo and believe this is not a duplicate
1.0
axom@0.7.0 +cuda %gcc@11.1.0 fails: RAJA/policy/cuda/reduce.hpp:453:5: there are no arguments to '__syncthreads' that depend on a template parameter - ### Steps to reproduce the issue `axom@0.7.0 +cuda cuda_arch=80` fails to build using: * Ubuntu 20.04, GCC 11.1.0, X86_64 * `spack@develop` (f593309b4e from `Sat Jan 21 10:17:53 2023 -0500`) * CUDA 11.7.0, Spack-installed * BLT 0.5.2 * RAJA 2022.03.0 This issue is affecting builds on Perlmutter with GCC 11.2.0 as well. Spack environment: [spack.yaml.txt](https://github.com/spack/spack/files/10480624/spack.yaml.txt) <details><summary>Concretization</summary><pre> - oila7ki axom@0.7.0%gcc@11.1.0+cpp14+cuda~devtools+examples+fortran+hdf5~ipo+lua~mfem+mpi+openmp~python+raja~rocm~scr+shared+tools+umpire build_system=cmake build_type=RelWithDebInfo cuda_arch=80 arch=linux-ubuntu20.04-x86_64 [+] gjnzmcb ^blt@0.5.2%gcc@11.1.0 build_system=generic arch=linux-ubuntu20.04-x86_64 [+] brghmpd ^cmake@3.25.1%gcc@11.1.0~doc+ncurses+ownlibs~qt build_system=generic build_type=Release arch=linux-ubuntu20.04-x86_64 [+] uvosrzt ^ncurses@6.4%gcc@11.1.0~symlinks+termlib abi=none build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] 53uimxu ^openssl@1.1.1s%gcc@11.1.0~docs~shared build_system=generic certs=mozilla arch=linux-ubuntu20.04-x86_64 [+] w233qs7 ^ca-certificates-mozilla@2023-01-10%gcc@11.1.0 build_system=generic arch=linux-ubuntu20.04-x86_64 [+] vjctwmd ^perl@5.36.0%gcc@11.1.0+cpanm+open+shared+threads build_system=generic arch=linux-ubuntu20.04-x86_64 [+] alkzfee ^berkeley-db@18.1.40%gcc@11.1.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=linux-ubuntu20.04-x86_64 [+] 4w6pcqh ^bzip2@1.0.8%gcc@11.1.0~debug~pic+shared build_system=generic arch=linux-ubuntu20.04-x86_64 [+] qdviltn ^gdbm@1.23%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] jfxuokb ^conduit@0.8.6%gcc@11.1.0~adios+blt_find_mpi~caliper~doc~doxygen+examples+fortran+hdf5+hdf5_compat~ipo+mpi+parmetis~python+shared~silo+test+utilities~zfp build_system=cmake build_type=RelWithDebInfo arch=linux-ubuntu20.04-x86_64 [+] to42dw5 ^metis@5.1.0%gcc@11.1.0~gdb~int64~ipo~real64+shared build_system=cmake build_type=RelWithDebInfo patches=4991da9,93a7903,b1225da arch=linux-ubuntu20.04-x86_64 [+] pg4zgow ^parmetis@4.0.3%gcc@11.1.0~gdb~int64~ipo+shared build_system=cmake build_type=RelWithDebInfo patches=4f89253,50ed208,704b84f arch=linux-ubuntu20.04-x86_64 [+] uudof2f ^cuda@11.7.0%gcc@11.1.0~allow-unsupported-compilers~dev build_system=generic arch=linux-ubuntu20.04-x86_64 [+] ndptfbi ^libxml2@2.10.3%gcc@11.1.0~python build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] 5f66p4a ^libiconv@1.17%gcc@11.1.0 build_system=autotools libs=shared,static arch=linux-ubuntu20.04-x86_64 [+] 4glpvrf ^xz@5.2.7%gcc@11.1.0+pic build_system=autotools libs=shared,static arch=linux-ubuntu20.04-x86_64 [+] ngjfkdb ^hdf5@1.8.21%gcc@11.1.0~cxx+fortran+hl~ipo+mpi+shared~szip~threadsafe+tools api=default build_system=cmake build_type=RelWithDebInfo patches=0e20187,b61e2f0 arch=linux-ubuntu20.04-x86_64 [+] uorrjkf ^pkgconf@1.8.0%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] 55gmmg7 ^zlib@1.2.13%gcc@11.1.0+optimize+pic+shared build_system=makefile arch=linux-ubuntu20.04-x86_64 [+] ox62a6d ^lua@5.4.4%gcc@11.1.0~pcfile+shared build_system=makefile fetcher=curl arch=linux-ubuntu20.04-x86_64 [+] dxygujy ^curl@7.85.0%gcc@11.1.0~gssapi~ldap~libidn2~librtmp~libssh~libssh2~nghttp2 build_system=autotools libs=shared,static tls=openssl arch=linux-ubuntu20.04-x86_64 [+] vxrl4pn ^readline@8.2%gcc@11.1.0 build_system=autotools patches=bbf97f1 arch=linux-ubuntu20.04-x86_64 [+] dkpfbr2 ^unzip@6.0%gcc@11.1.0 build_system=makefile arch=linux-ubuntu20.04-x86_64 [+] qfrpz5h ^mpich@4.0.2%gcc@11.1.0~argobots~cuda+fortran+hwloc+hydra+libxml2+pci~rocm+romio~slurm~two_level_namespace~vci~verbs~wrapperrpath build_system=autotools datatype-engine=auto device=ch4 netmod=ofi patches=d4c0e99 pmi=pmi arch=linux-ubuntu20.04-x86_64 [+] bqre6so ^findutils@4.9.0%gcc@11.1.0 build_system=autotools patches=440b954 arch=linux-ubuntu20.04-x86_64 [+] g4tnevt ^hwloc@2.9.0%gcc@11.1.0~cairo~cuda~gl~libudev+libxml2~netloc~nvml~oneapi-level-zero~opencl+pci~rocm build_system=autotools libs=shared,static arch=linux-ubuntu20.04-x86_64 [+] ieh7e7v ^libfabric@1.16.1%gcc@11.1.0~debug~kdreg build_system=autotools fabrics=rxm,sockets,tcp,udp arch=linux-ubuntu20.04-x86_64 [+] srwvvk5 ^libpciaccess@0.16%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] bntep2v ^libtool@2.4.7%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] i5p6x5b ^util-macros@1.19.3%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] gjvbvvm ^yaksa@0.2%gcc@11.1.0~cuda~rocm build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] rri3iyz ^autoconf@2.69%gcc@11.1.0 build_system=autotools patches=35c4492,7793209,a49dd5b arch=linux-ubuntu20.04-x86_64 [+] n7fisjn ^automake@1.16.5%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] qdxsvxc ^m4@1.4.19%gcc@11.1.0+sigsegv build_system=autotools patches=9dc5fbd,bfdffa7 arch=linux-ubuntu20.04-x86_64 [+] zbe45vu ^diffutils@3.8%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] lypoofr ^libsigsegv@2.13%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] dycqarv ^python@3.8.13%gcc@11.1.0+bz2+crypt+ctypes+dbm~debug+libxml2+lzma~nis~optimizations+pic+pyexpat+pythoncmd+readline+shared+sqlite3+ssl~tkinter+uuid+zlib build_system=generic patches=0d98e93,4c24573,f2fd060 arch=linux-ubuntu20.04-x86_64 [+] 4qorfjg ^expat@2.5.0%gcc@11.1.0+libbsd build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] bstssvl ^libbsd@0.11.7%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] cb26zci ^libmd@1.0.4%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] ynfqffs ^gettext@0.21.1%gcc@11.1.0+bzip2+curses+git~libunistring+libxml2+tar+xz build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] grgygjr ^tar@1.34%gcc@11.1.0 build_system=autotools zip=pigz arch=linux-ubuntu20.04-x86_64 [+] 6fhsko7 ^pigz@2.7%gcc@11.1.0 build_system=makefile arch=linux-ubuntu20.04-x86_64 [+] nwwllvp ^zstd@1.5.2%gcc@11.1.0+programs build_system=makefile compression=none libs=shared,static arch=linux-ubuntu20.04-x86_64 [+] mgim53z ^libffi@3.4.3%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] jky7lis ^libxcrypt@4.4.33%gcc@11.1.0~obsolete_api build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] nuy4oha ^sqlite@3.40.1%gcc@11.1.0+column_metadata+dynamic_extensions+fts~functions+rtree build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] i5b6dq6 ^util-linux-uuid@2.38.1%gcc@11.1.0 build_system=autotools arch=linux-ubuntu20.04-x86_64 [+] ias7lnj ^raja@2022.03.0%gcc@11.1.0+cuda+examples+exercises~ipo+openmp~rocm+shared~tests build_system=cmake build_type=RelWithDebInfo cuda_arch=80 arch=linux-ubuntu20.04-x86_64 [+] xk7o37x ^camp@2022.03.2%gcc@11.1.0+cuda~ipo+openmp~rocm~tests build_system=cmake build_type=RelWithDebInfo cuda_arch=80 arch=linux-ubuntu20.04-x86_64 [+] mvcn2op ^cub@1.16.0%gcc@11.1.0 build_system=generic arch=linux-ubuntu20.04-x86_64 [+] u5ufv43 ^umpire@2022.03.1%gcc@11.1.0+c+cuda+device_alloc~deviceconst+examples~fortran~ipo~numa+openmp~rocm~shared build_system=cmake build_type=RelWithDebInfo cuda_arch=80 tests=none arch=linux-ubuntu20.04-x86_64 </pre></details> Install error: ``` $> spack -e . install ... ==> Installing axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime ==> No binary for axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime found: installing from source ==> Using cached archive: /spack/var/spack/cache/_source-cache/git//LLNL/axom.git/v0.7.0.tar.gz ==> Warning: Fetching from mirror without a checksum! This package is normally checked out from a version control system, but it has been archived on a spack mirror. This means we cannot know a checksum for the tarball in advance. Be sure that your connection to this mirror is secure! ==> Ran patch() for axom ==> axom: Executing phase: 'initconfig' ==> axom: Executing phase: 'cmake' ==> axom: Executing phase: 'build' ... >> 39028 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:453:5: error: there are no arguments to '__sy ncthreads' that depend on a template parameter, so a declaration of '__syncthreads' must be available [-fpermissive] 39029 453 | __syncthreads(); 39030 | ^~~~~~~~~~~~~ 39031 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:453:5: note: (if you use '-fpermissive', G++ will accept your code, but allowing the use of an undeclared name is deprecated) >> 39032 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:458:28: error: 'RAJA::policy::cuda' has not b een declared 39033 458 | if (warpId * policy::cuda::WARP_SIZE < numThreads) { 39034 | ^~~~ >> 39035 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:459:20: error: request for member 'get' in 's d->', which is of non-class type 'int' 39036 459 | temp = sd->get(warpId); 39037 | ^~~ >> 39038 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:464:35: error: 'RAJA::policy::cuda' has not b een declared 39039 464 | for (int i = 1; i < policy::cuda::MAX_WARPS; i *= 2) { ... >> 39101 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:509:15: error: there are no arguments to '__s yncthreads_or' that depend on a template parameter, so a declaration of '__syncthreads_or' must be available [-fpermissive] 39102 509 | lastBlock = __syncthreads_or(lastBlock); 39103 | ^~~~~~~~~~~~~~~~ 39104 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp: In function 'bool RAJA::cuda::impl::grid_red uce_atomic(T&, T, T*, unsigned int*)': >> 39105 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:539:19: error: 'gridDim' was not declared in this scope 39106 539 | int numBlocks = gridDim.x * gridDim.y * gridDim.z; 39107 | ^~~~~~~ >> 39108 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:542:18: error: 'threadIdx' was not declared i n this scope; did you mean 'threadId'? 39109 542 | int threadId = threadIdx.x + blockDim.x * threadIdx.y + 39110 | ^~~~~~~~~ 39111 | threadId 39112 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ >> 39113 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:542:32: error: 'blockDim' was not declared in this scope 39114 542 | int threadId = threadIdx.x + blockDim.x * threadIdx.y + 39115 | ^~~~~~~~ >> 39116 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:547:30: error: '::atomicCAS' has not been dec lared; did you mean 'RAJA::atomicCAS'? 39117 547 | unsigned int old_val = ::atomicCAS(device_count, 0u, 1u); 39118 | ^~~~~~~~~ 39119 | RAJA::atomicCAS 39120 In file included from /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/atomic.hpp:32, 39121 from /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:47, 39122 from /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/tensor/arch/cuda/cuda_warp.hpp:27, ... >> 39439 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:1172:20: error: 'cuda_reduce_base' was not de clared in this scope 39440 1172 | class ReduceMinLoc<cuda_reduce_base<maybe_atomic>, T, IndexType> 39441 | ^~~~~~~~~~~~~~~~ >> 39442 /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/raja-2022.03.0-ias7lnjxaqqivv5liysppluhxqjxrs3d/include/RAJA/policy/cuda/reduce.hpp:1172:49: error: wrong number of template argu ments (1, should be at least 2) 39443 1172 | class ReduceMinLoc<cuda_reduce_base<maybe_atomic>, T, IndexType> ... >> 39541 make[1]: *** [CMakeFiles/Makefile2:2648: axom/klee/CMakeFiles/klee.dir/all] Error 2 39542 [ 27%] Building CUDA object axom/mint/CMakeFiles/mint.dir/utils/su2_utils.cpp.o 39543 cd /tmp/root/spack-stage/spack-stage-axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime/spack-build-oila7ki/axom/mint && /spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-11.1.0/cuda-11.7.0-uud of2fpqnxpwwyj3pbftsiibhcmqqgp/bin/nvcc -forward-unknown-to-host-compiler -ccbin=/usr/bin/g++ -DAXOM_DEBUG -DCAMP_HAVE_CUDA --options-file CMakeFiles/mint.dir/includes_CUDA.rsp -restr ict --expt-extended-lambda -arch sm_80 -std=c++14 -O2 -g -DNDEBUG --generate-code=arch=compute_80,code=[compute_80,sm_80] -Xcompiler=-fPIC -Xcompiler=-fopenmp -fallow-argument-misma tch -std=c++14 -MD -MT axom/mint/CMakeFiles/mint.dir/utils/su2_utils.cpp.o -MF CMakeFiles/mint.dir/utils/su2_utils.cpp.o.d -x cu -rdc=true -c /tmp/root/spack-stage/spack-stage-axom-0 .7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime/spack-src/src/axom/mint/utils/su2_utils.cpp -o CMakeFiles/mint.dir/utils/su2_utils.cpp.o 39544 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39545 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39546 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39547 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ ... 39694 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39695 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39696 cc1plus: warning: command-line option '-fallow-argument-mismatch' is valid for Fortran but not for C++ 39697 make[2]: Leaving directory '/tmp/root/spack-stage/spack-stage-axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime/spack-build-oila7ki' 39698 [ 35%] Built target quest 39699 make[1]: Leaving directory '/tmp/root/spack-stage/spack-stage-axom-0.7.0-oila7kiyzj2yrvtu23ztx3raqsnfcime/spack-build-oila7ki' >> 39700 make: *** [Makefile:139: all] Error 2 ``` ### Error message <details><summary>Error message</summary><pre> ... see above </pre></details> ### Information on your system * **Spack:** 0.20.0.dev0 (6fef543a6f3aaa4155082c54762a6852d732b999) * **Python:** 3.8.10 * **Platform:** linux-ubuntu20.04-cascadelake * **Concretizer:** clingo ### Additional information [spack-build-out.txt](https://github.com/spack/spack/files/10480681/spack-build-out.txt) [spack-build-env.txt](https://github.com/spack/spack/files/10480683/spack-build-env.txt) [CMakeError.log](https://github.com/spack/spack/files/10480685/CMakeError.log) [CMakeOutput.log](https://github.com/spack/spack/files/10480686/CMakeOutput.log) @white238 @wspear ### General information - [X] I have run `spack debug report` and reported the version of Spack/Python/Platform - [X] I have run `spack maintainers <name-of-the-package>` and **@mentioned** any maintainers - [X] I have uploaded the build log and environment files - [X] I have searched the issues of this repo and believe this is not a duplicate
build
axom cuda gcc fails raja policy cuda reduce hpp there are no arguments to syncthreads that depend on a template parameter steps to reproduce the issue axom cuda cuda arch fails to build using ubuntu gcc spack develop from sat jan cuda spack installed blt raja this issue is affecting builds on perlmutter with gcc as well spack environment concretization axom gcc cuda devtools examples fortran ipo lua mfem mpi openmp python raja rocm scr shared tools umpire build system cmake build type relwithdebinfo cuda arch arch linux gjnzmcb blt gcc build system generic arch linux brghmpd cmake gcc doc ncurses ownlibs qt build system generic build type release arch linux uvosrzt ncurses gcc symlinks termlib abi none build system autotools arch linux openssl gcc docs shared build system generic certs mozilla arch linux ca certificates mozilla gcc build system generic arch linux vjctwmd perl gcc cpanm open shared threads build system generic arch linux alkzfee berkeley db gcc cxx docs stl build system autotools patches arch linux gcc debug pic shared build system generic arch linux qdviltn gdbm gcc build system autotools arch linux jfxuokb conduit gcc adios blt find mpi caliper doc doxygen examples fortran compat ipo mpi parmetis python shared silo test utilities zfp build system cmake build type relwithdebinfo arch linux metis gcc gdb ipo shared build system cmake build type relwithdebinfo patches arch linux parmetis gcc gdb ipo shared build system cmake build type relwithdebinfo patches arch linux cuda gcc allow unsupported compilers dev build system generic arch linux ndptfbi gcc python build system autotools arch linux libiconv gcc build system autotools libs shared static arch linux xz gcc pic build system autotools libs shared static arch linux ngjfkdb gcc cxx fortran hl ipo mpi shared szip threadsafe tools api default build system cmake build type relwithdebinfo patches arch linux uorrjkf pkgconf gcc build system autotools arch linux zlib gcc optimize pic shared build system makefile arch linux lua gcc pcfile shared build system makefile fetcher curl arch linux dxygujy curl gcc gssapi ldap librtmp libssh build system autotools libs shared static tls openssl arch linux readline gcc build system autotools patches arch linux unzip gcc build system makefile arch linux mpich gcc argobots cuda fortran hwloc hydra pci rocm romio slurm two level namespace vci verbs wrapperrpath build system autotools datatype engine auto device netmod ofi patches pmi pmi arch linux findutils gcc build system autotools patches arch linux hwloc gcc cairo cuda gl libudev netloc nvml oneapi level zero opencl pci rocm build system autotools libs shared static arch linux libfabric gcc debug kdreg build system autotools fabrics rxm sockets tcp udp arch linux libpciaccess gcc build system autotools arch linux libtool gcc build system autotools arch linux util macros gcc build system autotools arch linux gjvbvvm yaksa gcc cuda rocm build system autotools arch linux autoconf gcc build system autotools patches arch linux automake gcc build system autotools arch linux qdxsvxc gcc sigsegv build system autotools patches arch linux diffutils gcc build system autotools arch linux lypoofr libsigsegv gcc build system autotools arch linux dycqarv python gcc crypt ctypes dbm debug lzma nis optimizations pic pyexpat pythoncmd readline shared ssl tkinter uuid zlib build system generic patches arch linux expat gcc libbsd build system autotools arch linux bstssvl libbsd gcc build system autotools arch linux libmd gcc build system autotools arch linux ynfqffs gettext gcc curses git libunistring tar xz build system autotools arch linux grgygjr tar gcc build system autotools zip pigz arch linux pigz gcc build system makefile arch linux nwwllvp zstd gcc programs build system makefile compression none libs shared static arch linux libffi gcc build system autotools arch linux libxcrypt gcc obsolete api build system autotools arch linux sqlite gcc column metadata dynamic extensions fts functions rtree build system autotools arch linux util linux uuid gcc build system autotools arch linux raja gcc cuda examples exercises ipo openmp rocm shared tests build system cmake build type relwithdebinfo cuda arch arch linux camp gcc cuda ipo openmp rocm tests build system cmake build type relwithdebinfo cuda arch arch linux cub gcc build system generic arch linux umpire gcc c cuda device alloc deviceconst examples fortran ipo numa openmp rocm shared build system cmake build type relwithdebinfo cuda arch tests none arch linux install error spack e install installing axom no binary for axom found installing from source using cached archive spack var spack cache source cache git llnl axom git tar gz warning fetching from mirror without a checksum this package is normally checked out from a version control system but it has been archived on a spack mirror this means we cannot know a checksum for the tarball in advance be sure that your connection to this mirror is secure ran patch for axom axom executing phase initconfig axom executing phase cmake axom executing phase build spack opt spack linux gcc raja include raja policy cuda reduce hpp error there are no arguments to sy ncthreads that depend on a template parameter so a declaration of syncthreads must be available syncthreads spack opt spack linux gcc raja include raja policy cuda reduce hpp note if you use fpermissive g will accept your code but allowing the use of an undeclared name is deprecated spack opt spack linux gcc raja include raja policy cuda reduce hpp error raja policy cuda has not b een declared if warpid policy cuda warp size numthreads spack opt spack linux gcc raja include raja policy cuda reduce hpp error request for member get in s d which is of non class type int temp sd get warpid spack opt spack linux gcc raja include raja policy cuda reduce hpp error raja policy cuda has not b een declared for int i i policy cuda max warps i spack opt spack linux gcc raja include raja policy cuda reduce hpp error there are no arguments to s yncthreads or that depend on a template parameter so a declaration of syncthreads or must be available lastblock syncthreads or lastblock spack opt spack linux gcc raja include raja policy cuda reduce hpp in function bool raja cuda impl grid red uce atomic t t t unsigned int spack opt spack linux gcc raja include raja policy cuda reduce hpp error griddim was not declared in this scope int numblocks griddim x griddim y griddim z spack opt spack linux gcc raja include raja policy cuda reduce hpp error threadidx was not declared i n this scope did you mean threadid int threadid threadidx x blockdim x threadidx y threadid warning command line option fallow argument mismatch is valid for fortran but not for c spack opt spack linux gcc raja include raja policy cuda reduce hpp error blockdim was not declared in this scope int threadid threadidx x blockdim x threadidx y spack opt spack linux gcc raja include raja policy cuda reduce hpp error atomiccas has not been dec lared did you mean raja atomiccas unsigned int old val atomiccas device count raja atomiccas in file included from spack opt spack linux gcc raja include raja policy cuda atomic hpp from spack opt spack linux gcc raja include raja policy cuda reduce hpp from spack opt spack linux gcc raja include raja policy tensor arch cuda cuda warp hpp spack opt spack linux gcc raja include raja policy cuda reduce hpp error cuda reduce base was not de clared in this scope class reduceminloc t indextype spack opt spack linux gcc raja include raja policy cuda reduce hpp error wrong number of template argu ments should be at least class reduceminloc t indextype make error building cuda object axom mint cmakefiles mint dir utils utils cpp o cd tmp root spack stage spack stage axom spack build axom mint spack opt spack linux gcc cuda uud bin nvcc forward unknown to host compiler ccbin usr bin g daxom debug dcamp have cuda options file cmakefiles mint dir includes cuda rsp restr ict expt extended lambda arch sm std c g dndebug generate code arch compute code xcompiler fpic xcompiler fopenmp fallow argument misma tch std c md mt axom mint cmakefiles mint dir utils utils cpp o mf cmakefiles mint dir utils utils cpp o d x cu rdc true c tmp root spack stage spack stage axom spack src src axom mint utils utils cpp o cmakefiles mint dir utils utils cpp o warning command line option fallow argument mismatch is valid for fortran but not for c warning command line option fallow argument mismatch is valid for fortran but not for c warning command line option fallow argument mismatch is valid for fortran but not for c warning command line option fallow argument mismatch is valid for fortran but not for c warning command line option fallow argument mismatch is valid for fortran but not for c warning command line option fallow argument mismatch is valid for fortran but not for c warning command line option fallow argument mismatch is valid for fortran but not for c make leaving directory tmp root spack stage spack stage axom spack build built target quest make leaving directory tmp root spack stage spack stage axom spack build make error error message error message see above information on your system spack python platform linux cascadelake concretizer clingo additional information wspear general information i have run spack debug report and reported the version of spack python platform i have run spack maintainers and mentioned any maintainers i have uploaded the build log and environment files i have searched the issues of this repo and believe this is not a duplicate
1
65,685
16,453,374,642
IssuesEvent
2021-05-21 09:08:09
spack/spack
https://api.github.com/repos/spack/spack
closed
Installation issue: petsc tests error
build-error
<!-- Thanks for taking the time to report this build failure. To proceed with the report please: 1. Title the issue "Installation issue: <name-of-the-package>". 2. Provide the information required below. We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively! --> ### Steps to reproduce the issue <!-- Fill in the exact spec you are trying to build and the relevant part of the error message --> ```console $ spack install --test all petsc [...] # build and install OK ==> Error: FileNotFoundError: [Errno 2] No such file or directory: 'src/ksp/ksp/examples/tutorials' /home/f377482/spack/var/spack/repos/builtin/packages/petsc/package.py:469, in install: 466 467 # solve Poisson equation in 2D to make sure nothing is broken: 468 if ('mpi' in spec) and self.run_tests: >> 469 with working_dir('src/ksp/ksp/examples/tutorials'): 470 env['PETSC_DIR'] = self.prefix 471 cc = Executable(spec['mpi'].mpicc) 472 cc('ex50.c', '-I%s' % prefix.include, '-L%s' % prefix.lib, ``` Since version 3.13, the path is src/ksp/ksp/tutorials instead of src/ksp/ksp/examples/tutorials A separate issue is that when using intel-mpi, mpirun is not found in `join_path(spec['mpi'].prefix.bin` , and I cannot therefore quickly test a fix for this issue ### Additional information <!-- Some packages have maintainers who have volunteered to debug build failures. Run `spack maintainers <name-of-the-package>` and @mention them here if they exist. --> @balay @BarrySmith @jedbrown ### General information <!-- These boxes can be checked by replacing [ ] with [x] or by clicking them after submitting the issue. --> - [x] I have run `spack debug report` and reported the version of Spack/Python/Platform * **Spack:** 0.16.0-929-facfa893b9 * **Python:** 3.8.3 * **Platform:** linux-ubuntu20.10-sandybridge * **Concretizer:** original - [x] I have run `spack maintainers <name-of-the-package>` and @mentioned any maintainers - [ ] I have uploaded the build log and environment files - [x] I have searched the issues of this repo and believe this is not a duplicate
1.0
Installation issue: petsc tests error - <!-- Thanks for taking the time to report this build failure. To proceed with the report please: 1. Title the issue "Installation issue: <name-of-the-package>". 2. Provide the information required below. We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively! --> ### Steps to reproduce the issue <!-- Fill in the exact spec you are trying to build and the relevant part of the error message --> ```console $ spack install --test all petsc [...] # build and install OK ==> Error: FileNotFoundError: [Errno 2] No such file or directory: 'src/ksp/ksp/examples/tutorials' /home/f377482/spack/var/spack/repos/builtin/packages/petsc/package.py:469, in install: 466 467 # solve Poisson equation in 2D to make sure nothing is broken: 468 if ('mpi' in spec) and self.run_tests: >> 469 with working_dir('src/ksp/ksp/examples/tutorials'): 470 env['PETSC_DIR'] = self.prefix 471 cc = Executable(spec['mpi'].mpicc) 472 cc('ex50.c', '-I%s' % prefix.include, '-L%s' % prefix.lib, ``` Since version 3.13, the path is src/ksp/ksp/tutorials instead of src/ksp/ksp/examples/tutorials A separate issue is that when using intel-mpi, mpirun is not found in `join_path(spec['mpi'].prefix.bin` , and I cannot therefore quickly test a fix for this issue ### Additional information <!-- Some packages have maintainers who have volunteered to debug build failures. Run `spack maintainers <name-of-the-package>` and @mention them here if they exist. --> @balay @BarrySmith @jedbrown ### General information <!-- These boxes can be checked by replacing [ ] with [x] or by clicking them after submitting the issue. --> - [x] I have run `spack debug report` and reported the version of Spack/Python/Platform * **Spack:** 0.16.0-929-facfa893b9 * **Python:** 3.8.3 * **Platform:** linux-ubuntu20.10-sandybridge * **Concretizer:** original - [x] I have run `spack maintainers <name-of-the-package>` and @mentioned any maintainers - [ ] I have uploaded the build log and environment files - [x] I have searched the issues of this repo and believe this is not a duplicate
build
installation issue petsc tests error thanks for taking the time to report this build failure to proceed with the report please title the issue installation issue provide the information required below we encourage you to try as much as possible to reduce your problem to the minimal example that still reproduces the issue that would help us a lot in fixing it quickly and effectively steps to reproduce the issue console spack install test all petsc build and install ok error filenotfounderror no such file or directory src ksp ksp examples tutorials home spack var spack repos builtin packages petsc package py in install solve poisson equation in to make sure nothing is broken if mpi in spec and self run tests with working dir src ksp ksp examples tutorials env self prefix cc executable spec mpicc cc c i s prefix include l s prefix lib since version the path is src ksp ksp tutorials instead of src ksp ksp examples tutorials a separate issue is that when using intel mpi mpirun is not found in join path spec prefix bin and i cannot therefore quickly test a fix for this issue additional information and mention them here if they exist balay barrysmith jedbrown general information i have run spack debug report and reported the version of spack python platform spack python platform linux sandybridge concretizer original i have run spack maintainers and mentioned any maintainers i have uploaded the build log and environment files i have searched the issues of this repo and believe this is not a duplicate
1
30,815
8,600,705,173
IssuesEvent
2018-11-16 08:38:43
scikit-learn/scikit-learn
https://api.github.com/repos/scikit-learn/scikit-learn
closed
KNeighborsClassifier test failures with PyPy with loky backend
Bug Build / CI pypy
Here is a minimal case to reproduce the cause of the failure (to reproduce with pypy3): ```python import numpy as np from sklearn.neighbors import KNeighborsClassifier from sklearn.utils import parallel_backend import cloudpickle import pickle parallel_backend('loky') X = np.random.randn(3, 1) y = X[:, 0] > 0 knn = KNeighborsClassifier(algorithm="ball_tree").fit(X, y) pickle.loads(cloudpickle.dumps(knn._tree.query)) ``` which results in: ```python-traceback --------------------------------------------------------------------------- TypeError Traceback (most recent call last) <ipython-input-36-9fc0a20fa010> in <module> ----> 1 pickle.loads(cloudpickle.dumps(knn._tree.query)) ~/pypy3-v6.0.0-linux64/lib-python/3/pickle.py in _loads(s, fix_imports, encoding, errors) 1585 file = io.BytesIO(s) 1586 return _Unpickler(file, fix_imports=fix_imports, -> 1587 encoding=encoding, errors=errors).load() 1588 1589 # Use the faster _pickle if possible ~/pypy3-v6.0.0-linux64/lib-python/3/pickle.py in load(self) 1061 raise EOFError 1062 assert isinstance(key, bytes_types) -> 1063 dispatch[key[0]](self) 1064 except _Stop as stopinst: 1065 return stopinst.value ~/pypy3-v6.0.0-linux64/lib-python/3/pickle.py in load_newobj(self) 1345 args = self.stack.pop() 1346 cls = self.stack.pop() -> 1347 obj = cls.__new__(cls, *args) 1348 self.append(obj) 1349 dispatch[NEWOBJ[0]] = load_newobj TypeError: object.__new__(getset_descriptor) is not safe, use getset_descriptor.__new__() ``` This is probably a bug in cloudpickle but I don't have time to investigate further tonight. We could skip the failing tests under PyPy in the mean time if they are to noisy.
1.0
KNeighborsClassifier test failures with PyPy with loky backend - Here is a minimal case to reproduce the cause of the failure (to reproduce with pypy3): ```python import numpy as np from sklearn.neighbors import KNeighborsClassifier from sklearn.utils import parallel_backend import cloudpickle import pickle parallel_backend('loky') X = np.random.randn(3, 1) y = X[:, 0] > 0 knn = KNeighborsClassifier(algorithm="ball_tree").fit(X, y) pickle.loads(cloudpickle.dumps(knn._tree.query)) ``` which results in: ```python-traceback --------------------------------------------------------------------------- TypeError Traceback (most recent call last) <ipython-input-36-9fc0a20fa010> in <module> ----> 1 pickle.loads(cloudpickle.dumps(knn._tree.query)) ~/pypy3-v6.0.0-linux64/lib-python/3/pickle.py in _loads(s, fix_imports, encoding, errors) 1585 file = io.BytesIO(s) 1586 return _Unpickler(file, fix_imports=fix_imports, -> 1587 encoding=encoding, errors=errors).load() 1588 1589 # Use the faster _pickle if possible ~/pypy3-v6.0.0-linux64/lib-python/3/pickle.py in load(self) 1061 raise EOFError 1062 assert isinstance(key, bytes_types) -> 1063 dispatch[key[0]](self) 1064 except _Stop as stopinst: 1065 return stopinst.value ~/pypy3-v6.0.0-linux64/lib-python/3/pickle.py in load_newobj(self) 1345 args = self.stack.pop() 1346 cls = self.stack.pop() -> 1347 obj = cls.__new__(cls, *args) 1348 self.append(obj) 1349 dispatch[NEWOBJ[0]] = load_newobj TypeError: object.__new__(getset_descriptor) is not safe, use getset_descriptor.__new__() ``` This is probably a bug in cloudpickle but I don't have time to investigate further tonight. We could skip the failing tests under PyPy in the mean time if they are to noisy.
build
kneighborsclassifier test failures with pypy with loky backend here is a minimal case to reproduce the cause of the failure to reproduce with python import numpy as np from sklearn neighbors import kneighborsclassifier from sklearn utils import parallel backend import cloudpickle import pickle parallel backend loky x np random randn y x knn kneighborsclassifier algorithm ball tree fit x y pickle loads cloudpickle dumps knn tree query which results in python traceback typeerror traceback most recent call last in pickle loads cloudpickle dumps knn tree query lib python pickle py in loads s fix imports encoding errors file io bytesio s return unpickler file fix imports fix imports encoding encoding errors errors load use the faster pickle if possible lib python pickle py in load self raise eoferror assert isinstance key bytes types dispatch self except stop as stopinst return stopinst value lib python pickle py in load newobj self args self stack pop cls self stack pop obj cls new cls args self append obj dispatch load newobj typeerror object new getset descriptor is not safe use getset descriptor new this is probably a bug in cloudpickle but i don t have time to investigate further tonight we could skip the failing tests under pypy in the mean time if they are to noisy
1
70,945
18,346,883,351
IssuesEvent
2021-10-08 07:39:32
appsmithorg/appsmith
https://api.github.com/repos/appsmithorg/appsmith
closed
[Bug] Checkbox Widget: Unable to type labels continously in the property pane
Bug Property Pane High UI Building Pod UI Builders Pod
## Description When I am trying to type the checkbox labels, it keeps moving the cursor outside and I am unable to type the text continuously/in one go. ### Steps to reproduce the behaviour: https://user-images.githubusercontent.com/83642140/136385661-59febbc0-49ac-4514-96ff-b681ec882677.mov 1. Drag and drop checkbox widget 2. Try adding text in the label property ### Important Details - Version: [Cloud] - OS: [MacOSX] - Browser [Chrome] - Environment [production]
2.0
[Bug] Checkbox Widget: Unable to type labels continously in the property pane - ## Description When I am trying to type the checkbox labels, it keeps moving the cursor outside and I am unable to type the text continuously/in one go. ### Steps to reproduce the behaviour: https://user-images.githubusercontent.com/83642140/136385661-59febbc0-49ac-4514-96ff-b681ec882677.mov 1. Drag and drop checkbox widget 2. Try adding text in the label property ### Important Details - Version: [Cloud] - OS: [MacOSX] - Browser [Chrome] - Environment [production]
build
checkbox widget unable to type labels continously in the property pane description when i am trying to type the checkbox labels it keeps moving the cursor outside and i am unable to type the text continuously in one go steps to reproduce the behaviour drag and drop checkbox widget try adding text in the label property important details version os browser environment
1
38,503
10,211,717,773
IssuesEvent
2019-08-14 17:38:11
orbeon/orbeon-forms
https://api.github.com/repos/orbeon/orbeon-forms
opened
Clarify and/or fix Calculated Value formats
Area: XBL Components Module: Form Builder Module: Form Runner
Use case: form-level settings for separators, etc. apply to "Number" fields but not "Calculated Value" fields. Workaround: use a readonly "Number" field. But there should be a way to apply settings to "Calculated Values" fields when they have a decimal or an integer type, at the control level as well as the form level. Could there also be an option to use the same settings as the "Number" field in that case? See also: - Number/currency controls: improve formatting and validation #2118 - Ability to set date format the way we do for number format #3927 - FR: Number in summary page is not formatted #1369 - PDF template: must be able to control formats, incl. dates (see DMV-14) #148 [+1 from customer](https://3.basecamp.com/3600924/buckets/2192969/messages/1989930092)
1.0
Clarify and/or fix Calculated Value formats - Use case: form-level settings for separators, etc. apply to "Number" fields but not "Calculated Value" fields. Workaround: use a readonly "Number" field. But there should be a way to apply settings to "Calculated Values" fields when they have a decimal or an integer type, at the control level as well as the form level. Could there also be an option to use the same settings as the "Number" field in that case? See also: - Number/currency controls: improve formatting and validation #2118 - Ability to set date format the way we do for number format #3927 - FR: Number in summary page is not formatted #1369 - PDF template: must be able to control formats, incl. dates (see DMV-14) #148 [+1 from customer](https://3.basecamp.com/3600924/buckets/2192969/messages/1989930092)
build
clarify and or fix calculated value formats use case form level settings for separators etc apply to number fields but not calculated value fields workaround use a readonly number field but there should be a way to apply settings to calculated values fields when they have a decimal or an integer type at the control level as well as the form level could there also be an option to use the same settings as the number field in that case see also number currency controls improve formatting and validation ability to set date format the way we do for number format fr number in summary page is not formatted pdf template must be able to control formats incl dates see dmv
1
690,622
23,666,204,588
IssuesEvent
2022-08-26 21:16:12
QuiltMC/quiltflower
https://api.github.com/repos/QuiltMC/quiltflower
closed
invalid constant type Error
bug Priority: High Subsystem: Variables
Hi, Your product has performed very well among the available options, but there are some problems when using it, which I request you to check and fix. like the: `INFO: Decompiling class com/adventnet/client/components/table/pdf/PropertySheetRenderer WARN: Method getDollarReqParams (Ljava/lang/String;Lcom/adventnet/client/view/web/ViewContext;)Ljava/lang/String; in class com/adventnet/client/components/table/template/FillTable couldn't be written. java.lang.RuntimeException: invalid constant type: Ljava/lang/Object; at org.jetbrains.java.decompiler.modules.decompiler.exps.ConstExprent.toJava(ConstExprent.java:347) at org.jetbrains.java.decompiler.modules.decompiler.exps.AssignmentExprent.toJava(AssignmentExprent.java:146) at org.jetbrains.java.decompiler.modules.decompiler.ExprProcessor.listToJava(ExprProcessor.java:858) at org.jetbrains.java.decompiler.modules.decompiler.stats.BasicBlockStatement.toJava(BasicBlockStatement.java:65) at org.jetbrains.java.decompiler.modules.decompiler.stats.IfStatement.toJava(IfStatement.java:199) at org.jetbrains.java.decompiler.modules.decompiler.ExprProcessor.jmpWrapper(ExprProcessor.java:796) at org.jetbrains.java.decompiler.modules.decompiler.stats.SequenceStatement.toJava(SequenceStatement.java:111) at org.jetbrains.java.decompiler.modules.decompiler.ExprProcessor.jmpWrapper(ExprProcessor.java:796) at org.jetbrains.java.decompiler.modules.decompiler.stats.CatchStatement.toJava(CatchStatement.java:178) at org.jetbrains.java.decompiler.modules.decompiler.stats.RootStatement.toJava(RootStatement.java:37) at org.jetbrains.java.decompiler.main.ClassWriter.methodToJava(ClassWriter.java:1084) at org.jetbrains.java.decompiler.main.ClassWriter.classToJava(ClassWriter.java:334) at org.jetbrains.java.decompiler.main.ClassesProcessor.writeClass(ClassesProcessor.java:480) at org.jetbrains.java.decompiler.main.Fernflower.getClassContent(Fernflower.java:156) at org.jetbrains.java.decompiler.struct.ContextUnit.lambda$save$0(ContextUnit.java:179) at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.util.concurrent.FutureTask.run(Unknown Source) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) `
1.0
invalid constant type Error - Hi, Your product has performed very well among the available options, but there are some problems when using it, which I request you to check and fix. like the: `INFO: Decompiling class com/adventnet/client/components/table/pdf/PropertySheetRenderer WARN: Method getDollarReqParams (Ljava/lang/String;Lcom/adventnet/client/view/web/ViewContext;)Ljava/lang/String; in class com/adventnet/client/components/table/template/FillTable couldn't be written. java.lang.RuntimeException: invalid constant type: Ljava/lang/Object; at org.jetbrains.java.decompiler.modules.decompiler.exps.ConstExprent.toJava(ConstExprent.java:347) at org.jetbrains.java.decompiler.modules.decompiler.exps.AssignmentExprent.toJava(AssignmentExprent.java:146) at org.jetbrains.java.decompiler.modules.decompiler.ExprProcessor.listToJava(ExprProcessor.java:858) at org.jetbrains.java.decompiler.modules.decompiler.stats.BasicBlockStatement.toJava(BasicBlockStatement.java:65) at org.jetbrains.java.decompiler.modules.decompiler.stats.IfStatement.toJava(IfStatement.java:199) at org.jetbrains.java.decompiler.modules.decompiler.ExprProcessor.jmpWrapper(ExprProcessor.java:796) at org.jetbrains.java.decompiler.modules.decompiler.stats.SequenceStatement.toJava(SequenceStatement.java:111) at org.jetbrains.java.decompiler.modules.decompiler.ExprProcessor.jmpWrapper(ExprProcessor.java:796) at org.jetbrains.java.decompiler.modules.decompiler.stats.CatchStatement.toJava(CatchStatement.java:178) at org.jetbrains.java.decompiler.modules.decompiler.stats.RootStatement.toJava(RootStatement.java:37) at org.jetbrains.java.decompiler.main.ClassWriter.methodToJava(ClassWriter.java:1084) at org.jetbrains.java.decompiler.main.ClassWriter.classToJava(ClassWriter.java:334) at org.jetbrains.java.decompiler.main.ClassesProcessor.writeClass(ClassesProcessor.java:480) at org.jetbrains.java.decompiler.main.Fernflower.getClassContent(Fernflower.java:156) at org.jetbrains.java.decompiler.struct.ContextUnit.lambda$save$0(ContextUnit.java:179) at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.util.concurrent.FutureTask.run(Unknown Source) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) `
non_build
invalid constant type error hi your product has performed very well among the available options but there are some problems when using it which i request you to check and fix like the info decompiling class com adventnet client components table pdf propertysheetrenderer warn method getdollarreqparams ljava lang string lcom adventnet client view web viewcontext ljava lang string in class com adventnet client components table template filltable couldn t be written java lang runtimeexception invalid constant type ljava lang object at org jetbrains java decompiler modules decompiler exps constexprent tojava constexprent java at org jetbrains java decompiler modules decompiler exps assignmentexprent tojava assignmentexprent java at org jetbrains java decompiler modules decompiler exprprocessor listtojava exprprocessor java at org jetbrains java decompiler modules decompiler stats basicblockstatement tojava basicblockstatement java at org jetbrains java decompiler modules decompiler stats ifstatement tojava ifstatement java at org jetbrains java decompiler modules decompiler exprprocessor jmpwrapper exprprocessor java at org jetbrains java decompiler modules decompiler stats sequencestatement tojava sequencestatement java at org jetbrains java decompiler modules decompiler exprprocessor jmpwrapper exprprocessor java at org jetbrains java decompiler modules decompiler stats catchstatement tojava catchstatement java at org jetbrains java decompiler modules decompiler stats rootstatement tojava rootstatement java at org jetbrains java decompiler main classwriter methodtojava classwriter java at org jetbrains java decompiler main classwriter classtojava classwriter java at org jetbrains java decompiler main classesprocessor writeclass classesprocessor java at org jetbrains java decompiler main fernflower getclasscontent fernflower java at org jetbrains java decompiler struct contextunit lambda save contextunit java at java util concurrent executors runnableadapter call unknown source at java util concurrent futuretask run unknown source at java util concurrent threadpoolexecutor runworker unknown source at java util concurrent threadpoolexecutor worker run unknown source at java lang thread run unknown source
0