added
stringdate
2025-04-01 04:05:38
2025-04-01 07:14:06
created
timestamp[us]date
2001-10-09 16:19:16
2025-01-01 03:51:31
id
stringlengths
4
10
metadata
dict
source
stringclasses
2 values
text
stringlengths
0
1.61M
2025-04-01T04:34:54.851411
2024-08-29T19:35:15
2495404739
{ "authors": [ "khaneliman", "refaelsh" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9087", "repo": "nix-community/nixvim", "url": "https://github.com/nix-community/nixvim/pull/2115" }
gharchive/pull-request
plugins/nvim-orgmode: init I want to add this plugin to Nixvim: https://github.com/nvim-orgmode/orgmode. It is already in Nixpkgs here: https://search.nixos.org/packages?channel=24.05&show=vimPlugins.orgmode&from=0&size=50&sort=relevance&type=packages&query=orgmode. I followed the following docs: https://github.com/nix-community/nixvim/blob/main/CONTRIBUTING.md. I hope I did everything right. Furthermore, I especially have doubts about the tests parts of the docs, I am not that good at Nix to understand what is said there. Thanks in advance for reviewing this PR. Also looks like you need to make sure the files get imported in the relevant locations so they get picked up I don't understand. Please elaborate. Also looks like you need to make sure the files get imported in the relevant locations so they get picked up I don't understand. Please elaborate. Needs to get added to plugins/default.nix Needs to get added to plugins/default.nix because right now it doesn't exist so we see: Done. But, where does it say sow in the docs? If you concur that it's not there, would it be ok if I open a PR to fix it? Needs to get added to plugins/default.nix because right now it doesn't exist so we see: Done. But, where does it say sow in the docs? If you concur that it's not there, would it be ok if I open a PR to fix it? https://github.com/nix-community/nixvim/blob/main/CONTRIBUTING.md if you wanna review it a little. It gives a small overview I performed all the requested changes. I hope I did not miss anything :-) Looks like the org_agenda_files option is actually either a string or list of strings upstream and the defaults are different It took me a while. But I understand now. You are correct, I will go and make the nessecary changes. P.S. In my defense, I would say: from their REAMDE, I would never have guessed that is could be a list of strings. I would have never thought to look at their source code to understand wethere its a string or a list of strings. @mergifyio queue Thank you every one! I had lots of fun :-)
2025-04-01T04:34:54.863424
2016-09-16T05:37:31
177348983
{ "authors": [ "arcnmx", "fiveop", "homu", "philippkeller" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9088", "repo": "nix-rust/nix", "url": "https://github.com/nix-rust/nix/pull/428" }
gharchive/pull-request
Add Mkstemp (fixed for rust 1.2) I fixed @antifuchs addition of mkstmp in https://github.com/nix-rust/nix/pull/365 by making it compile in Rust 1.2 and adding documentation. A few remarks: made it working with Rust 1.2 which needed to_bytes_with_nul(). I think the implementation is memory safe but would like to have a pair of eyes checking that replaced Path by PathBuf so it's more in line with getcwd I didn't move it into another module. If this still the consensus then I would like to do that but in a separate PR (probably moving all stdio methods out) it's true that unistd doesn't need mkstmp since there is the crate tempdir and tempfile, but I'd love to see this here for completeness r? @kamalmarhubi @arcnmx Strings are involved, that's your expertise. ping to @kamalmarhubi @arcnmx: could you check if that implementation is safe memory wise? I guess the question is if OsStr::from_bytes(CStr::from_ptr(p).to_bytes()) copies the bytes away from the pointer p into an owned string. Yeah no that string is dropped by the time you use PathBuf::from, probably best to just move path_copy out of the closure instead. @fiveop I guess this is now ready to be merged, the code is now safe, shorter and easier to understand thanks to the input of @arcnmx Looks good to me. Thank you and @antifuchs. @homu r+ :pushpin: Commit b47120d has been approved by fiveop :zap: Test exempted - status
2025-04-01T04:34:54.894375
2017-10-06T17:03:48
263511754
{ "authors": [ "fadenb", "samueldr" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9089", "repo": "nixos-users/wiki", "url": "https://github.com/nixos-users/wiki/issues/11" }
gharchive/issue
Re-integrate github login link properly with the theme See #7, The github login link comes from an extension that has almost no facilities to customize the behaviour. There will be a bit of integration to do with the mediawiki extensions system and the chosen theme. Current state The login link is a static link added to the nav bar. What it needs to do Redirects user back on the right page when logged-in. Not be present when logged-in. Navbar should not overflow at any width. Furthermore Both login elements (github and user/pw) should always be present when one is present. This is now complete. https://gitlab.com/samueldr/nixos-wiki-proposition/commit/07c0dfde9aa8a22297c8272fa0dd0e222e115222 Changes in the wiki special pages MediaWiki:Tweeki-navbar-right TOOLBOX,NIXOS-LOGIN-PROFILE,SEARCH Additional proposed changes With this fix: https://gitlab.com/samueldr/nixos-wiki-proposition/commit/a36b00c80302f97d26a5d197eb2caac6e38f6b66 Adding the edit link and the profile/login item to the footer. MediaWiki:Tweeki-footer FOOTER,EDIT,NIXOS-LOGIN-PROFILE Applied changes
2025-04-01T04:34:54.898633
2018-11-19T14:36:50
382242990
{ "authors": [ "iamlockelightning", "sunzequn" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9090", "repo": "nju-websoft/BootEA", "url": "https://github.com/nju-websoft/BootEA/issues/1" }
gharchive/issue
Question about the baseline IPTransE results in paper Hi, Zequn. I try to use https://github.com/thunlp/IEAJKE(IPTransE) to reproduce the results of IPTransE on DBP15Kzh-en. But the results are so much bad than what reported in your paper BootEA(My results are Hit@1: 0.1941, Hit@10: 0.4685, MRR: 0.2833). I just changed the data format of the DBP15Kzh-en to what IPTransE needs. No parameters modified. Would you like to share some advice on it? (Or kindly would you like to share the code/input_dataset that you use? My e-mail address is<EMAIL_ADDRESS>Thanks! The results of IPTransE on DBP15K were produced using my own implementation because there was no official code at that time. I would suggest you to try more parameter settings, such as the similarity threshold. Thank you for your reply!! And I saw the e-mail just now. Yes, I found that the iteration predicted many false pairs, but it was hard to control the threshold dynamically. I will try on. Thanks again!
2025-04-01T04:34:54.899683
2024-06-24T18:05:12
2370811716
{ "authors": [ "JoonHyung-Park", "pku-lyk" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9091", "repo": "njucckevin/SeeClick", "url": "https://github.com/njucckevin/SeeClick/issues/37" }
gharchive/issue
ValueError: Can't find 'adapter_config.json' at 'cckevinn/SeeClick' solved solved How did you solve this problem, please
2025-04-01T04:34:54.901274
2021-01-07T14:39:02
781360480
{ "authors": [ "nkantar", "tyrelsouza" ], "license": "Unlicense", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9092", "repo": "nkantar/TheMixNeverDies.com", "url": "https://github.com/nkantar/TheMixNeverDies.com/issues/2" }
gharchive/issue
Github should have link to the website and a description. Just an idea. also a readme :) JK im dumb they moved it to the right bar. JK im dumb they moved it to the right bar. Yeah, README is still outstanding. 😬 Yeah, README is still outstanding. 😬
2025-04-01T04:34:55.007835
2024-02-13T03:04:49
2131377688
{ "authors": [ "nkuik" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9093", "repo": "nkuik/its-whats-for-dinner", "url": "https://github.com/nkuik/its-whats-for-dinner/pull/567" }
gharchive/pull-request
chore(deps): update dependency @typescript-eslint/parser to v7 This PR contains the following updates: Package Type Update Change @typescript-eslint/parser (source) devDependencies major 5.62.0 -> 7.0.1 Release Notes typescript-eslint/typescript-eslint (@​typescript-eslint/parser) v7.0.1 Compare Source This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. v7.0.0 Compare Source 🚀 Features ⚠️ bump ESLint, NodeJS, and TS minimum version requirements add support for flat configs ⚠️ Breaking Changes ⚠️ bump ESLint, NodeJS, and TS minimum version requirements ❤️ Thank You Brad Zacher Kirk Waiblinger StyleShit YeonJuan You can read about our versioning strategy and releases on our website. 6.21.0 (2024-02-05) 🚀 Features allow parserOptions.project: false ❤️ Thank You auvred Brad Zacher Kirk Waiblinger Pete Gonzalez YeonJuan You can read about our versioning strategy and releases on our website. 6.20.0 (2024-01-29) This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. 6.19.1 (2024-01-22) This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. 6.19.0 (2024-01-15) This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. 6.18.1 (2024-01-08) This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. 6.18.0 (2024-01-06) This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. v6.21.0 Compare Source 🚀 Features allow parserOptions.project: false ❤️ Thank You auvred Brad Zacher Kirk Waiblinger Pete Gonzalez YeonJuan You can read about our versioning strategy and releases on our website. v6.20.0 Compare Source This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. v6.19.1 Compare Source This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. v6.19.0 Compare Source This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. v6.18.1 Compare Source This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. v6.18.0 Compare Source This was a version bump only for parser to align it with other projects, there were no code changes. You can read about our versioning strategy and releases on our website. v6.17.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.16.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.15.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.14.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.13.2 (2023-12-04) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.13.1 (2023-11-28) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.13.2 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.13.1 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.13.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.12.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.11.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.10.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.9.1 (2023-10-30) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.9.1 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.9.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.8.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.7.5 (2023-10-09) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.7.4 (2023-10-02) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.7.3 (2023-09-25) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.7.2 (2023-09-18) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.7.1 (2023-09-18) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.7.5 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.7.4 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.7.3 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.7.2 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.7.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.6.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.5.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.4.1 (2023-08-21) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.4.1 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.4.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.3.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. 6.2.1 (2023-07-31) Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.2.1 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.2.0 Compare Source Note: Version bump only for package @​typescript-eslint/parser You can read about our versioning strategy and releases on our website. v6.1.0 Compare Source Features typescript-estree: add EXPERIMENTAL_useProjectService option to use TypeScript project service (#​6754) (6d3d162) You can read about our versioning strategy and releases on our website. v6.0.0 Compare Source Bug Fixes update exports field in package.json files (#​6550) (53776c2) chore drop support for node v14.17, v17 (#​5971) (cc62015) Features add new package rule-tester (#​6777) (2ce1c1d) add package.json exports for public packages (#​6458) (d676683) drop support for ESLint v6 (#​5972) (bda806d) drop support for node v12 (#​5918) (7e3fe9a) drop support for node v14 and test against node v20 (#​7022) (e6235bf) remove partial type-information program (#​6066) (7fc062a) scope-manager: ignore ECMA version (#​5889) (f2330f7), closes #​5834 #​5882 #​5864 #​5883 typescript-estree: added allowInvalidAST option to not throw on invalid tokens (#​6247) (a3b177d) typescript-estree: allow providing code as a ts.SourceFile (#​5892) (af41b7f) typescript-estree: deprecate createDefaultProgram (#​5890) (426d6b6) typescript-estree: remove optionality from AST boolean properties (#​6274) (df131e2) BREAKING CHANGES drop support for ESLint v6 drops support for node v17 drops support for node v12 You can read about our versioning strategy and releases on our website. Configuration 📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined). 🚦 Automerge: Enabled. ♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 Ignore: Close this PR and you won't be reminded about this update again. [ ] If you want to rebase/retry this PR, check this box This PR has been generated by Renovate Bot. ⚠ Artifact update problem Renovate failed to update an artifact related to this branch. You probably do not want to merge this PR as-is. ♻ Renovate will retry this branch, including artifacts, only when one of the following happens: any of the package files in this branch needs updating, or the branch becomes conflicted, or you click the rebase/retry checkbox if found above, or you rename this PR's title to start with "rebase!" to trigger it manually The artifact failure details are included below: File name: package-lock.json npm ERR! code ERESOLVE npm ERR! ERESOLVE could not resolve npm ERR! npm ERR! While resolving<EMAIL_ADDRESS>npm ERR! Found<EMAIL_ADDRESS>npm ERR! node_modules/@typescript-eslint/parser npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! npm ERR! Could not resolve dependency: npm ERR! peer<EMAIL_ADDRESS>from<EMAIL_ADDRESS>npm ERR! node_modules/@typescript-eslint/eslint-plugin npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! peerOptional<EMAIL_ADDRESS>|| ^6.0.0" from<EMAIL_ADDRESS>npm ERR! node_modules/eslint-plugin-jest npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! npm ERR! Conflicting peer dependency<EMAIL_ADDRESS>npm ERR! node_modules/@typescript-eslint/parser npm ERR! peer<EMAIL_ADDRESS>from<EMAIL_ADDRESS>npm ERR! node_modules/@typescript-eslint/eslint-plugin npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! peerOptional<EMAIL_ADDRESS>|| ^6.0.0" from<EMAIL_ADDRESS>npm ERR! node_modules/eslint-plugin-jest npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! npm ERR! Fix the upstream dependency conflict, or retry npm ERR! this command with --force, or --legacy-peer-deps npm ERR! to accept an incorrect (and potentially broken) dependency resolution. npm ERR! npm ERR! See /tmp/renovate/cache/others/npm/eresolve-report.txt for a full report. npm ERR! A complete log of this run can be found in: npm ERR! /tmp/renovate/cache/others/npm/_logs/2024-02-13T03_04_45_757Z-debug-0.log ⚠️ Artifact update problem Renovate failed to update an artifact related to this branch. You probably do not want to merge this PR as-is. ♻ Renovate will retry this branch, including artifacts, only when one of the following happens: any of the package files in this branch needs updating, or the branch becomes conflicted, or you click the rebase/retry checkbox if found above, or you rename this PR's title to start with "rebase!" to trigger it manually The artifact failure details are included below: File name: package-lock.json npm ERR! code ERESOLVE npm ERR! ERESOLVE could not resolve npm ERR! npm ERR! While resolving<EMAIL_ADDRESS>npm ERR! Found<EMAIL_ADDRESS>npm ERR! node_modules/@typescript-eslint/parser npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! npm ERR! Could not resolve dependency: npm ERR! peer<EMAIL_ADDRESS>from<EMAIL_ADDRESS>npm ERR! node_modules/@typescript-eslint/eslint-plugin npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! peerOptional<EMAIL_ADDRESS>|| ^6.0.0" from<EMAIL_ADDRESS>npm ERR! node_modules/eslint-plugin-jest npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! npm ERR! Conflicting peer dependency<EMAIL_ADDRESS>npm ERR! node_modules/@typescript-eslint/parser npm ERR! peer<EMAIL_ADDRESS>from<EMAIL_ADDRESS>npm ERR! node_modules/@typescript-eslint/eslint-plugin npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! peerOptional<EMAIL_ADDRESS>|| ^6.0.0" from<EMAIL_ADDRESS>npm ERR! node_modules/eslint-plugin-jest npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! npm ERR! Fix the upstream dependency conflict, or retry npm ERR! this command with --force, or --legacy-peer-deps npm ERR! to accept an incorrect (and potentially broken) dependency resolution. npm ERR! npm ERR! See /tmp/renovate/cache/others/npm/eresolve-report.txt for a full report. npm ERR!
2025-04-01T04:34:55.009632
2015-08-13T10:35:17
100734911
{ "authors": [ "are-are-", "nkzawa" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9094", "repo": "nkzawa/socket.io-stream", "url": "https://github.com/nkzawa/socket.io-stream/pull/55" }
gharchive/pull-request
added option to enable FileReaderSync (with backwards compatibility) This little change is pretty self-explanatory. FileReader and FileReaderSync has the same interface. thx :smile: Ok, I found that it wont work, because FileReaderSync has no events. Already solved, sending new/updating this pull request.
2025-04-01T04:34:55.026622
2021-08-17T21:40:31
973086808
{ "authors": [ "Patrock2503" ], "license": "CC0-1.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9095", "repo": "nlef/moonraker-telegram-bot", "url": "https://github.com/nlef/moonraker-telegram-bot/issues/41" }
gharchive/issue
./install.sh doesn't work.. my problem is.. I deleted by mistake parts of the code... I tried to delete everything an tried to install it again.. but.. all I get is the following message: ./install.sh Stopping moonraker-telegram-bot instance ... Failed to stop moonraker-telegram-bot.service: Unit moonraker-telegram-bot.service not loaded. I have no idea what to do.. I hope you can help me solving the problem because I love your bot btw.. the Logfile has no entries.. so it looks like with the new release it will work 😊 fine as I have a second Printer I will see when the update is available. Thank you very much !!
2025-04-01T04:34:55.028143
2021-10-29T18:36:12
1039832413
{ "authors": [ "aka13-404", "nlef" ], "license": "CC0-1.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9096", "repo": "nlef/moonraker-telegram-bot", "url": "https://github.com/nlef/moonraker-telegram-bot/issues/49" }
gharchive/issue
Enable gcode, macro and command sending from bot + add an option to read console output sending from bot [x] gcode [x] macro [x] command add an option to read console output These features are implemented as of v1.3
2025-04-01T04:34:55.050534
2016-02-20T14:27:06
135080653
{ "authors": [ "abc100m", "cjh1", "henryiii", "nlohmann" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9097", "repo": "nlohmann/json", "url": "https://github.com/nlohmann/json/pull/212" }
gharchive/pull-request
support gcc 4.8.1 for issue #211 , support gcc 4.8.1 since it already had fully support c++ 11 (though STL library are not yet) Hi @abc100m, thanks for the PR. Could you please edit file json.hpp.re2c, because json.hpp is generated from this file. Does the PR also cope with user-defined string literals? Hi @abc100m, once you edited both files, could you also run the tests to make sure every feature works. Hi, I had edited json.hpp.re2c and had tested it on my computer like this way: >cd src >rm json.hpp >touch json.hpp >re2c json.hpp.re2c > json.hpp > cd .. > make >./json_unit >./json_unit "*" it works and all the test case were passed. for bug 57824 of gcc 4.8, I think it may be is the compiler issue, and I have no idea how to fix it. I'm just going to not use the "user-defined string" feature. thank you for your powerfully but simple json library, it's really great! Thanks for the update. I haven't found the time to check it though :-( Thanks for your patience! I tried to compile the code, but got several errors: $ g++-4.8 --version g++-4.8 (Homebrew gcc48 4.8.5) 4.8.5 Copyright (C) 2015 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. $ make CXX=g++-4.8 g++-4.8 -std=c++11 -Wall -Wextra -pedantic -Weffc++ -Wcast-align -Wcast-qual -Wctor-dtor-privacy -Wdisabled-optimization -Wformat=2 -Winit-self -Wmissing-declarations -Wmissing-include-dirs -Wold-style-cast -Woverloaded-virtual -Wredundant-decls -Wshadow -Wsign-conversion -Wsign-promo -Wstrict-overflow=5 -Wswitch -Wundef -Wno-unused -Wnon-virtual-dtor -Wreorder -Wdeprecated -Wfloat-equal -I src -I test test/unit.cpp -o json_unit test/unit.cpp:11899:32: error: unterminated raw string CHECK_NOTHROW(json(R"( ^ test/unit.cpp:11914:14: warning: missing terminating " character [enabled by default] )")); ^ test/unit.cpp:11918:32: error: unterminated raw string CHECK_NOTHROW(json(R"( ^ test/unit.cpp:11940:15: warning: missing terminating " character [enabled by default] ])")); ^ test/unit.cpp:12105:30: error: unterminated raw string const auto content = R"( ^ test/unit.cpp:12110:11: warning: missing terminating " character [enabled by default] })"; ^ test/unit.cpp:12349:0: error: unterminated argument list invoking macro "CHECK_NOTHROW" ^ In file included from test/unit.cpp:26:0: src/json.hpp: In instantiation of 'class nlohmann::basic_json<>::const_iterator': src/json.hpp:6834:11: required from 'class nlohmann::basic_json<>::iterator' test/unit.cpp:953:24: required from here src/json.hpp:6290:11: warning: base class 'struct std::iterator<std::random_access_iterator_tag, const nlohmann::basic_json<>, long int, const nlohmann::basic_json<>*, const nlohmann::basic_json<>&>' has a non-virtual destructor [-Weffc++] class const_iterator : public std::iterator<std::random_access_iterator_tag, const basic_json> ^ src/json.hpp: In instantiation of 'class nlohmann::basic_json<>::iterator': test/unit.cpp:953:24: required from here src/json.hpp:6834:11: warning: base class 'class nlohmann::basic_json<>::const_iterator' has a non-virtual destructor [-Weffc++] class iterator : public const_iterator ^ src/json.hpp: In instantiation of 'class nlohmann::basic_json<>::json_reverse_iterator<nlohmann::basic_json<>::iterator>': test/unit.cpp:4987:40: required from here src/json.hpp:6975:11: warning: base class 'class std::reverse_iterator<nlohmann::basic_json<>::iterator>' has a non-virtual destructor [-Weffc++] class json_reverse_iterator : public std::reverse_iterator<Base> ^ src/json.hpp: In instantiation of 'class nlohmann::basic_json<>::json_reverse_iterator<nlohmann::basic_json<>::const_iterator>': test/unit.cpp:5012:46: required from here src/json.hpp:6975:11: warning: base class 'class std::reverse_iterator<nlohmann::basic_json<>::const_iterator>' has a non-virtual destructor [-Weffc++] test/unit.cpp: In function 'void ____C_A_T_C_H____T_E_S_T____11880()': test/unit.cpp:11899:13: error: 'CHECK_NOTHROW' was not declared in this scope CHECK_NOTHROW(json(R"( ^ test/unit.cpp:11899:13: error: expected ';' at end of input test/unit.cpp:11899:13: error: expected '}' at end of input test/unit.cpp:11899:13: error: expected '}' at end of input test/unit.cpp:11899:13: error: expected '}' at end of input make: *** [json_unit] Error 1 well, bug 57824 of gcc 4.8.1 maybe can't be fixed. this PR only fix bug of STL library , and if we don't use C++11 raw strings literals, that will be fine. Sorry, but this is not helping. There is no way GCC 4.8 will ever support the library, and such workarounds only make the library harder to maintain. I shall add a link to this PR to the README file to document a solution for the STL issue. Thanks for the effort anyway! Since people are manually applying this patch, wouldn't it make sense to add this to master for now? It's not an awful maintenance burden from the look of it. I'm working with CUDA on CentOS 7, and that's all built on top of CentOS 7's GCC 4.8 stack. CentOS/RHEL 7 is the newest version available and is built on GCC 4.8, FYI. At least until Red Hat releases a successor to RHEL 7. This is what I'm thinking: @@ -123,7 +123,7 @@ using json = basic_json<>; #error "unsupported Clang version - see https://github.com/nlohmann/json#supported-compilers" #endif #elif defined(__GNUC__) && !(defined(__ICC) || defined(__INTEL_COMPILER)) - #if (__GNUC__ * 10000 + __GNUC_MINOR__ * 100 + __GNUC_PATCHLEVEL__) < 40900 + #if (__GNUC__ * 10000 + __GNUC_MINOR__ * 100 + __GNUC_PATCHLEVEL__) < 40800 #error "unsupported GCC version - see https://github.com/nlohmann/json#supported-compilers" #endif #endif @@ -10435,6 +10435,22 @@ class basic_json return object.release(); } + /// helper for insertion of an iterator (supports GCC 4.8+) + template<typename... Args> + iterator insert_iterator(const_iterator pos, Args&& ... args) + { + iterator result(this); + assert(m_value.array != nullptr); + #if defined(__GNUC__) && __GNUC__ <= 4 && __GNUC_MINOR__ <= 8 + auto insert_pos = std::distance(m_value.array->begin(), pos.m_it.array_iterator); + m_value.array->insert(pos.m_it.array_iterator, std::forward<Args>(args)...); + result.m_it.array_iterator = m_value.array->begin() + insert_pos; + #else + result.m_it.array_iterator = m_value.array->insert(pos.m_it.array_iterator, cnt, val); + #endif + return result; + } + //////////////////////// // JSON value storage // //////////////////////// @@ -14579,9 +14595,7 @@ class basic_json } // insert to array and return iterator - iterator result(this); - result.m_it.array_iterator = m_value.array->insert(pos.m_it.array_iterator, val); - return result; + return insert_iterator(pos, val); } JSON_THROW(type_error::create(309, "cannot use insert() with " + std::string(type_name()))); @@ -14632,9 +14646,7 @@ class basic_json } // insert to array and return iterator - iterator result(this); - result.m_it.array_iterator = m_value.array->insert(pos.m_it.array_iterator, cnt, val); - return result; + return insert_iterator(pos, cnt, val); } JSON_THROW(type_error::create(309, "cannot use insert() with " + std::string(type_name()))); @@ -14696,12 +14708,10 @@ class basic_json } // insert to array and return iterator - iterator result(this); - result.m_it.array_iterator = m_value.array->insert( - pos.m_it.array_iterator, - first.m_it.array_iterator, - last.m_it.array_iterator); - return result; + return insert_iterator( + pos, + first.m_it.array_iterator, + last.m_it.array_iterator); } /*! @@ -14743,9 +14753,7 @@ class basic_json } // insert to array and return iterator - iterator result(this); - result.m_it.array_iterator = m_value.array->insert(pos.m_it.array_iterator, ilist.begin(), ilist.end()); - return result; + return insert_iterator(pos, ilist.begin(), ilist.end()); } /*! That would reduce maintenance burden by factoring out similar code into one place, and would also support GCC 4.8 partially. It could also be done without the #if, but this makes it easier to update when GCC 4.8 dies out (after CentOS/REHL gets an update, for example). (I've left in the old assert, but it seems to not be in the current .hpp, so you could remove it. It's just in one place now instead of three.) By the way, by "partial support", I just mean a user can't use raw strings in macros. That's already a requirement for any user using GCC 4.8 and is not a fault or failure of this library. My solution is actually fewer lines of code than the current version. ;) I understand your efforts, but GCC 4.8 will never fully support this library and your fixes seem to only enable a few use cases. @henryiii Do you by any chance have your patch ported to the latest version? No I haven't updated it yet. That's great. Would your patch be accepted as a PR, @nlohmann? This doesn't add much as far as complexity for maintenance. (and is currently having to be maintained separately by all the people running CentOS/RHEL 7. @nlohmann I've added a PR with a gcc 4.8 Travis test and all tests fully pass. No #if's added anywhere. For those watching this thread and have their own patches: develop now supports GCC 4.8 and looks like 3.2.1 should have it! @henryiii Thanks for getting this in!
2025-04-01T04:34:55.054292
2021-08-12T20:00:19
969473028
{ "authors": [ "thomasyu888", "tschaffter" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9098", "repo": "nlpsandbox/nlpsandbox-controller", "url": "https://github.com/nlpsandbox/nlpsandbox-controller/issues/146" }
gharchive/issue
Remove the property Note.note_name from the note sent to a PHI annotator Is your proposal related to a problem? Not really an issue but found while searching why the Spark NLP tool succeed on the date annotation task but fails on the other tasks. Describe the solution you'd like The endpoint used by the controller to annotate a clinical note takes as input a Note object. However, MCW logs show that this object include an extra property note_name that is not part of the Note schemas. While this does not appear to affect the Spark NLP tool, the property Note.note_name is not expected by tools and may lead to error, e.g. if they case the JSON payload to a Class object and this cast fails because of the unknown property. The solution is to remove the property Note.note_name from the Note object sent to annotators. Describe alternatives you've considered Additional context Tagging @gkowalski who helps me troubleshooting the Spark NLP issue. @tschaffter this is duplicated of this https://github.com/nlpsandbox/nlpsandbox-client/issues/132. Please review this original ticket and comment.
2025-04-01T04:34:55.056829
2019-10-29T09:43:04
513799394
{ "authors": [ "AyeshaSarwar", "RafaelWO", "ntnshrav", "simonha94" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9099", "repo": "nlpyang/BertSum", "url": "https://github.com/nlpyang/BertSum/issues/82" }
gharchive/issue
"Step 4. Format to Simpler Json Files" not working Hi, I tried to run the code with the same data, but the when executing the suggested line in Step 4 python preprocess.py -mode format_to_lines -raw_path RAW_PATH -save_path JSON_PATH -map_path MAP_PATH -lower nothing happend. Somehow the json files are not created. Can someone help me out? Or do I have to recreate the required format on my own? Somehow I was doing something wrong, code is working fine, sorry! Can you please let me know how you solve it because it is giving me the same response. Thanks in advance same for me, thanks in advance! I am also facing the same issue. Please let us know how it was solved for you. Sorry, I don't remember how I "solved" this for me. Maybe the path needs to be specified in a different way? Do all needed directories exist? You can always debug the code and find the issue (I did this as well to understand what exactly happens in each step).
2025-04-01T04:34:55.099185
2021-01-03T17:24:11
777681027
{ "authors": [ "Ekleog" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9100", "repo": "nmattia/naersk", "url": "https://github.com/nmattia/naersk/pull/136" }
gharchive/pull-request
Expose copySource for dealing with path-based dependencies This is an alternative to https://github.com/nmattia/naersk/pull/135 ; that doesn't try to implement path dependency parsing, as: it'd need to be done recursively, which requires more work it'd have to make sure that workspaces with path dependencies don't end up breaking most if not all the benefits of two-derivation caching in most cases manually filling in copySources should not be too bad a thing to do (eg. to compile cargo I just had to set copySources = ["crates"]), and anyway to get more of the benefit of two-derivation caching there would be very hard (impossible?) What do you think about this PR? I've just pushed the missing comments asked in reviews :) I've just pushed the missing comments asked in reviews :)
2025-04-01T04:34:55.111635
2020-09-14T02:34:44
700713487
{ "authors": [ "nmkataoka" ], "license": "BSD-3-Clause", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9101", "repo": "nmkataoka/adv-life", "url": "https://github.com/nmkataoka/adv-life/issues/100" }
gharchive/issue
Refactor createMerchant to use events Description Let's refactor the functions in createMerchant to use events. This is the first instance of possible nested events, so let's figure out how to implement something like redux-thunk to dispatch subevents from an event creator kind of function. Acceptance Criteria Tasks Use this section to keep track of work QA Notes Environments Steps Not sure if this is actually a better pattern, but it's definitely more complicated so I'm going to close this as invalid. Also, the event system has been significantly changed since this issue was opened. If in the future it makes sense to do a refactor like this, I'm sure I'll be able to open an issue then.
2025-04-01T04:34:55.116764
2020-08-14T12:33:00
679126696
{ "authors": [ "vinnitu", "yurymalkov" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9102", "repo": "nmslib/hnswlib", "url": "https://github.com/nmslib/hnswlib/pull/241" }
gharchive/pull-request
Update ALGO_PARAMS.md is it correct? @vinnitu Thanks for the PR! I am not sure that calling dimensionality d instead of dim is better. Probably, it makes sense to directly define the parameter, not the same as I understand Makes sense. Thanks!
2025-04-01T04:34:55.121824
2021-05-18T23:30:35
894859809
{ "authors": [ "cathay4t" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9103", "repo": "nmstate/nmstate", "url": "https://github.com/nmstate/nmstate/pull/1592" }
gharchive/pull-request
ethtool: Introduce pause support Example: ethtool: pause: autoneg: false rx: true tx: true Unit test cases included. Integration test requires netdevsim kernel driver which does not exists in CI environment. The test passes in Fedora 34 with NetworkManager-1.31.4-28344.copr.0ce59b5dc4.fc34.x86_64 CI passed but github show it as hang. Force merging.
2025-04-01T04:34:55.145234
2018-02-23T21:06:45
299850266
{ "authors": [ "no23reason" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9104", "repo": "no23reason/react-qr-svg", "url": "https://github.com/no23reason/react-qr-svg/pull/113" }
gharchive/pull-request
docs: migrate to simpler build and docs process The docs are now generated using catalog and the webpack-dev-server is replaced by it as well. Other packages have been updated as well. :tada: This PR is included in version 2.2.0 :tada: The release is available on: npm package (@latest dist-tag) GitHub release Your semantic-release bot :package::rocket:
2025-04-01T04:34:55.155821
2019-04-09T16:51:58
431073655
{ "authors": [ "davedelong", "noahsark769" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9105", "repo": "noahsark769/cifilter.io", "url": "https://github.com/noahsark769/cifilter.io/issues/76" }
gharchive/issue
Search terms should match parameter names and descriptions I'm searching for filters that take certain kinds of parameters (eg aspectRatio) and it'd be nice if searching would also search through the parameter names as well. Great idea! I'm going to close #39 in favor of this, since this one is more descriptive. I think this UI of putting a small tag next to the search result should probably do it! Fix will be merged soon 👍 @davedelong this is now live, thanks again for the suggestion!
2025-04-01T04:34:55.177158
2024-12-23T17:17:41
2756453222
{ "authors": [ "nobShinjo" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9106", "repo": "nobShinjo/AnalyzeGitRepoLoc", "url": "https://github.com/nobShinjo/AnalyzeGitRepoLoc/pull/16" }
gharchive/pull-request
docs(changelog): correct order of fix entries Swapped the order of two recent fix entries in the changelog to accurately reflect their chronological sequence according to commit hashes. This change ensures clarity and maintains consistency with the project's version history documentation. This pull request includes a small change to the CHANGELOG.md file. The change corrects the order of the entries under the "Fixed" section to accurately reflect the corresponding commit messages. Changes in CHANGELOG.md: Swapped the order of entries to ensure paths and parameters are correctly quoted in analyze_git_repo_loc.ps1 and to convert date arguments to datetime objects.
2025-04-01T04:34:55.198223
2024-11-01T02:58:15
2628200342
{ "authors": [ "ScottTodd", "monorimet", "renxida" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9108", "repo": "nod-ai/SHARK-Platform", "url": "https://github.com/nod-ai/SHARK-Platform/issues/407" }
gharchive/issue
SD shortfin CI fails when apt is locked Should probably first check if cmake & ninja are installed & of the proper version before attempting to install. I don't think checking requires acquiring the apt lock. Workflow history: https://github.com/nod-ai/SHARK-Platform/actions/workflows/ci-sdxl.yaml Sample logs: https://github.com/nod-ai/SHARK-Platform/actions/runs/11621536945/job/32366319347#step:2:13 Text: Run sudo apt update -y WARNING: apt does not have a stable CLI interface. Use with caution in scripts. Reading package lists... E: Could not get lock /var/lib/apt/lists/lock. It is held by process 2271375 (apt) E: Unable to lock directory /var/lib/apt/lists/ (screenshots aren't searchable of copy/paste-able ;P) good point ty scott! Added a fix to https://github.com/nod-ai/SHARK-Platform/pull/411, but this still will throw an error if we don't have those installed and cannot acquire the lock. May not be an issue for a while. output logs: https://github.com/nod-ai/SHARK-Platform/actions/runs/11637357380/job/32410398564?pr=411#step:2:1 Added a fix to #411 Please keep changes small and send them as individual PRs. We shouldn't need to wait for a larger patch to be reviewed and landed to fix a broken CI build. Something is suspicious here about the runner environment. Basic dependency setup / installation shouldn't be a sticky source of failures. Might make sense to run these jobs in containers or limit what other code (if any) is running on these machines. Added a fix to #411 Please keep changes small and send them as individual PRs. We shouldn't need to wait for a larger patch to be reviewed and landed to fix a broken CI build. Something is suspicious here about the runner environment. Basic dependency setup / installation shouldn't be a sticky source of failures. Might make sense to run these jobs in containers or limit what other code (if any) is running on these machines. Agree, will send up separate PR. Containers are probably a bit heavy for this, but as long as we have env/machine setup flexibility validated within reason, it might be the most robust solution. I'm specifically wondering if apt update within a container is free from the hosting system's /var/lib/apt/lists/lock. We wouldn't need to use a container for packaging deps (we could)... just for isolation from other users on the runner machine.
2025-04-01T04:34:55.200736
2022-10-13T16:28:17
1408104126
{ "authors": [ "monorimet", "pashu123" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9109", "repo": "nod-ai/SHARK", "url": "https://github.com/nod-ai/SHARK/pull/401" }
gharchive/pull-request
Cleanup tank directory and move instructions to tank/README.md TFLite content is contained in the tank/tflite directory until it can be integrated and the directory removed. Any examples pertaining to specific shark_tank models outside of the pytest framework are moved to their model's directory in tank/examples. We want to have tank/ as the go-to place for trying out models with SHARK hence the separation from shark/examples. pytorch/ and tf/ directories have been removed. Testing + Benchmarking + Supported and Validated models sections from main README.md have been moved to tank/README.md. Amazing @monorimet. I will take a detailed look tomorrow morning.
2025-04-01T04:34:55.201949
2023-02-06T16:22:12
1572877452
{ "authors": [ "PhaneeshB", "Shukla-Gaurav", "powderluv" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9110", "repo": "nod-ai/SHARK", "url": "https://github.com/nod-ai/SHARK/pull/958" }
gharchive/pull-request
rand seed generation and check This is a temporary fix to get the correct seed info while saving the images. Ideally this should be part of the pipeline and will be integrated in upcoming PRs. let try to fix it the right way. no need to rush it. This has been fixed, closing it.
2025-04-01T04:34:55.231913
2024-12-17T17:22:27
2745556492
{ "authors": [ "chase9", "scottleestrange" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9111", "repo": "node-alarm-dot-com/homebridge-node-alarm-dot-com", "url": "https://github.com/node-alarm-dot-com/homebridge-node-alarm-dot-com/issues/139" }
gharchive/issue
Issue with 10.2 - ECONNRESET Connection issues with 10.2. I reverted back to 10.1 and problem is not occurring See below [17/12/2024, 10:03:33] [Security System] There was an error retrieving account settings. Please check that your credentials are correct and restart the plugin. [17/12/2024, 10:03:33] [Security System] GET https://www.alarm.com/web/api/identities failed: GET https://www.alarm.com/web/api/identities failed: Invalid response body while trying to fetch https://www.alarm.com/web/api/identities: read ECONNRESET [17/12/2024, 10:04:13] [Security System] refreshDevices Error: GET https://www.alarm.com/web/api/devices/sensors?ids[]=98182855-13&ids[]=98182855-8&ids[]=98182855-1&ids[]=98182855-2&ids[]=98182855-7&ids[]=98182855-10&ids[]=98182855-11&ids[]=98182855-14&ids[]=98182855-12&ids[]=98182855-6&ids[]=98182855-9&ids[]=98182855-5&ids[]=98182855-3&ids[]=98182855-4&ids[]=98182855-15 failed: request to https://www.alarm.com/web/api/devices/sensors?ids[]=98182855-13&ids[]=98182855-8&ids[]=98182855-1&ids[]=98182855-2&ids[]=98182855-7&ids[]=98182855-10&ids[]=98182855-11&ids[]=98182855-14&ids[]=98182855-12&ids[]=98182855-6&ids[]=98182855-9&ids[]=98182855-5&ids[]=98182855-3&ids[]=98182855-4&ids[]=98182855-15 failed, reason: read ETIMEDOUT [17/12/2024, 10:04:13] [Security System] Refreshing session authentication. [17/12/2024, 10:08:28] [Security System] There was an error retrieving account settings. Please check that your credentials are correct and restart the plugin. [17/12/2024, 10:08:28] [Security System] GET https://www.alarm.com/web/api/identities failed: GET https://www.alarm.com/web/api/identities failed: request to https://www.alarm.com/web/api/identities failed, reason: read ECONNRESET [17/12/2024, 10:11:59] [Security System] There was an error retrieving account settings. Please check that your credentials are correct and restart the plugin. [17/12/2024, 10:11:59] [Security System] GET https://www.alarm.com/web/api/identities failed: GET https://www.alarm.com/web/api/identities failed: request to https://www.alarm.com/web/api/identities failed, reason: read ECONNRESET [17/12/2024, 10:13:19] [Security System] There was an error retrieving account settings. Please check that your credentials are correct and restart the plugin. [17/12/2024, 10:13:19] [Security System] GET https://www.alarm.com/web/api/identities failed: GET https://www.alarm.com/web/api/identities failed: request to https://www.alarm.com/web/api/identities failed, reason: read ECONNRESET [17/12/2024, 10:14:55] [homebridge-network-presence] [Scott's iPhone] - disconnected from the network (mac: 14:1b:a0:ca:68:6f | ip:<IP_ADDRESS> | hostname:Scotts-iPhone.localdomain) Very strange since we didn't change any of the identity or partition code between 1.10.1 and 1.10.2. This almost looks like an issue with the timeout of the auth cookies. If you go back to the latest version now and reboot, does it work?ChaseOn Dec 17, 2024, at 12:23 PM, scottleestrange @.***> wrote: Connection issues with 10.2. I reverted back to 10.1 and problem is not occurring See below [17/12/2024, 10:03:33] [Security System] There was an error retrieving account settings. Please check that your credentials are correct and restart the plugin. [17/12/2024, 10:03:33] [Security System] GET https://www.alarm.com/web/api/identities failed: GET https://www.alarm.com/web/api/identities failed: Invalid response body while trying to fetch https://www.alarm.com/web/api/identities: read ECONNRESET [17/12/2024, 10:04:13] [Security System] refreshDevices Error: GET https://www.alarm.com/web/api/devices/sensors?ids[]=98182855-13&ids[]=98182855-8&ids[]=98182855-1&ids[]=98182855-2&ids[]=98182855-7&ids[]=98182855-10&ids[]=98182855-11&ids[]=98182855-14&ids[]=98182855-12&ids[]=98182855-6&ids[]=98182855-9&ids[]=98182855-5&ids[]=98182855-3&ids[]=98182855-4&ids[]=98182855-15 failed: request to https://www.alarm.com/web/api/devices/sensors?ids[]=98182855-13&ids[]=98182855-8&ids[]=98182855-1&ids[]=98182855-2&ids[]=98182855-7&ids[]=98182855-10&ids[]=98182855-11&ids[]=98182855-14&ids[]=98182855-12&ids[]=98182855-6&ids[]=98182855-9&ids[]=98182855-5&ids[]=98182855-3&ids[]=98182855-4&ids[]=98182855-15 failed, reason: read ETIMEDOUT [17/12/2024, 10:04:13] [Security System] Refreshing session authentication. [17/12/2024, 10:08:28] [Security System] There was an error retrieving account settings. Please check that your credentials are correct and restart the plugin. [17/12/2024, 10:08:28] [Security System] GET https://www.alarm.com/web/api/identities failed: GET https://www.alarm.com/web/api/identities failed: request to https://www.alarm.com/web/api/identities failed, reason: read ECONNRESET [17/12/2024, 10:11:59] [Security System] There was an error retrieving account settings. Please check that your credentials are correct and restart the plugin. [17/12/2024, 10:11:59] [Security System] GET https://www.alarm.com/web/api/identities failed: GET https://www.alarm.com/web/api/identities failed: request to https://www.alarm.com/web/api/identities failed, reason: read ECONNRESET [17/12/2024, 10:13:19] [Security System] There was an error retrieving account settings. Please check that your credentials are correct and restart the plugin. [17/12/2024, 10:13:19] [Security System] GET https://www.alarm.com/web/api/identities failed: GET https://www.alarm.com/web/api/identities failed: request to https://www.alarm.com/web/api/identities failed, reason: read ECONNRESET [17/12/2024, 10:14:55] [homebridge-network-presence] [Scott's iPhone] - disconnected from the network (mac: 14:1b:a0:ca:68:6f | ip:<IP_ADDRESS> | hostname:Scotts-iPhone.localdomain) —Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you are subscribed to this thread.Message ID: @.***> Yes it does work with the previous version. I reverted back and rebooted, all perfect This morning I did an upgrade again, so far no errors. I will let you know
2025-04-01T04:34:55.246478
2016-02-04T22:33:13
131493400
{ "authors": [ "gibbitz", "owiber", "sethen" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9112", "repo": "node-nock/nock", "url": "https://github.com/node-nock/nock/issues/466" }
gharchive/issue
No match for GET request I have been searching up and down for documentation to point me in the right direction of what the issue may be. I am getting a no match error when the URL I am mocking with nock is exactly the same. Here is the code: it('should be able to get all accounts', (done) => { nock(URL.API) .get('twitter_accounts/1/user_lookup') .query({ twitter_ids: [ 2 ] }) .reply(200, {}); const assertion = (action, expectedAction) => { return assert.deepEqual(action, expectedAction); }; const expectedActions = [ { accounts: { 2: 1 }, type: ACCOUNTS.RECEIVE_TWITTER_ACCOUNTS }, { accounts: { 1: { id: 1, twitter_id: 2 } }, type: ACCOUNTS.RECEIVE_ALL_ACCOUNTS }, { id: 1, type: ACCOUNTS.SET_CURRENT_ACCOUNT_ID }, { type: ACCOUNTS.REQUEST_TWITTER_DATA } ]; const store = mockStore({ assertion, done, expectedActions, state: { userReducer: { organizations: [ { twitter_accounts: [ { id: 1, twitter_id: 2 } ] } ] } } }); store.dispatch(AccountsServices.getAllAccounts()); }); Assume the URL.API is https://api/v1/... And when running my tests I receive and check the request, I get: { [Error: Nock: No match for request get https://api/v1/twitter_accounts/1/user_lookup?twitter_ids=2 ] status: 404, statusCode: 404 } So, what's the issue here? Is this a bug of some sort? I am not sure why this isn't working as intended. Since you specified an array for the value of twitter_ids, it's expecting the query parameter to be twitter_ids[]=. Also not sure if it's valid to have a path in the scope. It maybe should just be protocol + host. Even if I changed it to be query({ twitter_ids: '2,3,4,5' }), I get the same result. I am going to close this... After messing with the test a little longer the issue seemed to resolve itself. I will chalk this one up to a malformed request on my end. For googlers with this issue. I noticed that the service protocol + host (passed in the constructor) ended with a trailing slash and the path (passed in get() in my case) began with a slash when the requests matched. If they are written to join without a double slash, it won't match. When the path is deconstructed this is not obvious to those of us who are assuming how it may work under the hood (incorrectly in my case I guess). Hope this helps someone...
2025-04-01T04:34:55.261374
2022-05-01T15:50:13
1222197359
{ "authors": [ "Steve-Mcl", "coveralls" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9113", "repo": "node-red/node-red", "url": "https://github.com/node-red/node-red/pull/3567" }
gharchive/pull-request
Add "search for" buttons to notifications Types of changes [ ] Bugfix (non-breaking change which fixes an issue) [x] New feature (non-breaking change which adds functionality) Proposed changes As discussed here: https://discourse.nodered.org/t/flows-stopped-due-to-missing-node-types-a-feature-suggestion-to-help/61995 Adds buttons... * "Search unused config nodes", * "Search for unknown nodes", * "Search for invalid nodes", Before deploy notifications... 1. deploying bad import 2. deploying misconfigured nodes After deploy notifications... 3. stopped due to missing/unknown node types 4. successful deploy but unused config nodes remain Other stuff Look at last two screenshots. has no primary accent (white) on the close button (this is original condition) has primary accent (red) on the close button (this is new - this notification never had a close button) Which is preferable? When clicking the "Search for xxxxx" should the notification close or remain open? Checklist [x] I have read the contribution guidelines [x] For non-bugfix PRs, I have discussed this change on the forum/slack team. [x] I have run grunt to verify the unit tests pass [ ] I have added suitable unit tests to cover the new/changed functionality Coverage increased (+0.006%) to 68.54% when pulling 8a972ee5435a13c1b5568d5d34a433ab031b6a07 on Steve-Mcl:notification-buttons into 4fb829261879e0d75adb1605a4882bf689ce889b on node-red:dev. All "Search for ..." buttons now close the parent dialog. Missing var added (the let notification so that notification.close()` can be called) Spaces added to search terms
2025-04-01T04:34:55.268270
2017-09-24T02:23:28
260051972
{ "authors": [ "cinderblock", "ivanseidel" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9114", "repo": "node-serialport/node-serialport", "url": "https://github.com/node-serialport/node-serialport/issues/1344" }
gharchive/issue
Serial port Polling is limited to vsync speed SerialPort Version: 5.0.0 NodeJS Version: 8.x Operating System and Hardware Platform: OSx/Unix and Windows Summary of Problem Writing to SerialPort is fast, and not a problem. However, reading data from SerialPort is limited to the vsync of the monitor because uv_pool internally uses that to pool for changes. That means that, a device can only send and receive up to 60 packets serially one after the other although communication can go up to 500Kb/s without any problem. It seems to me, that it should be using some sort of socket piping when reading from the Serial, but that seems unclear as no one has done that before. Looking trough libuv, only this thread stated some sort of problem like that, but is not really related: https://github.com/joyent/libuv/issues/1246 Steps and Code to Reproduce the Issue Make a loop back on any hardware, and log the speed taken to write and wait for the data to come back from the SerialPort Increase what's being written, and you will see a clear historgram in times taken to receive the data: ~16.66ms/~33.33ms/~50ms/... That is cause because Reading is synchronized with the vsync from the monitor, and it shouldn't , as SerialPort is unrelated with the screen rendering. Any thoughts on that, or is that behavior really wanted? It's really limiting for applications that could be much faster (@joaopedrovbs) You can confirm it, by simply logging Date.now() - lastDate in lib/bindings/poller.js in poll(eventFlag) function. That is the method called to check if data has been received... Pooling is done in order to verify changes, while it would be best to use interrupts to trigger readings. But I'm not sure how that could be done. I tested in both Windows and Mac, and both gave me a exact 16ms period, and Data always snaps to that while reading... Are there any tricks that could be used to reduce the latency from when the bytes come in on the wire to when they're available as port.on('data', ...)? I have ~500Hz data coming in to the machine from a sensor. Most of the time, it works fine but for 1~5% of messages, they all arrive basically at once. Crazy thought is maybe it would be possible to integrate the readline parser and notifying the javascript as soon as a \n byte is received. Or maybe not and it's all the OS that is off doing something else and there would be nothing we can do in userland. This is happening on Windows and Linux. I could go to a realtime system but it would be nice to stay in the node world.
2025-04-01T04:34:55.314071
2019-05-01T08:09:11
439096967
{ "authors": [ "jaulz", "puzrin" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9115", "repo": "nodeca/js-yaml", "url": "https://github.com/nodeca/js-yaml/pull/492" }
gharchive/pull-request
feat: allow type to be dumped as a different type With this change it will be possible to dump a type to a different type based on the chosen style. The predicate function can decided based on it's style whether it can represent it or not. This feature looks strange. Could you explain why it may be useful for all to hack types on the fly? @puzrin for example "!square 4" could have two styles: "original" and "calculated". If you dump it with style "original" it will dump as "!square 4" again but if you dump it with style "calculated" it should dump as "16" only. As far as i understand, loader should be able to load back dumped result. If you dump 16, that will be a number after load, not custom type. Seems you try to use yaml as "programming language", that's attractive, but not good idea. @puzrin actually, I would like to use it as a preprocessor to generate a final yaml file that only needs built in types and no custom types. Any suggestion for any other option? IMHO it's better to keep things separate and prepare data to final state before dump. Mh, okay, so there's no chance to merge this? If yes, do you know other tools I could use for preprocessing? I tend to reject as "out of project scope". Too bad... thanks anyway for the input!
2025-04-01T04:34:55.356985
2016-01-10T22:21:25
125849539
{ "authors": [ "bengl", "chrisdickinson", "distracteddev", "rvagg" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9116", "repo": "nodejs/docs", "url": "https://github.com/nodejs/docs/pull/54" }
gharchive/pull-request
Update membership section of GOVERNANCE While running through the responses to the call for membership, I realized that the existing GOVERNANCE document was a bit lax on details for membership. Specifically, it did not answer: Who should be members? What are members expected to do? How should individuals go about becoming a member? Under what circumstances does a member leave the WG? In addition, the CONTRIBUTING doc was a bit off from where we're at right now: it's expected, at least in the near term future, that we'll be operating primarily in the nodejs/node repo. This PR attempts to address the questions above and is meant as jumping off point for discussion! I'm going to cc the folks who have publicly raised their hand to help out on this issue too, to make sure they're OK with all of this. In addition, I will reach out privately to all Node collaborators who have touched the docs in the last 6 months who are not part of the Docs WG. This does not address process, goal-setting, or tooling, but that should follow in subsequent PRs to be discussed by the WG and CTC. /cc @nodejs/documentation /cc @mikeal — could you make sure this looks okay w/r/t WG setup? /cc @rvagg — You've expressed interest in the operation of the Docs WG in other issues, and I'd love your thoughts on this. /cc @kosamari, @a0viedo — you've both raised your hands to help out (thank you!), are there any questions you have that the membership section doesn't answer? Are the responsibilities and access outlined here in line with what you'd expect, or is it too much? If we've moved to Slack, which I definitely think we should, then can we remove the gitter badge from the readme and officially close down the gitter channel if there's anyone left in it? Everything looks like it was addressed to LGTM. Merging this now. Thanks all! apologies for not finding the time to review this after being pulled in, it sounds like it's going well though so +1!
2025-04-01T04:34:55.618565
2021-06-15T12:20:56
921337457
{ "authors": [ "nodemationqa" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9117", "repo": "nodemationqa/nodeQA", "url": "https://github.com/nodemationqa/nodeQA/issues/215" }
gharchive/issue
Test Issue created at Tue, 15 Jun 2021 12:20:56 GMT Test issue body Tue, 15 Jun 2021 12:20:56 GMT Comment on issue at Tue, 15 Jun 2021 12:20:57 GMT
2025-04-01T04:34:55.619390
2021-12-16T02:23:11
1081699938
{ "authors": [ "nodemationqa" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9118", "repo": "nodemationqa/nodeQA", "url": "https://github.com/nodemationqa/nodeQA/issues/394" }
gharchive/issue
Test Issue created at Thu, 16 Dec 2021 02:23:10 GMT Test issue body Thu, 16 Dec 2021 02:23:10 GMT Comment on issue at Thu, 16 Dec 2021 02:23:12 GMT
2025-04-01T04:34:55.621371
2023-10-12T14:14:35
1940095637
{ "authors": [ "toshify" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9119", "repo": "nodepa/seedling", "url": "https://github.com/nodepa/seedling/issues/420" }
gharchive/issue
Feat C.3 – Practicing read-along 马丽 taps the “read out loud” icon and words are highlighted as they are spoken by audio playing along for the entire text. 2, 马丽 taps the “read out loud” icon again, and the audio pauses while the highlight remains. 马丽 taps the “read out loud” icon again, and the audio continues. The audio completes, and the lesson’s next stage is automatically displayed. The comprehension exercise type is detailed in #277 And implemented by #368
2025-04-01T04:34:55.663397
2023-12-07T16:44:13
2031137096
{ "authors": [ "TomAFrench", "swapilar" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9120", "repo": "noir-lang/noir", "url": "https://github.com/noir-lang/noir/pull/3715" }
gharchive/pull-request
Swapilar patch 3 Description These improvements are part of the Cryptoversidad grant proposal. Hey, I'm going to close this PR as it's just rewording an existing sentence.
2025-04-01T04:34:55.668031
2016-03-09T16:03:34
139619413
{ "authors": [ "theamundell" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9121", "repo": "nolimits4web/Swiper", "url": "https://github.com/nolimits4web/Swiper/issues/1638" }
gharchive/issue
AutoHeight feature not updating on slide change I've attempted using this awesome new feature. However, I can't get it working all the way. The active slide is giving the wrapper and all slides the same height but it's not updating when I go to the next slide. It seems this was a css issue, sorry about that, closing now.
2025-04-01T04:34:55.670191
2017-06-12T19:48:42
235345502
{ "authors": [ "eduardopopego", "nolimits4web", "tripboxer" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9122", "repo": "nolimits4web/Swiper", "url": "https://github.com/nolimits4web/Swiper/issues/2128" }
gharchive/issue
Dynamically add Slider to website (after ajax call) I try to build a website with multiple swiper sliders which are loaded dynamically with ajax. The pagination and the controls seem to be shown correctly for each loaded slider but none of the sliders is working properly. Is there any way to dynamically add new sliders to website (not slides to an existing slider!)? Hi! I could be wrong but I think you should create a new instance of the Swipe object for each slider that you want to create, each of the latter with a unique class name or # ID. var mySwiper1 = new Swiper('.swiper-container-unique-class-name', { loop: true, pagination: '.swiper-pagination', }); var mySwiper2 = new Swiper('.swiper-container-another-unique-class-name', { loop: false, pagination: '.another-swiper-pagination' }); So basically you should create a new mySwiper# variable and initialize Swiper for every new slider you wish to add. You could set this upon success of the ajax call or whatever you feel is best. Haven't tested it, but seems reasonable. Closing as looks like resolved or due to inactivity
2025-04-01T04:34:55.671505
2013-08-01T18:59:01
17524626
{ "authors": [ "ianespanto", "mattdemps" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9123", "repo": "nolimits4web/Swiper", "url": "https://github.com/nolimits4web/Swiper/issues/291" }
gharchive/issue
Changing focus does not update translate If focus changes through tabbing or through javascript and shifts the swiper, the translate value is not accurate causing the swiper to be offset by however much the focus shifted the content. Same here. Translate does not seem to update after I resize the window with another tab.
2025-04-01T04:34:55.673215
2015-01-28T17:06:38
55784659
{ "authors": [ "nolimits4web", "uavn" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9124", "repo": "nolimits4web/Swiper", "url": "https://github.com/nolimits4web/Swiper/pull/1073" }
gharchive/pull-request
Update idangerous.swiper.js https://github.com/nolimits4web/Swiper/issues/973 Almost ok, but this should be document.activeElement.nodeName.toLowerCase() === 'input' and document.activeElement.nodeName.toLowerCase() === 'textarea' because some browser may lower/uppercase it Closing as new Swiper 3 released with different API and rewritten core
2025-04-01T04:34:55.677651
2021-09-03T16:48:54
987947692
{ "authors": [ "nba94", "nolimits4web" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9125", "repo": "nolimits4web/swiper", "url": "https://github.com/nolimits4web/swiper/issues/4916" }
gharchive/issue
Version 7 issue with css (it seems) Hi All, having an issue since upgrade from version 6.8.4 to version 7.x.x (having issues since 7) Apologies, it is my first time posting an issue - hence I may have missed out on some important information - please shout and I will add what's required. Problem overview: Previously, the slider would appear as follows: However after version 7 has been released, the same page started appearing as follows: I've tried keeping version 6.8.4 on just the CSS files, and it seems like most of the design stays as it should be, the only issue is pagination is missing! Please see the screenshot: I tried understanding whether any substantial CSS changes have been made from version 6.8.4 to version 7, but couldn't work it out. I've decided to stay on version 6.8.4 for the time being, but wondering if anyone else had any issues with CSS since the update? https://swiperjs.com/migration-guide#html-layout
2025-04-01T04:34:55.680812
2022-09-05T12:35:14
1361879349
{ "authors": [ "abvas", "nolimits4web" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9126", "repo": "nolimits4web/swiper", "url": "https://github.com/nolimits4web/swiper/issues/6035" }
gharchive/issue
Dynamic change of pagination parameters. How to change pagination parameters dynamically? For example enable/disable dynamicBullets on button click. I'm trying to change the value through: mySwiper.params.pagination.dynamicBullets = .... And then I do: mySwiper.update(); mySwiper.pagination.update() but it doesn't work. Here is a small demo: https://codepen.io/abvas/pen/VwxYWae press the "ON/OFF" button at the top. Please tell me what am I doing wrong. Thank you. Do you want to ask a question? Are you looking for support? Stack Overflow or Swiper Discussions is the best places for getting support Please, don't use GitHub issues for questions
2025-04-01T04:34:55.695418
2022-07-16T19:56:12
1306890242
{ "authors": [ "Nozemi", "nomis" ], "license": "CC0-1.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9127", "repo": "nomis/logitech-z906", "url": "https://github.com/nomis/logitech-z906/pull/3" }
gharchive/pull-request
Pin Out Reference Image/Colors Pin out reference image, with colors to map the original console cable's wires. Added a color column to the pin out table. Moved the images into their own directory. Updated the image reference in the firmware.rst file so it references the new image location. Could you use SVG instead of JPEG for images/console-port-diagram.jpg? The pins in the diagram are from the perspective of the amplifier connector but the colours are for the console cable which has the pins in the mirrored order... The colours in here don't match those on https://www.reddit.com/r/hardwarehacking/comments/hnpprk/hacking_the_logitech_z906_speaker_system/. There's no reason why Logitech would have to use the same colour wires on every batch so it's possible they're not always the same. There are two variants of the console - with and without an exposed programming port - so these could have different wire colours too. I noticed that the wiring colors aren't matching, not sure what's up with that. Haven't really had time to look into it either, so I haven't done anything more with this pull request. I have both of the consoles at hand, so I suppose I could check to see if there is a difference between them. However, the wiring diagram on the Reddit link seems to work, at least for the cable I cut off the connectors of. Closing this PR, will open a new one if I can find time to produce something definitively correct, and useful. However, the wiring diagram on the Reddit link seems to work, at least for the cable I cut off the connectors of. I don't recommend cutting the cables because they use a standard connector... I suggest you get a VGA extension cable next time and cut that in half. The colours of the wires are only useful if you're cutting cables otherwise the pin numbering is sufficient.
2025-04-01T04:34:55.732708
2020-06-22T20:16:08
643333641
{ "authors": [ "vintprox" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9128", "repo": "nonstory/nonstory", "url": "https://github.com/nonstory/nonstory/issues/4" }
gharchive/issue
Management (arbiter) permissions I'd like to introduce proof-of-concept NsMgmt table. There is so much to come up with permissions, but it will be good to have something simple in the end. So, NS Manager is title that user bears when they get all permissions to manage NonStory settings for guild or separate channels. Also, NS Management can solve disputes regarding questionnaires with override commands given to them, force lesser countdown and disciplinary means to apply on players. NSM is bound to NsUser model, which makes easier migration of rights for them, if they suddenly decide to switch accounts (described in #3). That also requires sending notice to admins of server to prevent possible holes in security, before they can do so - some kind of DM poll for/against leaving permissions after identity switch. ns_mgmt Indexes: id, PRIMARY user_id + guild_id, UNIQUE user_id + channel_id, UNIQUE user_id guild_id channel_id (comment) 1 6 - OK. User #1 is managing guild #6 now 1 6 - ERROR! Record with user #1 and guild #6 cannot repeat 2 6 - OK. User #2 is managing guild #6 now 1 7 - OK. User #1 is managing guild #7 now 1 - 12 OK. User #1 is managing channel #12 now 1 - 12 ERROR! Record with user #1 and channel #12 cannot repeat 2 - 12 OK. User #2 is managing channel 12 now - 6 17 OK. Channel #17 in guild #6 is marked for NS Management - 6 12 ERROR! Cannot mark two channels for NS Management per one guild (two UNIQUE indexes constraint) As per #9, there will be few changes in models. They include renaming "channel" abstraction to "room" and "guild" to "server", allowing broader chat clients coverage. ns_mgmt Indexes: id, PRIMARY user_id + server_id, UNIQUE user_id + room_id, UNIQUE user_id server_id room_id id (comment) 1 6 - 100 OK. User 1 is managing server 6 now 1 6 - . ERROR! Record with user 1 and server 6 cannot repeat 2 6 - 101 OK. User 2 is managing server 6 now 1 7 - 102 OK. User 1 is managing server 7 now 1 - 12 103 OK. User 1 is managing room 12 now 1 - 12 . ERROR! Record with user 1 and room 12 cannot repeat 2 - 12 104 OK. User 2 is managing room 12 now - 6 17 105 OK. Room 17 in server 6 is marked for NS Management - 6 12 . ERROR! Cannot mark two rooms for NS Management per one server
2025-04-01T04:34:55.740030
2017-05-24T14:21:39
231057163
{ "authors": [ "BUONJG", "mbroadst" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9129", "repo": "noodlefrenzy/node-amqp10", "url": "https://github.com/noodlefrenzy/node-amqp10/issues/327" }
gharchive/issue
Error management Hello, Could you please tell us how we can handles errors: Opening the connection: new AMQPClient(...).connect(...); On accepting, rejecting, releasing messages We tried to connect to an invalid amqp, the connect method never return neither throw an exception When we remove network connection, accept, reject and release return but never throw exception neither Thank you by advance. @BUONJG there are examples in the usage section, and for the client itself you can listen to client:errorReceived which is sort of all oddly named for legacy reasons, until we can update it for a major version increment
2025-04-01T04:34:55.779869
2023-07-31T04:03:25
1828361703
{ "authors": [ "VitorMendesco" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9130", "repo": "nordic-io/calixto", "url": "https://github.com/nordic-io/calixto/pull/11" }
gharchive/pull-request
✨ Feature/CU-8678dm8f8 - PLOP, Rollup, Typography and more PLOP, Rollup, Typohgrapy and more... Adding PLOP template file generator with .hbs template files to components and providers. Also renamed wrappers to providers. The different texts components in baseui were contained in a single component named Text, where the different sizes and weights can be accessed by the variant property! The list of the available variants are typed and have the same name as in the baseui typography components. Also was added the useStyles custom hook, that can be used from anywhere in the lib, but was made focused in provide an easier method to apply quick styles from other applications. Rollup was removed as building tool, and it configuration was moved to Vite instead (who uses Rollup by the way, but in an easier way). The packages in general were also updated to latest LTS version, the most important changes are: storybook: 7.0.27 > 7.1.1 CHANGELOG feat(📦packages): updating packages to latest version fix(💄general): adjusting imports, linting and improving types feat(✨atoms): ading text component fix(🎨packages and storybook): removing unused packages and updating storybook configs feat(✨plop): adding plopfile and templates to components and providers feat(🎨StyleProvider): minimum improvements and renaming StylesProvider to ThemeProvider test(✅useStyles): adding helper tests adn modifying eslint import order feat(✨helpers): adding useStyles custom hook to generate css stylesheets feat(🔥styles wrapper): renaming with provider suffix feat(🔥button): renaming to pascal case and modifying overrides feat(✅tests and storybook): updating config and react compatibility feat(👷build): moving rollup config to vite config file and updating path aliases Task linked: CU-8678dm8f8 PLOP template manager
2025-04-01T04:34:55.801510
2019-03-04T17:15:02
416905271
{ "authors": [ "BcRikko", "omerimzali" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9131", "repo": "nostalgic-css/NES.css", "url": "https://github.com/nostalgic-css/NES.css/pull/320" }
gharchive/pull-request
is-disabled with disable cursor Description Compatibility Caveats Good idea 👍 Thanks 🙇 NES.css's cursor is made of 16x16 size. So could you make this cursor as 16x16 size? 🤔 and If using a black theme, I cannot see this cursor. Because it's all black. So could you make this inside white with a black border? 🤔 Like this: I'll try to use this. @omerimzali Thanks for changing 👍 But it seems that the way I conveyed it was not good. Sorry... 😭 So could you make this cursor as 16x16 size? 🤔 So could you make this inside white with a black border? 🤔 I want a cursor as below. Image size is 32x32 Cursor written with 16x16 pixels. like this cursor-click.png White background color with black border The inside of the cursor is transparent It looks like this now. Is there anything I can do to merge this request? It's stil review required 😭 NES.css's cursor is made of 16x16 size. Cursor written with 16x16 pixels. like this cursor-click.png Could you create a NES-like pixel art cursor. Like this
2025-04-01T04:34:55.816032
2022-01-31T07:23:43
1119062528
{ "authors": [ "ogost" ], "license": "Unlicense", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9132", "repo": "not-jan/pass-ssh", "url": "https://github.com/not-jan/pass-ssh/pull/2" }
gharchive/pull-request
if ssh address is not present, make one from user and IP If you have passwords stored like this: <password> user: ogost IP: <IP_ADDRESS> then make ssh_address by concatenating @. I hope this time I got it right.
2025-04-01T04:34:55.836011
2023-12-08T09:43:47
2032323782
{ "authors": [ "HuKeping", "NiazFK", "SteveLasker", "Two-Hearts", "justincormack", "priteshbandi", "toddysm", "vaninrao10", "yizha1" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9133", "repo": "notaryproject/.github", "url": "https://github.com/notaryproject/.github/issues/56" }
gharchive/issue
Nominate Shiwei Zhang (@shizhMSFT) as a Notary Project Org maintainer At the Notary Project community meeting held on December 4, 2023, the community discussed and agreed on nominating new Notary Project Org maintainers from subproject maintainers according to the governance guide. I would like to nominate Shiwei Zhang (@shizhMSFT) as a Notary Project Org maintainer. Shiwei Zhang (@shizhMSFT) has been a Notary Project subproject maintainer for subprojects: specifications, notation, notation-go, notation-core-go, notation-action and notation-plugin-framework-go. He has made significant and unique contributions to the Notary Project since joining the community. From routine work perspective, Shiwei Zhang (@shizhMSFT) has demonstrated a high meeting participation rate considering the time zone and has contributed significantly to quality control by reviewing PRs and providing comments on issues. The following table shows his contributions to the Notary Project sub-projects in the last 12 months. He has made amazing 681 PR comments, almost 2 comments on PR per day. Name (GitHub ID) PR Issue PR comments Issue comments Meetings Shiwei Zhang (@shizhMSFT) 8 20 681 168 40/78 Not only the number of comments, if you take a deeper look at the comments that Shiwei Zhang (@shizhMSFT) made, those are very insightful and deep thought comments, see an example https://github.com/notaryproject/notation/pull/601#discussion_r1150810148. For maintainers and contributors, I believe you all have the feeling that if your PRs are reviewed and approved by Shiwei Zhang (@shizhMSFT), it means high quality. Shiwei Zhang (@shizhMSFT) is a security expert. He led the design and implementation of the Notary Project COSE signature in notation-core-go library and is the main contributor to the go-cose library, which is used by the notation-core-go library. He has also demonstrated his expertise in the recent security audit and has received credits for several security advisories. Not to mention the latest notation threat model was created with his guidance. Last but not least, Shiwei Zhang (@shizhMSFT) participated in KubeCon China (Due to the pandemic, this conference has not been held for several years) and promote Notary Project solutions. He answered questions, did demos for folks who were interested in the Notary Project at the project booth. The following table shows more information about his activities in the last 12 months: Name (GitHub ID) Activities Shiwei Zhang (@shizhMSFT) KubeCon China 2023 (project booth), Notation Security audit 2023 I believe that Shiwei Zhang (@shizhMSFT)'s experience and security expertise will be an asset to the Notary Project, and I am confident that he will continue to make valuable contributions to the community. The data shown in above tables is copied from source https://github.com/notaryproject/.github/issues/54 Please comment "+1" to vote for Shiwei Zhang (@shizhMSFT), according to the governance guide and agreement on community meeting, we need two votes out of three from current org maintainers. Tag Org maintainers: @NiazFK @justincormack @SteveLasker /cc other maintainers: @cipherboy @OliverShang @FeynmanZhou @HuKeping @JeyJeyGao @duffney @gokarnm @mnm678 @priteshbandi @Two-Hearts @rgnote @iamsamirzon @toddysm @shizhMSFT @vaninrao10 @yizha1 @zr-msft +1 +1 +1 Hu Keping @.*** (From WeLink) 发件人:Feynman Zhou @.> 收件人:notaryproject/.github @.> 抄 送:hukeping @.>;Mention @.> 时 间:2023-12-08 17:10:29 主 题:Re: [notaryproject/.github] Nominate Shiwei Zhang @.***) as a Notary Project Org maintainer (Issue #56) +1 ― Reply to this email directly, view it on GitHubhttps://github.com/notaryproject/.github/issues/56#issuecomment-1847227946, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ACL7JXIWMI3TDTPFFJJIA5DYIMNUTAVCNFSM6AAAAABAMMQQPGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNBXGIZDOOJUGY. You are receiving this because you were mentioned.Message ID: @.***> +1 +1 +1 +1 as per the discussion in the meeting. As per discussion in this morning's community discussion and based on the votes for @shizhMSFT we decided to move forward with adding him to the Org maintainers. I will go ahead and create the necessary PRs to update the relevant MAINTAINERS and CODEOWNERS files
2025-04-01T04:34:55.844734
2024-03-24T15:12:59
2204402892
{ "authors": [ "JeyJeyGao", "omkhard", "yizha1" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9134", "repo": "notaryproject/notation", "url": "https://github.com/notaryproject/notation/issues/910" }
gharchive/issue
Why is the notation for verifying only takes the root certificate in truststore What is not working as expected? We are giving the chain of certificates having from root certificate , intermediate certificate to leaf cert and giving leaf private key for signing , but during verification only root certificate in truststore and subject of leaf certificate is enough for signing. My question is How is the integrity of leaf key is being verified with the leaf cert and where ? I looked there happens some (hash-algorithm) revocation and it creates a payload and verifies using it. May I get some more in depth knowledge about the Signer's public key integrity check ? What did you expect to happen? Explanation How can we reproduce it? Chain Signing/Verifying Describe your environment wget , OS : Linux , shell: bash What is the version of your Notation CLI or Notation Library? 1.1.0 @omkhard I believed you asked the same question on slack channel. As I commented in slack, would you mind checking this verification specification to see whether it helps answer your question. Thanks. these are the steps I am trying: Creating keys using openssl (config file is for notation specific X509 options, and CA:True) -> openssl req -x509 -newkey rsa:2048 -keyout root.key -out root.crt -days 365 -nodes -subj "/C=US/ST=WA/L=Seattle/O=Notary/CN=root" -config /root/.config/notation/localkeys/rootTmp.cnf -> openssl req -out inter1.csr -newkey rsa:2048 -keyout inter1.key -nodes -subj "/C=US/ST=WA/L=Seattle/O=Notary/CN=inter1" -config /root/.config/notation/localkeys/rootTmp.cnf -> openssl x509 -req -in inter1.csr -CAkey root.key -CA root.crt -days 365 -CAcreateserial -out inter1.crt Creating leaf cert using inter1.crt and inter1.key (config file for notation specific leaf x509 options, and CA:False) -> openssl req -out inter2.csr -newkey rsa:2048 -keyout inter2.key -nodes -subj "/C=US/ST=WA/L=Seattle/O=Notary/CN=inter2" -config ~ubuntu/ctrSign/tmp1.cnf -> openssl x509 -req -in inter2.csr -CAkey inter1.key -CA inter1.crt -days 365 -CAcreateserial -out inter2.crt -extfile /root/.config/notation/localkeys/v3.ext Copying all the ".key"s & ".crt"s in $XDG_CONFIG_HOME/notation/localkeys/ and making a concat.crt using : -> cat inter2.crt inter1.crt root.crt > concat.crt Specifying the concat.crt and inter2.key in $XDG_CONFIG_HOME/notation/signingkeys.json Doing Image signing (docker login all are OK, signing happened fine),Signed Successfully. During verifying if I am using concat.crt in truststore ($XDG_CONFIG_HOME/notation/truststore/x509/ca/sign/concat.crt) and specifying in $XDG_CONFIG_HOME/notation/trustpolicy.json , and giving x509.subject of leaf certificate. The verification fails. But if I give only root.crt in truststore the verification is happening successfully. Q: Are we not taking the entire chain for verification of an Image signed with the entire chain? @omkhard The trust store is used to store trusted certificates. When you put root certificates in the trust store, that means you trust these roots. To answer your question, notation requires the entire chain for validation. The chain is stored in the signature envelope, which is added by the signer.
2025-04-01T04:34:55.913944
2024-10-19T05:54:47
2598784979
{ "authors": [ "Gauravtb2253", "notsoocool" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9135", "repo": "notsoocool/codecache", "url": "https://github.com/notsoocool/codecache/issues/125" }
gharchive/issue
Improving text visibility Is your feature request related to a problem? Please describe. Currently the labels of the input fields in add snippets section are not visible clearly Describe the solution you'd like I would like to change the color and solve it Additional context The labels are not visible clearly, a white color would be better @Gauravtb2253 i have already fixed it yesterday, you can see the changes in the develop branch.
2025-04-01T04:34:55.917839
2023-12-06T18:52:48
2029192836
{ "authors": [ "nouritsu" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9136", "repo": "nouritsu/rs-pseudocode", "url": "https://github.com/nouritsu/rs-pseudocode/issues/7" }
gharchive/issue
Testing: Add tests for the Expression Parser Tests should handle edge cases like - chained operators varying operator precedence grouping expressions invalid literals invalid variable names ... and more The Cargo inbuilt test framework is perfect for this. There is no need to section tests off into a different folder as this is a small project. Test for variable expressions added Tests for unary expressions added The good first issue label has been removed because this requires understanding the structure of the parser to a great deal.
2025-04-01T04:34:55.921660
2024-09-13T00:14:15
2523558401
{ "authors": [ "noraj", "nova1751" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9137", "repo": "nova1751/hexo-shiki-plugin", "url": "https://github.com/nova1751/hexo-shiki-plugin/issues/2" }
gharchive/issue
Question : themes With which hexo themes this plugin is compatible? You can install this plugin to try it out, it is not developed for a specified theme. Yeah but theme developer have to support it right? Do you know themes that use this plugin? As far as I understand the plugin generates the HTML, but the theme is responsible for the CSS. Or I'm wrong and this plugin works with 99% of themes without them needing to support anything? Yeah but theme developer have to support it right? Do you know themes that use this plugin? As far as I understand the plugin generates the HTML, but the theme is responsible for the CSS. Or I'm wrong and this plugin works with 99% of themes without them needing to support anything? The main job of this plugin is to hijack the rendering of code blocks during the rendering process. So the code block content is not affected by themes. However, you cannot get the theme's original styles for code blocks. That's why I add some extra code to beautify the code blocks, which you can use in other themes. Thanks that amazing, the minimal styling allow this plugin to be universally working for every theme. I'd just suggest changing the font-size: 1em; to font-size: 100%; or even not explicitly setting it to not overwrite the chosen size. In my case it resulted in normal small sized text and code being a lot bigger. Thanks for your advice. However, as I mentioned above, the style for code blocks may vary from theme to theme. So it may not be a good idea to change the css file, as it may work fine in other themes. My suggestion is to customize your own css file. The css_cdn option of the plugin config is exposed for that. Close through #3 #4 .
2025-04-01T04:34:55.931368
2022-10-02T10:03:31
1393709142
{ "authors": [ "jessej-samuel" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9138", "repo": "novuhq/hacksquad-website", "url": "https://github.com/novuhq/hacksquad-website/issues/13" }
gharchive/issue
lint: eslint create errors when adding new tailwind classes to pages https://github.com/novuhq/hacksquad-website/blob/6e15da800e2d7b45179baee7a85c3e14c311b030/.lintstagedrc#L3 When lint-staged checks for linting errors on git add it concurrently does git add twice leading to a race condition. So, I suggest a hotfix for the same. Remove linting checks for .jsx files as they don't have a lot of JavaScript to check for linting errors Oops! Apparently this error can be prevented by stopping the dev server before commiting.
2025-04-01T04:34:55.935085
2015-11-20T03:53:42
117960468
{ "authors": [ "Fuzion24", "giantpune" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9139", "repo": "nowsecure/android-vts", "url": "https://github.com/nowsecure/android-vts/issues/66" }
gharchive/issue
Why test for CVE-2011-XXXX? Play store description says that we only support android 4.0.3 and higher. We include a test for psneuter, which was fixed in android 2.3.X. it would take a special kind of screwup somewhere for this test to ever come up as 'failed'. Any suggestions on the heuristics to use here? Only check CVEs within X years of current build date or something? if the app can be installed on android devices less than 4.0.3, then should it check for bugs that were fixed in android 2.3? My logic here was: if an oem (or community ROM like CM) takes a device with an old kernel and ports a new Android version to it, it's very possible that that device still has said kernel vulnerability. I think we should be very careful about deprecating these kind of checks for that reason. Maybe when the amount of checks that we are running starts to become an issue (because of the time it takes to run, etc), then we can consider doing something like this. I think a good stop-gap here is: sorting by cve date (currently implemented) so, that the most relevant results are first.
2025-04-01T04:34:55.942219
2018-03-26T00:52:11
308403230
{ "authors": [ "adamchenwei", "denis-sokolov", "tannerlinsley" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9140", "repo": "nozzle/react-static", "url": "https://github.com/nozzle/react-static/issues/519" }
gharchive/issue
what is the correct way to update existing project built already with older version of react-static? I have project build with earlier version of react static. What is the best approach to update it to newest version? Do we have a dedicate section/tool for this purpose? Thanks I tried just update the version and redo npm install or yarn install, that failed me, as I won't even able to get the yarn stage to work, which is new feature also seesm the latest version of the react-static no longer recognize import in static.config.js... why is that? This seems to be a support request, a not a bug report or functionality discussion. Consider other avenues for help, such as Stack Overflow, forums, and chats. The GitHub issue tracker is most often used specifically about the changes to the project itself. So I guess older version breaks so much is not something react-static team care about? Is that really support or maybe this is just a call more more backward compatibility consideration as not everyone use it will keep library up to day version by version especially for some reason this library breaks very frequently, I just upgraded lol like two three months back and now the newer version upgrad break again and it's not even I did anything so extrodinary or hacky... Don't get me wrong I appreciate your all efforts in maintaining this beautiful library, but exhaust devs who use it with so much breaking change and not off a more transparent way to upgrade... That's really it... Minimizing the number of breaking changes is a valuable priority, so is marking those changes in a clear way. The changelog of this project is as clear as I can imagine: it mentions all 3 breaking changes in the last major version, and gives clear paths on how to upgrade. Beyond that if you have more practical suggestions on what exactly could be done that would bring a lot of benefit, you are welcome to specify them. Re: breaking changes: The changelog is sufficient and all breaking changes are made in accordance to semver and each version is thorough tested to the best of our knowledge before release. Re: Getting more help: I highly suggest you join the slack organization if you are experiencing a non-widespread issue. There are plenty of individuals there that can help. Re: import problems. Most likely this is a due to a .babelrc misconfiguration and is not a known issue with the library If you continue to have issue after consulting our forums or believe you have found a bug with react-static, please submit a new issue. Thanks! k, thank you all for being patient, I will check out the slack group for sure!
2025-04-01T04:34:55.944612
2017-12-23T11:27:48
284301629
{ "authors": [ "denis-sokolov", "tannerlinsley" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9141", "repo": "nozzle/react-static", "url": "https://github.com/nozzle/react-static/pull/260" }
gharchive/pull-request
Remove package-lock.json, fixup 58610df Two lock files, package-lock.json and yarn.lock run a high risk of different people and CI environments having different dependencies, defeating the purpose of the lock files. In addition, a single lock file clearly describes the prefered tool in the project. Two lock files make it confusing. The package.json scripts and the project history shows that yarn is the preferred tool since the beginning, so let us remove the npm lock file. Nice catch. Yarn all the way!
2025-04-01T04:34:55.948531
2017-03-23T08:49:20
216350736
{ "authors": [ "cdlm", "estebanlm", "jecisc" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9142", "repo": "npasserini/iceberg", "url": "https://github.com/npasserini/iceberg/issues/306" }
gharchive/issue
Question: is it needed to download a zip when adding a local repo? There is a high chance that this question will lead to nothing but I wonder if it is necessary to download a zip each time we add a local repository already existing on the disk?  If the repo is big it can take some time and I don't know if this is needed since we already give a git repo. Because when I add a new repository to any other git's gui outside Pharo, they can fetch all the infos needed without cloning/downloading sources. no is not. if you do: Metacello new repository: 'gitlocal://path/to/my/repo'; baseline: 'MyBaseline'; load. it will load your locally-based sources. Note: /path/to/my/repo/ needs to point to the filetree structure. For example, in this project: https://github.com/estebanlm/worklog, you will need to do this: Metacello new repository: 'gitlocal://path/to/worklog/mc'; baseline: 'Worklog'; load. What I do is that I download an image from Jenkins with the project inside. Then I install Iceberg and I add my local repo (already present on disk). And after I added the repo I have a progress bar indicating that iceberg is extracting a zip. Is that normal? Tried the metacello gitlocal:// thing, got a self halt in IceMetacelloRepositoryAdapter>>canUpgradeTo: oops someone let it there... I'm removing it right now
2025-04-01T04:34:55.960555
2016-08-25T20:54:35
173307864
{ "authors": [ "mcowart123", "roji" ], "license": "PostgreSQL", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9143", "repo": "npgsql/Npgsql.EntityFrameworkCore.PostgreSQL", "url": "https://github.com/npgsql/Npgsql.EntityFrameworkCore.PostgreSQL/issues/93" }
gharchive/issue
Errors generating new DB from code first migration Using VS 2015, I am having some errors generating a new database from a code-first migration: Add-Migration does not add a reference to 'NpgsqlTypes' to the generated files; this is easy enough to work around though. Update-Database fails on the first table it tries to create because the associated serial sequence does not exist: Npgsql.PostgresException: 42P01: relation "frontends_id_seq" does not exist at Npgsql.NpgsqlConnector.DoReadMessage(DataRowLoadingMode dataRowLoadingMode, Boolean isPrependedMessage) at Npgsql.NpgsqlConnector.ReadMessageWithPrepended(DataRowLoadingMode dataRowLoadingMode) at Npgsql.NpgsqlDataReader.NextResultInternal() at Npgsql.NpgsqlDataReader.NextResult() at Npgsql.NpgsqlCommand.Execute(CommandBehavior behavior) at Npgsql.NpgsqlCommand.ExecuteNonQueryInternal() at Microsoft.EntityFrameworkCore.Storage.Internal.RelationalCommand.Execute(IRelationalConnection connection, String executeMethod, IReadOnlyDictionary2 parameterValues, Boolean openConnection, Boolean closeConnection) at Microsoft.EntityFrameworkCore.Storage.Internal.RelationalCommand.ExecuteNonQuery(IRelationalConnection connection, IReadOnlyDictionary2 parameterValues, Boolean manageConnection) at Microsoft.EntityFrameworkCore.Migrations.Internal.MigrationCommandExecutor.ExecuteNonQuery(IEnumerable1 migrationCommands, IRelationalConnection connection) at Microsoft.EntityFrameworkCore.Migrations.Internal.Migrator.Migrate(String targetMigration) at Microsoft.EntityFrameworkCore.Design.MigrationsOperations.UpdateDatabase(String targetMigration, String contextType) at Microsoft.EntityFrameworkCore.Tools.Cli.DatabaseUpdateCommand.<>c__DisplayClass0_0.<Configure>b__0() at Microsoft.Extensions.CommandLineUtils.CommandLineApplication.Execute(String[] args) at Microsoft.EntityFrameworkCore.Tools.Cli.Program.Main(String[] args) 42P01: relation "frontends_id_seq" does not exist Migrations-Example.zip Above are the generated migration files. Looking at the server, the new database is created, but is completely empty (the schemas aren't even created). I see there are 'EnsureSchema' lines being added, but is there something in the code-first declaration I may be missing to make sure they are created? I am not super-familiar with code-first design; I'm just using it because database-first in EF core does not singularize entity names properly. I created a database by hand and have queried against it successfully using my model. Can you please include the model from which you generated the problematic migration? If possible try to extract the minimal code which reproduces the issue. Models.zip Attached are the models. They're built into a DLL that I can share between my other projects. I have a dummy asp.net core project that references the DLL to build the migrations, with the following setup: public void ConfigureServices(IServiceCollection services) { services.AddDbContext(options => options.UseNpgsql(, b => b.MigrationsAssembly("Neighborhood.API.Migrations"))); } Any word on this? I provided the model code I'm using. Sorry, most of my time was spent elsewhere recently. I'll get around to this soon. Finally got some time to look at this. The source of the error is the configuration you're doing in the ConfigureDB method in Frontend.cs: entity.Property(e => e.Id) .HasColumnName("id") .HasDefaultValueSql("nextval('frontends_id_seq'::regclass)"); You're setting up a default for the id column, but you're missing the schema for frontends_id_seq, which is accounting. However, while it's possible to manually set up your model this way, it's not really needed - the Npgsql EFCore provider will do most of this work for you. It's enough to have a property called Id on an entity to have Npgsql create it as a PostgreSQL serial column, which is an int backed by a sequence that gets created implicitly. In other words, try removing all the HasSequence and HasDefaultValueSql directives and everything should be taken care of for you. Am closing this as there doesn't seem to be an issue, but feel free to ask more questions etc. I took that code directly from what was generated when I was using DB first in the beginning. Good to know it's not needed to explicitly include that piece.
2025-04-01T04:34:56.032206
2024-01-04T17:35:53
2066071637
{ "authors": [ "johndhancock", "thomaswilburn" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9144", "repo": "nprapps/pym.js", "url": "https://github.com/nprapps/pym.js/issues/200" }
gharchive/issue
Adding "allow" attribute to iframe tag Is it possible to add the allow attribute as one of the configuration options for the parent instance? I have an embed that attempts to use the user's location via the geolocation api in the browser to customize content of the embed. In order to do this is in the generated iframe, I need to add the allow parameter to the iframe tag with a value of 'geolocation'. You may want to look at Sidechain, which is Pym-compatible and supports allow attribute pass-through: https://github.com/nprapps/sidechain Thanks for the tip. Forgive me if I'm missing it, but is there documentation on how to pass through the attribute? There isn't, I never added documentation for it. But if you set <side-chain src="..." allow="..."></side-chain>, it should carry it through to the internal iframe.
2025-04-01T04:34:56.079690
2021-12-13T17:04:08
1078771301
{ "authors": [ "internetti", "ludydoo" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9145", "repo": "nrc-no/core", "url": "https://github.com/nrc-no/core/pull/319" }
gharchive/pull-request
COR-424 fix docker build Didn't find a way to use scratch as a base image yet. Seems that linking sqlite causes quite a lot of problems on that front. Reverting to golang base image for now Also, use env substitution for css files /cc internetti /cc neb42 /lgtm
2025-04-01T04:34:56.081223
2020-02-05T06:51:51
560175979
{ "authors": [ "Gei0r" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9146", "repo": "nrenner/brouter-web", "url": "https://github.com/nrenner/brouter-web/issues/276" }
gharchive/issue
By default, why is downhillcost=60 and uphillcost=0? Maybe this is just a confusion of terms, but why is there a cost in going downhill but none in going uphill by default in brouter-web? This has been answered here.
2025-04-01T04:34:56.131303
2021-01-19T16:10:15
789155175
{ "authors": [ "scala-steward" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9147", "repo": "nrinaudo/kantan.sbt", "url": "https://github.com/nrinaudo/kantan.sbt/pull/181" }
gharchive/pull-request
Update sbt-mdoc to 2.2.16 Updates org.scalameta:sbt-mdoc from 2.2.12 to 2.2.16. GitHub Release Notes - Version Diff I'll automatically update this PR to resolve conflicts as long as you don't change it yourself. If you'd like to skip this version, you can just close this PR. If you have any feedback, just mention me in the comments below. Configure Scala Steward for your repository with a .scala-steward.conf file. Have a fantastic day writing Scala! Ignore future updates Add this to your .scala-steward.conf file to ignore future updates of this dependency: updates.ignore = [ { groupId = "org.scalameta", artifactId = "sbt-mdoc" } ] labels: sbt-plugin-update, semver-patch Superseded by #184.
2025-04-01T04:34:56.138808
2022-12-14T17:00:11
1497042431
{ "authors": [ "Drena2020", "Tatskaari", "ZackDeRose", "bcabanes", "galuszkak", "juristr" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9148", "repo": "nrwl/monorepo.tools", "url": "https://github.com/nrwl/monorepo.tools/pull/87" }
gharchive/pull-request
Add please to the website Fixes: #67 thanks for submitting. I'll have a look later today/tomorrow. I think the build failed due to format issues. Can you run yarn nx format locally and then push again? thanks for submitting. I'll have a look later today/tomorrow. I think the build failed due to format issues. Can you run yarn nx format locally and then push again? Thanks! I've gotten an internal review from our side so we're happy to land when you are. Hey! Hope you had a good holiday break. Any chance of a review here? .? Sent from my iPad On Jan 3, 2023, at 5:37 AM, Jonathan Poole @.***> wrote: Hey! Hope you had a good holiday break. Any chance of a review here? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread. Hi! Thank you for the PR, we are on a stretch right now but will attend to it. This one a delicate since the design needs to be adjusted to have more and more tools to show. We'll come back to you! Hey @bcabanes Is there any progress on this? Hey guys! Just a friendly bump to see if there's any progress here. :) Hey @Tatskaari - I'm taking over for the monorepo.tools for now - I had a chance to look over please and I think it's a legitimate addition to the site. I think you submission is mostly fine. Can you: rebase your PR with the latest adjust the order for please so that it fits in alphbetically with the others I'll look forward to adding your submission when these are done. Hey @Tatskaari - just sending a friendly poke! Looks like this PR was left out a bit in the past, but I'm actively managing the repo from now on! Let me know if you're still interested in including please!
2025-04-01T04:34:56.188803
2022-11-16T08:47:09
1451150000
{ "authors": [ "ikovalev1", "roman-khimov" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9149", "repo": "nspcc-dev/neo-go", "url": "https://github.com/nspcc-dev/neo-go/issues/2803" }
gharchive/issue
blockchain stops working after network issue despite the issue has been fixed If there is an internal network issue like broadcast storm or the internal switch failure , the main chain stops working even though the inetwork issue has been fixed. How to reproduce: prepare K6 to generate traffic # ./scenarios/preset/preset_grpc.py --size 1024 --containers 1 --out grpc.json --endpoint n1:8080 --preload_obj 10 --policy 'REP 4 IN X CBF 1 SELECT 4 FROM * AS X' Limit network traffik between all nodes: iptables -A OUTPUT -d node1 -m limit --limit 3/sec --limit-burst 1 -j ACCEPT iptables -A OUTPUT -d node2 -m limit --limit 3/sec --limit-burst 1 -j ACCEPT iptables -A OUTPUT -d node3 -m limit --limit 3/sec --limit-burst 1 -j ACCEPT iptables -A OUTPUT -d node4 -m limit --limit 3/sec --limit-burst 1 -j ACCEPT iptables -A OUTPUT -d node1 -j DROP iptables -A OUTPUT -d node2 -j DROP iptables -A OUTPUT -d node3 -j DROP iptables -A OUTPUT -d node4 -j DROP Launch K6 # ./k6 run -e DURATION=1800 -e WRITE_OBJ_SIZE=8192 -e READERS=10 -e WRITERS=10 -e DELETERS=10 -e DELETE_AGE=10 -e REGISTRY_FILE=registry.bolt -e GRPC_ENDPOINTS=n1:8080 -e PREGEN_JSON=../grpc.json scenarios/grpc.js In 15 min restore internal traffic # iptables -F try to cteate a container - error # neofs-cli container create -r node4:8080 -w /etc/neofs/storage/wallet.json --policy "REP 2 IN X CBF 3 SELECT 2 FROM * AS X" --await --basic-acl public-read-write Enter password > container ID: 83FpYMtt4U1mjXFtFUKVB3tANHsXWmEghH4kmFxY5HN awaiting... timeout: container has not been persisted on sidechain look at log - main chain is not running # journalctl -fu neogo-morph-cn | grep 'persisted to disk' Most likely your timeout waiter is just not patient enough. dBFT uses exponential backoff for timers (and it's important for the protocol to function correctly), so if you have (severe) problems for a long time you need to wait more for the algorithm to pick up. NOBUG to me. Theoretically some extensible message retransmission can improve this, but I don't see any clear scheme for it that won't lead to excessive useless traffic.
2025-04-01T04:34:56.190117
2021-09-27T14:47:55
1008245120
{ "authors": [ "AnnaShaleva", "fyrchik" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9150", "repo": "nspcc-dev/neo-go", "url": "https://github.com/nspcc-dev/neo-go/pull/2201" }
gharchive/pull-request
Faster state switch Close #2156 . Also, the discussion https://github.com/nspcc-dev/neo-go/pull/2201#discussion_r744694554 is still relevant.
2025-04-01T04:34:56.212967
2019-07-01T18:57:33
462861559
{ "authors": [ "evilpie", "marczellm" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9151", "repo": "nt1m/livemarks", "url": "https://github.com/nt1m/livemarks/issues/178" }
gharchive/issue
Feed problem: Keyboard Corner Feed URL: http://forums.musicplayer.com/cache/rss18.xml Add-on version: 1.21 Describe the bug My livemark pointing to the above feed does not update. If I try to remove and readd the livemark I get an error message saying: "Error: no XML data" I viewed the request object in the debugger and even though the response mimetype is set by the server as "application/xml", the responseXML member is null. The response and responseText members do contain the XML feed. Invalid XML XML Parsing Error: not well-formed Location: http://forums.musicplayer.com/cache/rss18.xml Line Number 76, Column 40: What XML validator did you use? This is from Firefox trying to parse that feed. You can see the error message when opening the Browser Console. Thanks. It's a bug with the forum software that generated the RSS.
2025-04-01T04:34:56.250769
2023-09-20T06:45:59
1904275739
{ "authors": [ "dagnelies", "wesdevpro", "zhaolinlau" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9152", "repo": "ntohq/buefy-next", "url": "https://github.com/ntohq/buefy-next/issues/110" }
gharchive/issue
Documentation(website)? Description A documentation website for buefy-next is needed Why Buefy need this feature To make it easier for developers to read IMHO we may not have to worry about the docs for now. Ideally we could just lock the documentation for Buefy V2 at a dub domain and then add the latest changes to the official buefy site This will be addressed very soon! Thank you for your response I think this is crucial for multiple reasons: it's more visible/official/real that way it's clearly documented, especially regarding the "getting started" part which would differ from the vue2 buefy it's a perfect proof of concept that the components actually work broken components are seen and reported much more quickly ...otherwise it kinds of stays in the shadow as a "let's see if it's someday ready" fork. That said, I know it's kind of gigantic task, and it's defenitely a lot of work. I just wanted to say that despite of this, it should have the highest prio IMHO, even more so than fixing the components and tests.
2025-04-01T04:34:56.398612
2023-01-17T19:06:34
1536898236
{ "authors": [ "nkoppisetty", "sarabala1979", "vigith" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9158", "repo": "numaproj/numaflow-sinks", "url": "https://github.com/numaproj/numaflow-sinks/pull/14" }
gharchive/pull-request
chore: Adding flexibility to exclude the label in prometheus pusher Signed-off-by: Saravanan Balasubramanian<EMAIL_ADDRESS> Basically, inference will copy all source labels and add them to new anomaly metrics. But if the user wants to exclude any metrics from the source. They can configure it here. Basically, inference will copy all source labels and add them to new anomaly metrics. But if the user wants to exclude any metrics from the source. They can configure it here. No, we won't be adding all the src labels. We will only add what is required, as labels lead to high cardinality data. We wont need any feature to exclude labels, just adding labels feature will be useful. No, we won't be adding all the src labels. We will only add what is required, as labels lead to high cardinality data. We wont need any feature to exclude labels, just adding labels feature will be useful. +1
2025-04-01T04:34:56.399896
2014-06-16T22:44:49
35843259
{ "authors": [ "hargup", "jayvius", "pitrou" ], "license": "bsd-2-clause", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9159", "repo": "numba/numba", "url": "https://github.com/numba/numba/issues/498" }
gharchive/issue
Implement support for various numpy reduce functions Implement support for various numpy reduce functions like sum, mean, median, etc. This was mostly done in PR #833. median is still unsupported. @pitrou Can I take up the work on implementing median? Yes, you can!
2025-04-01T04:34:56.402908
2022-09-27T10:46:09
1387537556
{ "authors": [ "gmarkall" ], "license": "bsd-2-clause", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9160", "repo": "numba/numba", "url": "https://github.com/numba/numba/pull/8467" }
gharchive/pull-request
Convert implementations using generated_jit to overload Mixing generated_jit() and overload() causes fallbacks to object mode - this should not be happening in Numba internal implementations. This commit converts all uses of generated_jit() into overload() for function implementations in Numba. I originally wrote these changes as part of an attempt at removing pipeline fallback to see how it would go, where these tests passed. However, from the fails here it looks like the removal of the fallback harmonized overload and generated_jit somewhat, so some further change will be needed here to make tests pass again (if it is possible to make this change work without removing fallback). I must have been testing badly locally. The additional commit should(:crossed_fingers:) resolve the errors in the original commit.
2025-04-01T04:34:56.405478
2021-03-26T21:36:20
842311574
{ "authors": [ "christianp", "java12man" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9161", "repo": "numbas/Numbas", "url": "https://github.com/numbas/Numbas/issues/813" }
gharchive/issue
Gap Fills Not Being Added or Displayed Correctly When attempting to add a gap to a gap fill question the button is unresponsive. Using direct source code does not show in the test run of the problem. Even with gaps added to the part it is impossible to add the gap to the prompt for the student to enter a response. The gap seems to be working as intended and it is simply a bug with the prompt. Thanks for reporting this, and sorry I let this slip in: I updated the TinyMCE editor, but forgot to check the various bits I'd customised still worked. It should be working now. Gap fills still have bugs, but only in the editor interface and using source code allows them to be used with full functionality. Not a huge issue but it should be known. Specifics: Clicking gap fill button useless. Clicking it does nothing. Gap fills "sometimes" preview but other times just show as [[index]]. IE: "[[1]]" Have you tried clearing your browser's cache? Im so sorry for bothering you, that was it. Thanks for the help.
2025-04-01T04:34:56.411191
2024-04-14T18:12:21
2242294186
{ "authors": [ "gwhitney" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9162", "repo": "numberscope/backscope", "url": "https://github.com/numberscope/backscope/pull/120" }
gharchive/pull-request
fix: Support nonzero sequence shifts (positive and negative) Store the shift of a squence in the database, and parse both negative and positive indices of entries. Unskip the positive offset test and add a test for a sequence with a negative offset (and fewer terms than is requested, to make sure that works as well). Resolves #77. Resolves #50. The SequenceInterface provides the first and last indices of the sequence, so it is able to use that information if it wants to. (And indeed, it is good practice for a visualizer to respect that information, as not all sequences have a term at index 0 or at index 1 -- so a visualizer should not just blithely start using values at any specific index. It should always look at the sequence's first and last indices. Feel free to merge when you are comfortable.
2025-04-01T04:34:56.544213
2015-06-25T01:42:59
90825234
{ "authors": [ "VCarrasco", "numtel", "sushitommy" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9163", "repo": "numtel/meteor-mysql", "url": "https://github.com/numtel/meteor-mysql/issues/23" }
gharchive/issue
.change() method does not get new rows added after started meteor If a game(new row) has been added to a player after started Meteor and I use .change(player_id) to get the current games for this player then I'm not getting the last row(the newest one). Can you provide any example code? I'm able to reproduce this. Here's the function: liveDB = new LiveMysql( { // removed code }); Meteor.publish('mData', function(id, filter) { if (this.userId == null) return; if (typeof id == "undefined") return; if (typeof liveDB == "undefined") { } this.onStop(function() { if (liveDB.db.threadId != null) { liveDB.end(); } }); var query; if (filter.timespan == undefined) { query = "" // query 1 } else { query = "" // query 2 } var cursor = liveDB.select(query, [ { table: table } ]); return cursor; }); If I call 'change' on the mysql subscription and query 2 is used instead of query 1, the mysql subscription stays empty.
2025-04-01T04:34:56.546882
2020-12-10T19:51:08
761561623
{ "authors": [ "maxvonhippel" ], "license": "bsd-2-clause", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9164", "repo": "nunchaku-inria/nunchaku", "url": "https://github.com/nunchaku-inria/nunchaku/issues/36" }
gharchive/issue
README references missing files The README references the file docs/examples/first_order.nun, which is not actually present in the repository. I think either the missing file should be added, or, the README should be updated to correctly reference the (I assume identical?) file here. Thanks!
2025-04-01T04:34:56.552190
2023-11-16T03:24:48
1995979400
{ "authors": [ "nuniz", "tpoindex" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9165", "repo": "nuniz/TorchSpectralGating", "url": "https://github.com/nuniz/TorchSpectralGating/issues/1" }
gharchive/issue
torchgating command unable run without CUDA even when --cpu specified When trying to run 'torchgating' on an x86 MacOS, an assertion is thrown with or without the --cpu flag. torchgating sample.wav --graph ... File "/usr/local/lib/python3.11/site-packages/torch/cuda/__init__.py", line 289, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") with --cpu: torchgating sample.wav --graph --cpu ... File "/usr/local/lib/python3.11/site-packages/torchgating/run.py", line 168, in main assert not opt.cpu or torch.cuda.is_available() The problem is at line 168 in torchgating/run.py: assert not opt.cpu or torch.cuda.is_available() the line should be: assert opt.cpu or torch.cuda.is_available() Thank you; I've resolved this. Can you please check if this works for you? Yes, works. Thanks.
2025-04-01T04:34:56.555984
2017-08-29T10:28:34
253604795
{ "authors": [ "elrumordelaluz", "nunofgs" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9166", "repo": "nunofgs/clean-deep", "url": "https://github.com/nunofgs/clean-deep/pull/25" }
gharchive/pull-request
Replace lodash.isarray with Array.isArray() Since it's used yet in line 34 and also to avoid the npm warn. Perfect. Thank you.
2025-04-01T04:34:56.589493
2024-04-26T08:43:52
2265276283
{ "authors": [ "W-WJ-Ryan", "pinkhydroflask", "xinnyyy" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9167", "repo": "nus-cs2030/2324-s2", "url": "https://github.com/nus-cs2030/2324-s2/issues/770" }
gharchive/issue
Updating of wait time Hello! Does anyone have any advice on how to check why the wait time is not updating when the getWaitTime is called in my simulator class? In my implementation, the wait time is only calculated in my serve event where i take the current timeStamp - customer arrival time, while the rest of the event classes have an empty implementation. here is my current method call in the simulator: however in my output the sequence of events seems to be correct but the waitTime remains as 0 resulting in the final avgWaitTime to also be 0.000: hi can i ask what does your getWaitingTime(waitTime) method do? Does it pass the waitTime into the event and then gets the event to preform the update on it? hello! for ur serve event, i think u need to do waitingTime = waitingTime + timeStamp - this.customer.findArrivaltime(); instead of just current timeStamp - customer arrival time, because the waiting time needs to be passed down from arrival event all the way to done event hello! for ur serve event, i think u need to do waitingTime = waitingTime + timeStamp - this.customer.findArrivaltime(); instead of just current timeStamp - customer arrival time, because the waiting time needs to be passed down from arrival event all the way to done event i see okay! thanks for your help it works now :)
2025-04-01T04:34:56.590597
2017-01-31T14:34:59
204315365
{ "authors": [ "BrandonTJS", "okkhoy" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9168", "repo": "nus-cs2103-AY1617S2/addressbook-level2", "url": "https://github.com/nus-cs2103-AY1617S2/addressbook-level2/pull/121" }
gharchive/pull-request
[T3A3][T09-A3] Tan Jian Sin Ready for review @BrandonTJS some comments added. please check. please close the PR once you have noted the suggestions.
2025-04-01T04:34:56.591819
2017-07-07T05:32:28
241161834
{ "authors": [ "akd13", "nishantbudhdev" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9169", "repo": "nus-cs2103-AY1617S4/addressbook-level3", "url": "https://github.com/nus-cs2103-AY1617S4/addressbook-level3/pull/24" }
gharchive/pull-request
[T5A1][T2]Akankshita Dash Files edited due to Javadoc and Checkstyle indentation/line length rules @akd13 Are you checking isMutating before storing to file ? Some comments added. Please close the PR after reading them.
2025-04-01T04:34:56.596862
2018-03-07T04:29:14
302961115
{ "authors": [ "amad-person", "okkhoy" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9170", "repo": "nus-cs2103-AY1718S2/addressbook-level4", "url": "https://github.com/nus-cs2103-AY1718S2/addressbook-level4/pull/31" }
gharchive/pull-request
[v1.2][W13-B1] ContactSails @SuxianAlicia @AJZ1995 @A0143487X v1.2 @amad-person @SuxianAlicia @AJZ1995 @A0143487X Some comments added. Please take a look and update the document as necessary. Don't close this pull request; we will use it for further reviews later in the semester. @amad-person @AJZ1995 @SuxianAlicia @A0143487X I don't see user guide updates to most of the things on this list: +* Manage customer contacts: Add, edit and delete contacts. +* Categorize customers: Group customers according to their preferences, social media platforms, or any other category you want to define. +* Manage orders: Add, edit, and delete customer orders. +* Send promotions: Apply differentiated marketing strategies by sending promotions to relevant customers. +* Keep track of deadlines: Add tasks and reminders and view your schedules in the integrated calendar. +* Know your customer demographic: See what the top selling items and who your most frequent customers are. For those you want to implement as a part of later milestones, you were supposed to think through and write the command format and usage examples now! The idea is to get you to think through the product from an end user perspective, and develop it accordingly. In addition, I would want you to mention the following in your Developer guide For each team member what are the features (major and minor) you are proposing? Within the scope of the project, how does it fit in (just a one or 2 line summary) Please do this by end of day Monday 19 March. @okkhoy In the 2103 website, the TODO just mentioned to mark the coming in v2.0 features. We added a section in the UserGuide that does this. Also, in the Developer Guide, we put the release versions in the User Stories table. Can this please be considered as passing the milestone? Thanks! @amad-person @SuxianAlicia @AJZ1995 @A0143487X Some comments added to the DG. Please take a look and update as necessary.
2025-04-01T04:34:56.634519
2018-09-08T09:48:30
358280991
{ "authors": [ "QzSG", "luhan02" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9171", "repo": "nusCS2113-AY1819S1/addressbook-level2", "url": "https://github.com/nusCS2113-AY1819S1/addressbook-level2/pull/25" }
gharchive/pull-request
[W5.11][T13-1] Adrian Tan [LO-2KLoC] Add a feature to AddressBook Adds sort command to Addressbook-2 along with relevant unit tests to fulfill W5.11 🏆 Closing PR Great enhancement.
2025-04-01T04:34:56.701937
2024-08-15T21:45:08
2469031258
{ "authors": [ "dkoshkin", "jimmidyson" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9172", "repo": "nutanix-cloud-native/cluster-api-runtime-extensions-nutanix", "url": "https://github.com/nutanix-cloud-native/cluster-api-runtime-extensions-nutanix/pull/867" }
gharchive/pull-request
fix: CRS generated CA Deployment has extra quotes What problem does this PR solve?: Fixes an issue with CRS genertated CA Deployment not working because of extra quotes. I0815 20:12:58.376871 1 reflector.go:332] Listing and watching cluster.x-k8s.io/v1beta1, Resource=machines from k8s.io/client-go/dynamic/dynamicinformer/informer.go:108 W0815 20:12:58.379140 1 reflector.go:547] k8s.io/client-go/dynamic/dynamicinformer/informer.go:108: failed to list cluster.x-k8s.io/v1beta1, Resource=machines: machines.cluster.x-k8s.io is forbidden: User "system:serviceaccount:default:cluster-autoscaler-0191579a-2104-7ace-a5a2-ceae4590d7fe" cannot list resource "machines" in API group "cluster.x-k8s.io" in the namespace "'default'" E0815 20:12:58.379170 1 reflector.go:150] k8s.io/client-go/dynamic/dynamicinformer/informer.go:108: Failed to watch cluster.x-k8s.io/v1beta1, Resource=machines: failed to list cluster.x-k8s.io/v1beta1, Resource=machines: machines.cluster.x-k8s.io is forbidden: User "system:serviceaccount:default:cluster-autoscaler-0191579a-2104-7ace-a5a2-ceae4590d7fe" cannot list resource "machines" in API group "cluster.x-k8s.io" in the namespace "'default'" The extra quotes are only an issue in --node-group-auto-discovery=, but the same script replaces all occurrences. It should be safe to drop the single quotes everywhere though because the name and namespace will be strings and won't be interpreted as numbers by yaml. Which issue(s) this PR fixes: Fixes # How Has This Been Tested?: Special notes for your reviewer: ~I think we can improve the e2e tests by checking that all Deployments are Ready on the self-managed clusters, but I did not do that as part of this PR.~ The tests already do that https://github.com/nutanix-cloud-native/cluster-api-runtime-extensions-nutanix/blob/f632224257cb159b04abc2c6c6eb6874c503bb1c/test/e2e/clusterautoscaler_helpers.go#L115 Instead we can also wait for the status ConfigMap to be Running This is what the data will contain for a working Deployment: data: status: |+ time: 2024-05-22 19:33:34.074058252 +0000 UTC autoscalerStatus: Running And for non working: data: status: |+ time: 2024-08-15 20:07:37.204175116 +0000 UTC autoscalerStatus: Initializing @dkoshkin So with this error the deployment shows as ready but it's not working properly so we should check the status configmap too? @dkoshkin So with this error the deployment shows as ready but it's not working properly so we should check the status configmap too? Yep! Attempt on the check here https://github.com/nutanix-cloud-native/cluster-api-runtime-extensions-nutanix/pull/867/commits/b447ba4bea5211afc649ea0de48fa96d8ce4df79. The new ConfigMap test failed as expected without this fix https://github.com/nutanix-cloud-native/cluster-api-runtime-extensions-nutanix/pull/868#issuecomment-2292675203
2025-04-01T04:34:56.743238
2023-07-21T18:18:40
1816206167
{ "authors": [ "codecov-commenter", "ricardogobbosouza" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9173", "repo": "nuxt-modules/google-fonts", "url": "https://github.com/nuxt-modules/google-fonts/pull/142" }
gharchive/pull-request
fix: insert noscript tag in body open Resolve #75 Codecov Report Merging #142 (d069e91) into main (16e03a8) will decrease coverage by 0.39%. The diff coverage is 50.00%. :exclamation: Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more. @@ Coverage Diff @@ ## main #142 +/- ## ========================================== - Coverage 87.75% 87.37% -0.39% ========================================== Files 1 1 Lines 196 198 +2 Branches 16 16 ========================================== + Hits 172 173 +1 - Misses 24 25 +1 Impacted Files Coverage Δ src/module.ts 87.37% <50.00%> (-0.39%) :arrow_down: :mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
2025-04-01T04:34:56.750267
2024-09-17T19:21:38
2531920768
{ "authors": [ "Ingramz", "UfukUstali", "sumomo015" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9174", "repo": "nuxt/eslint", "url": "https://github.com/nuxt/eslint/issues/499" }
gharchive/issue
Enabling typescript-eslint typed linting Describe the feature Currently, @nuxt/eslint only enables rules of non-typechecked configs (recommended / strict) from typescript-eslint. In addition to these, typescript-eslint also provides advanced rules that depend on TypeScript's type checking API-s which are available via recommended-type-checked and strict-type-checked configs (see https://typescript-eslint.io/getting-started/typed-linting and https://typescript-eslint.io/rules/?=typeInformation). There are some performance concerns when it comes to naively enabling these configs, namely specifying project or projectService in parserOptions, a prerequisite, can slow down linting speed of a medium sized project from few seconds to several minutes. It is uncertain at the moment whether these slowdowns are unique to this project, Vue or caused by something else. I think the main question I have is whether @nuxt/eslint would be willing to adopt these configs by default provided that the linting performance bottlenecks are identified. If they deem unsolvable to a satisfactory level, could optionally opting into them made easier for cases such as a CI step, where we can typically afford to spend more time? Additional information [ ] Would you be willing to help implement this feature? [ ] Could this feature be implemented as a module? Final checks [X] Read the contribution guide. [X] Check existing discussions and issues. There are some performance concerns when it comes to naively enabling these configs While i do agree that due to performance reasons this feature can/maybe should remain disabled by default, having the option to enable it is essential, especially so when even something like this: "@typescript-eslint/strict-boolean-expressions": [ "warn", { allowString: false, allowNumber: false, allowNullableObject: false, }, ] requires it. I couldn't find a way to enable it currently so if there is, i would appreciate it. I was able to manually enable typed linting with the following configuration: // @ts-check import eslint from '@eslint/js' import pluginTs from '@typescript-eslint/eslint-plugin' import parserTs from '@typescript-eslint/parser' import vueParser from 'vue-eslint-parser' import tseslint from 'typescript-eslint' import withNuxt from './.nuxt/eslint.config.mjs' export default withNuxt() // @ts-ignore .replace('nuxt/typescript/setup', { name: 'nuxt/typescript/setup', plugins: { '@typescript-eslint': pluginTs }, }) .replace('nuxt/typescript/rules', { name: 'nuxt/typescript/rules', files: ['**/*.ts', '**/*.mts', '**/*.cts', '**/*.vue'], languageOptions: { parser: vueParser, parserOptions: { parser: parserTs, sourceType: 'module', extraFileExtensions: ['.vue'], projectService: true, tsconfigRootDir: import.meta.dirname, }, }, rules: { ...tseslint.config(eslint.configs.recommended, tseslint.configs.strictTypeChecked, tseslint.configs.stylisticTypeChecked) .map(c => c.rules).reduce((a, b) => ({ ...a, ...b }), {}), // add your custom rules here 'no-console': 'warn', '@typescript-eslint/consistent-type-imports': ['error', { disallowTypeAnnotations: false, prefer: 'type-imports', }], '@typescript-eslint/no-import-type-side-effects': 'error', }, }) I hope this helps anyone looking to enable typed linting manually.
2025-04-01T04:34:56.771406
2017-08-16T08:54:38
250555712
{ "authors": [ "Atinux", "baocancode", "coderPreacher", "yeziZ", "yuchonghua" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9175", "repo": "nuxt/nuxt.js", "url": "https://github.com/nuxt/nuxt.js/issues/1394" }
gharchive/issue
Very much looking forward to a v1.0.0-rc5 Very much looking forward to a v1.0.0-rc5 because, ~assets and ~/ assets of the use of really troublesome @pi0 Can you also look at #1378 for v1.0.0-rc5? It is trivial. Thanks. Very much looking forward to a v1.0.0-rc5 because, ~assets and ~/ assets of the use of really troublesome Very much looking forward to a v1.0.0-rc5 because, ~assets and ~/ assets of the use of really troublesome rc5 is out https://github.com/nuxt/nuxt.js/releases/tag/v1.0.0-rc5
2025-04-01T04:34:56.773075
2021-02-23T11:08:40
814359731
{ "authors": [ "mattiaerli97" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9176", "repo": "nuxt/nuxt.js", "url": "https://github.com/nuxt/nuxt.js/issues/8891" }
gharchive/issue
Yarn install fails: Couldn't find any versions for "@nuxt/utils" that matches "2.15.2" When I run the yarn install command, the install fails because of this error Couldn't find any versions for "@nuxt/utils" that matches "2.15.2" I believe this is due to the recent deploy of the new nuxt version. Can you please help me? Thanks Hi @danielroe . I found that the problem was related to the Nexus cache, our package manager. I invalidated the cache and the problem is solved. thank you for the assistance!
2025-04-01T04:34:56.779398
2018-08-22T15:26:13
353005726
{ "authors": [ "clarkdo", "codecov-io" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9177", "repo": "nuxt/nuxt.js", "url": "https://github.com/nuxt/nuxt.js/pull/3788" }
gharchive/pull-request
refactor: functional filename should be called from webpack @Atinux Improvement for #3787 Codecov Report Merging #3788 into dev will increase coverage by <.01%. The diff coverage is 100%. @@ Coverage Diff @@ ## dev #3788 +/- ## ========================================== + Coverage 97.84% 97.84% +<.01% ========================================== Files 18 18 Lines 1158 1159 +1 Branches 312 312 ========================================== + Hits 1133 1134 +1 Misses 24 24 Partials 1 1 Impacted Files Coverage Δ lib/builder/webpack/base.js 98.36% <100%> (+0.02%) :arrow_up: Continue to review full report at Codecov. Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 3e02726...170911d. Read the comment docs.
2025-04-01T04:34:56.788851
2019-06-08T22:42:07
453842721
{ "authors": [ "719media", "codecov-io", "manniL" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9178", "repo": "nuxt/nuxt.js", "url": "https://github.com/nuxt/nuxt.js/pull/5897" }
gharchive/pull-request
Update README.md typo Types of changes [ ] Bug fix (a non-breaking change which fixes an issue) [ ] New feature (a non-breaking change which adds functionality) [ ] Breaking change (fix or feature that would cause existing functionality to change) Description Checklist: [ ] My change requires a change to the documentation. [ ] I have updated the documentation accordingly. (PR: #) [ ] I have added tests to cover my changes (if not applicable, please state why) [ ] All new and existing tests are passing. Codecov Report Merging #5897 into dev will not change coverage. The diff coverage is n/a. @@ Coverage Diff @@ ## dev #5897 +/- ## ======================================= Coverage 95.68% 95.68% ======================================= Files 82 82 Lines 2689 2689 Branches 690 690 ======================================= Hits 2573 2573 Misses 98 98 Partials 18 18 Flag Coverage Δ #e2e 100% <ø> (ø) :arrow_up: #fixtures 50.35% <ø> (ø) :arrow_up: #unit 92.67% <ø> (ø) :arrow_up: Continue to review full report at Codecov. Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update b1797af...6f0eeb8. Read the comment docs. Thanks :relaxed:
2025-04-01T04:34:56.795874
2023-04-26T04:39:19
1684256211
{ "authors": [ "fxzer", "jongmin4943" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9179", "repo": "nuxt/nuxt", "url": "https://github.com/nuxt/nuxt/issues/20515" }
gharchive/issue
upgrade nuxt v3.4.2 from v3.4.1 makes error - Unexpected token 'export' Environment Nuxi 3.4.2 RootDir: C:/workspace/nuxt3-practice Nuxt project info: Operating System: Windows_NT Node Version: v16.14.0 Nuxt Version: 3.4.2 Nitro Version: 2.3.3 Package Manager<EMAIL_ADDRESS>Builder: vite User Config: vite Runtime Modules: - Build Modules: - Reproduction v3.4.1 https://stackblitz.com/edit/github-musgpg?file=package.json v3.4.2 https://stackblitz.com/edit/github-htxvi2?file=package.json Describe the bug CK-editor component was working good until nuxt v3.4.1. As you can see the above two reproductions, nuxt v3.4.2 makes "Unexpected token 'export'" 500 Error. I tried nuxt transpile, add ckeditor modules and it makes "'createElement' is not defined" error. Additional context No response Logs [nuxt] [request error] [unhandled] [500] Unexpected token 'export' at Object.compileFunction (https://githubhtxvi2-bnrr.w-credentialless.staticblitz.com/blitz.bc9c2adb.js:35:399058) at wrapSafe (https://githubhtxvi2-bnrr.w-credentialless.staticblitz.com/blitz.bc9c2adb.js:35:243344) at Module._compile (https://githubhtxvi2-bnrr.w-credentialless.staticblitz.com/blitz.bc9c2adb.js:35:243712) at Module._extensions..js (https://githubhtxvi2-bnrr.w-credentialless.staticblitz.com/blitz.bc9c2adb.js:35:244741) at Module.load (https://githubhtxvi2-bnrr.w-credentialless.staticblitz.com/blitz.bc9c2adb.js:35:242766) at Module._load (https://githubhtxvi2-bnrr.w-credentialless.staticblitz.com/blitz.bc9c2adb.js:35:240331) I found the solution. It was my misunderstanding. CK-editor component has to be only client component. Changing component name to 'ckeditor.client.vue' can fix this problem. I close this issue since this is not an issue. 我pnpm build也遇到相同的报错,通过改动package.json中 type 改为module就解决啦
2025-04-01T04:34:56.802927
2024-07-16T13:05:28
2411093592
{ "authors": [ "FlorianPhilipp2007", "danielroe" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9180", "repo": "nuxt/nuxt", "url": "https://github.com/nuxt/nuxt/issues/28173" }
gharchive/issue
Problem with nuxt imports: Vitest Coverage stays at 100% for components regardless of the existence of any component test because of nuxt imports Environment Operating System: Darwin Node Version: v18.16.0 Nuxt Version: 3.11.2 CLI Version: 3.11.1 Nitro Version: 2.9.6 Package Manager<EMAIL_ADDRESS>Builder: - User Config: devtools, typescript, extends, components, ssr, runtimeConfig, app, css, build, modules, experimental, i18n, proxy, vite, public, nitro, hooks Runtime Modules<EMAIL_ADDRESS><EMAIL_ADDRESS><EMAIL_ADDRESS><EMAIL_ADDRESS><EMAIL_ADDRESS>Build Modules: - Reproduction https://stackblitz.com/edit/nuxt-starter-gvfsue?file=nuxt.config.ts,components%2Fnested%2FTestNestedUntested.vue vitest --coverage is not working in StackBlitz. Please download the project and try it locally. Describe the bug Description: When the components option in the Nuxt configuration is set to true (enabling automatic global registration of components), the test coverage report shows 100% coverage for all components, regardless of whether tests actually exist for them. Something is importing the untested TestNestedUntested. V8 is reporting coverage for it. Maybe Nuxt imports all those files. Actual Behavior: The coverage report incorrectly shows 100% coverage for all components, including those without any tests like TestNestedUntested.vue in component directory. Seems to be a problem caused by nuxt importing all files. Expected Behavior: The coverage report should accurately reflect the coverage for each component. Only TestNested.vue should show 100% coverage, while TestNestedUnested.vue should show 0% coverage or appropriate coverage based on the absence of tests. Additional context No response Logs No response This is unlikely something to fix in Nuxt. It might possibly be a vitest bug or something to be resolved in nuxt/test-utils. Have you tried ensuring that .nuxt is added to your coverage exclude patterns?
2025-04-01T04:34:56.813741
2024-07-19T16:42:28
2419370652
{ "authors": [ "appdcs", "danielroe" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9181", "repo": "nuxt/nuxt", "url": "https://github.com/nuxt/nuxt/issues/28226" }
gharchive/issue
Unable to customize server startup error page Environment Operating System: Darwin Node Version: v20.10.0 Nuxt Version: 3.12.3 CLI Version: 3.12.0 Nitro Version: 2.9.7 Package Manager<EMAIL_ADDRESS>Builder: - User Config: - Runtime Modules: - Build Modules: - Reproduction https://stackblitz.com/edit/github-bf8y4g?file=server%2Fmiddleware%2Fsample.js Describe the bug The error page that is displayed when there are server errors is not editable and doesn't get replaced by error.vue: Additional context The documentation about Error Page has been followed but it doesn't seem to apply to the 500 error described Logs No response If there is an error rendering the error page then we fall back to a static page. Your middleware means that it throws an error even when rendering the error page. Check the page route/path first. Hi @danielroe, thank you for your comment. Could you please clarify what you mean by "Check the page route/path first" ? I am well aware that the middleware is causing an error. My question is how could I customize that fallback error page templates that are provided by @nuxt/ui-templates (https://github.com/nuxt/nuxt/tree/main/packages/ui-templates)? Thank you Alright, so I have changed the title of this issue to better represent my current understanding of how these pages work. That 'fallback error page', which is not to be confused with the Error Page (i.e.: error.vue), is handled by the configured Nitro error handler. As per https://vuejs.org/api/application.html#app-config-errorhandler, https://nuxt.com/docs/getting-started/error-handling#vue-errors and https://nitro.unjs.io/config#errorhandler, a new error handler can be provided in nuxt.config.js. I truly couldn't find an example online, so I took upon myself to try it on this online code editor: https://stackblitz.com/edit/github-bf8y4g-7iqjdn?file=server%2Ferror.ts This is the result: Hopefully this helps someone else.
2025-04-01T04:34:56.824241
2024-11-06T13:16:21
2638074224
{ "authors": [ "danielroe", "hafizjavaid" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9182", "repo": "nuxt/nuxt", "url": "https://github.com/nuxt/nuxt/issues/29821" }
gharchive/issue
Build failing while deploying to Cloudflare I'm facing issue when I'm deploying it to cloudflare. Local build is running fine. I don't know what's wrong with it Environment Operating System: Windows Nuxt: 3.13.2 Nitro: 2.10.2 Node: v20.18.0 Minimal reproduction It's my first time, I'm creating the minimal reproduction. Please forgive me if I made some mistake. I have added all the packages I'm using and my nuxt.config.ts file. For reference if I run the build command locally it's building fine. I even tried to downgrade dependencies but there was no success https://codesandbox.io/p/devbox/festive-volhard-xf53h4 I even upgraded to latest nuxt ( 3.14 ), but still build is failing Package.json { "name": "nuxt-app", "private": true, "type": "module", "scripts": { "build": "nuxt build", "dev": "nuxt dev", "generate": "nuxt generate", "preview": "nuxt preview", "postinstall": "nuxt prepare" }, "dependencies": { <EMAIL_ADDRESS>"^3.3.1", "@nuxt/content": "^2.13.1", "@nuxt/fonts": "^0.7.2", "@nuxtjs/sitemap": "^6.1.2", "@prisma/client": "^5.17.0", "@quasar/extras": "^1.16.12", "nuxt": "^3.13.2", "nuxt-auth-utils": "^0.5.1", "nuxt-quasar-ui": "^2.1.3", "prisma": "^5.17.0", "quasar": "^2.16.6", "sass": "^1.77.8", "vite": "^5.3.4", "vue": "latest", "vue-sonner": "^1.2.1", "zod": "^3.23.8" }, "prisma": { "seed": "node prisma/seed.js" }, "packageManager"<EMAIL_ADDRESS>"devDependencies": { "@iconify-json/devicon-plain": "^1.2.4", "@iconify-json/flowbite": "^1.2.2", "@iconify-json/heroicons": "^1.2.1", "@iconify-json/logos": "^1.2.3", "@iconify-json/mdi": "^1.2.1", "@iconify-json/pepicons-pop": "^1.2.1", "@iconify-json/tabler": "^1.2.5", "@iconify-json/uil": "^1.2.1", "@nuxt/icon": "^1.6.1" } } a tip: with a minimal repro like this, you can try removing one module after the other to see which one is causing it, and you can set the preset on the command line. yarn nuxi build --preset cloudflare-pages you would find that the module which causes this is @nuxt/icon - likely by importing a node version of consola
2025-04-01T04:34:56.839698
2024-03-27T20:06:58
2211803480
{ "authors": [ "Alpha4", "NilsEvers", "josh1va", "moshetanzer", "webseven-de" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9183", "repo": "nuxt/ui", "url": "https://github.com/nuxt/ui/issues/1576" }
gharchive/issue
Documentation is broken in Chrome Environment Operating system: Darwin Browser: Chrome v123.0.6312.87 (Official Build) (x86_64) Version v2.15.0 Reproduction https://ui.nuxt.com/getting-started/installation or https://ui.nuxt.com/components/input Description The documentation is suddenly broken in Chrome for me: This also happens in incognito mode. Firefox works though. Additional context No response Logs No response Sure. I did several hard refreshes and also tried to open the page in incognito mode. I did another test, this time with Safari and it is also broken. Same issue here using Chrome 123.0.6312.59 After updating Chrome to 123.0.6312.86, the error still persists. Same issue here on Firefox Developer Edition 125.0b5 (64-bit) on macOS Sonoma 14.4 I do not have the same problem in Safari Version 17.4 (196<IP_ADDRESS>.12) Anyone still encountering this issue? Seems to be fixed for me. Also, no errors in console anymore👍 Thanks! Great! Going to close this issue for now 😄 .
2025-04-01T04:34:56.848218
2024-12-17T21:42:18
2746145855
{ "authors": [ "MickL", "benjamincanac" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9184", "repo": "nuxt/ui", "url": "https://github.com/nuxt/ui/issues/2926" }
gharchive/issue
v3: Icon size gets overwritten by icon's width/height Version v3 latest commit Description The size-* class applied by a button gets overwritten by the icon itself. Left one is in my project, right one is from the docs: I couldnt reproduce in a fresh project, but my project is pretty clean still with no customizations. Might have something to do with the css order? What version of @nuxt/icon are you using? 1.10.2 Found the issue: It depends on the order of adding modules in nuxt config. I have 'nuxt-icon' added to modules, even tho I use nuxt ui, because it is part of a Nuxt layer. With this order the styles are overwritten as shown above: modules: ['@nuxt/ui', '@nuxt/icon'], You shouldn't put @nuxt/icon in your modules, @nuxt/ui installs it and configure it for you: https://ui3.nuxt.dev/getting-started/icons/nuxt#usage https://github.com/nuxt/ui/blob/v3/src/module.ts#L107 Yes I know but it is part of a Nuxt layer, together with its configuration, used in multiple apps (only one of them is using Nuxt UI). Why is @nuxt/ui in the layer if only used by one app then? 🤔 On the other apps, if @nuxt/ui is part of the layer it will be setup and will install @nuxt/icon anyway. Sorry for missunderstandement. Nuxt-icon is used by all apps and needs to be configured in the same way for all of them. Nuxt UI on the other hand is only used by one single app which also needs the layer. I see! Did changing the order of the modules solved the issue then? Yes but I cant change the order because apparently the Nuxt layer comes first. If there is no possibility on Nuxt UI side, I have to remove it from the layer or make an additional layer for non-Nuxt UI apps. I have the thing with nuxt/fonts btw, added by the layer and then added by Nuxt UI again. If the module comes before and you didn't override the config it shouldn't be a problem because we check if the module is already installed 🤔 https://github.com/nuxt/ui/blob/v3/src/module.ts#L99-L105 It seems you're missing the { cssLayer: 'components' } option, maybe you can try adding it to your layer? I will try. Do you have a reference to this option? Cant find it. In the end I guess the issue is only that tailwind css seems to come before icon css. This is why this option is set, it's not documented no. https://github.com/nuxt/icon/blob/main/src/schema-types.ts#L67 Setting this in nuxt.config.ts did the trick: icon: { cssLayer: 'components', },
2025-04-01T04:34:56.854246
2023-10-04T17:05:19
1926641068
{ "authors": [ "benjamincanac", "stlbucket" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9185", "repo": "nuxt/ui", "url": "https://github.com/nuxt/ui/issues/780" }
gharchive/issue
variant and color on preset: table->default->sortButton causes warnings Environment Operating System: Darwin Node Version: v18.17.0 Nuxt Version: 3.7.4 CLI Version: 3.9.0 Nitro Version: 2.6.3 Package Manager<EMAIL_ADDRESS>Builder: - User Config: supabase, imports, modules, pinia, devtools, css, graphql-client, devServer, runtimeConfig, components, ignore, tailwindcss Runtime Modules<EMAIL_ADDRESS><EMAIL_ADDRESS><EMAIL_ADDRESS><EMAIL_ADDRESS><EMAIL_ADDRESS><EMAIL_ADDRESS><EMAIL_ADDRESS>Build Modules: - Version v2.9.0 Reproduction add table preset to app.config.ts as shown in description below Description when including the preset in app config, the folowing warnings get thrown: this just started happening after upgrading to 2.8.1 if I remove the two highlighted settings, no warnings Additional context No response Logs No response You might need to add as const when defining those as since #692 color, variant, size are not strings anymore but enums.
2025-04-01T04:34:57.048685
2019-06-18T23:03:00
457732965
{ "authors": [ "MatFillion", "XiaotianNetlift", "jeromelaban" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9186", "repo": "nventive/Uno", "url": "https://github.com/nventive/Uno/pull/1093" }
gharchive/pull-request
bumped next-version to 1.46.0 PR Type What kind of change does this PR introduce? Build or CI related changes Documentation content changes What is the new behavior? bumped next-version to 1.46.0 from 1.45.0 PR Checklist Please check if your PR fulfills the following requirements: [x] Contains NO breaking changes [x] Updated the Release Notes [x] Associated with an issue (GitHub or internal) Other information Internal Issue (If applicable): https://nventive.visualstudio.com/Umbrella/_workitems/edit/156042 We shouldn't have updates to beta in master. Agreed. Don't update packages to the latest beta, latest dev is fine. @MatFillion @jeromelaban I fixed it. Can you review this again? /azp run
2025-04-01T04:34:57.050205
2019-05-22T20:03:22
447318171
{ "authors": [ "MaximeDionNventiveCom", "jeromelaban" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9187", "repo": "nventive/calculator", "url": "https://github.com/nventive/calculator/pull/23" }
gharchive/pull-request
Dev/madi/fix button themeing Fixes #. 154034 [Calculator] Themeing of the button is not correct Description of the changes: Colors of the press and hover state for the calculator button How changes were validated: Tested on windows. I changed the branch, we’re using Uno, not master
2025-04-01T04:34:57.061556
2023-04-02T09:00:36
1650867959
{ "authors": [ "HieuTranTH", "feoh", "tjdevries" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9192", "repo": "nvim-lua/kickstart.nvim", "url": "https://github.com/nvim-lua/kickstart.nvim/pull/246" }
gharchive/pull-request
nvim-treesitter.configs: Rename help to vimdoc To reflect change 93fa5df from 'nvim-treesitter/nvim-treesitter' Otherwise this error will be thrown: Installation not possible: ...vim/lazy/nvim-treesitter/lua/nvim-treesitter/install.lua:86: Parser not available for language "help" Hiya we have a duplicate PR here :) Mine's #248 and has a bunch of comments/references. Hopefully we can get someone to merge one of them :) Thanks! I already merged #248, but I appreciate the PR
2025-04-01T04:34:57.065606
2022-07-08T10:09:55
1298799632
{ "authors": [ "Malefaro", "akinsho" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9193", "repo": "nvim-neotest/neotest-go", "url": "https://github.com/nvim-neotest/neotest-go/issues/10" }
gharchive/issue
[Feature request] args optional setup argument Hi Can we get optional setup argument args like in neotest-python to provide extra_args from setup ? For example: require('neotest-go')({ args = {"-timeout=60s","-count=1"} }) Because providing them via neotest.run.run({extra_args={...}}) forces them to be specified in key bindings, which is not necessary when using neotest in other languages and may conflicts @Malefaro FYI 👉🏿 https://github.com/nvim-neotest/neotest-go#contributing https://github.com/nvim-neotest/neotest-go/pull/11 closed by #11
2025-04-01T04:34:57.068099
2023-01-11T22:20:51
1529789367
{ "authors": [ "rcarriga", "stevearc" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9194", "repo": "nvim-neotest/neotest", "url": "https://github.com/nvim-neotest/neotest/pull/193" }
gharchive/pull-request
fix: don't error if adapter has no root position I'm seeing an error when setting discovery.enabled = false. I have an adapter that does not have any valid tests discovered, and thus has no root position in the state. This causes the assert here to always fail, which means that I see an error on every BufAdd and BufWritePost. The repro is as follows: Open a file in a different language from the test adapter Open the summary pane require("neotest").summary.toggle() Open any new buffer :enew Observe error The fix I put in was to silently ignore the adapter if there is no root. I think this is fine (as that is a valid state), but let me know if that is not the case. Sorry this totally slipped by me! Yep this fix looks good thanks for the PR :grin:
2025-04-01T04:34:57.136574
2022-12-19T22:07:08
1503677005
{ "authors": [ "DominikPieper", "SimonCockx" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9195", "repo": "nxext/nx-extensions", "url": "https://github.com/nxext/nx-extensions/issues/870" }
gharchive/issue
Ionic-Angular: needs publishing to npm for Angular 15 support Since this commit from October 26, @nxext/ionic-angular supports Angular 15. However, the last publish of @nxext/ionic-angular on npm was 5 months ago, hence anyone trying to add this package to an Angular 15 project gets in trouble with the message npm ERR! Found<EMAIL_ADDRESS>npm ERR! node_modules/@nrwl/angular npm ERR! dev<EMAIL_ADDRESS>from the root project npm ERR! npm ERR! Could not resolve dependency: npm ERR! peer<EMAIL_ADDRESS>from<EMAIL_ADDRESS>npm ERR! node_modules/@nxext/ionic-angular npm ERR! dev<EMAIL_ADDRESS>from the root project @DominikPieper is there any kind of policy on releases? Any plans to release this soon? Great, keep up the good work! I wish you a stressless holiday ^^ Pushed the new version
2025-04-01T04:34:57.138293
2022-01-27T12:15:49
1116140251
{ "authors": [ "dannymcgee", "kunjee17" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:9196", "repo": "nxrs/cargo", "url": "https://github.com/nxrs/cargo/issues/6" }
gharchive/issue
Not an issue but big thanks. Hi, I was trying to do similar when I found .Net and Go support. But couldn't do it due to time constraint. But you created. And I was more than happy to delete my repo. thanks for whoever is working on this Repo. I will be soon love to take it for a spin. I have quite a big workspace for client project. Thanks again. And you can close the issue if you like. :) You're very welcome, happy it's helpful! :)