repo stringlengths 7 67 | org stringlengths 2 32 ⌀ | issue_id int64 780k 941M | issue_number int64 1 134k | pull_request dict | events list | text_size int64 0 279k | bot_issue bool 1 class | modified_by_bot bool 2 classes | user_count int64 1 77 | event_count int64 1 191 | modified_usernames bool 2 classes |
|---|---|---|---|---|---|---|---|---|---|---|---|
Haulmont/jmix-ui | Haulmont | 841,749,197 | 408 | null | [
{
"action": "opened",
"author": "DartVerder",
"comment_id": null,
"datetime": 1616750143000,
"masked_author": "username_0",
"text": "**Environment:**\r\nJmix version: 0.9.0\r\nIntelliJ version: IntelliJ IDEA 2020.3.3 (Community Edition)\r\n\r\n### Steps:\r\n1. Create new project\r\n2. Create new entity with 2 attributes (ex. grade, active)\r\n3. Create browser and editor screeens for entity\r\n4. In the browser screen descriptor, inside the filter field, add a groupFilter as:\r\n```\r\n<configurations>\r\n <configuration id=\"defaultConfiguration\" default=\"true\">\r\n <groupFilter operation=\"AND\">\r\n <propertyFilter property=\"active\" operation=\"EQUAL\"\r\n operationEditable=\"true\" defaultValue=\"false\" operationCaptionVisible=\"true\"/>\r\n <propertyFilter property=\"grade\" operation=\"EQUAL\"\r\n operationEditable=\"true\" defaultValue=\"30\"/>\r\n </groupFilter>\r\n </configuration>\r\n</configurations>\r\n```\r\n5. Run application\r\n6. Open the entity browser screen\r\n\r\n**ER:** caption of group filter will be determined depending on the value of the operation attribute.\r\n**AR:** caption of group fiter not set\r\nNote: for operation \"OR\" caption is determined",
"title": "Caption of the GroupFilter not determined for AND-operation",
"type": "issue"
},
{
"action": "closed",
"author": "DartVerder",
"comment_id": null,
"datetime": 1617103147000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1,085 | false | false | 1 | 2 | false |
bjerkio/oidc-react | bjerkio | 854,613,839 | 580 | {
"number": 580,
"repo": "oidc-react",
"user_login": "bjerkio"
} | [
{
"action": "opened",
"author": "pamapa",
"comment_id": null,
"datetime": 1617981054000,
"masked_author": "username_0",
"text": "- depends on #579 merge request (for effect depends `[userManager]`....)\r\n\r\nThis event is useful to listen to when the application calls userManager.removeUser.",
"title": "support event addUserLoaded",
"type": "issue"
},
{
"action": "created",
"author": "pamapa",
"comment_id": 818467193,
"datetime": 1618294301000,
"masked_author": "username_0",
"text": "when #588 is merged i will rebase this one",
"title": null,
"type": "comment"
}
] | 202 | false | true | 1 | 2 | false |
ADRE9/bunk-manager-mern | null | 828,087,058 | 30 | null | [
{
"action": "opened",
"author": "dig9074vijay",
"comment_id": null,
"datetime": 1615397405000,
"masked_author": "username_0",
"text": "Hey!\r\nI noticed that we still have the default **favicon** that comes with React boilerplate code. I would to like to change it to the one that fits our theme as a GSSOC`21 participant.",
"title": "Change default favicon",
"type": "issue"
},
{
"action": "created",
"author": "ADRE9",
"comment_id": 795797858,
"datetime": 1615397955000,
"masked_author": "username_1",
"text": "@username_0 ok go ahead.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dig9074vijay",
"comment_id": 796567983,
"datetime": 1615451976000,
"masked_author": "username_0",
"text": "Thanks @username_1 ! I'll add it shortly",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ADRE9",
"comment_id": null,
"datetime": 1615491977000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 246 | false | false | 2 | 4 | true |
RazuvaevDD/I2PSecChat | null | 808,535,896 | 25 | {
"number": 25,
"repo": "I2PSecChat",
"user_login": "RazuvaevDD"
} | [
{
"action": "opened",
"author": "AverinLV",
"comment_id": null,
"datetime": 1613395345000,
"masked_author": "username_0",
"text": "",
"title": "Added implementation of some functions in MainFrameLogic class",
"type": "issue"
},
{
"action": "created",
"author": "RazuvaevDD",
"comment_id": 779228496,
"datetime": 1613396123000,
"masked_author": "username_1",
"text": "Жду исправления поддержки обратной совместимости межмодульного взаимодействия. (Добавил поле, нужен конструктор? Создай новый, не изменяй старые и все места, где он юзается)",
"title": null,
"type": "comment"
}
] | 173 | false | false | 2 | 2 | false |
Penetrum-Security/Maltree-Issue-Repo | null | 622,801,447 | 159 | null | [
{
"action": "opened",
"author": "Penetrum-Security",
"comment_id": null,
"datetime": 1590096416000,
"masked_author": "username_0",
"text": "Python version `2.717`\nTraceback: \n```\nTraceback (most recent call):\n File \"/root/Maltree/entry/main.py\", line 137, in main\n config[\"cuckoo_server\"][\"uri\"]\n File \"/root/Maltree/api/cuckoo_api.py\", line 390, in dynamic_analysis\n return process_results(da, task_id, filename)\n File \"/root/Maltree/api/cuckoo_api.py\", line 247, in process_results\n pcap_data = base64.b64encode(open(process_pcap(malware_name, da_env, task_id).read()))\n'str' object has no attribute 'read'\n```\nRunning platform: `Linux-4.15.0-66-generic-x86_64-with-Ubuntu-18.04-bionic`",
"title": "Maltree Issue (1489a655aca14e8)",
"type": "issue"
},
{
"action": "closed",
"author": "Penetrum-Security",
"comment_id": null,
"datetime": 1590098445000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 560 | false | false | 1 | 2 | false |
ocaml-ppx/ocamlformat | ocaml-ppx | 727,329,175 | 1,508 | null | [
{
"action": "opened",
"author": "emillon",
"comment_id": null,
"datetime": 1603368521000,
"masked_author": "username_0",
"text": "Even with the fixes linked in #1505, `dune runtest` doesn't pass.\r\nThis is not an exhaustive list, but for example:\r\n\r\nWarning syntax changed (can be polyfilled with a postprocessor):\r\n\r\n```diff\r\n-Warning 47: illegal payload for attribute 'ocamlformat'.\r\n+Warning 47 [attribute-payload]: illegal payload for attribute 'ocamlformat'.\r\n```\r\n\r\nSome comments move:\r\n\r\n```diff\r\n-let {a (*a*) : a} = e\r\n+let {a : a (*a*)} = e\r\n```\r\n\r\nFunctor sugar is not recognized:\r\n\r\n```diff\r\n- module Make : functor (TV : TYPEVIEW with type t = t) -> USERCODE(TV).F\r\n+ module Make (TV : TYPEVIEW with type t = t) : USERCODE(TV).F\r\n```",
"title": "Tests do not pass on 4.12",
"type": "issue"
},
{
"action": "created",
"author": "emillon",
"comment_id": 719638469,
"datetime": 1604073461000,
"masked_author": "username_0",
"text": "N.B. when we close this issue, let's not forget to update the opam metadata.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gpetiot",
"comment_id": 775167047,
"datetime": 1612792520000,
"masked_author": "username_1",
"text": "@username_0 do you still have non-passing tests?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "emillon",
"comment_id": 776745610,
"datetime": 1612967358000,
"masked_author": "username_0",
"text": "It's all good now!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "gpetiot",
"comment_id": null,
"datetime": 1612967661000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 756 | false | false | 2 | 5 | true |
procube-open/hive-builder | procube-open | 755,994,862 | 7 | {
"number": 7,
"repo": "hive-builder",
"user_login": "procube-open"
} | [
{
"action": "opened",
"author": "kkuwa",
"comment_id": null,
"datetime": 1606985431000,
"masked_author": "username_0",
"text": "iptables 版の DNAT 対応です。\r\n\r\nlabels に、HIVE_VIPとセットで\r\n```\r\n labels:\r\n HIVE_VIP: \"10.0.0.1\"\r\n HIVE_DNAT_PORTS: \"1812/udp,8080\"\r\n```\r\nのように記述することで、VIP操作時に合わせて iptables 操作を実行します。",
"title": "Add support HIVE_DNAT_PORTS label",
"type": "issue"
},
{
"action": "created",
"author": "kkuwa",
"comment_id": 739709227,
"datetime": 1607323755000,
"masked_author": "username_0",
"text": "は、239行目で、コンテナの起動が完了してまで待機する処理で使用しています。ただ、メンバ変数に設定するのは task の ID と初期ステータスだけでもよいです。",
"title": null,
"type": "comment"
}
] | 264 | false | false | 1 | 2 | false |
HLTech/judge-d | HLTech | 665,140,329 | 130 | null | [
{
"action": "opened",
"author": "Felipe444",
"comment_id": null,
"datetime": 1595594289000,
"masked_author": "username_0",
"text": "**Is your feature request related to a problem? Please describe.**\r\nA clear and concise description of what the problem is. Ex. I'm always frustrated when [...]\r\n\r\n**Describe the solution you'd like**\r\nA clear and concise description of what you want to happen.\r\n\r\n**Describe alternatives you've considered**\r\nA clear and concise description of any alternative solutions or features you've considered.\r\n\r\n**Additional context**\r\nAdd any other context or screenshots about the feature request here.",
"title": "Remove deprecated endpoints",
"type": "issue"
},
{
"action": "closed",
"author": "Felipe444",
"comment_id": null,
"datetime": 1595596493000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 497 | false | false | 1 | 2 | false |
koaning/skedulord | null | 813,879,701 | 41 | null | [
{
"action": "opened",
"author": "koaning",
"comment_id": null,
"datetime": 1614029379000,
"masked_author": "username_0",
"text": "This syntax would be nice; \r\n\r\n```yaml\r\nuser: vincent\r\nschedule:\r\n - name: good-job\r\n command: python /Users/vincent/Development/skedulord/jobs/pyjob.py\r\n cron: \"*/2 * * * *\"\r\n arguments:\r\n org: rasahq\r\n repo: rasa\r\n output-dir: /home/vincent/Development/gh-dashb/issues\r\n```",
"title": "Allow CLI arguments to be configurable.",
"type": "issue"
}
] | 314 | false | false | 1 | 1 | false |
serilog/serilog-aspnetcore | serilog | 664,863,220 | 203 | null | [
{
"action": "opened",
"author": "nblumhardt",
"comment_id": null,
"datetime": 1595554932000,
"masked_author": "username_0",
"text": "Published CI builds are failing with:\r\n\r\n```\r\nDeploying using NuGet provider\r\nPublishing Serilog.AspNetCore.3.4.0-dev-00176.nupkg to https://www.nuget.org/api/v2/package...OK\r\nPublishing Serilog.AspNetCore.3.4.0-dev-00176.symbols.nupkg to https://nuget.smbsrc.net/api/v2/package...\r\nError publishing package. NuGet server returned 500: Internal Server Error\r\n```\r\n\r\nThis issue used to plague us some time back. I've disabled symbol publishing for now with `skip_symbols` in appveyor.yml, but it's likely we need to switch to the new snupkg format.",
"title": "Symbol publishing is failing from CI",
"type": "issue"
},
{
"action": "closed",
"author": "nblumhardt",
"comment_id": null,
"datetime": 1613289660000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 547 | false | false | 1 | 2 | false |
aws/jsii | aws | 669,632,466 | 1,840 | null | [
{
"action": "opened",
"author": "skorfmann",
"comment_id": null,
"datetime": 1596188937000,
"masked_author": "username_0",
"text": "## :rocket: Feature Request\r\n\r\n### Affected Languages\r\n<!--\r\nCheck the box (with an X) for any language runtime that you know is affected by\r\nthe reported bug. If you're uncertain whether a language is affected or not,\r\nplease leave the bux un-checked.\r\n-->\r\n- [ ] `TypeScript` or `Javascript`\r\n- [x] `Python`\r\n- [x] `Java` (haven't checked it)\r\n- [x] .NET (`C#`, `F#`, ...) (haven't checked it)\r\n\r\n### General Information\r\n* **JSII Version:** <!-- Output of `jsii --version` -->\r\n* **Platform:** <!-- `uname -a` (UNIX) / Version of Windows -->\r\n<!--\r\nCheck the box below (with an X) if you are able and willing to propose an\r\nimplementation for the requested feature. This does not imply a commitment from\r\nyou to actually do it!\r\n-->\r\n* [ ] I may be able to implement this feature request\r\n\r\n### Description\r\n\r\nWhen errors are occurring in one of the generated language targets, it's not possible by the stack trace to see where the error was originating from. See this stack trace for an AWS CDK app:\r\n\r\n```json\r\n \"/foo/foo/Resource\": [\r\n {\r\n \"type\": \"aws:cdk:logicalId\",\r\n \"data\": \"foo6445C170\",\r\n \"trace\": [\r\n \"new Bucket (/private/var/folders/zl/z8nc41qs7010qsfngfc_z1zc0000gn/T/jsii-kernel-ZbnPBD/node_modules/@aws-cdk/aws-s3/lib/bucket.js:387:26)\",\r\n \"/Users/sebastian/projects/cdk/tmp/foo/.env/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7915:49\",\r\n \"Kernel._wrapSandboxCode (/Users/sebastian/projects/cdk/tmp/foo/.env/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8388:20)\",\r\n \"Kernel._create (/Users/sebastian/projects/cdk/tmp/foo/.env/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7915:26)\",\r\n \"Kernel.create (/Users/sebastian/projects/cdk/tmp/foo/.env/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7659:21)\",\r\n \"KernelHost.processRequest (/Users/sebastian/projects/cdk/tmp/foo/.env/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7439:28)\",\r\n \"KernelHost.run (/Users/sebastian/projects/cdk/tmp/foo/.env/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7377:14)\",\r\n \"Immediate._onImmediate (/Users/sebastian/projects/cdk/tmp/foo/.env/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7380:37)\",\r\n \"processImmediate (internal/timers.js:456:21)\"\r\n ]\r\n }\r\n ]\r\n```\r\n\r\nIt looks similar for `cdktf` apps, see this issue for more context: https://github.com/hashicorp/terraform-cdk/issues/131\r\n\r\n### Proposed Solution\r\n<!--\r\nWhenever relevant, describe how you would like the feature to be implemented.\r\nInclude any documentation that can help understand your idea in very concrete\r\nways, such as code examples that leverage your feature, captures of design\r\ndiagrams, ...\r\n-->\r\n\r\nIt would be good to see a reference to the code in the respective language (e.g. Python), rather than having JS stack traces only.",
"title": "Cross Platform Error Stack Traces",
"type": "issue"
},
{
"action": "created",
"author": "RomainMuller",
"comment_id": 704845792,
"datetime": 1602066552000,
"masked_author": "username_1",
"text": "This is a duplicate of #47",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "RomainMuller",
"comment_id": null,
"datetime": 1602066553000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 3,058 | false | false | 2 | 3 | false |
codefluence-x/altair | codefluence-x | 642,505,338 | 15 | null | [
{
"action": "opened",
"author": "insomnius",
"comment_id": null,
"datetime": 1592721047000,
"masked_author": "username_0",
"text": "**Current Condition**\r\n\r\nGo lint in 45% score from go report\r\n\r\nhttps://goreportcard.com/report/github.com/codefluence-x/altair#golint\r\n\r\n**What we want**\r\n\r\nMake it atleast 90%\r\n\r\n**What we should do**\r\n- Add comment for documentation\r\n- Don't mind to ask me in my email or telegram (@username_0)",
"title": "[Go Report] [Go Lint] Make go lint atleast > 90%",
"type": "issue"
}
] | 296 | false | false | 1 | 1 | true |
bokeh/bokeh | bokeh | 776,045,500 | 10,803 | null | [
{
"action": "opened",
"author": "mattpap",
"comment_id": null,
"datetime": 1609269132000,
"masked_author": "username_0",
"text": "There multiple warnings like this in every CI run:\r\n```\r\nWarning: 'defaults' already in 'channels' list, moving to the top\r\n```\r\nPerhaps benign, but maybe affects resolution order. In any case, it would be good to clear this up. /cc @username_1",
"title": "`conda` warnings from bokeh-ci workflow",
"type": "issue"
},
{
"action": "created",
"author": "bryevdv",
"comment_id": 752212680,
"datetime": 1609269652000,
"masked_author": "username_1",
"text": "Weird, I will take a look soon.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bryevdv",
"comment_id": 752279648,
"datetime": 1609286059000,
"masked_author": "username_1",
"text": "@username_0 I think this is actually coming from `conda-incubator/setup-miniconda` and maybe the solution is just to remove `defaults` from the env files e.g. here:\r\n\r\nhttps://github.com/bokeh/bokeh/blob/branch-2.3/ci/environment-test-3.9.yml#L26\r\n\r\nBut we *do* want defaults first AFAIK and the warning mentions that its presence results in \"moving to the top\" so we might just have to live with it unless there is some other way to guarantee defaults is always first.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mattpap",
"comment_id": null,
"datetime": 1609417706000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 738 | false | false | 2 | 4 | true |
Setono/SyliusAnalyticsPlugin | Setono | 827,879,784 | 39 | {
"number": 39,
"repo": "SyliusAnalyticsPlugin",
"user_login": "Setono"
} | [
{
"action": "opened",
"author": "Roshyo",
"comment_id": null,
"datetime": 1615388694000,
"masked_author": "username_0",
"text": "Also sorting translation on alphabetical order",
"title": "Adding French translation",
"type": "issue"
},
{
"action": "created",
"author": "loevgaard",
"comment_id": 796526422,
"datetime": 1615447605000,
"masked_author": "username_1",
"text": "Thank you, Stephane 🎉",
"title": null,
"type": "comment"
}
] | 67 | false | false | 2 | 2 | false |
GoogleChrome/lighthouse | GoogleChrome | 431,368,547 | 8,128 | {
"number": 8128,
"repo": "lighthouse",
"user_login": "GoogleChrome"
} | [
{
"action": "opened",
"author": "muuvmuuv",
"comment_id": null,
"datetime": 1554884604000,
"masked_author": "username_0",
"text": "<!-- Thank you for submitting a pull request! -->\r\n\r\n**Summary**\r\n\r\nThis PR fixes #8110 where a `div` was missing and adds linter for HTML and CSS (as well as CSS in HTML)\r\n\r\nThis change was needed in favour of beautified exported reports where e.g. prettier printed an error because of that missing closing tag.\r\n\r\nThe linter where needed to prevent future errors like this and add consistence in styling CSS and HTML. We should now discuss how we want the linter to be configured, I added some defaults I _personally_ prefer the most and use in daily basis.\r\n\r\n**Notes**\r\n\r\nAfter I tested both linter's I took over the files and fixed those manually and parts automatically via `--fix`-flags. I don't think this should have broken something but it might be good someone else is looking over it and the tests, to see if everything is as before.\r\n\r\nI was forced to add some `.vscode`-settings because my auto formatter `prettier` tried to format some files to heavy.\r\n\r\nSomething I would add to the end we could consider:\r\n\r\n- add a formatter, like [prettier](https://github.com/prettier/prettier-vscode/), to make it easier while developing\r\n- add a spell-checker like [vscode-spell-checker](https://github.com/Jason-Rev/vscode-spell-checker), to prevent people from writing wrong typo (I have see some, see: lighthouse-core/report/html/report-styles.css:73:14)\r\n\r\n**Tests**\r\n\r\nI don't have much experience with tests and don't know if all have passed before. Here is my result:\r\n\r\n```bash\r\nTest Suites: 7 failed, 198 passed, 205 total\r\nTests: 52 failed, 5 skipped, 1348 passed, 1405 total\r\nSnapshots: 31 passed, 31 total\r\nTime: 25.598s, estimated 26s\r\n```\r\n\r\n**Added packages**\r\n\r\n- [html-linter](https://www.npmjs.com/package/html-linter) (this seems to be maintained)\r\n- [stylelint](https://github.com/stylelint/stylelint)\r\n- [stylelint-config-standard](https://github.com/stylelint/stylelint-config-standard) (which provides [Google's CSS Style Guide]())\r\n\r\n**Related Issues/PR's**\r\n\r\n#8110",
"title": "core: Added missing div in template and added html, css and css in html linters",
"type": "issue"
},
{
"action": "created",
"author": "patrickhulce",
"comment_id": 481732078,
"datetime": 1554909071000,
"masked_author": "username_1",
"text": "Thank you so much for the fast contribution @username_0! 👏 👏 💯 \r\n\r\nWould you mind splitting out the one `</div>` fix into a separate PR so we can get that in quickly while we try to sort out the linter pieces? :)\r\n\r\nIdeally, I think we could land the linter on warning or a small subset of the codebase and land the rest of the fixes incrementally, WDYT? We're going to be in the middle of updating our styles and HTML for v5 so a PR that's too big will likely be frustrating to try and rebase all the time.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Hoten",
"comment_id": 481886640,
"datetime": 1554934466000,
"masked_author": "username_2",
"text": "@paulirish and I are in the middle of some large report changes. We want to wait until after that before attempting to merge this in.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "brendankenny",
"comment_id": 481890709,
"datetime": 1554935525000,
"masked_author": "username_3",
"text": "that should be fine with the report work. Agreed with others that we should discuss in an issue before adding additional linters.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "muuvmuuv",
"comment_id": 481986791,
"datetime": 1554964904000,
"masked_author": "username_0",
"text": "@username_1 yeah that makes sense 🤦♂️, sorry. How to I roll-back without deleting the PR? Not very family-related with Git. Do I need to great a new branch on master of lighthouse?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "patrickhulce",
"comment_id": 482171211,
"datetime": 1554997775000,
"masked_author": "username_1",
"text": "Given that the real fix is so small, I think a new branch and PR is probably the easiest way to go :)\r\n\r\nAppreciate the effort here nonetheless @username_0 we'll want this to happen eventually! Just after we take care of some urgent report changes first 😃",
"title": null,
"type": "comment"
}
] | 3,213 | false | false | 4 | 6 | true |
aws/amazon-ecs-cli-v2 | aws | 587,839,665 | 782 | null | [
{
"action": "opened",
"author": "efekarakus",
"comment_id": null,
"datetime": 1585154973000,
"masked_author": "username_0",
"text": "## Ask\r\n\r\nBeing able to assign [tags](https://aws.amazon.com/answers/account-management/aws-tagging-strategies/) to all the application resources when manually deploying an application.\r\n\r\nRelated to https://github.com/aws/amazon-ecs-cli-v2/issues/696\r\n\r\n### Proposal\r\n\r\n```bash\r\n$ ecs-preview app deploy --help\r\n...\r\nFlags\r\n --tags stringToString Labels with a key and optional value separated with commas. \r\n Allows you to categorize resources in your project.\r\nExamples\r\n `$ ecs-preview app deploy --tags source/revision=15a4f06` \r\n```",
"title": "Allow tags while deploying an application",
"type": "issue"
}
] | 567 | false | true | 1 | 1 | false |
snyk/snyk | snyk | 680,276,239 | 1,340 | null | [
{
"action": "opened",
"author": "mateuszowolf",
"comment_id": null,
"datetime": 1597674233000,
"masked_author": "username_0",
"text": "- `node -v`: NA\r\n- `npm -v`: NA\r\n- `snyk -v`: v1.374.0 (expected)\r\n- OS: (e.g. OSX, Linux, Windows, ...)\r\n- Command run: (e.g. `snyk test --all-projects`, `snyk protect`, ...)\r\n\r\n### Expected behaviour\r\nsnyk-cli is able to process its commands\r\n\r\n### Actual behaviour\r\nError messages about missing symbols (see below)\r\n\r\n### Steps to reproduce\r\n```\r\nexport IMAGE_NAME=python:3.8-alpine\r\ndocker pull $IMAGE_NAME\r\ndocker run --rm -it $IMAGE_NAME /bin/sh\r\n/ # cd tmp/\r\n/tmp # wget -O snyk-alpine https://github.com/snyk/snyk/releases/download/v1.374.0/snyk-alpine\r\n/tmp # ./snyk-alpine \r\n```\r\nThis produces error messages from Debug log below\r\nIf `libstdc++` is installed with `apk add --no-cache --update gcc` the error message is then limited to `Segmentation fault`\r\n\r\n\r\n### Debug log\r\n```\r\nError loading shared library libstdc++.so.6: No such file or directory (needed by ./snyk-alpine)\r\nError loading shared library libgcc_s.so.1: No such file or directory (needed by ./snyk-alpine)\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1118basic_stringstreamIcSt11char_traitsIcESaIcEEC1ESt13_Ios_Openmode: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE9push_backEc: symbol not found\r\nError relocating ./snyk-alpine: _ZStrsIcSt11char_traitsIcESaIcEERSt13basic_istreamIT_T0_ES7_RNSt7__cxx1112basic_stringIS4_S5_T1_EE: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE6resizeEmc: symbol not found\r\nError relocating ./snyk-alpine: _ZSt18_Rb_tree_incrementPKSt18_Rb_tree_node_base: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE8_M_eraseEmm: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE6substrEmm: symbol not found\r\nError relocating ./snyk-alpine: _ZnwmRKSt9nothrow_t: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt12__basic_fileIcE7is_openEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE5rfindEcm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1118basic_stringstreamIcSt11char_traitsIcESaIcEED1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt6localeC1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt9basic_iosIcSt11char_traitsIcEE4initEPSt15basic_streambufIcS1_E: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt8ios_baseC2Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt6localeD1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEE3strEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE4findEPKcmm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt13basic_filebufIcSt11char_traitsIcEE5closeEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo9_M_insertIdEERSoT_: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt9basic_iosIcSt11char_traitsIcEE5imbueERKSt6locale: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo5seekpElSt12_Ios_Seekdir: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo9_M_insertIlEERSoT_: symbol not found\r\nError relocating ./snyk-alpine: _ZNSolsEi: symbol not found\r\nError relocating ./snyk-alpine: _ZdaPvm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt13basic_filebufIcSt11char_traitsIcEEC1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE6appendEPKc: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE6assignEPKc: symbol not found\r\nError relocating ./snyk-alpine: _ZSt29_Rb_tree_insert_and_rebalancebPSt18_Rb_tree_node_baseS0_RS_: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt8ios_baseD2Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt18condition_variableD1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt5ctypeIcE13_M_widen_initEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt9basic_iosIcSt11char_traitsIcEE5rdbufEPSt15basic_streambufIcS1_E: symbol not found\r\nError relocating ./snyk-alpine: _ZStlsISt11char_traitsIcEERSt13basic_ostreamIcT_ES5_PKc: symbol not found\r\nError relocating ./snyk-alpine: _ZSt20__throw_system_errori: symbol not found\r\nError relocating ./snyk-alpine: _ZNSolsEs: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE9_M_createERmm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt8__detail15_List_node_base7_M_hookEPS0_: symbol not found\r\nError relocating ./snyk-alpine: _ZSt11_Hash_bytesPKvmm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo9_M_insertIbEERSoT_: symbol not found\r\nError relocating ./snyk-alpine: _ZSt18_Rb_tree_incrementPSt18_Rb_tree_node_base: symbol not found\r\nError relocating ./snyk-alpine: _ZSt16__ostream_insertIcSt11char_traitsIcEERSt13basic_ostreamIT_T0_ES6_PKS3_l: symbol not found\r\nError relocating ./snyk-alpine: _ZSt25__throw_bad_function_callv: symbol not found\r\nError relocating ./snyk-alpine: _ZdlPvm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE10_M_replaceEmmPKcm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEaSEOS4_: symbol not found\r\nError relocating ./snyk-alpine: _ZSt16__throw_bad_castv: symbol not found\r\nError relocating ./snyk-alpine: _ZSt28_Rb_tree_rebalance_for_erasePSt18_Rb_tree_node_baseRS_: symbol not found\r\nError relocating ./snyk-alpine: _ZSt4endlIcSt11char_traitsIcEERSt13basic_ostreamIT_T0_ES6_: symbol not found\r\nError relocating ./snyk-alpine: __popcountdi2: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE4copyEPcmm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE9_M_mutateEmmPKcm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE4swapERS4_: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo9_M_insertImEERSoT_: symbol not found\r\nError relocating ./snyk-alpine: _ZSt19__throw_logic_errorPKc: symbol not found\r\nError relocating ./snyk-alpine: _Znwm: symbol not found\r\nError relocating ./snyk-alpine: _ZSt20__throw_length_errorPKc: symbol not found\r\nError relocating ./snyk-alpine: _ZdlPv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo9_M_insertIPKvEERSoT_: symbol not found\r\nError relocating ./snyk-alpine: _ZSt20__throw_out_of_rangePKc: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE7replaceEmmPKcm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo3putEc: symbol not found\r\nError relocating ./snyk-alpine: __cxa_guard_acquire: symbol not found\r\nError relocating ./snyk-alpine: _ZSt24__throw_out_of_range_fmtPKcz: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt8__detail15_List_node_base9_M_unhookEv: symbol not found\r\nError relocating ./snyk-alpine: _ZSt7getlineIcSt11char_traitsIcESaIcEERSt13basic_istreamIT_T0_ES7_RNSt7__cxx1112basic_stringIS4_S5_T1_EES4_: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo9_M_insertIyEERSoT_: symbol not found\r\nError relocating ./snyk-alpine: _ZSt18_Rb_tree_decrementPSt18_Rb_tree_node_base: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE9_M_appendEPKcm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo9_M_insertIxEERSoT_: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt6localeC1EPKc: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE7compareEPKc: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE7reserveEm: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE7compareERKS4_: symbol not found\r\nError relocating ./snyk-alpine: _ZSt18_Rb_tree_decrementPKSt18_Rb_tree_node_base: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt18condition_variable10notify_allEv: symbol not found\r\nError relocating ./snyk-alpine: _ZdlPvRKSt9nothrow_t: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt9basic_iosIcSt11char_traitsIcEE5clearESt12_Ios_Iostate: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo5writeEPKcl: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEC1ERKS4_mm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEE7_M_syncEPcmm: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt8__detail20_Prime_rehash_policy11_M_next_bktEm: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE4findEcm: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE9_M_assignERKS4_: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt13basic_filebufIcSt11char_traitsIcEE4openEPKcSt13_Ios_Openmode: symbol not found\r\nError relocating ./snyk-alpine: _Znam: symbol not found\r\nError relocating ./snyk-alpine: _ZSt24__throw_invalid_argumentPKc: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt8__detail15_List_node_base11_M_transferEPS0_S1_: symbol not found\r\nError relocating ./snyk-alpine: _ZnamRKSt9nothrow_t: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE12_M_constructEmc: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1119basic_ostringstreamIcSt11char_traitsIcESaIcEEC1ESt13_Ios_Openmode: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt9basic_iosIcSt11char_traitsIcEE7copyfmtERKS2_: symbol not found\r\nError relocating ./snyk-alpine: _ZNSo5flushEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt18condition_variable4waitERSt11unique_lockISt5mutexE: symbol not found\r\nError relocating ./snyk-alpine: __dynamic_cast: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt8ios_base4InitC1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt8__detail20_Prime_rehash_policy14_M_need_rehashEmmm: symbol not found\r\nError relocating ./snyk-alpine: __cxa_guard_release: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE14_M_replace_auxEmmmc: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt18condition_variableC1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNKSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE7compareEmmPKc: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt12__basic_fileIcED1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZdaPv: symbol not found\r\nError relocating ./snyk-alpine: _ZSt17__throw_bad_allocv: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: __cxa_pure_virtual: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE5imbueERKSt6locale: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE5imbueERKSt6locale: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE6setbufEPcl: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE6setbufEPcl: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE7seekoffElSt12_Ios_SeekdirSt13_Ios_Openmode: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE7seekoffElSt12_Ios_SeekdirSt13_Ios_Openmode: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE7seekposESt4fposI11__mbstate_tESt13_Ios_Openmode: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE7seekposESt4fposI11__mbstate_tESt13_Ios_Openmode: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE9showmanycEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE9showmanycEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE6xsgetnEPcl: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE6xsgetnEPcl: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE9underflowEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE9underflowEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE5uflowEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE5uflowEv: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE9pbackfailEi: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE9pbackfailEi: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt15basic_streambufIcSt11char_traitsIcEE6xsputnEPKcl: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv120__si_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv117__class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVN10__cxxabiv121__vmi_class_type_infoE: symbol not found\r\nError relocating ./snyk-alpine: _ZSt11__once_call: symbol not found\r\nError relocating ./snyk-alpine: _ZSt15__once_callable: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEED1Ev: symbol not found\r\nError relocating ./snyk-alpine: _ZNSt8ios_base4InitD1Ev: symbol not found\r\nError relocating ./snyk-alpine: __once_proxy: symbol not found\r\nError relocating ./snyk-alpine: _ZTTNSt7__cxx1119basic_istringstreamIcSt11char_traitsIcESaIcEEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTTSt14basic_ifstreamIcSt11char_traitsIcEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTTNSt7__cxx1119basic_ostringstreamIcSt11char_traitsIcESaIcEEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVSt14basic_ifstreamIcSt11char_traitsIcEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTTNSt7__cxx1118basic_stringstreamIcSt11char_traitsIcESaIcEEE: symbol not found\r\nError relocating ./snyk-alpine: _ZSt7nothrow: symbol not found\r\nError relocating ./snyk-alpine: _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVNSt7__cxx1119basic_ostringstreamIcSt11char_traitsIcESaIcEEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVSt13basic_filebufIcSt11char_traitsIcEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVSt15basic_streambufIcSt11char_traitsIcEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVNSt7__cxx1119basic_istringstreamIcSt11char_traitsIcESaIcEEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVNSt7__cxx1118basic_stringstreamIcSt11char_traitsIcESaIcEEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVSt14basic_ofstreamIcSt11char_traitsIcEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTTSt14basic_ofstreamIcSt11char_traitsIcEE: symbol not found\r\nError relocating ./snyk-alpine: _ZTVSt9basic_iosIcSt11char_traitsIcEE: symbol not found\r\nError relocating ./snyk-alpine: _ZSt4cout: symbol not found\r\nError relocating ./snyk-alpine: _ZSt4cerr: symbol not found\r\n```",
"title": "[🐛] Segmentation fault with Alpine binary since v1.374.0",
"type": "issue"
},
{
"action": "created",
"author": "JackuB",
"comment_id": 674979906,
"datetime": 1597681583000,
"masked_author": "username_1",
"text": "Hi, you are right, switching our Alpine build to use Node v12 caused this. Fix and tests to bump Node to v14 are in this PR, we'll try to merge them soon https://github.com/snyk/snyk/pull/1337",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JackuB",
"comment_id": 675061528,
"datetime": 1597691548000,
"masked_author": "username_1",
"text": "As of release https://github.com/snyk/snyk/releases/tag/v1.377.1 Alpine executable works again\r\n\r\n<img width=\"762\" alt=\"Screenshot 2020-08-17 at 21 10 50\" src=\"https://user-images.githubusercontent.com/1788727/90434766-455f9c00-e0ce-11ea-981a-f4cfa56cc5e3.png\">\r\n\r\nI see the new [smoke tests are passing on Alpine](https://github.com/snyk/snyk/actions/runs/212444856), so it shouldn't happen again",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JackuB",
"comment_id": 675063757,
"datetime": 1597691840000,
"masked_author": "username_1",
"text": "Also, the `apk add libgcc libstdc++` is still needed for `snyk-alpine`, but we are currently looking into ways to embed them in the executable",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "JackuB",
"comment_id": null,
"datetime": 1599828539000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 95,342 | false | false | 2 | 5 | false |
ros2/rclcpp | ros2 | 482,689,892 | 827 | null | [
{
"action": "opened",
"author": "hyunseok-yang",
"comment_id": null,
"datetime": 1566285928000,
"masked_author": "username_0",
"text": "<!--\r\nFor general questions, please ask on ROS answers: https://answers.ros.org, make sure to include at least the `ros2` tag and the rosdistro version you are running, e.g. `ardent`.\r\nFor general design discussions, please post on discourse: https://discourse.ros.org/c/ng-ros\r\nNot sure if this is the right repository? Open an issue on https://github.com/ros2/ros2/issues\r\nFor Bug report or feature requests, please fill out the relevant category below\r\n-->\r\n\r\n## Bug report\r\n\r\n**Required Info:**\r\n\r\n- Operating System:\r\n - Ubuntu 18.04\r\n- Installation type:\r\n - Binaries(Debian package)\r\n- Version or commit hash:\r\n - Dashing latest\r\n- DDS implementation:\r\n - Fast-RTPS\r\n- Client library (if applicable):\r\n - rclcpp\r\n\r\n#### Steps to reproduce issue\r\nIn my workspace, I usually run gazebo + driver(for simualtor) on **terminal 1**. And I launch the navigation2 package on terminal 2.\r\n\r\n\r\n<!-- Detailed instructions on how to reliably reproduce this issue http://sscce.org/\r\n``` code that can be copy-pasted is preferred ``` -->\r\n```\r\n\r\n```\r\n\r\n#### Expected behavior\r\nNo termination on **terminal 1**\r\n(I've never seen like this error message before with crystal version..)\r\n\r\nDo you guys think it's a DDS issue(for example Fast RTPS?)?\r\n\r\n#### Actual behavior\r\n\r\nNot always...... but I got this error message and process died occasionally on **terminal 1** when I shutdown the navigation2(on terminal 2) by Ctrl-C and re-launch the navigation2 (on terminal 2).\r\n\r\n[launch.sh-1] terminate called after throwing an instance of 'rclcpp::exceptions::RCLError'\r\n[launch.sh-1] what(): failed to publish message: cannot publish data, at /tmp/binarydeb/ros-dashing-rmw-fastrtps-shared-cpp-0.7.5/src/rmw_publish.cpp:52, at /tmp/binarydeb/ros-dashing-rcl-0.7.6/src/rcl/publisher.c:257\r\n\r\n[mcu_porter_sim-3] terminate called after throwing an instance of 'rclcpp::exceptions::RCLError'\r\n[mcu_porter_sim-3] what(): failed to publish message: cannot publish data, at /tmp/binarydeb/ros-dashing-rmw-fastrtps-shared-cpp-0.7.5/src/rmw_publish.cpp:52, at /tmp/binarydeb/ros-dashing-rcl-0.7.6/src/rcl/publisher.c:257\r\n\r\n\r\n#### Additional information\r\nI configured default NodeOptions like below.\r\n` : Node(\"micom_driver\", rclcpp::NodeOptions())`\r\n\r\nI configured QoS on publisher like below.\r\n```\r\n auto qos = rclcpp::QoS(rclcpp::QoSInitialization::from_rmw(rmw_qos_profile_sensor_data));\r\n pubOdometryMsg = this->create_publisher<nav_msgs::msg::Odometry>(\"/odom\", qos);\r\n```\r\n\r\n<!-- If you are reporting a bug delete everything below\r\n If you are requesting a feature deleted everything above this line -->\r\n----\r\n## Feature request\r\n\r\n#### Feature description\r\n<!-- Description in a few sentences what the feature consists of and what problem it will solve -->\r\n\r\n#### Implementation considerations\r\n<!-- Relevant information on how the feature could be implemented and pros and cons of the different solutions -->",
"title": "cannot publish data",
"type": "issue"
},
{
"action": "created",
"author": "ivanpauno",
"comment_id": 523024319,
"datetime": 1566309058000,
"masked_author": "username_1",
"text": "Similar to https://github.com/ros2/ros2/issues/714.\r\nIt seems to be an error in FastRTPS.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dirk-thomas",
"comment_id": 528966330,
"datetime": 1567794855000,
"masked_author": "username_2",
"text": "Closing as duplicate of ros2/ros2#714.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "dirk-thomas",
"comment_id": null,
"datetime": 1567794855000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3,044 | false | false | 3 | 4 | false |
swt2-intro-exercise/rails-exercise-20-frcroth | swt2-intro-exercise | 742,339,867 | 4 | null | [
{
"action": "opened",
"author": "swt2public",
"comment_id": null,
"datetime": 1605262884000,
"masked_author": "username_0",
"text": "*5/44 exercise tests have passed*",
"title": "Author #name should return the full name",
"type": "issue"
},
{
"action": "created",
"author": "swt2public",
"comment_id": 726681871,
"datetime": 1605262951000,
"masked_author": "username_0",
"text": "*If you have problems solving this task, please don't hesitate to contact the teaching team!*",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "frcroth",
"comment_id": null,
"datetime": 1605263488000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 126 | false | false | 2 | 3 | false |
gfx-rs/gfx | gfx-rs | 744,085,621 | 3,478 | null | [
{
"action": "opened",
"author": "Herschel",
"comment_id": null,
"datetime": 1605553539000,
"masked_author": "username_0",
"text": "Running the quad or colour-uniform example:\r\n`cargo run --bin quad --features dx12`\r\nresults in the following panic:\r\n```\r\nthread 'main' panicked at 'assertion failed: `(left == right)`\r\n left: `0`,\r\n right: `-2005270523`: error on command allocator creation: 887a0005', src\\backend\\dx12\\src\\pool.rs:29:17\r\n```\r\n\r\nStack trace and more spew:\r\n```AdapterInfo { name: \"NVIDIA GeForce RTX 2080 Ti\", vendor: 4318, device: 7687, device_type: DiscreteGpu }\r\nAdapterInfo { name: \"Microsoft Basic Render Driver\", vendor: 5140, device: 140, device_type: VirtualGpu }\r\nAdapterInfo { name: \"NVIDIA GeForce RTX 2080 Ti\", vendor: 4318, device: 7687, device_type: DiscreteGpu }\r\nMemory types: [MemoryType { properties: DEVICE_LOCAL, heap_index: 0 }, MemoryType { properties: CPU_VISIBLE | COHERENT, heap_index: 1 }, MemoryType { properties: CPU_VISIBLE | COHERENT | CPU_CACHED, heap_index: 1 }]\r\nformats: Some([Bgra8Srgb, Bgra8Unorm, Rgba8Srgb, Rgba8Unorm, A2b10g10r10Unorm, Rgba16Sfloat])\r\nSwapchainConfig { present_mode: FIFO, composite_alpha_mode: OPAQUE, format: Bgra8Srgb, extent: Extent2D { width: 1024, height: 768 }, image_count: 3, image_layers: 1, image_usage: COLOR_ATTACHMENT }\r\nLoaded 'C:\\Windows\\System32\\dxilconv.dll'.\r\nD3D12 WARNING: ID3D12CommandList::IASetVertexBuffers: Resource 0x00000209A84AE870:'Unnamed ID3D12Resource Object' and 1 other resources contain the GPU Virtual Address range [0x000000000a9fa000, 0x000000000a9fa05f] on a Heap (0x000002099FF82AD0:'Unnamed ID3D12Heap Object'). This may be OK as long as all of these resources are in the same state however developer probably did not intend to make use of this behavior. [ STATE_CREATION WARNING #926: HEAP_ADDRESS_RANGE_INTERSECTS_MULTIPLE_BUFFERS]\r\nD3D12 ERROR: ID3D12CommandList::ResourceBarrier: Before and after states must be different. [ RESOURCE_MANIPULATION ERROR #525: RESOURCE_BARRIER_MATCHING_STATES]\r\nException thrown at 0x00007FF8793E3E49 in quad.exe: Microsoft C++ exception: _com_error at memory location 0x000000505476B370.\r\nD3D12 ERROR: ID3D12CommandList::ResourceBarrier: Before and after states must be different. [ RESOURCE_MANIPULATION ERROR #525: RESOURCE_BARRIER_MATCHING_STATES]\r\nException thrown at 0x00007FF8793E3E49 in quad.exe: Microsoft C++ exception: _com_error at memory location 0x000000505476B7B0.\r\nD3D12 ERROR: ID3D12CommandQueue::ExecuteCommandLists: Command lists must be successfully closed before execution. [ EXECUTION ERROR #838: EXECUTECOMMANDLISTS_FAILEDCOMMANDLIST]\r\nException thrown at 0x00007FF8793E3E49 in quad.exe: Microsoft C++ exception: _com_error at memory location 0x000000505476B880.\r\nD3D12: Removing Device.\r\nD3D12 ERROR: ID3D12Device::RemoveDevice: Device removal has been triggered for the following reason (DXGI_ERROR_INVALID_CALL: There is strong evidence that the application has performed an illegal or undefined operation, and such a condition could not be returned to the application cleanly through a return code). [ EXECUTION ERROR #232: DEVICE_REMOVAL_PROCESS_AT_FAULT]\r\nException thrown at 0x00007FF8793E3E49 in quad.exe: Microsoft C++ exception: _com_error at memory location 0x0000005054763E08.\r\nException thrown at 0x00007FF8793E3E49 in quad.exe: Microsoft C++ exception: _com_error at memory location 0x0000005054763F50.\r\nthread 'main' panicked at 'assertion failed: `(left == right)`\r\n left: `0`,\r\n right: `-2005270523`: error on command allocator creation: 887a0005', src\\backend\\dx12\\src\\pool.rs:29:17\r\nstack backtrace:\r\nLoaded 'C:\\Windows\\System32\\dbghelp.dll'.\r\n 0: std::panicking::begin_panic_handler\r\n at /rustc/18bf6b4f01a6feaf7259ba7cdae58031af1b7b39\\/library\\std\\src\\panicking.rs:475\r\n 1: std::panicking::begin_panic_fmt\r\n at /rustc/18bf6b4f01a6feaf7259ba7cdae58031af1b7b39\\/library\\std\\src\\panicking.rs:429\r\n 2: gfx_backend_dx12::pool::PoolShared::acquire\r\n at .\\src\\backend\\dx12\\src\\pool.rs:29\r\n 3: gfx_backend_dx12::command::{{impl}}::begin\r\n at .\\src\\backend\\dx12\\src\\command.rs:1202\r\n 4: gfx_hal::command::CommandBuffer::begin_primary<gfx_backend_dx12::command::CommandBuffer,gfx_backend_dx12::Backend>\r\n at .\\src\\hal\\src\\command\\mod.rs:113\r\n 5: quad::Renderer<gfx_backend_dx12::Backend>::render<gfx_backend_dx12::Backend>\r\n at .\\examples\\quad\\main.rs:837\r\n 6: quad::main::{{closure}}\r\n at .\\examples\\quad\\main.rs:179\r\n 7: winit::platform_impl::platform::event_loop::{{impl}}::run_return::{{closure}}<tuple<>,closure-0>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop.rs:200\r\n 8: alloc::boxed::{{impl}}::call_mut<tuple<winit::event::Event<tuple<>>, mut winit::event_loop::ControlFlow*>,FnMut<tuple<winit::event::Event<tuple<>>, mut winit::event_loop::ControlFlow*>>>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\alloc\\src\\boxed.rs:1049\r\n 9: winit::platform_impl::platform::event_loop::runner::{{impl}}::call_event_handler::{{closure}}<tuple<>>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop\\runner.rs:239\r\n 10: std::panic::{{impl}}::call_once<tuple<>,closure-0>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\std\\src\\panic.rs:308\r\n 11: std::panicking::try::do_call<std::panic::AssertUnwindSafe<closure-0>,tuple<>>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\std\\src\\panicking.rs:373\r\n 12: winit::window::{{impl}}::clone\r\n 13: std::panicking::try<tuple<>,std::panic::AssertUnwindSafe<closure-0>>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\std\\src\\panicking.rs:337\r\n 14: std::panic::catch_unwind<std::panic::AssertUnwindSafe<closure-0>,tuple<>>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\std\\src\\panic.rs:379\r\n 15: winit::platform_impl::platform::event_loop::runner::EventLoopRunner<tuple<>>::catch_unwind<tuple<>,tuple<>,closure-0>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop\\runner.rs:150\r\n 16: winit::platform_impl::platform::event_loop::runner::EventLoopRunner<tuple<>>::call_event_handler<tuple<>>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop\\runner.rs:233\r\n 17: winit::platform_impl::platform::event_loop::runner::EventLoopRunner<tuple<>>::call_redraw_events_cleared<tuple<>>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop\\runner.rs:371\r\n 18: winit::platform_impl::platform::event_loop::runner::EventLoopRunner<tuple<>>::move_state_to<tuple<>>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop\\runner.rs:328\r\n 19: winit::platform_impl::platform::event_loop::runner::EventLoopRunner<tuple<>>::redraw_events_cleared<tuple<>>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop\\runner.rs:229\r\n 20: winit::platform_impl::platform::event_loop::public_window_callback::{{closure}}<tuple<>>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop.rs:821\r\n 21: core::ops::function::FnOnce::call_once<closure-0,tuple<>>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\core\\src\\ops\\function.rs:227\r\n 22: std::panic::{{impl}}::call_once<isize,closure-0>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\std\\src\\panic.rs:308\r\n 23: std::panicking::try::do_call<std::panic::AssertUnwindSafe<closure-0>,isize>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\std\\src\\panicking.rs:373\r\n 24: std::panicking::try::do_catch<std::panic::AssertUnwindSafe<closure-0>,isize>\r\n 25: std::panicking::try<isize,std::panic::AssertUnwindSafe<closure-0>>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\std\\src\\panicking.rs:337\r\n 26: std::panic::catch_unwind<std::panic::AssertUnwindSafe<closure-0>,isize>\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\std\\src\\panic.rs:379\r\n 27: winit::platform_impl::platform::event_loop::runner::EventLoopRunner<tuple<>>::catch_unwind<tuple<>,isize,closure-0>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop\\runner.rs:150\r\n 28: winit::platform_impl::platform::event_loop::public_window_callback<tuple<>>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop.rs:1893\r\n 29: DefSubclassProc\r\n 30: DefSubclassProc\r\n 31: CallWindowProcW\r\n 32: DispatchMessageW\r\n 33: SendMessageTimeoutW\r\n 34: KiUserCallbackDispatcher\r\n 35: NtUserDispatchMessage\r\n 36: DispatchMessageW\r\n 37: winit::platform_impl::platform::event_loop::EventLoop<tuple<>>::run_return<tuple<>,closure-0>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop.rs:215\r\n 38: winit::platform_impl::platform::event_loop::EventLoop<tuple<>>::run<tuple<>,closure-0>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\platform_impl\\windows\\event_loop.rs:185\r\n 39: winit::event_loop::EventLoop<tuple<>>::run<tuple<>,closure-0>\r\n at C:\\Users\\Michael Welsh\\.cargo\\registry\\src\\github.com-1ecc6299db9ec823\\winit-0.22.2\\src\\event_loop.rs:149\r\n 40: quad::main\r\n at .\\examples\\quad\\main.rs:152\r\n 41: core::ops::function::FnOnce::call_once<fn(),tuple<>>\r\nException thrown at 0x00007FF8793E3E49 in quad.exe: Microsoft C++ exception: ?? ::st_panic at memory location 0x0000005054763E78.\r\n at C:\\Users\\Michael Welsh\\.rustup\\toolchains\\stable-x86_64-pc-windows-msvc\\lib\\rustlib\\src\\rust\\library\\core\\src\\ops\\function.rs:227\r\nnote: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.\r\nDROPPED!\r\nD3D12 WARNING: Live ID3D12Device at 0x000002099D5B97D8, Refcount: 6 [ STATE_CREATION WARNING #274: LIVE_DEVICE]\r\nD3D12 WARNING: \tLive ID3D12RootSignature at 0x00000209A01D5490, Refcount: 0, IntRef: 2 [ STATE_CREATION WARNING #577: LIVE_ROOTSIGNATURE]\r\nD3D12 WARNING: \tLive ID3D12PipelineState at 0x00000209A6A6FEA0, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #572: LIVE_PIPELINESTATE]\r\nD3D12 WARNING: \tLive ID3D12PipelineState at 0x00000209A6A77DA0, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #572: LIVE_PIPELINESTATE]\r\nD3D12 WARNING: \tLive ID3D12Resource at 0x00000209A6A90710, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #575: LIVE_RESOURCE]\r\nD3D12 WARNING: \tLive ID3D12Heap at 0x00000209A6A90460, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #579: LIVE_HEAP]\r\nD3D12 WARNING: \tLive ID3D12Fence at 0x00000209A6B31A00, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #580: LIVE_MONITOREDFENCE]\r\nD3D12 WARNING: \tLive ID3D12CommandQueue at 0x00000209A6B31C50, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #570: LIVE_COMMANDQUEUE]\r\nD3D12 WARNING: \tLive ID3D12Fence at 0x00000209A6BD1C50, Name: Internal D3D12 Debug Fence, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #580: LIVE_MONITOREDFENCE]\r\nD3D12 WARNING: \tLive ID3D12Fence at 0x00000209A6C56230, Name: Internal D3D12 Debug Fence, Refcount: 0, IntRef: 3 [ STATE_CREATION WARNING #580: LIVE_MONITOREDFENCE]\r\nD3D12 WARNING: \tLive ID3D12CommandAllocator at 0x00000209A85B9C40, Refcount: 1, IntRef: 4 [ STATE_CREATION WARNING #571: LIVE_COMMANDALLOCATOR]\r\nD3D12 WARNING: \tLive ID3D12GraphicsCommandList at 0x00000209A85BA150, Refcount: 1, IntRef: 0 [ STATE_CREATION WARNING #573: LIVE_COMMANDLIST12]\r\nD3D12 WARNING: \tLive ID3D12GraphicsCommandList at 0x00000209A84B8130, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #573: LIVE_COMMANDLIST12]\r\nD3D12 WARNING: \tLive ID3D12GraphicsCommandList at 0x00000209A85368B0, Refcount: 1, IntRef: 0 [ STATE_CREATION WARNING #573: LIVE_COMMANDLIST12]\r\nD3D12 WARNING: \tLive ID3D12GraphicsCommandList at 0x00000209A8569360, Refcount: 0, IntRef: 1 [ STATE_CREATION WARNING #573: LIVE_COMMANDLIST12]\r\nException thrown at 0x00007FF8793E3E49 in quad.exe: Microsoft C++ exception: ?? ::st_panic at memory location 0x0000005054769B98.\r\nUnloaded 'C:\\Windows\\System32\\d3d10warp.dll'.\r\nD3D12 WARNING: Process is terminating. Using simple reporting. Please call ReportLiveObjects() at runtime for standard reporting. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: Live Producer at 0x000002099D5B97B8, Refcount: 3. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A01D5490, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A6A6FEA0, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A6A77DA0, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A6A90710, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A6A90460, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A6B31A00, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A6B31C50, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A6BD1C50, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A6C56230, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A85B9C40, Refcount: 1. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A85BA150, Refcount: 1. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A84B8130, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A85368B0, Refcount: 1. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: \tLive Object at 0x00000209A8569360, Refcount: 0. [ STATE_CREATION WARNING #0: UNKNOWN]\r\nD3D12 WARNING: Live Object : 14 [ STATE_CREATION WARNING #0: UNKNOWN]\r\nDXGI WARNING: Live Producer at 0x000002099D593658, Refcount: 7. [ STATE_CREATION WARNING #0: ]\r\nDXGI WARNING: \tLive Object at 0x000002099D590580, Refcount: 1. [ STATE_CREATION WARNING #0: ]\r\nDXGI WARNING: \tLive Object at 0x00000209A6C04120, Refcount: 1. [ STATE_CREATION WARNING #0: ]\r\nDXGI WARNING: \tLive Object at 0x00000209A6C08A50, Refcount: 1. [ STATE_CREATION WARNING #0: ]\r\nDXGI WARNING: \tLive Object at 0x00000209A6C57550, Refcount: 1. [ STATE_CREATION WARNING #0: ]\r\nDXGI WARNING: \tLive Object at 0x00000209A6C579E0, Refcount: 1. [ STATE_CREATION WARNING #0: ]\r\nDXGI WARNING: \tLive Object at 0x00000209A81F06A0, Refcount: 1. [ STATE_CREATION WARNING #0: ]\r\nDXGI WARNING: Live Object : 6 [ STATE_CREATION WARNING #0: ]\r\nThe program '[12316] quad.exe' has exited with code 101 (0x65).\r\n```\r\n\r\nShort info header:\r\n- GFX version: https://github.com/gfx-rs/gfx/commit/87a24bf390ddd0ee6bc34393d879f2d823aee67f\r\n- OS: Windows 10 64-bit\r\n- GPU: Nvidia Geforce 2080 Ti",
"title": "dx12: Error on command allocator allocation",
"type": "issue"
},
{
"action": "created",
"author": "kvark",
"comment_id": 728451537,
"datetime": 1605571172000,
"masked_author": "username_1",
"text": "Fixed by 57ce75cab889bd35ad00ffc4ccdc0665737bccd5",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "kvark",
"comment_id": null,
"datetime": 1605571172000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "Herschel",
"comment_id": 728456250,
"datetime": 1605571347000,
"masked_author": "username_0",
"text": "Confirmed fixed, thanks!",
"title": null,
"type": "comment"
}
] | 15,914 | false | false | 2 | 4 | false |
JuliaRegistries/General | JuliaRegistries | 708,071,892 | 21,908 | {
"number": 21908,
"repo": "General",
"user_login": "JuliaRegistries"
} | [
{
"action": "opened",
"author": "JuliaRegistrator",
"comment_id": null,
"datetime": 1600945206000,
"masked_author": "username_0",
"text": "- Registering package: MLJLinearModels\n- Repository: https://github.com/alan-turing-institute/MLJLinearModels.jl\n- Created by: @tlienart\n- Version: v0.5.1\n- Commit: dd87888b9ba65705ac61a7b20a6bee49afa46cea\n- Reviewed by: @tlienart\n- Reference: https://github.com/alan-turing-institute/MLJLinearModels.jl/commit/dd87888b9ba65705ac61a7b20a6bee49afa46cea#commitcomment-42687468\n<!-- bf0c69308befbd3ccf2cc956ac8a46712550b79fc9bfb5e4edf8f833f05f4c18b06eddad8845b45beb9f45c2b8020dd6e5e70d4790982f2cbfc3c4512df3919b14b180a9cfcbd2cbe655836e83e4d96f3af1e760bce0deb365e7d4e010af7e206b1459636aa4dbd1b1bba871ddc0b55c8c523a80fea1aaa9ae79e8b6ae60ce574b60cb06d80883cbde8588d08691264af9c1cc1ecba55d574d203169ac422ba532e885209e9c2ecd9e54bbb1cd9e6da2e25218fc2f403ff604beeffe40a66c415e046e5203aa05c0773a5011d2f3e26bf17cda845569285ba7a9cf1a1f50cd250db65d104627effb95c4aba7b199d231 -->",
"title": "New version: MLJLinearModels v0.5.1",
"type": "issue"
}
] | 864 | false | true | 1 | 1 | false |
Arkq/bluez-alsa | null | 841,080,436 | 432 | {
"number": 432,
"repo": "bluez-alsa",
"user_login": "Arkq"
} | [
{
"action": "opened",
"author": "borine",
"comment_id": null,
"datetime": 1616688299000,
"masked_author": "username_0",
"text": "",
"title": "improve PCM plugin compatiblity with ALSA rate plugin",
"type": "issue"
},
{
"action": "created",
"author": "Arkq",
"comment_id": 820133482,
"datetime": 1618465768000,
"masked_author": "username_1",
"text": "Thanks for this PR, and sorry for delayed merge (I'm really, really busy lately :/)",
"title": null,
"type": "comment"
}
] | 83 | false | false | 2 | 2 | false |
Tesena-smart-testing/WatchUI | Tesena-smart-testing | 776,409,600 | 59 | null | [
{
"action": "opened",
"author": "marcel-veselka",
"comment_id": null,
"datetime": 1609327675000,
"masked_author": "username_0",
"text": "Readme file propagated from GitHub doesn't show properly some characters + images are not propagated as well.\r\n\r\nhttps://pypi.org/project/WatchUI/\r\n\r\nSee issues with characters:\r\n<img width=\"397\" alt=\"chars\" src=\"https://user-images.githubusercontent.com/9706465/103348381-1dc80980-4a9a-11eb-857b-f87142f49275.png\">\r\n\r\nSee issues with images:\r\n<img width=\"287\" alt=\"imgs\" src=\"https://user-images.githubusercontent.com/9706465/103348430-4223e600-4a9a-11eb-8475-382435f37f28.png\">",
"title": "README.md propagation to Pip is incorrect - problems with chars and images",
"type": "issue"
},
{
"action": "created",
"author": "janegermaier",
"comment_id": 754179114,
"datetime": 1609789663000,
"masked_author": "username_1",
"text": "Chars was repaired in new release (v1.0.10). I going to repairing image in next release (v1.0.11). Thanks for report :-)",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "janegermaier",
"comment_id": null,
"datetime": 1611259522000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "janegermaier",
"comment_id": 764903926,
"datetime": 1611259522000,
"masked_author": "username_1",
"text": "Hi,\r\ntoday i repaired bug :-) \r\n\r\nMore u´ll see [here.](https://pypi.org/project/WatchUI/)",
"title": null,
"type": "comment"
}
] | 689 | false | false | 2 | 4 | false |
openEuler-Networking/woff2 | openEuler-Networking | 788,751,331 | 21 | null | [
{
"action": "opened",
"author": "jzmexec",
"comment_id": null,
"datetime": 1611038359000,
"masked_author": "username_0",
"text": "Synchronized from google/woff2: https://github.com/google/woff2/issues/127\r\n\r\nI am stuck at compiling this on Gentoo with g++ 8.3.0\r\nAfter I typed\r\n\r\n`make clean all `\r\n\r\nThe following occoured:\r\n\r\n`rm -f src/font.o src/glyph.o src/normalize.o src/table_tags.o src/transform.o src/woff2_dec.o src/woff2_enc.o src/woff2_common.o src/woff2_out.o src/variable_length.o src/woff2_compress.o src/woff2_decompress.o src/woff2_info.o woff2_compress woff2_decompress woff2_info make -C brotli clean make[1]: Entering directory '/home/null/woff2/brotli' rm -rf bin libbrotli.a make[1]: Leaving directory '/home/null/woff2/brotli' c++ -I./brotli/c/include/ -I./src -I./include -c -o src/font.o src/font.cc c++: error trying to exec 'cc1plus': execvp: Not a directory make: *** [<builtin>: src/font.o] Error 1 `\r\n\r\nBut I have found cc1plus in path:/usr/libexec/gcc/x86_64-pc-linux-gnu/8.3.0/\r\nSo I tried to\r\n\r\n`export PATH=/usr/libexec/gcc/x86_64-pc-linux-gnu/8.3.0/:$PATH `\r\n\r\nand then\r\n\r\n`make clean all `\r\n\r\nbut the error happen again.\r\nWhat should I do to fix this?",
"title": "Error on Gentoo",
"type": "issue"
}
] | 1,058 | false | false | 1 | 1 | false |
twilio/twilio-video-app-react | twilio | 663,094,078 | 274 | null | [
{
"action": "opened",
"author": "mominnawaf",
"comment_id": null,
"datetime": 1595344301000,
"masked_author": "username_0",
"text": "when i run `twilio login`\r\nI get error as \r\n\r\nError: Cannot find module '@oclif/command'\r\n at Function.Module._resolveFilename (internal/modules/cjs/loader.js:636:15)\r\n at Function.Module._load (internal/modules/cjs/loader.js:562:25)\r\n at Module.require (internal/modules/cjs/loader.js:690:17)\r\n at require (internal/modules/cjs/helpers.js:25:18)\r\n at Object.<anonymous> (C:\\Users\\Nawaf Momin\\AppData\\Roaming\\npm\\node_modules\\twilio-cli\\bin\\run:4:17)\r\n at Module._compile (internal/modules/cjs/loader.js:776:30)\r\n at Object.Module._extensions..js (internal/modules/cjs/loader.js:787:10)\r\n at Module.load (internal/modules/cjs/loader.js:653:32)\r\n at tryModuleLoad (internal/modules/cjs/loader.js:593:12)\r\n at Function.Module._load (internal/modules/cjs/loader.js:585:3)`",
"title": "Cannot find module '@oclif/command'",
"type": "issue"
},
{
"action": "closed",
"author": "timmydoza",
"comment_id": null,
"datetime": 1596062054000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "timmydoza",
"comment_id": 665965164,
"datetime": 1596062054000,
"masked_author": "username_1",
"text": "Thanks for the question @username_0!\r\n\r\nUnfortunately, I cannot reproduce the issue myself. It's also difficult to know what the issue could be without more information. Can you please create a new issue and complete our issue template in full? The more we know about your environment, what commands you have run, what exact steps are required to reproduce the issue, etc., the better we will be able to help you. \r\n\r\nThanks!",
"title": null,
"type": "comment"
}
] | 1,224 | false | false | 2 | 3 | true |
OpenSRP/opensrp-client-giz-malawi | OpenSRP | 658,436,674 | 317 | null | [
{
"action": "opened",
"author": "Mstjamush",
"comment_id": null,
"datetime": 1594923268000,
"masked_author": "username_0",
"text": "Issues from the Contact activity\r\n1. There's a lag when loading the \"Current Pregnancy\" page questions\r\n2. There's a lag when loading the \"Physical Symptoms\" page questions\r\n3. There's a lag when loading the \"Height & weight\" & \"Blood pressure\" page questions\r\n4. There's a lag when loading the \"Test\" option questions - lags on pages take 2-30 secs and responding issue when providing input data to the questions\r\n5. There's a lag when loading the \"Counselling and Treatment\" option questions - lags on pages take 10secs - 2mins and responding issue when providing input data to the questions\r\n6. App is crashing on the Immunisations page",
"title": "Jembi issues(v0.2.34-beta) - ANC Contact",
"type": "issue"
},
{
"action": "created",
"author": "hmakandi",
"comment_id": 666230566,
"datetime": 1596098363000,
"masked_author": "username_1",
"text": "The record included the time taken by the user to enter the data and thus were not applicable for the system performance measures being sought. Therefore am closing this issue.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "hmakandi",
"comment_id": null,
"datetime": 1596098364000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 817 | false | false | 2 | 3 | false |
Showndarya/Hacktoberfest | null | 724,996,964 | 2,980 | {
"number": 2980,
"repo": "Hacktoberfest",
"user_login": "Showndarya"
} | [
{
"action": "opened",
"author": "worachatsun",
"comment_id": null,
"datetime": 1603143743000,
"masked_author": "username_0",
"text": "The Pull Request resolves Issue #[Issue-Number]\r\n\r\nDescription:\r\nA brief description about the addition or edits made.\r\n\r\nNotes: Delete all the text including this!\r\n* The `#[Issue-Number]` should be replaced with the specific Issue being referenced.",
"title": "Create Awesomesauce",
"type": "issue"
}
] | 250 | false | true | 1 | 1 | false |
HackYourFuture-CPH/fp-class12 | HackYourFuture-CPH | 646,794,359 | 182 | {
"number": 182,
"repo": "fp-class12",
"user_login": "HackYourFuture-CPH"
} | [
{
"action": "opened",
"author": "JohnYazji",
"comment_id": null,
"datetime": 1593302309000,
"masked_author": "username_0",
"text": "# Description\r\n\r\nPlease provide a short summary explaining what this PR is about.\r\n\r\nDeploy storybook stories of fp-class12 on Netlify\r\n\r\nFixes #178 \r\n\r\n# How to test?\r\nhttps://storybook-fp-class12.netlify.app/?path=/story/*\r\n\r\nPlease provide a short summary how your changes can be tested?\r\n\r\n# Checklist\r\n\r\n- [X] I have performed a self-review of my own code\r\n- [ ] I have commented my code, particularly in hard-to-understand areas\r\n- [X] I have made corresponding changes to the documentation\r\n- [X] I have merged the latest version of the source branch (typically develop) into this branch before pushing my changes\r\n- [ ] This PR is ready to be merged and not breaking any other functionality",
"title": "Storybook deployment on Netlify",
"type": "issue"
},
{
"action": "created",
"author": "zkwsk",
"comment_id": 650725598,
"datetime": 1593337208000,
"masked_author": "username_1",
"text": "I don't know what is wrong and don't quite have the time to debug this further, but I can reproduce the error locally if I build storybook and try to serve it out of the `/storybook-static` folder.\r\n\r\nTwo small things:\r\n- Could you change this to a draft PR?\r\n- I would prefer if you work directly on the main repo rather than on your own fork. To checkout your code I either need to checkout your fork separately or add it as a remote and that's a bit tedious with many contributors.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JohnYazji",
"comment_id": 650771232,
"datetime": 1593355151000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JohnYazji",
"comment_id": 651119772,
"datetime": 1593437204000,
"masked_author": "username_0",
"text": "I closed this PR because I have created a new PR #187",
"title": null,
"type": "comment"
}
] | 1,235 | false | false | 2 | 4 | false |
dogsheep/github-to-sqlite | dogsheep | 681,086,659 | 47 | null | [
{
"action": "opened",
"author": "simonw",
"comment_id": null,
"datetime": 1597760786000,
"masked_author": "username_0",
"text": "For fun - it can import https://api.github.com/emojis - maybe with an option to fetch the binary representations in addition to the URLs.",
"title": "emojis command",
"type": "issue"
},
{
"action": "created",
"author": "simonw",
"comment_id": 675523053,
"datetime": 1597761954000,
"masked_author": "username_0",
"text": "```\r\n% github-to-sqlite emojis emojis.db --fetch\r\n [########----------------------------] 397/1682 23% 00:03:43\r\n```",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "simonw",
"comment_id": null,
"datetime": 1597762333000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 258 | false | false | 1 | 3 | false |
digital-asset/daml | digital-asset | 705,542,116 | 7,449 | {
"number": 7449,
"repo": "daml",
"user_login": "digital-asset"
} | [
{
"action": "opened",
"author": "remyhaemmerle-da",
"comment_id": null,
"datetime": 1600690465000,
"masked_author": "username_0",
"text": "Since #5321, LF versions < 1.6 are deprecated. This PR drops some tests\r\nfor those deprecated versions.\r\n\r\nCHANGELOG_BEGIN\r\nCHANGELOG_END\r\n\r\n### Pull Request Checklist\r\n\r\n- [ ] Read and understand the [contribution guidelines](https://github.com/digital-asset/daml/blob/master/CONTRIBUTING.md)\r\n- [ ] Include appropriate tests\r\n- [ ] Set a descriptive title and thorough description\r\n- [ ] Add a reference to the [issue this PR will solve](https://github.com/digital-asset/daml/issues), if appropriate\r\n- [ ] Include changelog additions in one or more commit message bodies between the `CHANGELOG_BEGIN` and `CHANGELOG_END` tags\r\n- [ ] Normal production system change, include purpose of change in description\r\n\r\nNOTE: CI is not automatically run on non-members pull-requests for security\r\nreasons. The reviewer will have to comment with `/AzurePipelines run` to\r\ntrigger the build.",
"title": "LF: Drop some tests for deprecated LF versions",
"type": "issue"
},
{
"action": "created",
"author": "remyhaemmerle-da",
"comment_id": 696086979,
"datetime": 1600691952000,
"masked_author": "username_0",
"text": "Since #7415, Sandbox classic does not supported then any more.",
"title": null,
"type": "comment"
}
] | 944 | false | false | 1 | 2 | false |
SumoLogic/tailing-sidecar | SumoLogic | 831,881,675 | 79 | null | [
{
"action": "opened",
"author": "kkujawa-sumo",
"comment_id": null,
"datetime": 1615819289000,
"masked_author": "username_0",
"text": "Document and/or add warning for sidecar configs with the same name, related to https://github.com/SumoLogic/tailing-sidecar/pull/51#discussion_r591614576\r\nCurrently when 2 sidecar configs have the same name then the second config overwrites the first.\r\n\r\nE.g.\r\nTailingSidecar\r\n```\r\napiVersion: tailing-sidecar.sumologic.com/v1\r\nkind: TailingSidecar\r\nmetadata:\r\n name: tailingsidecar-sample\r\nspec:\r\n configs:\r\n sidecarconfig:\r\n volume: varlogconfig\r\n file: /varconfig/log/example1.log\r\n sidecarconfig:\r\n volume: varlogconfig\r\n file: /varconfig/log/example2.log\r\n```\r\n\r\nPod:\r\n```\r\napiVersion: v1\r\nkind: Pod\r\nmetadata:\r\n name: pod-with-annotations\r\n namespace: tailing-sidecar-system\r\n annotations:\r\n tailing-sidecar: sidecarconfig\r\n labels:\r\n app: pod-with-annotations\r\nspec:\r\n containers:\r\n - name: count\r\n image: busybox\r\n args:\r\n - /bin/sh\r\n - -c\r\n - >\r\n i=0;\r\n while true;\r\n do\r\n echo \"example0: $i $(date)\" >> /var/log/example0.log;\r\n echo \"example1: $i $(date)\" >> /var/log/example1.log;\r\n echo \"example2: $i $(date)\" >> /varconfig/log/example2.log;\r\n i=$((i+1));\r\n sleep 1;\r\n done\r\n volumeMounts:\r\n - name: varlog\r\n mountPath: /var/log\r\n - name: varlogconfig\r\n mountPath: /varconfig/log\r\n volumes:\r\n - name: varlog\r\n emptyDir: {}\r\n - name: varlogconfig\r\n emptyDir: {}\r\n```\r\n\r\nCreated Pod by operator:\r\n```\r\n$ kubectl describe pod pod-with-annotations -n tailing-sidecar-system \r\nName: pod-with-annotations\r\nNamespace: tailing-sidecar-system\r\nPriority: 0\r\nNode: sumologic-tailing-sidecar/10.0.2.15\r\nStart Time: Mon, 15 Mar 2021 14:32:31 +0000\r\nLabels: app=pod-with-annotations\r\nAnnotations: cni.projectcalico.org/podIP: 10.1.37.83/32\r\n cni.projectcalico.org/podIPs: 10.1.37.83/32\r\n tailing-sidecar: sidecarconfig\r\nStatus: Running\r\nIP: 10.1.37.83\r\nIPs:\r\n IP: 10.1.37.83\r\nContainers:\r\n count:\r\n Container ID: containerd://1b56d9f4198a8f0ab365a2cda8454598301e1422b9a8c102edbc14bba6ba512e\r\n Image: busybox\r\n Image ID: docker.io/library/busybox@sha256:ce2360d5189a033012fbad1635e037be86f23b65cfd676b436d0931af390a2ac\r\n Port: <none>\r\n Host Port: <none>\r\n Args:\r\n /bin/sh\r\n -c\r\n i=0; while true; do\r\n echo \"example0: $i $(date)\" >> /var/log/example0.log;\r\n echo \"example1: $i $(date)\" >> /var/log/example1.log;\r\n echo \"example2: $i $(date)\" >> /varconfig/log/example2.log;\r\n i=$((i+1));\r\n sleep 1;\r\n done\r\n \r\n State: Running\r\n Started: Mon, 15 Mar 2021 14:32:33 +0000\r\n Ready: True\r\n Restart Count: 0\r\n Environment: <none>\r\n Mounts:\r\n /var/log from varlog (rw)\r\n /var/run/secrets/kubernetes.io/serviceaccount from default-token-smcv9 (ro)\r\n /varconfig/log from varlogconfig (rw)\r\n tailing-sidecar0:\r\n Container ID: containerd://6ab75471db047be85bf86353a292ca51a2093fe654f1e6dd92aa6bcce7acad3f\r\n Image: localhost:32000/sumologic/tailing-sidecar:demo\r\n Image ID: localhost:32000/sumologic/tailing-sidecar@sha256:0ed3d789b7e9a23a61cba26d394af02505fc2d25a96fdd85f03cd4bf2b3ca53a\r\n Port: <none>\r\n Host Port: <none>\r\n State: Running\r\n Started: Mon, 15 Mar 2021 14:32:34 +0000\r\n Ready: True\r\n Restart Count: 0\r\n Environment:\r\n PATH_TO_TAIL: /varconfig/log/example2.log\r\n TAILING_SIDECAR: true\r\n Mounts:\r\n /tailing-sidecar/var from volume-sidecar0 (rw)\r\n /var/run/secrets/kubernetes.io/serviceaccount from default-token-smcv9 (ro)\r\n /varconfig/log from varlogconfig (rw)\r\nConditions:\r\n Type Status\r\n Initialized True \r\n Ready True \r\n ContainersReady True \r\n PodScheduled True \r\nVolumes:\r\n varlog:\r\n Type: EmptyDir (a temporary directory that shares a pod's lifetime)\r\n Medium: \r\n SizeLimit: <unset>\r\n varlogconfig:\r\n Type: EmptyDir (a temporary directory that shares a pod's lifetime)\r\n Medium: \r\n SizeLimit: <unset>\r\n default-token-smcv9:\r\n Type: Secret (a volume populated by a Secret)\r\n SecretName: default-token-smcv9\r\n Optional: false\r\n volume-sidecar0:\r\n Type: HostPath (bare host directory volume)\r\n Path: /var/log/tailing-sidecar-fluentbit/tailing-sidecar-system/pod-with-annotations/tailing-sidecar0\r\n HostPathType: DirectoryOrCreate\r\nQoS Class: BestEffort\r\nNode-Selectors: <none>\r\nTolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s\r\n node.kubernetes.io/unreachable:NoExecute op=Exists for 300s\r\nEvents:\r\n Type Reason Age From Message\r\n ---- ------ ---- ---- -------\r\n Normal Scheduled 7m15s default-scheduler Successfully assigned tailing-sidecar-system/pod-with-annotations to sumologic-tailing-sidecar\r\n Normal Pulling 7m14s kubelet Pulling image \"busybox\"\r\n Normal Pulled 7m13s kubelet Successfully pulled image \"busybox\" in 1.373490106s\r\n Normal Created 7m13s kubelet Created container count\r\n Normal Started 7m13s kubelet Started container count\r\n Normal Pulled 7m13s kubelet Container image \"localhost:32000/sumologic/tailing-sidecar:demo\" already present on machine\r\n Normal Created 7m13s kubelet Created container tailing-sidecar0\r\n Normal Started 7m12s kubelet Started container tailing-sidecar0\r\n```",
"title": "Document and/or add warning for sidecar configs with the same name",
"type": "issue"
}
] | 5,677 | false | false | 1 | 1 | false |
kitian616/jekyll-TeXt-theme | null | 712,540,611 | 299 | null | [
{
"action": "opened",
"author": "dopewind",
"comment_id": null,
"datetime": 1601534349000,
"masked_author": "username_0",
"text": "<!-- Prefer English -->\r\n\r\n## Description\r\n[Description of the feature]\r\n\r\nBasically what the title says, a toggle for switching between two preselected themes would be great\r\n\r\nAnd the pinned post helps us to keep things organized\r\n\r\nPlease work in this is you have some time\r\n\r\nCheers✌",
"title": "Feature request: Pinned posts and a theme toggle",
"type": "issue"
},
{
"action": "closed",
"author": "zwxi",
"comment_id": null,
"datetime": 1644081002000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 287 | false | false | 2 | 2 | false |
navikt/foreldrepengeoversikt | navikt | 630,578,788 | 212 | null | [
{
"action": "opened",
"author": "janolaveide",
"comment_id": null,
"datetime": 1591255589000,
"masked_author": "username_0",
"text": "Kommenter med <b>/promote 20200604092252-40a761f cluster</b>, hvor <b>cluster</b> er et gyldig clusternavn\n<table>\n<tr><th>Cluster</th></tr>\n<tr><td>dev-sbs</td>\n<tr><td>prod-sbs</tr>\n</table>",
"title": "Bygg av 20200604092252-40a761f",
"type": "issue"
},
{
"action": "created",
"author": "janolaveide",
"comment_id": 638658750,
"datetime": 1591255590000,
"masked_author": "username_0",
"text": "/promote 20200604092252-40a761f dev-sbs",
"title": null,
"type": "comment"
}
] | 231 | false | true | 1 | 2 | false |
Technus/TecTech | null | 795,047,821 | 50 | {
"number": 50,
"repo": "TecTech",
"user_login": "Technus"
} | [
{
"action": "opened",
"author": "impbk2002",
"comment_id": null,
"datetime": 1611749541000,
"masked_author": "username_0",
"text": "OpenComputers unlocalized name fixed",
"title": "Update GT_MetaTileEntity_Hatch_Rack.java",
"type": "issue"
},
{
"action": "created",
"author": "Dream-Master",
"comment_id": 768273639,
"datetime": 1611752889000,
"masked_author": "username_1",
"text": "@username_0 didn't this work before ?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "impbk2002",
"comment_id": 768297797,
"datetime": 1611755405000,
"masked_author": "username_0",
"text": "yep, never happened. cause OpenComputers unlocalized name is not OpenComputers:items:(metadata)",
"title": null,
"type": "comment"
}
] | 168 | false | false | 2 | 3 | true |
Azgaar/Fantasy-Map-Generator | null | 699,513,477 | 521 | null | [
{
"action": "opened",
"author": "MisterySteel",
"comment_id": null,
"datetime": 1599841934000,
"masked_author": "username_0",
"text": "<!-- PLEASE FILL OUT THE FOLLOWING INFORMATION -->\r\nI would like to thank you for creating a tool so useful and complete. With the issue fixed or not i don't intend to leave the username_2 Map Generator any time soon...\r\n\r\n### Description\r\n<!-- Describe the problem / suggestion in a few words -->\r\nWhen loading a saved map from Local Disk in a .map format the issue occurs as it loads a version where the hud and editing tools are glitched.\r\n\r\n\r\n### Steps to reproduce\r\n<!-- What should I do to reproduce the issue? -->\r\nI believe that having the file simply loading it in should do the trick.\r\n\r\n### Expected behaviour\r\n<!-- Tell me what do you expect to happen -->\r\nIn the next day I can continue the previous work.\r\n \r\n### Actual behaviour\r\n<!-- Tell me what actually happens -->\r\nI'm neither able to navigate nor edit the map.\r\n\r\n\r\n\r\n#### .map file\r\n<!-- Attach archived .map file (zip archive) so I can see what is the problem -->\r\n[Paulinais 2020-09-10-01-51.zip](https://github.com/username_2/Fantasy-Map-Generator/files/5210280/Paulinais.2020-09-10-01-51.zip)\r\n\r\n### Generator version\r\n<!-- Version is displayed in the page title -->\r\nUsing at v1.4\r\nI already tried the local file version and the web version both give me the same issue\r\n### Browser version\r\n\r\n<!-- Populate with browser name and its version -->\r\n Google Chrome 85.0.4183.102 64 bits",
"title": "Loading Error",
"type": "issue"
},
{
"action": "created",
"author": "evolvedexperiment",
"comment_id": 692002610,
"datetime": 1600084294000,
"masked_author": "username_1",
"text": "The map was missing custom labels, burg labels, burg icons, the scalebar and the debug sections in the SVG. Did you perhaps delete custom labels or edit the SVG directly in the browser? I can't think of anything that could cause this, so it would be great to know what happened.\r\n\r\nI've tried to correct it, but it's hard to be sure it's 100% correct. I had to re-create the state labels, burg labels, icons, and the scale-bar. The corrected map is attached, but you may want to check that everything is as you expect it to be.\r\n\r\n[Paulinais 2020-09-14-13-44.zip](https://github.com/username_2/Fantasy-Map-Generator/files/5218163/Paulinais.2020-09-14-13-44.zip)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MisterySteel",
"comment_id": 695331979,
"datetime": 1600534722000,
"masked_author": "username_0",
"text": "Everything that I checked seems to be in perfectly in place. You saved me so many hours of work.\r\nNo need to worry about the markers it won't be a problem, I had not used the military tools yet but I'm very excited to use such a interesting system.\r\n\r\nThanks for the support, attention and effort.\r\nI will be soon in contact about my experience as I'm getting more familiar with username_2 Fantasy Map generator.\r\n\r\nBest regards,\r\nMistery Steel, A Pleased user",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Azgaar",
"comment_id": 739242266,
"datetime": 1607169858000,
"masked_author": "username_2",
"text": "Tried to preproduce, but failed. I will close it as for now, let's see how it will behave",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Azgaar",
"comment_id": null,
"datetime": 1607169858000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 2,668 | false | false | 3 | 5 | true |
btcontract/lnwallet | null | 436,989,583 | 136 | null | [
{
"action": "opened",
"author": "mattnook",
"comment_id": null,
"datetime": 1556161531000,
"masked_author": "username_0",
"text": "Hi, I am trying to open a channel to my Bitcoin Lightning Wallet. I am wondering how to get my FULL node ID. Right now the \"Share Node ID\" only gives me the public key, which I can't find on the network from searching 1ML.com\r\n\r\nWhat is the full address of my LN Node ID?",
"title": "how to find full LN node ID?",
"type": "issue"
},
{
"action": "created",
"author": "btcontract",
"comment_id": 486533128,
"datetime": 1556172745000,
"masked_author": "username_1",
"text": "There is no such thing as full/short node id, the one from \"Share Node ID\" menu is as full as it gets.\r\n\r\nThe reason your phone node is not seen on 1ML is because mobile wallets do not advertize themselves and only open private channels, so they are not present on a public graph, by design.\r\n\r\nIt's still possible to open an incoming channel into mobile wallet by doing the following:\r\n\r\n1. Establish a connection to your desktop LN node from BLW by either selecting your node in \"Open channel\" page, or by scanning a node QR from wallet.\r\n\r\n2. Once connected you'll see a special channel opening page and a funding popup in BLW as it tries to open an outgoing channel, either dismiss it or do nothing, but do stay on that page to maintain a connection.\r\n\r\n3. While connected, issue a channel opening command from your desktop LN node using wallet ID obtained from menu, the trick is BLW should be connected when a command is issued. Also offered incoming channel must be private, BLW will ignore public offerings.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "btcontract",
"comment_id": null,
"datetime": 1557731423000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 1,286 | false | false | 2 | 3 | false |
home-assistant/home-assistant | home-assistant | 520,659,400 | 28,683 | null | [
{
"action": "opened",
"author": "Molodax",
"comment_id": null,
"datetime": 1573420904000,
"masked_author": "username_0",
"text": "<!-- READ THIS FIRST:\r\n- If you need additional help with this template please refer to https://www.home-assistant.io/help/reporting_issues/\r\n- Make sure you are running the latest version of Home Assistant before reporting an issue: https://github.com/home-assistant/home-assistant/releases\r\n- Frontend issues should be submitted to the home-assistant-polymer repository: https://github.com/home-assistant/home-assistant-polymer/issues\r\n- iOS issues should be submitted to the home-assistant-iOS repository: https://github.com/home-assistant/home-assistant-iOS/issues\r\n- Do not report issues for integrations if you are using a custom integration: files in <config-dir>/custom_components\r\n- This is for bugs only. Feature and enhancement requests should go in our community forum: https://community.home-assistant.io/c/feature-requests\r\n- Provide as many details as possible. Paste logs, configuration sample and code into the backticks. Do not delete any text from this template!\r\n-->\r\n\r\n**Home Assistant release with the issue:**\r\n<!--\r\n- Frontend -> Developer tools -> Info\r\n- Or use this command: hass --version\r\n-->\r\n0.101.x(?)\r\n\r\n**Last working Home Assistant release (if known):**\r\n0.100.0\r\n\r\n**Operating environment (Hass.io/Docker/Windows/etc.):**\r\n<!--\r\nPlease provide details about your environment.\r\n-->\r\nHass.io\r\n**Integration:**\r\n<!--\r\nPlease add the link to the documentation at https://www.home-assistant.io/integrations/ of the integration in question.\r\n-->\r\nhttps://www.home-assistant.io/integrations/logi_circle/\r\n\r\n**Description of problem:**\r\nI used the integration with Logi without problem but after one of the restarts, I don't have the integration anymore.\r\n\r\n\r\n**Problem-relevant `configuration.yaml` entries and (fill out even if it seems unimportant):**\r\n```yaml\r\n\r\n```\r\n\r\n**Traceback (if applicable):**\r\n```\r\n2019-11-10 21:31:54 ERROR (MainThread) [homeassistant.core] Error doing job: Task exception was never retrieved\r\nTraceback (most recent call last):\r\n File \"/usr/src/homeassistant/homeassistant/data_entry_flow.py\", line 73, in async_init\r\n return await self._async_handle_step(flow, flow.init_step, data)\r\n File \"/usr/src/homeassistant/homeassistant/data_entry_flow.py\", line 132, in _async_handle_step\r\n result: Dict = await getattr(flow, method)(user_input)\r\n File \"/usr/src/homeassistant/homeassistant/components/logi_circle/config_flow.py\", line 147, in async_step_code\r\n return await self._async_create_session(code)\r\n File \"/usr/src/homeassistant/homeassistant/components/logi_circle/config_flow.py\", line 166, in _async_create_session\r\n cache_file=self.hass.config.path(DEFAULT_CACHEDB),\r\n File \"/usr/local/lib/python3.7/site-packages/logi_circle/__init__.py\", line 40, in __init__\r\n logi_base=self)\r\n File \"/usr/local/lib/python3.7/site-packages/logi_circle/auth.py\", line 26, in __init__\r\n self.tokens = self._read_token()\r\n File \"/usr/local/lib/python3.7/site-packages/logi_circle/auth.py\", line 138, in _read_token\r\n data = pickle.load(open(filename, 'rb'))\r\nUnicodeDecodeError: 'utf-8' codec can't decode byte 0xc8 in position 336: invalid continuation byte\r\n```\r\n\r\n**Additional information:**",
"title": "Cannot integrate Logi Circle anymore",
"type": "issue"
},
{
"action": "created",
"author": "evanjd",
"comment_id": 552366197,
"datetime": 1573465095000,
"masked_author": "username_1",
"text": "Do you mind trying the following?\r\n\r\n1. Delete `.logi_cache.pickle` from your configuration folder.\r\n2. Remove the integration from Configuration -> Integration.\r\n3. Re-add the integration.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Molodax",
"comment_id": 562960246,
"datetime": 1575819085000,
"masked_author": "username_0",
"text": "Thanks, @username_1 !\r\nThe issue seemed to be resolved.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Molodax",
"comment_id": null,
"datetime": 1575819095000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 3,411 | false | true | 2 | 4 | true |
kingrpaul/percy-marks_the-plastic-age | null | 829,798,298 | 2 | null | [
{
"action": "opened",
"author": "robinwhittleton",
"comment_id": null,
"datetime": 1615530150000,
"masked_author": "username_0",
"text": "1. Replace `This novel` with `<i>The Plastic Age</i>`.\r\n2. Can we expand the first paragraph with another sentence or two? A long description should be “as detailed as possible without giving away plot spoilers” https://standardebooks.org/manual/1.4.0/single-page#9.6.2.1",
"title": "Revise the long description",
"type": "issue"
},
{
"action": "closed",
"author": "kingrpaul",
"comment_id": null,
"datetime": 1615931994000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "kingrpaul",
"comment_id": 800640905,
"datetime": 1615931994000,
"masked_author": "username_1",
"text": "Rewritten at https://github.com/standardebooks/percy-marks_the-plastic-age",
"title": null,
"type": "comment"
}
] | 345 | false | false | 2 | 3 | false |
google/web-stories-wp | google | 841,529,909 | 6,913 | null | [
{
"action": "opened",
"author": "barklund",
"comment_id": null,
"datetime": 1616725300000,
"masked_author": "username_0",
"text": "## Feature Description\r\n\r\nAdd relevant integration tests for canvas zoom.\r\n\r\n---\r\n\r\n_Do not alter or remove anything below. The following sections will be managed by moderators only._\r\n\r\n## Acceptance Criteria\r\n\r\n<!-- One or more bullet points for acceptance criteria. -->\r\n\r\n## Implementation Brief\r\n\r\n<!-- One or more bullet points for how to technically implement the feature. -->",
"title": "Add tests for canvas zoom",
"type": "issue"
},
{
"action": "created",
"author": "barklund",
"comment_id": 814933429,
"datetime": 1617803608000,
"masked_author": "username_0",
"text": "Closing as it doesn't have QA nor UAT",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "barklund",
"comment_id": null,
"datetime": 1617803609000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 420 | false | false | 1 | 3 | false |
simp/rubygem-simp-cli | simp | 571,574,515 | 143 | {
"number": 143,
"repo": "rubygem-simp-cli",
"user_login": "simp"
} | [
{
"action": "opened",
"author": "trevor-vaughan",
"comment_id": null,
"datetime": 1582741797000,
"masked_author": "username_0",
"text": "Needs:\n * https://github.com/simp/pupmod-simp-simplib/pull/214\n\nSIMP-7332 #close",
"title": "(SIMP-7332) Renamespace libkv to simpkv",
"type": "issue"
},
{
"action": "created",
"author": "lnemsick-simp",
"comment_id": 594021190,
"datetime": 1583250678000,
"masked_author": "username_1",
"text": "Unit tests are heavily mocked. Will push to a local gitlab for thorough testing.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "lnemsick-simp",
"comment_id": 594071184,
"datetime": 1583256543000,
"masked_author": "username_1",
"text": "Enough permutations succeeded on internal gitlab that I believe this is good to go",
"title": null,
"type": "comment"
}
] | 244 | false | false | 2 | 3 | false |
BlockchainCommons/did-method-onion | BlockchainCommons | 843,579,161 | 11 | null | [
{
"action": "opened",
"author": "clehner",
"comment_id": null,
"datetime": 1617037495000,
"masked_author": "username_0",
"text": "Some sections and examples are missing an `id` property. The `id` property would enable linking to them more easily.\r\nFor example, in the `did:web` specification, we can link to sections, or examples, like this:\r\nhttps://w3c-ccg.github.io/did-method-web/#read-resolve\r\nhttps://w3c-ccg.github.io/did-method-web/#example-3-creating-the-did",
"title": "Section/heading ids",
"type": "issue"
}
] | 337 | false | false | 1 | 1 | false |
sardana-org/sardana | sardana-org | 790,942,904 | 1,478 | {
"number": 1478,
"repo": "sardana",
"user_login": "sardana-org"
} | [
{
"action": "opened",
"author": "jordiandreu",
"comment_id": null,
"datetime": 1611226678000,
"masked_author": "username_0",
"text": "Implements a mechanism to register a custom-user generic data recorder. The mechanism enables the possibility to register one or more custom data recorders. The recorders get configured by setting the sardana environment variable DataRecorder with one or more recorder class names.\r\n\r\nDocumentation about the implementation and configuration of a custom data recorder still missing.",
"title": "[WIP] Generic data recorder",
"type": "issue"
},
{
"action": "created",
"author": "reszelaz",
"comment_id": 768365974,
"datetime": 1611761466000,
"masked_author": "username_1",
"text": "Thanks @username_0 and @username_2 for working on this PR!\r\nJust a comment on the documentation, there is https://github.com/sardana-org/sardana/pull/1457 which moves around some documentation related to the recorders. It would be nice to get that one integrated first so you work already on the reorganized document.\r\n\r\nAlso, there is the environment variables catalgue which would need to be updated with the new environment variable.\r\n\r\nRegarding the implementation, do you foresee to parametrize somehow the initialization of the recorder class? Like for example the file recorders they receive the path to the file?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "tiagocoutinho",
"comment_id": 769891140,
"datetime": 1611936010000,
"masked_author": "username_2",
"text": "Yes, the idea was that each recorder would be responsible to figure out which parameters it needs to configure itself.\r\nSee example [here](https://github.com/ALBA-Synchrotron/sardana-streams/blob/f9c917f14bf475896c64f65a9ddc09e14390af76/sardana_streams/recorder/recorder.py#L7)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "reszelaz",
"comment_id": 770041882,
"datetime": 1611953317000,
"masked_author": "username_1",
"text": "Perfect! Thanks for the clarification!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "reszelaz",
"comment_id": 785203399,
"datetime": 1614184416000,
"masked_author": "username_1",
"text": "When reviewing issues I just found again #1274, which I think would benefit from this PR. @aureocarneiro, @username_3 how were you configuring the KafkaRecorder mentioned in https://github.com/sardana-org/sardana/issues/1274#issuecomment-579141745?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "13bscsaamjad",
"comment_id": 785783471,
"datetime": 1614248269000,
"masked_author": "username_3",
"text": "we have found that if you put multiple ScanFile and ScanRecorder, sardana uses them all sequentially. the catch is to have same number of ScanFile as ScanRecorder. we only tested with two though. embedding the example config.\r\n```\r\nsenv ScanFile \"['test1', 'test2']\" \r\nsenv ScanRecorder \"['Scicatrecorder', 'H5Recorder']\"\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "teresanunez",
"comment_id": 785789688,
"datetime": 1614248884000,
"masked_author": "username_4",
"text": "Sorry if I have not understood what you mean, we use at DESY very very oft several recorders and the scanfile is the same\r\nfor all of them, only by the termination we know the format. For us it has to continue working like this.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "reszelaz",
"comment_id": 787893643,
"datetime": 1614599913000,
"masked_author": "username_1",
"text": "@username_4 there are no plans to change/remove activation of file recorders based on the file extension.\r\nThe `ScanRecorder` variable was added in the past to explicitelly activate a recorder regardless of the one that Sardana would assign based on the file extension. This PR is to propose a mechanism for activating data recorders which do not write to a file.\r\n\r\nThanks for sharing your configuration @username_3.\r\nIf I understands well the ScanFile value \"test1\" corresponding to the ScanRecorer value \"Scicatrecorder\" does not correspond to any real file - this recorder does not write to a file, right? So you used it as a placeholder so there is a match of elements in both lists?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "13bscsaamjad",
"comment_id": 787952352,
"datetime": 1614605652000,
"masked_author": "username_3",
"text": "@username_1 yes exactly.",
"title": null,
"type": "comment"
}
] | 2,838 | false | false | 5 | 9 | true |
ManimCommunity/manim | ManimCommunity | 742,816,852 | 705 | null | [
{
"action": "opened",
"author": "fjp",
"comment_id": null,
"datetime": 1605305033000,
"masked_author": "username_0",
"text": "**Describe the bug**\r\n<!-- A clear and concise description of what the bug is. -->\r\n\r\nOn Ubuntu 20.04 (python 3.8) I followed the documentation on the required dependencies for Ubuntu/Linux and created a new virtual environment with `python -m venv venv`. When I try to install `manimce` with `pip install manimce` I get the following error:\r\n\r\n```console\r\npip install manimce \r\nCollecting manimce\r\n Using cached manimce-0.1.0-py3-none-any.whl (233 kB)\r\nCollecting tqdm\r\n Using cached tqdm-4.51.0-py2.py3-none-any.whl (70 kB)\r\nCollecting progressbar\r\n Using cached progressbar-2.5.tar.gz (10 kB)\r\nCollecting watchdog\r\n Using cached watchdog-0.10.3.tar.gz (94 kB)\r\nCollecting pydub\r\n Using cached pydub-0.24.1-py2.py3-none-any.whl (30 kB)\r\nCollecting Pillow\r\n Using cached Pillow-8.0.1-cp38-cp38-manylinux1_x86_64.whl (2.2 MB)\r\nCollecting numpy\r\n Using cached numpy-1.19.4-cp38-cp38-manylinux2010_x86_64.whl (14.5 MB)\r\nCollecting colour\r\n Using cached colour-0.1.5-py2.py3-none-any.whl (23 kB)\r\nCollecting grpcio-tools\r\n Using cached grpcio_tools-1.33.2-cp38-cp38-manylinux2014_x86_64.whl (2.5 MB)\r\nCollecting rich>=4.2.1\r\n Using cached rich-9.2.0-py3-none-any.whl (164 kB)\r\nCollecting scipy\r\n Using cached scipy-1.5.4-cp38-cp38-manylinux1_x86_64.whl (25.8 MB)\r\nCollecting grpcio\r\n Using cached grpcio-1.33.2-cp38-cp38-manylinux2014_x86_64.whl (3.8 MB)\r\nProcessing /home/username_0/.cache/pip/wheels/7c/6f/b3/3181c06d04ed8f428f86797384f5b4c9932692b9526b6aba21/pycairo-1.20.0-cp38-cp38-linux_x86_64.whl\r\nCollecting pangocairocffi<0.4.0,>=0.3.0\r\n Using cached pangocairocffi-0.3.2.tar.gz (21 kB)\r\nCollecting pangocffi<0.7.0,>=0.6.0\r\n Using cached pangocffi-0.6.0.tar.gz (37 kB)\r\nCollecting pygments\r\n Using cached Pygments-2.7.2-py3-none-any.whl (948 kB)\r\nCollecting cairocffi<2.0.0,>=1.1.0\r\n Using cached cairocffi-1.2.0.tar.gz (70 kB)\r\nCollecting pathtools>=0.1.1\r\n Using cached pathtools-0.1.2.tar.gz (11 kB)\r\nCollecting protobuf<4.0dev,>=3.5.0.post1\r\n Using cached protobuf-3.13.0-cp38-cp38-manylinux1_x86_64.whl (1.3 MB)\r\nCollecting colorama<0.5.0,>=0.4.0\r\n Using cached colorama-0.4.4-py2.py3-none-any.whl (16 kB)\r\nCollecting typing-extensions<4.0.0,>=3.7.4\r\n Using cached typing_extensions-3.7.4.3-py3-none-any.whl (22 kB)\r\nCollecting commonmark<0.10.0,>=0.9.0\r\n Using cached commonmark-0.9.1-py2.py3-none-any.whl (51 kB)\r\nCollecting six>=1.5.2\r\n Using cached six-1.15.0-py2.py3-none-any.whl (10 kB)\r\nCollecting cffi>=1.1.0\r\n Using cached cffi-1.14.3-cp38-cp38-manylinux1_x86_64.whl (410 kB)\r\nRequirement already satisfied: setuptools in ./venv/lib/python3.8/site-packages (from protobuf<4.0dev,>=3.5.0.post1->grpcio-tools->manimce) (44.0.0)\r\nCollecting pycparser\r\n Using cached pycparser-2.20-py2.py3-none-any.whl (112 kB)\r\nBuilding wheels for collected packages: progressbar, watchdog, pangocairocffi, pangocffi, cairocffi, pathtools\r\n Building wheel for progressbar (setup.py) ... error\r\n ERROR: Command errored out with exit status 1:\r\n command: /home/username_0/git/ros_ws/src/diffbot/docs/plots/venv/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '\"'\"'/tmp/pip-install-qmht0nm3/progressbar/setup.py'\"'\"'; __file__='\"'\"'/tmp/pip-install-qmht0nm3/progressbar/setup.py'\"'\"';f=getattr(tokenize, '\"'\"'open'\"'\"', open)(__file__);code=f.read().replace('\"'\"'\\r\\n'\"'\"', '\"'\"'\\n'\"'\"');f.close();exec(compile(code, __file__, '\"'\"'exec'\"'\"'))' bdist_wheel -d /tmp/pip-wheel-jd74wypd\r\n cwd: /tmp/pip-install-qmht0nm3/progressbar/\r\n Complete output (6 lines):\r\n usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]\r\n or: setup.py --help [cmd1 cmd2 ...]\r\n or: setup.py --help-commands\r\n or: setup.py cmd --help\r\n \r\n error: invalid command 'bdist_wheel'\r\n ----------------------------------------\r\n ERROR: Failed building wheel for progressbar\r\n Running setup.py clean for progressbar\r\n Building wheel for watchdog (setup.py) ... error\r\n ERROR: Command errored out with exit status 1:\r\n command: /home/username_0/git/ros_ws/src/diffbot/docs/plots/venv/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '\"'\"'/tmp/pip-install-qmht0nm3/watchdog/setup.py'\"'\"'; __file__='\"'\"'/tmp/pip-install-qmht0nm3/watchdog/setup.py'\"'\"';f=getattr(tokenize, '\"'\"'open'\"'\"', open)(__file__);code=f.read().replace('\"'\"'\\r\\n'\"'\"', '\"'\"'\\n'\"'\"');f.close();exec(compile(code, __file__, '\"'\"'exec'\"'\"'))' bdist_wheel -d /tmp/pip-wheel-z50sa6df\r\n cwd: /tmp/pip-install-qmht0nm3/watchdog/\r\n Complete output (6 lines):\r\n usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]\r\n or: setup.py --help [cmd1 cmd2 ...]\r\n or: setup.py --help-commands\r\n or: setup.py cmd --help\r\n \r\n error: invalid command 'bdist_wheel'\r\n ----------------------------------------\r\n ERROR: Failed building wheel for watchdog\r\n Running setup.py clean for watchdog\r\n...\r\n```\r\n\r\nTo solve this, I had to install `wheel` with `pip install wheel` and before that I also updated `pip` using `pip install -U pip`.",
"title": " [BUG-General] ",
"type": "issue"
},
{
"action": "created",
"author": "behackl",
"comment_id": 727066147,
"datetime": 1605306382000,
"masked_author": "username_1",
"text": "Thank you for your report, as well as the corresponding solution!\r\n\r\nJust for good measure: @username_2, do you think we can just add `wheel` as a dependency?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "naveen521kk",
"comment_id": null,
"datetime": 1605330699000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "naveen521kk",
"comment_id": 727146963,
"datetime": 1605330699000,
"masked_author": "username_2",
"text": "That won't be woking.\r\nIf I remember correctly, creating a venv will actually install `wheel` and `setuptools` along with `pip`. I don't know why it doesn't here. Also, if `wheel` is not available it should use setup.py without any error. I don't know why it doesn't here. Maybe this should be a venv bug?",
"title": null,
"type": "comment"
},
{
"action": "reopened",
"author": "behackl",
"comment_id": null,
"datetime": 1605343619000,
"masked_author": "username_1",
"text": "**Describe the bug**\r\n<!-- A clear and concise description of what the bug is. -->\r\n\r\nOn Ubuntu 20.04 (python 3.8) I followed the documentation on the required dependencies for Ubuntu/Linux and created a new virtual environment with `python -m venv venv`. When I try to install `manimce` with `pip install manimce` I get the following error:\r\n\r\n```console\r\npip install manimce \r\nCollecting manimce\r\n Using cached manimce-0.1.0-py3-none-any.whl (233 kB)\r\nCollecting tqdm\r\n Using cached tqdm-4.51.0-py2.py3-none-any.whl (70 kB)\r\nCollecting progressbar\r\n Using cached progressbar-2.5.tar.gz (10 kB)\r\nCollecting watchdog\r\n Using cached watchdog-0.10.3.tar.gz (94 kB)\r\nCollecting pydub\r\n Using cached pydub-0.24.1-py2.py3-none-any.whl (30 kB)\r\nCollecting Pillow\r\n Using cached Pillow-8.0.1-cp38-cp38-manylinux1_x86_64.whl (2.2 MB)\r\nCollecting numpy\r\n Using cached numpy-1.19.4-cp38-cp38-manylinux2010_x86_64.whl (14.5 MB)\r\nCollecting colour\r\n Using cached colour-0.1.5-py2.py3-none-any.whl (23 kB)\r\nCollecting grpcio-tools\r\n Using cached grpcio_tools-1.33.2-cp38-cp38-manylinux2014_x86_64.whl (2.5 MB)\r\nCollecting rich>=4.2.1\r\n Using cached rich-9.2.0-py3-none-any.whl (164 kB)\r\nCollecting scipy\r\n Using cached scipy-1.5.4-cp38-cp38-manylinux1_x86_64.whl (25.8 MB)\r\nCollecting grpcio\r\n Using cached grpcio-1.33.2-cp38-cp38-manylinux2014_x86_64.whl (3.8 MB)\r\nProcessing /home/username_0/.cache/pip/wheels/7c/6f/b3/3181c06d04ed8f428f86797384f5b4c9932692b9526b6aba21/pycairo-1.20.0-cp38-cp38-linux_x86_64.whl\r\nCollecting pangocairocffi<0.4.0,>=0.3.0\r\n Using cached pangocairocffi-0.3.2.tar.gz (21 kB)\r\nCollecting pangocffi<0.7.0,>=0.6.0\r\n Using cached pangocffi-0.6.0.tar.gz (37 kB)\r\nCollecting pygments\r\n Using cached Pygments-2.7.2-py3-none-any.whl (948 kB)\r\nCollecting cairocffi<2.0.0,>=1.1.0\r\n Using cached cairocffi-1.2.0.tar.gz (70 kB)\r\nCollecting pathtools>=0.1.1\r\n Using cached pathtools-0.1.2.tar.gz (11 kB)\r\nCollecting protobuf<4.0dev,>=3.5.0.post1\r\n Using cached protobuf-3.13.0-cp38-cp38-manylinux1_x86_64.whl (1.3 MB)\r\nCollecting colorama<0.5.0,>=0.4.0\r\n Using cached colorama-0.4.4-py2.py3-none-any.whl (16 kB)\r\nCollecting typing-extensions<4.0.0,>=3.7.4\r\n Using cached typing_extensions-3.7.4.3-py3-none-any.whl (22 kB)\r\nCollecting commonmark<0.10.0,>=0.9.0\r\n Using cached commonmark-0.9.1-py2.py3-none-any.whl (51 kB)\r\nCollecting six>=1.5.2\r\n Using cached six-1.15.0-py2.py3-none-any.whl (10 kB)\r\nCollecting cffi>=1.1.0\r\n Using cached cffi-1.14.3-cp38-cp38-manylinux1_x86_64.whl (410 kB)\r\nRequirement already satisfied: setuptools in ./venv/lib/python3.8/site-packages (from protobuf<4.0dev,>=3.5.0.post1->grpcio-tools->manimce) (44.0.0)\r\nCollecting pycparser\r\n Using cached pycparser-2.20-py2.py3-none-any.whl (112 kB)\r\nBuilding wheels for collected packages: progressbar, watchdog, pangocairocffi, pangocffi, cairocffi, pathtools\r\n Building wheel for progressbar (setup.py) ... error\r\n ERROR: Command errored out with exit status 1:\r\n command: /home/username_0/git/ros_ws/src/diffbot/docs/plots/venv/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '\"'\"'/tmp/pip-install-qmht0nm3/progressbar/setup.py'\"'\"'; __file__='\"'\"'/tmp/pip-install-qmht0nm3/progressbar/setup.py'\"'\"';f=getattr(tokenize, '\"'\"'open'\"'\"', open)(__file__);code=f.read().replace('\"'\"'\\r\\n'\"'\"', '\"'\"'\\n'\"'\"');f.close();exec(compile(code, __file__, '\"'\"'exec'\"'\"'))' bdist_wheel -d /tmp/pip-wheel-jd74wypd\r\n cwd: /tmp/pip-install-qmht0nm3/progressbar/\r\n Complete output (6 lines):\r\n usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]\r\n or: setup.py --help [cmd1 cmd2 ...]\r\n or: setup.py --help-commands\r\n or: setup.py cmd --help\r\n \r\n error: invalid command 'bdist_wheel'\r\n ----------------------------------------\r\n ERROR: Failed building wheel for progressbar\r\n Running setup.py clean for progressbar\r\n Building wheel for watchdog (setup.py) ... error\r\n ERROR: Command errored out with exit status 1:\r\n command: /home/username_0/git/ros_ws/src/diffbot/docs/plots/venv/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '\"'\"'/tmp/pip-install-qmht0nm3/watchdog/setup.py'\"'\"'; __file__='\"'\"'/tmp/pip-install-qmht0nm3/watchdog/setup.py'\"'\"';f=getattr(tokenize, '\"'\"'open'\"'\"', open)(__file__);code=f.read().replace('\"'\"'\\r\\n'\"'\"', '\"'\"'\\n'\"'\"');f.close();exec(compile(code, __file__, '\"'\"'exec'\"'\"'))' bdist_wheel -d /tmp/pip-wheel-z50sa6df\r\n cwd: /tmp/pip-install-qmht0nm3/watchdog/\r\n Complete output (6 lines):\r\n usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...]\r\n or: setup.py --help [cmd1 cmd2 ...]\r\n or: setup.py --help-commands\r\n or: setup.py cmd --help\r\n \r\n error: invalid command 'bdist_wheel'\r\n ----------------------------------------\r\n ERROR: Failed building wheel for watchdog\r\n Running setup.py clean for watchdog\r\n...\r\n```\r\n\r\nTo solve this, I had to install `wheel` with `pip install wheel` and before that I also updated `pip` using `pip install -U pip`.",
"title": "[BUG-General] error: invalid command 'bdist_wheel' (with solution)",
"type": "issue"
},
{
"action": "created",
"author": "behackl",
"comment_id": 727170316,
"datetime": 1605343993000,
"masked_author": "username_1",
"text": "@username_0 from the log you have posted, the usual behavior (if the wheel package is not available -- which seems to be a common problem with venv according to a quick google search -- then setup.py is used to build the package) can be observed, and actually the build did not fail within the lines you have posted.\n\nDid you abort the build at some point, or was there another error down the line that actually caused the build to fail?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fjp",
"comment_id": 727175594,
"datetime": 1605346982000,
"masked_author": "username_0",
"text": "➜ plots git:(master) ✗ which python\r\n/usr/bin/python\r\n```\r\n\r\nAnother note here: I have [ROS Noetic](http://wiki.ros.org/noetic) installed in `/opt/ros/noetic` which seems to put some ROS packages in a fresh venv. However there is no `wheel`:\r\n\r\n<details>\r\n<summary>pip list</summary>\r\n```console\r\n➜ manimce python -m venv venv\r\n➜ manimce source venv/bin/activate\r\n(venv) ➜ manimce pip list\r\nPackage Version\r\n----------------------------- -------\r\nactionlib 1.13.2 \r\nangles 1.9.13 \r\nbase-local-planner 1.17.1 \r\nbondpy 1.8.6 \r\ncamera-calibration 1.15.2 \r\ncamera-calibration-parsers 1.12.0 \r\ncatkin 0.8.8 \r\ncontroller-manager 0.19.2 \r\ncontroller-manager-msgs 0.19.2 \r\ncv-bridge 1.15.0 \r\ndiagnostic-analysis 1.10.2 \r\ndiagnostic-common-diagnostics 1.10.2 \r\ndiagnostic-updater 1.10.2 \r\ndynamic-reconfigure 1.7.1 \r\ngazebo-plugins 2.9.1 \r\ngazebo-ros 2.9.1 \r\ngencpp 0.6.5 \r\ngeneus 3.0.0 \r\ngenlisp 0.4.18 \r\ngenmsg 0.5.16 \r\ngennodejs 2.0.2 \r\ngenpy 0.6.14 \r\nimage-geometry 1.15.0 \r\ninteractive-markers 1.12.0 \r\njoint-state-publisher 1.15.0 \r\njoint-state-publisher-gui 1.15.0 \r\nlaser-geometry 1.6.5 \r\nmessage-filters 1.15.8 \r\nmir-driver 1.1.0 \r\npip 20.0.2 \r\npkg-resources 0.0.0 \r\npython-qt-binding 0.4.3 \r\nqt-dotgraph 0.4.2 \r\nqt-gui 0.4.2 \r\nqt-gui-cpp 0.4.2 \r\nqt-gui-py-common 0.4.2 \r\nresource-retriever 1.12.6 \r\nrosbag 1.15.8 \r\nrosboost-cfg 1.15.7 \r\nrosclean 1.15.7 \r\nroscreate 1.15.7 \r\nrosdoc-lite 0.2.10 \r\nrosgraph 1.15.8 \r\nroslaunch 1.15.8 \r\nroslib 1.15.7 \r\nroslint 0.12.0 \r\nroslz4 1.15.8 \r\nrosmake 1.15.7 \r\nrosmaster 1.15.8 \r\nrosmsg 1.15.8 \r\nrosnode 1.15.8 \r\nrosparam 1.15.8 \r\nrospy 1.15.8 \r\nrospy-message-converter 0.5.3 \r\nrosserial-client 0.9.1 \r\nrosserial-python 0.9.1 \r\nrosservice 1.15.8 \r\nrostest 1.15.8 \r\nrostopic 1.15.8 \r\nrosunit 1.15.7 \r\nroswtf 1.15.8 \r\nrqt-action 0.4.9 \r\nrqt-bag 0.4.15 \r\nrqt-bag-plugins 0.4.15 \r\nrqt-console 0.4.11 \r\nrqt-controller-manager 0.19.2 \r\nrqt-dep 0.4.10 \r\nrqt-graph 0.4.14 \r\nrqt-gui 0.5.2 \r\nrqt-gui-py 0.5.2 \r\nrqt-image-view 0.4.16 \r\nrqt-launch 0.4.8 \r\nrqt-logger-level 0.4.11 \r\nrqt-moveit 0.5.9 \r\nrqt-msg 0.4.9 \r\nrqt-nav-view 0.5.7 \r\nrqt-plot 0.4.12 \r\nrqt-pose-view 0.5.10 \r\nrqt-publisher 0.4.9 \r\nrqt-py-common 0.5.2 \r\nrqt-py-console 0.4.9 \r\nrqt-reconfigure 0.5.3 \r\nrqt-robot-dashboard 0.5.8 \r\nrqt-robot-monitor 0.5.13 \r\nrqt-robot-steering 0.5.11 \r\nrqt-runtime-monitor 0.5.8 \r\nrqt-rviz 0.6.1 \r\nrqt-service-caller 0.4.9 \r\nrqt-shell 0.4.10 \r\nrqt-srv 0.4.8 \r\nrqt-tf-tree 0.6.2 \r\nrqt-top 0.4.9 \r\nrqt-topic 0.4.12 \r\nrqt-web 0.4.9 \r\nrviz 1.14.1 \r\nsensor-msgs 1.13.0 \r\nsetuptools 44.0.0 \r\nsmach 2.5.0 \r\nsmach-ros 2.5.0 \r\nsmclib 1.8.6 \r\ntf 1.13.2 \r\ntf-conversions 1.13.2 \r\ntf2-geometry-msgs 0.7.5 \r\ntf2-kdl 0.7.5 \r\ntf2-py 0.7.5 \r\ntf2-ros 0.7.5 \r\ntopic-tools 1.15.8 \r\nxacro 1.14.4\r\n``` \r\n<details>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "behackl",
"comment_id": 727181813,
"datetime": 1605349040000,
"masked_author": "username_1",
"text": "Thank you for the more detailed information! I agree that the output of pip is a little frightening, but judging from\r\n\r\n```\r\nSuccessfully installed [...] manimce-0.1.0 [...]\r\n```\r\n\r\nI'd say that the process actually did not fail and that manim was installed successfully. Everything else here concerns how `venv` and/or `pip` work, and is outside of our scope -- which is why I am closing this again. Let us know if you run into further issues!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "behackl",
"comment_id": null,
"datetime": 1605349064000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "naveen521kk",
"comment_id": 727182328,
"datetime": 1605349212000,
"masked_author": "username_2",
"text": "What was you pip version?\r\nMaybe related to https://github.com/pypa/pip/issues/8178",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fjp",
"comment_id": 727200973,
"datetime": 1605357099000,
"masked_author": "username_0",
"text": "Inside a new venv it is:\r\n\r\n```console\r\n$ pip list | grep pip\r\npip 20.0.2\r\n```\r\n\r\nThis version seems to be not up to date and the main problem. When I create a new venv `nvenv` and update pip, then install `manimce` everyting works without \"errors\":\r\n\r\n```console\r\n➜ manimce python -m venv nvenv \r\n➜ manimce source nvenv/bin/activate\r\n(nvenv) ➜ manimce pip install -U pip\r\nCollecting pip\r\n Using cached pip-20.2.4-py2.py3-none-any.whl (1.5 MB)\r\nInstalling collected packages: pip\r\n Attempting uninstall: pip\r\n Found existing installation: pip 20.0.2\r\n Uninstalling pip-20.0.2:\r\n Successfully uninstalled pip-20.0.2\r\nSuccessfully installed pip-20.2.4\r\n(nvenv) ➜ manimce pip install manimce\r\nCollecting manimce\r\n Using cached manimce-0.1.0-py3-none-any.whl (233 kB)\r\nCollecting pangocffi<0.7.0,>=0.6.0\r\n Using cached pangocffi-0.6.0.tar.gz (37 kB)\r\nCollecting progressbar\r\n Using cached progressbar-2.5.tar.gz (10 kB)\r\nCollecting rich>=4.2.1\r\n Using cached rich-9.2.0-py3-none-any.whl (164 kB)\r\nCollecting watchdog\r\n Using cached watchdog-0.10.3.tar.gz (94 kB)\r\nCollecting pygments\r\n Using cached Pygments-2.7.2-py3-none-any.whl (948 kB)\r\nCollecting cairocffi<2.0.0,>=1.1.0\r\n Using cached cairocffi-1.2.0.tar.gz (70 kB)\r\nCollecting colour\r\n Using cached colour-0.1.5-py2.py3-none-any.whl (23 kB)\r\nCollecting grpcio\r\n Using cached grpcio-1.33.2-cp38-cp38-manylinux2014_x86_64.whl (3.8 MB)\r\nCollecting tqdm\r\n Using cached tqdm-4.51.0-py2.py3-none-any.whl (70 kB)\r\nCollecting scipy\r\n Using cached scipy-1.5.4-cp38-cp38-manylinux1_x86_64.whl (25.8 MB)\r\nCollecting grpcio-tools\r\n Using cached grpcio_tools-1.33.2-cp38-cp38-manylinux2014_x86_64.whl (2.5 MB)\r\nCollecting Pillow\r\n Using cached Pillow-8.0.1-cp38-cp38-manylinux1_x86_64.whl (2.2 MB)\r\nCollecting pangocairocffi<0.4.0,>=0.3.0\r\n Using cached pangocairocffi-0.3.2.tar.gz (21 kB)\r\nCollecting pydub\r\n Using cached pydub-0.24.1-py2.py3-none-any.whl (30 kB)\r\nProcessing /home/username_0/.cache/pip/wheels/7c/6f/b3/3181c06d04ed8f428f86797384f5b4c9932692b9526b6aba21/pycairo-1.20.0-cp38-cp38-linux_x86_64.whl\r\nCollecting numpy\r\n Using cached numpy-1.19.4-cp38-cp38-manylinux2010_x86_64.whl (14.5 MB)\r\nCollecting cffi>=1.1.0\r\n Using cached cffi-1.14.3-cp38-cp38-manylinux1_x86_64.whl (410 kB)\r\nCollecting colorama<0.5.0,>=0.4.0\r\n Using cached colorama-0.4.4-py2.py3-none-any.whl (16 kB)\r\nCollecting commonmark<0.10.0,>=0.9.0\r\n Using cached commonmark-0.9.1-py2.py3-none-any.whl (51 kB)\r\nCollecting typing-extensions<4.0.0,>=3.7.4\r\n Using cached typing_extensions-3.7.4.3-py3-none-any.whl (22 kB)\r\nCollecting pathtools>=0.1.1\r\n Using cached pathtools-0.1.2.tar.gz (11 kB)\r\nCollecting six>=1.5.2\r\n Using cached six-1.15.0-py2.py3-none-any.whl (10 kB)\r\nCollecting protobuf<4.0dev,>=3.5.0.post1\r\n Using cached protobuf-3.14.0-cp38-cp38-manylinux1_x86_64.whl (1.0 MB)\r\nCollecting pycparser\r\n Using cached pycparser-2.20-py2.py3-none-any.whl (112 kB)\r\nUsing legacy 'setup.py install' for pangocffi, since package 'wheel' is not installed.\r\nUsing legacy 'setup.py install' for progressbar, since package 'wheel' is not installed.\r\nUsing legacy 'setup.py install' for watchdog, since package 'wheel' is not installed.\r\nUsing legacy 'setup.py install' for cairocffi, since package 'wheel' is not installed.\r\nUsing legacy 'setup.py install' for pangocairocffi, since package 'wheel' is not installed.\r\nUsing legacy 'setup.py install' for pathtools, since package 'wheel' is not installed.\r\nInstalling collected packages: pycparser, cffi, pangocffi, progressbar, colorama, commonmark, pygments, typing-extensions, rich, pathtools, watchdog, cairocffi, colour, six, grpcio, tqdm, numpy, scipy, protobuf, grpcio-tools, Pillow, pangocairocffi, pydub, pycairo, manimce\r\n Running setup.py install for pangocffi ... done\r\n Running setup.py install for progressbar ... done\r\n Running setup.py install for pathtools ... done\r\n Running setup.py install for watchdog ... done\r\n Running setup.py install for cairocffi ... done\r\n Running setup.py install for pangocairocffi ... done\r\nSuccessfully installed Pillow-8.0.1 cairocffi-1.2.0 cffi-1.14.3 colorama-0.4.4 colour-0.1.5 commonmark-0.9.1 grpcio-1.33.2 grpcio-tools-1.33.2 manimce-0.1.0 numpy-1.19.4 pangocairocffi-0.3.2 pangocffi-0.6.0 pathtools-0.1.2 progressbar-2.5 protobuf-3.14.0 pycairo-1.20.0 pycparser-2.20 pydub-0.24.1 pygments-2.7.2 rich-9.2.0 scipy-1.5.4 six-1.15.0 tqdm-4.51.0 typing-extensions-3.7.4.3 watchdog-0.10.3\r\n```\r\n\r\nSo wheel doesn't seem to be required as it is also not installed after the update.\r\n\r\n```console\r\n(nvenv) ➜ manimce pip list | grep wheel\r\n(nvenv) ➜ manimce\r\n```",
"title": null,
"type": "comment"
}
] | 20,737 | false | false | 3 | 11 | true |
alibaba/ice | alibaba | 777,929,061 | 3,989 | null | [
{
"action": "opened",
"author": "rootsli",
"comment_id": null,
"datetime": 1609749790000,
"masked_author": "username_0",
"text": "ice.js 1.12.0\r\ninfo 离线化构建项目,自动下载网络资源,请耐心等待 \r\ninfo 使用 Fusion 组件主题包: @alifd/theme-design-pro\r\ninfo 自定义 Fusion 组件主题变量: { nextPrefix: 'next-icestark-' }\r\n 95% emitting(node:10088) UnhandledPromiseRejectionWarning: CssSyntaxError: /Users/lichbin/Desktop/bss-project/web-gpx-ice-framework/pdfjs2/UiWeb/css/bootstrap-3.3.7-dist/css/bootstrap.min.css:5558:1: At-rule without name\r\n(node:10088) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 2)\r\n(node:10088) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.\r\nlichbin@roots web-gpx-ice-framework % \r\n\r\n`",
"title": "ice框架build时,public文件夹的css也会被构建,导致报错",
"type": "issue"
},
{
"action": "created",
"author": "rootsli",
"comment_id": 753844621,
"datetime": 1609750171000,
"masked_author": "username_0",
"text": "错误源码位置如下:\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rootsli",
"comment_id": 753852013,
"datetime": 1609751084000,
"masked_author": "username_0",
"text": "错误文件在项目中的位置:\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ClarkXia",
"comment_id": 753853151,
"datetime": 1609751227000,
"masked_author": "username_1",
"text": "是不是引用了 css 文件,合理的方式是在 index.html 通过 link 标签的形式加入",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rootsli",
"comment_id": 753855035,
"datetime": 1609751454000,
"masked_author": "username_0",
"text": "这里只是静态内容,src内没有任何引用。相当于cdn了",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rootsli",
"comment_id": 753855358,
"datetime": 1609751491000,
"masked_author": "username_0",
"text": "构建应该是先执行的编译,最后才拷贝吧?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rootsli",
"comment_id": 753855683,
"datetime": 1609751528000,
"masked_author": "username_0",
"text": "build.json有配置: \"compileDependencies\": [\"\"],\r\n按道理不会去编译public内容啊",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ClarkXia",
"comment_id": 754470200,
"datetime": 1609833295000,
"masked_author": "username_1",
"text": "提供下有问题的 demo,没被源码引入的内容不会进行编译",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ClarkXia",
"comment_id": null,
"datetime": 1610936362000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "ClarkXia",
"comment_id": 761936510,
"datetime": 1610936362000,
"masked_author": "username_1",
"text": "public 目录下未被引用源码不会进行编译,如确定问题,提供下 demo 并重开 issue",
"title": null,
"type": "comment"
}
] | 1,573 | false | false | 2 | 10 | false |
MicrosoftDocs/dataexplorer-docs | MicrosoftDocs | 758,062,624 | 490 | {
"number": 490,
"repo": "dataexplorer-docs",
"user_login": "MicrosoftDocs"
} | [
{
"action": "opened",
"author": "cmcclister",
"comment_id": null,
"datetime": 1607300488000,
"masked_author": "username_0",
"text": "The pull request is created from master637428972840879587 to master to fix git push error for protected CLA branch",
"title": "Repo sync for protected CLA branch",
"type": "issue"
},
{
"action": "created",
"author": "PRMerger10",
"comment_id": 739592889,
"datetime": 1607300509000,
"masked_author": "username_1",
"text": "@username_0 : Thanks for your contribution! The author(s) have been notified to review your proposed change.",
"title": null,
"type": "comment"
}
] | 222 | false | false | 2 | 2 | true |
DFE-Digital/get-teacher-training-adviser-service | DFE-Digital | 831,646,284 | 573 | {
"number": 573,
"repo": "get-teacher-training-adviser-service",
"user_login": "DFE-Digital"
} | [
{
"action": "opened",
"author": "ethax-ross",
"comment_id": null,
"datetime": 1615802704000,
"masked_author": "username_0",
"text": "### Trello card\r\n\r\n[Trello-1412](https://trello.com/c/65tZmUeb/1412-add-correlation-ids-to-rails-apps-and-api)\r\n\r\n### Context\r\n\r\nWe want to be able to track requests from the app right through to the API. In order to do this we need to set a correlation id header in the requests going to the API. Rails has a request id we can tap into, but getting it into the API client for all requests will be tricky/verbose.\r\n\r\nTo make life easier this commit exposes the Rails request id to the current thread, so that we can have custom Faraday middleware read the id and set a header in the API client. Whilst having thread globals isn't ideal, we are using `ActiveSupport::CurrentAttributes` to ensure that they get cleaned up after each request and I think its a relatively sensible/pragmatic use case here.\r\n\r\n### Changes proposed in this pull request\r\n\r\n- Expose request uuid to current thread\r\n\r\n### Guidance to review",
"title": "Expose request uuid to current thread",
"type": "issue"
}
] | 915 | false | true | 1 | 1 | false |
dotnet/orleans | dotnet | 767,322,829 | 6,857 | null | [
{
"action": "opened",
"author": "lpyvvvvvv",
"comment_id": null,
"datetime": 1608021214000,
"masked_author": "username_0",
"text": "there are two silos, siloA and siloB. \r\ngrainA on siloA.\r\nthen shutdown siloA, sometimes grainA OnActivate on siloB before OnDeactivate on slioA.\r\nIs there any way to solve this issue?",
"title": "problem when shutdown silo",
"type": "issue"
},
{
"action": "created",
"author": "Cloud33",
"comment_id": 745747944,
"datetime": 1608091077000,
"masked_author": "username_1",
"text": "When you close silos, `OnDeactivateasync` cannot be guaranteed to trigger. This is a best effort behavior.\r\n\r\n\r\nYou may have to change the method, such as calling `WriteStateAsync` synchronously when the data changes",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sergeybykov",
"comment_id": 745792546,
"datetime": 1608099609000,
"masked_author": "username_2",
"text": "+1\r\nThe general rule is to persist any state that you don't want to lose as part of the operation that updates it.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "freever",
"comment_id": 745793721,
"datetime": 1608099774000,
"masked_author": "username_3",
"text": "Is there a recommended way to persist state asynchronously, so that the grain can release its caller while persisting? Or is that just generally a bad idea?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "lpyvvvvvv",
"comment_id": 745810340,
"datetime": 1608102185000,
"masked_author": "username_0",
"text": "+1",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rallets",
"comment_id": 746078600,
"datetime": 1608115493000,
"masked_author": "username_4",
"text": "@username_3 @username_0 it depends on how \"important\" the grain state is. \r\n1) If it's not ok for your business to lose \"some\" state in case of a crash, then you need to persist the grain state as soon it's changed (basically before leaving the grain method), awaiting the WriteStateAsync call to the storage to be able to handle exceptions, as then they are not swallowed. \r\n2) Instead, if you can get the risk of loosing the grain state in case of errors (silo crash, network failures, storage provider errors, etc), you can use a timer and persist the state only every x seconds.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sergeybykov",
"comment_id": 747259726,
"datetime": 1608189305000,
"masked_author": "username_2",
"text": "Nicely stated, @username_4.",
"title": null,
"type": "comment"
}
] | 1,275 | false | true | 5 | 7 | true |
platformio/platformio-home | platformio | 742,970,420 | 1,701 | null | [
{
"action": "opened",
"author": "boubouexpress",
"comment_id": null,
"datetime": 1605347174000,
"masked_author": "username_0",
"text": "PIO Core Call Error: \"Error: Unknown development platform 'PackageItem <path=C:\\\\Users\\\\fboug\\\\.platformio\\\\platforms\\\\ststm32@8.1.0 metadata=PackageMetaData <type=platform name=ststm32 version=8.1.0 spec={'owner': 'platformio', 'id': 8020, 'name': 'ststm32', 'requirements': None, 'url': None}'\"",
"title": "Could not load boards list",
"type": "issue"
}
] | 296 | false | true | 1 | 1 | false |
darwinia-network/apps | darwinia-network | 780,149,189 | 162 | null | [
{
"action": "opened",
"author": "HackFisher",
"comment_id": null,
"datetime": 1609916777000,
"masked_author": "username_0",
"text": "\r\n\r\n\r\n@wi1dcard thinks that this is caused by large rpc call from apps.darwinia.network",
"title": "Connections issue to polkadot.js app",
"type": "issue"
},
{
"action": "closed",
"author": "AurevoirXavier",
"comment_id": null,
"datetime": 1621220575000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 229 | false | false | 2 | 2 | false |
firebase/firebase-ios-sdk | firebase | 822,118,820 | 7,654 | null | [
{
"action": "opened",
"author": "ir-fuel",
"comment_id": null,
"datetime": 1614863434000,
"masked_author": "username_0",
"text": "<!-- DO NOT DELETE\r\nvalidate_template=true\r\ntemplate_path=.github/ISSUE_TEMPLATE/bug_report.md\r\n-->\r\n### [REQUIRED] Step 1: Describe your environment\r\n\r\n * Xcode version: 12.4\r\n * Firebase SDK version: 7.6.0\r\n * Installation method: ` Swift Package Manager`\r\n * Firebase Component: Auth\r\n\r\n### [REQUIRED] Step 2: Describe the problem\r\n\r\nWhen starting an app that hasn't been started in a while (several days), and where the user has been deleted server side (either by deleting on the server or just having killed the local simulator), the `addStateDidChangeListener` callback is called multiple times at app startup. The first time it contains the user object that was previously valid but which has been removed server-side since, and the second time it contains `nil` as to show there is no active user.\r\n\r\nAs there is no way to see on the `user` object if it is a \"good\" one I don't see how I can know if we are actually logged in and good to go, or just temporarily get something back and will receive a `nil` very soon after. This is problematic for building the interface of the app as the screens depend on if we have a user or not.\r\n\r\n#### Steps to reproduce:\r\n\r\nCreate a Firebase user in your app. \r\nKill your app.\r\nDelete the created user from Firebase (or just kill the simulator if running locally)\r\nWait a few days\r\nStart the app again\r\n\r\n#### Relevant Code:\r\n\r\n```\r\n listenerHandle = Auth.auth().addStateDidChangeListener { [weak self] _, user in\r\n if let self = self {\r\n \r\n if user != nil {\r\n debugPrint(\"Has authenticated user\")\r\n } else {\r\n debugPrint(\"Does not have authenticated user\")\r\n }\r\n \r\n self.user = user\r\n \r\n self.hasAuthenticatedUser = user != nil\r\n }\r\n }\r\n\r\n```\r\n\r\n\r\nSo the question is: is this behavior normal, and if it is, how can I be sure that the user object I get is an authenticated user?",
"title": "addStateDidChangeListener called multiple times when user does no longer exist at app startup",
"type": "issue"
},
{
"action": "created",
"author": "morganchen12",
"comment_id": 790848737,
"datetime": 1614884121000,
"masked_author": "username_1",
"text": "From what I understand this is correct behavior from the SDK. If you want to ensure your user is valid, you can try to [reload](https://firebase.google.com/docs/reference/ios/firebaseauth/api/reference/Classes/FIRUser#/c:objc(cs)FIRUser(im)reloadWithCompletion:) the user before performing any operations.\r\n\r\n@username_2 please correct me if I am wrong.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ir-fuel",
"comment_id": 795377695,
"datetime": 1615381134000,
"masked_author": "username_0",
"text": "I did some more testing. This is my test code\r\n\r\n```\r\n listenerHandle = Auth.auth().addStateDidChangeListener { _, user in\r\n if user != nil {\r\n debugPrint(\"Received a user, let's see if it is still valid\")\r\n user!.reload { error in\r\n if error != nil {\r\n user!.getIDToken { _, error in\r\n if error != nil {\r\n user!.logout()\r\n debugPrint(\"User no longer valid\")\r\n Self.userPublisher.value = nil\r\n } else {\r\n debugPrint(\"Has authenticated user after get ID token\")\r\n Self.userPublisher.value = user\r\n }\r\n }\r\n } else {\r\n debugPrint(\"Has authenticated user\")\r\n Self.userPublisher.value = user\r\n }\r\n }\r\n } else {\r\n debugPrint(\"Received nil user\")\r\n Self.userPublisher.value = nil\r\n }\r\n }\r\n```\r\n\r\nI ran this code after killing the local Firebase emulator, so the user no longer exists. this is what it logged:\r\n```\r\n\"Received a user, let\\'s see if it is still valid\"\r\n\"Received nil user\"\r\n\"Has authenticated user after get ID token\"\r\n```\r\n\r\nSo what happens is there is a user object passed, I call `reload` on that user to see if it is still valid, there is no error after the reload so I call `getIDToken`, in the mean time the callback is called again with an empty user, and the `getIDToken` returns me a valid token, which in the mean time has become invalid.\r\n\r\nSo again: no way for me to be sure I have a valid user.\r\n\r\nIs this normal?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ir-fuel",
"comment_id": 797287745,
"datetime": 1615533262000,
"masked_author": "username_0",
"text": "Just tried it again.:\r\n\r\n- start emulator\r\n- create account in app\r\n- kill emulator\r\n- restart emulator\r\n- restart app\r\n\r\nresult:\r\n\r\n\"Received a user, let\\'s see if it is still valid\"\r\n\r\n [Firebase/Auth][I-AUT000016] Invalid user token detected, user is automatically signed out.\r\n\r\n\"Received nil user\"\r\n\r\n\"Has authenticated user after get ID token\"\r\n\r\n\r\nSo it did it again.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "morganchen12",
"comment_id": 797796331,
"datetime": 1615588789000,
"masked_author": "username_1",
"text": "@username_2 is this a bug?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rosalyntan",
"comment_id": 799635360,
"datetime": 1615831843000,
"masked_author": "username_2",
"text": "Hi @username_0 -- this looks WAI to me. The docs state that the [`addStateDidChangeListener` will be triggered immediately when the block is registered as a listener](https://firebase.google.com/docs/reference/swift/firebaseauth/api/reference/Classes/Auth#addstatedidchangelistener_:), so it triggers with the cached user at app startup.\r\n\r\nAs @username_1 said, you can check whether the user is valid by calling `reload`. You mention that `reload` doesn't return an error, but from your logged debug statements, it looks like it does (since you call `getIDToken` only if `reload` returns an error).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "rosalyntan",
"comment_id": null,
"datetime": 1615831843000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "ir-fuel",
"comment_id": 799693479,
"datetime": 1615836573000,
"masked_author": "username_0",
"text": "Sorry @username_2 but this is not correct. I would like to reopen this :) \r\nAs you can see in the logs, reload does not return an error, nor does `getIDToken`. The listener is called a second time when the async callbacks fired during the first call are still running.\r\nWhat happens is:\r\n\r\n- Listener callback is called\r\n- I get user\r\n- I call `reload` on that user\r\n- Listener callback is called AGAIN with nil user (but the `reload` callback on the first user I got is still pending)\r\n- The first user I received reloads without errors and I call `getIDToken`.\r\n- `getIDToken` returns without an error making me think I have a valid user.\r\n\r\nThis is weird behaviour and does not allow me to detect if a user is still valid at startup",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ir-fuel",
"comment_id": 799694348,
"datetime": 1615836647000,
"masked_author": "username_0",
"text": "Ah yes, it's a typo. Oh boy ...",
"title": null,
"type": "comment"
}
] | 5,982 | false | false | 3 | 9 | true |
ChurchCRM/CRM | ChurchCRM | 744,034,570 | 5,541 | null | [
{
"action": "opened",
"author": "ChurchCRMBugReport",
"comment_id": null,
"datetime": 1605548646000,
"masked_author": "username_0",
"text": "sasdfasfasdfdd asfdasdfasdf asdfasdfasdf asdf asdf asdf",
"title": "Testing via new Node Bridge",
"type": "issue"
},
{
"action": "closed",
"author": "DawoudIO",
"comment_id": null,
"datetime": 1605549032000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 55 | false | false | 2 | 2 | false |
subvisual/subvisual.com | subvisual | 665,294,490 | 365 | {
"number": 365,
"repo": "subvisual.com",
"user_login": "subvisual"
} | [
{
"action": "opened",
"author": "gabrielpoca",
"comment_id": null,
"datetime": 1595609746000,
"masked_author": "username_0",
"text": "",
"title": "Blog Post: The Worst Decision Of My Career",
"type": "issue"
},
{
"action": "created",
"author": "gabrielpoca",
"comment_id": 664385223,
"datetime": 1595855288000,
"masked_author": "username_0",
"text": "@username_1 I'm honestly out of ideas :/ I already spent so many hours in this, if you think it's fine, let's merge it as it is",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pfac",
"comment_id": 664405984,
"datetime": 1595857531000,
"masked_author": "username_1",
"text": "Sure thing, feel free to merge it when you're ready.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zamith",
"comment_id": 664919382,
"datetime": 1595928503000,
"masked_author": "username_2",
"text": "Let's start by saying that I don't completely agree with the main point of the article, in the particular case of utrust (which is never mentioned, but is abundantly clear for people in the know), I think that event sourcing for the payment flow was one of the best decisions we made. Of course, as with any new technology, we overdid it at first, using it in places where we shouldn't have (such as user management) and had to learn from that.\r\n\r\nIn terms of the article itself, I feel it's very opinionated, but that's not necessarily bad, I've written opinionated articles for this blog before. The main issue I have with it is that it is opinionated in a negative way. It not only gives your opinion, but it goes at length (probably too much) to try and \"destroy\" another point of view (that is still shared by a lot of people in the Sub universe). \r\n\r\nIn sum, I think it's ok to write an article about your experience with these and even that early on you should shy away from complexity. However, I don't think we should publish an article that (to me, at least) comes across as so negative not to only to yourself (or us), but also other people we know and in general. I think you can make your point in a more positive way.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gabrielpoca",
"comment_id": 664957030,
"datetime": 1595931386000,
"masked_author": "username_0",
"text": "@username_2 I understand your point. I made a couple of changes to try and address what you mentioned. Would you mind looking at it again? I know the article will come across as negative, but that's how I want it to be. I know a lot of people in the universe will disagree, but a lot more will see themselves in my words and my pain. I'm ok with a little sanitization to my ideas, but to be honest, I would rather publish this in my personal blog than not being able to convey this message.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zamith",
"comment_id": 664960055,
"datetime": 1595931810000,
"masked_author": "username_2",
"text": "Not sure it's my call to make, but I don't think this aligns with the tone in our blog, nor with the message we want to put across as part of our OKRs.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gabrielpoca",
"comment_id": 664960755,
"datetime": 1595931906000,
"masked_author": "username_0",
"text": "@username_2 not even with my latest changes?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gabrielpoca",
"comment_id": 666560573,
"datetime": 1596131365000,
"masked_author": "username_0",
"text": "Closing for now. Will rewrite some sections soon.",
"title": null,
"type": "comment"
}
] | 2,130 | false | false | 3 | 8 | true |
scala/scala-collection-compat | scala | 771,424,154 | 403 | {
"number": 403,
"repo": "scala-collection-compat",
"user_login": "scala"
} | [
{
"action": "opened",
"author": "cquiroz",
"comment_id": null,
"datetime": 1608400168000,
"masked_author": "username_0",
"text": "The build seems hard to adapt for 2 milestone versions. I propose closing my previous PR and just updating the current Milestone version",
"title": "Switch to M3",
"type": "issue"
},
{
"action": "created",
"author": "SethTisue",
"comment_id": 748524214,
"datetime": 1608411669000,
"masked_author": "username_1",
"text": "Yeah, the complexity of the build in this repo is an ongoing annoyance: #395 \r\n\r\nNew ticket on actually publishing for M3: #404",
"title": null,
"type": "comment"
}
] | 263 | false | false | 2 | 2 | false |
Snakemake-Profiles/lsf | Snakemake-Profiles | 740,364,627 | 35 | {
"number": 35,
"repo": "lsf",
"user_login": "Snakemake-Profiles"
} | [
{
"action": "opened",
"author": "haizi-zh",
"comment_id": null,
"datetime": 1605056380000,
"masked_author": "username_0",
"text": "In a legacy LSF system such as v8.3, the command `bjobs` does not support certain new arguments such as `--noheader` or `-o`. Furthermore, most of the time it is not realistic for snakemake users to upgrade LSF versions, which is tightly controlled by cluster admins. Therefore, it is essential to make this project compatible with legacy LSF systems. We can achieve this by simply using an alternative way to monitor job status.\r\n\r\nThis pull request includes a simple patch, which uses a plain `bjobs` command and for job status checking.\r\n\r\nNote: I have tested the code under the following environment:\r\n\r\nIBM Platform LSF 8.3.0.196409, May 10 2012\r\n\r\nHowever, I haven't tested it for other LSF versions.",
"title": "🐛 Fix bug for legacy LSF system (v8.3)",
"type": "issue"
},
{
"action": "created",
"author": "mbhall88",
"comment_id": 728571329,
"datetime": 1605574598000,
"masked_author": "username_1",
"text": "Hi @username_0 , thanks for raising this issue and putting in a PR.\r\n\r\nI am a little bit hesitant about trying to add support for legacy systems this old. It looks like these features `-o` and `-noheader` were introduced in v9.1.1 (released 2013), which itself is [becoming legacy](https://www.ibm.com/support/pages/end-life-end-support-announcement-all-editions-ibm-platform-lsf-v9-family-ibm-platform-analytics-ibm-spectrum-lsf-suite-hpc-101x-ibm-spectrum-lsf-suite-workgroups-101x). This seems akin to supporting python2 in some respects (however v8.3 seems to be end-of-support for longer than python2).\r\n\r\nI was able to dig up some [old documentation](https://www14.software.ibm.com/support/customercare/sas/f/plcomp/platformlsf.html) for v8.0 (not v8.3 though), but again, it becomes very hard to debug issues on such an old system. If we were going to change the command to check the job status to the method you have outlined in this PR it will require a lot more work. We would need to update all tests and also add in some error handling for the case where the status cannot be obtained from the plain `bjobs <jobID>` command.\r\n\r\nI appreciate your hands are tied and you can't just upgrade.\r\n\r\nI might ask @username_2 and @johanneskoester for their thoughts on this also as I don't think it is appropriate for me to make such a decision on my own.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "haizi-zh",
"comment_id": 728700520,
"datetime": 1605591910000,
"masked_author": "username_0",
"text": "Hi @username_1 , totally understood. If I were you I would definitely feel the same hesitation. Unfortunately, my employer's HPC cluster is fairly old with lots of legacy codes running in it, thus upgrading to newer LSF is not something feasible. You don't have to merge the PR. I submitted it just in case some other guys may need it. Wish you a great day! 😃",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "leoisl",
"comment_id": 1081688621,
"datetime": 1648548774000,
"masked_author": "username_2",
"text": "I think is reasonable to not support legacy LSF versions. I am wondering if this PR should remain permanently open for legacy users to see it, or if we should close this PR, and add a section in `README.md` that points to this PR or to https://github.com/username_0/lsf for users looking to run this profile on legacy LSF versions.",
"title": null,
"type": "comment"
}
] | 2,743 | false | false | 3 | 4 | true |
microsoft/vscode-azureappservice | microsoft | 810,462,471 | 1,968 | null | [
{
"action": "opened",
"author": "jayeshnathani14IG",
"comment_id": null,
"datetime": 1613589350000,
"masked_author": "username_0",
"text": "<!-- IMPORTANT: Please be sure to remove any private information before submitting. -->\r\n\r\nDoes this occur consistently? <!-- TODO: Type Yes or No -->\r\nRepro steps:\r\n<!-- TODO: Share the steps needed to reliably reproduce the problem. Please include actual and expected results. -->\r\n\r\n1.\r\n2.\r\n\r\nAction: appService.Deploy\r\nError type: 500\r\nError Message: An error has occurred.\r\n\r\nVersion: 0.20.0\r\nOS: win32\r\nOS Release: 10.0.19042\r\nProduct: Visual Studio Code\r\nProduct Version: 1.53.2\r\nLanguage: en\r\n\r\n<details>\r\n<summary>Call Stack</summary>\r\n\r\n```\r\nnew RestError extension.bundle.js:46:66146\r\nextension.bundle.js:46:496611extension.bundle.js:46:496611\r\nprocessTicksAndRejections task_queues.js:97:5\r\n```\r\n\r\n</details>",
"title": "Error while publishing the app",
"type": "issue"
},
{
"action": "created",
"author": "nturinski",
"comment_id": 780811689,
"datetime": 1613591655000,
"masked_author": "username_1",
"text": "Please provide repro steps.",
"title": null,
"type": "comment"
}
] | 747 | false | true | 2 | 2 | false |
vinteo/hass-opensprinkler | null | 720,962,031 | 138 | null | [
{
"action": "opened",
"author": "JurajNyiri",
"comment_id": null,
"datetime": 1602630336000,
"masked_author": "username_0",
"text": "Opensprinkler sometimes does not initiate properly.\r\n\r\nWhen this is happening, home assistant seems to be taking a long time to load (reason currently unknown).\r\n\r\nI did some digging and I think that slow start of HASS causes timeouts and/or race conditions inside pyopensprinkler and this component.\r\n\r\n**Component should be able to recover from this and try to set up again and not give up.**\r\n\r\n```\r\nLogger: custom_components.opensprinkler\r\nSource: helpers/update_coordinator.py:147\r\nIntegration: OpenSprinkler (documentation)\r\nFirst occurred: October 13, 2020, 11:42:59 PM (1 occurrences)\r\nLast logged: October 13, 2020, 11:42:59 PM\r\n\r\nTimeout fetching OpenSprinkler resource status data\r\n```\r\n\r\n```\r\nError while setting up opensprinkler platform for binary_sensor\r\nTraceback (most recent call last):\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 201, in _async_setup_platform\r\n await asyncio.gather(*pending)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 310, in async_add_entities\r\n await asyncio.gather(*tasks)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 371, in _async_add_entity\r\n device_info = entity.device_info\r\n File \"/config/custom_components/opensprinkler/__init__.py\", line 134, in device_info\r\n model = controller.hardware_version_name or \"Unknown\"\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 471, in hardware_version_name\r\n if self.hardware_version == HARDWARE_VERSION_OSPI:\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 466, in hardware_version\r\n return self._get_option(\"hwv\")\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 274, in _get_option\r\n return self._get_options()[option]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 280, in _get_options\r\n return self._retrieve_state()[\"options\"]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 268, in _retrieve_state\r\n raise OpenSprinklerNoStateError(\"No state. Please refresh\")\r\npyopensprinkler.OpenSprinklerNoStateError: No state. Please refresh\r\n```\r\n\r\n```\r\nError while setting up opensprinkler platform for sensor\r\nTraceback (most recent call last):\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 201, in _async_setup_platform\r\n await asyncio.gather(*pending)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 310, in async_add_entities\r\n await asyncio.gather(*tasks)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 371, in _async_add_entity\r\n device_info = entity.device_info\r\n File \"/config/custom_components/opensprinkler/__init__.py\", line 134, in device_info\r\n model = controller.hardware_version_name or \"Unknown\"\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 471, in hardware_version_name\r\n if self.hardware_version == HARDWARE_VERSION_OSPI:\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 466, in hardware_version\r\n return self._get_option(\"hwv\")\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 274, in _get_option\r\n return self._get_options()[option]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 280, in _get_options\r\n return self._retrieve_state()[\"options\"]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 268, in _retrieve_state\r\n raise OpenSprinklerNoStateError(\"No state. Please refresh\")\r\npyopensprinkler.OpenSprinklerNoStateError: No state. Please refresh\r\n```\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 201, in _async_setup_platform\r\n await asyncio.gather(*pending)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 310, in async_add_entities\r\n await asyncio.gather(*tasks)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 371, in _async_add_entity\r\n device_info = entity.device_info\r\n File \"/config/custom_components/opensprinkler/__init__.py\", line 134, in device_info\r\n model = controller.hardware_version_name or \"Unknown\"\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 471, in hardware_version_name\r\n if self.hardware_version == HARDWARE_VERSION_OSPI:\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 466, in hardware_version\r\n return self._get_option(\"hwv\")\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 274, in _get_option\r\n return self._get_options()[option]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 280, in _get_options\r\n return self._retrieve_state()[\"options\"]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 268, in _retrieve_state\r\n raise OpenSprinklerNoStateError(\"No state. Please refresh\")\r\npyopensprinkler.OpenSprinklerNoStateError: No state. Please refresh\r\n```\r\n\r\nI think this is caused internally by some race conditions:\r\n- _retrieve_state in pyopensprinkler executes before update is finished, throws exception\r\n- as there was no retry logic, component setup fails\r\n\r\nIdeally, issue should be resolved by eliminating the race condition and / or implementing component retry setup logic after some time.\r\n\r\nI wrote a (very ugly workaround)[https://github.com/username_0/hass-opensprinkler/blob/fix_timeout_on_start/custom_components/opensprinkler/__init__.py#L61], which works and now boots 100% of the time. \r\nIn my testing, it failed the 1st time, went into exception, recursion and then was successful and continued setup.\r\n\r\nI tried restarting multiple times on both rpis I have and it just didn't want to initiate correctly so I finally had some time to debug and find the cause. \r\nI have encountered this issue before and I remember it also being discussed in Community without a resolution.\r\n\r\nI do not know how to replicate this as I do not know the cause. Sometimes it just starts happening and recovers the next day when I restart.\r\nMaybe try adding some sleeps to simulate this and then try to fix?",
"title": "Component fails to start",
"type": "issue"
},
{
"action": "created",
"author": "vinteo",
"comment_id": 708086434,
"datetime": 1602636288000,
"masked_author": "username_1",
"text": "It would be great if you could open a PR for either pyopensprinkler (to fix the race condition) or with this workaround. Hacktoberfest is still ongoing :)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JurajNyiri",
"comment_id": 708275314,
"datetime": 1602667152000,
"masked_author": "username_0",
"text": "Thank you for the suggestion. \r\nHowever, currently I don't have time to fix it properly and I am not proud about that workaround, it's really ugly and should be fixed properly.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "vinteo",
"comment_id": null,
"datetime": 1602890833000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "vinteo",
"comment_id": 710695947,
"datetime": 1602890989000,
"masked_author": "username_1",
"text": "Please try the latest release v1.1.3",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JurajNyiri",
"comment_id": 712148778,
"datetime": 1603113354000,
"masked_author": "username_0",
"text": "Thank you, I have now encountered a similar scenario, it threw an error \r\n```\r\nLogger: custom_components.opensprinkler\r\nSource: helpers/update_coordinator.py:147\r\nIntegration: OpenSprinkler (documentation)\r\nFirst occurred: 3:11:59 PM (1 occurrences)\r\nLast logged: 3:11:59 PM\r\n\r\nTimeout fetching OpenSprinkler resource status data\r\n```\r\n\r\nbut then recovered. 👍",
"title": null,
"type": "comment"
}
] | 7,114 | false | false | 2 | 6 | true |
avadev/AvaTax-Calc-SOAP-PHP | avadev | 787,040,242 | 20 | null | [
{
"action": "opened",
"author": "aanderson-msi",
"comment_id": null,
"datetime": 1610729585000,
"masked_author": "username_0",
"text": "Line 32 of ATConfig.php:\r\n` if($n == '_ivars') { return parent::__get($n); }`\r\n\r\nHowever, the class is declared without a parent:\r\n```\r\n/**\r\n * Contains various service configuration parameters as class static variables.\r\n *\r\n * {@link AddressServiceSoap} and {@link TaxServiceSoap} read this file during initialization.\r\n *\r\n * @author Avalara\r\n * @copyright © 2004 - 2011 Avalara, Inc. All rights reserved.\r\n * @package Base\r\n */\r\nnamespace AvaTax;\r\nclass ATConfig\r\n{\r\n\r\n```\r\nWhen using this class with PHP 7.4, it results in an error. \r\n```\r\nDeprecated Functionality: Cannot use \"parent\" when current class scope has no parent in <project>/vendor/avalara/avatax/AvaTax/ATConfig.php on line 32\r\n``` \r\nRecommend returning \"null\" instead of the call to the parent's __get() function.",
"title": "PHP 7.4 deprecates parent for classes without parent",
"type": "issue"
},
{
"action": "created",
"author": "vipinroy",
"comment_id": 783437412,
"datetime": 1614006222000,
"masked_author": "username_1",
"text": "Any update on this issue?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jonathanribas",
"comment_id": 879812843,
"datetime": 1626261983000,
"masked_author": "username_2",
"text": "We can't move forward on PHP upgrades because of this issue.\r\n\r\nPHP 7.3 will be deprecated at the of this year, can you please update your code?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gusdemayo",
"comment_id": 880068838,
"datetime": 1626283000000,
"masked_author": "username_3",
"text": "I opened a pull request that should fix this issue\r\n\r\nBy removing the line\r\n\r\n```\r\nif($n == '_ivars') { return parent::__get($n); }\r\n```\r\n\r\nthe case of `$n === 'ivars'` should be caught by the else statement",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jonathanribas",
"comment_id": 880093098,
"datetime": 1626285238000,
"masked_author": "username_2",
"text": "Thank you @username_3! I was creating a PR with the exact same change!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gusdemayo",
"comment_id": 890001912,
"datetime": 1627662128000,
"masked_author": "username_3",
"text": "Is this project still being maintained?",
"title": null,
"type": "comment"
}
] | 1,281 | false | false | 4 | 6 | true |
adiwajshing/Baileys | null | 775,793,101 | 297 | null | [
{
"action": "opened",
"author": "omerbeylife",
"comment_id": null,
"datetime": 1609234068000,
"masked_author": "username_0",
"text": "Hello, audio files, google voices do not work on iphone users",
"title": "İphone Users",
"type": "issue"
},
{
"action": "created",
"author": "adiwajshing",
"comment_id": 753284599,
"datetime": 1609488825000,
"masked_author": "username_1",
"text": "Need more info than this.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "omerbeylife",
"comment_id": 753287041,
"datetime": 1609490534000,
"masked_author": "username_0",
"text": "There is such a plug-in but iPhone users cannot play bot sounds. The problem is caused by bailley. \r\nhttps://www.github.com/username_2/WhatsAsena/tree/master/plugins/scrapers.js",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Quiec",
"comment_id": 753310144,
"datetime": 1609503587000,
"masked_author": "username_2",
"text": "The sounds sent by Baileys are not listen on the iPhone.\r\n\r\nAny suggestions?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "edgardmessias",
"comment_id": 753963763,
"datetime": 1609765535000,
"masked_author": "username_3",
"text": "Try this: `ffmpeg -i audio.mp3 -vn -ar 44100 -ac 2 -b:a 192k audio-fixed.mp3 -y`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Quiec",
"comment_id": 754178767,
"datetime": 1609789619000,
"masked_author": "username_2",
"text": "Thanks!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Romerio",
"comment_id": 756744248,
"datetime": 1610110931000,
"masked_author": "username_4",
"text": "For me it only worked after adding \"-f ipod\"\r\n`ffmpeg -i audio.mp3 -vn -ab 128k -ar 44100 -f ipod audio-fixed.mp3 -y`",
"title": null,
"type": "comment"
}
] | 538 | false | true | 5 | 7 | true |
SeldonIO/alibi-detect | SeldonIO | 857,931,356 | 221 | null | [
{
"action": "opened",
"author": "RitikaKulshresth",
"comment_id": null,
"datetime": 1618408938000,
"masked_author": "username_0",
"text": "I have tried to run the the Kolmogorov-Smirnov (K-S) tests for numerical columns using the Categorical and mixed type data drift detection on income prediction dataset, for numerical features for ex ( Age, Capital Gain, Capital Loss, Hours per week) feature I am getting K-S test value and P value as NAN but on applying the Chi-Square test on categorical features( Workclass, Education, Marital Status, Occupation, Relationship, Race, Sex, Country) I am getting the Chi2 and P value as some float values .\r\nPlease let me know shall I try with the downgraded version of Alibi Detect for running the Categorical and mixed type data drift detection on income prediction file .\r\nLooking for the resolution asap",
"title": "What to do as I am getting the P values as NAN on applying Kolmogorov-Smirnov (K-S) tests for numerical columns",
"type": "issue"
},
{
"action": "created",
"author": "arnaudvl",
"comment_id": 819555302,
"datetime": 1618409953000,
"masked_author": "username_1",
"text": "Hi @username_0 , could you provide a reproducible example as well as the alibi-detect version you are using? Because I just ran the ChiSquareDrift and TabularDrift detectors using the latest release and they seem to run fine.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RitikaKulshresth",
"comment_id": 819725477,
"datetime": 1618423634000,
"masked_author": "username_0",
"text": "Hey @username_1 , I am running the example mentioned here: https://docs.seldon.io/projects/alibi-detect/en/stable/examples/cd_chi2ks_adult.html.\r\nI am using the following versions:\r\nalibi - 0.5.7\r\nalibi_detect - 0.5.1\r\nPython - 3.8.8\r\n\r\nGetting the following output with continuous numerical features as NAN\r\n`Age -- Drift? No! -- K-S nan -- p-value nan\r\nWorkclass -- Drift? Yes! -- Chi2 18.114 -- p-value 0.020\r\nEducation -- Drift? No! -- Chi2 9.845 -- p-value 0.131\r\nMarital Status -- Drift? No! -- Chi2 6.583 -- p-value 0.086\r\nOccupation -- Drift? Yes! -- Chi2 17.900 -- p-value 0.022\r\nRelationship -- Drift? No! -- Chi2 0.964 -- p-value 0.965\r\nRace -- Drift? No! -- Chi2 1.168 -- p-value 0.883\r\nSex -- Drift? No! -- Chi2 0.463 -- p-value 0.496\r\nCapital Gain -- Drift? No! -- K-S nan -- p-value nan\r\nCapital Loss -- Drift? No! -- K-S nan -- p-value nan\r\nHours per week -- Drift? No! -- K-S nan -- p-value nan\r\nCountry -- Drift? Yes! -- Chi2 20.595 -- p-value 0.024`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "arnaudvl",
"comment_id": 819762583,
"datetime": 1618427205000,
"masked_author": "username_1",
"text": "Thanks, could you upgrade to the latest alibi-detect (v0.6.0)? That should resolve the issue.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RitikaKulshresth",
"comment_id": 820139026,
"datetime": 1618466516000,
"masked_author": "username_0",
"text": "That issue worked fine. Thanks for your quick help. You can close this issue\r\n\r\nAlso I am trying to use the same ChiSquareDrift and TabularDrift detectors on Iris Dataset where we have only Continuous numerical features for that I am raising another new issue.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "arnaudvl",
"comment_id": null,
"datetime": 1618480781000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2,262 | false | false | 2 | 6 | true |
edgexfoundry/edgex-go | edgexfoundry | 550,751,787 | 2,300 | null | [
{
"action": "opened",
"author": "michaelestrin",
"comment_id": null,
"datetime": 1579174664000,
"masked_author": "username_0",
"text": "Write an in-memory implementation of the `DBClient` interface for the file mentioned in this issue's title. Include unit tests that cover the implementation. \r\n\r\nThis in-memory persistence implementation will be used for Golang-based acceptance tests.\r\n\r\nRelated to https://github.com/edgexfoundry/edgex-go/issues/2277 and https://github.com/edgexfoundry/edgex-go/issues/2273.",
"title": "Implement db/memory/valuedescriptors.go behavior",
"type": "issue"
},
{
"action": "created",
"author": "michaelestrin",
"comment_id": 576753265,
"datetime": 1579622845000,
"masked_author": "username_0",
"text": "Added hold. This issue was created to facilitate Golang-based acceptance testing of APIv2 endpoints. \r\n Unsure at this point if APIv2 will reuse the DBClient abstraction or create its own.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "michaelestrin",
"comment_id": 582898963,
"datetime": 1580994698000,
"masked_author": "username_0",
"text": "Decision made for v2 to have its own persistence implementation. Closing this issue; it is no longer relevant.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "michaelestrin",
"comment_id": null,
"datetime": 1580994699000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 678 | false | false | 1 | 4 | false |
PrismJS/prism | PrismJS | 655,763,631 | 2,474 | {
"number": 2474,
"repo": "prism",
"user_login": "PrismJS"
} | [
{
"action": "opened",
"author": "osipxd",
"comment_id": null,
"datetime": 1594638533000,
"masked_author": "username_0",
"text": "Added file extensions `kt` and `kts` as aliases for Kotlin.",
"title": "Add aliases for Kotlin",
"type": "issue"
},
{
"action": "created",
"author": "osipxd",
"comment_id": 657505028,
"datetime": 1594639842000,
"masked_author": "username_0",
"text": "Ah, missed it. Fixed now.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RunDevelopment",
"comment_id": 657510031,
"datetime": 1594640443000,
"masked_author": "username_1",
"text": "Thank you for contributing!",
"title": null,
"type": "comment"
}
] | 111 | false | false | 2 | 3 | false |
vitejs/vite | vitejs | 890,849,296 | 3,399 | null | [
{
"action": "opened",
"author": "zhangyuang",
"comment_id": null,
"datetime": 1620894948000,
"masked_author": "username_0",
"text": "### Describe the bug\r\n\r\nwhen the thirdparty module is esm format, esbuild manage it without add `__esModule` property.\r\n\r\nfor examle, the source code is\r\n\r\n```js\r\n// swier.esm.js\r\nexport { default as Swiper, default } from './esm/components/core/core-class';\r\nexport { default as Virtual } from './esm/components/virtual/virtual';\r\n//...\r\n```\r\n\r\nAfter the build ,will output\r\n\r\n```js\r\nimport {default as default2, default as default3} from \"./esm/components/core/core-class\";\r\nimport {default as default4} from \"./esm/components/virtual/virtual\";\r\nexport {\r\n default14 as A11y,\r\n default3 as default\r\n};\r\n\r\n```\r\n\r\nwhich it not have `__esModule` property.\r\n\r\nit will be cause an error. if a module build by webpack, the source code as below\r\n\r\n```js\r\n// react-id-swiper sourceCode\r\nimport Swiper from 'swiper';\r\nnew Swiper()\r\n```\r\nafter webpack build , output code will add helper function `__importDefault`\r\n```js\r\n// react-id-swiper\r\nvar __importDefault = (this && this.__importDefault) || function (mod) {\r\n return (mod && mod.__esModule) ? mod : { \"default\": mod };\r\n};\r\nvar swiper_1 = __importDefault(require(\"swiper\"));\r\nnew swiper_1.default(swiperNodeRef.current, object_assign_1.default({}, props))\r\n```\r\n\r\nbut because of `__esModule` is undefined, `__importDefault` function will add default property once again cause the return value `__importDefault(require(\"swiper\")).default` is not the current object will be error\r\n\r\n\r\n\r\n \r\n\r\n\r\n### Reproduction\r\n\r\nhttps://github.com/username_0/vite-react-swiper-error\r\n\r\n### System Info\r\n\r\nvite/2.3.2 darwin-x64 node-v12.18.3\r\n\r\n### additional\r\n\r\nwhen i add `__esModule` property in source manually, it can be executed succeed like\r\n\r\n```js\r\nexport { default as Swiper, default } from './esm/components/core/core-class';\r\nexport { default as Virtual } from './esm/components/virtual/virtual';\r\nexport const __esModule = true \r\n```\r\n\r\nIn this case,maybe vite can add `__esModule` in esbuild plugin . for example\r\n\r\n```js\r\nbuild.onLoad({ filter: /.*/, namespace: 'swiper' }, ({ path: id }) => {\r\n return {\r\n loader: 'js',\r\n resolveDir: root,\r\n contents: `export * from \"filepath\"\r\n export {default} from \"filePath\"\r\n export const __esModule = true \r\n `\r\n }\r\n })\r\n```",
"title": "without __esModule property when format esm module cause load module error",
"type": "issue"
},
{
"action": "created",
"author": "Sociosarbis",
"comment_id": 840953898,
"datetime": 1620958189000,
"masked_author": "username_1",
"text": "temporary workaround for your case.\r\n\r\nadd a `resolve.alias` in `vite.cofnig.js` as follows:\r\n```js\r\n{\r\n resolve: {\r\n alias: {\r\n swiper: 'swiper/swiper.cjs'\r\n }\r\n }\r\n}\r\n```\r\n\r\nBTW,is possible to achieve this specifc functionality you proposed somehow like add a extra file and use `resolve.alias` to redirect the request to that. 🤔",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangyuang",
"comment_id": 841019332,
"datetime": 1620970528000,
"masked_author": "username_0",
"text": "yeah, but it's a temporary solution。the problem may be appear frequently",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Sociosarbis",
"comment_id": 841131608,
"datetime": 1620984816000,
"masked_author": "username_1",
"text": "```js\r\nexport default defineConfig({\r\n plugins: [reactRefresh()],\r\n resolve: {\r\n alias: {\r\n swiper: join(__dirname, 'src/swiper'),\r\n '@swiper': join(__dirname, 'node_modules/swiper/swiper.esm')\r\n }\r\n }\r\n})\r\n```\r\n```js\r\n// src/swiper.js\r\nexport * from '@swiper'\r\nexport { default } from '@swiper'\r\nexport const __esModule = true\r\n```\r\nclean the vite prebundle cache and rebuild. This would work. I think it's a issue of esbuild rather than vite's.😄",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangyuang",
"comment_id": 841133781,
"datetime": 1620985084000,
"masked_author": "username_0",
"text": "yeah, i also think that. in the final analysis,the problem is es module and commonjs module are not compatible. `export default` not have corresponding declare in commonjs",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "BasixKOR",
"comment_id": 842874337,
"datetime": 1621318242000,
"masked_author": "username_2",
"text": "This causes latest `evergreen-ui` to break, caused by `hyphenate-style-name`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "frabarz",
"comment_id": 844490263,
"datetime": 1621460056000,
"masked_author": "username_3",
"text": "I'm also having this issue with [`resize-observer-polyfill`](https://www.npmjs.com/package/resize-observer-polyfill). The package offers a es module, but the content doesn't have the `__esModule = true` export, and the app fails because the function is wrapped in an object under a `default` key:\r\n\r\n\r\nThis is clearly something they need to fix, but the thing is, this was working correctly 2 weeks ago. Does anyone know since which version this changed?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangyuang",
"comment_id": 849546936,
"datetime": 1622114008000,
"masked_author": "username_0",
"text": "@username_5 @underfin @username_4 @patak-js please see see, the issue occur many times in different project",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Shinigami92",
"comment_id": 849554327,
"datetime": 1622114789000,
"masked_author": "username_4",
"text": "Does this also happen with Vite `v2.2.4`? If not, I assume it also has to do with the `esbuild` update.\r\nBut if `v2.2.4` works for you, you are at least able to use Vite for now",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Shinigami92",
"comment_id": 849557399,
"datetime": 1622115119000,
"masked_author": "username_4",
"text": "Ok so please downgrade and try it out. Then write me feedback.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangyuang",
"comment_id": 849559920,
"datetime": 1622115399000,
"masked_author": "username_0",
"text": "@username_4 vite@2.2.4 will not occur the problem. but we want to use the newest vite.maybe you can push esbuild fix the problem",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "antfu",
"comment_id": null,
"datetime": 1623174607000,
"masked_author": "username_5",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "softboy99",
"comment_id": 858205947,
"datetime": 1623287724000,
"masked_author": "username_6",
"text": "Hi,\r\ndoes this has been released?",
"title": null,
"type": "comment"
}
] | 4,823 | false | true | 7 | 13 | true |
tevora-threat/SharpView | tevora-threat | 582,326,735 | 3 | null | [
{
"action": "opened",
"author": "andrewchiles",
"comment_id": null,
"datetime": 1584367628000,
"masked_author": "username_0",
"text": "Running `Get-DomainUser` with the following syntax fails to properly filter on the specified username and instead returns all users in the domain. The result was the same despite specifying `-name domadmin`, `-identity domadmin`, or simply `Get-DomainUser domadmin`\r\n\r\n```SharpView_4.5.exe Get-DomainUser -name \"**domadmin**\"\r\n[*] Tasked beacon to run .NET program: SharpView_4.5.exe Get-DomainUser -name \"domadmin\"\r\n[+] host called home, sent: 840809 bytes\r\n[+] received output:\r\nget-domain\r\n[Get-DomainSearcher] search base: LDAP://DC01.lab.local/DC=lab,DC=local\r\n[Get-DomainUser] filter string: (&(samAccountType=805306368))\r\nobjectsid : {S-1-5-21-.....-500}\r\nsamaccounttype : USER_OBJECT\r\nobjectguid : 019324d8-f17b-45c3-b9a9-adc7e0d3b9b3\r\nuseraccountcontrol : NORMAL_ACCOUNT\r\naccountexpires : 12/31/1600 7:00:00 PM\r\nlastlogon : 11/21/2014 6:42:49 AM\r\nlastlogontimestamp : 3/13/2020 10:40:02 AM\r\npwdlastset : 8/15/2019 10:30:55 AM\r\nlastlogoff : 12/31/1600 7:00:00 PM\r\nbadPasswordTime : 12/31/1600 7:00:00 PM\r\nname : Administrator\r\ndistinguishedname : CN=Administrator,CN=Users,DC=lab,DC=local\r\nwhencreated : 8/15/2019 2:32:06 PM\r\nwhenchanged : 3/13/2020 2:40:02 PM\r\nsamaccountname : Administrator\r\nmemberof : {CN=Group Policy Creator Owners,CN=Users,DC=lab,DC=local, CN=Domain Admins,CN=Users,DC=lab,DC=local, CN=Enterprise Admins,CN=Users,DC=lab,DC=local, CN=Schema Admins,CN=Users,DC=lab,DC=local, CN=Administrators,CN=Builtin,DC=lab,DC=local}\r\ncn : {Administrator}\r\nobjectclass : {top, person, organizationalPerson, user}\r\nlogoncount : 3\r\ncodepage : 0\r\nobjectcategory : CN=Person,CN=Schema,CN=Configuration,DC=lab,DC=local\r\ndescription : Built-in account for administering the computer/domain\r\nusnchanged : 22265\r\ninstancetype : 4\r\nbadpwdcount : 0\r\nusncreated : 8196\r\ncountrycode : 0\r\nprimarygroupid : 513\r\ndscorepropagationdata : {8/15/2019 2:47:54 PM, 8/15/2019 2:47:54 PM, 8/15/2019 2:32:44 PM, 1/1/1601 6:12:16 PM}\r\nlogonhours : {255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255}\r\nadmincount : 1\r\niscriticalsystemobject : True\r\n\r\nobjectsid : {S-1-5-21-....-502}\r\nsamaccounttype : USER_OBJECT\r\nobjectguid : af1a1a57-681f-4d3c-8775-3b922ba9613d\r\nuseraccountcontrol : ACCOUNTDISABLE, NORMAL_ACCOUNT\r\naccountexpires : NEVER\r\nlastlogon : 12/31/1600 7:00:00 PM\r\npwdlastset : 8/15/2019 10:32:44 AM\r\nlastlogoff : 12/31/1600 7:00:00 PM\r\nbadPasswordTime : 12/31/1600 7:00:00 PM\r\n**name : krbtgt**\r\ndistinguishedname : CN=krbtgt,CN=Users,DC=lab,DC=local\r\nwhencreated : 8/15/2019 2:32:44 PM\r\nwhenchanged : 8/15/2019 2:47:54 PM\r\nsamaccountname : krbtgt\r\nmemberof : {CN=Denied RODC Password Replication Group,CN=Users,DC=lab,DC=local}\r\ncn : {krbtgt}\r\nobjectclass : {top, person, organizationalPerson, user}\r\nServicePrincipalName : kadmin/changepw\r\nlogoncount : 0\r\ncodepage : 0\r\nobjectcategory : CN=Person,CN=Schema,CN=Configuration,DC=lab,DC=local\r\ndescription : Key Distribution Center Service Account\r\nusnchanged : 12731\r\ninstancetype : 4\r\nshowinadvancedviewonly : True\r\nbadpwdcount : 0\r\nusncreated : 12324\r\ncountrycode : 0\r\nprimarygroupid : 513\r\ndscorepropagationdata : {8/15/2019 2:47:54 PM, 8/15/2019 2:32:44 PM, 1/1/1601 12:04:16 AM}\r\nmsds-supportedencryptiontypes : 0\r\nadmincount : 1\r\niscriticalsystemobject : True\r\n\r\n<snip - Remaining domain users were displayed>\r\n```",
"title": "Get-DomainUser Not Filtering on \"Name\" Argument",
"type": "issue"
},
{
"action": "created",
"author": "coffeegist",
"comment_id": 662681747,
"datetime": 1595450021000,
"masked_author": "username_1",
"text": "@username_0 the arguments in this version are case sensitive unfortunately :)",
"title": null,
"type": "comment"
}
] | 4,556 | false | false | 2 | 2 | true |
canammex-tech/deepstream-services-library | canammex-tech | 694,942,611 | 352 | null | [
{
"action": "opened",
"author": "jlerasmus",
"comment_id": null,
"datetime": 1599473373000,
"masked_author": "username_0",
"text": "Running sample `ode_occurrence_rtsp_start_record_tap_action.py` logs error and throws segmentation fault when pressing 'q' to exit application.\r\n\r\nError:\r\n```\r\n0:00:08.434763531 3639 0x12566460 ERROR DSL src/DslTapBintr.cpp:114:UnlinkFromSource: : TapBintr 'src-0-record-tap' failed to unlink from Decode Source Tee\r\n```\r\n\r\nDebugger breaks at line 209 of `DslNodetr.h` when the segmentation fault occurs:\r\n```\r\nreturn GST_ELEMENT(m_pGstObj);\r\n```\r\n\r\nIf the record-tap is not added with `dsl_source_rtsp_tap_add`, the error log and segementation fault does not occur. The following is logged to the console though:\r\n```\r\nnvbuf_utils: dmabuf_fd 1306 mapped entry NOT found\r\n```",
"title": "Error and segmentation fault on quitting application when rtsp-tap is added",
"type": "issue"
},
{
"action": "created",
"author": "rjhowell44",
"comment_id": 688395951,
"datetime": 1599492766000,
"masked_author": "username_1",
"text": "Thanks @username_0 I will look into this. \r\n\r\nRE: *nvbuf_utils: dmabuf_fd 1306 mapped entry NOT found* \r\n\r\nThis I'm aware of and have reported such to nvidia as this showed up with the 5.0 release. They of course want me to reproduce using their app which I have not had time to follow up on as of yet.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlerasmus",
"comment_id": 695998957,
"datetime": 1600679946000,
"masked_author": "username_0",
"text": "@username_1 I am still getting the segfault on the latest v0.08.alpha branch\r\n\r\n```\r\n0:00:07.297015085 11595 0xe3ee8f0 INFO DSL src/DslPipelineBintr.cpp:747:HandleXWindowEvents: : Key released = 'q'\r\nkey released = q\r\n0:00:07.297340143 11595 0xe3ee8f0 INFO DSL src/DslServices.h:664:GetMainLoopHandle: : Returning Handle to MainLoop\r\n0:00:07.298293964 11595 0xe06ff60 INFO DSL src/DslBintr.h:223:SetState: : Changing state to 'NULL' for Bintr 'pipeline'\r\n0:00:07.298432665 11595 0xe06ff60 INFO DSL src/DslBintr.h:229:SetState: : State change completed synchronously for Bintr'pipeline'\r\n0:00:07.298548709 11595 0xe06ff60 INFO DSL src/DslPipelineSourcesBintr.cpp:230:UnlinkAll: : Unlinking stream_muxer from src-0\r\n0:00:07.298599960 11595 0xe06ff60 INFO DSL src/DslSourceBintr.cpp:144:UnlinkFromSink: : Unlinking and releasing request Sink Pad for StreamMux stream_muxer\r\n0:00:07.299076063 11595 0xe06ff60 WARN DSL src/DslSourceBintr.cpp:946:UnlinkAll: : *********************************source\r\n0:00:07.299201846 11595 0xe06ff60 INFO DSL src/DslTapBintr.cpp:109:UnlinkFromSource: : Unlinking and releasing requested Source Pad for Decode Source Tee src-0-record-tap\r\n0:00:07.299283671 11595 0xe06ff60 ERROR DSL src/DslTapBintr.cpp:114:UnlinkFromSource: : TapBintr 'src-0-record-tap' failed to unlink from Decode Source Tee\r\nSegmentation fault (core dumped)\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlerasmus",
"comment_id": 696057186,
"datetime": 1600687830000,
"masked_author": "username_0",
"text": "@username_1, I also got a different error when quitting from terminal using Ctrl+C, but this occurs randomly (sorry, only had INFO logging enabled)\r\n```\r\n^C0:00:29.381972166 1106 0x2edb7210 WARN DSL src/DslSourceBintr.cpp:946:UnlinkAll: : *********************************source\r\n0:00:29.382190347 1106 0x2edb7210 ERROR DSL src/DslTapBintr.cpp:114:UnlinkFromSource: : TapBintr 'src-1-record-tap' failed to unlink from Decode Source Tee\r\ng_mutex_clear() called on uninitialised or locked mutex\r\nAborted (core dumped)\r\n```\r\n\r\nHere is the full Debug log from when 'q' is pressed:\r\n```\r\n0:00:06.554164941 2037 0x5475cf0 INFO DSL src/DslPipelineBintr.cpp:747:HandleXWindowEvents: : Key released = 'q'\r\nkey released = q\r\n0:00:06.554435415 2037 0x5475cf0 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Services::GetMainLoopHandle()\r\n0:00:06.554482655 2037 0x5475cf0 INFO DSL src/DslServices.h:664:GetMainLoopHandle: : Returning Handle to MainLoop\r\n0:00:06.554503801 2037 0x5475cf0 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Services::GetMainLoopHandle()\r\n0:00:06.554596459 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Services::PipelineDeleteAll()\r\n0:00:06.554667085 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::RemoveAllChildren()\r\n0:00:06.554755368 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.554791879 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.554831880 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.554877975 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.554911100 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'iou-tracker' from Parent GST BIn'pipeline'\r\n0:00:06.554955997 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555050009 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555083551 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555101416 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555230533 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555253763 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555278034 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555295899 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555314649 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'on-screen-display' from Parent GST BIn'pipeline'\r\n0:00:06.555340639 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555358973 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555383244 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555400692 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555507725 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555530382 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555553455 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555571112 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555590123 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'primary-gie' from Parent GST BIn'pipeline'\r\n0:00:06.555614134 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555631842 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555654186 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555670749 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555751063 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555769761 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555792574 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555810127 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555827887 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'sinks-bin' from Parent GST BIn'pipeline'\r\n0:00:06.555852054 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555869763 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555892055 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555909087 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555992526 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556010547 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556032891 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556050079 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556069715 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'sources-bin' from Parent GST BIn'pipeline'\r\n0:00:06.556093934 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556111382 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556133466 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556150289 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556438367 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556459878 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556482743 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556500503 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556518212 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'tiler' from Parent GST BIn'pipeline'\r\n0:00:06.556542587 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556560817 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556583161 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556600453 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556655194 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::RemoveAllChildren()\r\n0:00:06.556683684 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556700194 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556722174 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556738841 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556756185 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'iou-tracker' from Parent 'pipeline'\r\n0:00:06.556779050 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.556795977 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.556821290 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556838322 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556860093 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556876552 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556893531 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'on-screen-display' from Parent 'pipeline'\r\n0:00:06.556915719 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.556932126 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.556956501 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556973168 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556995043 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557011710 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557028794 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'primary-gie' from Parent 'pipeline'\r\n0:00:06.557050930 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557067753 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557092285 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557109056 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557131035 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557147911 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557166817 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'sinks-bin' from Parent 'pipeline'\r\n0:00:06.557189005 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557205255 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557229110 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557245360 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557266871 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557283174 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557300101 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'sources-bin' from Parent 'pipeline'\r\n0:00:06.557322028 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557338695 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557363331 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557379998 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557401717 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557418488 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557435364 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'tiler' from Parent 'pipeline'\r\n0:00:06.557457135 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557473958 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557500938 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::RemoveAllChildren()\r\n0:00:06.557518646 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::RemoveAllChildren()\r\n0:00:06.557548699 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::PipelineBintr::~PipelineBintr()\r\n0:00:06.557571356 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::PipelineBintr::Stop()\r\n0:00:06.557594481 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Bintr::SetState()\r\n0:00:06.557621148 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557637971 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557655055 2037 0x50f5f60 INFO DSL src/DslBintr.h:223:SetState: : Changing state to 'NULL' for Bintr 'pipeline'\r\n0:00:06.557676462 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.557693493 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.557810631 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557832767 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557851517 2037 0x50f5f60 INFO DSL src/DslBintr.h:229:SetState: : State change completed synchronously for Bintr'pipeline'\r\n0:00:06.557869226 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Bintr::SetState()\r\n0:00:06.557893549 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Bintr::IsLinked()\r\n0:00:06.557912247 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Bintr::IsLinked()\r\n0:00:06.557935216 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::BranchBintr::UnlinkAll()\r\n0:00:06.557963029 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.557981675 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558004957 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.558027978 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558045270 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558067458 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558085844 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558107251 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558124543 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558146835 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558164075 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558186315 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558203711 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558235586 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.558268816 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558287254 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558311265 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558328140 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558345901 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'sources-bin' from Sink 'primary-gie'\r\n0:00:06.558368036 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.558386058 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.558410641 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::PipelineSourcesBintr::UnlinkAll()\r\n0:00:06.558441632 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558459913 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558483403 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558500331 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558517622 2037 0x50f5f60 INFO DSL src/DslPipelineSourcesBintr.cpp:230:UnlinkAll: : Unlinking stream_muxer from src-0\r\n0:00:06.558542519 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::SourceBintr::UnlinkFromSink()\r\n0:00:06.558565488 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558583092 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558618510 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558637052 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558654344 2037 0x50f5f60 INFO DSL src/DslSourceBintr.cpp:144:UnlinkFromSink: : Unlinking and releasing request Sink Pad for StreamMux stream_muxer\r\n0:00:06.558758200 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: std::shared_ptr<DSL::Nodetr> DSL::Nodetr::GetSink()\r\n0:00:06.558788617 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: std::shared_ptr<DSL::Nodetr> DSL::Nodetr::GetSink()\r\n0:00:06.558820857 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558840024 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559300866 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.559348627 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.559368315 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.559394617 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.559413159 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.559430712 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'src-0' from Sink 'stream_muxer'\r\n0:00:06.559451493 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.559469358 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::SourceBintr::UnlinkFromSink()\r\n0:00:06.559493421 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RtspSourceBintr::UnlinkAll()\r\n0:00:06.559516494 2037 0x50f5f60 WARN DSL src/DslSourceBintr.cpp:946:UnlinkAll: : *********************************source\r\n0:00:06.559543526 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.559568266 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.559586964 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.559612069 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559631236 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559654466 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559672539 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559694570 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559712227 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559734936 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559752124 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559815823 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.559849677 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.559869105 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.559893793 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.559939366 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.559974732 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'pre-decode-queue' from Sink 'decode-bin'\r\n0:00:06.560012180 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.560047754 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.560094109 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RtspSourceBintr::HasTapBintr()\r\n0:00:06.560133016 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::RtspSourceBintr::HasTapBintr()\r\n0:00:06.560184006 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::TapBintr::UnlinkFromSource()\r\n0:00:06.560231611 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSource()\r\n0:00:06.560266768 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSource()\r\n0:00:06.560317863 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.560350780 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.560385624 2037 0x50f5f60 INFO DSL src/DslTapBintr.cpp:109:UnlinkFromSource: : Unlinking and releasing requested Source Pad for Decode Source Tee src-0-record-tap\r\n0:00:06.560591618 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.560648337 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.560683025 2037 0x50f5f60 ERROR DSL src/DslTapBintr.cpp:114:UnlinkFromSource: : TapBintr 'src-0-record-tap' failed to unlink from Decode Source Tee\r\n0:00:06.560716932 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::TapBintr::UnlinkFromSource()\r\n0:00:06.560757714 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RecordTapBintr::UnlinkAll()\r\n0:00:06.560802715 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.560842455 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.560886258 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.560923134 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561007250 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::RemoveChild()\r\n0:00:06.561054282 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::IsChild()\r\n0:00:06.561096054 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.561130013 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.561171940 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::IsChild()\r\n0:00:06.561216472 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561250588 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561289651 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561332412 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561486529 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::RemoveChild()\r\n0:00:06.561521009 2037 0x5475400 WARN matroskamux matroska-mux.c:3468:gst_matroska_mux_finish:0x5470a10 unable to get final track duration\r\n0:00:06.561631219 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::IsChild()\r\n0:00:06.561679762 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.561714190 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.561752159 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::IsChild()\r\n0:00:06.561803202 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.561840494 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.561877734 2037 0x50f5f60 DEBUG DSL src/DslBase.h:192:RemoveChild: : Child 'record-bin' removed from Parent 'src-0-record-tap'\r\n0:00:06.561910495 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::RemoveChild()\r\n0:00:06.561942423 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::RemoveChild()\r\n0:00:06.561983621 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::~Nodetr()\r\n0:00:06.562019820 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:75:~Nodetr: : Nodetr 'record-bin' deleted\r\n0:00:06.562051383 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::~Nodetr()\r\n0:00:06.562087373 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::~Base()\r\n0:00:06.562118103 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::~Base()\r\n0:00:06.562160812 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RecordMgr::DestroyContext()\r\n0:00:06.562204407 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RecordMgr::IsOn()\r\n0:00:06.562235397 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::RecordMgr::IsOn()\r\n0:00:06.562271804 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::RecordMgr::DestroyContext()\r\n0:00:06.562319304 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::RecordTapBintr::UnlinkAll()\r\n0:00:06.562358107 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSource()\r\n0:00:06.562396129 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSource()\r\n0:00:06.562430504 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSource()\r\n0:00:06.562466807 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562499308 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562542902 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562577903 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562613008 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562643633 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562678425 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562715614 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562789834 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSource()\r\n0:00:06.562834574 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.562866345 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.562903065 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.562935878 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.563008952 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:148:UnlinkFromSource: : Unlinking self 'pre-decode-queue' as a Sink from 'pre-decode-tee' Source\r\n0:00:06.563056245 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSource()\r\n0:00:06.563086766 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSource()\r\n0:00:06.563123694 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.563160361 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.563192080 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.563227914 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563262290 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563301041 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563343229 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563380000 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563410730 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563444324 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563477242 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563546045 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.563590681 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.563631203 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.563667818 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.563699433 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.563730423 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'src-parse' from Sink 'pre-decode-tee'\r\n0:00:06.563764122 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.563795112 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.563834123 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.563871259 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.563903291 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.563940896 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563976678 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564016262 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564048241 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564083138 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564113868 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564149337 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564181421 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564251630 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.564301267 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.564334653 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.564373143 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.564405591 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.564437623 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'src-depay' from Sink 'src-parse'\r\n0:00:06.564471530 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.564502676 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.564544344 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\nSegmentation fault (core dumped)\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rjhowell44",
"comment_id": 697152334,
"datetime": 1600840948000,
"masked_author": "username_1",
"text": "@username_0 just fyi, I was finally able to get my live setup running again... should be able to fix the remaining issues tomorrow",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rjhowell44",
"comment_id": 698498251,
"datetime": 1600970407000,
"masked_author": "username_1",
"text": "@username_0 the segmentation fault issue has been resolved..",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlerasmus",
"comment_id": 698749086,
"datetime": 1601015516000,
"masked_author": "username_0",
"text": "@username_1 thanks, I will test and confirm",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlerasmus",
"comment_id": 699791319,
"datetime": 1601272978000,
"masked_author": "username_0",
"text": "@username_1, I can confirm that the segmentation fault is fixed. I believe you can close the issue.\r\n\r\nAs a final note regarding the `nvbuf_utils` message, herewith my debug log:\r\n```\r\n 0:00:09.260938622 15019 0x2ddd4e30 INFO DSL src/DslPipelineBintr.cpp:747:HandleXWindowEvents: : Key released = 'q'\r\n key released = q\r\n 0:00:09.261218835 15019 0x2ddd4e30 INFO DSL src/DslServices.h:664:GetMainLoopHandle: : Returning Handle to MainLoop\r\n 0:00:09.261435298 15019 0x2da56760 INFO DSL src/DslServices.cpp:7717:ReturnValueToString: : Result = 0 = DSL_RESULT_SUCCESS\r\n DSL_RESULT_SUCCESS\r\n 0:00:09.262414691 15019 0x2da56760 INFO DSL src/DslBintr.h:223:SetState: : Changing state to 'NULL' for Bintr 'pipeline'\r\n 0:00:09.262549798 15019 0x2da56760 INFO DSL src/DslBintr.h:229:SetState: : State change completed synchronously for Bintr'pipeline'\r\n 0:00:09.262663602 15019 0x2da56760 INFO DSL src/DslPipelineSourcesBintr.cpp:238:UnlinkAll: : Unlinking stream_muxer from src-0\r\n 0:00:09.262707666 15019 0x2da56760 INFO DSL src/DslNodetr.h:607:UnlinkFromSinkMuxer: : Unlinking and releasing requested Sink Pad '0x2ddb3160' for Bintr 'src-0'\r\n 0:00:09.272987496 15019 0x2ddd4d90 WARN rtspsrc gstrtspsrc.c:5999:gst_rtspsrc_try_send:<src-0> send interrupted\r\n 0:00:09.273032340 15019 0x2ddd4d90 WARN rtspsrc gstrtspsrc.c:7673:gst_rtspsrc_close:<src-0> TEARDOWN interrupted\r\n 0:00:09.275547492 15019 0x2da56760 INFO DSL src/DslNodetr.h:626:UnlinkFromSinkMuxer: : Bintr 'src-0' changed state to NULL successfully\r\n 0:00:09.276242974 15019 0x2da56760 INFO DSL src/DslSourceBintr.cpp:881:UnlinkAll: : Stream management disabled for RTSP Source 'src-0'\r\n 0:00:09.276436832 15019 0x2da56760 INFO DSL src/DslNodetr.h:547:UnlinkFromSourceTee: : Unlinking and releasing requested Src Pad '0x2ddb8d00' for Bintr 'pre-decode-queue'\r\n 0:00:09.276853506 15019 0x2da56760 INFO DSL src/DslTapBintr.cpp:71:UnlinkFromSourceTee: : Unlinking and releasing requested Source Pad for TapBintr src-0-record-tap\r\n 0:00:09.276958560 15019 0x2da56760 INFO DSL src/DslNodetr.h:547:UnlinkFromSourceTee: : Unlinking and releasing requested Src Pad '0x2ddb8aa0' for Bintr 'src-0-record-tap'\r\n 0:00:09.278355931 15019 0x2da56760 INFO DSL src/DslMultiComponentsBintr.cpp:193:UnlinkAll: : Unlinking sink_bin_tee from window-sink\r\n 0:00:09.278428692 15019 0x2da56760 INFO DSL src/DslNodetr.h:547:UnlinkFromSourceTee: : Unlinking and releasing requested Src Pad '0x2ddb8f60' for Bintr 'window-sink'\r\n nvbuf_utils: dmabuf_fd 1306 mapped entry NOT found\r\n 0:00:09.340796749 15019 0x2da56760 INFO DSL src/DslServices.cpp:6652:ComponentDeleteAll: : All Components deleted successfully\r\n 0:00:09.340959564 15019 0x2da56760 INFO DSL src/DslServices.cpp:3275:PphDeleteAll: : All Pad Probe Handlers deleted successfully\r\n 0:00:09.341097067 15019 0x2da56760 INFO DSL src/DslServices.cpp:2920:OdeTriggerDeleteAll: : All ODE Triggers deleted successfully\r\n 0:00:09.341151755 15019 0x2da56760 INFO DSL src/DslServices.cpp:1994:OdeAreaDeleteAll: : All ODE Areas deleted successfully\r\n 0:00:09.341264257 15019 0x2da56760 INFO DSL src/DslServices.cpp:1864:OdeActionDeleteAll: : All ODE Actions deleted successfully\r\n 0:00:09.341403739 15019 0x2da56760 INFO DSL src/DslServices.cpp:921:DisplayTypeDeleteAll: : All Display Types deleted successfully\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rjhowell44",
"comment_id": 700255013,
"datetime": 1601323808000,
"masked_author": "username_1",
"text": "@username_0 Thanks for the log. I will try and chase this down with Nvidia over the coming weeks",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "rjhowell44",
"comment_id": null,
"datetime": 1601323809000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 44,833 | false | false | 2 | 10 | true |
bkchr/impl-trait-for-tuples | null | 700,604,366 | 2 | {
"number": 2,
"repo": "impl-trait-for-tuples",
"user_login": "bkchr"
} | [
{
"action": "opened",
"author": "blefevre",
"comment_id": null,
"datetime": 1600015326000,
"masked_author": "username_0",
"text": "Custom trait bounds can be added with the `#[tuple_types_custom_trait_bound(NewBound)` attribute.\r\nThis will use `NewBound` as the type for each Tuple component instead of the Trait being implemented.",
"title": "Add support for custom trait bounds",
"type": "issue"
},
{
"action": "created",
"author": "blefevre",
"comment_id": 692129383,
"datetime": 1600096916000,
"masked_author": "username_0",
"text": "Supporting advanced trait bounds was simpler than expected so that's been added.",
"title": null,
"type": "comment"
}
] | 280 | false | false | 1 | 2 | false |
uselagoon/lagoon-charts | uselagoon | 875,053,728 | 252 | {
"number": 252,
"repo": "lagoon-charts",
"user_login": "uselagoon"
} | [
{
"action": "opened",
"author": "smlx",
"comment_id": null,
"datetime": 1620096073000,
"masked_author": "username_0",
"text": "This PR makes the docker-host PVC storage class configurable through the helm chart values. By default we keep the current behaviour of not defining a storage class so as to fall back to cluster default.\r\n\r\nCloses #251",
"title": "Docker host storageclass",
"type": "issue"
},
{
"action": "created",
"author": "smlx",
"comment_id": 832014430,
"datetime": 1620140695000,
"masked_author": "username_0",
"text": "this passed here: https://lagoon-ci.amazeeio.cloud/blue/organizations/jenkins/lagoon/detail/dockerhost-sc/1/pipeline/50\r\n\r\nwill fix CI here https://github.com/uselagoon/lagoon-charts/issues/253",
"title": null,
"type": "comment"
}
] | 411 | false | false | 1 | 2 | false |
microsoft/reverse-proxy | microsoft | 863,191,818 | 926 | null | [
{
"action": "opened",
"author": "thombrink",
"comment_id": null,
"datetime": 1618949010000,
"masked_author": "username_0",
"text": "### Describe the bug\r\nWhen sending a websocket request through the proxy, it fails at `StreamCopier.CopyAsync` in the `CopyResponseBodyAsync` method of the `HttpProxy` class.\r\n\r\n### To Reproduce\r\n1. Start `ReverseProxy.Direct.Sample` and `SampleServer`.\r\n2. Request a websocket connection on the `ReverseProxy.Direct.Sample` and redirect it to the `SampleServer` `/api/websockets` endpoint.\r\n\r\nThe output will show: `Microsoft.AspNetCore.Server.Kestrel.Transport.Sockets: Debug: Connection id \"<connection_id>\" received FIN.` and `Microsoft.AspNetCore.Server.Kestrel.Transport.Sockets: Debug: Connection id \"<connection_id>\" sending FIN because: \"The client closed the connection.\"`.\r\n\r\nWhen `SampleServer` is called directly, it all works like expected.\r\n\r\n### Further technical details\r\n\r\n- Tested on the current main branch\r\n- The used platform: Windows",
"title": "Proxy websocket request exception",
"type": "issue"
},
{
"action": "created",
"author": "Tratcher",
"comment_id": 823638734,
"datetime": 1618957593000,
"masked_author": "username_1",
"text": "Is that error reported from the proxy process or the SampleServer? It claims the client disconnected.\r\n\r\nHere are some diagnostics you can collect and/or share to help narrow it down:\r\n- Full debug logs from the proxy process\r\n- Wireshark/network traces between the proxy and client\r\n- Any client logs you can collect",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Tratcher",
"comment_id": 823670990,
"datetime": 1618962272000,
"masked_author": "username_1",
"text": "I ran this with the TestClient, ReverseProxy.Direct.Sample, and SampleServer and it seems to behave as expected:\r\n\r\nI see these logs:\r\n```\r\ninfo: Microsoft.AspNetCore.Hosting.Diagnostics[1]\r\n Request starting HTTP/1.1 GET https://localhost:5001/api/websockets - -\r\ndbug: Microsoft.AspNetCore.HostFiltering.HostFilteringMiddleware[0]\r\n Wildcard detected, all requests with hosts will be allowed.\r\ndbug: Microsoft.AspNetCore.Routing.Matching.DfaMatcher[1001]\r\n 1 candidate(s) found for the request path '/api/websockets'\r\ndbug: Microsoft.AspNetCore.Routing.Matching.DfaMatcher[1005]\r\n Endpoint '/{**catch-all}' with route pattern '/{**catch-all}' is valid for the request path '/api/websockets'\r\ndbug: Microsoft.AspNetCore.Routing.EndpointRoutingMiddleware[1]\r\n Request matched endpoint '/{**catch-all}'\r\ninfo: Microsoft.AspNetCore.Routing.EndpointMiddleware[0]\r\n Executing endpoint '/{**catch-all}'\r\ninfo: Yarp.ReverseProxy.Service.Proxy.HttpProxy[9]\r\n Proxying to http://localhost:5002/api/websockets?area=xx2\r\ndbug: Microsoft.AspNetCore.Server.Kestrel.Transport.Sockets[6]\r\n Connection id \"0HM84AMCB38HE\" received FIN.\r\ndbug: Microsoft.AspNetCore.Server.Kestrel.Transport.Sockets[7]\r\n Connection id \"0HM84AMCB38HE\" sending FIN because: \"The client closed the connection.\"\r\ndbug: Microsoft.AspNetCore.Server.Kestrel[10]\r\n Connection id \"0HM84AMCB38HE\" disconnecting.\r\ninfo: Microsoft.AspNetCore.Routing.EndpointMiddleware[1]\r\n Executed endpoint '/{**catch-all}'\r\ninfo: Microsoft.AspNetCore.Hosting.Diagnostics[2]\r\n Request finished HTTP/1.1 GET https://localhost:5001/api/websockets - - - 101 - - 1583.0792ms\r\ndbug: Microsoft.AspNetCore.Server.Kestrel[2]\r\n Connection id \"0HM84AMCB38HE\" stopped.\r\n```\r\n\r\nReceiving a FIN is a normal part of the WebSocket close process, not an error condition.\r\n\r\nI don't see any errors reported in StreamCopier, what was the error?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Tratcher",
"comment_id": null,
"datetime": 1620320295000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "Tratcher",
"comment_id": 833686824,
"datetime": 1620320295000,
"masked_author": "username_1",
"text": "Closing. Let us know if you find additional information.",
"title": null,
"type": "comment"
}
] | 3,158 | false | false | 2 | 5 | false |
radix-ui/primitives | radix-ui | 772,318,291 | 353 | {
"number": 353,
"repo": "primitives",
"user_login": "radix-ui"
} | [
{
"action": "opened",
"author": "chaance",
"comment_id": null,
"datetime": 1608569889000,
"masked_author": "username_0",
"text": "- [x] Use a meaningful title for the pull request. Include the name of the package modified.\r\n- [x] Test the change in your own code.\r\n- [ ] Add or edit tests to reflect the change (run `yarn test`).\r\n- [ ] Add or edit Storybook examples to reflect the change (run `yarn dev`).\r\n- [ ] Add documentation to support any new features.\r\n\r\nThis pull request:\r\n\r\n- [ ] Fixes a bug in an existing package\r\n- [ ] Adds additional features/functionality to an existing package\r\n- [ ] Updates documentation or example code\r\n- [x] Other",
"title": "Speed up pre-commit hooks with lint-staged",
"type": "issue"
},
{
"action": "created",
"author": "benoitgrelard",
"comment_id": 749460965,
"datetime": 1608631638000,
"masked_author": "username_1",
"text": "@username_0 So the main speed-up is from the fact that we're only going to re-lint / typecheck the files that are being staged rather than the whole codebase?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chaance",
"comment_id": 749550034,
"datetime": 1608645120000,
"masked_author": "username_0",
"text": "@username_1 See the comment I left in the config file; in order for `tsc` to work properly with `lint-staged` (and for our scripts to run cross platform) one of those keys we have to use the function syntax in a js config file. I kind of hate it too but I couldn't get it to work in package.json without running bash in the command, which wouldn't work on Windows and would exclude contributors. 😕",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chaance",
"comment_id": 749550569,
"datetime": 1608645194000,
"masked_author": "username_0",
"text": "Even better IMO!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "benoitgrelard",
"comment_id": 749555437,
"datetime": 1608645841000,
"masked_author": "username_1",
"text": "We can't really move this to pre-push hook right? otherwise how would the `--fix` work?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chaance",
"comment_id": 749562508,
"datetime": 1608646810000,
"masked_author": "username_0",
"text": "@username_1 I'm thinking we could keep `prettier` and `eslint` in pre-commit. Those actually run pretty fast anyway when only executed on staged changes. `tsc` is significantly slower and I think we can move that to pre-push.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chaance",
"comment_id": 749567970,
"datetime": 1608647475000,
"masked_author": "username_0",
"text": "@username_1 @username_2 Updated a bit. Per my comment above I kept `prretty-quick` and `eslint` in the pre-commit hook, runs only on staged files and is way faster now. `tsc` goes in pre-push because it was much slower, and that let me delete the config file and keep everything in `package.json`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jjenzz",
"comment_id": 749597012,
"datetime": 1608650723000,
"masked_author": "username_2",
"text": "🎉 :+1:",
"title": null,
"type": "comment"
}
] | 1,712 | false | false | 3 | 8 | true |
expo/expo | expo | 755,553,565 | 11,199 | {
"number": 11199,
"repo": "expo",
"user_login": "expo"
} | [
{
"action": "opened",
"author": "brentvatne",
"comment_id": null,
"datetime": 1606937583000,
"masked_author": "username_0",
"text": "# Why\r\n\r\nWhen this was first introduced we had just rolled out `expo install`. This has existed for years at this point, and I doubt anyone still uses a version of expo-cli without it. If they do, they have bigger problems.\r\n\r\n# How\r\n\r\nJust delete the logs.\r\n\r\n# Test Plan\r\n\r\nImport a deprecated module or removed module and notice that the warning is better.",
"title": "[expo] Remove warning about install command missing, everybody has this now",
"type": "issue"
}
] | 359 | false | true | 1 | 1 | false |
Princeton-CDH/startwords | Princeton-CDH | 599,079,433 | 59 | null | [
{
"action": "opened",
"author": "thatbudakguy",
"comment_id": null,
"datetime": 1586805438000,
"masked_author": "username_0",
"text": "[figma link](https://www.figma.com/file/MEt3gfKX4N3WTLPbLGBbGC/Startwords?node-id=225%3A2217)\nthis issue tracks the implementation of the homepage design.",
"title": "As a reader, I want to see featured content on the site's homepage.",
"type": "issue"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 613661374,
"datetime": 1586895709000,
"masked_author": "username_1",
"text": "@username_0 I can't view the \"homepage link\" – gives me \"internal error\" \nAlso, if this is about the homepage, could you change the figma link to show the \"homepage\" page on Figma? – I don't think the \"Tablet\" page on Figma is the appropriate one to look at",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 613667268,
"datetime": 1586896461000,
"masked_author": "username_0",
"text": "updated the links, thanks!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 613680723,
"datetime": 1586898271000,
"masked_author": "username_1",
"text": "@username_0 \nbelow are my notes:\nOn all devices: \n1. The \"read more\" below the issue abstract is not matching the design, it needs to be left-aligned (aligned with the issue abstract) and slightly closer to the last line of the abstract than it is now. \n2. The \"Read more about Startwords.\" needs to be replaced with a red arrow to match the design. \n3. The doi is missing, which needs to be below the authors. \n4. The length of the article previews need to be similar, this will be a problem that we need to fix. \n5. It will be better to have shape for the bottom of the preview text, however I don't think it's a deal breaker now, but would be nice to fix it for the next issue. \n6. I like the interaction of the red arrows! Thank you for doing this!\n\nOn mobile:\n1. The \"Data Beyond Vision\" title is crossing the boundaries, which is causing the arrow to be misplaced as well as an increased space between the title and the authors – \"Vision\" should be wrapped. \n\nOn tablet:\n1. The position and width of the 1. issue number 2. issue abstract, and 3. theme do not match the designs and what was agreed. The three pieces mentioned need to match the x position of the opening paragraph on the left and regarding the width, it will need to match where the opening paragraph on the right ends, i.e. aligning with the x positions of rest of the metadata on this page.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614179921,
"datetime": 1586972353000,
"masked_author": "username_0",
"text": "ok, I _think_ I was able to address everything here. regarding the placement of the arrows when the titles wrap, it's unfortunately pretty hard to control...I was able to get it to not wrap the arrow on its own, but more control than that might be hard to accomplish.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614232358,
"datetime": 1586978493000,
"masked_author": "username_1",
"text": "@username_0 thank you! This looks great, mostly everything is fixed!\n\nA couple of points:\nregarding my earlier comment:\n4. The length of the article previews need to be similar, this will be a problem that we need to fix. – Can we do something about this? \n\nOn mobile:\n1. The \"Data Beyond Vision\" title is crossing the boundaries, which is causing the arrow to be misplaced as well as an increased space between the title and the authors – \"Vision\" should be wrapped. \n- I know you have written above that it's difficult, but it looks like it's fixed! (Although I refreshed once and it was back to everything in one line, crossing the boundary, and refreshed again and it looks like it's fixed :D – not sure why this is happening) – But I think it actually matters, and visually changes the design, would be happy to discuss alternatives. (see the image)\n- Same thing is happening with the DOI crossing the boundary, the two DOIs are too close to each other. (see the image)\n- Also, sorry Nick, I just updated the DOI style here and everywhere else to match what we agreed on for the article page. Could you add \"doi: ....\" to be as part of the link? – You can see this in the design now.\n\n\nDesktop:\n1. Is there a way to prevent the featured content from increasing width on desktop and to look the same as it does on tablet?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614250352,
"datetime": 1586980737000,
"masked_author": "username_0",
"text": "@username_1 can you explain what you mean by \"crossing the boundary\"? in the design the preview text of the two articles are very close to each other; there's just a few pixels in between - the same as with the DOIs here.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614256639,
"datetime": 1586981589000,
"masked_author": "username_1",
"text": "@username_0 that makes sense – I think in development it looks visually even closer. I'm trying on Figma exactly what you currently have on the site, and what's on the site is so much closer, which makes a huge difference. I think one way to solve this issue is by aligning this with the design, where the left opening paragraph starts higher on the page. On the site, left and right start exactly on the same y axis.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614266508,
"datetime": 1586982931000,
"masked_author": "username_0",
"text": "ok, I've:\r\n- added tighter margins for desktop\r\n- pushed the second article preview down a bit to create more visual space\r\n- added `doi:` at the start of the doi links",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614275725,
"datetime": 1586984179000,
"masked_author": "username_1",
"text": "@username_0 Thank you! This looks so much better now!\n\nJust this question:\nregarding my earlier comment:\n4. The length of the article previews need to be similar, this will be a problem that we need to fix. – Can we do something about this? \n\n(Also, just a general question that now I'm wondering about, do you know if this is the fault of Figma that there seems to be more space between the DOIs for instance, than there is on the site? – because the font size is correct) \n\nOtherwise I think I'm done reviewing this issue.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 614296120,
"datetime": 1586987128000,
"masked_author": "username_2",
"text": "I agree with Gissoo that the length of the article previews should be similar if possible, but this is more of an authorial and editorial issue than a design issue. The opening lines that have been selected for both articles right now are really good. The fact that the article on the left has more authors makes that entire block similar in size to the article block on the right with a longer preview text. At least that's a consolation, so it doesn't seem (at least to my eye) like they're mismatched now. But it might be something we want to consider for the editorial style guide, recommending a particular word limit for the opening line(s) article preview.\n\nAlso: there’s something too jagged about the article preview shapes — they don’t quite have the clean angle lines we’ve been going for. Can we experiment with word break properties to make the words conform more to the shape we’re hoping for?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614679790,
"datetime": 1587046463000,
"masked_author": "username_0",
"text": "@username_2 how do you feel about hyphenation to aid in keeping the shape? I turned on automatic hyphenation and increased the slope on the text-shapes, which yielded this (on tablet):\r\n\r\n<img width=\"309\" alt=\"Screen Shot 2020-04-16 at 10 13 09 AM\" src=\"https://user-images.githubusercontent.com/4924494/79466692-eaf0ea80-7fca-11ea-9021-22244f35b109.png\">\r\n\r\nI think we might need to dynamically adjust the slope based on the screen size, because this slope becomes too steep for mobile.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 614685722,
"datetime": 1587047044000,
"masked_author": "username_2",
"text": "@username_0 much cleaner lines! Would you mind taking screenshots of a few different options, showing the entire home page? Auto-hyphenation on and off, how both appear on mobile and tablet, etc. It would be good to get a sense of what our options are.\r\n\r\nNot being able to shape the bottom of the lines (i.e. narrowing back down) seems to change the overall feel we were going for (@username_1 may have thoughts on this). The original design was intersecting polygons, now it's abutting triangles. So it would be good to see the range of options we're looking at right now.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614822837,
"datetime": 1587061797000,
"masked_author": "username_0",
"text": "here's some screenshot using different values for the slope of the content on mobile and tablet, with hyphenation turned on:\r\n# mobile\r\n<img src=\"https://user-images.githubusercontent.com/4924494/79492280-c2c6b300-7fed-11ea-8aa5-3f9be422399c.png\" width=\"50%\" />\r\n<img src=\"https://user-images.githubusercontent.com/4924494/79492286-c4907680-7fed-11ea-92ac-63bd11b24e6c.png\" width=\"50%\" />\r\n<img src=\"https://user-images.githubusercontent.com/4924494/79492292-c65a3a00-7fed-11ea-85d4-945279e51a1c.png\" width=\"50%\" />\r\n\r\n# tablet\r\n\r\n\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 614880445,
"datetime": 1587069273000,
"masked_author": "username_2",
"text": "I think I like the second option best for both mobile and tablet you screenshoot above. It's super content-dependent though — if the author or editor changes a single word in these opening lines, the shape of the polygons and the word breaks will look entirely different. But I say let's go with option two for now!\r\n\r\nAre we waiting on anything else for this issue, or should I close?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614882091,
"datetime": 1587069487000,
"masked_author": "username_0",
"text": "@username_1 mentioned that it would be good to set an absolute limit on article preview text length, but so far I haven't been able to figure out how to do it...it looks like you can set `summaryLength` in the hugo settings, but it didn't have any effect for me.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614896643,
"datetime": 1587071252000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614899882,
"datetime": 1587071664000,
"masked_author": "username_1",
"text": "@username_0 thank you so much for providing these variations!!! I really like the first variation on mobile – but I do understand that the second one looks better on both mobile and tablet in this specific case. (v1 might work better in the long run – not sure). I'm not sure how I feel about the hyphenation here yet. (they do happen to be necessary evils in typography :D)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614900605,
"datetime": 1587071755000,
"masked_author": "username_0",
"text": "just wanted to mention - it's very possible to do the first one on mobile and the second one on tablet. also as you pointed out i think i have the margins wrong on tablet, so these screenshots aren't as applicable as they should be...",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614902853,
"datetime": 1587072076000,
"masked_author": "username_1",
"text": "that is great news @username_0 Thank you so much!!!!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 638451605,
"datetime": 1591217076000,
"masked_author": "username_0",
"text": "OK, with a little more experimenting and some...creative usage of CSS paths, I was able to create the rounded shape. Here are some screenshots for reference, showing with hyphenation turned off/turned on for mobile and tablet. The blue lines of the shaping polygon are shown for reference. We can control the height of the polygon that shapes the text, but it has to be an absolute value - unfortunately it can't scale to the size of the text itself. Currently I have 300px set, which seems to balance OK between the lengths of the two previews.\n\n## tablet\nhypens ON (left), hyphens OFF (right)\n\n<img src=\"https://images.zenhubusercontent.com/5aa82c4b4b5806bc2bcab4fa/c21d2f9f-6ccc-47ce-896e-1502cdfb999a\" width=\"200\">\n<img src=\"https://images.zenhubusercontent.com/5aa82c4b4b5806bc2bcab4fa/1a6b937e-c7cf-4f51-8bfa-0a427f06f3a8\" width=\"200\">\n\n## mobile\nhypens ON (left), hyphens OFF (right)\n\n<img src=\"https://images.zenhubusercontent.com/5aa82c4b4b5806bc2bcab4fa/4ec97248-ae51-4089-aefb-dd7c22811df2\" width=\"200\">\n<img src=\"https://images.zenhubusercontent.com/5aa82c4b4b5806bc2bcab4fa/0b6776ba-1ef5-4d75-9dab-582b56a759e0\" width=\"200\">",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 638880747,
"datetime": 1591280494000,
"masked_author": "username_1",
"text": "@username_0 Thank you so much for these!!! I love how these look so much more than how they currently are. I love them when there is **no hyphen**, mainly because I'm noticing that in the versions without the hyphens the words are also **not** breaking at the end of a line? Is that true? It's much more easier to read in the version without the hyphen, because an entire word is wrapped to the next line. The shape of the entire paragraph also looks much better visually in the versions without the hyphens. They especially look better on tablet than mobile. But mobile is fine too.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 639001545,
"datetime": 1591292276000,
"masked_author": "username_2",
"text": "This is so cool, brilliant solution @username_0! I also agree with @username_1 that the **hyphens off** version is best. There are just so many broken words on mobile that it messes with the readability. And the shapes look just as good without word breaks.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 639090147,
"datetime": 1591301442000,
"masked_author": "username_0",
"text": "The shaping is still quite restrictive, and some content (including the intro for rebecca's essay) just doesn't naturally fit into the shape at all. Right now, the way I can see to implement this is:\n\n- measure the number of characters that can reasonably fit into the shape via some trials \n- allow writers to choose the intro to their piece using `<!--more-->`, but truncate it if it has too many characters and add an ellipsis\n- set an exact height for both the preview text and its outside shape so that the layout can't move around or break\n\nthis approach means that some (maybe most) previews will end up getting truncated, and it will likely require some trial and error on the part of the editor to find the right preview text. if the preview is too long it will be truncated, and if it's too short it will not fill up the shape and leave a bunch of empty space. however, the layout will be stable across various screens.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 641551747,
"datetime": 1591734551000,
"masked_author": "username_2",
"text": "@username_0 FWIW, I don't think it's a bad idea to have that much of the editor's and author's attention on the opening lines to get them to fit perfectly. It's an interesting feature, a constraint, to have editor and author collaborate on working within that format.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 641587327,
"datetime": 1591737559000,
"masked_author": "username_0",
"text": "@username_2 thanks, this input is helpful. based on you and @rlskoeser's comments I think I can make one or two more revisions and then we can close this out.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 644928397,
"datetime": 1592331415000,
"masked_author": "username_0",
"text": "OK, I've deployed the latest version of this and updated the testing notes at the top. Would like design testing from @username_1 and a final test from @username_2 on fitting the article preview quotes into the new shapes.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 645543511,
"datetime": 1592418321000,
"masked_author": "username_1",
"text": "@username_0 This looks amazing! Thank you for working on it!\n\nThe notes below apply to both the homepage and the single issue page.\n\nOn tablet and mobile: \n- The spacing between the article opening paragraph and the article title does not match the design (should be slightly less) (It looks very close to the design on desktop)\n\nOn all devices:\n- The spacing between the article opening paragraph and the article title should be the same for both articles – currently the spacing is much more on the left side. \n- The spacing between the article title and the first author does not match the designs\n\n\n\nOn mobile: \n- The article title on the left is getting cut off:\n![Screen Shot 2020-06-17 at 2.08.28 PM.png]",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 645556276,
"datetime": 1592419796000,
"masked_author": "username_0",
"text": "I can't get this to happen when emulating an iPhone SE. Not sure how to test a solution...",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 645600383,
"datetime": 1592424889000,
"masked_author": "username_1",
"text": "- yes, me either.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 645660315,
"datetime": 1592433110000,
"masked_author": "username_2",
"text": "This looks fantastic, @username_0 — thank you so much for all your hard work. This is clearly the trickiest page of the entire project! Just two additional thoughts:\r\n\r\nThe \"read more\" link doesn't work after the preview of the issue abstract text\r\n\r\nIs there any way at all we can make the article opening line text shapes accommodate just a few more words? I understand you've gone over this like a hundred times but I've been playing with some edits to the text locally and RM's literally perfect line is just 35 words but the last two words cut off on mobile, reading: `When I was growing up I liked to read about dying children. I’m not talking about the Victorian orphan kind of dying either, not the dying of storybooks, but children who were` I ask because 33 words are pretty restrictive when it comes to writing punchy opening lines. Allowing for 40ish words max would probably be enough. Or even simply finding a way to fit those last two words of RM's opening lines in, and having that limit determine the length of future contributors' opening lines. We could call it the RM rule.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 646255903,
"datetime": 1592507719000,
"masked_author": "username_0",
"text": "This is really complicated; if there are still problems we might need to zoom. Let me know how it looks to you with the latest changes.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 646261116,
"datetime": 1592508354000,
"masked_author": "username_2",
"text": "@username_0 that fully fit RM's opening lines into the homepage preview on mobile for me. Nice work.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 647538180,
"datetime": 1592834418000,
"masked_author": "username_1",
"text": "@username_0 This looks great!! Thank you so much for all of your work!!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 647551837,
"datetime": 1592835824000,
"masked_author": "username_1",
"text": "- @username_0 I think this is happening on my phone because of the \"The article title on the left is getting cut off\" screenshot I had attached earlier, above, _I think_. It's fine on desktop and tablet.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "gissoo",
"comment_id": null,
"datetime": 1592835844000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 15,483 | false | false | 3 | 36 | true |
graceshopper-codename/graceshopper | graceshopper-codename | 552,053,362 | 117 | {
"number": 117,
"repo": "graceshopper",
"user_login": "graceshopper-codename"
} | [
{
"action": "opened",
"author": "nurpny",
"comment_id": null,
"datetime": 1579492847000,
"masked_author": "username_0",
"text": "Updated cart post api request to be called via react redux store instead of directly from the react components\r\n\r\n### Assignee Tasks\r\n\r\n* [ ] added unit tests (or none needed)\r\n* [ ] written relevant docs (or none needed)\r\n* [ ] referenced any relevant issues (or none exist)\r\n\r\n### Guidelines\r\n\r\nPlease add a description of this Pull Request's motivation, scope, outstanding issues or potential alternatives, reasoning behind the current solution, and any other relevant information for posterity.\r\n\r\n---\r\n\r\n_Your PR Notes Here_",
"title": "Refactor: #116: ",
"type": "issue"
},
{
"action": "created",
"author": "nurpny",
"comment_id": 576127101,
"datetime": 1579501905000,
"masked_author": "username_0",
"text": "Combining with another update",
"title": null,
"type": "comment"
}
] | 558 | false | false | 1 | 2 | false |
mdn/browser-compat-data | mdn | 762,571,245 | 7,661 | {
"number": 7661,
"repo": "browser-compat-data",
"user_login": "mdn"
} | [
{
"action": "opened",
"author": "andybons",
"comment_id": null,
"datetime": 1607703812000,
"masked_author": "username_0",
"text": "This change updates the availability of the Storage Access API in Chrome\r\n\r\n+ [Chrome bug](https://bugs.chromium.org/p/chromium/issues/detail?id=989663)\r\n+ [Chrome Platform Status](https://www.chromestatus.com/feature/5612590694662144)\r\n\r\nThe [first commit](https://chromium.googlesource.com/chromium/src.git/+/da12cf8726dc8414f0d61624afebff46b83c8f1a) was in [version 78](https://chromium.googlesource.com/chromium/src/+/da12cf8726dc8414f0d61624afebff46b83c8f1a/chrome/VERSION), not 85, and the flag remains off by default.\r\n\r\nI’m not sure how MS Edge’s versions work and when it was turned on by default, so that version hasn’t been altered.\r\n\r\nFixes #7510",
"title": "Update Storage Access API availability on Chrome",
"type": "issue"
},
{
"action": "created",
"author": "andybons",
"comment_id": 748355339,
"datetime": 1608331522000,
"masked_author": "username_0",
"text": "Sorry for delay, here. Holidays in U.S. are slowing me down 🎄 \r\n\r\nWill follow up with those changes!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "andybons",
"comment_id": 748556467,
"datetime": 1608433777000,
"masked_author": "username_0",
"text": "Done",
"title": null,
"type": "comment"
}
] | 762 | false | false | 1 | 3 | false |
MicrosoftDocs/iis-docs | MicrosoftDocs | 798,833,242 | 543 | {
"number": 543,
"repo": "iis-docs",
"user_login": "MicrosoftDocs"
} | [
{
"action": "opened",
"author": "gewarren",
"comment_id": null,
"datetime": 1612227724000,
"masked_author": "username_0",
"text": "- Need to find a spot for these in the TOC...please help!\r\n- Okay to have \"unknown\" as API location in metadata?",
"title": "Port articles from previous-versions for APIScan",
"type": "issue"
},
{
"action": "created",
"author": "opbld31",
"comment_id": 771271242,
"datetime": 1612227838000,
"masked_author": "username_1",
"text": "Docs Build status updates of commit _[f038b48](https://github.com/username_0/iis-docs/commits/f038b4850f60dea4eb17867c4281ce42b39a2ee2)_: \n\n### :white_check_mark: Validation status: passed\r\n\r\n\r\nFile | Status | Preview URL | Details\r\n---- | ------ | ----------- | -------\r\n[iis/extensions/entry-point-functions/getextensionversion.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/getextensionversion.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/getextensionversion?branch=pr-en-us-543) | [Details](#user-content-af1e9225-6955aecf)\r\n[iis/extensions/entry-point-functions/httpextensionproc.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/httpextensionproc.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/httpextensionproc?branch=pr-en-us-543) | [Details](#user-content-375b9f63-6955aecf)\r\n[iis/extensions/entry-point-functions/terminateextension.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/terminateextension.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/terminateextension?branch=pr-en-us-543) | [Details](#user-content-999e24f1-6955aecf)\r\n\r\n<a id=\"af1e9225-6955aecf\"></a>\r\n### [iis/extensions/entry-point-functions/getextensionversion.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/getextensionversion.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\n<a id=\"375b9f63-6955aecf\"></a>\r\n### [iis/extensions/entry-point-functions/httpextensionproc.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/httpextensionproc.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n - **Line 34, Column 1**: **[Suggestion-table-syntax-invalid]** `````Table syntax is invalid. Ensure your table includes a header and is surrounded by empty lines. NOTE: This Suggestion will become a Warning on 1/29/21.`````\r\n\r\n<a id=\"999e24f1-6955aecf\"></a>\r\n### [iis/extensions/entry-point-functions/terminateextension.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/terminateextension.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\nFor more details, please refer to the [build report](https://opbuildstorageprod.blob.core.windows.net/report/2021%5C2%5C2%5C90f071cf-c002-b629-5e11-5bc968bb67ca%5CPullRequest%5C202102020102087563-543%5Cworkflow_report.html?sv=2016-05-31&sr=b&sig=D9HxQ%2FURQzykb88xuWiqRyizJ8Wljm4lbMO1cpaEFeQ%3D&st=2021-02-02T00%3A58%3A57Z&se=2021-03-05T01%3A03%3A57Z&sp=r).\r\n\r\n**Note:** Broken links written as relative paths are included in the above build report. For broken links written as absolute paths or external URLs, see the [broken link report](https://ops.microsoft.com/#/repos/90f071cf-c002-b629-5e11-5bc968bb67ca?tabName=brokenlinks).\r\n\r\nFor any questions, please:<ul><li>Try searching in the <a href=\"https://review.docs.microsoft.com/en-us/search/?search=&category=All&scope=help-docs&category=All&branch=master\">Docs contributor and Admin Guide</a></li><li>See the <a href=\"https://review.docs.microsoft.com/en-us/help/onboard/faq?branch=master\">frequently asked questions</a></li><li>Post your question in the <a href=\"https://teams.microsoft.com/l/channel/19%3a7ecffca1166a4a3986fed528cf0870ee%40thread.skype/General?groupId=de9ddba4-2574-4830-87ed-41668c07a1ca&tenantId=72f98bf-86f1-41af-91ab-2d7cd011db47\">Docs support channel</a></li></ul>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "opbld31",
"comment_id": 771273146,
"datetime": 1612228169000,
"masked_author": "username_1",
"text": "Docs Build status updates of commit _[8c3f2db](https://github.com/username_0/iis-docs/commits/8c3f2db45464d58a71ae68d863e96a3152b50723)_: \n\n### :white_check_mark: Validation status: passed\r\n\r\n\r\nFile | Status | Preview URL | Details\r\n---- | ------ | ----------- | -------\r\n[iis/extensions/entry-point-functions/getextensionversion.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/getextensionversion.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/getextensionversion?branch=pr-en-us-543) | [Details](#user-content-af1e9225-a5b51bc0)\r\n[iis/extensions/entry-point-functions/httpextensionproc.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/httpextensionproc.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/httpextensionproc?branch=pr-en-us-543) | [Details](#user-content-375b9f63-a5b51bc0)\r\n[iis/extensions/entry-point-functions/terminateextension.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/terminateextension.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/terminateextension?branch=pr-en-us-543) | [Details](#user-content-999e24f1-a5b51bc0)\r\n\r\n<a id=\"af1e9225-a5b51bc0\"></a>\r\n### [iis/extensions/entry-point-functions/getextensionversion.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/getextensionversion.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\n<a id=\"375b9f63-a5b51bc0\"></a>\r\n### [iis/extensions/entry-point-functions/httpextensionproc.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/httpextensionproc.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\n<a id=\"999e24f1-a5b51bc0\"></a>\r\n### [iis/extensions/entry-point-functions/terminateextension.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/terminateextension.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\nFor more details, please refer to the [build report](https://opbuildstorageprod.blob.core.windows.net/report/2021%5C2%5C2%5C90f071cf-c002-b629-5e11-5bc968bb67ca%5CPullRequest%5C202102020108135682-543%5Cworkflow_report.html?sv=2016-05-31&sr=b&sig=5PLDERG4SLwE0YZxALzx%2F53ey7Jq37zoaTUudGdFKdQ%3D&st=2021-02-02T01%3A04%3A28Z&se=2021-03-05T01%3A09%3A28Z&sp=r).\r\n\r\n**Note:** Broken links written as relative paths are included in the above build report. For broken links written as absolute paths or external URLs, see the [broken link report](https://ops.microsoft.com/#/repos/90f071cf-c002-b629-5e11-5bc968bb67ca?tabName=brokenlinks).\r\n\r\nFor any questions, please:<ul><li>Try searching in the <a href=\"https://review.docs.microsoft.com/en-us/search/?search=&category=All&scope=help-docs&category=All&branch=master\">Docs contributor and Admin Guide</a></li><li>See the <a href=\"https://review.docs.microsoft.com/en-us/help/onboard/faq?branch=master\">frequently asked questions</a></li><li>Post your question in the <a href=\"https://teams.microsoft.com/l/channel/19%3a7ecffca1166a4a3986fed528cf0870ee%40thread.skype/General?groupId=de9ddba4-2574-4830-87ed-41668c07a1ca&tenantId=72f98bf-86f1-41af-91ab-2d7cd011db47\">Docs support channel</a></li></ul>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Rick-Anderson",
"comment_id": 772892171,
"datetime": 1612394099000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gewarren",
"comment_id": 776899917,
"datetime": 1612979943000,
"masked_author": "username_0",
"text": "Closing this PR as the metadata was added to the previous-versions repo instead. cc @LBynum77.",
"title": null,
"type": "comment"
}
] | 7,694 | false | false | 3 | 5 | true |
XCPCIO/uptime | XCPCIO | 792,567,253 | 19 | null | [
{
"action": "opened",
"author": "Dup4",
"comment_id": null,
"datetime": 1611411020000,
"masked_author": "username_0",
"text": "In [`39bd27d`](https://github.com/XCPCIO/uptime/commit/39bd27d86abce95aa014a4c3082a629f4e55aa93\n), XCPCIO (https://xcpcio.com) was **down**:\n- HTTP code: 0\n- Response time: 0 ms",
"title": "🛑 XCPCIO is down",
"type": "issue"
},
{
"action": "created",
"author": "Dup4",
"comment_id": 766091510,
"datetime": 1611413700000,
"masked_author": "username_0",
"text": "**Resolved:** XCPCIO is back up in [`1a6dcce`](https://github.com/XCPCIO/uptime/commit/1a6dcce833046cf15b1f46dc64949879922d6344\n).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Dup4",
"comment_id": null,
"datetime": 1611413701000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 307 | false | false | 1 | 3 | false |
dimagi/commcare-android | dimagi | 883,975,823 | 2,469 | {
"number": 2469,
"repo": "commcare-android",
"user_login": "dimagi"
} | [
{
"action": "opened",
"author": "ShivamPokhriyal",
"comment_id": null,
"datetime": 1620644599000,
"masked_author": "username_0",
"text": "cross-request: https://github.com/dimagi/commcare-core/pull/988",
"title": "Dummy commit for triggering test",
"type": "issue"
},
{
"action": "created",
"author": "ShivamPokhriyal",
"comment_id": 836597729,
"datetime": 1620647248000,
"masked_author": "username_0",
"text": "<!--\n 0 Errors\n 8 Warnings: This PR does not contain any J...\n 0 Messages\n 0 Markdowns\n-->\n<table>\n <thead>\n <tr>\n <th width=\"50\"></th>\n <th width=\"100%\" data-danger-table=\"true\" data-kind=\"Warning\">\n 8 Warnings\n </th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\">This PR does not contain any JIRA issue keys in the PR title or commit messages (e.g. KEY-123)</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L13\">app/AndroidManifest.xml#L13</a> - This namespace declaration is redundant</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L16\">app/AndroidManifest.xml#L16</a> - This namespace declaration is redundant</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L19\">app/AndroidManifest.xml#L19</a> - This namespace declaration is redundant</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L81\">app/AndroidManifest.xml#L81</a> - Attribute <code>usesCleartextTraffic</code> is only used in API level 23 and higher (current min is 16)</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L357\">app/AndroidManifest.xml#L357</a> - Exported receiver does not require permission</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L364\">app/AndroidManifest.xml#L364</a> - Exported receiver does not require permission</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L371\">app/AndroidManifest.xml#L371</a> - Exported receiver does not require permission</td>\n </tr>\n </tbody>\n</table>\n\n<p align=\"right\" data-meta=\"generated_by_danger\">\n Generated by :no_entry_sign: <a href=\"https://danger.systems/\">Danger</a>\n</p>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dimagimon",
"comment_id": 836615246,
"datetime": 1620648358000,
"masked_author": "username_1",
"text": "[](https://jenkins.dimagi.com/job/commcare-android-pr-job-builder/462390/)",
"title": null,
"type": "comment"
}
] | 2,908 | false | false | 2 | 3 | false |
alibaba/ice | alibaba | 634,360,820 | 3,266 | null | [
{
"action": "opened",
"author": "imsobear",
"comment_id": null,
"datetime": 1591604956000,
"masked_author": "username_0",
"text": "- 优化 `@ice/spec` 规则 https://github.com/ice-lab/spec/issues/12\r\n- 模板里的 eslint 参数改为 true",
"title": "eslint 规则优化",
"type": "issue"
},
{
"action": "created",
"author": "chenbin92",
"comment_id": 644773733,
"datetime": 1592314908000,
"masked_author": "username_1",
"text": "结论:作为框架中台的插件沉淀,包含 rax 和 ice 的两部分。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chenbin92",
"comment_id": 650711467,
"datetime": 1593329666000,
"masked_author": "username_1",
"text": "进度同步:eslint 规则 @chenhebing 基于 @ice/spec 提供完整的 RFC 进行讨论 https://github.com/ice-lab/spec/issues/created_by/chenhebing",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "imsobear",
"comment_id": null,
"datetime": 1604376745000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 234 | false | false | 2 | 4 | false |
serverless-nextjs/serverless-next.js | serverless-nextjs | 826,340,660 | 938 | null | [
{
"action": "opened",
"author": "sebastianmacarescu",
"comment_id": null,
"datetime": 1615311212000,
"masked_author": "username_0",
"text": "### Describe the bug\r\n\r\nWe use serverless-nextjs-plugin@2.5.1 with the following routes\r\n`\r\n- src: index\r\n path: /{slug}\r\n cors: true\r\n request:\r\n parameters:\r\n paths:\r\n slug: true\r\n- src: index\r\n path: /{slug}/{page}\r\n cors: true\r\n request:\r\n parameters:\r\n paths:\r\n slug: true\r\n page: true\r\n`\r\n\r\nWhen trying to deploy we get:\r\n`A sibling ({slug}) of this resource already has a variable path part -- only one is allowed (Service: AmazonApiGateway; Status Code: 400; Error Code: BadRequestException; Request ID: fecdce7a-751a-4c27-b16d-0ebc521b563a; Proxy: null)`\r\n\r\nThis happens because the next.js creates a custom page _error.js to handle all errors and the serverless-nextjs-plugin will create a catch-all proxy resource.\r\n\r\n### Actual behavior\r\n\r\nGot error: `A sibling ({slug}) of this resource already has a variable path part -- only one is allowed (Service: AmazonApiGateway; Status Code: 400; Error Code: BadRequestException; Request ID: fecdce7a-751a-4c27-b16d-0ebc521b563a; Proxy: null)`\r\n\r\nIf I deploy without above routes, manually delete the proxy resources from AWS console then deploy again it works.\r\n\r\n### Expected behavior\r\n\r\nWe should have an option to not create that catch all proxy resource for the 404 pages.\r\n\r\n\r\n### Steps to reproduce\r\n\r\nCreate an empty serverless project using serverless-nextjs-plugin@2.5.1. \r\nConfigure the above routes and try to deploy\r\n\r\n### Screenshots/Code/Logs\r\n<!-- If applicable, add screenshots or a minimal repro (e.g code or configuration snippet or repository) to help explain your problem. If you have a runtime issue from Lambda/CloudFront, please check CloudWatch logs (note that Lambda@Edge logs are in a region closest to where you access CloudFront - NOT necessarily in `us-east-1` where the original Lambda is created) and post any logs or stacktraces if possible. See here for how to check logs: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/lambda-edge-testing-debugging.html#lambda-edge-identifying-function-errors. If you have a build or deploy issue, please run with serverless --debug and post the logs. Please also post your serverless.yml. -->\r\n\r\n### Versions\r\nserverless-nextjs-plugin@2.5.1\r\nserverless -v\r\nFramework Core: 1.83.2\r\nPlugin: 3.8.4\r\nSDK: 2.3.2\r\nComponents: 2.34.9",
"title": "serverless-nextjs-plugin ApiGatewayResourceProxyVar resource already has a variable path part",
"type": "issue"
}
] | 2,392 | false | false | 1 | 1 | false |
ohmyzsh/ohmyzsh | ohmyzsh | 799,371,764 | 9,631 | null | [
{
"action": "opened",
"author": "rustiever",
"comment_id": null,
"datetime": 1612278883000,
"masked_author": "username_0",
"text": "<!--\r\nFill this out before posting. You can delete irrelevant sections, but\r\nan issue where no sections have been filled will be deleted without comment.\r\n-->\r\n\r\n**Describe the bug**\r\nWhen I issue `vi .../` I can't do tab-completion\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior, for example:\r\n1. type this `vi .../` \r\n2. press <tab> key\r\n3. there will be visual bell (in my case)\r\n\r\n**Expected behavior**\r\nIt should give the tab-completion\r\n\r\n\r\n\r\n\r\n - OS / Distro: [macOS]\r\n - Latest ohmyzsh update?: [Yes]\r\n - ZSH Version: [e.g. 5.8]\r\n - Terminal emulator: [Alacritty]",
"title": "relative path doesn't get tab-completion",
"type": "issue"
},
{
"action": "created",
"author": "mcornella",
"comment_id": 772051128,
"datetime": 1612304762000,
"masked_author": "username_1",
"text": "You can use the globalias plugin to automatically expand the dots to their value. After they are expanded to `../..` the path completion will work correctly.\r\n\r\nThere's also [this plugin](https://github.com/knu/zsh-manydots-magic/blob/master/manydots-magic) that does the same thing only for dots, and my [less tested alternative](https://github.com/username_1/dotfiles/blob/main/ohmyzsh-custom/magic-dot.zsh) based off [this SO answer](https://stackoverflow.com/questions/23456873/multi-dot-paths-in-zsh-like-cd/23471915#23471915).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mcornella",
"comment_id": null,
"datetime": 1612304762000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "rustiever",
"comment_id": 772304225,
"datetime": 1612338029000,
"masked_author": "username_0",
"text": "Thanks, but I couldn't set the plugin(many dots-magic) can you please help me",
"title": null,
"type": "comment"
}
] | 1,185 | false | false | 2 | 4 | true |
Nienormalny/twitch-enhancer | null | 708,490,019 | 14 | null | [
{
"action": "opened",
"author": "Nienormalny",
"comment_id": null,
"datetime": 1600983377000,
"masked_author": "username_0",
"text": "## Step 1:\r\n\r\nAdd a image file to the directory: `src -> assets -> images -> emotes`.\r\nIt can be **.gif** or **.png**. (NOT JPG!)\r\n\r\n## Step 2:\r\n\r\nAdd new **case** to the col inside: `src -> utils -> emotes.js`",
"title": "Add FeelsOkayMan icon for live chat",
"type": "issue"
},
{
"action": "closed",
"author": "Nienormalny",
"comment_id": null,
"datetime": 1601568229000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 210 | false | false | 1 | 2 | false |
openmainframeproject/cobol-programming-course | openmainframeproject | 637,134,564 | 138 | {
"number": 138,
"repo": "cobol-programming-course",
"user_login": "openmainframeproject"
} | [
{
"action": "opened",
"author": "jellypuno",
"comment_id": null,
"datetime": 1591890688000,
"masked_author": "username_0",
"text": "",
"title": "Adding Source Modules for COBOL Challenges: Debugging",
"type": "issue"
},
{
"action": "created",
"author": "zeibura",
"comment_id": 642765739,
"datetime": 1591890982000,
"masked_author": "username_1",
"text": "Looks fine, have created #139 for the instructions and pics.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zeibura",
"comment_id": 642769633,
"datetime": 1591891149000,
"masked_author": "username_1",
"text": "@username_0 what's the difference between the two programs, CBL0106 and CBL0106C? The instructions I provided only refer to the first one.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jellypuno",
"comment_id": 642774067,
"datetime": 1591891390000,
"masked_author": "username_0",
"text": "@username_1 CBL0106 is the program that needs debugging. CBL0106C is the fixed version in case the students are stuck and they need help.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MikeBauerCA",
"comment_id": 643463948,
"datetime": 1591992759000,
"masked_author": "username_2",
"text": "I requested a small change on #139 but see no harm in merging this PR first.",
"title": null,
"type": "comment"
}
] | 407 | false | false | 3 | 5 | true |
async-rs/async-std | async-rs | 822,927,832 | 963 | null | [
{
"action": "opened",
"author": "svmk",
"comment_id": null,
"datetime": 1614937879000,
"masked_author": "username_0",
"text": "Hello! To reproduce this bug you may use this project: \r\nhttps://github.com/username_0/test-async-std-bug\r\n\r\nDownload it and exeucte file `run-test.sh`\r\n\r\nSteps to reproduce:\r\n1. Download remote url `https://www.sec.gov/Archives/edgar/full-index/1994/QTR1/company.idx` by using `reqwest` library\r\n2. Save it to file while downloading by using `async_std::fs::File`\r\n3. Download remote url `https://www.sec.gov/Archives/edgar/full-index/1994/QTR1/company.idx` using `wget`\r\n4. Compare files. It's differ!\r\n\r\nExpected result:\r\nFiles must be equal.\r\n\r\nCode to reproduce this bug:\r\n```rust\r\nuse futures::stream::StreamExt;\r\nuse async_std::fs::File;\r\nuse futures::AsyncWriteExt;\r\nuse std::{fs::File as SyncFile, io::Write};\r\n\r\n#[tokio::main]\r\nasync fn main() -> Result<(), Box<dyn std::error::Error>> {\r\n let url = \"https://www.sec.gov/Archives/edgar/full-index/1994/QTR1/company.idx\";\r\n let client = reqwest::Client::new();\r\n let response = client.get(url).send().await?;\r\n let mut response_body = response.bytes_stream();\r\n let mut async_file = File::create(\"company.async.idx\").await?;\r\n let mut sync_file = SyncFile::create(\"company.sync.idx\")?;\r\n while let Some(buffer) = response_body.next().await {\r\n let buffer = buffer?;\r\n let _ = async_file.write(&buffer).await?;\r\n async_file.sync_all().await?;\r\n sync_file.write(&buffer)?;\r\n }\r\n return Ok(());\r\n}\r\n```",
"title": "File was corrupted by async_std::fs::File",
"type": "issue"
},
{
"action": "created",
"author": "jbr",
"comment_id": 791784610,
"datetime": 1614986450000,
"masked_author": "username_1",
"text": "I believe the issue here is the same as https://github.com/async-rs/async-std/issues/939#issuecomment-762092739 — you're doing `async_file.write(&buffer).await` ([Write::write](https://docs.rs/async-std/1.9.0/async_std/io/trait.Write.html#method.write)) instead of `async_file.write_all(&buffer)`([Write::write_all](https://docs.rs/async-std/1.9.0/async_std/io/trait.Write.html#method.write_all) — write doesn't necessarily write all of the contents. If you log the return value from write and compare it to the buffer.len(), there will be occasional discrepancies",
"title": null,
"type": "comment"
}
] | 1,971 | false | false | 2 | 2 | true |
solo-io/service-mesh-hub | solo-io | 722,471,964 | 1,018 | null | [
{
"action": "opened",
"author": "ilackarms",
"comment_id": null,
"datetime": 1602778100000,
"masked_author": "username_0",
"text": "Currently SMH is opinionated in how it derives mesh configuration (via translation) and provides no extensibility layer. It is currently impossible to extend SMH without modifying SMH code directly.\r\n\r\n**Describe the solution you'd like**\r\n It would be preferable for SMH users to have the ability to extend the core SMH implementation with hooks to modify the generated output resources produced by smh (i.e. istio CRDs).\r\n\r\nI propose a solution whereby SMH makes GRPC Client requests to a user-defined **Extensions** server, which has the ability to patch and append resources to the SMH output snapshot before it is written to storage. The extension server should also have the ability to notify SMH when a resync is desired (i.e. when a third-party resource changes).\r\n\r\n**Additional context**\r\nThis has relevance both to 3rd party extensions as well as Solo.io's own Enterprise extensions for SMH.",
"title": "SMH Extensions",
"type": "issue"
}
] | 902 | false | true | 1 | 1 | false |
netease-im/NIM_iOS_UIKit | netease-im | 769,459,490 | 301 | null | [
{
"action": "opened",
"author": "396514268",
"comment_id": null,
"datetime": 1608174877000,
"masked_author": "username_0",
"text": "### 问题检查清单\r\n\r\n感谢您向我们提出问题,在提交问题之前,请确认以下事宜:\r\n\r\n- [ ] 我已经阅读了组件文档: \r\n\r\n [《项目结构介绍》](./Documents/nim_arch.md)\r\n\r\n [《界面排版自定义》](./Documents/nim_custom_ui.md)\r\n\r\n [《新消息类型集成》](./Documents/nim_custom_message.md)\r\n\r\n [《用户信息自定义》](./Documents/nim_userinfo.md)\r\n\r\n [《机器人消息排版指南》](./Documents/nim_robot.md)\r\n\r\n\r\n- [ ] 我已经阅读了[常见问题](http://dev.netease.im/docs/product/%E9%80%9A%E7%94%A8/%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98?#iOS%E7%89%88SDK), 但是找不到相关解决方案。\r\n\r\n- [ ] 我已经阅读了[SDK 文档](http://dev.netease.im/docs/product/IM%E5%8D%B3%E6%97%B6%E9%80%9A%E8%AE%AF/SDK%E5%BC%80%E5%8F%91%E9%9B%86%E6%88%90/iOS%E5%BC%80%E5%8F%91%E9%9B%86%E6%88%90/%E6%A6%82%E8%A6%81%E4%BB%8B%E7%BB%8D), 但是找不到相关解决方案。\r\n \r\n- [ ] 我已经搜索了[当前问题](https://github.com/netease-im/NIM_iOS_UIKit/issues?utf8=✓&q=is%3Aissue), 但是找不到相关解决方案。\r\n\r\n\r\n### 问题描述\r\n\r\n#### 问题现象\r\n\r\n[描述具体问题表现]\r\n\r\n#### 是否重现\r\n\r\n[重现此问题的步骤]\r\n\r\n#### 其它备注\r\n\r\n[在这里添加任何其他相关细节]\r\n初始化网易云信会弹出允许使用本地网络的弹窗,注释掉就不会弹出,请问有同遇到这个问题的吗",
"title": "初始化网易云信会弹出允许使用本地网络的弹窗",
"type": "issue"
},
{
"action": "closed",
"author": "396514268",
"comment_id": null,
"datetime": 1608796917000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 944 | false | false | 1 | 2 | false |
Public-Health-Bioinformatics/DataHarmonizer | Public-Health-Bioinformatics | 643,287,942 | 76 | null | [
{
"action": "opened",
"author": "cmrn-rhi",
"comment_id": null,
"datetime": 1592851677000,
"masked_author": "username_0",
"text": "This may have been discussed in the past (my apologies if I touch on things that have already been addressed) but can he include a 'help'/'support' section that directs the users to:\r\n- The SOP\r\n- Where to make issue/term requests (either linking to GitHub Issues or to a curator email)\r\n\r\nThere is now a [published version of the SOP](https://docs.google.com/document/d/e/2PACX-1vR4UkqrLaj1-9jxmrNk9mZ4S4Siim8onPHqgdXKd9m1lOroXmekClfPsXlqgFDio1rWZW7lHArSAbOg/pub) that we can link too (it will update every time we edit the SOP gdoc), but perhaps we should also have a pdf copy that users can use if they are offline? That latter would certainly require more maintenance - we could include a note at the top that it may not be the latest version and to go to the published version if possible.",
"title": "'Help'/'Support' Section in Validator",
"type": "issue"
},
{
"action": "created",
"author": "cmrn-rhi",
"comment_id": 647733653,
"datetime": 1592855047000,
"masked_author": "username_0",
"text": "I see that there is a pdf version of the SOP - so ignore that recommendation (although perhaps we should still add the published link somewhere).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ivansg44",
"comment_id": 647795897,
"datetime": 1592864079000,
"masked_author": "username_1",
"text": "Another thing to add to this menu is a link to the reference guide.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cmrn-rhi",
"comment_id": 648986165,
"datetime": 1593022773000,
"masked_author": "username_0",
"text": "@username_1 Is there a separate reference guide outside of the SOP and in-validator field guidance?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ivansg44",
"comment_id": 648988452,
"datetime": 1593023036000,
"masked_author": "username_1",
"text": "[reference.html](https://github.com/Public-Health-Bioinformatics/DataHarmonizer/blob/master/reference.html) in the root directory courtesy of @username_2",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ddooley",
"comment_id": 649026780,
"datetime": 1593027571000,
"masked_author": "username_2",
"text": "Yes, the reference.html is automatically generated version of the label / definition / guidance and examples fields of the vocabulary. It differs from SOP but could be linked from there too.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ivansg44",
"comment_id": null,
"datetime": 1594052698000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 1,444 | false | false | 3 | 7 | true |
google/site-kit-wp | google | 779,409,613 | 2,581 | {
"number": 2581,
"repo": "site-kit-wp",
"user_login": "google"
} | [
{
"action": "opened",
"author": "felixarntz",
"comment_id": null,
"datetime": 1609871960000,
"masked_author": "username_0",
"text": "## Summary\r\n\r\n<!-- Please reference the issue this PR addresses. -->\r\nAddresses issue #2578\r\n\r\n## Relevant technical choices\r\n\r\n<!-- Please describe your changes. -->\r\n\r\n## Checklist\r\n\r\n- [x] My code is tested and passes existing unit tests.\r\n- [ ] My code has an appropriate set of unit tests which all pass.\r\n- [x] My code is backward-compatible with WordPress 4.7 and PHP 5.6.\r\n- [x] My code follows the [WordPress](https://make.wordpress.org/core/handbook/best-practices/coding-standards/) coding standards.\r\n- [x] My code has proper inline documentation.\r\n- [x] I have added a QA Brief on the issue linked above.\r\n- [x] I have signed the Contributor License Agreement (see <https://cla.developers.google.com/>).",
"title": "Make local PHPUnit setup standalone",
"type": "issue"
},
{
"action": "created",
"author": "aaemnnosttv",
"comment_id": 799654007,
"datetime": 1615833244000,
"masked_author": "username_1",
"text": "@googlebot I consent.",
"title": null,
"type": "comment"
}
] | 737 | false | true | 2 | 2 | false |
HephaestusProject/pytorch-ReCoSa | HephaestusProject | 723,674,509 | 20 | {
"number": 20,
"repo": "pytorch-ReCoSa",
"user_login": "HephaestusProject"
} | [
{
"action": "opened",
"author": "soeque1",
"comment_id": null,
"datetime": 1602912592000,
"masked_author": "username_0",
"text": "# Pull Request\r\n레파지토리에 기여해주셔서 감사드립니다.\r\n\r\n해당 PR을 제출하기 전에 아래 사항이 완료되었는지 확인 부탁드립니다:\r\n- [x] 작성한 코드가 어떤 에러나 경고없이 빌드가 되었나요?\r\n- [x] 충분한 테스트를 수행하셨나요?\r\n\r\n## 1. 해당 PR은 어떤 내용인가요?\r\n<!-- 해당 PR이 어떠한 내용인지 상세하게 명시 부탁드리겠습니다. 상세한 명시는 1). 문제정의, 2). 해결방법, 3). 해당 PR로 인해 발생할 수 있는 예상문제와 같은 형태로 작성하시면 됩니다.-->\r\n\r\nLR scheduler Update and Model v0.2.3\r\n\r\n## 2. PR과 관련된 이슈가 있나요?\r\n<!-- PR이 참고하고 있는 이슈가 있다면 관련 자료를 태깅해주세요. 만약에 이슈가 같은 레파지토리의 이슈라면 이슈번호를 태그해주시고, 외부자료라면 URL로 표기해주세요.-->",
"title": "Bugfix/#19 lr scheduler",
"type": "issue"
},
{
"action": "created",
"author": "soeque1",
"comment_id": 710756362,
"datetime": 1602913960000,
"masked_author": "username_0",
"text": "학습 로그를 봤는데 lr-scheduler가 step 단위로 적용되어야하는데, epochs 단위로 적용되었습니다 ㅠ",
"title": null,
"type": "comment"
}
] | 516 | false | false | 1 | 2 | false |
microsoft/vscode-hexeditor | microsoft | 831,004,467 | 223 | null | [
{
"action": "opened",
"author": "Tyriar",
"comment_id": null,
"datetime": 1615672199000,
"masked_author": "username_0",
"text": "Reasons:\r\n\r\n- It hides my explorer or whatever side bar I have displayed at the time, this is not only distracting but I like it to be there to give me context on where I am within the project\r\n- There is _so much_ room for this on the right of the editor\r\n- There is also a lot of wasted space on the bottom of the data inspector\r\n- I can't seem to be able to bring it into my explorer which might be alright, provided I doesn't auto show the explorer when I focus the editor\r\n\r\n",
"title": "I really don't like the data inspector side bar",
"type": "issue"
},
{
"action": "created",
"author": "lramos15",
"comment_id": 799439911,
"datetime": 1615816579000,
"masked_author": "username_1",
"text": "As you remember, it was on the right side of the editor before and moved into the side panel to act as an information panel of sorts as it felt weird living next to editable content. I assume we could eventually add more stuff to the data panel, but I'm not sure what. We can discuss this at the UI/UX sync, but I'm not sure what a good place to put it is unless we throw it under a setting and let the user decide?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Tyriar",
"comment_id": 799614510,
"datetime": 1615830114000,
"masked_author": "username_0",
"text": "I never had an issue with it on the right side, I know others brought it up though. I'd prefer the editor to be more self contained and not mess with how I do things with the side bar.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Tyriar",
"comment_id": 801291321,
"datetime": 1616003830000,
"masked_author": "username_0",
"text": "After doing https://github.com/microsoft/vscode-hexeditor/pull/252 I figured out how to pull it into another one, I think I just didn't hold the mouse over an activity icon for long enough.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "lramos15",
"comment_id": null,
"datetime": 1616006113000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 1,378 | false | false | 2 | 5 | false |
ArrobeFr/jquery-calendar-bs4 | ArrobeFr | 788,448,881 | 17 | null | [
{
"action": "opened",
"author": "DLX23",
"comment_id": null,
"datetime": 1610991839000,
"masked_author": "username_0",
"text": "Hi,\r\n\r\ni got following code:\r\n```\r\nvar calendar = $('#calendar').Calendar({\r\n locale: 'de',\r\n weekday: {\r\n timeline: {\r\n fromHour: 9,\r\n toHour: 20,\r\n intervalMinutes: 30\r\n }\r\n },\r\n events: events\r\n }).init();\r\n```\r\n\r\nbut time starts at 0:00 not 9:00",
"title": "fromHour Parameter don't work",
"type": "issue"
}
] | 333 | false | false | 1 | 1 | false |
Homebrew/homebrew-core | Homebrew | 771,663,635 | 67,315 | {
"number": 67315,
"repo": "homebrew-core",
"user_login": "Homebrew"
} | [
{
"action": "opened",
"author": "carlocab",
"comment_id": null,
"datetime": 1608490098000,
"masked_author": "username_0",
"text": "Created with `brew bump-formula-pr`.",
"title": "nickle 2.90",
"type": "issue"
},
{
"action": "created",
"author": "carlocab",
"comment_id": 748646066,
"datetime": 1608490309000,
"masked_author": "username_0",
"text": "License:\r\n```\r\nCopyright © 1988-2004 Keith Packard and Bart Massey. All Rights Reserved.\r\n\r\nPermission is hereby granted, free of charge, to any person\r\nobtaining a copy of this software and associated\r\ndocumentation files (the \"Software\"), to deal in the\r\nSoftware without restriction, including without limitation\r\nthe rights to use, copy, modify, merge, publish, distribute,\r\nsublicense, and/or sell copies of the Software, and to\r\npermit persons to whom the Software is furnished to do so,\r\nsubject to the following conditions:\r\n\r\nThe above copyright notice and this permission notice shall\r\nbe included in all copies or substantial portions of the\r\nSoftware.\r\n\r\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\r\nKIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE\r\nWARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR\r\nPURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS\r\nBE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER\r\nIN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\r\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR\r\nOTHER DEALINGS IN THE SOFTWARE.\r\n\r\nExcept as contained in this notice, the names of the authors\r\nor their institutions shall not be used in advertising or\r\notherwise to promote the sale, use or other dealings in this\r\nSoftware without prior written authorization from the\r\nauthors.\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "carlocab",
"comment_id": 748646453,
"datetime": 1608490557000,
"masked_author": "username_0",
"text": "Looks like a `BSD-1-Clause`?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Rylan12",
"comment_id": 748651884,
"datetime": 1608493299000,
"masked_author": "username_1",
"text": "According to https://keithp.com/cgit/nickle.git/tree/debian/copyright and https://metadata.ftp-master.debian.org/changelogs//main/n/nickle/nickle_2.90_copyright:\r\n\r\n```ruby\r\nlicense \"MIT\"\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "carlocab",
"comment_id": 748659690,
"datetime": 1608494226000,
"masked_author": "username_0",
"text": "`MIT` I think is less restrictive than the license that actually appears in the `COPYING` file, so I'm reluctant to add it. I'll wait a bit to see if the author replies to my email.",
"title": null,
"type": "comment"
}
] | 1,805 | false | true | 2 | 5 | false |
oltpbenchmark/oltpbench | oltpbenchmark | 759,040,586 | 349 | null | [
{
"action": "opened",
"author": "malingaperera",
"comment_id": null,
"datetime": 1607399973000,
"masked_author": "username_0",
"text": "In the TATP documentation, I saw TATP uses a non-uniform data distribution by default. Is that true for OLTP bench? Is there a parameter to switch between uniform or non-uniform data distribution? Is there a way to change the skewness of data? Thanks.",
"title": "Does TATP benchmark uses a non-uniform data distribution?",
"type": "issue"
},
{
"action": "created",
"author": "apavlo",
"comment_id": 781554021,
"datetime": 1613673536000,
"masked_author": "username_1",
"text": "There is no way to switch this easily.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "malingaperera",
"comment_id": 781653563,
"datetime": 1613684566000,
"masked_author": "username_0",
"text": "Thanks.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "apavlo",
"comment_id": null,
"datetime": 1618012495000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 296 | false | false | 2 | 4 | false |
xorum-io/codeforces_watcher | xorum-io | 815,411,812 | 262 | null | [
{
"action": "opened",
"author": "MatyashDen",
"comment_id": null,
"datetime": 1614168034000,
"masked_author": "username_0",
"text": "",
"title": "Add missing localizable strings",
"type": "issue"
},
{
"action": "closed",
"author": "yev-kanivets",
"comment_id": null,
"datetime": 1617292432000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 0 | false | false | 2 | 2 | false |
microsoft/vcpkg | microsoft | 608,044,520 | 11,065 | {
"number": 11065,
"repo": "vcpkg",
"user_login": "microsoft"
} | [
{
"action": "opened",
"author": "NancyLi1013",
"comment_id": null,
"datetime": 1588053103000,
"masked_author": "username_0",
"text": "llibxml2 failed on osx pipeline in the PR #10770 \r\n\r\n```\r\nUndefined symbols for architecture x86_64:\r\n \"_iconv\", referenced from:\r\n _xmlIconvWrapper in libxml2.a(encoding.c.o)\r\n \"_iconv_close\", referenced from:\r\n _xmlFindCharEncodingHandler in libxml2.a(encoding.c.o)\r\n _xmlCharEncCloseFunc in libxml2.a(encoding.c.o)\r\n \"_iconv_open\", referenced from:\r\n _xmlFindCharEncodingHandler in libxml2.a(encoding.c.o)\r\nld: symbol(s) not found for architecture x86_64\r\nclang: error: linker command failed with exit code 1 (use -v to see invocation)\r\n```\r\nI checked the CMakeLists.txt and found that iconv has been linked. And I also cannot reproduce this on my local. \r\n\r\nRelated issue #11001.",
"title": "[test] Rerun libxml2",
"type": "issue"
},
{
"action": "created",
"author": "Neumann-A",
"comment_id": 620409842,
"datetime": 1588055677000,
"masked_author": "username_1",
"text": "Not libxml2 is failing. The port cmake is failing as a depent of libxml2 since it does not know that libxml2 has been linked with iconv. The vcpkg-cmake-wrapper needs to be adjusted to reflect that libxml2 has been build with iconv. This is already done for Windows but not for osx. Just take a look into libxml2 vcpkg-cmake-wrapper and you should directly see it",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "NancyLi1013",
"comment_id": 620421995,
"datetime": 1588057528000,
"masked_author": "username_0",
"text": "Thanks for your update. \r\nI understood it now. It is about the usage of libxml2. Will add the dependency later.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "NancyLi1013",
"comment_id": 620489206,
"datetime": 1588065971000,
"masked_author": "username_0",
"text": "After further investigation about this, it seems that this is not a bug for libxml2. So I close this PR now.",
"title": null,
"type": "comment"
}
] | 1,288 | false | false | 2 | 4 | false |
ligato/vpp-probe | ligato | 801,436,021 | 10 | {
"number": 10,
"repo": "vpp-probe",
"user_login": "ligato"
} | [
{
"action": "opened",
"author": "davetejas",
"comment_id": null,
"datetime": 1612455703000,
"masked_author": "username_0",
"text": "",
"title": "Topo display",
"type": "issue"
},
{
"action": "created",
"author": "ondrej-fabry",
"comment_id": 816128812,
"datetime": 1617911967000,
"masked_author": "username_1",
"text": "Superseded by #24",
"title": null,
"type": "comment"
}
] | 17 | false | false | 2 | 2 | false |
google-research/text-to-text-transfer-transformer | google-research | 696,908,835 | 393 | null | [
{
"action": "opened",
"author": "chrisrytting",
"comment_id": null,
"datetime": 1599664949000,
"masked_author": "username_0",
"text": "I'm trying to train the 3B parameter model from scratch. When I try to do so by executing the following command\r\n\r\n```\r\nexport CUDA_VISIBLE_DEVICES=3\r\nexport MODEL_DIR=models-t5/3Bscratch\r\nexport DATA_DIR=data/c/tsvs\r\nt5_mesh_transformer --model_dir=\"${MODEL_DIR}\" --t5_tfds_data_dir=\"${DATA_DIR}\" \\\r\n--gin_file=\"dataset.gin\" --gin_param=\"utils.run.mesh_shape = 'model:1,batch:1'\" \\\r\n--gin_param=\"utils.run.mesh_devices = ['gpu:0']\" --gin_file=\"models/bi_v1_3B.gin\" \\\r\n--gin_param=\"MIXTURE_NAME = 'atr'\" --module_import='t5_task_def' \r\n```\r\n\r\nI get an OOM error\r\n\r\n```\r\n (0) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,512] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n [[node encoder/block_018/layer_000/SelfAttention/einsum_8/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n [[scalar_add/parallel_0/add/_5677]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n (1) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,512] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n [[node encoder/block_018/layer_000/SelfAttention/einsum_8/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n```\r\n\r\nI add `--gin_param=\"tokens_per_batch=1\"` to the above command, thinking I need to reduce my batch size to fit on the GPU, (which in the first place this error is weird because I'm using a Nvidia Tesla V100-SXM3-32GB), and I still get the OOM error. Is there something wrong with the command I'm running?",
"title": "OOM error when training even when batch",
"type": "issue"
},
{
"action": "created",
"author": "craffel",
"comment_id": 689641428,
"datetime": 1599665620000,
"masked_author": "username_1",
"text": "I'm not sure the 3B model will fit in memory on a single GPU (even a v100), though it is possible to train the 3B model on a v2-8 TPU which has 64 GB of RAM. I'm also not sure `tokens_per_bath=1` will work if your input sequences have length 512 (which it appears they do based on the error you're seeing). You could try `tokens_per_bath=512` which would correspond to a single sequence in each batch, and/or you could try decreasing the input/output sequence lengths depending on the sequence lengths of the task you are trying.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "craffel",
"comment_id": null,
"datetime": 1599665621000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689644398,
"datetime": 1599665908000,
"masked_author": "username_0",
"text": "@username_1 thanks for the quick response. However, I have fine-tuned without a problem on these GPUs. Does training from scratch require more memory for some reason? I understand the pre-training is different than fine-tuning on specific tasks, but I basically want to fine-tune without any pre-training.\r\n\r\nAlso, if I'm getting OOM error with `tokens_per_batch=1`, there's no way I could go bigger, right?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "craffel",
"comment_id": 689646613,
"datetime": 1599666122000,
"masked_author": "username_1",
"text": "If your input sequence has 512 tokens and you set tokens_per_batch=1, I'm not sure what the code will do. You should set tokens_per_batch to be an integer multiple of your input sequence length, so the smallest you could use is 512.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "adarob",
"comment_id": 689648475,
"datetime": 1599666313000,
"masked_author": "username_2",
"text": "Can you try installing mesh-tf and t5 both from master on github?\r\n\r\nI think this will be fixed by the combination of:\r\nhttps://github.com/tensorflow/mesh/pull/175\r\nhttps://github.com/google-research/text-to-text-transfer-transformer/pull/386\r\n\r\nIf we can confirm that solves your problem, I will update the packages.",
"title": null,
"type": "comment"
},
{
"action": "reopened",
"author": "adarob",
"comment_id": null,
"datetime": 1599666314000,
"masked_author": "username_2",
"text": "I'm trying to train the 3B parameter model from scratch. When I try to do so by executing the following command\r\n\r\n```\r\nexport CUDA_VISIBLE_DEVICES=3\r\nexport MODEL_DIR=models-t5/3Bscratch\r\nexport DATA_DIR=data/c/tsvs\r\nt5_mesh_transformer --model_dir=\"${MODEL_DIR}\" --t5_tfds_data_dir=\"${DATA_DIR}\" \\\r\n--gin_file=\"dataset.gin\" --gin_param=\"utils.run.mesh_shape = 'model:1,batch:1'\" \\\r\n--gin_param=\"utils.run.mesh_devices = ['gpu:0']\" --gin_file=\"models/bi_v1_3B.gin\" \\\r\n--gin_param=\"MIXTURE_NAME = 'atr'\" --module_import='t5_task_def' \r\n```\r\n\r\nI get an OOM error\r\n\r\n```\r\n (0) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,512] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n [[node encoder/block_018/layer_000/SelfAttention/einsum_8/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n [[scalar_add/parallel_0/add/_5677]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n (1) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,512] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n [[node encoder/block_018/layer_000/SelfAttention/einsum_8/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n```\r\n\r\nI add `--gin_param=\"tokens_per_batch=1\"` to the above command, thinking I need to reduce my batch size to fit on the GPU, (which in the first place this error is weird because I'm using a Nvidia Tesla V100-SXM3-32GB), and I still get the OOM error. Is there something wrong with the command I'm running?",
"title": "OOM error when training even when batch",
"type": "issue"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689657943,
"datetime": 1599667220000,
"masked_author": "username_0",
"text": "So I'm working out of a docker container, and I hope that doesn't make things too tricky. But basically before I had just pip installed t5 and so I think the place it had the packages `t5` and `mesh_tensorflow` was in `/usr/local/lib/python3.6/dist-packages`. So I just renamed the ones installed by pip to `t5package` and `mesh_tensorflowpackage` and then cloned https://github.com/tensorflow/mesh into `mesh_tensorflow` and https://github.com/google-research/text-to-text-transfer-transformer into `t5` to replace the pip-installed packages with the master branches from github.\r\n\r\nBut then when I run the same command, I get this error\r\n\r\n```\r\n File \"/usr/local/bin/t5_mesh_transformer\", line 5, in <module>\r\n from t5.models.mesh_transformer_main import console_entry_point\r\nModuleNotFoundError: No module named 't5.models'\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "adarob",
"comment_id": 689660549,
"datetime": 1599667466000,
"masked_author": "username_2",
"text": "You should be able to do `pip install git+\nhttps://github.com/tensorflow/mesh`, etc.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689671068,
"datetime": 1599668528000,
"masked_author": "username_0",
"text": "Sorry I deleted that last comment I thought there was a detail wrong and panicked but there wasn't. So I pip uninstalled t5 and mesh_tensorflow at that point, and pip installed both with git+.\r\n\r\nI get nearly the same errors, it just looks like some of the dimensions are changed:\r\n\r\n```\r\ntensorflow.python.framework.errors_impl.ResourceExhaustedError: 2 root error(s) found.\r\n (0) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,128] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n\t [[node encoder/block_018/layer_000/SelfAttention/einsum_7/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n\t [[scalar_add/parallel_0/add/_5679]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n (1) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,128] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n\t [[node encoder/block_018/layer_000/SelfAttention/einsum_7/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689697675,
"datetime": 1599671150000,
"masked_author": "username_0",
"text": "So I tried parallelizing across two of the v100s, and I still got those errors above. When I parallelized across 4 (3 wouldn't work because the model doesn't split evenly across 3), it finally worked. \r\n\r\nStill, it seems weird to me that I could fine-tune with a sequence length of 250 on 1 GPU but couldn't train with the same sequence length across 2 GPUs. Aren't training and fine-tuning the same thing, just at different stages of training? Or am I misunderstanding the difference between them? Like when you guys train on MRPC from scratch in the readme, there's no reason that would be more computationally intensive than if you had fine-tuned on MRPC using one of the pre-trained models, right?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "craffel",
"comment_id": null,
"datetime": 1599671389000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "craffel",
"comment_id": 689699690,
"datetime": 1599671389000,
"masked_author": "username_1",
"text": "Yes, they are exactly the same, the only difference is that before fine-tuning we load variables from a checkpoint.\r\n\r\nYou can diff the operative config gin files that got written out to your model directories to triple check that there are no other differences (other than sequence length) between your fine-tuning and pre-training runs.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689701487,
"datetime": 1599671591000,
"masked_author": "username_0",
"text": "Oh sorry that finally made sense about the quadratic scaling. Thanks for your help, guys!",
"title": null,
"type": "comment"
}
] | 9,182 | false | false | 3 | 14 | true |
taichi-dev/taichi | taichi-dev | 684,773,916 | 1,769 | null | [
{
"action": "opened",
"author": "squarefk",
"comment_id": null,
"datetime": 1598283902000,
"masked_author": "username_0",
"text": "",
"title": "[Doc] Update doc on ti.Matrix (like zeros, ones, identity)",
"type": "issue"
},
{
"action": "created",
"author": "archibate",
"comment_id": 686287298,
"datetime": 1599115022000,
"masked_author": "username_1",
"text": "Great! Would you take this on your own?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "k-ye",
"comment_id": null,
"datetime": 1628753379000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "k-ye",
"comment_id": 897413128,
"datetime": 1628753379000,
"masked_author": "username_2",
"text": "Closed by https://github.com/taichi-dev/taichi/issues/2579",
"title": null,
"type": "comment"
}
] | 97 | false | false | 3 | 4 | false |
moralmunky/Home-Assistant-Mail-And-Packages | null | 748,138,407 | 258 | {
"number": 258,
"repo": "Home-Assistant-Mail-And-Packages",
"user_login": "moralmunky"
} | [
{
"action": "opened",
"author": "firstof9",
"comment_id": null,
"datetime": 1606008490000,
"masked_author": "username_0",
"text": "## Proposed change\r\n<!-- \r\n Describe the big picture of your changes here to communicate to the\r\n maintainers why we should accept this pull request. If it fixes a bug\r\n or resolves a feature request, be sure to link to that issue in the\r\n additional information section.\r\n-->\r\nSimplify tracking number regex\r\n\r\n## Type of change\r\n<!--\r\n What type of change does your PR introduce?\r\n NOTE: Please, check only 1! box! \r\n If your PR requires multiple boxes to be checked, you'll most likely need to\r\n split it into multiple PRs. This makes things easier and faster to code review.\r\n-->\r\n\r\n- [ ] Dependency upgrade\r\n- [ ] Bugfix (non-breaking change which fixes an issue)\r\n- [ ] New feature (which adds functionality)\r\n- [ ] Breaking change (fix/feature causing existing functionality to break)\r\n- [x] Code quality improvements to existing code or addition of tests\r\n\r\n## Additional information\r\n<!--\r\n Details are important, and help maintainers processing your PR.\r\n Please be sure to fill out additional details, if applicable.\r\n-->\r\n\r\n- This PR is related to issue: n/a",
"title": "Simplify tracking number regex",
"type": "issue"
}
] | 1,080 | false | true | 1 | 1 | false |
splunk/qbec | splunk | 696,204,523 | 155 | {
"number": 155,
"repo": "qbec",
"user_login": "splunk"
} | [
{
"action": "opened",
"author": "gotwarlost",
"comment_id": null,
"datetime": 1599602315000,
"masked_author": "username_0",
"text": "",
"title": "add windows CI build, thanks to @harsimranmaan",
"type": "issue"
},
{
"action": "created",
"author": "sbarzowski",
"comment_id": 689542971,
"datetime": 1599656000000,
"masked_author": "username_1",
"text": "Wow, good to know it's possible to set up Github actions to run on Windows. We may try setting it up upstream, too.",
"title": null,
"type": "comment"
}
] | 115 | false | true | 2 | 2 | false |
nange/gospider | null | 548,379,303 | 74 | null | [
{
"action": "opened",
"author": "lintan",
"comment_id": null,
"datetime": 1578716931000,
"masked_author": "username_0",
"text": "packr安装不成功",
"title": "go get: upgrading github.com/coreos/go-systemd@v0.0.0-20190321100706-95778dfbb74e: unexpected status (https://goproxy.cn/github.com/coreos/go-systemd/@v/list): 404 Not Found",
"type": "issue"
},
{
"action": "created",
"author": "nange",
"comment_id": 574117654,
"datetime": 1578999203000,
"masked_author": "username_1",
"text": "你是执行```go get -u github.com/gobuffalo/packr/packr```出的问题吗? 我没能重现你的问题。 \r\n\r\n另外如果你不是想开发前端页面,只是想体验功能,并不需要安装packr。直接进入```_example```执行```go build```,然后运行就可以了。 \r\n@username_0",
"title": null,
"type": "comment"
}
] | 173 | false | false | 2 | 2 | true |
scikit-hep/pyhf | scikit-hep | 650,275,733 | 918 | null | [
{
"action": "opened",
"author": "lukasheinrich",
"comment_id": null,
"datetime": 1593737990000,
"masked_author": "username_0",
"text": "# Description\r\n\r\nright now we trigger a binder build but i'm not sure, do we get a response / feedback iif the bulid fails? We could use repo2docker to do the binder-like build locally\r\n\r\nhttps://twitter.com/m_deff/status/1277064783662141441",
"title": "run repo2docker in CI",
"type": "issue"
},
{
"action": "created",
"author": "matthewfeickert",
"comment_id": 653307828,
"datetime": 1593744400000,
"masked_author": "username_1",
"text": "No. `trigger_binder.sh` just `curls` the endpoint and gives the return code\r\n\r\nhttps://github.com/scikit-hep/pyhf/blob/64dbce070156945f6bb524f79ee6e23d26b52f01/binder/trigger_binder.sh#L6-L16\r\n\r\nand as we're using the `postBuild` Binder config which is basically just installing how we would in CI\r\n\r\nhttps://github.com/scikit-hep/pyhf/blob/64dbce070156945f6bb524f79ee6e23d26b52f01/binder/postBuild#L1-L2\r\n\r\nthe `repo2docker` build on the Binder side shouldn't fail. That being said though, the idea of using `repo2docker` in CI (to test and prebuild the Docker image for Binder so that work was offloaded from them) were the reasons that I was looking at [`machine-learning-apps/repo2docker-action`](https://github.com/machine-learning-apps/repo2docker-action). The discussions with Tim and Chris on these Issues:\r\n\r\n- [`machine-learning-apps/repo2docker-action` Issue 26](https://github.com/machine-learning-apps/repo2docker-action/issues/26)\r\n- [`machine-learning-apps/repo2docker-action` Issue 25](https://github.com/machine-learning-apps/repo2docker-action/issues/25)\r\n\r\nmake it seem like it might not be a good fit for us as we want to use Binder config files in the top level `binder/` dir.\r\n\r\n@username_0 did you have additional ideas related to this?",
"title": null,
"type": "comment"
}
] | 1,503 | false | false | 2 | 2 | true |
HackYourFuture-CPH/fp-class12 | HackYourFuture-CPH | 637,308,459 | 101 | {
"number": 101,
"repo": "fp-class12",
"user_login": "HackYourFuture-CPH"
} | [
{
"action": "opened",
"author": "ghofranebenhmaid",
"comment_id": null,
"datetime": 1591908239000,
"masked_author": "username_0",
"text": "# Description\r\n\r\nadded qrcode.react 1.0.0 to package.json and package-lock.json\r\n[www.npmjs.com/package/qrcode.react](https://www.npmjs.com/package/qrcode.react\r\n)",
"title": "added qrcode.react 1.0.0 to package.json and package-lock.json",
"type": "issue"
},
{
"action": "created",
"author": "senner007",
"comment_id": 643355303,
"datetime": 1591978141000,
"masked_author": "username_1",
"text": "Hi. This has changes from your other branch. Try to revert them or close the pr",
"title": null,
"type": "comment"
}
] | 242 | false | false | 2 | 2 | false |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.