repo stringlengths 7 67 | org stringlengths 2 32 ⌀ | issue_id int64 780k 941M | issue_number int64 1 134k | pull_request dict | events list | user_count int64 1 77 | event_count int64 1 192 | text_size int64 0 329k | bot_issue bool 1 class | modified_by_bot bool 2 classes | text_size_no_bots int64 0 279k | modified_usernames bool 2 classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
AY2021S2-CS2103T-W12-3/tp | AY2021S2-CS2103T-W12-3 | 849,689,140 | 85 | null | [
{
"action": "closed",
"author": "KnitidCeladon23",
"comment_id": null,
"datetime": 1618012512000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "KnitidCeladon23",
"comment_id": 817032618,
"datetime": 1618012512000,
"masked_author": "username_0",
"text": "Resolved",
"title": null,
"type": "comment"
}
] | 2 | 3 | 513 | false | true | 8 | false |
djkampus/github-slideshow | null | 812,640,593 | 1 | null | [
{
"action": "closed",
"author": "djkampus",
"comment_id": null,
"datetime": 1613834639000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 6,799 | false | true | 0 | false |
agronholm/anyio | null | 814,551,632 | 208 | {
"number": 208,
"repo": "anyio",
"user_login": "agronholm"
} | [
{
"action": "opened",
"author": "makkus",
"comment_id": null,
"datetime": 1614092866000,
"masked_author": "username_0",
"text": "Maybe I'm missing something obvious, but it seems to me the spawn should be awaited, no?",
"title": "Add missing 'await'",
"type": "issue"
},
{
"action": "created",
"author": "smurfix",
"comment_id": 784271050,
"datetime": 1614093092000,
"masked_author": "username_1",
"text": "That's the master branch, so: no.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "makkus",
"comment_id": 784271371,
"datetime": 1614093121000,
"masked_author": "username_0",
"text": "Sorry, just realized this has changed in latest version. Apologies, readthedocs uses 'latest', which is why I was confused.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "agronholm",
"comment_id": 784359810,
"datetime": 1614100366000,
"masked_author": "username_2",
"text": "I'm confused. The docs on RTD default to \"stable\" which currently points to the v2.1.0 docs.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "makkus",
"comment_id": 784368538,
"datetime": 1614101153000,
"masked_author": "username_0",
"text": "Yes, that's correct. I arrived via a search engine, and that's probably why I somehow ended up on 'latest'. I know I didn't manually set the version, so I assume it was just some fluke that probably doesn't affect (m)any others (and nothing you could do about it anyway). Sorry again for the confusion.",
"title": null,
"type": "comment"
}
] | 3 | 5 | 638 | false | false | 638 | false |
pinterest/secor | pinterest | 655,445,286 | 1,452 | {
"number": 1452,
"repo": "secor",
"user_login": "pinterest"
} | [
{
"action": "opened",
"author": "pdambrauskas",
"comment_id": null,
"datetime": 1594576106000,
"masked_author": "username_0",
"text": "We are running Secor on K8s and monitoring our stack with Prometheus.\r\nAs Secor uses Micrometer, it is not that hard to add Prometheus support directly, and avoid using exporters for Secor metrics.\r\n\r\nI've changed the way `gauge` method works, because Prometheus endpoint was returning `NaN` values.\r\nMore on that https://stackoverflow.com/questions/50821924/micrometer-prometheus-gauge-displays-nan\r\n\r\nI've added `/prometheus` endpoint to Ostrich Server to avoid making things complicated with listening to some other port.\r\n\r\nAlso moved initialization of metric controller up to `ConsumerMain` class. I guess we don't need to have different instances for each Consumer. Otherwise I'd have to ensure, that there is only one instance of PrometheusMetricRegistry to avoid duplications and other problems on Prometheus http endpoint. This shouldnt be a problem, as Metric registries are delegating calls to static methods anyway.",
"title": "Add native support for Prometheus",
"type": "issue"
},
{
"action": "created",
"author": "HenryCaiHaiying",
"comment_id": 657714734,
"datetime": 1594664296000,
"masked_author": "username_1",
"text": "In that case use the ERROR level logging and set the cache size threshold a\nlittle bigger: 500",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "HenryCaiHaiying",
"comment_id": 658602879,
"datetime": 1594798918000,
"masked_author": "username_1",
"text": "Looks good, thanks for the contribution.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,063 | false | false | 1,063 | false |
mono/SkiaSharp | mono | 790,209,433 | 1,582 | null | [
{
"action": "opened",
"author": "sady4850",
"comment_id": null,
"datetime": 1611166727000,
"masked_author": "username_0",
"text": "``` \r\nclass Program {\r\n static void Main(string[] args)\r\n {\r\n var typeface = SKTypeface.FromFile(@\"metropolis.extra-light.otf\"); // metropolis extra light\r\n var fontWeight = typeface.FontWeight; // equals to 200 on windows (correct), equals 400 of macOS (incorrect)\r\n }\r\n}\r\n```\r\n\r\n@mattleibow could you please confirm that this is an issue ? if so is it something easy to fix ? Thanks!\r\n[ConsoleApp1.zip](https://github.com/mono/SkiaSharp/files/5844440/ConsoleApp1.zip)",
"title": "[BUG] SKTypeface.FontWeight is incorrect after loading from a file on macOS",
"type": "issue"
}
] | 1 | 1 | 495 | false | false | 495 | false |
JetBrains/resharper-unity | JetBrains | 473,489,483 | 1,260 | null | [
{
"action": "opened",
"author": "oxysoft",
"comment_id": null,
"datetime": 1564166912000,
"masked_author": "username_0",
"text": "Tracepoints are hugely useful, however the output is not in a very useful place for a developer working with Unity3D. In fact I remember a while back it took me quite some time to find where they went initially. It would be a huge win if we could set tracepoints and see their output directly in the Unity console. They could be prefixed with \"[TRACEPOINT]\" or something for clarity sake.\r\n\r\n(to be clear, when I say tracepoint I mean a breakpoint with _Evaluate and log_ set and _suspend_ off)",
"title": "Ability to redirect tracepoint output to Unity console",
"type": "issue"
},
{
"action": "created",
"author": "van800",
"comment_id": 515570447,
"datetime": 1564168914000,
"masked_author": "username_1",
"text": "Thank you for the idea. Couple of screenshots would help a lot. By \"Unity console\" do you mean in Rider or in Unity?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "oxysoft",
"comment_id": 515584637,
"datetime": 1564171995000,
"masked_author": "username_0",
"text": "I mean the console inside of Unity but I think both would be nice. Essentially, hitting a tracepoint would result in `Debug.Log` firing with the evaluated content of the tracepoint to print.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DmitriyYukhanov",
"comment_id": 563228653,
"datetime": 1575896611000,
"masked_author": "username_2",
"text": "Second this, that would be super helpful!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mfandreich",
"comment_id": 563229077,
"datetime": 1575896688000,
"masked_author": "username_3",
"text": "Must have )",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "5argon",
"comment_id": 565784458,
"datetime": 1576394308000,
"masked_author": "username_4",
"text": "This is a must have!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "van800",
"comment_id": 789610873,
"datetime": 1614767449000,
"masked_author": "username_1",
"text": "Workaround for this is to create a function somewhere in your project and call it in the `Evaluate and log`\r\n```\r\n [UsedImplicitly]\r\n public T Log<T>(T s)\r\n {\r\n Debug.Log(s);\r\n return s;\r\n }\r\n```\r\n",
"title": null,
"type": "comment"
}
] | 5 | 7 | 1,205 | false | false | 1,205 | false |
sveltejs/sapper | sveltejs | 606,470,104 | 1,169 | {
"number": 1169,
"repo": "sapper",
"user_login": "sveltejs"
} | [
{
"action": "opened",
"author": "stephane-vanraes",
"comment_id": null,
"datetime": 1587749205000,
"masked_author": "username_0",
"text": "This adds a _Who's using Sapper ?_ section to the website as proposed in https://github.com/sveltejs/sapper/issues/1113\r\n\r\nThe actual information is fetched from [svelte/community](https://github.com/sveltejs/community/tree/master/whos-using-sapper) which at the moment is empty, I think it would be better to add some sites there before merging this.",
"title": "Add \"Who is using Sapper\" section to the website.",
"type": "issue"
},
{
"action": "created",
"author": "benmccann",
"comment_id": 629360928,
"datetime": 1589560748000,
"masked_author": "username_1",
"text": "could we fetch this info via API instead?\r\n\r\notherwise I agree that a shell script seems cleaner than using `shelljs`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "stephane-vanraes",
"comment_id": 629624181,
"datetime": 1589624897000,
"masked_author": "username_0",
"text": "If we change this we shoudl probably change it for the svelte site itself as well",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "benmccann",
"comment_id": 806049618,
"datetime": 1616609663000,
"masked_author": "username_1",
"text": "Thanks for putting together this PR! Sorry it sat for awhile. There's probably not much point to collecting a list of Sapper users now that SvelteKit is out. We've improved a bit the \"Who is using Svelte?\" on the svelte.dev homepage. We haven't talked much about whether we'd want to try to do the same for SvelteKit or not. In any case, I'm going to go ahead and close this since I doubt we'll be doing it for Sapper",
"title": null,
"type": "comment"
}
] | 2 | 4 | 966 | false | false | 966 | false |
NCAR/ldcpy | NCAR | 754,740,620 | 200 | null | [
{
"action": "opened",
"author": "pinarda",
"comment_id": null,
"datetime": 1606858182000,
"masked_author": "username_0",
"text": "specify list of calculations and datasets, perform calculations on each dataset and print out answers\r\n\r\nlike compare state but more general - make compare stats a special case of this function",
"title": "more general function to compare calculations",
"type": "issue"
},
{
"action": "created",
"author": "allibco",
"comment_id": 783695062,
"datetime": 1614030260000,
"masked_author": "username_1",
"text": "This is for a later date :)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "allibco",
"comment_id": 819679660,
"datetime": 1618420229000,
"masked_author": "username_1",
"text": "It would be great if compare stats could take in multiple sets and have them all in the table",
"title": null,
"type": "comment"
}
] | 2 | 3 | 313 | false | false | 313 | false |
jGleitz/markdown-it-prism | null | 551,931,577 | 31 | {
"number": 31,
"repo": "markdown-it-prism",
"user_login": "jGleitz"
} | [
{
"action": "opened",
"author": "jGleitz",
"comment_id": null,
"datetime": 1579444581000,
"masked_author": "username_0",
"text": "",
"title": "Stop testing on Node.JS 6",
"type": "issue"
}
] | 2 | 2 | 362 | false | true | 0 | false |
malxau/yori | null | 793,199,356 | 84 | null | [
{
"action": "opened",
"author": "aleaksunder",
"comment_id": null,
"datetime": 1611566366000,
"masked_author": "username_0",
"text": "Hello! Just hit another issue with `MOVE`\r\nAssuming we have such a tree:\r\n```\r\nC:.\r\n└───temp\r\n ├───dir1\r\n │ ├───subdir1\r\n │ │ file1\r\n │ │\r\n │ └───subdir2\r\n │ file2\r\n │\r\n └───dirExist\r\n```\r\nthen I tried to move the contents of folder `dir1` to existen dir `dirExist`:\r\n```\r\nYORI> move \"c:\\temp\\dir1\\*\" \"c:\\temp\\dirExist\"\r\n```\r\nand all is fine here:\r\n```\r\nC:.\r\n└───temp\r\n ├───dir1\r\n └───dirExist\r\n ├───subdir1\r\n │ file1\r\n │\r\n └───subdir2\r\n file2\r\n```\r\n\r\nBut when revert all back and tried to move the contents of `dir1` to not existen directory:\r\n```\r\nmove \"c:\\temp\\dir1\\*\" \"c:\\temp\\dirNew\"\r\n```\r\nit fails with message: `Attempting to move multiple files over a single file (\\\\?\\C:\\temp\\dirNew)` and after that the tree became this:\r\n```\r\nC:.\r\n└───temp\r\n │\r\n ├───dir1\r\n │ └───subdir2\r\n │ file2\r\n │\r\n ├───dirExist\r\n └───dirNew\r\n file1\r\n```\r\n\r\nAnd all is fine with `COPY` in both cases... even `copy -s \"dir1\\*\" \"dirNew\"` creates new \"dirNew\" and copies all contents of \"dir1\"",
"title": "MOVE the content of directory to directory which is not exist will move the contents of subdirectories of original path to new one",
"type": "issue"
},
{
"action": "created",
"author": "malxau",
"comment_id": 766969695,
"datetime": 1611594885000,
"masked_author": "username_1",
"text": "`copy -s` creates the directory. `copy` without `-s` has the same behavior as `move` - if multiple files are in the source, it will error trying to copy the second file over the first. `move` is in a bit of a mind because there's no such thing as `move -s`, since it is always a non-recursive operation, even if it has hierarchical effects.\r\n\r\nAlthough it's always easy to pick an example that makes the behavior look illogical, see how we got here - `move src dest` is a valid thing to do to a single file, where `dest` is a newly created file; `move src\\* dest` might match a single file in which case these semantics make sense. It seems very scary to have different behavior based on the number of files that are found, since the behavior will be very hard to predict.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "aleaksunder",
"comment_id": null,
"datetime": 1611801385000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 1,894 | false | false | 1,894 | false |
headwirecom/peregrine-cms | headwirecom | 727,572,233 | 623 | null | [
{
"action": "opened",
"author": "cmrockwell",
"comment_id": null,
"datetime": 1603386632000,
"masked_author": "username_0",
"text": "**Please describe the problem.**\r\nCurrently the only way to use HTL in Peregrine is to deploy `pagerenderserver` module. While this can be extended, it is currently quite limited. New HTL sites cannot be created through the UI\r\n\r\n**Describe the solution you'd like**\r\nCreate a simple theme called 'Theme HTL' \r\n\r\nPage components\r\n- working RTE component\r\n- image component\r\n\r\nTemplate components\r\n- drop down nav\r\n- footer [where it belongs](https://www.freecodecamp.org/news/how-to-keep-your-footer-where-it-belongs-59c6aa05c59c/)\r\n\r\nDocuments\r\n\r\n1. how to create new htl components \r\n1. how to compile styling as .less \r\n1. how to concat and deliver theme js \r\n\r\n**Additional context**\r\nThe intent for this ticket is to create structure and begin a theme project based on htl",
"title": "Theme HTL",
"type": "issue"
},
{
"action": "created",
"author": "reusr1",
"comment_id": 714638537,
"datetime": 1603386997000,
"masked_author": "username_1",
"text": "@username_0 would be great to start this in https://github.com/peregrine-cms - maybe we can reuse the fragments from themeclen-flex [1] and update the hatch process at the same time. If we use that approach any site can switch from server to clientside rendering and it would be a great showcase to hatch to vue or htl. \r\n\r\n[1] https://github.com/headwirecom/themeclean-flex/tree/develop/fragments",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cmrockwell",
"comment_id": 717258333,
"datetime": 1603806730000,
"masked_author": "username_0",
"text": "Sure, the HTL theme could definitely live under the per-cms repo. Tooling for creating new components that allows both styles is an interesting idea, and probably deserves a separate ticket. In the scope of this ticket, I hope to model a few patterns that could inform both updates to the percli processes, and manual processes developers can use to create new components.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "cmrockwell",
"comment_id": null,
"datetime": 1608057691000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 1,547 | false | false | 1,547 | true |
pydata/conf_site | pydata | 684,898,775 | 657 | {
"number": 657,
"repo": "conf_site",
"user_login": "pydata"
} | [
{
"action": "opened",
"author": "martey",
"comment_id": null,
"datetime": 1598296117000,
"masked_author": "username_0",
"text": "",
"title": "Add israel2021 host information.",
"type": "issue"
}
] | 2 | 2 | 370 | false | true | 0 | false |
gakonst/ethers-rs | null | 822,723,531 | 221 | null | [
{
"action": "opened",
"author": "xJonathanLEI",
"comment_id": null,
"datetime": 1614917176000,
"masked_author": "username_0",
"text": "**Is your feature request related to a problem? Please describe.**\r\n\r\nI'm using `ethers-rs` in a daemon application that acts in a \"fire and forget\" fashion. It doesn't care about whether the transactions succeed or not, but only about broadcasting them.\r\n\r\nI need to log the transaction hash once a transaction is sent. However, the `tx_hash` field on struct `PendingTransaction` is made private, forcing me to await it and get the receipt instead.\r\n\r\n**Describe the solution you'd like**\r\n\r\nSimply make the `tx_hash` field on `PendingTransaction` public.\r\n\r\n**Note**\r\n\r\n_Though I've been using `ethers` in TypeScript for quite a while, I'm pretty new to `ethers-rs` and Rust in general. So please forgive me if I'm missing anything. Thanks!_",
"title": "Make tx_hash on PendingTransaction public",
"type": "issue"
},
{
"action": "created",
"author": "gakonst",
"comment_id": 791448459,
"datetime": 1614954028000,
"masked_author": "username_1",
"text": "You can get the `TxHash` inside a `PendingTransaction` by dereferencing it with a `*`, similarly to how it's done [here](https://github.com/username_1/ethers-rs/blob/5c1f8f532ac7da117edb86b0e1c734188646ff5d/ethers-contract/tests/contract.rs#L55)",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "gakonst",
"comment_id": null,
"datetime": 1614954028000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 985 | false | false | 985 | true |
kenkoooo/AtCoderProblems | null | 544,563,688 | 385 | {
"number": 385,
"repo": "AtCoderProblems",
"user_login": "kenkoooo"
} | [
{
"action": "opened",
"author": "kenkoooo",
"comment_id": null,
"datetime": 1577968870000,
"masked_author": "username_0",
"text": "",
"title": "fixed point of virtual contest",
"type": "issue"
}
] | 2 | 2 | 0 | false | true | 0 | false |
i-Naji/emoji_keyboard | null | 831,271,452 | 12 | null | [
{
"action": "opened",
"author": "yixiang",
"comment_id": null,
"datetime": 1615759194000,
"masked_author": "username_0",
"text": "Right now the CategoryIcons.color controls only the header background, but not the foreground color.\r\n\r\n[OPTION 1]\r\n\r\n```\r\nCategoryIcons {\r\n final Color iconColor;\r\n // Right now this field controls background color, but it's not clearly named so.\r\n // It'd nice to rename it to backgroundColor.\r\n final Color color;\r\n}\r\n```\r\n\r\n[OPTION 2] \r\nBetter yet. It will be nice to split the color customization from the icon customization.\r\n\r\n```\r\n// Remove CategoryIcons.color, and add the following two fields to EmojiKeyboard.\r\nEmojiKeyboard {\r\n final Color headerIconColor;\r\n final Color headerBackgroundColor;\r\n}\r\n```",
"title": "[FR] Split add header icon color customization.",
"type": "issue"
},
{
"action": "created",
"author": "yixiang",
"comment_id": 798988043,
"datetime": 1615759327000,
"masked_author": "username_0",
"text": "Looks like this is a dupe of #8, but for the sake of keeping the detailed suggestions here, I'm leaving this open.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 733 | false | false | 733 | false |
GSA/datagov-deploy | GSA | 815,988,144 | 2,890 | null | [
{
"action": "opened",
"author": "chris-macdermaid",
"comment_id": null,
"datetime": 1614217113000,
"masked_author": "username_0",
"text": "## User Story\r\n\r\nAs a person in the O&M role, I want a GitHub Action or Snyk integration to open issues for vulnerabilities.\r\n\r\n## Acceptance Criteria\r\n- [ ] Research Snyk GitHub integration\r\n- [ ] Research GitHub Actions\r\n- [ ] Test possible solutions with with inventory-app\r\n\r\n## Security Considerations ([required](https://nvd.nist.gov/800-53/Rev4/control/CM-4))\r\nNo security concerns. This a research ticket.\r\n\r\n## Sketch\r\nOne potential solution is the [Snyk GitHub Issue Creator](https://github.com/elastic/snyk-github-issue-creator)\r\n\r\nThe Snyk GitHub integration documentation is [here](https://support.snyk.io/hc/en-us/articles/360004032117-GitHub-integration)",
"title": "[research: 1d] Identify GH action or Snyk integration that Opens Issues",
"type": "issue"
},
{
"action": "created",
"author": "jbrown-xentity",
"comment_id": 933760255,
"datetime": 1633373501000,
"masked_author": "username_1",
"text": "We could consider making a github action that runs on dependabot/snyk pr's to run the make update-dependencies command on a PR so that all dependencies are synced and up to date.\r\nThis would involve deactivating all PR's for the full `requirements.txt` file, which is a simple UI fix.\r\nWill need to set some [configuration options](https://docs.github.com/en/actions/security-guides/automatic-token-authentication#permissions-for-the-github_token)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mogul",
"comment_id": 945336055,
"datetime": 1634528552000,
"masked_author": "username_2",
"text": "This looks like a handy alternative that explicitly calls out the things we've said are problems.\r\nhttps://github.com/apps/renovate",
"title": null,
"type": "comment"
}
] | 3 | 3 | 1,247 | false | false | 1,247 | false |
twitchdev/issues | twitchdev | 628,345,160 | 127 | null | [
{
"action": "opened",
"author": "TheHellcat",
"comment_id": null,
"datetime": 1591008946000,
"masked_author": "username_0",
"text": "**Brief description:**\r\nThis is for the Twitch account \"SpaceflowerDE\" and the company \"Spaceflower\", regarding the extension \"Live Games Collection\":\r\n\r\nWe created a new extension and then had our company created in the Twich dev backend.\r\nSince there is no option to move the already created extension to the company, to be able to assign permissions to the actual dev and we haven't done much on it so far, anyways, we wanted to simply delete and re-create it on the company account.\r\nHowever, hitting the \"delete\" button yield and \"not authorized\" message.\r\n\r\n**How to reproduce:**\r\nGo to \"manage extension\", click on \"delete extension\", in the next dialog again click \"delete\" to confirm the action.\r\n\r\n**Expected behavior:**\r\nThe extension beeing deleted.\r\n\r\n**Screenshots:**\r\nn/a\r\n\r\n**Additional context or questions**\r\nWe want to move the extension to the company account, so if it's possible to simple move it from the Twitch account based dev dashboard into the company based one, that'll be super fine.\r\n\r\nThis is for the Twitch account \"SpaceflowerDE\" and the company \"Spaceflower\", regarding the extension \"Live Games Collection\"\r\n\r\nThank you.",
"title": "Unable to delete Extension",
"type": "issue"
},
{
"action": "created",
"author": "BarryCarlyon",
"comment_id": 636786365,
"datetime": 1591009261000,
"masked_author": "username_1",
"text": "Duplicate of #46 \r\n\r\nRelated issue #112 \r\n\r\nYou can't delete an extension if it has no versions",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TheHellcat",
"comment_id": 636787256,
"datetime": 1591009369000,
"masked_author": "username_0",
"text": "I feel really silly now....\r\nThank you! That did the trick :-)",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "TheHellcat",
"comment_id": null,
"datetime": 1591009370000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "BarryCarlyon",
"comment_id": 636791883,
"datetime": 1591009878000,
"masked_author": "username_1",
"text": "Yeah this issue, _normally_ means you dumped out/aborted during the extension create wizard. Hence no version",
"title": null,
"type": "comment"
}
] | 2 | 5 | 1,422 | false | false | 1,422 | false |
hfreire/make-porto-win-european-best-destination-2017 | null | 673,003,365 | 818 | {
"number": 818,
"repo": "make-porto-win-european-best-destination-2017",
"user_login": "hfreire"
} | [
{
"action": "created",
"author": "hfreire",
"comment_id": 670314166,
"datetime": 1596774320000,
"masked_author": "username_0",
"text": ":tada: This PR is included in version 1.0.190 :tada:\n\nThe release is available on [GitHub release](https://github.com/username_0/make-porto-win-european-best-destination-2017/releases/tag/v1.0.190)\n\nYour **[semantic-release](https://github.com/semantic-release/semantic-release)** bot :package::rocket:",
"title": null,
"type": "comment"
}
] | 2 | 2 | 5,584 | false | true | 299 | true |
an-tao/drogon | null | 807,973,387 | 716 | null | [
{
"action": "opened",
"author": "ihmc3jn09hk",
"comment_id": null,
"datetime": 1613313585000,
"masked_author": "username_0",
"text": "Haven't been updated for a while. Unfortunately it is a bug report.\r\n\r\n**Describe the bug**\r\nUsing HttpClient as drogon::plugin. The application crashes on some particular urls.\r\n\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Create a plugin with the `drogon_ctl`, e.g. Pulgin\r\n2. Send requests by HttpClient in the `initAndStart` function, e.g.\r\n\r\n```c++\r\nvoid Pulgin::initAndStart(const Json::Value &config)\r\n{\r\n LOG_DEBUG << \"Start Plugin\";\r\n \tstatic auto func_resp = [](drogon::ReqResult res, const drogon::HttpResponsePtr &resp){\r\n\t\tif ( drogon::ReqResult::Ok != res ){\r\n\t\t\tLOG_WARN << \"GG : \" << static_cast<int>(res) << \" \";\r\n\t\t\treturn;\r\n\t\t}\r\n\t\tLOG_INFO << \"MKay\";\r\n\t};\r\n static auto func_req = [](const std::string &domain, const std::string &path){\r\n\t\tauto url{domain+path};\r\n\t\tauto clientPtr_ = \r\n\t\t\tdrogon::HttpClient::newHttpClient(\r\n\t\t\t\t\"https://\"+domain\r\n\t\t\t);\r\n\t\tclientPtr_->enableCookies();\r\n\t\tauto req_ = drogon::HttpRequest::newHttpRequest();\r\n\t\treq_->setMethod(drogon::Get);\r\n\t\treq_->setPath(path);\r\n\t\t\r\n\t\tclientPtr_->sendRequest(req_, func_resp, 10);\r\n\t};\r\n std::vector<std::string> targetUrls;\r\n //Crashing URLs\r\n targetUrls.emplace_back(\"medium.com/better-programming/understand-5-scopes-of-pytest-fixtures-1b607b5c19ed\");\r\n\ttargetUrls.emplace_back(\"tutorialedge.net/golang/improving-your-tests-with-testify-go\");\r\n\ttargetUrls.emplace_back(\"code.tutsplus.com/articles/rspec-testing-for-beginners-part-1--cms-26716\");\r\n\ttargetUrls.emplace_back(\"proandroiddev.com/testing-with-koin-ade8a46eb4d\");\r\n\ttargetUrls.emplace_back(\"ieftimov.com/post/testing-in-go-go-test\");\r\n \r\n/* E.g. Ok URLs\r\n\ttargetUrls.emplace_back(\"github.com/stretchr/testify\");\r\n\ttargetUrls.emplace_back(\"sysout.ru/testirovanie-kontrollerov-s-pomoshhyu-mockmvc\");\r\n\ttargetUrls.emplace_back(\"www.youtube.com/watch?v=8S8o46avgAw\");\r\n\ttargetUrls.emplace_back(\"changhsinlee.com/pytest-mock\");\r\n*/\r\n //Loop through all the urls\r\n\tfor ( size_t i=0; i<targetUrls.size(); ++i ){\r\n\t\tconst std::string &url = targetUrls[i];\r\n\t\tauto pos = url.find(\"/\");\r\n\t\tfunc_req(url.substr(0, pos),\r\n\t\t\t url.substr(pos));\r\n\t}\r\n}\r\n```\r\n\r\n3. Run the app\r\n\r\n```console\r\n20210214 14:26:12.445951 UTC 9509 DEBUG [initAndStart] Start Plugin - Pulgin.cc:36\r\n20210214 14:26:12.445978 UTC 9509 DEBUG [operator()] medium.com/better-programming/understand-5-scopes-of-pytest-fixtures-1b607b5c19ed - Pulgin.cc:48\r\n20210214 14:26:12.446506 UTC 9509 DEBUG [operator()] tutorialedge.net/golang/improving-your-tests-with-testify-go - Pulgin.cc:48\r\n20210214 14:26:12.446903 UTC 9509 DEBUG [operator()] code.tutsplus.com/articles/rspec-testing-for-beginners-part-1--cms-26716 - Pulgin.cc:48\r\n20210214 14:26:12.447242 UTC 9509 DEBUG [operator()] proandroiddev.com/testing-with-koin-ade8a46eb4d - Pulgin.cc:48\r\n20210214 14:26:12.447656 UTC 9509 DEBUG [operator()] ieftimov.com/post/testing-in-go-go-test - Pulgin.cc:48\r\nSegmentation fault\r\n\r\n```\r\n\r\n**Expected behavior**\r\n```console\r\n20210214 14:31:39.445951 UTC 9509 DEBUG [initAndStart] Start Plugin - Pulgin.cc:36\r\n20210214 14:31:39.464824 UTC 9596 DEBUG [operator()] github.com/stretchr/testify - Pulgin.cc:48\r\n20210214 14:31:39.465412 UTC 9596 DEBUG [operator()] sysout.ru/testirovanie-kontrollerov-s-pomoshhyu-mockmvc - Pulgin.cc:48\r\n20210214 14:31:39.465763 UTC 9596 DEBUG [operator()] www.youtube.com/watch?v=8S8o46avgAw - Pulgin.cc:48\r\n20210214 14:31:39.466133 UTC 9596 DEBUG [operator()] changhsinlee.com/pytest-mock - Pulgin.cc:48\r\n20210214 14:31:39.509395 UTC 9596 INFO MKay - Pulgin.cc:44\r\n20210214 14:31:39.834578 UTC 9596 INFO MKay - Pulgin.cc:44\r\n20210214 14:31:39.851334 UTC 9596 INFO MKay - Pulgin.cc:44\r\n20210214 14:31:45.271886 UTC 9596 INFO MKay - Pulgin.cc:44\r\n\r\n```\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: Linux\r\n\r\n**Additional context**\r\nHas no clue why is it related to the urls ? Those crashing urls are normal urls which are found from google randomly. They work just fine with Firefox and chrome.",
"title": "HttpClient crashes on some urls",
"type": "issue"
},
{
"action": "created",
"author": "an-tao",
"comment_id": 778799191,
"datetime": 1613318998000,
"masked_author": "username_1",
"text": "@username_0 The PR #717 fixed the crash of this issue, it's because the SSL handshake fails, but I have no idea of why some URLs can't be connected with SSL of Drogon.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ihmc3jn09hk",
"comment_id": 778800892,
"datetime": 1613319735000,
"masked_author": "username_0",
"text": "@username_1 Thank you for the quick reaction. I shall be fixing this.\r\nWill test it later.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "an-tao",
"comment_id": null,
"datetime": 1613351726000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "reopened",
"author": "an-tao",
"comment_id": null,
"datetime": 1613351739000,
"masked_author": "username_1",
"text": "Haven't been updated for a while. Unfortunately it is a bug report.\r\n\r\n**Describe the bug**\r\nUsing HttpClient as drogon::plugin. The application crashes on some particular urls.\r\n\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Create a plugin with the `drogon_ctl`, e.g. Pulgin\r\n2. Send requests by HttpClient in the `initAndStart` function, e.g.\r\n\r\n```c++\r\nvoid Pulgin::initAndStart(const Json::Value &config)\r\n{\r\n LOG_DEBUG << \"Start Plugin\";\r\n \tstatic auto func_resp = [](drogon::ReqResult res, const drogon::HttpResponsePtr &resp){\r\n\t\tif ( drogon::ReqResult::Ok != res ){\r\n\t\t\tLOG_WARN << \"GG : \" << static_cast<int>(res) << \" \";\r\n\t\t\treturn;\r\n\t\t}\r\n\t\tLOG_INFO << \"MKay\";\r\n\t};\r\n static auto func_req = [](const std::string &domain, const std::string &path){\r\n\t\tauto url{domain+path};\r\n\t\tauto clientPtr_ = \r\n\t\t\tdrogon::HttpClient::newHttpClient(\r\n\t\t\t\t\"https://\"+domain\r\n\t\t\t);\r\n\t\tclientPtr_->enableCookies();\r\n\t\tauto req_ = drogon::HttpRequest::newHttpRequest();\r\n\t\treq_->setMethod(drogon::Get);\r\n\t\treq_->setPath(path);\r\n\t\t\r\n\t\tclientPtr_->sendRequest(req_, func_resp, 10);\r\n\t};\r\n std::vector<std::string> targetUrls;\r\n //Crashing URLs\r\n targetUrls.emplace_back(\"medium.com/better-programming/understand-5-scopes-of-pytest-fixtures-1b607b5c19ed\");\r\n\ttargetUrls.emplace_back(\"tutorialedge.net/golang/improving-your-tests-with-testify-go\");\r\n\ttargetUrls.emplace_back(\"code.tutsplus.com/articles/rspec-testing-for-beginners-part-1--cms-26716\");\r\n\ttargetUrls.emplace_back(\"proandroiddev.com/testing-with-koin-ade8a46eb4d\");\r\n\ttargetUrls.emplace_back(\"ieftimov.com/post/testing-in-go-go-test\");\r\n \r\n/* E.g. Ok URLs\r\n\ttargetUrls.emplace_back(\"github.com/stretchr/testify\");\r\n\ttargetUrls.emplace_back(\"sysout.ru/testirovanie-kontrollerov-s-pomoshhyu-mockmvc\");\r\n\ttargetUrls.emplace_back(\"www.youtube.com/watch?v=8S8o46avgAw\");\r\n\ttargetUrls.emplace_back(\"changhsinlee.com/pytest-mock\");\r\n*/\r\n //Loop through all the urls\r\n\tfor ( size_t i=0; i<targetUrls.size(); ++i ){\r\n\t\tconst std::string &url = targetUrls[i];\r\n\t\tauto pos = url.find(\"/\");\r\n\t\tfunc_req(url.substr(0, pos),\r\n\t\t\t url.substr(pos));\r\n\t}\r\n}\r\n```\r\n\r\n3. Run the app\r\n\r\n```console\r\n20210214 14:26:12.445951 UTC 9509 DEBUG [initAndStart] Start Plugin - Pulgin.cc:36\r\n20210214 14:26:12.445978 UTC 9509 DEBUG [operator()] medium.com/better-programming/understand-5-scopes-of-pytest-fixtures-1b607b5c19ed - Pulgin.cc:48\r\n20210214 14:26:12.446506 UTC 9509 DEBUG [operator()] tutorialedge.net/golang/improving-your-tests-with-testify-go - Pulgin.cc:48\r\n20210214 14:26:12.446903 UTC 9509 DEBUG [operator()] code.tutsplus.com/articles/rspec-testing-for-beginners-part-1--cms-26716 - Pulgin.cc:48\r\n20210214 14:26:12.447242 UTC 9509 DEBUG [operator()] proandroiddev.com/testing-with-koin-ade8a46eb4d - Pulgin.cc:48\r\n20210214 14:26:12.447656 UTC 9509 DEBUG [operator()] ieftimov.com/post/testing-in-go-go-test - Pulgin.cc:48\r\nSegmentation fault\r\n\r\n```\r\n\r\n**Expected behavior**\r\n```console\r\n20210214 14:31:39.445951 UTC 9509 DEBUG [initAndStart] Start Plugin - Pulgin.cc:36\r\n20210214 14:31:39.464824 UTC 9596 DEBUG [operator()] github.com/stretchr/testify - Pulgin.cc:48\r\n20210214 14:31:39.465412 UTC 9596 DEBUG [operator()] sysout.ru/testirovanie-kontrollerov-s-pomoshhyu-mockmvc - Pulgin.cc:48\r\n20210214 14:31:39.465763 UTC 9596 DEBUG [operator()] www.youtube.com/watch?v=8S8o46avgAw - Pulgin.cc:48\r\n20210214 14:31:39.466133 UTC 9596 DEBUG [operator()] changhsinlee.com/pytest-mock - Pulgin.cc:48\r\n20210214 14:31:39.509395 UTC 9596 INFO MKay - Pulgin.cc:44\r\n20210214 14:31:39.834578 UTC 9596 INFO MKay - Pulgin.cc:44\r\n20210214 14:31:39.851334 UTC 9596 INFO MKay - Pulgin.cc:44\r\n20210214 14:31:45.271886 UTC 9596 INFO MKay - Pulgin.cc:44\r\n\r\n```\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: Linux\r\n\r\n**Additional context**\r\nHas no clue why is it related to the urls ? Those crashing urls are normal urls which are found from google randomly. They work just fine with Firefox and chrome.",
"title": "HttpClient crashes on some urls",
"type": "issue"
},
{
"action": "created",
"author": "ihmc3jn09hk",
"comment_id": 779278722,
"datetime": 1613401452000,
"masked_author": "username_0",
"text": "@username_1 #717 prevents crashes. But the handshake part is quite wierd. \r\nhttps://github.com/username_1/trantor/blob/e35fd046ac8a58bec3557c0c207a3e0b8ae57d38/trantor/net/inner/TcpConnectionImpl.cc#L1411",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "an-tao",
"comment_id": 779308056,
"datetime": 1613404241000,
"masked_author": "username_1",
"text": "Yes, I knew this and I'm trying to resolve it.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "marty1885",
"comment_id": 780996851,
"datetime": 1613615122000,
"masked_author": "username_2",
"text": "@username_0 What do you mean by weird?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "an-tao",
"comment_id": 782802135,
"datetime": 1613885094000,
"masked_author": "username_1",
"text": "@username_0 After some investigations, I thought the issue is related to SNI and made a PR #724 to resolve this, please check it out. thanks.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "an-tao",
"comment_id": null,
"datetime": 1613894758000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "ihmc3jn09hk",
"comment_id": 782838726,
"datetime": 1613905410000,
"masked_author": "username_0",
"text": "@username_2 I am sorry for missing your comment. That was related the header of the request. I compared the headers of requests from drogon and Chrome/Firefox, there is one not included, \"Hostname\". I thought was related to this and added this header in drogon but still failed since the actual problem is at the handshake part.\r\n\r\n@username_1 Thank you very much. Will test it out later.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ihmc3jn09hk",
"comment_id": 782906922,
"datetime": 1613933758000,
"masked_author": "username_0",
"text": "@username_1 All the urls posted can be accessed now. Thank you.",
"title": null,
"type": "comment"
}
] | 3 | 12 | 9,153 | false | false | 9,153 | true |
lgaticaq/chilexpress-cli | null | 844,811,870 | 294 | {
"number": 294,
"repo": "chilexpress-cli",
"user_login": "lgaticaq"
} | [
{
"action": "created",
"author": "lgaticaq",
"comment_id": 822726398,
"datetime": 1618860432000,
"masked_author": "username_0",
"text": ":tada: This PR is included in version 3.0.10 :tada:\n\nThe release is available on:\n- [npm package (@latest dist-tag)](https://www.npmjs.com/package/chilexpress-cli/v/3.0.10)\n- [GitHub release](https://github.com/username_0/chilexpress-cli/releases/tag/v3.0.10)\n\nYour **[semantic-release](https://github.com/semantic-release/semantic-release)** bot :package::rocket:",
"title": null,
"type": "comment"
}
] | 3 | 3 | 7,445 | false | true | 362 | true |
th2-net/th2-act-gui-demo | th2-net | 789,376,011 | 2 | {
"number": 2,
"repo": "th2-act-gui-demo",
"user_login": "th2-net"
} | [
{
"action": "opened",
"author": "Nikita-Smirnov-Exactpro",
"comment_id": null,
"datetime": 1611090888000,
"masked_author": "username_0",
"text": "",
"title": "[TH2-1261] Changed docker base image to java11",
"type": "issue"
},
{
"action": "created",
"author": "Vladimir-Panarin-Exactpro",
"comment_id": 763440617,
"datetime": 1611132691000,
"masked_author": "username_1",
"text": "@username_0 Please merge.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 38 | false | false | 38 | true |
JuliaSymbolics/Symbolics.jl | JuliaSymbolics | 831,844,332 | 124 | null | [
{
"action": "opened",
"author": "cadojo",
"comment_id": null,
"datetime": 1615817047000,
"masked_author": "username_0",
"text": "When building `CTarget`s, I'm getting an error when a matrix expression has elements that are simply numbers (not variables). \r\n\r\nThe following example shows the error I'm getting. \r\n\r\n```Julia\r\nusing Symbolics\r\n\r\n@variables x[1:3]\r\nskewsymmetric(ω) = [\r\n 0 -ω[3] ω[2]; \r\n ω[3] 0 -ω[1]; \r\n -ω[2] ω[1] 0;\r\n]\r\n\r\nbuild_function(skewsymmetric(x), x; target = Symbolics.JuliaTarget()); # This is fine!\r\nbuild_function(skewsymmetric(x), x; target = Symbolics.CTarget()); # Produces an error\r\n```\r\n\r\n```Julia\r\nERROR: type Int64 has no field args\r\nStacktrace:\r\n [1] getproperty(x::Int64, f::Symbol)\r\n @ Base ./Base.jl:33\r\n [2] coperators(expr::Int64)\r\n @ Symbolics ~/.julia/packages/Symbolics/L5FOZ/src/build_function.jl:380\r\n [3] |>(x::Int64, f::typeof(Symbolics.coperators))\r\n @ Base ./operators.jl:859\r\n [4] _build_function(target::Symbolics.CTarget, ex::Matrix{Num}, args::Vector{Num}; columnmajor::Bool, conv::Function, expression::Type, fname::Symbol, lhsname::Symbol, rhsnames::Vector{Symbol}, libpath::String, compiler::Symbol)\r\n @ Symbolics ~/.julia/packages/Symbolics/L5FOZ/src/build_function.jl:524\r\n [5] _build_function(target::Symbolics.CTarget, ex::Matrix{Num}, args::Vector{Num})\r\n @ Symbolics ~/.julia/packages/Symbolics/L5FOZ/src/build_function.jl:509\r\n [6] build_function(::Matrix{Num}, ::Vararg{Any, N} where N; target::Symbolics.CTarget, kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})\r\n @ Symbolics ~/.julia/packages/Symbolics/L5FOZ/src/build_function.jl:52\r\n [7] top-level scope\r\n @ REPL[22]:1\r\n```\r\n\r\nThe stack trace shows the error is found at the `for e in expr.args` line below:\r\n```Julia \r\n# build_function.jl:377-381\r\n\r\n# Replace certain multiplication and power expressions so they form valid C code\r\n# Extra factors of 1 are hopefully eliminated by the C compiler\r\nfunction coperators(expr)\r\n for e in expr.args\r\n if e isa Expr\r\n ...\r\n```\r\n\r\nHappy to submit a PR for this, I think it's a quick fix, but I'm not sure how y'all want the fix to be designed. __Should we restrict `coperators` to types contain `.args` fields? Should we otherwise patch that line in `coperators`?__",
"title": "ERROR: type <Int64 or Float64, etc.> has no field args",
"type": "issue"
},
{
"action": "created",
"author": "ChrisRackauckas",
"comment_id": 799449534,
"datetime": 1615817318000,
"masked_author": "username_1",
"text": "Is that a Julia Expr? Symbolics doesn't assume `.args` exists, and instead supposes you should do `arguments(ex)`. So that could be one thing.\r\n\r\nThe other is that arguments aren't defined if it's just a symbol. `:(0)` has no args, while `:(0 + 0)` does. In the generated expressions we might need to special case single symbols.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ChrisRackauckas",
"comment_id": null,
"datetime": 1615834115000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 2,524 | false | false | 2,524 | false |
NixOS/nixpkgs | NixOS | 836,771,096 | 117,018 | {
"number": 117018,
"repo": "nixpkgs",
"user_login": "NixOS"
} | [
{
"action": "opened",
"author": "cyplo",
"comment_id": null,
"datetime": 1616234875000,
"masked_author": "username_0",
"text": "<!--\r\nTo help with the large amounts of pull requests, we would appreciate your\r\nreviews of other pull requests, especially simple package updates. Just leave a\r\ncomment describing what you have tested in the relevant package/service.\r\nReviewing helps to reduce the average time-to-merge for everyone.\r\nThanks a lot if you do!\r\nList of open PRs: https://github.com/NixOS/nixpkgs/pulls\r\nReviewing guidelines: https://nixos.org/manual/nixpkgs/unstable/#chap-reviewing-contributions\r\n-->\r\n\r\n###### Motivation for this change\r\n\r\n\r\n###### Things done\r\n\r\n<!-- Please check what applies. Note that these are not hard requirements but merely serve as information for reviewers. -->\r\n\r\n- [X] Tested using sandboxing ([nix.useSandbox](https://nixos.org/nixos/manual/options.html#opt-nix.useSandbox) on NixOS, or option `sandbox` in [`nix.conf`](https://nixos.org/nix/manual/#sec-conf-file) on non-NixOS linux)\r\n- Built on platform(s)\r\n - [X] NixOS\r\n - [ ] macOS\r\n - [ ] other Linux distributions\r\n- [ ] Tested via one or more NixOS test(s) if existing and applicable for the change (look inside [nixos/tests](https://github.com/NixOS/nixpkgs/blob/master/nixos/tests))\r\n- [x] Tested compilation of all pkgs that depend on this change using `nix-shell -p nixpkgs-review --run \"nixpkgs-review wip\"`\r\n- [X] Tested execution of all binary files (usually in `./result/bin/`)\r\n- [ ] Determined the impact on package closure size (by running `nix path-info -S` before and after)\r\n- [ ] Ensured that relevant documentation is up to date\r\n- [X] Fits [CONTRIBUTING.md](https://github.com/NixOS/nixpkgs/blob/master/.github/CONTRIBUTING.md).",
"title": "genpass: 0.4.9 -> 0.4.12",
"type": "issue"
},
{
"action": "created",
"author": "r-rmcgibbo",
"comment_id": 803290822,
"datetime": 1616238010000,
"masked_author": "username_1",
"text": "Result of `nixpkgs-review pr 117018` at 4872f7a3 run on aarch64-linux [1](https://github.com/Mic92/nixpkgs-review)\n<details>\n <summary>1 package built successfully:</summary>\n <ul>\n <li><a href=\"https://gist.github.com/c428a76eaa4ee0cf8c93d00d5be468d3\">genpass</a></li></li>\n </ul>\n</details>\n<details>\n <summary>1 suggestion:</summary>\n <ul>\n <li>warning: <a href=\"https://github.com/jtojnar/nixpkgs-hammering/blob/master/explanations/unclear-gpl.md\">unclear-gpl</a>\n\n`agpl3` is a deprecated license, check if project uses `agpl3Plus` or `agpl3Only` and change `meta.license` accordingly.\n\nNear pkgs/tools/security/genpass/default.nix:25:5:\n```\n |\n25 | license = licenses.agpl3;\n | ^\n```\n</li>\n </ul>\n</details>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cyplo",
"comment_id": 803343134,
"datetime": 1616248816000,
"masked_author": "username_0",
"text": "thanks ! should be done now @fabaff :)",
"title": null,
"type": "comment"
}
] | 2 | 3 | 2,398 | false | false | 2,398 | false |
danielkrupinski/Osiris | null | 548,412,551 | 969 | null | [
{
"action": "created",
"author": "Straightplus",
"comment_id": 573310200,
"datetime": 1578743677000,
"masked_author": "username_0",
"text": "its not perfect",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rosesware",
"comment_id": 573315016,
"datetime": 1578747701000,
"masked_author": "username_1",
"text": "Edgejump isn't working on edges. It's jumping before the edge properly.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "danielkrupinski",
"comment_id": null,
"datetime": 1578771552000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "danielkrupinski",
"comment_id": 573347897,
"datetime": 1578771552000,
"masked_author": "username_2",
"text": "It'd be better with prediction which we don't have right now.",
"title": null,
"type": "comment"
}
] | 4 | 5 | 147 | false | true | 147 | false |
rbind/support | rbind | 751,747,089 | 719 | null | [
{
"action": "opened",
"author": "trianglegirl",
"comment_id": null,
"datetime": 1606411731000,
"masked_author": "username_0",
"text": "<!--\r\nPlease use this template for new rbind.io subdomain requests.\r\n\r\nA volunteer will help you create the subdomain later. We don't really have enough human resources here, so please be serious about your website. We hope to see you really make use of your website in the future, instead of simply getting a free subdomain and letting it collect dust in a corner. Thank you!\r\n-->\r\n\r\n## Netlify website address\r\n\r\nusername_0.netlify.app\r\n\r\n## Preferred rbind.io subdomain\r\n\r\nrhian.rbind.io\r\n\r\n### Agreement\r\n\r\n- [x] By submitting this request, I promise I will at least write one blog post or create one web page on my website after I get the rbind.io subdomain.\r\n\r\n\r\nHello, \r\n\r\nI do currently have an rbind.io subdomain username_0.rbind.io.\r\nHowever, I'd really like to be able to have a more professional subdomain. \r\n\r\nI believe rhian.rbind.io is available. If I'm able to have this new one, I obviously won't need the old one \"username_0\" and you can release it.\r\n\r\nMany thanks for your efforts, \r\nRhian",
"title": "subdomain for my personal website",
"type": "issue"
},
{
"action": "created",
"author": "nanxstats",
"comment_id": 735007235,
"datetime": 1606518022000,
"masked_author": "username_1",
"text": "@username_0 We have configured the rbind subdomain you requested. Please:\r\n\r\n1. Add the rbind subdomain in your Netlify account as the custom domain to your site, as is shown below. Simply use your `subdomain.rbind.io` as the record, no `www` needed.\r\n\r\n\r\n\r\n2. There might be messages saying \"DNS verification failed\" or \"check DNS configuration\" on Netlify after adding the rbind subdomain -- they can be safely ignored as long as the site renders properly.\r\n\r\n3. You may also find the last two sections of [this post](https://username_3.name/en/2017/11/301-redirect/) helpful for redirecting HTTP traffic to HTTPS automatically.\r\n\r\nThanks!\r\n-Nan",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "nanxstats",
"comment_id": 735007497,
"datetime": 1606518149000,
"masked_author": "username_1",
"text": "@username_3 -following Rhian's request, could you please help remove the record `username_0.rbind.io`? Thanks.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "StatsRhian",
"comment_id": 735822528,
"datetime": 1606746868000,
"masked_author": "username_2",
"text": "Have done - thanks @username_1",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "yihui",
"comment_id": 735913212,
"datetime": 1606755683000,
"masked_author": "username_3",
"text": "@username_1 Done.",
"title": null,
"type": "comment"
}
] | 4 | 5 | 1,913 | false | false | 1,913 | true |
microsoft/winget-pkgs | microsoft | 875,528,922 | 11,987 | {
"number": 11987,
"repo": "winget-pkgs",
"user_login": "microsoft"
} | [
{
"action": "opened",
"author": "jedieaston",
"comment_id": null,
"datetime": 1620138978000,
"masked_author": "username_0",
"text": "- [x] Have you signed the [Contributor License Agreement](https://cla.opensource.microsoft.com/microsoft/winget-pkgs)?\r\n- [x] Have you checked that there aren't other open [pull requests](https://github.com/microsoft/winget-pkgs/pulls) for the same manifest update/change?\r\n- [x] Have you validated your manifest locally with `winget validate --manifest <path>`? \r\n- [x] Have you tested your manifest locally with `winget install --manifest <path>`?\r\n- [x] Does your manifest conform to the [1.0 schema](https://github.com/microsoft/winget-cli/blob/master/doc/ManifestSpecv1.0.md)?\r\n\r\nNote: `<path>` is the name of the directory containing the manifest you're submitting.\r\n\r\n-----\r\n\r\n\n\n###### Microsoft Reviewers: [Open in CodeFlow](https://portal.fabricbot.ms/api/codeflow?pullrequest=https://github.com/microsoft/winget-pkgs/pull/11987)",
"title": "Added KDE Connect version 1.4.578.",
"type": "issue"
}
] | 4 | 5 | 1,482 | false | true | 838 | false |
meggart/Zarr.jl | null | 742,023,589 | 46 | {
"number": 46,
"repo": "Zarr.jl",
"user_login": "meggart"
} | [
{
"action": "created",
"author": "meggart",
"comment_id": 726666277,
"datetime": 1605261217000,
"masked_author": "username_0",
"text": "There seems to be something wrong on windows with the new HTTP version. The tests seem to complain about a non-existing libmbedtls. @username_1 you used to have access to windows machines, would you mind to test if you can reproduce?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "visr",
"comment_id": 726679293,
"datetime": 1605262692000,
"masked_author": "username_1",
"text": "I tried and got a different error on nightly:\r\n```\r\nERROR: Unable to automatically install 'MozillaCACerts' from 'C:\\bin\\julia-1.6\\share\\julia\\stdlib\\v1.6\\MozillaCACerts_jll\\Artifacts.toml'\r\n```\r\n\r\nThis seems to be only an issue on julia nightly, which currently is seeing a lot of changes on Pkg and JLL packages. I'd say let's merge this, and once 1.6 gets to the stabilization phase, we can make sure it is working again. It may need something like https://github.com/JuliaLang/julia/pull/38347.",
"title": null,
"type": "comment"
}
] | 3 | 3 | 1,039 | false | true | 725 | true |
sct/overseerr | null | 777,405,614 | 558 | {
"number": 558,
"repo": "overseerr",
"user_login": "sct"
} | [
{
"action": "opened",
"author": "samwiseg0",
"comment_id": null,
"datetime": 1609563490000,
"masked_author": "username_0",
"text": "#### Description\r\nUpdate docs for ubuntu. Add ubuntu arm install instructions. \r\n\r\n#### Issues Fixed or Closed by this PR\r\n\r\nN/A",
"title": "docs: Update Ubuntu install instructions [skip ci]",
"type": "issue"
}
] | 2 | 2 | 400 | false | true | 128 | false |
manfredsteyer/angular-oauth2-oidc | null | 632,603,657 | 855 | {
"number": 855,
"repo": "angular-oauth2-oidc",
"user_login": "manfredsteyer"
} | [
{
"action": "opened",
"author": "JohannesHuster",
"comment_id": null,
"datetime": 1591464022000,
"masked_author": "username_0",
"text": "Hi again,\r\n\r\nlike #853 this PR also solves issue #773. After researching a bit more I found a different solution. Because each solution might have different tradeoffs for you, I will leave both PRs open for now. \r\n\r\nI removed the `emitDecoratorMetadata` option, which is only needed for JIT support, if I am not mistaken: https://github.com/angular/angular/issues/30586#issuecomment-494898597. So with Angular 9 it should not be needed.\r\n\r\nThe actual issue seems to be a TypeScript limitation, but I opened issue https://github.com/angular/angular/issues/37472 with the angular team, to check whether this is a regression to a workaround they have implemented. From that issue you can find more information from the Angular team in linked issues that describe the original problem very well.\r\n\r\nThank you!",
"title": "Fix SSR ReferenceError with decorator metadata",
"type": "issue"
},
{
"action": "created",
"author": "manfredsteyer",
"comment_id": 651874723,
"datetime": 1593531536000,
"masked_author": "username_1",
"text": "Let's go with your other PR.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 833 | false | false | 833 | false |
googlemaps/js-jest-mocks | googlemaps | 926,588,904 | 2 | {
"number": 2,
"repo": "js-jest-mocks",
"user_login": "googlemaps"
} | [
{
"action": "opened",
"author": "jpoehnelt",
"comment_id": null,
"datetime": 1624308929000,
"masked_author": "username_0",
"text": "",
"title": "fix: use cjs output",
"type": "issue"
}
] | 3 | 3 | 304 | false | true | 0 | false |
openshift/certman-operator | openshift | 891,388,986 | 192 | {
"number": 192,
"repo": "certman-operator",
"user_login": "openshift"
} | [
{
"action": "opened",
"author": "2uasimojo",
"comment_id": null,
"datetime": 1620941026000,
"masked_author": "username_0",
"text": "To make the OLM template easier to understand, replace `${REPO_DIGEST}` with `${REGISTRY_IMG}@${IMAGE_DIGEST}`.\r\n\r\n`IMAGE_DIGEST` is supported as of APPSRE-3265.",
"title": "Use IMAGE_DIGEST in OLM template",
"type": "issue"
},
{
"action": "created",
"author": "yithian",
"comment_id": 842374541,
"datetime": 1621261928000,
"masked_author": "username_1",
"text": "/lgtm",
"title": null,
"type": "comment"
}
] | 4 | 5 | 2,930 | false | true | 166 | false |
ppy/osu | ppy | 894,398,463 | 12,851 | {
"number": 12851,
"repo": "osu",
"user_login": "ppy"
} | [
{
"action": "opened",
"author": "vininew921",
"comment_id": null,
"datetime": 1621345307000,
"masked_author": "username_0",
"text": "Fixes #12823\r\n\r\nMoved the applause sound object from the AccuracyCircle to the ResultScreen. Timed the delay so it matches the old one.",
"title": "Fixed applause sound stopping after switching scores",
"type": "issue"
},
{
"action": "created",
"author": "bdach",
"comment_id": 843215741,
"datetime": 1621347801000,
"masked_author": "username_1",
"text": "@username_0 it is not required to close pull requests after checks fail. you can reopen and push to the same branch you PRed this from (on your fork) to fix them.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "peppy",
"comment_id": 843380180,
"datetime": 1621358731000,
"masked_author": "username_2",
"text": "Which is fine, but all numbers should be drawn from constants, and match with `master`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bdach",
"comment_id": 843382958,
"datetime": 1621358986000,
"masked_author": "username_1",
"text": "I'm not sure I follow. The adjusted value I suggested in my review comment *is* the same value, temporally. The number changes because the delay starts in a different component, with a different lifetime start, so delays are naturally affected. And the original constant is one that comes out of thin air.\r\n\r\nI can make videos to show that it's the same, but at the end of the day, from the context I have the best that can be done is `-1440 + resize_duration + top_layer_expand_delay`. I'm not sure how to derive `-1440` originally.\r\n\r\nAm I missing some context again?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "peppy",
"comment_id": 843389752,
"datetime": 1621359450000,
"masked_author": "username_2",
"text": "I want to be able to review this by only reading the code. each portion of the duration which needs to be offset should be from a shared constant so it's clear why the value is what it is. And the original `1440` should be visible.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bdach",
"comment_id": 843394379,
"datetime": 1621359652000,
"masked_author": "username_1",
"text": "Sure, okay. So I guess I factor out `-1440` to a public constant on `AccuracyCircle`, expose the other two ones I need from `ScorePanel`, and combine those in `ResultsScreen`? Is that right?\r\n\r\nAlthough even then it wouldn't be immediately visible why `resize_duration + top_layer_expand_delay` is a thing from the diff. I'd probably need to expose a sum of those and use that in both places where it needs to go (one at existing site in `ScorePanel`, one in `ResultsScreen`).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "peppy",
"comment_id": 843395717,
"datetime": 1621359714000,
"masked_author": "username_2",
"text": "If this is decided as the correct direction for fixing this issue, yep, that sounds fine.\r\n\r\nI'd hope we could come up with something more contained, but I haven't looked into it yet.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bdach",
"comment_id": 843400114,
"datetime": 1621360141000,
"masked_author": "username_1",
"text": "I can't say I'm seeing an alternative as of right now, at least one that isn't a variant on `AlwaysPresent`. And it does make more sense to have an applause sound effect at screen-level to me than down in the accuracy circle.\r\n\r\nI guess I'll just go make changes as proposed above and if you can figure out something better, it can be opened as a separate PR superseding this one.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vininew921",
"comment_id": 843453847,
"datetime": 1621364408000,
"masked_author": "username_0",
"text": "Not sure why the CI check failed, I ran the same tests on my end and everything worked fine. I'm not that familiar with CI, so I don't know if it was just a fluke or a real error.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "peppy",
"comment_id": 845778517,
"datetime": 1621585749000,
"masked_author": "username_2",
"text": "I've added back test coverage in `TestSceneResultsScreen`. Let's go with this for now.",
"title": null,
"type": "comment"
}
] | 3 | 10 | 2,488 | false | false | 2,488 | true |
home-assistant/core | home-assistant | 866,817,275 | 49,634 | {
"number": 49634,
"repo": "core",
"user_login": "home-assistant"
} | [
{
"action": "opened",
"author": "janiversen",
"comment_id": null,
"datetime": 1619291498000,
"masked_author": "username_0",
"text": "<!--\r\n You are amazing! Thanks for contributing to our project!\r\n Please, DO NOT DELETE ANY TEXT from this template! (unless instructed).\r\n-->\r\n## Breaking change\r\n<!--\r\n If your PR contains a breaking change for existing users, it is important\r\n to tell them what breaks, how to make it work again and why we did this.\r\n This piece of text is published with the release notes, so it helps if you\r\n write it towards our users, not us.\r\n Note: Remove this section if this PR is NOT a breaking change.\r\n-->\r\n\r\n\r\n## Proposed change\r\n<!--\r\n Describe the big picture of your changes here to communicate to the\r\n maintainers why we should accept this pull request. If it fixes a bug\r\n or resolves a feature request, be sure to link to that issue in the\r\n additional information section.\r\n-->\r\nSmall cleanup in modbus sensor.py of non-reachable/unclear code detected while making 100% test coverage.\r\n\r\nSet value = last known state (which can be None)\r\nRemove non reachable code from struct handling.\r\n\r\n## Type of change\r\n<!--\r\n What type of change does your PR introduce to Home Assistant?\r\n NOTE: Please, check only 1! box!\r\n If your PR requires multiple boxes to be checked, you'll most likely need to\r\n split it into multiple PRs. This makes things easier and faster to code review.\r\n-->\r\n\r\n- [ ] Dependency upgrade\r\n- [ ] Bugfix (non-breaking change which fixes an issue)\r\n- [ ] New integration (thank you!)\r\n- [ ] New feature (which adds functionality to an existing integration)\r\n- [ ] Breaking change (fix/feature causing existing functionality to break)\r\n- [x] Code quality improvements to existing code or addition of tests\r\n\r\n## Example entry for `configuration.yaml`:\r\n<!--\r\n Supplying a configuration snippet, makes it easier for a maintainer to test\r\n your PR. Furthermore, for new integrations, it gives an impression of how\r\n the configuration would look like.\r\n Note: Remove this section if this PR does not have an example entry.\r\n-->\r\n\r\n```yaml\r\n# Example configuration.yaml\r\n\r\n```\r\n\r\n## Additional information\r\n<!--\r\n Details are important, and help maintainers processing your PR.\r\n Please be sure to fill out additional details, if applicable.\r\n-->\r\n\r\n- This PR fixes or closes issue: fixes #\r\n- This PR is related to issue: \r\n- Link to documentation pull request: \r\n\r\n## Checklist\r\n<!--\r\n Put an `x` in the boxes that apply. You can also fill these out after\r\n creating the PR. If you're unsure about any of them, don't hesitate to ask.\r\n We're here to help! This is simply a reminder of what we are going to look\r\n for before merging your code.\r\n-->\r\n\r\n- [x] The code change is tested and works locally.\r\n- [x] Local tests pass. **Your PR cannot be merged unless tests pass**\r\n- [x] There is no commented out code in this PR.\r\n- [x] I have followed the [development checklist][dev-checklist]\r\n- [x] The code has been formatted using Black (`black --fast homeassistant tests`)\r\n- [ ] Tests have been added to verify that the new code works.\r\n\r\nIf user exposed functionality or configuration variables are added/changed:\r\n\r\n- [ ] Documentation added/updated for [www.home-assistant.io][docs-repository]\r\n\r\nIf the code communicates with devices, web services, or third-party tools:\r\n\r\n- [ ] The [manifest file][manifest-docs] has all fields filled out correctly. \r\n Updated and included derived files by running: `python3 -m script.hassfest`.\r\n- [ ] New or updated dependencies have been added to `requirements_all.txt`. \r\n Updated by running `python3 -m script.gen_requirements_all`.\r\n- [ ] Untested files have been added to `.coveragerc`.\r\n\r\nThe integration reached or maintains the following [Integration Quality Scale][quality-scale]:\r\n<!--\r\n The Integration Quality Scale scores an integration on the code quality\r\n and user experience. Each level of the quality scale consists of a list\r\n of requirements. We highly recommend getting your integration scored!\r\n-->\r\n\r\n- [x] No score or internal\r\n- [ ] 🥈 Silver\r\n- [ ] 🥇 Gold\r\n- [ ] 🏆 Platinum\r\n\r\n<!--\r\n This project is very active and we have a high turnover of pull requests.\r\n\r\n Unfortunately, the number of incoming pull requests is higher than what our\r\n reviewers can review and merge so there is a long backlog of pull requests\r\n waiting for review. You can help here!\r\n \r\n By reviewing another pull request, you will help raise the code quality of\r\n that pull request and the final review will be faster. This way the general\r\n pace of pull request reviews will go up and your wait time will go down.\r\n \r\n When picking a pull request to review, try to choose one that hasn't yet\r\n been reviewed.\r\n\r\n Thanks for helping out!\r\n-->\r\n\r\nTo help with the load of incoming pull requests:\r\n\r\n- [ ] I have reviewed two other [open pull requests][prs] in this repository.\r\n\r\n[prs]: https://github.com/home-assistant/core/pulls?q=is%3Aopen+is%3Apr+-author%3A%40me+-draft%3Atrue+-label%3Awaiting-for-upstream+sort%3Acreated-desc+review%3Anone\r\n\r\n<!--\r\n Thank you for contributing <3\r\n\r\n Below, some useful links you could explore:\r\n-->\r\n[dev-checklist]: https://developers.home-assistant.io/docs/en/development_checklist.html\r\n[manifest-docs]: https://developers.home-assistant.io/docs/en/creating_integration_manifest.html\r\n[quality-scale]: https://developers.home-assistant.io/docs/en/next/integration_quality_scale_index.html\r\n[docs-repository]: https://github.com/home-assistant/home-assistant.io",
"title": "Remove dead code in sensor.py and 100% test coverage.",
"type": "issue"
}
] | 2 | 2 | 5,710 | false | true | 5,419 | false |
Azure/azure-functions-dotnet-worker | Azure | 841,534,132 | 363 | null | [
{
"action": "opened",
"author": "ssa3512",
"comment_id": null,
"datetime": 1616725947000,
"masked_author": "username_0",
"text": "I was playing around with middleware and storage queue triggered functions (this will be so nice to have) but the way the data was stored in the function execution context did not line up with my expectations when the function is taking a strongly typed object coming in as JSON.\r\n\r\nGiven the class:\r\n```cs\r\npublic class MessageInfo\r\n{\r\n public int Id { get; set; }\r\n public string Name { get; set; }\r\n}\r\n```\r\nand the queue message `{\"Name\":\"Steve\",\"Id\":2}`\r\n\r\nI get the following binding data\r\n\r\n\r\nIt seems odd that the individual property values are in the binding data dictionary as opposed to the deserialized MessageInfo object.\r\n\r\nCan you provide some clarification on this? It feels even weirder if I use a nested data structure like this:\r\n```cs\r\npublic class MessageInfo\r\n{\r\n public int Id { get; set; }\r\n public Person Person { get; set; }\r\n}\r\n\r\npublic class Person\r\n{\r\n public string FirstName { get; set; }\r\n public string LastName { get; set; }\r\n}`\r\n```\r\nThe \"Id\" field is in the data dictionary as the raw value, and the \"Person\" value is there as JSON. The entire serialized value is there as QueueTrigger.\r\n\r\n\r\n\r\nIs there any way to access the fully deserialized object from middleware, or will I need to manually deserialize to access the data in a strongly typed manner? What would happen if the incoming object used in my binding had a top level property that conflicted with one of the defaults on the binding type (`DateTime ExpirationTime` for example) - which value would show in the dictionary?",
"title": "BindingData values are confusing and don't line up with input object structure",
"type": "issue"
},
{
"action": "created",
"author": "fabiocav",
"comment_id": 808443497,
"datetime": 1616784713000,
"masked_author": "username_1",
"text": "The data exposed in biding data is the raw information we receive from the actual extension running in WebJobs. This is pre-processing/binding in the worker (those are the values ultimately used by the converters to perform model binding).\r\n\r\nWhen binding to POCOs/Complex types, most extensions (queues included) will default to JSON serialization. This is ultimately controlled by the extension itself, and you'll see changes depending on how things are configured (including enhancements in the future, like the ability to pass an object reference only). \r\n\r\nBased on what you're describing, it sounds like you'd better served by working directly against the model binding feature (which is still undergoing reviews, so not part of the public API), which performs all the conversions and will expose the APIs you'd need to get the binding payload as a specific type. \r\n\r\nFor the binding rules (for default values, for example), it's hard to answer that generically since ultimately, the extension author is in control (i.e. the storage extension defines how that information is passed to, and received from, the worker)\r\n\r\nRight now, with the current state of what is currently exposed, handling the deserialization manually would be the most straight forward way, but this is an example of an area that will be seeing a lot of enhancements to make scenarios like what you're describing simpler.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ssa3512",
"comment_id": 808444941,
"datetime": 1616784884000,
"masked_author": "username_0",
"text": "Thanks - it sounds like this is exactly what I am looking for. Do you have an separate public issue/discussion about this API I can follow?",
"title": null,
"type": "comment"
}
] | 2 | 3 | 3,293 | false | false | 3,293 | false |
FNNDSC/pl-fetal-brain-assessment | FNNDSC | 859,516,035 | 4 | null | [
{
"action": "opened",
"author": "jennydaman",
"comment_id": null,
"datetime": 1618555730000,
"masked_author": "username_0",
"text": "https://github.com/FNNDSC/pl-fetal-brain-assessment/blob/04adb7d08ced030bcd670e05ddb7ca4a60dc69ff/fetal_brain_assessment/volume.py#L48-L49\r\n\r\nInput is required to have very specific dimensions. Not sure about how dimensions of the image affect model evaluation.",
"title": "Hard-coded image dimensions are not generalized",
"type": "issue"
}
] | 1 | 1 | 261 | false | false | 261 | false |
linagora/james-project | linagora | 754,935,890 | 4,102 | {
"number": 4102,
"repo": "james-project",
"user_login": "linagora"
} | [
{
"action": "opened",
"author": "LanKhuat",
"comment_id": null,
"datetime": 1606884403000,
"masked_author": "username_0",
"text": "",
"title": "WIP: JAMES-3460 MailboxChangeListener implementation",
"type": "issue"
},
{
"action": "created",
"author": "chibenwa",
"comment_id": 737596894,
"datetime": 1606958452000,
"masked_author": "username_1",
"text": "```\r\n[607469e5041658d555a5f7f24b6aab07f1b53453] [ERROR] MemorySpamAssassinContractTest.spamShouldBeDeliveredInSpamMailboxOrInboxWhenMultipleRecipientsConfigurations » ConditionTimeout\r\n```\r\n\r\ntest this please",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chibenwa",
"comment_id": 739137843,
"datetime": 1607152309000,
"masked_author": "username_1",
"text": "Merged",
"title": null,
"type": "comment"
}
] | 2 | 3 | 216 | false | false | 216 | false |
navikt/fp-frontend | navikt | 813,328,331 | 756 | {
"number": 756,
"repo": "fp-frontend",
"user_login": "navikt"
} | [
{
"action": "opened",
"author": "pekern",
"comment_id": null,
"datetime": 1613987058000,
"masked_author": "username_0",
"text": "",
"title": "FIX: Fikser warnings i konsollen i fakta om beregning panel",
"type": "issue"
}
] | 2 | 2 | 3,394 | false | true | 0 | false |
CareBoo/Serially | CareBoo | 679,721,753 | 58 | {
"number": 58,
"repo": "Serially",
"user_login": "CareBoo"
} | [
{
"action": "opened",
"author": "jasonboukheir",
"comment_id": null,
"datetime": 1597569939000,
"masked_author": "username_0",
"text": "Adding TypeFilterAttribute support to the ShowSerializeReference fields.\n\nFixes #52",
"title": "jasonboukheir/issue52",
"type": "issue"
}
] | 3 | 3 | 3,520 | false | true | 83 | false |
dotnet/performance | dotnet | 563,448,929 | 1,179 | null | [
{
"action": "opened",
"author": "cshung",
"comment_id": null,
"datetime": 1581451907000,
"masked_author": "username_0",
"text": "Recently I have [fixed](https://github.com/dotnet/runtime/pull/2474) a [performance issue](https://github.com/dotnet/runtime/issues/2274) related to finalizing `WeakReference<T>`. We should have some performance tests in place to test the performance of it.",
"title": "Test WeakReference<T> finalization performance.",
"type": "issue"
}
] | 1 | 1 | 257 | false | false | 257 | false |
Azagwen/ATBYW | null | 771,206,699 | 2 | null | [
{
"action": "opened",
"author": "Tetrajak",
"comment_id": null,
"datetime": 1608330688000,
"masked_author": "username_0",
"text": "MC Version: 1.16.4\r\nFabric Version: 0.10.8\r\nJava Version: 1.8.0_271, 64-bit\r\nLauncher: MultiMC 5\r\n\r\nAs per the title, with the mod installed the game fails to launch, but without it the instance launches just fine.\r\n\r\nCrash log: https://pastebin.com/pMW1paWh",
"title": "Crash on Startup",
"type": "issue"
},
{
"action": "created",
"author": "Azagwen",
"comment_id": 748357874,
"datetime": 1608332035000,
"masked_author": "username_1",
"text": "It seems like \"charm\" interferes with my mod, because it also modifies how the Enchanting table works.\r\nI don't really know how to deal with this kind of interference issue yet but be sure I'll gather info on how to do so !",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Tetrajak",
"comment_id": 748376890,
"datetime": 1608335445000,
"masked_author": "username_0",
"text": "I can confirm the game launches if Charm is removed.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Azagwen",
"comment_id": 748391991,
"datetime": 1608339139000,
"masked_author": "username_1",
"text": "Temporary fix I can suggest is to dig into my mod's jar and edit \"atbyw.mixins.json\" to remove \"EnchantingTableBlockMixin\" fron \"client\" mixins.\r\nIf it still doesn't launch after that edit feel free to come back to me with the new log.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RDKRACZ",
"comment_id": 748464769,
"datetime": 1608379156000,
"masked_author": "username_2",
"text": "I removed \"EnchantingTableBlockMixin\" and the game can now launch with both mods.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "SylisMC",
"comment_id": 751317590,
"datetime": 1608957708000,
"masked_author": "username_3",
"text": "I found a better fix.\r\nCharm and ATBYW both add bookshelves, exact same blocks, exact same recipes, However, ATBYW also adds Redstone bookshelves. also, charm has a diverse config and ATBYW doesnt have a config file yet. so.\r\n\r\n1) Remove \"EnchantingTableBlockMixin\" From charms mod file\r\n2) Set `\"VariantBookshelves Enabled\" = false` in the charm config",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Azagwen",
"comment_id": null,
"datetime": 1609273135000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 4 | 7 | 1,202 | false | false | 1,202 | false |
azimjohn/jprq | null | 811,081,971 | 25 | null | [
{
"action": "opened",
"author": "MP3Martin",
"comment_id": null,
"datetime": 1613653541000,
"masked_author": "username_0",
"text": "Can i use this to open TCP and not HTTP?",
"title": "TCP",
"type": "issue"
},
{
"action": "created",
"author": "azimjohn",
"comment_id": 781333981,
"datetime": 1613653962000,
"masked_author": "username_1",
"text": "Currently, no.\n\nI am working on adding a feature for exposing any TCP server: including ssh and http (with websocket).\n\nHope it will be ready for using in early April this year.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MP3Martin",
"comment_id": 782185855,
"datetime": 1613752235000,
"masked_author": "username_0",
"text": "Will the tpc be like this⬇ or like Ngrok?:\r\nNgrok generates something like 0.ngrok.com:56842 (same adress, random port)\r\nJprq (same port, different adress)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "azimjohn",
"comment_id": 782188327,
"datetime": 1613752465000,
"masked_author": "username_1",
"text": "it will provide something like tcp.jprq.io:56842, same host & different port, like ngrok.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MP3Martin",
"comment_id": 782188877,
"datetime": 1613752518000,
"masked_author": "username_0",
"text": "Will there be any way to edit the port? bc i need 25565",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "azimjohn",
"comment_id": 782190387,
"datetime": 1613752665000,
"masked_author": "username_1",
"text": "Sure, I can design it the way that you can request a specific port. It will grant it if the port isn't busy.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MP3Martin",
"comment_id": 782590459,
"datetime": 1613811389000,
"masked_author": "username_0",
"text": "thanks and nice work 👍",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "MP3Martin",
"comment_id": null,
"datetime": 1613811409000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "Raparivo",
"comment_id": 1073963154,
"datetime": 1647872664000,
"masked_author": "username_2",
"text": "Hi,\r\nTCP is now added and working. What about the ability to requesting a specific port?\r\nThank you.",
"title": null,
"type": "comment"
}
] | 3 | 9 | 746 | false | false | 746 | false |
Kethku/neovide | null | 927,236,094 | 740 | null | [
{
"action": "opened",
"author": "SenseiRalph",
"comment_id": null,
"datetime": 1624368529000,
"masked_author": "username_0",
"text": "The concurrent key press issue has been fixed by [reverting back to winit](https://github.com/username_2/neovide/issues/244#issuecomment-865750061), but this introduced somewhat laggy cursor animations. They look like they have a lower framerate to them, like 30 fps instead of 60.\r\n\r\nI built Neovide today (2021-05-06) on Windows 21H1. My previous build was from 2021-05-06 and had smooth cursor animations but had [this](https://github.com/username_2/neovide/issues/244) issue (now fixed). Both builds were made following the instructions on the repo main page.",
"title": "Lower framerate cursor animations with winit",
"type": "issue"
},
{
"action": "created",
"author": "shaunsingh",
"comment_id": 866854337,
"datetime": 1624456295000,
"masked_author": "username_1",
"text": "Hi, could you please provide a screenrecording if possible?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Kethku",
"comment_id": 868726040,
"datetime": 1624642553000,
"masked_author": "username_2",
"text": "I think I've also noticed this problem. I believe it has to do with the way that the animation loop works. Currently it uses the WaitUntil https://docs.rs/winit/0.25.0/winit/event_loop/enum.ControlFlow.html#variant.WaitUntil to run the animation loop. Either this should be swapped to RequestAnimationFrame style looping or we should swap to Poll and try to sleep at appropriate time to make sure we don't just spin the cpu.\r\n\r\nI took a stab at this earlier this week, but gave up with my initial approach using Wait and RequestAnimationFrame because I was trying too hard to hack it in.\r\n\r\nIf anybody has any thoughts in this regard I'm all ears as I think this should probably block the next release.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fredizzimo",
"comment_id": 1007705890,
"datetime": 1641586283000,
"masked_author": "username_3",
"text": "I suggest that you try to decouple the rendering loop from the event loop. Yes I know that's not an easy task, but that would give you much more control, you can for example enable vsync, and get very precise timing.\r\n\r\nBut even with the current loop, or with a separate render thread, but vsync disabled, you should be able to do much better if you fix your timestep to the display refresh rate, or a divisor of it. That can be done using an accumulator like Glenn Fiedler describes in this excellent article https://gafferongames.com/post/fix_your_timestep/. But forget the \"The final touchup section\" with interpolation, that's not applicable here, since you can decide your timestep, even on the fly.\r\n\r\nYou also need to make sure that you try to render the frame at least at this rate, preferably twice the rate, because it does not matter that you render the same frame twice in a vblank interval, it's still only going to be displayed once and look smooth. Otherwise it's very easy to miss a frame. \r\n\r\nWith vsync that's easy to do, but without vsync, you need to simulate it. So instead of letting the time drift, like you do with `previous_frame_start + frame_duration`, you do `expected_frame_length_seconds * ceil(current_time / expected_frame_length_seconds)`, which should wait until the next time exactly divisible by the frame length. `current_time` should be read just before calling `WaitUntil`, since the most important thing is to come out of the wait at an as consistent rate as possible.\r\n\r\nIf the `expected_frame_length_seconds`, corresponds exactly to the monitor refresh rate, then the above technique might jitter quite a bit if you are unlucky, since it's possible that the expected frame starts are very close to the vsync times, and then it can constantly jump between the previous and next frames. That's why twice the rate is better. Alternatively you could try using 59 Hz instead of 60 Hz for example (for the rendering, the update stays at 60Hz), then you would only see a skipped frame once per second, which is much better.\r\n\r\nI hope this at least helps you get some ideas, since improvements can likely be seen even when just implementing parts of this. I also can't guarantee that it will work in your case, since I'm taking the general ideas from the top of my head , but with some experience after working almost 20 years in the games industry.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Kethku",
"comment_id": 1007726445,
"datetime": 1641588053000,
"masked_author": "username_2",
"text": "Super valuable insight. Another part of it that may be valuable is to make the frame drops visible in some way. Lots of systems display graphs of frame times via some sort of spark line. I can't help but think having an option which overlays information like that would make addressing this issue easier.\r\n\r\nCurrently all we have to go on is \"it feels slower\" which is hard to turn into actionable changes, and even if we make a change, its not obvious how to verify that it worked.\r\n\r\nI figure something as simple as just putting the frame length in a rolling buffer and then displaying that on the screen would be valuable, but I'm curious if you have any thoughts or insight to give for verifying responsiveness.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fredizzimo",
"comment_id": 1007740193,
"datetime": 1641589537000,
"masked_author": "username_3",
"text": "I just tested it. The Game Bar built into Windows can show graphs and FPS https://www.howtogeek.com/706162/how-to-see-fps-in-any-windows-10-game-without-extra-software/. I'm getting between 30-40 FPS on a Ryzen 5 5600 with RX3060... \r\n\r\nI have never used it, since we use custom tools, but this profiler looks very cool https://github.com/wolfpld/tracy, for really digging into performance problems. I found it when I was checking out this library https://github.com/ocornut/imgui, which is often used for making ingame graphs and counters like that, but probably not usable for you since it's for c++.\r\n\r\nAnd yes rolling buffers, are usually used for that. With them you can can also track other statistics like, min, max, median 90 percentile and so on. But at least for this project it's best to stay simple, since implementing those can take quite some time. And hopefully the Game Bar is enough.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Kethku",
"comment_id": 1007741240,
"datetime": 1641589647000,
"masked_author": "username_2",
"text": "Hadn't thought to use the game bar. Thats a fantastic idea. And 30-40 fps is absolutely not sufficient. Again, thanks so much for the thoughts. An extra set of experienced eyes on the system makes a world of difference",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Kethku",
"comment_id": 1030974355,
"datetime": 1644196439000,
"masked_author": "username_2",
"text": "https://github.com/neovide/neovide/pull/1201 addresses this issue by restructuring our event loop to poll by default and fall back to waiting when not animating. On my machine, I regularly hit close to my monitors refresh rate (modulo some occasional frame drops).\r\n\r\nThis doesn't implement rendering on a background thread though as that would involve a much larger change which I don't fully understand yet. The system also does not implement keeping track of a rolling frame offset, but instead just changes the current system to render at 1.1 the requested framerate 😅 This isn't correct... but is good enough for now and we can improve it later.\r\n\r\n@username_3 if you have the time, could you take a look at the changes? Its not what you suggested, but related and I figure you may have ideas to improve it further.",
"title": null,
"type": "comment"
}
] | 4 | 8 | 6,356 | false | false | 6,356 | true |
antfu/icones | null | 690,984,982 | 12 | null | [
{
"action": "opened",
"author": "egoist",
"comment_id": null,
"datetime": 1599050203000,
"masked_author": "username_0",
"text": "It should convert kebab-case like `stroke-width` to camelCase 🤔",
"title": "Output valid React JSX",
"type": "issue"
},
{
"action": "created",
"author": "antfu",
"comment_id": 685723485,
"datetime": 1599052031000,
"masked_author": "username_1",
"text": "Which icon set you are using when facing this issue? I didn't found an icon with this attr.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "egoist",
"comment_id": 686507199,
"datetime": 1599141137000,
"masked_author": "username_0",
"text": "for example this is the JSX output of `heroicons-outline:search` from heroicons:\r\n\r\n```jsx\r\n<svg focusable=\"false\" width=\"1em\" height=\"1em\" viewBox=\"0 0 24 24\">\r\n<g fill=\"none\">\r\n<path \r\nd=\"M21 21l-6-6m2-5a7 7 0 1 1-14 0a7 7 0 0 1 14 0z\" \r\nstroke=\"currentColor\" \r\nstroke-width=\"2\" \r\nstroke-linecap=\"round\" \r\nstroke-linejoin=\"round\"></path>\r\n</g>\r\n</svg>\r\n```",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "antfu",
"comment_id": null,
"datetime": 1599760471000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "antfu",
"comment_id": 690581563,
"datetime": 1599760494000,
"masked_author": "username_1",
"text": "Fixed. Thanks",
"title": null,
"type": "comment"
}
] | 2 | 5 | 525 | false | false | 525 | false |
microsoft/vscode-mssql | microsoft | 765,952,091 | 9,154 | null | [
{
"action": "opened",
"author": "13140780341",
"comment_id": null,
"datetime": 1607925658000,
"masked_author": "username_0",
"text": "【本溪明山区找妹子美女包夜服务】▋╋薇:781乄372乄524▋【本溪明山区妹子多少钱一晚】【本溪明山区上门服务▋╋薇:781乄372乄524▋本溪明山区▋╋薇:781乄372乄524▋本溪明山区找小姐小妹兼职按摩莞式一条龙服务▋╋薇:781乄372乄524▋《本溪明山区找小姐远离服务》《红灯区》▋╋薇:781乄372乄524▋《本溪明山区找小姐》《保健按摩》《一夜情》▋╋薇:781乄372乄524▋《本溪明山区找小姐保健按摩》《一条龙服务》▋╋薇:781乄372乄524▋月日到底是什么神仙日子啊五月天开演唱会竟然是半空高中时期播放列表的节奏《知足》、《倔强》、《突然好想你》等金曲现场大放送周杰伦、王力宏的出现让演唱会甚至一度变成了天王的集会。这都不是重点重点是五月天的主唱阿信竟然跳舞跳上了微博热搜还酸到了一大波网友其实是蔡依林作为嘉宾惊艳亮相五月天演唱会带来的神仙操作。康康阿信和这张对视的美好偶像剧画面瞬间觉得“哇这对太甜了吧。”虽然两人只是多年的好友也完全不影响“五月花”合体之后的高糖瞬间嗑到上头了有木有~通过小编在微博看演唱会得出的信息两人在现场合唱了《玫瑰少年》《今天你要嫁给我》两首金曲还上演了一出精彩的跳舞表演“旋转、跳跃我闭着眼”的直接引发全场欢呼这下腰的功力不愧是有“舞娘”身份的天后蔡依林啊不过蔡依林虽然惊喜现身五月天演唱会也带来了两首金曲一段舞蹈但对于粉丝来说“这才哪跟哪根本不够好吗”别着急要不为啥开篇就说了一句“月日”是神仙日子呢除了蔡依林和阿信共舞甜哭众人之外娱乐圈还有一件大事——苏宁易购再次官宣“双十一嗨爆夜”的全明星阵容蔡依林凭借姓氏字母的优势位居榜首我等没看够的观众又有福利了——苏宁易购官微深夜在微博偷偷爆料全场的华语天后蔡依林将在“狮晚”展现出与众不同的金曲特别舞台。别说了光是听蔡依林的名字其实我们就能自动脑补“大艺术家”、“舞娘”、“怪美的”等炸裂舞台但还是忍不住搓着小手手期待“苏宁易购嗨爆夜”当晚的与众不同。其实不止从苏宁易购官宣的全明星阵容来看根本不能用“强大”这样的词汇来形容只有“啊啊啊啊”的尖叫声才能代表我们内心的激动与兴奋。比如同样承包了超多金曲的另一大天后容祖儿也将通过精心改编带来别开生面的“金曲新唱”周笔畅、张靓颖、尚雯婕等超女重聚首携手用歌曲带来回忆杀说完女爱豆再来说说参加“狮晚”的男明星们凭借“不要你觉得我要我觉得”等明言明语大火的黄晓明似乎又要在嗨爆夜语出惊人在《做家务的男人》中相遇的话痨汪苏泷、魏大勋基本可以预料他们在嗨爆夜的超强综艺感刚刚开完《陈情令》音乐演唱会的王一博、肖战似乎又要合体了想提前预定“博君一笑”甜甜的糖尉逃侨九勤渡戏沿肪睦屹竞律腿缚https://github.com/microsoft/vscode-mssql/issues/8258 <br />https://github.com/microsoft/vscode-mssql/issues/6878 <br />https://github.com/microsoft/vscode-mssql/issues/2280?09307 <br />https://github.com/microsoft/vscode-mssql/issues/4348 <br />https://github.com/microsoft/vscode-mssql/issues/2968?o028A <br />",
"title": "本溪明山区哪有特殊服务的洗浴_腾讯新闻今日要闻",
"type": "issue"
}
] | 1 | 1 | 1,432 | false | false | 1,432 | false |
ebi-uniprot/franklin-sites | ebi-uniprot | 801,478,156 | 162 | {
"number": 162,
"repo": "franklin-sites",
"user_login": "ebi-uniprot"
} | [
{
"action": "opened",
"author": "aurel-l",
"comment_id": null,
"datetime": 1612458729000,
"masked_author": "username_0",
"text": "## Purpose\r\nAdd a third state to a column header where it can be visually marked as sortable even though it is not sorted.\r\n\r\n## Approach\r\nVisual change, opacity 0.7 on a sortable column\r\n\r\n## Testing\r\nUpdated snapshots, checked story\r\n\r\n## Stories\r\ndata table story\r\n\r\n## Checklist\r\n- [x] My PR is scoped properly, and “does one thing only”\r\n- [x] I have reviewed my own code\r\n- [x] I have checked that linting checks pass and type safety is respected\r\n- [x] I have checked that tests pass and coverage has at least improved, and if not explained the reasons why\r\n- [x] Were all the edited/created files in TypeScript? If not, please list the files you left in JavaScript and the reason to not update them.\r\n- [ ] For the stories you created/updated, are the props modifiables through knobs?",
"title": "add a visual cue that a column is sortable even though it is not sorted",
"type": "issue"
},
{
"action": "created",
"author": "aurel-l",
"comment_id": 773906648,
"datetime": 1612516901000,
"masked_author": "username_0",
"text": "Be the trigger of what? And do you mean the whoooole table header (as in, the whole set of column headers?)?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "xwatkins",
"comment_id": 773909106,
"datetime": 1612517168000,
"masked_author": "username_1",
"text": "Actually it is already, it's just that I think `cursor: pointer;` should be on `&--sortable` not just the `:after`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "aurel-l",
"comment_id": 773911782,
"datetime": 1612517441000,
"masked_author": "username_0",
"text": "Ah, I see. I wasn't sure what you were refering to as the behaviour was there already. I have now modified the cursor style to cover the whole column header.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 1,171 | false | false | 1,171 | false |
amzn/selling-partner-api-docs | amzn | 778,975,333 | 266 | null | [
{
"action": "opened",
"author": "RJEspera",
"comment_id": null,
"datetime": 1609850596000,
"masked_author": "username_0",
"text": "I get \"Access to requested resource is denied\" while testing sandbox endpoint. (Unauthorized) error. I followed the instructions from https://github.com/amzn/selling-partner-api-docs/blob/main/guides/developer-guide/SellingPartnerApiDeveloperGuide.md#the-selling-partner-api-sandbox. I used Self Authorization and was able to get a refresh token, then an access token using the refresh token. I was also able to create and sign the request successfully as well.",
"title": "Sandbox - getting 'Unauthorized' error with Self-Authorized authentication flow.",
"type": "issue"
},
{
"action": "created",
"author": "RJEspera",
"comment_id": 754771155,
"datetime": 1609866729000,
"masked_author": "username_0",
"text": "I found the solution to it here: https://github.com/amzn/selling-partner-api-docs/issues/24#issuecomment-738901323.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "RJEspera",
"comment_id": null,
"datetime": 1609866729000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "weiliguo15634145",
"comment_id": 787855189,
"datetime": 1614596051000,
"masked_author": "username_1",
"text": "Have you found the answer , I had the same problem , And I need your help.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 651 | false | false | 651 | false |
adobe/aio-cli-plugin-jwt-auth | adobe | 467,111,556 | 37 | {
"number": 37,
"repo": "aio-cli-plugin-jwt-auth",
"user_login": "adobe"
} | [
{
"action": "created",
"author": "shazron",
"comment_id": 511634439,
"datetime": 1563242489000,
"masked_author": "username_0",
"text": "Because eslint-config-standard has changed (breaking change) a lot of our code do not pass the linter. Punt this for a release post-v2.0.\r\n\r\nChanges: https://standardjs.com/changelog.html#1300---2019-07-10",
"title": null,
"type": "comment"
}
] | 3 | 6 | 5,386 | false | true | 205 | false |
MicrosoftDocs/azure-docs | MicrosoftDocs | 864,215,666 | 74,152 | null | [
{
"action": "opened",
"author": "VipinAgarwal",
"comment_id": null,
"datetime": 1619034375000,
"masked_author": "username_0",
"text": "[Enter feedback here]\r\nCurrently we are helping customer to setup their azure resources. They already has another Customer Partner filled in for another bunch of services. So wanted to see how can we take the credit from Microsoft for the work we are helping on our end. \r\n\r\n---\r\n#### Document Details\r\n\r\n⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*\r\n\r\n* ID: 5f901056-c225-b086-cdaa-c8bd978db64c\r\n* Version Independent ID: 9c067eb4-54cb-e141-d82f-e136ff3c671e\r\n* Content: [Link an Azure account to a partner ID](https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/link-partner-id)\r\n* Content Source: [articles/cost-management-billing/manage/link-partner-id.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/cost-management-billing/manage/link-partner-id.md)\r\n* Service: **cost-management-billing**\r\n* Sub-service: **billing**\r\n* GitHub Login: @username_3\r\n* Microsoft Alias: **banders**",
"title": "How can we add more than one partner?",
"type": "issue"
},
{
"action": "created",
"author": "BharathNimmala-MSFT",
"comment_id": 824488586,
"datetime": 1619057744000,
"masked_author": "username_1",
"text": "@username_0 Thank you for your query , our team will review it and get back to you at the earliest.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "KrishnaG-MSFT",
"comment_id": 824494176,
"datetime": 1619058786000,
"masked_author": "username_2",
"text": "Tagging @username_3 @bandersmsft for visibility, review and to help provide insights.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dhirajgandhi",
"comment_id": 824835627,
"datetime": 1619097624000,
"masked_author": "username_3",
"text": "It is possible to have multiple partners will Partner Admin Link. The user who has RBAC access to customer's Azure resources need to tag the MPN ID.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "KrishnaG-MSFT",
"comment_id": 825491381,
"datetime": 1619166388000,
"masked_author": "username_2",
"text": "@username_3 Thanks for the super quick response.\r\n@username_0 Let us know if you have any further questions. If not, we would like to go ahead and close this issue.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "KrishnaG-MSFT",
"comment_id": null,
"datetime": 1619422615000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 4 | 6 | 1,476 | false | false | 1,476 | true |
BenFradet/spark-kafka-writer | null | 896,081,535 | 177 | {
"number": 177,
"repo": "spark-kafka-writer",
"user_login": "BenFradet"
} | [
{
"action": "opened",
"author": "scala-steward",
"comment_id": null,
"datetime": 1621469527000,
"masked_author": "username_0",
"text": "Updates [org.scoverage:sbt-scoverage](https://github.com/scoverage/sbt-scoverage) from 1.6.1 to 1.8.1.\n[GitHub Release Notes](https://github.com/scoverage/sbt-scoverage/releases/tag/v1.8.1) - [Version Diff](https://github.com/scoverage/sbt-scoverage/compare/v1.6.1...v1.8.1)\n\nI'll automatically update this PR to resolve conflicts as long as you don't change it yourself.\n\nIf you'd like to skip this version, you can just close this PR. If you have any feedback, just mention me in the comments below.\n\nConfigure Scala Steward for your repository with a [`.username_0.conf`](https://github.com/username_0-org/username_0/blob/08e2048c5bb4e1b19888e462a51ef7555a43c621/docs/repo-specific-configuration.md) file.\n\nHave a fantastic day writing Scala!\n\n<details>\n<summary>Ignore future updates</summary>\n\nAdd this to your `.username_0.conf` file to ignore future updates of this dependency:\n```\nupdates.ignore = [ { groupId = \"org.scoverage\", artifactId = \"sbt-scoverage\" } ]\n```\n</details>\n\nlabels: sbt-plugin-update, semver-minor",
"title": "Update sbt-scoverage to 1.8.1",
"type": "issue"
}
] | 2 | 2 | 1,037 | false | true | 1,037 | true |
ZupIT/ritchie-cli | ZupIT | 828,024,907 | 884 | {
"number": 884,
"repo": "ritchie-cli",
"user_login": "ZupIT"
} | [
{
"action": "opened",
"author": "andressaabreuzup",
"comment_id": null,
"datetime": 1615394695000,
"masked_author": "username_0",
"text": "### Description\r\nFormulas with nested conditional inputs were returning an unnecessary error.\r\nWhen there were two flows of nested conditionals, for example, consequently one of them was not satisfied. But an error was returned when the variables of the alternative flow conditionals were tested. The error said that such variables did not exist, however, they existed. They just weren't being filled by belonging to that other flow that didn't happen.\r\n\r\n### How to verify it\r\nTest nested conditional inputs with a alternative flow. (Not a error)\r\nTest conditional inputs that not exists. (A error)\r\n\r\n### Changelog\r\nChanging conditional to only return an error when the conditional input variable does not exist in the config.json variable list",
"title": "Changing conditional to only return an error when the conditional input variable does not exist in the config.json variable list",
"type": "issue"
},
{
"action": "created",
"author": "andressaabreuzup",
"comment_id": 796950996,
"datetime": 1615487489000,
"masked_author": "username_0",
"text": "It was a spacing problem, I fixed it, now it's 100% :)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "henriquemoraeszup",
"comment_id": 797071017,
"datetime": 1615499171000,
"masked_author": "username_1",
"text": "/merge qa",
"title": null,
"type": "comment"
}
] | 3 | 4 | 879 | false | true | 811 | false |
KhronosGroup/Vulkan-Samples-Assets | KhronosGroup | 707,632,880 | 10 | {
"number": 10,
"repo": "Vulkan-Samples-Assets",
"user_login": "KhronosGroup"
} | [
{
"action": "opened",
"author": "JoseEmilio-ARM",
"comment_id": null,
"datetime": 1600890268000,
"masked_author": "username_0",
"text": "Generated ASTC compressed normal maps using astcenc 2.0 with the following parameters\r\n\r\n`-cl 10x10 -thorough -normal`\r\n\r\nThe -normal option improves quality but stores data as a two channel X+Y(RGB=X, A=Y). The Z component must be recovered programatically in shader code:\r\n\r\n```\r\nvec3 nml;\r\nnml.xy = texture(...).ga; // Load normals (range 0 to 1)\r\nnml.xy = nml.xy * 2.0f - 1.0f; // Unpack normals (range -1 to +1)\r\nnml.z = sqrt(1 - dot(nml.xy, nml.xy)); // Compute Z, given unit length\r\n```\r\n\r\nGenerated ASTC compressed roughness-metallic maps using astcenc 2.0 with the following parameters\r\n\r\n`-cl 10x10 -thorough`\r\n\r\nThe new settings improve quality and use a more appropriate color space for these assets.",
"title": "Bonza KTX textures update",
"type": "issue"
},
{
"action": "created",
"author": "wasimabbas-arm",
"comment_id": 698276102,
"datetime": 1600945666000,
"masked_author": "username_1",
"text": "These are ASTC textures without mip-maps we should add mip-maps too.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 807 | false | false | 807 | false |
alibaba-fusion/next | alibaba-fusion | 927,016,108 | 3,130 | null | [
{
"action": "opened",
"author": "zmy1235",
"comment_id": null,
"datetime": 1624353317000,
"masked_author": "username_0",
"text": "### Reproduction link \r\n[https://fusion.design/sites/new?spm=fusion-design.design-design-fusion.0.0.611d1b1dzCSlvW](https://fusion.design/sites/new?spm=fusion-design.design-design-fusion.0.0.611d1b1dzCSlvW)\r\n\r\n### Steps to reproduce\r\n建站不成功,倒是提供个理由吧,一点提示都没有就说失败,到底失败啥愿意呢?这用户体验做的\r\n\r\n<!-- generated by alibaba-fusion-issue-helper. DO NOT REMOVE -->\r\n<!-- platform: main -->",
"title": "您好,为啥新建不了站点",
"type": "issue"
},
{
"action": "closed",
"author": "zmy1235",
"comment_id": null,
"datetime": 1624406668000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "youluna",
"comment_id": 866845229,
"datetime": 1624455587000,
"masked_author": "username_1",
"text": "不好意思,今晚已修复,可以再试一下",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zmy1235",
"comment_id": 866887339,
"datetime": 1624458494000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "comment"
}
] | 2 | 4 | 387 | false | false | 387 | false |
internap/fake-switches | internap | 899,845,589 | 164 | {
"number": 164,
"repo": "fake-switches",
"user_login": "internap"
} | [
{
"action": "opened",
"author": "ansiblejunky",
"comment_id": null,
"datetime": 1621876059000,
"masked_author": "username_0",
"text": "Simple fix for the time being, until the repo gets revamped for Python3. \r\nOnly added the requirement for `crytopgraphy` to version 3.0 since this prevents the existing error \"invalid private key\" that occurs when running the docker image. This should also fix any other issues with running fake-switches with python2 for now.",
"title": "Fix invalid private key error",
"type": "issue"
},
{
"action": "created",
"author": "fbouliane",
"comment_id": 848289802,
"datetime": 1621979487000,
"masked_author": "username_1",
"text": "Hello, welcome to the project & thanks for the patch. \r\nI see we have broken the 1.3.10 version on dockerhub and this should help.\r\nthe 1.3.8 tag should help in the meanwhile if you don't need the new features.\r\nhttps://hub.docker.com/r/internap/fake-switches/tags?page=1&ordering=last_updated\r\n\r\nGood practice for me would be to use a pip constraints, and release fake-switches 1.3.11 that \r\n - Excludes cryptography > 3\r\n - Use contraints file to build the docker image.\r\n\r\nThis was really useful to dig why our internal ci was broken this morning ! Thanks for your support.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ansiblejunky",
"comment_id": 848302756,
"datetime": 1621980560000,
"masked_author": "username_0",
"text": "Hi @username_1 and thanks for the welcome msg. Your app helped me a lot recently with a tech requirement I had. I was hoping to get this repo moving forward with some enhancements to support other devices, as I have struggled with spinning up or getting various network devices to support my customers. I work with Ansible and want to use your project as a way to emulate various devices. \r\n\r\nLet me know if you can merge this PR or if you need something else.\r\n\r\nCheers,\r\nJohn",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,378 | false | false | 1,378 | true |
dolanor/caldav-go | null | 812,311,933 | 3 | null | [
{
"action": "opened",
"author": "raoel",
"comment_id": null,
"datetime": 1613765909000,
"masked_author": "username_0",
"text": "It looks like it cannot hydrate the following property, when I run it in a debugger that is the property being handled.\r\n\r\n`CATEGORIES:http://schemas.google.com/g/2005#event`\r\n\r\nI just started with Golang, I already tried to fix it but I can't :-/ I guess it's the hash \"#\" ? \r\n\r\nstack:\r\n\r\n```\r\npanic: reflect: call of reflect.Append on ptr Value\r\n\r\ngoroutine 1 [running]:\r\nreflect.flag.mustBe(0x196, 0x17)\r\n\t/usr/lib/golang/src/reflect/value.go:208 +0xde\r\nreflect.Append(0x8121e0, 0xc0001303c0, 0x196, 0xc00019ea68, 0x1, 0x1, 0x0, 0x0, 0x0)\r\n\t/usr/lib/golang/src/reflect/value.go:2032 +0x60\r\ngithub.com/username_1/caldav-go/icalendar.hydrateProperty(0x8121e0, 0xc0001303c0, 0x196, 0xc0000cc280, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/icalendar/unmarshal.go:200 +0x5df\r\ngithub.com/username_1/caldav-go/icalendar.hydrateProperties(0x82ba80, 0xc0001302c0, 0x16, 0xc0004ad5e0, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/icalendar/unmarshal.go:270 +0x4e0\r\ngithub.com/username_1/caldav-go/icalendar.hydrateComponent(0x82ba80, 0xc0001302c0, 0x16, 0xc0004ad5e0, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/icalendar/unmarshal.go:302 +0x412\r\ngithub.com/username_1/caldav-go/icalendar.hydrateNestedComponent(0x7f2580, 0xc0006ce4e8, 0x197, 0xc0004ad5e0, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/icalendar/unmarshal.go:224 +0x120\r\ngithub.com/username_1/caldav-go/icalendar.hydrateProperties(0x821e20, 0xc0006ce480, 0x16, 0xc0004ad440, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/icalendar/unmarshal.go:284 +0xbd2\r\ngithub.com/username_1/caldav-go/icalendar.hydrateComponent(0x821e20, 0xc0006ce480, 0x16, 0xc0004ad440, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/icalendar/unmarshal.go:302 +0x412\r\ngithub.com/username_1/caldav-go/icalendar.hydrateValue(0x821e20, 0xc0006ce480, 0x16, 0xc0004ad3c0, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/icalendar/unmarshal.go:353 +0xf1b\r\ngithub.com/username_1/caldav-go/icalendar.Unmarshal(0xc0003a4480, 0x231, 0x821e20, 0xc0006ce480, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/icalendar/unmarshal.go:363 +0x1e1\r\ngithub.com/username_1/caldav-go/caldav.(*Response).Decode(0xc000162120, 0x821e20, 0xc0006ce480, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/caldav/response.go:29 +0x2f3\r\ngithub.com/username_1/caldav-go/caldav.(*Client).GetEvents(0xc00000e380, 0xc00036c0c0, 0x51, 0x0, 0x0, 0x0, 0x0, 0x0)\r\n\t/home/roomen/go/pkg/mod/github.com/username_1/caldav-go@v0.2.0/caldav/client.go:130 +0x732\r\n```\r\n\r\n\r\nfor completeness the whole Event that I'm fetching:\r\n\r\n```\r\nBEGIN:VCALENDAR\r\nCALSCALE:GREGORIAN\r\nVERSION:2.0\r\nPRODID:-//SabreDAV//SabreDAV//EN\r\nBEGIN:VEVENT\r\nDTSTART;VALUE=DATE:20120114\r\nDTEND;VALUE=DATE:20120115\r\nDTSTAMP:20150209T114140Z\r\nUID:REDACTED_IDENTIFIER@google.com\r\nATTENDEE;CUTYPE=INDIVIDUAL;ROLE=REQ-PARTICIPANT;PARTSTAT=ACCEPTED;X-NUM-GUE\r\n STS=0:mailto:REDACTED@REDACTED.COM\r\nCREATED:20111214T180041Z\r\nDESCRIPTION:\r\nLAST-MODIFIED:20111214T180041Z\r\nLOCATION:Thuis\r\nSEQUENCE:1\r\nSTATUS:TENTATIVE\r\nSUMMARY:REDACTED BECAUSE PRIVACY\r\nTRANSP:OPAQUE\r\nCATEGORIES:http://schemas.google.com/g/2005#event\r\nEND:VEVENT\r\nEND:VCALENDAR\r\n```",
"title": "\"panic: reflect: call of reflect.Append on ptr Value\" during hydrateProperty",
"type": "issue"
},
{
"action": "created",
"author": "raoel",
"comment_id": 782650849,
"datetime": 1613827261000,
"masked_author": "username_0",
"text": "I've tried to get a limited set of properties returned (as described here: https://tools.ietf.org/html/rfc4791#section-7.8.1 ) but I have not been able to get it to work (ofcourse it could be that my nextcloud-server is not honoring such a query because it is not implemented?)\r\n\r\nCould you give an example? This didn't work (maybe my knowledge of Go is just lacking)\r\n\r\n\r\n```\r\n\tquery, _ := cent.NewEventRangeQuery(lastmonth, start)\r\n\r\n\t// calendardata -> component -> property.name = \"VEVENT\"\r\n\tpropertyEvent := new(cent.PropertyName)\r\n\tpropertyEvent.Name = \"VEVENT\"\r\n\tquery.Prop.CalendarData.Component = new(cent.Component)\r\n\teventProperties := []*(cent.PropertyName){propertyEvent}\r\n\tquery.Prop.CalendarData.Component.Properties = eventProperties\r\n\r\n\t// calendardata -> component -> property -> component -> property = \"DTSTART\"\r\n\tmyprop := new(cent.PropertyName)\r\n\tmyprop.Name = \"DTSTART\"\r\n\tmyproperties := []*(cent.PropertyName){myprop}\r\n\tmycomponent := new(cent.Component)\r\n\tmycomponent.Properties = myproperties\r\n\tmycomponents := []*(cent.Component){mycomponent}\r\n\tquery.Prop.CalendarData.Component.Components = mycomponents\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dolanor",
"comment_id": 851271399,
"datetime": 1622446515000,
"masked_author": "username_1",
"text": "Thanks for you patience, I totally forgot your issue.\r\nI'll try to reproduce and get back to you this week.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dolanor",
"comment_id": 867947019,
"datetime": 1624568281000,
"masked_author": "username_1",
"text": "Sorry it took so long, been quite busy. I found out that you're right, the problem is with the `CATEGORIES`, but not because of the `#`. It's the whole underlying type which doesn't hydrate correctly. the `CONTACT` is of the same type (`type.CSV`) and fails the same.\r\nI'm gonna try and fix it and update this issue when it's done.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 4,871 | false | false | 4,871 | true |
Kylmakalle/Telescopy | null | 822,465,794 | 4 | null | [
{
"action": "opened",
"author": "saminn122",
"comment_id": null,
"datetime": 1614888395000,
"masked_author": "username_0",
"text": "Hi can you fix telescopy plzzzz its broken",
"title": "Bug",
"type": "issue"
},
{
"action": "closed",
"author": "Kylmakalle",
"comment_id": null,
"datetime": 1615121138000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 42 | false | false | 42 | false |
alixaxel/chrome-aws-lambda | null | 739,147,682 | 173 | null | [
{
"action": "opened",
"author": "nguyensonghao94",
"comment_id": null,
"datetime": 1604937292000,
"masked_author": "username_0",
"text": "chrome-aws-lambda is working very good but when we run on some website we has an error \r\n\r\n\" Unhandled Promise Rejection {\"errorType\":\"Runtime.UnhandledPromiseRejection\",\"errorMessage\":\"TypeError: Cannot read property 'errorText' of null\",\"reason\":{\"errorType\":\"TypeError\",\"errorMessage\":\"Cannot read property 'errorText' of null\",\"stack\":[\"TypeError: Cannot read property 'errorText' of null\",\" at Page.<anonymous> (/var/task/index.js:69:40)\",\" at Page.emit (events.js:315:20)\",\" at NetworkManager.<anonymous> (/var/task/node_modules/puppeteer-core/lib/Page.js:120:89)\",\" at NetworkManager.emit (events.js:315:20)\",\" at NetworkManager._onLoadingFailed (/var/task/node_modules/puppeteer-core/lib/NetworkManager.js:249:14)\",\" at CDPSession.emit (events.js:315:20)\",\" at CDPSession._onMessage (/var/task/node_modules/puppeteer-core/lib/Connection.js:166:18)\",\" at Connection._onMessage (/var/task/node_modules/puppeteer-core/lib/Connection.js:83:25)\",\" at WebSocket.<anonymous> (/var/task/node_modules/puppeteer-core/lib/WebSocketTransport.js:25:32)\",\" at WebSocket.onMessage (/var/task/node_modules/ws/lib/event-target.js:132:16)\"]},\"promise\":{},\"stack\":[\"Runtime.UnhandledPromiseRejection: TypeError: Cannot read property 'errorText' of null\",\" at process.<anonymous> (/var/runtime/index.js:35:15)\",\" at process.emit (events.js:315:20)\",\" at processPromiseRejections (internal/process/promises.js:209:33)\",\" at processTicksAndRejections (internal/process/task_queues.js:98:32)\"]}\"\r\n\r\nAnd when we use puppeteer it still working",
"title": "[BUG] Unhandled Promise Rejection when run on Docker",
"type": "issue"
},
{
"action": "created",
"author": "alixaxel",
"comment_id": 731646353,
"datetime": 1605997797000,
"masked_author": "username_1",
"text": "@username_0 Can you clarify if you're getting this error on a specific URL?\r\n\r\nFrom your report it's not clear whether this is happening only on Docker or only on a specific URL.\r\n\r\nPlease share more details, I'll be glad to help.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 1,807 | false | false | 1,807 | true |
pytorch/pytorch | pytorch | 616,559,806 | 38,314 | null | [
{
"action": "opened",
"author": "mfuntowicz",
"comment_id": null,
"datetime": 1589280800000,
"masked_author": "username_0",
"text": "Warning: ATen was a removed experimental ops. In the future, we may directly reject this operator. Please update your model as soon as possible.\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 1, in <module>\r\n File \"D:\\Workspace\\Miniconda3\\envs\\huggingface\\lib\\site-packages\\onnxruntime\\capi\\session.py\", line 25, in __init__\r\n self._load_model(providers)\r\n File \"D:\\Workspace\\Miniconda3\\envs\\huggingface\\lib\\site-packages\\onnxruntime\\capi\\session.py\", line 43, in _load_model\r\n self._sess.load_model(providers)\r\nonnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Fatal error: ATen is not a registered function/op\r\n```\r\n\r\n\r\n\r\nPlatform: Windows 10 (_but tested also on Ubuntu 18.10_)",
"title": "[ONNX] CumSum exported as Aten.CumSum instead of onnx.CumSum",
"type": "issue"
},
{
"action": "created",
"author": "spandantiwari",
"comment_id": 627480200,
"datetime": 1589304039000,
"masked_author": "username_1",
"text": "@username_0 - can you share the repro. Specifically, which exact model are you exporting and your export code.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mfuntowicz",
"comment_id": 627495713,
"datetime": 1589305801000,
"masked_author": "username_0",
"text": "Sure, I'm observing this behavior on a PR I'm currently working on https://github.com/huggingface/transformers/pull/4253\r\n\r\nThe file involved: \r\nhttps://github.com/huggingface/transformers/blob/onnx-export/src/transformers/convert_graph_to_onnx.py\r\n\r\nI tried with different opsets (11, 12) but results were pretty similar, at least for `roberta-base` which is essentially `bert-base` without the `token type ids`.\r\n\r\nIf you want to reproduce easily and you can git clone the specific **transformers** onnx-export branch, then: \r\n\r\n`python convert_graph_to_onnx.py --model roberta-base --framework pt onnx/roberta-base.onnx`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "neginraoof",
"comment_id": 627529223,
"datetime": 1589309743000,
"masked_author": "username_2",
"text": "Hi @username_0 \r\nCould you please verify the opset version for export one more time?\r\n\r\nI have pytorch 1.5 installed, and I'm running the code from onnx-export branch with the following command:\r\npython convert_graph_to_onnx.py --model roberta-base --framework pt onnx/roberta-base.onnx --check-loading\r\n\r\nThis command seems to succeed to export and run model with ORT, with following output:\r\n\r\nSetting ONNX opset version to: 12\r\nLoading pipeline (model: roberta-base, tokenizer: roberta-base)\r\nDownloading: 100%|███████ 230/230 [00:00<00:00, 281kB/s]\r\nPyTorch: 1.5.0\r\nChecking ONNX model loading from: pytorch-transformers/src/transformers/onnx/roberta-base.onnx\r\nModel correctly loaded\r\n\r\nI also do get the same results with opset 11.\r\n(Using opset 9 throws the same error you mentioned.)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mfuntowicz",
"comment_id": 627557644,
"datetime": 1589313151000,
"masked_author": "username_0",
"text": "Hum interesting ... \r\n\r\nHere is the output I have after reinstalling my dev environment: \r\n\r\n```\r\nroot@ee14c90078d2:/dev/transformers/src/transformers# python3 convert_graph_to_onnx.py --model roberta-base --framework pt onnx/roberta-base.onnx --check-loading\r\nSetting ONNX opset version to: 12\r\nLoading pipeline (model: roberta-base, tokenizer: roberta-base)\r\nDownloading: 100%|##########################################################################################| 230/230 [00:00<00:00, 298kB/s]\r\nCreating folder /dev/transformers/src/transformers/onnx\r\nPyTorch: 1.5.0+cu101\r\nChecking ONNX model loading from: /dev/transformers/src/transformers/onnx/roberta-base.onnx\r\nError while converting the model: [ONNXRuntimeError] : 10 : INVALID_GRAPH : This is an invalid model. Error in Node:Shape_0 : No Op registered for Shape with domain_version of 12\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mfuntowicz",
"comment_id": 627557888,
"datetime": 1589313179000,
"masked_author": "username_0",
"text": "I'm installing from pypi through: \r\n\r\n`pip install torch==1.5.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "neginraoof",
"comment_id": 627575473,
"datetime": 1589315283000,
"masked_author": "username_2",
"text": "Thanks, this error is from onnxruntime. \r\nIf you specify --opset 11, you should be able to run the model successfully with onnxruntime 1.2.\r\n\r\nFor opset 12, onnxruntime has recently added support for these ops in master. So, it could be that the onnxruntime version you have is not supporting all opset 12 ops.\r\nI'm running this model with latest ort-nigthly build (from pip install -i https://test.pypi.org/simple/ ort-nightly) with no issues.",
"title": null,
"type": "comment"
}
] | 3 | 7 | 3,792 | false | false | 3,792 | true |
dockstore/dockstore | dockstore | 928,414,337 | 4,325 | null | [
{
"action": "opened",
"author": "garyluu",
"comment_id": null,
"datetime": 1624463891000,
"masked_author": "username_0",
"text": "**Describe the bug**\r\nShould not be able to see the option to set default version in a public page such as https://dev.dockstore.net/containers/registry.hub.docker.com/lynnlangit/blastn/blastn:11?tab=versions even if it doesn't work\r\n\r\nBackend seems to be handling it ok though (returning error for random users)\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Go to something like https://dev.dockstore.net/containers/registry.hub.docker.com/lynnlangit/blastn/blastn:11?tab=versions\r\n2. Click on actions\r\n3. See option \"Set as Default Version\"\r\n\r\n**Expected behavior**\r\nThat option should not be visible at all on a public page (not admins, not curator, not owner of tool)\r\n\r\n**Additional context**\r\n[Webservice](https://github.com/dockstore/dockstore/commits/eea257d) - eea257d\r\n\r\n[UI](https://github.com/dockstore/dockstore-ui2/commits/3b7a7164) - 2.8.0-rc.0-20-g3b7a7164\r\n\r\n[Compose Setup](https://github.com/dockstore/compose_setup/commits/7ad0764) - 7ad0764\r\n\r\n[Deploy](https://github.com/dockstore/dockstore-deploy/commits/94dee6d) - 1.11.0-18-g94dee6d",
"title": "Setting default version in public page",
"type": "issue"
}
] | 2 | 2 | 1,177 | false | true | 1,069 | false |
microsoft/Cream | microsoft | 784,779,189 | 14 | null | [
{
"action": "opened",
"author": "lvxjl",
"comment_id": null,
"datetime": 1610512442000,
"masked_author": "username_0",
"text": "which simulates the updated student weights via **gradient ascent** in order to take part in the calculation of teacher weights.\r\n\r\nIt turns out that updating student weights in \"simulate_sgd_update\" should be in accordance with updating student weights via SGD as in line 23. But in practice gradient **ascent** is used in one case while gradient **descent** is used in the other.\r\n\r\nP.S. It also a little bit confusing to simulate the updated student weights right before you actually update them. Why not just take the updated weights from the student model? That's the **real** updated weights afterall.",
"title": "?Error in simulate_sgd_update for MetaMatchingNetwork",
"type": "issue"
},
{
"action": "created",
"author": "Z7zuqer",
"comment_id": 759218862,
"datetime": 1610516469000,
"masked_author": "username_1",
"text": "Hi,\r\n\r\nThanks for your interest in our project!\r\n\r\n* We define function *simulate_sgd_update* to simulate gradients update of student weights. [Real SGD updates](https://pytorch.org/docs/stable/_modules/torch/optim/sgd.html#SGD) utilize momentum and weight_decay to update student weights. In the updates of MetaMN, we only focus on the updates of the Meta Matchnig Networks, ignoring the updates of student weights. So we only need an approximation of updated student weights to see whether updated student weights are better than before to evaluate the performance of MetaMN. Both gradient ascent and descent are all allowed here.\r\n\r\n```\r\n# SGD update in Pytorch Official Implemention\r\nif weight_decay != 0:\r\n d_p = d_p.add(p, alpha=weight_decay)\r\nif momentum != 0:\r\n param_state = self.state[p]\r\n if 'momentum_buffer' not in param_state:\r\n buf = param_state['momentum_buffer'] = torch.clone(d_p).detach()\r\n else:\r\n buf = param_state['momentum_buffer']\r\n buf.mul_(momentum).add_(d_p, alpha=1 - dampening)\r\n if nesterov:\r\n d_p = d_p.add(buf, alpha=momentum)\r\n else:\r\n d_p = buf\r\n\r\n```\r\n* In the procedure of updating MetaMN, only weights of meta matching networks should be updated and student weights should be fixed. To make sure student weights are well stored in the process of updating MetaMN, we don't use optimizer.step() to directly update the student weights. This is to prevent `momentum_buffer` in SGD optimizer to be affected(If 'momentum_buffer' is affected, student weights may not correctly reach the global minimum). So here we define the function *simulate_sgd_update* to simulate the updates of student weights.\r\n\r\nBest,\r\nHao.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "lvxjl",
"comment_id": 760222414,
"datetime": 1610633653000,
"masked_author": "username_0",
"text": "It turns out to be quite counterintuitive to see that using gradient ascent is OK.\r\nAnd I am not sure if I fully understand how the performance of MetaMN depends on the weights of student network.\r\nIt would be great if you could provide more details, thank you.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "penghouwen",
"comment_id": null,
"datetime": 1625133522000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 4 | 2,566 | false | false | 2,566 | false |
onivim/oni2 | onivim | 606,639,708 | 1,667 | null | [
{
"action": "opened",
"author": "ddmendes",
"comment_id": null,
"datetime": 1587771867000,
"masked_author": "username_0",
"text": "I just gone through [How to build and run from source](https://onivim.github.io/docs/for-developers/building) today and when I executed the `./scripts/osx/create-symlink.sh` then command oni2 was not working.\r\n\r\nIn order to fix it I edited the _release/run.sh, on the last line. It was\r\n```sh\r\n\"$DIR\"/../MacOS/Oni2 --working-directory \"$CWD\" \"$@\"\r\n```\r\nand I changed it to\r\n```sh\r\n\"$DIR\"/Onivim2.app/Contents/MacOS/Oni2 --working-directory \"$CWD\" \"$@\"\r\n```\r\nand now the command oni2 works.\r\n\r\nI would open a PR but I'm without time to understand how all of this works.\r\nIf someone could mentor me on this fix I can do it myself.\r\n\r\nReally thank you for this project! I'm really loving it!",
"title": "_release/run.sh doesn't work for MacOS",
"type": "issue"
},
{
"action": "created",
"author": "CrossR",
"comment_id": 619364319,
"datetime": 1587813882000,
"masked_author": "username_1",
"text": "I think this is the instructions and the code becoming a bit out of sync (I didn't realise there was a \"Make symlink\" script when updating the code, as I just use `esy run` to debug locally).\r\n\r\nThe current `run.sh` script is made to be used with the release build of Oni2. That is, `esy create-release`, install the application as normal by dragging to your applications, start Oni2, and from inside Oni2 `Add to system PATH` (Cmd-Shift-P). That should then add the `oni2` executable to your path to use in the shell.\r\n\r\nAs such, that path is setup for the way the in application symlink works. I think changing it would break that workflow. We might just want to add a new duplicated script with your fix, or improve the logic of working out which environment we are in",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ddmendes",
"comment_id": 619439171,
"datetime": 1587847954000,
"masked_author": "username_0",
"text": "Hi @username_1 ! Great, so it's just doc out of sync.\n\nI'll try to install as you said and then I'll update the doc for you. Is it ok?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "CrossR",
"comment_id": 619442576,
"datetime": 1587849807000,
"masked_author": "username_1",
"text": "Sounds good!\r\n\r\nI'm not sure if anyone uses the symlink script in development, so if its not used we could just drop mentioning it at all.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ddmendes",
"comment_id": 619538795,
"datetime": 1587902651000,
"masked_author": "username_0",
"text": "PR #1669",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ddmendes",
"comment_id": null,
"datetime": 1588248960000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 1,735 | false | false | 1,735 | true |
Azure/azure-sdk-for-js | Azure | 794,519,652 | 13,404 | {
"number": 13404,
"repo": "azure-sdk-for-js",
"user_login": "Azure"
} | [
{
"action": "opened",
"author": "HarshaNalluru",
"comment_id": null,
"datetime": 1611691143000,
"masked_author": "username_0",
"text": "Just updating the version and fixing related compile errors",
"title": "[Service Bus] Stress tests - Migrate to 7.0.0",
"type": "issue"
}
] | 2 | 2 | 556 | false | true | 60 | false |
carbon-design-system/carbon | carbon-design-system | 789,807,209 | 7,605 | null | [
{
"action": "opened",
"author": "gjunming",
"comment_id": null,
"datetime": 1611135328000,
"masked_author": "username_0",
"text": "",
"title": "copy button tooltip overlay when multi copy button together",
"type": "issue"
},
{
"action": "created",
"author": "gjunming",
"comment_id": 763472356,
"datetime": 1611135474000,
"masked_author": "username_0",
"text": "this commit raised this problem https://github.com/carbon-design-system/carbon/commit/5028d80e07a20d617e3f53d462897c33ee694b29",
"title": null,
"type": "comment"
}
] | 2 | 3 | 234 | false | true | 234 | false |
firebase/firebase-ios-sdk | firebase | 707,600,767 | 6,541 | {
"number": 6541,
"repo": "firebase-ios-sdk",
"user_login": "firebase"
} | [
{
"action": "opened",
"author": "paulb777",
"comment_id": null,
"datetime": 1600886949000,
"masked_author": "username_0",
"text": "IID is also ported to SwiftPM but not exposed externally because of its upcoming deprecation.",
"title": "SwiftPM for FCM",
"type": "issue"
},
{
"action": "created",
"author": "davdroman",
"comment_id": 700644360,
"datetime": 1601379316000,
"masked_author": "username_1",
"text": "Just to confirm, this will only be available in Firebase 7? I guess there's some technical limitation and can't be on version 6?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ryanwilson",
"comment_id": 700707310,
"datetime": 1601386485000,
"masked_author": "username_2",
"text": "Confirmed that this will only be available in Firebase 7. The big technical issue here is Messaging currently depends on Protobuf which is not available via Swift Package Manager, but in Firebase 7 Messaging removed the dependency on Protobuf. There are potential workarounds in terms of hosting a binary version of Protobuf and depending on that, but there's technical complexity and many issues we've seen with binary frameworks so far that it's not worth the investment considering we're able to add it easily as a source based product in Firebase 7.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "davdroman",
"comment_id": 700739324,
"datetime": 1601389375000,
"masked_author": "username_1",
"text": "@username_2 thank you for responding so quickly. It's good to know and I understand. Good to hear a few dependencies are being dropped in the next major version. Given the long list of issues left for the 7 milestone, I presume it'll take a while for it to come out. In the meantime, would you deem it sensible to consume `firebase7-main` from SPM if we depend on frameworks like Messaging?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paulb777",
"comment_id": 700748056,
"datetime": 1601390089000,
"masked_author": "username_0",
"text": "@username_1 While we don't plan to break anything on `firebase7-main`, it is undergoing substantial churn with a bunch of technical debt being addressed and I wouldn't recommend usage in apps beyond development testing.\r\n\r\nIf despite those risks, you still want to use, we'll be happy to work with you on any issues you find.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "davdroman",
"comment_id": 700750816,
"datetime": 1601390326000,
"masked_author": "username_1",
"text": "@username_0 okay, makes sense. Thank you Paul and Ryan for your efforts towards SPM support 👏",
"title": null,
"type": "comment"
}
] | 4 | 7 | 2,199 | false | true | 1,580 | true |
adobe/aio-lib-cloudmanager | adobe | 872,938,622 | 120 | {
"number": 120,
"repo": "aio-lib-cloudmanager",
"user_login": "adobe"
} | [
{
"action": "opened",
"author": "justinedelson",
"comment_id": null,
"datetime": 1619804024000,
"masked_author": "username_0",
"text": "<!--- Provide a general summary of your changes in the Title above -->\r\n\r\n## Description\r\n\r\nAdd an eslint rule that validates that errors are handling property\r\n\r\n## Related Issue\r\n\r\n#114 \r\n#116 \r\n#117\r\n\r\n## Motivation and Context\r\n\r\nThese errors should be caught earlier.\r\n\r\n## How Has This Been Tested?\r\n\r\nTested WIP on those PRs... waited to submit until open issues were fixed\r\n\r\n## Screenshots (if appropriate):\r\n\r\n## Types of changes\r\n\r\n<!--- What types of changes does your code introduce? Put an `x` in all the boxes that apply: -->\r\n\r\n- [ ] Bug fix (non-breaking change which fixes an issue)\r\n- [ ] New feature (non-breaking change which adds functionality)\r\n- [ ] Breaking change (fix or feature that would cause existing functionality to change)\r\n\r\n## Checklist:\r\n\r\n<!--- Go over all the following points, and put an `x` in all the boxes that apply. -->\r\n<!--- If you're unsure about any of these, don't hesitate to ask. We're here to help! -->\r\n\r\n- [X] I have signed the [Adobe Open Source CLA](http://opensource.adobe.com/cla.html).\r\n- [X] My code follows the code style of this project.\r\n- [ ] My change requires a change to the documentation.\r\n- [ ] I have updated the documentation accordingly.\r\n- [ ] I have read the **CONTRIBUTING** document.\r\n- [ ] I have added tests to cover my changes.\r\n- [X] All new and existing tests passed.",
"title": "chore(lint): add custom rule to check error code usage",
"type": "issue"
}
] | 3 | 3 | 1,721 | false | true | 1,349 | false |
vintlabs/fauxmoESP | vintlabs | 757,749,744 | 129 | null | [
{
"action": "opened",
"author": "societyofrobots",
"comment_id": null,
"datetime": 1607191598000,
"masked_author": "username_0",
"text": "Lets say I have a lock set up.\r\n\r\nHow would I prevent any user from using Alexa to discover the device and then unlocking it? Alexa has the ability to require a pin number for locks... how would I go about adding this ability with fauxmoESP?",
"title": "connection security, for example with locks",
"type": "issue"
},
{
"action": "created",
"author": "pvint",
"comment_id": 742107042,
"datetime": 1607553382000,
"masked_author": "username_1",
"text": "I've been thinking about this for a few days now, and I cannot see any way we can do this. We are locked in (for the time being, anyway), to emulating Philips Hue bulbs, and with those there's no provision for anything like a PIN.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "pvint",
"comment_id": null,
"datetime": 1610742103000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 471 | false | false | 471 | false |
camgrodgers/senior-fps | null | 848,463,658 | 172 | null | [
{
"action": "opened",
"author": "camgrodgers",
"comment_id": null,
"datetime": 1617283251000,
"masked_author": "username_0",
"text": "The player could have a stamina bar that depletes only when they sprint. Or, there could be a Dark Souls style stamina bar that depletes when performing other actions as well, such as jumping or raising a gun and shooting.",
"title": "Stamina bar",
"type": "issue"
}
] | 1 | 1 | 222 | false | false | 222 | false |
Azure/azure-sdk-for-net | Azure | 802,881,288 | 18,515 | {
"number": 18515,
"repo": "azure-sdk-for-net",
"user_login": "Azure"
} | [
{
"action": "opened",
"author": "idear1203",
"comment_id": null,
"datetime": 1612680906000,
"masked_author": "username_0",
"text": "# Description\r\n\r\nRecently we got some user feedback that some models are not visible due to internal access level. This PR is to fix the issue by introducing some customizations.\r\n\r\n# All SDK Contribution checklist:\r\n\r\nThis checklist is used to make sure that common guidelines for a pull request are followed.\r\n- [x] **Please open PR in `Draft` mode if it is:**\r\n\t- Work in progress or not intended to be merged.\r\n\t- Encountering multiple pipeline failures and working on fixes.\r\n- [ ] If an SDK is being regenerated based on a new swagger spec, a link to the pull request containing these swagger spec changes has been included above.\r\n- [x] **I have read the [contribution guidelines](https://github.com/Azure/azure-sdk-for-net/blob/master/CONTRIBUTING.md).**\r\n- [x] **The pull request does not introduce [breaking changes](https://github.com/dotnet/corefx/blob/master/Documentation/coding-guidelines/breaking-change-rules.md).**\r\n\r\n### [General Guidelines and Best Practices](https://github.com/Azure/azure-sdk-for-net/blob/master/CONTRIBUTING.md#general-guidelines)\r\n- [x] Title of the pull request is clear and informative.\r\n- [x] There are a small number of commits, each of which have an informative message. This means that previously merged commits do not appear in the history of the PR. For more information on cleaning up the commits in your PR, [see this page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md).\r\n\r\n### [Testing Guidelines](https://github.com/Azure/azure-sdk-for-net/blob/master/CONTRIBUTING.md#testing-guidelines)\r\n- [] Pull request includes test coverage for the included changes.\r\n\r\n### [SDK Generation Guidelines](https://github.com/Azure/azure-sdk-for-net/blob/master/CONTRIBUTING.md#sdk-generation-guidelines)\r\n- [ ] The generate.cmd file for the SDK has been updated with the version of AutoRest, as well as the commitid of your swagger spec or link to the swagger spec, used to generate the code. (Track 2 only)\r\n- [ ] The `*.csproj` and `AssemblyInfo.cs` files have been updated with the new version of the SDK. Please double check nuget.org current release version.\r\n\r\n## Additional management plane SDK specific contribution checklist: \r\nNote: Only applies to `Microsoft.Azure.Management.[RP]` or `Azure.ResourceManager.[RP]`\r\n \r\n- [ ] Include updated [management metadata](https://github.com/Azure/azure-sdk-for-net/tree/master/eng/mgmt/mgmtmetadata).\r\n- [ ] Update AzureRP.props to add/remove version info to maintain up to date API versions.\r\n\r\n### Management plane SDK Troubleshooting\r\n- If this is very first SDK for a services and you are adding new service folders directly under /SDK, please add `new service` label and/or contact assigned reviewer.\r\n- If the check fails at the `Verify Code Generation` step, please ensure:\r\n\t- Do not modify any code in generated folders.\r\n\t- Do not selectively include/remove generated files in the PR.\r\n\t- Do use `generate.ps1/cmd` to generate this PR instead of calling `autorest` directly.\r\n\tPlease pay attention to the @microsoft.csharp version output after running generate.ps1. If it is lower than current released version (2.3.82), please run it again as it should pull down the latest version,\r\n\r\n### Old outstanding PR cleanup\r\n Please note:\r\n\tIf PRs (including draft) has been out for more than 60 days and there are no responses from our query or followups, they will be closed to maintain a concise list for our reviewers.",
"title": "[Synapse][Artifacts] - Make some models public",
"type": "issue"
},
{
"action": "created",
"author": "idear1203",
"comment_id": 775634630,
"datetime": 1612841510000,
"masked_author": "username_0",
"text": "@chamons Correct, I just copy those code from generated without real changes except changing \"internal\" to “public”",
"title": null,
"type": "comment"
}
] | 1 | 2 | 3,594 | false | false | 3,594 | false |
gii-is-psg2/PSG2-2021-G1-15 | gii-is-psg2 | 852,559,328 | 100 | null | [
{
"action": "opened",
"author": "manmogvil",
"comment_id": null,
"datetime": 1617811265000,
"masked_author": "username_0",
"text": "Crear un informe técnico titulado “Análisis del Código Fuente y Métricas Asociadas” sobre los resultados de analizar su proyecto con SonarCloud (según los lanzamientos generados en S2 y S3, analizando el código en las ramas maestras para las confirmaciones correspondientes). Dicho informe debe describir los valores de las métricas del código fuente calculadas, los tipos de problemas encontrados en el análisis y sus causas. Este documento debe contener al menos los siguientes elementos:\r\n\r\n - Una captura de pantalla del panel de Sonar Cloud para el análisis de su proyecto y una descripción de las métricas proporcionadas en el panel y sus valores.\r\n - Descripción y análisis de los posibles fallos encontrados en el repositorio (en el apartado de fiabilidad, sobre medidas).\r\n - Descripción y análisis de los diferentes tipos de olores de código encontrados en los análisis (en la sección de mantenibilidad, sobre medidas). Para cada tipo de olor de código, el informe debe describirlo.\r\n - El nombre y la descripción del olor del código.\r\n - Las diferentes causas del olor en su código base\r\n - Una evaluación justificada de la gravedad del olor del código.\r\n - Una breve descripción de cómo solucionarlo en función de las causas.\r\n - Conclusiones sobre los resultados de los análisis",
"title": "S3.4 - Technical Report (2)",
"type": "issue"
},
{
"action": "closed",
"author": "carnuare",
"comment_id": null,
"datetime": 1620057867000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 1,303 | false | false | 1,303 | false |
acumenlabs/status-page | acumenlabs | 807,665,612 | 451 | null | [
{
"action": "opened",
"author": "max-acumen",
"comment_id": null,
"datetime": 1613187194000,
"masked_author": "username_0",
"text": "In [`c7da2d9`](https://github.com/acumenlabs/status-page/commit/c7da2d9462622622361e9d912a0d356142a418d4\n), Processors ($STATUS_URL) was **down**:\n- HTTP code: 200\n- Response time: 60 ms",
"title": "🛑 Processors is down",
"type": "issue"
},
{
"action": "created",
"author": "max-acumen",
"comment_id": 778557268,
"datetime": 1613188149000,
"masked_author": "username_0",
"text": "**Resolved:** Processors is back up in [`f045332`](https://github.com/acumenlabs/status-page/commit/f045332662e8034029849147bc1487881fc2825d\n).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "max-acumen",
"comment_id": null,
"datetime": 1613188150000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 329 | false | false | 329 | false |
MicrosoftDocs/windows-uwp | MicrosoftDocs | 692,942,801 | 2,686 | null | [
{
"action": "opened",
"author": "pjmlp",
"comment_id": null,
"datetime": 1599210692000,
"masked_author": "username_0",
"text": "Page tells that C++/WinRT is the way to go for C++, yet the examples still make use of C++/CX.\r\n\r\n\r\n---\r\n#### Document Details\r\n\r\n⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*\r\n\r\n* ID: 8d75cb81-9ff3-f9ad-8ad4-818e45da1d62\r\n* Version Independent ID: f21e558d-b41f-bcfd-ca8a-9daabf7fa85a\r\n* Content: [Call Windows Runtime APIs in desktop apps](https://docs.microsoft.com/en-us/windows/apps/desktop/modernize/desktop-to-uwp-enhance)\r\n* Content Source: [hub/apps/desktop/modernize/desktop-to-uwp-enhance.md](https://github.com/MicrosoftDocs/windows-uwp/blob/docs/hub/apps/desktop/modernize/desktop-to-uwp-enhance.md)\r\n* Product: **uwp**\r\n* Technology: **apps**\r\n* GitHub Login: @username_1\r\n* Microsoft Alias: **mcleans**",
"title": "Samples still make use of C++/CX despite advice for C++/WinRT adoption.",
"type": "issue"
},
{
"action": "created",
"author": "mcleanbyron",
"comment_id": 687470769,
"datetime": 1599262682000,
"masked_author": "username_1",
"text": "Thanks for submitting your feedback, and good catch. I added a C++/WinRT version to the notification code example in this article.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mcleanbyron",
"comment_id": null,
"datetime": 1599262682000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 894 | false | false | 894 | true |
slide-rs/hibitset | slide-rs | 473,620,232 | 52 | {
"number": 52,
"repo": "hibitset",
"user_login": "slide-rs"
} | [
{
"action": "opened",
"author": "lcnr",
"comment_id": null,
"datetime": 1564219829000,
"masked_author": "username_0",
"text": "As kings go and empires fall, `Clone` is eternal.\r\n\r\n```rust\r\nimpl Clone for specs::JoinIter\r\n```\r\n\r\nForgot that `BitSetAnd` is needed when joining tuples.",
"title": "BitSetAnd",
"type": "issue"
},
{
"action": "created",
"author": "lcnr",
"comment_id": 529532223,
"datetime": 1568042845000,
"masked_author": "username_0",
"text": "@torkleyy can you release this new version on crates.io?",
"title": null,
"type": "comment"
}
] | 2 | 3 | 379 | false | true | 211 | false |
cilium/cilium | cilium | 870,485,162 | 15,926 | {
"number": 15926,
"repo": "cilium",
"user_login": "cilium"
} | [
{
"action": "opened",
"author": "aanm",
"comment_id": null,
"datetime": 1619653076000,
"masked_author": "username_0",
"text": "* #15803 -- handle IP addresses modification in running nodes and CEPs (@username_0)\r\n* #15077 -- pkg/k8s: add DeepEqual code generation for Service (@username_0)\r\n\r\nPR #15803 depends on #15077, so both had to be backported\r\n\r\nOnce this PR is merged, you can update the PR labels via:\r\n```upstream-prs\r\n$ for pr in 15077 15803; do contrib/backporting/set-labels.py $pr done 1.9; done\r\n```",
"title": "v1.9 backports 2021-04-29",
"type": "issue"
},
{
"action": "created",
"author": "aanm",
"comment_id": 828855890,
"datetime": 1619654605000,
"masked_author": "username_0",
"text": "test-backport-1.9",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "aanm",
"comment_id": 829132785,
"datetime": 1619693692000,
"masked_author": "username_0",
"text": "test-1.12-netnext",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pchaigno",
"comment_id": 830238896,
"datetime": 1619802863000,
"masked_author": "username_1",
"text": "test-1.12-netnext",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "aanm",
"comment_id": 830279835,
"datetime": 1619807156000,
"masked_author": "username_0",
"text": "@username_1 I think it's a real failure since it happened here as well https://github.com/cilium/cilium/pull/15927",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "aanm",
"comment_id": 830706148,
"datetime": 1619909964000,
"masked_author": "username_0",
"text": "test-backport-1.9",
"title": null,
"type": "comment"
}
] | 2 | 6 | 556 | false | false | 556 | true |
leonweber/nlprolog | null | 899,980,744 | 2 | null | [
{
"action": "opened",
"author": "eliotpbrenner",
"comment_id": null,
"datetime": 1621888036000,
"masked_author": "username_0",
"text": "Would it be possible to attach a license to this, preferably a MIT or BSD? Thanks!",
"title": "license",
"type": "issue"
},
{
"action": "created",
"author": "leonweber",
"comment_id": 847577705,
"datetime": 1621923952000,
"masked_author": "username_1",
"text": "Closed via 87d93fb",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "leonweber",
"comment_id": null,
"datetime": 1621923952000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 101 | false | false | 101 | false |
tensorflow/tensorflow | tensorflow | 852,103,233 | 48,353 | {
"number": 48353,
"repo": "tensorflow",
"user_login": "tensorflow"
} | [
{
"action": "opened",
"author": "lchangwo",
"comment_id": null,
"datetime": 1617780540000,
"masked_author": "username_0",
"text": "Update on 9 Jan 2019",
"title": "Merge pull request #1 from tensorflow/master",
"type": "issue"
},
{
"action": "created",
"author": "gbaned",
"comment_id": 815000449,
"datetime": 1617808698000,
"masked_author": "username_1",
"text": "Closing this PR since no changes in files. Thanks!\r\ncc @mihaimaruseac",
"title": null,
"type": "comment"
}
] | 3 | 3 | 1,908 | false | true | 89 | false |
psmiraglia/spid-compliant-certificates | null | 811,570,794 | 4 | null | [
{
"action": "opened",
"author": "peppelinux",
"comment_id": null,
"datetime": 1613693298000,
"masked_author": "username_0",
"text": "Dear Paolo,\r\nConsidering the quality of this repository in relation to the promulgation objectives of the tools currently useful to simplify the SPID accreditation process, and all its intermediate steps, I would like to ask you to migrate this repository to developers italia.\r\n\r\nWe could address any requests for integration of CI models, if you agree, once the repository has been migrated. You will remain the one and only maintainer.\r\n\r\nSure of your kind reply,\r\nthank you for what has been done to date",
"title": "Migration to Developers Italia",
"type": "issue"
},
{
"action": "closed",
"author": "psmiraglia",
"comment_id": null,
"datetime": 1613747462000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "psmiraglia",
"comment_id": 782134907,
"datetime": 1613747462000,
"masked_author": "username_1",
"text": "It's fine! I'll give you all the required permissions to migrate the repo.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 582 | false | false | 582 | false |
google/filament | google | 863,278,368 | 3,829 | {
"number": 3829,
"repo": "filament",
"user_login": "google"
} | [
{
"action": "opened",
"author": "bejado",
"comment_id": null,
"datetime": 1618956127000,
"masked_author": "username_0",
"text": "#3786 added default sampler precision specifiers, but these specifiers are only valid in ES.\r\n\r\nFixes #3823",
"title": "Specify sampler precision only on GLES",
"type": "issue"
},
{
"action": "created",
"author": "ScatteredRay",
"comment_id": 823746412,
"datetime": 1618974812000,
"masked_author": "username_1",
"text": "Confirmed that this fixes the samples on my computer.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 160 | false | false | 160 | false |
FrederickMih/Bubble-sort | null | 789,628,313 | 1 | {
"number": 1,
"repo": "Bubble-sort",
"user_login": "FrederickMih"
} | [
{
"action": "opened",
"author": "FrederickMih",
"comment_id": null,
"datetime": 1611118171000,
"masked_author": "username_0",
"text": "## Bubble-sort and bubble_sort_by methods\r\n\r\nIn these methods,\r\n\r\n-I implemented the bubble_sort method that returns a sorted array of integers in ascending order.\r\n-The bubble_sort_by method returns sorted strings of block words in ascending order.",
"title": "Ruby: Bubble sort",
"type": "issue"
},
{
"action": "created",
"author": "FrederickMih",
"comment_id": 763511822,
"datetime": 1611139220000,
"masked_author": "username_0",
"text": "## Ruby:\r\n# Bubble Sorting of integers and words of an array in ascending order.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 329 | false | false | 329 | false |
binance-exchange/binance-java-api | binance-exchange | 755,282,480 | 295 | null | [
{
"action": "opened",
"author": "guilllet1",
"comment_id": null,
"datetime": 1606916348000,
"masked_author": "username_0",
"text": "Hello, i can't find those attributes in the API\r\nisMarginTradingAllowed\r\nisSpotTradingAllowed\r\nCould you tell me ?\r\n\r\n\r\n`{\r\n \"timezone\": \"UTC\",\r\n \"serverTime\": 1565246363776,\r\n \"rateLimits\": [\r\n {\r\n //These are defined in the `ENUM definitions` section under `Rate Limiters (rateLimitType)`.\r\n //All limits are optional\r\n }\r\n ],\r\n \"exchangeFilters\": [\r\n //These are the defined filters in the `Filters` section.\r\n //All filters are optional.\r\n ],\r\n \"symbols\": [\r\n {\r\n \"symbol\": \"ETHBTC\",\r\n \"status\": \"TRADING\",\r\n \"baseAsset\": \"ETH\",\r\n \"baseAssetPrecision\": 8,\r\n \"quoteAsset\": \"BTC\",\r\n \"quotePrecision\": 8,\r\n \"baseCommissionPrecision\": 8,\r\n \"quoteCommissionPrecision\": 8,\r\n \"orderTypes\": [\r\n \"LIMIT\",\r\n \"LIMIT_MAKER\",\r\n \"MARKET\",\r\n \"STOP_LOSS\",\r\n \"STOP_LOSS_LIMIT\",\r\n \"TAKE_PROFIT\",\r\n \"TAKE_PROFIT_LIMIT\"\r\n ],\r\n \"icebergAllowed\": true,\r\n \"ocoAllowed\": true,\r\n \"quoteOrderQtyMarketAllowed\": true,\r\n \"isSpotTradingAllowed\": true,\r\n \"isMarginTradingAllowed\": true,\r\n \"filters\": [\r\n //These are defined in the Filters section.\r\n //All filters are optional\r\n ],\r\n \"permissions\": [\r\n \"SPOT\",\r\n \"MARGIN\"\r\n ]\r\n }\r\n ]\r\n}`",
"title": "Can't find isMarginTradingAllowed",
"type": "issue"
}
] | 1 | 1 | 1,319 | false | false | 1,319 | false |
inexorgame/site | inexorgame | 609,408,979 | 63 | null | [
{
"action": "opened",
"author": "MartinMuzatko",
"comment_id": null,
"datetime": 1588200063000,
"masked_author": "username_0",
"text": "The team page is currently automatically rendered using github information.\r\nI would like to replace it with a manual list with respect to the field of work everyone has contributed so far.\r\n\r\nThis is a manual process so the list has to be maintained. \r\n\r\n@inexorgame/members PLEASE let me know what on the website you would like to be listed as.\r\nLike what kind of work you have contributed, and what your interests are - a tiny bio ~1-2 sentences.\r\n\r\nI will also create a section for our previous members that have contributed their ideas, code and content.\r\nIf you want to be listed, please also let me know. These are the ones that I know of:\r\n\r\n@Fohlen \r\n@username_1 \r\n@Croydon",
"title": "Rework Team page",
"type": "issue"
},
{
"action": "created",
"author": "terencode",
"comment_id": 624797421,
"datetime": 1588787663000,
"masked_author": "username_1",
"text": "Thanks, I'd appreciate as well, probably with something along \"website design contribution\".",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "MartinMuzatko",
"comment_id": null,
"datetime": 1646262187000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 774 | false | false | 774 | true |
pmndrs/use-cannon | pmndrs | 858,308,349 | 180 | {
"number": 180,
"repo": "use-cannon",
"user_login": "pmndrs"
} | [
{
"action": "opened",
"author": "brunnolou",
"comment_id": null,
"datetime": 1618438872000,
"masked_author": "username_0",
"text": "# What?\r\n\r\nThis PR adds the following setter methods to `Spring`:\r\n - `stiffness`\r\n - `restLength`\r\n - `damping`\r\n\r\n# Why?\r\n\r\nSince there's no support for prismatic constraint, with the right spring settings, we could simulate a linear actuator, or a muscle contraction using spring.\r\n\r\n# How?\r\n\r\n- Adds 3 new methods to the hook and worker.\r\n- Updates the demo to dynamically change the spring length on pointer down:\r\n \r\nhttps://user-images.githubusercontent.com/2729225/114787089-17bf1580-9d77-11eb-9aac-b95214203aa2.mp4",
"title": "Add Spring setters",
"type": "issue"
},
{
"action": "created",
"author": "stockHuman",
"comment_id": 819931390,
"datetime": 1618445956000,
"masked_author": "username_1",
"text": "Lovely addition, thank you.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 565 | false | false | 565 | false |
Homebrew/homebrew-cask | Homebrew | 831,648,327 | 101,133 | {
"number": 101133,
"repo": "homebrew-cask",
"user_login": "Homebrew"
} | [
{
"action": "opened",
"author": "bingtsingw",
"comment_id": null,
"datetime": 1615802834000,
"masked_author": "username_0",
"text": "**Important:** *Do not tick a checkbox if you haven’t performed its action.* Honesty is indispensable for a smooth review process.\r\n\r\nAfter making all changes to a cask, verify:\r\n\r\n- [x] The submission is for [a stable version](https://github.com/Homebrew/homebrew-cask/blob/master/doc/development/adding_a_cask.md#stable-versions) or [documented exception](https://github.com/Homebrew/homebrew-cask/blob/master/doc/development/adding_a_cask.md#but-there-is-no-stable-version).\r\n- [x] `brew audit --cask {{cask_file}}` is error-free.\r\n- [x] `brew style --fix {{cask_file}}` reports no offenses.\r\n\r\nAdditionally, **if adding a new cask**:\r\n\r\n- [x] Named the cask according to the [token reference](https://github.com/Homebrew/homebrew-cask/blob/master/doc/cask_language_reference/token_reference.md).\r\n- [x] Checked the cask was not [already refused](https://github.com/Homebrew/homebrew-cask/search?q=is%3Aclosed&type=Issues).\r\n- [x] Checked the cask is submitted to [the correct repo](https://github.com/Homebrew/homebrew-cask/blob/master/doc/development/adding_a_cask.md#finding-a-home-for-your-cask).\r\n- [x] `brew audit --new-cask {{cask_file}}` worked successfully.\r\n- [x] `brew install --cask {{cask_file}}` worked successfully.\r\n- [x] `brew uninstall --cask {{cask_file}}` worked successfully.",
"title": "Add RemNote.app v1.2.9",
"type": "issue"
},
{
"action": "created",
"author": "bingtsingw",
"comment_id": 799295753,
"datetime": 1615803141000,
"masked_author": "username_0",
"text": "Hi, could you help me with the `livecheck` part?\r\n`curl https://www.remnote.io/desktop/mac` is a 302 redirect response:\r\n`Found. Redirecting to https://download.remnote.io/RemNote-1.2.9.dmg`\r\n\r\nI have no idea how to add this type of live check.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 1,543 | false | false | 1,543 | false |
ropensci/FedData | ropensci | 772,973,446 | 74 | null | [
{
"action": "opened",
"author": "jsta",
"comment_id": null,
"datetime": 1608646044000,
"masked_author": "username_0",
"text": "Looks like the `pal_nlcd` function was dropped in 634adad0dd3b7d431e1df8227d6cb39df3adb0aa. Was this intentional?",
"title": "The pal_nlcd function has been dropped",
"type": "issue"
},
{
"action": "created",
"author": "bocinsky",
"comment_id": 851786072,
"datetime": 1622519773000,
"masked_author": "username_1",
"text": "Hey there. Sorry I took so long to respond to this. It was definitely intentional, but negligent. ;-) I'll add it back in ASAP.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "bocinsky",
"comment_id": null,
"datetime": 1631038887000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "bocinsky",
"comment_id": 914523039,
"datetime": 1631038887000,
"masked_author": "username_1",
"text": "It's back!",
"title": null,
"type": "comment"
}
] | 2 | 4 | 250 | false | false | 250 | false |
PrismarineJS/prismarine-web-client | PrismarineJS | 819,613,771 | 37 | null | [
{
"action": "opened",
"author": "u9g",
"comment_id": null,
"datetime": 1614660801000,
"masked_author": "username_0",
"text": "I shouldnt be able to select any text lol, the blue looks terrible",
"title": "make all text unselectable",
"type": "issue"
},
{
"action": "created",
"author": "ShrimpyStuff",
"comment_id": 789394911,
"datetime": 1614741257000,
"masked_author": "username_1",
"text": "Should be fixed in #51",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Karang",
"comment_id": null,
"datetime": 1614770677000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 3 | 88 | false | false | 88 | false |
informalsystems/tendermint-rs | informalsystems | 929,252,416 | 919 | null | [
{
"action": "opened",
"author": "romac",
"comment_id": null,
"datetime": 1624542659000,
"masked_author": "username_0",
"text": "On order to implement runtime configuration reload in Hermes, we need to be able to compare two configurations for equality.\r\nWe are currently not able to do that, because our configuration struct contains a few fields of type `tendermint_rpc::Url`, which does not implement the `Eq` trait.\r\n\r\nIt would be great if we could derive this trait for `tendermint_rpc::Url`, and add other standard trait bounds while we're at it. The full list I am thinking of is: `PartialEq, Eq, PartialOrd, Ord, Hash`.\r\n\r\n**What's the definition of \"done\" for this issue?**\r\n\r\n`tendermint_rpc::Url` implements `PartialEq, Eq, PartialOrd, Ord, Hash`.",
"title": "Add `PartialEq, Eq, PartialOrd, Ord, Hash` bounds to `tendermint_rpc::Url`",
"type": "issue"
},
{
"action": "closed",
"author": "romac",
"comment_id": null,
"datetime": 1624795213000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 629 | false | false | 629 | false |
ballerina-platform/nballerina | ballerina-platform | 800,510,091 | 62 | null | [
{
"action": "opened",
"author": "ruvi-d",
"comment_id": null,
"datetime": 1612371028000,
"masked_author": "username_0",
"text": "**Description:**\r\nAdd the LIT tests (integration tests) to the Github PR check workflow \r\n\r\n**Suggested Labels:**\r\nPriority/High \r\n\r\n**Suggested Assignees:**\r\nusername_0",
"title": "Add LIT tests to PR check",
"type": "issue"
},
{
"action": "created",
"author": "ruvi-d",
"comment_id": 776495559,
"datetime": 1612940830000,
"masked_author": "username_0",
"text": "Need a link to the Ballerina pack for this\r\nDiscussion for that is at : https://ballerina-platform.slack.com/archives/C01LNSAHK18/p1612789622005500",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ruvi-d",
"comment_id": 777340141,
"datetime": 1613038807000,
"masked_author": "username_0",
"text": "## Status\r\n\r\n### Sub tasks\r\n- [ ] Get download URL for Ballerina pack : **Blocked**\r\n- [ ] Setup dependencies on Github PR check instance\r\n- [ ] Execute checks (Lit tests) on Github PR check instance",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Kishanthan",
"comment_id": null,
"datetime": 1614225465000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 511 | false | false | 511 | true |
dotnet/AspNetCore.Docs | dotnet | 613,591,968 | 18,163 | null | [
{
"action": "opened",
"author": "CeciAc",
"comment_id": null,
"datetime": 1588796847000,
"masked_author": "username_0",
"text": "@Koulbrise1030 commented on [Wed May 06 2020](https://github.com/dotnet/AspNetCore.Docs.fr-fr/issues/246)\n\nI get an error adding the [Required] attribute in the \"Rating\" class.\nYou must fill in the \"Rating\" data to avoid having the error\n\n---\n#### Détails du document\n\n⚠ *Ne pas modifier cette section. C’est obligatoire pour docs.microsoft.com ➟ Liaison des problèmes GitHub.*\n\n* ID: 443a7c53-cc16-7ed5-5950-46a6f195f19c\n* Version Independent ID: 12fe9329-2cef-4056-4986-18ed7ae5c397\n* Content: [Ajouter une validation à une application ASP.NET Core MVC](https://docs.microsoft.com/fr-fr/aspnet/core/tutorials/first-mvc-app/validation?view=aspnetcore-3.1#feedback)\n* Content Source: [aspnetcore/tutorials/first-mvc-app/validation.md](https://github.com/dotnet/AspNetCore.Docs.fr-fr/blob/live/aspnetcore/tutorials/first-mvc-app/validation.md)\n* Product: **aspnet-core**\n* Technology: **aspnetcore-tutorials**\n* GitHub Login: @username_1\n* Microsoft Alias: **riande**\n\n---\n\n@srvbpigh commented on [Wed May 06 2020](https://github.com/dotnet/AspNetCore.Docs.fr-fr/issues/246#issuecomment-624756751)\n\nHello, @Koulbrise1030\n \nThank you for your feedback.\n \nWe are actively reviewing your comments and will get back to you soon.\n\n\nKind regards,\nMicrosoft DOCS International Team",
"title": "Error because \"Rating\" empty",
"type": "issue"
},
{
"action": "closed",
"author": "Rick-Anderson",
"comment_id": null,
"datetime": 1588797038000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "Rick-Anderson",
"comment_id": 624874097,
"datetime": 1588797038000,
"masked_author": "username_1",
"text": "The tutorial works. I'm not sure what step you missed. You might try asking on StackOverflow,.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,370 | false | false | 1,370 | true |
blackflux/lambda-serverless-api | blackflux | 833,427,639 | 2,193 | {
"number": 2193,
"repo": "lambda-serverless-api",
"user_login": "blackflux"
} | [
{
"action": "opened",
"author": "MrsFlux",
"comment_id": null,
"datetime": 1615960372000,
"masked_author": "username_0",
"text": "Automatically created by Git-Ally",
"title": "[Gally]: master <- dev",
"type": "issue"
},
{
"action": "created",
"author": "MrsFlux",
"comment_id": 801194105,
"datetime": 1615996200000,
"masked_author": "username_0",
"text": ":tada: This PR is included in version 8.0.5 :tada:\n\nThe release is available on:\n- [npm package (@latest dist-tag)](https://www.npmjs.com/package/lambda-serverless-api/v/8.0.5)\n- [GitHub release](https://github.com/blackflux/lambda-serverless-api/releases/tag/v8.0.5)\n\nYour **[semantic-release](https://github.com/semantic-release/semantic-release)** bot :package::rocket:",
"title": null,
"type": "comment"
}
] | 1 | 2 | 405 | false | false | 405 | false |
shane3606/release-based-workflow | null | 593,897,208 | 4 | {
"number": 4,
"repo": "release-based-workflow",
"user_login": "shane3606"
} | [
{
"action": "opened",
"author": "shane3606",
"comment_id": null,
"datetime": 1586011028000,
"masked_author": "username_0",
"text": "",
"title": "Create pull_request_template.md",
"type": "issue"
}
] | 2 | 2 | 388 | false | true | 0 | false |
nevermined-io/contracts | nevermined-io | 819,929,762 | 44 | null | [
{
"action": "opened",
"author": "aaitor",
"comment_id": null,
"datetime": 1614684663000,
"masked_author": "username_0",
"text": "NFT Sales (NFTSalesTemplate) will require to execute the following conditions:\r\n- [ ] Owner lock the NFTs to sell in an escrow contract\r\n- [ ] Consumer lock payment via the existing Lock Reward condition\r\n- [ ] Owner or Provider Transfer the NFTs via the new condition\r\n- [ ] Whoever can distribute rewards to the owner (and royalties to the original creator)",
"title": "Create new NFT Sales template",
"type": "issue"
},
{
"action": "closed",
"author": "aaitor",
"comment_id": null,
"datetime": 1616070899000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 359 | false | false | 359 | false |
zephyrproject-rtos/esp-idf | zephyrproject-rtos | 756,443,386 | 6 | {
"number": 6,
"repo": "esp-idf",
"user_login": "zephyrproject-rtos"
} | [
{
"action": "opened",
"author": "sylvioalves",
"comment_id": null,
"datetime": 1607020002000,
"masked_author": "username_0",
"text": "This PR is the baseline for the upcoming commits related to ESP32 Development ([ESP Development](https://github.com/zephyrproject-rtos/zephyr/issues/29394))\r\n\r\nAs from [here](https://github.com/zephyrproject-rtos/esp-idf/issues/4), we keep current esp-idf structure and not creating a new repo/branch.",
"title": "esp32: add wifi hal interface and update to esp-idf v4.2 ",
"type": "issue"
},
{
"action": "created",
"author": "sylvioalves",
"comment_id": 738726794,
"datetime": 1607080477000,
"masked_author": "username_0",
"text": "@nashif, once approved, we need to sync with [this](https://github.com/zephyrproject-rtos/zephyr/pull/30424).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "carlescufi",
"comment_id": 738891642,
"datetime": 1607100711000,
"masked_author": "username_1",
"text": "@username_0 thanks, could you please rebase this properly so that only the commits relevant to this PR are shown?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sylvioalves",
"comment_id": 740666148,
"datetime": 1607439079000,
"masked_author": "username_0",
"text": "Closing this PR as we will move to `hal_espressif`.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 575 | false | false | 575 | true |
johnno1962/Refactorator | null | 861,472,627 | 9 | null | [
{
"action": "opened",
"author": "adamchenwei",
"comment_id": null,
"datetime": 1618848093000,
"masked_author": "username_0",
"text": "",
"title": "is this same as the Lyft's refactorator? or its just same name?",
"type": "issue"
},
{
"action": "created",
"author": "johnno1962",
"comment_id": 822589283,
"datetime": 1618848433000,
"masked_author": "username_1",
"text": "I don't think so. What is the link to Lyft's refactorator?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "adamchenwei",
"comment_id": 822590998,
"datetime": 1618848571000,
"masked_author": "username_0",
"text": "I can't find one that's reason I popup here :D haha thanks for the quick verification!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "johnno1962",
"comment_id": null,
"datetime": 1619243351000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 144 | false | false | 144 | false |
diamondburned/arikawa | null | 897,631,753 | 197 | {
"number": 197,
"repo": "arikawa",
"user_login": "diamondburned"
} | [
{
"action": "opened",
"author": "twoscott",
"comment_id": null,
"datetime": 1621566078000,
"masked_author": "username_0",
"text": "Just some methods I thought would be convenient. I haven't used most of these structs yet but I figured the CreatedAt methods should be on any object that has a snowflake directly identifying it (not sure if I missed any).",
"title": "discord: Helper methods & doc fixes",
"type": "issue"
},
{
"action": "created",
"author": "diamondburned",
"comment_id": 845617473,
"datetime": 1621566170000,
"masked_author": "username_1",
"text": "Is there a reason why `CreatedAt` returns `discord.Timestamp` and not `time.Time`?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "twoscott",
"comment_id": 845618929,
"datetime": 1621566454000,
"masked_author": "username_0",
"text": "I honestly just wasn't sure which to use, figured you might prefer to use `discord.Timestamp` to keep consistency but I could see `time.Time` working just as well if not better",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "diamondburned",
"comment_id": 845653329,
"datetime": 1621573093000,
"masked_author": "username_1",
"text": "`discord.Timestamp` would indeed be more consistent, but since these are methods, it doesn't really matter. The type was made just so JSON can be marshaled properly, so it doesn't matter as far as usage goes.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 688 | false | false | 688 | false |
hingston/7-billion-humans-solutions | null | 787,375,897 | 266 | {
"number": 266,
"repo": "7-billion-humans-solutions",
"user_login": "hingston"
} | [
{
"action": "opened",
"author": "danrudmin",
"comment_id": null,
"datetime": 1610785007000,
"masked_author": "username_0",
"text": "1 fewer steps (this is DMR)",
"title": "Update Year 62 - The Sorting Floor (size).txt",
"type": "issue"
},
{
"action": "created",
"author": "hingston",
"comment_id": 846567897,
"datetime": 1621778439000,
"masked_author": "username_1",
"text": "Sorry about the slow response!",
"title": null,
"type": "comment"
}
] | 2 | 2 | 57 | false | false | 57 | false |
maxbanton/cwh | null | 479,584,895 | 66 | null | [
{
"action": "opened",
"author": "omacesc",
"comment_id": null,
"datetime": 1565607645000,
"masked_author": "username_0",
"text": "* **I'm submitting a ...**\r\n - [ ] bug report\r\n - [ ] feature request\r\n - [x] support request\r\n\r\n* **Do you want to request a *feature* or report a *bug*?**\r\nReport a bug\r\n\r\n* **What is the current behavior?**\r\nWhen create a new log group, no put the correct retention\r\n\r\n* **If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem**\r\nI use laravel 5.5 monolog\r\n\r\n* **What is the expected behavior?**\r\nPut the correct number in retention\r\n\r\n* **What is the motivation / use case for changing the behavior?**\r\n\r\n* **Please tell about your environment:**\r\n \r\n - PHP Version: 7.1\r\n - Operating system (distro): debian\r\n - Application mode (web app / cli app / daemon cli app): web app\r\n\r\n* **Other information** (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links to have context, eg. stackoverflow, etc)\r\nLaravel 5.5\r\n\r\n`$cwClient = App::make('aws')->createClient('CloudWatchLogs');\r\n $cwGroupName = config('laravel-monolog-ext.drivers.cloudwatch.group');\r\n $cwStreamNameApp = 'laravel-' . now()->toDateString() . '.log';\r\n $cwRetentionDays = config('laravel-monolog-ext.drivers.cloudwatch.retention');\r\n $cwLevel = config('laravel-monolog-ext.drivers.cloudwatch.level');\r\n $this->cwHandlerApp = new CloudWatch($cwClient, $cwGroupName, $cwStreamNameApp, $cwRetentionDays, 10000, [], $cwLevel);\r\n$monolog->pushHandler($this->cwHandlerApp);`\r\n\r\nThe user AWS have full access in CloudWatch Logs",
"title": "No apply retention days",
"type": "issue"
},
{
"action": "closed",
"author": "maxbanton",
"comment_id": null,
"datetime": 1605628937000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 1,528 | false | false | 1,528 | false |
Raulcmm/Portafolio | null | 824,204,837 | 8 | {
"number": 8,
"repo": "Portafolio",
"user_login": "Raulcmm"
} | [
{
"action": "opened",
"author": "Raulcmm",
"comment_id": null,
"datetime": 1615183359000,
"masked_author": "username_0",
"text": "<h3>Snyk has created this PR to fix one or more vulnerable packages in the `npm` dependencies of this project.</h3>\n\n\n\n\n#### Changes included in this PR\n\n- Changes to the following files to upgrade the vulnerable dependencies to a fixed version:\n - package.json\n - package-lock.json\n\n\n\n#### Vulnerabilities that will be fixed\n##### With an upgrade:\nSeverity | Priority Score (*) | Issue | Breaking Change | Exploit Maturity\n:-------------------------:|-------------------------|:-------------------------|:-------------------------|:-------------------------\n | **479/1000** <br/> **Why?** Has a fix available, CVSS 5.3 | Regular Expression Denial of Service (ReDoS) <br/>[SNYK-JS-GLOBPARENT-1016905](https://snyk.io/vuln/SNYK-JS-GLOBPARENT-1016905) | No | No Known Exploit \n\n(*) Note that the real score may have changed since the PR was raised.\n\n\n\n\n\n<details>\n <summary><b>Commit messages</b></summary>\n </br>\n <details>\n <summary>Package name: <b>gatsby-plugin-sharp</b></summary>\n The new version differs by 53 commits.</br>\n <ul>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/83cd408c557ac3318902b17e16bca23a7d711006\">83cd408</a> chore(release): Publish</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/d6f03185030cefff11e2f3b573d6e24ee9ead36e\">d6f0318</a> chore: use packlist for cleanup-package-dir (#26657)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/aa300f47e1b8ae69af5d60bda61b3b3d35ff00dc\">aa300f4</a> chore(docs):fixed file names and links in query-execution (#26680)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/11ab72a91c911551fe35e7bcd285bf9807f2d238\">11ab72a</a> chore(docs): fixed some links in query-execution (#26555)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/fed26198626d96780d9c04f954a1f2d26fe29063\">fed2619</a> fix(docs): query filters -> update dictionary, code fences, fix code, brand name (#26408)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/7de5f182dca00bdba088742c09a3674e2a97c1b3\">7de5f18</a> add code fences (#26409)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/823e473449e940c583db24728ff0015fa7f9e33f\">823e473</a> fix(docs): schema -> fix 404, remove deleted page from sidebar, apply redirects (#26461)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/21b94dfb43154ccb984af91b26cd05b2d96efba3\">21b94df</a> Docs - Remove not inclusive words (#26294)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/652af043b22dfe9dcdd6d97275576901e46da0a7\">652af04</a> fix(docs): schema -> code fences, code fix (#26462)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/6b9697258950e334850dce7c5eee97bdea9e87bd\">6b96972</a> chore(docs): Update GraphQL spelling in README.md (#26693)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/c2aededc50bc9e2946e3fd8be143c31cdfed3259\">c2aeded</a> fix(gatsby): properly unlock processes onExit (#26670)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/93fdc0993d530b68cd317fb3c58b322684de68b3\">93fdc09</a> fix(gatsby): only enable debugger when argument is given (#26669)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/7e83aced6b5c9fc421fd8f3b67b77269a6b59f09\">7e83ace</a> chore(docs): fix typos (#26682)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/c40434a27adac38f5d7963eebdb3f050976babe7\">c40434a</a> chore(docs): Fix a typo (#26665)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/18f6b4d1eae5cf7a05e1da2a390e8cf0333979b1\">18f6b4d</a> chore(docs): Fix typos (#26663)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/dedd37f79467058e0bd17d9f5db176faace272e7\">dedd37f</a> chore(gatsby-plugin-sharp, gatsby-transformer-sharp): update dependencies (#26259)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/7975b91b53ff9b38fd85d0d58863bf34bd6d98b1\">7975b91</a> chore(gatsby-recipes): Add a contributing.md to recipes (#26583)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/ac72bfb7d1ff2c06ef6652f12c1e861fd7cc21f3\">ac72bfb</a> chore(release): Publish</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/703678e31f98223dc70eaa4be5de54309e0de622\">703678e</a> Admin/recipes gui (#26243)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/04c75bb3476230d4d8f602d01265d5371a2374db\">04c75bb</a> fix(gatsby): fix error from ts conversion (#26681)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/25e3a636dca48a0241e50cd95b4a31c6062be293\">25e3a63</a> fix(gatsby): fix materialization edge case with nullish values (#26677)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/19020c200d059fff6ee5eff8d860cffb82363ab5\">19020c2</a> chore(benchmarks): set semver to match any patch/minor for most deps (#26679)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/608f40c1d8658e93a27732159c1a58c7cdd2bb45\">608f40c</a> chore: cherrypick Renovate updates (#26582)</li>\n <li><a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/commit/6ba68f8e34a18fabc16aedebbdc3b79707c2d322\">6ba68f8</a> feat(gatsby): Support React 17's new JSX Transform (#26652)</li>\n </ul>\n\n <a href=\"https://snyk.io/redirect/github/gatsbyjs/gatsby/compare/56990bdd49c01e4f4a0c659df542e35abea2954e...83cd408c557ac3318902b17e16bca23a7d711006\">See the full diff</a>\n </details>\n</details>\n\n\n\n\n\n\nCheck the changes in this PR to ensure they won't cause issues with your project.\n\n\n\n------------\n\n\n\n**Note:** *You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.*\n\nFor more information: <img src=\"https://api.segment.io/v1/pixel/track?data=eyJ3cml0ZUtleSI6InJyWmxZcEdHY2RyTHZsb0lYd0dUcVg4WkFRTnNCOUEwIiwiYW5vbnltb3VzSWQiOiIzY2IyMTJiOC04ZmUzLTRkNzMtOTk3NC1mM2NlMmRlODRjMjciLCJldmVudCI6IlBSIHZpZXdlZCIsInByb3BlcnRpZXMiOnsicHJJZCI6IjNjYjIxMmI4LThmZTMtNGQ3My05OTc0LWYzY2UyZGU4NGMyNyJ9fQ==\" width=\"0\" height=\"0\"/>\n🧐 [View latest project report](https://app.snyk.io/org/appswicho/project/f46260a3-a995-498d-8dc7-21342470ae95)\n\n🛠 [Adjust project settings](https://app.snyk.io/org/appswicho/project/f46260a3-a995-498d-8dc7-21342470ae95/settings)\n\n📚 [Read more about Snyk's upgrade and patch logic](https://support.snyk.io/hc/en-us/articles/360003891078-Snyk-patches-to-fix-vulnerabilities)\n\n[//]: # (snyk:metadata:{\"prId\":\"3cb212b8-8fe3-4d73-9974-f3ce2de84c27\",\"dependencies\":[{\"name\":\"gatsby-plugin-sharp\",\"from\":\"2.6.28\",\"to\":\"2.6.31\"}],\"packageManager\":\"npm\",\"projectPublicId\":\"f46260a3-a995-498d-8dc7-21342470ae95\",\"projectUrl\":\"https://app.snyk.io/org/appswicho/project/f46260a3-a995-498d-8dc7-21342470ae95?utm_source=github&utm_medium=fix-pr\",\"type\":\"auto\",\"patch\":[],\"vulns\":[\"SNYK-JS-GLOBPARENT-1016905\"],\"upgrade\":[\"SNYK-JS-GLOBPARENT-1016905\"],\"isBreakingChange\":false,\"env\":\"prod\",\"prType\":\"fix\",\"templateVariants\":[\"updated-fix-title\",\"priorityScore\",\"merge-advice-badge-shown\"],\"priorityScoreList\":[479]})",
"title": "[Snyk] Security upgrade gatsby-plugin-sharp from 2.6.28 to 2.6.31",
"type": "issue"
}
] | 2 | 2 | 8,136 | false | true | 7,756 | false |
google/objax | google | 797,225,193 | 197 | null | [
{
"action": "opened",
"author": "AlexeyKurakin",
"comment_id": null,
"datetime": 1611956535000,
"masked_author": "username_0",
"text": "Typically user has to specify list of variables used by a function:\r\n\r\n```python\r\nmodel = WideResNet(nin=3, nclass=10, depth=wrn_depth, width=wrn_width)\r\n\r\n@objax.Function.with_vars(model.vars())\r\ndef loss_fn(x, label):\r\n logit = model(x, training=True)\r\n return objax.functional.loss.cross_entropy_logits_sparse(\r\n logit, label).mean()\r\n```\r\n\r\nThe goal of the tracing is to automatically detect used variables, so the code could look like following:\r\n```python\r\nmodel = WideResNet(nin=3, nclass=10, depth=wrn_depth, width=wrn_width)\r\n\r\n@objax.Function.auto_vars\r\ndef loss_fn(x, label):\r\n logit = model(x, training=True)\r\n return objax.functional.loss.cross_entropy_logits_sparse(\r\n logit, label).mean()\r\n```",
"title": "Prototype automatic tracing of variables used by a function",
"type": "issue"
},
{
"action": "closed",
"author": "AlexeyKurakin",
"comment_id": null,
"datetime": 1615586614000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 722 | false | false | 722 | false |
serenity-rs/serenity | serenity-rs | 824,017,710 | 1,255 | {
"number": 1255,
"repo": "serenity",
"user_login": "serenity-rs"
} | [
{
"action": "opened",
"author": "FelixMcFelix",
"comment_id": null,
"datetime": 1615150586000,
"masked_author": "username_0",
"text": "## Description\r\n\r\nThis removes a `let _ = ...` which was silently causing some resume failures to become permanent, causing the shard to crash the next time it attempted to use its (closed) WS client. This is replaced with code to trigger a further resume and/or reidentification attempt.\r\n\r\nIn the worst case, this will attempt one further resume before \"promoting\" the resume into a full reidentify. The `ShardQueuer` is then already responsible for repeated full reidentify attempts from that point on.\r\n\r\n## Type of Change\r\n\r\nThis fixes a (rare) bug in the shard runner triggered when TLS or socket initialisation spuriously fails.\r\n\r\n## How Has This Been Tested?\r\n\r\nThis has been tested by (locally) adding in extra `ShardRunnerMessage` types to test and access the new code paths individually. The underlying bug has not been observed (using the new `warn!`/`debug!` invocations); I haven't seen this bug reoccur in over two weeks, in spite of it being a nightly occurrence around that time (most likely due to my hosting provider).",
"title": "Reconnect in event of failed ShardActions",
"type": "issue"
},
{
"action": "created",
"author": "FelixMcFelix",
"comment_id": 800215530,
"datetime": 1615897525000,
"masked_author": "username_0",
"text": "This seems to now catch and correctly act on serial failure states as required. I observed this in my logs this morning:\r\n\r\n```\r\nMar 16 07:09:58.139 DEBUG run: serenity::client::bridge::gateway::shard_runner: [ShardRunner [0, 1]] Reconnecting due to error performing Reconnect(Resume): Tungstenite(Io(Custom { kind: Other, error: Ssl(Error { code: ErrorCode(5), cause: None }, X509VerifyResult { code: 0, error: \"ok\" }) }))\r\nMar 16 07:09:58.145 DEBUG run:resume: serenity::gateway::shard: [Shard [0, 1]] Attempting to resume\r\nMar 16 07:09:58.145 DEBUG run:resume:initialize: serenity::gateway::shard: [Shard [0, 1]] Initializing.\r\nMar 16 07:10:32.111 WARN run: serenity::client::bridge::gateway::shard_runner: [ShardRunner [0, 1]] Resume failed, reidentifying: Tungstenite(Io(Os { code: 32, kind: BrokenPipe, message: \"Broken pipe\" }))\r\nMar 16 07:10:32.115 DEBUG run:request_restart: serenity::client::bridge::gateway::shard_runner: [ShardRunner [0, 1]] Requesting restart\r\nMar 16 07:10:32.119 DEBUG serenity::client::bridge::gateway::shard_queuer: [ShardRunner [0, 1]] Stopping\r\nMar 16 07:10:32.134 INFO start:start_connection{shard_data=[0, 0, 1]}:run:restart{shard_id=ShardId(0)}: serenity::client::bridge::gateway::shard_manager: Restarting shard 0\r\nMar 16 07:10:32.138 INFO start:start_connection{shard_data=[0, 0, 1]}:run:restart{shard_id=ShardId(0)}:shutdown{shard_id=ShardId(0) code=4000}: serenity::client::bridge::gateway::shard_manager: Shutting down shard 0\r\nMar 16 07:10:32.138 DEBUG run: serenity::client::bridge::gateway::shard_queuer: [Shard Queuer] Received to shutdown shard 0 with 4000.\r\nMar 16 07:10:32.139 INFO run:shutdown{shard_id=ShardId(0) code=4000}: serenity::client::bridge::gateway::shard_queuer: Shutting down shard 0\r\nMar 16 07:10:32.139 WARN run:shutdown{shard_id=ShardId(0) code=4000}: serenity::client::bridge::gateway::shard_queuer: Failed to cleanly shutdown shard 0 when sending message to shard runner: TrySendError { kind: Disconnected }\r\nMar 16 07:10:37.139 WARN start:start_connection{shard_data=[0, 0, 1]}:run:restart{shard_id=ShardId(0)}:shutdown{shard_id=ShardId(0) code=4000}: serenity::client::bridge::gateway::shard_manager: Failed to cleanly shutdown shard 0, reached timeout: Elapsed(())\r\nMar 16 07:10:37.142 INFO start:start_connection{shard_data=[0, 0, 1]}:run:restart{shard_id=ShardId(0)}:boot{shard_info=[ShardId(0), ShardId(1)]}: serenity::client::bridge::gateway::shard_manager: Telling shard queuer to start shard 0\r\nMar 16 07:10:37.142 DEBUG run: serenity::client::bridge::gateway::shard_queuer: [Shard Queuer] Received to start shard 0 of 1. Mar 16 07:10:37.142 DEBUG run:checked_start{id=0 total=1}: serenity::client::bridge::gateway::shard_queuer: [Shard Queuer] Checked start for shard 0 out of 1\r\nMar 16 07:10:37.518 DEBUG run:checked_start{id=0 total=1}:start{shard_id=0 shard_total=1}:create_native_tls_client{url=Url { scheme: \"wss\", username: \"\", password: None, host: Some(Domain(\"gateway.discord.gg\")), port: None, path: \"/\", query: Some(\"v=8\"), fragment: None }}: tungstenite::handshake::client: Client handshake done.\r\nMar 16 07:10:37.519 INFO run: serenity::client::bridge::gateway::shard_runner: [ShardRunner [0, 1]] Running\r\n```\r\n\r\nThis PR doesn't deal with the transfer of any unhandled `Intermessage`s between `ShardRunner`s after a full restart, which affects songbird somewhat. I think the best method to handle this is to put a (configurable) timeout on those connection attempts for now, as that requires less engineering work.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 4,634 | false | false | 4,634 | false |
rahuldkjain/github-profile-readme-generator | null | 718,537,305 | 258 | null | [
{
"action": "opened",
"author": "BRAVO68WEB",
"comment_id": null,
"datetime": 1602307650000,
"masked_author": "username_0",
"text": "**What can be done?**\r\nAdding Hexo Framework as a skill under \"Static Site Generators\"\r\n\r\n**Hexo**\r\n\r\n\r\n**Solution**\r\nI can add it, can you assign me!",
"title": "Adding HEXO framework as a skill",
"type": "issue"
},
{
"action": "created",
"author": "BRAVO68WEB",
"comment_id": 706491213,
"datetime": 1602307682000,
"masked_author": "username_0",
"text": "@rahuldkjain Can you assign me?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "BRAVO68WEB",
"comment_id": 706493494,
"datetime": 1602309221000,
"masked_author": "username_0",
"text": "Thx",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "BRAVO68WEB",
"comment_id": 706705689,
"datetime": 1602423249000,
"masked_author": "username_0",
"text": "Done",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "BRAVO68WEB",
"comment_id": null,
"datetime": 1602423251000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 5 | 247 | false | false | 247 | false |
fga-eps-mds/2020.2-Projeto-Kokama-Wiki | fga-eps-mds | 874,677,526 | 182 | null | [
{
"action": "opened",
"author": "WelisonR",
"comment_id": null,
"datetime": 1620055854000,
"masked_author": "username_0",
"text": "## Descrição\r\n\r\nA presente issue busca automatizar a recuperação das releases dos repositórios do projeto a fim de disponibilizá-las em um local único.\r\n\r\n## Tarefas\r\n\r\n- [ ] Criar script para automatizar o processo de recuperação dos arquivos do sonarcloud dos outros repositórios;\r\n- [ ] Criar rotina (job) com GitHub Actions que execute semanalmente o script que recupera os arquivos do sonarcloud;\r\n- [ ] Disponibilizar, através da rotina, arquivos de métricas em uma branch específica no github (_metrics_).\r\n\r\n## Critérios de Aceitação\r\n\r\n- Últimos arquivos de métricas disponibilizados na branch _metrics_, separados por pastas de acordo com os microsserviços.\r\n\r\n## Observação\r\n\r\n- Usar a biblioteca `act` para testar as github actions localmente.",
"title": "Automatizar recuperação de arquivos sonarcloud com Github Actions",
"type": "issue"
},
{
"action": "created",
"author": "WelisonR",
"comment_id": 841701238,
"datetime": 1621101432000,
"masked_author": "username_0",
"text": "Decidido pela equipe que não será necessário desenvolver essa atividade, pois o custo seria maior que o benefício. Portanto, a extração das releases será feita de forma manual.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "WelisonR",
"comment_id": null,
"datetime": 1621101432000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 931 | false | false | 931 | false |
flutter/flutter | flutter | 855,133,262 | 80,202 | null | [
{
"action": "opened",
"author": "shifenis",
"comment_id": null,
"datetime": 1618080844000,
"masked_author": "username_0",
"text": "Like the title, it would be awesome if `TextStyle`, allows handling Inner Shadow and not just for the background.",
"title": "Inner shadow on Text",
"type": "issue"
},
{
"action": "created",
"author": "pedromassangocode",
"comment_id": 817563811,
"datetime": 1618212816000,
"masked_author": "username_1",
"text": "Hi @username_0 \r\nCan you please elaborate more?\r\nDoes the code bellow works for you?\r\n\r\n\r\n\r\n\r\n\r\n<details>\r\n<summary>code sample</summary>\r\n\r\n```dart\r\nimport 'package:flutter/cupertino.dart';\r\nimport 'package:flutter/material.dart';\r\n\r\nvoid main() => runApp(MyApp());\r\n\r\nclass MyApp extends StatelessWidget {\r\n @override\r\n Widget build(BuildContext context) {\r\n return MaterialApp(\r\n title: 'Flutter Demo',\r\n home: Scaffold(\r\n body: Center(\r\n child: Text(\r\n \"Forgot Password?\",\r\n style: TextStyle(\r\n shadows: [\r\n Shadow(\r\n color: Colors.green,\r\n offset: Offset(0, -2))\r\n ],\r\n ),\r\n ),\r\n ),\r\n ),\r\n );\r\n }\r\n}\r\n\r\n```\r\n\r\n</details>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "shifenis",
"comment_id": 817596473,
"datetime": 1618215519000,
"masked_author": "username_0",
"text": "Hi @username_1, the code you provide is not really an inner shadow, indeed, it's a background shadow.\r\n\r\n\r\n\r\nThis is the result I would like to achieve.\r\nAt the moment I don't think Flutter can allow this, at least for the `Text` widget.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zanderso",
"comment_id": 915615993,
"datetime": 1631140073000,
"masked_author": "username_2",
"text": "I think we should probably put a little more effort in to determining whether an effect like this is already possible given the primitives that exist today.",
"title": null,
"type": "comment"
}
] | 3 | 4 | 1,555 | false | false | 1,555 | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.