repo stringlengths 7 67 | org stringlengths 2 32 ⌀ | issue_id int64 780k 941M | issue_number int64 1 134k | pull_request dict | events list | user_count int64 1 77 | event_count int64 1 192 | text_size int64 0 329k | bot_issue bool 1 class | modified_by_bot bool 2 classes | text_size_no_bots int64 0 279k | modified_usernames bool 2 classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
21cmfast/21CMMC | 21cmfast | 488,165,069 | 10 | null | [
{
"action": "opened",
"author": "NGillet",
"comment_id": null,
"datetime": 1567427184000,
"masked_author": "username_0",
"text": "In each of the different likelihood modules already written, the verification of the provided linked core is different. \r\nSome of them are done in the setup, in others, the check is done directly in the compute_likelihood, or even in the reduce data. Also, because of this variety of methods, I do not understand how the required_core works. In most of them, I think it is not used.\r\n\r\nIt would be better to have one fix method to check for the needed cores and stick to it.",
"title": "[Feature Req.] Clarification of CoreModule verification in LikelihoodModules",
"type": "issue"
},
{
"action": "created",
"author": "steven-murray",
"comment_id": 527529689,
"datetime": 1567527206000,
"masked_author": "username_1",
"text": "Absolutely agree. I think I can fix this pretty easily -- hopefully tomorrow!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "steven-murray",
"comment_id": 533619182,
"datetime": 1568996162000,
"masked_author": "username_1",
"text": "OK, so it's been more than one day. I checked through, and I think the following is happening:\r\n\r\nAll `Core`s and `Likelihood`s are checking their `required_cores` during `setup()`. None of the `Core`s do anything else about checking cores. \r\n\r\nHowever some `Likelihood`s don't actually set their `required_cores` (so the check always works), but then raise errors elsewhere by, for example, checking the contents of `_cores`. The reason for this is that in those cases, the requirement is that the core by _either_ one or the other, which the default mechanism for checking cores doesn't recognize. So that is just a workaround. \r\n\r\nI guess it would be better to make the checking mechanism inherently handle this case, rather than using a workaround. I'll try to do that.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "steven-murray",
"comment_id": null,
"datetime": 1572382991000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 1,324 | false | false | 1,324 | false |
Stairdeck/scpOnlineDiscordBot | null | 602,546,207 | 5 | null | [
{
"action": "opened",
"author": "Oyplap",
"comment_id": null,
"datetime": 1587242158000,
"masked_author": "username_0",
"text": "Northwood released a new lobbylist api, could you maby update this?\r\n\r\nThanks in advance",
"title": "Doesn't work anymore",
"type": "issue"
},
{
"action": "created",
"author": "Stairdeck",
"comment_id": 616182236,
"datetime": 1587315445000,
"masked_author": "username_1",
"text": "Updated.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Stairdeck",
"comment_id": null,
"datetime": 1587315446000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 96 | false | false | 96 | false |
raisedadead/portfolio | null | 700,792,365 | 318 | {
"number": 318,
"repo": "portfolio",
"user_login": "raisedadead"
} | [
{
"action": "created",
"author": "raisedadead",
"comment_id": 691859834,
"datetime": 1600066903000,
"masked_author": "username_0",
"text": "@dependabot rebase",
"title": null,
"type": "comment"
}
] | 2 | 2 | 6,197 | false | true | 18 | false |
canonical/bundle-kubeflow | canonical | 746,112,865 | 268 | null | [
{
"action": "opened",
"author": "jhobbs",
"comment_id": null,
"datetime": 1605741097000,
"masked_author": "username_0",
"text": "This should work with rbac on without any extra role config.",
"title": "rbac has to be turned off, or the istio-ingressgateway-operator role has to be manually modified",
"type": "issue"
},
{
"action": "created",
"author": "Deadleg",
"comment_id": 837705980,
"datetime": 1620702557000,
"masked_author": "username_1",
"text": "Adding a bit more detail; the operator logs this error:\r\n\r\n```\r\nReason: Forbidden\r\nHTTP response headers: HTTPHeaderDict({'Audit-Id': '12770c51-a979-41d1-b453-1bda3f9013aa', 'Cache-Control': 'no-cache, private', 'Content-Type': 'application/json', 'X-Content-Type-Options': 'nosniff', 'Date': 'Tue, 11 May 2021 02:27:20 GMT', 'Content-Length': '374'})\r\nHTTP response body: {\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"configmaps \\\"istio-ca-root-cert\\\" is forbidden: User \\\"system:serviceaccount:kubeflow:istio-ingressgateway-operator\\\" cannot get resource \\\"configmaps\\\" in API group \\\"\\\" in the namespace \\\"kubeflow\\\"\",\"reason\":\"Forbidden\",\"details\":{\"name\":\"istio-ca-root-cert\",\"kind\":\"configmaps\"},\"code\":403}\r\n```\r\n\r\nWhich requires the `istio-ingressgateway-operator` role to be updated with `get` permissions for `configmaps`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DomFleischmann",
"comment_id": 837946631,
"datetime": 1620716087000,
"masked_author": "username_2",
"text": "On which version are you testing this?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DomFleischmann",
"comment_id": 839557633,
"datetime": 1620806720000,
"masked_author": "username_2",
"text": "@username_1 okay so, the issue that @username_0 filed should be resolved by now, I think your issue might be specific to EKS as we don't actively test against EKS. If you have the option of deploying on some other kubernetes like AKS, Charmed Kubernetes or Microk8s you will probably get better results.\r\n\r\nEitherway a new issue for this bug on EKS should probably be opened.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DomFleischmann",
"comment_id": 868392483,
"datetime": 1624615890000,
"masked_author": "username_2",
"text": "Currently with RBAC the following manual step is required:\r\n`kubectl patch role -n kubeflow istio-ingressgateway-operator -p '{\"apiVersion\":\"rbac.authorization.k8s.io/v1\",\"kind\":\"Role\",\"metadata\":{\"name\":\"istio-ingressgateway-operator\"},\"rules\":[{\"apiGroups\":[\"*\"],\"resources\":[\"*\"],\"verbs\":[\"*\"]}]}'`\r\n\r\nThis is documented in our [official documentation](https://charmed-kubeflow.io/docs/install)",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "DomFleischmann",
"comment_id": null,
"datetime": 1633959523000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 6 | 1,728 | false | false | 1,728 | true |
gitahead/gitahead | gitahead | 490,743,625 | 265 | {
"number": 265,
"repo": "gitahead",
"user_login": "gitahead"
} | [
{
"action": "opened",
"author": "Gr3q",
"comment_id": null,
"datetime": 1567942638000,
"masked_author": "username_0",
"text": "Now, this PR is to get feedback and help, because I kinda suck working with C++ and never used Qt before. Comes from my feature request https://github.com/gitahead/gitahead/issues/261\r\n\r\nThe Issues:\r\n* Flag to trigger this mode is not linked to the added option (I need advice on how to do this)\r\n* I want to use the git like relative time string (26 minutes ago) as timestamp, and I don't know if I can pull that from git or I have to build it myself.\r\n* I only want to display the date if it is different than the previous string, but I don't know how/where to store the previous value to make this work.\r\n* Badge needs serious refactoring. I split the whole function into 2 calls, based on alignment, but it's a lot of duplicate code, and I am unsure how to refactor it nicely because it is a static function.\r\n\r\nI excluded the author name on purpose, because it does not provide any reference or context (only if you have to look out for that One person's commits in your team...) and takes up way too much space.\r\nI could include the Initials, but I'm already facing alignment issues on the right side so I don't really know where could I put it.\r\n\r\nWith Initials:\r\n\r\n\r\n\r\nCurrent State:\r\n",
"title": "Add Compact Commit List as an option",
"type": "issue"
},
{
"action": "created",
"author": "dgalli1",
"comment_id": 529195047,
"datetime": 1567943157000,
"masked_author": "username_1",
"text": "Would it be possible to do something like gitkraken for the authors? \r\nThey generate a image for every user based on their git profile.\r\n\r\n\r\nThis doesn't take up much space. But helps a lot with looking up commits from a certain person. And it looks way slicker then initals. I don't think it has to necessary be implementet at the history as gitkraken is doing i think next to the timestamp would be a improvment aswell",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 530428237,
"datetime": 1568214989000,
"masked_author": "username_0",
"text": "I believe it is possible to do this, although it is a design choice what's not for me to decide.\r\n\r\nAt the moment I'll try to focus to fix the issues with the existing features.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "CFlix",
"comment_id": 540173645,
"datetime": 1570652277000,
"masked_author": "username_2",
"text": "Thank you so much @username_0! This looks great and is an absoloutely necessary feature. Hope the PR gets merged soon to end up in the next release!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 551805426,
"datetime": 1573220226000,
"masked_author": "username_0",
"text": "This PR is actually ready for merge, all the extra features can be implemented later (They turned out to be quite challenging and I do not have too much time to spend on this feature anymore)\r\n\r\nIf there is anything I should change/fix/additional testing please let me know.\r\n\r\nManual tests made in Linux:\r\nSwitching between Compact and Normal mode for an opened repo\r\nSwitching between Repos With different modes set\r\nOpening new repo when an opened one was set to compact mode.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kas-luthor",
"comment_id": 552981851,
"datetime": 1573577461000,
"masked_author": "username_3",
"text": "The width of the short ref IDs doesn't seem to fit on Windows\r\nMaybe a broader character should be used for measurements?\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 553580520,
"datetime": 1573675801000,
"masked_author": "username_0",
"text": "I wanted the commit reference width to be uniform (like a column), so the rest to its left are aligned correctly.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 554934039,
"datetime": 1574069773000,
"masked_author": "username_0",
"text": "@username_3 Does the problem still persists? I will try to build it on Windows _(again, had problems with openssl before)_ as well today to check if the commit reference displays correctly.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kas-luthor",
"comment_id": 555010851,
"datetime": 1574083071000,
"masked_author": "username_3",
"text": "The width issue doesn't appear anymore",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hackhaslam",
"comment_id": 555298008,
"datetime": 1574129192000,
"masked_author": "username_4",
"text": "It still doesn't quite compute the right text width for the id, at least on macOS:\r\n<img width=\"276\" alt=\"Screen Shot 2019-11-18 at 7 03 50 PM\" src=\"https://user-images.githubusercontent.com/6586470/69109959-76734900-0a36-11ea-97db-906207b6040f.png\">",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 555439689,
"datetime": 1574159137000,
"masked_author": "username_0",
"text": "It seems I can't get away with simple hacks... Fine, I'll do it properly this time.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kas-luthor",
"comment_id": 555465295,
"datetime": 1574162844000,
"masked_author": "username_3",
"text": "Also: I think it would be nice to have an option to set compact mode as a default\r\nShould I implement this or would you rather do it yourself?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 555470734,
"datetime": 1574163750000,
"masked_author": "username_0",
"text": "@username_3 That's something that was on my list to do, you can do it if you have the time. Or if you tell me which section and group is the best place for this setting in Options I can do it too.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kas-luthor",
"comment_id": 555471610,
"datetime": 1574163895000,
"masked_author": "username_3",
"text": "I'd put it in the \"Window\" tab\r\nI'll be able to implement it in ~6h when I get home",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 555473221,
"datetime": 1574164174000,
"masked_author": "username_0",
"text": "Ok, thank you.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 555479736,
"datetime": 1574165384000,
"masked_author": "username_0",
"text": "Commit ID width calculation should be fixed now (it covers most cases). Widest characters are either 0,4,6,9,d or b, so I'm checking for those.\r\n\r\nOptionally, I can use monospace font just for the commit ID to eliminate this problem, like this:\r\n\r\n\r\n\r\nAlthough It can look weird if people use wildly different monospace font compared to the default. For example for me the font baseline is different.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hackhaslam",
"comment_id": 556114379,
"datetime": 1574269696000,
"masked_author": "username_4",
"text": "I'm reviewing this now.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kas-luthor",
"comment_id": 556158688,
"datetime": 1574271786000,
"masked_author": "username_3",
"text": "@username_4 \r\nMy PR username_0/gitahead#2 is still open",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hackhaslam",
"comment_id": 556196956,
"datetime": 1574273710000,
"masked_author": "username_4",
"text": "@username_3 I will look at it.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 556197494,
"datetime": 1574273738000,
"masked_author": "username_0",
"text": "@username_3 thanks its merged now, I don't know why I don't get notifications about my fork.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 556297800,
"datetime": 1574278890000,
"masked_author": "username_0",
"text": "@username_4 feel free to pick it apart, I think there are a few things what I implemented at the wrong places/incorrectly (probably in CommitList.cpp) what I'm happy to fix if you point them out.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hackhaslam",
"comment_id": 556541403,
"datetime": 1574292105000,
"masked_author": "username_4",
"text": "I tried it without duplicating the date in the way that I think you were trying to do. It looks weird to me. It would probably look better if the space were left blank instead of allowing the message to grow into that space. However, the width of the date string is variable, so I wouldn't really know how much space to leave. I'm going to merge it without this for now.\r\n\r\n<img width=\"419\" alt=\"Screen Shot 2019-11-20 at 4 15 23 PM\" src=\"https://user-images.githubusercontent.com/6586470/69286673-11902e00-0bb1-11ea-9bee-ebedfd9912be.png\">",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hackhaslam",
"comment_id": 556555514,
"datetime": 1574294320000,
"masked_author": "username_4",
"text": "I merged this manually with a couple of additional commits. I also didn't use the default setting commit from @username_3. Instead, I just changed the existing setting to an application global setting instead of a repo-specific one. I don't think there needs to be a separate repo-specific setting and global default setting, but I can revisit that if you guys think it's important.\r\n\r\n@username_0 @username_3 I'm going to start including contributor names for specific changes in the change log. Can you let me know your preferred name (real name or handle) if you want it appear?\r\n\r\nAlso, @username_3 I'm trying to review some of your pull requests now. I can implement the macOS-specific portions of the ones that need them.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Gr3q",
"comment_id": 556557443,
"datetime": 1574294729000,
"masked_author": "username_0",
"text": "I'm happy with the changes. My name is Attila Greguss, you can use it.\n\nThanks",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kas-luthor",
"comment_id": 556944135,
"datetime": 1574318580000,
"masked_author": "username_3",
"text": "@username_4 \r\nJust \"Kas\" will be fine, thanks",
"title": null,
"type": "comment"
}
] | 5 | 25 | 6,368 | false | false | 6,368 | true |
Jerry2048/github-slideshow | null | 663,989,318 | 3 | {
"number": 3,
"repo": "github-slideshow",
"user_login": "Jerry2048"
} | [
{
"action": "opened",
"author": "Jerry2048",
"comment_id": null,
"datetime": 1595445214000,
"masked_author": "username_0",
"text": "Added a slide that links back to the first slide",
"title": "Add second slide",
"type": "issue"
}
] | 2 | 2 | 1,813 | false | true | 48 | false |
BW-secret-family-recipes-3/front-end | BW-secret-family-recipes-3 | 708,296,533 | 44 | {
"number": 44,
"repo": "front-end",
"user_login": "BW-secret-family-recipes-3"
} | [
{
"action": "opened",
"author": "suigenerous",
"comment_id": null,
"datetime": 1600963574000,
"masked_author": "username_0",
"text": "",
"title": "lets be careful with this one",
"type": "issue"
}
] | 2 | 2 | 595 | false | true | 0 | false |
SwissDataScienceCenter/renku-graph | SwissDataScienceCenter | 702,739,894 | 407 | null | [
{
"action": "opened",
"author": "jachro",
"comment_id": null,
"datetime": 1600260788000,
"masked_author": "username_0",
"text": "It looks like during the workflow changes on the CLI, we dropped the idea of storing Dataset's `sameAs` as a string value when a dataset was imported from Zenodo or Dataverse. KG still expects that in such cases we'd have strings. With the current approach, all the `sameAs` properties are URIs.",
"title": "Datasets imported from external providers have sameAs with URIs not String values",
"type": "issue"
},
{
"action": "closed",
"author": "jachro",
"comment_id": null,
"datetime": 1600264171000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 295 | false | false | 295 | false |
apache/poi | apache | 346,778,240 | 119 | {
"number": 119,
"repo": "poi",
"user_login": "apache"
} | [
{
"action": "opened",
"author": "drmacro",
"comment_id": null,
"datetime": 1533159329000,
"masked_author": "username_0",
"text": "This adds getters and setters for additional Run formatting properties. My goal was to implement the properties likely to be used in generating DOCX from non-trivial HTML documents. The only property I couldn't figure out how to set was underline color, which I can come back to.\r\n\r\nI also fixed a bug in isHighlight() that returned a false positive if highlight was not set at all (the test case was flawed and didn't catch this case).\r\n\r\nI used the built-in OOXML enums so that I could set values using the same strings as specified in the OO XML spec rather than having to translate them to e.g., uppercase keywords as used with the custom enumerations (this is why I implemented setVerticalAlignment() as a replacement for setSubscript() [which was a bad method name anyway.]\r\n\r\nI did not implement versions of the methods that involve enumerations to also allow direct specification of the enumeration values themselves--I can if there's a need for it, but for most uses it seems more likely that string values will be the input, e.g., from parsing of XML or HTML or whatever.",
"title": "Implemented a number of new set and get methods for useful formatting…",
"type": "issue"
},
{
"action": "created",
"author": "asfgit",
"comment_id": 409732536,
"datetime": 1533159483000,
"masked_author": "username_1",
"text": "Can one of the admins verify this patch?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "drmacro",
"comment_id": 409931663,
"datetime": 1533217584000,
"masked_author": "username_0",
"text": "I will attempt to implement Mark's idea about run property access.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pjfanning",
"comment_id": 410051326,
"datetime": 1533240364000,
"masked_author": "username_2",
"text": "@username_0 could you rebase due to PR 120 being merged?",
"title": null,
"type": "comment"
}
] | 3 | 4 | 1,240 | false | false | 1,240 | true |
ponylang/ponylang.github.io | ponylang | 300,055,264 | 252 | null | [
{
"action": "opened",
"author": "mfelsche",
"comment_id": null,
"datetime": 1519588368000,
"masked_author": "username_0",
"text": "",
"title": "Last Week in Pony - March 04, 2018",
"type": "issue"
},
{
"action": "created",
"author": "SeanTAllen",
"comment_id": 369705009,
"datetime": 1519932909000,
"masked_author": "username_1",
"text": "Audio from the February 28th Pony development sync:\r\n\r\nhttps://pony.groups.io/g/dev/files/Pony%20Sync/2018_02_28",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "SeanTAllen",
"comment_id": 369991619,
"datetime": 1520011701000,
"masked_author": "username_1",
"text": "MQTT client in Pony => https://github.com/epiceric/pony-mqtt",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mfelsche",
"comment_id": 369999871,
"datetime": 1520013425000,
"masked_author": "username_0",
"text": "Has already been announced here https://www.ponylang.org/blog/2018/02/last-week-in-pony---february-11-2018/",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mfelsche",
"comment_id": null,
"datetime": 1520195180000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 5 | 280 | false | false | 280 | false |
flowscripter/ts-example-cli | flowscripter | 640,591,986 | 148 | {
"number": 148,
"repo": "ts-example-cli",
"user_login": "flowscripter"
} | [
{
"action": "created",
"author": "vectronic",
"comment_id": 649186840,
"datetime": 1593054417000,
"masked_author": "username_0",
"text": ":tada: This PR is included in version 1.1.11 :tada:\n\nThe release is available on:\n- [npm package (@latest dist-tag)](https://www.npmjs.com/package/@flowscripter/ts-example-cli/v/1.1.11)\n- [GitHub release](https://github.com/flowscripter/ts-example-cli/releases/tag/v1.1.11)\n\nYour **[semantic-release](https://github.com/semantic-release/semantic-release)** bot :package::rocket:",
"title": null,
"type": "comment"
}
] | 2 | 2 | 19,589 | false | true | 378 | false |
apache/hudi | apache | 681,128,292 | 1,979 | null | [
{
"action": "opened",
"author": "hughfdjackson",
"comment_id": null,
"datetime": 1597764103000,
"masked_author": "username_0",
"text": "**Describe the problem you faced**\r\n\r\nMy team are interested in writing to Hudi tables using a repeated batch process that often upserts data that's identical to what's already there. For instance, we may be: \r\n\r\n- recalculating # of times a particular set of event has occurred\r\n- re-running a query over the last week of data, to include potentially late arriving data. \r\n\r\nWe also have some consumers that want to consume these tables incrementally (to ingest the latest results into local databases, or monitor the changes). Ideally, these consumers would only see the 1% of records that have changed, rather than all records involved in the upsert. \r\n\r\nHowever, in our testing, it seems like the incremental query returns _all_ records that were involved in the upsert, even if they were overwriting identical data. \r\n\r\n(As far as I can tell, this happens here: https://github.com/apache/hudi/blob/release-0.5.3/hudi-client/src/main/java/org/apache/hudi/io/HoodieMergeHandle.java#L238-L244, no matter which `PAYLOAD_CLASS_OPT_KEY` class is used).\r\n\r\n**To Reproduce**\r\n\r\nSteps to reproduce the behavior:\r\n\r\n1. clone hudi git repo, checkout `release-0.5.3-rc2` and run `mvn clean package -DskipTests -DskipITs`\r\n2. Copy `packaging/hudi-spark-bundle/target/hudi-spark-bundle_2.11-0.5.3-rc2.jar` to EMR master node\r\n3. Run the following spark shell on master, with the command: `spark-shell --conf \"spark.serializer=org.apache.spark.serializer.KryoSerializer\" --conf \"spark.sql.hive.convertMetastoreParquet=false\" --jars hudi-spark-bundle_2.11-0.5.3-rc2.jar,/usr/lib/spark/external/lib/spark-avro.jar -i spark-shell-script`\r\n\r\nwhere `spark-shell-script` contents is:\r\n```scala\r\nimport org.apache.hudi.QuickstartUtils._\r\nimport scala.collection.JavaConversions._\r\nimport org.apache.spark.sql.SaveMode._\r\nimport org.apache.spark.sql.SaveMode\r\nimport org.apache.hudi.DataSourceReadOptions._\r\nimport org.apache.hudi.DataSourceWriteOptions._\r\nimport org.apache.hudi.config.HoodieWriteConfig._\r\nimport org.apache.spark.sql.DataFrame\r\nimport org.apache.hudi.common.table.HoodieTableMetaClient\r\nimport org.apache.hudi.table.HoodieTable\r\nimport org.apache.hudi.config.HoodieWriteConfig\r\n \r\n// Helper functions\r\nval basePath = \"s3://{s3BucketNameAndPrefixPath}\"\r\nval tableName = \"hudi_incremental_read_test\"\r\ndef write(df: DataFrame, saveMode: SaveMode = Append) = df.write.format(\"hudi\")\r\n .option(PRECOMBINE_FIELD_OPT_KEY, \"ts\")\r\n .option(RECORDKEY_FIELD_OPT_KEY, \"uuid\")\r\n .option(PARTITIONPATH_FIELD_OPT_KEY, \"partitionpath\")\r\n .option(\"hoodie.consistency.check.enabled\", \"true\")\r\n .option(TABLE_NAME, tableName)\r\n .mode(saveMode)\r\n .save(basePath)\r\ndef incrementalRead(beginInstant: String) = { \r\n println(s\"READING FROM $beginInstant\") \r\n spark.read\r\n .format(\"hudi\")\r\n .option(QUERY_TYPE_OPT_KEY, QUERY_TYPE_INCREMENTAL_OPT_VAL)\r\n .option(BEGIN_INSTANTTIME_OPT_KEY, beginInstant)\r\n .load(basePath)\r\n} \r\ndef latestCommitInstant() = { \r\n val metaClient = new HoodieTableMetaClient(spark.sparkContext.hadoopConfiguration, basePath, true)\r\n val hoodieTable = HoodieTable.getHoodieTable(metaClient, HoodieWriteConfig.newBuilder().withPath(basePath).build(), spark.sparkContext)\r\n \r\n hoodieTable.getMetaClient.getCommitTimeline.filterCompletedInstants().lastInstant.get.getTimestamp\r\n}\r\n\r\ndef justBefore(commitTime: String) = (commitTime.toLong - 1).toString\r\nval dataGen = new DataGenerator\r\nval inserts = convertToStringList(dataGen.generateInserts(10))\r\nval df = spark.read.json(spark.sparkContext.parallelize(inserts, 2))\r\n\r\nwrite(df, saveMode=Overwrite)\r\n\r\nprintln(\"\"\"\r\n----------- INCREMENTAL READ -------\r\n\"\"\")\r\nprintln(\"The whole table is new, so I'm expecting all 10 rows to be returned on incremental read\")\r\nincrementalRead(justBefore(latestCommitInstant)).show()\r\n\r\n// generate an update for a single row\r\nval updates = convertToStringList(dataGen.generateUpdates(1))\r\nval updatesDF = spark.read.json(spark.sparkContext.parallelize(updates, 2))\r\n\r\nprintln(\"\"\"\r\n----------- INCREMENTAL READ -------\r\n\"\"\")\r\nprintln(\"Now we're updating a row, we expect to see the updated row only on incremental read, which we do\")\r\nwrite(updatesDF)\r\nincrementalRead(justBefore(latestCommitInstant)).show()\r\n\r\nprintln(\"\"\"\r\n----------- INCREMENTAL READ -------\r\n\"\"\")\r\nprintln(\"Re-upserting the same row twice causes it to be 'emitted' twice to the incremental reader, even though the contents of the second reading are identical from the first (metadata aside)\")\r\nwrite(updatesDF)\r\nincrementalRead(justBefore(latestCommitInstant)).show()\r\n```\r\n\r\nThat results in: \r\n\r\n```\r\n----------- INCREMENTAL READ -------\r\nThe whole table is new, so I'm expecting all 10 rows to be returned on incremental read\r\nREADING FROM 20200818091617\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+-------------------+-------------------+----------+-------------------+-------------------+------------------+--------------------+---------+---+--------------------+\r\n|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name| begin_lat| begin_lon| driver| end_lat| end_lon| fare| partitionpath| rider| ts| uuid|\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+-------------------+-------------------+----------+-------------------+-------------------+------------------+--------------------+---------+---+--------------------+\r\n| 20200818091618| 20200818091618_1_1|ecde6618-0cbc-4b6...| americas/united_s...|3e9b3e64-3895-46a...|0.21624150367601136|0.14285051259466197|driver-213| 0.5890949624813784| 0.0966823831927115| 93.56018115236618|americas/united_s...|rider-213|0.0|ecde6618-0cbc-4b6...|\r\n| 20200818091618| 20200818091618_1_2|c9a45eda-fe53-480...| americas/united_s...|3e9b3e64-3895-46a...| 0.8742041526408587| 0.7528268153249502|driver-213| 0.9197827128888302| 0.362464770874404|19.179139106643607|americas/united_s...|rider-213|0.0|c9a45eda-fe53-480...|\r\n| 20200818091618| 20200818091618_1_3|35808b31-2d1e-474...| americas/united_s...|3e9b3e64-3895-46a...| 0.5731835407930634| 0.4923479652912024|driver-213|0.08988581780930216|0.42520899698713666| 64.27696295884016|americas/united_s...|rider-213|0.0|35808b31-2d1e-474...|\r\n| 20200818091618| 20200818091618_1_4|67e1c9d5-a3c0-4f7...| americas/united_s...|3e9b3e64-3895-46a...|0.11488393157088261| 0.6273212202489661|driver-213| 0.7454678537511295| 0.3954939864908973| 27.79478688582596|americas/united_s...|rider-213|0.0|67e1c9d5-a3c0-4f7...|\r\n| 20200818091618| 20200818091618_1_5|8fdf91c8-b0ca-46c...| americas/united_s...|3e9b3e64-3895-46a...| 0.1856488085068272| 0.9694586417848392|driver-213|0.38186367037201974|0.25252652214479043| 33.92216483948643|americas/united_s...|rider-213|0.0|8fdf91c8-b0ca-46c...|\r\n| 20200818091618| 20200818091618_0_1|2efbfbf1-aa1f-40f...| americas/brazil/s...|a71d09b8-7cc8-408...| 0.4726905879569653|0.46157858450465483|driver-213| 0.754803407008858| 0.9671159942018241|34.158284716382845|americas/brazil/s...|rider-213|0.0|2efbfbf1-aa1f-40f...|\r\n| 20200818091618| 20200818091618_0_2|2bbebad3-1a3c-4f1...| americas/brazil/s...|a71d09b8-7cc8-408...| 0.0750588760043035|0.03844104444445928|driver-213|0.04376353354538354| 0.6346040067610669| 66.62084366450246|americas/brazil/s...|rider-213|0.0|2bbebad3-1a3c-4f1...|\r\n| 20200818091618| 20200818091618_0_3|2c3d179c-899f-42f...| americas/brazil/s...|a71d09b8-7cc8-408...| 0.6100070562136587| 0.8779402295427752|driver-213| 0.3407870505929602| 0.5030798142293655| 43.4923811219014|americas/brazil/s...|rider-213|0.0|2c3d179c-899f-42f...|\r\n| 20200818091618| 20200818091618_2_1|3c9add87-8347-41d...| asia/india/chennai|df2d7f47-0d10-43b...| 0.651058505660742| 0.8192868687714224|driver-213|0.20714896002914462|0.06224031095826987| 41.06290929046368| asia/india/chennai|rider-213|0.0|3c9add87-8347-41d...|\r\n| 20200818091618| 20200818091618_2_2|8cd8ff41-791e-43a...| asia/india/chennai|df2d7f47-0d10-43b...| 0.40613510977307| 0.5644092139040959|driver-213| 0.798706304941517|0.02698359227182834|17.851135255091155| asia/india/chennai|rider-213|0.0|8cd8ff41-791e-43a...|\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+-------------------+-------------------+----------+-------------------+-------------------+------------------+--------------------+---------+---+--------------------+\r\n----------- INCREMENTAL READ -------\r\nNow we're updating a row, we expect to see the updated row only on incremental read, which we do\r\n20/08/18 09:17:36 WARN IncrementalTimelineSyncFileSystemView: Incremental Sync of timeline is turned off or deemed unsafe. Will revert to full syncing\r\nREADING FROM 20200818091705\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+------------------+----------+------------------+------------------+------------------+--------------------+---------+---+--------------------+\r\n|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name| begin_lat| begin_lon| driver| end_lat| end_lon| fare| partitionpath| rider| ts| uuid|\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+------------------+----------+------------------+------------------+------------------+--------------------+---------+---+--------------------+\r\n| 20200818091706| 20200818091706_0_3|35808b31-2d1e-474...| americas/united_s...|3e9b3e64-3895-46a...|0.7340133901254792|0.5142184937933181|driver-284|0.7814655558162802|0.6592596683641996|49.527694252432056|americas/united_s...|rider-284|0.0|35808b31-2d1e-474...|\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+------------------+----------+------------------+------------------+------------------+--------------------+---------+---+--------------------+\r\n----------- INCREMENTAL READ -------\r\nRe-upserting the same row twice causes it to be 'emitted' twice to the incremental reader, even though the contents of the second reading are identical from the first (metadata aside)\r\n20/08/18 09:18:04 WARN IncrementalTimelineSyncFileSystemView: Incremental Sync of timeline is turned off or deemed unsafe. Will revert to full syncing\r\nREADING FROM 20200818091736\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+------------------+----------+------------------+------------------+------------------+--------------------+---------+---+--------------------+\r\n|_hoodie_commit_time|_hoodie_commit_seqno| _hoodie_record_key|_hoodie_partition_path| _hoodie_file_name| begin_lat| begin_lon| driver| end_lat| end_lon| fare| partitionpath| rider| ts| uuid|\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+------------------+----------+------------------+------------------+------------------+--------------------+---------+---+--------------------+\r\n| 20200818091737| 20200818091737_0_4|35808b31-2d1e-474...| americas/united_s...|3e9b3e64-3895-46a...|0.7340133901254792|0.5142184937933181|driver-284|0.7814655558162802|0.6592596683641996|49.527694252432056|americas/united_s...|rider-284|0.0|35808b31-2d1e-474...|\r\n+-------------------+--------------------+--------------------+----------------------+--------------------+------------------+------------------+----------+------------------+------------------+------------------+--------------------+---------+---+--------------------+\r\n```\r\n\r\n**Expected behavior**\r\n\r\nIdeally (in our use case), upserting a row whose contents is identical doesn't cause an incremental reader to read the data again. \r\n\r\n**Environment Description**\r\n\r\n* Hudi version : 0.5.3-rc2, built from source\r\n\r\n* Spark version : 2.4.4 (Using Scala version 2.11.12, OpenJDK 64-Bit Server VM, 1.8.0_252)\r\n\r\n* Hive version : 2.3.6\r\n\r\n* Hadoop version : 2.8.5-amzn-5\r\n\r\n* Storage (HDFS/S3/GCS..) : S3\r\n\r\n* Running on Docker? (yes/no) : no\r\n\r\n* EMR Version : emr-5.29.0",
"title": "[SUPPORT]: Incremental read returns all upserted rows, even if no material change has occurred.",
"type": "issue"
},
{
"action": "created",
"author": "bvaradar",
"comment_id": 676490788,
"datetime": 1597850308000,
"masked_author": "username_1",
"text": "One option to make this to work currently is to add columns that gets updated also as part of the composite record key. We can use key uniqueness constraint of Hudi to achieve the result. This way, you have an option to filter out duplicates first and then upsert rest of the records in the batch.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hughfdjackson",
"comment_id": 679017460,
"datetime": 1598261406000,
"masked_author": "username_0",
"text": "Hi @username_1 - thanks for the reply! And for the suggestion.\r\n\r\nIn our use case, we're interested in both incremental read of material changes, and in using the Hudi table with regular snapshot queries. I would expect 30-50% incremental reads, and 50-70% snapshot queries.\r\n\r\nIf I'm understanding correctly, your suggestion would essentially lead to an event log of all material changes to an entity. If you do a snapshot query against that data, you'd end up with lots of duplicates, so each query would need to include de-duplication to reproduce the a materialised view with the latest data for each entity.\r\n\r\nIs that right?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hughfdjackson",
"comment_id": 679026628,
"datetime": 1598262387000,
"masked_author": "username_0",
"text": "@username_1 - As a follow-up question, your reply confirms that what we're looking for (ideally) isn't a Hudi feature currently. Is it something you might be interested in supporting?\r\n\r\nIn many use cases, the behaviour would likely be identical to the current - for snapshot queries, or for incrementally reading tables where the writer ensures only material changes* are written (e.g. some stream processing, or insert-only batch processes). In the remaining use-cases like ours, it would cut back on a lot of noise + processing. \r\n\r\nIf so, I can talk to my team about contributing towards the project, since it would be valuable to us. \r\n\r\n----\r\n\r\n\\* I'm using 'material changes' here to describe an upsert that impacts on the non-`_hoodie` columns. Either a deletion, or a change in value to one of those columns.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bvaradar",
"comment_id": 679522323,
"datetime": 1598326376000,
"masked_author": "username_1",
"text": "@username_0 : In general getting incremental read to discard duplicates is not possible for MOR table types as we defer the merging of records to compaction.\r\n\r\nI was thinking about alternate ways to achieve your use-case for COW table by using an application level boolean flag. Let me know if this makes sense:\r\n\r\n1. Introduce additional boolean column \"changed\". Default Value is false.\r\n2. Have your own implementation of HoodieRecordPayload plugged-in.\r\n3a In HoodieRecordPayload.getInsertValue(), return an avro record with changed = true. This function is called first time when the new record is inserted.\r\n3(b) In HoodieRecordPayload.combineAndGetUpdateValue(), if you determine, there is no material change, set changed = false else set it to true.\r\n\r\nIn your incremental query, add the filter changed = true to filter out those without material changes ?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hughfdjackson",
"comment_id": 680685136,
"datetime": 1598423182000,
"masked_author": "username_0",
"text": "That does make sense, although I think a boolean column may lead to missing changes if the incremental read spans two or more commits to the same row. I'm spiking a variation on that suggesting with my team, wherein: \r\n\r\n1. Introduce a 'last_updated_timestamp', default to null (i.e. the update was in this commit)\r\n2. Have your own implementation of HoodieRecordPayload plugged-in.\r\n3. a. In HoodieRecordPayload.getInsertValue(), return an avro record with last_updated_timestamp = null.*\r\n3. b. In HoodieRecordPayload.combineAndGetUpdateValue(), if you determine, there is no material change, set last_updated_timestamp to that of the old record (if it exists) _or_ to the old record's commit_time. \r\n\r\nIn the incremental query, we're filtering for `null` (which indicates that one of the commits within the timeline last updated the record) or for `last_updated_timestamp` within the beginInstant and endInstant bounds. \r\n\r\nWe've not tested it extensively, but it looks like a promising workaround so far. \r\n\r\n---\r\n\r\n\\* It'd be 'cleaner' to set this equal to the commit time of the write, but in our HoodieRecordPayload class, that's not available unfortunately. The 'null means insert' + special case handling in HoodieRecordPayload.combineAndGetUpdateValue() is a work-around for that.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bvaradar",
"comment_id": 680998998,
"datetime": 1598460638000,
"masked_author": "username_1",
"text": "@username_0 : Good point about incrementally reading multiple commits. The variation you suggested seems to make sense.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bvaradar",
"comment_id": 682061419,
"datetime": 1598546364000,
"masked_author": "username_1",
"text": "Will close the ticket for now. Please reopen if we need to discuss more on this topic.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "bvaradar",
"comment_id": null,
"datetime": 1598546365000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "hughfdjackson",
"comment_id": 688168867,
"datetime": 1599468207000,
"masked_author": "username_0",
"text": "@username_1 - thanks for your help. I think we're going to try the above approach, but it's something we might return to later. \r\n\r\nClosing the issue for now sounds like a good idea.",
"title": null,
"type": "comment"
}
] | 2 | 10 | 16,846 | false | false | 16,846 | true |
ThowV/hydrogen-hub | null | 747,591,434 | 27 | null | [
{
"action": "opened",
"author": "MelchiorKokernoot",
"comment_id": null,
"datetime": 1605887156000,
"masked_author": "username_0",
"text": "\r\n\r\nMaybe it'd be nice if this text could be on one line, without breaks.",
"title": "Front-End change suggestion`",
"type": "issue"
},
{
"action": "created",
"author": "ThowV",
"comment_id": 731284192,
"datetime": 1605891405000,
"masked_author": "username_1",
"text": "Also, maybe instead of only having Groningen Seaports at the left banner, we could have something like:\r\nHydrogen Hub (bigger)\r\nPowered by Groningen Seaports (smaller)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "martijnjongman",
"comment_id": 731583913,
"datetime": 1605967390000,
"masked_author": "username_2",
"text": "Okay \"retrieve password\" shouldn't break at all... I will have a look at it.\r\n\r\nSecond, yes good idea Thomas. That way, the name of the platform will be announced.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ThowV",
"comment_id": 732057811,
"datetime": 1606125943000,
"masked_author": "username_1",
"text": "@username_2 When creating a merge request please link this issue so we know you implemented these suggestions.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ThowV",
"comment_id": null,
"datetime": 1606170063000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 5 | 627 | false | false | 627 | true |
code-for-hamamatsu/covid19 | code-for-hamamatsu | 590,687,315 | 32 | null | [
{
"action": "opened",
"author": "airsogi",
"comment_id": null,
"datetime": 1585613772000,
"masked_author": "username_0",
"text": "## 起こっている問題 / The Problem\r\n- 印刷表示時に右上に表示されるQRコードとURLがが本家のまま\r\n\r\n## スクリーンショット / Screenshot\r\n \r\n\r\n\r\n\r\n## 期待する見せ方・挙動 / Expected Behavior\r\n- 記載されているURLは浜松市版のものに変える\r\n- QRコードは消してもいいかも\r\n\r\n## 起こっている問題の再現手段 / Steps to Reproduce\r\n1. トップ画面より右上の「印刷」をクリック",
"title": "印刷表示時のQRコードが本家のまま",
"type": "issue"
},
{
"action": "created",
"author": "poba1031",
"comment_id": 606357536,
"datetime": 1585621089000,
"masked_author": "username_1",
"text": "修正したQRをアップしました",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "poba1031",
"comment_id": 606385160,
"datetime": 1585627188000,
"masked_author": "username_1",
"text": "差し替えデータをアップしました。",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "jacoyutorius",
"comment_id": null,
"datetime": 1585650002000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 4 | 424 | false | false | 424 | false |
Altinn/altinn-studio | Altinn | 646,140,673 | 4,443 | null | [
{
"action": "opened",
"author": "jeevananthank",
"comment_id": null,
"datetime": 1593164336000,
"masked_author": "username_0",
"text": "## Describe the bug\r\nScreen reader issues with attachment component in mobile\r\n 1. 'Ferdig lastet' is not read.\r\n 2. Row and column numbers are read\r\n 3. Strangely placed header for 'slett vedlegg' column and it is read out (this is not read out in a desktop browser)\r\n\r\n## To Reproduce\r\nSteps to reproduce the behavior:\r\n1. Login to SBL in a mobile browser and start an app instance that has attachment component\r\n2. Upload an attachment and start a screen reader in the mobile device.\r\n3. The issues mentioned in description are seen.\r\n\r\n## Expected behavior\r\n@username_1 can you check this and comment are the relevant issues to be fixed/expected results.\r\n\r\n## Additional info\r\nEnv: TT02\r\nApp. ttd/apps-test\r\nBrowser: Chrome for iOS 83 and VoiceOver for iOS",
"title": "Screen reader issues with attachment component in mobile",
"type": "issue"
},
{
"action": "created",
"author": "Febakke",
"comment_id": 702030417,
"datetime": 1601546670000,
"masked_author": "username_1",
"text": "Retested with VoiceOver on Iphone 8.\r\n1. The text \"ferdig lastet\" is removed when scaling down to a mobile view. But the check mark icon do have an aria label \"Uploaded\". This text should be read by the screen reader but instead I got \"Umulig å uttale\" This should be fixed!\r\n\r\n2. I do agree that the way it works in desktop seems better, but I am sceptical to do something about it. I think it is safest to keep it as a simple table for now. You can reach every element with VoiceOver so I dont think there is an issue her. \r\n\r\n3. Im no expert, but I think there is a difference in how screen reader on desktop and mobile works. When you tab trough a form on a desktop you will only reach elements with functionality, but on a mobile you scroll trough everything. The delete button is in a column. Either we have to remove the delete button from the table or use aria hidden to hide from the screen reader. I dont think this is a big problem, the name of the column is not confusing.\r\n\r\nNumber one must be fixed, the others I dont think break the wcag guidelines. \r\n@username_2 Any input on this?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "lorang92",
"comment_id": 702033251,
"datetime": 1601547003000,
"masked_author": "username_2",
"text": "@username_1 think that's a good summary. Issue 1. should be fixed. I see that the checked icon should have the same aria-label as the text that is presented in the desktop view. Why this is not being read is not clear to me, but should be looked into.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "acn-sbuad",
"comment_id": 739874032,
"datetime": 1607342415000,
"masked_author": "username_3",
"text": "Point 1 prioritized into milestone 50/53. Extract into seperate issue (up for grabs, estimate: 1)",
"title": null,
"type": "comment"
}
] | 4 | 4 | 2,202 | false | false | 2,202 | true |
KhronosGroup/SPIRV-Cross | KhronosGroup | 730,313,381 | 1,507 | null | [
{
"action": "opened",
"author": "caosdoar",
"comment_id": null,
"datetime": 1603794149000,
"masked_author": "username_0",
"text": "Hello, \r\n\r\nI am using SPIRV-Cross to translate SPIRV code generated from OpenCL (with clspv) to Metal.\r\n\r\nI found one of my compute shaders generating wrong data and I narrowed the problem to SPIRV-Cross generating wrong code for one of the loops. I am attaching a simplified version of the OpenCL shader, the SPIRV binary and disassembled files, and the Metal output generated by SPIRV-Cross.\r\n\r\n[spirv-cross-bug-for.zip](https://github.com/KhronosGroup/SPIRV-Cross/files/5444227/spirv-cross-bug-for.zip)\r\n\r\nLet me explain the problem. This is a piece of code of the simplified shader attached, only contains the loop that SPIRV-Cross is failing to generate:\r\n\r\n```OpenCL\r\nfloat maxh = data[0];\r\nfor (int i = 1; i < nbins; ++i)\r\n{\r\n if (maxh < data[i]) maxh = data[i];\r\n}\r\n```\r\n\r\nI am omitting here the SPIRV output generated by CLSPV, but the code seems correct for this simplified version, and indeed works fine on the original/full shader. \r\n\r\nHere is the Metal code generated by SPIRV-Cross for that loop:\r\n\r\n```metal\r\nfloat _75;\r\nfor (;;)\r\n{\r\n _75 = (_70 < _15._m0[_71]) ? _15._m0[_71] : _70;\r\n if (!(_71 < 35u))\r\n {\r\n break;\r\n }\r\n else\r\n {\r\n _70 = _75;\r\n _71++;\r\n break; // Problem!\r\n }\r\n}\r\n```\r\n\r\nIt does not match perfectly, as some of the SPIRV instructions are executed as part of the previous loop, that you can find on the full shader. But I think is enough to understand the problem. The \"break\" on the else block should not be there, or should be a \"continue\".\r\n\r\nAfter debugging the SPIRV-Cross execution I found for the conditional on that loop [CompilerGLSL::branch](https://github.com/KhronosGroup/SPIRV-Cross/blob/1a95017d114e408657d27301323f20525aca4108/spirv_glsl.cpp#L13241) enters the false block with the same BlockID for the \"from\" and \"to\" parameters. But it enters the second condition as \"is_break(to)\" returns true, so the later test \"to_is_continue || from == to\" is never checked, and it outputs a \"break\". I could fix the problem by just switching the order of this two conditions, but I don't know if the order is relevant for other cases.\r\n\r\nPlease, let me know if I can provide any more information or help in solving this issue.\r\n\r\nRegards,\r\nOscar.",
"title": "For branch bug: both branches break the loop",
"type": "issue"
},
{
"action": "created",
"author": "HansKristian-Work",
"comment_id": 717178169,
"datetime": 1603798122000,
"masked_author": "username_1",
"text": "Wow, now that's one hell of an edge case, a block being a continue block, loop header AND break target at the same time :v Fortunately the fix is simple.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "caosdoar",
"comment_id": 717181780,
"datetime": 1603798553000,
"masked_author": "username_0",
"text": "Wow. That was a quick response and a quick fix.\r\nThank you!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "HansKristian-Work",
"comment_id": null,
"datetime": 1603897455000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 2,449 | false | false | 2,449 | false |
aliyun/darabonba-java-generator | aliyun | 696,628,747 | 34 | {
"number": 34,
"repo": "darabonba-java-generator",
"user_login": "aliyun"
} | [
{
"action": "opened",
"author": "atptro",
"comment_id": null,
"datetime": 1599642487000,
"masked_author": "username_0",
"text": "",
"title": "improve name",
"type": "issue"
}
] | 2 | 2 | 0 | false | true | 0 | false |
rpotter12/whatsapp-play | null | 606,820,028 | 318 | {
"number": 318,
"repo": "whatsapp-play",
"user_login": "rpotter12"
} | [
{
"action": "opened",
"author": "Ritacheta",
"comment_id": null,
"datetime": 1587836626000,
"masked_author": "username_0",
"text": "## Issue that this pull request solves\r\n\r\nCloses: # 145\r\n\r\n## Proposed changes\r\n\r\nBrief description of what is fixed or changed\r\n\r\n## Types of changes\r\n\r\n_Put an `x` in the boxes that apply_\r\n\r\n- [ ] Bugfix (non-breaking change which fixes an issue)\r\n- [ ] New feature (non-breaking change which adds functionality)\r\n- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)\r\n- [ ] Documentation update (Documentation content changed)\r\n- [ ] Other (please describe): \r\n\r\n## Checklist\r\n\r\n_Put an `x` in the boxes that apply_\r\n\r\n- [ ] My code follows the style guidelines of this project\r\n- [ ] I have performed a self-review of my own code\r\n- [ ] I have commented my code, particularly in hard-to-understand areas\r\n- [ ] I have made corresponding changes to the documentation\r\n- [ ] My changes generate no new warnings\r\n\r\n## Screenshots\r\n\r\n\r\n\r\n\r\n\r\n\r\n## Other information\r\n\r\nAny other information that is important to this pull request",
"title": "screenshots upload",
"type": "issue"
},
{
"action": "created",
"author": "Ritacheta",
"comment_id": 619438190,
"datetime": 1587847415000,
"masked_author": "username_0",
"text": "is my contribution successful??",
"title": null,
"type": "comment"
}
] | 1 | 2 | 1,247 | false | false | 1,247 | false |
dotnet/aspnetcore | dotnet | 632,134,060 | 22,611 | {
"number": 22611,
"repo": "aspnetcore",
"user_login": "dotnet"
} | [
{
"action": "opened",
"author": "dougbu",
"comment_id": null,
"datetime": 1591409011000,
"masked_author": "username_0",
"text": "- #22556\r\n- make `-BuildNative` primarily useful when you want _only_ native assets\r\n- remove `-ForceCoreMsbuild` option\r\n\r\nnit: extra `Remove-Item`s caused `MSBuild` function to do redundant work",
"title": "Build native assets by default",
"type": "issue"
},
{
"action": "created",
"author": "dougbu",
"comment_id": 640112897,
"datetime": 1591474598000,
"masked_author": "username_0",
"text": "https://dev.azure.com/dnceng/internal/_build/results?buildId=674531 demonstrated the success of this change in addressing #22556. The Windows build worked fine though the `dotnet test`s failed miserably (perhaps they need `--no-build`❔).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dougbu",
"comment_id": 640271923,
"datetime": 1591560180000,
"masked_author": "username_0",
"text": "After my update:\r\n- [20200607.1](https://dev.azure.com/dnceng/internal/_build/results?buildId=675255&view=results) run of aspnetcore-blazor-daily-tests failed about the same way; build was fine `dotnet test` not so much\r\n- [20200607.1](https://dev.azure.com/dnceng/public/_build/results?buildId=675252&view=results) succeeded and demonstrated more of our build scripts are working fine these days.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jkotalik",
"comment_id": 640752375,
"datetime": 1591635671000,
"masked_author": "username_1",
"text": "Can you clarify this statement a bit more? So now calling `build.cmd -BuildNative` will only build native bits?\r\n\r\nI think this breaks the build.cmd script in the IIS folder: https://github.com/dotnet/aspnetcore/blob/master/src/Servers/IIS/build.cmd. Could you adjust that accordingly?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dougbu",
"comment_id": 640780535,
"datetime": 1591638908000,
"masked_author": "username_0",
"text": "@username_1 good catch. That command previously worked because desktop `msbuild` would handle any `*.vcxproj` dependencies of the managed projects listed. The `$(BuildNative)` setting made sure those dependencies weren't ignored. Now, the script will attempt to build the managed projects using desktop `msbuild` and fail unless the system has VS 16.6.0 (actually preview 2 IIRC) or later.\r\n\r\nFix is simply to remove `-buildNative` from the command line. Will do before I merge.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dougbu",
"comment_id": 640783194,
"datetime": 1591639222000,
"masked_author": "username_0",
"text": "That command has always done that. The differences are\r\n1. `.\\build.cmd -projects {some projects}` will build the projects using `dotnet msbuild` while `.\\build.cmd -buildNative {some projects}` will build them using desktop `msbuild`\r\n2. `.\\build.cmd` will build both managed and native projects because native projects are built _except_ if you specify `-noBuildNative`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jkotalik",
"comment_id": 640849556,
"datetime": 1591645675000,
"masked_author": "username_1",
"text": "Does that mean you need to have the VS toolsets necessary to build native for build.cmd to succeed? What happens if I call build.cmd without having it installed?\r\n\r\nWe made a very conscious decision to make it so build.cmd wouldn't build native dlls, as it isn't easy to have everything installed and slows down the build process.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dougbu",
"comment_id": 640854198,
"datetime": 1591646287000,
"masked_author": "username_0",
"text": "@username_1 the answer depends on the specific build.cmd script you run. The problem without the fixes we've made over the week or so (starting with 14d6b6e28661) is builds using desktop `msbuild` failed miserably with VS 2019 16.5 or earlier failed because that `msbuild` didn't understand `net5.0`.\r\n\r\nWe're not in the perfect place with this. But, it's better than those failures or (worse) having to run `.\\build.cmd` multiple times to pick up the native assets needed in the shared framework i.e. to get the CI builds working.\r\n\r\nIn any case, those who don't care about building the shared framework properly or aren't testing with IIS can execute `./build.cmd -noBuildNative`. The native build takes about 3 minutes for the rest of us.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jkotalik",
"comment_id": 640864814,
"datetime": 1591647681000,
"masked_author": "username_1",
"text": "From my experience, it takes significantly longer than 3 minutes on CI to build native. I was seeing ~15 minutes when I was testing ways to speed up build times for our CI.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dougbu",
"comment_id": 640874840,
"datetime": 1591648858000,
"masked_author": "username_0",
"text": "I'm not so sure. We also have issue #6304 about having the root build.cmd script building everything.\r\n\r\nIf we do something here, would be separate from `-buildNative` because that has the side effect of changing how `-projects` is interpreted. Maybe `-includeNativePhase`❔ One concern would be the need to change a bunch of the YAML and build.cmd|ps1 files to get the CI working again ☹️",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jkotalik",
"comment_id": 640881506,
"datetime": 1591649705000,
"masked_author": "username_1",
"text": "If that's the direction for the root build.cmd, I think that's fine. I think I'm most concerned about the subdirectory build.cmd calls and how they are affected. For example, what happens when calling src/Servers/Kestrel/build.cmd when someone doesn't have the native dependencies installed? Does the build fail? That's my main concern.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dougbu",
"comment_id": 640949079,
"datetime": 1591660821000,
"masked_author": "username_0",
"text": "build.ps1 isn't smart enough to notice the user lacks the needed `msbuild` components to build native projects. So, it'll run desktop `msbuild` with `/p:BuildNative=true` that global property means the [override here](https://github.com/dotnet/aspnetcore/blob/master/eng/Common.props#L14) doesn't work and the build will fail.\r\n\r\nThis brings to mind another solution option if we decide the experience is significantly degraded: We could figure out a way to skip the native phase when the needed `msbuild` components are missing, a bit like what we do when NodeJS isn't found. The other option is to fail up front, a lot like we do when Java isn't found. (Notice the inconsistency there ☹️)\r\n\r\n@username_1 should we move this to a new issue on the DoI board❔ I suspect we want to change something somewhere and am not surprised given how quickly we've pivoted to using `dotnet msbuild` by default then cleaning up.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jkotalik",
"comment_id": 641588486,
"datetime": 1591737709000,
"masked_author": "username_1",
"text": "I'd be okay with that option. We could briefly bring this up at DoI as well.",
"title": null,
"type": "comment"
}
] | 2 | 13 | 4,915 | false | false | 4,915 | true |
sigp/lighthouse | sigp | 668,266,230 | 1,419 | null | [
{
"action": "opened",
"author": "AgeManning",
"comment_id": null,
"datetime": 1596070703000,
"masked_author": "username_0",
"text": "## Description\r\n\r\nThe graffiti CLI option should be duplicated to the VC such that VC's decide what the graffiti will be on published blocks. \r\nThis aligns more with out other clients and users expectations. \r\n\r\nSee #1410 for example. \r\n\r\nThe VC graffiti should be prioritised over the BN. If the VC graffiti is set, it modifies the blocks graffiti, if its not set it leaves it as is from the BN. \r\n\r\n## Present Behaviour\r\n\r\nGraffiti tag is currently only on the BN.",
"title": "Shift the graffiti CLI option into the VC",
"type": "issue"
},
{
"action": "closed",
"author": "hermanjunge",
"comment_id": null,
"datetime": 1599694406000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 466 | false | false | 466 | false |
zxqfl/TabNine | null | 478,996,134 | 126 | null | [
{
"action": "opened",
"author": "kantuin",
"comment_id": null,
"datetime": 1565358437000,
"masked_author": "username_0",
"text": "\r\nThis may be difficult problem to solve...",
"title": "Conflict with auto-pairs plugins in Vim",
"type": "issue"
},
{
"action": "created",
"author": "gary-ruizhang",
"comment_id": 520194787,
"datetime": 1565490542000,
"masked_author": "username_1",
"text": "What auto-pairs does is something like: inoremap ( ()<left>, so there would be a ) after your cursor. And autocompletion only depends on what you enter before the cursor.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Shougo",
"comment_id": 520214804,
"datetime": 1565516946000,
"masked_author": "username_2",
"text": "I have tested TabNine's output.\r\nTabNine seems ignore `)` for `old_suffix`.\r\n\r\nI think TabNine should set old_suffix as `)`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Shougo",
"comment_id": 520215011,
"datetime": 1565517156000,
"masked_author": "username_2",
"text": "Current TabNine checks the after of the cursor.\r\nSo it can be implemented by TabNine.",
"title": null,
"type": "comment"
}
] | 3 | 4 | 547 | false | false | 547 | false |
ArkEcosystem/core | ArkEcosystem | 535,620,284 | 3,334 | {
"number": 3334,
"repo": "core",
"user_login": "ArkEcosystem"
} | [
{
"action": "opened",
"author": "Nigui",
"comment_id": null,
"datetime": 1575972128000,
"masked_author": "username_0",
"text": "<!--\r\nThanks for your interest in the project. Bugs filed and PRs submitted are appreciated!\r\n\r\nPlease make sure you're familiar with and follow the instructions in the [contributing guidelines](https://docs.ark.io/guidebook/contribution-guidelines/contributing.html).\r\n\r\nPlease fill out the information below to expedite the review and (hopefully) merge of your pull request!\r\n-->\r\n\r\n## Summary\r\n\r\n### Fix (fb44d13)\r\n\r\nCalling `network:generate` command then using new network config does not compile due to invalid genesis block type. This fix generates valid genesis block, writing string properties for `transaction.fee`, `block.totalFee` and `block.reward`.\r\n\r\n### Feat (5b0fb68)\r\n\r\nWhen I generate a new network config through `network:generate` command, my pre-mined tokens are locked into a wallet because generated passphrase is not shared by script. This improvement writes genesis wallet info into a file (`genesis-wallet.json` at core config path). \r\n\r\n\r\n## Checklist\r\n\r\n<!-- Have you done all of these things? -->\r\n\r\n- [ ] Documentation _(if necessary)_\r\n- [ ] Tests _(if necessary)_\r\n- [x] Ready to be merged\r\n\r\n<!-- Feel free to add additional comments. -->",
"title": "Fix/core network generate command",
"type": "issue"
}
] | 2 | 4 | 1,813 | false | true | 1,175 | false |
glushchenko/fsnotes | null | 615,456,009 | 884 | null | [
{
"action": "opened",
"author": "sloria",
"comment_id": null,
"datetime": 1589137818000,
"masked_author": "username_0",
"text": "<!-- NOTE: ignoring this template will lead to your issue being dealt with more slowly -->\r\n\r\n**Describe your feature request**\r\n\r\nI would like to sync my preferences across multiple machines. Currently, I have to manually configure both my work and personal laptops to match.\r\n\r\n**Additional context**\r\n\r\nI'm currently using a Dropbox folder as my storage location. It would be convenient if the preferences could live in the same folder, so everything's in one place. nvalt, for example, saves a `Notes & Settings` file in the storage folder.",
"title": "Preferences syncing",
"type": "issue"
},
{
"action": "created",
"author": "gingerbeardman",
"comment_id": 626591984,
"datetime": 1589189968000,
"masked_author": "username_1",
"text": "How about storing your pref in Dropbox and then symlink to it from the pref folder of each macOS.\r\n\r\nI've not tried this, but it should work.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sloria",
"comment_id": 626691670,
"datetime": 1589202599000,
"masked_author": "username_0",
"text": "That should work but it requires manual action outside of FSNotes, which isn't ideal. Also, non-technical users would not likely be comfortable doing that. It'd be great if this were a part of the application.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sloria",
"comment_id": 626733573,
"datetime": 1589207071000,
"masked_author": "username_0",
"text": "Unfortunately, symlinking didn't work. I ended up with a weird state with two storage locations after opening FSNotes on my 2nd machine:\r\n\r\n\r\n\r\n\r\nI had to remove the symlink to revert back to the normal state.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gingerbeardman",
"comment_id": 626777250,
"datetime": 1589211081000,
"masked_author": "username_1",
"text": "Well, if only one person is asking for a feature then a simple workaround outside of the app is a good place to start. \r\n\r\nHow did you do the symlink on each Mac?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sloria",
"comment_id": 626784764,
"datetime": 1589211867000,
"masked_author": "username_0",
"text": "```sh\r\n#/usr/bin/env bash\r\n\r\n\r\nSETTINGS_PATH=\"$HOME/Library/Containers/co.fluder.FSNotes/Data/Library/Preferences/co.fluder.FSNotes.plist\"\r\n\r\nif [ ! -L \"$SETTINGS_PATH\" ]; then\r\n echo \"==> Backing up preferences...\"\r\n mv \"$SETTINGS_PATH\" \"$SETTINGS_PATH.bak\"\r\nfi\r\n\r\nln -sf \"$PWD/co.fluder.FSNotes.plist\" \"$SETTINGS_PATH\"\r\necho \"==> Done.\"\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gingerbeardman",
"comment_id": 626785714,
"datetime": 1589211966000,
"masked_author": "username_1",
"text": "OK, do not us `-s` for symbolic link. I did that and saw it not working as you describe.\r\n\r\nThis worked for me: \r\n`ln ~/Dropbox/co.fluder.FSNotes.plist ~/Library/Containers/co.fluder.FSNotes/Data/Library/Preferencesco.fluder.FSNotes.plist`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sloria",
"comment_id": 626795741,
"datetime": 1589213004000,
"masked_author": "username_0",
"text": "I'll try that later today and report back",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sloria",
"comment_id": 627051167,
"datetime": 1589246298000,
"masked_author": "username_0",
"text": "Unfortunately hardlinking with `ln DROPBOX_PATH PREFERENCES_PATH` didn't work either.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gingerbeardman",
"comment_id": 627224216,
"datetime": 1589275635000,
"masked_author": "username_1",
"text": "Can't you expand on what exactly went wrong?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sloria",
"comment_id": 627320534,
"datetime": 1589287629000,
"masked_author": "username_0",
"text": "Yes, those are the steps I followed. I used \r\n\r\n```sh\r\nln \"$PWD/co.fluder.FSNotes.plist\" \"$HOME/Library/Containers/co.fluder.FSNotes/Data/Library/Preferences/co.fluder.FSNotes.plist\"\r\n``` \r\n\r\nto make the link. When I made the link on my second machine, my settings were not updated.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sloria",
"comment_id": 627444464,
"datetime": 1589300112000,
"masked_author": "username_0",
"text": "FWIW, if https://github.com/glushchenko/fsnotes/pull/890 and https://github.com/glushchenko/fsnotes/pull/889 are merged and a \"Reset to Defaults\" button is added as [suggested here](https://github.com/glushchenko/fsnotes/issues/887#issuecomment-627093795), this issue will be less important to me since I'll mostly just use the defaults. We could even close this if there's no interest in implementing it.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gingerbeardman",
"comment_id": 627905223,
"datetime": 1589367010000,
"masked_author": "username_1",
"text": "this makes no sense to me, as this is outside of FSNotes control",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sloria",
"comment_id": 628013825,
"datetime": 1589378902000,
"masked_author": "username_0",
"text": "I'm puzzled, too. Are _all_ settings stored in `co.fluder.FSNotes.plist`? In particular, I noticed that my line spacing and line width settings were not getting updated.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gingerbeardman",
"comment_id": 633037799,
"datetime": 1590235341000,
"masked_author": "username_1",
"text": "So another app has tracked down the issue: `.atomic` writing\r\n\r\nhttps://github.com/knurling/ServiceStation/issues/34\r\n\r\nso, perhaps we can remove that in FSN @glushchenko ?\r\n\r\nthen hard links (`ln`) would work for syncing prefs.",
"title": null,
"type": "comment"
}
] | 2 | 15 | 3,302 | false | false | 3,302 | false |
Azure/azure-sdk-for-net | Azure | 584,706,289 | 10,724 | null | [
{
"action": "opened",
"author": "JoshLove-msft",
"comment_id": null,
"datetime": 1584652755000,
"masked_author": "username_0",
"text": "We currently return a ServiceBusReceiver whether you call client.GetReceiver or client.GetSessionReceiver. The sessionManager holds the session related functionality. This was part of our initial design before we switched to using a top level client, as it avoided forcing users to know that they needed something like a SessionClient to do anything when working with sessions. \r\n\r\nWith the current design, a user will always start by creating a ServiceBusClient and then calling GetReceiver/GetSessionReceiver depending on whether they are using sessions. As such, we may as well just have GetSessionReceiver return a different type that derives from ServiceBusReceiver, and do away with the concept of a SessionManager entirely.",
"title": "Consider introducing a SessionReceiver instead of a SessionManager",
"type": "issue"
},
{
"action": "created",
"author": "hemanttanwar",
"comment_id": 601489710,
"datetime": 1584667826000,
"masked_author": "username_1",
"text": "Although this willreduce one type `sessionManager ` but if you `return a different type that derives from ServiceBusReceiver` , this new type will also have `RenewMessageLockAsync` function . You can not reduce the visibility of this function in new SessionReceiver .",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JoshLove-msft",
"comment_id": 601491843,
"datetime": 1584668586000,
"masked_author": "username_0",
"text": "You are right that we still would need to throw an exception if a user attempts to RenewMessage lock from the session receiver. But this isn't a new issue as we already have to do this.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JoshLove-msft",
"comment_id": 601503811,
"datetime": 1584672749000,
"masked_author": "username_0",
"text": "So one thing that I didn't think of initially.. The callback for the processor takes an EventArgs which has a Receiver property. If we do this change users would need to downcast this property to a session receiver to access the session methods.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "richardpark-msft",
"comment_id": 601805019,
"datetime": 1584723265000,
"masked_author": "username_2",
"text": "@username_0 - some of these difficulties are \"gone\" when the settlement methods end up on the Message object itself, rather than on the client.\r\n\r\n1. Messages can be settled directly w/o a receiver. Callback gets simpler.\r\n2. Renewing a message lock goes on the Message object, so no need to worry about differences on the client.\r\n\r\nFWIW, JavaScript is doing both of these - settling methods on the message and two receivers (Receiver, and SessionReceiver (derived from Receiver)). This lets me avoid having a .Session or any other weirdness altogether.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JoshLove-msft",
"comment_id": 601809022,
"datetime": 1584723727000,
"masked_author": "username_0",
"text": "@username_2 how do you do session operations from the callback?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "richardpark-msft",
"comment_id": 601814368,
"datetime": 1584724376000,
"masked_author": "username_2",
"text": "@username_0 - Those are on the receiver.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JoshLove-msft",
"comment_id": 601815600,
"datetime": 1584724520000,
"masked_author": "username_0",
"text": "For concurrent receiving I believe the session operations would need to be available within the callback as the outer SessionReceiver would not have the appropriate receive links for each received message.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "richardpark-msft",
"comment_id": 601817217,
"datetime": 1584724718000,
"masked_author": "username_2",
"text": "I think you're right - the session manager aspect still needs to be designed.\r\n\r\nIn JS I think it's going to end up being a separate class altogether so we don't have to have the same exact interface on the typical Receiver interface and the \"round-robin-receive\" interface.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JoshLove-msft",
"comment_id": 601817883,
"datetime": 1584724806000,
"masked_author": "username_0",
"text": "Yeah another idea that @KrzysztofCwalina floated was having any actions that a user would be reasonably expected to take in the processor just be methods directly on the EventArgs, so we wouldn't have to include the entire Receiver.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "JoshLove-msft",
"comment_id": null,
"datetime": 1585159100000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 11 | 2,814 | false | false | 2,814 | true |
Chasesc/discord-classify-bot | null | 606,850,231 | 5 | {
"number": 5,
"repo": "discord-classify-bot",
"user_login": "Chasesc"
} | [
{
"action": "opened",
"author": "aramlaka",
"comment_id": null,
"datetime": 1587846922000,
"masked_author": "username_0",
"text": "",
"title": "space",
"type": "issue"
},
{
"action": "created",
"author": "Chasesc",
"comment_id": 619444595,
"datetime": 1587850928000,
"masked_author": "username_1",
"text": "This is a breaking change. `config.get` **MUST** line up with `IMG_SAVE_PATH` otherwise everything will break & all order is lost. This would be disastrous.\r\n\r\nThe newline at the end is good though.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "PhillipHunter",
"comment_id": 619479092,
"datetime": 1587874746000,
"masked_author": "username_2",
"text": "What business value does this provide to the project?",
"title": null,
"type": "comment"
}
] | 3 | 3 | 251 | false | false | 251 | false |
microsoft/vscode-python | microsoft | 615,619,112 | 11,728 | null | [
{
"action": "opened",
"author": "verdimrc",
"comment_id": null,
"datetime": 1589177831000,
"masked_author": "username_0",
"text": "## Environment data\r\n\r\n- VS Code version: 1.4.60-insider (commit: d487078dc7fc1c276657cadb61b4f63833a8df55)\r\n- Extension version (available under the Extensions sidebar): 2020.6.78209-dev\r\n- OS and version: macOS Mojave 10.14.6\r\n- Python version (& distribution if applicable, e.g. Anaconda): python-3.7.6 from conda-forge\r\n- Type of virtual environment used (N/A | venv | virtualenv | conda | ...): conda\r\n- Relevant/affected Python packages and their versions: pydocstyle-5.0.2\r\n- Relevant/affected Python-related VS Code extensions and their versions: pydocstyle functionalities\r\n- Jedi or Language Server? (i.e. what is `\"python.jediEnabled\"` set to; more info #3977): `false`\r\n- Value of the `python.languageServer` setting: `Microsoft`\r\n\r\n## Expected behaviour\r\nConfigured pydocstyle to ignore `setup.py` in `tox.ini`:\r\n```ini\r\n[pydocstyle]\r\nmatch = (?!setup).*\\.py\r\n```\r\n\r\nRunning `pydocstyle` from cli correctly skips checks on `${workspaceDir}/setup.py`.\r\n\r\nI would've expected vscode to also not reporting pydocstyle violations.\r\n\r\n## Actual behaviour\r\n\r\nvscode still reports pydocstyle violations on `setup.py`. And as a matter of fact, also tried to put pydocstyle setting in `.pydocstyle`, to no effect.\r\n\r\n## Steps to reproduce:\r\n\r\n1. settings\r\n\r\n```json\r\n// settings.json\r\n{\r\n \"python.jediEnabled\": false,\r\n \"python.langageServer\": \"Microsoft\",\r\n \"python.linting.enabled\": true,\r\n \"python.linting.flake8Enabled\": true,\r\n \"python.linting.mypy\": true,\r\n \"python.linting.pydocstyleEnabled\": true,\r\n \"python.linting.pytestEnabled\": true,\r\n \"python.linting.unittestEnabled\": true,\r\n \"python.linting.noseteststEnabled\": true,\r\n```\r\n\r\n```ini\r\n# tox.ini\r\n[pydocstyle]\r\nmatch = (?!setup).*\\.py\r\n```\r\n\r\n2. Open `setup.py`, and vscode will reports pydocstyle violation (under Problem tab, and visual indications on the offending line.\r\n\r\n</details>",
"title": "Workspace does not respect pydocstyle's match setting.",
"type": "issue"
},
{
"action": "created",
"author": "karthiknadig",
"comment_id": 630460233,
"datetime": 1589839697000,
"masked_author": "username_1",
"text": "@username_0 Can you share a screenshot of where the problem is shown? Also, can you share the contents of Output > Python pannel?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "verdimrc",
"comment_id": 631852644,
"datetime": 1590030635000,
"masked_author": "username_0",
"text": "#### Screenshot-1: vscode shows pydocstyle errors despite `setup.py` is blacklisted in `tox.ini`.\r\n<img width=\"628\" alt=\"Screen Shot 2020-05-21 at 11 02 35 AM\" src=\"https://user-images.githubusercontent.com/2340781/82518972-1e3b1200-9b53-11ea-8c4f-f0542de3a113.png\">\r\n\r\n#### Screenshot-2: pydocstyle cli rightly skips `setup.py`\r\n<img width=\"626\" alt=\"Screen Shot 2020-05-21 at 11 03 00 AM\" src=\"https://user-images.githubusercontent.com/2340781/82519168-a02b3b00-9b53-11ea-8394-4f500a6e0b5d.png\">\r\n\r\n#### Screenshot-3: pydocstyle cli rightly checks `setup.py` if this file is not blacklisted.\r\n<img width=\"634\" alt=\"Screen Shot 2020-05-21 at 11 03 19 AM\" src=\"https://user-images.githubusercontent.com/2340781/82519106-796d0480-9b53-11ea-9793-3106c8b0b72b.png\">",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "verdimrc",
"comment_id": 634132887,
"datetime": 1590510452000,
"masked_author": "username_0",
"text": "cwd: ~/src/github/smepu\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "verdimrc",
"comment_id": 981100956,
"datetime": 1638111954000,
"masked_author": "username_0",
"text": "Hi, no longer an issue with latest insiders (vscode & python extension).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "brettcannon",
"comment_id": null,
"datetime": 1644001755000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 6 | 2,886 | false | false | 2,886 | true |
denoland/deno | denoland | 718,220,691 | 7,903 | {
"number": 7903,
"repo": "deno",
"user_login": "denoland"
} | [
{
"action": "opened",
"author": "ry",
"comment_id": null,
"datetime": 1602255895000,
"masked_author": "username_0",
"text": "<!--\r\nBefore submitting a PR, please read\r\nhttps://github.com/denoland/deno/blob/master/docs/contributing.md\r\n\r\n1. Give the PR a descriptive title.\r\n\r\n Examples of good title:\r\n - fix(std/http): Fix race condition in server\r\n - docs(console): Update docstrings\r\n - feat(doc): Handle nested reexports\r\n\r\n Examples of bad title:\r\n - fix #7123\r\n - update docs\r\n - fix bugs\r\n\r\n2. Ensure there is a related issue and it is referenced in the PR text.\r\n3. Ensure there are tests that cover the changes.\r\n4. Ensure `cargo test` passes.\r\n5. Ensure `./tools/format.py` passes without changing files.\r\n6. Ensure `./tools/lint.py` passes.\r\n-->",
"title": "How long do builds without cache take?",
"type": "issue"
},
{
"action": "created",
"author": "ry",
"comment_id": 706252380,
"datetime": 1602257808000,
"masked_author": "username_0",
"text": "Actually it seems the cache is not helping that much...\r\n\r\n| target | with cache (0736420) | without cache (this pr) |\r\n|--|--|--|\r\n| test_release macos-10.15 | 17m 57s | 21m 34s | \r\n| test_release windows-2019 | 23m 19s | 19m 17s |\r\n| test_release ubuntu-latest-xl | 10m 45s | 9m 11s |\r\n| test_debug ubuntu-latest-xl | 5m 33s | 6m 40s |\r\n| bench ubuntu-latest-xl | 11m 54s | 14m 46s |\r\n| lint ubuntu-latest-xl | 1m 30s | 3m 19s",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ry",
"comment_id": 706266636,
"datetime": 1602259423000,
"masked_author": "username_0",
"text": "I'm going to land @piscisaureus's fix to the CI #7898 instead, since it's a couple minutes faster. (Table updated)\r\n\r\nHowever, if we continue to encounter cache problems it's good to know we can simply remove it without a big hit.",
"title": null,
"type": "comment"
}
] | 1 | 3 | 1,309 | false | false | 1,309 | false |
BitcoinUnlimited/BitcoinUnlimited | BitcoinUnlimited | 615,923,827 | 2,200 | {
"number": 2200,
"repo": "BitcoinUnlimited",
"user_login": "BitcoinUnlimited"
} | [
{
"action": "opened",
"author": "ptschip",
"comment_id": null,
"datetime": 1589206898000,
"masked_author": "username_0",
"text": "While we have tweaks for these values they don't currently do anything. This PR adds that functionality so we can dynamically increase or decrease the number of threads running.",
"title": "Add the ability to launch or kill txAdmissionThreads and msgHandlerThreads",
"type": "issue"
},
{
"action": "created",
"author": "gandrewstone",
"comment_id": 637015551,
"datetime": 1591034257000,
"masked_author": "username_1",
"text": "I'd like to see the 2 places that execute the add/drop thread logic call a single function, instead of copy-paste. This function will also help anyone creating a new pool of threads...",
"title": null,
"type": "comment"
}
] | 2 | 2 | 362 | false | false | 362 | false |
aacotroneo/laravel-saml2 | null | 675,077,274 | 231 | null | [
{
"action": "opened",
"author": "thshiro",
"comment_id": null,
"datetime": 1596812770000,
"masked_author": "username_0",
"text": "```php\r\nPHPUnit\\Framework\\Exception: Fatal error: Uncaught Error: Class 'Route' not found in vendor/aacotroneo/laravel-saml2/src/routes.php on line 3\r\n```\r\n\r\nI'm running tests with phpunit for another features, and always shows this error. Its fixed when i put `Use Illuminate\\Support\\Facades\\Route;` in routes.php file.\r\n\r\nYes, i have the Route alias in `config/app.php`",
"title": "PHPUnit - Uncaught Error: Class 'Route' not found",
"type": "issue"
}
] | 1 | 1 | 371 | false | false | 371 | false |
github/issue-labeler | github | 619,320,802 | 12 | null | [
{
"action": "opened",
"author": "fmigneault",
"comment_id": null,
"datetime": 1589584594000,
"masked_author": "username_0",
"text": "I really like the ability to automatically apply labels if a specific word is found, but having the same labels removed because the word is missing can be problematic. \r\n\r\nFor instance, I label `bug` to any issue that as that word, but the label was manually tagged as a bug without using the word in the body, it just gets removed at some point and we lose track of the bug!\r\n\r\nMaybe something like follows could be considered?\r\n\r\n```\r\nbug: \r\n add: false\r\n remove: false\r\n regex: \r\n - bug\r\n - ....\r\n```",
"title": "Option to not remove label?",
"type": "issue"
},
{
"action": "created",
"author": "stephanmiehe",
"comment_id": 637598655,
"datetime": 1591109940000,
"masked_author": "username_1",
"text": "@username_0 this sounds good to me. Feel free to put a PR up and I'll review it otherwise I'll try to get to it in the near future",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jsosulska",
"comment_id": 693618476,
"datetime": 1600284500000,
"masked_author": "username_2",
"text": "I am seeing that labels are being removed as part of every run of my action, even though the output agrees the label is added. Can this be turned off?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hawkeye116477",
"comment_id": 699664326,
"datetime": 1601228081000,
"masked_author": "username_3",
"text": "Could it be possible to add condition when label will be removed? So for example, I check checkbox label added, I uncheck, then label removed.",
"title": null,
"type": "comment"
}
] | 4 | 4 | 939 | false | false | 939 | true |
ratson/cordova-plugin-admob-free | null | 610,843,253 | 426 | null | [
{
"action": "opened",
"author": "antoniocorreia",
"comment_id": null,
"datetime": 1588350867000,
"masked_author": "username_0",
"text": "I'm having a problem after configured the Interstitial with this plugin.\r\n\r\nFirst, I was able to display the Interstitial with the following config\r\n\r\n```\r\nlet interConfig: AdMobFreeInterstitialConfig = {\r\n isTesting: false,\r\n autoShow: true,\r\n id: this.adconfig.idIntersticial,\r\n };\r\n\r\nthis.admob.interstitial.config(interConfig);\r\n\r\nthis.admob.interstitial\r\n .prepare()\r\n .then(() => {\r\n console.log('insterstitial displayed');\r\n })\r\n .catch((e) => console.log(e));\r\n```\r\nI'm calling this code twice in my Home.ts, and for each call I setup a listener to execute a specific action when the user closes the interstitial\r\n\r\n```\r\n this.admob.on(this.admob.events.INTERSTITIAL_CLOSE).subscribe(() => {\r\n this.function1();\r\n });\r\n```\r\nand in another method\r\n\r\n```\r\n this.admob.on(this.admob.events.INTERSTITIAL_CLOSE).subscribe(() => {\r\n this.function2();\r\n });\r\n```\r\nMy problem is... the first time I close the interstitial everything works nicely, let's say I closed and the `this.function1()` method was called for the first scenario, but from the second call on the listeners are appended to the `INTERSTITIAL_CLOSE`, for example, if I execute the second scenario to close and execute `this.function2()` the `this.function1() `is called again after executing `this.function2()`.\r\n\r\nWhat am I doing wrong? And how can I reset the listeners?",
"title": "How to reset event listeners for INTERSTITIAL_CLOSE after closing an Ionic 3/Admob Free interstitial?",
"type": "issue"
}
] | 1 | 1 | 1,425 | false | false | 1,425 | false |
swsoyee/2019-ncov-japan | null | 669,612,310 | 1,906 | null | [
{
"action": "opened",
"author": "Lix900",
"comment_id": null,
"datetime": 1596187599000,
"masked_author": "username_0",
"text": "\r\n",
"title": "403/404 error",
"type": "issue"
},
{
"action": "created",
"author": "swsoyee",
"comment_id": 667028898,
"datetime": 1596187732000,
"masked_author": "username_1",
"text": "おそらくちょうどデータ更新中です。最近アクセスがまた増えてきて、たまに重い時があります。\r\n再度アクセスしてみてください。それでも見れなければ、お手数ですがキャッシュをクリアして試してください。",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Lix900",
"comment_id": null,
"datetime": 1596187910000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "Lix900",
"comment_id": 667030103,
"datetime": 1596187910000,
"masked_author": "username_0",
"text": "HAとかやってないですね。いつもありがとうございます。",
"title": null,
"type": "comment"
}
] | 2 | 4 | 346 | false | false | 346 | false |
aquariuslt/blog | null | 574,799,188 | 56 | {
"number": 56,
"repo": "blog",
"user_login": "aquariuslt"
} | [
{
"action": "opened",
"author": "Snakeflute",
"comment_id": null,
"datetime": 1583253879000,
"masked_author": "username_0",
"text": "",
"title": "add link.",
"type": "issue"
},
{
"action": "created",
"author": "aquariuslt",
"comment_id": 594712247,
"datetime": 1583345012000,
"masked_author": "username_1",
"text": ":tada: This PR is included in version 6.22.0 :tada:\n\nThe release is available on [GitHub release](https://github.com/username_1/blog/releases/tag/v6.22.0)\n\nYour **[semantic-release](https://github.com/semantic-release/semantic-release)** bot :package::rocket:",
"title": null,
"type": "comment"
}
] | 2 | 2 | 259 | false | false | 259 | true |
ckeditor/ckeditor5 | ckeditor | 715,639,669 | 8,217 | null | [
{
"action": "opened",
"author": "mlewand",
"comment_id": null,
"datetime": 1601988303000,
"masked_author": "username_0",
"text": "## 📝 Provide a description of the new feature\r\n\r\nAFAIR we have already a logic that detects new plugins for the `npm run docs:content-styles` command.\r\n\r\nIt would be great to list the new plugins during the release so that we can pass on new plugin list to be added to the added to the online builder.\r\n\r\n---\r\n\r\nIf you'd like to see this feature implemented, add a 👍 reaction to this post.",
"title": "List new plugins during the release",
"type": "issue"
},
{
"action": "created",
"author": "pomek",
"comment_id": 711904595,
"datetime": 1603099520000,
"masked_author": "username_1",
"text": "A list of plugins is available here: https://github.com/ckeditor/ckeditor5/blob/608baa9be5a1c8ae5600e8df9627c4f5b2cecef7/scripts/docs/build-content-styles.js#L249-L252\r\n\r\nWe need to save an array of those objects and compare them with the existing database file. After displaying the list, we should update the database and commit it.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mlewand",
"comment_id": 717737727,
"datetime": 1603867833000,
"masked_author": "username_0",
"text": "Fixed in #8338",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mlewand",
"comment_id": null,
"datetime": 1603867833000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 737 | false | false | 737 | false |
jr-marchand/caviar | null | 706,517,763 | 5 | null | [
{
"action": "opened",
"author": "jr-marchand",
"comment_id": null,
"datetime": 1600791224000,
"masked_author": "username_0",
"text": "If a PDB is truncated and does not have the element column (column 77-78 of the PDB format), the function \"find_protein_points_variablevolumn\" in cavitydetect.py glitches.\r\n\r\nIt doesn't find any atom of the protein and prints out \"CAVIAR does not find a cavity\".\r\n\r\nThe selection tool in this function is based [on element](http://prody.csb.pitt.edu/tutorials/prody_tutorial/selection.html#selection-operations), there may be a work around with a selection based on atom \"name\".",
"title": "Crash if PDB is not conform with the PDB format (missing element column)",
"type": "issue"
}
] | 1 | 1 | 478 | false | false | 478 | false |
carbon-design-system/gatsby-theme-carbon | carbon-design-system | 521,099,380 | 513 | null | [
{
"action": "opened",
"author": "jendowns",
"comment_id": null,
"datetime": 1573494434000,
"masked_author": "username_0",
"text": "## Detailed description\r\n\r\nCheck out the `ImageCard` example page: https://gatsby-theme-carbon.now.sh/components/ImageCard\r\n\r\nContent is vertically misaligned, and it does not look intentional.\r\n\r\nThe `margin-top` makes it so `ImageCard` components are not locked up.\r\n\r\n\r\n\r\n\r\nCulprit appears to be:\r\n\r\nhttps://github.com/carbon-design-system/gatsby-theme-carbon/blob/7ad684f52441975d4452d33d96cfd3093f7733b6/packages/gatsby-theme-carbon/src/components/Main/Main.module.scss#L11-L14\r\n\r\nPossibly related to work done for #500",
"title": "Margin top applied to descendents of `main` causes misaligned content",
"type": "issue"
},
{
"action": "created",
"author": "jnm2377",
"comment_id": 554502841,
"datetime": 1573847304000,
"masked_author": "username_1",
"text": "Thanks for opening this issue! I'm currently looking into all the spacing issues that resulted from the latest spacing update. 👍",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "vpicone",
"comment_id": null,
"datetime": 1574358842000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 3 | 793 | false | false | 793 | false |
diamantidis/scala-screenshots | null | 714,441,748 | 41 | {
"number": 41,
"repo": "scala-screenshots",
"user_login": "diamantidis"
} | [
{
"action": "opened",
"author": "scala-steward",
"comment_id": null,
"datetime": 1601859322000,
"masked_author": "username_0",
"text": "Updates [org.scala-sbt:sbt](https://github.com/sbt/sbt) from 1.3.13 to 1.4.0.\n\n\nI'll automatically update this PR to resolve conflicts as long as you don't change it yourself.\n\nIf you'd like to skip this version, you can just close this PR. If you have any feedback, just mention me in the comments below.\n\nConfigure Scala Steward for your repository with a [`.username_0.conf`](https://github.com/fthomas/username_0/blob/a8ccd686231bfa50aed806afe54fed5a5889719c/docs/repo-specific-configuration.md) file.\n\nHave a fantastic day writing Scala!\n\n<details>\n<summary>Ignore future updates</summary>\n\nAdd this to your `.username_0.conf` file to ignore future updates of this dependency:\n```\nupdates.ignore = [ { groupId = \"org.scala-sbt\", artifactId = \"sbt\" } ]\n```\n</details>\n\nlabels: library-update, semver-minor",
"title": "Update sbt to 1.4.0",
"type": "issue"
}
] | 2 | 2 | 818 | false | true | 818 | true |
Azure/azure-sdk-for-c | Azure | 676,322,690 | 1,027 | null | [
{
"action": "opened",
"author": "ahsonkhan",
"comment_id": null,
"datetime": 1597083254000,
"masked_author": "username_0",
"text": "From https://github.com/Azure/azure-sdk-for-c/pull/1023#discussion_r468060598\r\n\r\nConsider removing the `number_of_buffers` parameter and changing:\r\n```C\r\nAZ_NODISCARD az_result az_json_reader_chunked_init(\r\n az_json_reader* json_reader,\r\n az_span json_buffers[],\r\n int32_t number_of_buffers,\r\n az_json_reader_options const* options);\r\n```\r\nTo:\r\n```C\r\nAZ_NODISCARD az_result az_json_reader_chunked_init(\r\n az_json_reader* json_reader,\r\n az_span json_buffers[],\r\n az_json_reader_options const* options);\r\n```",
"title": "In json reader, should the span array be terminated with AZ_SPAN_NULL instead of explicit length parameter?",
"type": "issue"
},
{
"action": "created",
"author": "RickWinter",
"comment_id": 671623766,
"datetime": 1597098972000,
"masked_author": "username_1",
"text": "@username_0 @JeffreyRichter If we are going to change this api we need to do it before GA.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 616 | false | false | 616 | true |
microsoft/vscode | microsoft | 556,209,258 | 89,462 | null | [
{
"action": "opened",
"author": "alexr00",
"comment_id": null,
"datetime": 1580218070000,
"masked_author": "username_0",
"text": "Testing https://github.com/microsoft/vscode/issues/89358\r\n\r\n- Reset all machines completely\r\n- Turn on sync (all checkboxes) on Machine 1. Machine 1 has some disabled extensions on it.\r\n- Turn on sync on Machine 2. Machine 2 has some of the same extensions as Machine 1, but on Machine 2 they are enabled.\r\nExpected: Disabled state from Machine 1 is preserved on Machine 1 since it synced first. I'd also expect those same extensions to be disabled now on Machine 2.",
"title": "Syncing a second machine resulted in extension disablement state being lost on first machine",
"type": "issue"
},
{
"action": "closed",
"author": "sandy081",
"comment_id": null,
"datetime": 1580307412000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "alexr00",
"comment_id": 580247149,
"datetime": 1580390136000,
"masked_author": "username_0",
"text": "I see a new behavior that I still find unexpected:\r\n- Reset all machines completely\r\n- Turn on sync (all checkboxes) on Machine 1. Machine 1 has the CodeMetrics extension installed but disabled.\r\n- On Machine 2, make sure that the CodeMetrics extension from Machine 1 is NOT installed.\r\n- Turn on sync on Machine 2. \r\nExpected: Machine 2 now has CodeMetrics installed and disabled\r\nActual: CodeMetrics gets uninstalled on Machine 1.",
"title": null,
"type": "comment"
},
{
"action": "reopened",
"author": "alexr00",
"comment_id": null,
"datetime": 1580390145000,
"masked_author": "username_0",
"text": "Testing https://github.com/microsoft/vscode/issues/89358\r\n\r\n- Reset all machines completely\r\n- Turn on sync (all checkboxes) on Machine 1. Machine 1 has some disabled extensions on it.\r\n- Turn on sync on Machine 2. Machine 2 has some of the same extensions as Machine 1, but on Machine 2 they are enabled.\r\nExpected: Disabled state from Machine 1 is preserved on Machine 1 since it synced first. I'd also expect those same extensions to be disabled now on Machine 2.",
"title": "Syncing a second machine resulted in extension disablement state being lost on first machine",
"type": "issue"
},
{
"action": "created",
"author": "alexr00",
"comment_id": 580248833,
"datetime": 1580390392000,
"masked_author": "username_0",
"text": "And now I can't reproduce it.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "alexr00",
"comment_id": null,
"datetime": 1580390392000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 1,393 | false | false | 1,393 | false |
google/go-querystring | google | 630,943,002 | 37 | {
"number": 37,
"repo": "go-querystring",
"user_login": "google"
} | [
{
"action": "opened",
"author": "juliolustosa",
"comment_id": null,
"datetime": 1591285399000,
"masked_author": "username_0",
"text": "Add new option",
"title": "add dot-delimited",
"type": "issue"
},
{
"action": "created",
"author": "willnorris",
"comment_id": 716921790,
"datetime": 1603762848000,
"masked_author": "username_1",
"text": "Is this actually used somewhere? I don't think I've ever seen dot-delimited values like this.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "willnorris",
"comment_id": 784649591,
"datetime": 1614128296000,
"masked_author": "username_1",
"text": "This can now be done with the custom delimiter option added in faa69f46eca0a09bb9bbcfcf2c4fceed21953c06:\r\n\r\n``` go\r\ntype options struct {\r\n DotDelimited []string `del:\".\"`\r\n}\r\n```",
"title": null,
"type": "comment"
}
] | 2 | 3 | 290 | false | false | 290 | false |
ldn-softdev/jtc | null | 630,452,059 | 25 | null | [
{
"action": "opened",
"author": "soniah",
"comment_id": null,
"datetime": 1591236152000,
"masked_author": "username_0",
"text": "url is `https://github.com/username_1/jtc/releases/download/LatestBuild/jtc-linux-64.latest`, which causes a `.latest` file to be downloaded, rather than a `.zip`",
"title": "url for precompiled binaries is wrong",
"type": "issue"
},
{
"action": "created",
"author": "ldn-softdev",
"comment_id": 638710820,
"datetime": 1591260597000,
"masked_author": "username_1",
"text": "Am I missing something?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "soniah",
"comment_id": 639183420,
"datetime": 1591315766000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ldn-softdev",
"comment_id": 639481705,
"datetime": 1591363875000,
"masked_author": "username_1",
"text": "I see where confusion is. Files `.latest` are the binaries. You just need to rename it and give execution permissions, like this:\r\n```\r\nmv jtc-linux-64.latest jtc\r\nchmod 754 jtc\r\n```\r\n\r\nProbably it would be more descriptive, if I give them `.bin` extensions?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ldn-softdev",
"comment_id": 640120877,
"datetime": 1591479152000,
"masked_author": "username_1",
"text": "I cannot name all 3 files (for MacOs, Linux 64bit, and 32bit) just `jtc` and keep all 3 in the same location (likewise it's impossible to have 3 different files under the same name in one folder).\r\n\r\nInstead, I'll add instructions like I mentioned above (to rename the file and give exec permissions).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ldn-softdev",
"comment_id": null,
"datetime": 1591479153000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 869 | false | false | 869 | true |
hasura/graphql-engine | hasura | 635,931,987 | 5,039 | {
"number": 5039,
"repo": "graphql-engine",
"user_login": "hasura"
} | [
{
"action": "opened",
"author": "kolharsam",
"comment_id": null,
"datetime": 1591765707000,
"masked_author": "username_0",
"text": "### Description\r\n<!-- The title might not be enough to convey how this change affects the user. -->\r\n<!-- Describe the changes from a user's perspective -->\r\nA new set of flags are added to the `migrate` command on the CLI. The `up-sql` and `down-sql` flags can be used to create SQL migrations from the CLI. These flags are to be used in conjunction. Either being absent during execution results in an error.\r\n\r\n### Changelog\r\n\r\n- [x] `CHANGELOG.md` is updated with user-facing content relevant to this PR. If no changelog is required, then add the `no-changelog-required` label.\r\n\r\n### Affected components\r\n- [x] CLI\r\n\r\n### Related Issues\r\nresolves #5026 \r\n\r\n#### Breaking changes\r\n\r\n- [x] No Breaking changes",
"title": "cli: add new flags to generate sql migrations",
"type": "issue"
}
] | 3 | 10 | 2,240 | false | true | 711 | false |
tensorflow/tensorflow | tensorflow | 677,227,679 | 42,246 | null | [
{
"action": "opened",
"author": "DNXie",
"comment_id": null,
"datetime": 1597183456000,
"masked_author": "username_0",
"text": "<em>Please make sure that this is a bug. As per our\r\n[GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md),\r\nwe only address code/doc bugs, performance issues, feature requests and\r\nbuild/installation issues on GitHub. tag:bug_template</em>\r\n\r\n**System information**\r\n- Have I written custom code (as opposed to using a stock example script provided in TensorFlow): No\r\n- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 18.04\r\n- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: N/A\r\n- TensorFlow installed from (source or binary): binary\r\n- TensorFlow version (use command below):2.1.0\r\n- Python version:3.7.6\r\n- Bazel version (if compiling from source):N/A\r\n- GCC/Compiler version (if compiling from source):N/A\r\n- CUDA/cuDNN version:N/A\r\n- GPU model and memory:N/A\r\n\r\n\r\nYou can collect some of this information using our environment capture\r\n[script](https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh)\r\nYou can also obtain the TensorFlow version with:\r\n1. TF 1.0: `python -c \"import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)\"`\r\n2. TF 2.0: `python -c \"import tensorflow as tf; print(tf.version.GIT_VERSION, tf.version.VERSION)\"`\r\n\r\n\r\n**Describe the current behavior**\r\n\r\n`tf.signal.inverse_stft` segfault when `frame_length` is a large value.\r\n\r\n**Describe the expected behavior**\r\nexpect no crashes\r\n\r\n**Standalone code to reproduce the issue**\r\nProvide a reproducible test case that is the bare minimum necessary to generate\r\nthe problem. If possible, please share a link to Colab/Jupyter/any notebook.\r\n~~~python\r\nimport tensorflow as tf\r\nstfts = tf.ones((1,1), dtype=tf.complex64)\r\ntf.signal.inverse_stft(stfts=stfts, frame_length=2700000000, frame_step=1)\r\n~~~\r\n\r\n\r\n**Other info / logs** Include any logs or source code that would be helpful to\r\ndiagnose the problem. If including tracebacks, please include the full\r\ntraceback. Large logs and files should be attached.\r\n~~~python\r\nSegmentation fault (core dumped)\r\n~~~",
"title": "tf.signal.inverse_stft segfault when frame_length is a large value",
"type": "issue"
},
{
"action": "created",
"author": "ravikyram",
"comment_id": 672643001,
"datetime": 1597214690000,
"masked_author": "username_1",
"text": "@username_0 \r\n\r\nIf we change frame_length=27000 i am not seeing any issue.Please, find the gist [here](https://colab.research.google.com/gist/username_1/a31bfaa703ddd34548862e7860baf762/untitled242.ipynb).Thanks!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DNXie",
"comment_id": 672859199,
"datetime": 1597237645000,
"masked_author": "username_0",
"text": "@username_1 I agree. But the function shouldn't crash with invalid input.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dthkao",
"comment_id": 695016925,
"datetime": 1600453265000,
"masked_author": "username_2",
"text": "How much memory does the host running the snippet have? I suspect [this pad](https://github.com/tensorflow/tensorflow/blob/9ea2adeb41131b2af704d8ceee96b7c771d220ad/tensorflow/python/ops/signal/spectral_ops.py#L264-L266) may just be running out of memory.\r\n\r\nSeparate from the seg fault, the frame_length of 2700000000 is just large enough to require a tf.int64 dtype, but windows are assumed to have lengths [representable by tf.int32](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/signal/window_ops.py).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DNXie",
"comment_id": 768616930,
"datetime": 1611786037000,
"masked_author": "username_0",
"text": "@username_2 A memory check instead of crash would be better! :)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sushreebarsa",
"comment_id": 850535107,
"datetime": 1622219480000,
"masked_author": "username_3",
"text": "Was able to replicate the issue in TF 2.6.0-dev20210528,please find the gist[ here ](https://colab.research.google.com/gist/username_3/fe623ab6a29572e7ccb9bb2ce208b2ca/untitled70.ipynb#scrollTo=aKxtUiBRx_Ic)..Thanks !",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jvishnuvardhan",
"comment_id": 855183506,
"datetime": 1622868849000,
"masked_author": "username_4",
"text": "@username_0 When i ran with recent `tf-nightly`, the code is not crashing. It is throwing an error as expected. Please check the [gist here](https://colab.research.google.com/gist/username_4/5723b47e5c66e33bc4c82df8d3e68fb9/untitled242.ipynb). Thanks!\r\n\r\nThe following is an error trace.\r\n```\r\n---------------------------------------------------------------------------\r\nInvalidArgumentError Traceback (most recent call last)\r\n<ipython-input-3-495cfd7046bc> in <module>()\r\n 1 import tensorflow as tf\r\n 2 stfts = tf.ones((1,1), dtype=tf.complex64)\r\n----> 3 tf.signal.inverse_stft(stfts=stfts, frame_length=2700000000, frame_step=1)\r\n\r\n5 frames\r\n/usr/local/lib/python3.7/dist-packages/six.py in raise_from(value, from_value)\r\n\r\nInvalidArgumentError: Obtained a FFT shape of 0 elements: [1,0] [Op:IRFFT]\r\n```",
"title": null,
"type": "comment"
}
] | 6 | 10 | 4,199 | false | true | 3,980 | true |
deepset-ai/FARM | deepset-ai | 746,351,834 | 637 | {
"number": 637,
"repo": "FARM",
"user_login": "deepset-ai"
} | [
{
"action": "opened",
"author": "ftesser",
"comment_id": null,
"datetime": 1605773800000,
"masked_author": "username_0",
"text": "This PR is related to #636.\r\nIt will be an incremental PR, since I see that after this first commit, some others changes are needed to successfully run all the tests on Windows.\r\n\r\nThis first commit remove the not used 'from torch.distributed import all_gather' from 'farm\\modeling\\prediction_head.py'. There is also a newline added here.",
"title": "Make tests working on Windws",
"type": "issue"
},
{
"action": "created",
"author": "ftesser",
"comment_id": 730211371,
"datetime": 1605774321000,
"masked_author": "username_0",
"text": "E UnicodeDecodeError: 'charmap' codec can't decode byte 0x8d in position 1012: character maps to <undefined>\r\n\r\nc:\\python\\lib\\encodings\\cp1252.py:23: UnicodeDecodeError\r\n```\r\n\r\nSo the second commit is need to fix that.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ftesser",
"comment_id": 730230203,
"datetime": 1605776554000,
"masked_author": "username_0",
"text": "After the first two commits, there is only one test that fail on Windows: test.test_dpr.test_dpr_modules\r\nThe reason is: \r\n```\r\nE AttributeError: module 'torch.distributed' has no attribute 'get_rank'\r\n\r\n..\\venv\\lib\\site-packages\\farm\\modeling\\prediction_head.py:1644: AttributeError\r\n```\r\nI see that this refer to `logits_to_loss` method on prediction_head.py :\r\n\r\n\r\n```python\r\n# Check if DDP is initialized\r\n try:\r\n rank = torch.distributed.get_rank()\r\n except AssertionError:\r\n rank = -1\r\n```\r\n\r\nAny suggestions to make this working also on windows test? \r\nIf DDP does not work on windows perhaps we can exclude this test when running on Windows?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ftesser",
"comment_id": 734222079,
"datetime": 1606387391000,
"masked_author": "username_0",
"text": "My current proposal is to exclude `test.test_dpr.test_dpr_modules` just when pytest is run on a Windows machine.\r\nIf you think that it is possible somehow to test `dpr_modules` without `DDP` it would probably be better, but for the moment this solution allows you to test almost everything even on windows.\r\n\r\nLet me know what do you think.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Timoeller",
"comment_id": 735883533,
"datetime": 1606752733000,
"masked_author": "username_1",
"text": "Hey @username_0 thanks for looking into this.\r\nGenerally I believe it doesnt make much sense having different code/tests for different distributions. It will be tedious to keep track of all the individual changes.\r\n\r\nThe DDP module is quite important for making DPR trainable so excluding it would not be preferred.\r\n\r\nLooking into Pytorch support for DDP in windows I found this PR: https://github.com/pytorch/pytorch/pull/45335 which was merged on Sep 25th. So this code is only in the newest pytorch 1.7.0 release (we are currently using pytorch 1.6.0).\r\n\r\nSo actionable insights: We want to do a FARM release later this week, after this we can increment the pytorch version and see if this fixes the DDP issues. Would that be a solution for you? Of course you can try updating pytorch beforehand yourself, we would appreciate your insights.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ftesser",
"comment_id": 736336959,
"datetime": 1606814412000,
"masked_author": "username_0",
"text": "Thanks @username_1 for your feedback.\r\nOf course if with pytorch 1.7.0 DDP will works on Windows this is the best solution, so I am ok to wait for that. I will try to updating pytorch beforehand myself.\r\n\r\nJust a note about this PR, commit https://github.com/deepset-ai/FARM/pull/637/commits/0aedd051e4f80dcefd9948cfba7e31ae37e0384c solve a UnicodeDecodeError, this should be applied in all cases.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Timoeller",
"comment_id": 750068832,
"datetime": 1608717817000,
"masked_author": "username_1",
"text": "Sorry for the delay @username_0\r\nWe actually made huge changes in #649 that caused a lot of testing.\r\n\r\nWe also quickly tested updating to pytorch in #660 which produced failing onnx conversion tests. Unfortunately we will be able to look into this with the start of the new year.\r\n\r\nI will merge you PR now to include this nice patch in the coming release.",
"title": null,
"type": "comment"
}
] | 2 | 7 | 3,191 | false | false | 3,191 | true |
onnx/optimizer | onnx | 699,589,849 | 6 | null | [
{
"action": "opened",
"author": "JonTriebenbach",
"comment_id": null,
"datetime": 1599846969000,
"masked_author": "username_0",
"text": "E AssertionError: \r\nE Arrays are not almost equal to 7 decimals\r\nE \r\nE Mismatched elements: 3 / 3 (100%)\r\nE Max absolute difference: 2.7824624e+14\r\nE Max relative difference: 1.0692023e+32\r\nE x: array([-1.3619891, 2.4206262, 2.2576501], dtype=float32)\r\nE y: array([-4.7447069e-14, -2.2639551e-32, 2.7824624e+14], dtype=float32)\r\n\r\noptimizer_test.py:1509: AssertionError\r\n\r\n```\r\n\r\nThe `fuse_bn_into_conv.h` code does not account for potentially running on a big-endian architecture machine when running the optimization routines.\r\n\r\nAttached is a patch file with updated source files with a proposed fix for this optimization issue.\r\n\r\n[onnx-opt.patch.zip](https://github.com/onnx/onnx/files/5157259/onnx-opt.patch.zip)\r\n\r\nA pull request can be submitted if needed.",
"title": "fuse_bn_into_conv optimization fails on big-endian architecture",
"type": "issue"
}
] | 1 | 1 | 849 | false | false | 849 | false |
rizenback000/TogetterSlyr | null | 387,932,090 | 7 | null | [
{
"action": "opened",
"author": "rizenback000",
"comment_id": null,
"datetime": 1544041739000,
"masked_author": "username_0",
"text": "# 症状\r\n反応ツイートが150件以上存在する時、反応ツイートの読み込みが150件から先を読み込めない\r\n\r\n# 原因\r\nseamlessReactStatusでcompleteをするタイミングが反応ツイート精査時だったので、精査中に次の公式ツイートを見つけたらその時点でまだ読み込んでないツイートが残っていても読み込みを停止していた\r\n\r\n# 対応\r\nseamlessReactStatusでcompleteをするタイミングを反応ツイート抽出時に行うように変更",
"title": "反応ツイートが150件以上存在する時、反応ツイートの読み込みが150件から先を読み込めない",
"type": "issue"
},
{
"action": "closed",
"author": "rizenback000",
"comment_id": null,
"datetime": 1544050465000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 231 | false | false | 231 | false |
onnx/onnx | onnx | 657,583,536 | 2,904 | null | [
{
"action": "opened",
"author": "chinhuang007",
"comment_id": null,
"datetime": 1594839870000,
"masked_author": "username_0",
"text": "The following are release automation enhancements to be worked on.\r\n\r\n1. move building Windows wheel files from ONNX repo (Actions) to wheel-builder repo, https://github.com/onnx/wheel-builder\r\n2. in wheel-builder repo, replace non-functional AppVeyor integration with Azure Pipelines for Windows builds\r\n3. in wheel-builder repo, update the supported environment and software versions\r\n4. in onnx-feedstock repo, https://github.com/conda-forge/onnx-feedstock, enable Windows packaging for ONNX conda install\r\n\r\nThis issue is to bring awareness to community and encourage developers to contribute to these enhancements.",
"title": "Release automation enhancements",
"type": "issue"
},
{
"action": "created",
"author": "snnn",
"comment_id": 660341876,
"datetime": 1595020623000,
"masked_author": "username_1",
"text": "I suggest ONNX should prefer to use Github Actions than Azure Pipelines when possible. The tech underneath of these two are the same. But, I think Github Actions is the future. And, by reducing one extra component, it will have less chance to fail. Based on my experience, the comunication channel between Azure DevOps and github isn't that smooth, it can often lose notifications. \r\nAlso, please note that for public projects like this, Azure DevOps only allow you have 10 concurrent running build jobs, you can't even buy more if you hit the limit.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chinhuang007",
"comment_id": 660396019,
"datetime": 1595033435000,
"masked_author": "username_0",
"text": "@username_1 Are you suggesting we continue to use the current way to create Windows wheel files in Github Actions and then manually upload them to TestPypi and Pypi? That could certainly be an option.\r\n\r\nHow about the Linux and OSX builds? Do you think that should also be handled in Github Actions? If so, do you suggest we totally stop the wheel-builder project and do all release builds in Github?\r\n\r\nThanks for your input!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "snnn",
"comment_id": 660729952,
"datetime": 1595202910000,
"masked_author": "username_1",
"text": "Github Action support all these 3 platform, so it won't be an issue. I don't know why there is a separated \"wheel-builder\" repo, I haven't done any ONNX release, so my knowledge is limited. Sorry.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chinhuang007",
"comment_id": 662544214,
"datetime": 1595434257000,
"masked_author": "username_0",
"text": "@username_1 The wheel-builder project is to automate the builds for various python versions and operating systems, and also upload wheel files to TestPypi and Pypi. Please take a look, https://github.com/onnx/wheel-builder.\r\n\r\nBased on your comments, we should be able to do all builds in Github Action. Then the release manager might need to upload the wheel files manually.\r\n\r\nThe next release working group meeting is 10-10:30 am Pacific Time 7/29/2020, https://lists.lfai.foundation/g/onnx-wg-release/viewevent?eventid=866498&calstart=2020-07-29. It will be very helpful if you can attend so we could discuss and identify the ONNX builds option for 1.8.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "askhade",
"comment_id": 854755017,
"datetime": 1622815600000,
"masked_author": "username_2",
"text": "Closing this issue. We have now moved all build pipelines to github actions in onnx repo and updated the relevant library versions.\nWindows builds for conda are also fixed and we have started releasing those packages as well.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "askhade",
"comment_id": null,
"datetime": 1622815624000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 7 | 2,665 | false | false | 2,665 | true |
GafferHQ/gaffer | GafferHQ | 605,440,230 | 3,731 | {
"number": 3731,
"repo": "gaffer",
"user_login": "GafferHQ"
} | [
{
"action": "opened",
"author": "themissingcow",
"comment_id": null,
"datetime": 1587639621000,
"masked_author": "username_0",
"text": "This was inadvertently broken when we refactored the visualisation mechanism.",
"title": "IECoreGLPreview : Handle rendering locations with zero scale transforms",
"type": "issue"
},
{
"action": "created",
"author": "johnhaddon",
"comment_id": 618379613,
"datetime": 1587645672000,
"masked_author": "username_1",
"text": "Thanks Tom, LGTM, but I think you've missed a bit :) This fixes the rendering, but there appears to be an almost identical error still being thrown when the ScaleTool is active - I suspect there's another spot inside the ScaleTool that needs a similar fix.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 333 | false | false | 333 | false |
AY1920S2-CS2103T-F10-1/main | AY1920S2-CS2103T-F10-1 | 582,724,805 | 85 | null | [
{
"action": "opened",
"author": "wardetu",
"comment_id": null,
"datetime": 1584414932000,
"masked_author": "username_0",
"text": "For instance, if user keys in the correct command words, they should see suggestion for the correct format in the result box.",
"title": "Add more handling for erroneous inputs",
"type": "issue"
},
{
"action": "closed",
"author": "duongphammmm",
"comment_id": null,
"datetime": 1586591640000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 125 | false | false | 125 | false |
thaopham1816/note-app | null | 740,148,339 | 24 | null | [
{
"action": "opened",
"author": "thaopham1816",
"comment_id": null,
"datetime": 1605033229000,
"masked_author": "username_0",
"text": "Create a context diagram and level 0 diagram showing the flow of information and processes of the system. Paste the diagrams screenshots in Project Document.",
"title": "Create a context diagram and level 0 diagram",
"type": "issue"
},
{
"action": "closed",
"author": "Wallidortiz",
"comment_id": null,
"datetime": 1605202464000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 157 | false | false | 157 | false |
raiden-network/light-client | raiden-network | 644,462,420 | 1,784 | {
"number": 1784,
"repo": "light-client",
"user_login": "raiden-network"
} | [
{
"action": "opened",
"author": "taleldayekh",
"comment_id": null,
"datetime": 1592990822000,
"masked_author": "username_0",
"text": "",
"title": "Updating READMEs for Alderaan",
"type": "issue"
},
{
"action": "created",
"author": "taleldayekh",
"comment_id": 648708091,
"datetime": 1592990953000,
"masked_author": "username_0",
"text": "@username_1 We have the section `Try Out the Raiden Demo dApp` with instructions on how to use the dApp on testnets. It is short but to the point and works well in the readme. I think we should keep it as it is and complement with a similar one but for mainnet? What do you think?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "taleldayekh",
"comment_id": 648710218,
"datetime": 1592991187000,
"masked_author": "username_0",
"text": "Also, since we are mentioning the dApp and SDK in the main README it makes sense to add a section about the CLI/API also in the main README?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "christianbrb",
"comment_id": 648735312,
"datetime": 1592994212000,
"masked_author": "username_1",
"text": "I would keep this section. Topics for updating the readmes for mainnet can be put here #1785",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "christianbrb",
"comment_id": 648735445,
"datetime": 1592994230000,
"masked_author": "username_1",
"text": "Yes, please :)",
"title": null,
"type": "comment"
}
] | 3 | 6 | 528 | false | true | 528 | true |
yunionio/onecloud | yunionio | 643,075,718 | 6,911 | {
"number": 6911,
"repo": "onecloud",
"user_login": "yunionio"
} | [
{
"action": "opened",
"author": "wanyaoqi",
"comment_id": null,
"datetime": 1592832773000,
"masked_author": "username_0",
"text": "Cherry pick of #6906 on release/2.13.\n\n#6906: disable dhclient",
"title": "Automated cherry pick of #6906: disable dhclient",
"type": "issue"
},
{
"action": "created",
"author": "yousong",
"comment_id": 647600530,
"datetime": 1592840139000,
"masked_author": "username_1",
"text": "/lgtm\r\n/approve",
"title": null,
"type": "comment"
}
] | 4 | 4 | 871 | false | true | 77 | false |
nhsconnect/gpconnect-provider-testing | nhsconnect | 420,545,129 | 194 | null | [
{
"action": "opened",
"author": "rbutterf",
"comment_id": null,
"datetime": 1552488842000,
"masked_author": "username_0",
"text": "change default date range banner message wording",
"title": "Default Date Range Banner Message Wording",
"type": "issue"
},
{
"action": "created",
"author": "Sado1234",
"comment_id": 478002720,
"datetime": 1553867104000,
"masked_author": "username_1",
"text": "Approach plan: \r\nMake text changes in htmlSteps.cs file as per spec to:\r\n<div class=\"date-banner\">\r\n\t<p>All relevant items</p>\r\n</div>\r\n\r\nAdd a test to Html Feature 'should contain the default banner' to check null values in StartDateTime and EndDateTime, existing tests will cover no date values provided.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Sado1234",
"comment_id": 500771124,
"datetime": 1560246670000,
"masked_author": "username_1",
"text": "Change implemented Ticket Closed",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Sado1234",
"comment_id": null,
"datetime": 1560246670000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 386 | false | false | 386 | false |
Jermolene/TiddlyWiki5 | null | 680,467,549 | 4,825 | null | [
{
"action": "opened",
"author": "pagdot",
"comment_id": null,
"datetime": 1597691776000,
"masked_author": "username_0",
"text": "**Describe the bug**\r\nThe markdown highlight plugin for codemirror shows an error and highlighting inside the editor doesn't work\r\n\r\n**To Reproduce**\r\nStatic copy of broken Tiddlywiki: https://gist.github.com/username_0/ce766e48b9da4c97fa4ec85a61a47309\r\n\r\n**Expected behavior**\r\nNo error, \r\n\r\n**Screenshots**\r\n\r\n\r\n**node.js:**\r\n - Browser Chromium, Firefox\r\n - Tiddlywiki Version 5.1.22\r\n\r\n\r\n**Additional context**\r\n\r\ntiddlywiki.info of of broken Tiddlywiki: https://gist.github.com/username_0/ce766e48b9da4c97fa4ec85a61a47309",
"title": "[BUG]",
"type": "issue"
},
{
"action": "created",
"author": "adithya-badidey",
"comment_id": 675599731,
"datetime": 1597770194000,
"masked_author": "username_1",
"text": "Seems like a similar issue to #4821",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "linonetwo",
"comment_id": 683281511,
"datetime": 1598702692000,
"masked_author": "username_2",
"text": "Did you install that xml syntax plugin?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pagdot",
"comment_id": 686284132,
"datetime": 1599114552000,
"masked_author": "username_0",
"text": "Yeah, that fixed it. Maybe it is possible to provide a better error message in the future",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "linonetwo",
"comment_id": 686294428,
"datetime": 1599116115000,
"masked_author": "username_2",
"text": "I agree, you can close this issue and open another issue about this.\r\n\r\nMaybe auto install related plugin is another option.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "pagdot",
"comment_id": null,
"datetime": 1599116997000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 6 | 914 | false | false | 914 | true |
swapagarwal/geeksay | null | 511,364,970 | 196 | {
"number": 196,
"repo": "geeksay",
"user_login": "swapagarwal"
} | [
{
"action": "opened",
"author": "Miguelerja",
"comment_id": null,
"datetime": 1571841129000,
"masked_author": "username_0",
"text": "This PR adds a .gitignore file to the project. So far it only avoids node_modules to go through.\r\n\r\nPD: I am new in development and in Open Source. If there is another way set in place to control these type of files ignore the PR",
"title": "Add .gitignore",
"type": "issue"
},
{
"action": "created",
"author": "swapagarwal",
"comment_id": 545477694,
"datetime": 1571841745000,
"masked_author": "username_1",
"text": "Thanks @username_0! 👍",
"title": null,
"type": "comment"
}
] | 2 | 2 | 250 | false | false | 250 | true |
eitrtechnologies/idem_provider_azurerm | eitrtechnologies | 607,018,398 | 55 | null | [
{
"action": "opened",
"author": "nicholasmhughes",
"comment_id": null,
"datetime": 1587909393000,
"masked_author": "username_0",
"text": "sphinx + readthedocs?",
"title": "Automated docs",
"type": "issue"
},
{
"action": "closed",
"author": "Ajnbro",
"comment_id": null,
"datetime": 1589756037000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 21 | false | false | 21 | false |
heartexlabs/label-studio-addon-pdf | heartexlabs | 742,996,531 | 2 | null | [
{
"action": "opened",
"author": "kskyten",
"comment_id": null,
"datetime": 1605357472000,
"masked_author": "username_0",
"text": "I found this [project](https://github.com/paperai/pdfanno) for annotating pdf files. Hopefully it would be a good starting point for pdf annotation in label studio.",
"title": "PDFanno: an MIT licensed browser-based pdf annotator ",
"type": "issue"
},
{
"action": "created",
"author": "kskyten",
"comment_id": 727201905,
"datetime": 1605357654000,
"masked_author": "username_0",
"text": "It seems like the UI is in a separate repo: https://github.com/paperai/anno-ui.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "smohan123",
"comment_id": 845276844,
"datetime": 1621528741000,
"masked_author": "username_1",
"text": "Do we know if there will be any efforts made on this plugin? I would be happy to contribute, but I am new to LS, so I would defer leadership and decision making to perhaps someone from heartexlabs.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "makseq",
"comment_id": 846467251,
"datetime": 1621719926000,
"masked_author": "username_2",
"text": "@username_0 @username_1 We can't use this plugin inside of LS, because it breaks our internal conceptions and LSF engine. However, this is a good example of what this might look like.\r\n\r\nRight now you can convert pdf => html and do HTML labeling using LS.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "smohan123",
"comment_id": 846600421,
"datetime": 1621792315000,
"masked_author": "username_1",
"text": "Cool, thanks. I think this plugin could work for my use case. Thanks again for creating a great product in LS!",
"title": null,
"type": "comment"
}
] | 3 | 5 | 801 | false | false | 801 | true |
ant-design/ant-design-mobile | ant-design | 545,433,766 | 3,512 | null | [
{
"action": "opened",
"author": "ITZengWei",
"comment_id": null,
"datetime": 1578239291000,
"masked_author": "username_0",
"text": "使用的 post2rem 部分安卓手机 pullToRefresh 上拉加载更多失效 求助, 谷歌模拟 是可以的",
"title": "使用的 post2rem 部分安卓手机 pullToRefresh 上拉加载更多失效 求助",
"type": "issue"
},
{
"action": "created",
"author": "xiaohuoni",
"comment_id": 573533692,
"datetime": 1578899221000,
"masked_author": "username_1",
"text": "部分安卓手机?是指什么?给个重现demo",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "liangzhuang327",
"comment_id": 575503901,
"datetime": 1579245092000,
"masked_author": "username_2",
"text": "[你看看我的这个问题是否适用你](https://github.com/ant-design/ant-design-mobile/issues/3521)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Web-Delevan",
"comment_id": 731901689,
"datetime": 1606100136000,
"masked_author": "username_3",
"text": "官方pullToRefresh up demo在Android9和10微信浏览器内无法上拉,Android8可以,ios没问题",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Web-Delevan",
"comment_id": 731902731,
"datetime": 1606100480000,
"masked_author": "username_3",
"text": "官方pullToRefresh up demo在Android9和10微信浏览器内无法上拉,Android8可以,ios没问题",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Web-Delevan",
"comment_id": 731906695,
"datetime": 1606101708000,
"masked_author": "username_3",
"text": "非X5内核引起的,开启X5内核就没问题了....",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "awmleer",
"comment_id": 925557957,
"datetime": 1632381119000,
"masked_author": "username_4",
"text": "v2 已经停止维护了,建议升级到 v5 吧",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "awmleer",
"comment_id": null,
"datetime": 1632381120000,
"masked_author": "username_4",
"text": "",
"title": null,
"type": "issue"
}
] | 6 | 9 | 432 | false | true | 326 | false |
angular-ui/bootstrap | angular-ui | 134,004,572 | 5,486 | null | [
{
"action": "opened",
"author": "RobJacobs",
"comment_id": null,
"datetime": 1455635129000,
"masked_author": "username_0",
"text": "@username_1 I'm seeing a behavior I was not expecting as a result of change #5425 and wanted to check if it was intended. The tab content no longer works if not implementing the 'active' state on the uib-tabset and 'index' on the uib-tab directives. Previously, tracking the active tab was not required. Here is a [plunk](http://plnkr.co/edit/lPwmPpFK0sLVGjsJsUge) demoing the issue.",
"title": "Tab: active state",
"type": "issue"
},
{
"action": "created",
"author": "wesleycho",
"comment_id": 184726076,
"datetime": 1455636007000,
"masked_author": "username_1",
"text": "The API changed as a result of 5ec19b643b2271c81e83a7032ccb0ab97ddc6612",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "RobJacobs",
"comment_id": null,
"datetime": 1455639256000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 457 | false | false | 457 | true |
prestosql/presto | prestosql | 457,243,546 | 1,021 | {
"number": 1021,
"repo": "presto",
"user_login": "prestosql"
} | [
{
"action": "opened",
"author": "elonazoulay",
"comment_id": null,
"datetime": 1560828916000,
"masked_author": "username_0",
"text": "This is to avoid an issue where the jdbc client attempts\r\nto insert into unsupported columns which are not present\r\nin the presto jdbc tmp table but are present in the database.",
"title": "Only include selected columns in jdbc insert statement",
"type": "issue"
},
{
"action": "created",
"author": "elonazoulay",
"comment_id": 502933486,
"datetime": 1560829313000,
"masked_author": "username_0",
"text": "Some more context: we had an issue where some column types in our postgres database are not supported and tried to run an insert from presto. Instead of doing \"SELECT * FROM <tmp_table\" the jdbc client will select only the columns in the temp table.\r\n\r\nNote: If the unsupported columns have not null constraints the insert will fail.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "electrum",
"comment_id": 503244003,
"datetime": 1560880519000,
"masked_author": "username_1",
"text": "Can you add a test for this in `TestPostgreSqlIntegrationSmokeTest`? See `testTableWithNoSupportedColumns` for an example of creating a table with unsupported columns.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "findepi",
"comment_id": 650420079,
"datetime": 1593208394000,
"masked_author": "username_2",
"text": "This seems superseded by https://github.com/prestosql/presto/commit/d36b1053d733cfd6d1c2bf043c7ca321eae93da2.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "martint",
"comment_id": 668841699,
"datetime": 1596577716000,
"masked_author": "username_3",
"text": "This was implemented as part of another PR",
"title": null,
"type": "comment"
}
] | 5 | 6 | 1,367 | false | true | 828 | false |
artipie/docker-adapter | artipie | 639,660,829 | 220 | {
"number": 220,
"repo": "docker-adapter",
"user_login": "artipie"
} | [
{
"action": "opened",
"author": "olegmoz",
"comment_id": null,
"datetime": 1592313742000,
"masked_author": "username_0",
"text": "Part of #170 \r\nReading DigestHeader from headers. Will be used to extract digest of manifest from remote registry response.",
"title": "#170 - Reading DigestHeader from headers",
"type": "issue"
}
] | 2 | 2 | 123 | false | true | 123 | false |
department-of-veterans-affairs/vets-api | department-of-veterans-affairs | 537,059,950 | 3,636 | {
"number": 3636,
"repo": "vets-api",
"user_login": "department-of-veterans-affairs"
} | [
{
"action": "opened",
"author": "annaswims",
"comment_id": null,
"datetime": 1576165682000,
"masked_author": "username_0",
"text": "## Description of change\r\n\r\n\r\n## Testing done\r\n<!-- Please describe testing done to verify the changes. -->\r\n\r\n## Testing planned\r\n<!-- Please describe testing planned. -->",
"title": "upgrade rack-attack gem",
"type": "issue"
},
{
"action": "created",
"author": "annaswims",
"comment_id": 565064069,
"datetime": 1576165754000,
"masked_author": "username_0",
"text": "# Changelog\r\n\r\nAll notable changes to this project will be documented in this file.\r\n\r\n## [6.2.1] - 2019-10-30\r\n\r\n### Fixed\r\n\r\n- Remove unintended side-effects on Rails app initialization order. It was potentially affecting the order of `config/initializers/*` in respect to gems initializers (#457)\r\n\r\n## [6.2.0] - 2019-10-12\r\n\r\n### Added\r\n\r\n- Failsafe on Redis error replies in RedisCacheStoreProxy (#421) (@cristiangreco)\r\n- Rack::Attack middleware is now auto added for Rails 5.1+ apps to simplify gem setup (#431) (@fatkodima)\r\n- You can disable Rack::Attack with `Rack::Attack.enabled = false` (#431) (@fatkodima)\r\n\r\n## [6.1.0] - 2019-07-11\r\n\r\n### Added\r\n\r\n- Provide throttle discriminator in the env `throttle_data`\r\n\r\n## [6.0.0] - 2019-04-17\r\n\r\n### Added\r\n\r\n- `#blocklist` and `#safelist` name argument (the first one) is now optional.\r\n- Added support to subscribe only to specific event types via `ActiveSupport::Notifications`, e.g. subscribe to the\r\n `throttle.rack_attack` or the `blocklist.rack_attack` event.\r\n\r\n### Changed\r\n\r\n- Changed `ActiveSupport::Notifications` event naming to comply with the recommended format.\r\n- Changed `ActiveSupport::Notifications` event so that the 5th yielded argument to the `#subscribe` method is now a\r\n `Hash` instead of a `Rack::Attack::Request`, to comply with `ActiveSupport`s spec. The original request object is\r\n still accessible, being the value of the hash's `:request` key.\r\n\r\n### Deprecated\r\n\r\n- Subscriptions via `ActiveSupport::Notifications` to the `\"rack.attack\"` event will continue to work (receive event\r\n notifications), but it is going to be removed in a future version. Replace the event name with `/rack_attack/` to\r\n continue to be subscribed to all events, or `\"throttle.rack_attack\"` e.g. for specific type of events only.\r\n\r\n### Removed\r\n\r\n- Removed support for ruby 2.2.\r\n- Removed support for obsolete memcache-client as a cache store.\r\n- Removed deprecated methods `#blacklist` and `#whitelist` (use `#blocklist` and `#safelist` instead).\r\n\r\n## [5.4.2] - 2018-10-30\r\n\r\n### Fixed\r\n\r\n- Fix unexpected error when using `redis` 3 and any store which is not proxied\r\n\r\n### Changed\r\n\r\n- Provide better information in `MisconfiguredStoreError` exception message to aid end-user debugging\r\n\r\n## [5.4.1] - 2018-09-29\r\n\r\n### Fixed\r\n\r\n- Make [`ActiveSupport::Cache::MemCacheStore`](http://api.rubyonrails.org/classes/ActiveSupport/Cache/MemCacheStore.html) also work as excepted when initialized with pool options (e.g. `pool_size`). Thank you @jdelStrother.\r\n\r\n## [5.4.0] - 2018-07-02\r\n\r\n### Added\r\n\r\n- Support \"plain\" `Redis` as a cache store backend ([#280](https://github.com/kickstarter/rack-attack/pull/280)). Thanks @bfad and @ryandv.\r\n- When overwriting `Rack::Attack.throttled_response` you can now access the exact epoch integer that was used for caching\r\nso your custom code is less prone to race conditions ([#282](https://github.com/kickstarter/rack-attack/pull/282)). Thanks @doliveirakn.\r\n\r\n### Dependency changes\r\n\r\n- Explictly declare ancient `rack 0.x` series as incompatible in gemspec\r\n\r\n## [5.3.2] - 2018-06-25\r\n\r\n### Fixed\r\n\r\n- Don't raise exception `The Redis cache store requires the redis gem` when using [`ActiveSupport::Cache::MemoryStore`](http://api.rubyonrails.org/classes/ActiveSupport/Cache/MemoryStore.html) as a cache store backend\r\n\r\n## [5.3.1] - 2018-06-20\r\n\r\n### Fixed\r\n\r\n- Make [`ActiveSupport::Cache::RedisCacheStore`](http://api.rubyonrails.org/classes/ActiveSupport/Cache/RedisCacheStore.html) also work as excepted when initialized with pool options (e.g. `pool_size`)\r\n\r\n## [5.3.0] - 2018-06-19\r\n\r\n### Added\r\n\r\n- Add support for [`ActiveSupport::Cache::RedisCacheStore`](http://api.rubyonrails.org/classes/ActiveSupport/Cache/RedisCacheStore.html) as a store backend ([#340](https://github.com/kickstarter/rack-attack/pull/340) and [#350](https://github.com/kickstarter/rack-attack/pull/350))\r\n\r\n## [5.2.0] - 2018-03-29\r\n\r\n### Added\r\n\r\n- Shorthand for blocking an IP address `Rack::Attack.blocklist_ip(\"1.2.3.4\")` ([#320](https://github.com/kickstarter/rack-attack/pull/320))\r\n- Shorthand for blocking an IP subnet `Rack::Attack.blocklist_ip(\"1.2.0.0/16\")` ([#320](https://github.com/kickstarter/rack-attack/pull/320))\r\n- Shorthand for safelisting an IP address `Rack::Attack.safelist_ip(\"5.6.7.8\")` ([#320](https://github.com/kickstarter/rack-attack/pull/320))\r\n- Shorthand for safelisting an IP subnet `Rack::Attack.safelist_ip(\"5.6.0.0/16\")` ([#320](https://github.com/kickstarter/rack-attack/pull/320))\r\n- Throw helpful error message when using `allow2ban` but cache store is misconfigured ([#315](https://github.com/kickstarter/rack-attack/issues/315))\r\n- Throw helpful error message when using `fail2ban` but cache store is misconfigured ([#315](https://github.com/kickstarter/rack-attack/issues/315))\r\n\r\n## [5.1.0] - 2018-03-10\r\n\r\n - Fixes edge case bug when using ruby 2.5.0 and redis [#253](https://github.com/kickstarter/rack-attack/issues/253) ([#271](https://github.com/kickstarter/rack-attack/issues/271))\r\n - Throws errors with better semantics when missing or misconfigured store caches to aid in developers debugging their configs ([#274](https://github.com/kickstarter/rack-attack/issues/274))\r\n - Removed legacy code that was originally intended for Rails 3 apps ([#264](https://github.com/kickstarter/rack-attack/issues/264))\r\n\r\n## [5.0.1] - 2016-08-11\r\n\r\n - Fixes arguments passed to deprecated internal methods. ([#198](https://github.com/kickstarter/rack-attack/issues/198))",
"title": null,
"type": "comment"
}
] | 1 | 2 | 5,688 | false | false | 5,688 | false |
MicrosoftDocs/azure-docs | MicrosoftDocs | 648,109,206 | 58,133 | {
"number": 58133,
"repo": "azure-docs",
"user_login": "MicrosoftDocs"
} | [
{
"action": "opened",
"author": "axelgMS",
"comment_id": null,
"datetime": 1593517813000,
"masked_author": "username_0",
"text": "CRI 194235863 - sysprep needs bitlocker to but turned off before it can proceed to generalize the OS.\r\n\r\n\r\n# for hex 0x80310039 / decimal -2144272327\r\n FVE_E_NOT_DECRYPTED winerror.h\r\n# The drive must be fully decrypted to complete this operation.\r\n\r\n\r\nSysprep has explicitly wrote a method in bdesysprep.dll to precheck if bitlocker is turned on and halt the sysprep if encryption is enabled.",
"title": "Bitlocker encryption must be disabled before sysprep",
"type": "issue"
},
{
"action": "created",
"author": "PRMerger18",
"comment_id": 651743457,
"datetime": 1593517830000,
"masked_author": "username_1",
"text": "@username_0 : Thanks for your contribution! The author(s) have been notified to review your proposed change.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cynthn",
"comment_id": 652128289,
"datetime": 1593565839000,
"masked_author": "username_2",
"text": "Thanks @username_0!\r\n\r\n#sign-off",
"title": null,
"type": "comment"
}
] | 3 | 3 | 571 | false | false | 571 | true |
CS2113-AY1819S1-W13-2/main | CS2113-AY1819S1-W13-2 | 379,543,856 | 145 | {
"number": 145,
"repo": "main",
"user_login": "CS2113-AY1819S1-W13-2"
} | [
{
"action": "opened",
"author": "TTTaus",
"comment_id": null,
"datetime": 1541957294000,
"masked_author": "username_0",
"text": "Fixed some eng and tags",
"title": "Update UG",
"type": "issue"
},
{
"action": "created",
"author": "shuanang",
"comment_id": 437690679,
"datetime": 1541959061000,
"masked_author": "username_1",
"text": "Uh oh, conflict. Will merge and resolve",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TTTaus",
"comment_id": 437690718,
"datetime": 1541959088000,
"masked_author": "username_0",
"text": "Dont merge first",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "shuanang",
"comment_id": 437690884,
"datetime": 1541959249000,
"masked_author": "username_1",
"text": "Oops! Open a new PR and let me know when you're merging. Need to resolve conflicts and add new sections..",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TTTaus",
"comment_id": 437690923,
"datetime": 1541959289000,
"masked_author": "username_0",
"text": "Wokay. Cuz I broke smth and just fixed",
"title": null,
"type": "comment"
}
] | 2 | 5 | 221 | false | false | 221 | false |
react-native-mapbox-gl/maps | react-native-mapbox-gl | 627,898,014 | 898 | null | [
{
"action": "opened",
"author": "mahipalsingh7",
"comment_id": null,
"datetime": 1590903887000,
"masked_author": "username_0",
"text": "I'm getting error **Property 'MarkerView' does not exist on type 'typeof MapboxGL'.ts(2339)** in my react-native project , i'm using Typescript.\r\nI have also tried `npm install @types/react-native-mapbox-gl/maps` but i think lib not support typescript. Please help i'm stuck due to MarkerView\r\n\r\nreact-native: 0.62.2, \r\n\"@react-native-mapbox-gl/maps\": \"^8.1.0-beta.1\",",
"title": "MapboxGL.MarkerView is not available in Typescript : Property 'MarkerView' does not exist on type 'typeof MapboxGL'.ts(2339)",
"type": "issue"
},
{
"action": "created",
"author": "kamiranoff",
"comment_id": 636783413,
"datetime": 1591008925000,
"masked_author": "username_1",
"text": "you can still use MarkerView. Just the types are missing.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mfazekas",
"comment_id": 638355368,
"datetime": 1591206450000,
"masked_author": "username_2",
"text": "SHould be fixed by #849",
"title": null,
"type": "comment"
}
] | 3 | 3 | 449 | false | false | 449 | false |
nuxt-community/axios-module | nuxt-community | 690,735,444 | 415 | null | [
{
"action": "opened",
"author": "webcoderkz",
"comment_id": null,
"datetime": 1599025571000,
"masked_author": "username_0",
"text": "Doc: https://axios.nuxtjs.org/extend/\r\n\r\n```js\r\nimport https from 'https'\r\n// import { NuxtAxiosInstance } from '@nuxtjs/axios'\r\n\r\nexport default function ({ $axios }: any) {\r\n $axios.defaults.httpsAgent = new https.Agent({ rejectUnauthorized: false })\r\n}\r\n\r\n```\r\n\r\nI've tried to use `NuxtAxiosInstance` but it won't work.",
"title": "How to type $axios when extending it?",
"type": "issue"
},
{
"action": "created",
"author": "pi0",
"comment_id": 685529521,
"datetime": 1599040487000,
"masked_author": "username_1",
"text": "Hi @username_0 would you please create a reproduction repository to check?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "webcoderkz",
"comment_id": 685532241,
"datetime": 1599040607000,
"masked_author": "username_0",
"text": "It's just an axios.ts file inside \"@/plugins/axios.ts\":\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pi0",
"comment_id": 685551072,
"datetime": 1599041397000,
"masked_author": "username_1",
"text": "I see but in order to check, without having a repository i have to create one myself to ensure answer is correct and it takes lots of time per day if have to do for each issue. So as a blind answer, is i think we can use `Context` type:\r\n\r\n```ts\r\nimport https from 'https'\r\nimport { Context } from '@nuxt/types'\r\n\r\nexport default function ({ $axios }: Context) {\r\n $axios.defaults.httpsAgent = new https.Agent({ rejectUnauthorized: false })\r\n}\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "webcoderkz",
"comment_id": 685557363,
"datetime": 1599041657000,
"masked_author": "username_0",
"text": "Context works, thanks! Ok, i see now, sorry for inconvenience.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "webcoderkz",
"comment_id": null,
"datetime": 1599041657000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "Mini-ghost",
"comment_id": 720983076,
"datetime": 1604393353000,
"masked_author": "username_2",
"text": "I have same problem when I use `context.$axios` in plugins, I import `Plugin` type from `@nuxt/types`, but that is not useful\r\n\r\n",
"title": null,
"type": "comment"
}
] | 3 | 7 | 1,315 | false | false | 1,315 | true |
spotify/backstage | spotify | 602,001,233 | 580 | {
"number": 580,
"repo": "backstage",
"user_login": "spotify"
} | [
{
"action": "opened",
"author": "katz95",
"comment_id": null,
"datetime": 1587132330000,
"masked_author": "username_0",
"text": "## Hey, I just made a Pull Request!\r\n\r\nAdding a header image for [Contributing Doc.](https://github.com/spotify/backstage/blob/master/CONTRIBUTING.md) \r\n\r\n<!-- Please describe what you added, and add a screenshot if possible.\r\n That makes it easier to understand the change so we can :shipit: faster. -->\r\n\r\n#### :heavy_check_mark: Checklist\r\n<!--- Put an `x` in all the boxes that apply: -->\r\n- [ ] All tests are passing `yarn test`\r\n- [ ] Screenshots attached (for UI changes)\r\n- [ ] Relevant documentation updated\r\n- [ ] Prettier run on changed files\r\n- [ ] Tests added for new functionality\r\n- [ ] Regression tests added for bug fixes",
"title": "Contributor Header",
"type": "issue"
},
{
"action": "created",
"author": "stefanalund",
"comment_id": 615268482,
"datetime": 1587132892000,
"masked_author": "username_1",
"text": "I was thinking something more \"warm and welcoming\", what do you think?",
"title": null,
"type": "comment"
}
] | 2 | 2 | 712 | false | false | 712 | false |
NosWings/bug-reports | NosWings | 711,552,392 | 588 | null | [
{
"action": "opened",
"author": "genkai7",
"comment_id": null,
"datetime": 1601426357000,
"masked_author": "username_0",
"text": "## **Title of bug**\r\nPartner will force the skillbar to reset.\r\n## **To Reproduce**\r\n**Where**: Miniland\r\n**When**: Upon pressing Stay/Send back if your partner is already in your party, Company if your partner was in the miniland.\r\n**Description**: Partner will force the skillbar to reset in or out of SP. The solution is relogging. You get all your skills in the order you had them after relogging. (Doesn't happen with pet, only partner)\r\n**Actions**: Clicking/selecting your partner\r\n**1.** Pressing Stay/Send back if your partner is already in your party\r\n**2.** Pressing Company if your partner was already in the miniland\r\n**3.**\r\n\r\n**Expected behavior**\r\nThe skillbar should stay intact after interacting with your partner.\r\n\r\n## GIF\r\nhttps://gyazo.com/16ddaa6f9862fd70559f383b2f294533",
"title": "[BUG] Taking/leaving partner from/in miniland resets everything in skillbar [Miniland]",
"type": "issue"
},
{
"action": "closed",
"author": "imquarry",
"comment_id": null,
"datetime": 1601567396000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 794 | false | false | 794 | false |
HospitalRun/hospitalrun-frontend | HospitalRun | 729,022,031 | 2,448 | {
"number": 2448,
"repo": "hospitalrun-frontend",
"user_login": "HospitalRun"
} | [
{
"action": "opened",
"author": "dastgirp",
"comment_id": null,
"datetime": 1603632817000,
"masked_author": "username_0",
"text": "Fixes #2443\r\n\r\n**Changes proposed in this pull request:**\r\n\r\n- Using usePatientAllergies to refresh allergy data as new allergy is added.\r\n\r\n**Newly added dependencies with [Bundlephobia](https://bundlephobia.com/) links:**\r\nNone\r\n\r\n\r\n_Note: pull requests without proper descriptions may simply be closed without further discussion. We appreciate your contributions, but need to know what you are offering in clearly described format. Provide tests for all code that you add/modify. If you add/modify any components update the storybook. Thanks! (you can delete this text)_",
"title": "fix(allergy): allergy data now gets refresh",
"type": "issue"
}
] | 3 | 5 | 1,886 | false | true | 573 | false |
AbdulRahmanAlHamali/flutter_typeahead | null | 438,208,156 | 93 | {
"number": 93,
"repo": "flutter_typeahead",
"user_login": "AbdulRahmanAlHamali"
} | [
{
"action": "opened",
"author": "MhmdAlmz",
"comment_id": null,
"datetime": 1556527464000,
"masked_author": "username_0",
"text": "There is not any option to change no found item text value.",
"title": "Change text of not found item",
"type": "issue"
},
{
"action": "created",
"author": "KaYBlitZ",
"comment_id": 487719209,
"datetime": 1556567530000,
"masked_author": "username_1",
"text": "Thank you for the PR, but I cannot merge this. There are too many changes (specifically formatting) that do not pertain to the actual change in question. Can you just make the changes necessary without changing all the formatting?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "KaYBlitZ",
"comment_id": 487721491,
"datetime": 1556567950000,
"masked_author": "username_1",
"text": "Actually, I think the preferred way would be to create your own widget using the `noItemsFoundBuilder` property. @AbdulRahmanAlHamali @sjmcdowall What do you guys think?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "KaYBlitZ",
"comment_id": 493786450,
"datetime": 1558295560000,
"masked_author": "username_1",
"text": "Closing since `noItemsFoundBuilder` seems like the official way of modifying the no items result.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 555 | false | false | 555 | false |
ChenyuHeidiZhang/step | null | 625,362,939 | 2 | {
"number": 2,
"repo": "step",
"user_login": "ChenyuHeidiZhang"
} | [
{
"action": "opened",
"author": "ChenyuHeidiZhang",
"comment_id": null,
"datetime": 1590552991000,
"masked_author": "username_0",
"text": "This is, I think, an almost complete portfolio website : ) I had another version from last week on the portfolio-week1 branch, but I re-designed the layout today and here it is. I enjoyed making this thing lol. Attached are some screenshots of the website running.\r\n\r\n\r\n\r\n\r\n\r\nSeems like there are a lot of changed files, but really 35 out of 41 are just images.",
"title": "First version of portfolio with three page",
"type": "issue"
},
{
"action": "created",
"author": "hayleejane3",
"comment_id": 637301165,
"datetime": 1591078293000,
"masked_author": "username_1",
"text": "Did you still want me to look at the changes on this pull request?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ChenyuHeidiZhang",
"comment_id": 637680647,
"datetime": 1591117055000,
"masked_author": "username_0",
"text": "I'm not entirely sure how it works, but it's fine if you just approve it : )",
"title": null,
"type": "comment"
}
] | 2 | 3 | 962 | false | false | 962 | false |
dotnet/runtime | dotnet | 717,047,020 | 43,159 | {
"number": 43159,
"repo": "runtime",
"user_login": "dotnet"
} | [
{
"action": "created",
"author": "Anipik",
"comment_id": 706310889,
"datetime": 1602264750000,
"masked_author": "username_0",
"text": "The failures are known and we have a tracking issue for them",
"title": null,
"type": "comment"
}
] | 3 | 3 | 289 | false | true | 60 | false |
osy86/HaC-Mini | null | 732,986,747 | 353 | null | [
{
"action": "opened",
"author": "helmehraz",
"comment_id": null,
"datetime": 1604047136000,
"masked_author": "username_0",
"text": "*As mentioned in the title this is not a bug report but a king of discussion thread**\r\n\r\nthere are some similarities between the following models [nuc8i7hvk & nuc9i9qnx]! anyone ever tried to implement the work described here on a nuc9i9qnx nuc ? do you think this can be done with some minors tweaks?\r\n\r\nThanks",
"title": "[Discussion & NOT Bug Report] Using nuc9i9qnx instead of nuc8i7hvk - for HaC Mini",
"type": "issue"
},
{
"action": "created",
"author": "RogerChern",
"comment_id": 719455928,
"datetime": 1604051639000,
"masked_author": "username_1",
"text": "I don't see why this is a BUG? \r\nIf you think some minor tweaks could work, you may have a try yourself.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "helmehraz",
"comment_id": 719469964,
"datetime": 1604053398000,
"masked_author": "username_0",
"text": "@username_1 My bad the label was a mistake ! and i would like change it to \"discussion\" but i do not have the necessary rights to do it !",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ConfolidelBiins",
"comment_id": 719923392,
"datetime": 1604145065000,
"masked_author": "username_2",
"text": "do they use frtgfryghjvfr56? wtf is that shit",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jarryji",
"comment_id": 732536440,
"datetime": 1606183041000,
"masked_author": "username_3",
"text": "I managed installed Catalina 10.15.7 on my NUC9i7QNX with minor modification of one guy's EFI made for nuc9vxqnx, thanks to him. even it's not free.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "helmehraz",
"comment_id": 733113901,
"datetime": 1606237611000,
"masked_author": "username_0",
"text": "@username_3 Good to know :) and everything is working even the two front usb ports and hotplug thunderbolt ?\r\n\r\non my side I also managed to install Catalina, it works quite well! I tested a number of things and others have not yet test them. **here is my current status :** \r\n\r\n**Working Hardware:** \r\n- [ ] Ethernet (both ports [I219LM & I210AT])\r\n- [ ] HDMI/DP audio\r\n- [ ] USB A ports (blue one) [Rear]\r\n- [ ] SD card slot\r\n- [ ] NVMe/SATA SSD\r\n- [ ] Sleep/Resume\r\n- [ ] Sleep/Resume\r\n- [ ] BT (with the built-in card)\r\n\r\n**Hardware Not yet tested :** \r\n- [ ] Analog Audio\r\n- [ ] Microphone\r\n- [ ] USB C ports\r\n- [ ] Thunderbolt 3\r\n\r\n**Hardware Not yet tested :** \r\n- [ ] USB A ports (black one) [front]\r\n- [ ] Microphone\r\n- [ ] Wifi 6 (using built-in card [AX200])\r\n- [ ] Thunderbolt 3\r\n\r\nI'm also planning to add this module [BCM94360NG] to make these functionalities working [Handoff, Continuity, Universal Clipboard, Apple Watch unlock (using Apple Wifi card)]",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jarryji",
"comment_id": 733805497,
"datetime": 1606320980000,
"masked_author": "username_3",
"text": "usb b type a both front and back works fine, but type-c doesnt work (quick charger works), \r\nThunderbolt not tested prorobly not working.\r\nfront earphone jack works when earphone pluged in.\r\nmicrophone doesn't work.\r\nbt works.\r\nuhd 630 3d game Acceleration frame rate only 20 fps while playing CSGO.\r\n\r\nbtw: for intel ax200 there have third part driver available. I havent test yet.\r\ni have attached my EFI, see if it's can help you.\r\n[Uploading EFI.zip…]",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "helmehraz",
"comment_id": 734824452,
"datetime": 1606482051000,
"masked_author": "username_0",
"text": "@username_3 thank u very much dude ! tonight i will have a look on your EFI",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "helmehraz",
"comment_id": 737497762,
"datetime": 1606943478000,
"masked_author": "username_0",
"text": "@username_3 I did give a try to your EFI, but i couldn't get it to work ... i couldn't even boot :(\r\n\r\nI haven't spent a lot of time setting up my HackMini but just enough to get all the USB ports working using this [guide.](https://dortania.github.io/OpenCore-Post-Install/usb/intel-mapping/intel.html).\r\n\r\nI also took a look at Osyx86 articles to understand how to patch thunderbolt firmware ... but I still have few questions about that.\r\n\r\nonce my HackMini finished I plan to share my EFI with the community :)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "helmehraz",
"comment_id": 737509602,
"datetime": 1606944842000,
"masked_author": "username_0",
"text": "Hi @username_4, I'm sorry to bother you with this question, but I wanted to know if I understood everything correctly after reading this [page](https://github.com/username_4/ThunderboltReset/blob/master/PatchingNVM.md)! indeed, my question turn around thunderbolt firmware patching.\r\n\r\nI tried to do the execice using the attached binary files but unfortunately I find more differences than you when I compare the two binary files ... and I wonder where I could have gone wrong.\r\n\r\nPlease notice that the Apple firmware v33 was found inside an USBCUpdater coming from 10.13.x version instead of 10.14.x\r\n\r\n- [Intel_Hades_AR_HR_4C_C0_rev33_W_TI_20180109_SEC1_sign.bin.zip](https://github.com/username_4/HaC-Mini/files/5632103/Intel_Hades_AR_HR_4C_C0_rev33_W_TI_20180109_SEC1_sign.bin.zip)\r\n\r\n- [Mac-BE088AF8C5EB4FA2-C0_5.56.0-C0_33.1.bin.zip](https://github.com/username_4/HaC-Mini/files/5632104/Mac-BE088AF8C5EB4FA2-C0_5.56.0-C0_33.1.bin.zip)\r\n\r\nhere is the list of diffs :\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "osy86",
"comment_id": 737531334,
"datetime": 1606947718000,
"masked_author": "username_4",
"text": "It's been a while so I don't completely recall. But I think I started with all the changes then one by one I reverted it and tested to see if anything changed (everything still works) then ended up with a minimal set of changes that actually affected functionality.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "helmehraz",
"comment_id": 737544795,
"datetime": 1606949620000,
"masked_author": "username_0",
"text": "Thanks @username_4 👍🏻",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jarryji",
"comment_id": 737795886,
"datetime": 1606988743000,
"masked_author": "username_3",
"text": "I am sure that this EFI works with my nuc9i7qnx and I have confirmed that USB type C works well on my NUC9",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "helmehraz",
"comment_id": 738040544,
"datetime": 1607006809000,
"masked_author": "username_0",
"text": "I have no doubts about it ... I'm just trying to understand why your EFI as is, is not working in my case! and I think (perhaps) I found it by reading your answer twice ;)\r\n\r\nyour nuc model is i7 processor (nuc9i7qnx) and mine is an i9 (nuc9i7qnx) !\r\n\r\nLuck u :) My thunderbolt are not working yet :(",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "helmehraz",
"comment_id": 749559831,
"datetime": 1608646439000,
"masked_author": "username_0",
"text": "Hi @username_3 please could you try this EFI and tell me if it solve your microphone, builtin wifi and CSGO hardware acceleration issues ?\r\n\r\nThanks.\r\n\r\n[Intel_Quartz_Canyon_EFI.zip](https://github.com/username_5/HaC-Mini/files/5730256/Intel_Quartz_Canyon_EFI.zip)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jarryji",
"comment_id": 750347161,
"datetime": 1608736931000,
"masked_author": "username_3",
"text": "No, it's not work on my nuc9i7qnx.\r\nthank you anyways.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "osy",
"comment_id": null,
"datetime": 1617256950000,
"masked_author": "username_5",
"text": "",
"title": null,
"type": "issue"
}
] | 6 | 17 | 4,816 | false | false | 4,816 | true |
okfn/website | okfn | 711,881,611 | 447 | {
"number": 447,
"repo": "website",
"user_login": "okfn"
} | [
{
"action": "opened",
"author": "pranjalshahi",
"comment_id": null,
"datetime": 1601466883000,
"masked_author": "username_0",
"text": "",
"title": "Improved_Docs",
"type": "issue"
}
] | 2 | 2 | 288 | false | true | 0 | false |
codex-team/editor.js | codex-team | 714,017,181 | 1,340 | {
"number": 1340,
"repo": "editor.js",
"user_login": "codex-team"
} | [
{
"action": "opened",
"author": "hata6502",
"comment_id": null,
"datetime": 1601705982000,
"masked_author": "username_0",
"text": "https://github.com/codex-team/editor.js/issues/1339",
"title": "Bugfix/fix modification observer disable",
"type": "issue"
},
{
"action": "created",
"author": "hata6502",
"comment_id": 703946281,
"datetime": 1601941295000,
"masked_author": "username_0",
"text": "This is a critical fix when Editor.js saves the block data in onChange callback. \r\nWould you merge it in v2.19.0? 🙏\r\n\r\n```\r\nonChange: async () => {\r\n // Block data saving is unstable because save method may throws an exception. \r\n const data = await editorJS.save();\r\n\r\n localStorage.setItem('block-data', data);\r\n}\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "neSpecc",
"comment_id": 704412574,
"datetime": 1602003310000,
"masked_author": "username_1",
"text": "@username_0 please, resolve conflicts",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "neSpecc",
"comment_id": 704469576,
"datetime": 1602009171000,
"masked_author": "username_1",
"text": "Can you open pushes for us?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hata6502",
"comment_id": 704520419,
"datetime": 1602014406000,
"masked_author": "username_0",
"text": "Conflict resolved. \r\n\r\nI use GitHub Organization, so I can't allow edits from maintainers. \r\nhttps://qiita.com/ryysud/items/4d31dd86efab34a7fbaf#github-%E3%82%B5%E3%83%9D%E3%83%BC%E3%83%88%E3%81%8B%E3%82%89%E3%81%AE%E5%9B%9E%E7%AD%94",
"title": null,
"type": "comment"
}
] | 2 | 5 | 669 | false | false | 669 | true |
gemmamckay/github-slideshow | null | 705,605,297 | 3 | {
"number": 3,
"repo": "github-slideshow",
"user_login": "gemmamckay"
} | [
{
"action": "opened",
"author": "gemmamckay",
"comment_id": null,
"datetime": 1600695398000,
"masked_author": "username_0",
"text": "Created branch, created a file, made a commit and opened a pull request",
"title": "Add gemmamckay's file",
"type": "issue"
}
] | 2 | 2 | 1,839 | false | true | 71 | false |
webmachinelearning/webnn-polyfill | webmachinelearning | 704,060,733 | 11 | null | [
{
"action": "opened",
"author": "huningxin",
"comment_id": null,
"datetime": 1600400947000,
"masked_author": "username_0",
"text": "Spec: https://webmachinelearning.github.io/webnn/#api-neuralnetworkcontext-unary",
"title": "Implement unary ops",
"type": "issue"
},
{
"action": "created",
"author": "huningxin",
"comment_id": 720866198,
"datetime": 1604369047000,
"masked_author": "username_0",
"text": "#25 implemented `sigmoid`, `tanh` `sqrt` and `exp`. \r\n\r\nThere are missing `abs`, `ceil`, `cos`, `floor`, `log`, `neg`, `sin` and `tan`.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "huningxin",
"comment_id": null,
"datetime": 1631599824000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 215 | false | false | 215 | false |
aerni/statamic-imagekit | null | 684,223,035 | 1 | {
"number": 1,
"repo": "statamic-imagekit",
"user_login": "aerni"
} | [
{
"action": "opened",
"author": "pryley",
"comment_id": null,
"datetime": 1598203580000,
"masked_author": "username_0",
"text": "Was this a breaking change in Statamic 3 final?",
"title": "Fix called property name ($this->params)",
"type": "issue"
},
{
"action": "created",
"author": "pryley",
"comment_id": 678801819,
"datetime": 1598203820000,
"masked_author": "username_0",
"text": "see: https://github.com/statamic/cms/pull/2167",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "aerni",
"comment_id": 678964612,
"datetime": 1598254914000,
"masked_author": "username_1",
"text": "Thanks for this!",
"title": null,
"type": "comment"
}
] | 2 | 3 | 109 | false | false | 109 | false |
EuroPython/epcon | EuroPython | 445,705,740 | 968 | {
"number": 968,
"repo": "epcon",
"user_login": "EuroPython"
} | [
{
"action": "opened",
"author": "patrick91",
"comment_id": null,
"datetime": 1558178053000,
"masked_author": "username_0",
"text": "TODO:\r\n\r\n- [ ] Mobile list view\r\n- [x] Header to select the day\r\n- [x] Filter by day\r\n- [ ] Restore ICS views\r\n- [ ] My Schedule view\r\n- [ ] Restore list view\r\n- [ ] Voting/Booking\r\n\r\nThis is based on what we did for PyCon Italia :)",
"title": "Add new schedule layout",
"type": "issue"
},
{
"action": "created",
"author": "umgelurgel",
"comment_id": 504508623,
"datetime": 1561138485000,
"masked_author": "username_1",
"text": "I'll update the url and handle moving the files to `conference` before merging.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 311 | false | false | 311 | false |
Azure/azure-quickstart-templates | Azure | 562,769,408 | 7,169 | {
"number": 7169,
"repo": "azure-quickstart-templates",
"user_login": "Azure"
} | [
{
"action": "opened",
"author": "ssripadham",
"comment_id": null,
"datetime": 1581362075000,
"masked_author": "username_0",
"text": "### Best Practice Checklist\r\nCheck these items before submitting a PR... See the Contribution Guide for the full detail: https://github.com/Azure/azure-quickstart-templates/blob/master/1-CONTRIBUTION-GUIDE/README.md \r\n\r\n1. uri's compatible with all clouds (Stack, China, Government)\r\n1. Staged artifacts use _artifactsLocation & _artifactsLocationSasToken\r\n1. Use a parameter for resource locations with the defaultValue set to resourceGroup().location\r\n1. Folder names for artifacts (nestedtemplates, scripts, DSC)\r\n1. Use literal values for apiVersion (no variables)\r\n1. Parameter files (GEN-UNIQUE for value generation and no \"changemeplease\" values)\r\n1. $schema and other uris use https\r\n1. Use uniqueString() whenever possible to generate names for resources. While this is not required, it's one of the most common failure points in a deployment. \r\n1. Update the metadata.json with the current date\r\n\r\nFor details: https://github.com/Azure/azure-quickstart-templates/blob/master/1-CONTRIBUTION-GUIDE/best-practices.md\r\n\r\n- [ ] - Please check this box once you've submitted the PR if you've read through the Contribution Guide and best practices checklist.\r\n\r\n### Changelog\r\n\r\n*\r\n*\r\n*",
"title": "101-create-azurefirewall-with-ipgroups",
"type": "issue"
},
{
"action": "created",
"author": "ssripadham",
"comment_id": 584804773,
"datetime": 1581448700000,
"masked_author": "username_0",
"text": "hi @username_1 , Can you help me merge this PR.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bmoore-msft",
"comment_id": 584877968,
"datetime": 1581458646000,
"masked_author": "username_1",
"text": "Why are we removing zones?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ssripadham",
"comment_id": 585362013,
"datetime": 1581533693000,
"masked_author": "username_0",
"text": "Zones are not available in all regions. So the automated template deployment tests were failing. There is another template exclusive for Firewall with zones (with location hard coded to a known supported region). BTW, can you help with why template deployment failed? For Public, \"Previous request in-progress\". But how? For USGov, this resource is not yet available.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 1,632 | false | false | 1,632 | true |
dotnet/runtime | dotnet | 707,615,956 | 42,637 | null | [
{
"action": "opened",
"author": "crummel",
"comment_id": null,
"datetime": 1600888507000,
"masked_author": "username_0",
"text": "Runtime uses the dotnet/command-line-api repo, e.g. [here](https://github.com/dotnet/runtime/blob/57fde8309798a53c4bf6eb2b579683d9f3e71bf6/src/coreclr/src/tools/aot/crossgen2/crossgen2.csproj#L38), but this is not included in eng/Version.Details.xml, so Arcade/Maestro package updates and uberclone will not know about the dependency. This should be added at an appropriate version and hash.",
"title": "Missing command-line-api repo reference in eng/Version.Details.xml",
"type": "issue"
},
{
"action": "created",
"author": "mangod9",
"comment_id": 699019928,
"datetime": 1601050515000,
"masked_author": "username_1",
"text": "@username_3 is looking at updating the command-line dependency.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "trylek",
"comment_id": 774757549,
"datetime": 1612730744000,
"masked_author": "username_2",
"text": "@username_3, is this item ready to be closed based on your fix?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "trylek",
"comment_id": 817409673,
"datetime": 1618188543000,
"masked_author": "username_2",
"text": "OK, so it looks like this issue now effectively tracks removing the \"new slow\" System.CommandLine from ILVerify and R2RDump. While I believe that R2RDump is not considered a shipping product now, I'm not sure about ILVerify. If ILVerify is a shipping product, we should migrate it to the same parser as Crossgen2.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mangod9",
"comment_id": 877516786,
"datetime": 1625875543000,
"masked_author": "username_1",
"text": "Moving to ILVerification area since its now tracking ILVerify's usage of Commandline.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "AntonLapounov",
"comment_id": 877518811,
"datetime": 1625876059000,
"masked_author": "username_3",
"text": "The dotnet/source-build#1886 issue that motivated this one has been closed. I don't think it is necessary to do any changes for 6.0.",
"title": null,
"type": "comment"
}
] | 4 | 6 | 1,055 | false | false | 1,055 | true |
google/web-stories-wp | google | 680,043,276 | 4,010 | {
"number": 4010,
"repo": "web-stories-wp",
"user_login": "google"
} | [
{
"action": "opened",
"author": "diegovar",
"comment_id": null,
"datetime": 1597652455000,
"masked_author": "username_0",
"text": "## Summary\r\n\r\nFixes a series of flex issues with the library tab collapsing when there's litte to no media. These changes make it take 100% and so the attribution always displays at the bottom.\r\n\r\n\r\n\r\n",
"title": "Fix flex issues",
"type": "issue"
}
] | 3 | 3 | 5,044 | false | true | 479 | false |
cliffordwolf/picorv32 | null | 610,675,707 | 161 | null | [
{
"action": "opened",
"author": "bilkarpooja",
"comment_id": null,
"datetime": 1588327540000,
"masked_author": "username_0",
"text": "hello,\r\nplz can any one suggest me,how to check the addition of 2 numbers using picorv32.i mean now succesfully i hv compiled and able to run the whole picorv32 respository now i want to check by, by giving simple add intruction ,wat file i need to edit for this purpose.",
"title": "testing",
"type": "issue"
},
{
"action": "created",
"author": "Fifiy",
"comment_id": 655522457,
"datetime": 1594215161000,
"masked_author": "username_1",
"text": "@username_0 I'm searching about the same thing but can't know frome where to begin ?! did you find any idea about that ?\r\nThanks.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 401 | false | false | 401 | true |
HandyOrg/HandyControl | HandyOrg | 658,813,944 | 441 | null | [
{
"action": "opened",
"author": "dayhi",
"comment_id": null,
"datetime": 1594957296000,
"masked_author": "username_0",
"text": "**Describe the bug**\r\nA clear and concise description of what the bug is.\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Go to '...'\r\n2. Click on '....'\r\n3. Scroll down to '....'\r\n4. See error\r\n\r\n**Expected behavior**\r\nA clear and concise description of what you expected to happen.\r\n\r\n**Screenshots**\r\nIf applicable, add screenshots to help explain your problem.\r\n\r\n**Environment (please complete the following information):**\r\n - .net: [e.g. 4.6.2]\r\n - IDE [e.g. vs2017]\r\n - Version [e.g. 1.0.0]\r\n\r\n**Additional context**\r\nAdd any other context about the problem here.",
"title": "RunningBlock控件当Content刷新时,内容宽度未自适应,导致的显示不全的问题",
"type": "issue"
},
{
"action": "closed",
"author": "NaBian",
"comment_id": null,
"datetime": 1598892613000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 581 | false | false | 581 | false |
yasirkula/UnityNativeCamera | null | 619,870,030 | 34 | null | [
{
"action": "opened",
"author": "favoyang",
"comment_id": null,
"datetime": 1589770196000,
"masked_author": "username_0",
"text": "package.json\r\n```\r\n\"version\": \"1.2.1\"\r\n```",
"title": "Mismatched Version of 1.2.2 Release",
"type": "issue"
},
{
"action": "created",
"author": "yasirkula",
"comment_id": 630033904,
"datetime": 1589791127000,
"masked_author": "username_1",
"text": "You're right, thanks!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "yasirkula",
"comment_id": null,
"datetime": 1589791127000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 63 | false | false | 63 | false |
TransbankDevelopers/transbank-plugin-prestashop-webpay | TransbankDevelopers | 602,130,800 | 55 | null | [
{
"action": "opened",
"author": "msaustral",
"comment_id": null,
"datetime": 1587145007000,
"masked_author": "username_0",
"text": "Hola dado el requerimiento de un cliente hemos realizado modificaciones al modulo que seria interesantes agregarlos para la próxima versión.\r\n \r\nEl cliente requería ver en cada pedido las cantidades de cuotas, tipo de tarjeta (Débito o Crédito), código de autorización de Transbank y número de transacción de Transbank para poder llevar su control al cruzarlo con los reportes de Transbank.\r\n\r\nModificaciones:\r\n\r\n1- Creamos en la BD tabla ps_orders_payment 6 campos varchar-> byorder, authorizationcode, paymenttype, tipo_cuotas, sharesnumber, responsecode\r\n\r\n2- En el módulo de Webpay->controllers->front->validate.php agregamos luego de la línea 222\r\n\r\n $payment[0]->byorder = $result->buyOrder;\r\n $payment[0]->authorizationcode = $result->detailOutput->authorizationCode;\r\n $payment[0]->paymenttype = $paymentType;\r\n $payment[0]->tipo_cuotas = $tipo_cuotas;\r\n $payment[0]->sharesnumber = $result->detailOutput->sharesNumber;\r\n $payment[0]->responsecode = $result->detailOutput->responseCode;\r\n\r\n3- Instalación de PS->clasess->order->OrderPayment.php agregamos luego de la linea 38\r\n\r\n public $byorder;\r\n public $authorizationcode;\r\n public $paymenttype;\r\n public $tipo_cuotas;\r\n public $sharesnumber;\r\n public $responsecode;\r\n\r\n4- Instalación de PS->clasess->order->OrderPayment.php agregamos luego de la linea 63\r\n\r\n 'byorder' => array('type' => self::TYPE_STRING, 'validate' => 'isAnything', 'size' => 254),\r\n 'authorizationcode' => array('type' => self::TYPE_STRING, 'validate' => 'isAnything', 'size' => 254),\r\n 'paymenttype' => array('type' => self::TYPE_STRING, 'validate' => 'isAnything', 'size' => 254),\r\n 'tipo_cuotas' => array('type' => self::TYPE_STRING, 'validate' => 'isAnything', 'size' => 254),\r\n 'sharesnumber' => array('type' => self::TYPE_STRING, 'validate' => 'isAnything', 'size' => 254),\r\n 'responsecode' => array('type' => self::TYPE_STRING, 'validate' => 'isAnything', 'size' => 254),\r\n\r\n5- Instalación de PS->admin\\themes\\default\\template\\controllers\\orders\\helpers\\view->view.tpl agregamos luego de la linea 489\r\n\r\n <p>\r\n <b>{l s='Número de transacción' d='Admin.Orderscustomers.Feature'}</b> \r\n {if $payment->byorder}\r\n {$payment->byorder}\r\n {else}\r\n <i>{l s='Not defined' d='Admin.Orderscustomers.Feature'}</i>\r\n {/if}\r\n </p>\r\n <p>\r\n <b>{l s='Código de Autorización' d='Admin.Orderscustomers.Feature'}</b> \r\n {if $payment->authorizationcode}\r\n {$payment->authorizationcode}\r\n {else}\r\n <i>{l s='Not defined' d='Admin.Orderscustomers.Feature'}</i>\r\n {/if}\r\n </p>\r\n <p>\r\n <b>{l s='Tipo de Tarjeta' d='Admin.Orderscustomers.Feature'}</b> \r\n {if $payment->paymenttype}\r\n {$payment->paymenttype}\r\n {else}\r\n <i>{l s='Not defined' d='Admin.Orderscustomers.Feature'}</i>\r\n {/if}\r\n </p>\r\n <p>\r\n <b>{l s='Cantidad de Cuotas' d='Admin.Orderscustomers.Feature'}</b> \r\n {if $payment->tipo_cuotas}\r\n {$payment->tipo_cuotas}\r\n {else}\r\n <i>{l s='Not defined' d='Admin.Orderscustomers.Feature'}</i>\r\n {/if}\r\n </p>\r\n\r\nTodos estos cambios los realizamos manualmente, al estar modificando el CORE, es posible que en alguna actualización de la plataforma esto se re-escriba y deje de funcionar.\r\n\r\nPor otro lado no sabemos como hacer para que todos estos cambios se realicen en automático al instalar el módulo.\r\n\r\nEspero haber podido aportar una mejora al modulo, saludos.",
"title": "Guardar en BD y mostrar los datos en el bank-end cuotas, tipo de tarjeta, autorización y transacción",
"type": "issue"
},
{
"action": "created",
"author": "gonzunigad",
"comment_id": 615377788,
"datetime": 1587145425000,
"masked_author": "username_1",
"text": "Hola @username_0 . Muchas gracias por el apoyo con esta mejora. \r\nEn este caso te recomendaría crear un Pull Request con estos cambios. Así podemos ver mejor los cambios y quedarían como contribuidores en el repositorio. \r\n\r\nSi no saben muy bien cómo hacerlo me avisas y los apoyamos :) \r\n\r\n\r\nGracias nuevamente 🎉",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "msaustral",
"comment_id": 615399113,
"datetime": 1587148305000,
"masked_author": "username_0",
"text": "Hola, please ayúdame en como hacer el Pull Request, saludos.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gonzunigad",
"comment_id": 615429876,
"datetime": 1587152367000,
"masked_author": "username_1",
"text": "Perfecto, para crear un Pull Request debes eguir este procedimiento:\r\n\r\n1. - [Hacer un fork](https://help.github.com/es/github/getting-started-with-github/fork-a-repo) de este repositorio. Esto creará una copia en tu propia cuenta de github. \r\n2. - Descargar/clonar este nuevo repositorio en tu computador\r\n3. - Crear un nuevo branch (algo como `fix/database_prefix`)\r\n4. - Hacer los cambios necesarios en ese branch, hacer commit y subir el branch a tu repositorio (`git push ...`)\r\n5. - Si ya tienes un branch creado en tu fork, ya puedes hacer un Pull Request en este repositorio siguiendo [estas instrucciones](https://help.github.com/es/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request-from-a-fork)\r\n\r\nSi tienes dudas, puedes hablarme por el slack de la comunidad de TransbankDevelopers. Puedes unirte acá: https://join-transbankdevelopers-slack.herokuapp.com/",
"title": null,
"type": "comment"
}
] | 2 | 4 | 5,411 | false | false | 5,411 | true |
jmprathab/MyHome | null | 614,752,708 | 10 | null | [
{
"action": "opened",
"author": "jmprathab",
"comment_id": null,
"datetime": 1588945521000,
"masked_author": "username_0",
"text": "# Implement API for listing all communities\r\n\r\n* GET - /communities/\r\n\r\nThis API should list all available communities as a response.\r\n\r\nskip and limit parameters will be added later on to enable pagination",
"title": "API for listing all communities",
"type": "issue"
},
{
"action": "closed",
"author": "jmprathab",
"comment_id": null,
"datetime": 1588998781000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 206 | false | false | 206 | false |
grpc/grpc-web | grpc | 674,701,163 | 920 | null | [
{
"action": "opened",
"author": "mei2015",
"comment_id": null,
"datetime": 1596765394000,
"masked_author": "username_0",
"text": "为什么grpc-web 一开启stream连接什么都不做 浏览器内存就开始慢慢增加直至崩溃\r\nWhy does grPC-Web start the Stream connection and do nothing but slowly increase the browser memory until it crashes",
"title": "Why does grPC-Web start the Stream connection and do nothing but slowly increase the browser memory until it crashes",
"type": "issue"
},
{
"action": "closed",
"author": "stanley-cheung",
"comment_id": null,
"datetime": 1597872219000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "stanley-cheung",
"comment_id": 676717536,
"datetime": 1597872219000,
"masked_author": "username_1",
"text": "Dupe of #714.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 176 | false | false | 176 | false |
Cecilapp/Cecil | Cecilapp | 509,813,174 | 449 | {
"number": 449,
"repo": "Cecil",
"user_login": "Cecilapp"
} | [
{
"action": "created",
"author": "Narno",
"comment_id": 544511300,
"datetime": 1571664043000,
"masked_author": "username_0",
"text": "@dependabot merge",
"title": null,
"type": "comment"
}
] | 2 | 2 | 4,577 | false | true | 17 | false |
ocaml/opam-repository | ocaml | 726,810,332 | 17,459 | {
"number": 17459,
"repo": "opam-repository",
"user_login": "ocaml"
} | [
{
"action": "opened",
"author": "hannesm",
"comment_id": null,
"datetime": 1603309191000,
"masked_author": "username_0",
"text": "Simple public-key cryptography for the modern age\n\n- Project page: <a href=\"https://github.com/mirage/mirage-crypto\">https://github.com/mirage/mirage-crypto</a>\n- Documentation: <a href=\"https://mirage.github.io/mirage-crypto/doc\">https://mirage.github.io/mirage-crypto/doc</a>\n\n##### CHANGES:\n\n* Detect CPU architecture from C compiler, allowing cross-compiling to Android\n and iOS (mirage/mirage-crypto#84 by @EduardoRFC)\n* Upgrade to dune2, use a Makefile for building freestanding libraries, drop\n mirage-xen-posix support (solo5-based PVH exists now) mirage/mirage-crypto#86 by @username_0",
"title": "[new release] mirage-crypto-pk, mirage-crypto, mirage-crypto-rng and mirage-crypto-rng-mirage (0.8.6)",
"type": "issue"
},
{
"action": "created",
"author": "camelus",
"comment_id": 713905196,
"datetime": 1603318134000,
"masked_author": "username_1",
"text": "Commit: ea513dcc8d215720a667e32e9c0e20172aea85d5\n\nA pull request by opam-seasoned @username_0.\n\n##### :sunny: All lint checks passed <small>ea513dcc8d215720a667e32e9c0e20172aea85d5</small>\n\n* These packages passed lint tests: `mirage-crypto-pk.0.8.6`, `mirage-crypto-rng-mirage.0.8.6`, `mirage-crypto-rng.0.8.6`, `mirage-crypto.0.8.6`\n\n\n---\n\n\n##### :sunny: Installability check (+4)\n\n* new installable packages (4): `mirage-crypto.0.8.6` `mirage-crypto-pk.0.8.6` `mirage-crypto-rng.0.8.6` `mirage-crypto-rng-mirage.0.8.6`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mseri",
"comment_id": 714351405,
"datetime": 1603357870000,
"masked_author": "username_2",
"text": "Looking good, thanks",
"title": null,
"type": "comment"
}
] | 3 | 3 | 1,131 | false | false | 1,131 | true |
Lora-net/LoRaMac-node | Lora-net | 644,268,989 | 916 | null | [
{
"action": "opened",
"author": "ZaneL",
"comment_id": null,
"datetime": 1592965715000,
"masked_author": "username_0",
"text": "Hello,\r\n\r\nI'm using v4.4.4 and I'm trying to port this code over to the Arduino platform for the SX1262. It seems to be going okay, but I'm a little confused about the RTC board-specific functions. Really all I need is the Class A functionality -- but it seems that in LoRaMAC.c, class B and class A code are all tightly coupled so it requires you to implement the RTC functionality, even when it isn't necessary for class A.\r\n\r\nI've determined through trial and error that these are the functions that I must implement. What do you recommend I do for the RTC functions if I have no RTC on-board? Can I just leave the functions empty and it will work okay for Class A functionality? Or is there some easy way to implement these RTC functions using the built-in Arduino millis() function? \r\n\r\nEven if I were to connect an RTC and I were to use a library like Time/TimeLib.h, I'm at a complete loss as to how I would use the functions from that TimeLib library to implement these Loramac-node RTC functions. It seems overly complex (when looking at rtc-board.c as an example).\r\n\r\n[https://pastebin.com/YB8RiMZp](https://pastebin.com/YB8RiMZp)",
"title": "Confused about porting this to an Arduino library",
"type": "issue"
},
{
"action": "created",
"author": "ZaneL",
"comment_id": 648564582,
"datetime": 1592970150000,
"masked_author": "username_0",
"text": "Maybe something like this? Can I just use milliseconds as ticks?\r\n\r\n...and then in the RtcSetAlarm function I just setup a timer that will call TimerIrqHandler( );\r\n\r\nDoes that sound correct?\r\n\r\n[https://pastebin.com/gju2tLBu](https://pastebin.com/gju2tLBu)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "altishchenko",
"comment_id": 648668660,
"datetime": 1592986307000,
"masked_author": "username_1",
"text": "Sure, there is no strict requirement for the proper real time. Monotonic time counter is quite enough. Requirements begin with Class B and other things which can be safely ignored in Class A. Be warned though, this library is pretty heavy when compiled and might not fit in your board.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ZaneL",
"comment_id": 649618634,
"datetime": 1593098524000,
"masked_author": "username_0",
"text": "Sounds good, thank you!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ZaneL",
"comment_id": null,
"datetime": 1593098534000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 5 | 1,705 | false | false | 1,705 | false |
helm/charts | helm | 700,309,433 | 23,758 | null | [
{
"action": "opened",
"author": "eugene-chernyshenko",
"comment_id": null,
"datetime": 1599931086000,
"masked_author": "username_0",
"text": "<!-- Thanks for filing an issue! Before hitting the button, please answer these questions. It's helpful to search the existing GitHub issues first. It's likely that another user has already reported the issue you're facing, or it's a known issue that we're already aware of.\r\n\r\nDescribe *in detail* the feature/behavior/change you'd like to see.\r\n\r\nBe ready for followup questions, and please respond in a timely manner. If we can't reproduce a bug or think a feature already exists, we might close your issue. If we're wrong, PLEASE feel free to reopen it and explain why.\r\n-->\r\n\r\n**Is your feature request related to a problem? Please describe.**\r\nPlease make service type configurable\r\n\r\n**Additional context**\r\nSome cloud k8s cluster providers use NodePort type for ingress controller (GKE for example)\r\nThere is not way to configure this option",
"title": "[incubator/schema-registry] Make service type configurable",
"type": "issue"
}
] | 2 | 4 | 1,168 | false | true | 850 | false |
Helveg/patch | null | 642,543,384 | 19 | {
"number": 19,
"repo": "patch",
"user_login": "Helveg"
} | [
{
"action": "opened",
"author": "Helveg",
"comment_id": null,
"datetime": 1592738238000,
"masked_author": "username_0",
"text": "",
"title": "Fixed NetCon spike detection",
"type": "issue"
}
] | 2 | 2 | 0 | false | true | 0 | false |
bitmaelum/bitmaelum-suite | bitmaelum | 716,759,677 | 37 | {
"number": 37,
"repo": "bitmaelum-suite",
"user_login": "bitmaelum"
} | [
{
"action": "opened",
"author": "jaytaph",
"comment_id": null,
"datetime": 1602095497000,
"masked_author": "username_0",
"text": "Using mnemonic phrases to generate mailserver routing id and key",
"title": "Using mnemonic phrases to generate mailserver routing id and key",
"type": "issue"
}
] | 2 | 2 | 1,571 | false | true | 64 | false |
akka/akka | akka | 614,171,660 | 29,043 | {
"number": 29043,
"repo": "akka",
"user_login": "akka"
} | [
{
"action": "opened",
"author": "johanandren",
"comment_id": null,
"datetime": 1588867361000,
"masked_author": "username_0",
"text": "References #28961",
"title": "Coordinator ddata state store with persistent remember entities",
"type": "issue"
},
{
"action": "created",
"author": "johanandren",
"comment_id": 625819786,
"datetime": 1588945044000,
"masked_author": "username_0",
"text": "Multi jvm test added for ddata store mode and event sourced remember entities store, updated tests a bit since this requires that a journal is set up if either persistent state store or event sourced remember entitites.\r\n\r\nI think I'll stop here for this week.",
"title": null,
"type": "comment"
}
] | 2 | 5 | 313 | false | true | 277 | false |
LiskHQ/lisk-core | LiskHQ | 544,912,041 | 184 | null | [
{
"action": "opened",
"author": "ManuGowda",
"comment_id": null,
"datetime": 1578043182000,
"masked_author": "username_0",
"text": "### Expected behavior\r\nUpdate the lodash version to https://github.com/lodash/lodash/pull/4336\r\n### Actual behavior\r\nhttps://github.com/lodash/lodash/pull/4336\r\n### Steps to reproduce\r\n`npm audit` against `release/3.0.0`\r\n### Which version(s) does this affect? (Environment, OS, etc...)\r\n3.0.0",
"title": "Fix lodash vulnerability",
"type": "issue"
},
{
"action": "closed",
"author": "ManuGowda",
"comment_id": null,
"datetime": 1578409686000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 293 | false | false | 293 | false |
offirgolan/ember-light-table | null | 434,664,012 | 684 | null | [
{
"action": "opened",
"author": "kmichek",
"comment_id": null,
"datetime": 1555578106000,
"masked_author": "username_0",
"text": "Hi, I use a form component to edit the line selected via body.expanded-row. Is there a way to close the expanded row from within the form component, please? Thank you.",
"title": "Close body.expanded-row programmatically",
"type": "issue"
},
{
"action": "created",
"author": "kmichek",
"comment_id": 505124570,
"datetime": 1561400677000,
"masked_author": "username_0",
"text": "Hi, solved this, not sure if correctly, but it works:\r\nconst self = this;\r\nlater((function() {\r\n self.parentView.row.set('expanded', false);\r\n}), 0);",
"title": null,
"type": "comment"
}
] | 1 | 2 | 318 | false | false | 318 | false |
kubeflow/kfp-tekton | kubeflow | 709,379,250 | 316 | {
"number": 316,
"repo": "kfp-tekton",
"user_login": "kubeflow"
} | [
{
"action": "opened",
"author": "Tomcli",
"comment_id": null,
"datetime": 1601081350000,
"masked_author": "username_0",
"text": "**Which issue is resolved by this Pull Request:** \r\nResolves #\r\n\r\n**Description of your changes:**\r\nUpdate the lightweight_component.ipynb to align with the KubeflowDojo. Also extract the pipeline code from notebook into .py files for easier development and demo.\r\n\r\n**Environment tested:**\r\n\r\n* Python Version (use `python --version`):\r\n* Tekton Version (use `tkn version`):\r\n* Kubernetes Version (use `kubectl version`):\r\n* OS (e.g. from `/etc/os-release`):\r\n\r\n**Checklist:**\r\n- [x] The title for your pull request (PR) should follow our title convention. [Learn more about the pull request title convention used in this repository](https://github.com/kubeflow/pipelines/blob/master/CONTRIBUTING.md#pull-request-title-convention). \r\n\r\n PR titles examples:\r\n * `fix(frontend): fixes empty page. Fixes #1234`\r\n Use `fix` to indicate that this PR fixes a bug.\r\n * `feat(backend): configurable service account. Fixes #1234, fixes #1235`\r\n Use `feat` to indicate that this PR adds a new feature. \r\n * `chore: set up changelog generation tools`\r\n Use `chore` to indicate that this PR makes some changes that users don't need to know.\r\n * `test: fix CI failure. Part of #1234`\r\n Use `part of` to indicate that a PR is working on an issue, but shouldn't close the issue when merged.\r\n\r\n- [ ] Do you want this pull request (PR) cherry-picked into the current release branch?\r\n \r\n If yes, use one of the following options:\r\n \r\n * **(Recommended.)** Ask the PR approver to add the `cherrypick-approved` label to this PR. The release manager adds this PR to the release branch in a batch update.\r\n * After this PR is merged, create a cherry-pick PR to add these changes to the release branch. (For more information about creating a cherry-pick PR, see the [Kubeflow Pipelines release guide](https://github.com/kubeflow/pipelines/blob/master/RELEASE.md#option--git-cherry-pick).)",
"title": "fix(samples): Add compilable python file for lightweight and e2e-mnist example",
"type": "issue"
},
{
"action": "created",
"author": "Tomcli",
"comment_id": 700352141,
"datetime": 1601339096000,
"masked_author": "username_0",
"text": "/cc @ckadner",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "animeshsingh",
"comment_id": 702393546,
"datetime": 1601585852000,
"masked_author": "username_1",
"text": "/lgtm",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "animeshsingh",
"comment_id": 707405605,
"datetime": 1602547343000,
"masked_author": "username_1",
"text": "Is this good to merge @ckadner ?",
"title": null,
"type": "comment"
}
] | 5 | 10 | 5,111 | false | true | 1,970 | false |
uyjulian/renpy-switch | null | 616,709,016 | 10 | null | [
{
"action": "opened",
"author": "perillamint",
"comment_id": null,
"datetime": 1589294209000,
"masked_author": "username_0",
"text": "Hello, I tried your Ren'Py port and successfully ported Kindred Spirits on the Roof to Switch. The problem is, it uses 800 x 600 resolution and Ren'Py switch displays it \"stretched\" horizontally. (does not keep aspect ratio)\r\n\r\nIt seems there is a problem exist in the resolution configuration routine or so. By guessing from switching fullscreen/windowed mode while docked fixes this problem temporary, I personally suspect gl2/gl2draw.pyx:353, which does not handle Nintendo Switch's case but I didn't experimented with it yet.",
"title": "It does not keep aspect ratio correctly in case of 4:3 ratio",
"type": "issue"
},
{
"action": "created",
"author": "uyjulian",
"comment_id": 627520614,
"datetime": 1589308711000,
"masked_author": "username_1",
"text": "switch-sdl2 recently added a feature where it will automatically change the resolution when the console is docked and undocked. However, the code does not execute until the console is docked or undocked. \r\nI'll see if I can fix this issue.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "uyjulian",
"comment_id": null,
"datetime": 1590335241000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 769 | false | false | 769 | false |
ros2/launch | ros2 | 605,233,558 | 406 | {
"number": 406,
"repo": "launch",
"user_login": "ros2"
} | [
{
"action": "opened",
"author": "cottsay",
"comment_id": null,
"datetime": 1587616893000,
"masked_author": "username_0",
"text": "If the test run is interrupted, `_LaunchDiedException` triggers a `FailResult` for the entire run. However, some cases may have already been cleaned up by the test run, and `FailResult` calls into `unittest.TestResult.addFailure()` with possibly cleaned-up tests, which are represented by `None`.\r\n\r\nThis leads to traces like this:\r\n```\r\nTraceback (most recent call last):\r\n File \"/opt/ros/master/bin/ros2\", line 11, in <module>\r\n load_entry_point('ros2cli', 'console_scripts', 'ros2')()\r\n File \"/opt/ros_src/master/build/ros2cli/ros2cli/cli.py\", line 67, in main\r\n rc = extension.main(parser=parser, args=args)\r\n File \"/opt/ros_src/master/build/ros2test/ros2test/command/test.py\", line 48, in main\r\n parser, args, test_runner_cls=launch_testing_ros.LaunchTestRunner\r\n File \"/opt/ros_src/master/build/launch_testing/launch_testing/launch_test.py\", line 111, in run\r\n results = runner.run()\r\n File \"/opt/ros_src/master/build/launch_testing/launch_testing/test_runner.py\", line 267, in run\r\n message='Launch stopped before the active tests finished.'\r\n File \"/opt/ros_src/master/build/launch_testing/launch_testing/test_result.py\", line 26, in __init__\r\n self.addFailure(case, (Exception, Exception(message), None))\r\n File \"/usr/lib64/python3.7/unittest/result.py\", line 17, in inner\r\n return method(self, *args, **kw)\r\n File \"/usr/lib64/python3.7/unittest/result.py\", line 122, in addFailure\r\n self.failures.append((test, self._exc_info_to_string(err, test)))\r\n File \"/usr/lib64/python3.7/unittest/result.py\", line 180, in _exc_info_to_string\r\n if exctype is test.failureException:\r\nAttributeError: 'NoneType' object has no attribute 'failureException'\r\n```\r\n\r\nThe solution presented here is to change the `_removeTestAtIndex` function to a no-op, since we aren't actually subclassing `unittest.TestSuite` as the documentation suggests.",
"title": "Disable cleanup of test cases once they have been run",
"type": "issue"
},
{
"action": "created",
"author": "cottsay",
"comment_id": 626894845,
"datetime": 1589223669000,
"masked_author": "username_0",
"text": "The error occurs because `FailResult` is assuming that all of the tests cases can be used when calling `addFailure` in the creation of a `TestResult` instance: https://github.com/ros2/launch/blob/d16628d8abfcdd7d44293c89198a1a64cfc54ee6/launch_testing/launch_testing/test_result.py#L23-L27\r\n\r\nIn the current implementation, test cases that have already completed have been cleaned up and can not be passed to `addFailure`.\r\n\r\nIf we just skip `addFailure` for the test cases that have already been completed, we'll lose them entirely, since the current structure of this code doesn't have the capability to report partial success if launch shuts down before all of the tests finish.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hidmic",
"comment_id": 642900735,
"datetime": 1591905927000,
"masked_author": "username_1",
"text": "@username_0 ping.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cottsay",
"comment_id": 643009445,
"datetime": 1591924137000,
"masked_author": "username_0",
"text": "At the point where we're currently calling `addFailure`, the case is already `None` because of the cleanup. Even after suppressing the cleanup, the `TestCase` doesn't store the result so we don't know which tests were successful, failed OR ignored - that is what the `TestResult` class is for. It might be possible to get access to the `TestResult` instance that has the accumulated successful and ignored cases, but I can't see a straightforward way to get access to it. It might not even be possible.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hidmic",
"comment_id": 665285511,
"datetime": 1595970695000,
"masked_author": "username_1",
"text": "Correct, you'd have to propagate it upwards, perhaps along with the `LaunchDiedException`. At any rate, it's a nice to have, if launch dies the test is broken anyways.",
"title": null,
"type": "comment"
}
] | 2 | 5 | 3,235 | false | false | 3,235 | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.