repo stringlengths 7 67 | org stringlengths 2 32 ⌀ | issue_id int64 780k 941M | issue_number int64 1 134k | pull_request dict | events list | user_count int64 1 77 | event_count int64 1 192 | text_size int64 0 329k | bot_issue bool 1 class | modified_by_bot bool 2 classes | text_size_no_bots int64 0 279k | modified_usernames bool 2 classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
marrobHD/firetv-card | null | 854,688,671 | 9 | null | [
{
"action": "opened",
"author": "broyuken",
"comment_id": null,
"datetime": 1617986835000,
"masked_author": "username_0",
"text": "Why are there 2 power buttons? Can I control 2 devices with this card? IE Sound bar and TV?\r\n\r\nAlso, could you allow separate services for turn_on and turn_off like how universal media players do it?\r\n\r\n```\r\ncommands:\r\n turn_on:\r\n service: switch.turn_on\r\n data:\r\n entity_id: switch.living_room_tv_power\r\n turn_off:\r\n service: switch.turn_off\r\n data:\r\n entity_id: switch.living_room_tv_power\r\n```",
"title": "2 power buttons?",
"type": "issue"
}
] | 1 | 1 | 420 | false | false | 420 | false |
SabakiHQ/Sabaki | SabakiHQ | 899,173,223 | 789 | null | [
{
"action": "opened",
"author": "psygo",
"comment_id": null,
"datetime": 1621821140000,
"masked_author": "username_0",
"text": "I'm trying to create some [new themes of my own](https://github.com/FanaroEngineering/fanaro_sabaki_theme_collection) but have encountered 2 issues with trying to modify the following characteristics:\r\n\r\n- The grid color\r\n- Sabaki's background to the Go board — I've only managed to do it outside the `.asar` file, inside the preferences' theming tab\r\n\r\nAre there ways of modifying those characteristics and embed them in the `.asar` file? If so, is there documentation about it — I couldn't find it neither [here](https://github.com/SabakiHQ/Shudan/tree/master/docs#styling) nor [here](https://github.com/SabakiHQ/Sabaki/blob/master/docs/guides/create-themes.md) —?\r\n\r\nAdditionally, are there ways of modifying more of Sabaki's styles through CSS?\r\n\r\nAnd does anyone know how to open the inspector on Linux? I've tried googling it and using the shortcuts I use on my browser and on Windows but nothing really worked.",
"title": "Help with Some Styling Variables for Themes",
"type": "issue"
},
{
"action": "created",
"author": "ParmuzinAlexander",
"comment_id": 855891353,
"datetime": 1623069710000,
"masked_author": "username_1",
"text": "Grid color\r\n`.shudan-goban {--shudan-board-foreground-color: #000;}`\r\nBoard color\r\n```\r\n.shudan-goban {--shudan-board-background-color: #FFF;}\r\n.shudan-goban-image {background-image: none;}\r\n```\r\n\r\nYou may be interested in some of the tricks that I personally use.\r\nP.S. Disable bar very extreme change, use only background settings for fixing bug https://github.com/SabakiHQ/Sabaki/pull/542\r\n```\r\n/* Stones texture */\r\n.shudan-stone-image.shudan-sign_1 {background-image: url('0.png');}\r\n.shudan-stone-image.shudan-sign_-1 {background-image: url('1.png');}\r\n.shudan-stone-image.shudan-sign_-1.shudan-random_1 {background-image: url('2.png');}\r\n.shudan-stone-image.shudan-sign_-1.shudan-random_2 {background-image: url('3.png');}\r\n.shudan-stone-image.shudan-sign_-1.shudan-random_3 {background-image: url('4.png');}\r\n.shudan-stone-image.shudan-sign_-1.shudan-random_4 {background-image: url('5.png');}\r\n/* 100% stones size */\r\n.shudan-vertex .shudan-stone {top: 0; left: 0; width: 100%; height: 100%;}\r\n/* Black lines and disable borders */\r\n.shudan-goban {--shudan-board-border-width: 0; --shudan-board-foreground-color: #000;}\r\n.shudan-goban:not(.shudan-coordinates) {padding: 0;}\r\n/* Board texture */\r\n.shudan-goban-image {background-image: url('board1.png');}\r\n/* Disable stones shadow */\r\n.shudan-vertex:not(.shudan-sign_0) .shudan-shadow {background: none; box-shadow: none;}\r\n/* Disable bar */\r\n#bar {visibility: hidden;}\r\nmain {bottom: 0; background: #f0f0f0 url('background1.png') left top;}\r\n/* Disable goban shadow */\r\n#goban {box-shadow: none;}\r\n/* Disable last move marker */\r\n.shudan-vertex.shudan-marker_point.shudan-sign_1 .shudan-marker {background: none;}\r\n.shudan-vertex.shudan-marker_point.shudan-sign_-1 .shudan-marker {background: none;}\r\n/* Heat map */\r\n.shudan-vertex .shudan-heat {transition: opacity 0s, box-shadow 0s;}\r\n.shudan-vertex.shudan-heat_9 .shudan-heat {background: #009900; box-shadow: 0 0 0 .5em #009900; opacity: 1;}\r\n.shudan-vertex.shudan-heat_8 .shudan-heat {background: none; box-shadow: none; opacity: 1;}\r\n.shudan-vertex.shudan-heat_7 .shudan-heat {background: none; box-shadow: none; opacity: 1;}\r\n.shudan-vertex.shudan-heat_6 .shudan-heat {background: none; box-shadow: none; opacity: 1;}\r\n.shudan-vertex.shudan-heat_5 .shudan-heat {background: none; box-shadow: none; opacity: 1;}\r\n.shudan-vertex.shudan-heat_4 .shudan-heat {background: none; box-shadow: none; opacity: 1;}\r\n.shudan-vertex.shudan-heat_3 .shudan-heat {background: none; box-shadow: none; opacity: 1;}\r\n.shudan-vertex.shudan-heat_2 .shudan-heat {background: none; box-shadow: none; opacity: 1;}\r\n.shudan-vertex.shudan-heat_1 .shudan-heat {background: none; box-shadow: none; opacity: 1;}\r\n.shudan-vertex .shudan-heatlabel {color: white; font-size: .38em; line-height: 1; text-shadow: none; opacity: 1;}\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "psygo",
"comment_id": 859955271,
"datetime": 1623453962000,
"masked_author": "username_0",
"text": "```css\r\n.shudan-goban {\r\n --shudan-board-foreground-color: #FFF; \r\n}\r\n```\r\n\r\nsolved the color of the grid for me. Thanks a lot, @username_1. Before I close this issue, could you please leave a reference to where you ended up finding all these properties?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ParmuzinAlexander",
"comment_id": 860026076,
"datetime": 1623489239000,
"masked_author": "username_1",
"text": "@username_0 Source code like https://github.com/SabakiHQ/Shudan/blob/master/css/goban.css or inspect web version https://github.com/SabakiHQ/Sabaki/releases/download/v0.43.3/sabaki-v0.43.3-web.zip\r\nBut after adding settings for textures, I don't see the point in simple themes. Maybe things like custom Heat map in lizzie \\ katrain style will be more useful. Using themes as an addon or plugin, unfortunately only one...",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "psygo",
"comment_id": null,
"datetime": 1623505121000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "psygo",
"comment_id": 860054855,
"datetime": 1623505121000,
"masked_author": "username_0",
"text": "I wish there were more documentation for all this. Even the names are not quite great — how am I supposed to guess `--shudan-board-foreground-color` means the grid color? Anyway, I think this will do for now.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "yishn",
"comment_id": 920753041,
"datetime": 1631785666000,
"masked_author": "username_2",
"text": "For future reference: https://github.com/SabakiHQ/Shudan/blob/master/docs/README.md#styling\r\n\r\nForeground color also refers to marker/label/arrow/line color on non-stone vertices.",
"title": null,
"type": "comment"
}
] | 3 | 7 | 4,827 | false | false | 4,827 | true |
googleapis/google-cloud-cpp | googleapis | 779,342,077 | 5,676 | {
"number": 5676,
"repo": "google-cloud-cpp",
"user_login": "googleapis"
} | [
{
"action": "opened",
"author": "dopiera",
"comment_id": null,
"datetime": 1609868488000,
"masked_author": "username_0",
"text": "This fixes #5670.\r\n\r\nBefore this PR, `NotifyOnStateChange()` could be called on a\r\n`grpc::CompletionQueue` which was shut down via `Shutdown()`. This PR\r\nmakes `AsyncConnectionReadyFuture` use\r\n`CompletionQueueImpl::StartOperation` to make sure that the whole\r\noperation either fails with `StatusCode::kCancelled` or there is a\r\nguarantee that the `CompletionQueue` is not shut down.\n\n<!-- Reviewable:start -->\n---\nThis change is [<img src=\"https://reviewable.io/review_button.svg\" height=\"34\" align=\"absmiddle\" alt=\"Reviewable\"/>](https://reviewable.io/reviews/googleapis/google-cloud-cpp/5676)\n<!-- Reviewable:end -->",
"title": "fix: don't wait for state change on shut down CQ",
"type": "issue"
},
{
"action": "created",
"author": "coryan",
"comment_id": 760286368,
"datetime": 1610639688000,
"masked_author": "username_1",
"text": "I think #5701 addresses the same problem, closing for now.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 677 | false | false | 677 | false |
SimpleAppProjects/SimpleWeather-Windows | SimpleAppProjects | 646,867,473 | 363 | null | [
{
"action": "opened",
"author": "thewizrd",
"comment_id": null,
"datetime": 1593331710000,
"masked_author": "username_0",
"text": "### Version 3.3.0.0(3.3.0.0) ###\n\n\n### Stacktrace ###\n\n\n__Interop.ComCallHelpers.Call($__ComObject __this, RuntimeTypeHandle __typeHnd, Int32 __targetIndex) in Call at 15732480:0;__Interop\n\n__Interop.ForwardComStubs.Stub_19<System.__Canon>(Void* InstParam, $__ComObject __this, Int32 __targetIndex) in Stub_19 at 16707566:0;__Interop.ForwardComStubs\n\n\n\n### Reason ###\n\n\nSystem.Exception\n\n\n### Link to App Center ###\n\n\n* [https://appcenter.ms/users/username_0.dev/apps/SimpleWeather/crashes/errors/1183154435u](https://appcenter.ms/users/username_0.dev/apps/SimpleWeather/crashes/errors/1183154435u)",
"title": "Fix System.Exception in ComCallHelpers.Call ($__ComObject __this, RuntimeTypeHandle __typeHnd, Int32 __targetIndex)",
"type": "issue"
},
{
"action": "closed",
"author": "thewizrd",
"comment_id": null,
"datetime": 1594834563000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 594 | false | false | 594 | true |
HL7/fhir | HL7 | 812,676,811 | 1,104 | {
"number": 1104,
"repo": "fhir",
"user_login": "HL7"
} | [
{
"action": "opened",
"author": "yunwwang",
"comment_id": null,
"datetime": 1613844988000,
"masked_author": "username_0",
"text": "## HL7 FHIR Pull Request\r\n\r\n_Note: No pull requests will be accepted against `./source` unless logged in the_ [HL7 Jira issue tracker](https://jira.hl7.org/projects/FHIR/issues/).\r\n\r\nIf you made changes to any files within `./source` please indicate the Jira tracker number this pull request is associated with: ` `\r\n\r\n## Description\r\n\r\nFHIR-26831\r\nFHIR-29660\r\nFHIR-31075\r\nFHIR-21243",
"title": "Update Questionnaire and QuestionnaireResponse",
"type": "issue"
},
{
"action": "created",
"author": "yunwwang",
"comment_id": 785261645,
"datetime": 1614189415000,
"masked_author": "username_0",
"text": "Will reorganize tickets",
"title": null,
"type": "comment"
}
] | 1 | 2 | 408 | false | false | 408 | false |
apache/hudi | apache | 853,721,526 | 2,793 | {
"number": 2793,
"repo": "hudi",
"user_login": "apache"
} | [
{
"action": "opened",
"author": "TeRS-K",
"comment_id": null,
"datetime": 1617905412000,
"masked_author": "username_0",
"text": "## What is the purpose of the pull request\r\n\r\nThis pull request supports ORC storage in hudi.\r\n\r\n## Brief change log\r\n\r\nIn two separate commits:\r\n- Implemented HoodieOrcWriter\r\n - Added HoodieOrcConfigs\r\n - Added AvroOrcUtils that writes Avro record **to** VectorizedRowBatch\r\n - Used orc-core:no-hive module (`no-hive` is needed because spark-sql uses no-hive version of orc and it would become easier for spark integration)\r\n- Implemented HoodieOrcReader\r\n - Read Avro records **from** VectorizedRowBatch\r\n - Implemented OrcReaderIterator\r\n - Implemented ORC utility functions \r\n\r\n## Verify this pull request\r\n\r\n- Added unit tests for \r\n - reader/writer creation\r\n - AvroOrcUtils\r\n- (local) Wrote a small tool that reads from ORC/Parquet files and writes to ORC/Parquet files, verified that the records in the input/output files are identical using spark.read.orc/spark.read.parquet.\r\n- (local) Changed the HoodieTableConfig.DEFAULT_BASE_FILE_FORMAT to force the tests to run with ORC as the base format. Some changes need to be made, but I'm leaving it out of this PR to get some initial feedback on the reader/writer implementation first.\r\n For all tests to pass with ORC as the base file format:\r\n - Understand schema evolution in ORC (ref TestUpdateSchemaEvolution)\r\n - Add ORC support for places that have hardcoded ParquetReader or sqlContext.read().parquet() \r\n - Add ORC support for bootstrap op\r\n - Hive engine integration with ORC (implement HoodieOrcInputFormat, and more)\r\n - Spark engine integration with ORC (implement HoodieInternalRowOrcWriter, and more)\r\n - Add ORC support for HoodieSnapshotExporter\r\n - Implement HDFSOrcImporter\r\n - and possibly more.\r\n\r\n\r\n## Committer checklist\r\n\r\n - [ ] Has a corresponding JIRA in PR title & commit\r\n \r\n - [ ] Commit message is descriptive of the change\r\n \r\n - [ ] CI is green\r\n\r\n - [ ] Necessary doc changes done or have another open PR\r\n \r\n - [ ] For large changes, please consider breaking it into sub-tasks under an umbrella JIRA.",
"title": "[HUDI-57] Support ORC Storage",
"type": "issue"
},
{
"action": "created",
"author": "n3nash",
"comment_id": 816037894,
"datetime": 1617905982000,
"masked_author": "username_1",
"text": "@prashantwason Can you review this ?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TeRS-K",
"comment_id": 816163413,
"datetime": 1617913917000,
"masked_author": "username_0",
"text": "The build is currently failing with error `ERROR: toomanyrequests: You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limit`, it doesn't seem to be related to my change. How can I trigger a rebuild?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "yanghua",
"comment_id": 816495293,
"datetime": 1617955102000,
"masked_author": "username_2",
"text": "option 1: close and reopen the PR;\r\noption 2: push an empty commit via git command",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TeRS-K",
"comment_id": 819244201,
"datetime": 1618378961000,
"masked_author": "username_0",
"text": "ORC is well-integrated with hive, so hive already has OrcInputFormat, OrcOutputFormat etc. With my latest change to the HoodieInputFormatUtils class, I was able to sync hudi orc format table to hive metastore (tested with deltastreamer).\r\nHowever, we do still need to implement HoodieOrcInputFormat & HoodieRealtimeOrcInputFormat. I have done some work on that but it's not tested yet.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "n3nash",
"comment_id": 859069045,
"datetime": 1623359257000,
"masked_author": "username_1",
"text": "Closing this in favor of -> https://github.com/apache/hudi/pull/2999",
"title": null,
"type": "comment"
}
] | 5 | 8 | 15,598 | false | true | 2,870 | false |
pytorch/pytorch | pytorch | 860,157,571 | 56,295 | null | [
{
"action": "opened",
"author": "befelix",
"comment_id": null,
"datetime": 1618605822000,
"masked_author": "username_0",
"text": "```\r\nwhich has a hard-coded dimension that replaces the ellipsis according to the batch-dimension of the example tensor. In contrast, `jit.script` identifies the ellipsis and compiles the code to:\r\n```python\r\ndef select(x: Tensor) -> Tensor:\\n return torch.select(x, -1, 0)\\n'\r\n```\r\n\r\n## Motivation\r\n\r\nWhile `jit.script` has come a long way, tracing is often still the easiest to get to jit'ed code. Having indexing operations with ellipsis traced as hard-coded indeces makes it more difficult to use tracing with different batch sizes and can lead to subtle indexing errors as in the example above.",
"title": "[torch.jit.trace] Indexing with ellipsis fixes the batch dimension",
"type": "issue"
},
{
"action": "created",
"author": "gmagogsfm",
"comment_id": 822112447,
"datetime": 1618796518000,
"masked_author": "username_1",
"text": "@username_3 @username_2 Any thoughts? It feels like a fundamental limitation of `trace` to me",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "suo",
"comment_id": 823448238,
"datetime": 1618938145000,
"masked_author": "username_2",
"text": "The current implementation the tracer records operations at the `aten` level, so it only sees advanced indexing operations after they've been desugared into the actual `aten::select` call. So yeah, in that sense it's a fundamental limitation of the current approach.\r\n\r\nPossible options, listed in no particular order of practicality:\r\n1. We could create an aten op that captures the fact that advanced indexing was done, although that increases the complexity of the PT core to solve an implementation detail of the tracer.\r\n2. We could use FX to trace this, although I'm not sure whether it captures this kind of higher-level indexing either (cc @username_3)\r\n3. You can nest a scripted function within a traced one to \"preserve\" the indexing call (this might be ugly/hard to know where the scripted functions need to be inserted)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gmagogsfm",
"comment_id": 823455299,
"datetime": 1618938703000,
"masked_author": "username_1",
"text": "Hi @username_0,\r\n\r\nOptional #3 that @username_2 proposed can be done quickly in your code, could you give that a try? option #1 and #2 would take more time to develop than #3. So #3 is best shot to get unblocked.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "gmagogsfm",
"comment_id": null,
"datetime": 1618938715000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "jamesr66a",
"comment_id": 823664588,
"datetime": 1618961211000,
"masked_author": "username_3",
"text": "@username_2 FX traces the surface-level `getitem` call so it's plausible:\r\n\r\n```\r\nimport torch\r\nimport torch.fx\r\n\r\ndef idx(x, z):\r\n return x[..., z]\r\n\r\ntraced = torch.fx.symbolic_trace(idx)\r\nprint(traced.graph)\r\n\"\"\"\r\ngraph(x, z):\r\n %getitem : [#users=1] = call_function[target=operator.getitem](args = (%x, (Ellipsis, %z)), kwargs = {})\r\n return getitem\r\n\"\"\"\r\n```",
"title": null,
"type": "comment"
}
] | 4 | 6 | 2,081 | false | false | 2,081 | true |
sentrysoftware/studioX-templates | sentrysoftware | 810,202,143 | 46 | {
"number": 46,
"repo": "studioX-templates",
"user_login": "sentrysoftware"
} | [
{
"action": "opened",
"author": "RazeemM",
"comment_id": null,
"datetime": 1613569835000,
"masked_author": "username_0",
"text": "",
"title": "Dell EMC Isilon OneFS REST API",
"type": "issue"
},
{
"action": "created",
"author": "MohammedSentry",
"comment_id": 782108342,
"datetime": 1613744823000,
"masked_author": "username_1",
"text": "@username_0 I think there is a conversion issue for the Used HDD Capacity of the StoragePool\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MohammedSentry",
"comment_id": 782122341,
"datetime": 1613746246000,
"masked_author": "username_1",
"text": "@username_0 can you display the Storagepool values in GB instead of MB ?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RazeemM",
"comment_id": 782148331,
"datetime": 1613748544000,
"masked_author": "username_0",
"text": "@username_1 Units modified to GB.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MohammedSentry",
"comment_id": 783421395,
"datetime": 1614004804000,
"masked_author": "username_1",
"text": "@username_0 no more USED HDD CAPACITY ?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bertysentry",
"comment_id": 785932241,
"datetime": 1614263088000,
"masked_author": "username_2",
"text": "@username_0 no more USED HDD CAPACITY?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RazeemM",
"comment_id": 785979199,
"datetime": 1614266327000,
"masked_author": "username_0",
"text": "It is there.\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RazeemM",
"comment_id": 786001895,
"datetime": 1614268238000,
"masked_author": "username_0",
"text": "There was no Used HDD Capacity for Nodepool. It is there for Storagepool.",
"title": null,
"type": "comment"
}
] | 3 | 8 | 577 | false | false | 577 | true |
primer/view_components | primer | 814,631,038 | 267 | {
"number": 267,
"repo": "view_components",
"user_login": "primer"
} | [
{
"action": "opened",
"author": "manuelpuyol",
"comment_id": null,
"datetime": 1614098212000,
"masked_author": "username_0",
"text": "",
"title": "Remove test deprecation warnings",
"type": "issue"
}
] | 2 | 2 | 388 | false | true | 0 | false |
jlippold/tweakCompatible | null | 751,057,139 | 134,453 | null | [
{
"action": "opened",
"author": "MEGSystem",
"comment_id": null,
"datetime": 1606331183000,
"masked_author": "username_0",
"text": "```\r\n{\r\n \"packageId\": \"com.rpgfarm.a-font\",\r\n \"action\": \"working\",\r\n \"userInfo\": {\r\n \"arch32\": false,\r\n \"packageId\": \"com.rpgfarm.a-font\",\r\n \"deviceId\": \"iPhone8,1\",\r\n \"url\": \"http://cydia.saurik.com/package/com.rpgfarm.a-font/\",\r\n \"iOSVersion\": \"14.2\",\r\n \"packageVersionIndexed\": true,\r\n \"packageName\": \"A-Font\",\r\n \"category\": \"Tweaks\",\r\n \"repository\": \"MERONA Repo\",\r\n \"name\": \"A-Font\",\r\n \"installed\": \"1.8.3\",\r\n \"packageIndexed\": true,\r\n \"packageStatusExplaination\": \"A matching version of this tweak for this iOS version could not be found. Please submit a review if you choose to install.\",\r\n \"id\": \"com.rpgfarm.a-font\",\r\n \"commercial\": false,\r\n \"packageInstalled\": true,\r\n \"tweakCompatVersion\": \"0.1.5\",\r\n \"shortDescription\": \"Change your font!\",\r\n \"latest\": \"1.8.3\",\r\n \"author\": \"Baw Appie\",\r\n \"packageStatus\": \"Unknown\"\r\n },\r\n \"base64\": \"eyJhcmNoMzIiOmZhbHNlLCJwYWNrYWdlSWQiOiJjb20ucnBnZmFybS5hLWZvbnQiLCJkZXZpY2VJZCI6ImlQaG9uZTgsMSIsInVybCI6Imh0dHA6XC9cL2N5ZGlhLnNhdXJpay5jb21cL3BhY2thZ2VcL2NvbS5ycGdmYXJtLmEtZm9udFwvIiwiaU9TVmVyc2lvbiI6IjE0LjIiLCJwYWNrYWdlVmVyc2lvbkluZGV4ZWQiOnRydWUsInBhY2thZ2VOYW1lIjoiQS1Gb250IiwiY2F0ZWdvcnkiOiJUd2Vha3MiLCJyZXBvc2l0b3J5IjoiTUVST05BIFJlcG8iLCJuYW1lIjoiQS1Gb250IiwiaW5zdGFsbGVkIjoiMS44LjMiLCJwYWNrYWdlSW5kZXhlZCI6dHJ1ZSwicGFja2FnZVN0YXR1c0V4cGxhaW5hdGlvbiI6IkEgbWF0Y2hpbmcgdmVyc2lvbiBvZiB0aGlzIHR3ZWFrIGZvciB0aGlzIGlPUyB2ZXJzaW9uIGNvdWxkIG5vdCBiZSBmb3VuZC4gUGxlYXNlIHN1Ym1pdCBhIHJldmlldyBpZiB5b3UgY2hvb3NlIHRvIGluc3RhbGwuIiwiaWQiOiJjb20ucnBnZmFybS5hLWZvbnQiLCJjb21tZXJjaWFsIjpmYWxzZSwicGFja2FnZUluc3RhbGxlZCI6dHJ1ZSwidHdlYWtDb21wYXRWZXJzaW9uIjoiMC4xLjUiLCJzaG9ydERlc2NyaXB0aW9uIjoiQ2hhbmdlIHlvdXIgZm9udCEiLCJsYXRlc3QiOiIxLjguMyIsImF1dGhvciI6IkJhdyBBcHBpZSIsInBhY2thZ2VTdGF0dXMiOiJVbmtub3duIn0=\",\r\n \"chosenStatus\": \"working\",\r\n \"notes\": \"\"\r\n}\r\n```",
"title": "`A-Font` working on iOS 14.2",
"type": "issue"
}
] | 1 | 1 | 1,861 | false | false | 1,861 | false |
PaddlePaddle/PaddleHelix | PaddlePaddle | 890,751,062 | 90 | {
"number": 90,
"repo": "PaddleHelix",
"user_login": "PaddlePaddle"
} | [
{
"action": "opened",
"author": "jinghu23",
"comment_id": null,
"datetime": 1620886623000,
"masked_author": "username_0",
"text": "",
"title": "DDs",
"type": "issue"
}
] | 3 | 3 | 1,022 | false | true | 0 | false |
MirahezeBots/MirahezeBots | MirahezeBots | 890,314,686 | 517 | {
"number": 517,
"repo": "MirahezeBots",
"user_login": "MirahezeBots"
} | [
{
"action": "opened",
"author": "RhinosF1",
"comment_id": null,
"datetime": 1620840029000,
"masked_author": "username_0",
"text": "Gate time!",
"title": "Run push/pr_target always; make tests a package",
"type": "issue"
}
] | 2 | 2 | 3,409 | false | true | 10 | false |
akarnokd/RxJavaFiberInterop | null | 671,787,098 | 30 | {
"number": 30,
"repo": "RxJavaFiberInterop",
"user_login": "akarnokd"
} | [
{
"action": "created",
"author": "akarnokd",
"comment_id": 667954054,
"datetime": 1596451806000,
"masked_author": "username_0",
"text": "@dependabot recreate",
"title": null,
"type": "comment"
}
] | 3 | 3 | 6,575 | false | true | 20 | false |
AhmedAmin90/ror-social-scaffold | null | 817,398,548 | 4 | null | [
{
"action": "opened",
"author": "mricanho",
"comment_id": null,
"datetime": 1614349463000,
"masked_author": "username_0",
"text": "Hello Team!\r\n\r\nYou did an excellent job in your first two milestones, just a few things to work on:\r\n\r\n- You need to refactor your friendship code, all of this is in preparation for the milestone.\r\n- Take out all the logic on your views, this is the best practice.\r\n\r\nHappy coding!",
"title": "Peer to peer code review",
"type": "issue"
}
] | 1 | 1 | 281 | false | false | 281 | false |
sider/goodcheck | sider | 659,731,957 | 126 | {
"number": 126,
"repo": "goodcheck",
"user_login": "sider"
} | [
{
"action": "created",
"author": "ybiquitous",
"comment_id": 660745059,
"datetime": 1595207528000,
"masked_author": "username_0",
"text": "@dependabot squash and merge",
"title": null,
"type": "comment"
}
] | 2 | 2 | 4,665 | false | true | 28 | false |
JuliaRegistries/General | JuliaRegistries | 776,796,751 | 27,130 | {
"number": 27130,
"repo": "General",
"user_login": "JuliaRegistries"
} | [
{
"action": "opened",
"author": "JuliaRegistrator",
"comment_id": null,
"datetime": 1609392325000,
"masked_author": "username_0",
"text": "- Registering package: ArrayInterface\n- Repository: https://github.com/SciML/ArrayInterface.jl\n- Created by: @chriselrod\n- Version: v2.14.11\n- Commit: 0256272ce42846753edf055aae66191116f1ff7e\n- Reviewed by: @chriselrod\n- Reference: https://github.com/SciML/ArrayInterface.jl/commit/0256272ce42846753edf055aae66191116f1ff7e#commitcomment-45540590\n<!-- bf0c69308befbd3ccf2cc956ac8a46712550b79fc9bfb5e4edf8f833f05f4c18b06eddad8845b45beb9f45c2b8020dd6052a85cb81bea51da7fec03da3249e2705da7c5893884fa78a1359a9fe123114c0495d8c2338f2d819d9640571c424920cc3378da8c50b036aa90aa540f08fbfbeeb3053ad2642eeeb1467dd964eb1ca067b1c80d36d503de955557ef34211fe5664b8c2d873927252fcf0aef20a15c4c19fa7801ea2aa39c57f553547234bb993cb9adc967564405d2544fc239149239db0738c3c9691dc9bc00dff1df035e202e806a49a7f1b9fdd9eaa502b02e685 -->",
"title": "New version: ArrayInterface v2.14.11",
"type": "issue"
}
] | 2 | 2 | 1,203 | false | true | 803 | false |
reapit/foundations | reapit | 795,166,998 | 3,357 | null | [
{
"action": "opened",
"author": "plittlewood-rpt",
"comment_id": null,
"datetime": 1611758939000,
"masked_author": "username_0",
"text": "The logging is a little light in the data marketplace app, particularly in interactions with the third party platforms. This makes it difficult to isolate exact points of failure in some scenarios. We should enhance the logging capabilities of this service",
"title": "Improve logging",
"type": "issue"
},
{
"action": "closed",
"author": "cbryanreapit",
"comment_id": null,
"datetime": 1611868392000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 256 | false | false | 256 | false |
OpenAPITools/openapi-generator | OpenAPITools | 848,584,179 | 9,152 | null | [
{
"action": "opened",
"author": "Kink77",
"comment_id": null,
"datetime": 1617292036000,
"masked_author": "username_0",
"text": "<!--\r\nPlease follow the issue template below for bug reports and feature requests.\r\nAlso please indicate in the issue title which language/library is concerned. Eg: [JAVA] Bug generating foo with bar \r\n-->\r\n\r\n##### Description\r\n\r\nGetting a stackoverflow exception when trying to generate a client for csharp-netcore\r\n\r\n##### openapi-generator version\r\n\r\nversion 5.1.0\r\n\r\n##### OpenAPI declaration file content or url\r\n\r\nhttps://gist.github.com/username_0/673cbfe68f6c94afdee6ff5c7f0fe9a8\r\n\r\n##### Command line used for generation\r\n\r\njava -jar openapi-generator-cli-5.1.0.jar generate -i ./openapi-spec.json -c ./config.json -g csharp-netcore\r\n\r\nYou can find openapi-spec.json here : \r\nhttps://gist.github.com/username_0/673cbfe68f6c94afdee6ff5c7f0fe9a8\r\nYou can find config.json here : \r\nhttps://gist.github.com/username_0/48efb0951e4d21f682948138bcd6eb52\r\n\r\n##### Steps to reproduce\r\n\r\ndownload both openapi-spec.json and config.json, put them in the same directory as the openapi-generator-cli-5.1.0.jar and run the command line java -jar openapi-generator-cli-5.1.0.jar generate -i ./openapi-spec.json -c ./config.json -g csharp-netcore\r\n\r\nYou will get the following error : \r\nException in thread \"main\" java.lang.StackOverflowError\r\n\tat com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:740)\r\n\r\nThe culprit is the schema JToken referencing itself which creates a cyclic reference\r\n\r\n##### Related issues/PRs\r\n\r\n##### Suggest a fix/enhancement\r\n\r\nWIth the swagger generator, I end up with a Dictionary<string, JToken> for that schema",
"title": "Openapi Generator stackoverflow exception for csharp-netcore",
"type": "issue"
},
{
"action": "created",
"author": "wing328",
"comment_id": 812823266,
"datetime": 1617432694000,
"masked_author": "username_1",
"text": "```\r\n \"components\": {\r\n \"schemas\": {\r\n \"JToken\": {\r\n \"type\": \"array\",\r\n \"items\": {\r\n \"$ref\": \"#/components/schemas/JToken\"\r\n }\r\n }\r\n }\r\n }\r\n```\r\n`JToken` is an array of itself, which is causing the issue.\r\n\r\nWhat does the JSON payload look like?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Kink77",
"comment_id": 813392745,
"datetime": 1617629293000,
"masked_author": "username_0",
"text": "It's basically a [JObject from Newtonsoft.JSON](https://www.newtonsoft.com/json/help/html/T_Newtonsoft_Json_Linq_JObject.htm)\r\n\r\nWhen I try to generate a client with swagger codegen, it gives me a dictionary of <string, JToken>. Open Api Generator seems to do an endless loop and cause a stackoverflow.\r\n\r\nI managed to generate a client by fiddling with the openapi spec. Here's the [fixed openapi-spec](https://gist.github.com/username_0/c4352f4b02102937f256fafe4eb552a0) I used to make it work. \r\n\r\nSo I'm not too sure how to go about having a property that is a JObject translated to an openapi spec and back to a JObject in the client.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JesperG",
"comment_id": 852802338,
"datetime": 1622618220000,
"masked_author": "username_2",
"text": "I also ran into this issue because we use JToken in an API. It reached the same endless loop until stack overflowed.\r\nThank you for your insight as it allowed me to manually modify the .json file (change the TJoken type to object instead of an array of itself) in order to make the generator not failing.\r\nI did not need the endpoints which are using the JToken, so I did not look further into how to use this, sorry.\r\nThis is to confirm that there is an issue.",
"title": null,
"type": "comment"
}
] | 3 | 4 | 2,962 | false | false | 2,962 | true |
PlaceOS/drivers | PlaceOS | 838,177,173 | 123 | null | [
{
"action": "opened",
"author": "jeremyw24",
"comment_id": null,
"datetime": 1616454885000,
"masked_author": "username_0",
"text": "API: https://developer.cisco.com/meraki/mv-sense/#!overview/example-use-cases",
"title": "Cisco MV Sense",
"type": "issue"
},
{
"action": "closed",
"author": "stakach",
"comment_id": null,
"datetime": 1628674175000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 77 | false | false | 77 | false |
aiidateam/aiida-quantumespresso | aiidateam | 827,688,852 | 661 | null | [
{
"action": "opened",
"author": "mbercx",
"comment_id": null,
"datetime": 1615380664000,
"masked_author": "username_0",
"text": "When running the `hp.x` calculation with the `HpCalculation` in the `aiida-quantumespresso-hp` package, you can provide the `parent_scf` input. However, here it is critical that the atoms that the Hubbard atoms are provided first in the `ATOMIC_POSITIONS` list, else the following error is raised by the `hp.x` code:\r\n\r\n```console\r\n WARNING! All Hubbard atoms must be listed first in the ATOMIC_POSITIONS card of PWscf\r\n Stopping...\r\n```\r\n\r\nIn order to avoid this, perhaps we can put the atoms that have Hubbard values assigned first in this list by default?",
"title": "`BasePwCpInputGenerator`: Place Hubbard first in `ATOMIC_POSITIONS` list",
"type": "issue"
},
{
"action": "created",
"author": "mbercx",
"comment_id": 809228635,
"datetime": 1617010293000,
"masked_author": "username_0",
"text": "Pinging @username_1 for comments. :)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MackeEric",
"comment_id": 811801146,
"datetime": 1617271408000,
"masked_author": "username_1",
"text": "Thinking of future extensions to `aiida-quantumespresso-hp` to support the calculation of Hubbard V parameters, I would also be in favour of providing at least an option that allows to sort the positions by Hubbard species. This is because the SCF before the first hp.x calculation (for Hubbard V) requires to set a input such as `hubbard_v(i,i,1) = 1.d-7` where `i` is the number of the Hubbard atom in the `ATOMIC_POSITIONS` list. If we could order the positions in a way that puts the Hubbard species first, the input generator could simply set all `hubbard_v(i,i,1)` from i=1 to i=N<sub>Hubbard<sub> to a finite value and we're done.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,238 | false | false | 1,238 | true |
jhk0530/aladin | null | 801,988,876 | 452 | null | [
{
"action": "opened",
"author": "jhk0530",
"comment_id": null,
"datetime": 1612516161000,
"masked_author": "username_0",
"text": "- 파이썬과 리액트를 활용한 주식 자동거래 시스템 구축\n- 카카오톡, 라인, 아이 메시지 & 페이스북 메신저와 함께하는 이모티콘으로 돈벌기\n- 컴퓨팅 사고력을 키우는 코딩\n- 팜 1 : 지하 농장\n- 하이퍼레저 패브릭 실전 프로젝트\n- 하이퍼레저 패브릭 철저 입문\n- 문과생, 데이터 사이언티스트 되다\n- 스마트한 생활을 위한 버전 2 : 엑셀 2010 활용\n- 디지털 포렌식\n- 2020 시나공 워드프로세서 실기\n- 월드 오브 사이버펑크 2077\n- 2020 시나공 컴퓨터활용능력 1급 실기\n- 2020 이기적 컴퓨터활용능력 1급 실기 기본서 : 무료 동영상 전강 & 채점 프로그램 제공\n- 큐비코가 이상해!\n- 일상을 아름답게 담아내는 사진촬영\n- 삐딱하게 바라본 4차 산업혁명\n- 두렵지 않은 코딩교육\n- 15초면 충분해, 틱톡!\n- 예제로 배우는 Visual C++ MFC 2017 윈도우 프로그래밍\n- 유닉스의 탄생\n- 3D프린팅 수업을 위한 틴커캐드 디자인 4\n- 소프트웨어 인사이더\n- Microsoft Power BI 기본 + 활용",
"title": "알라딘 잠실새내역점 새로 등록된 IT 도서 알림(2021년 02월 05일)",
"type": "issue"
},
{
"action": "closed",
"author": "jhk0530",
"comment_id": null,
"datetime": 1612531215000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 529 | false | false | 529 | false |
miguelgrinberg/oreilly-flask-apis-video | null | 611,202,459 | 15 | null | [
{
"action": "opened",
"author": "nataly-obr",
"comment_id": null,
"datetime": 1588429844000,
"masked_author": "username_0",
"text": "**Description**\r\nsession.commit() method produces IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint.\r\n\r\n**To Reproduce**\r\n``` python\r\nclass Currency(Base):\r\n __tablename__ = 'currency'\r\n\r\n currency_code = db.Column('currency_code', db.String(3), primary_key=True)\r\n currency_name = db.Column('currency_name', db.String(140))\r\n\r\nSession = sessionmaker(bind=self.engine)\r\nsession = Session()\r\ncurrency_list = [Currency(currency_code = 'USD'), Currency(currency_code = 'EUR')]\r\ncurrency_list = list(map(lambda x: session.merge(x), currency_list))\r\nsession.bulk_save_objects(currency_list, update_changed_only=False)\r\nsession.commit() \r\n```\r\n**Additional information**\r\n\r\n1. If I modify this code by adding session.commit() after session.merge(), everything works fine:\r\n``` python\r\n...\r\ncurrency_list = list(map(lambda x: session.merge(x), currency_list))\r\nsession.commit() \r\nsession.bulk_save_objects(currency_list, update_changed_only=False)\r\nsession.commit() \r\n```\r\n2. The table 'currency' is empty, so the currency_code that I'm trying to add doesn't exist in the database.\r\n3. Also no problem with adding this currency_code via \r\n``` python\r\nconnection = psycopg2.connect(...) \r\n... \r\nstatement = \"INSERT INTO {0} ({1}) VALUES ({2}); \".format(table, columns, values) \r\ncrsr.execute(statement) \r\nconnection.commit()\r\n```\r\n**Error**\r\n``` python\r\nsqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint \"currency_pkey\"\r\nDETAIL: Key (currency_code)=(EUR) already exists.\r\n```",
"title": "sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint",
"type": "issue"
},
{
"action": "closed",
"author": "nataly-obr",
"comment_id": null,
"datetime": 1588430052000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "reopened",
"author": "nataly-obr",
"comment_id": null,
"datetime": 1588430171000,
"masked_author": "username_0",
"text": "**Description**\r\nsession.commit() method produces IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint.\r\n\r\n**To Reproduce**\r\n``` python\r\nclass Currency(Base):\r\n __tablename__ = 'currency'\r\n\r\n currency_code = db.Column('currency_code', db.String(3), primary_key=True)\r\n currency_name = db.Column('currency_name', db.String(140))\r\n\r\nSession = sessionmaker(bind=self.engine)\r\nsession = Session()\r\ncurrency_list = [Currency(currency_code = 'USD'), Currency(currency_code = 'EUR')]\r\ncurrency_list = list(map(lambda x: session.merge(x), currency_list))\r\nsession.bulk_save_objects(currency_list, update_changed_only=False)\r\nsession.commit() \r\n```\r\n**Additional information**\r\n\r\n1. If I modify this code by adding session.commit() after session.merge(), everything works fine:\r\n``` python\r\n...\r\ncurrency_list = list(map(lambda x: session.merge(x), currency_list))\r\nsession.commit() \r\nsession.bulk_save_objects(currency_list, update_changed_only=False)\r\nsession.commit() \r\n```\r\n2. The table 'currency' is empty, so the currency_code that I'm trying to add doesn't exist in the database.\r\n3. Also no problem with adding this currency_code via \r\n``` python\r\nconnection = psycopg2.connect(...) \r\n... \r\nstatement = \"INSERT INTO {0} ({1}) VALUES ({2}); \".format(table, columns, values) \r\ncrsr.execute(statement) \r\nconnection.commit()\r\n```\r\n**Error**\r\n``` python\r\nsqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint \"currency_pkey\"\r\nDETAIL: Key (currency_code)=(EUR) already exists.\r\n```",
"title": "sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint",
"type": "issue"
},
{
"action": "closed",
"author": "nataly-obr",
"comment_id": null,
"datetime": 1588430325000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 4 | 3,174 | false | false | 3,174 | false |
siena-mh/github-slideshow | null | 606,555,458 | 3 | {
"number": 3,
"repo": "github-slideshow",
"user_login": "siena-mh"
} | [
{
"action": "opened",
"author": "siena-mh",
"comment_id": null,
"datetime": 1587759023000,
"masked_author": "username_0",
"text": "Created a branch and file.",
"title": "add slide",
"type": "issue"
}
] | 2 | 2 | 1,789 | false | true | 26 | false |
dotnet/docfx | dotnet | 758,838,460 | 6,866 | null | [
{
"action": "opened",
"author": "alexhelms",
"comment_id": null,
"datetime": 1607374260000,
"masked_author": "username_0",
"text": "**Operating System**: Windows\r\n\r\n**DocFX Version Used**: 2.56.5\r\n\r\n**Template used**: default\r\n\r\n**Steps to Reproduce**:\r\n\r\n1. Create a property like `public string MyString { get; init; }`\r\n2. Run docfx\r\n\r\n**Expected Behavior**: Documentation is generated.\r\n\r\n**Actual Behavior**: An exception is thrown, see below for stack trace.\r\n\r\nTo find this I had to run docfx from source and in debug mode and trace back from the exception. The hint I had was the name of the property in a function further up the call stack.\r\n\r\nI understand #6805 is for C# 9 support but I don't see any implementation or notes about `init` properties. At work we recently upgraded to .NET 5 and this is the first time I tried to use an `init` property and our CI failed during doc generation and took a considerable amount of time to find the root cause.\r\n\r\nThanks!\r\n\r\n```\r\nMicrosoft.DocAsCode.Exceptions.DocfxException: Unable to generate spec reference for !: ---> System.IO.InvalidDataException: Fail to parse id for symbol in namespace .\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.YamlModelGenerator.AddSpecReference(ISymbol symbol, IReadOnlyList`1 typeGenericParameters, IReadOnlyList`1 methodGenericParameters, Dictionary`2 references, SymbolVisitorAdapter adapter)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.SymbolVisitorAdapter.AddSpecReference(ISymbol symbol, IReadOnlyList`1 typeGenericParameters, IReadOnlyList`1 methodGenericParameters)\r\n --- End of inner exception stack trace ---\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.SymbolVisitorAdapter.AddSpecReference(ISymbol symbol, IReadOnlyList`1 typeGenericParameters, IReadOnlyList`1 methodGenericParameters)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.SymbolVisitorAdapter.AddMethodSyntax(IMethodSymbol symbol, MetadataItem result, IReadOnlyList`1 typeGenericParameters, IReadOnlyList`1 methodGenericParameters)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.SymbolVisitorAdapter.VisitMethod(IMethodSymbol symbol)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.SymbolVisitorAdapter.VisitNamedType(INamedTypeSymbol symbol)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.SymbolVisitorAdapter.VisitDescendants[T](IEnumerable`1 children, Func`2 getChildren, Func`2 filter)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.SymbolVisitorAdapter.VisitNamespace(INamespaceSymbol symbol)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.SymbolVisitorAdapter.VisitAssembly(IAssemblySymbol symbol)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.RoslynMetadataExtractor.Extract(ExtractMetadataOptions options)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.ExtractMetadataWorker.GetMetadataFromProjectLevelCache(IBuildController controller, IInputParameters key)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.ExtractMetadataWorker.<SaveAllMembersFromCacheAsync>d__13.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()\r\n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n at Microsoft.DocAsCode.Metadata.ManagedReference.ExtractMetadataWorker.<ExtractMetadataAsync>d__11.MoveNext()\r\n 0 Warning(s)\r\n 1 Error(s)\r\n```",
"title": "Support C# 9 init properties",
"type": "issue"
}
] | 1 | 1 | 3,303 | false | false | 3,303 | false |
HMS-Core/hms-flutter-plugin | HMS-Core | 801,838,845 | 62 | null | [
{
"action": "opened",
"author": "davidzou",
"comment_id": null,
"datetime": 1612500261000,
"masked_author": "username_0",
"text": "**Description**\r\nThe Splash Ad do not be showed.\r\n\r\n**Logs**\r\n```\r\n [+1012 ms] I/HiAdSDK.RealtimeAdMediator( 5144): doOnShowSloganEnd\r\n [ +1 ms] I/HiAdSDK.RealtimeAdMediator( 5144): Ad fails to display or loading timeout, ad dismiss\r\n [ ] I/HiAdSDK.AdMediator( 5144): ad failed:499\r\n [ ] I/HiAdSDK.AdMediator( 5144): ad is already failed\r\n [ ] I/HiAdSDK.AdMediator( 5144): notifyAdDismissed\r\n [ ] I/HiAdSDK.AdMediator( 5144): ad already dismissed\r\n```\r\n\r\n**Environment**\r\n - Platform: Flutter\r\n - Kit: Ads\r\n - Kit Version : 13.4.35+300\r\n - OS Version : any\r\n - Android Studio version (if applicable) [4.1]\r\n - Platform version (if applicable)\r\n - Node Version (if applicable)\r\n - Your Location/Region (if applicable) CN",
"title": "The Splash Ad do not be showed. Error code 499.",
"type": "issue"
},
{
"action": "created",
"author": "furkansarihan",
"comment_id": 773862414,
"datetime": 1612511664000,
"masked_author": "username_1",
"text": "Hello @username_0,\r\n\r\n1. Make sure HMS Core version is 4.0+ in your device.\r\n2. Don't set the country/Region to China if your phone is not ChinaROM.\r\n\r\nAlso you can test your apk in [cloud debugging](https://developer.huawei.com/consumer/en/console#/openCard/AppService/1045) to make sure there is no issue in your implementation.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Mike-mei",
"comment_id": null,
"datetime": 1616481795000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 3 | 1,089 | false | false | 1,089 | true |
flashlight/flashlight | flashlight | 860,701,960 | 547 | null | [
{
"action": "opened",
"author": "waynelapierre",
"comment_id": null,
"datetime": 1618762451000,
"masked_author": "username_0",
"text": "### Feature Description\r\nIt would be great if the flashlight library can be used in R. \r\n\r\n#### Use Case\r\nMany users prefer R over Python for data analytics. They can benefit from being able to use the flashlight library in R. \r\n\r\n#### Additional Context\r\nThe Rcpp package can be used to integrate R and C++.",
"title": "Using from R Language",
"type": "issue"
},
{
"action": "created",
"author": "jacobkahn",
"comment_id": 822868658,
"datetime": 1618877145000,
"masked_author": "username_1",
"text": "@username_0 — as with other bindings, we probably don't have the bandwidth or knowhow to create robust R bindings ourselves, but we'd welcome a PR and would absolutely be interested in discussing this further. Would you/others be able to support a first push on this?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "waynelapierre",
"comment_id": 822880486,
"datetime": 1618878400000,
"masked_author": "username_0",
"text": "This is beyond my capability. Since someone else at Facebook is doing a great job maintaining the R binding of the prophet project, they may be interested in this project as well. \r\nhttps://github.com/facebook/prophet",
"title": null,
"type": "comment"
}
] | 2 | 3 | 796 | false | false | 796 | true |
javve/list.js | null | 754,585,396 | 699 | null | [
{
"action": "opened",
"author": "machupichu123",
"comment_id": null,
"datetime": 1606843598000,
"masked_author": "username_0",
"text": "I found a problem caused by a different version.\r\n\r\nI can't search the symbols `#` and `-` when using a var 2.3.0.\r\n\r\nFor example, here is the code.\r\n[https://jsfiddle.net/veduozjf/](https://jsfiddle.net/veduozjf/)\r\n\r\nIn case of I use var 1.5.0 result shows 'Mark #Twain',\r\nBut var 2.3.0 shows nothing.",
"title": "function is not wark at item option at var 2.3.0 ",
"type": "issue"
},
{
"action": "created",
"author": "sheffieldnick",
"comment_id": 736749760,
"datetime": 1606848824000,
"masked_author": "username_1",
"text": "I think it is probably a bug in `setSearchString()` where it is escaping regular expression characters (like #, -, etc) but the new faster multiword search code in v2.3.0 doesn't use regexp, so those characters shouldn't be escaped?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "machupichu123",
"comment_id": 737759124,
"datetime": 1606985545000,
"masked_author": "username_0",
"text": "Thank you. I solved it by commenting out line 744 of 2.3.0.\r\n\r\n744 s = s.replace(/[-[\\]{}()*+?.,\\\\^$|#]/g, '\\\\$&'); // Escape regular expression characters\r\n↓\r\n744 // s = s.replace(/[-[\\]{}()*+?.,\\\\^$|#]/g, '\\\\$&'); // Escape regular expression characters",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "machupichu123",
"comment_id": 737770762,
"datetime": 1606986800000,
"masked_author": "username_0",
"text": "Thank you. I solved it by commenting out line 744 of 2.3.0.\r\n\r\n744 s = s.replace(/[-[]{}()+?.,\\^$|#]/g, '\\$&'); // Escape regular expression characters\r\n↓\r\n744 // s = s.replace(/[-[]{}()+?.,\\^$|#]/g, '\\$&'); // Escape regular expression characters",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "machupichu123",
"comment_id": 737771103,
"datetime": 1606986838000,
"masked_author": "username_0",
"text": "I am expecting great things of you!\r\nWhat do you think of this issue?\r\n[https://github.com/javve/list.js/issues/698](https://github.com/javve/list.js/issues/698)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sheffieldnick",
"comment_id": 737781458,
"datetime": 1606987953000,
"masked_author": "username_1",
"text": "I suggest renaming this issue to something specific, like \"Search on regexp characters broken by v2.3.0\"",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "machupichu123",
"comment_id": 737825839,
"datetime": 1606990216000,
"masked_author": "username_0",
"text": "I Fixed the title.\r\nWhy did you thumbs down?\r\nAre you angry with me, by any chance?\r\nI'm concerning about that if my English makes sense or not.",
"title": null,
"type": "comment"
}
] | 2 | 7 | 1,445 | false | false | 1,445 | false |
celo-org/celo-monorepo | celo-org | 832,088,241 | 7,439 | null | [
{
"action": "opened",
"author": "yorhodes",
"comment_id": null,
"datetime": 1615833540000,
"masked_author": "username_0",
"text": "### Expected Behavior\r\n\r\nCelo contracts verified on https://sourcify.dev/ \r\n\r\n### Current Behavior\r\n\r\nNo verified contracts on sourcify (or blockscout)\r\n\r\nhttps://docs.blockscout.com/for-projects/premium-features/contracts-verification-via-sourcify",
"title": "Verify released contracts using sourcify ",
"type": "issue"
},
{
"action": "created",
"author": "zviadm",
"comment_id": 812718493,
"datetime": 1617398057000,
"masked_author": "username_1",
"text": "Just a note that might be helpful. I recently integrated fetching verified contract metadata from sourcify in Celo Terminal. It was surprisingly easy to do and also verification side was very easy and smooth too directly with https://sourcify.dev.\r\n\r\nOne really nice thing that is also possible with `sourcify` is that once you verify contracts on alfajores or baklava, you don't even need to do anything to verify them on mainnet as long as contracts match exactly, since it will do automatic bytecode matching and find appropriate metadata + source. \r\n\r\nFor verification itself, using sourcify api directly can be easier than having to go through blockscout: https://github.com/ethereum/sourcify/blob/master/docs/api/server/verification1/verify.md\r\nhttps://github.com/ethereum/sourcify#api",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "yorhodes",
"comment_id": 820584906,
"datetime": 1618505956000,
"masked_author": "username_0",
"text": "@kevjue worked with this tool during ubeswap audit, could be a good candidate",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "yorhodes",
"comment_id": 843456835,
"datetime": 1621364653000,
"masked_author": "username_0",
"text": "blocked on https://github.com/celo-org/celo-monorepo/issues/7855",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "yorhodes",
"comment_id": 863399916,
"datetime": 1623948576000,
"masked_author": "username_0",
"text": "https://github.com/ethereum/sourcify/issues/468",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "yorhodes",
"comment_id": null,
"datetime": 1629909309000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 1,228 | false | false | 1,228 | false |
OPS-E2E-PPE/E2E_DocFxV3 | OPS-E2E-PPE | 770,369,213 | 12,297 | {
"number": 12297,
"repo": "E2E_DocFxV3",
"user_login": "OPS-E2E-PPE"
} | [
{
"action": "opened",
"author": "OPSTestPPE",
"comment_id": null,
"datetime": 1608239351000,
"masked_author": "username_0",
"text": "",
"title": "pronly-true-warning: changed includes file reported in parent file",
"type": "issue"
},
{
"action": "created",
"author": "e2ebd2",
"comment_id": 747703551,
"datetime": 1608239385000,
"masked_author": "username_1",
"text": "Docs Build status updates of commit _[544c9b0](https://github.com/OPS-E2E-PPE/E2E_DocFxV3/commits/544c9b066de6b679355bc3dca6b386e05d355064)_: \n\n### :white_check_mark: Validation status: passed\r\n\r\n\r\nFile | Status | Preview URL | Details\r\n---- | ------ | ----------- | -------\r\n[E2E_DocsBranch_Dynamic/pr-only/includes/skip-level.md](https://github.com/OPS-E2E-PPE/E2E_DocFxV3/blob/pronly-on-skipLevel-warning/E2E_DocsBranch_Dynamic/pr-only/includes/skip-level.md) | :white_check_mark:Succeeded | [View](https://ppe.docs.microsoft.com/en-us/E2E_DocFxV3/pr-only/skiplevel?branch=pr-en-us-12297) |\r\n\r\nFor more details, please refer to the [build report](https://opbuildstoragesandbox2.blob.core.windows.net/report/2020%5C12%5C17%5C8b165afe-a866-8c48-e5c3-ead94440f27e%5CPullRequest%5C202012172109142210-12297%5Cworkflow_report.html?sv=2016-05-31&sr=b&sig=z6ZgiDyItRqq36A8KcQVBm0l3TfnjmPwZxOAVgFzNoU%3D&st=2020-12-17T21%3A04%3A44Z&se=2021-01-17T21%3A09%3A44Z&sp=r).\r\n\r\n**Note:** Broken links written as relative paths are included in the above build report. For broken links written as absolute paths or external URLs, see the [broken link report](https://opportal-sandbox.azurewebsites.net/#/repos/8b165afe-a866-8c48-e5c3-ead94440f27e?tabName=brokenlinks).\r\n\r\nFor any questions, please:<ul><li>Try searching in the <a href=\"https://review.docs.microsoft.com/en-us/search/?search=&category=All&scope=help-docs&category=All&branch=master\">Docs contributor and Admin Guide</a></li><li>See the <a href=\"https://review.docs.microsoft.com/en-us/help/onboard/faq?branch=master\">frequently asked questions</a></li><li>Post your question in the <a href=\"https://teams.microsoft.com/l/channel/19%3a7ecffca1166a4a3986fed528cf0870ee%40thread.skype/General?groupId=de9ddba4-2574-4830-87ed-41668c07a1ca&tenantId=72f98bf-86f1-41af-91ab-2d7cd011db47\">Docs support channel</a></li></ul>",
"title": null,
"type": "comment"
}
] | 2 | 2 | 1,885 | false | false | 1,885 | false |
folio-org/mod-circulation | folio-org | 654,073,762 | 599 | {
"number": 599,
"repo": "mod-circulation",
"user_login": "folio-org"
} | [
{
"action": "opened",
"author": "bohdan-suprun",
"comment_id": null,
"datetime": 1594302352000,
"masked_author": "username_0",
"text": "Includes:\r\n* https://github.com/folio-org/mod-circulation/pull/588\r\n* https://github.com/folio-org/mod-circulation/pull/590\r\n* https://github.com/folio-org/mod-circulation/pull/584\r\n\r\n## Purpose\r\n<!--\r\n Why are you making this change? There is nothing more important\r\n to provide to the reviewer and to future readers than the cause\r\n that gave rise to this pull request. Be careful to avoid circular\r\n statements like \"the purpose is to update the schema.\" and\r\n instead provide an explanation like \"there is more data to be provided and stored for Purchase Orders \r\n which is currently missing in the schema\"\r\n\r\n The purpose may seem self-evident to you now, but the standard to\r\n hold yourself to should be \"can a developer parachuting into this\r\n project reconstruct the necessary context merely by reading this\r\n section.\"\r\n\r\n If you have a relevant JIRA issue, add a link directly to the issue URL here.\r\n Example: https://issues.folio.org/browse/MODORDERS-70\r\n -->\r\n\r\n## Approach\r\n<!--\r\n How does this change fulfill the purpose? It's best to talk\r\n high-level strategy and avoid code-splaining the commit history.\r\n\r\n The goal is not only to explain what you did, but help other\r\n developers *work* with your solution in the future.\r\n-->\r\n\r\n#### TODOS and Open Questions\r\n<!-- OPTIONAL\r\n- [ ] Use GitHub checklists. When solved, check the box and explain the answer.\r\n-->\r\n\r\n## Learning\r\n<!-- OPTIONAL\r\n Help out not only your reviewer, but also your fellow developer!\r\n Sometimes there are key pieces of information that you used to come up\r\n with your solution. Don't let all that hard work go to waste! A\r\n pull request is a *perfect opportunity to share the learning that\r\n you did. Add links to blog posts, patterns, libraries or addons used\r\n to solve this problem.\r\n-->\r\n\r\n## Pre-Merge Checklist:\r\nBefore merging this PR, please go through the following list and take appropriate actions.\r\n\r\n- Does this PR meet or exceed the expected quality standards?\r\n - [ ] Code coverage on new code is 80% or greater\r\n - [ ] Duplications on new code is 3% or less\r\n - [ ] There are no major code smells or security issues\r\n- Does this introduce breaking changes?\r\n - [ ] Were any API paths or methods changed, added or removed?\r\n - [ ] Were there any schema changes?\r\n - [ ] Did any of the interface versions change?\r\n - [ ] Were permissions changed, added, or removed?\r\n - [ ] Are there new interface dependencies?\r\n - [ ] There are no breaking changes in this PR.\r\n \r\nIf there are breaking changes, please **STOP** and consider the following:\r\n\r\n- What other modules will these changes impact?\r\n- Do JIRAs exist to update the impacted modules?\r\n - [ ] If not, please create them\r\n - [ ] Do they contain the appropriate level of detail? Which endpoints/schemas changed, etc.\r\n - [ ] Do they have all they appropriate links to blocked/related issues?\r\n- Are the JIRAs under active development? \r\n - [ ] If not, contact the project's PO and make sure they're aware of the urgency.\r\n- Do PRs exist for these changes?\r\n - [ ] If so, have they been approved?\r\n\r\nIdeally all of the PRs involved in breaking changes would be merged in the same day to avoid breaking the folio-testing environment. Communication is paramount if that is to be achieved, especially as the number of intermodule and inter-team dependencies increase. \r\n\r\nWhile it's helpful for reviewers to help identify potential problems, ensuring that it's safe to merge is ultimately the responsibility of the PR assignee.",
"title": "Bug/fix release 19.0.6",
"type": "issue"
}
] | 2 | 3 | 10,435 | false | true | 3,523 | false |
LeoChen98/BiliRaffle | null | 896,177,556 | 1,000 | null | [
{
"action": "closed",
"author": "LeoChen98",
"comment_id": null,
"datetime": 1622252523000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 1,759 | false | true | 0 | false |
keyonvafa/tbip | null | 700,662,921 | 3 | null | [
{
"action": "opened",
"author": "yoyoyyono",
"comment_id": null,
"datetime": 1600036864000,
"masked_author": "username_0",
"text": "Hi, I've been using your repository a few weeks ago to estimate social media user ideal points. However, I have noticed that when I try to run the model with more than 800 authors, the model does not converge. Specifically the ELBO returned nan values. Have you ever run the model with more than 800 authors? Also, do you know some article that discusses variational inference convergence problems? I think my problem may be due to the number of parameters but I am not sure.\nI would appreciate if you could guide me, Thanks!",
"title": "Max authors",
"type": "issue"
},
{
"action": "created",
"author": "keyonvafa",
"comment_id": 691737652,
"datetime": 1600038006000,
"masked_author": "username_1",
"text": "Hi, hmm. I haven't tried running with 800 authors but that shouldn't be the nan issue (each author is only adding one extra parameter to the model). Out of curiosity, what happens if you keep the dataset the same but change the author indices so that there are only 2 authors (i.e. incorrectly label the authors)? I assume the nans would still be there, but if they're not, that would confirm that the issue is with the number of authors.\r\n\r\nAre you using the TensorFlow or PyTorch implementation? And what is the vocabulary size and the number of documents you're using?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "yoyoyyono",
"comment_id": 730111825,
"datetime": 1605757881000,
"masked_author": "username_0",
"text": "Sorry for the lateness of my reply, I had to pause the project for a while. Changing the author indices leaving only 2 authors solved the problem. However, days later I was able to find the root of my problem and it was not the number of authors. The problem was generated because I had authors with 0 vocabulary words and the optimization placed a 0 in the rate parameter of the Poisson distribution, generating Nans in the log_prob.\r\nHowever, eliminating the authors with 0 words in the vocabulary, I have been able to estimate ideal points for datasets with 100,000 authors.\r\nThanks for the answer and for the excellent tutorial on Google Colab.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "keyonvafa",
"comment_id": null,
"datetime": 1605806581000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "keyonvafa",
"comment_id": 730521223,
"datetime": 1605806581000,
"masked_author": "username_1",
"text": "Great, I'm glad it's working now. And thank you!",
"title": null,
"type": "comment"
}
] | 2 | 5 | 1,792 | false | false | 1,792 | false |
statsmodels/statsmodels | statsmodels | 653,455,681 | 6,865 | {
"number": 6865,
"repo": "statsmodels",
"user_login": "statsmodels"
} | [
{
"action": "opened",
"author": "bashtage",
"comment_id": null,
"datetime": 1594227718000,
"masked_author": "username_0",
"text": "Mark as not implemented to stop use\r\n\r\ncloses #2347\r\n\r\n- [ ] closes #xxxx\r\n- [ ] tests added / passed. \r\n- [ ] code/documentation is well formatted. \r\n- [ ] properly formatted commit message. See \r\n [NumPy's guide](https://docs.scipy.org/doc/numpy-1.15.1/dev/gitwash/development_workflow.html#writing-the-commit-message). \r\n\r\n<details>\r\n\r\n\r\n**Notes**:\r\n\r\n* It is essential that you add a test when making code changes. Tests are not \r\n needed for doc changes.\r\n* When adding a new function, test values should usually be verified in another package (e.g., R/SAS/Stata).\r\n* When fixing a bug, you must add a test that would produce the bug in master and\r\n then show that it is fixed with the new code.\r\n* New code additions must be well formatted. Changes should pass flake8. If on Linux or OSX, you can\r\n verify you changes are well formatted by running \r\n ```\r\n git diff upstream/master -u -- \"*.py\" | flake8 --diff --isolated\r\n ```\r\n assuming `flake8` is installed. This command is also available on Windows \r\n using the Windows System for Linux once `flake8` is installed in the \r\n local Linux environment. While passing this test is not required, it is good practice and it help \r\n improve code quality in `statsmodels`.\r\n* Docstring additions must render correctly, including escapes and LaTeX.\r\n\r\n</details>",
"title": "MAINT: Mark VAR from_formula as NotImplemented",
"type": "issue"
}
] | 2 | 2 | 1,632 | false | true | 1,329 | false |
lihangleo2/ShadowLayout | null | 757,921,960 | 74 | null | [
{
"action": "opened",
"author": "dengyiqian",
"comment_id": null,
"datetime": 1607260055000,
"masked_author": "username_0",
"text": "如果有多个子view FrameLayout排版很不方便,继承ConstraintLayout布局就很灵活了",
"title": "多个子view排版不够灵活",
"type": "issue"
},
{
"action": "created",
"author": "lihangleo2",
"comment_id": 741395654,
"datetime": 1607477244000,
"masked_author": "username_1",
"text": "暂且不支持多个子view排布,ShadowLayout只支持一个子view,如果有多布局,你可以当前子view用LinearLayout或RelativeLayout有点类似ScrollView的使用",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "lihangleo2",
"comment_id": null,
"datetime": 1607477244000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 155 | false | false | 155 | false |
PuruVJ/puruvjdev2 | null | 773,483,083 | 10 | {
"number": 10,
"repo": "puruvjdev2",
"user_login": "PuruVJ"
} | [
{
"action": "opened",
"author": "PuruVJ",
"comment_id": null,
"datetime": 1608698945000,
"masked_author": "username_0",
"text": "",
"title": "Blog: Top Level Await",
"type": "issue"
},
{
"action": "created",
"author": "PuruVJ",
"comment_id": 750766134,
"datetime": 1608791121000,
"masked_author": "username_0",
"text": "To be merged on Monday",
"title": null,
"type": "comment"
}
] | 2 | 3 | 475 | false | true | 22 | false |
machty/ember-concurrency | null | 641,659,431 | 362 | {
"number": 362,
"repo": "ember-concurrency",
"user_login": "machty"
} | [
{
"action": "opened",
"author": "chancancode",
"comment_id": null,
"datetime": 1592533758000,
"masked_author": "username_0",
"text": "Had I known that this is a thing, I would probably have written the types a bit differently, but this should work and is non-breaking.\r\n\r\nI didn't make `EncapsulatedTaskInstance` public because I had this lingering suspicious that we'll need to add a second generic parameter to `TaskInstance` for the args. I don't want to lockdown that possibility, or deal with strangely mismatched ordering of parameters between `TaskInstance` vs `EncapsulatedTaskInstance`, so I kept that private.\r\n\r\nFor the time being, the public way to type it (as seen in the tests) is `TaskInstance<T> & { ... }`, which I think is plenty acceptable.\r\n\r\nGot to update e-c-async and e-c-ts to account for this, then update the Octane tests.\r\n\r\ncc @username_1",
"title": "Allow accessing encapsulated task state",
"type": "issue"
},
{
"action": "created",
"author": "andreyfel",
"comment_id": 646675487,
"datetime": 1592577738000,
"masked_author": "username_1",
"text": "Hi @username_0! Thank you for this PR! It seems that it is covering my use case! I've tested against that branch and it works!\r\n\r\nThe only thing I'm missing now is how to type `this` within the `perform` function of the encapsulated task. If I want to set a property on task instance inside perform what type does it have? EncapsulatedTaskInstance?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chancancode",
"comment_id": 646844755,
"datetime": 1592597713000,
"masked_author": "username_0",
"text": "I don't think we should block on making `EncapsulatedTaskInstance` public. It is very rare that you would have to type it (please show some examples – most cases I know of are either implicit or can be inferred, and I think it's rare that you would want to use this in a parameter or return position).\r\n\r\nI do think people will copy the type alias, and I think that's not ideal but not so bad. If we a public one, it will be compatible with and won't break these existing usages.\r\n\r\nI do think we should work towards making it public, we just have to address the other blocker first– whether we need to add the second generic to `TaskInstance` – and that's far from a hypothetical/theoretical issue. I don't think this PR is a good place to address that but we can also come to a decision on that fairy soon and fix both.\r\n\r\nIn the meantime, I think this adds enough values to those who need the feature and it's not obvious to me that they would be inconvenienced by this during normal usage.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chancancode",
"comment_id": 646846670,
"datetime": 1592598070000,
"masked_author": "username_0",
"text": "Why did you have to type it? There are tests that confirms the `this` inference works correctly: https://github.com/username_0/ember-concurrency/blob/encapsulated-task-v2/tests/types/ember-concurrency-test.ts#L2347-L2348\r\n\r\nIs that not working for you for some reason?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "andreyfel",
"comment_id": 647405720,
"datetime": 1592818819000,
"masked_author": "username_1",
"text": "@username_0, yes, it works in a basic case. But I have a little bit more complex case.\r\nThe idea is to have a task which exposes a property isRunningLong which should turn true in\r\ncase if the task takes longer than `loadingTimeout`. I'm using an encapsulated task with an inner task for it:\r\n\r\n```\r\nexport interface LongRunningTaskDescriptor<T, Args extends any[]> extends EncapsulatedTaskDescriptor<T, Args> {\r\n isRunningLong: boolean,\r\n timeoutTask: TaskProperty<void, []>,\r\n}\r\n\r\nexport function longRunningTask<T, Args extends any[]>(fn: TaskFunction<T, Args>, loadingTimeout = 300): LongRunningTaskDescriptor<T, Args> {\r\n return {\r\n isRunningLong: false,\r\n\r\n timeoutTask: task(function * (this: LongRunningTaskDescriptor<T, Args>) {\r\n yield timeout(loadingTimeout);\r\n set(this, 'isRunningLong', true);\r\n }).drop(),\r\n\r\n * perform(...args): TaskGenerator<T> {\r\n try {\r\n this.timeoutTask.perform();\r\n return yield * fn.apply(this.context, args);\r\n } finally {\r\n this.timeoutTask.cancelAll();\r\n set(this, 'isRunningLong', false);\r\n }\r\n },\r\n };\r\n}\r\n```\r\n\r\nIt is used like this with new e-c and e-c-d:\r\n```\r\n@task({ restartable: true })\r\nfetchSmth = longRunningTask(function * () {\r\n yield fetchSmth();\r\n}, 600)\r\n```\r\n\r\nSo, I have to specify type of `this` for the `timeoutTask`. And inside the `perform` function this.timeoutTask.perform doesn't work. And `taskFor` doesn't work here.\r\nAlso typescript says that this.context doesn't exist. I'm not sure if it is a type issue or context is not a public filed.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "maxfierke",
"comment_id": 647821849,
"datetime": 1592869046000,
"masked_author": "username_2",
"text": "`context` is sort of \"intimate\", undocumented API. It's useful for implementing extensions onto e-c, so perhaps its worth making public but its not something we'd expect people to use often (though with encapsulated tasks, it might make more sense.)\r\n\r\nI think a composition like that to create an encapsulated task is a bit of an advanced use-case, so if it's working in the basic case, it may be worth deferring for later unless addressing it is fairly straightforward.",
"title": null,
"type": "comment"
}
] | 3 | 6 | 4,398 | false | false | 4,398 | true |
openshift/release | openshift | 759,691,875 | 14,173 | {
"number": 14173,
"repo": "release",
"user_login": "openshift"
} | [
{
"action": "opened",
"author": "hongkailiu",
"comment_id": null,
"datetime": 1607453412000,
"masked_author": "username_0",
"text": "Missed in https://github.com/openshift/release/pull/14081\r\nThe deployment is not controlled by the `make` target.\r\nI have to manually edit it.\r\n\r\n/cc @stevekuznetsov @smarterclayton",
"title": "Apply soft-delete-release-tags to RC of origin",
"type": "issue"
}
] | 2 | 3 | 1,923 | false | true | 181 | false |
emilybache/GildedRose-Refactoring-Kata | null | 874,741,177 | 229 | {
"number": 229,
"repo": "GildedRose-Refactoring-Kata",
"user_login": "emilybache"
} | [
{
"action": "opened",
"author": "Pen-y-Fan",
"comment_id": null,
"datetime": 1620061087000,
"masked_author": "username_0",
"text": "- update README.md with latest PHP information\r\n- update composer.json to support PHP 7.3 to PHP8\r\n - active support for PHP 7.2 ended 6 Dec 2020\r\n - PHP8 was released 26-Nov-2020\r\n - update the dependencies\r\n- PHPUnit now version 9.5 and config file updated\r\n- ECS now version 9.3 and config file changed from `ecs.yaml` to `ecs.php`\r\n\r\nApprovalTest removed, in line with latest readme, all set for refactoring :)\r\n\r\nTested with PHP 7.3, 7.4 and 8.0 one failing \"fixme\" != \"foo\" test!",
"title": "Add PHP8",
"type": "issue"
},
{
"action": "created",
"author": "emilybache",
"comment_id": 831703936,
"datetime": 1620108449000,
"masked_author": "username_1",
"text": "Thankyou!",
"title": null,
"type": "comment"
}
] | 2 | 2 | 497 | false | false | 497 | false |
pytorch/pytorch | pytorch | 704,166,113 | 44,938 | null | [
{
"action": "opened",
"author": "nightlessbaron",
"comment_id": null,
"datetime": 1600415044000,
"masked_author": "username_0",
"text": "## 🚀 Feature\r\n<!-- A clear and concise description of the feature proposal -->\r\nAdding DataParallel method for CPU\r\n\r\n## Motivation\r\n\r\n<!-- Please outline the motivation for the proposal. Is your feature request related to a problem? e.g., I'm always frustrated when [...]. If this is related to another GitHub issue, please link here too -->\r\nRecently, I have been working on few-shot learning and meta-learning models which in particular need small datasets. Let's take MAML for example. We have scope for parallelizing the inner loop and data parallelism seemed to be the answer for that. While, using GPUs does the work, we can also make use of CPUs for the same, as the data is sufficiently small.\r\n\r\n## Pitch\r\n\r\n<!-- A clear and concise description of what you want to happen. -->\r\nMy question is, is it worth looking into adding an option for CPUs in DataParallel module along with GPUs?",
"title": "DataParallel on CPU",
"type": "issue"
},
{
"action": "created",
"author": "agolynski",
"comment_id": 694897420,
"datetime": 1600438857000,
"masked_author": "username_1",
"text": "@username_0 Would https://pytorch.org/docs/stable/multiprocessing.html work for your usecase or you are proposing PT to have data parallel wrapper for this?",
"title": null,
"type": "comment"
}
] | 2 | 2 | 1,054 | false | false | 1,054 | true |
rajatdayal01/github-slideshow | null | 823,579,223 | 1 | null | [
{
"action": "closed",
"author": "rajatdayal01",
"comment_id": null,
"datetime": 1615013892000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 6,807 | false | true | 0 | false |
icsm-au/DynAdjust | icsm-au | 860,801,499 | 93 | {
"number": 93,
"repo": "DynAdjust",
"user_login": "icsm-au"
} | [
{
"action": "opened",
"author": "rogerfraser",
"comment_id": null,
"datetime": 1618791057000,
"masked_author": "username_0",
"text": "",
"title": "Minor enhancements, bug fix, more test data and test scripts",
"type": "issue"
}
] | 2 | 2 | 0 | false | true | 0 | false |
ericoporto/ImGi | null | 812,987,372 | 22 | null | [
{
"action": "opened",
"author": "ericoporto",
"comment_id": null,
"datetime": 1613952180000,
"masked_author": "username_0",
"text": "I think the library is not fully featured yet but it's a bit big already. Find out what is possible to cut, maybe use some macros and overall try to reduce a bit the line count of the file.",
"title": "Reduce size of the .scm",
"type": "issue"
},
{
"action": "created",
"author": "ericoporto",
"comment_id": 782973006,
"datetime": 1613955610000,
"masked_author": "username_0",
"text": "Probably I may have left something unused in the renderer since it's a huge part of the code now.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 286 | false | false | 286 | false |
agrbin/svgtex | null | 852,104,057 | 20 | null | [
{
"action": "opened",
"author": "alusiani",
"comment_id": null,
"datetime": 1617780601000,
"masked_author": "username_0",
"text": "Thanks for this nice piece of software!\r\n\r\nAlberto",
"title": "failure to render a long math formula",
"type": "issue"
},
{
"action": "created",
"author": "agrbin",
"comment_id": 817331753,
"datetime": 1618157453000,
"masked_author": "username_1",
"text": "Thank you for the message and thanks for the nice words!\r\n\r\nIt's been a long time since I looked at this code :) I won't be able to help quickly.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 195 | false | false | 195 | false |
gdamore/tcell | null | 733,452,624 | 404 | null | [
{
"action": "opened",
"author": "walles",
"comment_id": null,
"datetime": 1604085633000,
"masked_author": "username_0",
"text": "Hi!\r\n\r\nIn 1.x, `tcell.Color(74)` used to return a value matching `tcell.Color74.`\r\n\r\nIn 2.0, those values are different.\r\n\r\nIs this something I should worry about?\r\n\r\n Regards /Johan\r\n\r\n# Repro\r\n```go\r\npackage main\r\n\r\nimport (\r\n\t\"fmt\"\r\n\t\"strconv\"\r\n\r\n\t\"github.com/username_1/tcell\"\r\n)\r\n\r\nfunc main() {\r\n\tcolorValue := tcell.Color(74)\r\n\r\n\tfmt.Printf(\"Created color value: %s\\n\", strconv.FormatInt(int64(colorValue), 16))\r\n\tfmt.Printf(\"Constant color value: %s\\n\", strconv.FormatInt(int64(tcell.Color74), 16))\r\n\tif colorValue == tcell.Color74 {\r\n\t\tfmt.Println(\"Equal, good.\")\r\n\t} else {\r\n\t\tfmt.Println(\"Not equal, bad!\")\r\n\t}\r\n}\r\n```",
"title": "In 2.0, color constants don't match created colors",
"type": "issue"
},
{
"action": "closed",
"author": "gdamore",
"comment_id": null,
"datetime": 1604087283000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "gdamore",
"comment_id": 719761660,
"datetime": 1604087283000,
"masked_author": "username_1",
"text": "You should not worry about it. The numeric value of the color includes a bit indicating that the color is \"valid\". The low order bits are still 74 (or whatever).\r\n\r\nThe reason for adding in an extra high order bit is so that we can treat 0 (normally black) as an uninitialized value.\r\n\r\nThe numeric values are \"private\" -- if you want the actual RGB value, you can ask for it. We don't give you the palette index as a value -- mostly because we didn't see a need for that, but if you want to have that I'm sure its an API we could provide.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gdamore",
"comment_id": 719761852,
"datetime": 1604087309000,
"masked_author": "username_1",
"text": "(These changes in the color handling were why tcell was bumped to v2 btw...)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "walles",
"comment_id": 719895462,
"datetime": 1604127978000,
"masked_author": "username_0",
"text": "So I have a function returning different colors depending on its input, and I want to test it.\r\n\r\nCurrently the test looks like this, and it works on 1.4 and fails on 2.0:\r\nhttps://github.com/username_0/moar/blob/a174d861801c4c206c116635638d430c08361c19/m/ansiTokenizer_test.go#L93-L99\r\n\r\n```go\r\nfunc TestConsumeCompositeColorHappy(t *testing.T) {\r\n\t// 8 bit color\r\n\t// Example from: https://github.com/username_0/moar/issues/14\r\n\tnewIndex, color, err := consumeCompositeColor([]string{\"38\", \"5\", \"74\"}, 0)\r\n\tassert.NilError(t, err)\r\n\tassert.Equal(t, newIndex, 3)\r\n\tassert.Equal(t, *color, tcell.Color74)\r\n```\r\n\r\nWhat would be the best way for me to phrase this test with tcell 2.0?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "walles",
"comment_id": 723466729,
"datetime": 1604767076000,
"masked_author": "username_0",
"text": "@username_1 should I turn my last comment here into a new ticket, do you want to re-open this one or something else?\r\n\r\nI would *like* to re-open this issue until it has been hashed out, but I don't have permissions to do so.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gdamore",
"comment_id": 723477256,
"datetime": 1604772965000,
"masked_author": "username_1",
"text": "Wait a minute -- you're trying to assert that this color (tcell.Color74) has a specific numeric value? That specifically is not part of our public API.\r\n\r\nHowever.... if you want to get the TrueColor value of tcell.Color74, you can do that via a call to the colors RGB() or Hex() methods. That will give you a value. Maybe that will resolve what you want?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gdamore",
"comment_id": 723477498,
"datetime": 1604773120000,
"masked_author": "username_1",
"text": "Specifically try \"tcell.Color74.Hex()\" to get the 24 bit value, or tcell.Color74.RGB() to return the red, green, blue components. Admittedly when using palette colors this is based on the published default palette of xterm, which may or may not match what folks have configured. (Generally palette entries above >= 16 are rarely if ever modified by themes, but the lower 16 entries are frequently changed via themes.)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gdamore",
"comment_id": 723477743,
"datetime": 1604773253000,
"masked_author": "username_1",
"text": "Btw, if you need to create a color a specific RGB value, you should use NewRGBColor(). It will create the precise value, or as close a match as we can using the palette if direct RGB colors are not supported by the terminal.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "walles",
"comment_id": 723540102,
"datetime": 1604819781000,
"masked_author": "username_0",
"text": "Well, verifying that my color has a specific numeric value is *exactly* what I want to do.\r\n\r\n# Background\r\nIn this case, [moar](https://github.com/username_0/moar) is a pager.\r\n\r\nAnd the test case outlined above is for verifying that it converts an input ANSI escape sequence requesting color 74 to `tcell.Color74`.\r\n\r\nSince this is a pager, it should display the same thing as `cat` would, but with paging.\r\n\r\nSo **I can't convert to RGB**, that would make `moar` display the wrong things, by not following the terminal theming.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "walles",
"comment_id": 730358716,
"datetime": 1605790701000,
"masked_author": "username_0",
"text": "@username_1 any opinions about this use case? ^",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gdamore",
"comment_id": 730402486,
"datetime": 1605795226000,
"masked_author": "username_1",
"text": "The colors starting at index 1 will have the same offset. So compare the difference to color1 if you need that verification. \n\nAlternatively just compare the RGB values which should be good enough for your self tests.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "walles",
"comment_id": 739562080,
"datetime": 1607287991000,
"masked_author": "username_0",
"text": "Given that I have a variable containing the number `74`, what do you think would be the best way for me to turn that into `tcell.Color74`?\r\n\r\nI have [a workaround](https://github.com/username_0/moar/blob/08db9e9cd6232ca0c13c7b434d69c6d16321fd5d/m/ansiTokenizer.go#L426-L432) for how to do that, but it doesn't feel great, so I'd like to know what you think would be the best way.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gdamore",
"comment_id": 740055057,
"datetime": 1607361169000,
"masked_author": "username_1",
"text": "Take 73 (74-1) and add it to Color1.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gdamore",
"comment_id": 740055707,
"datetime": 1607361239000,
"masked_author": "username_1",
"text": "Maybe I should add a method to create a color from a palette index.",
"title": null,
"type": "comment"
}
] | 2 | 15 | 4,410 | false | false | 4,410 | true |
ngageoint/scale-ui | ngageoint | 456,251,953 | 123 | null | [
{
"action": "opened",
"author": "cshamis",
"comment_id": null,
"datetime": 1560519192000,
"masked_author": "username_0",
"text": "Permission denied error when trying to view docs in new ui.",
"title": "Docs in 7.0 don't work",
"type": "issue"
},
{
"action": "created",
"author": "ericsvendsen-mil",
"comment_id": 502274976,
"datetime": 1560547876000,
"masked_author": "username_1",
"text": "This is most likely due to an incorrect value specified in the runtime config for the `documentation` property. As a result this can be fixed where it is currently deployed.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "emimaesmith",
"comment_id": 502658701,
"datetime": 1560773858000,
"masked_author": "username_2",
"text": "I also didn't build the docs when I deployed Scale the last few times because it was taking 10+ minutes to run the build. Taking out building the docs dropped the build time in half.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mikenholt",
"comment_id": null,
"datetime": 1563807126000,
"masked_author": "username_3",
"text": "",
"title": null,
"type": "issue"
}
] | 4 | 4 | 414 | false | false | 414 | false |
asvetliakov/vscode-neovim | null | 775,826,454 | 468 | null | [
{
"action": "opened",
"author": "sombra-yevstakhii-krul",
"comment_id": null,
"datetime": 1609238791000,
"masked_author": "username_0",
"text": "yesterday everything worked perfectly so I assume the latest update broke something. When I use commands like `h`/`j`/`k`/`l` or even `gg` or `0` cursor moves in neovim but vscode shows it in its previous position. If I enter insert mode and then move cursor it updates correctly, search also works fine. Tried empty vimrc, didn't help.\r\n\r\nUsing macOS Big Sur 11.0.1\r\n\r\nneovim version:\r\n\r\n",
"title": "Most motion commands stopped working after update",
"type": "issue"
},
{
"action": "created",
"author": "David-Else",
"comment_id": 752048893,
"datetime": 1609242699000,
"masked_author": "username_1",
"text": "I am a bit unclear about what you mean. You are saying NeoVim works fine with h/j/k/l but VS Code with the extension in normal mode does not? What does 'previous position' refer to? Are you running an actual instance of NeoVim as well as VS Code?\r\n\r\nIt works fine here with `NVIM v0.5.0-dev+975-ga58c5509d\r\n` from a day or two ago and the latest NeoVim 0.0.72 extension on Linux.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sombra-yevstakhii-krul",
"comment_id": 752052500,
"datetime": 1609243529000,
"masked_author": "username_0",
"text": "I meant that it seems like cursor position doesn't update in vscode when it updates in neovim, hopefully a video will help:\r\n\r\nhttps://user-images.githubusercontent.com/50657402/103282348-278c3700-49de-11eb-9347-a422c9c35f9b.mov\r\n\r\nIn the video I press `j`/`k` a bunch of times, then use insert mode, then press `x` in normal a few times. I'm using [quick-scope](https://github.com/unblevable/quick-scope) that highlights unique characters in each word on current line so that I can easily use `f`/`F` to jump there. As you can see quick scope (vim plugin) updates, but vscode doesn't until I go to insert mode and move cursor.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sombra-yevstakhii-krul",
"comment_id": 752054513,
"datetime": 1609243978000,
"masked_author": "username_0",
"text": "Update: tried installing different versions, 0.0.63 works, everything above it doesn't",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "asvetliakov",
"comment_id": 752079001,
"datetime": 1609249369000,
"masked_author": "username_2",
"text": "@username_0 can you enable debug logs in ext settings and upload it somewhere?\r\n\r\nAlso can you check if the issue still happens with default vscode settings and optionally with all extensions disabled?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sombra-yevstakhii-krul",
"comment_id": 752125537,
"datetime": 1609256203000,
"masked_author": "username_0",
"text": "Sorry for not replying for so long, updating neovim required also updating xcode and then it took me some time to realize I also need to update luajit (with `--HEAD` flag in brew)\r\n\r\nAnyway update solved the issue, thanks",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "sombra-yevstakhii-krul",
"comment_id": null,
"datetime": 1609256203000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 7 | 2,026 | false | false | 2,026 | true |
Molunerfinn/PicGo | null | 874,323,863 | 666 | null | [
{
"action": "opened",
"author": "vay1314",
"comment_id": null,
"datetime": 1620027979000,
"masked_author": "username_0",
"text": "<!--\r\nPicGo Issue 模板\r\n请依照该模板来提交,否则将会被关闭。\r\n**提问之前请注意你看过 FAQ、配置手册以及那些被关闭的 issues。否则同样的提问也会被关闭!**\r\n-->\r\n\r\n**声明:我已经仔细看过 [文档](https://picgo.github.io/PicGo-Doc/)、[FAQ](https://github.com/username_1/PicGo/blob/dev/FAQ.md)、和搜索过已经关闭的 [issues](https://github.com/username_1/PicGo/issues?q=is%3Aissue+sort%3Aupdated-desc+is%3Aclosed) 后依然没有找到答案,所以才发了一个新的 issue。**\r\n\r\n## 问题类型\r\n\r\nBug Report\r\n\r\n## PicGo 的相关信息\r\n\r\nPicGo 的版本:2.3.0-beta6\r\n所在平台:Windows\r\n\r\n## 问题重现\r\n\r\n<!-- 如果是 Bug Report 请填写本项 -->\r\n<!-- 请附上相关截图 -->\r\n<!-- 请附上 PicGo 的相关报错日志(用文本的形式)否则无法判断原因-->\r\n<!-- 报错日志可以在 PicGo 设置 -> 设置日志文件 -> 点击打开 后找到 -->\r\n在使用typora中粘贴图片后自动上传到图床,PicGo每次都提示有前序任务,但其实没有上传任务。\r\n使用的插件为ssh-scp-uploader\r\n截图和日志如下\r\n\r\n\r\n\r\n2021-05-03 15:32:48 [PicGo INFO] [PicGo Server] get the request \r\n2021-05-03 15:32:48 [PicGo INFO] [PicGo Server] upload files in list \r\n2021-05-03 15:32:48 [PicGo INFO] Before transform \r\n2021-05-03 15:32:48 [PicGo INFO] Transforming... Current transformer is [path] \r\n2021-05-03 15:32:48 [PicGo INFO] [PicGo Server] get the request \r\n2021-05-03 15:32:48 [PicGo INFO] [PicGo Server] upload files in list \r\n2021-05-03 15:32:48 [PicGo WARN] [PicGo Server] upload failed, see picgo.log for more detail ↑ \r\n2021-05-03 15:32:48 [PicGo WARN] [PicGo Server] upload failed, see picgo.log for more detail ↑ \r\n2021-05-03 15:32:48 [PicGo INFO] Before upload \r\n2021-05-03 15:32:48 [PicGo INFO] beforeUploadPlugins: renameFn running \r\n2021-05-03 15:32:48 [PicGo INFO] Uploading... Current uploader is [ssh-scp-uploader] \r\n2021-05-03 15:32:50 [PicGo SUCCESS] \r\nhttps://xxxxxxxxxxxxxxx/uploads/2021/05/image-20210503153248908.png \r\n## 功能请求\r\n\r\n<!-- 如果是 Feature Request 请填写本项 -->\r\n<!-- 详细描述你所预想的功能或者是现有功能的改进。 -->\r\n\r\n---\r\n\r\n<!-- \r\n 最后,喜欢 PicGo 的话不妨给它点个 star~\r\n 如果可以的话,请我喝杯咖啡?首页有赞助二维码,谢谢你的支持! \r\n -->",
"title": "PicGo提示前序任务问题",
"type": "issue"
},
{
"action": "created",
"author": "Molunerfinn",
"comment_id": 831097551,
"datetime": 1620029232000,
"masked_author": "username_1",
"text": "从日志中确实收到了两次上传请求,上一次还没结束,下一次就请求就过来了。,上一次还没上传完会出现这个提示是正常的。\n\n后续会优化一下",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vay1314",
"comment_id": 831137479,
"datetime": 1620033836000,
"masked_author": "username_0",
"text": "也就是PicGo同一时间自动执行了两遍上传是吧?还有个问题请教下,之前在typora粘贴本地图片后,PicGo自动上传成功后,会自动把typora中的本地地址替换成上传成功后的网络地址,出现这个前序任务后上传成功也不会替换了,这个是typora的还是PicGo的问题?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Molunerfinn",
"comment_id": 831143378,
"datetime": 1620034521000,
"masked_author": "username_1",
"text": "普通剪贴板粘贴图片的话typora不应该发两个请求过来。后续picgo这侧来处理吧",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rizaust",
"comment_id": 833174487,
"datetime": 1620267404000,
"masked_author": "username_2",
"text": "+1",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "liyunfu123",
"comment_id": 833434600,
"datetime": 1620298885000,
"masked_author": "username_3",
"text": "gitee 也出现了 前序任务,覆盖安装2.2正式版(不会出现前序) 然后到2.3 beta5(出现前序)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Molunerfinn",
"comment_id": 833438041,
"datetime": 1620299255000,
"masked_author": "username_1",
"text": "会在下个beta版本解决",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Molunerfinn",
"comment_id": 833685898,
"datetime": 1620320219000,
"masked_author": "username_1",
"text": "https://github.com/typora/typora-issues/issues/4379\r\n\r\n已经上报typora",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Molunerfinn",
"comment_id": 833691462,
"datetime": 1620320690000,
"masked_author": "username_1",
"text": "有问题的可以先降级到0.9.x版本的typora",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ytfsL",
"comment_id": 834987227,
"datetime": 1620441523000,
"masked_author": "username_4",
"text": "我也遇到这个问题了,就是typora的问题,用typora测试上传的时候会上传两张图片",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Molunerfinn",
"comment_id": null,
"datetime": 1620528746000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "Molunerfinn",
"comment_id": 835645135,
"datetime": 1620528906000,
"masked_author": "username_1",
"text": "下个版本会解决「前序任务」提示问题,现在每个上传任务都是独立的了,不会有前序任务问题。\r\n\r\n但是typora还是会同时发起两次上传同一张图片的请求,在某些图床下会发生问题。比如 GitHub,不支持上传同名文件,一旦第一张上传成功,第二张一样的图片就会失败。但是其他一些云服务商不会,比如腾讯云、阿里云等。这个需要等typora修复。如果影响使用,请降级typora至0.9.98。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Molunerfinn",
"comment_id": 835791560,
"datetime": 1620561411000,
"masked_author": "username_1",
"text": "更新,typora最新的0.10.9已经修复",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ysinx",
"comment_id": 868168327,
"datetime": 1624590351000,
"masked_author": "username_5",
"text": "我用 picgo core 也遇到这问题",
"title": null,
"type": "comment"
}
] | 6 | 14 | 2,549 | false | false | 2,549 | true |
AppsFlyerSDK/appsflyer-flutter-plugin | AppsFlyerSDK | 863,484,815 | 120 | null | [
{
"action": "opened",
"author": "oyzxchi",
"comment_id": null,
"datetime": 1618985456000,
"masked_author": "username_0",
"text": "I want to use appsflyer to get advertising information from Facebook, such as Facebook advertising campaign, ads set, Ads and other information. The information I see on af is \"Get af_dp in onConversionDataSuccess callback\". I want to know which interface method should be used To get the onConversionDataSuccess callback\r\n\r\nMy appsflyer_sdk version is 6.2.4+1-flutterv1\r\n\r\nflutter doctor:\r\nDoctor summary (to see all details, run flutter doctor -v):\r\n[✓] Flutter (Channel stable, 1.22.2)\r\n[✓] Android toolchain - develop for Android devices (Android SDK version 30.0.2)\r\n[✓] Xcode - develop for iOS and macOS (Xcode 12.3)\r\n[✓] Android Studio (version 4.0)\r\n[✓] VS Code (version 1.54.1)\r\n[✓] Connected device (1 available)",
"title": "How to get campaign name of FB ADS by AppsFlyer?",
"type": "issue"
},
{
"action": "closed",
"author": "GM-appsflyer",
"comment_id": null,
"datetime": 1619695333000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 722 | false | false | 722 | false |
aws/aws-cdk | aws | 856,681,935 | 14,130 | null | [
{
"action": "opened",
"author": "tom10271",
"comment_id": null,
"datetime": 1618299993000,
"masked_author": "username_0",
"text": "<!--\r\ndescription of the bug:\r\n-->\r\n\r\n### Reproduction Steps\r\nIt is hard for me to write everything down here or to prepare a minimum reproducible code. I don't time for that\r\n\r\nConsider I have a VPC with RDS and EC2, whatever working and then added these.\r\n\r\n```ts\r\nexport function storageSetup(scope: Construct) {\r\n STORAGE.EFS.testing = new FileSystem(scope, 'App testing server EFS', {\r\n vpc: NETWORKING.VPC,\r\n performanceMode: PerformanceMode.GENERAL_PURPOSE,\r\n enableAutomaticBackups: true,\r\n removalPolicy: RemovalPolicy.RETAIN,\r\n lifecyclePolicy: LifecyclePolicy.AFTER_90_DAYS,\r\n });\r\n}\r\n```\r\n\r\nAnd run `cdk diff`, the diff is to delete everything but create the EFS.\r\n\r\n<!--\r\nminimal amount of code that causes the bug (if possible) or a reference:\r\n-->\r\n\r\n### What did you expect to happen?\r\n\r\n<!--\r\nWhat were you trying to achieve by performing the steps above?\r\n-->\r\n\r\n### What actually happened?\r\n\r\n<!--\r\nWhat is the unexpected behavior you were seeing? If you got an error, paste it here.\r\n-->\r\n\r\n\r\n### Environment\r\n\r\n - **CDK CLI Version :** 1.95.1 (build ed2bbe6)\r\n - **Framework Version:** ???\r\n - **Node.js Version:** <!-- Version of Node.js (run the command `node -v`) --> v14.15.1\r\n - **OS :** macOS Catalina 10.15.2 (19C57)\r\n - **Language (Version):** <!-- [all | TypeScript (3.8.3) | Java (8)| Python (3.7.3) | etc... ] --> TS ~3.9.7\r\n\r\n### Other\r\n\r\n<!-- e.g. detailed explanation, stacktraces, related issues, suggestions on how to fix, links for us to have context, eg. associated pull-request, stackoverflow, slack, etc -->\r\n\r\n\r\n\r\n\r\n--- \r\n\r\nThis is :bug: Bug Report",
"title": "EFS: No clue but adding EFS into my existing stack result in deletion of everything",
"type": "issue"
},
{
"action": "created",
"author": "tom10271",
"comment_id": 818526575,
"datetime": 1618300388000,
"masked_author": "username_0",
"text": "And here is how CDK no longer delete everything but just to create the EFS is for me to create EFS with specifying which subnet to use\r\n\r\n```ts\r\n STORAGE.EFS.testing = new FileSystem(scope, 'App testing server EFS', {\r\n vpc: NETWORKING.VPC,\r\n performanceMode: PerformanceMode.GENERAL_PURPOSE,\r\n enableAutomaticBackups: true,\r\n removalPolicy: RemovalPolicy.RETAIN,\r\n lifecyclePolicy: LifecyclePolicy.AFTER_90_DAYS,\r\n vpcSubnets: {\r\n subnets: [NETWORKING.PUBLIC_SUBNET_2B]\r\n },\r\n });\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "iliapolo",
"comment_id": 820212874,
"datetime": 1618473810000,
"masked_author": "username_1",
"text": "@username_0 I am not able to reproduce this. \r\n\r\nUsing the following code:\r\n\r\n```ts\r\nconst vpc = new ec2.Vpc(this, 'Vpc');\r\n\r\n// Create the Database\r\nnew docdb.DatabaseCluster(this, 'todos-database', {\r\n masterUser: {\r\n username: 'epolon', // NOTE: 'admin' is reserved by DocumentDB\r\n },\r\n instanceType: ec2.InstanceType.of(ec2.InstanceClass.R5, ec2.InstanceSize.LARGE),\r\n vpcSubnets: {\r\n subnetType: ec2.SubnetType.PRIVATE\r\n },\r\n vpc,\r\n dbClusterName: \"todos-database-cluster\",\r\n removalPolicy: cdk.RemovalPolicy.DESTROY,\r\n});\r\n```\r\n\r\nIf I add an `efs.FileSystem`, I get the expected diff:\r\n\r\n```ts\r\nnew efs.FileSystem(this, 'FileSystem', {\r\n vpc,\r\n performanceMode: efs.PerformanceMode.GENERAL_PURPOSE,\r\n enableAutomaticBackups: true,\r\n removalPolicy: cdk.RemovalPolicy.RETAIN,\r\n lifecyclePolicy: efs.LifecyclePolicy.AFTER_90_DAYS,\r\n});\r\n```\r\n\r\n```console\r\n───┬────────────────────────────────────────┬─────┬────────────┬─────────────────┐\r\n│ │ Group │ Dir │ Protocol │ Peer │\r\n├───┼────────────────────────────────────────┼─────┼────────────┼─────────────────┤\r\n│ + │ ${FileSystem/EfsSecurityGroup.GroupId} │ Out │ Everything │ Everyone (IPv4) │\r\n└───┴────────────────────────────────────────┴─────┴────────────┴─────────────────┘\r\n(NOTE: There may be security-related changes not in this list. See https://github.com/aws/aws-cdk/issues/1299)\r\n\r\nResources\r\n[+] AWS::EFS::FileSystem FileSystem FileSystem8A8E25C0 \r\n[+] AWS::EC2::SecurityGroup FileSystem/EfsSecurityGroup FileSystemEfsSecurityGroup212D3ACB \r\n[+] AWS::EFS::MountTarget FileSystem/EfsMountTarget1 FileSystemEfsMountTarget1586453F0 \r\n[+] AWS::EFS::MountTarget FileSystem/EfsMountTarget2 FileSystemEfsMountTarget24B8EBB43 \r\n[+] AWS::EFS::MountTarget FileSystem/EfsMountTarget3 FileSystemEfsMountTarget37C2F9139 \r\n```\r\n\r\nAre you able to reproduce this in a clean installation? Is it possible something in your configuration is causing this? We need an isolated reproduction to keep investigating it.",
"title": null,
"type": "comment"
}
] | 3 | 5 | 4,384 | false | true | 4,235 | true |
Road-of-CODEr/we-hate-js | Road-of-CODEr | 845,651,923 | 50 | null | [
{
"action": "opened",
"author": "hayoung0Lee",
"comment_id": null,
"datetime": 1617156235000,
"masked_author": "username_0",
"text": "- [ ] 16장: 프로퍼티 어트리뷰트\r\n- [ ] 17장: 생성자 함수에 의한 객체 생성 \r\n- [ ] 18장: 함수와 일급 객체 \r\n- [ ] 19장: 프로토타입 \r\n- [ ] 20장: strict mode",
"title": "16~20장 스터디",
"type": "issue"
},
{
"action": "closed",
"author": "1ilsang",
"comment_id": null,
"datetime": 1626546560000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 117 | false | false | 117 | false |
conda-forge/aiobotocore-feedstock | conda-forge | 689,793,263 | 28 | {
"number": 28,
"repo": "aiobotocore-feedstock",
"user_login": "conda-forge"
} | [
{
"action": "created",
"author": "ocefpaf",
"comment_id": 684976346,
"datetime": 1598977292000,
"masked_author": "username_0",
"text": "All deps of deps that the right versions are specified on deps upstream. Adding them here may break the package.",
"title": null,
"type": "comment"
}
] | 4 | 4 | 3,315 | false | true | 112 | false |
microsoft/FluidFramework | microsoft | 878,170,895 | 6,065 | {
"number": 6065,
"repo": "FluidFramework",
"user_login": "microsoft"
} | [
{
"action": "opened",
"author": "ChumpChief",
"comment_id": null,
"datetime": 1620342663000,
"masked_author": "username_0",
"text": "Resolves #6015\r\n\r\nIf `createSubDirectory()` is called with a subdirectory name that already exists, we retrieve the already-existing subdirectory and return it rather than creating a new one. This PR documents that more explicitly, plus adds a couple tests that prove it more explicitly (it was tangentially tested in other cases, but this is more exhaustive).\r\n\r\n~~While looking at this, I noticed that we would still submit a `createSubDirectory` op in the case that it already existed -- this was benign since all collaborators would just ignore it basically, but we can eliminate this op traffic as unnecessary.~~\r\n\r\nActually this is revealing we have some eventual consistency bugs for simultaneous delete/create subdirectory in combination with key sets within that deleted/created subdirectory. Reverting the directory changes in favor of sorting those out before changing behavior -- so this is just a documentation and test PR now.",
"title": "Double createSubDirectory handling",
"type": "issue"
},
{
"action": "created",
"author": "ChumpChief",
"comment_id": 833972550,
"datetime": 1620347711000,
"masked_author": "username_0",
"text": "Filed #6069 for the eventual consistency issues discovered.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 1,001 | false | false | 1,001 | false |
EGI-Federation/documentation | EGI-Federation | 900,778,289 | 218 | {
"number": 218,
"repo": "documentation",
"user_login": "EGI-Federation"
} | [
{
"action": "opened",
"author": "mviljoen-egi",
"comment_id": null,
"datetime": 1621947618000,
"masked_author": "username_0",
"text": "improved top-level description of this section\r\n\r\n<!--\r\nA good PR should describe what benefit this brings to the repository.\r\nIdeally, there is an existing issue which the PR address.\r\n\r\nPlease check the [Contributing guide](https://docs.egi.eu/about/contributing/)\r\nfor style requirements and advice.\r\n-->\r\n\r\n# Summary\r\n\r\n<!-- Describe in plain English what this PR does -->\r\nimproved top-level description of this section\r\n---\r\n\r\n<!-- Add, if any, the related issue here, e.g. #6 -->\r\n\r\n**Related issue :**",
"title": "Update _index.md",
"type": "issue"
},
{
"action": "created",
"author": "gwarf",
"comment_id": 847897125,
"datetime": 1621951344000,
"masked_author": "username_1",
"text": "Thanks @username_0, it's still failing the tests due to spaces at end of lines: https://github.com/EGI-Federation/documentation/pull/218/checks?check_run_id=2665456529\r\n\r\nCan you please edit this PR to fix this?\r\nYou should be able to edit the file by going to the https://github.com/EGI-Federation/documentation/pull/218/files tabs and using `Edit file` from the `...` menu.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 886 | false | false | 886 | true |
davodesign84/react-native-mixpanel | null | 734,046,295 | 261 | null | [
{
"action": "opened",
"author": "AashJ",
"comment_id": null,
"datetime": 1604260098000,
"masked_author": "username_0",
"text": "What is the alternative for mixpanel.people.set as seen in the node.js documentation (https://developer.mixpanel.com/docs/nodejs)? \r\n\r\nMy use case is I already have user profiles stored in a database, and am now adding in a mixpanel integration. I want to create user profiles if they don't already exist, but identify otherwise.",
"title": "Mixpanel.people.set",
"type": "issue"
},
{
"action": "created",
"author": "eugenetraction",
"comment_id": 724392535,
"datetime": 1604972169000,
"masked_author": "username_1",
"text": "If I'm not mistaken this is the syntax: \r\n\r\n`// Set People properties (warning: if no mixpanel profile has been assigned to the current user when this method is called, it will automatically create a new mixpanel profile and the user will no longer be anonymous in Mixpanel)\r\nMixpanel.set({\"$email\": \"elvis@email.com\"});\r\n\r\n// Set People Properties Once (warning: if no mixpanel profile has been assigned to the current user when this method is called, it will automatically create a new mixpanel profile and the user will no longer be anonymous in Mixpanel)\r\nMixpanel.setOnce({\"$email\": \"elvis@email.com\", \"Created\": new Date().toISOString()});`",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "AashJ",
"comment_id": null,
"datetime": 1614021064000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 975 | false | false | 975 | false |
bukalapak/snowboard | bukalapak | 788,494,371 | 169 | null | [
{
"action": "opened",
"author": "X-Ray-Jin",
"comment_id": null,
"datetime": 1610997266000,
"masked_author": "username_0",
"text": "Anyone an idea how to fix that? I tried installing Protagonist and Node-Gyp in various versions manually but that doesn't help. Tried as normal and as admin user. Windows build tools are installed, too.",
"title": "Global Snowboard installation fails",
"type": "issue"
},
{
"action": "created",
"author": "u4989190",
"comment_id": 810703024,
"datetime": 1617156564000,
"masked_author": "username_1",
"text": "you may take a look at this issue [119](https://github.com/bukalapak/snowboard/issues/119)\r\nor you may run snowboard in docker instead installing it on your local machine.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "u4989190",
"comment_id": 812957425,
"datetime": 1617502178000,
"masked_author": "username_1",
"text": "@username_0 \r\nthank you for your recommendation. I managed to install snowboard at last(not on windows though, I did it on wsl2), but yet failed to parse the apib file. it seems that other team member, who is using Mac, edited the file with LF breaks and since windows is CRLF, it just don't work out. I just give up and asked him to generate a static html for me so I will not parse the apib files.\r\n\r\nI am using this API blueprint only because our client choose the tech stack. for those projects where I am the leader, I usually go on with OPEN API, and if you are building REST API endpoints, I strongly encourage you take a look at it. it goes well with a lot of tools which are well-developed, such as swagger, stoplight etc, and the community is thriving. anyway, good luck!",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,153 | false | false | 1,153 | true |
shibing624/pycorrector | null | 833,439,711 | 196 | null | [
{
"action": "opened",
"author": "cxy86121-sudo",
"comment_id": null,
"datetime": 1615961723000,
"masked_author": "username_0",
"text": "你是我的小宝呗 []\r\n\r\n\r\n使用纠错模块,试了几个例子后发现很多纠正不了,如上面的两个例子。",
"title": "关于pycorrector.correct模块的使用体验",
"type": "issue"
},
{
"action": "created",
"author": "shibing624",
"comment_id": 802499933,
"datetime": 1616122092000,
"masked_author": "username_1",
"text": "长尾漏召回case,建议自己加到混淆词典confusion dict处理。",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "shibing624",
"comment_id": null,
"datetime": 1620569185000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 85 | false | false | 85 | false |
dotnet/AspNetCore.Docs | dotnet | 795,479,495 | 21,335 | {
"number": 21335,
"repo": "AspNetCore.Docs",
"user_login": "dotnet"
} | [
{
"action": "opened",
"author": "Rick-Anderson",
"comment_id": null,
"datetime": 1611784893000,
"masked_author": "username_0",
"text": "Fixes #21315\r\n[Internal review URL](https://review.docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/?view=aspnetcore-5.0&branch=pr-en-us-21335#command-line)",
"title": "Kestrel config command line",
"type": "issue"
},
{
"action": "created",
"author": "Rick-Anderson",
"comment_id": 768837682,
"datetime": 1611816301000,
"masked_author": "username_0",
"text": "@natenho please review.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Rick-Anderson",
"comment_id": 773627161,
"datetime": 1612475515000,
"masked_author": "username_0",
"text": "@serpent5 please review my latest to see if I've addressed `@halter73` feedback. No hurry.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Rick-Anderson",
"comment_id": 778849804,
"datetime": 1613340601000,
"masked_author": "username_0",
"text": "@serpent5 much appreciated",
"title": null,
"type": "comment"
}
] | 1 | 4 | 313 | false | false | 313 | false |
TerriaJS/terriajs | TerriaJS | 753,439,424 | 5,042 | {
"number": 5042,
"repo": "terriajs",
"user_login": "TerriaJS"
} | [
{
"action": "opened",
"author": "nf-s",
"comment_id": null,
"datetime": 1606739447000,
"masked_author": "username_0",
"text": "### Revert magda config ID patch + add shareKey\r\n\r\nFixes https://github.com/TerriaJS/terriajs/issues/4978 https://github.com/TerriaJS/qld-digital-twin/issues/237 https://github.com/TerriaJS/terrace/issues/142\r\n\r\nPartially fixes (chill github don't close the issue when I merge this) https://github.com/TerriaJS/terriajs/issues/4774 \r\n\r\n### Checklist\r\n\r\n- [x] Mostly reverting things - no Tests needed\r\n- [x] I've updated CHANGES.md with what I changed.",
"title": "Revert magda config ID patch + add shareKey",
"type": "issue"
},
{
"action": "created",
"author": "nf-s",
"comment_id": 740432396,
"datetime": 1607411806000,
"masked_author": "username_0",
"text": "Closed due to decision made in https://github.com/TerriaJS/terriajs/pull/5056\r\nWe will be keeping the Magda Reference root group id as `\"/\"`",
"title": null,
"type": "comment"
}
] | 1 | 2 | 596 | false | false | 596 | false |
Unity-Technologies/datasetinsights | Unity-Technologies | 700,006,017 | 93 | null | [
{
"action": "opened",
"author": "adason",
"comment_id": null,
"datetime": 1599881209000,
"masked_author": "username_0",
"text": "**Why you need this feature:**\r\nThe current GCSDownloader tries to download file sequentially. \r\n\r\n**Describe the solution you'd like:**\r\n\r\nUse multi-threading to speedup the download process. \r\n\r\n**Anything else you would like to add:**\r\n[Miscellaneous information that will assist in solving the issue.]",
"title": "Improve GCSDownloader to enable multi-threading",
"type": "issue"
}
] | 1 | 1 | 305 | false | false | 305 | false |
cyrilou242/covid-19 | null | 615,246,275 | 51 | {
"number": 51,
"repo": "covid-19",
"user_login": "cyrilou242"
} | [
{
"action": "opened",
"author": "donok1",
"comment_id": null,
"datetime": 1589053134000,
"masked_author": "username_0",
"text": "Je me suis permis de remplacer les espaces avant les ponctuations détachées («»:!?)\r\n\r\nIl a aussi quelque \"\" et 2-3 arrangement de foot notes.\r\n\r\net les R < 1 de #50",
"title": "espace insécable + qq details",
"type": "issue"
},
{
"action": "created",
"author": "donok1",
"comment_id": 626225836,
"datetime": 1589053226000,
"masked_author": "username_0",
"text": "Je vais essayer de relire demain encore une fois le tout. Bonne soirée ;)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cyrilou242",
"comment_id": 626228400,
"datetime": 1589054598000,
"masked_author": "username_1",
"text": "super ! J'ai pull ta branche, rebase (j'avais fait qque modifs entre temps) et mergé donc je ferme celle ci.\r\n\r\nJe touche plus à rien jusqu'à demain soir !",
"title": null,
"type": "comment"
}
] | 2 | 3 | 393 | false | false | 393 | false |
styled-components/styled-components | styled-components | 634,041,801 | 3,166 | null | [
{
"action": "opened",
"author": "dimaqq",
"comment_id": null,
"datetime": 1591585484000,
"masked_author": "username_0",
"text": "I'm using:\r\n* styled-components@5.1.1 with `pure: true`\r\n* import them `from \"styled-components/macro\"`\r\n* in a create-react-app app\r\n* targeting only evergreen browsers\r\n\r\nBuilding the app, I find this in `2.<hash>.chunk.js`:\r\n\r\n```js\r\n var b = \"undefined\" !== typeof e && (Object({\r\n NODE_ENV: \"production\",\r\n PUBLIC_URL: \"\",\r\n WDS_SOCKET_HOST: void 0,\r\n WDS_SOCKET_PATH: void 0,\r\n WDS_SOCKET_PORT: void 0,\r\n REACT_APP_GIT_SHA: \"d317dc12351f14768f919735d4486a651224f448\"\r\n }).REACT_APP_SC_ATTR || Object({\r\n NODE_ENV: \"production\",\r\n PUBLIC_URL: \"\",\r\n WDS_SOCKET_HOST: void 0,\r\n WDS_SOCKET_PATH: void 0,\r\n WDS_SOCKET_PORT: void 0,\r\n REACT_APP_GIT_SHA: \"d317dc12351f14768f919735d4486a651224f448\"\r\n }).SC_ATTR) || \"data-styled\",\r\n w = \"undefined\" !== typeof window && \"HTMLElement\" in\r\n window,\r\n _ = \"boolean\" === typeof SC_DISABLE_SPEEDY &&\r\n SC_DISABLE_SPEEDY || \"undefined\" !== typeof e && (\r\n Object({\r\n NODE_ENV: \"production\",\r\n PUBLIC_URL: \"\",\r\n WDS_SOCKET_HOST: void 0,\r\n WDS_SOCKET_PATH: void 0,\r\n WDS_SOCKET_PORT: void 0,\r\n REACT_APP_GIT_SHA: \"d317dc12351f14768f919735d4486a651224f448\"\r\n }).REACT_APP_SC_DISABLE_SPEEDY || Object({\r\n NODE_ENV: \"production\",\r\n PUBLIC_URL: \"\",\r\n WDS_SOCKET_HOST: void 0,\r\n WDS_SOCKET_PATH: void 0,\r\n WDS_SOCKET_PORT: void 0,\r\n REACT_APP_GIT_SHA: \"d317dc12351f14768f919735d4486a651224f448\"\r\n }).SC_DISABLE_SPEEDY) || !1,\r\n```\r\n\r\nWould it be possible not to include the entire (?) environment in the build?\r\n\r\nI was trying to set up `REACT_APP_GIT_SHA` env var and use it **very carefully** so that most of the build artefacts are not changed. Here, `styled-components` thwarted my attempts.\r\n\r\nI think, if someone needs to inject a `REACT_APP_SECRET_VALUE` they'd have the same problem.",
"title": "Is it possible not to pull env vars into build?",
"type": "issue"
},
{
"action": "closed",
"author": "probablyup",
"comment_id": null,
"datetime": 1592233252000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "probablyup",
"comment_id": 644188569,
"datetime": 1592233252000,
"masked_author": "username_1",
"text": "Whatever that is isn't coming from our codebase",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dimaqq",
"comment_id": 644539520,
"datetime": 1592285350000,
"masked_author": "username_0",
"text": "Hmm I was rather certain that `SC_` variables were for styled-components.\r\nThere's even a test in this codebase for correct treatment of `REACT_APP_SC_DISABLE_SPEEDY`:\r\nhttps://github.com/styled-components/styled-components/blob/147b0e9a1f10786551b13fd27452fcd5c678d5e0/packages/styled-components/src/test/constants.test.js#L117-L122",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dimaqq",
"comment_id": 644541124,
"datetime": 1592285690000,
"masked_author": "username_0",
"text": "```\r\nnode_modules/styled-components/primitives/dist/styled-components-primitives.esm.js\r\n154:var DISABLE_SPEEDY = typeof SC_DISABLE_SPEEDY === 'boolean' && SC_DISABLE_SPEEDY || typeof process !== 'undefined' && (process.env.REACT_APP_SC_DISABLE_SPEEDY || process.env.SC_DISABLE_SPEEDY) || process.env.NODE_ENV !== 'production'; // Shared empty execution context when generating static styles\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dimaqq",
"comment_id": 647213287,
"datetime": 1592788639000,
"masked_author": "username_0",
"text": "My best guess is that merely referencing `process.env.REACT_APP_xxx` tricks CRA/webpack into including the entire \"visible\" part of env. 🤔",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dimaqq",
"comment_id": 664066167,
"datetime": 1595811278000,
"masked_author": "username_0",
"text": "So, ugh, is there perhaps some other way to let users control \"disable speedy\" than through env?",
"title": null,
"type": "comment"
}
] | 2 | 7 | 3,398 | false | false | 3,398 | false |
cryptolandtech/moonlet | cryptolandtech | 350,426,969 | 11 | null | [
{
"action": "opened",
"author": "i3uRi",
"comment_id": null,
"datetime": 1534253132000,
"masked_author": "username_0",
"text": "*As a user I want to be able to create a password for my Extension wallet so that I can have another layer of security*\n\n- [ ] Display dialog box\n- [ ] Use this \"Re-confirm Backup\" title\n- [ ] Use this \"Are you sure that you have saved your secret phrase in a secure location?\" message\n- [ ] When click on Yes, it should trigger next journey step page\n- [ ] When click on No, it should trigger previous step\n\nPlease [use this visual asset as a guideline](https://gallery.io/projects/MCHbtQVoQ2HCZTlak4tp1lmj/files/MCHtA7U1iMGr61SXmvJTUH2T22Kn7I1KHFQ) and [check this demo in order to map the user journey](https://gallery.io/projects/MCHbtQVoQ2HCZTlak4tp1lmj)",
"title": "Create Password Authentication",
"type": "issue"
},
{
"action": "closed",
"author": "krisboit",
"comment_id": null,
"datetime": 1540060745000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 659 | false | false | 659 | false |
pwa-builder/PWABuilder | pwa-builder | 899,883,990 | 1,733 | {
"number": 1733,
"repo": "PWABuilder",
"user_login": "pwa-builder"
} | [
{
"action": "opened",
"author": "jgw96",
"comment_id": null,
"datetime": 1621879600000,
"masked_author": "username_0",
"text": "Fixes #\r\n<!-- Link to relevant issue (for ex: #1234) which will automatically close the issue once the PR is merged -->\r\n\r\n## PR Type\r\n<!-- Please uncomment one ore more that apply to this PR -->\r\n\r\nBugfix\r\n<!-- - Feature -->\r\n<!-- - Code style update (formatting) -->\r\n<!-- - Refactoring (no functional changes, no api changes) -->\r\n<!-- - Build or CI related changes -->\r\n<!-- - Documentation content changes -->\r\n<!-- - Sample app changes -->\r\n<!-- - Other... Please describe: -->\r\n\r\n\r\n## Describe the current behavior?\r\n<!-- Please describe the current behavior that is being modified or link to a relevant issue. -->\r\nThis fixes a bug where even if the non-canonical URL is being used for testing (such as https://pinterest.com) we will update the URL being used for packaging with the canonical URL (from our backend manifest finder services) in the background. This adds no new latency for the user to init the tests, but also ensures that once it comes to packaging the correct URL is being used (this is especially important for Android).\r\n\r\n## Describe the new behavior? \r\nOnce we have the canonical URL from our manifest finder services I call a function to update the URL being worked on with this more correct URL. This all happens without any need for action from the user.\r\n\r\n## PR Checklist\r\n\r\n- [ x] Test: run `npm run test` and ensure that all tests pass\r\n- [x ] Target master branch (or an appropriate release branch if appropriate for a bug fix)\r\n- [x ] Ensure that your contribution follows [standard accessibility guidelines](https://docs.microsoft.com/en-us/microsoft-edge/accessibility/design). Use tools like https://webhint.io/ to validate your changes.\r\n\r\n\r\n## Additional Information",
"title": "feat(): update url being used for packaging with canonical URL in the background",
"type": "issue"
}
] | 3 | 4 | 2,157 | false | true | 1,710 | false |
MicrosoftDocs/azure-docs | MicrosoftDocs | 732,677,137 | 65,260 | {
"number": 65260,
"repo": "azure-docs",
"user_login": "MicrosoftDocs"
} | [
{
"action": "opened",
"author": "ChristopherMank",
"comment_id": null,
"datetime": 1604007012000,
"masked_author": "username_0",
"text": "Add detail that the Azure DevOps Admin role doesn't give permissions inside of Azure DevOps itself. That was not clear previously.",
"title": "Add detail about Azure DevOps permissions",
"type": "issue"
},
{
"action": "created",
"author": "PRMerger14",
"comment_id": 719035881,
"datetime": 1604007032000,
"masked_author": "username_1",
"text": "@username_0 : Thanks for your contribution! The author(s) have been notified to review your proposed change.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ChristopherMank",
"comment_id": 722462652,
"datetime": 1604591374000,
"masked_author": "username_0",
"text": "Thanks @username_2 ! Updated per your suggestion.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "curtand",
"comment_id": 722467475,
"datetime": 1604591847000,
"masked_author": "username_2",
"text": "#sign-off \r\n\r\n@username_0 Awesome, thanks!",
"title": null,
"type": "comment"
}
] | 3 | 4 | 336 | false | false | 336 | true |
hotosm/tasking-manager | hotosm | 779,232,731 | 4,050 | {
"number": 4050,
"repo": "tasking-manager",
"user_login": "hotosm"
} | [
{
"action": "opened",
"author": "dakotabenjamin",
"comment_id": null,
"datetime": 1609863041000,
"masked_author": "username_0",
"text": "In order to allow for non-hotosm urls (such as tasks.teachosm.org) in the cfn template, we add a condition for the url parameter to check if its a hotosm.org domain. We will not create the Route53 record in the case that it is an external url. \r\n\r\nIn this case currently it only works when we also control the ssl certificate in ACM, so it is not fully generalized. that should come with future development. \r\n\r\n@willemarcel I would like to test this with `staging-test` before merging.",
"title": "Add condition for custom URL in cfn template",
"type": "issue"
},
{
"action": "created",
"author": "dakotabenjamin",
"comment_id": 754796980,
"datetime": 1609869358000,
"masked_author": "username_0",
"text": "Everything works on staging using `staging-test` branch so this is ready for merge",
"title": null,
"type": "comment"
}
] | 1 | 2 | 568 | false | false | 568 | false |
igor-dv/me | null | 791,299,108 | 17 | {
"number": 17,
"repo": "me",
"user_login": "igor-dv"
} | [
{
"action": "created",
"author": "igor-dv",
"comment_id": 766352708,
"datetime": 1611496837000,
"masked_author": "username_0",
"text": "@dependabot merge",
"title": null,
"type": "comment"
}
] | 2 | 2 | 8,432 | false | true | 17 | false |
AdoptOpenJDK/openjdk-tests | AdoptOpenJDK | 827,925,652 | 2,357 | null | [
{
"action": "opened",
"author": "andrew-m-leonard",
"comment_id": null,
"datetime": 1615390592000,
"masked_author": "username_0",
"text": "https://ci.adoptopenjdk.net/job/Test_openjdk8_j9_extended.openjdk_x86-64_linux/25/console\r\nFails both Hotspot and OpenJ9:\r\n```\r\n12:30:22 openjdk version \"1.8.0_292\"\r\n12:30:22 OpenJDK Runtime Environment (build 1.8.0_292-202103091156-b05)\r\n12:30:22 Eclipse OpenJ9 VM (build openj9-0.26.0-m1, JRE 1.8.0 Linux amd64-64-Bit Compressed References 20210309_972 (JIT enabled, AOT enabled)\r\n12:30:22 OpenJ9 - b227feba2\r\n12:30:22 OMR - 4665e2f72\r\n12:30:22 JCL - 1780cbc92b based on jdk8u292-b05)\r\n```\r\n```\r\n13:29:39 java.lang.NullPointerException\r\n13:29:39 \tat sun.awt.FontConfiguration.getVersion(FontConfiguration.java:1264)\r\n13:29:39 \tat sun.awt.FontConfiguration.readFontConfigFile(FontConfiguration.java:219)\r\n13:29:39 \tat sun.awt.FontConfiguration.init(FontConfiguration.java:107)\r\n13:29:39 \tat sun.awt.X11FontManager.createFontConfiguration(X11FontManager.java:774)\r\n13:29:39 \tat sun.font.SunFontManager$2.run(SunFontManager.java:441)\r\n13:29:39 \tat java.security.AccessController.doPrivileged(AccessController.java:682)\r\n13:29:39 \tat sun.font.SunFontManager.<init>(SunFontManager.java:386)\r\n13:29:39 \tat sun.awt.FcFontManager.<init>(FcFontManager.java:35)\r\n13:29:39 \tat sun.awt.X11FontManager.<init>(X11FontManager.java:57)\r\n13:29:39 \tat java.lang.J9VMInternals.newInstanceImpl(Native Method)\r\n13:29:39 \tat java.lang.Class.newInstance(Class.java:2038)\r\n13:29:39 \tat sun.font.FontManagerFactory$1.run(FontManagerFactory.java:83)\r\n13:29:39 \tat java.security.AccessController.doPrivileged(AccessController.java:682)\r\n13:29:39 \tat sun.font.FontManagerFactory.getInstance(FontManagerFactory.java:74)\r\n13:29:39 \tat sun.font.SunFontManager.getInstance(SunFontManager.java:250)\r\n13:29:39 \tat sun.font.FontDesignMetrics.getMetrics(FontDesignMetrics.java:264)\r\n13:29:39 \tat sun.font.FontDesignMetrics.getMetrics(FontDesignMetrics.java:250)\r\n13:29:39 \tat sun.awt.X11.XComponentPeer.getFontMetrics(XComponentPeer.java:683)\r\n13:29:39 \tat sun.awt.X11.XLabelPeer.getFontMetrics(XLabelPeer.java:48)\r\n13:29:39 \tat sun.awt.X11.XLabelPeer.getMinimumSize(XLabelPeer.java:70)\r\n13:29:39 \tat sun.awt.X11.XComponentPeer.getPreferredSize(XComponentPeer.java:606)\r\n13:29:39 \tat java.awt.Component.preferredSize(Component.java:2643)\r\n13:29:39 \tat java.awt.Component.getPreferredSize(Component.java:2626)\r\n13:29:39 \tat java.awt.GridBagLayout.GetLayoutInfo(GridBagLayout.java:1115)\r\n13:29:39 \tat java.awt.GridBagLayout.getLayoutInfo(GridBagLayout.java:916)\r\n13:29:39 \tat java.awt.GridBagLayout.preferredLayoutSize(GridBagLayout.java:736)\r\n13:29:39 \tat java.awt.Container.preferredSize(Container.java:1799)\r\n13:29:39 \tat java.awt.Container.getPreferredSize(Container.java:1783)\r\n13:29:39 \tat java.awt.Window.pack(Window.java:809)\r\n13:29:39 \tat com.sun.javatest.regtest.agent.AppletWrapper$AppletRunnable.run(AppletWrapper.java:142)\r\n13:29:39 \tat java.lang.Thread.run(Thread.java:823)\r\n13:29:39 STATUS:Failed.Applet thread threw exception: java.lang.NullPointerException\r\n```\r\nPossibly a fontconfig setup issue? or an upstream problem?",
"title": "extended.openjdk failure: javax/imageio/AppletResourceTest.java : FontConfiguration NPE",
"type": "issue"
},
{
"action": "created",
"author": "aahlenst",
"comment_id": 795640622,
"datetime": 1615391540000,
"masked_author": "username_1",
"text": "@username_0 I suspect a machine issue. [See the list of libraries that need to be present](https://blog.adoptopenjdk.net/2021/01/prerequisites-for-font-support-in-adoptopenjdk/).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "lumpfish",
"comment_id": 817959416,
"datetime": 1618245636000,
"masked_author": "username_2",
"text": "I think this affects a few tests and is being seen when tests are running on docker images:\r\nFailing test run:https://ci.adoptopenjdk.net/job/Test_openjdk11_j9_extended.openjdk_x86-64_linux/16/consoleFull\r\nTo rerun: https://ci.adoptopenjdk.net/job/Grinder/parambuild/?JDK_VERSION=11&JDK_IMPL=openj9&JDK_VENDOR=adoptopenjdk&BUILD_LIST=openjdk&PLATFORM=x86-64_linux_mixed&TARGET=jdk_imageio_0\r\nMachine: test-docker-fedora33-x64-1\r\nFailing tests:\r\n```\r\njavax/imageio/plugins/shared/ImageWriterCompressionTest.java\r\njavax/imageio/AppletResourceTest.java\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sxa",
"comment_id": 817963154,
"datetime": 1618245986000,
"masked_author": "username_3",
"text": "Thanks - attaching to https://github.com/AdoptOpenJDK/openjdk-tests/issues/2138",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sxa",
"comment_id": 818966515,
"datetime": 1618338980000,
"masked_author": "username_3",
"text": "Haven't managed to properly look at this today other than trying l grinders on a real and dockerised Ubuntu of the same level. Will aim to continue tomorrow",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sxa",
"comment_id": 819637631,
"datetime": 1618416538000,
"masked_author": "username_3",
"text": "It's not specific to any particular docker image - seen on Ubuntu and Fedora ones",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sxa",
"comment_id": 819656055,
"datetime": 1618418072000,
"masked_author": "username_3",
"text": "As Andreas suggested, adding `fontconfig` has allowed it to pass on [Fedora](https://ci.adoptopenjdk.net/job/Grinder/75/) and [Ubuntu](https://ci.adoptopenjdk.net/job/Grinder/76/). I'll get that deployed to all the docker systems and added into the playbooks.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "sxa",
"comment_id": null,
"datetime": 1620908870000,
"masked_author": "username_3",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "lumpfish",
"comment_id": 858485034,
"datetime": 1623318993000,
"masked_author": "username_2",
"text": "I've seen another occurrence on machine `test-docker-fedora33-x64-1` (https://ci.adoptopenjdk.net/view/Test_openjdk/job/Test_openjdk8_j9_extended.openjdk_x86-64_linux_testList_1/2/console)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sxa",
"comment_id": 858503525,
"datetime": 1623320573000,
"masked_author": "username_3",
"text": "Hmmm `fontconfig` is installed on there so I'm not sure what's going on with that system ([replicated on grinder](https://ci.adoptopenjdk.net/view/Test_grinder/job/Grinder/789/console)) - I'll try on the other 2 Fedora33 machines - [790](https://ci.adoptopenjdk.net/view/Test_grinder/job/Grinder/790/console) [791](https://ci.adoptopenjdk.net/view/Test_grinder/job/Grinder/791/console)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sxa",
"comment_id": 858528710,
"datetime": 1623323178000,
"masked_author": "username_3",
"text": "Hmmm the last two are failing but with a different error\r\n```\r\n11:28:13 STDOUT:\r\n11:28:13 STDERR:\r\n11:28:13 java.lang.RuntimeException: Test failed. Did not get expected IIOException\r\n11:28:13 \tat MaxLengthKeywordReadTest.main(MaxLengthKeywordReadTest.java:63)\r\n11:28:13 \tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\r\n11:28:13 \tat java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\r\n11:28:13 \tat java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\r\n11:28:13 \tat java.base/java.lang.reflect.Method.invoke(Method.java:566)\r\n11:28:13 \tat com.sun.javatest.regtest.agent.MainWrapper$MainThread.run(MainWrapper.java:127)\r\n11:28:13 \tat java.base/java.lang.Thread.run(Thread.java:836)\r\n11:28:13 \r\n11:28:13 JavaTest Message: Test threw exception: java.lang.RuntimeException: Test failed. Did not get expected IIOException\r\n11:28:13 JavaTest Message: shutting down test\r\n11:28:13 \r\n11:28:13 STATUS:Failed.`main' threw exception: java.lang.RuntimeException: Test failed. Did not get expected IIOException\r\n11:28:13 rerun:\r\n11:28:13 cd /home/jenkins/workspace/Grinder/openjdk-tests/TKG/output_16233198211531/jdk_imageio_0/work/javax/imageio/plugins/png/MaxLengthKeywordReadTest && \\\r\n```",
"title": null,
"type": "comment"
}
] | 4 | 11 | 6,268 | false | false | 6,268 | true |
kiali/kiali | kiali | 858,633,753 | 3,886 | null | [
{
"action": "opened",
"author": "aljesusg",
"comment_id": null,
"datetime": 1618475316000,
"masked_author": "username_0",
"text": "Should we show a degraded status if the user scale down the app ?\r\n\r\n\r\n\r\n\r\nI think that Iddle status should have more priority despite the last X minutes the health was degraded or failure. \r\n\r\nWhat do you think?",
"title": "zero replicas vs Degraded status",
"type": "issue"
},
{
"action": "created",
"author": "lucasponce",
"comment_id": 820238292,
"datetime": 1618476028000,
"masked_author": "username_1",
"text": "I think that's due it's taking the 1m telemetry info just when the resource is scale down.\r\nOnce that workload stops receiving telemetry it will show the not ready.\r\n\r\nTo me it's fine to keep it as it's as it tells user (it has traffic in the past but now you won't get more traffic).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "aljesusg",
"comment_id": 820255603,
"datetime": 1618477342000,
"masked_author": "username_0",
"text": "yes I agree the traffic by default is the last minute but maybe this could be confuse from user . is aw this checking another issue and I wanted to open a discussion about this",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "aljesusg",
"comment_id": null,
"datetime": 1618477374000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 812 | false | false | 812 | false |
ceph/ceph-csi | ceph | 909,113,460 | 2,120 | null | [
{
"action": "opened",
"author": "Madhu-1",
"comment_id": null,
"datetime": 1622610927000,
"masked_author": "username_0",
"text": "https://github.com/ceph/ceph-csi/blob/7cbad9305f71b70040509ed854f1d65c7c624d4f/internal/rbd/mirror.go#L131-L161 Need to re-written using go-ceph",
"title": "RBD: convert getImageMirroingStatus from RBD CLI to go-ceph",
"type": "issue"
}
] | 2 | 2 | 144 | false | true | 144 | false |
leoasis/react-sound | null | 900,131,817 | 96 | null | [
{
"action": "opened",
"author": "ridhwaans",
"comment_id": null,
"datetime": 1621903089000,
"masked_author": "username_0",
"text": "React-sound audio playback works as expected with `react-router-dom`. The audio doesnt suspend or shutdown when using `<Link>` \r\n\r\nUsing `goBack()` and `goForward()` from the `useHistory` hook stops the audio playing for some reason, even though it does not reload/refresh the page. it behaves like `<Link>` to the previous browserrouter `<Route>` \r\nconsole log is \r\n```\r\nsoundmanager2.js:1307 sound0: Destruct\r\nsoundmanager2.js:1307 sound0: Removing event listeners\r\nsoundmanager2.js:1307 sound0: stop()\r\nsoundmanager2.js:1307 sound0: unload()\r\n```",
"title": "React-sound soundmanager doesnt work with useHistory hook",
"type": "issue"
}
] | 1 | 1 | 552 | false | false | 552 | false |
webgriffe/SyliusAkeneoPlugin | webgriffe | 836,868,078 | 64 | null | [
{
"action": "opened",
"author": "mmenozzi",
"comment_id": null,
"datetime": 1616261253000,
"masked_author": "username_0",
"text": "After #60 empty attributes values are properly passed to all value handlers during product import. So now, for attributes that are empty on Akeneo, the AttributeValueHandler always sets empty product attributes on Sylius. Instead it should remove them from products.",
"title": "AttributeValueHandler should remove attributes from product when there's no value from Akeneo",
"type": "issue"
},
{
"action": "closed",
"author": "mmenozzi",
"comment_id": null,
"datetime": 1616766033000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 266 | false | false | 266 | false |
sindresorhus/refined-github | null | 750,062,710 | 3,764 | null | [
{
"action": "opened",
"author": "ElijahLynn",
"comment_id": null,
"datetime": 1606251406000,
"masked_author": "username_0",
"text": "I have searched the issues and installed the extension but don't see functionality for hiding `TimelineItem`s. \r\n\r\nMy request is to be able to hide/toggle all Timelineitems from an issue or pull request view/page. \r\n\r\n\r\n\r\n\r\nGitLab has this built-in and I think GitHub should too, but next best thing would be for Refined GitHub to have this feature:\r\n",
"title": "Toggle/Hide TimelineItem",
"type": "issue"
},
{
"action": "created",
"author": "fregante",
"comment_id": 733258740,
"datetime": 1606255357000,
"masked_author": "username_1",
"text": "This would be an alternative solution to https://github.com/sindresorhus/refined-github/issues/615, but it doesn't feel like an action I would need to take often, if ever. Why do you need/want to hide events?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ElijahLynn",
"comment_id": 733278237,
"datetime": 1606258177000,
"masked_author": "username_0",
"text": "Thanks for the reply. I routinely have issues that have like 20 timeline events and it is really noisy. My use case is nearly every day I would want it on, and I would want it defaulted to my last selection, which will most of the time be turned off. And only if I need an important bit of info will I toggle it on temporarily. Here is a sample issue where it gets out of control: \r\n\r\n\r\n\r\n\r\nsource: https://github.com/department-of-veterans-affairs/va.gov-cms/issues/910",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ElijahLynn",
"comment_id": 733279337,
"datetime": 1606258377000,
"masked_author": "username_0",
"text": "Interesting to note that _some_ of the timeline events are only shown with the ZenHub extension enabled, which I just disabled and took this same screenshot:\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fregante",
"comment_id": 751500041,
"datetime": 1609093162000,
"masked_author": "username_1",
"text": "Maybe #3847",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ElijahLynn",
"comment_id": 775398011,
"datetime": 1612813618000,
"masked_author": "username_0",
"text": "Looks hopeful, subscribed, thanks!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "fregante",
"comment_id": null,
"datetime": 1614920912000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 7 | 1,673 | false | false | 1,673 | false |
sourcegraph/sourcegraph | sourcegraph | 808,787,819 | 18,302 | null | [
{
"action": "opened",
"author": "keegancsmith",
"comment_id": null,
"datetime": 1613419292000,
"masked_author": "username_0",
"text": "Currently we return a single count: the number of results. We plan on making it consistent what that means. However, we might as well return more information such as:\r\n\r\n- number of \"highlights\" (current).\r\n- number of lines with matches.\r\n- number of documents with matches.\r\n- number of commits with matches (in commit/diff search).\r\n- number of repositories with matches.\r\n\r\nPart of https://github.com/sourcegraph/sourcegraph/issues/18297",
"title": "streaming: detailed counts in response",
"type": "issue"
},
{
"action": "created",
"author": "rrhyne",
"comment_id": 780635174,
"datetime": 1613575790000,
"masked_author": "username_1",
"text": "Potential duplicate ticket: https://github.com/sourcegraph/sourcegraph/issues/17573. Happy to move Figma designs here if that's the case.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "keegancsmith",
"comment_id": 780779908,
"datetime": 1613588612000,
"masked_author": "username_0",
"text": "Lets keep this as the backend part of the task.",
"title": null,
"type": "comment"
}
] | 3 | 4 | 697 | false | true | 625 | false |
NVIDIA/spark-rapids | NVIDIA | 731,428,245 | 1,036 | {
"number": 1036,
"repo": "spark-rapids",
"user_login": "NVIDIA"
} | [
{
"action": "opened",
"author": "tgravescs",
"comment_id": null,
"datetime": 1603890669000,
"masked_author": "username_0",
"text": "Seems getting GPU nodes on our databricks cluster is taking a long time so just increase the number of times we retry before giving up.",
"title": "Increase number of retries when waiting for databricks cluster",
"type": "issue"
},
{
"action": "created",
"author": "tgravescs",
"comment_id": 717922470,
"datetime": 1603890681000,
"masked_author": "username_0",
"text": "build",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "tgravescs",
"comment_id": 718040869,
"datetime": 1603901474000,
"masked_author": "username_0",
"text": "we would have to investigate different regions as this was setup for us and we are just using it.",
"title": null,
"type": "comment"
}
] | 1 | 3 | 237 | false | false | 237 | false |
gopasspw/gopass | gopasspw | 614,602,621 | 1,340 | {
"number": 1340,
"repo": "gopass",
"user_login": "gopasspw"
} | [
{
"action": "opened",
"author": "dominikschulz",
"comment_id": null,
"datetime": 1588927520000,
"masked_author": "username_0",
"text": "Remove the limit on the tree for recipients and templates\r\n\r\nFixes #1334",
"title": "Print all recipients and templates",
"type": "issue"
}
] | 2 | 2 | 72 | false | true | 72 | false |
kivy/kivy-ios | kivy | 830,992,124 | 595 | null | [
{
"action": "opened",
"author": "Andrei-Pozolotin",
"comment_id": null,
"datetime": 1615668607000,
"masked_author": "username_0",
"text": "1. currently, `kivy-ios/recipes/python3/reduce_python` removes `wsgiref`\r\n\r\nhttps://github.com/kivy/kivy-ios/blob/2cb084c26b7e9b45b027961cded13521b8db5b6b/kivy_ios/recipes/python3/__init__.py\r\n```python\r\n # cleanup python libraries\r\n with cd(join(\r\n self.ctx.dist_dir, \"root\", \"python3\", \"lib\", \"python3.8\")):\r\n sh.rm(\"-rf\", \"wsgiref\", \"curses\", \"idlelib\", \"lib2to3\",\r\n \"ensurepip\", \"turtledemo\", \"lib-dynload\", \"venv\",\r\n \"pydoc_data\")\r\n```\r\n\r\n2. that breaks `google-auth-oauthlib`\r\n\r\nhttps://github.com/googleapis/google-auth-library-python-oauthlib\r\n\r\neven in cases when `wsgiref` is not actually used in active code path, but merely due to `import`\r\n\r\nhttps://github.com/googleapis/google-auth-library-python-oauthlib/blob/master/google_auth_oauthlib/flow.py\r\n```python\r\nimport wsgiref.simple_server\r\nimport wsgiref.util\r\n```",
"title": "recipes/python3/reduce_python breaks google-auth-library-python-oauthlib via wsgiref",
"type": "issue"
},
{
"action": "created",
"author": "misl6",
"comment_id": 808701469,
"datetime": 1616837229000,
"masked_author": "username_1",
"text": "@tito Do you remind why `wsgiref` was included in the cleanup? Just for some decluttering or something else?",
"title": null,
"type": "comment"
}
] | 2 | 2 | 1,010 | false | false | 1,010 | false |
tidwall/raft-wal | null | 576,503,288 | 1 | null | [
{
"action": "opened",
"author": "samanderson",
"comment_id": null,
"datetime": 1583439146000,
"masked_author": "username_0",
"text": "During raft.RecoverCluster, a snapshot is created with the last index being 1 more than the `lastIndex` in the log. https://github.com/hashicorp/raft/blob/b47bafab395164c561a61f4bd63d725d8cd5a665/raft.go#L959\r\n\r\nAfter this snapshot is created it attempts to compact the logs by deleting the logs from `firstIndex` to `lastIndex +1` This library throws an index out of bounds error https://github.com/tidwall/raft-wal/blob/ba7138bf9f8dee04aeb37d6eddc4ee564b45f265/store.go#L174. \r\n\r\nHave you considered ignoring deletion of items greater than lastIndex as other libraries do?",
"title": "Cannot RecoverCluster with this logStore",
"type": "issue"
}
] | 1 | 1 | 574 | false | false | 574 | false |
microsoft/TypeScript-Website | microsoft | 892,374,606 | 1,826 | null | [
{
"action": "opened",
"author": "eps1lon",
"comment_id": null,
"datetime": 1621061746000,
"masked_author": "username_0",
"text": "**Improvement Idea:** Cookie choices should be persisted\r\n\r\n**Why:** This is the only page I know of that constantly asks me about cookie choices. \r\nI don't think this is good UX. It seems like the page just does not want to accept that I deactive non-essential cookies. It wouldn't be a problem if \"reject all\" would be as streamlined as \"accept all\". But right now it's another implementation of \"we really don't want you to reject our cookies\".\r\n\r\n\r\nhttps://user-images.githubusercontent.com/12292047/118351252-2dcd1b00-b55b-11eb-95d0-497fd02bb61f.mp4\r\n\r\n\r\n\r\n**Alternative ideas:**\r\n\r\nDisplay a \"reject all\" next to \"accept all\". Though that would be a nice addition regardless of this issue.",
"title": "Idea: Persist cookie choices across pages",
"type": "issue"
},
{
"action": "created",
"author": "orta",
"comment_id": 841785041,
"datetime": 1621153009000,
"masked_author": "username_1",
"text": "This might be the default behavior for the MS cookie stuff which we have to use, maybe it's setup wrong somehow",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "eps1lon",
"comment_id": null,
"datetime": 1646482624000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "eps1lon",
"comment_id": 1059753281,
"datetime": 1646482624000,
"masked_author": "username_0",
"text": "There's no longer any cookie consent dialog so this is no longer relevant.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 881 | false | false | 881 | false |
Azure/azure-rest-api-specs | Azure | 784,843,914 | 12,407 | null | [
{
"action": "opened",
"author": "jsntcy",
"comment_id": null,
"datetime": 1610521036000,
"masked_author": "username_0",
"text": "In swagger, `retentionInDays` is defined with lowerCamelCase which is consistent with the real API response.\r\n\r\nhttps://github.com/Azure/azure-rest-api-specs/blob/master/specification/operationalinsights/resource-manager/Microsoft.OperationalInsights/stable/2020-08-01/Tables.json#L124\r\n\r\n```\r\n\"TableProperties\": {\r\n \"properties\": {\r\n \"retentionInDays\": {\r\n \"type\": \"integer\",\r\n \"format\": \"int32\",\r\n \"minimum\": 30,\r\n \"maximum\": 730,\r\n \"description\": \"The data table data retention in days, between 30 and 730. Setting this property to null will default to the workspace retention.\"\r\n }\r\n }\r\n```\r\n\r\nbut in real API response, it returns UpperCamelCase `RetentionInDays`.\r\n```\r\nurllib3.connectionpool : https://management.azure.com:443 \"PATCH /subscriptions/0b1f6471-1bf0-4dda-aec3-cb9272f09590/resourcegroups/azure-cli-test-rg/providers/Microsoft.OperationalInsights/workspaces/myws0230/tables/WVDManagement?api-version=2020-03-01-previe msrest.http_logger : Response status: 200\r\nmsrest.http_logger : Response content:\r\nmsrest.http_logger : {\"properties\":{\"RetentionInDays\":30},\"id\":\"/subscriptions/0b1f6471-1bf0-4dda-aec3-cb9272f09590/resourcegroups /azure-cli-test-rg/providers/microsoft.operationalinsights/workspaces/myws0230/tables/wvdmanagement\",\"name\":\"WVDManagement\"}\r\n```\r\nSo in the latest Python mgmt. SDK, the deserialization will be failed because the case between swagger and real API response are different.",
"title": "[Monitor] TableProperties definition is different between swagger and real API response",
"type": "issue"
}
] | 2 | 2 | 4,011 | false | true | 1,731 | false |
Koihik/LuaFormatter | null | 627,800,776 | 114 | {
"number": 114,
"repo": "LuaFormatter",
"user_login": "Koihik"
} | [
{
"action": "opened",
"author": "andreantunes",
"comment_id": null,
"datetime": 1590866408000,
"masked_author": "username_0",
"text": "I needed the latest commit to be compiled for windows, so I used Docker and mxe project to cross compile for windows.\r\nThis commit can be used as a base step to use github actions to compile automatically.\r\nThe only thing that did not work out of the box is the \"try_run\" (to test the c++17 compiler) CmakeList option, not sure why, so I simply added an option to disable.",
"title": "compile for win32 using docker and mxe",
"type": "issue"
},
{
"action": "created",
"author": "andreantunes",
"comment_id": 638387002,
"datetime": 1591209609000,
"masked_author": "username_0",
"text": "I was not sure if the current Dockerfile had any dependencies, since I don't fully understand this project. I will change the Dockerfile next time I have some free time.\r\n\r\n\"It's more reasonable to disable the test for known correct compilers. For instance:\r\nclang >= 10.0.0 and gcc >= 9.0.0.\"\r\n\r\nI use the mxe cross compiling project in which I can select the compiler version (in this case, i chose gcc10). The default CMakeLists.txt behavior is still to test for the c++17. I simply added the option to disable, but at the same time I know that it's going to compile with the right version, in which is kind what you suggested. Still think I should do something else regarding the CMakelists.txt?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "tammela",
"comment_id": 638699009,
"datetime": 1591259616000,
"masked_author": "username_1",
"text": "It's better to have CMake disable it \"automagically\" for newer compilers which are known to be correct.\r\nIt's just a matter of a `if()` statement regarding the compiler versions (clang, gcc, etc...).",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,270 | false | false | 1,270 | false |
OHIF/Viewers | OHIF | 614,051,096 | 1,715 | null | [
{
"action": "opened",
"author": "JamesAPetts",
"comment_id": null,
"datetime": 1588857497000,
"masked_author": "username_0",
"text": "When we have multiple SEGs/RTSTRUCTs we intend to load the most recent by default.\r\nHowever we start to load once the first set of metadata arrives. We need to wait until all series metadata is fetched.",
"title": "Race Condition when loading derived display sets.",
"type": "issue"
},
{
"action": "closed",
"author": "JamesAPetts",
"comment_id": null,
"datetime": 1589466025000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 202 | false | false | 202 | false |
chameleon-system/chameleon-system | chameleon-system | 907,640,796 | 707 | null | [
{
"action": "opened",
"author": "kzorluoglu",
"comment_id": null,
"datetime": 1622483969000,
"masked_author": "username_0",
"text": "**Describe the bug**\r\nDuring command `composer install` execution in a freshed cloned project, we get this error: ` fatal: cannot create directory at 'src/AmazonPaymentBundle/Resources/views/objectviews/pkgShop/views/db/TShopOrderStep/ChameleonSystemAmazonPaymentBundlepkgShopOrderStepsAmazonConfirmOrderStep': Filename too long `\r\n\r\n\r\n**Affected version(s)**\r\n7.1.x\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Clone to project on your local setup\r\n2. Select the branch '7.1.x'\r\n3. Run composer install\r\n4. See error\r\n\r\n\r\n**Screenshots**\r\n\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: Windows 10\r\n - Browser: Chrome\r\n - PHP : 7.4.6\r\n - Composer: 2.0.8\r\n\r\n\r\n**Technical details**\r\n```\r\n - Upgrading phpunit/phpunit (8.5.8 => 9.4.1): Extracting archive\r\n 0 [>---------------------------]\r\n \r\n [RuntimeException] \r\n Failed to execute git checkout \"b9071a8e9373767a09bc6738f648c530cf1f5597\" -- && git reset --hard \"b9071a8e9373767a09bc6738f648c530cf1f5597\" -- \r\n \r\n warning: unable to access 'src/AmazonPaymentBundle/Resources/views/objectviews/pkgShop/views/db/TShopOrderStep/ChameleonSystemAmazonPaymentBundlepkgShopOrderStepsAmazonAddressStep/.gitattributes': Filename too long \r\n fatal: cannot create directory at 'src/AmazonPaymentBundle/Resources/views/objectviews/pkgShop/views/db/TShopOrderStep/ChameleonSystemAmazonPaymentBundlepkgShopOrderStepsAmazonConfirmOrderStep': Filename too long \r\n \r\n\r\ninstall [--prefer-source] [--prefer-dist] [--dry-run] [--dev] [--no-suggest] [--no-dev] [--no-autoloader] [--no-scripts] [--no-progress] [--no-install] [-v|vv|vvv|--verbose] [-o|--optimize-autoloader] [-a|--classmap-authoritative] [\r\n--apcu-autoloader] [--apcu-autoloader-prefix APCU-AUTOLOADER-PREFIX] [--ignore-platform-req IGNORE-PLATFORM-REQ] [--ignore-platform-reqs] [--] [<packages>]...\r\n```",
"title": " Filename too long: ChameleonSystemAmazonPaymentBundlepkgShopOrderStepsAmazonConfirmOrderStep",
"type": "issue"
}
] | 1 | 1 | 2,894 | false | false | 2,894 | false |
flexera/policy_templates | flexera | 718,406,790 | 613 | {
"number": 613,
"repo": "policy_templates",
"user_login": "flexera"
} | [
{
"action": "opened",
"author": "MikaHarviala",
"comment_id": null,
"datetime": 1602275623000,
"masked_author": "username_0",
"text": "Added 404 ignore to handle workspace not found errors\r\n\r\n### Description\r\n\r\nMars is seeing errors as Azure returns 'Workspace not found'\r\n\r\n### Issues Resolved\r\n\r\nPOL-386\r\n\r\n### Contribution Check List\r\n\r\n- [ ] All tests pass.\r\n- [ ] New functionality includes testing.\r\n- [ ] New functionality has been documented in the README if applicable\r\n- [ ] New functionality has been documented in CHANGELOG.MD",
"title": "Added 404 ignore",
"type": "issue"
}
] | 2 | 2 | 949 | false | true | 403 | false |
Thomvh/thvmh-uptime | null | 765,096,448 | 97 | null | [
{
"action": "opened",
"author": "Thomvh",
"comment_id": null,
"datetime": 1607842768000,
"masked_author": "username_0",
"text": "In [`499d5b5`](https://github.com/username_0/thvmh-uptime/commit/499d5b55b84b0f6a80a8d74bc9f23653d810a983\n), x5i5 Plex2 ($X5I5_PLEX2) was **down**:\n- HTTP code: 404\n- Response time: 118 ms",
"title": "🛑 x5i5 Plex2 is down",
"type": "issue"
},
{
"action": "created",
"author": "Thomvh",
"comment_id": 743967879,
"datetime": 1607844930000,
"masked_author": "username_0",
"text": "**Resolved:** x5i5 Plex2 is back up in [`baaca0d`](https://github.com/username_0/thvmh-uptime/commit/baaca0d95972ad3cfb0d7d4a3241aaf374c642e3\n).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Thomvh",
"comment_id": null,
"datetime": 1607844931000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 324 | false | false | 324 | true |
pytorch/text | pytorch | 706,110,226 | 1,002 | null | [
{
"action": "opened",
"author": "JonathanSum",
"comment_id": null,
"datetime": 1600755657000,
"masked_author": "username_0",
"text": "## ❓ Questions and Help\r\n\r\n**Description**\r\n<!-- Please send questions or ask for help here. -->\r\nCan I have all the dataset download link from the torch text examples?\r\nI can see there are examples, but I can't understand them without the dataset. Where they are?",
"title": "Can I have all the dataset download link from the torch text examples?",
"type": "issue"
},
{
"action": "created",
"author": "zhangguanheng66",
"comment_id": 696710857,
"datetime": 1600780235000,
"masked_author": "username_1",
"text": "which example?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JonathanSum",
"comment_id": 696772508,
"datetime": 1600786234000,
"masked_author": "username_0",
"text": "For example, the pos/wsj one. Thx",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JonathanSum",
"comment_id": 696787381,
"datetime": 1600787659000,
"masked_author": "username_0",
"text": "All of them. For example, the post/wsj one. Thanks",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangguanheng66",
"comment_id": 696807581,
"datetime": 1600789651000,
"masked_author": "username_1",
"text": "oh, this is based on some legacy code (like `data.TabularDataset`) and we will remove them sooner. You can take a look at the examples in the `pytorch/text/examples` folder.",
"title": null,
"type": "comment"
}
] | 2 | 5 | 534 | false | false | 534 | false |
IdentityServer/IdentityServer4 | IdentityServer | 817,149,612 | 5,168 | null | [
{
"action": "opened",
"author": "BaHXeLiSiHg",
"comment_id": null,
"datetime": 1614328018000,
"masked_author": "username_0",
"text": "Have latest IdentityServer4 with configured refresh token flow for client:\r\n\r\n```\r\n...\r\nAllowOfflineAccess = true\r\nUpdateAccessTokenClaimsOnRefresh = true,\r\nAccessTokenLifetime = 120,\r\nRefreshTokenUsage = TokenUsage.OneTimeOnly,\r\nRefreshTokenExpiration = TokenExpiration.Absolute,\r\nAbsoluteRefreshTokenLifetime = 300\r\n...\r\n```\r\n\r\nWanted to test it in short period of time, to ensure that everything works. Refresh token requesting every 60 seconds (half of access token lifetime), but at 5-th refresh token request (300 seconds) getting error that refresh token expired.\r\n\r\nI expected that every new refresh token will have fresh expire time (300 seconds), but seems like even when i get new refresh tokens every time, they have inherited expiration time from first-issued refresh token.\r\n\r\nAm i doing something wrong?\r\n\r\n\r\n\r\n",
"title": "Fresh issued refresh token expired",
"type": "issue"
},
{
"action": "created",
"author": "BaHXeLiSiHg",
"comment_id": 786505848,
"datetime": 1614329730000,
"masked_author": "username_0",
"text": "Just encountered post [https://karatejb.blogspot.com/2019/09/aspnet-core-identity-server-4-refresh.html](url) that explained little bit detailed refresh token behavior.\r\n\r\nSo next configuration did the trick:\r\n\r\n```\r\n...\r\nAllowOfflineAccess = true\r\nUpdateAccessTokenClaimsOnRefresh = true,\r\nAccessTokenLifetime = 120,\r\nRefreshTokenUsage = TokenUsage.OneTimeOnly,\r\nRefreshTokenExpiration = TokenExpiration.Sliding,\r\nSlidingRefreshTokenLifetime = 300,\r\nAbsoluteRefreshTokenLifetime = 0\r\n...\r\n```",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "BaHXeLiSiHg",
"comment_id": null,
"datetime": 1614329731000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 1,800 | false | true | 1,652 | false |
TDAmeritrade/stumpy | TDAmeritrade | 830,235,926 | 359 | null | [
{
"action": "opened",
"author": "seanlaw",
"comment_id": null,
"datetime": 1615563596000,
"masked_author": "username_0",
"text": "We need a good data set that contains multiple motifs OR we can construct one ourselves",
"title": "Add Motif Discovery Tutorial",
"type": "issue"
},
{
"action": "created",
"author": "seanlaw",
"comment_id": 798807104,
"datetime": 1615681419000,
"masked_author": "username_0",
"text": "@username_1 On [slide 40](https://www.cs.ucr.edu/~eamonn/Matrix_Profile_Tutorial_Part1.pdf) they demonstrate three potential motifs in the [penguin dataset](https://www.cs.ucr.edu/~eamonn/MP_first_test_penguin_sample.mat). I think this would work. What do you think?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seanlaw",
"comment_id": 798961168,
"datetime": 1615748697000,
"masked_author": "username_0",
"text": "@username_1 I reached out to Eamonn Keogh and he provided another dataset for motif discovery. I've [deposited it on Zenodo](https://zenodo.org/record/4603813) and you can download it like:\r\n\r\n```\r\n%matplotlib inline\r\n\r\nimport pandas as pd\r\nimport matplotlib.pyplot as plt\r\nimport stumpy\r\n\r\nplt.rcParams[\"figure.figsize\"] = [20, 6] # width, height\r\nplt.rcParams['xtick.direction'] = 'out'\r\n\r\ndf = pd.read_csv(\"https://zenodo.org/record/4603813/files/common_nightingale.csv?download=1\")\r\n\r\nplt.plot(df[\"mfcc_2\"])\r\nplt.show()\r\n\r\nplt.plot(df[\"mfcc_4\"])\r\nplt.show()\r\n\r\nplt.plot(df[\"mfcc_1\"])\r\nplt.show()\r\n\r\nmp_mfcc2_150 = stumpy.stump(df[\"mfcc_2\"], m=150)\r\nmp_mfcc4_150 = stumpy.stump(df[\"mfcc_4\"], m=150)\r\nmp_mfcc1_350 = stumpy.stump(df[\"mfcc_1\"], m=350)\r\n```\r\n\r\nThis reproduces the plots/motifs shown in [this PPT provided by Eamonn](https://github.com/TDAmeritrade/stumpy/files/6137371/bird_for_sean.pptx)\r\n\r\nLet me know if this is helpful",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mexxexx",
"comment_id": 799490742,
"datetime": 1615820507000,
"masked_author": "username_1",
"text": "This is great, thanks! I will work with this data set and reproduce the motifs for one of the time series.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seanlaw",
"comment_id": 846147748,
"datetime": 1621621039000,
"masked_author": "username_0",
"text": "@username_1 Just to confirm, this one is for the \"motif discovery tutorial\" while #358 needs to be updated to leverage `motifs.py`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seanlaw",
"comment_id": 853466592,
"datetime": 1622679726000,
"masked_author": "username_0",
"text": "@username_1 I wanted to let you know that I am going to remove the `excl_zone` parameter in the `motifs` function as I am moving that parameter over into `config.py` so that it will be defined globally. This way a user only has to change/set the exclusion zone once and then it will be inherited in ALL functions.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seanlaw",
"comment_id": 860673847,
"datetime": 1623676374000,
"masked_author": "username_0",
"text": "@username_1 I just wanted to [follow up on this](https://github.com/TDAmeritrade/stumpy/issues/358) to ask if you've been able to make any progress. I'm sure that you've been busy so no worries if you need more time.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mexxexx",
"comment_id": 860730482,
"datetime": 1623680920000,
"masked_author": "username_1",
"text": "@username_0 I pushed something on #358 but I didn't manage to write anything for this issue yet unfortunately... Sorry for having promised it, I hope I can do this soon!",
"title": null,
"type": "comment"
}
] | 2 | 8 | 2,210 | false | false | 2,210 | true |
arlac77/remove-sensible-values | null | 758,758,883 | 217 | {
"number": 217,
"repo": "remove-sensible-values",
"user_login": "arlac77"
} | [
{
"action": "opened",
"author": "arlac77",
"comment_id": null,
"datetime": 1607367162000,
"masked_author": "username_0",
"text": "package.json\n---\n- chore(deps): add ^3.14.0 remove ^3.13.0 (devDependencies.ava)",
"title": "merge from arlac77/template-arlac77-github",
"type": "issue"
}
] | 2 | 2 | 455 | false | true | 80 | false |
DolphDev/pynationstates | null | 873,821,160 | 18 | {
"number": 18,
"repo": "pynationstates",
"user_login": "DolphDev"
} | [
{
"action": "opened",
"author": "DolphDev",
"comment_id": null,
"datetime": 1619923118000,
"masked_author": "username_0",
"text": "HTML Entities will not be processed as a back up.\r\n\r\n(Dev Note - will decided if to do this by default in a future release, due to potential backwards comp issues. we will return html entities as is on first attempt).",
"title": "Fixed Bug Related to HTML entities",
"type": "issue"
}
] | 2 | 2 | 488 | false | true | 217 | false |
mzntaka0/saasinit | null | 640,147,349 | 2 | null | [
{
"action": "opened",
"author": "themother-lab",
"comment_id": null,
"datetime": 1592370888000,
"masked_author": "username_0",
"text": "Hi,\r\n\r\nI'm trying to setup the environment, but running \"docker-compose up -d --build\" and opening the app via browser doesn't work. What's missing?\r\n\r\n<img width=\"367\" alt=\"Screenshot 2020-06-17 at 07 11 21\" src=\"https://user-images.githubusercontent.com/65431964/84857803-257a1080-b06a-11ea-9ac7-34bc5512d719.png\">\r\n<img width=\"728\" alt=\"Screenshot 2020-06-17 at 07 11 28\" src=\"https://user-images.githubusercontent.com/65431964/84857807-27dc6a80-b06a-11ea-9ea1-fe01fd3dddc6.png\">",
"title": "docker runs but app not reachable.",
"type": "issue"
}
] | 1 | 1 | 482 | false | false | 482 | false |
microsoft/jschema | microsoft | 762,650,563 | 136 | null | [
{
"action": "opened",
"author": "MDMS17",
"comment_id": null,
"datetime": 1607707192000,
"masked_author": "username_0",
"text": "tried to validate a json file against json schema, but result was unexpected, here is the code\r\n static bool IsValidMcpdFile(FileInfo fi)\r\n {\r\n string McpdSchemaFile = File.ReadAllText(\"JsonSchema\\\\mcpd.json\");\r\n var schema = SchemaReader.ReadSchema(McpdSchemaFile, \"JsonSchema\\\\mcpd.json\");\r\n string McpdJsonFile = File.ReadAllText(fi.FullName);\r\n Validator validator = new Validator(schema);\r\n Result[] errors = validator.Validate(McpdJsonFile, fi.FullName);\r\n bool result = true;\r\n if (errors.Any()) result = false;\r\n return result;\r\n }\r\ninside result array, there's one item, and there're 3 items in its arguments, header/schemaversion, 15, 1.3, this is stranger, schema version in both schema and instance are 1.5, both files are attached here, please review, thanks!\r\n[mcpd.zip](https://github.com/microsoft/jschema/files/5680607/mcpd.zip)",
"title": "Json.Schema.Validation, wrong result",
"type": "issue"
}
] | 1 | 1 | 954 | false | false | 954 | false |
bancolombia/secrets-manager | bancolombia | 824,623,504 | 22 | {
"number": 22,
"repo": "secrets-manager",
"user_login": "bancolombia"
} | [
{
"action": "opened",
"author": "juancgalvis",
"comment_id": null,
"datetime": 1615213929000,
"masked_author": "username_0",
"text": "- Resolves error from migration to maven central which publishes the module metadata",
"title": "Fix module metadata",
"type": "issue"
}
] | 2 | 2 | 3,432 | false | true | 84 | false |
betagouv/ecolab-data | betagouv | 620,213,478 | 4 | null | [
{
"action": "opened",
"author": "laem",
"comment_id": null,
"datetime": 1589808769000,
"masked_author": "username_0",
"text": "Électricité verte : je pense que c'est une erreur d'inclure les 12gCO2e, car ils sont virtuels. \r\nSi j'utilise une voiture thermique aux émissions égales à la moyenne du parc national, et que je finance à côté l'injection dans le parc de voitures moins émissives, je fais de la compensation carbone, et le but d'un calculateur d'empreinte personnel n'est pas de mesure la compensation. C'est ce qui se passe avec les fournisseurs d'énergie \"verte\".",
"title": "Electricité verte",
"type": "issue"
},
{
"action": "created",
"author": "laem",
"comment_id": 655423629,
"datetime": 1594202766000,
"masked_author": "username_0",
"text": "On pourrait aussi demander si une part de l'électricité est produite consommée localement indépendamment du réseau national. Est-ce courant ? A noter : il y a des chances pour que l'empreinte climat d'une électricité produite localement soit plus élevée que celle du réseau national, en prenant en compte l'ensemble du cycle de vie. A creuser.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fjg",
"comment_id": 688359027,
"datetime": 1599488683000,
"masked_author": "username_1",
"text": "voir retour donné aussi sur #200 sur le sujet de l'électricité \"verte\".\r\n\r\nSur le biogaz, ca mérite de creuser par contre, mais les offres sont très récentes à ma connaissance. A-t-on des retours/études sur les facteurs d'émissions du méthane issue de la méthanisation ?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "laem",
"comment_id": 720586285,
"datetime": 1604335238000,
"masked_author": "username_0",
"text": "Même retour sur le biogaz ici #98",
"title": null,
"type": "comment"
}
] | 2 | 4 | 1,094 | false | false | 1,094 | false |
MakeNowJust-Labo/lite | MakeNowJust-Labo | 908,214,799 | 16 | {
"number": 16,
"repo": "lite",
"user_login": "MakeNowJust-Labo"
} | [
{
"action": "opened",
"author": "MakeNowJust",
"comment_id": null,
"datetime": 1622548629000,
"masked_author": "username_0",
"text": "TODO\r\n\r\n- [x] Add implementation\r\n- [ ] Add tests",
"title": "Add lite-diff",
"type": "issue"
}
] | 2 | 2 | 49 | false | true | 49 | false |
istio/istio | istio | 747,245,491 | 29,074 | {
"number": 29074,
"repo": "istio",
"user_login": "istio"
} | [
{
"action": "opened",
"author": "SpecialYang",
"comment_id": null,
"datetime": 1605857189000,
"masked_author": "username_0",
"text": "Fixes https://github.com/istio/istio/pull/29049. The same issue is also in release-1.8, which needed be fixed. \r\n\r\n[ ] Configuration Infrastructure\r\n[ ] Docs\r\n[ ] Installation\r\n[ ] Networking\r\n[ ] Performance and Scalability\r\n[ ] Policies and Telemetry\r\n[ ] Security\r\n[ ] Test and Release\r\n[ ] User Experience\r\n[ ] Developer Infrastructure",
"title": "[release-1.8] Manual cherry-pick for \"Fix nil pointer dereference in initSidecarInjector of Pilot (#29049)\"",
"type": "issue"
},
{
"action": "created",
"author": "howardjohn",
"comment_id": 731279455,
"datetime": 1605890858000,
"masked_author": "username_1",
"text": "/ok-to-test",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bianpengyuan",
"comment_id": 731302774,
"datetime": 1605893518000,
"masked_author": "username_2",
"text": "/retest",
"title": null,
"type": "comment"
}
] | 4 | 6 | 3,433 | false | true | 357 | false |
SAP/ui5-webcomponents | SAP | 839,800,948 | 3,009 | {
"number": 3009,
"repo": "ui5-webcomponents",
"user_login": "SAP"
} | [
{
"action": "opened",
"author": "ilhan007",
"comment_id": null,
"datetime": 1616597189000,
"masked_author": "username_0",
"text": "**Background**\r\nThere are complains that if the user is at the very beginning of a step and scrolls a bit upwards, the selection switches to the previous step, although most of the viewport is taken by the first one.\r\nQuote\r\n\"5. scroll up a little bit of the cascading goal dialog, and you can see the fact that although the most of dialog is step two, but the wizard will highlight the step one button. this is an issue, because the current step is still step two\"\r\n\r\n**Solution**\r\nAdd private/protected API, called step-switch-threshold\r\nto give more control of the point the switch happens.\r\nIt support float values between 0.5 and 1, where 0.5 means the step will be switched after the user scrolls half of the current step, and 1 - when the current step is entirely scrolled out. The default value is 0.7, which is very close to the behaviour of the sap.m.Wizard.\r\n\r\n**Usage**\r\n```html\r\n<ui5-wizard step-switch-threshold=\"0.5\">\r\n<ui5-wizard step-switch-threshold=\"0.6\">\r\n<ui5-wizard step-switch-threshold=\"0.1\">\r\n```\r\n\r\nrelated to: https://github.com/SAP/ui5-webcomponents/issues/2883",
"title": "refactor(ui5-wizard): configure step change position",
"type": "issue"
},
{
"action": "created",
"author": "ilhan007",
"comment_id": 805883281,
"datetime": 1616597311000,
"masked_author": "username_0",
"text": "FYI @sfsf-xweb-sh",
"title": null,
"type": "comment"
}
] | 1 | 2 | 1,106 | false | false | 1,106 | false |
reactnativecn/react-native-website | reactnativecn | 591,689,987 | 101 | null | [
{
"action": "opened",
"author": "xyZangGit",
"comment_id": null,
"datetime": 1585726132000,
"masked_author": "username_0",
"text": "我原生嵌套RN,然后集成react-navigation3X 4X版本都有一个link的操作,但是原生集成RN,是没有继承reactActivity类的,然后 RN添加",
"title": "原生嵌套RN,如何集成 react-navigation?",
"type": "issue"
},
{
"action": "created",
"author": "sunnylqm",
"comment_id": 607099918,
"datetime": 1585728533000,
"masked_author": "username_1",
"text": "如果你不按标准流程走,那你就得自己研究这些第三方是如何链接编译并加到packagelist中了",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "xyZangGit",
"comment_id": 607100594,
"datetime": 1585728619000,
"masked_author": "username_0",
"text": "我是按照 原生嵌套RN的官方流程走的;目前已经找到问题所在了",
"title": null,
"type": "comment"
}
] | 2 | 3 | 162 | false | false | 162 | false |
move4mobile/geekbites | move4mobile | 837,292,662 | 175 | {
"number": 175,
"repo": "geekbites",
"user_login": "move4mobile"
} | [
{
"action": "created",
"author": "mschilling",
"comment_id": 920035398,
"datetime": 1631713715000,
"masked_author": "username_0",
"text": ":tada: This PR is included in version 2.11.0 :tada:\n\nThe release is available on [GitHub release](https://github.com/move4mobile/geekbites/releases/tag/v2.11.0)\n\nYour **[semantic-release](https://github.com/semantic-release/semantic-release)** bot :package::rocket:",
"title": null,
"type": "comment"
}
] | 2 | 2 | 2,626 | false | true | 265 | false |
comit-network/monero-rs | comit-network | 874,157,291 | 1 | {
"number": 1,
"repo": "monero-rs",
"user_login": "comit-network"
} | [
{
"action": "opened",
"author": "luckysori",
"comment_id": null,
"datetime": 1620003157000,
"masked_author": "username_0",
"text": "The implementation comes from\r\nhttps://github.com/dalek-cryptography/bulletproofs and has been\r\nmodified to conform to Monero's implementation of the same paper.",
"title": "Create new bulletproof submobule",
"type": "issue"
},
{
"action": "created",
"author": "luckysori",
"comment_id": 830997864,
"datetime": 1620011337000,
"masked_author": "username_0",
"text": "This is an interesting question. At first I thought that writing an integration test was not that important because our primary goal is to be able to generate a valid Monero bulletproof, rather than to be able to verify Monero bulletproofs in general. \r\n\r\nOf course, if we are to contribute this to upstream, the verify function must not be buggy. So I was curious and tried it against a mainnet bulletproof and it fails! \r\n\r\nAs it stands, the verify function works against the local prove function (and more importantly, the prove function works against Monero's verify function). I don't want to invest more time into debugging this (it may just be that I'm not deserializing things correctly), but we should record adding a passing \"integration\" test as a TODO. I'll add my failing test and mark it as `ignored`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thomaseizinger",
"comment_id": 831005523,
"datetime": 1620013884000,
"masked_author": "username_1",
"text": "That is okay, thank you!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "luckysori",
"comment_id": 831006603,
"datetime": 1620014321000,
"masked_author": "username_0",
"text": "Yep, all these `create_and_verify_n_m` tests create and verify a proof.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 1,071 | false | false | 1,071 | false |
google/ads-privacy | google | 826,222,764 | 36 | null | [
{
"action": "opened",
"author": "dmarti",
"comment_id": null,
"datetime": 1615307883000,
"masked_author": "username_0",
"text": "If a publisher chooses not to have a certain advertiser or piece of ad creative appear on their site, will they also be able to set an option similar to \"mute this ad\" to prevent it from appearing?",
"title": "\"Mute This Ad\" option for the publisher in DoveKey Auction?",
"type": "issue"
},
{
"action": "created",
"author": "gangwang-google",
"comment_id": 799851955,
"datetime": 1615854552000,
"masked_author": "username_1",
"text": "Thank you for your question. The Dovekey auction proposal supports publisher controls, e.g. advertiser or creative blocking, as follows.\r\n\r\nAs explained in Section “[Determine auction eligibility](https://github.com/google/ads-privacy/blob/master/proposals/dovekey/dovekey_auction_secure_2pc.md#determine-auction-eligibility)”, each conditional bid cached in the Dovekey server is associated with a cache lookup key derived from a limited number of signals, e.g. {`page URL`, `browser language`, `coarse location`}. Before the SSP asks the Dovekey server to cache a conditional bid, the SSP can apply publisher controls applicable to the page URL, browser language and coarse user location, based on the creative information and buyer information associated with the conditional bid.\r\n\r\nIf there are publisher controls more granular than what can be supported by {`page URL`, `browser language`, `coarse location`}, please let us know.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 1,133 | false | false | 1,133 | false |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.