repo stringlengths 7 67 | org stringlengths 2 32 ⌀ | issue_id int64 780k 941M | issue_number int64 1 134k | pull_request dict | events list | user_count int64 1 77 | event_count int64 1 192 | text_size int64 0 329k | bot_issue bool 1 class | modified_by_bot bool 2 classes | text_size_no_bots int64 0 279k | modified_usernames bool 2 classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
Snakemake-Profiles/lsf | Snakemake-Profiles | 740,364,627 | 35 | {
"number": 35,
"repo": "lsf",
"user_login": "Snakemake-Profiles"
} | [
{
"action": "opened",
"author": "haizi-zh",
"comment_id": null,
"datetime": 1605056380000,
"masked_author": "username_0",
"text": "In a legacy LSF system such as v8.3, the command `bjobs` does not support certain new arguments such as `--noheader` or `-o`. Furthermore, most of the time it is not realistic for snakemake users to upgrade LSF versions, which is tightly controlled by cluster admins. Therefore, it is essential to make this project compatible with legacy LSF systems. We can achieve this by simply using an alternative way to monitor job status.\r\n\r\nThis pull request includes a simple patch, which uses a plain `bjobs` command and for job status checking.\r\n\r\nNote: I have tested the code under the following environment:\r\n\r\nIBM Platform LSF 8.3.0.196409, May 10 2012\r\n\r\nHowever, I haven't tested it for other LSF versions.",
"title": "🐛 Fix bug for legacy LSF system (v8.3)",
"type": "issue"
},
{
"action": "created",
"author": "mbhall88",
"comment_id": 728571329,
"datetime": 1605574598000,
"masked_author": "username_1",
"text": "Hi @username_0 , thanks for raising this issue and putting in a PR.\r\n\r\nI am a little bit hesitant about trying to add support for legacy systems this old. It looks like these features `-o` and `-noheader` were introduced in v9.1.1 (released 2013), which itself is [becoming legacy](https://www.ibm.com/support/pages/end-life-end-support-announcement-all-editions-ibm-platform-lsf-v9-family-ibm-platform-analytics-ibm-spectrum-lsf-suite-hpc-101x-ibm-spectrum-lsf-suite-workgroups-101x). This seems akin to supporting python2 in some respects (however v8.3 seems to be end-of-support for longer than python2).\r\n\r\nI was able to dig up some [old documentation](https://www14.software.ibm.com/support/customercare/sas/f/plcomp/platformlsf.html) for v8.0 (not v8.3 though), but again, it becomes very hard to debug issues on such an old system. If we were going to change the command to check the job status to the method you have outlined in this PR it will require a lot more work. We would need to update all tests and also add in some error handling for the case where the status cannot be obtained from the plain `bjobs <jobID>` command.\r\n\r\nI appreciate your hands are tied and you can't just upgrade.\r\n\r\nI might ask @username_2 and @johanneskoester for their thoughts on this also as I don't think it is appropriate for me to make such a decision on my own.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "haizi-zh",
"comment_id": 728700520,
"datetime": 1605591910000,
"masked_author": "username_0",
"text": "Hi @username_1 , totally understood. If I were you I would definitely feel the same hesitation. Unfortunately, my employer's HPC cluster is fairly old with lots of legacy codes running in it, thus upgrading to newer LSF is not something feasible. You don't have to merge the PR. I submitted it just in case some other guys may need it. Wish you a great day! 😃",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "leoisl",
"comment_id": 1081688621,
"datetime": 1648548774000,
"masked_author": "username_2",
"text": "I think is reasonable to not support legacy LSF versions. I am wondering if this PR should remain permanently open for legacy users to see it, or if we should close this PR, and add a section in `README.md` that points to this PR or to https://github.com/username_0/lsf for users looking to run this profile on legacy LSF versions.",
"title": null,
"type": "comment"
}
] | 3 | 4 | 2,743 | false | false | 2,743 | true |
microsoft/vscode-azureappservice | microsoft | 810,462,471 | 1,968 | null | [
{
"action": "opened",
"author": "jayeshnathani14IG",
"comment_id": null,
"datetime": 1613589350000,
"masked_author": "username_0",
"text": "<!-- IMPORTANT: Please be sure to remove any private information before submitting. -->\r\n\r\nDoes this occur consistently? <!-- TODO: Type Yes or No -->\r\nRepro steps:\r\n<!-- TODO: Share the steps needed to reliably reproduce the problem. Please include actual and expected results. -->\r\n\r\n1.\r\n2.\r\n\r\nAction: appService.Deploy\r\nError type: 500\r\nError Message: An error has occurred.\r\n\r\nVersion: 0.20.0\r\nOS: win32\r\nOS Release: 10.0.19042\r\nProduct: Visual Studio Code\r\nProduct Version: 1.53.2\r\nLanguage: en\r\n\r\n<details>\r\n<summary>Call Stack</summary>\r\n\r\n```\r\nnew RestError extension.bundle.js:46:66146\r\nextension.bundle.js:46:496611extension.bundle.js:46:496611\r\nprocessTicksAndRejections task_queues.js:97:5\r\n```\r\n\r\n</details>",
"title": "Error while publishing the app",
"type": "issue"
},
{
"action": "created",
"author": "nturinski",
"comment_id": 780811689,
"datetime": 1613591655000,
"masked_author": "username_1",
"text": "Please provide repro steps.",
"title": null,
"type": "comment"
}
] | 3 | 4 | 949 | false | true | 747 | false |
vinteo/hass-opensprinkler | null | 720,962,031 | 138 | null | [
{
"action": "opened",
"author": "JurajNyiri",
"comment_id": null,
"datetime": 1602630336000,
"masked_author": "username_0",
"text": "Opensprinkler sometimes does not initiate properly.\r\n\r\nWhen this is happening, home assistant seems to be taking a long time to load (reason currently unknown).\r\n\r\nI did some digging and I think that slow start of HASS causes timeouts and/or race conditions inside pyopensprinkler and this component.\r\n\r\n**Component should be able to recover from this and try to set up again and not give up.**\r\n\r\n```\r\nLogger: custom_components.opensprinkler\r\nSource: helpers/update_coordinator.py:147\r\nIntegration: OpenSprinkler (documentation)\r\nFirst occurred: October 13, 2020, 11:42:59 PM (1 occurrences)\r\nLast logged: October 13, 2020, 11:42:59 PM\r\n\r\nTimeout fetching OpenSprinkler resource status data\r\n```\r\n\r\n```\r\nError while setting up opensprinkler platform for binary_sensor\r\nTraceback (most recent call last):\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 201, in _async_setup_platform\r\n await asyncio.gather(*pending)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 310, in async_add_entities\r\n await asyncio.gather(*tasks)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 371, in _async_add_entity\r\n device_info = entity.device_info\r\n File \"/config/custom_components/opensprinkler/__init__.py\", line 134, in device_info\r\n model = controller.hardware_version_name or \"Unknown\"\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 471, in hardware_version_name\r\n if self.hardware_version == HARDWARE_VERSION_OSPI:\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 466, in hardware_version\r\n return self._get_option(\"hwv\")\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 274, in _get_option\r\n return self._get_options()[option]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 280, in _get_options\r\n return self._retrieve_state()[\"options\"]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 268, in _retrieve_state\r\n raise OpenSprinklerNoStateError(\"No state. Please refresh\")\r\npyopensprinkler.OpenSprinklerNoStateError: No state. Please refresh\r\n```\r\n\r\n```\r\nError while setting up opensprinkler platform for sensor\r\nTraceback (most recent call last):\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 201, in _async_setup_platform\r\n await asyncio.gather(*pending)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 310, in async_add_entities\r\n await asyncio.gather(*tasks)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 371, in _async_add_entity\r\n device_info = entity.device_info\r\n File \"/config/custom_components/opensprinkler/__init__.py\", line 134, in device_info\r\n model = controller.hardware_version_name or \"Unknown\"\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 471, in hardware_version_name\r\n if self.hardware_version == HARDWARE_VERSION_OSPI:\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 466, in hardware_version\r\n return self._get_option(\"hwv\")\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 274, in _get_option\r\n return self._get_options()[option]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 280, in _get_options\r\n return self._retrieve_state()[\"options\"]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 268, in _retrieve_state\r\n raise OpenSprinklerNoStateError(\"No state. Please refresh\")\r\npyopensprinkler.OpenSprinklerNoStateError: No state. Please refresh\r\n```\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 201, in _async_setup_platform\r\n await asyncio.gather(*pending)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 310, in async_add_entities\r\n await asyncio.gather(*tasks)\r\n File \"/usr/src/homeassistant/homeassistant/helpers/entity_platform.py\", line 371, in _async_add_entity\r\n device_info = entity.device_info\r\n File \"/config/custom_components/opensprinkler/__init__.py\", line 134, in device_info\r\n model = controller.hardware_version_name or \"Unknown\"\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 471, in hardware_version_name\r\n if self.hardware_version == HARDWARE_VERSION_OSPI:\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 466, in hardware_version\r\n return self._get_option(\"hwv\")\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 274, in _get_option\r\n return self._get_options()[option]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 280, in _get_options\r\n return self._retrieve_state()[\"options\"]\r\n File \"/usr/local/lib/python3.8/site-packages/pyopensprinkler/__init__.py\", line 268, in _retrieve_state\r\n raise OpenSprinklerNoStateError(\"No state. Please refresh\")\r\npyopensprinkler.OpenSprinklerNoStateError: No state. Please refresh\r\n```\r\n\r\nI think this is caused internally by some race conditions:\r\n- _retrieve_state in pyopensprinkler executes before update is finished, throws exception\r\n- as there was no retry logic, component setup fails\r\n\r\nIdeally, issue should be resolved by eliminating the race condition and / or implementing component retry setup logic after some time.\r\n\r\nI wrote a (very ugly workaround)[https://github.com/username_0/hass-opensprinkler/blob/fix_timeout_on_start/custom_components/opensprinkler/__init__.py#L61], which works and now boots 100% of the time. \r\nIn my testing, it failed the 1st time, went into exception, recursion and then was successful and continued setup.\r\n\r\nI tried restarting multiple times on both rpis I have and it just didn't want to initiate correctly so I finally had some time to debug and find the cause. \r\nI have encountered this issue before and I remember it also being discussed in Community without a resolution.\r\n\r\nI do not know how to replicate this as I do not know the cause. Sometimes it just starts happening and recovers the next day when I restart.\r\nMaybe try adding some sleeps to simulate this and then try to fix?",
"title": "Component fails to start",
"type": "issue"
},
{
"action": "created",
"author": "vinteo",
"comment_id": 708086434,
"datetime": 1602636288000,
"masked_author": "username_1",
"text": "It would be great if you could open a PR for either pyopensprinkler (to fix the race condition) or with this workaround. Hacktoberfest is still ongoing :)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JurajNyiri",
"comment_id": 708275314,
"datetime": 1602667152000,
"masked_author": "username_0",
"text": "Thank you for the suggestion. \r\nHowever, currently I don't have time to fix it properly and I am not proud about that workaround, it's really ugly and should be fixed properly.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "vinteo",
"comment_id": null,
"datetime": 1602890833000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "vinteo",
"comment_id": 710695947,
"datetime": 1602890989000,
"masked_author": "username_1",
"text": "Please try the latest release v1.1.3",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JurajNyiri",
"comment_id": 712148778,
"datetime": 1603113354000,
"masked_author": "username_0",
"text": "Thank you, I have now encountered a similar scenario, it threw an error \r\n```\r\nLogger: custom_components.opensprinkler\r\nSource: helpers/update_coordinator.py:147\r\nIntegration: OpenSprinkler (documentation)\r\nFirst occurred: 3:11:59 PM (1 occurrences)\r\nLast logged: 3:11:59 PM\r\n\r\nTimeout fetching OpenSprinkler resource status data\r\n```\r\n\r\nbut then recovered. 👍",
"title": null,
"type": "comment"
}
] | 2 | 6 | 7,114 | false | false | 7,114 | true |
oclif/cli-ux | oclif | 767,019,910 | 299 | {
"number": 299,
"repo": "cli-ux",
"user_login": "oclif"
} | [
{
"action": "created",
"author": "RasPhilCo",
"comment_id": 744885461,
"datetime": 1607992254000,
"masked_author": "username_0",
"text": "@dependabot squash and merge",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,668 | false | true | 28 | false |
avadev/AvaTax-Calc-SOAP-PHP | avadev | 787,040,242 | 20 | null | [
{
"action": "opened",
"author": "aanderson-msi",
"comment_id": null,
"datetime": 1610729585000,
"masked_author": "username_0",
"text": "Line 32 of ATConfig.php:\r\n` if($n == '_ivars') { return parent::__get($n); }`\r\n\r\nHowever, the class is declared without a parent:\r\n```\r\n/**\r\n * Contains various service configuration parameters as class static variables.\r\n *\r\n * {@link AddressServiceSoap} and {@link TaxServiceSoap} read this file during initialization.\r\n *\r\n * @author Avalara\r\n * @copyright © 2004 - 2011 Avalara, Inc. All rights reserved.\r\n * @package Base\r\n */\r\nnamespace AvaTax;\r\nclass ATConfig\r\n{\r\n\r\n```\r\nWhen using this class with PHP 7.4, it results in an error. \r\n```\r\nDeprecated Functionality: Cannot use \"parent\" when current class scope has no parent in <project>/vendor/avalara/avatax/AvaTax/ATConfig.php on line 32\r\n``` \r\nRecommend returning \"null\" instead of the call to the parent's __get() function.",
"title": "PHP 7.4 deprecates parent for classes without parent",
"type": "issue"
},
{
"action": "created",
"author": "vipinroy",
"comment_id": 783437412,
"datetime": 1614006222000,
"masked_author": "username_1",
"text": "Any update on this issue?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jonathanribas",
"comment_id": 879812843,
"datetime": 1626261983000,
"masked_author": "username_2",
"text": "We can't move forward on PHP upgrades because of this issue.\r\n\r\nPHP 7.3 will be deprecated at the of this year, can you please update your code?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gusdemayo",
"comment_id": 880068838,
"datetime": 1626283000000,
"masked_author": "username_3",
"text": "I opened a pull request that should fix this issue\r\n\r\nBy removing the line\r\n\r\n```\r\nif($n == '_ivars') { return parent::__get($n); }\r\n```\r\n\r\nthe case of `$n === 'ivars'` should be caught by the else statement",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jonathanribas",
"comment_id": 880093098,
"datetime": 1626285238000,
"masked_author": "username_2",
"text": "Thank you @username_3! I was creating a PR with the exact same change!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gusdemayo",
"comment_id": 890001912,
"datetime": 1627662128000,
"masked_author": "username_3",
"text": "Is this project still being maintained?",
"title": null,
"type": "comment"
}
] | 4 | 6 | 1,281 | false | false | 1,281 | true |
adiwajshing/Baileys | null | 775,793,101 | 297 | null | [
{
"action": "opened",
"author": "omerbeylife",
"comment_id": null,
"datetime": 1609234068000,
"masked_author": "username_0",
"text": "Hello, audio files, google voices do not work on iphone users",
"title": "İphone Users",
"type": "issue"
},
{
"action": "created",
"author": "adiwajshing",
"comment_id": 753284599,
"datetime": 1609488825000,
"masked_author": "username_1",
"text": "Need more info than this.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "omerbeylife",
"comment_id": 753287041,
"datetime": 1609490534000,
"masked_author": "username_0",
"text": "There is such a plug-in but iPhone users cannot play bot sounds. The problem is caused by bailley. \r\nhttps://www.github.com/username_2/WhatsAsena/tree/master/plugins/scrapers.js",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Quiec",
"comment_id": 753310144,
"datetime": 1609503587000,
"masked_author": "username_2",
"text": "The sounds sent by Baileys are not listen on the iPhone.\r\n\r\nAny suggestions?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "edgardmessias",
"comment_id": 753963763,
"datetime": 1609765535000,
"masked_author": "username_3",
"text": "Try this: `ffmpeg -i audio.mp3 -vn -ar 44100 -ac 2 -b:a 192k audio-fixed.mp3 -y`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Quiec",
"comment_id": 754178767,
"datetime": 1609789619000,
"masked_author": "username_2",
"text": "Thanks!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Romerio",
"comment_id": 756744248,
"datetime": 1610110931000,
"masked_author": "username_4",
"text": "For me it only worked after adding \"-f ipod\"\r\n`ffmpeg -i audio.mp3 -vn -ab 128k -ar 44100 -f ipod audio-fixed.mp3 -y`",
"title": null,
"type": "comment"
}
] | 6 | 9 | 674 | false | true | 538 | true |
custom-components/alexa_media_player | custom-components | 792,733,080 | 1,132 | {
"number": 1132,
"repo": "alexa_media_player",
"user_login": "custom-components"
} | [
{
"action": "opened",
"author": "alandtse",
"comment_id": null,
"datetime": 1611470258000,
"masked_author": "username_0",
"text": "",
"title": "2020-01-23",
"type": "issue"
}
] | 2 | 2 | 509 | false | true | 0 | false |
SeldonIO/alibi-detect | SeldonIO | 857,931,356 | 221 | null | [
{
"action": "opened",
"author": "RitikaKulshresth",
"comment_id": null,
"datetime": 1618408938000,
"masked_author": "username_0",
"text": "I have tried to run the the Kolmogorov-Smirnov (K-S) tests for numerical columns using the Categorical and mixed type data drift detection on income prediction dataset, for numerical features for ex ( Age, Capital Gain, Capital Loss, Hours per week) feature I am getting K-S test value and P value as NAN but on applying the Chi-Square test on categorical features( Workclass, Education, Marital Status, Occupation, Relationship, Race, Sex, Country) I am getting the Chi2 and P value as some float values .\r\nPlease let me know shall I try with the downgraded version of Alibi Detect for running the Categorical and mixed type data drift detection on income prediction file .\r\nLooking for the resolution asap",
"title": "What to do as I am getting the P values as NAN on applying Kolmogorov-Smirnov (K-S) tests for numerical columns",
"type": "issue"
},
{
"action": "created",
"author": "arnaudvl",
"comment_id": 819555302,
"datetime": 1618409953000,
"masked_author": "username_1",
"text": "Hi @username_0 , could you provide a reproducible example as well as the alibi-detect version you are using? Because I just ran the ChiSquareDrift and TabularDrift detectors using the latest release and they seem to run fine.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RitikaKulshresth",
"comment_id": 819725477,
"datetime": 1618423634000,
"masked_author": "username_0",
"text": "Hey @username_1 , I am running the example mentioned here: https://docs.seldon.io/projects/alibi-detect/en/stable/examples/cd_chi2ks_adult.html.\r\nI am using the following versions:\r\nalibi - 0.5.7\r\nalibi_detect - 0.5.1\r\nPython - 3.8.8\r\n\r\nGetting the following output with continuous numerical features as NAN\r\n`Age -- Drift? No! -- K-S nan -- p-value nan\r\nWorkclass -- Drift? Yes! -- Chi2 18.114 -- p-value 0.020\r\nEducation -- Drift? No! -- Chi2 9.845 -- p-value 0.131\r\nMarital Status -- Drift? No! -- Chi2 6.583 -- p-value 0.086\r\nOccupation -- Drift? Yes! -- Chi2 17.900 -- p-value 0.022\r\nRelationship -- Drift? No! -- Chi2 0.964 -- p-value 0.965\r\nRace -- Drift? No! -- Chi2 1.168 -- p-value 0.883\r\nSex -- Drift? No! -- Chi2 0.463 -- p-value 0.496\r\nCapital Gain -- Drift? No! -- K-S nan -- p-value nan\r\nCapital Loss -- Drift? No! -- K-S nan -- p-value nan\r\nHours per week -- Drift? No! -- K-S nan -- p-value nan\r\nCountry -- Drift? Yes! -- Chi2 20.595 -- p-value 0.024`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "arnaudvl",
"comment_id": 819762583,
"datetime": 1618427205000,
"masked_author": "username_1",
"text": "Thanks, could you upgrade to the latest alibi-detect (v0.6.0)? That should resolve the issue.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RitikaKulshresth",
"comment_id": 820139026,
"datetime": 1618466516000,
"masked_author": "username_0",
"text": "That issue worked fine. Thanks for your quick help. You can close this issue\r\n\r\nAlso I am trying to use the same ChiSquareDrift and TabularDrift detectors on Iris Dataset where we have only Continuous numerical features for that I am raising another new issue.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "arnaudvl",
"comment_id": null,
"datetime": 1618480781000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 2,262 | false | false | 2,262 | true |
edgexfoundry/edgex-go | edgexfoundry | 550,751,787 | 2,300 | null | [
{
"action": "opened",
"author": "michaelestrin",
"comment_id": null,
"datetime": 1579174664000,
"masked_author": "username_0",
"text": "Write an in-memory implementation of the `DBClient` interface for the file mentioned in this issue's title. Include unit tests that cover the implementation. \r\n\r\nThis in-memory persistence implementation will be used for Golang-based acceptance tests.\r\n\r\nRelated to https://github.com/edgexfoundry/edgex-go/issues/2277 and https://github.com/edgexfoundry/edgex-go/issues/2273.",
"title": "Implement db/memory/valuedescriptors.go behavior",
"type": "issue"
},
{
"action": "created",
"author": "michaelestrin",
"comment_id": 576753265,
"datetime": 1579622845000,
"masked_author": "username_0",
"text": "Added hold. This issue was created to facilitate Golang-based acceptance testing of APIv2 endpoints. \r\n Unsure at this point if APIv2 will reuse the DBClient abstraction or create its own.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "michaelestrin",
"comment_id": 582898963,
"datetime": 1580994698000,
"masked_author": "username_0",
"text": "Decision made for v2 to have its own persistence implementation. Closing this issue; it is no longer relevant.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "michaelestrin",
"comment_id": null,
"datetime": 1580994699000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 4 | 678 | false | false | 678 | false |
PrismJS/prism | PrismJS | 655,763,631 | 2,474 | {
"number": 2474,
"repo": "prism",
"user_login": "PrismJS"
} | [
{
"action": "opened",
"author": "osipxd",
"comment_id": null,
"datetime": 1594638533000,
"masked_author": "username_0",
"text": "Added file extensions `kt` and `kts` as aliases for Kotlin.",
"title": "Add aliases for Kotlin",
"type": "issue"
},
{
"action": "created",
"author": "osipxd",
"comment_id": 657505028,
"datetime": 1594639842000,
"masked_author": "username_0",
"text": "Ah, missed it. Fixed now.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RunDevelopment",
"comment_id": 657510031,
"datetime": 1594640443000,
"masked_author": "username_1",
"text": "Thank you for contributing!",
"title": null,
"type": "comment"
}
] | 2 | 3 | 111 | false | false | 111 | false |
vitejs/vite | vitejs | 890,849,296 | 3,399 | null | [
{
"action": "opened",
"author": "zhangyuang",
"comment_id": null,
"datetime": 1620894948000,
"masked_author": "username_0",
"text": "### Describe the bug\r\n\r\nwhen the thirdparty module is esm format, esbuild manage it without add `__esModule` property.\r\n\r\nfor examle, the source code is\r\n\r\n```js\r\n// swier.esm.js\r\nexport { default as Swiper, default } from './esm/components/core/core-class';\r\nexport { default as Virtual } from './esm/components/virtual/virtual';\r\n//...\r\n```\r\n\r\nAfter the build ,will output\r\n\r\n```js\r\nimport {default as default2, default as default3} from \"./esm/components/core/core-class\";\r\nimport {default as default4} from \"./esm/components/virtual/virtual\";\r\nexport {\r\n default14 as A11y,\r\n default3 as default\r\n};\r\n\r\n```\r\n\r\nwhich it not have `__esModule` property.\r\n\r\nit will be cause an error. if a module build by webpack, the source code as below\r\n\r\n```js\r\n// react-id-swiper sourceCode\r\nimport Swiper from 'swiper';\r\nnew Swiper()\r\n```\r\nafter webpack build , output code will add helper function `__importDefault`\r\n```js\r\n// react-id-swiper\r\nvar __importDefault = (this && this.__importDefault) || function (mod) {\r\n return (mod && mod.__esModule) ? mod : { \"default\": mod };\r\n};\r\nvar swiper_1 = __importDefault(require(\"swiper\"));\r\nnew swiper_1.default(swiperNodeRef.current, object_assign_1.default({}, props))\r\n```\r\n\r\nbut because of `__esModule` is undefined, `__importDefault` function will add default property once again cause the return value `__importDefault(require(\"swiper\")).default` is not the current object will be error\r\n\r\n\r\n\r\n \r\n\r\n\r\n### Reproduction\r\n\r\nhttps://github.com/username_0/vite-react-swiper-error\r\n\r\n### System Info\r\n\r\nvite/2.3.2 darwin-x64 node-v12.18.3\r\n\r\n### additional\r\n\r\nwhen i add `__esModule` property in source manually, it can be executed succeed like\r\n\r\n```js\r\nexport { default as Swiper, default } from './esm/components/core/core-class';\r\nexport { default as Virtual } from './esm/components/virtual/virtual';\r\nexport const __esModule = true \r\n```\r\n\r\nIn this case,maybe vite can add `__esModule` in esbuild plugin . for example\r\n\r\n```js\r\nbuild.onLoad({ filter: /.*/, namespace: 'swiper' }, ({ path: id }) => {\r\n return {\r\n loader: 'js',\r\n resolveDir: root,\r\n contents: `export * from \"filepath\"\r\n export {default} from \"filePath\"\r\n export const __esModule = true \r\n `\r\n }\r\n })\r\n```",
"title": "without __esModule property when format esm module cause load module error",
"type": "issue"
},
{
"action": "created",
"author": "Sociosarbis",
"comment_id": 840953898,
"datetime": 1620958189000,
"masked_author": "username_1",
"text": "temporary workaround for your case.\r\n\r\nadd a `resolve.alias` in `vite.cofnig.js` as follows:\r\n```js\r\n{\r\n resolve: {\r\n alias: {\r\n swiper: 'swiper/swiper.cjs'\r\n }\r\n }\r\n}\r\n```\r\n\r\nBTW,is possible to achieve this specifc functionality you proposed somehow like add a extra file and use `resolve.alias` to redirect the request to that. 🤔",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangyuang",
"comment_id": 841019332,
"datetime": 1620970528000,
"masked_author": "username_0",
"text": "yeah, but it's a temporary solution。the problem may be appear frequently",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Sociosarbis",
"comment_id": 841131608,
"datetime": 1620984816000,
"masked_author": "username_1",
"text": "```js\r\nexport default defineConfig({\r\n plugins: [reactRefresh()],\r\n resolve: {\r\n alias: {\r\n swiper: join(__dirname, 'src/swiper'),\r\n '@swiper': join(__dirname, 'node_modules/swiper/swiper.esm')\r\n }\r\n }\r\n})\r\n```\r\n```js\r\n// src/swiper.js\r\nexport * from '@swiper'\r\nexport { default } from '@swiper'\r\nexport const __esModule = true\r\n```\r\nclean the vite prebundle cache and rebuild. This would work. I think it's a issue of esbuild rather than vite's.😄",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangyuang",
"comment_id": 841133781,
"datetime": 1620985084000,
"masked_author": "username_0",
"text": "yeah, i also think that. in the final analysis,the problem is es module and commonjs module are not compatible. `export default` not have corresponding declare in commonjs",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "BasixKOR",
"comment_id": 842874337,
"datetime": 1621318242000,
"masked_author": "username_2",
"text": "This causes latest `evergreen-ui` to break, caused by `hyphenate-style-name`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "frabarz",
"comment_id": 844490263,
"datetime": 1621460056000,
"masked_author": "username_3",
"text": "I'm also having this issue with [`resize-observer-polyfill`](https://www.npmjs.com/package/resize-observer-polyfill). The package offers a es module, but the content doesn't have the `__esModule = true` export, and the app fails because the function is wrapped in an object under a `default` key:\r\n\r\n\r\nThis is clearly something they need to fix, but the thing is, this was working correctly 2 weeks ago. Does anyone know since which version this changed?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangyuang",
"comment_id": 849546936,
"datetime": 1622114008000,
"masked_author": "username_0",
"text": "@username_5 @underfin @username_4 @patak-js please see see, the issue occur many times in different project",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Shinigami92",
"comment_id": 849554327,
"datetime": 1622114789000,
"masked_author": "username_4",
"text": "Does this also happen with Vite `v2.2.4`? If not, I assume it also has to do with the `esbuild` update.\r\nBut if `v2.2.4` works for you, you are at least able to use Vite for now",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Shinigami92",
"comment_id": 849557399,
"datetime": 1622115119000,
"masked_author": "username_4",
"text": "Ok so please downgrade and try it out. Then write me feedback.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhangyuang",
"comment_id": 849559920,
"datetime": 1622115399000,
"masked_author": "username_0",
"text": "@username_4 vite@2.2.4 will not occur the problem. but we want to use the newest vite.maybe you can push esbuild fix the problem",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "antfu",
"comment_id": null,
"datetime": 1623174607000,
"masked_author": "username_5",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "softboy99",
"comment_id": 858205947,
"datetime": 1623287724000,
"masked_author": "username_6",
"text": "Hi,\r\ndoes this has been released?",
"title": null,
"type": "comment"
}
] | 8 | 14 | 5,264 | false | true | 4,823 | true |
tevora-threat/SharpView | tevora-threat | 582,326,735 | 3 | null | [
{
"action": "opened",
"author": "andrewchiles",
"comment_id": null,
"datetime": 1584367628000,
"masked_author": "username_0",
"text": "Running `Get-DomainUser` with the following syntax fails to properly filter on the specified username and instead returns all users in the domain. The result was the same despite specifying `-name domadmin`, `-identity domadmin`, or simply `Get-DomainUser domadmin`\r\n\r\n```SharpView_4.5.exe Get-DomainUser -name \"**domadmin**\"\r\n[*] Tasked beacon to run .NET program: SharpView_4.5.exe Get-DomainUser -name \"domadmin\"\r\n[+] host called home, sent: 840809 bytes\r\n[+] received output:\r\nget-domain\r\n[Get-DomainSearcher] search base: LDAP://DC01.lab.local/DC=lab,DC=local\r\n[Get-DomainUser] filter string: (&(samAccountType=805306368))\r\nobjectsid : {S-1-5-21-.....-500}\r\nsamaccounttype : USER_OBJECT\r\nobjectguid : 019324d8-f17b-45c3-b9a9-adc7e0d3b9b3\r\nuseraccountcontrol : NORMAL_ACCOUNT\r\naccountexpires : 12/31/1600 7:00:00 PM\r\nlastlogon : 11/21/2014 6:42:49 AM\r\nlastlogontimestamp : 3/13/2020 10:40:02 AM\r\npwdlastset : 8/15/2019 10:30:55 AM\r\nlastlogoff : 12/31/1600 7:00:00 PM\r\nbadPasswordTime : 12/31/1600 7:00:00 PM\r\nname : Administrator\r\ndistinguishedname : CN=Administrator,CN=Users,DC=lab,DC=local\r\nwhencreated : 8/15/2019 2:32:06 PM\r\nwhenchanged : 3/13/2020 2:40:02 PM\r\nsamaccountname : Administrator\r\nmemberof : {CN=Group Policy Creator Owners,CN=Users,DC=lab,DC=local, CN=Domain Admins,CN=Users,DC=lab,DC=local, CN=Enterprise Admins,CN=Users,DC=lab,DC=local, CN=Schema Admins,CN=Users,DC=lab,DC=local, CN=Administrators,CN=Builtin,DC=lab,DC=local}\r\ncn : {Administrator}\r\nobjectclass : {top, person, organizationalPerson, user}\r\nlogoncount : 3\r\ncodepage : 0\r\nobjectcategory : CN=Person,CN=Schema,CN=Configuration,DC=lab,DC=local\r\ndescription : Built-in account for administering the computer/domain\r\nusnchanged : 22265\r\ninstancetype : 4\r\nbadpwdcount : 0\r\nusncreated : 8196\r\ncountrycode : 0\r\nprimarygroupid : 513\r\ndscorepropagationdata : {8/15/2019 2:47:54 PM, 8/15/2019 2:47:54 PM, 8/15/2019 2:32:44 PM, 1/1/1601 6:12:16 PM}\r\nlogonhours : {255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255, 255}\r\nadmincount : 1\r\niscriticalsystemobject : True\r\n\r\nobjectsid : {S-1-5-21-....-502}\r\nsamaccounttype : USER_OBJECT\r\nobjectguid : af1a1a57-681f-4d3c-8775-3b922ba9613d\r\nuseraccountcontrol : ACCOUNTDISABLE, NORMAL_ACCOUNT\r\naccountexpires : NEVER\r\nlastlogon : 12/31/1600 7:00:00 PM\r\npwdlastset : 8/15/2019 10:32:44 AM\r\nlastlogoff : 12/31/1600 7:00:00 PM\r\nbadPasswordTime : 12/31/1600 7:00:00 PM\r\n**name : krbtgt**\r\ndistinguishedname : CN=krbtgt,CN=Users,DC=lab,DC=local\r\nwhencreated : 8/15/2019 2:32:44 PM\r\nwhenchanged : 8/15/2019 2:47:54 PM\r\nsamaccountname : krbtgt\r\nmemberof : {CN=Denied RODC Password Replication Group,CN=Users,DC=lab,DC=local}\r\ncn : {krbtgt}\r\nobjectclass : {top, person, organizationalPerson, user}\r\nServicePrincipalName : kadmin/changepw\r\nlogoncount : 0\r\ncodepage : 0\r\nobjectcategory : CN=Person,CN=Schema,CN=Configuration,DC=lab,DC=local\r\ndescription : Key Distribution Center Service Account\r\nusnchanged : 12731\r\ninstancetype : 4\r\nshowinadvancedviewonly : True\r\nbadpwdcount : 0\r\nusncreated : 12324\r\ncountrycode : 0\r\nprimarygroupid : 513\r\ndscorepropagationdata : {8/15/2019 2:47:54 PM, 8/15/2019 2:32:44 PM, 1/1/1601 12:04:16 AM}\r\nmsds-supportedencryptiontypes : 0\r\nadmincount : 1\r\niscriticalsystemobject : True\r\n\r\n<snip - Remaining domain users were displayed>\r\n```",
"title": "Get-DomainUser Not Filtering on \"Name\" Argument",
"type": "issue"
},
{
"action": "created",
"author": "coffeegist",
"comment_id": 662681747,
"datetime": 1595450021000,
"masked_author": "username_1",
"text": "@username_0 the arguments in this version are case sensitive unfortunately :)",
"title": null,
"type": "comment"
}
] | 2 | 2 | 4,556 | false | false | 4,556 | true |
canammex-tech/deepstream-services-library | canammex-tech | 694,942,611 | 352 | null | [
{
"action": "opened",
"author": "jlerasmus",
"comment_id": null,
"datetime": 1599473373000,
"masked_author": "username_0",
"text": "Running sample `ode_occurrence_rtsp_start_record_tap_action.py` logs error and throws segmentation fault when pressing 'q' to exit application.\r\n\r\nError:\r\n```\r\n0:00:08.434763531 3639 0x12566460 ERROR DSL src/DslTapBintr.cpp:114:UnlinkFromSource: : TapBintr 'src-0-record-tap' failed to unlink from Decode Source Tee\r\n```\r\n\r\nDebugger breaks at line 209 of `DslNodetr.h` when the segmentation fault occurs:\r\n```\r\nreturn GST_ELEMENT(m_pGstObj);\r\n```\r\n\r\nIf the record-tap is not added with `dsl_source_rtsp_tap_add`, the error log and segementation fault does not occur. The following is logged to the console though:\r\n```\r\nnvbuf_utils: dmabuf_fd 1306 mapped entry NOT found\r\n```",
"title": "Error and segmentation fault on quitting application when rtsp-tap is added",
"type": "issue"
},
{
"action": "created",
"author": "rjhowell44",
"comment_id": 688395951,
"datetime": 1599492766000,
"masked_author": "username_1",
"text": "Thanks @username_0 I will look into this. \r\n\r\nRE: *nvbuf_utils: dmabuf_fd 1306 mapped entry NOT found* \r\n\r\nThis I'm aware of and have reported such to nvidia as this showed up with the 5.0 release. They of course want me to reproduce using their app which I have not had time to follow up on as of yet.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlerasmus",
"comment_id": 695998957,
"datetime": 1600679946000,
"masked_author": "username_0",
"text": "@username_1 I am still getting the segfault on the latest v0.08.alpha branch\r\n\r\n```\r\n0:00:07.297015085 11595 0xe3ee8f0 INFO DSL src/DslPipelineBintr.cpp:747:HandleXWindowEvents: : Key released = 'q'\r\nkey released = q\r\n0:00:07.297340143 11595 0xe3ee8f0 INFO DSL src/DslServices.h:664:GetMainLoopHandle: : Returning Handle to MainLoop\r\n0:00:07.298293964 11595 0xe06ff60 INFO DSL src/DslBintr.h:223:SetState: : Changing state to 'NULL' for Bintr 'pipeline'\r\n0:00:07.298432665 11595 0xe06ff60 INFO DSL src/DslBintr.h:229:SetState: : State change completed synchronously for Bintr'pipeline'\r\n0:00:07.298548709 11595 0xe06ff60 INFO DSL src/DslPipelineSourcesBintr.cpp:230:UnlinkAll: : Unlinking stream_muxer from src-0\r\n0:00:07.298599960 11595 0xe06ff60 INFO DSL src/DslSourceBintr.cpp:144:UnlinkFromSink: : Unlinking and releasing request Sink Pad for StreamMux stream_muxer\r\n0:00:07.299076063 11595 0xe06ff60 WARN DSL src/DslSourceBintr.cpp:946:UnlinkAll: : *********************************source\r\n0:00:07.299201846 11595 0xe06ff60 INFO DSL src/DslTapBintr.cpp:109:UnlinkFromSource: : Unlinking and releasing requested Source Pad for Decode Source Tee src-0-record-tap\r\n0:00:07.299283671 11595 0xe06ff60 ERROR DSL src/DslTapBintr.cpp:114:UnlinkFromSource: : TapBintr 'src-0-record-tap' failed to unlink from Decode Source Tee\r\nSegmentation fault (core dumped)\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlerasmus",
"comment_id": 696057186,
"datetime": 1600687830000,
"masked_author": "username_0",
"text": "@username_1, I also got a different error when quitting from terminal using Ctrl+C, but this occurs randomly (sorry, only had INFO logging enabled)\r\n```\r\n^C0:00:29.381972166 1106 0x2edb7210 WARN DSL src/DslSourceBintr.cpp:946:UnlinkAll: : *********************************source\r\n0:00:29.382190347 1106 0x2edb7210 ERROR DSL src/DslTapBintr.cpp:114:UnlinkFromSource: : TapBintr 'src-1-record-tap' failed to unlink from Decode Source Tee\r\ng_mutex_clear() called on uninitialised or locked mutex\r\nAborted (core dumped)\r\n```\r\n\r\nHere is the full Debug log from when 'q' is pressed:\r\n```\r\n0:00:06.554164941 2037 0x5475cf0 INFO DSL src/DslPipelineBintr.cpp:747:HandleXWindowEvents: : Key released = 'q'\r\nkey released = q\r\n0:00:06.554435415 2037 0x5475cf0 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Services::GetMainLoopHandle()\r\n0:00:06.554482655 2037 0x5475cf0 INFO DSL src/DslServices.h:664:GetMainLoopHandle: : Returning Handle to MainLoop\r\n0:00:06.554503801 2037 0x5475cf0 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Services::GetMainLoopHandle()\r\n0:00:06.554596459 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Services::PipelineDeleteAll()\r\n0:00:06.554667085 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::RemoveAllChildren()\r\n0:00:06.554755368 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.554791879 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.554831880 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.554877975 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.554911100 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'iou-tracker' from Parent GST BIn'pipeline'\r\n0:00:06.554955997 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555050009 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555083551 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555101416 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555230533 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555253763 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555278034 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555295899 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555314649 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'on-screen-display' from Parent GST BIn'pipeline'\r\n0:00:06.555340639 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555358973 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555383244 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555400692 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555507725 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555530382 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555553455 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555571112 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555590123 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'primary-gie' from Parent GST BIn'pipeline'\r\n0:00:06.555614134 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555631842 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555654186 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555670749 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555751063 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555769761 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555792574 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.555810127 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.555827887 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'sinks-bin' from Parent GST BIn'pipeline'\r\n0:00:06.555852054 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555869763 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555892055 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555909087 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.555992526 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556010547 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556032891 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556050079 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556069715 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'sources-bin' from Parent GST BIn'pipeline'\r\n0:00:06.556093934 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556111382 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556133466 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556150289 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556438367 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556459878 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556482743 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556500503 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556518212 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:379:RemoveAllChildren: : Removing Child GstNodetr'tiler' from Parent GST BIn'pipeline'\r\n0:00:06.556542587 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556560817 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556583161 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556600453 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.556655194 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::RemoveAllChildren()\r\n0:00:06.556683684 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556700194 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556722174 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556738841 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556756185 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'iou-tracker' from Parent 'pipeline'\r\n0:00:06.556779050 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.556795977 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.556821290 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556838322 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556860093 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556876552 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556893531 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'on-screen-display' from Parent 'pipeline'\r\n0:00:06.556915719 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.556932126 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.556956501 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.556973168 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.556995043 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557011710 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557028794 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'primary-gie' from Parent 'pipeline'\r\n0:00:06.557050930 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557067753 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557092285 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557109056 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557131035 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557147911 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557166817 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'sinks-bin' from Parent 'pipeline'\r\n0:00:06.557189005 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557205255 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557229110 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557245360 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557266871 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557283174 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557300101 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'sources-bin' from Parent 'pipeline'\r\n0:00:06.557322028 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557338695 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557363331 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557379998 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557401717 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557418488 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557435364 2037 0x50f5f60 DEBUG DSL src/DslBase.h:206:RemoveAllChildren: : Removing Child 'tiler' from Parent 'pipeline'\r\n0:00:06.557457135 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557473958 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.557500938 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::RemoveAllChildren()\r\n0:00:06.557518646 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::RemoveAllChildren()\r\n0:00:06.557548699 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::PipelineBintr::~PipelineBintr()\r\n0:00:06.557571356 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::PipelineBintr::Stop()\r\n0:00:06.557594481 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Bintr::SetState()\r\n0:00:06.557621148 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557637971 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557655055 2037 0x50f5f60 INFO DSL src/DslBintr.h:223:SetState: : Changing state to 'NULL' for Bintr 'pipeline'\r\n0:00:06.557676462 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.557693493 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.557810631 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.557832767 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.557851517 2037 0x50f5f60 INFO DSL src/DslBintr.h:229:SetState: : State change completed synchronously for Bintr'pipeline'\r\n0:00:06.557869226 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Bintr::SetState()\r\n0:00:06.557893549 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Bintr::IsLinked()\r\n0:00:06.557912247 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Bintr::IsLinked()\r\n0:00:06.557935216 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::BranchBintr::UnlinkAll()\r\n0:00:06.557963029 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.557981675 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558004957 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.558027978 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558045270 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558067458 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558085844 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558107251 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558124543 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558146835 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558164075 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558186315 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558203711 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558235586 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.558268816 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558287254 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558311265 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558328140 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558345901 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'sources-bin' from Sink 'primary-gie'\r\n0:00:06.558368036 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.558386058 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.558410641 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::PipelineSourcesBintr::UnlinkAll()\r\n0:00:06.558441632 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558459913 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558483403 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558500331 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558517622 2037 0x50f5f60 INFO DSL src/DslPipelineSourcesBintr.cpp:230:UnlinkAll: : Unlinking stream_muxer from src-0\r\n0:00:06.558542519 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::SourceBintr::UnlinkFromSink()\r\n0:00:06.558565488 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558583092 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.558618510 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.558637052 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.558654344 2037 0x50f5f60 INFO DSL src/DslSourceBintr.cpp:144:UnlinkFromSink: : Unlinking and releasing request Sink Pad for StreamMux stream_muxer\r\n0:00:06.558758200 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: std::shared_ptr<DSL::Nodetr> DSL::Nodetr::GetSink()\r\n0:00:06.558788617 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: std::shared_ptr<DSL::Nodetr> DSL::Nodetr::GetSink()\r\n0:00:06.558820857 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.558840024 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559300866 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.559348627 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.559368315 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.559394617 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.559413159 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.559430712 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'src-0' from Sink 'stream_muxer'\r\n0:00:06.559451493 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.559469358 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::SourceBintr::UnlinkFromSink()\r\n0:00:06.559493421 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RtspSourceBintr::UnlinkAll()\r\n0:00:06.559516494 2037 0x50f5f60 WARN DSL src/DslSourceBintr.cpp:946:UnlinkAll: : *********************************source\r\n0:00:06.559543526 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.559568266 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.559586964 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.559612069 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559631236 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559654466 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559672539 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559694570 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559712227 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559734936 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559752124 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.559815823 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.559849677 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.559869105 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.559893793 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.559939366 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.559974732 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'pre-decode-queue' from Sink 'decode-bin'\r\n0:00:06.560012180 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.560047754 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.560094109 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RtspSourceBintr::HasTapBintr()\r\n0:00:06.560133016 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::RtspSourceBintr::HasTapBintr()\r\n0:00:06.560184006 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::TapBintr::UnlinkFromSource()\r\n0:00:06.560231611 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSource()\r\n0:00:06.560266768 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSource()\r\n0:00:06.560317863 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.560350780 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.560385624 2037 0x50f5f60 INFO DSL src/DslTapBintr.cpp:109:UnlinkFromSource: : Unlinking and releasing requested Source Pad for Decode Source Tee src-0-record-tap\r\n0:00:06.560591618 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.560648337 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.560683025 2037 0x50f5f60 ERROR DSL src/DslTapBintr.cpp:114:UnlinkFromSource: : TapBintr 'src-0-record-tap' failed to unlink from Decode Source Tee\r\n0:00:06.560716932 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::TapBintr::UnlinkFromSource()\r\n0:00:06.560757714 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RecordTapBintr::UnlinkAll()\r\n0:00:06.560802715 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.560842455 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.560886258 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.560923134 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561007250 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::RemoveChild()\r\n0:00:06.561054282 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::IsChild()\r\n0:00:06.561096054 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.561130013 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.561171940 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::IsChild()\r\n0:00:06.561216472 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561250588 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561289651 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561332412 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.561486529 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::RemoveChild()\r\n0:00:06.561521009 2037 0x5475400 WARN matroskamux matroska-mux.c:3468:gst_matroska_mux_finish:0x5470a10 unable to get final track duration\r\n0:00:06.561631219 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::IsChild()\r\n0:00:06.561679762 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.561714190 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.561752159 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::IsChild()\r\n0:00:06.561803202 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.561840494 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::ClearParentName()\r\n0:00:06.561877734 2037 0x50f5f60 DEBUG DSL src/DslBase.h:192:RemoveChild: : Child 'record-bin' removed from Parent 'src-0-record-tap'\r\n0:00:06.561910495 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::RemoveChild()\r\n0:00:06.561942423 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::RemoveChild()\r\n0:00:06.561983621 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::~Nodetr()\r\n0:00:06.562019820 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:75:~Nodetr: : Nodetr 'record-bin' deleted\r\n0:00:06.562051383 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::~Nodetr()\r\n0:00:06.562087373 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::~Base()\r\n0:00:06.562118103 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::~Base()\r\n0:00:06.562160812 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RecordMgr::DestroyContext()\r\n0:00:06.562204407 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::RecordMgr::IsOn()\r\n0:00:06.562235397 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::RecordMgr::IsOn()\r\n0:00:06.562271804 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::RecordMgr::DestroyContext()\r\n0:00:06.562319304 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::RecordTapBintr::UnlinkAll()\r\n0:00:06.562358107 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSource()\r\n0:00:06.562396129 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSource()\r\n0:00:06.562430504 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSource()\r\n0:00:06.562466807 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562499308 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562542902 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562577903 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562613008 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562643633 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562678425 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562715614 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.562789834 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSource()\r\n0:00:06.562834574 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.562866345 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.562903065 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.562935878 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.563008952 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:148:UnlinkFromSource: : Unlinking self 'pre-decode-queue' as a Sink from 'pre-decode-tee' Source\r\n0:00:06.563056245 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSource()\r\n0:00:06.563086766 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSource()\r\n0:00:06.563123694 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.563160361 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.563192080 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.563227914 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563262290 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563301041 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563343229 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563380000 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563410730 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563444324 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563477242 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563546045 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.563590681 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.563631203 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.563667818 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.563699433 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.563730423 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'src-parse' from Sink 'pre-decode-tee'\r\n0:00:06.563764122 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.563795112 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.563834123 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.563871259 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.563903291 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::IsLinkedToSink()\r\n0:00:06.563940896 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.563976678 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564016262 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564048241 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564083138 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564113868 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564149337 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564181421 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::GetGstElement()\r\n0:00:06.564251630 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.564301267 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.564334653 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.564373143 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Base::GetName()\r\n0:00:06.564405591 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Base::GetName()\r\n0:00:06.564437623 2037 0x50f5f60 DEBUG DSL src/DslNodetr.h:111:UnlinkFromSink: : Unlinking Source 'src-depay' from Sink 'src-parse'\r\n0:00:06.564471530 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::Nodetr::UnlinkFromSink()\r\n0:00:06.564502676 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:75:~LogFunc: DSL::GstNodetr::UnlinkFromSink()\r\n0:00:06.564544344 2037 0x50f5f60 DEBUG DSL src/DslLogGst.h:69:LogFunc: DSL::Nodetr::GetGstElement()\r\nSegmentation fault (core dumped)\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rjhowell44",
"comment_id": 697152334,
"datetime": 1600840948000,
"masked_author": "username_1",
"text": "@username_0 just fyi, I was finally able to get my live setup running again... should be able to fix the remaining issues tomorrow",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rjhowell44",
"comment_id": 698498251,
"datetime": 1600970407000,
"masked_author": "username_1",
"text": "@username_0 the segmentation fault issue has been resolved..",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlerasmus",
"comment_id": 698749086,
"datetime": 1601015516000,
"masked_author": "username_0",
"text": "@username_1 thanks, I will test and confirm",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlerasmus",
"comment_id": 699791319,
"datetime": 1601272978000,
"masked_author": "username_0",
"text": "@username_1, I can confirm that the segmentation fault is fixed. I believe you can close the issue.\r\n\r\nAs a final note regarding the `nvbuf_utils` message, herewith my debug log:\r\n```\r\n 0:00:09.260938622 15019 0x2ddd4e30 INFO DSL src/DslPipelineBintr.cpp:747:HandleXWindowEvents: : Key released = 'q'\r\n key released = q\r\n 0:00:09.261218835 15019 0x2ddd4e30 INFO DSL src/DslServices.h:664:GetMainLoopHandle: : Returning Handle to MainLoop\r\n 0:00:09.261435298 15019 0x2da56760 INFO DSL src/DslServices.cpp:7717:ReturnValueToString: : Result = 0 = DSL_RESULT_SUCCESS\r\n DSL_RESULT_SUCCESS\r\n 0:00:09.262414691 15019 0x2da56760 INFO DSL src/DslBintr.h:223:SetState: : Changing state to 'NULL' for Bintr 'pipeline'\r\n 0:00:09.262549798 15019 0x2da56760 INFO DSL src/DslBintr.h:229:SetState: : State change completed synchronously for Bintr'pipeline'\r\n 0:00:09.262663602 15019 0x2da56760 INFO DSL src/DslPipelineSourcesBintr.cpp:238:UnlinkAll: : Unlinking stream_muxer from src-0\r\n 0:00:09.262707666 15019 0x2da56760 INFO DSL src/DslNodetr.h:607:UnlinkFromSinkMuxer: : Unlinking and releasing requested Sink Pad '0x2ddb3160' for Bintr 'src-0'\r\n 0:00:09.272987496 15019 0x2ddd4d90 WARN rtspsrc gstrtspsrc.c:5999:gst_rtspsrc_try_send:<src-0> send interrupted\r\n 0:00:09.273032340 15019 0x2ddd4d90 WARN rtspsrc gstrtspsrc.c:7673:gst_rtspsrc_close:<src-0> TEARDOWN interrupted\r\n 0:00:09.275547492 15019 0x2da56760 INFO DSL src/DslNodetr.h:626:UnlinkFromSinkMuxer: : Bintr 'src-0' changed state to NULL successfully\r\n 0:00:09.276242974 15019 0x2da56760 INFO DSL src/DslSourceBintr.cpp:881:UnlinkAll: : Stream management disabled for RTSP Source 'src-0'\r\n 0:00:09.276436832 15019 0x2da56760 INFO DSL src/DslNodetr.h:547:UnlinkFromSourceTee: : Unlinking and releasing requested Src Pad '0x2ddb8d00' for Bintr 'pre-decode-queue'\r\n 0:00:09.276853506 15019 0x2da56760 INFO DSL src/DslTapBintr.cpp:71:UnlinkFromSourceTee: : Unlinking and releasing requested Source Pad for TapBintr src-0-record-tap\r\n 0:00:09.276958560 15019 0x2da56760 INFO DSL src/DslNodetr.h:547:UnlinkFromSourceTee: : Unlinking and releasing requested Src Pad '0x2ddb8aa0' for Bintr 'src-0-record-tap'\r\n 0:00:09.278355931 15019 0x2da56760 INFO DSL src/DslMultiComponentsBintr.cpp:193:UnlinkAll: : Unlinking sink_bin_tee from window-sink\r\n 0:00:09.278428692 15019 0x2da56760 INFO DSL src/DslNodetr.h:547:UnlinkFromSourceTee: : Unlinking and releasing requested Src Pad '0x2ddb8f60' for Bintr 'window-sink'\r\n nvbuf_utils: dmabuf_fd 1306 mapped entry NOT found\r\n 0:00:09.340796749 15019 0x2da56760 INFO DSL src/DslServices.cpp:6652:ComponentDeleteAll: : All Components deleted successfully\r\n 0:00:09.340959564 15019 0x2da56760 INFO DSL src/DslServices.cpp:3275:PphDeleteAll: : All Pad Probe Handlers deleted successfully\r\n 0:00:09.341097067 15019 0x2da56760 INFO DSL src/DslServices.cpp:2920:OdeTriggerDeleteAll: : All ODE Triggers deleted successfully\r\n 0:00:09.341151755 15019 0x2da56760 INFO DSL src/DslServices.cpp:1994:OdeAreaDeleteAll: : All ODE Areas deleted successfully\r\n 0:00:09.341264257 15019 0x2da56760 INFO DSL src/DslServices.cpp:1864:OdeActionDeleteAll: : All ODE Actions deleted successfully\r\n 0:00:09.341403739 15019 0x2da56760 INFO DSL src/DslServices.cpp:921:DisplayTypeDeleteAll: : All Display Types deleted successfully\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rjhowell44",
"comment_id": 700255013,
"datetime": 1601323808000,
"masked_author": "username_1",
"text": "@username_0 Thanks for the log. I will try and chase this down with Nvidia over the coming weeks",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "rjhowell44",
"comment_id": null,
"datetime": 1601323809000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 10 | 44,833 | false | false | 44,833 | true |
ergebnis/classy | ergebnis | 767,220,928 | 314 | {
"number": 314,
"repo": "classy",
"user_login": "ergebnis"
} | [
{
"action": "created",
"author": "localheinz",
"comment_id": 752047423,
"datetime": 1609242346000,
"masked_author": "username_0",
"text": "@dependabot rebase",
"title": null,
"type": "comment"
}
] | 3 | 3 | 5,546 | false | true | 18 | false |
bkchr/impl-trait-for-tuples | null | 700,604,366 | 2 | {
"number": 2,
"repo": "impl-trait-for-tuples",
"user_login": "bkchr"
} | [
{
"action": "opened",
"author": "blefevre",
"comment_id": null,
"datetime": 1600015326000,
"masked_author": "username_0",
"text": "Custom trait bounds can be added with the `#[tuple_types_custom_trait_bound(NewBound)` attribute.\r\nThis will use `NewBound` as the type for each Tuple component instead of the Trait being implemented.",
"title": "Add support for custom trait bounds",
"type": "issue"
},
{
"action": "created",
"author": "blefevre",
"comment_id": 692129383,
"datetime": 1600096916000,
"masked_author": "username_0",
"text": "Supporting advanced trait bounds was simpler than expected so that's been added.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 280 | false | false | 280 | false |
uselagoon/lagoon-charts | uselagoon | 875,053,728 | 252 | {
"number": 252,
"repo": "lagoon-charts",
"user_login": "uselagoon"
} | [
{
"action": "opened",
"author": "smlx",
"comment_id": null,
"datetime": 1620096073000,
"masked_author": "username_0",
"text": "This PR makes the docker-host PVC storage class configurable through the helm chart values. By default we keep the current behaviour of not defining a storage class so as to fall back to cluster default.\r\n\r\nCloses #251",
"title": "Docker host storageclass",
"type": "issue"
},
{
"action": "created",
"author": "smlx",
"comment_id": 832014430,
"datetime": 1620140695000,
"masked_author": "username_0",
"text": "this passed here: https://lagoon-ci.amazeeio.cloud/blue/organizations/jenkins/lagoon/detail/dockerhost-sc/1/pipeline/50\r\n\r\nwill fix CI here https://github.com/uselagoon/lagoon-charts/issues/253",
"title": null,
"type": "comment"
}
] | 1 | 2 | 411 | false | false | 411 | false |
microsoft/reverse-proxy | microsoft | 863,191,818 | 926 | null | [
{
"action": "opened",
"author": "thombrink",
"comment_id": null,
"datetime": 1618949010000,
"masked_author": "username_0",
"text": "### Describe the bug\r\nWhen sending a websocket request through the proxy, it fails at `StreamCopier.CopyAsync` in the `CopyResponseBodyAsync` method of the `HttpProxy` class.\r\n\r\n### To Reproduce\r\n1. Start `ReverseProxy.Direct.Sample` and `SampleServer`.\r\n2. Request a websocket connection on the `ReverseProxy.Direct.Sample` and redirect it to the `SampleServer` `/api/websockets` endpoint.\r\n\r\nThe output will show: `Microsoft.AspNetCore.Server.Kestrel.Transport.Sockets: Debug: Connection id \"<connection_id>\" received FIN.` and `Microsoft.AspNetCore.Server.Kestrel.Transport.Sockets: Debug: Connection id \"<connection_id>\" sending FIN because: \"The client closed the connection.\"`.\r\n\r\nWhen `SampleServer` is called directly, it all works like expected.\r\n\r\n### Further technical details\r\n\r\n- Tested on the current main branch\r\n- The used platform: Windows",
"title": "Proxy websocket request exception",
"type": "issue"
},
{
"action": "created",
"author": "Tratcher",
"comment_id": 823638734,
"datetime": 1618957593000,
"masked_author": "username_1",
"text": "Is that error reported from the proxy process or the SampleServer? It claims the client disconnected.\r\n\r\nHere are some diagnostics you can collect and/or share to help narrow it down:\r\n- Full debug logs from the proxy process\r\n- Wireshark/network traces between the proxy and client\r\n- Any client logs you can collect",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Tratcher",
"comment_id": 823670990,
"datetime": 1618962272000,
"masked_author": "username_1",
"text": "I ran this with the TestClient, ReverseProxy.Direct.Sample, and SampleServer and it seems to behave as expected:\r\n\r\nI see these logs:\r\n```\r\ninfo: Microsoft.AspNetCore.Hosting.Diagnostics[1]\r\n Request starting HTTP/1.1 GET https://localhost:5001/api/websockets - -\r\ndbug: Microsoft.AspNetCore.HostFiltering.HostFilteringMiddleware[0]\r\n Wildcard detected, all requests with hosts will be allowed.\r\ndbug: Microsoft.AspNetCore.Routing.Matching.DfaMatcher[1001]\r\n 1 candidate(s) found for the request path '/api/websockets'\r\ndbug: Microsoft.AspNetCore.Routing.Matching.DfaMatcher[1005]\r\n Endpoint '/{**catch-all}' with route pattern '/{**catch-all}' is valid for the request path '/api/websockets'\r\ndbug: Microsoft.AspNetCore.Routing.EndpointRoutingMiddleware[1]\r\n Request matched endpoint '/{**catch-all}'\r\ninfo: Microsoft.AspNetCore.Routing.EndpointMiddleware[0]\r\n Executing endpoint '/{**catch-all}'\r\ninfo: Yarp.ReverseProxy.Service.Proxy.HttpProxy[9]\r\n Proxying to http://localhost:5002/api/websockets?area=xx2\r\ndbug: Microsoft.AspNetCore.Server.Kestrel.Transport.Sockets[6]\r\n Connection id \"0HM84AMCB38HE\" received FIN.\r\ndbug: Microsoft.AspNetCore.Server.Kestrel.Transport.Sockets[7]\r\n Connection id \"0HM84AMCB38HE\" sending FIN because: \"The client closed the connection.\"\r\ndbug: Microsoft.AspNetCore.Server.Kestrel[10]\r\n Connection id \"0HM84AMCB38HE\" disconnecting.\r\ninfo: Microsoft.AspNetCore.Routing.EndpointMiddleware[1]\r\n Executed endpoint '/{**catch-all}'\r\ninfo: Microsoft.AspNetCore.Hosting.Diagnostics[2]\r\n Request finished HTTP/1.1 GET https://localhost:5001/api/websockets - - - 101 - - 1583.0792ms\r\ndbug: Microsoft.AspNetCore.Server.Kestrel[2]\r\n Connection id \"0HM84AMCB38HE\" stopped.\r\n```\r\n\r\nReceiving a FIN is a normal part of the WebSocket close process, not an error condition.\r\n\r\nI don't see any errors reported in StreamCopier, what was the error?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Tratcher",
"comment_id": null,
"datetime": 1620320295000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "Tratcher",
"comment_id": 833686824,
"datetime": 1620320295000,
"masked_author": "username_1",
"text": "Closing. Let us know if you find additional information.",
"title": null,
"type": "comment"
}
] | 2 | 5 | 3,158 | false | false | 3,158 | false |
radix-ui/primitives | radix-ui | 772,318,291 | 353 | {
"number": 353,
"repo": "primitives",
"user_login": "radix-ui"
} | [
{
"action": "opened",
"author": "chaance",
"comment_id": null,
"datetime": 1608569889000,
"masked_author": "username_0",
"text": "- [x] Use a meaningful title for the pull request. Include the name of the package modified.\r\n- [x] Test the change in your own code.\r\n- [ ] Add or edit tests to reflect the change (run `yarn test`).\r\n- [ ] Add or edit Storybook examples to reflect the change (run `yarn dev`).\r\n- [ ] Add documentation to support any new features.\r\n\r\nThis pull request:\r\n\r\n- [ ] Fixes a bug in an existing package\r\n- [ ] Adds additional features/functionality to an existing package\r\n- [ ] Updates documentation or example code\r\n- [x] Other",
"title": "Speed up pre-commit hooks with lint-staged",
"type": "issue"
},
{
"action": "created",
"author": "benoitgrelard",
"comment_id": 749460965,
"datetime": 1608631638000,
"masked_author": "username_1",
"text": "@username_0 So the main speed-up is from the fact that we're only going to re-lint / typecheck the files that are being staged rather than the whole codebase?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chaance",
"comment_id": 749550034,
"datetime": 1608645120000,
"masked_author": "username_0",
"text": "@username_1 See the comment I left in the config file; in order for `tsc` to work properly with `lint-staged` (and for our scripts to run cross platform) one of those keys we have to use the function syntax in a js config file. I kind of hate it too but I couldn't get it to work in package.json without running bash in the command, which wouldn't work on Windows and would exclude contributors. 😕",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chaance",
"comment_id": 749550569,
"datetime": 1608645194000,
"masked_author": "username_0",
"text": "Even better IMO!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "benoitgrelard",
"comment_id": 749555437,
"datetime": 1608645841000,
"masked_author": "username_1",
"text": "We can't really move this to pre-push hook right? otherwise how would the `--fix` work?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chaance",
"comment_id": 749562508,
"datetime": 1608646810000,
"masked_author": "username_0",
"text": "@username_1 I'm thinking we could keep `prettier` and `eslint` in pre-commit. Those actually run pretty fast anyway when only executed on staged changes. `tsc` is significantly slower and I think we can move that to pre-push.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chaance",
"comment_id": 749567970,
"datetime": 1608647475000,
"masked_author": "username_0",
"text": "@username_1 @username_2 Updated a bit. Per my comment above I kept `prretty-quick` and `eslint` in the pre-commit hook, runs only on staged files and is way faster now. `tsc` goes in pre-push because it was much slower, and that let me delete the config file and keep everything in `package.json`.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jjenzz",
"comment_id": 749597012,
"datetime": 1608650723000,
"masked_author": "username_2",
"text": "🎉 :+1:",
"title": null,
"type": "comment"
}
] | 3 | 8 | 1,712 | false | false | 1,712 | true |
expo/expo | expo | 755,553,565 | 11,199 | {
"number": 11199,
"repo": "expo",
"user_login": "expo"
} | [
{
"action": "opened",
"author": "brentvatne",
"comment_id": null,
"datetime": 1606937583000,
"masked_author": "username_0",
"text": "# Why\r\n\r\nWhen this was first introduced we had just rolled out `expo install`. This has existed for years at this point, and I doubt anyone still uses a version of expo-cli without it. If they do, they have bigger problems.\r\n\r\n# How\r\n\r\nJust delete the logs.\r\n\r\n# Test Plan\r\n\r\nImport a deprecated module or removed module and notice that the warning is better.",
"title": "[expo] Remove warning about install command missing, everybody has this now",
"type": "issue"
}
] | 2 | 2 | 689 | false | true | 359 | false |
Princeton-CDH/startwords | Princeton-CDH | 599,079,433 | 59 | null | [
{
"action": "opened",
"author": "thatbudakguy",
"comment_id": null,
"datetime": 1586805438000,
"masked_author": "username_0",
"text": "[figma link](https://www.figma.com/file/MEt3gfKX4N3WTLPbLGBbGC/Startwords?node-id=225%3A2217)\nthis issue tracks the implementation of the homepage design.",
"title": "As a reader, I want to see featured content on the site's homepage.",
"type": "issue"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 613661374,
"datetime": 1586895709000,
"masked_author": "username_1",
"text": "@username_0 I can't view the \"homepage link\" – gives me \"internal error\" \nAlso, if this is about the homepage, could you change the figma link to show the \"homepage\" page on Figma? – I don't think the \"Tablet\" page on Figma is the appropriate one to look at",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 613667268,
"datetime": 1586896461000,
"masked_author": "username_0",
"text": "updated the links, thanks!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 613680723,
"datetime": 1586898271000,
"masked_author": "username_1",
"text": "@username_0 \nbelow are my notes:\nOn all devices: \n1. The \"read more\" below the issue abstract is not matching the design, it needs to be left-aligned (aligned with the issue abstract) and slightly closer to the last line of the abstract than it is now. \n2. The \"Read more about Startwords.\" needs to be replaced with a red arrow to match the design. \n3. The doi is missing, which needs to be below the authors. \n4. The length of the article previews need to be similar, this will be a problem that we need to fix. \n5. It will be better to have shape for the bottom of the preview text, however I don't think it's a deal breaker now, but would be nice to fix it for the next issue. \n6. I like the interaction of the red arrows! Thank you for doing this!\n\nOn mobile:\n1. The \"Data Beyond Vision\" title is crossing the boundaries, which is causing the arrow to be misplaced as well as an increased space between the title and the authors – \"Vision\" should be wrapped. \n\nOn tablet:\n1. The position and width of the 1. issue number 2. issue abstract, and 3. theme do not match the designs and what was agreed. The three pieces mentioned need to match the x position of the opening paragraph on the left and regarding the width, it will need to match where the opening paragraph on the right ends, i.e. aligning with the x positions of rest of the metadata on this page.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614179921,
"datetime": 1586972353000,
"masked_author": "username_0",
"text": "ok, I _think_ I was able to address everything here. regarding the placement of the arrows when the titles wrap, it's unfortunately pretty hard to control...I was able to get it to not wrap the arrow on its own, but more control than that might be hard to accomplish.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614232358,
"datetime": 1586978493000,
"masked_author": "username_1",
"text": "@username_0 thank you! This looks great, mostly everything is fixed!\n\nA couple of points:\nregarding my earlier comment:\n4. The length of the article previews need to be similar, this will be a problem that we need to fix. – Can we do something about this? \n\nOn mobile:\n1. The \"Data Beyond Vision\" title is crossing the boundaries, which is causing the arrow to be misplaced as well as an increased space between the title and the authors – \"Vision\" should be wrapped. \n- I know you have written above that it's difficult, but it looks like it's fixed! (Although I refreshed once and it was back to everything in one line, crossing the boundary, and refreshed again and it looks like it's fixed :D – not sure why this is happening) – But I think it actually matters, and visually changes the design, would be happy to discuss alternatives. (see the image)\n- Same thing is happening with the DOI crossing the boundary, the two DOIs are too close to each other. (see the image)\n- Also, sorry Nick, I just updated the DOI style here and everywhere else to match what we agreed on for the article page. Could you add \"doi: ....\" to be as part of the link? – You can see this in the design now.\n\n\nDesktop:\n1. Is there a way to prevent the featured content from increasing width on desktop and to look the same as it does on tablet?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614250352,
"datetime": 1586980737000,
"masked_author": "username_0",
"text": "@username_1 can you explain what you mean by \"crossing the boundary\"? in the design the preview text of the two articles are very close to each other; there's just a few pixels in between - the same as with the DOIs here.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614256639,
"datetime": 1586981589000,
"masked_author": "username_1",
"text": "@username_0 that makes sense – I think in development it looks visually even closer. I'm trying on Figma exactly what you currently have on the site, and what's on the site is so much closer, which makes a huge difference. I think one way to solve this issue is by aligning this with the design, where the left opening paragraph starts higher on the page. On the site, left and right start exactly on the same y axis.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614266508,
"datetime": 1586982931000,
"masked_author": "username_0",
"text": "ok, I've:\r\n- added tighter margins for desktop\r\n- pushed the second article preview down a bit to create more visual space\r\n- added `doi:` at the start of the doi links",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614275725,
"datetime": 1586984179000,
"masked_author": "username_1",
"text": "@username_0 Thank you! This looks so much better now!\n\nJust this question:\nregarding my earlier comment:\n4. The length of the article previews need to be similar, this will be a problem that we need to fix. – Can we do something about this? \n\n(Also, just a general question that now I'm wondering about, do you know if this is the fault of Figma that there seems to be more space between the DOIs for instance, than there is on the site? – because the font size is correct) \n\nOtherwise I think I'm done reviewing this issue.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 614296120,
"datetime": 1586987128000,
"masked_author": "username_2",
"text": "I agree with Gissoo that the length of the article previews should be similar if possible, but this is more of an authorial and editorial issue than a design issue. The opening lines that have been selected for both articles right now are really good. The fact that the article on the left has more authors makes that entire block similar in size to the article block on the right with a longer preview text. At least that's a consolation, so it doesn't seem (at least to my eye) like they're mismatched now. But it might be something we want to consider for the editorial style guide, recommending a particular word limit for the opening line(s) article preview.\n\nAlso: there’s something too jagged about the article preview shapes — they don’t quite have the clean angle lines we’ve been going for. Can we experiment with word break properties to make the words conform more to the shape we’re hoping for?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614679790,
"datetime": 1587046463000,
"masked_author": "username_0",
"text": "@username_2 how do you feel about hyphenation to aid in keeping the shape? I turned on automatic hyphenation and increased the slope on the text-shapes, which yielded this (on tablet):\r\n\r\n<img width=\"309\" alt=\"Screen Shot 2020-04-16 at 10 13 09 AM\" src=\"https://user-images.githubusercontent.com/4924494/79466692-eaf0ea80-7fca-11ea-9021-22244f35b109.png\">\r\n\r\nI think we might need to dynamically adjust the slope based on the screen size, because this slope becomes too steep for mobile.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 614685722,
"datetime": 1587047044000,
"masked_author": "username_2",
"text": "@username_0 much cleaner lines! Would you mind taking screenshots of a few different options, showing the entire home page? Auto-hyphenation on and off, how both appear on mobile and tablet, etc. It would be good to get a sense of what our options are.\r\n\r\nNot being able to shape the bottom of the lines (i.e. narrowing back down) seems to change the overall feel we were going for (@username_1 may have thoughts on this). The original design was intersecting polygons, now it's abutting triangles. So it would be good to see the range of options we're looking at right now.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614822837,
"datetime": 1587061797000,
"masked_author": "username_0",
"text": "here's some screenshot using different values for the slope of the content on mobile and tablet, with hyphenation turned on:\r\n# mobile\r\n<img src=\"https://user-images.githubusercontent.com/4924494/79492280-c2c6b300-7fed-11ea-8aa5-3f9be422399c.png\" width=\"50%\" />\r\n<img src=\"https://user-images.githubusercontent.com/4924494/79492286-c4907680-7fed-11ea-92ac-63bd11b24e6c.png\" width=\"50%\" />\r\n<img src=\"https://user-images.githubusercontent.com/4924494/79492292-c65a3a00-7fed-11ea-85d4-945279e51a1c.png\" width=\"50%\" />\r\n\r\n# tablet\r\n\r\n\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 614880445,
"datetime": 1587069273000,
"masked_author": "username_2",
"text": "I think I like the second option best for both mobile and tablet you screenshoot above. It's super content-dependent though — if the author or editor changes a single word in these opening lines, the shape of the polygons and the word breaks will look entirely different. But I say let's go with option two for now!\r\n\r\nAre we waiting on anything else for this issue, or should I close?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614882091,
"datetime": 1587069487000,
"masked_author": "username_0",
"text": "@username_1 mentioned that it would be good to set an absolute limit on article preview text length, but so far I haven't been able to figure out how to do it...it looks like you can set `summaryLength` in the hugo settings, but it didn't have any effect for me.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614896643,
"datetime": 1587071252000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614899882,
"datetime": 1587071664000,
"masked_author": "username_1",
"text": "@username_0 thank you so much for providing these variations!!! I really like the first variation on mobile – but I do understand that the second one looks better on both mobile and tablet in this specific case. (v1 might work better in the long run – not sure). I'm not sure how I feel about the hyphenation here yet. (they do happen to be necessary evils in typography :D)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 614900605,
"datetime": 1587071755000,
"masked_author": "username_0",
"text": "just wanted to mention - it's very possible to do the first one on mobile and the second one on tablet. also as you pointed out i think i have the margins wrong on tablet, so these screenshots aren't as applicable as they should be...",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 614902853,
"datetime": 1587072076000,
"masked_author": "username_1",
"text": "that is great news @username_0 Thank you so much!!!!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 638451605,
"datetime": 1591217076000,
"masked_author": "username_0",
"text": "OK, with a little more experimenting and some...creative usage of CSS paths, I was able to create the rounded shape. Here are some screenshots for reference, showing with hyphenation turned off/turned on for mobile and tablet. The blue lines of the shaping polygon are shown for reference. We can control the height of the polygon that shapes the text, but it has to be an absolute value - unfortunately it can't scale to the size of the text itself. Currently I have 300px set, which seems to balance OK between the lengths of the two previews.\n\n## tablet\nhypens ON (left), hyphens OFF (right)\n\n<img src=\"https://images.zenhubusercontent.com/5aa82c4b4b5806bc2bcab4fa/c21d2f9f-6ccc-47ce-896e-1502cdfb999a\" width=\"200\">\n<img src=\"https://images.zenhubusercontent.com/5aa82c4b4b5806bc2bcab4fa/1a6b937e-c7cf-4f51-8bfa-0a427f06f3a8\" width=\"200\">\n\n## mobile\nhypens ON (left), hyphens OFF (right)\n\n<img src=\"https://images.zenhubusercontent.com/5aa82c4b4b5806bc2bcab4fa/4ec97248-ae51-4089-aefb-dd7c22811df2\" width=\"200\">\n<img src=\"https://images.zenhubusercontent.com/5aa82c4b4b5806bc2bcab4fa/0b6776ba-1ef5-4d75-9dab-582b56a759e0\" width=\"200\">",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 638880747,
"datetime": 1591280494000,
"masked_author": "username_1",
"text": "@username_0 Thank you so much for these!!! I love how these look so much more than how they currently are. I love them when there is **no hyphen**, mainly because I'm noticing that in the versions without the hyphens the words are also **not** breaking at the end of a line? Is that true? It's much more easier to read in the version without the hyphen, because an entire word is wrapped to the next line. The shape of the entire paragraph also looks much better visually in the versions without the hyphens. They especially look better on tablet than mobile. But mobile is fine too.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 639001545,
"datetime": 1591292276000,
"masked_author": "username_2",
"text": "This is so cool, brilliant solution @username_0! I also agree with @username_1 that the **hyphens off** version is best. There are just so many broken words on mobile that it messes with the readability. And the shapes look just as good without word breaks.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 639090147,
"datetime": 1591301442000,
"masked_author": "username_0",
"text": "The shaping is still quite restrictive, and some content (including the intro for rebecca's essay) just doesn't naturally fit into the shape at all. Right now, the way I can see to implement this is:\n\n- measure the number of characters that can reasonably fit into the shape via some trials \n- allow writers to choose the intro to their piece using `<!--more-->`, but truncate it if it has too many characters and add an ellipsis\n- set an exact height for both the preview text and its outside shape so that the layout can't move around or break\n\nthis approach means that some (maybe most) previews will end up getting truncated, and it will likely require some trial and error on the part of the editor to find the right preview text. if the preview is too long it will be truncated, and if it's too short it will not fill up the shape and leave a bunch of empty space. however, the layout will be stable across various screens.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 641551747,
"datetime": 1591734551000,
"masked_author": "username_2",
"text": "@username_0 FWIW, I don't think it's a bad idea to have that much of the editor's and author's attention on the opening lines to get them to fit perfectly. It's an interesting feature, a constraint, to have editor and author collaborate on working within that format.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 641587327,
"datetime": 1591737559000,
"masked_author": "username_0",
"text": "@username_2 thanks, this input is helpful. based on you and @rlskoeser's comments I think I can make one or two more revisions and then we can close this out.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 644928397,
"datetime": 1592331415000,
"masked_author": "username_0",
"text": "OK, I've deployed the latest version of this and updated the testing notes at the top. Would like design testing from @username_1 and a final test from @username_2 on fitting the article preview quotes into the new shapes.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 645543511,
"datetime": 1592418321000,
"masked_author": "username_1",
"text": "@username_0 This looks amazing! Thank you for working on it!\n\nThe notes below apply to both the homepage and the single issue page.\n\nOn tablet and mobile: \n- The spacing between the article opening paragraph and the article title does not match the design (should be slightly less) (It looks very close to the design on desktop)\n\nOn all devices:\n- The spacing between the article opening paragraph and the article title should be the same for both articles – currently the spacing is much more on the left side. \n- The spacing between the article title and the first author does not match the designs\n\n\n\nOn mobile: \n- The article title on the left is getting cut off:\n![Screen Shot 2020-06-17 at 2.08.28 PM.png]",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 645556276,
"datetime": 1592419796000,
"masked_author": "username_0",
"text": "I can't get this to happen when emulating an iPhone SE. Not sure how to test a solution...",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 645600383,
"datetime": 1592424889000,
"masked_author": "username_1",
"text": "- yes, me either.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 645660315,
"datetime": 1592433110000,
"masked_author": "username_2",
"text": "This looks fantastic, @username_0 — thank you so much for all your hard work. This is clearly the trickiest page of the entire project! Just two additional thoughts:\r\n\r\nThe \"read more\" link doesn't work after the preview of the issue abstract text\r\n\r\nIs there any way at all we can make the article opening line text shapes accommodate just a few more words? I understand you've gone over this like a hundred times but I've been playing with some edits to the text locally and RM's literally perfect line is just 35 words but the last two words cut off on mobile, reading: `When I was growing up I liked to read about dying children. I’m not talking about the Victorian orphan kind of dying either, not the dying of storybooks, but children who were` I ask because 33 words are pretty restrictive when it comes to writing punchy opening lines. Allowing for 40ish words max would probably be enough. Or even simply finding a way to fit those last two words of RM's opening lines in, and having that limit determine the length of future contributors' opening lines. We could call it the RM rule.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "thatbudakguy",
"comment_id": 646255903,
"datetime": 1592507719000,
"masked_author": "username_0",
"text": "This is really complicated; if there are still problems we might need to zoom. Let me know how it looks to you with the latest changes.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gwijthoff",
"comment_id": 646261116,
"datetime": 1592508354000,
"masked_author": "username_2",
"text": "@username_0 that fully fit RM's opening lines into the homepage preview on mobile for me. Nice work.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 647538180,
"datetime": 1592834418000,
"masked_author": "username_1",
"text": "@username_0 This looks great!! Thank you so much for all of your work!!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gissoo",
"comment_id": 647551837,
"datetime": 1592835824000,
"masked_author": "username_1",
"text": "- @username_0 I think this is happening on my phone because of the \"The article title on the left is getting cut off\" screenshot I had attached earlier, above, _I think_. It's fine on desktop and tablet.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "gissoo",
"comment_id": null,
"datetime": 1592835844000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 36 | 15,483 | false | false | 15,483 | true |
graceshopper-codename/graceshopper | graceshopper-codename | 552,053,362 | 117 | {
"number": 117,
"repo": "graceshopper",
"user_login": "graceshopper-codename"
} | [
{
"action": "opened",
"author": "nurpny",
"comment_id": null,
"datetime": 1579492847000,
"masked_author": "username_0",
"text": "Updated cart post api request to be called via react redux store instead of directly from the react components\r\n\r\n### Assignee Tasks\r\n\r\n* [ ] added unit tests (or none needed)\r\n* [ ] written relevant docs (or none needed)\r\n* [ ] referenced any relevant issues (or none exist)\r\n\r\n### Guidelines\r\n\r\nPlease add a description of this Pull Request's motivation, scope, outstanding issues or potential alternatives, reasoning behind the current solution, and any other relevant information for posterity.\r\n\r\n---\r\n\r\n_Your PR Notes Here_",
"title": "Refactor: #116: ",
"type": "issue"
},
{
"action": "created",
"author": "nurpny",
"comment_id": 576127101,
"datetime": 1579501905000,
"masked_author": "username_0",
"text": "Combining with another update",
"title": null,
"type": "comment"
}
] | 1 | 2 | 558 | false | false | 558 | false |
mdn/browser-compat-data | mdn | 762,571,245 | 7,661 | {
"number": 7661,
"repo": "browser-compat-data",
"user_login": "mdn"
} | [
{
"action": "opened",
"author": "andybons",
"comment_id": null,
"datetime": 1607703812000,
"masked_author": "username_0",
"text": "This change updates the availability of the Storage Access API in Chrome\r\n\r\n+ [Chrome bug](https://bugs.chromium.org/p/chromium/issues/detail?id=989663)\r\n+ [Chrome Platform Status](https://www.chromestatus.com/feature/5612590694662144)\r\n\r\nThe [first commit](https://chromium.googlesource.com/chromium/src.git/+/da12cf8726dc8414f0d61624afebff46b83c8f1a) was in [version 78](https://chromium.googlesource.com/chromium/src/+/da12cf8726dc8414f0d61624afebff46b83c8f1a/chrome/VERSION), not 85, and the flag remains off by default.\r\n\r\nI’m not sure how MS Edge’s versions work and when it was turned on by default, so that version hasn’t been altered.\r\n\r\nFixes #7510",
"title": "Update Storage Access API availability on Chrome",
"type": "issue"
},
{
"action": "created",
"author": "andybons",
"comment_id": 748355339,
"datetime": 1608331522000,
"masked_author": "username_0",
"text": "Sorry for delay, here. Holidays in U.S. are slowing me down 🎄 \r\n\r\nWill follow up with those changes!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "andybons",
"comment_id": 748556467,
"datetime": 1608433777000,
"masked_author": "username_0",
"text": "Done",
"title": null,
"type": "comment"
}
] | 1 | 3 | 762 | false | false | 762 | false |
MicrosoftDocs/iis-docs | MicrosoftDocs | 798,833,242 | 543 | {
"number": 543,
"repo": "iis-docs",
"user_login": "MicrosoftDocs"
} | [
{
"action": "opened",
"author": "gewarren",
"comment_id": null,
"datetime": 1612227724000,
"masked_author": "username_0",
"text": "- Need to find a spot for these in the TOC...please help!\r\n- Okay to have \"unknown\" as API location in metadata?",
"title": "Port articles from previous-versions for APIScan",
"type": "issue"
},
{
"action": "created",
"author": "opbld31",
"comment_id": 771271242,
"datetime": 1612227838000,
"masked_author": "username_1",
"text": "Docs Build status updates of commit _[f038b48](https://github.com/username_0/iis-docs/commits/f038b4850f60dea4eb17867c4281ce42b39a2ee2)_: \n\n### :white_check_mark: Validation status: passed\r\n\r\n\r\nFile | Status | Preview URL | Details\r\n---- | ------ | ----------- | -------\r\n[iis/extensions/entry-point-functions/getextensionversion.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/getextensionversion.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/getextensionversion?branch=pr-en-us-543) | [Details](#user-content-af1e9225-6955aecf)\r\n[iis/extensions/entry-point-functions/httpextensionproc.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/httpextensionproc.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/httpextensionproc?branch=pr-en-us-543) | [Details](#user-content-375b9f63-6955aecf)\r\n[iis/extensions/entry-point-functions/terminateextension.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/terminateextension.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/terminateextension?branch=pr-en-us-543) | [Details](#user-content-999e24f1-6955aecf)\r\n\r\n<a id=\"af1e9225-6955aecf\"></a>\r\n### [iis/extensions/entry-point-functions/getextensionversion.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/getextensionversion.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\n<a id=\"375b9f63-6955aecf\"></a>\r\n### [iis/extensions/entry-point-functions/httpextensionproc.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/httpextensionproc.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n - **Line 34, Column 1**: **[Suggestion-table-syntax-invalid]** `````Table syntax is invalid. Ensure your table includes a header and is surrounded by empty lines. NOTE: This Suggestion will become a Warning on 1/29/21.`````\r\n\r\n<a id=\"999e24f1-6955aecf\"></a>\r\n### [iis/extensions/entry-point-functions/terminateextension.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/terminateextension.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\nFor more details, please refer to the [build report](https://opbuildstorageprod.blob.core.windows.net/report/2021%5C2%5C2%5C90f071cf-c002-b629-5e11-5bc968bb67ca%5CPullRequest%5C202102020102087563-543%5Cworkflow_report.html?sv=2016-05-31&sr=b&sig=D9HxQ%2FURQzykb88xuWiqRyizJ8Wljm4lbMO1cpaEFeQ%3D&st=2021-02-02T00%3A58%3A57Z&se=2021-03-05T01%3A03%3A57Z&sp=r).\r\n\r\n**Note:** Broken links written as relative paths are included in the above build report. For broken links written as absolute paths or external URLs, see the [broken link report](https://ops.microsoft.com/#/repos/90f071cf-c002-b629-5e11-5bc968bb67ca?tabName=brokenlinks).\r\n\r\nFor any questions, please:<ul><li>Try searching in the <a href=\"https://review.docs.microsoft.com/en-us/search/?search=&category=All&scope=help-docs&category=All&branch=master\">Docs contributor and Admin Guide</a></li><li>See the <a href=\"https://review.docs.microsoft.com/en-us/help/onboard/faq?branch=master\">frequently asked questions</a></li><li>Post your question in the <a href=\"https://teams.microsoft.com/l/channel/19%3a7ecffca1166a4a3986fed528cf0870ee%40thread.skype/General?groupId=de9ddba4-2574-4830-87ed-41668c07a1ca&tenantId=72f98bf-86f1-41af-91ab-2d7cd011db47\">Docs support channel</a></li></ul>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "opbld31",
"comment_id": 771273146,
"datetime": 1612228169000,
"masked_author": "username_1",
"text": "Docs Build status updates of commit _[8c3f2db](https://github.com/username_0/iis-docs/commits/8c3f2db45464d58a71ae68d863e96a3152b50723)_: \n\n### :white_check_mark: Validation status: passed\r\n\r\n\r\nFile | Status | Preview URL | Details\r\n---- | ------ | ----------- | -------\r\n[iis/extensions/entry-point-functions/getextensionversion.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/getextensionversion.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/getextensionversion?branch=pr-en-us-543) | [Details](#user-content-af1e9225-a5b51bc0)\r\n[iis/extensions/entry-point-functions/httpextensionproc.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/httpextensionproc.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/httpextensionproc?branch=pr-en-us-543) | [Details](#user-content-375b9f63-a5b51bc0)\r\n[iis/extensions/entry-point-functions/terminateextension.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/terminateextension.md) | :bulb:Suggestion | [View](https://review.docs.microsoft.com/en-us/IIS/extensions/entry-point-functions/terminateextension?branch=pr-en-us-543) | [Details](#user-content-999e24f1-a5b51bc0)\r\n\r\n<a id=\"af1e9225-a5b51bc0\"></a>\r\n### [iis/extensions/entry-point-functions/getextensionversion.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/getextensionversion.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\n<a id=\"375b9f63-a5b51bc0\"></a>\r\n### [iis/extensions/entry-point-functions/httpextensionproc.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/httpextensionproc.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\n<a id=\"999e24f1-a5b51bc0\"></a>\r\n### [iis/extensions/entry-point-functions/terminateextension.md](https://github.com/username_0/iis-docs/blob/apiscan-0201/iis/extensions/entry-point-functions/terminateextension.md)\r\n - **Line 2, Column 1**: **[Suggestion-description-missing]** `````Missing required attribute: 'description'.`````\r\n\r\nFor more details, please refer to the [build report](https://opbuildstorageprod.blob.core.windows.net/report/2021%5C2%5C2%5C90f071cf-c002-b629-5e11-5bc968bb67ca%5CPullRequest%5C202102020108135682-543%5Cworkflow_report.html?sv=2016-05-31&sr=b&sig=5PLDERG4SLwE0YZxALzx%2F53ey7Jq37zoaTUudGdFKdQ%3D&st=2021-02-02T01%3A04%3A28Z&se=2021-03-05T01%3A09%3A28Z&sp=r).\r\n\r\n**Note:** Broken links written as relative paths are included in the above build report. For broken links written as absolute paths or external URLs, see the [broken link report](https://ops.microsoft.com/#/repos/90f071cf-c002-b629-5e11-5bc968bb67ca?tabName=brokenlinks).\r\n\r\nFor any questions, please:<ul><li>Try searching in the <a href=\"https://review.docs.microsoft.com/en-us/search/?search=&category=All&scope=help-docs&category=All&branch=master\">Docs contributor and Admin Guide</a></li><li>See the <a href=\"https://review.docs.microsoft.com/en-us/help/onboard/faq?branch=master\">frequently asked questions</a></li><li>Post your question in the <a href=\"https://teams.microsoft.com/l/channel/19%3a7ecffca1166a4a3986fed528cf0870ee%40thread.skype/General?groupId=de9ddba4-2574-4830-87ed-41668c07a1ca&tenantId=72f98bf-86f1-41af-91ab-2d7cd011db47\">Docs support channel</a></li></ul>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Rick-Anderson",
"comment_id": 772892171,
"datetime": 1612394099000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "gewarren",
"comment_id": 776899917,
"datetime": 1612979943000,
"masked_author": "username_0",
"text": "Closing this PR as the metadata was added to the previous-versions repo instead. cc @LBynum77.",
"title": null,
"type": "comment"
}
] | 3 | 5 | 7,694 | false | false | 7,694 | true |
XCPCIO/uptime | XCPCIO | 792,567,253 | 19 | null | [
{
"action": "opened",
"author": "Dup4",
"comment_id": null,
"datetime": 1611411020000,
"masked_author": "username_0",
"text": "In [`39bd27d`](https://github.com/XCPCIO/uptime/commit/39bd27d86abce95aa014a4c3082a629f4e55aa93\n), XCPCIO (https://xcpcio.com) was **down**:\n- HTTP code: 0\n- Response time: 0 ms",
"title": "🛑 XCPCIO is down",
"type": "issue"
},
{
"action": "created",
"author": "Dup4",
"comment_id": 766091510,
"datetime": 1611413700000,
"masked_author": "username_0",
"text": "**Resolved:** XCPCIO is back up in [`1a6dcce`](https://github.com/XCPCIO/uptime/commit/1a6dcce833046cf15b1f46dc64949879922d6344\n).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Dup4",
"comment_id": null,
"datetime": 1611413701000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 307 | false | false | 307 | false |
XanaduAI/strawberryfields | XanaduAI | 579,465,044 | 315 | {
"number": 315,
"repo": "strawberryfields",
"user_login": "XanaduAI"
} | [
{
"action": "opened",
"author": "nquesada",
"comment_id": null,
"datetime": 1583951605000,
"masked_author": "username_0",
"text": "Adds missing parenthesis.",
"title": "Update gates.rst",
"type": "issue"
}
] | 2 | 2 | 25 | false | true | 25 | false |
dimagi/commcare-android | dimagi | 883,975,823 | 2,469 | {
"number": 2469,
"repo": "commcare-android",
"user_login": "dimagi"
} | [
{
"action": "opened",
"author": "ShivamPokhriyal",
"comment_id": null,
"datetime": 1620644599000,
"masked_author": "username_0",
"text": "cross-request: https://github.com/dimagi/commcare-core/pull/988",
"title": "Dummy commit for triggering test",
"type": "issue"
},
{
"action": "created",
"author": "ShivamPokhriyal",
"comment_id": 836597729,
"datetime": 1620647248000,
"masked_author": "username_0",
"text": "<!--\n 0 Errors\n 8 Warnings: This PR does not contain any J...\n 0 Messages\n 0 Markdowns\n-->\n<table>\n <thead>\n <tr>\n <th width=\"50\"></th>\n <th width=\"100%\" data-danger-table=\"true\" data-kind=\"Warning\">\n 8 Warnings\n </th>\n </tr>\n </thead>\n <tbody>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\">This PR does not contain any JIRA issue keys in the PR title or commit messages (e.g. KEY-123)</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L13\">app/AndroidManifest.xml#L13</a> - This namespace declaration is redundant</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L16\">app/AndroidManifest.xml#L16</a> - This namespace declaration is redundant</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L19\">app/AndroidManifest.xml#L19</a> - This namespace declaration is redundant</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L81\">app/AndroidManifest.xml#L81</a> - Attribute <code>usesCleartextTraffic</code> is only used in API level 23 and higher (current min is 16)</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L357\">app/AndroidManifest.xml#L357</a> - Exported receiver does not require permission</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L364\">app/AndroidManifest.xml#L364</a> - Exported receiver does not require permission</td>\n </tr>\n <tr>\n <td>:warning:</td>\n <td data-sticky=\"false\"><a href=\"https://github.com/dimagi/commcare-android/blob/6aa865dbc1d52ef1241c4df238c229c0d05922f0/app/AndroidManifest.xml#L371\">app/AndroidManifest.xml#L371</a> - Exported receiver does not require permission</td>\n </tr>\n </tbody>\n</table>\n\n<p align=\"right\" data-meta=\"generated_by_danger\">\n Generated by :no_entry_sign: <a href=\"https://danger.systems/\">Danger</a>\n</p>",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dimagimon",
"comment_id": 836615246,
"datetime": 1620648358000,
"masked_author": "username_1",
"text": "[](https://jenkins.dimagi.com/job/commcare-android-pr-job-builder/462390/)",
"title": null,
"type": "comment"
}
] | 2 | 3 | 2,908 | false | false | 2,908 | false |
alibaba/ice | alibaba | 634,360,820 | 3,266 | null | [
{
"action": "opened",
"author": "imsobear",
"comment_id": null,
"datetime": 1591604956000,
"masked_author": "username_0",
"text": "- 优化 `@ice/spec` 规则 https://github.com/ice-lab/spec/issues/12\r\n- 模板里的 eslint 参数改为 true",
"title": "eslint 规则优化",
"type": "issue"
},
{
"action": "created",
"author": "chenbin92",
"comment_id": 644773733,
"datetime": 1592314908000,
"masked_author": "username_1",
"text": "结论:作为框架中台的插件沉淀,包含 rax 和 ice 的两部分。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chenbin92",
"comment_id": 650711467,
"datetime": 1593329666000,
"masked_author": "username_1",
"text": "进度同步:eslint 规则 @chenhebing 基于 @ice/spec 提供完整的 RFC 进行讨论 https://github.com/ice-lab/spec/issues/created_by/chenhebing",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "imsobear",
"comment_id": null,
"datetime": 1604376745000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 234 | false | false | 234 | false |
serverless-nextjs/serverless-next.js | serverless-nextjs | 826,340,660 | 938 | null | [
{
"action": "opened",
"author": "sebastianmacarescu",
"comment_id": null,
"datetime": 1615311212000,
"masked_author": "username_0",
"text": "### Describe the bug\r\n\r\nWe use serverless-nextjs-plugin@2.5.1 with the following routes\r\n`\r\n- src: index\r\n path: /{slug}\r\n cors: true\r\n request:\r\n parameters:\r\n paths:\r\n slug: true\r\n- src: index\r\n path: /{slug}/{page}\r\n cors: true\r\n request:\r\n parameters:\r\n paths:\r\n slug: true\r\n page: true\r\n`\r\n\r\nWhen trying to deploy we get:\r\n`A sibling ({slug}) of this resource already has a variable path part -- only one is allowed (Service: AmazonApiGateway; Status Code: 400; Error Code: BadRequestException; Request ID: fecdce7a-751a-4c27-b16d-0ebc521b563a; Proxy: null)`\r\n\r\nThis happens because the next.js creates a custom page _error.js to handle all errors and the serverless-nextjs-plugin will create a catch-all proxy resource.\r\n\r\n### Actual behavior\r\n\r\nGot error: `A sibling ({slug}) of this resource already has a variable path part -- only one is allowed (Service: AmazonApiGateway; Status Code: 400; Error Code: BadRequestException; Request ID: fecdce7a-751a-4c27-b16d-0ebc521b563a; Proxy: null)`\r\n\r\nIf I deploy without above routes, manually delete the proxy resources from AWS console then deploy again it works.\r\n\r\n### Expected behavior\r\n\r\nWe should have an option to not create that catch all proxy resource for the 404 pages.\r\n\r\n\r\n### Steps to reproduce\r\n\r\nCreate an empty serverless project using serverless-nextjs-plugin@2.5.1. \r\nConfigure the above routes and try to deploy\r\n\r\n### Screenshots/Code/Logs\r\n<!-- If applicable, add screenshots or a minimal repro (e.g code or configuration snippet or repository) to help explain your problem. If you have a runtime issue from Lambda/CloudFront, please check CloudWatch logs (note that Lambda@Edge logs are in a region closest to where you access CloudFront - NOT necessarily in `us-east-1` where the original Lambda is created) and post any logs or stacktraces if possible. See here for how to check logs: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/lambda-edge-testing-debugging.html#lambda-edge-identifying-function-errors. If you have a build or deploy issue, please run with serverless --debug and post the logs. Please also post your serverless.yml. -->\r\n\r\n### Versions\r\nserverless-nextjs-plugin@2.5.1\r\nserverless -v\r\nFramework Core: 1.83.2\r\nPlugin: 3.8.4\r\nSDK: 2.3.2\r\nComponents: 2.34.9",
"title": "serverless-nextjs-plugin ApiGatewayResourceProxyVar resource already has a variable path part",
"type": "issue"
}
] | 1 | 1 | 2,392 | false | false | 2,392 | false |
antvis/G2Plot | antvis | 680,762,408 | 1,457 | {
"number": 1457,
"repo": "G2Plot",
"user_login": "antvis"
} | [
{
"action": "opened",
"author": "Me-Momo",
"comment_id": null,
"datetime": 1597735532000,
"masked_author": "username_0",
"text": "- [x] register default theme",
"title": "feat: add theme config",
"type": "issue"
}
] | 2 | 2 | 349 | false | true | 28 | false |
ohmyzsh/ohmyzsh | ohmyzsh | 799,371,764 | 9,631 | null | [
{
"action": "opened",
"author": "rustiever",
"comment_id": null,
"datetime": 1612278883000,
"masked_author": "username_0",
"text": "<!--\r\nFill this out before posting. You can delete irrelevant sections, but\r\nan issue where no sections have been filled will be deleted without comment.\r\n-->\r\n\r\n**Describe the bug**\r\nWhen I issue `vi .../` I can't do tab-completion\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior, for example:\r\n1. type this `vi .../` \r\n2. press <tab> key\r\n3. there will be visual bell (in my case)\r\n\r\n**Expected behavior**\r\nIt should give the tab-completion\r\n\r\n\r\n\r\n\r\n - OS / Distro: [macOS]\r\n - Latest ohmyzsh update?: [Yes]\r\n - ZSH Version: [e.g. 5.8]\r\n - Terminal emulator: [Alacritty]",
"title": "relative path doesn't get tab-completion",
"type": "issue"
},
{
"action": "created",
"author": "mcornella",
"comment_id": 772051128,
"datetime": 1612304762000,
"masked_author": "username_1",
"text": "You can use the globalias plugin to automatically expand the dots to their value. After they are expanded to `../..` the path completion will work correctly.\r\n\r\nThere's also [this plugin](https://github.com/knu/zsh-manydots-magic/blob/master/manydots-magic) that does the same thing only for dots, and my [less tested alternative](https://github.com/username_1/dotfiles/blob/main/ohmyzsh-custom/magic-dot.zsh) based off [this SO answer](https://stackoverflow.com/questions/23456873/multi-dot-paths-in-zsh-like-cd/23471915#23471915).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mcornella",
"comment_id": null,
"datetime": 1612304762000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "rustiever",
"comment_id": 772304225,
"datetime": 1612338029000,
"masked_author": "username_0",
"text": "Thanks, but I couldn't set the plugin(many dots-magic) can you please help me",
"title": null,
"type": "comment"
}
] | 2 | 4 | 1,185 | false | false | 1,185 | true |
Nienormalny/twitch-enhancer | null | 708,490,019 | 14 | null | [
{
"action": "opened",
"author": "Nienormalny",
"comment_id": null,
"datetime": 1600983377000,
"masked_author": "username_0",
"text": "## Step 1:\r\n\r\nAdd a image file to the directory: `src -> assets -> images -> emotes`.\r\nIt can be **.gif** or **.png**. (NOT JPG!)\r\n\r\n## Step 2:\r\n\r\nAdd new **case** to the col inside: `src -> utils -> emotes.js`",
"title": "Add FeelsOkayMan icon for live chat",
"type": "issue"
},
{
"action": "closed",
"author": "Nienormalny",
"comment_id": null,
"datetime": 1601568229000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 210 | false | false | 210 | false |
openmainframeproject/cobol-programming-course | openmainframeproject | 637,134,564 | 138 | {
"number": 138,
"repo": "cobol-programming-course",
"user_login": "openmainframeproject"
} | [
{
"action": "opened",
"author": "jellypuno",
"comment_id": null,
"datetime": 1591890688000,
"masked_author": "username_0",
"text": "",
"title": "Adding Source Modules for COBOL Challenges: Debugging",
"type": "issue"
},
{
"action": "created",
"author": "zeibura",
"comment_id": 642765739,
"datetime": 1591890982000,
"masked_author": "username_1",
"text": "Looks fine, have created #139 for the instructions and pics.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zeibura",
"comment_id": 642769633,
"datetime": 1591891149000,
"masked_author": "username_1",
"text": "@username_0 what's the difference between the two programs, CBL0106 and CBL0106C? The instructions I provided only refer to the first one.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jellypuno",
"comment_id": 642774067,
"datetime": 1591891390000,
"masked_author": "username_0",
"text": "@username_1 CBL0106 is the program that needs debugging. CBL0106C is the fixed version in case the students are stuck and they need help.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MikeBauerCA",
"comment_id": 643463948,
"datetime": 1591992759000,
"masked_author": "username_2",
"text": "I requested a small change on #139 but see no harm in merging this PR first.",
"title": null,
"type": "comment"
}
] | 3 | 5 | 407 | false | false | 407 | true |
async-rs/async-std | async-rs | 822,927,832 | 963 | null | [
{
"action": "opened",
"author": "svmk",
"comment_id": null,
"datetime": 1614937879000,
"masked_author": "username_0",
"text": "Hello! To reproduce this bug you may use this project: \r\nhttps://github.com/username_0/test-async-std-bug\r\n\r\nDownload it and exeucte file `run-test.sh`\r\n\r\nSteps to reproduce:\r\n1. Download remote url `https://www.sec.gov/Archives/edgar/full-index/1994/QTR1/company.idx` by using `reqwest` library\r\n2. Save it to file while downloading by using `async_std::fs::File`\r\n3. Download remote url `https://www.sec.gov/Archives/edgar/full-index/1994/QTR1/company.idx` using `wget`\r\n4. Compare files. It's differ!\r\n\r\nExpected result:\r\nFiles must be equal.\r\n\r\nCode to reproduce this bug:\r\n```rust\r\nuse futures::stream::StreamExt;\r\nuse async_std::fs::File;\r\nuse futures::AsyncWriteExt;\r\nuse std::{fs::File as SyncFile, io::Write};\r\n\r\n#[tokio::main]\r\nasync fn main() -> Result<(), Box<dyn std::error::Error>> {\r\n let url = \"https://www.sec.gov/Archives/edgar/full-index/1994/QTR1/company.idx\";\r\n let client = reqwest::Client::new();\r\n let response = client.get(url).send().await?;\r\n let mut response_body = response.bytes_stream();\r\n let mut async_file = File::create(\"company.async.idx\").await?;\r\n let mut sync_file = SyncFile::create(\"company.sync.idx\")?;\r\n while let Some(buffer) = response_body.next().await {\r\n let buffer = buffer?;\r\n let _ = async_file.write(&buffer).await?;\r\n async_file.sync_all().await?;\r\n sync_file.write(&buffer)?;\r\n }\r\n return Ok(());\r\n}\r\n```",
"title": "File was corrupted by async_std::fs::File",
"type": "issue"
},
{
"action": "created",
"author": "jbr",
"comment_id": 791784610,
"datetime": 1614986450000,
"masked_author": "username_1",
"text": "I believe the issue here is the same as https://github.com/async-rs/async-std/issues/939#issuecomment-762092739 — you're doing `async_file.write(&buffer).await` ([Write::write](https://docs.rs/async-std/1.9.0/async_std/io/trait.Write.html#method.write)) instead of `async_file.write_all(&buffer)`([Write::write_all](https://docs.rs/async-std/1.9.0/async_std/io/trait.Write.html#method.write_all) — write doesn't necessarily write all of the contents. If you log the return value from write and compare it to the buffer.len(), there will be occasional discrepancies",
"title": null,
"type": "comment"
}
] | 2 | 2 | 1,971 | false | false | 1,971 | true |
solo-io/service-mesh-hub | solo-io | 722,471,964 | 1,018 | null | [
{
"action": "opened",
"author": "ilackarms",
"comment_id": null,
"datetime": 1602778100000,
"masked_author": "username_0",
"text": "Currently SMH is opinionated in how it derives mesh configuration (via translation) and provides no extensibility layer. It is currently impossible to extend SMH without modifying SMH code directly.\r\n\r\n**Describe the solution you'd like**\r\n It would be preferable for SMH users to have the ability to extend the core SMH implementation with hooks to modify the generated output resources produced by smh (i.e. istio CRDs).\r\n\r\nI propose a solution whereby SMH makes GRPC Client requests to a user-defined **Extensions** server, which has the ability to patch and append resources to the SMH output snapshot before it is written to storage. The extension server should also have the ability to notify SMH when a resync is desired (i.e. when a third-party resource changes).\r\n\r\n**Additional context**\r\nThis has relevance both to 3rd party extensions as well as Solo.io's own Enterprise extensions for SMH.",
"title": "SMH Extensions",
"type": "issue"
}
] | 2 | 2 | 902 | false | true | 902 | false |
netease-im/NIM_iOS_UIKit | netease-im | 769,459,490 | 301 | null | [
{
"action": "opened",
"author": "396514268",
"comment_id": null,
"datetime": 1608174877000,
"masked_author": "username_0",
"text": "### 问题检查清单\r\n\r\n感谢您向我们提出问题,在提交问题之前,请确认以下事宜:\r\n\r\n- [ ] 我已经阅读了组件文档: \r\n\r\n [《项目结构介绍》](./Documents/nim_arch.md)\r\n\r\n [《界面排版自定义》](./Documents/nim_custom_ui.md)\r\n\r\n [《新消息类型集成》](./Documents/nim_custom_message.md)\r\n\r\n [《用户信息自定义》](./Documents/nim_userinfo.md)\r\n\r\n [《机器人消息排版指南》](./Documents/nim_robot.md)\r\n\r\n\r\n- [ ] 我已经阅读了[常见问题](http://dev.netease.im/docs/product/%E9%80%9A%E7%94%A8/%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98?#iOS%E7%89%88SDK), 但是找不到相关解决方案。\r\n\r\n- [ ] 我已经阅读了[SDK 文档](http://dev.netease.im/docs/product/IM%E5%8D%B3%E6%97%B6%E9%80%9A%E8%AE%AF/SDK%E5%BC%80%E5%8F%91%E9%9B%86%E6%88%90/iOS%E5%BC%80%E5%8F%91%E9%9B%86%E6%88%90/%E6%A6%82%E8%A6%81%E4%BB%8B%E7%BB%8D), 但是找不到相关解决方案。\r\n \r\n- [ ] 我已经搜索了[当前问题](https://github.com/netease-im/NIM_iOS_UIKit/issues?utf8=✓&q=is%3Aissue), 但是找不到相关解决方案。\r\n\r\n\r\n### 问题描述\r\n\r\n#### 问题现象\r\n\r\n[描述具体问题表现]\r\n\r\n#### 是否重现\r\n\r\n[重现此问题的步骤]\r\n\r\n#### 其它备注\r\n\r\n[在这里添加任何其他相关细节]\r\n初始化网易云信会弹出允许使用本地网络的弹窗,注释掉就不会弹出,请问有同遇到这个问题的吗",
"title": "初始化网易云信会弹出允许使用本地网络的弹窗",
"type": "issue"
},
{
"action": "closed",
"author": "396514268",
"comment_id": null,
"datetime": 1608796917000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 944 | false | false | 944 | false |
Public-Health-Bioinformatics/DataHarmonizer | Public-Health-Bioinformatics | 643,287,942 | 76 | null | [
{
"action": "opened",
"author": "cmrn-rhi",
"comment_id": null,
"datetime": 1592851677000,
"masked_author": "username_0",
"text": "This may have been discussed in the past (my apologies if I touch on things that have already been addressed) but can he include a 'help'/'support' section that directs the users to:\r\n- The SOP\r\n- Where to make issue/term requests (either linking to GitHub Issues or to a curator email)\r\n\r\nThere is now a [published version of the SOP](https://docs.google.com/document/d/e/2PACX-1vR4UkqrLaj1-9jxmrNk9mZ4S4Siim8onPHqgdXKd9m1lOroXmekClfPsXlqgFDio1rWZW7lHArSAbOg/pub) that we can link too (it will update every time we edit the SOP gdoc), but perhaps we should also have a pdf copy that users can use if they are offline? That latter would certainly require more maintenance - we could include a note at the top that it may not be the latest version and to go to the published version if possible.",
"title": "'Help'/'Support' Section in Validator",
"type": "issue"
},
{
"action": "created",
"author": "cmrn-rhi",
"comment_id": 647733653,
"datetime": 1592855047000,
"masked_author": "username_0",
"text": "I see that there is a pdf version of the SOP - so ignore that recommendation (although perhaps we should still add the published link somewhere).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ivansg44",
"comment_id": 647795897,
"datetime": 1592864079000,
"masked_author": "username_1",
"text": "Another thing to add to this menu is a link to the reference guide.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cmrn-rhi",
"comment_id": 648986165,
"datetime": 1593022773000,
"masked_author": "username_0",
"text": "@username_1 Is there a separate reference guide outside of the SOP and in-validator field guidance?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ivansg44",
"comment_id": 648988452,
"datetime": 1593023036000,
"masked_author": "username_1",
"text": "[reference.html](https://github.com/Public-Health-Bioinformatics/DataHarmonizer/blob/master/reference.html) in the root directory courtesy of @username_2",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ddooley",
"comment_id": 649026780,
"datetime": 1593027571000,
"masked_author": "username_2",
"text": "Yes, the reference.html is automatically generated version of the label / definition / guidance and examples fields of the vocabulary. It differs from SOP but could be linked from there too.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ivansg44",
"comment_id": null,
"datetime": 1594052698000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 7 | 1,444 | false | false | 1,444 | true |
google/site-kit-wp | google | 779,409,613 | 2,581 | {
"number": 2581,
"repo": "site-kit-wp",
"user_login": "google"
} | [
{
"action": "opened",
"author": "felixarntz",
"comment_id": null,
"datetime": 1609871960000,
"masked_author": "username_0",
"text": "## Summary\r\n\r\n<!-- Please reference the issue this PR addresses. -->\r\nAddresses issue #2578\r\n\r\n## Relevant technical choices\r\n\r\n<!-- Please describe your changes. -->\r\n\r\n## Checklist\r\n\r\n- [x] My code is tested and passes existing unit tests.\r\n- [ ] My code has an appropriate set of unit tests which all pass.\r\n- [x] My code is backward-compatible with WordPress 4.7 and PHP 5.6.\r\n- [x] My code follows the [WordPress](https://make.wordpress.org/core/handbook/best-practices/coding-standards/) coding standards.\r\n- [x] My code has proper inline documentation.\r\n- [x] I have added a QA Brief on the issue linked above.\r\n- [x] I have signed the Contributor License Agreement (see <https://cla.developers.google.com/>).",
"title": "Make local PHPUnit setup standalone",
"type": "issue"
},
{
"action": "created",
"author": "aaemnnosttv",
"comment_id": 799654007,
"datetime": 1615833244000,
"masked_author": "username_1",
"text": "@googlebot I consent.",
"title": null,
"type": "comment"
}
] | 3 | 3 | 1,579 | false | true | 737 | false |
HephaestusProject/pytorch-ReCoSa | HephaestusProject | 723,674,509 | 20 | {
"number": 20,
"repo": "pytorch-ReCoSa",
"user_login": "HephaestusProject"
} | [
{
"action": "opened",
"author": "soeque1",
"comment_id": null,
"datetime": 1602912592000,
"masked_author": "username_0",
"text": "# Pull Request\r\n레파지토리에 기여해주셔서 감사드립니다.\r\n\r\n해당 PR을 제출하기 전에 아래 사항이 완료되었는지 확인 부탁드립니다:\r\n- [x] 작성한 코드가 어떤 에러나 경고없이 빌드가 되었나요?\r\n- [x] 충분한 테스트를 수행하셨나요?\r\n\r\n## 1. 해당 PR은 어떤 내용인가요?\r\n<!-- 해당 PR이 어떠한 내용인지 상세하게 명시 부탁드리겠습니다. 상세한 명시는 1). 문제정의, 2). 해결방법, 3). 해당 PR로 인해 발생할 수 있는 예상문제와 같은 형태로 작성하시면 됩니다.-->\r\n\r\nLR scheduler Update and Model v0.2.3\r\n\r\n## 2. PR과 관련된 이슈가 있나요?\r\n<!-- PR이 참고하고 있는 이슈가 있다면 관련 자료를 태깅해주세요. 만약에 이슈가 같은 레파지토리의 이슈라면 이슈번호를 태그해주시고, 외부자료라면 URL로 표기해주세요.-->",
"title": "Bugfix/#19 lr scheduler",
"type": "issue"
},
{
"action": "created",
"author": "soeque1",
"comment_id": 710756362,
"datetime": 1602913960000,
"masked_author": "username_0",
"text": "학습 로그를 봤는데 lr-scheduler가 step 단위로 적용되어야하는데, epochs 단위로 적용되었습니다 ㅠ",
"title": null,
"type": "comment"
}
] | 1 | 2 | 516 | false | false | 516 | false |
microsoft/vscode-hexeditor | microsoft | 831,004,467 | 223 | null | [
{
"action": "opened",
"author": "Tyriar",
"comment_id": null,
"datetime": 1615672199000,
"masked_author": "username_0",
"text": "Reasons:\r\n\r\n- It hides my explorer or whatever side bar I have displayed at the time, this is not only distracting but I like it to be there to give me context on where I am within the project\r\n- There is _so much_ room for this on the right of the editor\r\n- There is also a lot of wasted space on the bottom of the data inspector\r\n- I can't seem to be able to bring it into my explorer which might be alright, provided I doesn't auto show the explorer when I focus the editor\r\n\r\n",
"title": "I really don't like the data inspector side bar",
"type": "issue"
},
{
"action": "created",
"author": "lramos15",
"comment_id": 799439911,
"datetime": 1615816579000,
"masked_author": "username_1",
"text": "As you remember, it was on the right side of the editor before and moved into the side panel to act as an information panel of sorts as it felt weird living next to editable content. I assume we could eventually add more stuff to the data panel, but I'm not sure what. We can discuss this at the UI/UX sync, but I'm not sure what a good place to put it is unless we throw it under a setting and let the user decide?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Tyriar",
"comment_id": 799614510,
"datetime": 1615830114000,
"masked_author": "username_0",
"text": "I never had an issue with it on the right side, I know others brought it up though. I'd prefer the editor to be more self contained and not mess with how I do things with the side bar.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Tyriar",
"comment_id": 801291321,
"datetime": 1616003830000,
"masked_author": "username_0",
"text": "After doing https://github.com/microsoft/vscode-hexeditor/pull/252 I figured out how to pull it into another one, I think I just didn't hold the mouse over an activity icon for long enough.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "lramos15",
"comment_id": null,
"datetime": 1616006113000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 5 | 1,378 | false | false | 1,378 | false |
ArrobeFr/jquery-calendar-bs4 | ArrobeFr | 788,448,881 | 17 | null | [
{
"action": "opened",
"author": "DLX23",
"comment_id": null,
"datetime": 1610991839000,
"masked_author": "username_0",
"text": "Hi,\r\n\r\ni got following code:\r\n```\r\nvar calendar = $('#calendar').Calendar({\r\n locale: 'de',\r\n weekday: {\r\n timeline: {\r\n fromHour: 9,\r\n toHour: 20,\r\n intervalMinutes: 30\r\n }\r\n },\r\n events: events\r\n }).init();\r\n```\r\n\r\nbut time starts at 0:00 not 9:00",
"title": "fromHour Parameter don't work",
"type": "issue"
}
] | 1 | 1 | 333 | false | false | 333 | false |
Homebrew/homebrew-core | Homebrew | 771,663,635 | 67,315 | {
"number": 67315,
"repo": "homebrew-core",
"user_login": "Homebrew"
} | [
{
"action": "opened",
"author": "carlocab",
"comment_id": null,
"datetime": 1608490098000,
"masked_author": "username_0",
"text": "Created with `brew bump-formula-pr`.",
"title": "nickle 2.90",
"type": "issue"
},
{
"action": "created",
"author": "carlocab",
"comment_id": 748646066,
"datetime": 1608490309000,
"masked_author": "username_0",
"text": "License:\r\n```\r\nCopyright © 1988-2004 Keith Packard and Bart Massey. All Rights Reserved.\r\n\r\nPermission is hereby granted, free of charge, to any person\r\nobtaining a copy of this software and associated\r\ndocumentation files (the \"Software\"), to deal in the\r\nSoftware without restriction, including without limitation\r\nthe rights to use, copy, modify, merge, publish, distribute,\r\nsublicense, and/or sell copies of the Software, and to\r\npermit persons to whom the Software is furnished to do so,\r\nsubject to the following conditions:\r\n\r\nThe above copyright notice and this permission notice shall\r\nbe included in all copies or substantial portions of the\r\nSoftware.\r\n\r\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY\r\nKIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE\r\nWARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR\r\nPURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS\r\nBE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER\r\nIN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\r\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR\r\nOTHER DEALINGS IN THE SOFTWARE.\r\n\r\nExcept as contained in this notice, the names of the authors\r\nor their institutions shall not be used in advertising or\r\notherwise to promote the sale, use or other dealings in this\r\nSoftware without prior written authorization from the\r\nauthors.\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "carlocab",
"comment_id": 748646453,
"datetime": 1608490557000,
"masked_author": "username_0",
"text": "Looks like a `BSD-1-Clause`?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Rylan12",
"comment_id": 748651884,
"datetime": 1608493299000,
"masked_author": "username_1",
"text": "According to https://keithp.com/cgit/nickle.git/tree/debian/copyright and https://metadata.ftp-master.debian.org/changelogs//main/n/nickle/nickle_2.90_copyright:\r\n\r\n```ruby\r\nlicense \"MIT\"\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "carlocab",
"comment_id": 748659690,
"datetime": 1608494226000,
"masked_author": "username_0",
"text": "`MIT` I think is less restrictive than the license that actually appears in the `COPYING` file, so I'm reluctant to add it. I'll wait a bit to see if the author replies to my email.",
"title": null,
"type": "comment"
}
] | 3 | 6 | 1,920 | false | true | 1,805 | false |
oltpbenchmark/oltpbench | oltpbenchmark | 759,040,586 | 349 | null | [
{
"action": "opened",
"author": "malingaperera",
"comment_id": null,
"datetime": 1607399973000,
"masked_author": "username_0",
"text": "In the TATP documentation, I saw TATP uses a non-uniform data distribution by default. Is that true for OLTP bench? Is there a parameter to switch between uniform or non-uniform data distribution? Is there a way to change the skewness of data? Thanks.",
"title": "Does TATP benchmark uses a non-uniform data distribution?",
"type": "issue"
},
{
"action": "created",
"author": "apavlo",
"comment_id": 781554021,
"datetime": 1613673536000,
"masked_author": "username_1",
"text": "There is no way to switch this easily.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "malingaperera",
"comment_id": 781653563,
"datetime": 1613684566000,
"masked_author": "username_0",
"text": "Thanks.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "apavlo",
"comment_id": null,
"datetime": 1618012495000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 296 | false | false | 296 | false |
xorum-io/codeforces_watcher | xorum-io | 815,411,812 | 262 | null | [
{
"action": "opened",
"author": "MatyashDen",
"comment_id": null,
"datetime": 1614168034000,
"masked_author": "username_0",
"text": "",
"title": "Add missing localizable strings",
"type": "issue"
},
{
"action": "closed",
"author": "yev-kanivets",
"comment_id": null,
"datetime": 1617292432000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 0 | false | false | 0 | false |
microsoft/vcpkg | microsoft | 608,044,520 | 11,065 | {
"number": 11065,
"repo": "vcpkg",
"user_login": "microsoft"
} | [
{
"action": "opened",
"author": "NancyLi1013",
"comment_id": null,
"datetime": 1588053103000,
"masked_author": "username_0",
"text": "llibxml2 failed on osx pipeline in the PR #10770 \r\n\r\n```\r\nUndefined symbols for architecture x86_64:\r\n \"_iconv\", referenced from:\r\n _xmlIconvWrapper in libxml2.a(encoding.c.o)\r\n \"_iconv_close\", referenced from:\r\n _xmlFindCharEncodingHandler in libxml2.a(encoding.c.o)\r\n _xmlCharEncCloseFunc in libxml2.a(encoding.c.o)\r\n \"_iconv_open\", referenced from:\r\n _xmlFindCharEncodingHandler in libxml2.a(encoding.c.o)\r\nld: symbol(s) not found for architecture x86_64\r\nclang: error: linker command failed with exit code 1 (use -v to see invocation)\r\n```\r\nI checked the CMakeLists.txt and found that iconv has been linked. And I also cannot reproduce this on my local. \r\n\r\nRelated issue #11001.",
"title": "[test] Rerun libxml2",
"type": "issue"
},
{
"action": "created",
"author": "Neumann-A",
"comment_id": 620409842,
"datetime": 1588055677000,
"masked_author": "username_1",
"text": "Not libxml2 is failing. The port cmake is failing as a depent of libxml2 since it does not know that libxml2 has been linked with iconv. The vcpkg-cmake-wrapper needs to be adjusted to reflect that libxml2 has been build with iconv. This is already done for Windows but not for osx. Just take a look into libxml2 vcpkg-cmake-wrapper and you should directly see it",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "NancyLi1013",
"comment_id": 620421995,
"datetime": 1588057528000,
"masked_author": "username_0",
"text": "Thanks for your update. \r\nI understood it now. It is about the usage of libxml2. Will add the dependency later.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "NancyLi1013",
"comment_id": 620489206,
"datetime": 1588065971000,
"masked_author": "username_0",
"text": "After further investigation about this, it seems that this is not a bug for libxml2. So I close this PR now.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 1,288 | false | false | 1,288 | false |
ligato/vpp-probe | ligato | 801,436,021 | 10 | {
"number": 10,
"repo": "vpp-probe",
"user_login": "ligato"
} | [
{
"action": "opened",
"author": "davetejas",
"comment_id": null,
"datetime": 1612455703000,
"masked_author": "username_0",
"text": "",
"title": "Topo display",
"type": "issue"
},
{
"action": "created",
"author": "ondrej-fabry",
"comment_id": 816128812,
"datetime": 1617911967000,
"masked_author": "username_1",
"text": "Superseded by #24",
"title": null,
"type": "comment"
}
] | 2 | 2 | 17 | false | false | 17 | false |
microsoft/botbuilder-js | microsoft | 746,170,702 | 3,077 | {
"number": 3077,
"repo": "botbuilder-js",
"user_login": "microsoft"
} | [
{
"action": "opened",
"author": "Danieladu",
"comment_id": null,
"datetime": 1605749531000,
"masked_author": "username_0",
"text": "Fixes #3076",
"title": "Fix CRUD issue raised by Composer",
"type": "issue"
}
] | 2 | 2 | 962 | false | true | 11 | false |
google-research/text-to-text-transfer-transformer | google-research | 696,908,835 | 393 | null | [
{
"action": "opened",
"author": "chrisrytting",
"comment_id": null,
"datetime": 1599664949000,
"masked_author": "username_0",
"text": "I'm trying to train the 3B parameter model from scratch. When I try to do so by executing the following command\r\n\r\n```\r\nexport CUDA_VISIBLE_DEVICES=3\r\nexport MODEL_DIR=models-t5/3Bscratch\r\nexport DATA_DIR=data/c/tsvs\r\nt5_mesh_transformer --model_dir=\"${MODEL_DIR}\" --t5_tfds_data_dir=\"${DATA_DIR}\" \\\r\n--gin_file=\"dataset.gin\" --gin_param=\"utils.run.mesh_shape = 'model:1,batch:1'\" \\\r\n--gin_param=\"utils.run.mesh_devices = ['gpu:0']\" --gin_file=\"models/bi_v1_3B.gin\" \\\r\n--gin_param=\"MIXTURE_NAME = 'atr'\" --module_import='t5_task_def' \r\n```\r\n\r\nI get an OOM error\r\n\r\n```\r\n (0) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,512] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n [[node encoder/block_018/layer_000/SelfAttention/einsum_8/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n [[scalar_add/parallel_0/add/_5677]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n (1) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,512] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n [[node encoder/block_018/layer_000/SelfAttention/einsum_8/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n```\r\n\r\nI add `--gin_param=\"tokens_per_batch=1\"` to the above command, thinking I need to reduce my batch size to fit on the GPU, (which in the first place this error is weird because I'm using a Nvidia Tesla V100-SXM3-32GB), and I still get the OOM error. Is there something wrong with the command I'm running?",
"title": "OOM error when training even when batch",
"type": "issue"
},
{
"action": "created",
"author": "craffel",
"comment_id": 689641428,
"datetime": 1599665620000,
"masked_author": "username_1",
"text": "I'm not sure the 3B model will fit in memory on a single GPU (even a v100), though it is possible to train the 3B model on a v2-8 TPU which has 64 GB of RAM. I'm also not sure `tokens_per_bath=1` will work if your input sequences have length 512 (which it appears they do based on the error you're seeing). You could try `tokens_per_bath=512` which would correspond to a single sequence in each batch, and/or you could try decreasing the input/output sequence lengths depending on the sequence lengths of the task you are trying.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "craffel",
"comment_id": null,
"datetime": 1599665621000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689644398,
"datetime": 1599665908000,
"masked_author": "username_0",
"text": "@username_1 thanks for the quick response. However, I have fine-tuned without a problem on these GPUs. Does training from scratch require more memory for some reason? I understand the pre-training is different than fine-tuning on specific tasks, but I basically want to fine-tune without any pre-training.\r\n\r\nAlso, if I'm getting OOM error with `tokens_per_batch=1`, there's no way I could go bigger, right?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "craffel",
"comment_id": 689646613,
"datetime": 1599666122000,
"masked_author": "username_1",
"text": "If your input sequence has 512 tokens and you set tokens_per_batch=1, I'm not sure what the code will do. You should set tokens_per_batch to be an integer multiple of your input sequence length, so the smallest you could use is 512.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "adarob",
"comment_id": 689648475,
"datetime": 1599666313000,
"masked_author": "username_2",
"text": "Can you try installing mesh-tf and t5 both from master on github?\r\n\r\nI think this will be fixed by the combination of:\r\nhttps://github.com/tensorflow/mesh/pull/175\r\nhttps://github.com/google-research/text-to-text-transfer-transformer/pull/386\r\n\r\nIf we can confirm that solves your problem, I will update the packages.",
"title": null,
"type": "comment"
},
{
"action": "reopened",
"author": "adarob",
"comment_id": null,
"datetime": 1599666314000,
"masked_author": "username_2",
"text": "I'm trying to train the 3B parameter model from scratch. When I try to do so by executing the following command\r\n\r\n```\r\nexport CUDA_VISIBLE_DEVICES=3\r\nexport MODEL_DIR=models-t5/3Bscratch\r\nexport DATA_DIR=data/c/tsvs\r\nt5_mesh_transformer --model_dir=\"${MODEL_DIR}\" --t5_tfds_data_dir=\"${DATA_DIR}\" \\\r\n--gin_file=\"dataset.gin\" --gin_param=\"utils.run.mesh_shape = 'model:1,batch:1'\" \\\r\n--gin_param=\"utils.run.mesh_devices = ['gpu:0']\" --gin_file=\"models/bi_v1_3B.gin\" \\\r\n--gin_param=\"MIXTURE_NAME = 'atr'\" --module_import='t5_task_def' \r\n```\r\n\r\nI get an OOM error\r\n\r\n```\r\n (0) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,512] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n [[node encoder/block_018/layer_000/SelfAttention/einsum_8/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n [[scalar_add/parallel_0/add/_5677]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n (1) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,512] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n [[node encoder/block_018/layer_000/SelfAttention/einsum_8/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n```\r\n\r\nI add `--gin_param=\"tokens_per_batch=1\"` to the above command, thinking I need to reduce my batch size to fit on the GPU, (which in the first place this error is weird because I'm using a Nvidia Tesla V100-SXM3-32GB), and I still get the OOM error. Is there something wrong with the command I'm running?",
"title": "OOM error when training even when batch",
"type": "issue"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689657943,
"datetime": 1599667220000,
"masked_author": "username_0",
"text": "So I'm working out of a docker container, and I hope that doesn't make things too tricky. But basically before I had just pip installed t5 and so I think the place it had the packages `t5` and `mesh_tensorflow` was in `/usr/local/lib/python3.6/dist-packages`. So I just renamed the ones installed by pip to `t5package` and `mesh_tensorflowpackage` and then cloned https://github.com/tensorflow/mesh into `mesh_tensorflow` and https://github.com/google-research/text-to-text-transfer-transformer into `t5` to replace the pip-installed packages with the master branches from github.\r\n\r\nBut then when I run the same command, I get this error\r\n\r\n```\r\n File \"/usr/local/bin/t5_mesh_transformer\", line 5, in <module>\r\n from t5.models.mesh_transformer_main import console_entry_point\r\nModuleNotFoundError: No module named 't5.models'\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "adarob",
"comment_id": 689660549,
"datetime": 1599667466000,
"masked_author": "username_2",
"text": "You should be able to do `pip install git+\nhttps://github.com/tensorflow/mesh`, etc.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689671068,
"datetime": 1599668528000,
"masked_author": "username_0",
"text": "Sorry I deleted that last comment I thought there was a detail wrong and panicked but there wasn't. So I pip uninstalled t5 and mesh_tensorflow at that point, and pip installed both with git+.\r\n\r\nI get nearly the same errors, it just looks like some of the dimensions are changed:\r\n\r\n```\r\ntensorflow.python.framework.errors_impl.ResourceExhaustedError: 2 root error(s) found.\r\n (0) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,128] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n\t [[node encoder/block_018/layer_000/SelfAttention/einsum_7/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n\t [[scalar_add/parallel_0/add/_5679]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n\r\n (1) Resource exhausted: OOM when allocating tensor with shape[1,4,32,512,128] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc\r\n\t [[node encoder/block_018/layer_000/SelfAttention/einsum_7/parallel_0/einsum/Einsum (defined at /lib/python3.6/dist-packages/mesh_tensorflow/ops.py:1336) ]]\r\nHint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689697675,
"datetime": 1599671150000,
"masked_author": "username_0",
"text": "So I tried parallelizing across two of the v100s, and I still got those errors above. When I parallelized across 4 (3 wouldn't work because the model doesn't split evenly across 3), it finally worked. \r\n\r\nStill, it seems weird to me that I could fine-tune with a sequence length of 250 on 1 GPU but couldn't train with the same sequence length across 2 GPUs. Aren't training and fine-tuning the same thing, just at different stages of training? Or am I misunderstanding the difference between them? Like when you guys train on MRPC from scratch in the readme, there's no reason that would be more computationally intensive than if you had fine-tuned on MRPC using one of the pre-trained models, right?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "craffel",
"comment_id": null,
"datetime": 1599671389000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "craffel",
"comment_id": 689699690,
"datetime": 1599671389000,
"masked_author": "username_1",
"text": "Yes, they are exactly the same, the only difference is that before fine-tuning we load variables from a checkpoint.\r\n\r\nYou can diff the operative config gin files that got written out to your model directories to triple check that there are no other differences (other than sequence length) between your fine-tuning and pre-training runs.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chrisrytting",
"comment_id": 689701487,
"datetime": 1599671591000,
"masked_author": "username_0",
"text": "Oh sorry that finally made sense about the quadratic scaling. Thanks for your help, guys!",
"title": null,
"type": "comment"
}
] | 3 | 14 | 9,182 | false | false | 9,182 | true |
taichi-dev/taichi | taichi-dev | 684,773,916 | 1,769 | null | [
{
"action": "opened",
"author": "squarefk",
"comment_id": null,
"datetime": 1598283902000,
"masked_author": "username_0",
"text": "",
"title": "[Doc] Update doc on ti.Matrix (like zeros, ones, identity)",
"type": "issue"
},
{
"action": "created",
"author": "archibate",
"comment_id": 686287298,
"datetime": 1599115022000,
"masked_author": "username_1",
"text": "Great! Would you take this on your own?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "k-ye",
"comment_id": null,
"datetime": 1628753379000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "k-ye",
"comment_id": 897413128,
"datetime": 1628753379000,
"masked_author": "username_2",
"text": "Closed by https://github.com/taichi-dev/taichi/issues/2579",
"title": null,
"type": "comment"
}
] | 3 | 4 | 97 | false | false | 97 | false |
moralmunky/Home-Assistant-Mail-And-Packages | null | 748,138,407 | 258 | {
"number": 258,
"repo": "Home-Assistant-Mail-And-Packages",
"user_login": "moralmunky"
} | [
{
"action": "opened",
"author": "firstof9",
"comment_id": null,
"datetime": 1606008490000,
"masked_author": "username_0",
"text": "## Proposed change\r\n<!-- \r\n Describe the big picture of your changes here to communicate to the\r\n maintainers why we should accept this pull request. If it fixes a bug\r\n or resolves a feature request, be sure to link to that issue in the\r\n additional information section.\r\n-->\r\nSimplify tracking number regex\r\n\r\n## Type of change\r\n<!--\r\n What type of change does your PR introduce?\r\n NOTE: Please, check only 1! box! \r\n If your PR requires multiple boxes to be checked, you'll most likely need to\r\n split it into multiple PRs. This makes things easier and faster to code review.\r\n-->\r\n\r\n- [ ] Dependency upgrade\r\n- [ ] Bugfix (non-breaking change which fixes an issue)\r\n- [ ] New feature (which adds functionality)\r\n- [ ] Breaking change (fix/feature causing existing functionality to break)\r\n- [x] Code quality improvements to existing code or addition of tests\r\n\r\n## Additional information\r\n<!--\r\n Details are important, and help maintainers processing your PR.\r\n Please be sure to fill out additional details, if applicable.\r\n-->\r\n\r\n- This PR is related to issue: n/a",
"title": "Simplify tracking number regex",
"type": "issue"
}
] | 2 | 2 | 2,704 | false | true | 1,080 | false |
wix/ricos | wix | 861,350,302 | 2,386 | {
"number": 2386,
"repo": "ricos",
"user_login": "wix"
} | [
{
"action": "opened",
"author": "maimonav",
"comment_id": null,
"datetime": 1618841302000,
"masked_author": "username_0",
"text": "Remove unsupported plugins from pasted content.",
"title": "feat - clear unsupported plugins from pasted content",
"type": "issue"
}
] | 2 | 3 | 507 | false | true | 47 | false |
splunk/qbec | splunk | 696,204,523 | 155 | {
"number": 155,
"repo": "qbec",
"user_login": "splunk"
} | [
{
"action": "opened",
"author": "gotwarlost",
"comment_id": null,
"datetime": 1599602315000,
"masked_author": "username_0",
"text": "",
"title": "add windows CI build, thanks to @harsimranmaan",
"type": "issue"
},
{
"action": "created",
"author": "sbarzowski",
"comment_id": 689542971,
"datetime": 1599656000000,
"masked_author": "username_1",
"text": "Wow, good to know it's possible to set up Github actions to run on Windows. We may try setting it up upstream, too.",
"title": null,
"type": "comment"
}
] | 3 | 3 | 115 | false | true | 115 | false |
nange/gospider | null | 548,379,303 | 74 | null | [
{
"action": "opened",
"author": "lintan",
"comment_id": null,
"datetime": 1578716931000,
"masked_author": "username_0",
"text": "packr安装不成功",
"title": "go get: upgrading github.com/coreos/go-systemd@v0.0.0-20190321100706-95778dfbb74e: unexpected status (https://goproxy.cn/github.com/coreos/go-systemd/@v/list): 404 Not Found",
"type": "issue"
},
{
"action": "created",
"author": "nange",
"comment_id": 574117654,
"datetime": 1578999203000,
"masked_author": "username_1",
"text": "你是执行```go get -u github.com/gobuffalo/packr/packr```出的问题吗? 我没能重现你的问题。 \r\n\r\n另外如果你不是想开发前端页面,只是想体验功能,并不需要安装packr。直接进入```_example```执行```go build```,然后运行就可以了。 \r\n@username_0",
"title": null,
"type": "comment"
}
] | 2 | 2 | 173 | false | false | 173 | true |
scikit-hep/pyhf | scikit-hep | 650,275,733 | 918 | null | [
{
"action": "opened",
"author": "lukasheinrich",
"comment_id": null,
"datetime": 1593737990000,
"masked_author": "username_0",
"text": "# Description\r\n\r\nright now we trigger a binder build but i'm not sure, do we get a response / feedback iif the bulid fails? We could use repo2docker to do the binder-like build locally\r\n\r\nhttps://twitter.com/m_deff/status/1277064783662141441",
"title": "run repo2docker in CI",
"type": "issue"
},
{
"action": "created",
"author": "matthewfeickert",
"comment_id": 653307828,
"datetime": 1593744400000,
"masked_author": "username_1",
"text": "No. `trigger_binder.sh` just `curls` the endpoint and gives the return code\r\n\r\nhttps://github.com/scikit-hep/pyhf/blob/64dbce070156945f6bb524f79ee6e23d26b52f01/binder/trigger_binder.sh#L6-L16\r\n\r\nand as we're using the `postBuild` Binder config which is basically just installing how we would in CI\r\n\r\nhttps://github.com/scikit-hep/pyhf/blob/64dbce070156945f6bb524f79ee6e23d26b52f01/binder/postBuild#L1-L2\r\n\r\nthe `repo2docker` build on the Binder side shouldn't fail. That being said though, the idea of using `repo2docker` in CI (to test and prebuild the Docker image for Binder so that work was offloaded from them) were the reasons that I was looking at [`machine-learning-apps/repo2docker-action`](https://github.com/machine-learning-apps/repo2docker-action). The discussions with Tim and Chris on these Issues:\r\n\r\n- [`machine-learning-apps/repo2docker-action` Issue 26](https://github.com/machine-learning-apps/repo2docker-action/issues/26)\r\n- [`machine-learning-apps/repo2docker-action` Issue 25](https://github.com/machine-learning-apps/repo2docker-action/issues/25)\r\n\r\nmake it seem like it might not be a good fit for us as we want to use Binder config files in the top level `binder/` dir.\r\n\r\n@username_0 did you have additional ideas related to this?",
"title": null,
"type": "comment"
}
] | 2 | 2 | 1,503 | false | false | 1,503 | true |
HackYourFuture-CPH/fp-class12 | HackYourFuture-CPH | 637,308,459 | 101 | {
"number": 101,
"repo": "fp-class12",
"user_login": "HackYourFuture-CPH"
} | [
{
"action": "opened",
"author": "ghofranebenhmaid",
"comment_id": null,
"datetime": 1591908239000,
"masked_author": "username_0",
"text": "# Description\r\n\r\nadded qrcode.react 1.0.0 to package.json and package-lock.json\r\n[www.npmjs.com/package/qrcode.react](https://www.npmjs.com/package/qrcode.react\r\n)",
"title": "added qrcode.react 1.0.0 to package.json and package-lock.json",
"type": "issue"
},
{
"action": "created",
"author": "senner007",
"comment_id": 643355303,
"datetime": 1591978141000,
"masked_author": "username_1",
"text": "Hi. This has changes from your other branch. Try to revert them or close the pr",
"title": null,
"type": "comment"
}
] | 2 | 2 | 242 | false | false | 242 | false |
tl-its-umich-edu/my-learning-analytics | tl-its-umich-edu | 560,533,556 | 875 | {
"number": 875,
"repo": "my-learning-analytics",
"user_login": "tl-its-umich-edu"
} | [
{
"action": "opened",
"author": "ssciolla",
"comment_id": null,
"datetime": 1580925184000,
"masked_author": "username_0",
"text": "Responding to changes from PR #845, this PR adjusts `globals.js` to use an empty object if it is unable to find the`myla_globals` JSON element in the DOM. Along with an extra condition checking whether `mylaGlobals.primary_color` is `undefined` allows tests to import and consume `siteTheme` without an error. I also updated the hex values in the snapshot for `createHistogram.test.js` to reflect the new secondary color from PR #853. This PR aims to resolve issue #874.",
"title": "Add fallback empty object for mylaGlobals and update createHistogram test (#874)",
"type": "issue"
},
{
"action": "created",
"author": "pushyamig",
"comment_id": 582539539,
"datetime": 1580926342000,
"masked_author": "username_1",
"text": "All the Unit test are passing",
"title": null,
"type": "comment"
}
] | 2 | 2 | 499 | false | false | 499 | false |
ozubov/hello-github-actions | null | 778,021,661 | 2 | {
"number": 2,
"repo": "hello-github-actions",
"user_login": "ozubov"
} | [
{
"action": "opened",
"author": "ozubov",
"comment_id": null,
"datetime": 1609759007000,
"masked_author": "username_0",
"text": "",
"title": "Create Dockerfile",
"type": "issue"
}
] | 2 | 7 | 6,849 | false | true | 0 | false |
kubernetes-sigs/kustomize | kubernetes-sigs | 804,795,887 | 3,565 | {
"number": 3565,
"repo": "kustomize",
"user_login": "kubernetes-sigs"
} | [
{
"action": "opened",
"author": "monopole",
"comment_id": null,
"datetime": 1612894678000,
"masked_author": "username_0",
"text": "ALLOW_MODULE_SPAN",
"title": "Pin to cmd/config v0.9.1",
"type": "issue"
}
] | 2 | 2 | 821 | false | true | 17 | false |
BoboTiG/ebook-reader-dict | null | 782,588,354 | 533 | {
"number": 533,
"repo": "ebook-reader-dict",
"user_login": "BoboTiG"
} | [
{
"action": "opened",
"author": "BoboTiG",
"comment_id": null,
"datetime": 1610191854000,
"masked_author": "username_0",
"text": "That's better, thanks @username_1 ;)",
"title": "fix #501: Improve aliases retrieval",
"type": "issue"
},
{
"action": "created",
"author": "lasconic",
"comment_id": 757139102,
"datetime": 1610193614000,
"masked_author": "username_1",
"text": "We could probably use the same trick in other scripts.",
"title": null,
"type": "comment"
}
] | 3 | 3 | 1,887 | false | true | 88 | true |
SilenceLove/HXPhotoPicker | null | 775,714,708 | 433 | null | [
{
"action": "opened",
"author": "qunwang6",
"comment_id": null,
"datetime": 1609221532000,
"masked_author": "username_0",
"text": "好像只能设置预览视频时是否自动播放,有没有预览视频一次就停止播放的设置。",
"title": "预览视频次数",
"type": "issue"
},
{
"action": "created",
"author": "SilenceLove",
"comment_id": 752048252,
"datetime": 1609242539000,
"masked_author": "username_1",
"text": "目前没有这种配置",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cike534222598",
"comment_id": 918934363,
"datetime": 1631608343000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "comment"
}
] | 3 | 3 | 44 | false | false | 44 | false |
enso-org/enso | enso-org | 824,795,362 | 1,557 | null | [
{
"action": "opened",
"author": "mwu-tow",
"comment_id": null,
"datetime": 1615225255000,
"masked_author": "username_0",
"text": "### General Summary\r\nIf the visualization sets preprocessor code that yields an error, this error is not reported in any way to IDE.\r\n\r\nThis is really troublesome, as developing or debugging visualizations when there is something wrong in the preprocessor code becomes guesswork.\r\n\r\n### Steps to Reproduce\r\nCreate a visualization that sets nonsensical preprocessor expression. Wait for updates.\r\n\r\nSee the example request IDE makes under such condition and the reply:\r\n\r\n```json\r\n{\r\n \"jsonrpc\": \"2.0\",\r\n \"id\": 18,\r\n \"method\": \"executionContext/attachVisualisation\",\r\n \"params\": {\r\n \"expressionId\": \"1f9ca20f-5143-4010-819d-7b4a4768e477\",\r\n \"visualisationConfig\": {\r\n \"executionContextId\": \"38358e93-60ef-400f-bb25-c084e51af57d\",\r\n \"expression\": \"hey -> this is pure nonsense\",\r\n \"visualisationModule\": \"Unnamed.Main\"\r\n },\r\n \"visualisationId\": \"a1cced79-1e4b-47f6-a85d-73f766325372\"\r\n }\r\n}\r\n\r\n{\r\n \"jsonrpc\": \"2.0\",\r\n \"id\": 18,\r\n \"result\": null\r\n}\r\n```\r\n\r\n\r\n### Expected Result\r\n\r\n1) if expression can feasibly be proven to be incorrect, attaching should fail;\r\n2) if expression's validity depends on runtime conditions, the error should be reported through appropriate channels (binary connection?).\r\n\r\nBasically, IDE needs to observe some results of visualization, be it correct values or errors on whatever stage.\r\n\r\n### Actual Result\r\nAttaching succeeds but visualization never receives any data. No error nor diagnostics are provided to explain this silence.\r\n\r\nIn the logs one can find line \r\n```\r\n[warn] [2021-03-08T17:28:42.928Z] [org.enso.languageserver.protocol.json.JsonConnectionController] Received unknown message: VisualisationEvaluationFailed(38358e93-60ef-400f-bb25-c084e51af57d,Compile error: Variable `is` is not defined.)\r\n``` \r\nbut it is not easily accessible.\r\n\r\n### Enso Version\r\n0.2.6",
"title": "Visualization Errors Are Not Reported",
"type": "issue"
},
{
"action": "closed",
"author": "4e6",
"comment_id": null,
"datetime": 1619011943000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 1,891 | false | false | 1,891 | false |
rapidsai/cuml | rapidsai | 704,487,124 | 2,843 | null | [
{
"action": "opened",
"author": "cjnolet",
"comment_id": null,
"datetime": 1600444609000,
"masked_author": "username_0",
"text": "The increase in the number of Python warnings in our CI logs from our pytests have made it increasingly challenging to hunt through the logs searching for potential CI failures. \r\n\r\nIn [this CI log] (https://gpuci.gpuopenanalytics.com/blue/rest/organizations/jenkins/pipelines/rapidsai/pipelines/gpuci/pipelines/cuml/pipelines/prb/pipelines/cuml-gpu-build/runs/14624/log/?start=0), almost half of the log is filled up with warnings that aren't relevant at all to the tests. We should probably ignore many of these warnings.",
"title": "[BUG] Ignore excessive warnings in pytests / CI logs",
"type": "issue"
},
{
"action": "created",
"author": "wphicks",
"comment_id": 698461025,
"datetime": 1600965966000,
"masked_author": "username_1",
"text": "Just to confirm, you're talking about the huge number of warnings like the following, yes?\r\n\r\n```\r\n /opt/conda/envs/rapids/lib/python3.8/site-packages/sklearn/utils/validation.py:68: FutureWarning: Pass eps=0.9109496478186582 as keyword args. From version 0.25 passing these as positional arguments will result in an error\r\n warnings.warn(\"Pass {} as keyword args. From version 0.25 \"\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JohnZed",
"comment_id": 728209573,
"datetime": 1605547676000,
"masked_author": "username_2",
"text": "Possible staging / elements:\r\n\r\n- [ ] Hide scikit-learn warnings by default\r\n- [ ] Add python wrapper that allows us to fail on any warnings (with ability for expected warnings)\r\n- [ ] Reroute C++ logging through python logger to allow it to participate in python fail options\r\n\r\nSee also: adding log_once / warn_once to reduce spew consistently",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JohnZed",
"comment_id": 728211618,
"datetime": 1605547873000,
"masked_author": "username_2",
"text": "See also:\r\n - [ ] Adding custom SummaryWriter that will make it easier to find and view error",
"title": null,
"type": "comment"
}
] | 5 | 7 | 2,019 | false | true | 1,354 | false |
ppc64le-cloud/builds | ppc64le-cloud | 845,798,393 | 1,868 | {
"number": 1868,
"repo": "builds",
"user_login": "ppc64le-cloud"
} | [
{
"action": "opened",
"author": "ltccci",
"comment_id": null,
"datetime": 1617163175000,
"masked_author": "username_0",
"text": "This is an automated PR via build-bot",
"title": "Update golang/master build",
"type": "issue"
},
{
"action": "created",
"author": "ltccci",
"comment_id": 810741012,
"datetime": 1617163180000,
"masked_author": "username_0",
"text": "[APPROVALNOTIFIER] This PR is **NOT APPROVED**\n\nThis pull-request has been approved by: *<a href=\"https://github.com/ppc64le-cloud/builds/pull/1868#\" title=\"Author self-approved\">username_0</a>*\nTo complete the [pull request process](https://git.k8s.io/community/contributors/guide/owners.md#the-code-review-process), please assign after the PR has been reviewed.\nYou can assign the PR to them by writing `/assign ` in a comment when ready.\n\nThe full list of commands accepted by this bot can be found [here](https://go.k8s.io/bot-commands?repo=ppc64le-cloud%2Fbuilds).\n\n<details open>\nNeeds approval from an approver in each of these files:\n\n- **[OWNERS](https://github.com/ppc64le-cloud/builds/blob/master/OWNERS)**\n\nApprovers can indicate their approval by writing `/approve` in a comment\nApprovers can cancel approval by writing `/approve cancel` in a comment\n</details>\n<!-- META={\"approvers\":[]} -->",
"title": null,
"type": "comment"
}
] | 1 | 2 | 939 | false | false | 939 | true |
hyperledger/fabric | hyperledger | 748,322,933 | 2,156 | {
"number": 2156,
"repo": "fabric",
"user_login": "hyperledger"
} | [
{
"action": "opened",
"author": "psingh-io",
"comment_id": null,
"datetime": 1606072285000,
"masked_author": "username_0",
"text": "#### Type of change\r\n- Improvement to cryptogen tool\r\n\r\n#### Description\r\nThis is an improvement on cryptogen tool to allow for custom expiry times for CA, Identity and TLS certificates for each organization. The expiration times can be customized using following configuration for each organization:\r\n CACertExpiry: 131400h\r\n IdentityCertExpiry: 131400h\r\n TLSCertExpiry: 131400h\r\n\r\nThe default expiry time remains 15 years",
"title": "Added support to specify CA, Identity and TLS cert expiry times in cryptogen",
"type": "issue"
},
{
"action": "created",
"author": "sykesm",
"comment_id": 732253926,
"datetime": 1606147342000,
"masked_author": "username_1",
"text": "Re-running PR checks. Some failures related to ubuntu mirror issues and units failed with FAB-16233.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sykesm",
"comment_id": 733774789,
"datetime": 1606317993000,
"masked_author": "username_1",
"text": "@username_0 I plan to review these changes shortly but I wanted to point out that this PR was opened against release-2.3 instead of master. Our general process is to make the changes that introduce new features in master and to pull them back to other branches as appropriate. This helps ensure changes don't unintentionally dropped along the way.\r\n\r\nCan you please retarget this PR to master?\r\n\r\nThanks.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "psingh-io",
"comment_id": 733837883,
"datetime": 1606324413000,
"masked_author": "username_0",
"text": "Sure, I will do this later today.\n\nThabks",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "caod123",
"comment_id": 754045450,
"datetime": 1609774547000,
"masked_author": "username_2",
"text": "@username_0 it seems after you changed the base branch to master, a couple of unnecessary commits from the rebased `release-2.3` branch made it into this PR. Can you clean up the commit log so only your commit is on this PR against `master`? Also if possible can you address @username_1's comment regarding the missing integration tests for verifying this behavior.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "psingh-io",
"comment_id": 754291537,
"datetime": 1609803836000,
"masked_author": "username_0",
"text": "I will do the cleanup by tomorrow.\n\nThere is a separate ticket for integration test and I will pick that after\nthis. As it was pointed out by @username_1 <https://github.com/username_1>'s, the\nintegration tests did not exist for cryptogen before these changes.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sykesm",
"comment_id": 774051136,
"datetime": 1612533768000,
"masked_author": "username_1",
"text": "The goal of that comment was to highlight that we're missing integration tests for this tool and that I believe we need them before we add additional capabilities.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "denyeart",
"comment_id": 831531816,
"datetime": 1620075738000,
"masked_author": "username_3",
"text": "@username_0 Do you intend to address the comments?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "denyeart",
"comment_id": 832621123,
"datetime": 1620214865000,
"masked_author": "username_3",
"text": "No response from @username_0 in many months so let's close, can be re-opened later.",
"title": null,
"type": "comment"
}
] | 4 | 9 | 1,882 | false | false | 1,882 | true |
linz/geospatial-data-lake | linz | 892,766,769 | 710 | null | [
{
"action": "opened",
"author": "billgeo",
"comment_id": null,
"datetime": 1621201705000,
"masked_author": "username_0",
"text": "### Enabler\n\n<!-- A description of the enabler that covers what needs to be done why it needs to be done. It should be understandable by all members of the team -->\n\nSo that we can POC a STAC brower and/or API, we want to research the options for implementing STAC browsing functionality\n\n#### Acceptance Criteria\n\n<!-- Requirements to accept this enabler as completed -->\n\n- [ ] All available options have been investigated for pros and cons, including technology fit, user experience, maturity and effort for the product team to incorporate\n- [ ] Options presented to the team with recommendations\n- [ ] Team decision on best option to use in the POC\n\n#### Tasks\n\n<!-- Tasks needed to complete this enabler -->\n\n\n- [ ] Incorporate @jacobus POC on a bespoke index using AWS services like Elasticsearch etc\n- [ ] Investigated options in this server list here: https://stacindex.org/ecosystem?category=Server\n- [ ] Investigated options in this GUI list here: https://stacindex.org/ecosystem?category=Visualization\n- [ ] Could also consider https://github.com/linz/linz-data-importer\n- [ ] Looked in other places?\n- [ ] Results written up in confluence\n- [ ] Present results to the team\n- [ ] Decision made by the team \n\n#### Additional context\n\n\n\n#### Definition of Ready\n\n- [ ] This story is **ready** to work on, according to the\n [team's definition](https://confluence.linz.govt.nz/pages/viewpage.action?pageId=87930423)\n\n#### Definition of Done\n\n- [ ] This story is **done**, according to the\n [team's definition](https://confluence.linz.govt.nz/pages/viewpage.action?pageId=87930423)\n\n<!-- Please add one or more of these labels: 'spike', 'refactor', 'architecture', 'infrastructure', 'compliance' -->",
"title": "Spike: research STAC browsers",
"type": "issue"
},
{
"action": "closed",
"author": "MitchellPaff",
"comment_id": null,
"datetime": 1623382775000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 1,715 | false | false | 1,715 | false |
harshith1611/hello-github-actions | null | 721,696,419 | 2 | {
"number": 2,
"repo": "hello-github-actions",
"user_login": "harshith1611"
} | [
{
"action": "opened",
"author": "harshith1611",
"comment_id": null,
"datetime": 1602701226000,
"masked_author": "username_0",
"text": "",
"title": "Create Dockerfile",
"type": "issue"
}
] | 2 | 7 | 6,964 | false | true | 0 | false |
microsoft/vscode | microsoft | 666,109,970 | 103,400 | null | [
{
"action": "opened",
"author": "gleborgne",
"comment_id": null,
"datetime": 1595840083000,
"masked_author": "username_0",
"text": "<!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ -->\r\n<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->\r\n<!-- Please search existing issues to avoid creating duplicates. -->\r\n<!-- Also please test using the latest insiders build to make sure your issue has not already been fixed: https://code.visualstudio.com/insiders/ -->\r\n\r\n<!-- Use Help > Report Issue to prefill these. -->\r\n- VSCode Version: 1.47.2\r\n- OS Version: Windows 10 (update 2004)\r\n\r\nSteps to Reproduce:\r\n\r\n1. Start node.js debugging on a project with Typescript\r\n\r\n<!-- Launch with `code --disable-extensions` to check. -->\r\nDoes this issue occur when all extensions are disabled?: Yes\r\n\r\n\r\nSince last vscode version update, when starting to debug I have the error bellow. tslib is at version 2.0.0. I tried downgrading it to 1.X but still have the same error. I saw other bugs that looks related and if I add \"debug.javascript.usePreview\": false to vscode's settings, everything is working fine.\r\n\r\nTypeError: \r\n at c:\\Users\\X\\AppData\\Local\\Programs\\Microsoft VS Code\\resources\\app\\extensions\\ms-vscode.js-debug\\src\\bootloader.bundle.js:16:7581\r\n at Object.<anonymous> (c:\\Users\\X\\AppData\\Local\\Programs\\Microsoft VS Code\\resources\\app\\extensions\\ms-vscode.js-debug\\src\\bootloader.bundle.js:16:7609)\r\n at Object.__decorate (c:\\projects\\ABCD\\master\\front\\node_modules\\tslib\\tslib.js:95:96)",
"title": "Error while debugging node.js on windows",
"type": "issue"
},
{
"action": "created",
"author": "bdunn313",
"comment_id": 664545815,
"datetime": 1595872386000,
"masked_author": "username_1",
"text": "I can confirm the exact same behavior, and setting the `\"debug.javascript.usePreview\": false` does prevent the issue from happening.\r\n\r\nI observe that there _may_ be a memory leak in the preview implementation - when starting the debugger, vscode rapidly grows from ~700mb memory to as much as it can grab (I've seen north of 5GB mem taken up).\r\n\r\nVSCode Version: 1.47.2\r\nOS Version: Windows 10 v2004 Build 19041.388",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "connor4312",
"comment_id": 666559959,
"datetime": 1596131309000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "prbans",
"comment_id": 668829661,
"datetime": 1596575823000,
"masked_author": "username_3",
"text": "@username_2 pinged you offline, I can show you repro and share logs in private",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "connor4312",
"comment_id": 671580185,
"datetime": 1597092227000,
"masked_author": "username_2",
"text": "Closing due to lack of info",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "connor4312",
"comment_id": null,
"datetime": 1597092228000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 4 | 6 | 1,931 | false | false | 1,931 | true |
axe-api/axe-api | axe-api | 910,790,582 | 39 | null | [
{
"action": "opened",
"author": "ozziest",
"comment_id": null,
"datetime": 1622748518000,
"masked_author": "username_0",
"text": "We don't have a filename standard in the project. We should add the eslint plugin to manage that;\r\n\r\n[filename-case](https://github.com/sindresorhus/eslint-plugin-unicorn/blob/main/docs/rules/filename-case.md)",
"title": "Filename standards",
"type": "issue"
},
{
"action": "closed",
"author": "ozziest",
"comment_id": null,
"datetime": 1622923850000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 209 | false | false | 209 | false |
numpy/numpy | numpy | 802,842,346 | 18,355 | null | [
{
"action": "opened",
"author": "MahendraCherukupalli",
"comment_id": null,
"datetime": 1612666820000,
"masked_author": "username_0",
"text": "<!-- Please describe the issue in detail here, and fill in the fields below -->\r\n\r\n### Reproducing code example:\r\n\r\n<!-- A short code example that reproduces the problem/missing feature. It should be\r\nself-contained, i.e., possible to run as-is via 'python myproblem.py' -->\r\n\r\n```python\r\nimport numpy as np\r\n<< your code here >>\r\nimport numpy as np\r\nimport pandas as pd\r\ndf=pd.read_csv('link')\r\ndf.info() and df.describe() gives error as \"TypeError: Cannot interpret '<attribute 'dtype' of 'numpy.generic' objects>' as a data type\" and also plotting(df.plot()) gives same error.\r\n```\r\n\r\n### Error message:\r\n\r\n<!-- If you are reporting a segfault please include a GDB traceback, which you\r\ncan generate by following\r\nhttps://github.com/numpy/numpy/blob/master/doc/source/dev/development_environment.rst#debugging -->\r\n\r\n<!-- Full error message, if any (starting from line Traceback: ...) -->\r\n\r\n### NumPy/Python version information:\r\n\r\n<!-- Output from 'import sys, numpy; print(numpy.__version__, sys.version)' -->",
"title": "TypeError: Cannot interpret '<attribute 'dtype' of 'numpy.generic' objects>' as a data type",
"type": "issue"
},
{
"action": "created",
"author": "seberg",
"comment_id": 774610226,
"datetime": 1612678238000,
"masked_author": "username_1",
"text": "There is a unfortuate incompatibility with old pandas and 1.20. Updating pandas to a newer version should fix it, see also https://github.com/pandas-dev/pandas/issues/39520#issuecomment-772630011\r\n\r\nUpdating to `pandas>=1.0.5` should solve it. Supposedly also `pandas==0.25.3` works. If you are stuck with pandas 0.24.x you may not be able to usse numpy 1.20.x though, unfortunaly.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jessdwitch",
"comment_id": 775414881,
"datetime": 1612815243000,
"masked_author": "username_2",
"text": "@username_1 Any idea why someone might be suddenly seeing this error when calling `df.to_csv` with `pandas==0.24.2` and `numpy==1.16.5`? Based on what you're saying above, it seems like those should be in the \"OK\" range.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seberg",
"comment_id": 775422571,
"datetime": 1612815692000,
"masked_author": "username_1",
"text": "@username_2 I am not sure if there are other (likely) ways to get this error, and I doubt it is possible with the above example (or a similar one, such as creating an empty dataframe IIRC).\r\n\r\nCan you double check `np.__version__` and `pd.__version__` in your shell to make sure you are running the expected versions? Sometimes you might get multiple versions or so, which can quickly become very confusing.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jessdwitch",
"comment_id": 775454519,
"datetime": 1612817676000,
"masked_author": "username_2",
"text": "Thank you for the quick reply! I did check those already, since there _are_ multiple versions installed (`numpy==1.20.0` and `pandas==0.25.3` when conda is deactivated, and the versions noted above when the conda environment the script is running in is activated). I double checked the logs, and while I don't have the specific versions printing to the log, I do see the env activating and confirmed `0.24.2` and `1.16.5` were the ones installed to that env.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seberg",
"comment_id": 775464887,
"datetime": 1612818725000,
"masked_author": "username_1",
"text": "Hmm, can you post an example and the full traceback? I am wondering if I mixed up the issues and this one was not related to 1.20.x at all.\r\n\r\nTo be honest, if you are not running 1.20.x, then searching/asking on pandas is more likely to be successfull.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jessdwitch",
"comment_id": 775470350,
"datetime": 1612819283000,
"masked_author": "username_2",
"text": "Gotcha. Tbh asking around pandas was my first thought, but this is literally the only google result with this error (well, this and an SO thread Mahendra created). Here's the traceback starting with the `to_csv` call. If that doesn't spark anything, I'll leave you be and pop in over there, but thank you regardless for taking a look.\r\n\r\n```\r\n File \"/root/miniconda/lib/python3.7/site-packages/annotation/summarize_annotation.py\", line 159, in create_cluster_table\r\n pd.DataFrame(cluster_dict).to_csv(os.path.join(directory, 'report.csv'), index=False)\r\n File \"/root/miniconda/lib/python3.7/site-packages/pandas/core/frame.py\", line 411, in __init__\r\n mgr = init_dict(data, index, columns, dtype=dtype)\r\n File \"/root/miniconda/lib/python3.7/site-packages/pandas/core/internals/construction.py\", line 257, in init_dict\r\n return arrays_to_mgr(arrays, data_names, index, columns, dtype=dtype)\r\n File \"/root/miniconda/lib/python3.7/site-packages/pandas/core/internals/construction.py\", line 82, in arrays_to_mgr\r\n arrays = _homogenize(arrays, index, dtype)\r\n File \"/root/miniconda/lib/python3.7/site-packages/pandas/core/internals/construction.py\", line 323, in _homogenize\r\n val, index, dtype=dtype, copy=False, raise_cast_failure=False\r\n File \"/root/miniconda/lib/python3.7/site-packages/pandas/core/internals/construction.py\", line 712, in sanitize_array\r\n subarr = construct_1d_arraylike_from_scalar(value, len(index), dtype)\r\n File \"/root/miniconda/lib/python3.7/site-packages/pandas/core/dtypes/cast.py\", line 1233, in construct_1d_arraylike_from_scalar\r\n subarr = np.empty(length, dtype=dtype)\r\nTypeError: Cannot interpret '<attribute 'dtype' of 'numpy.generic' objects>' as a data type\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seberg",
"comment_id": 775521753,
"datetime": 1612825405000,
"masked_author": "username_1",
"text": "Sorry, do you have a minimal reproducer, preferably printing out the versions? It does look like this issue, but in that case the pandas code in `contruct_1d_arraylike_from_scalar` would include `isinstance(dtype, (np.dtype, type(np.dtype)))` or so (which is incorrect, because `type(np.dtype)` changed and was always weird).\r\n\r\nLong story short, it looks a lot like this issue but I if you are really not on NumPy 1.20 `type(np.dtype) is type` and I am not aware of what might be wrong.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jessdwitch",
"comment_id": 776890968,
"datetime": 1612979066000,
"masked_author": "username_2",
"text": "Hey sorry for disappearing, but you were totally right. For some reason the Conda environment was using the pandas from within the env, but the numpy from outside, causing the conflict. We ended up just downgrading the numpy from outside the env to match the one within the env and everything is happy now. Thank you so much!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mattip",
"comment_id": 777250395,
"datetime": 1613028449000,
"masked_author": "username_3",
"text": "Closing. Please reopen if needed.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mattip",
"comment_id": null,
"datetime": 1613028450000,
"masked_author": "username_3",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "aarthim123",
"comment_id": 786297555,
"datetime": 1614295232000,
"masked_author": "username_4",
"text": "Hi I tried updating pandas to 1.0.5 and I still get the same error message. TypeError: Cannot interpret '<attribute 'dtype' of 'numpy.generic' objects>' as a data type . I have a numpy version of 1.20",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mattip",
"comment_id": 786438326,
"datetime": 1614320209000,
"masked_author": "username_3",
"text": "@username_4 this issue is specifically about the error in the title. Please open an error on an appropriate issue tracker for problems with installing pandas on conda please use a more appropriate forum. You might have better luck searching for similar error messages. If you do open an issue, you probably should report the output of `conda list` in order to untangle what you have installed and what you might need.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MahendraCherukupalli",
"comment_id": 786444521,
"datetime": 1614321249000,
"masked_author": "username_0",
"text": "@username_4 use ..pip install numpy==1.16.5",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RichardScottOZ",
"comment_id": 820806412,
"datetime": 1618530296000,
"masked_author": "username_5",
"text": "I just saw this:-\r\n\r\n<class 'numpy.ndarray'> <class 'numpy.ndarray'>\r\n---------------------------------------------------------------------------\r\nTypeError Traceback (most recent call last)\r\n<ipython-input-15-14380fd16e5f> in <module>\r\n 49 print(type(v_list[i]), type(face_list[i]) )\r\n 50 #dataset = pd.DataFrame({'Group':'Neoprot-Ordovician','Surface': s, 'X': v_list[i][:, 0], 'Y': v_list[i][:, 1], 'Z': v_list[i][:, 2], 'SG': -1 })\r\n---> 51 dataset = pd.DataFrame({'Group':'Neoprot-Ordovician','Surface': s, 'X': v_list[i][:, 0].astype(float) })\r\n 52 dataset[\"Name\"] = metadata_list[i][\"NAME\"]\r\n 53 dataset[\"CRS\"] = str(metadata_list[i][\"CRS\"])\r\n\r\n~\\AppData\\Local\\Continuum\\anaconda3\\envs\\gemgis\\lib\\site-packages\\pandas\\core\\frame.py in __init__(self, data, index, columns, dtype, copy)\r\n 433 )\r\n 434 elif isinstance(data, dict):\r\n--> 435 mgr = init_dict(data, index, columns, dtype=dtype)\r\n 436 elif isinstance(data, ma.MaskedArray):\r\n 437 import numpy.ma.mrecords as mrecords\r\n\r\n~\\AppData\\Local\\Continuum\\anaconda3\\envs\\gemgis\\lib\\site-packages\\pandas\\core\\internals\\construction.py in init_dict(data, index, columns, dtype)\r\n 252 arr if not is_datetime64tz_dtype(arr) else arr.copy() for arr in arrays\r\n 253 ]\r\n--> 254 return arrays_to_mgr(arrays, data_names, index, columns, dtype=dtype)\r\n 255 \r\n 256 \r\n\r\n~\\AppData\\Local\\Continuum\\anaconda3\\envs\\gemgis\\lib\\site-packages\\pandas\\core\\internals\\construction.py in arrays_to_mgr(arrays, arr_names, index, columns, dtype)\r\n 67 \r\n 68 # don't force copy because getting jammed in an ndarray anyway\r\n---> 69 arrays = _homogenize(arrays, index, dtype)\r\n 70 \r\n 71 # from BlockManager perspective\r\n\r\n~\\AppData\\Local\\Continuum\\anaconda3\\envs\\gemgis\\lib\\site-packages\\pandas\\core\\internals\\construction.py in _homogenize(data, index, dtype)\r\n 320 val = dict(val)\r\n 321 val = lib.fast_multiget(val, oindex.values, default=np.nan)\r\n--> 322 val = sanitize_array(\r\n 323 val, index, dtype=dtype, copy=False, raise_cast_failure=False\r\n 324 )\r\n\r\n~\\AppData\\Local\\Continuum\\anaconda3\\envs\\gemgis\\lib\\site-packages\\pandas\\core\\construction.py in sanitize_array(data, index, dtype, copy, raise_cast_failure)\r\n 463 value = maybe_cast_to_datetime(value, dtype)\r\n 464 \r\n--> 465 subarr = construct_1d_arraylike_from_scalar(value, len(index), dtype)\r\n 466 \r\n 467 else:\r\n\r\n~\\AppData\\Local\\Continuum\\anaconda3\\envs\\gemgis\\lib\\site-packages\\pandas\\core\\dtypes\\cast.py in construct_1d_arraylike_from_scalar(value, length, dtype)\r\n 1459 value = ensure_str(value)\r\n 1460 \r\n-> 1461 subarr = np.empty(length, dtype=dtype)\r\n 1462 subarr.fill(value)\r\n 1463 \r\n\r\nTypeError: Cannot interpret '<attribute 'dtype' of 'numpy.generic' objects>' as a data type",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RichardScottOZ",
"comment_id": 820807222,
"datetime": 1618530420000,
"masked_author": "username_5",
"text": "pandas 1.0.1 py38he350917_0 conda-forge\r\nnumpy 1.20.2 py38h09042cb_0 conda-forge",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seberg",
"comment_id": 820823017,
"datetime": 1618532800000,
"masked_author": "username_1",
"text": "Please just update your pandas. If you want to stay on `1.0.x` that is fine. `1.0.5` is just a bug-fix release that was reported to also include a fix for this. If 1.0.5 is not sufficient to fix it, let us know so others can find a solution easier.\r\nThe pandas version you list are clearly mentioned as _not_ compatible in the pandas issue.\r\n\r\nIf you don't want to upgrade pandas for whatever reason, you may have to stick to NumPy 1.19.x as well.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "RichardScottOZ",
"comment_id": 820827512,
"datetime": 1618533624000,
"masked_author": "username_5",
"text": "For reference if someone else comes across it - it happened due to installing something else downgrading the pandas version.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "MigueL71994",
"comment_id": 1029684903,
"datetime": 1643954447000,
"masked_author": "username_6",
"text": "pip install --upgrade numpy\r\npip install --upgrade pandas",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dinhtrang24",
"comment_id": 1035990690,
"datetime": 1644569470000,
"masked_author": "username_7",
"text": "Hi, I've already upgrade both pandas (0.24.2) and numpy (1.21.5). When I tried data.info(), it still doesn't work. Any thoughts?\r\n\r\n`---------------------------------------------------------------------------\r\nTypeError Traceback (most recent call last)\r\n<ipython-input-12-6208d269f320> in <module>\r\n----> 1 data.info()\r\n\r\n/opt/apps/apps/binapps/anaconda3/2019.07/lib/python3.7/site-packages/pandas/core/frame.py in info(self, verbose, buf, max_cols, memory_usage, null_counts)\r\n 2503 self.index._is_memory_usage_qualified()):\r\n 2504 size_qualifier = '+'\r\n-> 2505 mem_usage = self.memory_usage(index=True, deep=deep).sum()\r\n 2506 lines.append(\"memory usage: {mem}\\n\".format(\r\n 2507 mem=_sizeof_fmt(mem_usage, size_qualifier)))\r\n\r\n/opt/apps/apps/binapps/anaconda3/2019.07/lib/python3.7/site-packages/pandas/core/frame.py in memory_usage(self, index, deep)\r\n 2597 if index:\r\n 2598 result = Series(self.index.memory_usage(deep=deep),\r\n-> 2599 index=['Index']).append(result)\r\n 2600 return result\r\n 2601 \r\n\r\n/opt/apps/apps/binapps/anaconda3/2019.07/lib/python3.7/site-packages/pandas/core/series.py in __init__(self, data, index, dtype, name, copy, fastpath)\r\n 260 else:\r\n 261 data = sanitize_array(data, index, dtype, copy,\r\n--> 262 raise_cast_failure=True)\r\n 263 \r\n 264 data = SingleBlockManager(data, index, fastpath=True)\r\n\r\n/opt/apps/apps/binapps/anaconda3/2019.07/lib/python3.7/site-packages/pandas/core/internals/construction.py in sanitize_array(data, index, dtype, copy, raise_cast_failure)\r\n 640 \r\n 641 subarr = construct_1d_arraylike_from_scalar(\r\n--> 642 value, len(index), dtype)\r\n 643 \r\n 644 else:\r\n\r\n/opt/apps/apps/binapps/anaconda3/2019.07/lib/python3.7/site-packages/pandas/core/dtypes/cast.py in construct_1d_arraylike_from_scalar(value, length, dtype)\r\n 1185 value = to_str(value)\r\n 1186 \r\n-> 1187 subarr = np.empty(length, dtype=dtype)\r\n 1188 subarr.fill(value)\r\n 1189 \r\n\r\nTypeError: Cannot interpret '<attribute 'dtype' of 'numpy.generic' objects>' as a data type\r\n`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jtlz2",
"comment_id": 1039957257,
"datetime": 1644911375000,
"masked_author": "username_8",
"text": "Similarly - this is still an issue for me",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seberg",
"comment_id": 1040603696,
"datetime": 1644948172000,
"masked_author": "username_1",
"text": "The last time I saw someone who updated their pandas version and then still had the problem, had an environment issue that made them pick up the wrong pandas version after all. Please double check `pandas.__version__` and `numpy.__version__` in whatever you are running (e.g. the script) itself?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "levalencia",
"comment_id": 1041664383,
"datetime": 1645025216000,
"masked_author": "username_9",
"text": "I am having the same issue.\r\nOn a new compute instance in Azure ML, I am using Python 3.8 Kernel.\r\n\r\nI checked the versions:\r\npandas 0.25.3\r\nnumpy 1.22.2\r\n\r\n\r\n`.describe `gives the same error;\r\n`TypeError: Cannot interpret '<attribute 'dtype' of 'numpy.generic' objects>' as a data type\r\n`\r\nI also tried:\r\n`scaled_df.select_dtypes(include=[np.number])`\r\n\r\nsame error:\r\n`TypeError: Cannot interpret '<attribute 'dtype' of 'numpy.generic' objects>' as a data type`\r\n\r\n\r\nWhat should I do?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dinhtrang24",
"comment_id": 1063952359,
"datetime": 1646911436000,
"masked_author": "username_7",
"text": "Yes. I have checked both script and terminal again. \r\n\r\npandas == 0.24.2 \r\nnumpy == 1.21.5\r\n\r\nThe error doesn't not disappear",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "seberg",
"comment_id": 1064117711,
"datetime": 1646922232000,
"masked_author": "username_1",
"text": "@username_7 that pandas version is known to be too old. I had asked Luis, because I though 0.24.3 may be new enough (I am not quite sure). You have to either upgrade pandas, since it is an old version, or if you are stuck with such an old pandas version downgrade NumPy.\r\n(Or I suppose apply a patch to pandas, but unless you have very concrete reasons for using that pandas version, you should upgrade.)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dinhtrang24",
"comment_id": 1066656363,
"datetime": 1647256357000,
"masked_author": "username_7",
"text": "@username_1 Hi, thanks for the reply. I figured out the problem is that I have installed different versions of pandas and numpy using pip and pip3. After upgrading and matching their versions, the problem was solved.",
"title": null,
"type": "comment"
}
] | 10 | 26 | 13,723 | false | false | 13,723 | true |
GetStream/react-activity-feed | GetStream | 809,383,723 | 144 | {
"number": 144,
"repo": "react-activity-feed",
"user_login": "GetStream"
} | [
{
"action": "opened",
"author": "mahboubii",
"comment_id": null,
"datetime": 1613487440000,
"masked_author": "username_0",
"text": "",
"title": "TS config",
"type": "issue"
}
] | 2 | 2 | 444 | false | true | 0 | false |
delphidabbler/delphidabbler.github.io | null | 639,425,684 | 55 | null | [
{
"action": "opened",
"author": "delphidabbler",
"comment_id": null,
"datetime": 1592291312000,
"masked_author": "username_0",
"text": "Getting many 404s got the old tips pages of form `/tips/99`\r\n\r\nShort term fix would to redirect to Delphi-tips github page using HTML preview. Use temporary redirect code.\r\n\r\nLonger term it would be better to restore the tips, preferably with a public API on the server and a web app on the site.",
"title": "Do redirect for Delphi Tips",
"type": "issue"
},
{
"action": "closed",
"author": "delphidabbler",
"comment_id": null,
"datetime": 1592317912000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 296 | false | false | 296 | false |
Azure-Samples/cognitive-services-speech-sdk | Azure-Samples | 602,931,519 | 592 | null | [
{
"action": "opened",
"author": "viju2008",
"comment_id": null,
"datetime": 1587357220000,
"masked_author": "username_0",
"text": "Dear Sir\r\n\r\nHow can we disable partial results to the Wesocket endpoint. I do not want partial results as I cannot work on the partial results in my domain as every word is important. The whole audio should be processed in one shot and as a whole. I have a high speed network between the SDK and Speech recognition endpoint. And telling the Speech endpoint that whole speech has to be processed I beleive will reduce the processing time and also improce throughput as it can save the compute time spent of making partial results and sending over network.\r\n\r\n\r\nAlso i saw that the audio is being sent in chunks over the network to the WebSocket endpoint at Azure end and not in one single go. Is it possible to specify the chunk size and increase it.\r\n\r\nRegards,",
"title": "Disable Partial results and increase Chunk size of audio being sent over network",
"type": "issue"
},
{
"action": "created",
"author": "oscholz",
"comment_id": 619318112,
"datetime": 1587788642000,
"masked_author": "username_1",
"text": "@username_0 Thank you for getting in touch! I am checking with the feature team on this, and we'll get back to you soon.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "viju2008",
"comment_id": 620089819,
"datetime": 1588004526000,
"masked_author": "username_0",
"text": "Any Updates.\r\n\r\nRegards,\r\n\r\nVijay",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "amitkumarshukla",
"comment_id": 620291018,
"datetime": 1588030664000,
"masked_author": "username_2",
"text": "@username_0 Are you looking for disabling hypothesis ?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "amitkumarshukla",
"comment_id": 620295670,
"datetime": 1588031641000,
"masked_author": "username_2",
"text": "@username_0 From here https://docs.microsoft.com/en-us/dotnet/api/microsoft.cognitiveservices.speech.propertyid?view=azure-dotnet\r\ntry setting this SpeechServiceResponse_StablePartialResultThreshold\r\nlike \r\nconfig.SetProperty(PropertyId.SpeechServiceResponse_StablePartialResultThreshold, \"100\");\r\n\r\nPlay with different numbers and see what fits your requirement.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "viju2008",
"comment_id": 620518426,
"datetime": 1588069455000,
"masked_author": "username_0",
"text": "I tried setting is as told config.setProperty(PropertyId.SpeechServiceResponse_StablePartialResultThreshold, \"200\");\r\n\r\n\r\nBut it is stilling recognizing partially. Below is the output\r\n\r\nRECOGNIZING: Text=I need\r\nRECOGNIZING: Text=I need to\r\nRECOGNIZING: Text=I need to transfer\r\nRECOGNIZING: Text=I need to transfer seventh\r\nRECOGNIZED: Text=I need to transfer 7000 rupees to sham.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "amitkumarshukla",
"comment_id": 620875403,
"datetime": 1588110848000,
"masked_author": "username_2",
"text": "You have option not to subscribe to recognizing event and in that case the sdk wont callback the app for every intermediate. Do let me know if that solves your problem",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "viju2008",
"comment_id": 620972785,
"datetime": 1588131508000,
"masked_author": "username_0",
"text": "Will disabling the event listening for partial results actually stop the speech endpoint or containers from producing partial results and sending them across the network Or will the sdk simply not display it. I wish that the partial results should not travel over the network also",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "amitkumarshukla",
"comment_id": 621348251,
"datetime": 1588180652000,
"masked_author": "username_2",
"text": "@username_0 Even though you disable the event but the intermediate result will travel over the network. Please keep in mind the data which comes from service is a json payload which is text data. The amount of data received is much more lower than the amount of data sent to the service. So I can guarantee you that your data usage will never be a bottleneck for the incoming traffic.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "viju2008",
"comment_id": 621398418,
"datetime": 1588186471000,
"masked_author": "username_0",
"text": "But the purpose of disabling partial results is not served. Its not only about the network overheads also the container need not waste time in making partial results and can in one shot take the hold file and process the same. \r\n\r\nThe Main reason for the post was so that the response time is made better from container by disabling partial results over the network as well as the compute time involved for the container in partial results. As the partial results are not of any use in my case. Hence if the chunk size could be increased and the whole audio is taken and processed in one go the response time of the container will be better in my case",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "amitkumarshukla",
"comment_id": 621469251,
"datetime": 1588195046000,
"masked_author": "username_2",
"text": "@username_0 That is a great input to us. Please put a feature request at https://cognitive.uservoice.com/forums/912208-speech-service\r\n\r\nOur team will look at this and will work on it based on our priority.\r\nThanks a lot.\r\nClosing it for the time being.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "amitkumarshukla",
"comment_id": null,
"datetime": 1588195046000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "viju2008",
"comment_id": 626482073,
"datetime": 1589175724000,
"masked_author": "username_0",
"text": "Hello @username_2 \r\n\r\nI have put a feature request. Please provide update on progress",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jzhw0130",
"comment_id": 644533993,
"datetime": 1592284134000,
"masked_author": "username_3",
"text": "waiting too",
"title": null,
"type": "comment"
}
] | 4 | 14 | 3,555 | false | false | 3,555 | true |
matrix-org/matrix-react-sdk | matrix-org | 805,181,321 | 5,634 | {
"number": 5634,
"repo": "matrix-react-sdk",
"user_login": "matrix-org"
} | [
{
"action": "opened",
"author": "jaiwanth-v",
"comment_id": null,
"datetime": 1612934599000,
"masked_author": "username_0",
"text": "Fixes vector-im/element-web#16273 \r\n\r\n",
"title": "Added loading and disabled the button while searching for server",
"type": "issue"
},
{
"action": "created",
"author": "t3chguy",
"comment_id": 777409052,
"datetime": 1613045618000,
"masked_author": "username_1",
"text": "Thanks for your contribution!",
"title": null,
"type": "comment"
}
] | 2 | 2 | 192 | false | false | 192 | false |
BryanWilhite/songhay-web-components | null | 638,538,667 | 17 | null | [
{
"action": "opened",
"author": "BryanWilhite",
"comment_id": null,
"datetime": 1592196352000,
"masked_author": "username_0",
"text": "the list of thumbs used in my [Angular work](https://github.com/username_0/songhay-ng-workspace#songhayplayer-video-you-tube-project) needs to made more flexible:\r\n\r\n\r\n\r\nit needs to work in Angular _and_ on Web standards (the shadow DOM)\r\n\r\n:book: https://www.grapecity.com/blogs/using-web-components-in-angular",
"title": "proposed component: responsive list of thumbnails",
"type": "issue"
}
] | 1 | 1 | 422 | false | false | 422 | true |
zeebe-io/zeebe | zeebe-io | 816,471,118 | 6,448 | {
"number": 6448,
"repo": "zeebe",
"user_login": "zeebe-io"
} | [
{
"action": "opened",
"author": "npepinpe",
"comment_id": null,
"datetime": 1614262140000,
"masked_author": "username_0",
"text": "## Description\r\n\r\nAs part of #6175, we will need a way to walk the scope hierarchy of variable scopes in order to properly merge variable documents, and this will be done outside of the `VariableState`. The simplest solution is simply to expose `getParentScopeKey` which returns either the parent scope key, or `NO_PARENT` if there is none.\r\n\r\nIf we end up walking this hierarchy a lot, then we could always introduce a more complex abstraction, but this is fine for now.\r\n\r\n## Related issues\r\n\r\n<!-- Which issues are closed by this PR or are related -->\r\n\r\nrelated to #6175 \r\nblocked by #6446\r\n\r\n## Definition of Done\r\n\r\n_Not all items need to be done depending on the issue and the pull request._\r\n\r\nCode changes:\r\n* [x] The changes are backwards compatibility with previous versions\r\n* [ ] If it fixes a bug then PRs are created to [backport](https://github.com/zeebe-io/zeebe/compare/stable/0.24...develop?expand=1&template=backport_template.md&title=[Backport%200.24]) the fix to the last two minor versions. You can trigger a backport by assigning labels (e.g. `backport stable/0.25`) to the PR, in case that fails you need to create backports manually.\r\n\r\nTesting:\r\n* [x] There are unit/integration tests that verify all acceptance criterias of the issue\r\n* [x] New tests are written to ensure backwards compatibility with further versions\r\n* [x] The behavior is tested manually\r\n* [ ] The change has been verified by a QA run\r\n* [ ] The impact of the changes is verified by a benchmark \r\n\r\nDocumentation: \r\n* [ ] The documentation is updated (e.g. BPMN reference, configuration, examples, get-started guides, etc.)\r\n* [ ] New content is added to the [release announcement](https://drive.google.com/drive/u/0/folders/1DTIeswnEEq-NggJ25rm2BsDjcCQpDape)",
"title": "Provide a way to get the parent scope key of a variable scope",
"type": "issue"
},
{
"action": "created",
"author": "npepinpe",
"comment_id": 785921945,
"datetime": 1614262175000,
"masked_author": "username_0",
"text": "This PR currently includes commits from #6446 as it builds on top of it - it'll probably be easier to review once that one is merged.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "npepinpe",
"comment_id": 786560505,
"datetime": 1614335485000,
"masked_author": "username_0",
"text": "bors r+",
"title": null,
"type": "comment"
}
] | 2 | 4 | 2,047 | false | true | 1,898 | false |
thepirat000/Audit.NET | null | 773,514,783 | 350 | null | [
{
"action": "opened",
"author": "Tsukasa42",
"comment_id": null,
"datetime": 1608703940000,
"masked_author": "username_0",
"text": "**Describe the bug**\r\nAssigning your own value to Environment.Exception is overwritten when the event ends.\r\n\r\n**To Reproduce**\r\nscope.Event.Environment.Exception = $\"{ex.GetType().Name}: {ex.Message}\";\r\n\r\n**Expected behavior**\r\nWe should be allowed to set a custom exception message. While testing out the library on MacOS, the exception data is returned as a COMException with no actual details on the error.\r\n\r\n**Libraries (specify the Audit.NET extensions being used including version):**\r\nFor example:\r\n - Audit.Net 16.2.0\r\n\r\n**Target .NET framework:**\r\nFor example:\r\n - .NET Core 3.0",
"title": "Allow Custom Exception Message",
"type": "issue"
},
{
"action": "created",
"author": "thepirat000",
"comment_id": 750788872,
"datetime": 1608795242000,
"masked_author": "username_1",
"text": "Note you can have extra fields on the `AuditEvent` or the `AuditEventEnvironment` by using the `CustomFields` property, for example:\r\n\r\n```c#\r\nscope.Event.Environment.CustomFields[\"ExceptionMessage\"] = $\"{ex.GetType().Name}: {ex.Message}\";\r\n```\r\n\r\nIf you use JSON serialization, that will be serialized as another property of the Environment object",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "thepirat000",
"comment_id": null,
"datetime": 1609648152000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 937 | false | false | 937 | false |
kubernetes/kube-state-metrics | kubernetes | 823,132,945 | 1,407 | {
"number": 1407,
"repo": "kube-state-metrics",
"user_login": "kubernetes"
} | [
{
"action": "opened",
"author": "lilic",
"comment_id": null,
"datetime": 1614953831000,
"masked_author": "username_0",
"text": "This PR brings in the release-2.0 branch into master, to sync master with it.\r\n\r\ncc @username_1 @tariq1890 @mrueg please have a look. I had a lot of conflicts to resolve due to branches diverging so having a closer look might be good!",
"title": "Merge release-2.0 branch into master",
"type": "issue"
},
{
"action": "created",
"author": "lilic",
"comment_id": 791449555,
"datetime": 1614954152000,
"masked_author": "username_0",
"text": "cc @kakkoyun",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "brancz",
"comment_id": 791453482,
"datetime": 1614954535000,
"masked_author": "username_1",
"text": "Awesome!\r\n\r\n/lgtm\r\n/approve",
"title": null,
"type": "comment"
}
] | 3 | 5 | 2,019 | false | true | 269 | true |
dpk/et-book | null | 820,237,622 | 16 | null | [
{
"action": "opened",
"author": "dpk",
"comment_id": null,
"datetime": 1614706716000,
"masked_author": "username_0",
"text": "Need:\r\n\r\n* A version of f with a shorter hook for use in German (and probably Dutch, etc). E.g. in the word Auflage, because of the morpheme boundary (Auf|lage), it is technically wrong to use an fl ligature, and an f with a shorter hook should come before the l instead. Also for combinations like få, fä, fü, the umlaut can be uncomfortably close to the hook of the f (or just collide with it) and a ligature for those combinations is probably the wrong thing to do. Then we’ll also need a version of ff ending in the short hooked version (Stoff|farbe, Schiff|bau), but no fff, thankfully, as there’s always a morpheme boundary there.\r\n* fh, fb (halfback), etc with the crossbar connecting to the following character like in fl … then also ffh (offhand), ffb (offbeat).\r\n* An fj ligature would be nice to have, for better support of languages where that’s a common combination (and loanwords in English like fjord).\r\n\r\nBut I’ll need to get comfortable with glyph editing to add these.\r\n\r\nEB Garamond has a comprehensive set, including feature files that handle language specific rules in some detail.",
"title": "More and better ligatures",
"type": "issue"
},
{
"action": "created",
"author": "dpk",
"comment_id": 789121383,
"datetime": 1614710028000,
"masked_author": "username_0",
"text": "As an example for what the f with shorter hook might look like:\r\n\r\n```xml\r\n<?xml version=\"1.0\"?>\r\n<glyph name=\"f.deu\" format=\"1\">\r\n <advance width=\"321\"/>\r\n <unicode hex=\"0066\"/>\r\n <outline>\r\n <contour>\r\n <point x=\"324\" y=\"680\" type=\"line\" smooth=\"yes\"/>\r\n <point x=\"320.754\" y=\"660.562\"/>\r\n <point x=\"321\" y=\"650\"/>\r\n <point x=\"305\" y=\"649\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"286.011\" y=\"647.813\"/>\r\n <point x=\"278\" y=\"670\"/>\r\n <point x=\"263\" y=\"670\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"226.654\" y=\"670\"/>\r\n <point x=\"204\" y=\"627\"/>\r\n <point x=\"191\" y=\"605\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"180.762\" y=\"587.674\"/>\r\n <point x=\"175.333\" y=\"550.005\"/>\r\n <point x=\"176\" y=\"504\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"177\" y=\"435\" type=\"line\" smooth=\"yes\"/>\r\n <point x=\"177\" y=\"424\"/>\r\n <point x=\"178\" y=\"415\"/>\r\n <point x=\"192\" y=\"415\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"266\" y=\"415\" type=\"line\" smooth=\"yes\"/>\r\n <point x=\"271\" y=\"415\"/>\r\n <point x=\"279\" y=\"411\"/>\r\n <point x=\"280\" y=\"406\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"281\" y=\"405\"/>\r\n <point x=\"287\" y=\"361\"/>\r\n <point x=\"287\" y=\"361\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"286\" y=\"356\"/>\r\n <point x=\"283\" y=\"353\"/>\r\n <point x=\"275\" y=\"353\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"190\" y=\"354\" type=\"line\" smooth=\"yes\"/>\r\n <point x=\"182\" y=\"354\"/>\r\n <point x=\"177\" y=\"345\"/>\r\n <point x=\"177\" y=\"341\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"177\" y=\"128\" type=\"line\" smooth=\"yes\"/>\r\n <point x=\"177\" y=\"33\"/>\r\n <point x=\"181\" y=\"33\"/>\r\n <point x=\"246\" y=\"34\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"258\" y=\"34\"/>\r\n <point x=\"270\" y=\"35\"/>\r\n <point x=\"280\" y=\"34\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"291\" y=\"33\"/>\r\n <point x=\"292\" y=\"24\"/>\r\n <point x=\"293\" y=\"13\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"293\" y=\"5\"/>\r\n <point x=\"289\" y=\"-3\"/>\r\n <point x=\"284\" y=\"-3\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"220\" y=\"-1\"/>\r\n <point x=\"94\" y=\"0\"/>\r\n <point x=\"32\" y=\"-3\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"26\" y=\"-3\"/>\r\n <point x=\"24\" y=\"6\"/>\r\n <point x=\"24\" y=\"13\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"24\" y=\"16\"/>\r\n <point x=\"24\" y=\"21\"/>\r\n <point x=\"26\" y=\"24\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"29\" y=\"32\"/>\r\n <point x=\"37\" y=\"31\"/>\r\n <point x=\"44\" y=\"31\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"93\" y=\"31\"/>\r\n <point x=\"97\" y=\"51\"/>\r\n <point x=\"97\" y=\"132\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"97\" y=\"340\" type=\"line\" smooth=\"yes\"/>\r\n <point x=\"95\" y=\"351\"/>\r\n <point x=\"91\" y=\"355\"/>\r\n <point x=\"85\" y=\"356\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"22\" y=\"356\" type=\"line\" smooth=\"yes\"/>\r\n <point x=\"15\" y=\"356\"/>\r\n <point x=\"16\" y=\"382\"/>\r\n <point x=\"22\" y=\"386\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"53\" y=\"393\"/>\r\n <point x=\"76\" y=\"401\"/>\r\n <point x=\"97\" y=\"413\" type=\"curve\"/>\r\n <point x=\"97\" y=\"436\"/>\r\n <point x=\"97.685\" y=\"442.015\"/>\r\n <point x=\"99\" y=\"467\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"103\" y=\"543\"/>\r\n <point x=\"118.798\" y=\"596.267\"/>\r\n <point x=\"173\" y=\"656\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"222\" y=\"710\"/>\r\n <point x=\"267\" y=\"726\"/>\r\n <point x=\"295\" y=\"726\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"298\" y=\"726\"/>\r\n <point x=\"318.667\" y=\"725.222\"/>\r\n <point x=\"321\" y=\"719\" type=\"curve\" smooth=\"yes\"/>\r\n <point x=\"324\" y=\"711\"/>\r\n </contour>\r\n </outline>\r\n</glyph>\r\n```\r\n\r\nBut this is just a draft glyph drawn up by someone who knows very little about font editing. But it doesn’t collide with l (Auflage) or b (Laufband), for instance",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dpk",
"comment_id": 791665874,
"datetime": 1614975787000,
"masked_author": "username_0",
"text": "fj will probably make it into 2.0 — I just designed it (and dcroat) in all weights and styles 🎉",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dpk",
"comment_id": 792726403,
"datetime": 1615206838000,
"masked_author": "username_0",
"text": "[The specimen from the SB Berlin](https://tw.staatsbibliothek-berlin.de/ma10568) appears to include a shorter-hooked f to use as a basis for the design.",
"title": null,
"type": "comment"
}
] | 1 | 4 | 5,239 | false | false | 5,239 | false |
vitech-team/SDLC | vitech-team | 850,991,814 | 115 | null | [
{
"action": "opened",
"author": "serhiykrupka",
"comment_id": null,
"datetime": 1617687349000,
"masked_author": "username_0",
"text": "Extend JX dashboard with resources which been produced by pipeline.\r\n\r\nDifferent pipelines can generate different reports like junit, owasp, large tests, etc. so users should be able to navigate from JX dashboard/pipeline to report in a user-friendly way.",
"title": "Enhance JX dashboard ",
"type": "issue"
}
] | 1 | 1 | 255 | false | false | 255 | false |
kubernetes-sigs/cluster-api | kubernetes-sigs | 652,405,013 | 3,296 | null | [
{
"action": "opened",
"author": "randomvariable",
"comment_id": null,
"datetime": 1594134753000,
"masked_author": "username_0",
"text": "<!-- NOTE: ⚠️ For larger proposals, we follow the CAEP process as outlined in https://sigs.k8s.io/cluster-api/CONTRIBUTING.md. -->\r\n\r\n**User Story**\r\n\r\nAs a operator, I deploy a cluster in my infrastructure provider. The infrastructure has a CIDR range of 192.168.0.0/24, and I set a pod and service CIDR range of 192.168.0.0/16.\r\n\r\nI see intermittent communication failures, but Cluster API hasn't told me what is wrong.\r\n\r\n**Detailed Description**\r\n\r\nCluster API should inform a user that there is overlap between their infrastructure networking and that for the pods and service CIDRs, such that it could cause communication issues. This ideally would be a condition on the Cluster resource, that infrastructure providers could inform.\r\n\r\n**Anything else you would like to add:**\r\n\r\n[Miscellaneous information that will assist in solving the issue.]\r\n\r\n/kind feature",
"title": "Cluster condition, consumable by Infrastructure Providers for machine IP / pod & service CIDR overlap",
"type": "issue"
},
{
"action": "created",
"author": "vincepri",
"comment_id": 654966027,
"datetime": 1594138007000,
"masked_author": "username_1",
"text": "/milestone v0.3.x\r\n/help",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vincepri",
"comment_id": 668175962,
"datetime": 1596479464000,
"masked_author": "username_1",
"text": "/milestone v0.4.0",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vincepri",
"comment_id": 668176144,
"datetime": 1596479488000,
"masked_author": "username_1",
"text": "/help\r\n/priority important-longterm",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "iamemilio",
"comment_id": 704429728,
"datetime": 1602005119000,
"masked_author": "username_2",
"text": "/assign\r\nI would like to take a look at this",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fabriziopandini",
"comment_id": 704439244,
"datetime": 1602006177000,
"masked_author": "username_3",
"text": "What about having a validation webhook that prevents misconfiguration instead of reporting errors after the cluster is created?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "randomvariable",
"comment_id": 704483359,
"datetime": 1602010373000,
"masked_author": "username_0",
"text": "@username_3 Could do, but would have to be infrastructure specific then, which is fine as well.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "randomvariable",
"comment_id": 704483850,
"datetime": 1602010417000,
"masked_author": "username_0",
"text": "@username_3 would need validation webhooks performed by each infraprovider, since the cluster object has no idea of the infrastructure networks being used.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "randomvariable",
"comment_id": 704484045,
"datetime": 1602010437000,
"masked_author": "username_0",
"text": "Additionally in DHCP environments, even the infra provider doesn't know what the networks are.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "iamemilio",
"comment_id": 704971201,
"datetime": 1602080582000,
"masked_author": "username_2",
"text": "Excuse my Newbieness, but do we need to dynamically check the network CIDR range? For this validation check, why should we not just assume that the CIDR ranges the users entered [here](https://github.com/kubernetes-sigs/cluster-api/blob/7fdecfe013260f6ca4dd5afa1ae72a37766f7506/api/v1alpha3/cluster_types.go#L67) are valid, and just check if they overlap?\r\n\r\nIf we want to validate the CIDR ranges themselves, I think that should be a separate step, and should be handled by the infra providers",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "randomvariable",
"comment_id": 705120414,
"datetime": 1602095770000,
"masked_author": "username_0",
"text": "We don't know the network range of the infrastructure from that information, which is the main problem that pops up. That struct only tells us the CNI networking.\r\n\r\nShould probably be along the lines of comparing Machine.Status.Addresses vs. the CIDRs in the struct above, and add a condition representing a clash.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "iamemilio",
"comment_id": 705581470,
"datetime": 1602164956000,
"masked_author": "username_2",
"text": "Thanks, that makes more sense. My next question would be why a validation webhook? Do we do a simlar things for other validation steps? Do we have a process for validating configs?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "iamemilio",
"comment_id": 705832390,
"datetime": 1602192352000,
"masked_author": "username_2",
"text": "@username_0 for clarification, by infraprovider do you mean the underlying infra, or cluster-api-provider-foo?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "iamemilio",
"comment_id": 705833532,
"datetime": 1602192511000,
"masked_author": "username_2",
"text": "Can we define an interface version of our [crd validation webhooks](https://github.com/kubernetes-sigs/cluster-api/blob/9cbf8be8d643203d99aca6714825abdabcbca0b1/api/v1alpha3/cluster_webhook.go#L64) then have the cluster-api-providers implement it downstream?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "randomvariable",
"comment_id": 708353694,
"datetime": 1602676714000,
"masked_author": "username_0",
"text": "@username_2 There's no guarantee an infra provider knows what the infrastructure network will be ahead of time, so it can't happen via a webhook.\r\n\r\nIf we take vSphere or bare metal as an example, you may very well be putting a machine in an L2 network with some external DHCP. There's nothing on the machine specification that mentions an IP address at that time.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "iamemilio",
"comment_id": 709494404,
"datetime": 1602784749000,
"masked_author": "username_2",
"text": "Given those restrictions, the only way this can really work on all platforms the same way is to check the ip assigned to each machine against the service and pods network CIDRS and throw and error or warning in the machine-controller logs if they overlap. Is that an acceptable solution?\r\n\r\nAnother option to consider is how things are handled in OpenShift. We require admins to do a little legwork upfront and tell us the CIDR of the network our nodes will get their IPs from. This makes the validation easy and upfront. In the CAPI case, this would mean that we would add another optional param to the ClusterNetwork object: `Machines *NetworkRanges`. If this is provided, we can quickly check for overlap with a webhook. The big question is if this is worth adding a parameter to the cluster API.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ncdc",
"comment_id": 709514957,
"datetime": 1602786880000,
"masked_author": "username_4",
"text": "Logs are not user-facing. The only available user-facing status indicator is the conditions array in status. The original description above suggests making this a condition in Cluster status.\r\n\r\nI'll defer to @username_5 @CecileRobertMichon @username_0 etc. for thoughts on the suggested new API field.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "detiber",
"comment_id": 709591038,
"datetime": 1602796082000,
"masked_author": "username_5",
"text": "I'm not necessarily sure I'm a fan of adding a field for this unless we have a comprehensive proposal around it, how it would impact providers, and how it would impact users.\r\n\r\nI want to make sure we aren't creating a situation where it's really easy for a user to shoot themselves in the foot, and that we aren't necessarily imposing requirements on providers that are unreasonable.\r\n\r\nFor example, with AWS, I could easily see us being able to report the network range in use by the VPC (either cluster-api managed or user-provided) in a way that the Cluster resource could verify things (likely not at admission time, but could still short circuit the feedback loop on conditions before things get too far along). I'm not sure how feasible that approach would be for vSphere, metal3, packet, or other types of bare metal'ish providers.\r\n\r\nAlso depending on the use case we are talking about (such as a managed service provider model), it may not be reasonable to expect the end user to know what network address block will be used in advance.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "randomvariable",
"comment_id": 710026184,
"datetime": 1602852720000,
"masked_author": "username_0",
"text": "Agree with @username_5 .\r\n\r\nJust to clarify the use case for having a condition on the cluster object:\r\n\r\nHaving a tool like the at-the-glance status in https://github.com/kubernetes-sigs/cluster-api/issues/3802 could show the problem across infrastructure providers.\r\n\r\nConcretely, just thinking of a defined `ConditionType` constant for the condition and leaving the rest to infra providers.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fabriziopandini",
"comment_id": 760508293,
"datetime": 1610662289000,
"masked_author": "username_3",
"text": "/remove-lifecycle stale",
"title": null,
"type": "comment"
}
] | 8 | 27 | 7,791 | false | true | 5,755 | true |
mgoltzsche/helm-kustomize-plugin | null | 647,316,106 | 1 | null | [
{
"action": "opened",
"author": "daurnimator",
"comment_id": null,
"datetime": 1593432373000,
"masked_author": "username_0",
"text": "To ensure that a chart doesn't change out from under me, I'd like to include checksums in my generator (the same checksum you'd see in requirements.lock)",
"title": "Support specifying checksum(s)",
"type": "issue"
},
{
"action": "created",
"author": "mgoltzsche",
"comment_id": 651249649,
"datetime": 1593450721000,
"masked_author": "username_1",
"text": "Makes sense!\r\nWould you like to create a PR?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "scjudd",
"comment_id": 657723137,
"datetime": 1594665344000,
"masked_author": "username_2",
"text": "Could most of [`helm.LoadChart`](https://github.com/username_1/helm-kustomize-plugin/blob/master/pkg/helm/helm.go#L235-L262) be replaced with a call to [`man.Build()`](https://github.com/username_1/helm-kustomize-plugin/blob/master/vendor/k8s.io/helm/pkg/downloader/manager.go#L62-L103)? From looking over this a bit today, it seems that `LoadChart` is doing a subset of the work of Helm's download manager's `Build` method.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mgoltzsche",
"comment_id": 657739996,
"datetime": 1594667484000,
"masked_author": "username_1",
"text": "`helm.LoadChart` is also downloading dependencies while `man.Build()` isn't if I am not mistaken but you're right: this can probably be solved more elegantly or rather without copying code that helm provides anyway. I think I simply copied it from [ContainerSolutions/helm-convert](https://github.com/ContainerSolutions/helm-convert/blob/v0.5.1/pkg/helm/helm.go).\r\n\r\nAlso this repo has a [git_chart branch](https://github.com/username_1/helm-kustomize-plugin/tree/git_chart) which provides an additional but hacky feature: Referencing helm charts using go-getters and excluding particular resources like in [this example](https://github.com/username_1/kustomizations/blob/master/linkerd/base/linkerd-chart.yaml) where I fetched linkerd's helm chart from its git repo because they don't publish it but ship their own CLI that uses it. However I did not merge the feature into master because this is something you usually don't want to do.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "scjudd",
"comment_id": 657749613,
"datetime": 1594668739000,
"masked_author": "username_2",
"text": "Right, I didn't consider the initial download/unpack logic.\n\nWell, I think I'll take a stab at this. :)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mgoltzsche",
"comment_id": 657818729,
"datetime": 1594678258000,
"masked_author": "username_1",
"text": "@username_2 great!\r\nTo avoid potential conflicts and since I was using the go getter and resource filter features already I merged my very long-lived feature branch into master now.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mgoltzsche",
"comment_id": 699110283,
"datetime": 1601061714000,
"masked_author": "username_1",
"text": "Solved by PR #5",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mgoltzsche",
"comment_id": null,
"datetime": 1601061714000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "reopened",
"author": "mgoltzsche",
"comment_id": null,
"datetime": 1601576578000,
"masked_author": "username_1",
"text": "To ensure that a chart doesn't change out from under me, I'd like to include checksums in my generator (the same checksum you'd see in requirements.lock)",
"title": "Support specifying checksum(s)",
"type": "issue"
},
{
"action": "created",
"author": "mgoltzsche",
"comment_id": 702330679,
"datetime": 1601578285000,
"masked_author": "username_1",
"text": "I have to reopen this to clarify some requirements because I have to change some of the logic when upgrading to helm 3. The features for helm 2 done in PR #5 can be tested in release v0.9.2.\r\n\r\nPeople (including me) were confused about the actual feature that should be implemented:\r\n* Supporting `requirements.lock`: As I understand this is to lock to particular chart versions (in case the requirements.yaml specifies version ranges). In addition PR #5 introduced the field `lockFile` in the generator config so that you could point it to an existing `requirements.lock` - here it was unclear how the user could easily update it and what its purpose is. Do you think this is needed and what would be your use case? Do you need it because you want to refer to a chart by version range within the generator config?\r\n* Verifying an actual chart: this can be done using the generator config fields `verify` (bool) and `keyring` - should work but I didn't try it yet.\r\n\r\nThe changes introduced in PR #5 unfortunately don't work this way with the helm 3 code anymore (since e.g. `HashReq()` is inside a helm-internal package, see WIP PR #8). Therefore before I remove or reimplement the feature I need to ask if you really need to specify a lock file for a remote chart without specifying a local umbrella chart or rather if you need the `lockFile` field within the generator config? (if it is just about version ranges you could as well specify that single version and don't need another `requirements.lock` for it)\r\ncc @username_0 @username_2 @james-callahan\r\n\r\n_If you need the `lockFile` field I'd generate a temporary chart and place the chart reference from the generator config file into the `requirements.yaml` file of that temporary chart and nest values correspondingly to make helm deal with the `requirements.lock` and copy it back and forth to the project directory so that the user doesn't need other tools and knowledge in order to create that lock file initially - other opinions are welcome._",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "daurnimator",
"comment_id": 725230234,
"datetime": 1605076247000,
"masked_author": "username_0",
"text": "@username_1 what I want to do is lock to a particular version of a chart, so that what I audit and test locally (e.g. cert-manager 1.0.1) is the exact same thing that gets deployed onto my cluster.\r\n\r\n`verify` and `keyring` only check that it was signed by a given author; but I don't trust the author to not push out a malicious update in future overwriting their previous chart.\r\n\r\nI was under the impression that the checksums in `requirements.lock` would protect against this issue, but perhaps not?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mgoltzsche",
"comment_id": 725687147,
"datetime": 1605132325000,
"masked_author": "username_1",
"text": "That's not what the digest within the `requirements.lock` is for. As I understand it is to make the user rebuild her chart dependencies after she changed `requirements.yaml` (no idea why helm doesn't do it for us). You can see [here](https://github.com/helm/helm/blob/d481bc6cf32adf8a4684d58ae97002a9d7e70223/internal/resolver/resolver.go#L144) that the digest is just the hash of `requirements.yaml`.\r\nI don't think helm supports the feature you're looking for.\r\nHowever I think preventing redeployment would be very bad for desaster recovery.\r\n\r\nInstead I recommend to use [kpt](https://github.com/GoogleContainerTools/kpt) to render a helm chart into a git repo using a kpt function/sink and commit and push the result (see [here](https://opensource.googleblog.com/2020/03/kpt-packaging-up-your-kubernetes.html)). Your CD pipeline (`kpt live apply .`) can always deploy the plain rendered manifests from the repo and you can reliably audit what gets deployed. Also you CD pipeline doesn't depend on rendering technology like helm or kustomize and availability of their repositories (kpt can also manage other (helm/kustomize/...) repositories within your repository).\r\nThere is also an official helm-template kpt function.\r\n\r\nSoon I ll ship this project as kpt function/sink container image as well since it supports a few more features. As I understand, since they originate from kustomize, kpt functions will soon also be available within kustomize directly and eventually replace kustomize' plugins as they are today - the latter is just me guessing.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mgoltzsche",
"comment_id": 744085228,
"datetime": 1607901153000,
"masked_author": "username_1",
"text": "I am closing this since `requirements.lock` is supported for local charts but does not guarantee that a chart does not change on the server.\r\n\r\n@username_0 you can guarantee that manifests don't change after you have audited them using the [kpt functionality](https://github.com/username_1/khelm#kpt-function) for which I've also prepared an [example](https://github.com/username_1/khelm/tree/master/example/kpt/cert-manager) - though in practice you would also commit `example/kpt/cert-manager/static/generated-manifest.yaml` with the repository to be sure it doesn't change after you audited it and avoid dependencies within the CD pipeline (I just didn't do that to avoid polluting the repo for the sake of an example).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mgoltzsche",
"comment_id": null,
"datetime": 1607901153000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "mgoltzsche",
"comment_id": 792324543,
"datetime": 1615140036000,
"masked_author": "username_1",
"text": "fyi: there is now an [example](https://github.com/username_1/khelm/tree/master/example/kpt/linkerd) that shows how to pull in charts from other git repositories as kpt dependencies and render them with khelm.",
"title": null,
"type": "comment"
}
] | 3 | 15 | 6,999 | false | false | 6,999 | true |
henriquehbr/svelte-typewriter | null | 762,737,045 | 31 | null | [
{
"action": "opened",
"author": "evdama",
"comment_id": null,
"datetime": 1607711682000,
"masked_author": "username_0",
"text": "If you put a `&` in your text for example then you get a jumping left/right effect which is bad\r\n\r\n```js\r\n<script>\r\n\timport Typewriter from 'svelte-typewriter'\r\n</script>\r\n\r\n<Typewriter loop interval={ 100 }>\r\n <div>normal text doesn't jump</div>\r\n <div>text with ambersand & then some more</div> // <--- jumping effect when processing &\r\n</Typewriter>\r\n```\r\n\r\n\r\nRepl to reproduce https://svelte.dev/repl/5698397692c54c2bb5fd03f5ea6ca8cc?version=3.31.0",
"title": "text with ambersand 'jumps'",
"type": "issue"
},
{
"action": "created",
"author": "henriquehbr",
"comment_id": 743445420,
"datetime": 1607723463000,
"masked_author": "username_1",
"text": "Indeed, just checked the repl, at the moment i'm solving conflicts caused by breaking changes on the dependencies, but i'll look into it as soon as possible\r\n\r\nMany thanks for reporting this!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "henriquehbr",
"comment_id": null,
"datetime": 1608039888000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "henriquehbr",
"comment_id": 745297785,
"datetime": 1608040184000,
"masked_author": "username_1",
"text": "A fix for this bug has been released on [`2.4.4`](https://github.com/username_1/svelte-typewriter/blob/v2.4.4/CHANGELOG.md#v244)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "evdama",
"comment_id": 745410769,
"datetime": 1608050219000,
"masked_author": "username_0",
"text": "Excellent, I just checked and yes, it's fixed!\r\nThanks a lot for your time and effort... it's much appreciated for this awesome package 👍",
"title": null,
"type": "comment"
}
] | 2 | 5 | 919 | false | false | 919 | true |
EasyCorp/EasyAdminBundle | EasyCorp | 708,540,561 | 3,808 | {
"number": 3808,
"repo": "EasyAdminBundle",
"user_login": "EasyCorp"
} | [
{
"action": "opened",
"author": "horlyk",
"comment_id": null,
"datetime": 1600991267000,
"masked_author": "username_0",
"text": "There is a problem for the nested collections when the **Add** button stops working.\r\n\r\nMy case: I have a main entity **A**. This entity should contain a collection of entities **B**. Entity **B** should contain a collection of entities **C**. **Add** button works well for adding new entities **B**, but for **C** it stops working.\r\n\r\nI've adjusted JS to work with any level of nesting collections.\r\n\r\nPS: there is one more bug with `prototype_name` option. If you leave it as a default `__name__`, the script will work not correctly. For example: `entity[__name__]colletionItem[__name__]field`, will be replaced for both, even though it should be replaced only for the first or the second.",
"title": "Fixed form-type-collection script for nested collections",
"type": "issue"
},
{
"action": "created",
"author": "horlyk",
"comment_id": 698883217,
"datetime": 1601034244000,
"masked_author": "username_0",
"text": "There is an issue on edit form. Will try to fix.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "horlyk",
"comment_id": 698918719,
"datetime": 1601039365000,
"masked_author": "username_0",
"text": "Fixed in a quick way. Actually, to achieve better results, add button and a collection itself should have some relation. Current code is not flexible enough :(",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "javiereguiluz",
"comment_id": 699493114,
"datetime": 1601125473000,
"masked_author": "username_1",
"text": "Thanks for fixing this bug! Sadly the form collections and the JS code is a bit weak, but thanks to your improvements and future changes we might do, this will improve. Thanks!",
"title": null,
"type": "comment"
}
] | 2 | 4 | 1,074 | false | false | 1,074 | false |
hasadna/anyway | hasadna | 659,272,207 | 1,390 | null | [
{
"action": "opened",
"author": "Mano3",
"comment_id": null,
"datetime": 1594993725000,
"masked_author": "username_0",
"text": "Since @username_1 incredible refactor, a lot of files became deprecated, there are also some functions that are deprecated in files that are being used.\r\nI would appreciate if only code that is usable in production (via the current scraping flow) would remain in the codebase, any legacy code has got to go.\r\nFor example, as far as I understand news_flash folder is deprecated and everything is handled via location_extraction.py and news_flash_*.py files in the parsers folder. This is just an example, there are more legacy files in this folder regarding news flash parsing.\r\nMaybe they all should be moved to a newly created news_flash for organization of files.",
"title": "Remove legacy code from news flash parser folder",
"type": "issue"
},
{
"action": "created",
"author": "elazarg",
"comment_id": 662955394,
"datetime": 1595503874000,
"masked_author": "username_1",
"text": "Oh. I thought I have removed them already.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Mano3",
"comment_id": null,
"datetime": 1595605025000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 704 | false | false | 704 | true |
RedHatInsights/insights-results-smart-proxy | RedHatInsights | 723,157,003 | 314 | {
"number": 314,
"repo": "insights-results-smart-proxy",
"user_login": "RedHatInsights"
} | [
{
"action": "opened",
"author": "Sergey1011010",
"comment_id": null,
"datetime": 1602849633000,
"masked_author": "username_0",
"text": "## Type of change\r\n\r\nPlease delete options that are not relevant.\r\n\r\n- Bug fix (non-breaking change which fixes an issue)\r\n- Refactor (refactoring code, removing useless files)\r\n\r\n## Testing steps\r\n\r\n`make before_commit`",
"title": "Some refactoring",
"type": "issue"
}
] | 2 | 2 | 220 | false | true | 220 | false |
harryjubb/quacker | null | 845,433,453 | 15 | null | [
{
"action": "opened",
"author": "harryjubb",
"comment_id": null,
"datetime": 1617145738000,
"masked_author": "username_0",
"text": "**Is your feature request related to a problem? Please describe.**\r\nA group of shortcuts may be related, e.g. a set of shortcuts for window resizing. It would be useful to be able to organise these and collapse them independently\r\n\r\n**Describe the solution you'd like**\r\nOne-level folder-type hierarchy to organise shortcut cards within.\r\n\r\n**Describe alternatives you've considered**\r\nCould do this implicitly by adding re-ordering, but not as nice.\r\n\r\n**Additional context**\r\nN/A",
"title": "Ability to group related shortcuts",
"type": "issue"
}
] | 1 | 1 | 481 | false | false | 481 | false |
iterative/dvc.org | iterative | 798,025,440 | 2,130 | {
"number": 2130,
"repo": "dvc.org",
"user_login": "iterative"
} | [
{
"action": "opened",
"author": "jorgeorpinel",
"comment_id": null,
"datetime": 1612161959000,
"masked_author": "username_0",
"text": "- [ ] UPDATE: After this, (make 1+ PRs) from #2132",
"title": "remote: terminology and sample standardization",
"type": "issue"
},
{
"action": "created",
"author": "jorgeorpinel",
"comment_id": 771855030,
"datetime": 1612289437000,
"masked_author": "username_0",
"text": "These old changes don't carry > 2.0 code (from master). I apply them to `v1` first because otherwise in the process of merging to master (if there are conflicts) the branch could get \"contaminated\" with >2.0 code, and it would be harder to backport.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 299 | false | false | 299 | false |
rancher/rancher | rancher | 734,760,493 | 29,906 | null | [
{
"action": "opened",
"author": "rmweir",
"comment_id": null,
"datetime": 1604345050000,
"masked_author": "username_0",
"text": "<!--\r\nPlease search for existing issues first, then read https://rancher.com/docs/rancher/v2.x/en/contributing/#bugs-issues-or-questions to see what we expect in an issue\r\nFor security issues, please email security@rancher.com instead of posting a public issue in GitHub. You may (but are not required to) use the GPG key located on Keybase.\r\n-->\r\n\r\n**What kind of request is this (question/bug/enhancement/feature request):**\r\nenhancement\r\n\r\n**Steps to reproduce (least amount of steps as possible):**\r\n1. Create or register private EKS cluster (a cluster that only has Private Access enabled).\r\n\r\n**Result:**\r\nError: \"Failed to communicate with cluster.\r\nEven if a cluster is created as Public/Private then has public disabled, the agent will connect but health checks will error.\r\n\r\n**Desire:**\r\nRegistering a private EKS cluster should work and rancher should be able to communicate with it.\r\nCreating a private EKS cluster should work. This flow will still require the user to use a command to deploy the agent, similar to regular imported clusters, described here: https://github.com/rancher/rancher/issues/28356.\r\n\r\n**Notes:**\r\nIt is possible to import a private cluster through regular import. This is because imported clusters tunnel their kubernetes requests to the control plane node and communicate with the kubernetes service IP instead of the public endpoint. The solution for this should probably use this same approach for EKS clusters. Requests should be tunneled and instead of using the public Kube API endpoint, the kubernetes service ip should be used.\r\n\r\nThis is blocking: https://github.com/rancher/rancher/issues/28356",
"title": "[EKSv2]: Rancher should be able to fully connect to created/register private EKS clusters",
"type": "issue"
},
{
"action": "closed",
"author": "rmweir",
"comment_id": null,
"datetime": 1604347668000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 1,643 | false | false | 1,643 | false |
openshift/cluster-network-operator | openshift | 723,966,444 | 840 | {
"number": 840,
"repo": "cluster-network-operator",
"user_login": "openshift"
} | [
{
"action": "opened",
"author": "rcarrillocruz",
"comment_id": null,
"datetime": 1603013298000,
"masked_author": "username_0",
"text": "",
"title": "Bump dependencies of k8s to 0.19",
"type": "issue"
},
{
"action": "created",
"author": "rcarrillocruz",
"comment_id": 711803715,
"datetime": 1603094889000,
"masked_author": "username_0",
"text": "We need to land https://github.com/openshift/release/pull/12907 , unit tests are running on golang 1.13 thus the crypto failures as those symbols naming changed.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rcarrillocruz",
"comment_id": 712339018,
"datetime": 1603130170000,
"masked_author": "username_0",
"text": "/retest",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rcarrillocruz",
"comment_id": 712347213,
"datetime": 1603130696000,
"masked_author": "username_0",
"text": "/assign @username_1",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "danwinship",
"comment_id": 712356758,
"datetime": 1603131539000,
"masked_author": "username_1",
"text": "/lgtm\r\nif it passes enough tests to merge...",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "juanluisvaladas",
"comment_id": 712783984,
"datetime": 1603193289000,
"masked_author": "username_2",
"text": "/lgtm",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rcarrillocruz",
"comment_id": 712884203,
"datetime": 1603203284000,
"masked_author": "username_0",
"text": "/override ci/prow/e2e-aws-sdn-multi",
"title": null,
"type": "comment"
}
] | 6 | 153 | 329,077 | false | true | 272 | true |
DemocracyClub/UK-Polling-Stations | DemocracyClub | 574,789,516 | 2,638 | null | [
{
"action": "opened",
"author": "polling-bot-4000",
"comment_id": null,
"datetime": 1583252991000,
"masked_author": "username_0",
"text": "EMS: Xpress DC\nFiles:\n- `E06000046/2020-03-03T16:29:11.988826/iow.gov.uk-1583251766000-.tsv`",
"title": "Import E06000046-Isle of Wight Council",
"type": "issue"
},
{
"action": "closed",
"author": "chris48s",
"comment_id": null,
"datetime": 1585562701000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 92 | false | false | 92 | false |
openlawteam/molochv3-ui | openlawteam | 780,508,350 | 20 | null | [
{
"action": "opened",
"author": "jtrein",
"comment_id": null,
"datetime": 1609935072000,
"masked_author": "username_0",
"text": "_Jotting some initial notes down._\r\n\r\nWe need to make standard the way we submit to Moloch and Snapshot as many actions involve both happening in \"parallel\".\r\n\r\n- [ ] Submitting proposals should be handled by a service which encapsulates other smaller services for submitting to Moloch and Snapshot. Hook?\r\n- [ ] Retry Snapshot (`HTTP`) if failed. Use a hook (search for a good one we can adapt from the interwebz)?\r\n- [ ] Handle case where internet cuts out (etc.) and one succeeds and the other doesn't/can't. How to nicely pick up where user left off?\r\n- [ ] Unit tests\r\n\r\ncc: @jdville03",
"title": "Standard for Moloch + Snapshot proposal submission",
"type": "issue"
},
{
"action": "created",
"author": "jtrein",
"comment_id": 790710651,
"datetime": 1614872561000,
"masked_author": "username_0",
"text": "Mostly handled as per the initial proposal flow's with onboarding and transfer. Will open new issues for improvements related to redundancies.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "jtrein",
"comment_id": null,
"datetime": 1614872561000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 732 | false | false | 732 | false |
trailofbits/blight | trailofbits | 765,825,588 | 15,424 | null | [
{
"action": "opened",
"author": "13140772047",
"comment_id": null,
"datetime": 1607915961000,
"masked_author": "username_0",
"text": "兴海找妹子包夜服务美女【+V:10771909】靓妹兴海妹子多少钱一晚上门服务(十微IO77I9O9)由芒果、银河酷娱联合出品,华晨美创承制的甜宠爱情偶像剧《奈何又如何》于近日在长沙隆重开机。导演吴强,主演宣璐、赵志伟、金雯昕、刘胤君、刘宥畅、王倩等齐齐亮相,各出品方、承制方代表也出席了开机仪式。“甜宠爆款制造机”导演打造超强热门,准备好你的少女心!曾执导《克拉恋人》、《双世宠妃》、《奈何要娶我》、《人间至味是清欢》的导演吴强,操刀多部爆款偶像剧,作品收视与播放量都表现出众。吴导擅长从细节之处出发,将演员的个人特质用观众喜爱的方式表达,打造出广受好评的剧集,足见导演对于女性情感的细腻感知和精准把握。当热门遭遇“甜宠爆款制造”团队,一定会击中你的少女心!开机现场,导演吴强表达了对新剧的强烈期待,面对长沙突降的冬雨,导演不仅用激动人心的话鼓舞了开机士气,还暖心地提醒大家长沙的冬天是“魔法攻击”,需多注意防寒保暖。“叮嘱式发言”提前将全体剧组人员带入了大家庭的温馨氛围。宣璐赵志伟首次合作,演绎超强职场《陈情令》中温柔端庄的“师姐”宣璐摇身一变,成为职场白骨精聂星辰;《我只喜欢你》中的暖心哥哥赵志伟把头发梳成大人的样子,化身极致挑剔强迫症霸总严景致。两位主演一改以往的风格,演绎史上最强与秘书组合,职场上惺惺相惜,情场上棋逢对手,玩转高智商职场爱恋。超强的超甜爱恋之下,涌动着旧日情感纠葛的秘密。初次见面,聂星辰就对“难搞”严景致的日常习惯了如指掌,一次出击住总裁办全场。究竟是什么让初次见面的二人似乎相识多年?蜜恋开始之后,霸总屡出,聂星辰如何见招拆招,出奇制胜,再获霸总心?另类多元视角呈现爱情新样本除了有强迫症霸总和完美特助,剧中还将打造多组:刘胤君饰演的花花公子赵远方和王倩饰演的漂亮女明星甄念从相互看不顺眼到最终相爱,两人学会了在爱情里面对最真实的自己。金雯昕饰演的元气追星女孩儿童欣苦恋帅气医生,一对颇具反差萌的情侣,在带给大家搞笑逗趣的同时,让人体验别样的甜蜜与宠爱。在一波多折的寻爱之路上,恋爱中的男女们共同成长蜕变。三对风格迥异的,浓缩出了当代都市爱恋现状的经典模式,全方位满足了观众们对爱情的不同想象。热门、超强编剧,加上“最懂少女心”导演的加持,搭配契合度的制片班底。全剧以轻松幽默的风格将职场、甜宠、失忆等流行元素融合,力图打造又一部浪漫甜蜜、少女心爆棚的偶像剧。甜宠之外,本剧还聚焦于都市青年的职场与爱情,展现当代青年人面对困难时努力克服的勇气与决心。声明:中华娱乐网刊载此文出于传递更多信息之目的,并非意味着赞同其观点或证实其描述。版权归作者所有,更多同类文章敬请浏览:综合资讯防较殉煌椅幽毡倘戮笛问陀炕孕倘https://github.com/trailofbits/blight/issues/8148?cbnid <br />https://github.com/trailofbits/blight/issues/5501?dunnj <br />https://github.com/trailofbits/blight/issues/2830?24314 <br />https://github.com/trailofbits/blight/issues/2254?32305 <br />https://github.com/trailofbits/blight/issues/13204?38532 <br />wlyqhgiabkjnmvgfqkviwfxpy",
"title": "兴海哪有特殊服务的洗浴-济南生活圈",
"type": "issue"
}
] | 1 | 1 | 1,458 | false | false | 1,458 | false |
shiftleft-staging/shiftleft-java-demo | null | 858,023,770 | 1 | {
"number": 1,
"repo": "shiftleft-java-demo",
"user_login": "shiftleft-staging"
} | [
{
"action": "opened",
"author": "shiftleft-staging",
"comment_id": null,
"datetime": 1618414902000,
"masked_author": "username_0",
"text": "\n\nThis pull request adds a GitHub Action workflow file that executes ShiftLeft NextGen SAST (NG SAST) on this PR. Once merged, it will also execute NG SAST on all future PRs opened in this repo.\n\n### Visit [shiftleft.io](https://www.stg.shiftleft.io/findingsSummary/shiftleft-java-demo?apps=shiftleft-java-demo&isApp=1) to see the security findings for this repository.\n\n## We've done a few things on your behalf\n\n- Forked this demo application and opened a pull request\n- Generated a unique secret `SHIFTLEFT_ACCESS_TOKEN` to allow GitHub Actions in this repository to communicate with the ShiftLeft API\n- Created a [GitHub Action](https://github.com/username_0/shiftleft-java-demo/blob/demo-branch-1618414897/.github/workflows/shiftleft.yml) that will send this pull request to ShiftLeft for analysis\n- Added a status check that displays the result of the GitHub Action\n\nQuestions? Comments? Want to learn more? [Get in touch with us](https://www.shiftleft.io/contact/) or check out [our documentation](https://docs.shiftleft.io).",
"title": "Add GitHub Action: ShiftLeft NextGen Static Analysis",
"type": "issue"
}
] | 2 | 2 | 1,672 | false | true | 1,101 | true |
rht-labs/ubiquitous-journey | rht-labs | 719,996,267 | 194 | {
"number": 194,
"repo": "ubiquitous-journey",
"user_login": "rht-labs"
} | [
{
"action": "opened",
"author": "mvmaestri",
"comment_id": null,
"datetime": 1602577474000,
"masked_author": "username_0",
"text": "An app of apps for ready-to-use pipelines examples. The chart includes an example of a pipeline developed in Tekton, the peaceful cat 🐈. It contains the main steps of a continuous software delivery process. It enforces a strict semantic version validation strategy, managing tag increments for you. Develop, features, releases, patches and hotfixes flows are supported.\r\n\r\n",
"title": "UJ-pipelines with tekton-demo",
"type": "issue"
},
{
"action": "created",
"author": "springdo",
"comment_id": 981880987,
"datetime": 1638209025000,
"masked_author": "username_1",
"text": "I think given this tekton starter is already in the helm-charts repo in CoP land where it will get more exposure and this repo is supposed to be more focused on just gitOps as a method for CD, we should close it out",
"title": null,
"type": "comment"
}
] | 2 | 2 | 700 | false | false | 700 | false |
stealthcopter/AndroidNetworkTools | null | 730,199,810 | 78 | null | [
{
"action": "opened",
"author": "babajasoos",
"comment_id": null,
"datetime": 1603784320000,
"masked_author": "username_0",
"text": "",
"title": "gethostname should return name instead of ip address again",
"type": "issue"
},
{
"action": "created",
"author": "ForionsAcc",
"comment_id": 718886736,
"datetime": 1603990581000,
"masked_author": "username_1",
"text": "Same here, public addresses works fine but not local.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ForionsAcc",
"comment_id": 720977866,
"datetime": 1604392661000,
"masked_author": "username_1",
"text": "Hi, this has to do with the DHCP and DNS server on your local router, is there any other way to get the device name, so its not depended on the router configs.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 212 | false | false | 212 | false |
kubernetes-sigs/cluster-api | kubernetes-sigs | 709,123,123 | 3,696 | null | [
{
"action": "opened",
"author": "sedefsavas",
"comment_id": null,
"datetime": 1601054009000,
"masked_author": "username_0",
"text": "Currently, we are using Kind v0.7.1 for CAPD and e2e tests.\r\nKind v0.9.0 came out recently, we may want to wait until v0.9.1.",
"title": "Update Kind to v0.9.x for CAPD and e2e tests",
"type": "issue"
},
{
"action": "created",
"author": "vincepri",
"comment_id": 699048828,
"datetime": 1601054051000,
"masked_author": "username_1",
"text": "/milestone v0.4.0",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sedefsavas",
"comment_id": 699057018,
"datetime": 1601055052000,
"masked_author": "username_0",
"text": "/good-first-issue\r\n/help",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "evalsocket",
"comment_id": 699663708,
"datetime": 1601227755000,
"masked_author": "username_2",
"text": "/assign @username_2",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "tcordeu",
"comment_id": 702234240,
"datetime": 1601567988000,
"masked_author": "username_3",
"text": "Hey @username_2 can I work on these one? If so, would you mind helping me?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "evalsocket",
"comment_id": 702236020,
"datetime": 1601568167000,
"masked_author": "username_2",
"text": "/unassign @username_2 \n/assign @username_3",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "tcordeu",
"comment_id": 703132064,
"datetime": 1601743899000,
"masked_author": "username_3",
"text": "As Kind v0.9.0 depends on apimachinery v0.18.8 we should wait on https://github.com/kubernetes-sigs/cluster-api/pull/3735 to be merged (or to at least be stable), right?\r\ncc @username_1 \r\n\r\nPS: Sorry for the questions, this is my first issue.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "fabriziopandini",
"comment_id": 707784297,
"datetime": 1602599901000,
"masked_author": "username_4",
"text": "@username_3 #3735 is merged now. Are you still up to work on this issue\r\n?",
"title": null,
"type": "comment"
}
] | 6 | 10 | 966 | false | true | 609 | true |
NCAR/VAPOR | NCAR | 836,204,623 | 2,656 | null | [
{
"action": "opened",
"author": "clyne",
"comment_id": null,
"datetime": 1616173722000,
"masked_author": "username_0",
"text": "There is a setting on the Preferences menu to set (and lock) the window size. This is particularly useful for people who want to capture a number of outputs images, often over numerous session, and want the images to be a consistent size. However, the size specification applies to the overall application window, not the renderer window, limiting the usability of the feature.",
"title": "Window size feature should apply to visualization window, not entire application",
"type": "issue"
},
{
"action": "created",
"author": "StasJ",
"comment_id": 803039573,
"datetime": 1616179643000,
"masked_author": "username_1",
"text": "@username_0 That is why we have the \"Custom Output Size\" option under Viewpoint. I don't know what the point of the original option you are referring to is for, we could just remove it.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "clyne",
"comment_id": 803150835,
"datetime": 1616190218000,
"masked_author": "username_0",
"text": "We should probably discuss whether it makes sense to have both of these? The FrameBuffer control is clearly much more useful, but not sure we should do away with the viewpoint preference. \r\n\r\ncc: @username_3 , @username_2",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "shaomeng",
"comment_id": 803216297,
"datetime": 1616203546000,
"masked_author": "username_2",
"text": "yeah, it seems redundant to me. I'd vote to remove the options under global preference menu, and keep the one that's under viewpoint.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sgpearse",
"comment_id": 804302667,
"datetime": 1616438480000,
"masked_author": "username_3",
"text": "Removing it is fine with me.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "clyne",
"comment_id": null,
"datetime": 1616551257000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 4 | 6 | 935 | false | false | 935 | true |
campsych/concerto-platform | campsych | 663,285,020 | 294 | null | [
{
"action": "opened",
"author": "csancineto",
"comment_id": null,
"datetime": 1595363012000,
"masked_author": "username_0",
"text": "### Concerto Platform version\r\n5.0.13\r\n\r\n### Expected behavior\r\nText using unicode to be shown correctly \r\n\r\n### Actual behavior\r\nAccented texts are displayed incorrectly. The templates are in UTF-8 but they are shown in a different codification.\r\n\r\n### Steps to reproduce the issue\r\nI'm using Concerto installed in my University's server. When I use the internal container databases, the test is presented correctly. When I use external sql databases (to improve the performance), all the texts are presented incorrectly when I run the tests. When the templates are opened internally, all the texts are in UTF-8; when presented, the same templates are corrupt. \r\n\r\nA simple test that shows the problem is in the following link:\r\nhttps://concerto.sites.ufsc.br/test/teste\r\n\r\n<img width=\"1182\" alt=\"Captura de Tela 2020-07-21 às 17 18 51\" src=\"https://user-images.githubusercontent.com/68610931/88102821-908d9a00-cb76-11ea-9296-b917374ec0c1.png\">\r\n\r\n<img width=\"1182\" alt=\"Captura de Tela 2020-07-21 às 17 18 39\" src=\"https://user-images.githubusercontent.com/68610931/88102795-87043200-cb76-11ea-9193-d37e46caaae5.png\">",
"title": "UTF8 issues when using external sql databases",
"type": "issue"
},
{
"action": "created",
"author": "bkielczewski",
"comment_id": 662441035,
"datetime": 1595423128000,
"masked_author": "username_1",
"text": "Hi. \r\n\r\nIs your Concerto installation on University's server a bare-metal installation, not using the container we provide? \r\n\r\nIf so, this would look like the operating system doesn't have `en_US.UTF8` locale enabled. R interpreter needs this to handle UTF-8 properly in the strings. It is used when you run the test and isn't involved when you just use the admin panel. This would explain why you see the text correctly there but not when running the test. \r\n\r\nTo address this on the server Concerto is running on add this line to `/etc/locale.gen`:\r\n\r\n en_US.UTF-8 UTF-8\r\n\r\nThen execute:\r\n\r\n locale-gen \"en_US.UTF-8\"\r\n\r\nBest,\r\nb.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "csancineto",
"comment_id": 662639799,
"datetime": 1595445195000,
"masked_author": "username_0",
"text": "Thank you very much. I forwarded your answer to the professional responsible for installing the Concert. As soon as I get a return from him, I'll indicate if it worked.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "josenorberto",
"comment_id": 666565447,
"datetime": 1596131890000,
"masked_author": "username_2",
"text": "Hi!\r\n\r\nI was in contact with @username_0 to address this issue.\r\nI made a pull request (https://github.com/campsych/concerto-platform/pull/295) with the solution to the encoding problem we were facing with our University's MySQL server.\r\n\r\nBest regards,\r\nJosé Norberto Guiz Fernandes Corrêa",
"title": null,
"type": "comment"
}
] | 3 | 4 | 2,215 | false | false | 2,215 | true |
gilbarbara/logos | null | 768,110,938 | 398 | null | [
{
"action": "opened",
"author": "bromso",
"comment_id": null,
"datetime": 1608061902000,
"masked_author": "username_0",
"text": "Would be awesome if the current LinkedIn logo could be updated.\r\n\r\nAnd if you could please add like just the 'in' logo so to speak.\r\nDownload link from the official LinkedIn brand guideline site: https://content.linkedin.com/content/dam/me/brand/en-us/brand-home/downloads/LinkedIn-Logos.zip\r\n\r\nhttps://brand.linkedin.com/en-us\r\n\r\nThe 'in' logo I'm refering to: https://content.linkedin.com/content/dam/me/business/en-us/amp/brand-site/v2/bg/LI-Bug.svg.original.svg\r\n\r\nThank in advance!",
"title": "Add/Update LinkedIn logo",
"type": "issue"
},
{
"action": "created",
"author": "gilbarbara",
"comment_id": 751870106,
"datetime": 1609190962000,
"masked_author": "username_1",
"text": "Updated in a02fbe40d0c2e0658ac614783e452fca9d0f17f1\r\nThanks",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "gilbarbara",
"comment_id": null,
"datetime": 1609190962000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 545 | false | false | 545 | false |
lukeautry/tsoa | null | 732,898,872 | 834 | null | [
{
"action": "opened",
"author": "asdf123101",
"comment_id": null,
"datetime": 1604036463000,
"masked_author": "username_0",
"text": "<!--- Provide a general summary of the issue in the Title above -->\r\n\r\n## Sorting\r\n\r\n- **I'm submitting a ...**\r\n\r\n - [x] bug report\r\n - [ ] feature request\r\n - [ ] support request\r\n\r\n- I confirm that I\r\n - [x] used the [search](https://github.com/lukeautry/tsoa/search?type=Issues) to make sure that a similar issue hasn't already been submit\r\n\r\n## Expected Behavior\r\n`@Example()` should read variable value imported from another file.\r\n\r\nWith \r\n```ts\r\n// file a.ts\r\nexport const example = {someKey: 'someValue'}\r\n// file b.ts\r\nimport example from a\r\n...\r\n@Example(a)\r\n...\r\n```\r\ntsoa should generate the correct response example from the imported variable.\r\n\r\n## Current Behavior\r\nIt now creates an empty object `{}` if the variable is imported. It works if the example value is directly supplied to the decorator or defined in the same file.\r\n\r\n## Possible Solution\r\nFeel like a TS module resolution issue.\r\n\r\n## Steps to Reproduce\r\nSee expected behavior section.\r\n\r\n## Context (Environment)\r\n\r\nVersion of the library: 3.4.0\r\nVersion of NodeJS: 12.4.0\r\n\r\n- Confirm you were using yarn not npm: [x]",
"title": "@Example doesn't read value from an imported variable",
"type": "issue"
},
{
"action": "created",
"author": "DigohD",
"comment_id": 743219916,
"datetime": 1607696504000,
"masked_author": "username_1",
"text": "This also seems to only be an issue when specVersion is 3. When specVersion is set to 2 it works as expected.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "nikosgpet",
"comment_id": 919539371,
"datetime": 1631656410000,
"masked_author": "username_2",
"text": "Wondering if there is any update, because I just had the same issue.",
"title": null,
"type": "comment"
}
] | 4 | 5 | 1,413 | false | true | 1,280 | false |
mekanism/Mekanism | mekanism | 556,376,650 | 5,823 | null | [
{
"action": "opened",
"author": "Zoratan",
"comment_id": null,
"datetime": 1580234601000,
"masked_author": "username_0",
"text": "Issue description:\r\n\r\nRotary Condensentrator does not accept water, singleplayer and dedicated Server.\r\nLooks like there is no Water Vapor (replaced by Steam) gas tank in creative.\r\n\r\nEven tried to set the water pipe to push. \r\nIt connects but the condensentrator does not take anything in.\r\nCondensentrating steam produces Liquid steam...\r\n\r\nVersion (make sure you are on the latest version before reporting):\r\n\r\nForge: forge-1.15.2-31.0.1\r\nMekanism: 9.9.5",
"title": "[1.15.2] Rotary Condensentrator not accepting water",
"type": "issue"
},
{
"action": "created",
"author": "pupnewfster",
"comment_id": 579383502,
"datetime": 1580235253000,
"masked_author": "username_1",
"text": "Opened up a 1.12 instance and seems that I hadn't actually noticed mekanism had something called \"water vapor\" (maybe that is just the localized name it was using), I will try to look into this at some point and figure out what functionality I accidentally killed off.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "term2112",
"comment_id": 579384813,
"datetime": 1580235451000,
"masked_author": "username_2",
"text": "I can confirm in is in 9.9.6\r\n\r\n\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "winsrp",
"comment_id": 579849420,
"datetime": 1580316333000,
"masked_author": "username_3",
"text": "I'm just getting to know this mod, and I can confirm this on 1.5.2, What broke is ore processing for tier 4 (5X)\r\n\r\n[https://wiki.aidancbrady.com/w/images/aidancbrady/a/a2/Mekanism-flowchart-v7-simplified.png](url)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "winsrp",
"comment_id": 579979114,
"datetime": 1580335045000,
"masked_author": "username_3",
"text": "Steam seems to be the most common form of this gas (I mean... water vapor is steam XD), so might as well just make it steam which would also make it compatible with other mods,, but also I think you have other machines that make steam right? So would that break this loop, or would it actually add an alternative for it?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Zoratan",
"comment_id": 579993997,
"datetime": 1580337340000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pupnewfster",
"comment_id": 580012885,
"datetime": 1580340691000,
"masked_author": "username_1",
"text": "The reason I for now just added water vapor additionally instead of removing one of the types of steam. Is I don't want to have to go through/figure out currently the implications of being able to get steam via just the condensentrator instead of having to get it via other means. Eventually we will figure out how we want to handle the production/usage of steam and water vapor.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "winsrp",
"comment_id": 580046611,
"datetime": 1580348850000,
"masked_author": "username_3",
"text": "looking around in the wiki it seems that the other thing that produces steam is the Thermoelectric Boiler, which is far more expensive in terms of materials, and the output is much greater, that said, I don't think nothing would break, the Rotary condensentrator works as a small supplier of steam for the ore setup, which is just what you need, and the Thermoelectric Boiler is something bigger for the turbine, which could eventually replace the condensentrator as the steam provider for the setup, but it still has lots more uses so I think from a user perspective that its fine to change it to making just steam instead of steam with another name.\r\n\r\nAlso checking the wiki:\r\n\r\nWater Vapor\r\nMade by:\r\n\r\nRotary Condensentrator using Water\r\nUsed by:\r\n\r\nChemical Infuser to make Sulfuric Acid -> [the one on this setup for ore production]\r\nChemical Injection Chamber with Dirt to make Clay -> [the only other use ¯\\_(ツ)_/¯ ]\r\n\r\nAnd that's about it... so I think that for the actual uses it has... steam would work just fine.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pupnewfster",
"comment_id": 580047803,
"datetime": 1580349145000,
"masked_author": "username_1",
"text": "Closing this for now as it is \"fixed\" in 9.9.7 which is released and on curseforge. I will end up looking back at this when I eventually look more into how things will actually be handled.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "pupnewfster",
"comment_id": null,
"datetime": 1580349146000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "closeshot",
"comment_id": 596086585,
"datetime": 1583586469000,
"masked_author": "username_4",
"text": "I found that it accepts water now version 1.15.2 9.9.14.406 however it is not producing water vapor",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pupnewfster",
"comment_id": 596092211,
"datetime": 1583590122000,
"masked_author": "username_1",
"text": "#5940",
"title": null,
"type": "comment"
}
] | 5 | 12 | 3,211 | false | false | 3,211 | false |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.