repo stringlengths 7 67 | org stringlengths 2 32 ⌀ | issue_id int64 780k 941M | issue_number int64 1 134k | pull_request dict | events list | user_count int64 1 77 | event_count int64 1 192 | text_size int64 0 329k | bot_issue bool 1 class | modified_by_bot bool 2 classes | text_size_no_bots int64 0 279k | modified_usernames bool 2 classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
zeit/next.js | zeit | 553,125,564 | 10,198 | null | [
{
"action": "opened",
"author": "zluo01",
"comment_id": null,
"datetime": 1579640320000,
"masked_author": "username_0",
"text": "# Question about Next.js\r\nI have following code in my custom _app.js\r\n```\r\n <Provider store={store}>\r\n <Component {...pageProps} />\r\n <Player/>\r\n </Provider>\r\n```.\r\nI want the Player will keep playing without reloading when changing paths.\r\nIf I use express as a custom server ,the Player component will not reload every time when I change pages, however, if I use dynamic routing, then when I change page, the Player component will reload. I wonder if it is because dynamic routing will trigger SSR every time when changing pages?\r\n\r\nThank you",
"title": "Dynamic Routing trigger SSR everytime ?",
"type": "issue"
},
{
"action": "closed",
"author": "timneutkens",
"comment_id": null,
"datetime": 1579640352000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "timneutkens",
"comment_id": 576878705,
"datetime": 1579640352000,
"masked_author": "username_1",
"text": "Please follow the issue template.\r\n\r\nhttps://github.com/zeit/next.js/issues/new/choose\r\n\r\nhttps://github.com/zeit/next.js/issues/new?template=8.Question_about_next.md\r\n\r\n```\r\n# Question about Next.js\r\n\r\nGitHub Issues are reserved for Bug reports and Feature requests. The best place to get your question answered is to post it on https://spectrum.chat/next-js.\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "balazsorban44",
"comment_id": 1025162485,
"datetime": 1643555135000,
"masked_author": "username_2",
"text": "This issue has been automatically locked due to no recent activity. If you are running into a similar issue, please create a new issue with the steps to reproduce. Thank you.",
"title": null,
"type": "comment"
}
] | 3 | 4 | 1,135 | false | false | 1,135 | false |
red6/pdfcompare | red6 | 532,623,725 | 60 | null | [
{
"action": "opened",
"author": "prashantpahwa",
"comment_id": null,
"datetime": 1575458837000,
"masked_author": "username_0",
"text": "On comparing a PDF with size 1800 pages, It chokes up the CPU and 12 GB of RAM for 15 minutes. The result varies with 75-150 pages instead of 1800 pages.",
"title": "Utility fails to create complete result for large size pdf files.",
"type": "issue"
},
{
"action": "created",
"author": "finsterwalder",
"comment_id": 563453222,
"datetime": 1575927730000,
"masked_author": "username_1",
"text": "In this general form it is really hard to tackle this problem.\r\nWould it be possible that you supply one of your documents? Or a generated document that shows the problem?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "finsterwalder",
"comment_id": null,
"datetime": 1579599836000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "finsterwalder",
"comment_id": 576599485,
"datetime": 1579599836000,
"masked_author": "username_1",
"text": "Since there was no reaction, I close this ticket for now.\r\nFeel free to contact me for further investigation...",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Lonzak",
"comment_id": 576782183,
"datetime": 1579626535000,
"masked_author": "username_2",
"text": "@username_0 Can you supply a test document? Provide some details...",
"title": null,
"type": "comment"
}
] | 3 | 5 | 505 | false | false | 505 | true |
jboss-openshift/cct_module | jboss-openshift | 301,424,050 | 205 | {
"number": 205,
"repo": "cct_module",
"user_login": "jboss-openshift"
} | [
{
"action": "opened",
"author": "iankko",
"comment_id": null,
"datetime": 1519914469000,
"masked_author": "username_0",
"text": "Cherrypick changes from https://github.com/jboss-openshift/cct_module/pull/198 also into the ```sprint-14``` branch.\r\n\r\nSigned-off-by: rcernich <rcernich@redhat.com>\r\nSigned-off-by: Jan Lieskovsky <jlieskov@redhat.com>\r\n\r\nThanks for submitting your Pull Request!\r\n\r\nPlease make sure your PR meets following requirements:\r\n\r\n- [ ] Pull Request title is properly formatted: `[CLOUD-XYA] Subject`\r\n- [ ] Pull Request contains link to the JIRA issue\r\n- [ ] Pull Request contains description of the issue\r\n- [ ] Pull request does not include fixes for other issues than the main ticket\r\n- [ ] Attached commits represent unit of work and are properly formatted\r\n- [ ] You have read and agreed to the Developer Certificate of Origin (DCO) (see `CONTRIBUTING.md`)\r\n- [ ] Every commit contains `` - use `git commit -s`",
"title": "[CLOUD-2316] only use JGroups ENCRYPT protocol with JDG 6.5",
"type": "issue"
},
{
"action": "created",
"author": "iankko",
"comment_id": 369608500,
"datetime": 1519914499000,
"masked_author": "username_0",
"text": "@goldmann @rcernich PTAL\r\n\r\nThank you, Jan",
"title": null,
"type": "comment"
}
] | 1 | 2 | 851 | false | false | 851 | false |
gcash/bchd | gcash | 392,399,075 | 170 | {
"number": 170,
"repo": "bchd",
"user_login": "gcash"
} | [
{
"action": "opened",
"author": "tyler-smith",
"comment_id": null,
"datetime": 1545181413000,
"masked_author": "username_0",
"text": "Currently we show a warning but don't don't actually alter our behavior in any way. This changes that by returning and exiting the process with an error code.",
"title": "BUGFIX: Don't attempt deep reorg unless using --force flag.",
"type": "issue"
},
{
"action": "created",
"author": "cpacia",
"comment_id": 448431012,
"datetime": 1545181528000,
"masked_author": "username_1",
"text": "Thanks. I actually fixed this as part of the fast sync PR which I just merged.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 236 | false | false | 236 | false |
apache/incubator-dubbo | apache | 339,192,639 | 2,045 | null | [
{
"action": "opened",
"author": "brucelwl",
"comment_id": null,
"datetime": 1531019560000,
"masked_author": "username_0",
"text": "例如:如果是xml配置可以这样使用,\r\n<dubbo:reference interface=\"com.xxx.XxxService\">\r\n <dubbo:method name=\"findXxx\" timeout=\"3000\" retries=\"2\" />\r\n</dubbo:reference>\r\n但是如果是 \r\n@Reference\r\nprivate com.xxx.XxxService xxxService;\r\n如何针对XxxService类中的findXxx 方法配置进行设置",
"title": "采用dubbo的@Reference注解如何使用方法级别配置",
"type": "issue"
},
{
"action": "created",
"author": "kimmking",
"comment_id": 403260913,
"datetime": 1531023116000,
"masked_author": "username_1",
"text": "Add reference annotation to methods directly",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chutian52",
"comment_id": 403330201,
"datetime": 1531096611000,
"masked_author": "username_2",
"text": "目前情况下好像是不可以的,你可以尝试下在parameters这个参数中设置下看看。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zhengjieyuan",
"comment_id": 407257792,
"datetime": 1532398429000,
"masked_author": "username_3",
"text": "同问 关注下",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dslchd",
"comment_id": 409434225,
"datetime": 1533092648000,
"masked_author": "username_4",
"text": "目前好像不能,我之前一直想在provider方的@Service注解,配置方法级别的服务provider配置,没有这些选项,只能用xml配置。但是既然用了注解,就肯定不再单独用xml配置了。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kute",
"comment_id": 409562564,
"datetime": 1533127661000,
"masked_author": "username_5",
"text": "看看这个 https://my.oschina.net/roccn/blog/871032",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "chickenlj",
"comment_id": 469108510,
"datetime": 1551671041000,
"masked_author": "username_6",
"text": "@username_7 please apply this change https://github.com/apache/incubator-dubbo/pull/2603to the master branch.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cvictory",
"comment_id": 469161903,
"datetime": 1551688158000,
"masked_author": "username_7",
"text": "https://github.com/username_7/incubator-dubbo/tree/issue_2045",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cvictory",
"comment_id": 469162065,
"datetime": 1551688188000,
"masked_author": "username_7",
"text": "I still need to do some test .",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "chickenlj",
"comment_id": null,
"datetime": 1552552057000,
"masked_author": "username_6",
"text": "",
"title": null,
"type": "issue"
}
] | 8 | 10 | 674 | false | false | 674 | true |
lazd/coronadatascraper | null | 581,313,463 | 17 | null | [
{
"action": "opened",
"author": "raysalem",
"comment_id": null,
"datetime": 1584209738000,
"masked_author": "username_0",
"text": "Website data is below. note this a maitrix, need sum all three columns and to be bias towards positive also sum presumptive--> \r\n\r\n | San Diego County1 | Federal Quarantine2 | Non-San Diego County Residents3\r\n-- | -- | -- | --\r\nPositive (confirmed cases) | 0 | 2 | 0\r\nPresumptive Positive | 8 | 1 | 0\r\nPending Results | 38 | 6 | 4\r\nNegative | 99 | 11 | 8\r\nTotal Tested | 145 | 20 | 12\r\n\r\n**URL**\r\nhttps://www.sandiegocounty.gov/content/sdc/hhsa/programs/phs/community_epidemiology/dc/2019-nCoV/status.html\r\n\r\n**Scraper code -->**\r\n{\r\n county: 'San Diego County',\r\n state: 'CA',\r\n country: 'USA',\r\n url: 'https://www.sandiegocounty.gov/content/sdc/hhsa/programs/phs/community_epidemiology/dc/2019-nCoV/status.html',\r\n scraper: async function() {\r\n let $ = await fetch.page(this.url);\r\n\r\n let cases = parse.number($('td:contains(\"Positive (confirmed cases)\")').next('td').text()) + parse.number($('td:contains(\"Presumptive Positive\")').next('td').text());\r\n return {\r\n cases: cases,\r\n tested: parse.number($('td:contains(\"Total Tested\")').next('td').text())\r\n };\r\n }\r\n\r\n\r\nI would fix this,b tut dont know Java Scriping",
"title": "San diego data is wrong",
"type": "issue"
},
{
"action": "closed",
"author": "lazd",
"comment_id": null,
"datetime": 1584213329000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "lazd",
"comment_id": 599123524,
"datetime": 1584213365000,
"masked_author": "username_1",
"text": "Thanks for the report, fixed!\r\n\r\nAnd hey, this is a great excuse to [learn JavaScript](https://javascript.info/).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "raysalem",
"comment_id": 600730098,
"datetime": 1584548747000,
"masked_author": "username_0",
"text": "the data is still wrong for san diego, and we might want something like this\r\n\r\nwhat is the page https://www.sandiegocounty.gov/content/sdc/hhsa/programs/phs/community_epidemiology/dc/2019-nCoV/status.html -->\r\n\r\nPositive Cases in San Diego County Since February 14, 2020Coronavirus Disease 2019 (COVID-19)Updated March 17, 2020\r\n--\r\nCOVID-19 Case Summary | San Diego County Residents | Federal Quarantine | Non-San Diego County Residents | Total\r\nTotal Positives | 51 | 5 | 4 | 60\r\nAge Groups | | | | \r\n0-17 years | 0 | 0 | 0 | 0\r\n18-64 years | 43 | 1 | 3 | 47\r\n65+ years | 8 | 4 | 1 | 13\r\nAge Unknown | 0 | 0 | 0 | 0\r\nGender | | | | \r\nFemale | 17 | 2 | 2 | 21\r\nMale | 34 | 3 | 2 | 39\r\nUnknown | 0 | 0 | 0 | 0\r\nHospitalized | 8 | 1 | 1 | 10\r\nDeaths | 0 | 0 | 0 | 0\r\n\r\nright now reporting zeros, since the data has changed\r\npython solution is -->\r\n\r\nimport pandas as pd\r\nimport re\r\nimport requests\r\nfrom bs4 import BeautifulSoup\r\n\r\nURL = 'https://www.sandiegocounty.gov/content/sdc/hhsa/programs/phs/community_epidemiology/dc/2019-nCoV/status.html'\r\npage = requests.get(URL)\r\nsoup = BeautifulSoup(page.content, 'html.parser')\r\n\r\n\r\n\r\n\r\ntable = soup.find(\"div\",{\"class\":\"table parbase section\"})\r\nrows = table.find_all('tr')\r\n\r\n# handle header\r\nheader = [row.text for row in rows[1].find_all('td')]\r\nheader = [re.sub('[ \\t\\n]+', ' ',h) for h in header]\r\n\r\ntbl ={}\r\nfor row in rows[2:]: #skip the first row\r\n data = [r.text for r in row.find_all('td')] \r\n if data[1] =='\\xa0':continue \r\n tbl[data[0]]=[int(d) for d in data[1:]]\r\ndf = pd.DataFrame(tbl, index=header[1:])\r\ndisplay(HTML(df.to_html()))\r\nupdateDateTime = rows[0].find('td').text.split('\\n')[-1].replace(\"Updated\",\"\")\r\nprint(\"updateDateTime %s\" %updateDateTime )\r\n\r\nwill generate this -->\r\n\r\n | Total Positives | 0-17 years | 18-64 years | 65+ years | Age Unknown | Female | Male | Unknown | Hospitalized | Deaths\r\n-- | -- | -- | -- | -- | -- | -- | -- | -- | -- | --\r\n51 | 0 | 43 | 8 | 0 | 17 | 34 | 0 | 8 | 0\r\n5 | 0 | 1 | 4 | 0 | 2 | 3 | 0 | 1 | 0\r\n4 | 0 | 3 | 1 | 0 | 2 | 2 | 0 | 1 | 0\r\n60 | 0 | 47 | 13 | 0 | 21 | 39 | 0 | 10 | 0\r\n\r\nupdateDateTime = March 17, 2020",
"title": null,
"type": "comment"
}
] | 2 | 4 | 3,441 | false | false | 3,441 | false |
liferay/clay | liferay | 569,777,112 | 2,948 | {
"number": 2948,
"repo": "clay",
"user_login": "liferay"
} | [
{
"action": "opened",
"author": "wincent",
"comment_id": null,
"datetime": 1582541571000,
"masked_author": "username_0",
"text": "Looks like e393c07f3f9b21bb7f146037d7372e1b updated some but not all of the snapshots.",
"title": "chore: update stale snapshots",
"type": "issue"
},
{
"action": "created",
"author": "wincent",
"comment_id": 590264956,
"datetime": 1582541667000,
"masked_author": "username_0",
"text": "@username_1: I am just assuming that the snapshot is stale and there's no actual bug here, but I could be wrong — I trust you'll know either way! 😂",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "bryceosterhaus",
"comment_id": 590435895,
"datetime": 1582563570000,
"masked_author": "username_1",
"text": "Thanks for sending this. I realized I had these updated on a different branch but forgot to push before I merged the the branch.",
"title": null,
"type": "comment"
}
] | 3 | 4 | 922 | false | true | 365 | true |
vim-airline/vim-airline | vim-airline | 447,404,798 | 1,923 | null | [
{
"action": "opened",
"author": "pjrt",
"comment_id": null,
"datetime": 1558573589000,
"masked_author": "username_0",
"text": "I noticed an odd flickering in the `hunks` section of airline a couple of days ago ([see this gif](https://media.giphy.com/media/YOjughYk2wkXxlW69n/giphy.gif)). I've managed to track down the issue (via git bisect) to this commit: 9112675ad8c069838f9584003fd3450226ad9085 (so this must be old, I just didn't upgrade until a couple of days ago).\r\n\r\nThe issue doesn't appear to happen in good-old vim though, just neovim.\r\n\r\n#### environment\r\n\r\n- vim: NVIM v0.3.5\r\n- vim-airline: 9112675ad8c069838f9584003fd3450226ad9085 or after\r\n- OS: Linux 5.1.2-arch1-1-ARCH\r\n- Have you reproduced with a minimal vimrc:\r\n With a simple vimrc of just vim-airline, git-gutter and fugitive, I was not able to reproduce it\r\n- What is your airline configuration:\r\n My only airline related configuration is:\r\n```\r\nlet g:airline#extensions#whitespace#enabled = 0\r\n```\r\nI also have this set up, which may or may not be related:\r\n```\r\nset ttimeoutlen=10\r\naugroup FastEscape\r\n autocmd!\r\n au InsertEnter * set timeoutlen=0\r\n au InsertLeave * set timeoutlen=1000\r\naugroup END\r\n```\r\nMy vimrc can be found here: https://github.com/username_0/dotfiles/blob/74b5546/vimrc\r\n- terminal: urxvtc\r\n- $TERM variable: rxvt-unicode\r\n- color configuration (:set t_Co?): NO\r\n- if you are using Neovim, does it happen in Vim: NO\r\n\r\n#### actual behavior\r\n\r\nOn save, the `hunks` section flickers on and off. Sometimes once, sometimes multiple times.\r\n\r\n#### expected behavior\r\n\r\nThe `hunks` section should be solid and stable.",
"title": "`hunks` sections flickers when saving a file in neovim",
"type": "issue"
},
{
"action": "created",
"author": "chrisbra",
"comment_id": 495077422,
"datetime": 1558590767000,
"masked_author": "username_1",
"text": "Well, I don't use neovim. If you know how to fix it, I am open for a PR. Else you might simply want to disable the dirty feature.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "chrisbra",
"comment_id": null,
"datetime": 1558590767000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "reopened",
"author": "chrisbra",
"comment_id": null,
"datetime": 1558590841000,
"masked_author": "username_1",
"text": "I noticed an odd flickering in the `hunks` section of airline a couple of days ago ([see this gif](https://media.giphy.com/media/YOjughYk2wkXxlW69n/giphy.gif)). I've managed to track down the issue (via git bisect) to this commit: 9112675ad8c069838f9584003fd3450226ad9085 (so this must be old, I just didn't upgrade until a couple of days ago).\r\n\r\nThe issue doesn't appear to happen in good-old vim though, just neovim.\r\n\r\n#### environment\r\n\r\n- vim: NVIM v0.3.5\r\n- vim-airline: 9112675ad8c069838f9584003fd3450226ad9085 or after\r\n- OS: Linux 5.1.2-arch1-1-ARCH\r\n- Have you reproduced with a minimal vimrc:\r\n With a simple vimrc of just vim-airline, git-gutter and fugitive, I was not able to reproduce it\r\n- What is your airline configuration:\r\n My only airline related configuration is:\r\n```\r\nlet g:airline#extensions#whitespace#enabled = 0\r\n```\r\nI also have this set up, which may or may not be related:\r\n```\r\nset ttimeoutlen=10\r\naugroup FastEscape\r\n autocmd!\r\n au InsertEnter * set timeoutlen=0\r\n au InsertLeave * set timeoutlen=1000\r\naugroup END\r\n```\r\nMy vimrc can be found here: https://github.com/username_0/dotfiles/blob/74b5546/vimrc\r\n- terminal: urxvtc\r\n- $TERM variable: rxvt-unicode\r\n- color configuration (:set t_Co?): NO\r\n- if you are using Neovim, does it happen in Vim: NO\r\n\r\n#### actual behavior\r\n\r\nOn save, the `hunks` section flickers on and off. Sometimes once, sometimes multiple times.\r\n\r\n#### expected behavior\r\n\r\nThe `hunks` section should be solid and stable.",
"title": "`hunks` sections flickers when saving a file in neovim",
"type": "issue"
},
{
"action": "created",
"author": "chrisbra",
"comment_id": 495077934,
"datetime": 1558590920000,
"masked_author": "username_1",
"text": "Please try to come up with a minimal vimrc reproducing the problem, then I might have a look trying with neovim. BTW: I found the gif very hard to see the problem, due to the bad quality.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pjrt",
"comment_id": 495407928,
"datetime": 1558650603000,
"masked_author": "username_0",
"text": "Given that I can't reproduce this with neovim and a simpler vimrc, I am going to close this (was hoping something would pop up). \r\n\r\n@username_1 how do I go about disabling that? Would that disable hunks in general?",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "pjrt",
"comment_id": null,
"datetime": 1558650603000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "chrisbra",
"comment_id": 495582314,
"datetime": 1558697042000,
"masked_author": "username_1",
"text": "Have a look at https://github.com/vim-airline/vim-airline/blob/master/doc/airline.txt#L532",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "pjrt",
"comment_id": 495621291,
"datetime": 1558703534000,
"masked_author": "username_0",
"text": "That did the trick, thanks!",
"title": null,
"type": "comment"
}
] | 2 | 9 | 3,608 | false | false | 3,608 | true |
telstra/open-kilda | telstra | 534,797,551 | 3,011 | null | [
{
"action": "opened",
"author": "surabujin",
"comment_id": null,
"datetime": 1575883628000,
"masked_author": "username_0",
"text": "```\r\n/* The VLAN id is 12-bits, so we can use the entire 16 bits to indicate\r\n* special conditions.\r\n*/\r\nenum ofp_vlan_id {\r\nOFPVID_PRESENT = 0x1000, /* Bit that indicate that a VLAN id is set */\r\nOFPVID_NONE = 0x0000, /* No VLAN id was set. */\r\n};\r\n```\r\n\r\nWe should check is the switch handle this bit and produce the incorrect response for flow-stats or switch completely ignore this bit and we must not set it for this group of switches.\r\n\r\nRelated to https://github.com/telstra/open-kilda/pull/3005",
"title": "Issue with \"set-field\" action for VLAN-VID field",
"type": "issue"
},
{
"action": "created",
"author": "surabujin",
"comment_id": 619831673,
"datetime": 1587977428000,
"masked_author": "username_0",
"text": "The fix has been merged.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "surabujin",
"comment_id": null,
"datetime": 1587977428000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 526 | false | false | 526 | false |
Ramiro115/github-slideshow | null | 532,314,164 | 2 | null | [
{
"action": "closed",
"author": "Ramiro115",
"comment_id": null,
"datetime": 1575413551000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "reopened",
"author": "Ramiro115",
"comment_id": null,
"datetime": 1575413557000,
"masked_author": "username_0",
"text": "### Introduction to GitHub flow\n\nNow that you're familiar with issues, let's use this issue to track your path to your first contribution.\n\nPeople use different workflows to contribute to software projects, but the simplest and most effective way to contribute on GitHub is the GitHub flow.\n\n:tv: [Video: Understanding the GitHub flow](https://www.youtube.com/watch?v=PBI2Rz-ZOxU)\n\n<hr>\n<h3 align=\"center\">Read below for next steps</h3>",
"title": "Your first contribution",
"type": "issue"
},
{
"action": "closed",
"author": "Ramiro115",
"comment_id": null,
"datetime": 1575413562000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "reopened",
"author": "Ramiro115",
"comment_id": null,
"datetime": 1575581687000,
"masked_author": "username_0",
"text": "### Introduction to GitHub flow\n\nNow that you're familiar with issues, let's use this issue to track your path to your first contribution.\n\nPeople use different workflows to contribute to software projects, but the simplest and most effective way to contribute on GitHub is the GitHub flow.\n\n:tv: [Video: Understanding the GitHub flow](https://www.youtube.com/watch?v=PBI2Rz-ZOxU)\n\n<hr>\n<h3 align=\"center\">Read below for next steps</h3>",
"title": "Your first contribution",
"type": "issue"
},
{
"action": "reopened",
"author": "Ramiro115",
"comment_id": null,
"datetime": 1575583994000,
"masked_author": "username_0",
"text": "### Introduction to GitHub flow\n\nNow that you're familiar with issues, let's use this issue to track your path to your first contribution.\n\nPeople use different workflows to contribute to software projects, but the simplest and most effective way to contribute on GitHub is the GitHub flow.\n\n:tv: [Video: Understanding the GitHub flow](https://www.youtube.com/watch?v=PBI2Rz-ZOxU)\n\n<hr>\n<h3 align=\"center\">Read below for next steps</h3>",
"title": "Your first contribution",
"type": "issue"
}
] | 2 | 11 | 7,412 | false | true | 1,308 | false |
zcoinofficial/zcoin | zcoinofficial | 541,666,909 | 806 | {
"number": 806,
"repo": "zcoin",
"user_login": "zcoinofficial"
} | [
{
"action": "opened",
"author": "a-bezrukov",
"comment_id": null,
"datetime": 1577093618000,
"masked_author": "username_0",
"text": "## PR intention\r\nSet Ubuntu 18.04 \"Bionic\" as the build engine for the Gitian builder\r\n\r\n## Code changes brief\r\nMost of the update is straight-forward as it was ported from the upstream repository.",
"title": "Upstream depends",
"type": "issue"
},
{
"action": "created",
"author": "reubenyap",
"comment_id": 569610310,
"datetime": 1577693183000,
"masked_author": "username_1",
"text": "To merge only after 13.8.10 release.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 233 | false | false | 233 | false |
spiral/framework | spiral | 622,489,512 | 265 | {
"number": 265,
"repo": "framework",
"user_login": "spiral"
} | [
{
"action": "opened",
"author": "48d90782",
"comment_id": null,
"datetime": 1590066701000,
"masked_author": "username_0",
"text": "update CHANGELOG",
"title": "release 2.4.18",
"type": "issue"
}
] | 2 | 2 | 16 | false | true | 16 | false |
ElemeFE/element | ElemeFE | 309,988,933 | 10,476 | null | [
{
"action": "opened",
"author": "xuyanming",
"comment_id": null,
"datetime": 1522392029000,
"masked_author": "username_0",
"text": "<!-- generated by https://eleme-issue.surge.sh DO NOT REMOVE -->\r\n\r\n### Element UI version\r\n2.3.2\r\n\r\n### OS/Browsers version\r\nFirefox\r\n\r\n### Vue version\r\n2.5.17-beta.0\r\n\r\n### Reproduction Link\r\n[https://jsfiddle.net/700ah11e/4/](https://jsfiddle.net/700ah11e/4/)\r\n\r\n### Steps to reproduce\r\nDrag to the outside of the node\r\n\r\n### What is Expected?\r\nDrag to the outside of the node\r\n\r\n### What is actually happening?\r\nDrag to the outside of the node\r\n\r\n<!-- generated by https://eleme-issue.surge.sh DO NOT REMOVE -->",
"title": "[Bug Report] Tree draggable to open a new page",
"type": "issue"
},
{
"action": "closed",
"author": "Leopoldthecoder",
"comment_id": null,
"datetime": 1522477750000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 515 | false | false | 515 | false |
dynatrace-oss/barista | dynatrace-oss | 570,390,351 | 637 | null | [
{
"action": "opened",
"author": "myieye",
"comment_id": null,
"datetime": 1582618754000,
"masked_author": "username_0",
"text": "# Feature Request\r\n\r\n## Summary\r\n\r\n`dt-simple-number-column` has a sort direction of `desc`. I think that's a good default, but that it should be settable.\r\n\r\n## Feature Description\r\n\r\nSorting number in descending order often makes sense, but there are certainly cases when sorting them in ascending order is more desirable. For example, we sort browser monitor availability in ascending order so that the more problematic monitors are at the top.\r\n\r\nOr course I could make my own column, but I just want to set one parameter and I think that `asc` is a valid initial sort order for numbers. So, I think that the sort order of `dt-simple-number-column` should be configurable.\r\n\r\nCurrently I can use `dtSortActive=\"availability\" dtSortDirection=\"asc\"` so that the column is initially ordered with `asc`, but then if I change the sorted column and switch it back to availability again, then it is initially sorted with `desc`.",
"title": "Add sort direction Input to simple-number-column",
"type": "issue"
},
{
"action": "created",
"author": "tomheller",
"comment_id": 590769761,
"datetime": 1582622991000,
"masked_author": "username_1",
"text": "This was initially requested by UX and POs, that there is a consistent sorting behavior on number columns. The decision to not make this configurable on simple-columns was a conscious one.\r\n\r\nMy opinion would be, to not make this configurable. @username_2 @Marike-Sorgdrager, what would be your take on this one?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ffriedl89",
"comment_id": 590774475,
"datetime": 1582623625000,
"masked_author": "username_2",
"text": "@username_0 Thanks for the feature request. I have to agree with @username_1 here. This was a lengthy discussion in the past and it was a conscious decision to have this set internally. If you have an edge case to sort numbers differently you can always create your own simple-column. But as part of the library it makes sense to keep it as is.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "myieye",
"comment_id": null,
"datetime": 1582640609000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 4 | 1,575 | false | false | 1,575 | true |
flowscripter/cli | flowscripter | 598,577,264 | 189 | {
"number": 189,
"repo": "cli",
"user_login": "flowscripter"
} | [
{
"action": "created",
"author": "vectronic",
"comment_id": 629812917,
"datetime": 1589728031000,
"masked_author": "username_0",
"text": ":tada: This PR is included in version 1.1.0 :tada:\n\nThe release is available on:\n- [npm package (@latest dist-tag)](https://www.npmjs.com/package/@flowscripter/cli/v/1.1.0)\n- [GitHub release](https://github.com/flowscripter/cli/releases/tag/v1.1.0)\n\nYour **[semantic-release](https://github.com/semantic-release/semantic-release)** bot :package::rocket:",
"title": null,
"type": "comment"
}
] | 3 | 3 | 4,781 | false | true | 353 | false |
ENCODE-DCC/encoded | ENCODE-DCC | 444,548,964 | 2,764 | {
"number": 2764,
"repo": "encoded",
"user_login": "ENCODE-DCC"
} | [
{
"action": "opened",
"author": "p-sud",
"comment_id": null,
"datetime": 1557939967000,
"masked_author": "username_0",
"text": "… certification audit",
"title": "ENCD-4609 exclude fccs institutional certification",
"type": "issue"
},
{
"action": "created",
"author": "p-sud",
"comment_id": 492816671,
"datetime": 1557953081000,
"masked_author": "username_0",
"text": "My idea was that we could potentially use \"component\" in the future to indicate \"mapping\", \"data analysis\", etc to indicate other types of awards we need to do stuff with. I did consider a boolean but it's not extensible.",
"title": null,
"type": "comment"
}
] | 1 | 2 | 242 | false | false | 242 | false |
smartcitiesdata/react_discovery_ui | smartcitiesdata | 510,195,615 | 67 | null | [
{
"action": "opened",
"author": "dunson062786",
"comment_id": null,
"datetime": 1571683145000,
"masked_author": "username_0",
"text": "Datasets take forever to load and map forever to render on really large datasets:\r\n\r\nhttps://discovery.prod.internal.smartcolumbusos.com/dataset/ogrip/f9581d10_f8f9_4efa_8cdf_1c0c6b208687\r\n\r\nPursue possible alternatives that might take less time",
"title": "Datasets take forever to load and map forever to render on really large datasets",
"type": "issue"
}
] | 1 | 1 | 245 | false | false | 245 | false |
ionic-team/ionic | ionic-team | 393,764,977 | 16,870 | null | [
{
"action": "opened",
"author": "JCKodel",
"comment_id": null,
"datetime": 1545584012000,
"masked_author": "username_0",
"text": "<!-- Before submitting an issue, please consult our docs (https://beta.ionicframework.com/docs/) and API reference (https://beta.ionicframework.com/docs/api/) -->\r\n\r\n<!-- Please make sure you are posting an issue pertaining to the Ionic Framework. If you are having an issue with the Ionic Appflow services (Ionic View, Ionic Deploy, etc.) please consult the Ionic Appflow support portal (https://ionic.zendesk.com/hc/en-us) -->\r\n\r\n<!-- Please do not submit support requests or \"How to\" questions here. Instead, please use one of these channels: https://forum.ionicframework.com/ or http://ionicworldwide.herokuapp.com/ -->\r\n\r\n<!-- ISSUES MISSING IMPORTANT INFORMATION MAY BE CLOSED WITHOUT INVESTIGATION. -->\r\n\r\n# Bug Report\r\n\r\n**Ionic version:**\r\n<!-- (For Ionic 1.x issues, please use https://github.com/ionic-team/ionic-v1) -->\r\n<!-- (For Ionic 2.x & 3.x issues, please use https://github.com/ionic-team/ionic-v3) -->\r\n[x] **4.x** 4.0.0-rc.0\r\n\r\n**Current behavior:**\r\n<!-- Describe how the bug manifests. -->\r\nI'm trying to implement a navigation guard, cancelling the navigation for a specific route (so I can popup a login page, for example).\r\n\r\nThe event is cancelled, but the router still navigates.\r\n\r\n**Expected behavior:**\r\n<!-- Describe what the behavior would be without the bug. -->\r\nWhen I cancel the `ionRouteWillChange` event, the navigation should be cancelled.\r\n\r\n**Steps to reproduce:**\r\n<!-- Please explain the steps required to duplicate the issue, especially if you are able to provide a sample application. -->\r\nImplement a `onIonRouteWillChange` and call `event.preventDefault()`.\r\n\r\n**Related code:**\r\n\r\n<!-- If you are able to illustrate the bug or feature request with an example, please provide a sample application via one of the following means:\r\n\r\nA sample application via GitHub\r\n\r\nStackBlitz (https://stackblitz.com)\r\n\r\nPlunker (http://plnkr.co/edit/cpeRJs?p=preview)\r\n\r\n-->\r\n```\r\n<ion-app>\r\n <ion-router useHash={true} onIonRouteWillChange={this.onRouteWillChange}>\r\n <ion-route url=\"/menu/:menuHash\" component=\"page-menu\" />\r\n <ion-route url=\"/test\" component=\"page-test\" />\r\n </ion-router>\r\n\r\n <ion-nav main />\r\n</ion-app>\r\n```\r\n\r\n```\r\nprivate onRouteWillChange(e: CustomEvent<RouterEventDetail>)\r\n{\r\n console.warn(\"RouteWillChange\");\r\n console.dir(e);\r\n\r\n if(e.detail.to.startsWith(\"/menu\"))\r\n {\r\n console.warn(\"CANCELLING\");\r\n e.preventDefault();\r\n e.stopImmediatePropagation();\r\n e.stopPropagation();\r\n return false;\r\n }\r\n}\r\n```\r\n\r\nConsole output:\r\n```\r\n/!\\ RouteWillChange (for /)\r\n/!\\ RouteWillChange (for /menu/xxxx)\r\n/!\\ CANCELLING\r\n```\r\n\r\n**Other information:**\r\n<!-- List any other information that is relevant to your issue. Stack traces, related issues, suggestions on how to fix, Stack Overflow links, forum links, etc. -->\r\n\r\n**Ionic info:** \r\n<!-- (run `ionic info` from a terminal/cmd prompt and paste output below): -->\r\n\r\n```\r\n[WARN] You are not in an Ionic project directory. Project context may be missing.\r\n\r\nIonic:\r\n\r\n ionic (Ionic CLI) : 4.5.0\r\n\r\nSystem:\r\n\r\n NodeJS : v10.13.0\r\n npm : 6.4.1\r\n OS : Windows 10\r\n```\r\nNOTE: It's a Ionic Core/StencilJS project",
"title": "[4.0.0-rc.0] onIonRouteWillChange should be cancellable",
"type": "issue"
},
{
"action": "created",
"author": "Silvest89",
"comment_id": 449702534,
"datetime": 1545640223000,
"masked_author": "username_1",
"text": "What about using the Ionic nav guards, ionViewCanEnter and ionViewCanLeave",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Silvest89",
"comment_id": 449702655,
"datetime": 1545640314000,
"masked_author": "username_1",
"text": "What about using ionViewCanEnter at the component level?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JCKodel",
"comment_id": 449745314,
"datetime": 1545664572000,
"masked_author": "username_0",
"text": "There is no such thing in Ionic Core.\r\n\r\nAnd that's not the issue. The issue is a cancellable event not cancelling anything (the fix is either cancel the navigation or mark the event as non cancellable).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "manucorporat",
"comment_id": null,
"datetime": 1546026131000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "manucorporat",
"comment_id": 450416174,
"datetime": 1546026131000,
"masked_author": "username_2",
"text": "You can't cancel a URL change, but what you can do is to implement guards in the way of redirects to protect parts of your application.\r\n\r\n```tsx\r\n<ion-app>\r\n <ion-router useHash={true}>\r\n { this.isLoggedIn && <ion-route-redirect from=\"/menu*\" to=\"/test\" /> }\r\n <ion-route url=\"/menu/:menuHash\" component=\"page-menu\" />\r\n <ion-route url=\"/test\" component=\"page-test\" />\r\n </ion-router>\r\n\r\n <ion-nav main />\r\n</ion-app>\r\n```",
"title": null,
"type": "comment"
}
] | 4 | 7 | 4,234 | false | true | 3,990 | false |
qiuxiang/react-native-amap-geolocation | null | 391,355,416 | 41 | null | [
{
"action": "opened",
"author": "cuiken",
"comment_id": null,
"datetime": 1544861083000,
"masked_author": "username_0",
"text": "",
"title": "IOS",
"type": "issue"
},
{
"action": "closed",
"author": "qiuxiang",
"comment_id": null,
"datetime": 1557716191000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 0 | false | false | 0 | false |
ably/ably-cocoa | ably | 576,430,860 | 1,000 | {
"number": 1000,
"repo": "ably-cocoa",
"user_login": "ably"
} | [
{
"action": "opened",
"author": "tcard",
"comment_id": null,
"datetime": 1583430870000,
"masked_author": "username_0",
"text": "This is a corner case, but users may encounter it.",
"title": "Avoid leak from user incorrectly holding to authCallback's callback.",
"type": "issue"
},
{
"action": "created",
"author": "QuintinWillison",
"comment_id": 596413744,
"datetime": 1583745290000,
"masked_author": "username_1",
"text": "I'm having a look locally and may just end up pushing a couple of tweaks to save time.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "QuintinWillison",
"comment_id": 605286195,
"datetime": 1585338557000,
"masked_author": "username_1",
"text": "@username_0 is there any chance my responses satisfied you and you're happy to resolve the two conversations above so I can get this merged?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "tcard",
"comment_id": 606678373,
"datetime": 1585666481000,
"masked_author": "username_0",
"text": "@username_1 Sorry, I got drowned in other things. LGTM",
"title": null,
"type": "comment"
}
] | 2 | 4 | 330 | false | false | 330 | true |
kareniel/awesome-ctf-challenge-design | null | 525,087,046 | 1 | null | [
{
"action": "opened",
"author": "kareniel",
"comment_id": null,
"datetime": 1574177774000,
"masked_author": "username_0",
"text": "I'm using this using to jot down ideas about how to structure the main readme file.\r\n\r\n- Designing puzzles\r\n- Design by subtraction\r\n- Game Design\r\n - Core Mechanic\r\n - Implicit Learning\r\n - Complexity Escalation\r\n - Reprisals\r\n - Subversions\r\n- Lateral Thinking",
"title": "Notes",
"type": "issue"
}
] | 1 | 1 | 267 | false | false | 267 | false |
pingcap/grpc-rs | pingcap | 468,966,086 | 346 | null | [
{
"action": "opened",
"author": "hunterlxt",
"comment_id": null,
"datetime": 1563333834000,
"masked_author": "username_0",
"text": "**Describe the bug**\r\nBuild error when generate bindings to gRPC in CentOS\r\n```\r\nHOST = Some(\"x86_64-unknown-linux-gnu\")\r\nCXX_x86_64-unknown-linux-gnu = None\r\nCXX_x86_64_unknown_linux_gnu = None\r\nHOST_CXX = None\r\nCXX = None\r\nCXXFLAGS_x86_64-unknown-linux-gnu = None\r\nCXXFLAGS_x86_64_unknown_linux_gnu = None\r\nHOST_CXXFLAGS = None\r\nCXXFLAGS = None\r\nCRATE_CC_NO_DEFAULTS = None\r\nDEBUG = Some(\"true\")\r\nCARGO_CFG_TARGET_FEATURE = Some(\"fxsr,sse,sse2\")\r\nrunning: \"c++\" \"-O0\" \"-ffunction-sections\" \"-fdata-sections\" \"-fPIC\" \"-g\" \"-fno-omit-frame-pointer\" \"-m64\" \"-I\" \"grpc/include\" \"-Wall\" \"-Wextra\" \"-std=c++11\" \"-DGRPC_SYS_SECURE\" \"-Werror\" \"-o\" \"/home/lxt/Projects/grpc-rs/target/debug/build/grpcio-sys-3e86be41c3f6dd46/out/grpc_wrap.o\" \"-c\" \"grpc_wrap.cc\"\r\nexit code: 0\r\nAR_x86_64-unknown-linux-gnu = None\r\nAR_x86_64_unknown_linux_gnu = None\r\nHOST_AR = None\r\nAR = None\r\nrunning: \"ar\" \"crs\" \"/home/lxt/Projects/grpc-rs/target/debug/build/grpcio-sys-3e86be41c3f6dd46/out/libgrpc_wrap.a\" \"/home/lxt/Projects/grpc-rs/target/debug/build/grpcio-sys-3e86be41c3f6dd46/out/grpc_wrap.o\"\r\nexit code: 0\r\ncargo:rustc-link-lib=static=grpc_wrap\r\ncargo:rustc-link-search=native=/home/lxt/Projects/grpc-rs/target/debug/build/grpcio-sys-3e86be41c3f6dd46/out\r\nCXXSTDLIB_x86_64-unknown-linux-gnu = None\r\nCXXSTDLIB_x86_64_unknown_linux_gnu = None\r\nHOST_CXXSTDLIB = None\r\nCXXSTDLIB = None\r\ncargo:rustc-link-lib=stdc++\r\n\r\n--- stderr\r\n./grpc/include/grpc/impl/codegen/slice.h:24:10: fatal error: 'stddef.h' file not found\r\n./grpc/include/grpc/impl/codegen/slice.h:24:10: fatal error: 'stddef.h' file not found, err: true\r\nthread 'main' panicked at 'Unable to generate grpc bindings: ()', src/libcore/result.rs:999:5\r\n```\r\n\r\nMethod that has been tried\r\n1. scl enable llvm-toolset-7\r\n2. update cmake \r\n3. update gcc/g++\r\n4. use a new OS to build in strict accordance with the documentation",
"title": "Build error when generate bindings to gRPC in CentOS",
"type": "issue"
},
{
"action": "created",
"author": "BusyJay",
"comment_id": 512123654,
"datetime": 1563345413000,
"masked_author": "username_1",
"text": "I encounter the same error on Fedora 30. I manage to get around this by setting environment variables CPLUS_INCLUDE_PATH to \"/usr/lib/gcc/x86_64-redhat-linux/9/include\".",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Ten0",
"comment_id": 512748194,
"datetime": 1563443592000,
"masked_author": "username_2",
"text": "Same error on my fully up to date ArchLinux and on our Debian stretch deployment system. Latest release broke quite some builds.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "light4",
"comment_id": 512790088,
"datetime": 1563452043000,
"masked_author": "username_3",
"text": "Same here, I find that cargo cache don't have `third_party/protobuf` directory.\r\n\r\n```\r\nusername_3@matrix ~> ls -lh .cargo/registry/src/github.com-1ecc6299db9ec823/grpcio-sys-0.5.0-alpha.2/grpc/third_party/\r\n总用量 72K\r\ndrwxr-xr-x 3 username_3 username_3 4.0K 7月 18 17:33 address_sorting/\r\n-rw-rw-r-- 1 username_3 username_3 291 3月 19 16:59 benchmark.BUILD\r\ndrwxr-xr-x 11 username_3 username_3 4.0K 7月 18 17:33 boringssl/\r\n-rw-rw-r-- 1 username_3 username_3 266 3月 19 16:59 BUILD\r\ndrwxr-xr-x 9 username_3 username_3 4.0K 7月 18 17:33 cares/\r\n-rw-rw-r-- 1 username_3 username_3 134 3月 19 16:59 constantly.BUILD\r\n-rw-rw-r-- 1 username_3 username_3 667 3月 19 16:59 cython.BUILD\r\n-rw-rw-r-- 1 username_3 username_3 777 3月 19 16:59 gtest.BUILD\r\n-rw-rw-r-- 1 username_3 username_3 178 3月 19 16:59 incremental.BUILD\r\ndrwxr-xr-x 8 username_3 username_3 4.0K 7月 18 17:33 nanopb/\r\n-rw-rw-r-- 1 username_3 username_3 323 3月 19 16:59 nanopb.BUILD\r\ndrwxr-xr-x 2 username_3 username_3 4.0K 7月 18 17:33 py/\r\ndrwxr-xr-x 2 username_3 username_3 4.0K 7月 18 17:33 toolchains/\r\n-rw-rw-r-- 1 username_3 username_3 375 3月 19 16:59 twisted.BUILD\r\n-rw-rw-r-- 1 username_3 username_3 164 3月 19 16:59 yaml.BUILD\r\ndrwxr-xr-x 14 username_3 username_3 4.0K 7月 18 17:33 zlib/\r\n-rw-rw-r-- 1 username_3 username_3 632 3月 19 16:59 zlib.BUILD\r\n-rw-rw-r-- 1 username_3 username_3 241 3月 19 16:59 zope_interface.BUILD\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "BusyJay",
"comment_id": 512795600,
"datetime": 1563453150000,
"masked_author": "username_1",
"text": "@username_2 from 0.5.0-alpha.2, grpcio-sys depends on clang and clang library. Have you installed those libraries?\r\n\r\n@username_3 It has nothing to do with protobuf. The error is reported by bindgen, which requires clang toolchains.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "light4",
"comment_id": 513103312,
"datetime": 1563516304000,
"masked_author": "username_3",
"text": "# quick fix\r\n\r\n```\r\n# centos\r\nyum install clang clang-devel llvm llvm-devel\r\n\r\n# archlinx\r\n# add this line to bindgen config at build.rs:259\r\n.clang_arg(\"-I/usr/lib/clang/8.0.0/include\")\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "BusyJay",
"comment_id": 513106130,
"datetime": 1563517001000,
"masked_author": "username_1",
"text": "Hmm, can you help us to add the dependencies to Prerequisites? @username_3",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hunterlxt",
"comment_id": 513109152,
"datetime": 1563517747000,
"masked_author": "username_0",
"text": "it seems like there is only 3.4.2 in yum repo",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "light4",
"comment_id": 513118428,
"datetime": 1563519995000,
"masked_author": "username_3",
"text": "yep, but it should be fine, can you test it?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hunterlxt",
"comment_id": 513119143,
"datetime": 1563520156000,
"masked_author": "username_0",
"text": "doesn't work, I think building from source could fix @username_3",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "light4",
"comment_id": 513187882,
"datetime": 1563534033000,
"masked_author": "username_3",
"text": "I try to build it on centos 7.1, it builds successfully. Can you paste the error logs here?\r\n```\r\nllvm 3.4.2\r\nclang 3.4.2\r\nCentOS Linux release 7.1.1503 (Core)\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "BusyJay",
"comment_id": 513269789,
"datetime": 1563549605000,
"masked_author": "username_1",
"text": "Related issue: rust-lang/rust-bindgen#1226",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hunterlxt",
"comment_id": 513270633,
"datetime": 1563549743000,
"masked_author": "username_0",
"text": "Be sad /(ㄒoㄒ)/~~ @username_1",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "BusyJay",
"comment_id": null,
"datetime": 1563959638000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 4 | 14 | 4,302 | false | false | 4,302 | true |
hatena/Hatena-Blog-Theme-Boilerplate | hatena | 312,302,951 | 7 | null | [
{
"action": "opened",
"author": "Git-aTn",
"comment_id": null,
"datetime": 1523191747000,
"masked_author": "username_0",
"text": "はてな担当様\r\n\r\n新規にサブサイトを作成したらアドレスはhttpsでした。\r\nnpm start を実行してもbrowser-syncはhttpのアドレスしか認識してなく自動更新されません。\r\n\r\nローカル環境のアクセス先:\r\nLocal: http://localhost:3000\r\nExternal: http://192.168.3.8:3000\r\n管理画面のアクセス先:\r\nUI: http://localhost:3001\r\nUI External: http://192.168.3.8:3001\r\n\r\n質問:\r\npacage.jnsnのbrowser-syncにどのようなオプションを追加すれば\r\nhttpsのサイトと同期できるか教えて頂けないでしょうか。\r\n\r\n宜しくお願いいたします。",
"title": "httpsでのbrowser-sync実行方法について",
"type": "issue"
},
{
"action": "created",
"author": "ueday",
"comment_id": 380036194,
"datetime": 1523352729000,
"masked_author": "username_1",
"text": "@username_0 お返事が遅れて申し訳ございません。 Browsersync が HTTPS のブログで正常に動作しないことを手元でも確認しました。\r\n本件の対応について、チームで現在検討中です。ご不便をおかけしますが、何卒ご理解ください。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Git-aTn",
"comment_id": 380062793,
"datetime": 1523358900000,
"masked_author": "username_0",
"text": "さっそくのご確認ありがとうございます。お忙しいところ、ご検討恐れ入ります。\r\n当方でもいろいろトライしてみましたが、上手く行きませんでした。\r\n例えば、--https のオプションを追加するとローカル環境のアクセス先はhttpsに変わるのですが、管理画面のアクセス先はhttpのままでした。\r\n以上、宜しくお願い致します。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ueday",
"comment_id": 380730053,
"datetime": 1523523539000,
"masked_author": "username_1",
"text": "@username_0 チームで検討した結果、Browsersync の自動リロードは今後サポート対象外といたします。大変申し訳ございません。\r\nBrowsersync を HTTPS で動かすオプションは存在するものの、SSLの証明書を手元で作成するか、もしくは安全でないスクリプトの読み込みを許容する必要があり、公式にサポートするのは困難という結論になりました。\r\n\r\nご不便をおかけしますが、 SCSS の保存後には手動でブラウザを更新していただきますようお願いいたします。\r\n\r\nREADMEのセットアップ方法については、こちらの pull request で修正しました: https://github.com/hatena/Hatena-Blog-Theme-Boilerplate/pull/8",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Git-aTn",
"comment_id": 381124872,
"datetime": 1523623753000,
"masked_author": "username_0",
"text": "はてな開発チームのご担当様、上記ご回答ありがとうございます。\r\n\r\nさっそく以下で動作確認しました。\r\n\r\n■変更前:カスタマイズ > ヘッダ > タイトル下 に下記を貼り付けるる。\r\n<link rel=\"stylesheet\" href=\"http://localhost:3000/index.css\"/>\r\n<script async src='http://localhost:3000/browser-sync/browser-sync-client.js'></script>\r\n\r\n■変更後:カスタマイズ > {}デザインCSS に下記を貼り付ける\r\n@import url(\"http://localhost:3000/boilerplate.css\");\r\n\r\nSCSSを保存したら手動でリロードをする。\r\n\r\nテストした結果、httpsアドレスでも画面は更新できました。\r\n\r\nBrowsersyncを公式サポート出来ないのは非常に残念ですが、\r\nいつか対応できる日が来ることを期待いたします。\r\n\r\n今後とも宜しくお願い致します。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Git-aTn",
"comment_id": 381127185,
"datetime": 1523624313000,
"masked_author": "username_0",
"text": "はてな開発チームの皆様、上記ご回答ありがとうございます。\r\n\r\nさっそく、カスタマイズ > {}デザインCSS に下記を貼り付けてみました。\r\n@import url(\"http://localhost:3000/boilerplate.css\");\r\n\r\nSCSSを保存→手動でリロード後、httpsアドレスでも画面は更新できました。\r\n\r\n正直上記方法は従来に比べとても面倒ですが、諸般の事情で現時点ではBrowsersyncを公式サポート出来ないのはやむを得ないと理解致しました。\r\n\r\nしかし、いつか対応できる日が来ることを期待いたします。\r\n\r\n今後とも宜しくお願い致します。",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ueday",
"comment_id": null,
"datetime": 1547804095000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 7 | 1,755 | false | false | 1,755 | true |
powercord-community/betterfriends | powercord-community | 647,512,979 | 20 | {
"number": 20,
"repo": "betterfriends",
"user_login": "powercord-community"
} | [
{
"action": "opened",
"author": "ADoesGit",
"comment_id": null,
"datetime": 1593448306000,
"masked_author": "username_0",
"text": "",
"title": "replace depracated functions",
"type": "issue"
},
{
"action": "created",
"author": "Juby210",
"comment_id": 676118526,
"datetime": 1597833489000,
"masked_author": "username_1",
"text": "no longer needed (as `54996d604a9313dd5ee2de01b7fcccdf4cfbfb0e` is merged)",
"title": null,
"type": "comment"
}
] | 2 | 2 | 74 | false | false | 74 | false |
Samuell1/cdn-plugin | null | 520,569,836 | 2 | null | [
{
"action": "opened",
"author": "Ta2Ta2",
"comment_id": null,
"datetime": 1573374090000,
"masked_author": "username_0",
"text": "Hello, Thanks for the amazing plugin. I'm using below to serve my assets, combining resources.\r\n\r\nOnly first file is being served. Rest of files are not loaded.\r\n\r\n`<script src=\"{{ cdn('assets/javascript/jquery.js','assets/vendor/bootstrap/js/transition.js','assets/vendor/bootstrap/js/alert.js','assets/vendor/bootstrap/js/button.js','assets/vendor/bootstrap/js/carousel.js','assets/vendor/bootstrap/js/collapse.js','assets/vendor/bootstrap/js/dropdown.js','assets/vendor/bootstrap/js/modal.js','assets/vendor/bootstrap/js/tooltip.js','assets/vendor/bootstrap/js/popover.js','assets/vendor/bootstrap/js/scrollspy.js','assets/vendor/bootstrap/js/tab.js','assets/vendor/bootstrap/js/affix.js','assets/javascript/app.js') }}\"></script>`",
"title": "combining assets",
"type": "issue"
},
{
"action": "created",
"author": "Samuell1",
"comment_id": 552182914,
"datetime": 1573382452000,
"masked_author": "username_1",
"text": "Hey, Thanks!\r\n\r\nSorry but this plugin doesnt support combining assets, you need to do this in your webpack enviroment.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "Samuell1",
"comment_id": null,
"datetime": 1584604951000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 852 | false | false | 852 | false |
dankamongmen/notcurses | null | 647,037,153 | 743 | null | [
{
"action": "opened",
"author": "dankamongmen",
"comment_id": null,
"datetime": 1593390415000,
"masked_author": "username_0",
"text": "Now that we cna have true glyph transparency, it's time to improve the demo's HUD, ideally turning it on by default. The right hand side has already been handled -- quadblitter-rendered graphics show up perfectly. The left is still filling gaps with spaces due to use of minimum widths in `ncplane_printf()`. We instead want to use no maximum width, and then reposition the cursor. Once this is done, we ought be able to see the background perfectly through the HUD, and can turn it on in all cases.",
"title": "Make HUD glyph-transparent",
"type": "issue"
},
{
"action": "created",
"author": "dankamongmen",
"comment_id": 650845718,
"datetime": 1593390888000,
"masked_author": "username_0",
"text": "Got the spaces padding the frame count. Still need to get the spaces right-aligning the timer, and the space between timer and name.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "dankamongmen",
"comment_id": null,
"datetime": 1593391669000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 631 | false | false | 631 | false |
neherlab/covid19_scenarios_data | neherlab | 588,660,580 | 55 | null | [
{
"action": "opened",
"author": "rneher",
"comment_id": null,
"datetime": 1585249751000,
"masked_author": "username_0",
"text": "We have the country Brazil now, but the smaller divisions are missing. We have case data for them already. Maybe @username_1 can help?",
"title": "Brazil regions for populationData.tsv",
"type": "issue"
},
{
"action": "created",
"author": "gstvribs",
"comment_id": 604682317,
"datetime": 1585256307000,
"masked_author": "username_1",
"text": "Yep, of course!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "nnoll",
"comment_id": 607392334,
"datetime": 1585762815000,
"masked_author": "username_2",
"text": "I'm transferring this to the main repo as we are no longer going to keep this as submodule but rather just a directory within covid19_scenarios.",
"title": null,
"type": "comment"
}
] | 3 | 3 | 291 | false | false | 291 | true |
dmitrybubyakin/nova-medialibrary-field | null | 596,880,689 | 63 | {
"number": 63,
"repo": "nova-medialibrary-field",
"user_login": "dmitrybubyakin"
} | [
{
"action": "opened",
"author": "kaysersoze",
"comment_id": null,
"datetime": 1586381985000,
"masked_author": "username_0",
"text": "I took an initial stab at updating references and relevant classes to use v8 of spatie/laravel-medialibrary.\r\n\r\nResolves #54",
"title": "spatie/laravel-medialibrary v8 compatibility",
"type": "issue"
},
{
"action": "created",
"author": "kaysersoze",
"comment_id": 611209310,
"datetime": 1586382044000,
"masked_author": "username_0",
"text": "This additionally merges some PHP syntax changes from @Grayda, which I thought appropriate.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dmitrybubyakin",
"comment_id": 611338669,
"datetime": 1586410702000,
"masked_author": "username_1",
"text": "@username_0 Thank you! I'll review it a bit later.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dmitrybubyakin",
"comment_id": 611368578,
"datetime": 1586416123000,
"masked_author": "username_1",
"text": "@username_0 This changes are in v3 now.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 304 | false | false | 304 | true |
pwa-builder/PWABuilder | pwa-builder | 628,988,265 | 787 | null | [
{
"action": "opened",
"author": "koenvd",
"comment_id": null,
"datetime": 1591084808000,
"masked_author": "username_0",
"text": "Currently PWA's installed through the Microsoft Store are run in the WWAHost app which internally still uses the EdgeHTML renderer. \r\n\r\nAs explained here https://github.com/MicrosoftEdge/WebViewFeedback/issues/68 a Chromium based host app would have a tremendous impact:\r\n- Immediate support for currently not working features in the app due to missing browser API's in Edge.\r\n- The improved Indexeddb technology (faster inserts, compound indexes, support for IndexedDB v2.0 API) would have big impact on the speed of the app.\r\n- Closer feedback loop in development and QA. Practically all development already happens in Chrome but still needs to be verified and eventually adapted for the app. A Chromium based host app would mean a big improvement here and win lot's of time.\r\n- Easier to explain to external developers that a PWA just has the same behavior as a Chromium based app iso EdgeHTML renderer.\r\n- Customer expectations where - after the release of the Chromium based Edge browser - it's not obvious to them the PWA app is still using the EdgeHTML renderer while they expect the same behavior between the app and a web app.\r\n\r\nThe great news here [SK122 Building rich app experiences with Progressive Web Apps ](https://www.youtube.com/watch?v=y4p_QHZtMKM&feature=youtu.be&t=2075) indicated that currently this feature is being looked at and a timeline for release is \"somewhere this year\".\r\n\r\nIt would be very nice to be able to follow more closely progress related to this feature because of the huge business value it has. Also here https://github.com/MicrosoftEdge/WebViewFeedback/issues/68#issuecomment-635215154 it was mentioned that this repo is probably the best place for doing that. Hence the reason I'm duplicating the original ticket here.\r\n\r\nImmediate questions:\r\n- How would upgrade work? Will registered service workers and existing Indexeddb instances survive a migration to the new host app? Or would it mean that when the OS is upgraded the user would see a fresh installation of the app?",
"title": "[FEATURE] Chromium based host app for PWA's installed through the Microsoft Store",
"type": "issue"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 637719824,
"datetime": 1591121762000,
"masked_author": "username_1",
"text": "Your existing PWAs in the Store will still work. We won't migrate your existing apps in the Store. If you published a PWA using the old EdgeHTML-based model, it'll stay that way until you update the app.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "koenvd",
"comment_id": 637775107,
"datetime": 1591128103000,
"masked_author": "username_0",
"text": "Thanks for already sharing that apps will require an update In the Store to start using the Chromium based renderer and for explaining the pain points you’re currently facing.\r\n\r\nMuch appreciated!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 662619061,
"datetime": 1595443122000,
"masked_author": "username_1",
"text": "Just to give an update on this:\r\n\r\nThe Edge team and Store team have agreed on a technical path forward. We created a proof of concept that resulted in a PWA powered by Chromium-based Edge running in the Store.\r\n\r\nWe're now doing engineering work to turn this into a real thing.\r\n\r\nMore updates to follow in August.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "koenvd",
"comment_id": 662834403,
"datetime": 1595484392000,
"masked_author": "username_0",
"text": "Thanks for update @username_1 💪 !",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "koenvd",
"comment_id": 671050628,
"datetime": 1596978765000,
"masked_author": "username_0",
"text": "@username_1 Just wondering if a Chromium based PWA will also have the unlimited storage capacity that an EdgeHTML based PWA already has?\r\n\r\nSee the comment here for some more context: https://github.com/MicrosoftEdge/WebViewFeedback/issues/68#issuecomment-670202167",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 671564405,
"datetime": 1597090313000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 681007184,
"datetime": 1598461508000,
"masked_author": "username_1",
"text": "August update: we've completed engineering on this and have a working solution in place. We're beginning to test this with select partners. We'll also be deploying this to preview.pwabuilder.com for testing soon.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "behroozbc",
"comment_id": 681537454,
"datetime": 1598506173000,
"masked_author": "username_2",
"text": "hi @username_1 \r\nin your solution can you authorize pwa website is my own or not ?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 683018368,
"datetime": 1598638100000,
"masked_author": "username_1",
"text": "No, at least initially, we won't be doing any sort of domain ownership checks. That may change in the future - but it's up to the Edge engineering team to drive that.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "behroozbc",
"comment_id": 684569991,
"datetime": 1598949052000,
"masked_author": "username_2",
"text": "@username_1 okay !\r\nload speed ?? for submit my pwa apps to store check load speed? like google play store twa !",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 686088185,
"datetime": 1599088856000,
"masked_author": "username_1",
"text": "We are considering doing some quality checks around offline support, given Google's new policy that your Android app will crash if there's an unhandled 404.\r\n\r\nWe don't have any current plans for load speed checks at this time.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "koenvd",
"comment_id": 692123727,
"datetime": 1600096384000,
"masked_author": "username_0",
"text": "Is Chromium using a specific strategy for this? Just wondering since if a user has installed a PWA and in case due to other user actions the disk becomes full and Chromium would decide to remove the main.js from cache storage suddenly the installed PWA won't work anymore offline. Also the user would not be aware of this upfront and would just notice a malfunctioning PWA. Is this something that the PWA has to take into account?\r\n\r\nFor same reason wondering if a PWA installed through an appx or Microsoft Store with a fixed start url would ever support unlimited storage and no eviction by providing the `kUnlimitedStorage` option from here: https://source.chromium.org/chromium/chromium/src/+/master:chrome/common/chrome_switches.cc?q=kUnlimitedStorage&ss=chromium%2Fchromium%2Fsrc. So it would have similar behaviour as current Edge based PWA's.\r\n\r\nAlso interested to try this out whenever it would be possible to do so 😅 ! Thanks!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 694511174,
"datetime": 1600378377000,
"masked_author": "username_1",
"text": "We have the service published, but haven't shared it with the general public yet. We plan on releasing it in October, as it depends on Edge 86 which is released in October. If you want to preview the service before then, email me juhimang @ microsoft dot com.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "koenvd",
"comment_id": 694945638,
"datetime": 1600444217000,
"masked_author": "username_0",
"text": "Thanks a lot for sharing this info @username_1 ! Much appreciate the time you took for providing the answers.\r\n\r\nOther question I still had is if we will still have the possibility to call into the Windows 10 API or a windows runtime component like currently supported in EdgeHTML based PWA's on the Store? \r\n\r\n(When creating this issue I kind of expected it would be supported since it's supported in an EdgeHTML based PWA but thought maybe it would be good to explicitly verify with you as well).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 696449847,
"datetime": 1600734265000,
"masked_author": "username_1",
"text": "No, last I heard. My understanding was Edge was going for pure web standards - if there is stuff you need to do that isn't available via web standards, let's make that available via web standards.\r\n\r\nThat said, I heard some recent murmurings from Edge folks about the possibility of still injecting WinRT / WinJS for installed PWAs, like we did for legacy Edge-based PWAs. I'll find out from Edge team whether there's a definitive answer.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "koenvd",
"comment_id": 696584534,
"datetime": 1600763670000,
"masked_author": "username_0",
"text": "The alternative to achieve this might be a custom UWP WinUI3.0 app embedding a single Webview2 control and to use `window.chrome.webview.postMessage` to be able to call into the C# part of the app (and get access to the winRT API) as illustrated here https://github.com/hawkerm/monaco-editor-uwp/pull/32\r\n\r\nFor some extra context but in our current Edge-based PWA we call the winRT API:\r\n- to bring up the `WebAuthenticationBroker` for the OAuth flow.\r\n- `Launcher.LaunchUriAsync` to explicitly open a browser instance when we don't want to handle a link inside the app.\r\n\r\nSimilarly being able to call `Launcher.LaunchUriAsync` would be useful (as part of SSO support) to initiate an OAuth flow with the external default browser as demo-ed here https://github.com/googlesamples/oauth-apps-for-windows/blob/master/OAuthUniversalApp/OAuthUniversalApp/MainPage.xaml.cs#L81\r\n\r\nTherefore it would still be very interesting to know if this mechanism would be on the roadmap for Chromium based installed PWA's as well. \r\n\r\nThanks again for sharing the info and checking internally!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 705044117,
"datetime": 1602087393000,
"masked_author": "username_1",
"text": "Update on this: we're building the UI for specifying the options of the new Chromium-based platform. We anticipate this will be released either later this week or next week.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "JudahGabriel",
"comment_id": 713086548,
"datetime": 1603221888000,
"masked_author": "username_1",
"text": "I'm pleased to announce [we've released a preview of our new Chromium-based packaging](https://medium.com/pwabuilder/bringing-chromium-edge-pwas-progressive-web-apps-to-the-microsoft-store-c0bd07914ed9?source=friends_link&sk=04ca8b2ae2bd094b04ef6b53780b5698).\r\n\r\nTo access the preview, analyze your PWA through PWABuilder -> Build Package -> Windows. you'll now see some options to try the Chromium-based Edge preview:\r\n\r\n\r\nIf you want to just try it out quick, click `Download Test Package`. If you want to actually submit to the Store, click `Open Store Options` and fill in your publisher details.\r\n\r\nYour download will contain [instructions for testing your app locally and submitting to the Store](https://github.com/pwa-builder/pwabuilder-windows-chromium-docs/blob/master/next-steps.md).\r\n\r\nWe'd be stoked if you gave it a try on your PWA. Let us know how it goes for you!",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "davrous",
"comment_id": 713662383,
"datetime": 1603294326000,
"masked_author": "username_3",
"text": "Thanks all for this great discussion and feedback! \r\n\r\n@username_0, we're not planning to use a WebView approach mechanism in the future. This doesn't mean you won't be able to build yourself an hybrid app using WebView2 inside a WinUI 3.0 app and manage the bridge between the webview and the native API. However, this is not our preferred choice today. We'd like to bet on standard API available with the browser. \r\n\r\nLooks like we can now close this issue following the shipping we've done yesterday. Please continue the discussion on Twitter if you have more questions and/or ideas: https://twitter.com/pwabuilder",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "davrous",
"comment_id": null,
"datetime": 1603294327000,
"masked_author": "username_3",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "koenvd",
"comment_id": 715432170,
"datetime": 1603469000000,
"masked_author": "username_0",
"text": "Thanks @username_3 for giving extra context. \r\n\r\nIndeed from our end we will consider both tracks (browser based or winui hybrid) to move away from the EdgeHTML based host app. \r\n\r\nAlso congrats to @username_1 and the whole team for reaching this milestone!! \r\n\r\nWould love to share our findings as well somewhere later after we started on this.",
"title": null,
"type": "comment"
}
] | 5 | 23 | 9,366 | false | true | 9,168 | true |
desktop/desktop | desktop | 403,811,227 | 6,734 | null | [
{
"action": "opened",
"author": "legoman8304",
"comment_id": null,
"datetime": 1548681695000,
"masked_author": "username_0",
"text": "I’m having a continuous problem with my GitHub desktop (windows 10, GitHub desktop build 1.6.1). This is much like error #5055 but the steps used there do not work for me. I have a repo (let’s call it repo A) that I have made a website by hand on. Uploading all the files and changes. I also have another repo (repo B) which I am trying to use with GitHub pages to host my site temporarily. Now when I try to clone repo B (currently empty) onto my desktop it instantly changes the name of repo B to the name of repo A and syncing the files between them, hence rendering it useless for GitHub pages as repo B must be named “_username_.github.io” to host. Why is this happening and how do I fix it. Thanks!",
"title": "Repo titles changing and syncing with other repos on github desktop",
"type": "issue"
},
{
"action": "created",
"author": "steveward",
"comment_id": 458340042,
"datetime": 1548717129000,
"masked_author": "username_1",
"text": "Thanks for the report @username_0. The local repository name should reflect the remote repository name when cloning a repository from GitHub, but a repository can point to a repository `origin` URL that has a different name than the local repository. \r\n\r\nA few questions:\r\n- Have you tried cloning the repository to a different directory to see if you you can reproduce this issue?\r\n- Can you share the specific directory structure of where you are trying to clone this repository?\r\n- If you go to the menu in GitHub Desktop and select `Repository` > `Repository Settings` > `Remote` is the URL there pointing to the correct remote repository on GitHub?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "steveward",
"comment_id": 458340257,
"datetime": 1548717179000,
"masked_author": "username_1",
"text": "@username_0 it would also be helpful if you could upload the log file from GitHub Desktop.\r\n\r\nTo access the log files go to the file menu in GitHub Desktop and select `Help` > `Show Logs in Finder (macOS) or Explorer (Windows)`. \r\n\r\nThe log files are created daily -- please upload a log file as an attachment from a day where you experienced the issue.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "steveward",
"comment_id": 458805176,
"datetime": 1548821513000,
"masked_author": "username_1",
"text": "Thanks for sending over the logs @username_0. Here's what I'm seeing in the log:\r\n\r\n```\r\n\r\n2019-01-29T13:17:06.951Z - info: [ui] [AppStore.getAccountForRemoteURL] account found for remote: https://github.com/username_0/Farm-Bureau-Site.git - username_0 (has token)\r\n2019-01-29T13:17:19.389Z - info: [ui] Executing clone: git -c credential.helper= clone --recursive --progress -- https://github.com/username_0/Farm-Bureau-Site.git C:\\Users\\nultonw1\\Desktop\\Coding\\GHUB\\username_0.github.io \r\n```\r\n\r\nIt looks like you are cloning the `Farm-Bureau-Site` into the `username_0.github.io` directory -- are you expecting `username_0.github.io` to be the local repository name? \r\n\r\nFor reference, if you are trying to use GitHub Pages you can either do a project page or a user page:\r\n\r\nhttps://help.github.com/articles/user-organization-and-project-pages/\r\n\r\nThe GitHub Pages user page requires the `<username>.github.io` naming convention, but a project page does not.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "steveward",
"comment_id": 461166336,
"datetime": 1549483285000,
"masked_author": "username_1",
"text": "Apologies for the delay @username_0. Can you share the config information of that repository? \r\n\r\n1. Open the repository in the command line (Menu > `Repository` > `Open in command prompt`)\r\n2. Run the command `git config --local --list`\r\n3. Share the output of that command in a reply \r\n\r\nTo try and get things working again you can clear the cached data that GitHub Desktop stores. Here's how to do that:\r\n\r\n1. Close GitHub Desktop\r\n2. Remove all of the files in `%APPDATA%\\GitHub Desktop\\`\r\n3. Restart GitHub Desktop -- you will need to log in and add your repositories again.\r\n4. Try cloning the repository to your local machine\r\n\r\nLet me know if that fixes this issue.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "legoman8304",
"comment_id": 461219210,
"datetime": 1549493296000,
"masked_author": "username_0",
"text": "Here is the output of `git config --local --list`\r\n```\r\ncore.repositoryformatversion=0\r\ncore.filemode=false\r\ncore.bare=false\r\ncore.logallrefupdates=true\r\ncore.symlinks=false\r\ncore.ignorecase=true\r\nsubmodule.active=.\r\nremote.origin.url=https://github.com/username_0/Farm-Bureau-Site.git\r\nremote.origin.fetch=+refs/heads/*:refs/remotes/origin/*\r\nbranch.master.remote=origin\r\nbranch.master.merge=refs/heads/master\r\nbranch.dev.remote=origin\r\nbranch.dev.merge=refs/heads/dev\r\nbranch.PR-test-branch.remote=origin\r\nbranch.PR-test-branch.merge=refs/heads/PR-test-branch\r\natomgithub.historysha=983bffda6cf5dd4dd3263f50080b7a83097963ce\r\n```\r\n\r\nand as for your second solution, should this be the folder I'm deleting the files from, I don't want to screw something up. \r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "legoman8304",
"comment_id": 464074076,
"datetime": 1550241801000,
"masked_author": "username_0",
"text": "Hello? I understand you all could be busy, but I've been waiting for 9 days now with no answer. I know someone has viewed it too, the label support was added 4 days ago, so please get back to me as soon as possible",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "steveward",
"comment_id": 464094301,
"datetime": 1550245162000,
"masked_author": "username_1",
"text": "@username_0 that is the correct directory -- you can delete all of the files there. You will need to add your log in to GitHub Desktop again and add your repositories back, but it should get things working correctly.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "legoman8304",
"comment_id": null,
"datetime": 1550245904000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "legoman8304",
"comment_id": 464098840,
"datetime": 1550245904000,
"masked_author": "username_0",
"text": "That seemed to do the trick!! That you for your help.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "billygriffin",
"comment_id": 464099345,
"datetime": 1550245989000,
"masked_author": "username_2",
"text": "Thanks @username_0, and sorry for the delay! We were in the midst of a release last week and things got hectic, so really appreciate your patience and the gentle nudge. Glad you got things resolved!",
"title": null,
"type": "comment"
}
] | 3 | 11 | 4,904 | false | false | 4,904 | true |
guardian/typerighter | guardian | 572,259,356 | 36 | {
"number": 36,
"repo": "typerighter",
"user_login": "guardian"
} | [
{
"action": "opened",
"author": "jonathonherbert",
"comment_id": null,
"datetime": 1582826782000,
"masked_author": "username_0",
"text": "Add a set of endpoints to proxy CAPI searches, in preparation for the management frontend's rule audit service.\r\n\r\nThis service will allow users to query CAPI for content, and then run this content through our matcher service for audit.",
"title": "Add a CAPI proxy",
"type": "issue"
},
{
"action": "created",
"author": "jonathonherbert",
"comment_id": 603771459,
"datetime": 1585133281000,
"masked_author": "username_0",
"text": "@akash1810 – I sorted a one line merge conflict and now require reapproval – if you could be so kind 😁",
"title": null,
"type": "comment"
}
] | 1 | 2 | 338 | false | false | 338 | false |
T-Systems-MMS/phonebook | T-Systems-MMS | 595,813,635 | 592 | {
"number": 592,
"repo": "phonebook",
"user_login": "T-Systems-MMS"
} | [
{
"action": "opened",
"author": "friedaxvictoria",
"comment_id": null,
"datetime": 1586260695000,
"masked_author": "username_0",
"text": "Before: The user could only view their bookmarked people on the dashboard.\r\n\r\nNow: User can switch between their team and bookmarks. If the user doesn't have any bookmarks, the dashboard will automatically display the team and next to the dashboard button it says how to bookmark somebody. Otherwise, the bookmarks are the default view. If the user isn't logged in, there won't be the option to look at the team. \r\n\r\nOpen question: Currently, the team is displayed in the following order: supervisor, assistents, employees, and learners. Should we add subheadings to showcase the different groups or leave it the way it is now?",
"title": "feat(dashboard): Show Team on Dashboard",
"type": "issue"
},
{
"action": "created",
"author": "paule96",
"comment_id": 610365907,
"datetime": 1586263633000,
"masked_author": "username_1",
"text": "there is a bug. If you navigate direct to https://pr-592.demo-phonebook.me/en/dashboard nothing is to see.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 610368189,
"datetime": 1586263924000,
"masked_author": "username_2",
"text": "already mentioned in my comments ;)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paule96",
"comment_id": 610878331,
"datetime": 1586341500000,
"masked_author": "username_1",
"text": "/azp run T-Systems-MMS.phonebook-preview",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paule96",
"comment_id": 610904792,
"datetime": 1586345470000,
"masked_author": "username_1",
"text": "sad demo broken. :) So I can't check the stuff..",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "friedaxvictoria",
"comment_id": 617725181,
"datetime": 1587555621000,
"masked_author": "username_0",
"text": "For some reason the mock user doesn't work in the preview. The id that I used is a person and can be found in the preview but unfortunately it doesn't register as the user which is the reason why you can't see the team...",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "friedaxvictoria",
"comment_id": 618193562,
"datetime": 1587621101000,
"masked_author": "username_0",
"text": "@username_2 Do you know how I can define a Mock UserTreeNode[] and not a UserTreeNode to compare with the Mock Person's OrgUnit?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 618195431,
"datetime": 1587621456000,
"masked_author": "username_2",
"text": "Just use `Partial<UserTreeNode>`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "friedaxvictoria",
"comment_id": 618216740,
"datetime": 1587624973000,
"masked_author": "username_0",
"text": "To use it in getNodeForUser(user, organigram, 0), the organigram needs to be a UnitTreeNode[]. It doesn't accept Partial <UnitTreeNode>.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 618218762,
"datetime": 1587625248000,
"masked_author": "username_2",
"text": "Then just cast it to UserTreeNode:\n\nYour Object `as UserTreeNode[]`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "friedaxvictoria",
"comment_id": 618221318,
"datetime": 1587625608000,
"masked_author": "username_0",
"text": "Doesn't really work either. The only version that isn't an error is organigram: UnitTreeNode[] = []. I don't know whether that will work though because there is an error with the user as well. I copied a mock person's data and the data itself seems to be fine but apparently the Person is missing properties (isLearner, isSupervisor, isAssistent, isOfStatus). However, these 'properties' are functions from Person.ts and I can't add them to the Mock Person.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paule96",
"comment_id": 623002390,
"datetime": 1588447751000,
"masked_author": "username_1",
"text": "/azbot",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paule96",
"comment_id": 623002403,
"datetime": 1588447757000,
"masked_author": "username_1",
"text": "/az",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mschwrdtnr",
"comment_id": 623091578,
"datetime": 1588503520000,
"masked_author": "username_3",
"text": "/azp run T-Systems-MMS.phonebook-preview",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paule96",
"comment_id": 623132752,
"datetime": 1588521994000,
"masked_author": "username_1",
"text": "This stuff is broken?\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paule96",
"comment_id": 623132795,
"datetime": 1588522016000,
"masked_author": "username_1",
"text": "also this switch doesn't working:\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "friedaxvictoria",
"comment_id": 623272597,
"datetime": 1588571936000,
"masked_author": "username_0",
"text": "@username_1 My cards aren't broken. There is a bit of an issue with the small cards, though. I'll see what I can do about that. \r\nThe switch only works for the booksmarks so far. Should we implement it for the team as well @username_2?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "friedaxvictoria",
"comment_id": 623278208,
"datetime": 1588573058000,
"masked_author": "username_0",
"text": "The Layout switch should now be working for the team as well. If you don't want that we can change it back although I would hide the button when the team is on display if this is the case. Otherwise, I think it would get to confusing. \r\n\r\nThe cards also shouldn't spread out anymore. Furthermore, I had to add another if-statement to the organigram.service.ts since it only checked the first node of every depth. If the user is in the second or third node, it would return null. @username_2",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paule96",
"comment_id": 623292664,
"datetime": 1588575506000,
"masked_author": "username_1",
"text": "no it's looks good 👍 I will start reviewing this code to night.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 632719831,
"datetime": 1590157477000,
"masked_author": "username_2",
"text": "@username_0 waiting for merge",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 635296213,
"datetime": 1590667337000,
"masked_author": "username_2",
"text": "do you? @username_1",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "friedaxvictoria",
"comment_id": 635875402,
"datetime": 1590744735000,
"masked_author": "username_0",
"text": "The function getUnitTreeById() has been added and it works when the UnitTreeNodes have been loaded. At the moment, the dashboard is finished loading before the UnitTreeNodes are. Therefore, the getUnitTreeById() is emty and doesn't return the user's UnitTreeNode. It is just a matter of getting the nodes before returning the dashboard.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 635880912,
"datetime": 1590745496000,
"masked_author": "username_2",
"text": "forgot to push functions are still not there",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "friedaxvictoria",
"comment_id": 637312715,
"datetime": 1591080170000,
"masked_author": "username_0",
"text": "I added the function and deleted some imports. However, I can’t check in localhost whether everything is all right because I can’t seem to get localhost to work in my browser...",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 640189540,
"datetime": 1591524149000,
"masked_author": "username_2",
"text": "/azp run T-Systems-MMS.phonebook-preview",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mschwrdtnr",
"comment_id": 669029333,
"datetime": 1596612319000,
"masked_author": "username_3",
"text": "@username_1 Please test it again locally",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 691904884,
"datetime": 1600072164000,
"masked_author": "username_2",
"text": "/azp run T-Systems-MMS.phonebook-preview",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paule96",
"comment_id": 700799474,
"datetime": 1601394778000,
"masked_author": "username_1",
"text": "/azp run T-Systems-MMS.phonebook-preview",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 700852852,
"datetime": 1601399278000,
"masked_author": "username_2",
"text": "Preview is not working at the moment ;(",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "DanielHabenicht",
"comment_id": 706274312,
"datetime": 1602260330000,
"masked_author": "username_2",
"text": "/azp run T-Systems-MMS.phonebook-preview",
"title": null,
"type": "comment"
}
] | 6 | 50 | 5,213 | false | true | 3,871 | true |
kubernetes/kubernetes | kubernetes | 398,063,573 | 72,795 | {
"number": 72795,
"repo": "kubernetes",
"user_login": "kubernetes"
} | [
{
"action": "opened",
"author": "verult",
"comment_id": null,
"datetime": 1547163299000,
"masked_author": "username_0",
"text": "Cherry pick of #71681 on release-1.11.\n\n#71681: Add e2e test for file exec",
"title": "Automated cherry pick of #71681: Add e2e test for file exec",
"type": "issue"
},
{
"action": "created",
"author": "verult",
"comment_id": 453303344,
"datetime": 1547163584000,
"masked_author": "username_0",
"text": "/kind bug\r\n/priority important-soon\r\n/assign @username_2 \r\n/sig storage",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "verult",
"comment_id": 453304073,
"datetime": 1547163740000,
"masked_author": "username_0",
"text": "This is adding a new test case for a bug found with certain kernel versions.\r\n\r\n/assign @username_1",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "verult",
"comment_id": 453305087,
"datetime": 1547163957000,
"masked_author": "username_0",
"text": "Addressed comments",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "foxish",
"comment_id": 453664274,
"datetime": 1547242518000,
"masked_author": "username_1",
"text": "@username_2, is this good to go?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "verult",
"comment_id": 453680391,
"datetime": 1547246541000,
"masked_author": "username_0",
"text": "Fixed the gofmt error, but I'm not sure where the bazel build error came from. Maybe a flake?\r\n\r\n@username_2 btw I changed the test image from Nginx to NginxSlim (since the former isn't defined in 1.11). Also simplified the test function a bit by removing the bits tailored to the new e2e testing framework",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "msau42",
"comment_id": 453690350,
"datetime": 1547249515000,
"masked_author": "username_2",
"text": "/lgtm\r\n/approve",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "verult",
"comment_id": 454977887,
"datetime": 1547679796000,
"masked_author": "username_0",
"text": "@username_1 good to go!",
"title": null,
"type": "comment"
}
] | 4 | 17 | 9,480 | false | true | 618 | true |
commercialhaskell/stack | commercialhaskell | 641,438,057 | 5,320 | null | [
{
"action": "opened",
"author": "aryairani",
"comment_id": null,
"datetime": 1592502368000,
"masked_author": "username_0",
"text": "In `doc/README.md`, it says \"the easiest way to install is to run `curl -sSL https://get.haskellstack.org/ | sh`\"\r\n\r\nbut n `doc/travis_ci.md`, it says \"there is only one reasonable way to install Stack: fetch precompiled binary from the Github.\"\r\n```yaml\r\nbefore_install:\r\n# Download and unpack the stack executable\r\n- mkdir -p ~/.local/bin\r\n- export PATH=$HOME/.local/bin:$PATH\r\n- travis_retry curl -L https://get.haskellstack.org/stable/linux-x86_64.tar.gz | tar xz --wildcards --strip-components=1 -C ~/.local/bin '*/stack'\r\n```\r\n\r\nI'm trying to read between the lines here to understand the difference between easy and reasonable. Are these Travis instructions still current?\r\n\r\nThanks in advance.",
"title": "reasonable way to install stack?",
"type": "issue"
},
{
"action": "created",
"author": "mpilgrem",
"comment_id": 674311350,
"datetime": 1597448146000,
"masked_author": "username_1",
"text": "If you examine the script at `https://get.haskellstack.org/` (the one obtained by `curl` and then piped to `sh` for the easy method), I think you will see that it also actually fetches the pre-compiled binary from the same place. I think 'from the Github` is not accurate.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 974 | false | false | 974 | false |
PyFilesystem/s3fs | PyFilesystem | 621,254,421 | 70 | null | [
{
"action": "opened",
"author": "poudro",
"comment_id": null,
"datetime": 1589919135000,
"masked_author": "username_0",
"text": "Been having a strange issue with `fs.open` on various files. \r\n\r\nThe code is basically\r\n```\r\n if not fs.exists(path_in_fs):\r\n return None\r\n\r\n try:\r\n with fs.open(path_in_fs, 'rb') as f:\r\n <- read file\r\n except Exception as err:\r\n raise err\r\n```\r\n\r\nSometimes `fs.exists` will say that a file exists in s3 but the `fs.open` step will return a `ResourceNotFound` error.\r\n\r\nI tracked the error in `fs.open` to this section of the code https://github.com/PyFilesystem/s3fs/blob/master/fs_s3fs/_s3fs.py#L433-L441, but I'm a bit at a loss as to why an exception should be triggered here but not for the `fs.exists`.\r\n\r\nFinally, if I delete the file via other means (awscli or in s3 browser), and recreate the file (by copying a local copy to s3 via awscli), it will then work without triggering an error.\r\n\r\nAny help on what I'm doing wrong would be greatly appreciated.\r\n\r\nBecause of this last element it's a bit hard to reproduce, but if you need more details please let me know.",
"title": "`fs.open` returns `ResourceNotFound` (404) even though `fs.exists` finds the path",
"type": "issue"
},
{
"action": "created",
"author": "poudro",
"comment_id": 631492252,
"datetime": 1589983241000,
"masked_author": "username_0",
"text": "I was able to reproduce, seems like if some other user wrote the file this occurs.\r\n\r\nIf I comment out line 439 it works as expected.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "shadiakiki1986",
"comment_id": 888967348,
"datetime": 1627551559000,
"masked_author": "username_1",
"text": "I ran in to this issue and investigated a bit. \r\n\r\nTL;DR: This behavior is intended by `s3fs` and was introduced as a fix to issue #17 in PR #21\r\n\r\nIt turns out that it's related to how S3 handles the directories that represent the path to the file. S3 is not really a filesystem as it doesn't create the directory structure to the file, but only stores the full path as a key. So if you upload to s3 something like `s3://bucket/folder/file.txt` **using aw scli**, then `/folder` doesn't exist in itself, whereas on a regular filesystem (say in my ubuntu terminal) it would. For `s3fs` in particular, and since it is part of `PyFilesystem`, it has to comply with how other filesystems work, so it does some extra stuff that would not be done if you were using the aws cli. For example, if you upload a file to s3 with s3fs to `s3://bucket/folder/file.txt`, it would create both `/folder/` and `/folder/file.txt`. At the same time, if you read that same path **after** having created it with `s3fs`, then the above issue won't happen. That's because the `obj.load` (that you commented out in PR #71 ) is verifying that the `/folder/` path exists, as would make sense in a normal filesystem. However, if you read that path **after** having uploaded it with aws cli for example, then the above issue will happen because the aws cli doesn't create `/folder/`. The `s3fs` way to handle this would be to use the `dir_path` argument in the `S3FS` constructor. Here is a full example to illustrate in code what I blaberred about in the prose above:\r\n\r\n```\r\nfrom fs_s3fs import S3FS\r\nfrom zipfile import ZipFile\r\n\r\ns3fs = S3FS('dolphicom')\r\nassert s3fs.exists(\"mobysound.org-mirror-v20210704/workshops/5th_DCL_data_bottlenose.zip\")\r\ntry:\r\n s3fs.getinfo(\"5th_DCL_data_bottlenose.zip\") # raises exception\r\n assert False\r\nexcept Exception as e:\r\n assert True\r\n print(f\"Got exception: {e}\") # Got exception: resource '5th_DCL_data_bottlenose.zip' not found\r\n\r\ns3fs = S3FS('dolphicom', dir_path=\"mobysound.org-mirror-v20210704/workshops/\")\r\nassert s3fs.exists(\"5th_DCL_data_bottlenose.zip\")\r\ns3fs.getinfo(\"5th_DCL_data_bottlenose.zip\") # does not raise exception\r\nassert True\r\n```\r\n\r\nAs a last note, I would suggest to add `s3` to the issue title so that others facing the same issue could find this and to close this issue as well as the PR because I don't think that `s3fs` will change this behaviour.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "poudro",
"comment_id": 973273281,
"datetime": 1637269770000,
"masked_author": "username_0",
"text": "The issue I mention only happens when a different user uploads the file whether it be via s3fs or aws cli even when all permitions are ok. \r\n\r\nIf the same user as the one trying ot read it via s2fs is the one that did the upload, via either s3fs or aws cli, it works fine. \r\n\r\nSo your explanation doesn't hold since it does in fact work, but only sometimes, depending on who did the upload with aws cli.\r\n\r\nI'm pretty sure this is not intended behavior on s3fs part as sometimes it works sometimes it doesn't depending on who uploaded the file even when the permitions are ok.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "geoffjukes",
"comment_id": 1055982284,
"datetime": 1646178515000,
"masked_author": "username_2",
"text": "@username_0 - The explanation does hold. Let me explain it differently....\r\n\r\nIn a POSIX filesystem, the full file path always exists. So you can 'stat' every part of it. So with \"/temp/file.ext\" as an example; `temp` exists, and so does `file.ext`\r\n\r\nIn Object storage (such as S3) it is possible for `/temp/file.ext` to exist WITHOUT `/temp/` existing - because \"folders\" aren't a thing. Objects use keys that *look* like folder paths, but are not. For `/temp/` to \"exist\" it has to be created. Quite literally, and empty object with the key `/temp/`.\r\n\r\nAll that said - I have the same issue (and gripe) with S3FS, because not all S3 clients create the `/foldername/` objects, and I don't always have control over the bucket. If I glob or walk a bucket with this issue, I get errors, and don't have any good way to solve it.\r\n\r\nI wish there was a way to tell S3FS to not 'stat' dirs for specific mounts....",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "geoffjukes",
"comment_id": 1057085931,
"datetime": 1646236495000,
"masked_author": "username_2",
"text": "@username_0 Are you saying that files uploaded by 2 users, with the same key prefix, result in an error? e.g. /some/place/user1.txt vs /some/place/user2.txt\n\nOr are the key prefixes different? e.g. /user1/file.ext vs /user2/file.ext\n\nAre the users using the same software to upload/create the objects? Or different software? I'd like to try and recreate your issue, so some specifics would help me to do that.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "poudro",
"comment_id": 1057317774,
"datetime": 1646250756000,
"masked_author": "username_0",
"text": "@username_2 here's the more detailed scenario:\r\n\r\n- There is a single file `foo` that contains `bar`\r\n- There are two users U1 and U2\r\n- U1 uploads the file to `s3://bucket/path/foo` (via s3fs, aws webinterface or aws cli, all three result in same outcome) and sets read permissions so both U1 and U2 can read it\r\n\r\nIn this scenario U1 can read file via s3fs, but U2 will raise a `ResourceNotFound` even though U2 can successfully access file via aws webinterface and aws cli.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "geoffjukes",
"comment_id": 1057337945,
"datetime": 1646252163000,
"masked_author": "username_2",
"text": "@username_0 `s3://bucket/path/foo` does not imply that an object the object `s3://bucket/path/` exists, which is what causes the error you are seeing.\r\n\r\nWhen U1 creates the object at `s3://bucket/path/foo` are they first creating `s3://bucket/path` (such as with `makedirs`)? or just creating `s3://bucket/path/foo` directly? I know that the AWS web interface and the AWS CLI allow for direct object creation, without creating the intermediate objects for \"faking\" the directories - which, again, is the source of the issue you (and I) are experiencing.\r\n\r\nNote that \"seeing\" a \"directory\" in the web interface, does not imply an object exists with that key.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "poudro",
"comment_id": 1057391546,
"datetime": 1646255332000,
"masked_author": "username_0",
"text": "The permissions in my experiment were actually set at the bucket level, seems unlikely to me then that it could be linked to permissions at the path level.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "geoffjukes",
"comment_id": 1057401847,
"datetime": 1646255927000,
"masked_author": "username_2",
"text": "It happens. I've experienced it myself many times.",
"title": null,
"type": "comment"
}
] | 3 | 10 | 6,763 | false | false | 6,763 | true |
getsentry/sentry-javascript | getsentry | 492,065,429 | 2,235 | null | [
{
"action": "opened",
"author": "KangYoosam",
"comment_id": null,
"datetime": 1568186988000,
"masked_author": "username_0",
"text": "## What\r\nIn Expo(ReactNative) documents, it indicates we can use `enableInExpoDevelopment` props.\r\n```\r\nSentry.init({\r\n dsn: 'DSN',\r\n enableInExpoDevelopment: true,\r\n debug: true\r\n});\r\n```\r\n\r\nAnd it works with `\"sentry-expo\": \"^2.0.0\"`.\r\n\r\nSo we should add type definition of `enableInExpoDevelopment`.\r\n\r\n## Refs\r\nhttps://docs.expo.io/versions/latest/guides/using-sentry/",
"title": "add enableInExpoDevelopment type to options.d.ts",
"type": "issue"
},
{
"action": "closed",
"author": "kamilogorek",
"comment_id": null,
"datetime": 1568194566000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "kamilogorek",
"comment_id": 530303136,
"datetime": 1568194566000,
"masked_author": "username_1",
"text": "This option is not applicable to all @sentry/browser environments, thus it's not applicable here. It should be applied to either @sentry/react-native or directly to sentry-expo by extending `Options` type. Cheers!",
"title": null,
"type": "comment"
}
] | 2 | 3 | 588 | false | false | 588 | false |
kubeflow/manifests | kubeflow | 558,755,877 | 833 | {
"number": 833,
"repo": "manifests",
"user_login": "kubeflow"
} | [
{
"action": "opened",
"author": "cliveseldon",
"comment_id": null,
"datetime": 1580676680000,
"masked_author": "username_0",
"text": "Fixes #832\n\n<!-- Reviewable:start -->\n---\nThis change is [<img src=\"https://reviewable.io/review_button.svg\" height=\"34\" align=\"absmiddle\" alt=\"Reviewable\"/>](https://reviewable.io/reviews/kubeflow/manifests/833)\n<!-- Reviewable:end -->",
"title": "Remove 1.15 selectors for Seldon",
"type": "issue"
},
{
"action": "created",
"author": "WillBeebe",
"comment_id": 581188961,
"datetime": 1580685503000,
"masked_author": "username_1",
"text": "These changes LGTM but can we add a comment or console log to where they're generated in the Makefile?\r\n\r\nhttps://github.com/kubeflow/manifests/blob/master/seldon/Makefile#L5-L8",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jlewi",
"comment_id": 581530238,
"datetime": 1580751534000,
"masked_author": "username_2",
"text": "/lgtm\r\n/approve\r\n\r\nWe will need to cherry-pick this into the 1.0 branch after it is merged.",
"title": null,
"type": "comment"
}
] | 4 | 5 | 2,090 | false | true | 504 | false |
mhhollomon/yalr | null | 506,319,969 | 25 | null | [
{
"action": "opened",
"author": "mhhollomon",
"comment_id": null,
"datetime": 1570970288000,
"masked_author": "username_0",
"text": "Currently you can only set the precedence of a terminal by predefining it and using `@prec=`.\r\n\r\nFor large sets of terminals this can be rather tedious. Also, it requires you to come up with a name for each terminal.\r\n\r\ninstead, allow the following:\r\n~~~\r\nprecedence 200 'x', 'y', FOO;\r\n~~~\r\n\r\nthe precedence statement should define any new terminals used inline like productions do.\r\n\r\nMaybe do the same for associativity?",
"title": "Create 'precedence' statement",
"type": "issue"
},
{
"action": "created",
"author": "mhhollomon",
"comment_id": 544205134,
"datetime": 1571527141000,
"masked_author": "username_0",
"text": "Forgot to mention in commit 27e5a5e \r\n\r\nthis is complete.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "mhhollomon",
"comment_id": null,
"datetime": 1571527142000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 480 | false | false | 480 | false |
ValveSoftware/Proton | ValveSoftware | 624,515,005 | 3,913 | null | [
{
"action": "opened",
"author": "Gman1988",
"comment_id": null,
"datetime": 1590448025000,
"masked_author": "username_0",
"text": "# Compatibility Report\r\n- Name of the game with compatibility issues: Black Desert Online\r\n- Steam AppID of the game: 582660\r\n\r\n## System Information\r\n- GPU: Radeon R9 290/290X / 390/390X\r\n- Driver/LLVM version: Mesa 20.0.7\r\n- Kernel version: 5.6.13-300.fc32.x86_64\r\n- Link to full system information report as [Gist](https://gist.github.com/username_0/de2744bb73414bf6282af4d4b3c1561b):\r\n- Proton version: 5.0.7\r\n\r\n## I confirm:\r\n- [ x] that I haven't found an existing compatibility report for this game.\r\n- [ x] that I have checked whether there are updates for my system available.\r\n\r\n<!-- Please add `PROTON_LOG=1 %command%` to the game's launch options and drag\r\nand drop the generated `$HOME/steam-$APPID.log` into this issue report -->\r\n\r\nhttps://gist.github.com/username_0/966d43fac5e79323652626a29ec9d617\r\n\r\n[steam-582660.log](https://github.com/ValveSoftware/Proton/files/4679121/steam-582660.log)\r\n\r\n\r\n## Symptoms <!-- What's the problem? -->\r\n\r\nGame installs fine. Launcher open up, when I press \"start game\", account creation shows up for 1 sec and that is it, nothing else is happening.\r\n\r\n## Reproduction\r\n\r\n1. Install game.\r\n2. Launch game.\r\n3. Press \"start game\"\r\n4. Observe issue.",
"title": "Black Desert Online - game launcher crashes ",
"type": "issue"
},
{
"action": "closed",
"author": "kisak-valve",
"comment_id": null,
"datetime": 1590448249000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "kisak-valve",
"comment_id": 633738791,
"datetime": 1590448249000,
"masked_author": "username_1",
"text": "Hello @username_0, we're using one issue report per unofficially supported game title, so I've gone ahead and transferred this issue report to https://github.com/ValveSoftware/Proton/issues/3620#issuecomment-633738742.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,411 | false | false | 1,411 | true |
sibson/redbeat | null | 561,732,143 | 151 | null | [
{
"action": "opened",
"author": "joelbixbyrevel",
"comment_id": null,
"datetime": 1581091853000,
"masked_author": "username_0",
"text": "I've noticed that my tasks are scheduled strangely between restarts. For internals, they wait the entire amount of time of the internal.\r\n\r\nAfter digging into the source, I've narrowed down the issue to not reading the meta data from the schedules stored on Redis. `setup_schedule` is currently calling a method `maybe_entry` which creates a new entry with a last_run_at set to the current time.\r\n\r\nhttps://github.com/username_1/redbeat/blob/ff7685f8a199cf6c5e0dc9084ff30a0e14b2f5da/redbeat/schedulers.py#L397\r\n\r\nIs it expected that the interval timer would be restarted between beat restarts? or am I correct in thinking that it should factor the last_run_at in from the meta data in Redis after restart.\r\n\r\nI did a little research into what the other schedulers are doing and I believe some are taking into account the last_run_at saved in their respective stores. django-celery-beat for example does a `create_or_update` when instantiating the Entry class in the `from_entry` method. That method is very similar to your own `from_key`\r\n\r\nhttps://github.com/celery/django-celery-beat/blob/master/django_celery_beat/schedulers.py#L183-L186\r\n\r\nredbeat: 0.13.0\r\ndjango: 2.2.5\r\nredis-py: 3.3.11",
"title": "`Entry.last_run_at` Not Respected Between Restarts",
"type": "issue"
},
{
"action": "created",
"author": "joelbixbyrevel",
"comment_id": 583482193,
"datetime": 1581092699000,
"masked_author": "username_0",
"text": "Heres a sample of output:\r\n\r\n```\r\n[INFO][celery.beat][2020-02-07T16:18:14.501950] beat: Starting...\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:14.579097] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:19.586736] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:24.590713] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:29.599073] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:34.607386] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:39.611007] Loading 1 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:39.624242] Scheduler: Sending due task health check every 30 (revel.common.celery.health_check)\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:44.657929] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:49.665227] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:54.673581] Loading 0 tasks\r\n\r\n\r\n[INFO][celery.beat][2020-02-07T16:18:59.274981] beat: Starting...\r\n[INFO][redbeat.schedulers][2020-02-07T16:18:59.346911] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:19:04.350679] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:19:09.357298] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:19:14.361769] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:19:19.365981] Loading 0 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:19:24.375015] Loading 1 tasks\r\n[INFO][redbeat.schedulers][2020-02-07T16:19:24.393131] Scheduler: Sending due task health check every 30 (revel.common.celery.health_check)\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sibson",
"comment_id": 588007549,
"datetime": 1582080460000,
"masked_author": "username_1",
"text": "Without looking too deeply, I suspect redbeat isn't properly handling ```relative``` as defined in https://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html#available-fields. If that's the case then as you've surmised the usage of last_run_at is problematic.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mathieu-lemay",
"comment_id": 887052613,
"datetime": 1627336430000,
"masked_author": "username_2",
"text": "I don't think it has anything to do with the `relative` property. It's as @username_0 found, when celery beat is started, the schedules are recreated in redis with a `last_run_time` set to now instead of being taken from the data in redis.\r\n\r\nSo if I have a task scheduled to run every hour that ran 55 minutes ago and I restart celery beat, the task will only be sent again in 1h, not 5 minutes.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "sibson",
"comment_id": 1002100397,
"datetime": 1640697307000,
"masked_author": "username_1",
"text": "Related? https://github.com/username_1/redbeat/issues/210",
"title": null,
"type": "comment"
}
] | 3 | 5 | 3,451 | false | false | 3,451 | true |
Shuttle/Shuttle.Esb.RabbitMQ | Shuttle | 601,921,642 | 14 | {
"number": 14,
"repo": "Shuttle.Esb.RabbitMQ",
"user_login": "Shuttle"
} | [
{
"action": "opened",
"author": "zgabi",
"comment_id": null,
"datetime": 1587124579000,
"masked_author": "username_0",
"text": "",
"title": "Upgrade to RabbitMQ 6.0.0",
"type": "issue"
},
{
"action": "created",
"author": "eben-roux",
"comment_id": 616306368,
"datetime": 1587358081000,
"masked_author": "username_1",
"text": "Thanks for the PR Gábor. I'll get to it as soon as I can. My home Internet was down from Friday 13h30 but up again today (Monday).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zgabi",
"comment_id": 617698337,
"datetime": 1587551996000,
"masked_author": "username_0",
"text": "Please do not merge it yet... I think there is a bug in it... i'm still investigating the problem.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "eben-roux",
"comment_id": 618162864,
"datetime": 1587614343000,
"masked_author": "username_1",
"text": "No problem. I'll also take a look this weekend.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zgabi",
"comment_id": 618283516,
"datetime": 1587633231000,
"masked_author": "username_0",
"text": "I've investigated the problem, and it is not in this PR.\r\nThe new RabbitMQ using Task.Run. which is using ThreadPool.\r\n\r\nAnd other part of our application (more exactly another 3rd party maps library) is also using ThreadPool, and some tasks are taking very long time (because of a bug), and no free threads for returning the messages.\r\n\r\nSo this PR should be OK, please check it when you have time.",
"title": null,
"type": "comment"
}
] | 2 | 5 | 677 | false | false | 677 | false |
igrr/mkspiffs | null | 660,512,022 | 75 | null | [
{
"action": "opened",
"author": "morcillo",
"comment_id": null,
"datetime": 1595123536000,
"masked_author": "username_0",
"text": "Hello, I have a project that I just need to create a new file system to upload with my firmware every time I upload it. The thing is that my VM with esp-open-rtos isn't working well anymore and I wanted to upgrade to windows since it's my host OS. The thing is that comparing the binaries generated by both mkspiffs I noticed that there are some differences that may affect the functionality of the code, since it may not be able to read the file system. \r\n\r\nUnfortunately I won't be able to test this for a few days, maybe weeks. Has anyone changes host OS me? are those difference significant or not? My firmware uses esp-open-sdk that uses version 0.3.6",
"title": "Compatibility with esp-open-rtos mkspiffs",
"type": "issue"
}
] | 1 | 1 | 656 | false | false | 656 | false |
fsprojects/Paket | fsprojects | 588,412,174 | 3,816 | null | [
{
"action": "opened",
"author": "TimLariviere",
"comment_id": null,
"datetime": 1585228501000,
"masked_author": "username_0",
"text": "### Description\r\n\r\nStarting v4.5.0, [Xamarin.Forms](https://www.nuget.org/packages/Xamarin.Forms/4.5.0.495) added 2 new folders in their NuGet packages, under the folder `build`, named `XCODE10` and `XCODE11`.\r\n\r\nWhen installing this package with Paket, the following warnings appear:\r\n```\r\nXamarin.Forms 4.5.0.495 unzipped to /Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms\r\nCould not detect any platforms from 'XCODE10' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE10/Xamarin.Forms.Platform.iOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE10' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE10/Xamarin.Forms.Platform.macOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE10' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE10/Xamarin.Forms.Platform.iOS.pdb', please tell the package authors\r\nCould not detect any platforms from 'XCODE11' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE11/Xamarin.Forms.Platform.iOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE11' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE11/Xamarin.Forms.Platform.macOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE11' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE11/Xamarin.Forms.Platform.iOS.pdb', please tell the package authors\r\nCould not detect any platforms from 'XCODE10' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE10/Xamarin.Forms.Platform.iOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE10' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE10/Xamarin.Forms.Platform.macOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE10' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE10/Xamarin.Forms.Platform.iOS.pdb', please tell the package authors\r\nCould not detect any platforms from 'XCODE11' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE11/Xamarin.Forms.Platform.iOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE11' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE11/Xamarin.Forms.Platform.macOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE11' in '/Users/timothelariviere/Git/GitHub/Fabulous/packages/androidapp/Xamarin.Forms/build/XCODE11/Xamarin.Forms.Platform.iOS.pdb', please tell the package authors\r\n```\r\n\r\n### Repro steps\r\n\r\n1. Add a reference to `Xamarin.Forms 4.5.0.495`\r\n```\r\nnuget Xamarin.Forms 4.5.0.495\r\n```\r\n\r\n2. Run `paket install`\r\n\r\n3. Warnings about `XCODE10` and `XCODE11` should appear (see example above)\r\n\r\n### Expected behavior\r\n\r\nI'm not exactly sure what these warnings mean, but I would expect no warning at all.\r\n\r\n### Actual behavior\r\n\r\nWarnings are appearing.\r\n\r\n### Known workarounds\r\n\r\nNo workaround known.",
"title": "Could not detect any platforms from 'XCODE10' / 'XCODE11'",
"type": "issue"
},
{
"action": "created",
"author": "vshapenko",
"comment_id": 616132571,
"datetime": 1587302274000,
"masked_author": "username_1",
"text": "Same for my project, looks like paket does not properly processes content inside the build folder of nuget package",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "forki",
"comment_id": 616138003,
"datetime": 1587303809000,
"masked_author": "username_2",
"text": "What even is xcode10? Do we have similar?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vshapenko",
"comment_id": 616138748,
"datetime": 1587304016000,
"masked_author": "username_1",
"text": "That is xamarin forms magic. If we look inside xamarin.forms 4.5 nuget\npackage,we find xcode 11 and xcode10 folder inside 'build' folder. Files\nthere are very important to build an ios app. Honestly,i do not know,why\nxamarin team took such decision.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TimLariviere",
"comment_id": 616142031,
"datetime": 1587304927000,
"masked_author": "username_0",
"text": "I noticed that an assembly named `Xamarin.Forms.Platform.iOS` (containing iOS-specific code - same with macOS) which was previously under `lib\\Xamarin.iOS10` has been removed and replaced by 2 other ones under `build\\XCode10` and `build\\XCode11`.\r\n\r\nI'm guessing the Xamarin.Forms team had to provide dlls specifically compatible with the XCode version you have installed on your mac.\r\n\r\nLike @username_1, some of my iOS projects managed by Paket failed to build once updated by Paket.\r\nThey were missing links to `Xamarin.Forms.props` and `Xamarin.Forms.targets`, for some reasons.\r\nThose files are at the root of the `build` directory.\r\nOnce I added the links manually, the projects built correctly.\r\n\r\nThe `targets` file automatically chooses the appropriate XCode10/11 dll, so I don't think it really concerns Paket.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "forki",
"comment_id": 616142416,
"datetime": 1587305046000,
"masked_author": "username_2",
"text": "Can you please describe the workaround in detail? Thx",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vshapenko",
"comment_id": 616143302,
"datetime": 1587305285000,
"masked_author": "username_1",
"text": "@username_2 , does paket processes .props and .targets files from 'build' folder?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TimLariviere",
"comment_id": 616144550,
"datetime": 1587305639000,
"masked_author": "username_0",
"text": "Workaround of the build issue once upgrading to Xamarin.Forms 4.5.0.396 or newer:\r\n\r\nAfter running `paket install`, Paket should have removed the reference to `Xamarin.Forms.Platform.iOS.dll` inside the iOS fsproj file.\r\n\r\nTo be able to continue building your iOS (or macOS) project, you'll need to do the following:\r\n- Add this line at the start of the iOS fsproj file, after the `<Project>` start tag\r\n```\r\n<Import Project=\"..\\packages\\Xamarin.Forms\\build\\Xamarin.Forms.props\" Condition=\"Exists('..\\packages\\Xamarin.Forms\\build\\Xamarin.Forms.props')\" />\r\n```\r\n- Add this line at the endof the iOS fsproj file, before the `</Project>` end tag\r\n```\r\n<Import Project=\"..\\packages\\Xamarin.Forms\\build\\Xamarin.Forms.targets\" Condition=\"Exists('..\\packages\\Xamarin.Forms\\build\\Xamarin.Forms.targets')\" />\r\n```\r\n\r\nFor reference, see this commit:\r\nhttps://github.com/username_0/FabulousPlanets/commit/e5e8b6e39f9343896d91fc427decd9b0d94a61e1#diff-b5787cbb1a1fa09cbd9f445d240677dc",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vshapenko",
"comment_id": 616207563,
"datetime": 1587322772000,
"masked_author": "username_1",
"text": "Ok, after some deep dive into paket sources, i think i can identify the problem: \r\nInstallModel.fs, part there we parse build folder. According to nuspec : https://docs.microsoft.com/ru-ru/nuget/create-packages/creating-a-package#include-msbuild-props-and-targets-in-a-package props and targets files from build folder should be included in project. However, paket just scans subfolders for dll with proper tfm, and it works not in 100% cases, as we can see.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "forki",
"comment_id": 616334097,
"datetime": 1587363320000,
"masked_author": "username_2",
"text": "https://github.com/fsprojects/Paket/pull/3830",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "forki",
"comment_id": 620408367,
"datetime": 1588055440000,
"masked_author": "username_2",
"text": "Please try with 6.0.0-alpha025",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TimLariviere",
"comment_id": 620594074,
"datetime": 1588079150000,
"masked_author": "username_0",
"text": "@username_2 I'm still having the issue with the following `paket.dependencies`.\r\n\r\n```\r\nversion 6.0.0-alpha025\r\nframework xamarinios\r\nsource https://www.nuget.org/api/v2\r\nnuget Xamarin.Forms ~> 4.5.0\r\n```\r\n\r\n```\r\ntimothelariviere@Timothes-MacBook-Pro FabulousPlanets % mono .paket/paket.exe install\r\nPaket version 6.0.0-alpha025\r\nResolving packages for group Main:\r\n - Xamarin.Forms 4.5.0.657+219-sha.3d0108ce6-azdo.3648568\r\nLocked version resolution written to /Users/timothelariviere/Git/GitHub/FabulousPlanets/paket.lock\r\nInstalling into projects:\r\n - Creating model and downloading packages.\r\nCould not detect any platforms from 'XCODE10' in '/Users/timothelariviere/Git/GitHub/FabulousPlanets/packages/Xamarin.Forms/build/XCODE10/Xamarin.Forms.Platform.iOS.dll', please tell the package authors\r\nCould not detect any platforms from 'XCODE11' in '/Users/timothelariviere/Git/GitHub/FabulousPlanets/packages/Xamarin.Forms/build/XCODE11/Xamarin.Forms.Platform.iOS.dll', please tell the package authors\r\n - Installing for projects\r\n - FabulousPlanets.Android/paket.references -> FabulousPlanets.Android/FabulousPlanets.Android.fsproj\r\n - FabulousPlanets.iOS/paket.references -> FabulousPlanets.iOS/FabulousPlanets.iOS.fsproj\r\n - FabulousPlanets/paket.references -> FabulousPlanets/FabulousPlanets.fsproj\r\nF# project /Users/timothelariviere/Git/GitHub/FabulousPlanets/FabulousPlanets/FabulousPlanets.fsproj does not reference FSharp.Core.\r\nPerformance:\r\n - Resolver: 4 seconds (1 runs)\r\n - Runtime: 115 milliseconds\r\n - Blocked (retrieving package details): 504 milliseconds (1 times)\r\n - Blocked (retrieving package versions): 3 seconds (1 times)\r\n - Disk IO: 671 milliseconds\r\n - Average Request Time: 1 second\r\n - Number of Requests: 3\r\n - Runtime: 5 seconds\r\nPaket omitted 10 warnings similar to the ones above. You can see them in verbose mode.\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "forki",
"comment_id": 621026403,
"datetime": 1588143874000,
"masked_author": "username_2",
"text": "can you please upload a zip with a small sample?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TimLariviere",
"comment_id": 621211877,
"datetime": 1588167391000,
"masked_author": "username_0",
"text": "[Sample-Paket-XCode.zip](https://github.com/fsprojects/Paket/files/4551968/Sample-Paket-XCode.zip)\r\n\r\nI created the default Xamarin iOS template and added Xamarin.Forms 4.5.0 via `paket install` (magic mode).\r\n\r\nIf you remove `paket.lock` and run `paket install` (`mono paket.exe install` on macOS), you'll see the warnings and also #3831 (the Xamarin.Forms.targets/props is not imported in the csproj by Paket).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vshapenko",
"comment_id": 622305173,
"datetime": 1588322855000,
"masked_author": "username_1",
"text": "\\build\r\n \\netstandard1.4\r\n \\Contoso.Utility.UsefulStuff.props\r\n \\Contoso.Utility.UsefulStuff.targets\r\n \\net462\r\n \\Contoso.Utility.UsefulStuff.props\r\n \\Contoso.Utility.UsefulStuff.targets\r\n\r\nSo, the proper algorithm would be following: \r\n1. Determine tfm in project we add nuget package to \r\n2. search build folder for either global props and targets or for tfm/.build or .props\r\n3. Include search results into the project file.\r\n4. import_targets flag looks useless, because setting it false breaks the expected behavior for nuget package and does not include important parts of package into project.\r\n5. Dll files from build folder should not be included as project references (i hope paket follows this rule).\r\n\r\nSorry for being some rude, but i have problems with clean and proper description of issues.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "forki",
"comment_id": 622994287,
"datetime": 1588443612000,
"masked_author": "username_2",
"text": "ok I pushed another fix. It's no longer complaining.\r\n\r\nBut which props and targets file from the sample should it import exactly?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "TimLariviere",
"comment_id": 623010340,
"datetime": 1588451992000,
"masked_author": "username_0",
"text": "@username_2 I confirm that the warnings disappeared with 6.0.0-alpha027.\r\n\r\nFor the props/targets, Xamarin.Forms requires to import both `build\\Xamarin.Forms.props` and `build\\Xamarin.Forms.targets` (inside Xamarin.Forms NuGet package) in order to compile.\r\nIt's these files that import the dlls from the XCODE10/XCODE11 folders.\r\n\r\nNote that `build\\Xamarin.Forms.DefaultItems.props` is imported by `build\\Xamarin.Forms.props`, same with targets.\r\nSo I guess a props/targets file inside the `build` folder with the same name as the package is meant to be imported by default?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "vshapenko",
"comment_id": 623063706,
"datetime": 1588488444000,
"masked_author": "username_1",
"text": "@username_2 , if we look into the srtucture of directory build, we will see the following:\r\n- Xamarin.Forms.props\r\n-Xamarin.Forms.targets\r\n-Xamarin.Forms.Default.props\r\n-Xamarin.Forms.Default.targets\r\nMonoAndroid10\\\r\nnet46\\\r\nnetstandard2.0\\\r\nXCODE10\\\r\nXCODE11\\\r\n\r\nWe need to put <packagename>.props at the beginning of project file, put <packagename>.targets to the end of project file. But before we do this, we should scan folders for props and targets for tfm related files, i.e. there may be <tfm>/<packagename>.props, which we should take instead root props file. \r\nImportant: Only props and targets should be taken, neither dll or other one. Name of file should match the package name. \r\nP.,S. There can be a buildMultitarget directory in package : \r\n\r\n MSBuild .props and .targets files for cross-framework targeting can be placed in the \\buildMultiTargeting folder. During package installation, NuGet adds the corresponding <Import> elements to the project file with the condition, that the target framework is not set (the MSBuild property $(TargetFramework) must be empty).\r\n\r\nP.S.S [Here](url) is more detailed and full information about possible folder names and their behaviour.",
"title": null,
"type": "comment"
}
] | 3 | 18 | 11,256 | false | false | 11,256 | true |
modichirag/flowpm | null | 518,367,510 | 14 | null | [
{
"action": "opened",
"author": "EiffL",
"comment_id": null,
"datetime": 1573034850000,
"masked_author": "username_0",
"text": "This issue is to track the development of a benchmark script to test the scaling of a distributed FFT in Mesh TensorFlow under different environments.\r\n\r\nI have a prototype script for Cori in https://github.com/username_1/flowpm/tree/mesh/scripts\r\n\r\nBut there is a lot of room for improvement, in particular:\r\n\r\n - [ ] Measuring the communication and compute times\r\n - [ ] Support TPU enviroments as well as SLURM cluster\r\n - [ ] Automatically run a scaling experiment and report the results\r\n\r\nHelp is most welcome :-)",
"title": "Building FFT benchmark script",
"type": "issue"
},
{
"action": "created",
"author": "EiffL",
"comment_id": 552157947,
"datetime": 1573355280000,
"masked_author": "username_0",
"text": "Adding a PR to support exporting timeline to profile the execution of FFTs, look at this beautiful graph of flows between the 16 GPUs of a 1D mesh on 2 cori-gpu nodes: \r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "EiffL",
"comment_id": 552263768,
"datetime": 1573437656000,
"masked_author": "username_0",
"text": "We now have an FFT benchmark that runs on both TPUs and Slurm cluster. And the first results are super interesting!\r\n\r\nThe scripts running these benchmarks are here: https://github.com/username_1/flowpm/tree/mesh/scripts\r\nAll it does is sample a batch of random 3D cubes i.e. tensors of size 128x512x512x512, execute a forward/backward FFT transform, and check accuracy. This runs on a 1D mesh, with the array splitted along the first spatial dimension.\r\n\r\nHere are 2 traces captured on Cori on a one vs two nodes for the same volume size:\r\n - 1 Node: [timeline_1_node.log](https://github.com/username_1/flowpm/files/3829345/timeline_1_node.log)\r\n - 2 Nodes: [timeline_2_nodes.log](https://github.com/username_1/flowpm/files/3829346/timeline_2_nodes.log)\r\nTo see these traces, simply go to Chrome [tracing plugin](about:tracing) and load these files.\r\n\r\nHere, the computation on 2 nodes is about 10x slower than on a single node. It is certainly expected that cross-nodes communications are going to be the bottleneck, but we are using completely default TensorFlow with gRPC at the moment. There is a lot of room for improvement there.\r\n\r\nAlso ran a trace on a TPU slice with the same code essentially. There we get very interesting numbers and insight into our very naive first implementation of 3D FFTs\r\n - Dominated by all-to-all communications, 56% of the time, and 10% of the time doing memory operations, during array transpose:\r\n\r\nFFTs account for 18% of the time, and even there they only use a tiny amount of the compute.\r\n - It looks like we can improve on the amount of communication by a factor 2, by combining the transpose and reshape steps we are currently using to transpose and split the memory array between two FFTs:\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "EiffL",
"comment_id": 582455102,
"datetime": 1580915658000,
"masked_author": "username_0",
"text": "@username_1 can you just report on your latest experiments. and then we can tick the last box and close this issue :-)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "modichirag",
"comment_id": 585510695,
"datetime": 1581559015000,
"masked_author": "username_1",
"text": "Timings for a single FFT+iFFT for different grids and configurations of mesh on 1, 2, 4, 8 nodes of GPUs on Cori\r\n\r\n\r\n\r\nAnd this is the comparison with timings for the same operation in the differentiable python code ([vmad](https://github.com/rainwoodman/vmad)) run on Cori Haswell ( 1 and 2 nodes respectively as required by the number of processes)\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "EiffL",
"comment_id": null,
"datetime": 1614974827000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 3,451 | false | false | 3,451 | true |
CSCfi/metadata-submitter | CSCfi | 652,264,966 | 74 | {
"number": 74,
"repo": "metadata-submitter",
"user_login": "CSCfi"
} | [
{
"action": "opened",
"author": "otahontas",
"comment_id": null,
"datetime": 1594123261000,
"masked_author": "username_0",
"text": "### Description\r\n\r\nThis PR adds pagination to query responses, following our api specs.\r\n\r\n### Related issues\r\nFixes #54 \r\n\r\n### Type of change\r\n\r\n- [x] New feature (non-breaking change which adds functionality)\r\n\r\n### Changes Made\r\n- Added pagination parameter checks and pagination response to handlers module\r\n- Added pagination logic to operators module\r\n- Added db operations for size of query results (yepp, this can't be done directly with cursor...) \r\n- Moved away json dumps from operators module --> now operators are only returning strings, lists or dicts, which can then be dumped to json / xml in handlers module. Makes program logic clearer imo.\r\n\r\n+ added some tiny fixes I noticed to handlers errors \r\n\r\n### Testing\r\n- [x] Unit Tests\r\n- [x] Integration Tests\r\n\r\n### Mentions\r\nAre we ok with this or do we wan't to include for example link headers (see https://stackoverflow.com/questions/12168624/pagination-response-payload-from-a-restful-api)?",
"title": "Feature/add pagination",
"type": "issue"
},
{
"action": "created",
"author": "otahontas",
"comment_id": 655285644,
"datetime": 1594184558000,
"masked_author": "username_0",
"text": "Added issue for links: #75",
"title": null,
"type": "comment"
}
] | 1 | 2 | 987 | false | false | 987 | false |
gnembon/fabric-carpet | null | 603,649,877 | 239 | null | [
{
"action": "opened",
"author": "supersaiyansubtlety",
"comment_id": null,
"datetime": 1587434654000,
"masked_author": "username_0",
"text": "I've noticed that right clicking a dispenser with a cactus only makes it face the direction opposite its current direction, rather than cycling through 4 different directions like it does with other blocks.",
"title": "flippinCactus only inverts direction of dispensers",
"type": "issue"
},
{
"action": "created",
"author": "gnembon",
"comment_id": 616918027,
"datetime": 1587437121000,
"masked_author": "username_1",
"text": "that has been this way by design to flip hoppers, dispensers etc - things that can be placed facing 6 directions, and for blocks that inherently face only 4 directions, like side hoppers or repeaters - to rotate them. \r\n\r\ntldr, its WAI",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "gnembon",
"comment_id": null,
"datetime": 1587437123000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "supersaiyansubtlety",
"comment_id": 616959489,
"datetime": 1587446668000,
"masked_author": "username_0",
"text": "Can I ask why? It doesn't seem like the most useful behavior.",
"title": null,
"type": "comment"
}
] | 2 | 4 | 502 | false | false | 502 | false |
GameDevLiege/GameDevTwitchPlays12 | GameDevLiege | 294,712,369 | 53 | {
"number": 53,
"repo": "GameDevTwitchPlays12",
"user_login": "GameDevLiege"
} | [
{
"action": "opened",
"author": "didztm",
"comment_id": null,
"datetime": 1517913947000,
"masked_author": "username_0",
"text": "Ajout 2 classes : \r\nPhysicsManager <- Classe à implémenter dans l'intégration (façade)\r\nFactionManager : gère la création des 4 camps\r\n\r\nMouvement des moles fonctionnels",
"title": "Add Fix movement and facing class",
"type": "issue"
},
{
"action": "created",
"author": "didztm",
"comment_id": 363506709,
"datetime": 1517939368000,
"masked_author": "username_0",
"text": "Ajout itemEvent ---> soulève l'item trouvé et renvoi celui-ci + le player",
"title": null,
"type": "comment"
}
] | 1 | 2 | 243 | false | false | 243 | false |
neiesc/neiesc.github.io | null | 588,914,075 | 275 | {
"number": 275,
"repo": "neiesc.github.io",
"user_login": "neiesc"
} | [
{
"action": "created",
"author": "neiesc",
"comment_id": 605405888,
"datetime": 1585378650000,
"masked_author": "username_0",
"text": "#56",
"title": null,
"type": "comment"
}
] | 2 | 2 | 5,635 | false | true | 3 | false |
crate/crate | crate | 212,452,492 | 5,049 | {
"number": 5049,
"repo": "crate",
"user_login": "crate"
} | [
{
"action": "opened",
"author": "mfussenegger",
"comment_id": null,
"datetime": 1488897780000,
"masked_author": "username_0",
"text": "Seems like private functions (invokespecial vs invokevirtual) and\ndead-code elimination works better than hiding the logic in a\nspecialized predicate/ondoc lambda implementation\n\nfrom\n\n 343.276 ±(99.9%) 9.036 ms/op [Average]\n\nto\n\n 326.965 ±(99.9%) 6.435 ms/op [Average]",
"title": "Optimize LuceneBatchIterator",
"type": "issue"
}
] | 2 | 2 | 275 | false | true | 275 | false |
architecture-building-systems/CityEnergyAnalyst | architecture-building-systems | 430,950,004 | 1,866 | null | [
{
"action": "opened",
"author": "daren-thomas",
"comment_id": null,
"datetime": 1554814312000,
"masked_author": "username_0",
"text": "As a User, when I change the selected folder in the \"Projects\" dialog on the Dashboard (http://localhost:5050/project/show), I want the list of scenario-names to be updated to valid entries for that project.\n\nCurrently, the list from the previous folder is left until the dialog is saved. This is a bug.",
"title": "Reload list of scenario-names when project folder is changed in the Dashboard",
"type": "issue"
},
{
"action": "created",
"author": "daren-thomas",
"comment_id": 492998742,
"datetime": 1558000007000,
"masked_author": "username_0",
"text": "I will check to see if still relevant after landing page changes",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "daren-thomas",
"comment_id": 494419141,
"datetime": 1558449417000,
"masked_author": "username_0",
"text": "@username_1 do you think this is still relevant? I think, after the work you've done on the project management blueprint, that we can maybe just close this?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "reyery",
"comment_id": 494676121,
"datetime": 1558507931000,
"masked_author": "username_1",
"text": "@username_0 currently \"Projects\" can be accessed using the gear icon in the bottom left. We can probably close this issue already since we can already switch scenarios without using this page. However, I am curious regarding what we can do with the remaining settings. Do we still need them? We could move them to project management, then we can remove the gear icon thing. \r\n\r\nAt the same time, maybe we can just remove the bar in the bottom left if we have no plans to do anything with the icons, so we can close #1911. I might just put the glossary search bar that I have in the top left to that bar. What do you think?\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "daren-thomas",
"comment_id": 494690020,
"datetime": 1558510752000,
"masked_author": "username_0",
"text": "@username_1 oh, so you moved the project page down there... i thought you just removed it...\r\n\r\nthere is an [issue somewhere](https://github.com/architecture-building-systems/CityEnergyAnalyst/issues/1925) about these:\r\n\r\n\r\n\r\nI'm actually fine with removing this page entirely. I think the tools that use these parameters are sufficient for setting them.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "reyery",
"comment_id": 496066667,
"datetime": 1558928083000,
"masked_author": "username_1",
"text": "@username_0 I am guessing I can go ahead and remove this page entirely to close this issue? The only concern left is the weather setting, which I feel is a separate unrelated issue.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "daren-thomas",
"comment_id": 496472755,
"datetime": 1559042037000,
"masked_author": "username_0",
"text": "@username_1 did you make progress on this?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "reyery",
"comment_id": 496473927,
"datetime": 1559042284000,
"masked_author": "username_1",
"text": "@username_0 not yet. I'm just planning to remove it anyway.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "daren-thomas",
"comment_id": null,
"datetime": 1560350015000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 9 | 2,054 | false | false | 2,054 | true |
freifunk-gluon/gluon | freifunk-gluon | 383,920,128 | 1,580 | {
"number": 1580,
"repo": "gluon",
"user_login": "freifunk-gluon"
} | [
{
"action": "opened",
"author": "ecsv",
"comment_id": null,
"datetime": 1543004140000,
"masked_author": "username_0",
"text": "Collect module symvers for all external modules to make them available\r\nfor modpost. This fixes dependencies for most external modules.\r\n\r\n root@ffv-525400123456:/# modinfo batman-adv\r\n module: /lib/modules/4.4.153/batman-adv.ko\r\n alias: net-pf-16-proto-16-family-batadv\r\n alias: rtnl-link-batadv\r\n version: openwrt-2018.1-5\r\n description: B.A.T.M.A.N. advanced\r\n author: Marek Lindner <mareklindner@neomailbox.ch>, Simon Wunderlich <sw@simonwunderlich.de>\r\n license: GPL\r\n depends:\r\n\r\nAfter:\r\n\r\n root@ffv-525400123456:/# modinfo batman-adv\r\n module: /lib/modules/4.4.153/batman-adv.ko\r\n alias: net-pf-16-proto-16-family-batadv\r\n alias: rtnl-link-batadv\r\n version: openwrt-2018.1-5\r\n description: B.A.T.M.A.N. advanced\r\n author: Marek Lindner <mareklindner@neomailbox.ch>, Simon Wunderlich <sw@simonwunderlich.de>\r\n license: GPL\r\n depends: libcrc32c,cfg80211",
"title": "v2018.1.x: kernel: collect module symvers for external modules",
"type": "issue"
},
{
"action": "created",
"author": "rubo77",
"comment_id": 441353440,
"datetime": 1543049947000,
"masked_author": "username_1",
"text": "When was this bug introduced?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ecsv",
"comment_id": 441355245,
"datetime": 1543052216000,
"masked_author": "username_0",
"text": "It was triggered by https://github.com/freifunk-gluon/gluon/commit/0bb3742f51d1af0ebc1394c72e899524b0adf81d (which moved the the openwrt-routing feed to a newer commit in the openwrt-18.06 branch). The first gluon release with these commit is v2018.1.2.\r\n\r\nThe underlying bug (missing dependencies in external kernel modules) is there since the beginning of external kernel modules in LEDE/OpenWrt. Kernel modules before OpenWrt 18.06 usually used workarounds to force loading other kernel modules in the correct order. This kind of WAR was dropped for batman-adv in the openwrt-18.06 branch and its kernel loading mechanism was switched to AutoProbe.\r\n\r\n* https://github.com/openwrt-routing/packages/commit/1ba424a4d040210fdded11a7d6848f9b82857918\r\n\r\nThe fix here is a backport of the required OpenWrt-18.06 commit to LEDE 17.01. It is required because gluon uses the openwrt-18.06 branch of openwrt-routing with LEDE 17.01.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "heini66",
"comment_id": 441387099,
"datetime": 1543084478000,
"masked_author": "username_2",
"text": "my er-x is working with this patch and i've got a bat0 interface",
"title": null,
"type": "comment"
}
] | 3 | 4 | 2,039 | false | false | 2,039 | false |
xiph/rav1e | xiph | 494,127,910 | 1,662 | {
"number": 1662,
"repo": "rav1e",
"user_login": "xiph"
} | [
{
"action": "opened",
"author": "rzumer",
"comment_id": null,
"datetime": 1568648301000,
"masked_author": "username_0",
"text": "All sampling formats are supported (though not fully for non-4:2:0 as mentioned above).",
"title": "Remove input chroma sampling restriction in readme",
"type": "issue"
},
{
"action": "created",
"author": "shssoichiro",
"comment_id": 531857497,
"datetime": 1568651895000,
"masked_author": "username_1",
"text": "We should probably mention that monochrome (4:0:0) inputs are not yet supported.",
"title": null,
"type": "comment"
}
] | 3 | 3 | 445 | false | true | 167 | false |
Askannz/optimus-manager | null | 529,034,738 | 181 | null | [
{
"action": "opened",
"author": "noctuid",
"comment_id": null,
"datetime": 1574812415000,
"masked_author": "username_0",
"text": "**Describe the bug**\r\nI'm not using a display manager; I have `prime-offload` in my .xinitrc and am running `prime-switch` as root after stopping X. When switching to nvidia and running `startx`, I get a bunch of red dots as the background. If I start a program, it is visible for a fraction of a second, and then everything goes black. I've tried PCI, bbswitch, and no configuration (both PCI and bbswitch work with nvidia-xrun).\r\n\r\nIf I start X with a monitor plugged in, the screen doesn't go black at least. I assume this is the same problem mentioned in the FAQ.\r\n\r\n**System info**\r\nPlease include :\r\n- Distro: Arch\r\n- WM/DM: bspwm, no display manager\r\n- The version of optimus-manager you are using : optimus-manager-git\r\n\r\n**Logs**\r\nOutput of `xrand --listproviders` (after switching to intel) if it matters:\r\n```\r\nProviders: number : 1\r\nProvider 0: id: 0x43 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 3 outputs: 1 associated providers: 0 name:modesetting\r\n```\r\n\r\nThere were no errors or anything interesting in any of the optimus-manager log files, but I can provide them if you think they would help.\r\n\r\nHere is `Xorg.0.log`:\r\n[xorg.log](https://github.com/Askannz/optimus-manager/files/3894469/xorg.log)",
"title": "Screen goes black after starting X with nvidia",
"type": "issue"
},
{
"action": "created",
"author": "noctuid",
"comment_id": 613025659,
"datetime": 1586802053000,
"masked_author": "username_0",
"text": "Updated today to try again, and this issue no longer happens.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "noctuid",
"comment_id": null,
"datetime": 1586802053000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 1,305 | false | false | 1,305 | false |
rancher/rke | rancher | 519,702,071 | 1,761 | null | [
{
"action": "opened",
"author": "vvkkhjt",
"comment_id": null,
"datetime": 1573186779000,
"masked_author": "username_0",
"text": "**RKE version:**\r\n0.2.2\r\n**Docker version: (`docker version`,`docker info` preferred)**\r\n18.03.0-ce\r\n**Operating system and kernel: (`cat /etc/os-release`, `uname -r` preferred)**\r\nNAME=\"CentOS Linux\"\r\nVERSION=\"7 (Core)\"\r\nID=\"centos\"\r\nID_LIKE=\"rhel fedora\"\r\nVERSION_ID=\"7\"\r\nPRETTY_NAME=\"CentOS Linux 7 (Core)\"\r\nANSI_COLOR=\"0;31\"\r\nCPE_NAME=\"cpe:/o:centos:centos:7\"\r\nHOME_URL=\"https://www.centos.org/\"\r\nBUG_REPORT_URL=\"https://bugs.centos.org/\"\r\n\r\nCENTOS_MANTISBT_PROJECT=\"CentOS-7\"\r\nCENTOS_MANTISBT_PROJECT_VERSION=\"7\"\r\nREDHAT_SUPPORT_PRODUCT=\"centos\"\r\nREDHAT_SUPPORT_PRODUCT_VERSION=\"7\"\r\n**Type/provider of hosts: (VirtualBox/Bare-metal/AWS/GCE/DO)**\r\nBare-metal\r\n**cluster.yml file:**\r\n```yaml\r\n# If you intened to deploy Kubernetes in an air-gapped environment,\r\n# please consult the documentation on how to configure custom RKE images.\r\nnodes:\r\n- address: 192.168.20.121\r\n port: \"22\"\r\n internal_address: 192.168.20.121\r\n role:\r\n - controlplane\r\n - worker\r\n - etcd\r\n hostname_override: rancher121\r\n user: rketest\r\n docker_socket: /var/run/docker.sock\r\n ssh_key: \"\"\r\n ssh_key_path: ~/.ssh/id_rsa\r\n labels: {}\r\n- address: 192.168.20.77\r\n port: \"22\"\r\n internal_address: 192.168.20.77\r\n role:\r\n - worker\r\n hostname_override: rancher77\r\n user: rketest\r\n docker_socket: /run/docker.sock\r\n ssh_key: \"\"\r\n ssh_key_path: ~/.ssh/id_rsa\r\n labels: {}\r\n- address: 192.168.2.193\r\n port: \"22\"\r\n internal_address: 192.168.2.193\r\n role:\r\n - worker\r\n hostname_override: rancher193\r\n user: rketest\r\n docker_socket: /run/docker.sock\r\n ssh_key: \"\"\r\n ssh_key_path: ~/.ssh/id_rsa\r\n labels: {}\r\n- address: 192.168.220.50\r\n port: \"22\"\r\n internal_address: 192.168.220.50\r\n role:\r\n - worker\r\n hostname_override: rancher50\r\n user: rketest\r\n docker_socket: /run/docker.sock\r\n ssh_key: \"\"\r\n ssh_key_path: ~/.ssh/id_rsa\r\n labels: {}\r\nservices:\r\n etcd:\r\n image: \"\"\r\n extra_args: {}\r\n extra_binds: []\r\n extra_env: []\r\n external_urls: []\r\n ca_cert: \"\"\r\n cert: \"\"\r\n key: \"\"\r\n path: \"\"\r\n snapshot: false\r\n retention: \"\"\r\n creation: \"\"\r\n #backup_config:\r\n # interval_hours: 12\r\n # retention: 6\r\n # s3backupconfig:\r\n # access_key: 6227KC1ZC4EUGMM2PLGZ\r\n # secret_key: jPS+U6GlV1wY4P9JATACS52T2dIQPEmHZ+WDxIXh\r\n # bucket_name: k8setcd\r\n # region: \"\"\r\n # endpoint: http://minio.digi-sky.com:9000\r\n kube-api:\r\n nginx_proxy: rancher/rke-tools:v0.1.13\r\n cert_downloader: rancher/rke-tools:v0.1.13\r\n kubernetes_services_sidecar: rancher/rke-tools:v0.1.13\r\n kubedns: rancher/k8s-dns-kube-dns-amd64:1.14.10\r\n dnsmasq: rancher/k8s-dns-dnsmasq-nanny-amd64:1.14.10\r\n kubedns_sidecar: rancher/k8s-dns-sidecar-amd64:1.14.10\r\n kubedns_autoscaler: rancher/cluster-proportional-autoscaler-amd64:1.0.0\r\n kubernetes: rancher/hyperkube:v1.11.1-rancher1\r\n flannel: rancher/coreos-flannel:v0.9.1\r\n flannel_cni: rancher/coreos-flannel-cni:v0.2.0\r\n calico_node: rancher/calico-node:v3.1.1\r\n calico_cni: rancher/calico-cni:v3.1.1\r\n calico_controllers: \"\"\r\n calico_ctl: rancher/calico-ctl:v2.0.0\r\n canal_node: rancher/calico-node:v3.1.1\r\n canal_cni: rancher/calico-cni:v3.1.1\r\n canal_flannel: rancher/coreos-flannel:v0.9.1\r\n wave_node: weaveworks/weave-kube:2.1.2\r\n weave_cni: weaveworks/weave-npc:2.1.2\r\n pod_infra_container: rancher/pause-amd64:3.1\r\n ingress: rancher/nginx-ingress-controller:0.16.2-rancher1\r\n ingress_backend: rancher/nginx-ingress-controller-defaultbackend:1.4\r\n metrics_server: rancher/metrics-server-amd64:v0.2.1\r\nssh_key_path: ~/.ssh/id_rsa\r\nssh_agent_auth: false\r\nauthorization:\r\n mode: rbac\r\n options: {}\r\nignore_docker_version: true\r\nkubernetes_version: \"\"\r\nprivate_registries: []\r\ningress:\r\n provider: \"\"\r\n options: {}\r\n node_selector: {}\r\n extra_args: {}\r\ncluster_name: \"\"\r\ncloud_provider:\r\n name: \"\"\r\nprefix_path: \"\"\r\naddon_job_timeout: 0\r\nbastion_host:\r\n address: \"\"\r\n port: \"\"\r\n user: \"\"\r\n ssh_key: \"\"\r\n ssh_key_path: \"\"\r\nmonitoring:\r\n provider: \"\"\r\n options: {}\r\n```\r\n**Steps to Reproduce:**\r\n1,cluster cert is expire\r\n2,use rke0.2.2 rotate cert\r\n3,restart node\r\n4,cluster is fine,but api-server log has those error log\r\n**Results:**\r\n```bash\r\nI1108 04:18:54.522728 1 logs.go:49] http: TLS handshake error from 192.168.20.77:36946: EOF\r\nI1108 04:18:54.975992 1 logs.go:49] http: TLS handshake error from 127.0.0.1:49140: EOF\r\nI1108 04:18:55.688478 1 logs.go:49] http: TLS handshake error from 192.168.2.253:42702: EOF\r\nI1108 04:18:55.881587 1 logs.go:49] http: TLS handshake error from 192.168.20.77:36980: EOF\r\nI1108 04:18:55.893539 1 logs.go:49] http: TLS handshake error from 192.168.2.253:42604: EOF\r\nI1108 04:18:57.358746 1 logs.go:49] http: TLS handshake error from 192.168.220.50:33830: EOF\r\nI1108 04:18:57.579623 1 logs.go:49] http: TLS handshake error from 192.168.220.50:33824: EOF\r\nI1108 04:18:58.701217 1 logs.go:49] http: TLS handshake error from 10.42.0.224:49746: EOF\r\nI1108 04:18:59.061299 1 logs.go:49] http: TLS handshake error from 192.168.2.253:42646: EOF\r\nI1108 04:18:59.934138 1 controller.go:105] OpenAPI AggregationController: Processing item v1beta1.metrics.k8s.io\r\nE1108 04:18:59.971130 1 controller.go:111] loading OpenAPI spec for \"v1beta1.metrics.k8s.io\" failed with: OpenAPI spec does not exists\r\nI1108 04:18:59.971180 1 controller.go:119] OpenAPI AggregationController: action for item v1beta1.metrics.k8s.io: Rate Limited Requeue.\r\nE1108 04:19:00.086635 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.088799 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.091591 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.093609 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.095432 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.097300 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.103310 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.106256 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.117610 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\nE1108 04:19:00.118393 1 authentication.go:62] Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]\r\n```",
"title": "Unable to authenticate the request due to an error: [x509: certificate has expired or is not yet valid, x509: certificate has expired or is not yet valid]",
"type": "issue"
},
{
"action": "closed",
"author": "vvkkhjt",
"comment_id": null,
"datetime": 1602229462000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 7,768 | false | true | 7,552 | false |
DataDog/datadog-serverless-functions | DataDog | 599,511,917 | 250 | {
"number": 250,
"repo": "datadog-serverless-functions",
"user_login": "DataDog"
} | [
{
"action": "opened",
"author": "azakordonets",
"comment_id": null,
"datetime": 1586864951000,
"masked_author": "username_0",
"text": "### What does this PR do?\r\nFixes a bug when `message` in log is `dict` and not a `string`\r\n### Motivation\r\nWe were upgrading our datadog lambda to the newest code and found this problem. \r\n\r\nWhat inspired you to submit this pull request?\r\nI love fixing bugs :) \r\n\r\n### Checklist\r\n\r\n- [ ] Member of the datadog team has run integration tests",
"title": "Ensure that message is a string before sending enhanced lambda metrics",
"type": "issue"
},
{
"action": "created",
"author": "DarcyRaynerDD",
"comment_id": 613484090,
"datetime": 1586875369000,
"masked_author": "username_1",
"text": "Hi @username_0, thanks for the fix. We'll put this into the next release.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "azakordonets",
"comment_id": 613512808,
"datetime": 1586878356000,
"masked_author": "username_0",
"text": "Wow, that was fast. Thanks!",
"title": null,
"type": "comment"
}
] | 2 | 3 | 443 | false | false | 443 | true |
kubernetes/minikube | kubernetes | 614,842,824 | 8,050 | null | [
{
"action": "opened",
"author": "afbjorklund",
"comment_id": null,
"datetime": 1588954867000,
"masked_author": "username_0",
"text": "Something that we discussed about the Docker VM, which allocates 2 GB RAM + 1 GB swap.\r\n\r\nMaybe it is time to increase our default memory allocation to 4 GB, at least where available ?\r\n\r\nSimilarly, we could increase the default to 40 GB, since it will only occupy as much as used.\r\n\r\nThe idea would be to have something like computer games, \"recommended\" vs \"minimum\":\r\n\r\n**Minimum**:\r\n2 CPUs or more\r\n2GB of free memory\r\n20GB of free disk space\r\n\r\n**Recommended**:\r\n4 CPUs or more\r\n4GB of free memory\r\n40GB of free disk space\r\n\r\nCurrently we follow the recommendations from `kubeadm`, but it seems to be a bit low ?\r\n\r\nhttps://kubernetes.io/docs/setup/production-environment/tools/kubeadm/install-kubeadm/\r\n\r\nIdeally this would be followed up by an analysis of where the memory is actually going...\r\n\r\nLike monitor the memory usage over time, like we did previously with the cpu usage ?\r\n\r\nAs discussed in https://github.com/kubernetes/minikube/issues/7980#issuecomment-623096792",
"title": "Increase default memory and diskimage allocation ?",
"type": "issue"
},
{
"action": "created",
"author": "afbjorklund",
"comment_id": 625899428,
"datetime": 1588955461000,
"masked_author": "username_0",
"text": "We could use `vmstat` for monitoring, or write our own like we did for `iostat` (\"[cstat](https://github.com/tstromberg/cstat)\")\r\n\r\n```\r\nprocs -----------memory---------- ---swap-- -----io---- -system-- ------cpu-----\r\n r b swpd free buff cache si so bi bo in cs us sy id wa st\r\n 1 2 1536 3625936 9905112 15017912 0 0 20 125 6 16 10 3 87 0 0\r\n 2 0 1536 3063600 9905216 15625908 0 0 9 73857 5365 36577 9 9 73 9 0\r\n25 0 1536 3009120 9905632 15632688 0 0 0 1424 3566 12279 8 3 88 1 0\r\n28 0 1536 2721084 9905684 15635796 0 0 0 171 5222 12392 23 4 72 0 0\r\n20 0 1536 2715396 9906596 15641064 0 0 13 2548 7161 23262 10 6 82 3 0\r\n 5 0 1536 2674888 9906792 15671304 0 0 2936 329 5553 18045 7 5 87 1 0\r\n17 0 1536 2589184 9908292 15678200 0 0 33 2283 6654 19200 11 7 80 1 0\r\n 0 0 1536 2576704 9908520 15679836 0 0 2 387 4598 12778 6 4 89 1 0\r\n```\r\n\r\nWe might need something more advanced for KIC, since `free` will show the host.\r\n\r\ne.g. `docker stat` https://docs.docker.com/engine/reference/commandline/stats/",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "afbjorklund",
"comment_id": 625901506,
"datetime": 1588955743000,
"masked_author": "username_0",
"text": "Somewhat related to #3574",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "afbjorklund",
"comment_id": 625944498,
"datetime": 1588961211000,
"masked_author": "username_0",
"text": "\r\n\r\nMemory usage of \"docker\", from start to idle.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "medyagh",
"comment_id": 718140719,
"datetime": 1603911312000,
"masked_author": "username_1",
"text": "@username_0 I believe our new Auto Detect memory already does this for memory. but for disk size we have not done anything.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "medyagh",
"comment_id": 718141455,
"datetime": 1603911393000,
"masked_author": "username_1",
"text": "I would accept a PR that would try to allocate more disk space for VM if it is available",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "medyagh",
"comment_id": 718141632,
"datetime": 1603911416000,
"masked_author": "username_1",
"text": "if we need disk space issue, lets create another one",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "medyagh",
"comment_id": null,
"datetime": 1603911416000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "afbjorklund",
"comment_id": 719297291,
"datetime": 1604040271000,
"masked_author": "username_0",
"text": "https://www.vagrantup.com/docs/boxes/base\r\n\r\n----\r\n\r\nUser are free to choose whatever, using the `disk-size` config.\r\n\r\nI will probably use 32G as a compromise, size of an SD card\r\nLike a modest 63% size increase, from 20000M to 32768M ?\r\n\r\nIt would be more interesting to work on the _monitoring_: #3574",
"title": null,
"type": "comment"
}
] | 3 | 11 | 3,581 | false | true | 2,898 | true |
OmnesRes/prepub | null | 365,139,077 | 13 | {
"number": 13,
"repo": "prepub",
"user_login": "OmnesRes"
} | [
{
"action": "opened",
"author": "katrinleinweber",
"comment_id": null,
"datetime": 1538232956000,
"masked_author": "username_0",
"text": "Hello :-)\r\n\r\nThe DOI foundation recommends [this new resolver](https://www.doi.org/doi_handbook/3_Resolution.html#3.8). Yes, a bit ironic that they would change the URL of their service, but it's now [encrypted](https://www.ssllabs.com/ssltest/analyze.html?d=doi.org). Because the old links with the `dx` subdomain continue to work, there is no urgent need to do anything.\r\n\r\nHowever, I'd hereby like to suggest to follow the new recommendation and update all static DOI links and any code that generates new DOI links.\r\n\r\nCheers!",
"title": "Hyperlink DOIs to preferred resolver",
"type": "issue"
},
{
"action": "created",
"author": "OmnesRes",
"comment_id": 425685645,
"datetime": 1538269689000,
"masked_author": "username_1",
"text": "Thanks, I didn't know about that, I'll look into it.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "katrinleinweber",
"comment_id": 425699197,
"datetime": 1538290246000,
"masked_author": "username_0",
"text": "You're welcome :-) I hope it was OK to `sed` all those data files straight as well. If you want, I can revert that part, though, and reduce this PR down to the `f.write` function.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 761 | false | false | 761 | false |
bw-med-cabinet-3/Front-End | bw-med-cabinet-3 | 561,283,491 | 46 | {
"number": 46,
"repo": "Front-End",
"user_login": "bw-med-cabinet-3"
} | [
{
"action": "opened",
"author": "lyndsiWilliams",
"comment_id": null,
"datetime": 1581024890000,
"masked_author": "username_0",
"text": "",
"title": "Carl sachs",
"type": "issue"
}
] | 2 | 2 | 328 | false | true | 0 | false |
sudara/alonetone | null | 365,570,442 | 221 | {
"number": 221,
"repo": "alonetone",
"user_login": "sudara"
} | [
{
"action": "created",
"author": "sudara",
"comment_id": 426001283,
"datetime": 1538416390000,
"masked_author": "username_0",
"text": "@dependabot ignore this dependency",
"title": null,
"type": "comment"
}
] | 2 | 3 | 4,534 | false | true | 34 | false |
OGRECave/ogre | OGRECave | 573,256,460 | 1,476 | null | [
{
"action": "opened",
"author": "LMCrashy",
"comment_id": null,
"datetime": 1582966358000,
"masked_author": "username_0",
"text": "Therefore, the values aren't uploaded at the right place.\r\n\r\nMaybe the GpuConstantDefinitionMap should use an unordered map ?",
"title": "Shared params & Uniform buffer",
"type": "issue"
},
{
"action": "created",
"author": "LMCrashy",
"comment_id": 592936058,
"datetime": 1582976022000,
"masked_author": "username_0",
"text": "Actually, there is another issue, at least in GL3+.\r\n\r\nOf course if one wants to share the same uniform buffer across different shaders it nees the buffer to use the \"shared\" layout to have the same offsets everywhere.\r\n\r\nThe offset for each uniform is not \"guessable\", one need to retrieve it from the program handle.\r\nThis pic from nSight show that \r\n\r\n\r\n\r\nHere, BetaRay is at offset 48 which is not 4*4(sunDirection) + 4*4 (sunColor) + 4(sunPower) = 36. \"sunPower\" is having a 4*4bytes size, but other floats from my constant buffer( I've only displayed a small part of it) are indeed using a 4bytes size.\r\n\r\nPreviously, there was this code in the GL3+RS to retrieve the offsets\r\n\r\n```\r\n// Get active block parameter properties.\r\nGpuConstantDefinitionIterator sharedParamDef = blockSharedParams->getConstantDefinitionIterator();\r\nstd::vector<const char*> sharedParamNames;\r\nfor (; sharedParamDef.current() != sharedParamDef.end(); sharedParamDef.moveNext())\r\n{\r\n\tsharedParamNames.push_back(sharedParamDef.current()->first.c_str());\r\n}\r\n\r\nstd::vector<GLuint> uniformParamIndices(sharedParamNames.size());\r\nstd::vector<GLint> uniformParamOffsets(sharedParamNames.size());\r\n\r\nOGRE_CHECK_GL_ERROR(glGetUniformIndices(mGLProgramHandle, sharedParamNames.size(), &sharedParamNames[0], &uniformParamIndices[0]));\r\n//FIXME debug this (see stdout)\r\nOGRE_CHECK_GL_ERROR(glGetActiveUniformsiv(mGLProgramHandle, uniformParamIndices.size(), &uniformParamIndices[0], GL_UNIFORM_OFFSET, &uniformParamOffsets[0]));\r\n\r\nGpuNamedConstants& consts = const_cast<GpuNamedConstants&>(blockSharedParams->getConstantDefinitions());\r\nMapIterator<GpuConstantDefinitionMap> sharedParamDefMut(consts.map);\r\n```\r\n\r\nBut in current implementation, nothing retrieves the actual location of uniforms in the buffer.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paroj",
"comment_id": 592939678,
"datetime": 1582978806000,
"masked_author": "username_1",
"text": "you are required to use `std140` for shared_params at the moment\r\n\r\nhttps://www.khronos.org/opengl/wiki/Interface_Block_(GLSL)#Memory_layout",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "LMCrashy",
"comment_id": 592941143,
"datetime": 1582979854000,
"masked_author": "username_0",
"text": "I've also tried with std140 layout qualifier before, this doesn't help with the offset issue:\r\n\r\n\r\n\r\nSometimes a float uses 4bytes, sometimes 16bytes, vec3 uses 16bytes here and not 9.\r\nThis is not predictable and should be retrieved from the gpu program",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paroj",
"comment_id": 592942551,
"datetime": 1582980823000,
"masked_author": "username_1",
"text": "no, it is specified as per the link above. e.g. vec3 is required to be 16byte aligned. We just dont adhere to the rules in ogre.\r\n\r\nThe real question is whether its easier to implement the rules or query the driver.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paroj",
"comment_id": 592944574,
"datetime": 1582982180000,
"masked_author": "username_1",
"text": "#1477 should yield the above offsets and fix the upload iteration",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "LMCrashy",
"comment_id": 592956523,
"datetime": 1582990097000,
"masked_author": "username_0",
"text": "Thanks for your time and clarification. I was suspecting an aligment issue per uniform but not this way, it makes sense now.\r\n\r\nUnfortunatly, this patch gives the same logical index for BetaMie and BetaRay: 48\r\n\r\nWouln't it be easier to accumulate offset in a member value when adding a new constant and calculate the next aligned position ? Using the size of the mXXXConstants lists doesn't give an accurate information of previous values alignment.\r\n\r\nI'm working on a fix done this way, I'll tell if it works.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "paroj",
"comment_id": null,
"datetime": 1582990313000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "paroj",
"comment_id": 592971774,
"datetime": 1582997405000,
"masked_author": "username_1",
"text": "yeah, you are right. As of now, we are discarding any previous padding when calculating the offset.",
"title": null,
"type": "comment"
},
{
"action": "reopened",
"author": "paroj",
"comment_id": null,
"datetime": 1582997405000,
"masked_author": "username_1",
"text": "Therefore, the values aren't uploaded at the right place.\r\n\r\nMaybe the GpuConstantDefinitionMap should use an unordered map, or the offset should be calculated using the physical index instead.",
"title": "Shared params & Uniform buffer",
"type": "issue"
},
{
"action": "closed",
"author": "paroj",
"comment_id": null,
"datetime": 1583005171000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "LMCrashy",
"comment_id": 618374766,
"datetime": 1587645059000,
"masked_author": "username_0",
"text": "Woops, \r\nMy fix proposal wasn't properly handling matrices.\r\n\r\nI reworked it (again) in this commit : https://github.com/username_0/ogre/commit/ab74f5c931b585c52518d3adbbe036d0a653e234\r\n\r\nTested it against this Uniform block \r\n```\r\nlayout(std140) uniform TestAlignment\r\n{\r\n\tmat3 _mat33 ; \t//0\r\n\tmat4x3 _mat43 ; //48\r\n\tfloat _float ; \t//112\r\n\tmat3x4 _mat34 ; //128\r\n\tvec3 _vec3;\t//176\r\n\tvec2 _vec2; \t//192\r\n\tmat4 _mat44; \t//208\r\n\tfloat _float1; \t//272\r\n\tfloat _float2; \t//276\r\n\tmat3 _mat33_2; \t//288\r\n\tmat2 _mat22 ;\t//336\r\n};\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "paroj",
"comment_id": 618376303,
"datetime": 1587645257000,
"masked_author": "username_1",
"text": "can you create a PR with this fix?",
"title": null,
"type": "comment"
}
] | 2 | 13 | 4,167 | false | false | 4,167 | true |
mtoqeer/personal-porfolio | null | 562,191,832 | 1 | null | [
{
"action": "closed",
"author": "mtoqeer",
"comment_id": null,
"datetime": 1581262060000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 328 | false | true | 0 | false |
carbon-design-system/carbon-tutorial | carbon-design-system | 573,407,771 | 3,813 | {
"number": 3813,
"repo": "carbon-tutorial",
"user_login": "carbon-design-system"
} | [
{
"action": "opened",
"author": "ahmnouira",
"comment_id": null,
"datetime": 1583008863000,
"masked_author": "username_0",
"text": "",
"title": "feat(tutorial): complete step 2",
"type": "issue"
}
] | 2 | 2 | 184 | false | true | 0 | false |
bcgov/rems | bcgov | 602,253,738 | 46 | null | [
{
"action": "created",
"author": "ateucher",
"comment_id": 616873494,
"datetime": 1587427568000,
"masked_author": "username_0",
"text": "Added topics: `env`, `rstats`, `r`, `data-science`.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ateucher",
"comment_id": null,
"datetime": 1587427569000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 3,429 | false | true | 51 | false |
storybookjs/addon-jsx | storybookjs | 511,324,044 | 91 | {
"number": 91,
"repo": "addon-jsx",
"user_login": "storybookjs"
} | [
{
"action": "opened",
"author": "ndelangen",
"comment_id": null,
"datetime": 1571837338000,
"masked_author": "username_0",
"text": "<!-- GITHUB_RELEASE PR BODY: canary-version -->\nPublished PR with canary version: `7.1.11-canary.91.230`\n<!-- GITHUB_RELEASE PR BODY: canary-version -->",
"title": "UPGRADES",
"type": "issue"
},
{
"action": "created",
"author": "hipstersmoothie",
"comment_id": 545471936,
"datetime": 1571841011000,
"masked_author": "username_1",
"text": "<!-- GITHUB_RELEASE COMMENT: released -->\n:rocket: PR was released in v7.1.11 :rocket:",
"title": null,
"type": "comment"
}
] | 2 | 2 | 238 | false | false | 238 | false |
apple/coremltools | apple | 385,031,446 | 299 | null | [
{
"action": "opened",
"author": "manuelcosta74",
"comment_id": null,
"datetime": 1543360901000,
"masked_author": "username_0",
"text": "@username_1 back to standardization. \r\n\r\nIs there an \"off the shelf\" solution to do a per channel and per image standardization (mean & stdev) like Tensorflow tf.image.per_image_standardization? \r\nNote that this is not the same as #244 where avg & sdtDev were constant.\r\nJust have a model with this requirement. \r\n\r\nthanks",
"title": "Per image standardization",
"type": "issue"
},
{
"action": "created",
"author": "aseemw",
"comment_id": 442261383,
"datetime": 1543362152000,
"masked_author": "username_1",
"text": "The MVN layer in CoreML can do this operation: https://github.com/apple/coremltools/blob/master/mlmodel/format/NeuralNetwork.proto#L2066",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "manuelcosta74",
"comment_id": 442498338,
"datetime": 1543420551000,
"masked_author": "username_0",
"text": "@username_1 thanks!\r\n\r\nYou can find it here working here:\r\n\r\nhttps://github.com/username_0/keras-facenet/blob/master/code/Convert2CoreML.py",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "manuelcosta74",
"comment_id": null,
"datetime": 1543420621000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 4 | 592 | false | false | 592 | true |
libra/libra | libra | 609,530,729 | 3,666 | null | [
{
"action": "opened",
"author": "dayadam",
"comment_id": null,
"datetime": 1588215858000,
"masked_author": "username_0",
"text": "[ERROR] Error minting coins: Failed to query remote faucet server[status=503]: \"<html><body><h1>503 Service Unavailable</h1>\\nNo server is available to handle this request.\\n</body></html>\\n\"\r\n`\r\n\r\nMaybe some basic package I'm missing?",
"title": "Failing to create account after installation on Ubuntu 20, error: missing field `validator_change_proof`",
"type": "issue"
},
{
"action": "created",
"author": "dayadam",
"comment_id": 621937443,
"datetime": 1588261588000,
"masked_author": "username_0",
"text": "needed a git pull",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "dayadam",
"comment_id": null,
"datetime": 1588261589000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 252 | false | false | 252 | false |
likejazz/likejazz.github.io | null | 339,267,504 | 6 | null | [
{
"action": "opened",
"author": "likejazz",
"comment_id": null,
"datetime": 1531092347000,
"masked_author": "username_0",
"text": "",
"title": "GitHub Page/Wiki Images 5",
"type": "issue"
},
{
"action": "created",
"author": "likejazz",
"comment_id": 404495844,
"datetime": 1531398673000,
"masked_author": "username_0",
"text": "<img width=\"750\" alt=\"screen shot 2018-07-12 at 9 30 04 pm\" src=\"https://user-images.githubusercontent.com/1250095/42633263-dc2d78ca-861a-11e8-970c-4bd856e1e287.png\">",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "likejazz",
"comment_id": 422663348,
"datetime": 1537336510000,
"masked_author": "username_0",
"text": "\r\n\r\n(통계학 도감, 2017)",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "likejazz",
"comment_id": 429628796,
"datetime": 1539525564000,
"masked_author": "username_0",
"text": "<img width=\"873\" alt=\"screen shot 2018-10-14 at 10 58 51 pm\" src=\"https://user-images.githubusercontent.com/1250095/46917700-cc401b00-d004-11e8-8ab5-29b290dc81cb.png\">\r\n<img width=\"961\" alt=\"screen shot 2018-10-14 at 10 58 58 pm\" src=\"https://user-images.githubusercontent.com/1250095/46917701-cc401b00-d004-11e8-8665-f7d9eeb60c6c.png\">",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "likejazz",
"comment_id": 433337298,
"datetime": 1540544062000,
"masked_author": "username_0",
"text": "\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "likejazz",
"comment_id": null,
"datetime": 1540544072000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 6 | 1,089 | false | false | 1,089 | false |
Azure/autorest.python | Azure | 595,465,033 | 553 | null | [
{
"action": "opened",
"author": "lmazuel",
"comment_id": null,
"datetime": 1586211708000,
"masked_author": "username_0",
"text": "``\r\nC:\\Users\\username_0\\AppData\\Roaming\\npm\\autorest.CMD C:\\Users\\username_0\\Git\\azure-rest-api-specs\\specification\\reservations\\resource-manager\\readme.md --blabla=@autorest/python@5.0.0-dev.20200326.1 --keep-version-file --multiapi --no-async --python --python-mode=update --python-sdks-folder=C:\\Users\\username_0\\Git\\azure-sdk-for-python\\TEST --use=C:/Users/username_0/Git/autorest.python.v3/autorest-python-5.0.0-preview.1.tgz --version=3.0.6265\r\nAutoRest code generation utility [cli version: 3.0.6187; node: v10.15.3, max-memory: 8192 gb]\r\n(C) 2018 Microsoft Corporation.\r\nhttps://aka.ms/autorest\r\n Loading AutoRest core 'C:\\Users\\username_0\\.autorest\\@autorest_core@3.0.6265\\node_modules\\@autorest\\core\\dist' (3.0.6265)\r\n Loading AutoRest extension '@autorest/python' (C:/Users/username_0/Git/autorest.python.v3/autorest-python-5.0.0-preview.1.tgz->5.0.0-preview.1)\r\n Loading AutoRest extension '@autorest/modelerfour' (4.12.301->4.12.301)WARNING: [autorest.codegen.models.lro_operation.set_lro_response_type:61] Multiple schema types in responses: [<SchemaResponse [200]>, <SchemaResponse [201]>]\r\nWARNING: [autorest.codegen.models.lro_operation.set_lro_response_type:61] Multiple schema types in responses: [<SchemaResponse [200]>, <SchemaResponse [201]>]\r\n[7.7 s] Generation Complete\r\n``\r\n\r\nWe should check if it's a Swagger issue, or a m4 / Python issue.",
"title": "Investigate LRO multiple schema",
"type": "issue"
},
{
"action": "created",
"author": "iscai-msft",
"comment_id": 611017993,
"datetime": 1586358902000,
"masked_author": "username_1",
"text": "Not an autorest python issue. The swagger is specifying two responses with different schemas: https://github.com/Azure/azure-rest-api-specs/blob/master/specification/reservations/resource-manager/Microsoft.Capacity/preview/2019-07-19/quota.json#L133",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "iscai-msft",
"comment_id": null,
"datetime": 1586369336000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 1,607 | false | false | 1,607 | true |
trailofbits/cb-multios | trailofbits | 475,839,516 | 65 | null | [
{
"action": "opened",
"author": "woodruffw",
"comment_id": null,
"datetime": 1564686035000,
"masked_author": "username_0",
"text": "#64 makes macOS use ninja; Windows should do the same on AppVeyor.",
"title": "Use ninja on Windows CI",
"type": "issue"
},
{
"action": "created",
"author": "ekilmer",
"comment_id": 517416799,
"datetime": 1564686159000,
"masked_author": "username_1",
"text": "I believe Windows already uses Ninja for at least the `clang` build here \r\nhttps://github.com/trailofbits/cb-multios/blob/561f2369e20e2727376028578631a7363ef9d2be/.appveyor.yml#L26-L30",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "ekilmer",
"comment_id": 517417597,
"datetime": 1564686313000,
"masked_author": "username_1",
"text": "Admittedly, it _only_ uses Ninja for the `clang` build https://github.com/trailofbits/cb-multios/blob/561f2369e20e2727376028578631a7363ef9d2be/build.ps1#L39-L43 \r\n\r\nHowever, it uses the default CMake generator for the (broken) MSVC compiler build",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "woodruffw",
"comment_id": null,
"datetime": 1564686362000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "woodruffw",
"comment_id": 517417874,
"datetime": 1564686362000,
"masked_author": "username_0",
"text": "Yep, my bad. I was looking at the AppVeyor scrollback and didn't see \"Ninja\", so I assumed it was using the default backend.",
"title": null,
"type": "comment"
}
] | 2 | 5 | 620 | false | false | 620 | false |
archriss/react-native-snap-carousel | archriss | 619,385,227 | 694 | {
"number": 694,
"repo": "react-native-snap-carousel",
"user_login": "archriss"
} | [
{
"action": "opened",
"author": "abbasmoosavi",
"comment_id": null,
"datetime": 1589607537000,
"masked_author": "username_0",
"text": "const AnimatedFlatList = FlatList ? Animated.createAnimatedComponent(FlatList) : null;\r\nconst AnimatedScrollView = Animated.createAnimatedComponent(ScrollView);\r\n\r\nchange to: \r\n\r\nconst AnimatedFlatList = FlatList ? Animated.FlatList : null;\r\nconst AnimatedScrollView = Animated.ScrollView;\r\n\r\nand\r\n\r\nreturn this._carouselRef && this._carouselRef.getNode && this._carouselRef.getNode();\r\n\r\nchange to: \r\n\r\nreturn this._carouselRef;",
"title": "Fix getnode() warnings",
"type": "issue"
},
{
"action": "created",
"author": "bd-arc",
"comment_id": 634439738,
"datetime": 1590557875000,
"masked_author": "username_1",
"text": "Fixed in version `3.9.1` with backward-compatible code.",
"title": null,
"type": "comment"
}
] | 2 | 2 | 484 | false | false | 484 | false |
moby/buildkit | moby | 599,451,189 | 1,440 | null | [
{
"action": "opened",
"author": "mbarbero",
"comment_id": null,
"datetime": 1586858833000,
"masked_author": "username_0",
"text": "Similar to #1143\r\n\r\nWhen exporting cache to quay.io, it fails with \r\n\r\n```error: failed to solve: rpc error: code = Unknown desc = error writing manifest blob: failed commit on ref \"sha256:c2aba47e903ef19d459785c7e5750ef7da0f6f86657d9b40c329d5268dfe2185\": unexpected status: 401 Unauthorized```\r\n\r\nThe error is the same with both modes: `mode=max` or `mode=min`\r\n\r\n```bash\r\nbuildctl\" build \\\r\n --progress=plain \\\r\n --frontend=dockerfile.v0 \\\r\n --local context=\"${context}\" \\\r\n --local dockerfile=\"$(dirname \"${dockerfile}\")\" \\\r\n --opt filename=\"$(basename \"${dockerfile}\")\" \\\r\n --output \"type=image,\\\"name=${name}\\\",push=${push}\" \\\r\n --export-cache \"type=registry,mode=max,ref=${image}:${tag}-buildcache\" \\\r\n --import-cache \"type=registry,ref=${image}:${tag}-buildcache\" \\\r\n \"${@}\"\r\n```\r\n\r\nWhen I do not `--export-cache`, images are pushed properly to quay.io so the credentials are correct.",
"title": "Cache can't be exported to Quay.io",
"type": "issue"
},
{
"action": "created",
"author": "mbarbero",
"comment_id": 613459071,
"datetime": 1586872674000,
"masked_author": "username_0",
"text": "I should add that the cache export works OK with the docker.io registry.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cpuguy83",
"comment_id": 638978041,
"datetime": 1591289520000,
"masked_author": "username_1",
"text": "I have not done any real investigation here, but this sounds like quay does not support cache manifests.\r\nYou may want to try an inline cache? This may still not work. I know I've had issues with some other registries with it.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mbarbero",
"comment_id": 638979630,
"datetime": 1591289709000,
"masked_author": "username_0",
"text": "Issue is that inline cache does not support max mode, which is a requirement in my case.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cpuguy83",
"comment_id": 638982269,
"datetime": 1591290010000,
"masked_author": "username_1",
"text": "Speaking with someone else, they are having trouble pushing 2 tags... e.g. `-t foo -t bar`, getting the same 401 error.\r\nMaybe it's not even a cache manifest issue, and the 401 is legit... like not authorized to push 2 tags with the same auth token?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "mbarbero",
"comment_id": 639328557,
"datetime": 1591344694000,
"masked_author": "username_0",
"text": "I've tried to push 2 tags at the same time with a cache type=local and it works. So the issue is not about quay not supporting pushing 2 tags at once.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "tonistiigi",
"comment_id": 652807791,
"datetime": 1593670797000,
"masked_author": "username_2",
"text": "#1550",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "tonistiigi",
"comment_id": null,
"datetime": 1593670798000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "mbarbero",
"comment_id": 812555277,
"datetime": 1617373819000,
"masked_author": "username_0",
"text": "Coming back at this issue as using quay.io may come back as a topic on my side. I tried with latest buildkit (0.8.2), and I still cannot export cache to quay. Both\r\n\r\n```\r\n--export-cache \"type=registry,mode=max,oci-mediatypes=false,ref=${image}:${tag}-buildcache\" \\\r\n--export-cache \"type=registry,mode=min,oci-mediatypes=false,ref=${image}:${tag}-buildcache\" \\",
"title": null,
"type": "comment"
}
] | 3 | 9 | 2,068 | false | false | 2,068 | false |
orange-cloudfoundry/paas-templates | orange-cloudfoundry | 642,199,034 | 821 | null | [
{
"action": "opened",
"author": "poblin-orange",
"comment_id": null,
"datetime": 1592597907000,
"masked_author": "username_0",
"text": "### Expected behavior\r\n\r\n* As a paas-templates operator | cf user | marketplace user | paas-templates author | paas-templates maintainer\r\n* In order to be able to use pure k8s declarative approach, including helm install / update\r\n* I need a standard mechanism available to align helm charts installations\r\n\r\n### Observed behavior\r\nThe helm chart installation and indempotency check is done with\r\n\r\nUsing flux helm operator would ease debug phase (just a yaml to edit and apply in k8S, instead of a full git/COA/bosh deploy for kubectl-helm bosh release).\r\n\r\nIt will also use future COA => k8s generic mapping mechanism implementation \r\n\r\ncc @o-orand @obeyler \r\n\r\n ### references\r\n- https://docs.fluxcd.io/projects/helm-operator/en/stable/\r\n\r\n### Affected releases\r\n\r\n* x.y\r\n* earlier versions\r\n\r\n<!--\r\n### Traces and logs\r\n\r\nRemember this is a public repo. DON'T leak credentials or Orange internal URLs. \r\nAutomation may be applied in the future. \r\n\r\n* [ ] I have reviewed provided traces against secrets (credentials, internal URLs) that should not be leake, manually of using some tools such as [truffle-hog file:///user/dxa4481/codeprojects/mytraces.txt](https://github.com/dxa4481/truffleHog#truffle-hog)\r\n -->",
"title": "leverage helm operator to ease gitops interoperability",
"type": "issue"
},
{
"action": "closed",
"author": "poblin-orange",
"comment_id": null,
"datetime": 1594913696000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "poblin-orange",
"comment_id": 662579059,
"datetime": 1595438240000,
"masked_author": "username_0",
"text": "reopen as it must be fixed",
"title": null,
"type": "comment"
},
{
"action": "reopened",
"author": "poblin-orange",
"comment_id": null,
"datetime": 1595438241000,
"masked_author": "username_0",
"text": "### Expected behavior\r\n\r\n* As a paas-templates author / maintainer\r\n* In order to be able to use pure k8s declarative approach, including helm install / update\r\n* I need a standard mechanism available to align helm charts installations\r\n\r\n### Observed behavior\r\nThe helm chart installation and indempotency check is done with\r\n\r\nUsing flux helm operator would ease debug phase (just a yaml to edit and apply in k8S, instead of a full git/COA/bosh deploy for kubectl-helm bosh release).\r\n\r\nIt will also ease future COA k8s generic mapping mechanism implementation (possibly based on flux / argocd)\r\n\r\ncc @o-orand @obeyler \r\n\r\n ### references\r\n- https://docs.fluxcd.io/projects/helm-operator/en/stable/\r\n- https://github.com/fluxcd/helm-operator-get-started\r\n- https://github.com/fluxcd/helm-operator/issues/377\r\n### Affected releases\r\n\r\n* 46.x\r\n* earlier versions\r\n\r\n<!--\r\n### Traces and logs\r\n\r\nRemember this is a public repo. DON'T leak credentials or Orange internal URLs. \r\nAutomation may be applied in the future. \r\n\r\n* [ ] I have reviewed provided traces against secrets (credentials, internal URLs) that should not be leake, manually of using some tools such as [truffle-hog file:///user/dxa4481/codeprojects/mytraces.txt](https://github.com/dxa4481/truffleHog#truffle-hog)\r\n -->",
"title": "leverage helm k8s operator to ease debug and gitops interoperability",
"type": "issue"
},
{
"action": "created",
"author": "poblin-orange",
"comment_id": 662579152,
"datetime": 1595438252000,
"masked_author": "username_0",
"text": "consider alertnative https://github.com/rancher/helm-controller",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "poblin-orange",
"comment_id": null,
"datetime": 1629724495000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 6 | 2,593 | false | false | 2,593 | false |
computerjazz/react-native-draggable-flatlist | null | 622,669,646 | 185 | null | [
{
"action": "opened",
"author": "malashkevich",
"comment_id": null,
"datetime": 1590083314000,
"masked_author": "username_0",
"text": "**Is your feature request related to a problem? Please describe.**\r\nIs it possible to render some custom view at the spacer? It would be nice to have such feature\r\n\r\n\r\n**Describe the solution you'd like**\r\nI would say it could be done the same way as renderItem\r\n\r\n**Describe alternatives you've considered**\r\nI've tried to relate on `isActive`, but the full item is rendered on top, so there is no way to do this.\r\n\r\nIf you can suggest how to implement this I could try to create PR. But it would be nice to have some suggestions/hints.",
"title": "Custom view instead of spacer",
"type": "issue"
},
{
"action": "created",
"author": "computerjazz",
"comment_id": 633251186,
"datetime": 1590335575000,
"masked_author": "username_1",
"text": "what would happen when the spacer index changes as you drag? Would the custom view animate/translate to the new spacer index?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "malashkevich",
"comment_id": 633396813,
"datetime": 1590387222000,
"masked_author": "username_0",
"text": "Yes, it could be just animation/translation between spacer and row item. Currently it animates leaving free space there.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "computerjazz",
"comment_id": 663782635,
"datetime": 1595636822000,
"masked_author": "username_1",
"text": "Whoops sorry for the delayed response. \r\nI took a stab at implementing this -- would be great to get a test in before merging:\r\nhttps://github.com/username_1/react-native-draggable-flatlist/pull/208\r\n\r\nsnack with the new code dropped in:\r\nhttps://snack.expo.io/@username_1/renderplaceholder",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "malashkevich",
"comment_id": 664829819,
"datetime": 1595921480000,
"masked_author": "username_0",
"text": "@username_1 well done! Looks really good.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "computerjazz",
"comment_id": null,
"datetime": 1596321864000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 6 | 1,228 | false | false | 1,228 | true |
techiediaries/ngx-qrcode | null | 418,935,564 | 33 | null | [
{
"action": "opened",
"author": "sebasmdl",
"comment_id": null,
"datetime": 1552075124000,
"masked_author": "username_0",
"text": "",
"title": "ngx-qrcode2.component.d.ts -- cannot find module '@angular/core'",
"type": "issue"
},
{
"action": "closed",
"author": "rafa-as",
"comment_id": null,
"datetime": 1588082269000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "rafa-as",
"comment_id": 620624798,
"datetime": 1588082269000,
"masked_author": "username_1",
"text": "Closed by #43",
"title": null,
"type": "comment"
}
] | 2 | 3 | 13 | false | false | 13 | false |
ooni/probe | ooni | 624,700,483 | 1,169 | null | [
{
"action": "opened",
"author": "lorenzoPrimi",
"comment_id": null,
"datetime": 1590482035000,
"masked_author": "username_0",
"text": "Crashlytics is reporting a crash in DashboardTableViewController.m line 120 trending up.\nNeed to try to reproduce and fix it.",
"title": "Crash in DashboardTableViewController.m line 120",
"type": "issue"
},
{
"action": "created",
"author": "lorenzoPrimi",
"comment_id": 633889145,
"datetime": 1590482219000,
"masked_author": "username_0",
"text": "Apparently it happens when tapping on a row to get into TestOverview.\n```\n else if ([[segue identifier] isEqualToString:@\"toTestOverview\"]){\n NSIndexPath* indexPath = [self.tableView indexPathForSelectedRow];\n TestOverviewViewController *vc = (TestOverviewViewController * )segue.destinationViewController;\n AbstractSuite *testSuite = [items objectAtIndex:indexPath.row];\n [vc setTestSuite:testSuite];\n } \n```\nthe item array causes a \n`\nFatal Exception: NSRangeException\n*** -[__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty array\n-[DashboardTableViewController prepareForSegue:sender:]\n`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "lorenzoPrimi",
"comment_id": 633891683,
"datetime": 1590482521000,
"masked_author": "username_0",
"text": "Some useful info: https://stackoverflow.com/questions/9073309/indexpathforselectedrow-returning-nil",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "lorenzoPrimi",
"comment_id": null,
"datetime": 1590585484000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 4 | 861 | false | false | 861 | false |
avalonmediasystem/avalon | avalonmediasystem | 572,856,118 | 4,011 | null | [
{
"action": "opened",
"author": "joncameron",
"comment_id": null,
"datetime": 1582906872000,
"masked_author": "username_0",
"text": "### Description\nOur long-term goal is to update the IIIF player. We need a comparison of what the player in Avalon currently provides (\n\n### Done Looks Like",
"title": "Create Comparison of Avalon's MediaElement and the React IIIF Player",
"type": "issue"
},
{
"action": "created",
"author": "Dananji",
"comment_id": 593558467,
"datetime": 1583175098000,
"masked_author": "username_1",
"text": "1. Add quality selector \n2. Add lil' scrub\n3. Add captions support\n4. Add marker support in playlists\n5. Support multiple sections\n6. Identify the media type from manifest and create audio/video element from that\n7. Parse section/structure labels properly",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "joncameron",
"comment_id": null,
"datetime": 1584372759000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 411 | false | false | 411 | false |
google/rust_icu | google | 637,407,059 | 125 | {
"number": 125,
"repo": "rust_icu",
"user_login": "google"
} | [
{
"action": "opened",
"author": "filmil",
"comment_id": null,
"datetime": 1591922311000,
"masked_author": "username_0",
"text": "Here is a draft implementation of ECMA 402 API surface that was proposed here: https://github.com/google/rust_icu/blob/master/experimental/ecma402_listformat/src/lib.rs\r\n\r\nBefore I equip it with better docs and README content, initial thoughts from y'all would be welcome.\r\n\r\n@kpozin `uloc` now depends on the `ecma402_traits`, which means we need to import it in Fuchsia too. It seemed that if I created a new aliased type for the locale, we'd not get much in return except we'd get the decoupling.",
"title": "Ecma402 traits",
"type": "issue"
},
{
"action": "created",
"author": "filmil",
"comment_id": 643434877,
"datetime": 1591988276000,
"masked_author": "username_0",
"text": "@zbraniecki",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "filmil",
"comment_id": 643438825,
"datetime": 1591988870000,
"masked_author": "username_0",
"text": "An amusing bit is that the implementation revealed a few practical issues that `rust_icu` must resolve in order to support ECMA 402. One of them is that the list formatting support we need for ECMA 402 is available only since ICU 67 (!), and that in practice, systems use ICUs as far back as ICU 63. (each version increment is about 6 months IIRC).\r\n\r\nSo, a few conditional compilation experiments later, there's something that works on ICU 67, and gives suboptimal but still useful results on ICU 66 and lower.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "filmil",
"comment_id": 643461826,
"datetime": 1591992400000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "comment"
}
] | 1 | 4 | 1,023 | false | false | 1,023 | false |
denoland/deno | denoland | 618,217,257 | 5,345 | null | [
{
"action": "opened",
"author": "therealadityashankar",
"comment_id": null,
"datetime": 1589461715000,
"masked_author": "username_0",
"text": "any version management service requires being able to discover all the available versions,\r\n\r\nlets say in a module, I use\r\n\r\n```js\r\nimport something from \"https://deno.land/x/something@1.2.0/main.ts\"\r\n```\r\n\r\nit'd be nice if I could do something similar to\r\n\r\n```sh\r\ncurl https://deno.land/x/something@versions\r\n```\r\nif it'd return\r\n\r\n```json\r\n{\"versions\": [\"0.0.1\", \"1.0.0\", \"1.0.1\", \"1.1.0\", \"1.2.0\", \"2.0\", \"2.1\"]}\r\n```\r\n\r\nthere's currently no standardised way of discovering all available versions of a file,\r\n\r\ncurrently, for example, [deno-udd](https://github.com/hayd/deno-udd), uses a variety of techniques to find all available versions, including\r\n- checking github releases\r\n- checking https://raw.githubusercontent.com/denoland/deno_website2/master/src/database.json for package releases\r\n- web scraping\r\n\r\napart from the other issues in this (like github's rate limiting), there is no standard, meaning that every person who hosts a file on their site will have to tell deno-udd to be able to support their website, and every other deno version management service to be able to support their website, and like, what may be a standard way in one version management service might not be supported in another\r\n\r\nI like the `@versions` method cause is simple, and probably super easy to implement\r\n\r\nfurther, it'd be even better if it could support additional parameters like\r\n```json\r\n{\r\n \"versions\": [\"0.0.1\", \"1.0.0\", \"1.0.1\", \"1.1.0\", \"2.0\", \"2.6.9\"],\r\n \"deprecated\": [\"1.*\"],\r\n \"fatal-risk-of-use\": [\"<1.0.0\"],\r\n \"dog-person-wrote-code\": false,\r\n \"cat-person-wrote-code\": true\r\n}\r\n```\r\n\r\nrelevant issue https://github.com/hayd/deno-udd/issues/10",
"title": "@versions to list down all versions of a module [Enhancement]",
"type": "issue"
},
{
"action": "created",
"author": "lucacasonato",
"comment_id": 628642456,
"datetime": 1589463592000,
"masked_author": "username_1",
"text": "As you said GIthub rate limiting prevents us from doing this. We only have 60 server side requests per hour, but 235 modules, with more added every day. That means that at most we can update each modules' versions once every ~4 hours (235 / 60). That is not a great experience for the end user. And it only gets worse with an increase in the available modules.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "therealadityashankar",
"comment_id": 628723423,
"datetime": 1589471371000,
"masked_author": "username_0",
"text": "how does deno.land currently retrieve package content, is raw.githubusercontent.com not subject to the same rate limits ?\r\n\r\nIf the above is true, I guess a temporary alternative could be asking individual package owners to have an versions.txt in the master branch, which could be represented in `@versions`",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "therealadityashankar",
"comment_id": 628848706,
"datetime": 1589485357000,
"masked_author": "username_0",
"text": "another option - although untested - might be GraphQl github requests, which I believe work with lesser rate limiting restrictions",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kitsonk",
"comment_id": 628877794,
"datetime": 1589489172000,
"masked_author": "username_2",
"text": "This should be moved to denoland/deno_website2.",
"title": null,
"type": "comment"
}
] | 3 | 5 | 2,518 | false | false | 2,518 | false |
aabaker/controldlna | null | 602,633,708 | 19 | {
"number": 19,
"repo": "controldlna",
"user_login": "aabaker"
} | [
{
"action": "opened",
"author": "matgoebl",
"comment_id": null,
"datetime": 1587276836000,
"masked_author": "username_0",
"text": "Fix crash when starting playback from a playlist with only a single item, fixes #13\r\nReplaces #14 (forgot cherry pick branch there)",
"title": "Fix crash when starting playback from a playlist with only a single item, fixes #13",
"type": "issue"
},
{
"action": "created",
"author": "matgoebl",
"comment_id": 617292579,
"datetime": 1587489019000,
"masked_author": "username_0",
"text": "I changed the fix as you suggested.\r\nThe crash happens, when you start up with an empty playlist, long-click on any song and choose \"add single item to playlist\". \r\nWith both fixes it does not start playing. You have to click once on the song when its in the playlist to start playing.\r\nNevertheless, its more important to fix the more annoying crash.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "aabaker",
"comment_id": 619367486,
"datetime": 1587815435000,
"masked_author": "username_1",
"text": "I wasn't happy with a solution where there was a button with a play symbol active that did nothing when pressed, calling play() if the current track isn't valid seems like a safe alternative.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 673 | false | false | 673 | false |
canada-ca/aia-eia-js | canada-ca | 603,165,047 | 205 | {
"number": 205,
"repo": "aia-eia-js",
"user_login": "canada-ca"
} | [
{
"action": "created",
"author": "ShadeWyrm",
"comment_id": 616567767,
"datetime": 1587390693000,
"masked_author": "username_0",
"text": "@dependabot rebase",
"title": null,
"type": "comment"
}
] | 2 | 2 | 6,189 | false | true | 18 | false |
shadowsocks/shadowsocks-windows | shadowsocks | 578,982,159 | 2,828 | null | [
{
"action": "opened",
"author": "jarvok",
"comment_id": null,
"datetime": 1583896758000,
"masked_author": "username_0",
"text": "<!--\r\n- Shadowsocks is a non-profit open source project. If you bought the service from a provider, please contact them.\r\n 影梭(Shadowsocks)是一个开源非盈利项目,不提供任何托管服务。如果你是从服务提供商购买的服务,请联系他们。\r\n- If you have questions rather than Shadowsocks Windows client, please go to https://github.com/shadowsocks\r\n 如果你有非影梭Windows客户端相关的问题,请去 https://github.com/shadowsocks\r\n- Please read Wiki carefully, especially https://github.com/shadowsocks/shadowsocks-windows/wiki/Troubleshooting\r\n 提问前请先阅读wiki https://github.com/shadowsocks/shadowsocks-windows/wiki/Troubleshooting.\r\n- And search from Issue Board https://github.com/shadowsocks/shadowsocks-windows/issues?utf8=%E2%9C%93&q=is%3Aissue\r\n 并在Issue Board中搜索 https://github.com/shadowsocks/shadowsocks-windows/issues?utf8=%E2%9C%93&q=is%3Aissue\r\n- Please include the following information. Questions lacking details will be closed.\r\n 请按照以下格式描述你的问题,描述不清的问题将会被关闭。\r\n-->\r\n\r\n### Shadowsocks version / 影梭版本\r\n\r\n\r\n### Environment (Operating system, .NET Framework, etc) / 使用环境(操作系统,.NET Framework等)\r\n\r\n\r\n### Steps you have tried / 操作步骤\r\n\r\n\r\n### What did you expect to see? / 期望的结果\r\n\r\n\r\n### What did you see instead? / 实际结果\r\n\r\n\r\n### Config and error log in detail (with all sensitive info masked) / 配置文件和日志文件(请隐去敏感信息)",
"title": "禁用状态和PAC模式的图标都是灰色的, 不易区分. 建议把PAC模式换成绿色图标.",
"type": "issue"
},
{
"action": "created",
"author": "studentmain",
"comment_id": 597444994,
"datetime": 1583902787000,
"masked_author": "username_1",
"text": "顺便一提,在蓝色主题下,窗口图标可见性不佳。或许我们需要重新设计图标来同时根治这两个问题。\r\n\r\n",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "celeron533",
"comment_id": 601981238,
"datetime": 1584757946000,
"masked_author": "username_2",
"text": "Option A. Contour\r\nOption B. Add background color\r\nOption C. Use at least two color theme and let user to select",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "studentmain",
"comment_id": 601992442,
"datetime": 1584764573000,
"masked_author": "username_1",
"text": "选项B可能太像Telegram了(虽然现在就很像,ss-android 也有这毛病,公共素材小飞机……)\r\nC代价太大,单纯的显示问题我认为还用不着这么干",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "AnnDad",
"comment_id": 601993233,
"datetime": 1584765189000,
"masked_author": "username_3",
"text": "那就A",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "tompotter00",
"comment_id": 603347909,
"datetime": 1585067323000,
"masked_author": "username_4",
"text": "建议把图标加个边,或者加些渐变色",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "rolle33",
"comment_id": 603760529,
"datetime": 1585131818000,
"masked_author": "username_5",
"text": "增加调色板以防止颜色的冲突",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "studentmain",
"comment_id": 603789006,
"datetime": 1585135785000,
"masked_author": "username_1",
"text": "\r\n\r\n\r\n用勾边也比较瞎眼……\r\n\r\nC选项如果是自动选择倒是不错,不过得和win32 api打交道,https://docs.microsoft.com/en-us/windows/win32/api/uxtheme/nf-uxtheme-getthemesyscolorbrush",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Zhwt",
"comment_id": 604886293,
"datetime": 1585299039000,
"masked_author": "username_6",
"text": "在不重新设计图标的情况下, 建议可以留一个选项来读取自定义 .ico 或者 png 文件, 像 SSR 一样, 用户可以配置使用自定义图标",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "AnnDad",
"comment_id": 604908585,
"datetime": 1585302316000,
"masked_author": "username_3",
"text": "这个主意好",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "lssmoon2019",
"comment_id": 630858556,
"datetime": 1589898693000,
"masked_author": "username_7",
"text": "淡灰色和深灰色颜色还是很好区分的。建议不要绿色,换其他色。",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "GamenRoll",
"comment_id": 715123836,
"datetime": 1603440090000,
"masked_author": "username_8",
"text": "has wished new pac color for a long long time",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "database64128",
"comment_id": 715129384,
"datetime": 1603440280000,
"masked_author": "username_9",
"text": "We are not planning to implement a tray icon (`NotifyIcon`) in the upcoming version 5.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "database64128",
"comment_id": null,
"datetime": 1603440281000,
"masked_author": "username_9",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "GamenRoll",
"comment_id": 715175471,
"datetime": 1603441827000,
"masked_author": "username_8",
"text": "there has already been 3 colors for all the time, \r\nplz change the pac gray into black with white sideline (means empty inside, when it doesn't work), or\r\nhalf white half blue\r\n\r\nif you would\r\nthanx",
"title": null,
"type": "comment"
}
] | 10 | 15 | 2,409 | false | false | 2,409 | false |
apache/incubator-echarts | apache | 662,468,195 | 12,999 | null | [
{
"action": "opened",
"author": "yvv11520",
"comment_id": null,
"datetime": 1595296424000,
"masked_author": "username_0",
"text": "### Version\r\n4.8.0\r\n\r\n### Reproduction link\r\n[https://gallery.echartsjs.com/editor.html?c=xyfKams4BO&v=1](https://gallery.echartsjs.com/editor.html?c=xyfKams4BO&v=1)\r\n\r\n### Steps to reproduce\r\n点击拆线文字区域\r\n\r\n### What is expected?\r\n显示 alert\r\n\r\n### What is actually happening?\r\n无法进行点击事件\r\n\r\n<!-- This issue is generated by echarts-issue-helper. DO NOT REMOVE -->",
"title": "series.bar.label 及 series.line.label 点击无效",
"type": "issue"
},
{
"action": "created",
"author": "plainheart",
"comment_id": 664132403,
"datetime": 1595828705000,
"masked_author": "username_1",
"text": "The coming next big version `5.0` should support the clickable label.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "susiwen8",
"comment_id": null,
"datetime": 1628944622000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 4 | 4 | 1,159 | false | true | 425 | false |
1767914826/github-slideshow | null | 614,568,528 | 1 | null | [
{
"action": "created",
"author": "1767914826",
"comment_id": 625688342,
"datetime": 1588924095000,
"masked_author": "username_0",
"text": "OK\r\n\r\n\r\n\r\n\r\n\r\n\r\n------------------ 原始邮件 ------------------",
"title": null,
"type": "comment"
}
] | 2 | 3 | 6,213 | false | true | 68 | false |
fortran-lang/stdlib | fortran-lang | 547,249,971 | 105 | null | [
{
"action": "opened",
"author": "leonfoks",
"comment_id": null,
"datetime": 1578545883000,
"masked_author": "username_0",
"text": "There is quite a wide variety of proposals.\r\n\r\nDoes it make sense ahead of time to split the stdlib into defined parts like numpy and scipy have done?\r\nI can see maths, sorting, random numbers, linalg etc. going into NumFortran?\r\nKdTrees, interpolation, splines, etc. going into SciFortran? (Theres already a SciFortran but you get what I mean)\r\nMeshing? Python has a package called Discretize, but something similar could be handled in Fortran.\r\nThey could all be under the umbrella of the stdlib.\r\n\r\nIf i'm missing the point of anything stdlib related, please let me know!",
"title": "stdlib group with separate but related repositories?",
"type": "issue"
},
{
"action": "created",
"author": "scivision",
"comment_id": 572631975,
"datetime": 1578586329000,
"masked_author": "username_1",
"text": "That structure does seem to work well for a lot of projects in other languages. It could help ensure each part is kept to high quality while keeping agility of development. It could also facilitate the continuum of such efforts as many projects will probably be affiliated and API compatible, like the Numpy stack, h5py, xarray, pandas, etc.\r\n\r\nFor example w.r.t. file I/O, some libraries will simply have an iso_c_binding interface, some will be shims to polymorphic Fortran interfaces, some will be pure Fortran modules, etc.\r\nThis Fortran stdlib doesn't map exactly to the C++ concept of stdlib or other languages. It considers the math/analysis nature of Fortran.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "certik",
"comment_id": 572649093,
"datetime": 1578588359000,
"masked_author": "username_2",
"text": "So having a rich ecosystem of Fortran packages is something we need. But we need #44 to make that approach work well. Languages like Julia first started with richer standard library, but lately it seems their approach is to use separate packages for functionality and a smaller standard library. This part of a blog post argues this point also:\r\n\r\nhttp://cliffle.com/blog/m4vga-in-rust/#rust-has-a-package-manager\r\n\r\nBut: even if we succeed and in few years we have a rich ecosystem of packages and a large community and easy to create your own package, I think there is large value in having `stdlib` with the scope as defined in the README (https://github.com/fortran-lang/stdlib#scope), and the reason is that it will make Fortran very much with batteries included just like Matlab or Python+SciPy. The reason NumPy is a separate package is that Python itself is not numerically oriented, unlike Fortran, which already has a large subset of NumPy already built-in. Then `stdlib` is roughly in the range of SciPy (plus utilities). SciPy could be split into 10 packages, but there is high value in having just one package, as the \"base\", that gets you started to prototype your code and has everything you need to do numerical computing. Then, as you need more specialized packages, you will install such packages using `fpm` (Fortran Package Manager), as discussed in #44.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "marshallward",
"comment_id": 572655402,
"datetime": 1578589145000,
"masked_author": "username_3",
"text": "I would personally prefer a leaner standard library focused on generic \"computer science\" operations such as sorting and I/O, and rely on custom libraries for numerical work, perhaps provided by a robust distribution (if not packaging) system. There's only a few variations in output when sorting a list (stable and unstable, for example) but many ways to do an interpolation.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "leonfoks",
"comment_id": 572658282,
"datetime": 1578589553000,
"masked_author": "username_0",
"text": "Good points. I never really thought about the importance of the package manager and its influence on project development. \r\n\r\nThis probably goes without saying but I might suggest that repo organization follows an approach where we are splitting the code base into well defined sections such that. If in the future a package manager makes it feasible, the code base could be split at a later date if it makes sense.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "milancurcic",
"comment_id": 572665257,
"datetime": 1578590583000,
"masked_author": "username_4",
"text": "I think the target audience is a factor. Python started out as a general-purpose language and its stdlib reflects that. \r\n\r\nThis exercise is interesting to do with any language that has a stdlib. Try Python, C, C++, Haskell, Go, Rust, Nim. The result: Every stdlib has different scope.\r\n\r\nIf we look at Fortran's target audience (read: who uses Fortran?), then the functionality covered by numpy and scipy is largely in the center, not on the outskirts. Based on this, I'd argue that numpy+scipy stuff is more in scope of Fortran's stdlib than collections and sorting (though I think they are very much in scope too).",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "milancurcic",
"comment_id": 572667836,
"datetime": 1578590989000,
"masked_author": "username_4",
"text": "Sorry, I didn't answer the question. \r\n\r\nI think yes (for more specialized stuff), but only eventually down the road. Splitting by functionality would be counterproductive this early. We're still learning who uses Fortran, how they use it, and what do they want.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "certik",
"comment_id": 572669329,
"datetime": 1578591229000,
"masked_author": "username_2",
"text": "To add to @username_4's answer --- everything in `stdlib` currently is in the \"experimental\" namespace. That means we are reserving the freedom for ourselves to change things, to move things around, or to (down the road) split them into separate packages. Logistically, because we are just starting, and our community is still small, let's collaborate around just one repository. As our community becomes big and `stdlib` becomes big and hopefully we have a working `fpm` at that point, let's revisit this discussion.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "certik",
"comment_id": 572808174,
"datetime": 1578613425000,
"masked_author": "username_2",
"text": "So I suggest we keep trying to agree on common functionality that we all want, and agree an an API and put into experimental. At the same time, let's work on `fpm` (#44) and try to create a healthy Fortran package ecosystem like in Rust. But even if we are successful, as Rust shows, there is still a need for a high quality `stdlib`. And for the scope of `stdlib`, we already have about 10 Fortran packages that people use, see the list at #1. So we are at the point when it's time to consolidate all these libraries into just one library and all of us to collaborate on that.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "leonfoks",
"comment_id": 572847160,
"datetime": 1578623871000,
"masked_author": "username_0",
"text": "Cool yeah these are all great reasons and ideas to get moving!",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "milancurcic",
"comment_id": null,
"datetime": 1615474755000,
"masked_author": "username_4",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "milancurcic",
"comment_id": 796798276,
"datetime": 1615474755000,
"masked_author": "username_4",
"text": "Stdlib is currently a single repository with not plan to split into sub-packages. If a need comes up, let's revisit it.",
"title": null,
"type": "comment"
}
] | 5 | 12 | 5,566 | false | false | 5,566 | true |
pnp/pnpjs | pnp | 515,783,620 | 906 | null | [
{
"action": "opened",
"author": "harrychen1",
"comment_id": null,
"datetime": 1572560322000,
"masked_author": "username_0",
"text": "### Category\r\n- [ ] Enhancement\r\n- [X ] Bug\r\n- [ ] Question\r\n- [ ] Documentation gap/issue\r\n\r\n### Version \r\n\r\nPlease specify what version of the library you are using: [ 1.3.2 ]\r\n\r\nPlease specify what version(s) of SharePoint you are targeting: [ SharePoint online ]\r\n\r\n*If you are not using the latest release, please update and see if the issue is resolved before submitting an issue.*\r\n\r\n### Observed Behavior\r\nUse PnP/Js to create an SharePoint item and then add attachment, run the gulp serve locally, and test with Version 78.0.3904.70 (Official Build) (64-bit), the attach file code failed with error.\r\n\"TypeError: Failed to fetch\"\r\n\r\nActually the files are attached but it through exception.\r\n\r\nThis was not happen using previous Chrome version. I've noticed the error people also notice:\r\n\r\nMore issues reported here:\r\nhttps://stackoverflow.com/questions/58270663/samesite-warning-chrome-77\r\n\r\nA cookie associated with a cross-site resource at <URL> was set without the `SameSite` attribute. A future release of Chrome will only deliver cookies with cross-site requests if they are set with `SameSite=None` and `Secure`. You can review cookies in developer tools under Application>Storage>Cookies and see more details at <URL> and <URL>.\r\n\r\nIt seems we need to add in response header\r\n\r\nresponse.setHeader(\"Set-Cookie\", \"HttpOnly;Secure;SameSite=Strict\");\r\n\r\nHowever, we are using PnP/JS package and could not set this.\r\n\r\n**Can PnP/JS package enhance and ad this header?\r\nresponse.setHeader(\"Set-Cookie\", \"HttpOnly;Secure;SameSite=Strict\");**\r\n\r\n### Steps to Reproduce\r\n1. Create a spfx project\r\n2. Add code to add item\r\n3. Add code to upload files\r\n4. Run locally gulp serve\r\n5. Use Chrome Version 78.0.3904.70 (Official Build) (64-bit)\r\n\r\nYou will found the issue.\r\n\r\nThis also happens sometimes in all SharePoint tenant after code deployed.\r\n\r\nThis is urgent.\r\n\r\n```typescript \r\nCode sample:\r\nthis.web.lists.getByTitle('Pubs').items.add({\r\n Title: localItem.Title, PubsType: localItem.PubsType\r\n }).then((iar: ItemAddResult) => {\r\n\r\n var item = this.web.lists.getByTitle('Pubs').items.getById(iar.data.Id);\r\n if (!(this.state.selectedFile == null) && this.state.selectedFile.length > 0)\r\n {\r\n item.attachmentFiles.addMultiple(this.state.selectedFile).then(v => { \r\n \r\n item.update({\r\n 'WorkflowAutoStart': localItem.WorkflowAutoStart\r\n }).then((result: ItemUpdateResult) => {//Promise< boolean>\r\n\r\n // set messgae after files attached\r\n this._updateUIAfterSave(true);\r\n this.setState({\r\n isChildSectionVisible: false,\r\n status: \"\"\r\n });\r\n }, (error: any) => {\r\n // Do not start workflow if attach file failed\r\n this._updateUIAfterSave(true);\r\n\r\n });\r\n\r\n this._updateUIAfterSave(true);\r\n }).catch((err) => {\r\n \r\n**// Always failed here**\r\n alert('Attachment problem for new Pubs. Please check Pubs list and verift if files attached, if not, please try to attache again. If issue consistent, please contact Pubs support. ' + err);\r\n }\r\n });\r\n\r\n ….\r\n\r\nThanks.",
"title": "TypeError: Failed to fetch when attach files to SharePoint item",
"type": "issue"
},
{
"action": "created",
"author": "koltyakov",
"comment_id": 548679251,
"datetime": 1572588843000,
"masked_author": "username_1",
"text": "@username_0 If I got you right. Nope, PnPjs is the client-side library, we can not add anything to SharePoint API responses headers, we only make consuming the API nicer and easier.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "patrick-rodgers",
"comment_id": 548782333,
"datetime": 1572613854000,
"masked_author": "username_2",
"text": "@username_0 - yes looking at this and the SO article you linked the fix is required on the server side. We can't control what/how cookies are sent from the client. You could report this as well to the [sp-dev-docs repo](https://github.com/SharePoint/sp-dev-docs). Sorry this caught you but unfortunately not within our power to fix.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "harrychen1",
"comment_id": 548841890,
"datetime": 1572623762000,
"masked_author": "username_0",
"text": "This is ok. I found a way to fix the issue from our end.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "harrychen1",
"comment_id": null,
"datetime": 1572623763000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 4 | 6 | 3,954 | false | true | 3,804 | true |
alpacahq/alpaca-trade-api-python | alpacahq | 527,729,043 | 123 | null | [
{
"action": "opened",
"author": "keramtalb",
"comment_id": null,
"datetime": 1574616171000,
"masked_author": "username_0",
"text": "Polygon has announced retiring some of the tick end points still used in this library on Jan 1, 2020 - https://polygon.io/blog/action-required-v1-tick-api-will-be-deprecated-on-01-01-2020/.\r\n\r\nIs there any work underway to address this?",
"title": "Depreciation of some V1 endpoints",
"type": "issue"
},
{
"action": "created",
"author": "ttt733",
"comment_id": 558668633,
"datetime": 1574780402000,
"masked_author": "username_1",
"text": "We have not done any deprecation warnings in the past, but I will see about adding a warning in a future release. I just opened a draft PR allowing use of the V2 endpoints; once I've been able to confirm something about the response from Polygon I should be able to add the replacements.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "ttt733",
"comment_id": null,
"datetime": 1580271831000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 523 | false | false | 523 | false |
warpnet/salt-lint | warpnet | 667,891,604 | 179 | {
"number": 179,
"repo": "salt-lint",
"user_login": "warpnet"
} | [
{
"action": "opened",
"author": "chrisvanbreeden",
"comment_id": null,
"datetime": 1596033024000,
"masked_author": "username_0",
"text": "",
"title": "Added rule: check that jinja template files have the correct extension",
"type": "issue"
},
{
"action": "created",
"author": "jbouter",
"comment_id": 667003462,
"datetime": 1596184199000,
"masked_author": "username_1",
"text": "When using `file.recursive`, one might have jinja templates in there that can't be renamed to have the `.j2` extension, because then they'll have that extension on filesystem.\r\n\r\nFor example:\r\n\r\n```code\r\nnginx_vhosts:\r\n file.recursive:\r\n - name: /etc/nginx/sites-enabled/\r\n - source: salt://nginx/vhosts\r\n [..]\r\n```\r\n\r\nIf `/nginx/vhosts/` is filled with `vhost.conf.j2` files, they will be put on the server's filesystem as `vhost.conf.j2` files.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "jbouter",
"comment_id": 667132891,
"datetime": 1596203822000,
"masked_author": "username_1",
"text": "First of all, thank you for your contribution!\r\n\r\nWe are, however, sadly closing this pull request due to the following reasons:\r\n\r\n* We should only be linting SaltStack files, meaning only lint files ending in `.sls`. We don't want our users to lint all of their files, scanning for possible jinja.\r\n* We can't rule out false positives as stated in the comment above.\r\n\r\nI suggest this PR is rewritten to do the following:\r\n\r\n* Check **only** `.sls` files\r\n* Check if `file.managed` is being used\r\n* If so, check if `- template: jinja` is set\r\n* If these conditions match, check if a `- source: ` is set\r\n* Check if the file path in `- source:` ends with `.j2`\r\n\r\nIf you do wish however to use this code, you can use `-r` option to specify your own set of rules in your own SaltStack linting pipeline. But we won't be merging this into the codebase.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,306 | false | false | 1,306 | false |
boltdesignsystem/bolt | boltdesignsystem | 539,677,194 | 1,626 | {
"number": 1626,
"repo": "bolt",
"user_login": "boltdesignsystem"
} | [
{
"action": "opened",
"author": "sghoweri",
"comment_id": null,
"datetime": 1576675372000,
"masked_author": "username_0",
"text": "## Jira\r\nN/A\r\n\r\n## Summary\r\nAuto-format + re-order any `package.json` files that get updated.\r\n\r\n## Details\r\nUses https://www.npmjs.com/package/sort-package-json + our new super fast commit hooks to automatically take care of everything!\r\n\r\n## How to test\r\nIt just works + only applies to the monorepo. Super low risk quality of life improvement! 😉",
"title": "Automatically Format Package.json Files",
"type": "issue"
},
{
"action": "created",
"author": "sghoweri",
"comment_id": 571146614,
"datetime": 1578318957000,
"masked_author": "username_0",
"text": "<!-- GITHUB_RELEASE COMMENT: released -->\nPR was released with v2.14.0",
"title": null,
"type": "comment"
}
] | 1 | 2 | 418 | false | false | 418 | false |
V4Fire/Client | V4Fire | 616,524,024 | 230 | null | [
{
"action": "opened",
"author": "bonkalol",
"comment_id": null,
"datetime": 1589277459000,
"masked_author": "username_0",
"text": "Репорт:\r\n\"Ставишь медленный интернет. И начинаешь переключать сегменты, не дожидаясь их загрузки. Ты заметишь, что в какой-то момент пропадают скелетоны\"\r\n\r\nЛог:\r\n\r\n```\r\n==== scroll-render::reInit [tagUuid: 689545f6-94c3-4be9-845f-2d441e3228f0] ==== {page: 0, limit: 10, tagUuid: \"689545f6-94c3-4be9-845f-2d441e3228f0\"}\r\nref tombstones visibility changed: show = true\r\nref loader visibility changed: show = true\r\nref retry visibility changed: show = false\r\nref done visibility changed: show = false\r\nref empty visibility changed: show = false\r\nmod progress set to true [old: undefined]\r\nmod status set to before-ready [old: loading]\r\nmod progress set to false [old: true]\r\nlocalReady\r\n==== scroll-render::onReady [tagUuid: 689545f6-94c3-4be9-845f-2d441e3228f0]==== {page: 0, limit: 10, tagUuid: \"689545f6-94c3-4be9-845f-2d441e3228f0\"}\r\nref tombstones visibility changed: show = false\r\nref loader visibility changed: show = false\r\nmod status set to ready [old: before-ready]\r\n==== scroll-request::try:start [tagUuid: 689545f6-94c3-4be9-845f-2d441e3228f0] ==== {page: 0, limit: 10, tagUuid: \"689545f6-94c3-4be9-845f-2d441e3228f0\"}\r\nref tombstones visibility changed: show = true\r\nref loader visibility changed: show = true\r\nmod progress set to true [old: false]\r\nmod status set to loading [old: ready]\r\n==== scroll-render::reInit [tagUuid: 8c935c42-8afa-4be8-ae50-7359a966bde7] ==== {page: 0, limit: 10, tagUuid: \"8c935c42-8afa-4be8-ae50-7359a966bde7\"}\r\nref tombstones visibility changed: show = true\r\nref loader visibility changed: show = true\r\nref retry visibility changed: show = false\r\nref done visibility changed: show = false\r\nref empty visibility changed: show = false\r\n==== scroll-request::try:finish [tagUuid: 689545f6-94c3-4be9-845f-2d441e3228f0] ==== {page: 0, limit: 10, tagUuid: \"689545f6-94c3-4be9-845f-2d441e3228f0\"}\r\nref tombstones visibility changed: show = false\r\nref loader visibility changed: show = false\r\n==== scroll-render::onRequestDone [tagUuid: 689545f6-94c3-4be9-845f-2d441e3228f0]==== {page: 0, limit: 10, tagUuid: \"689545f6-94c3-4be9-845f-2d441e3228f0\"}\r\nref tombstones visibility changed: show = false\r\nref loader visibility changed: show = false\r\nref done visibility changed: show = true\r\nmod status set to before-ready [old: loading]\r\nlocalReady\r\n==== scroll-render::onReady [tagUuid: 8c935c42-8afa-4be8-ae50-7359a966bde7]==== {page: 0, limit: 10, tagUuid: \"8c935c42-8afa-4be8-ae50-7359a966bde7\"}\r\nref tombstones visibility changed: show = false\r\nref loader visibility changed: show = false\r\nmod status set to ready [old: before-ready]\r\n```",
"title": "Пропадают скелетоны в b-virtual-scroll",
"type": "issue"
},
{
"action": "created",
"author": "bonkalol",
"comment_id": 627240182,
"datetime": 1589277515000,
"masked_author": "username_0",
"text": "https://github.com/V4Fire/Client/issues/203",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "kobezzza",
"comment_id": null,
"datetime": 1591172593000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 3 | 2,613 | false | false | 2,613 | false |
open-mmlab/mmdetection | open-mmlab | 618,184,110 | 2,720 | null | [
{
"action": "opened",
"author": "Mxbonn",
"comment_id": null,
"datetime": 1589458798000,
"masked_author": "username_0",
"text": "By default COCO doesn't use a single IoU to calculate the mAP but instead averages over a range of IoUs (IoU=0.50:0.95).\r\nThe evaluate function also outputs the mAP for IoU 0.5 and 0.75.\r\nHowever calculating the mAP for IoU 0.5 and 0.75 using the `evaluate` method from the `super` class `CustomDataset` gives a different result.\r\n\r\n**This means that implementing a new dataset will give different evaluation results depending on whether it is implemented as a new subclass of `CustomDataset` or with COCO compatible labels as a `CocoDataset`.**\r\n\r\nMinimal example:\r\n```python\r\nimport os\r\n\r\nimport mmdet.datasets\r\nimport mmdet.apis\r\nimport mmcv\r\n```\r\n\r\n\r\n```python\r\ncheckpoint_path = \"checkpoints/ssd300_coco_20200307-a92d2092.pth\"\r\nconfig_path = \"configs/ssd/ssd300_coco.py\"\r\n\r\ncfg = mmcv.Config.fromfile(config_path)\r\ncfg.model.pretrained = None\r\ncfg.data.test.test_mode = True\r\ncfg.test_cfg.max_per_img = 100 # Set to 100 instead of 200 as CustomDataset.evaluate has no maxDets\r\n```\r\n\r\n\r\n```python\r\ntest_dataset = mmdet.datasets.build_dataset(cfg.data.test)\r\ntest_dataloader = mmdet.datasets.build_dataloader(\r\n test_dataset,\r\n samples_per_gpu=1,\r\n workers_per_gpu=cfg.data.workers_per_gpu,\r\n dist=False,\r\n shuffle=False\r\n)\r\n```\r\n\r\n loading annotations into memory...\r\n Done (t=0.52s)\r\n creating index...\r\n index created!\r\n\r\n\r\n\r\n```python\r\nraw_model = mmdet.models.build_detector(cfg.model, train_cfg=None, test_cfg=cfg.test_cfg)\r\ncheckpoint = mmcv.runner.load_checkpoint(raw_model, checkpoint_path, map_location='cpu')\r\nraw_model.CLASSES = test_dataset.CLASSES\r\nmodel = mmcv.parallel.MMDataParallel(raw_model, device_ids=[0])\r\n```\r\n\r\n\r\n```python\r\nresults = mmdet.apis.single_gpu_test(model, test_dataloader)\r\n```\r\n\r\n [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 5000/5000, 39.0 task/s, elapsed: 128s, ETA: 0s\r\n\r\n\r\n```python\r\ntest_dataset.evaluate(results)\r\n```\r\n\r\n\r\n Evaluating bbox...\r\n Loading and preparing results...\r\n DONE (t=3.77s)\r\n creating index...\r\n index created!\r\n Running per image evaluation...\r\n Evaluate annotation type *bbox*\r\n DONE (t=53.15s).\r\n Accumulating evaluation results...\r\n DONE (t=7.20s).\r\n Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.255\r\n Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.436\r\n Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.263\r\n Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.065\r\n Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.277\r\n Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.422\r\n Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.237\r\n Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.347\r\n Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.364\r\n Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.108\r\n Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.401\r\n Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.582\r\n\r\n\r\n\r\n\r\n\r\n {'bbox_mAP': 0.255,\r\n 'bbox_mAP_50': 0.436,\r\n 'bbox_mAP_75': 0.263,\r\n 'bbox_mAP_s': 0.065,\r\n 'bbox_mAP_m': 0.277,\r\n 'bbox_mAP_l': 0.422,\r\n 'bbox_mAP_copypaste': '0.255 0.436 0.263 0.065 0.277 0.422'}\r\n\r\n\r\n\r\n\r\n```python\r\nsuper(type(test_dataset), test_dataset).evaluate(results, iou_thr=0.5)\r\n```\r\n\r\n\r\n +----------------+-------+--------+--------+-------+\r\n | class | gts | dets | recall | ap |\r\n +----------------+-------+--------+--------+-------+\r\n | person | 10777 | 151136 | 0.781 | 0.619 |\r\n | bicycle | 314 | 1935 | 0.503 | 0.376 |\r\n | car | 1918 | 17858 | 0.626 | 0.440 |\r\n | motorcycle | 367 | 2282 | 0.703 | 0.571 |\r\n | airplane | 143 | 3942 | 0.895 | 0.737 |\r\n | bus | 283 | 875 | 0.742 | 0.693 |\r\n | train | 190 | 4048 | 0.863 | 0.795 |\r\n | truck | 414 | 2610 | 0.693 | 0.431 |\r\n | boat | 424 | 8890 | 0.547 | 0.308 |\r\n | traffic light | 634 | 12564 | 0.465 | 0.236 |\r\n | fire hydrant | 101 | 274 | 0.703 | 0.662 |\r\n | stop sign | 75 | 479 | 0.640 | 0.581 |\r\n | parking meter | 60 | 147 | 0.583 | 0.498 |\r\n | bench | 411 | 12074 | 0.457 | 0.267 |\r\n | bird | 427 | 13891 | 0.433 | 0.296 |\r\n | cat | 202 | 1221 | 0.931 | 0.851 |\r\n | dog | 218 | 1027 | 0.821 | 0.713 |\r\n | horse | 272 | 2077 | 0.794 | 0.671 |\r\n | sheep | 354 | 5733 | 0.782 | 0.559 |\r\n | cow | 372 | 4083 | 0.766 | 0.534 |\r\n | elephant | 252 | 3908 | 0.897 | 0.738 |\r\n | bear | 71 | 308 | 0.831 | 0.777 |\r\n | zebra | 266 | 5540 | 0.880 | 0.797 |\r\n | giraffe | 232 | 5620 | 0.901 | 0.812 |\r\n | backpack | 371 | 2462 | 0.305 | 0.111 |\r\n | umbrella | 407 | 4079 | 0.634 | 0.462 |\r\n | handbag | 540 | 4092 | 0.274 | 0.079 |\r\n | tie | 252 | 4043 | 0.480 | 0.314 |\r\n | suitcase | 299 | 1934 | 0.515 | 0.334 |\r\n | frisbee | 115 | 566 | 0.687 | 0.499 |\r\n | skis | 240 | 5994 | 0.592 | 0.301 |\r\n | snowboard | 69 | 343 | 0.478 | 0.311 |\r\n | sports ball | 260 | 4571 | 0.504 | 0.382 |\r\n | kite | 327 | 7137 | 0.648 | 0.385 |\r\n | baseball bat | 145 | 1113 | 0.441 | 0.283 |\r\n | baseball glove | 148 | 1161 | 0.480 | 0.296 |\r\n | skateboard | 179 | 1872 | 0.721 | 0.559 |\r\n | surfboard | 267 | 4453 | 0.592 | 0.413 |\r\n | tennis racket | 225 | 2015 | 0.631 | 0.524 |\r\n | bottle | 1013 | 7547 | 0.523 | 0.277 |\r\n | wine glass | 341 | 1171 | 0.408 | 0.290 |\r\n | cup | 895 | 4976 | 0.534 | 0.330 |\r\n | fork | 215 | 1029 | 0.414 | 0.228 |\r\n | knife | 325 | 1714 | 0.237 | 0.065 |\r\n | spoon | 253 | 1237 | 0.249 | 0.088 |\r\n | bowl | 623 | 3130 | 0.652 | 0.431 |\r\n | banana | 370 | 7206 | 0.635 | 0.300 |\r\n | apple | 236 | 2028 | 0.424 | 0.154 |\r\n | sandwich | 177 | 1156 | 0.678 | 0.462 |\r\n | orange | 285 | 2710 | 0.561 | 0.285 |\r\n | broccoli | 312 | 13007 | 0.769 | 0.325 |\r\n | carrot | 365 | 9667 | 0.573 | 0.193 |\r\n | hot dog | 125 | 470 | 0.480 | 0.363 |\r\n | pizza | 284 | 2390 | 0.736 | 0.600 |\r\n | donut | 328 | 2525 | 0.649 | 0.389 |\r\n | cake | 310 | 1742 | 0.626 | 0.412 |\r\n | chair | 1771 | 33923 | 0.537 | 0.279 |\r\n | couch | 261 | 1659 | 0.785 | 0.590 |\r\n | potted plant | 342 | 12017 | 0.661 | 0.308 |\r\n | bed | 163 | 1625 | 0.785 | 0.632 |\r\n | dining table | 695 | 8174 | 0.662 | 0.381 |\r\n | toilet | 179 | 3298 | 0.866 | 0.730 |\r\n | tv | 288 | 2276 | 0.795 | 0.671 |\r\n | laptop | 231 | 913 | 0.740 | 0.647 |\r\n | mouse | 106 | 627 | 0.708 | 0.557 |\r\n | remote | 283 | 2146 | 0.378 | 0.138 |\r\n | keyboard | 153 | 1376 | 0.784 | 0.624 |\r\n | cell phone | 262 | 2546 | 0.462 | 0.287 |\r\n | microwave | 55 | 366 | 0.727 | 0.562 |\r\n | oven | 143 | 3618 | 0.783 | 0.523 |\r\n | toaster | 9 | 16 | 0.111 | 0.007 |\r\n | sink | 225 | 8280 | 0.702 | 0.447 |\r\n | refrigerator | 126 | 2632 | 0.794 | 0.599 |\r\n | book | 1129 | 21431 | 0.417 | 0.100 |\r\n | clock | 267 | 4601 | 0.674 | 0.562 |\r\n | vase | 274 | 1612 | 0.504 | 0.315 |\r\n | scissors | 36 | 256 | 0.472 | 0.379 |\r\n | teddy bear | 190 | 2154 | 0.674 | 0.535 |\r\n | hair drier | 11 | 1 | 0.000 | 0.000 |\r\n | toothbrush | 57 | 326 | 0.263 | 0.146 |\r\n +----------------+-------+--------+--------+-------+\r\n | mAP | | | | 0.430 |\r\n +----------------+-------+--------+--------+-------+\r\n\r\n\r\n\r\n\r\n\r\n {'mAP': 0.43032437562942505}\r\n\r\n\r\n\r\n\r\n```python\r\nsuper(type(test_dataset), test_dataset).evaluate(results, iou_thr=0.75)\r\n```\r\n\r\n\r\n +----------------+-------+--------+--------+-------+\r\n | class | gts | dets | recall | ap |\r\n +----------------+-------+--------+--------+-------+\r\n | person | 10777 | 151136 | 0.424 | 0.322 |\r\n | bicycle | 314 | 1935 | 0.220 | 0.141 |\r\n | car | 1918 | 17858 | 0.301 | 0.209 |\r\n | motorcycle | 367 | 2282 | 0.351 | 0.275 |\r\n | airplane | 143 | 3942 | 0.664 | 0.538 |\r\n | bus | 283 | 875 | 0.625 | 0.589 |\r\n | train | 190 | 4048 | 0.679 | 0.623 |\r\n | truck | 414 | 2610 | 0.425 | 0.236 |\r\n | boat | 424 | 8890 | 0.200 | 0.094 |\r\n | traffic light | 634 | 12564 | 0.129 | 0.056 |\r\n | fire hydrant | 101 | 274 | 0.594 | 0.532 |\r\n | stop sign | 75 | 479 | 0.600 | 0.567 |\r\n | parking meter | 60 | 147 | 0.433 | 0.373 |\r\n | bench | 411 | 12074 | 0.221 | 0.138 |\r\n | bird | 427 | 13891 | 0.267 | 0.194 |\r\n | cat | 202 | 1221 | 0.757 | 0.617 |\r\n | dog | 218 | 1027 | 0.651 | 0.529 |\r\n | horse | 272 | 2077 | 0.540 | 0.418 |\r\n | sheep | 354 | 5733 | 0.432 | 0.290 |\r\n | cow | 372 | 4083 | 0.462 | 0.333 |\r\n | elephant | 252 | 3908 | 0.643 | 0.511 |\r\n | bear | 71 | 308 | 0.718 | 0.664 |\r\n | zebra | 266 | 5540 | 0.624 | 0.552 |\r\n | giraffe | 232 | 5620 | 0.668 | 0.582 |\r\n | backpack | 371 | 2462 | 0.127 | 0.029 |\r\n | umbrella | 407 | 4079 | 0.327 | 0.218 |\r\n | handbag | 540 | 4092 | 0.093 | 0.019 |\r\n | tie | 252 | 4043 | 0.183 | 0.118 |\r\n | suitcase | 299 | 1934 | 0.261 | 0.143 |\r\n | frisbee | 115 | 566 | 0.539 | 0.393 |\r\n | skis | 240 | 5994 | 0.133 | 0.046 |\r\n | snowboard | 69 | 343 | 0.217 | 0.134 |\r\n | sports ball | 260 | 4571 | 0.342 | 0.255 |\r\n | kite | 327 | 7137 | 0.321 | 0.175 |\r\n | baseball bat | 145 | 1113 | 0.172 | 0.067 |\r\n | baseball glove | 148 | 1161 | 0.257 | 0.117 |\r\n | skateboard | 179 | 1872 | 0.358 | 0.237 |\r\n | surfboard | 267 | 4453 | 0.270 | 0.156 |\r\n | tennis racket | 225 | 2015 | 0.293 | 0.183 |\r\n | bottle | 1013 | 7547 | 0.247 | 0.116 |\r\n | wine glass | 341 | 1171 | 0.194 | 0.115 |\r\n | cup | 895 | 4976 | 0.345 | 0.214 |\r\n | fork | 215 | 1029 | 0.172 | 0.080 |\r\n | knife | 325 | 1714 | 0.098 | 0.029 |\r\n | spoon | 253 | 1237 | 0.119 | 0.046 |\r\n | bowl | 623 | 3130 | 0.456 | 0.293 |\r\n | banana | 370 | 7206 | 0.265 | 0.117 |\r\n | apple | 236 | 2028 | 0.250 | 0.094 |\r\n | sandwich | 177 | 1156 | 0.486 | 0.318 |\r\n | orange | 285 | 2710 | 0.411 | 0.210 |\r\n | broccoli | 312 | 13007 | 0.308 | 0.108 |\r\n | carrot | 365 | 9667 | 0.216 | 0.064 |\r\n | hot dog | 125 | 470 | 0.304 | 0.233 |\r\n | pizza | 284 | 2390 | 0.521 | 0.434 |\r\n | donut | 328 | 2525 | 0.463 | 0.293 |\r\n | cake | 310 | 1742 | 0.374 | 0.199 |\r\n | chair | 1771 | 33923 | 0.244 | 0.126 |\r\n | couch | 261 | 1659 | 0.521 | 0.321 |\r\n | potted plant | 342 | 12017 | 0.266 | 0.102 |\r\n | bed | 163 | 1625 | 0.534 | 0.405 |\r\n | dining table | 695 | 8174 | 0.397 | 0.230 |\r\n | toilet | 179 | 3298 | 0.659 | 0.549 |\r\n | tv | 288 | 2276 | 0.635 | 0.507 |\r\n | laptop | 231 | 913 | 0.576 | 0.499 |\r\n | mouse | 106 | 627 | 0.557 | 0.459 |\r\n | remote | 283 | 2146 | 0.166 | 0.073 |\r\n | keyboard | 153 | 1376 | 0.484 | 0.333 |\r\n | cell phone | 262 | 2546 | 0.282 | 0.177 |\r\n | microwave | 55 | 366 | 0.545 | 0.438 |\r\n | oven | 143 | 3618 | 0.413 | 0.277 |\r\n | toaster | 9 | 16 | 0.111 | 0.007 |\r\n | sink | 225 | 8280 | 0.351 | 0.179 |\r\n | refrigerator | 126 | 2632 | 0.556 | 0.424 |\r\n | book | 1129 | 21431 | 0.117 | 0.024 |\r\n | clock | 267 | 4601 | 0.468 | 0.339 |\r\n | vase | 274 | 1612 | 0.270 | 0.167 |\r\n | scissors | 36 | 256 | 0.250 | 0.189 |\r\n | teddy bear | 190 | 2154 | 0.374 | 0.240 |\r\n | hair drier | 11 | 1 | 0.000 | 0.000 |\r\n | toothbrush | 57 | 326 | 0.070 | 0.040 |\r\n +----------------+-------+--------+--------+-------+\r\n | mAP | | | | 0.259 |\r\n +----------------+-------+--------+--------+-------+\r\n\r\n\r\n {'mAP': 0.2592655122280121}",
"title": "COCO mAP and CustomDataset mAP don't match on single IoU value.",
"type": "issue"
},
{
"action": "created",
"author": "ZwwWayne",
"comment_id": 630053838,
"datetime": 1589793245000,
"masked_author": "username_1",
"text": "The `customdataset` uses VOC-style AP calculation, its implementation when calculating TP and FPs are a little bit different from COCO evaluation.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "hellock",
"comment_id": 633182840,
"datetime": 1590298790000,
"masked_author": "username_2",
"text": "@username_0 If you use the official evaluation tool provided by COCO to evaluate the results of Pascal VOC dataset or ImageNet DET dataset, you will also get slightly different mAPs. The implementation of evaluation metrics of different datasets are not exactly the same.",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "hellock",
"comment_id": null,
"datetime": 1590298790000,
"masked_author": "username_2",
"text": "",
"title": null,
"type": "issue"
}
] | 3 | 4 | 14,123 | false | false | 14,123 | true |
johnrengelman/gradle-processes | null | 472,534,072 | 16 | null | [
{
"action": "opened",
"author": "ddelponte",
"comment_id": null,
"datetime": 1564002911000,
"masked_author": "username_0",
"text": "I have a multi-project build.\r\n\r\nIt contains four sample apps.\r\n\r\nI would like to have one gradle task that will:\r\n\r\n1. start sample app 1:\r\n - `./gradlew :simple:app1:run`\r\n2. once app 1 is started, execute tests against the running sample app 1:\r\n - `./gradlew :simple:loadtests-app1:gatlingRun`\r\n3. shut down sample app 1\r\n4. start sample app 2\r\n5. once app 2 is started, execute tests against the running sample app 2\r\n\r\netc.\r\n\r\nAny advice on how best to accomplish this?\r\n\r\nThanks!",
"title": "How to use the plugin to startup subprojects and run tests against them",
"type": "issue"
},
{
"action": "closed",
"author": "ddelponte",
"comment_id": null,
"datetime": 1564153596000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 2 | 492 | false | false | 492 | false |
cmiscm/leonsans | null | 488,304,622 | 9 | null | [
{
"action": "opened",
"author": "rcrath",
"comment_id": null,
"datetime": 1567454068000,
"masked_author": "username_0",
"text": "I really like the look of metaball, but can't figure out where it gets is unique \"burnt-in\" look. I tried to figure it out from the code, but all I see are standard weight, leading, tracking, etc, nothing that would account for the unique shape of the characters (ie, larger at the joints). Can you explain this one a little? Thanks. \r\n\r\nPS, reminds me a little of [averia](http://iotic.com/averia/).",
"title": "Metaball?",
"type": "issue"
},
{
"action": "created",
"author": "bbbooo313",
"comment_id": 528160317,
"datetime": 1567648051000,
"masked_author": "username_1",
"text": "让我想起了一点dapingtai4455.com",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "kawa-yoiko",
"comment_id": 528177373,
"datetime": 1567652935000,
"masked_author": "username_2",
"text": "I was curious as well and dug into it, sharing it here ^ ^\r\n\r\n(tl;dr: blur + threshold)\r\n\r\nYou can see that sprites are placed along the strokes and animated, which in its final form looks like this:\r\n\r\n<img width=\"421\" alt=\"metaball-1\" src=\"https://user-images.githubusercontent.com/45892908/64308727-d8421c00-cfcc-11e9-84c5-6a4c49119fde.png\">\r\n\r\nBut before being drawn to the screen the entire canvas goes through two filters.\r\n\r\nhttps://github.com/username_3/leonsans/blob/920bc2b2d0fdd23a220590c36c9bfc34e56b44ac/examples/metaball-pixi.html#L93-L94\r\n\r\nThe blur filter is provided by PixiJS:\r\n\r\nhttps://github.com/username_3/leonsans/blob/920bc2b2d0fdd23a220590c36c9bfc34e56b44ac/examples/metaball-pixi.html#L61-L63\r\n\r\nThe image after this step:\r\n\r\n<img width=\"445\" alt=\"metaball-2\" src=\"https://user-images.githubusercontent.com/45892908/64308828-2f47f100-cfcd-11e9-8d62-a844ab3ea07d.png\">\r\n\r\nThe threshold filter is built through a WebGL fragment shader:\r\n\r\nhttps://github.com/username_3/leonsans/blob/920bc2b2d0fdd23a220590c36c9bfc34e56b44ac/examples/metaball-pixi.html#L66-L82\r\n\r\nThe shader is executed for each pixel. It takes the corresponding pixel on the previously drawn image, finds its opacity, and return a given color if it's larger than a given threshold, or transparency otherwise. Uniforms are basically global constants in shaders. In case you are interested, [the book of shaders](https://thebookofshaders.com/) is a good resource.\r\n\r\nFinal result:\r\n\r\n<img width=\"433\" alt=\"metaball-3\" src=\"https://user-images.githubusercontent.com/45892908/64308767-ff98e900-cfcc-11e9-831c-6ed2646a8919.png\">\r\n\r\n---\r\n\r\nTo see the very first image for yourself, apply the following diff, run a local static file server (`python3 -m http.server 8000`, otherwise sprite images won't load) and visit `http://localhost:8000/examples/metaball-pixi.html`:\r\n\r\n```diff\r\n--- a/examples/metaball-pixi.html\r\n+++ b/examples/metaball-pixi.html\r\n@@ -74,11 +74,7 @@\r\n '{',\r\n ' vec4 color = texture2D(uSampler, vTextureCoord);',\r\n ' vec3 mcolor = vec3(mr, mg, mb);',\r\n- ' if (color.a > threshold) {',\r\n- ' gl_FragColor = vec4(mcolor, 1.0);',\r\n- ' } else {',\r\n- ' gl_FragColor = vec4(vec3(0.0), 0.0);',\r\n- ' }',\r\n+ ' gl_FragColor = vec4(mcolor * color.a, 1.0);',\r\n '}'\r\n ].join('\\n');\r\n \r\n@@ -90,7 +86,7 @@\r\n };\r\n \r\n const thresholdFilter = new PIXI.Filter(null, fragSource, uniformsData);\r\n- stage.filters = [blurFilter, thresholdFilter];\r\n+ stage.filters = [thresholdFilter];\r\n stage.filterArea = renderer.screen;\r\n \r\n controll = {\r\n@@ -106,7 +102,7 @@\r\n y: 0\r\n });\r\n TweenMax.to(particles[i].scale, 3, {\r\n- delay: 0.001 * i,\r\n+ delay: 0.02 * i,\r\n x: particles[i].saveScale,\r\n y: particles[i].saveScale,\r\n ease: Circ.easeOut\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "cmiscm",
"comment_id": 528185686,
"datetime": 1567655711000,
"masked_author": "username_3",
"text": "Thanks for the detailed comment, also this article I wrote last year would be helpful.\r\nhttps://medium.com/@username_3/gooey-effect-for-chamoy-creative-4ce0b5ff5baf",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "rcrath",
"comment_id": null,
"datetime": 1567674718000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "rcrath",
"comment_id": 528275268,
"datetime": 1567674718000,
"masked_author": "username_0",
"text": "Thanks @username_2 and @username_3! Looking forward to playing with this. If I can figure things out, I might even replace my [logo](http://way.net), which has been on my site since 1995!",
"title": null,
"type": "comment"
}
] | 4 | 6 | 4,002 | false | false | 4,002 | true |
hasktorch/hasktorch | hasktorch | 317,014,788 | 84 | null | [
{
"action": "opened",
"author": "stites",
"comment_id": null,
"datetime": 1524525135000,
"masked_author": "username_0",
"text": "Might be a bit tricky, but it would be nice to say something like:\r\n```\r\nmain :: IO ()\r\nmain = do\r\n t :: DoubleTensor '[1,2,3] <- fromList [1..]\r\n printTensor t\r\n```\r\n\r\nCurrently the types take precedence, so it actually doesn't matter what you put into the list: if the list is too short, it will be filled with (I assume) whatever was last in memory, whereas if the list is too long it will be truncated.",
"title": "Make fromList take an infinite list",
"type": "issue"
},
{
"action": "created",
"author": "stites",
"comment_id": 429498570,
"datetime": 1539392524000,
"masked_author": "username_0",
"text": "this is infeasible due to the handling of the FFI (as far as I can tell).",
"title": null,
"type": "comment"
},
{
"action": "closed",
"author": "stites",
"comment_id": null,
"datetime": 1539392524000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 1 | 3 | 481 | false | false | 481 | false |
apache/hbase | apache | 570,577,998 | 1,206 | {
"number": 1206,
"repo": "hbase",
"user_login": "apache"
} | [
{
"action": "opened",
"author": "Apache9",
"comment_id": null,
"datetime": 1582637947000,
"masked_author": "username_0",
"text": "",
"title": "HBASE-23890 Update the rsgroup section in our ref guide",
"type": "issue"
},
{
"action": "created",
"author": "Apache9",
"comment_id": 592305420,
"datetime": 1582863462000,
"masked_author": "username_0",
"text": "Any other concerns? @busbey",
"title": null,
"type": "comment"
}
] | 2 | 6 | 9,496 | false | true | 27 | false |
facebook/react | facebook | 607,715,560 | 18,755 | {
"number": 18755,
"repo": "react",
"user_login": "facebook"
} | [
{
"action": "opened",
"author": "bl00mber",
"comment_id": null,
"datetime": 1588007879000,
"masked_author": "username_0",
"text": "",
"title": "Allow Node 14.x",
"type": "issue"
}
] | 3 | 4 | 1,126 | false | true | 0 | false |
arlac77/svelte-guard-history-router | null | 644,026,725 | 244 | null | [
{
"action": "closed",
"author": "arlac77",
"comment_id": null,
"datetime": 1592935574000,
"masked_author": "username_0",
"text": "",
"title": null,
"type": "issue"
}
] | 2 | 2 | 6,174 | false | true | 0 | false |
ros2/rosbag2 | ros2 | 564,885,938 | 294 | {
"number": 294,
"repo": "rosbag2",
"user_login": "ros2"
} | [
{
"action": "opened",
"author": "zmichaels11",
"comment_id": null,
"datetime": 1581620219000,
"masked_author": "username_0",
"text": "### Changes\r\n* Require `compression_mode_` to not be none within `load_next_file`\r\n* Require `decompressor_` to not be null if `compression_mode_` is `FILE` in `load_next_file`",
"title": "Fix throw in playback of split+compressed bagfiles",
"type": "issue"
},
{
"action": "created",
"author": "zmichaels11",
"comment_id": 585996792,
"datetime": 1581631832000,
"masked_author": "username_0",
"text": "@ros2/aws-oncall - please run this CI job\r\nGist: https://gist.githubusercontent.com/username_0/50bcd011e1f31e0477fe5c58c6f99c1a/raw/2d07178c55d483933adb583f2f4184765f0a4128/ros2.repos\r\nBUILD args: --packages-up-to rosbag2_compression\r\nTEST args: --packages-select rosbag2_compression\r\nJob: ci_launcher",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "emersonknapp",
"comment_id": 586007292,
"datetime": 1581633610000,
"masked_author": "username_1",
"text": "Is this a bugfix? If so, can we put in a regression test for it?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "dabonnie",
"comment_id": 586413709,
"datetime": 1581705127000,
"masked_author": "username_2",
"text": "* Linux [](http://ci.ros2.org/job/ci_linux/9347/)\r\n* Linux-aarch64 [](http://ci.ros2.org/job/ci_linux-aarch64/5029/)\r\n* macOS [](http://ci.ros2.org/job/ci_osx/7653/)\r\n* Windows [](http://ci.ros2.org/job/ci_windows/9269/)\r\n* Windows-container []",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zmichaels11",
"comment_id": 586425929,
"datetime": 1581707075000,
"masked_author": "username_0",
"text": "A proper unit test for this regression will require a `CompressionFactory` so we can mock out `ZstdDecompressor`. I'll try and find some time to introduce that in a separate project.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "piraka9011",
"comment_id": 586437751,
"datetime": 1581708264000,
"masked_author": "username_3",
"text": "Open a ticket to track this please.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zmichaels11",
"comment_id": 586440816,
"datetime": 1581708751000,
"masked_author": "username_0",
"text": "@username_3 Opened https://github.com/ros2/rosbag2/issues/297",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "Karsten1987",
"comment_id": 597266249,
"datetime": 1583867893000,
"masked_author": "username_4",
"text": "@username_0 what's the status of this PR?",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "zmichaels11",
"comment_id": 597308107,
"datetime": 1583873468000,
"masked_author": "username_0",
"text": "We want to put sufficient testing on the changes in this PR before merging it in.\r\n@username_3 is going to merge in the work required to write the unit tests shortly.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "piraka9011",
"comment_id": 597958911,
"datetime": 1583975964000,
"masked_author": "username_3",
"text": "The main fix comes from this: \r\n\r\n```c++\r\nif (decompressor_ == nullptr) {\r\n throw std::runtime_error{\r\n \"The bag file was not properly opened. \"\r\n \"Somehow the compression mode was set without opening a decompressor.\"\r\n };\r\n}\r\n```\r\n\r\nWhere the previous check was simply:\r\n\r\n```c++\r\nif (decompressor_) {\r\n...\r\n}\r\n```",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "piraka9011",
"comment_id": 597960121,
"datetime": 1583976269000,
"masked_author": "username_3",
"text": "* Linux [](http://ci.ros2.org/job/ci_linux/9613/)\r\n* Linux-aarch64 [](http://ci.ros2.org/job/ci_linux-aarch64/5309/)\r\n* macOS [](http://ci.ros2.org/job/ci_osx/7836/)\r\n* Windows [](http://ci.ros2.org/job/ci_windows/9518/)",
"title": null,
"type": "comment"
}
] | 5 | 11 | 2,526 | false | false | 2,526 | true |
teamcapybara/capybara | teamcapybara | 255,003,088 | 1,910 | {
"number": 1910,
"repo": "capybara",
"user_login": "teamcapybara"
} | [
{
"action": "opened",
"author": "herwinw",
"comment_id": null,
"datetime": 1504521621000,
"masked_author": "username_0",
"text": "",
"title": "Fixed typo in docs of capybara/minitest",
"type": "issue"
},
{
"action": "created",
"author": "twalpole",
"comment_id": 326993868,
"datetime": 1504540811000,
"masked_author": "username_1",
"text": "Thanks",
"title": null,
"type": "comment"
}
] | 2 | 2 | 6 | false | false | 6 | false |
github/linguist | github | 545,995,457 | 4,762 | {
"number": 4762,
"repo": "linguist",
"user_login": "github"
} | [
{
"action": "opened",
"author": "lpil",
"comment_id": null,
"datetime": 1578353586000,
"masked_author": "username_0",
"text": "The Elixir template language sometimes uses the extension `.leex`, when used with the LiveView library.\r\n\r\nThis is documented here https://hexdocs.pm/phoenix_live_view/Phoenix.LiveView.html#module-assigns-and-liveeex-templates\r\n\r\nI've not added an example as it is the same file contents as the regular `.eex` version. Shall I add this?\r\n\r\n## Checklist:\r\n\r\n- [x] **I am associating a language with a new file extension.**\r\n - [x] The new extension is used in hundreds of repositories on GitHub.com\r\n - Search results for each extension:\r\n - https://github.com/search?q=extension%3Aleex+div&type=Code\r\n - [ ] I have included a real-world usage sample for all extensions added in this PR:\r\n - Sample source(s):\r\n - [URL to each sample source, if applicable]\r\n - Sample license(s):\r\n - [ ] I have included a change to the heuristics to distinguish my language from others using the same extension.",
"title": "Add leex extension to HTML+EEX",
"type": "issue"
},
{
"action": "created",
"author": "pchaigno",
"comment_id": 571583222,
"datetime": 1578403340000,
"masked_author": "username_1",
"text": "For other reviewers: Usage looks okay. I downloaded 1139 `.html.leex` (which is likely all such files) and counted 486 repositories by 403 users.",
"title": null,
"type": "comment"
},
{
"action": "created",
"author": "lpil",
"comment_id": 571599841,
"datetime": 1578405885000,
"masked_author": "username_0",
"text": "An unrelated question around the eex extension seems to have got mixed up in this PR for the leex extension, so I'm going to close this.",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,197 | false | false | 1,197 | false |
dotnet/EntityFramework.Docs | dotnet | 576,208,642 | 2,165 | null | [
{
"action": "opened",
"author": "jeffw-wherethebitsroam",
"comment_id": null,
"datetime": 1583410291000,
"masked_author": "username_0",
"text": "These methods \"throw an exception if there is not exactly one element in the sequence.\" It would be great if it was documented *which* exception was thrown so that I could catch it.\n\n---\n#### Document Details\n\n⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*\n\n* ID: d3b7deb0-a592-b6dc-1564-fba73a4dcb4c\n* Version Independent ID: 1836b7d9-419e-d8a4-51ff-09f504a56462\n* Content: [EntityFrameworkQueryableExtensions.SingleAsync Method (Microsoft.EntityFrameworkCore)](https://docs.microsoft.com/en-us/dotnet/api/microsoft.entityframeworkcore.entityframeworkqueryableextensions.singleasync?view=efcore-3.1)\n* Content Source: [dotnet/xml/Microsoft.EntityFrameworkCore/EntityFrameworkQueryableExtensions.xml](https://github.com/aspnet/EntityFramework.ApiDocs/blob/live/dotnet/xml/Microsoft.EntityFrameworkCore/EntityFrameworkQueryableExtensions.xml)\n* Product: **entity-framework-core**\n* GitHub Login: @dotnet-bot\n* Microsoft Alias: **divega**",
"title": "Which Exception is thrown?",
"type": "issue"
},
{
"action": "closed",
"author": "ajcvickers",
"comment_id": null,
"datetime": 1583426059000,
"masked_author": "username_1",
"text": "",
"title": null,
"type": "issue"
},
{
"action": "created",
"author": "ajcvickers",
"comment_id": 595323169,
"datetime": 1583426059000,
"masked_author": "username_1",
"text": "Duplicate of https://github.com/dotnet/efcore/issues/20086",
"title": null,
"type": "comment"
}
] | 2 | 3 | 1,039 | false | false | 1,039 | false |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.