repo
stringlengths
7
67
org
stringlengths
2
32
issue_id
int64
780k
941M
issue_number
int64
1
134k
pull_request
dict
events
list
user_count
int64
1
77
event_count
int64
1
192
text_size
int64
0
329k
bot_issue
bool
1 class
modified_by_bot
bool
2 classes
text_size_no_bots
int64
0
279k
modified_usernames
bool
2 classes
toorop/go-pusher
null
216,097,620
8
{ "number": 8, "repo": "go-pusher", "user_login": "toorop" }
[ { "action": "opened", "author": "Deleplace", "comment_id": null, "datetime": 1490195325000, "masked_author": "username_0", "text": "Library user may close Client, to reclaim back its Pusher connection and stop its network activity.", "title": "(*Client).Close", "type": "issue" }, { "action": "created", "author": "toorop", "comment_id": 288646265, "datetime": 1490256890000, "masked_author": "username_1", "text": "Thanks ; )", "title": null, "type": "comment" } ]
2
2
109
false
false
109
false
christopherdro/react-native-calendar
null
191,662,121
78
null
[ { "action": "opened", "author": "challenger532", "comment_id": null, "datetime": 1480067260000, "masked_author": "username_0", "text": "Hello, \r\n\r\nInstead of showing the words 'prev' and 'next', it's better to show the name and previous month and next month, example:\r\nIf current displayed month is 'Nov', show 'Oct' on the left and 'Dec' on the right.\r\n\r\nThanks,", "title": "Prev month and next month", "type": "issue" }, { "action": "created", "author": "christopherdro", "comment_id": 265076813, "datetime": 1481007890000, "masked_author": "username_1", "text": "You can do this by using props `prevButtonText` and `nextButtonText` or create your own menu bar and disable the built in one by setting the prop `showControls` to false.", "title": null, "type": "comment" }, { "action": "closed", "author": "christopherdro", "comment_id": null, "datetime": 1481007890000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
3
397
false
false
397
false
iverberk/larasearch
null
60,628,825
72
null
[ { "action": "opened", "author": "pfeiferchristopher", "comment_id": null, "datetime": 1426064888000, "masked_author": "username_0", "text": "I myself do not know what is needed to change in order for this to work within Laravel 5.0 but I would love to see it, I imagine it's a few method name changes or something not too much.", "title": "Laravel 5.0 update possible?", "type": "issue" }, { "action": "created", "author": "thangngoc89", "comment_id": 78242630, "datetime": 1426071510000, "masked_author": "username_1", "text": "I'm working on a L5 branch \r\nYou can install and try it (using composer vcs)\r\nhttps://github.com/username_3/larasearch/pull/71", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 84911094, "datetime": 1427102936000, "masked_author": "username_0", "text": "@username_1 if you could explain the steps to do so I've be very thankful lol", "title": null, "type": "comment" }, { "action": "created", "author": "thangngoc89", "comment_id": 84935665, "datetime": 1427106424000, "masked_author": "username_1", "text": "@username_0 you can do something like this\r\n\r\n```js\r\n\"repositories\": [\r\n {\r\n \"type\": \"vcs\",\r\n \"url\": \"https://github.com/username_1/larasearch\"\r\n }\r\n ],\r\n \"require\": {\r\n \"username_3/larasearch\": \"dev-develop\"\r\n }\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 84936349, "datetime": 1427106563000, "masked_author": "username_0", "text": "@username_1 Thanks I'll give it a shot.", "title": null, "type": "comment" }, { "action": "created", "author": "thangngoc89", "comment_id": 84973861, "datetime": 1427114051000, "masked_author": "username_1", "text": "@username_0 OK. PRs welcome", "title": null, "type": "comment" }, { "action": "created", "author": "marijang", "comment_id": 92273178, "datetime": 1428914528000, "masked_author": "username_2", "text": "Is it L5 Possible?", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 92279230, "datetime": 1428915838000, "masked_author": "username_0", "text": "@username_2 I'm currently using the version shown above and it is working properly for pretty much everything. I haven't been able to get on-change indexing working properly I believe it may be because of the changes to the Queue system between L4 and L5 but I haven't had time to look into it and try to create a PR to fix it.", "title": null, "type": "comment" }, { "action": "created", "author": "marijang", "comment_id": 92288420, "datetime": 1428916911000, "masked_author": "username_2", "text": "Are you using lennynyktyk version?", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 92301460, "datetime": 1428920412000, "masked_author": "username_0", "text": "@username_2 no just googled it. I'm going to try it tonight.", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 93153173, "datetime": 1429064047000, "masked_author": "username_0", "text": "@username_2 how do I install that version of this package? I'm not good at version control.", "title": null, "type": "comment" }, { "action": "created", "author": "marijang", "comment_id": 93241456, "datetime": 1429083401000, "masked_author": "username_2", "text": "@username_3 must approve pull request\r\nhttps://github.com/username_3/larasearch/pull/75\r\n\r\nI will wait L5 version. This is great package. I need it for autocomplete for search users:D\r\n\r\nThanks @username_3", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 93291105, "datetime": 1429091378000, "masked_author": "username_0", "text": "@username_2 by specifying the location of @username_1 project I was able to install his pull requests before being approved. I can't do that with the lennynyktyk version?", "title": null, "type": "comment" }, { "action": "created", "author": "iverberk", "comment_id": 94435701, "datetime": 1429531847000, "masked_author": "username_3", "text": "I just updated a bunch of stuff to make this compatible with L5. Could people check the L5 branch and see if it works within Laravel 5 installations?", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 94436083, "datetime": 1429532006000, "masked_author": "username_0", "text": "I will in about 12 hours. I'll give it a thorough testing.", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 94609918, "datetime": 1429581093000, "masked_author": "username_0", "text": "@username_3 I'm so terrible at using git to install specific versions. What version should I set to install the changes you want?", "title": null, "type": "comment" }, { "action": "created", "author": "thangngoc89", "comment_id": 94615192, "datetime": 1429582677000, "masked_author": "username_1", "text": "@username_0 It should be dev-L5 branch in composer", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 94616553, "datetime": 1429583004000, "masked_author": "username_0", "text": "Yeah I'm mean `\"username_3/larasearch\": \"dev-develop\",` what should go where `\"dev-develop\"`", "title": null, "type": "comment" }, { "action": "created", "author": "iverberk", "comment_id": 94663893, "datetime": 1429600720000, "masked_author": "username_3", "text": "@username_0 you should add \"username_3/larasearch\": \"dev-L5\" to your composer dependencies.", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 94723879, "datetime": 1429609850000, "masked_author": "username_0", "text": "@username_3 okay I ran into the same issue I had previously manually fixed.\r\n```ssh\r\n[ErrorException]\r\n Undefined index: errors\r\n```\r\nduring the first batch. Going through the code I found this in `\\Iverberk\\Larasearch\\Index.php`\r\n```php\r\nif ($results['errors'])\r\n```\r\nand by changing it to\r\n```php\r\nif ( array_key_exists('errors', $results) )\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "iverberk", "comment_id": 94760590, "datetime": 1429617515000, "masked_author": "username_3", "text": "I believe the bulk import should always return if there are errors or not? So how are you hitting this case? Which version of Elasticsearch are you using?", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 95039067, "datetime": 1429680370000, "masked_author": "username_0", "text": "@username_3 no I'm not saying the bulk isn't returning properly. It seems as though when there are no errors present in the `$results` when no errors occur. So by using `$results['errors']` an exception for `Undefined index: errors` is thrown. Where as with my fix it looks to see if the key `['errors']` exists within `$results` or not.\r\n\r\nWith my change shown above the entire package now works wonderfully. Artisan commands for `paths` and `reindex` with `--relations` as well as `--dir=app/Models` worked perfectly. Also, was able to text that the observer and queue jobs are queuing and firing as they should. Everything except the `$results['errors']` is working flawlessly.", "title": null, "type": "comment" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 95039147, "datetime": 1429680430000, "masked_author": "username_0", "text": "Here's my `->bulk()` method:\r\n```php\r\npublic function bulk($records)\r\n {\r\n $params['index'] = $this->getName();\r\n $params['type'] = $this->getProxy()->getType();\r\n $params['body'] = $records;\r\n\r\n $results = self::getClient()->bulk($params);\r\n\r\n if ( array_key_exists('errors', $results) )\r\n {\r\n $errorItems = [];\r\n\r\n foreach ($results['items'] as $item)\r\n {\r\n if (array_key_exists('error', $item['index']))\r\n {\r\n $errorItems[] = $item;\r\n }\r\n }\r\n\r\n throw new ImportException('Bulk import with errors', 1, $errorItems);\r\n }\r\n }\r\n```", "title": null, "type": "comment" }, { "action": "closed", "author": "pfeiferchristopher", "comment_id": null, "datetime": 1430802194000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "pfeiferchristopher", "comment_id": 98950402, "datetime": 1430802194000, "masked_author": "username_0", "text": "@username_3 I'm sorry I forgot to post this. But I did in fact have an older version of ES installed on my local environment and that was causing the issue. I have been using purely `dev-L5` and it's been working great.", "title": null, "type": "comment" } ]
4
25
4,310
false
false
4,310
true
stcorp/coda
stcorp
194,593,452
17
null
[ { "action": "opened", "author": "svniemeijer", "comment_id": null, "datetime": 1481289087000, "masked_author": "username_0", "text": "It happens in practice that GRIB files contain a mix of GRIB1 and GRIB2 messages.\r\n\r\nTo support this we will have to:\r\n\r\n- [ ] Combine `coda_format_grib1` and `coda_format_grib2` into a single `coda_format_grib`\r\n- [ ] Introduce support for unions in the memory backend of CODA\r\n- [ ] Change the CODA type mapping of GRIB files from an array of records (with each record being a grib message) into an array of unions with each union having `grib1` and `grib2` fields pointing to the GRIB1 or GRIB2 record.", "title": "Allow GRIB files to contain combination of GRIB1 and GRIB2 messages", "type": "issue" }, { "action": "closed", "author": "svniemeijer", "comment_id": null, "datetime": 1482253237000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" } ]
1
2
505
false
false
505
false
spotify/HubFramework
spotify
182,774,963
75
{ "number": 75, "repo": "HubFramework", "user_login": "spotify" }
[ { "action": "opened", "author": "krris", "comment_id": null, "datetime": 1476361465000, "masked_author": "username_0", "text": "This PR resolves an issue https://github.com/spotify/HubFramework/issues/51", "title": "Image loader: Replace cache-check with timer", "type": "issue" }, { "action": "created", "author": "krris", "comment_id": 255058373, "datetime": 1476956357000, "masked_author": "username_0", "text": "@JohnSundell 🎾", "title": null, "type": "comment" }, { "action": "created", "author": "krris", "comment_id": 255079574, "datetime": 1476962691000, "masked_author": "username_0", "text": "@JohnSundell 🎾", "title": null, "type": "comment" } ]
3
5
625
false
true
103
false
scylladb/seastar
scylladb
168,726,598
176
null
[ { "action": "opened", "author": "tgrabiec", "comment_id": null, "datetime": 1470080895000, "masked_author": "username_0", "text": "HEAD = 0bcdd282c54d00651dd0518da91944a6ec97dab6 + the following diff:\r\n```diff\r\ndiff --git a/core/reactor.cc b/core/reactor.cc\r\nindex 55164df..4f68279 100644\r\n--- a/core/reactor.cc\r\n+++ b/core/reactor.cc\r\n@@ -534,13 +534,14 @@ void reactor_backend_epoll::complete_epoll_event(pollable_fd_state& pfd, promise\r\n auto pr = std::make_unique<promise<io_event>>();\r\n iocb io;\r\n prepare_io(io);\r\n- io.data = pr.get();\r\n+ auto f = pr->get_future();\r\n+ io.data = pr.release();\r\n _pending_aio.push_back(io);\r\n if ((_io_queue->queued_requests() > 0) ||\r\n (_pending_aio.size() >= std::min(max_aio / 4, _io_queue->_capacity / 2))) {\r\n flush_pending_aio();\r\n }\r\n- return pr.release()->get_future();\r\n+ return f;\r\n });\r\n }\r\n \r\n@@ -2864,7 +2865,8 @@ void engine_exit(std::exception_ptr eptr) {\r\n }\r\n \r\n void report_failed_future(std::exception_ptr eptr) {\r\n- report_exception(\"WARNING: exceptional future ignored\", eptr);\r\n+ abort();\r\n+// report_exception(\"WARNING: exceptional future ignored\", eptr);\r\n }\r\n \r\n future<> check_direct_io_support(sstring path) {\r\n```\r\n\r\n```\r\n#2 0x00000000004d25f9 in report_failed_future (eptr=...) at core/reactor.cc:2868\r\n#3 0x0000000000f845f6 in future<foreign_ptr<lw_shared_ptr<query::result> > >::~future (this=<optimized out>, __in_chrg=<optimized out>)\r\n at /home/centos/src/scylla/build/rpmbuild/BUILD/scylla-1.2.1/seastar/core/future.hh:772\r\n#4 do_void_futurize_apply<auto rpc::recv_helper<net::serializer, net::messaging_verb, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>, future<foreign_ptr<lw_shared_ptr<query::result> > >, query::read_command, range<dht::ring_position>, rpc::do_want_client_info>(rpc::signature<future<foreign_ptr<lw_shared_ptr<query::result> > > (query::read_command, range<dht::ring_position>)>, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>&&, rpc::do_want_client_info)::{lambda(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)#1}::operator()(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)::{lambda()#1}::operator()()::{lambda(future<foreign_ptr<lw_shared_ptr<query::result> > >)#1}, future<foreign_ptr<lw_shared_ptr<query::result> > > >(std::enable_if&&, auto rpc::recv_helper<net::serializer, net::messaging_verb, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>, future<foreign_ptr<lw_shared_ptr<query::result> > >, query::read_command, range<dht::ring_position>, rpc::do_want_client_info>(rpc::signature<future<foreign_ptr<lw_shared_ptr<query::result> > > (query::read_command, range<dht::ring_position>)>, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>&&, rpc::do_want_client_info)::{lambda(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)#1}::operator()(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)::{lambda()#1}::operator()()::{lambda(future<foreign_ptr<lw_shared_ptr<query::result> > >)#1}) (\r\n func=func@entry=<unknown type in /usr/bin/scylla, CU 0xf30d9b3, DIE 0xfa67b4b>) at /home/centos/src/scylla/build/rpmbuild/BUILD/scylla-1.2.1/seastar/core/future.hh:1179\r\n#5 0x0000000000f8583e in futurize<void>::apply<auto rpc::recv_helper<net::serializer, net::messaging_verb, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>, future<foreign_ptr<lw_shared_ptr<query::result> > >, query::read_command, range<dht::ring_position>, rpc::do_want_client_info>(rpc::signature<future<foreign_ptr<lw_shared_ptr<query::result> > > (query::read_command, range<dht::ring_position>)>, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>&&, rpc::do_want_client_info)::{lambda(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)#1}::operator()(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)::{lambda()#1}::operator()()::{lambda(future<foreign_ptr<lw_shared_ptr<query::result> > >)#1}, future<foreign_ptr<lw_shared_ptr<query::result> > > >(auto rpc::recv_helper<net::serializer, net::messaging_verb, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>, future<foreign_ptr<lw_shared_ptr<query::result> > >, query::read_command, range<dht::ring_position>, rpc::do_want_client_info>(rpc::signature<future<foreign_ptr<lw_shared_ptr<query::result> > > (query::read_command, range<dht::ring_position>)>, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>&&, rpc::do_want_client_info)::{lambda(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)#1}::operator()(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)::{lambda()#1}::operator()()::{lambda(future<foreign_ptr<lw_shared_ptr<query::result> > >)#1}&&, future<foreign_ptr<lw_shared_ptr<query::result> > >&&)\r\n (func=<optimized out>) at /home/centos/src/scylla/build/rpmbuild/BUILD/scylla-1.2.1/seastar/core/future.hh:1227\r\n#6 future<> future<foreign_ptr<lw_shared_ptr<query::result> > >::then_wrapped<auto rpc::recv_helper<net::serializer, net::messaging_verb, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>, future<foreign_ptr<lw_shared_ptr<query::result> > >, query::read_command, range<dht::ring_position>, rpc::do_want_client_info>(rpc::signature<future<foreign_ptr<lw_shared_ptr<query::result> > > (query::read_command, range<dht::ring_position>)>, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>&&, rpc::do_want_client_info)::{lambda(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)#1}::operator()(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)::{lambda()#1}::operator()()::{lambda(future<foreign_ptr<lw_shared_ptr<query::result> > >)#1}, future<> >(auto rpc::recv_helper<net::serializer, net::messaging_verb, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>, future<foreign_ptr<lw_shared_ptr<query::result> > >, query::read_command, range<dht::ring_position>, rpc::do_want_client_info>(rpc::signature<future<foreign_ptr<lw_shared_ptr<query::result> > > (query::read_command, range<dht::ring_position>)>, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>&&, rpc::do_want_client_info)::{lambda(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)#1}::operator()(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)::{lambda()#1}::operator()()::{lambda(future<foreign_ptr<lw_shared_ptr<query::result> > >)#1}&&)::{lambda(future<>)#1}::operator()<future_state<foreign_ptr<lw_shared_ptr<query::result> > > >(auto, future<>) (state=<unknown type in /usr/bin/scylla, CU 0xf30d9b3, DIE 0xfa6b004>, __closure=0x60704739cc20)\r\n at /home/centos/src/scylla/build/rpmbuild/BUILD/scylla-1.2.1/seastar/core/future.hh:909\r\n#7 continuation<future<> future<foreign_ptr<lw_shared_ptr<query::result> > >::then_wrapped<auto rpc::recv_helper<net::serializer, net::messaging_verb, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>, future<foreign_ptr<lw_shared_ptr<query::result> > >, query::read_command, range<dht::ring_position>, rpc::do_want_client_info>(rpc::signature<future<foreign_ptr<lw_shared_ptr<query::result> > > (query::read_command, range<dht::ring_position>)>, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>&&, rpc::do_want_client_info)::{lambda(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)#1}::operator()(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)::{lambda()#1}::operator()()::{lambda(future<foreign_ptr<lw_shared_ptr<query::result> > >)#1}, future<> >(auto rpc::recv_helper<net::serializer, net::messaging_verb, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>, future<foreign_ptr<lw_shared_ptr<query::result> > >, query::read_command, range<dht::ring_position>, rpc::do_want_client_info>(rpc::signature<future<foreign_ptr<lw_shared_ptr<query::result> > > (query::read_command, range<dht::ring_position>)>, std::function<future<foreign_ptr<lw_shared_ptr<query::result> > > (rpc::client_info const&, query::read_command, range<dht::ring_position>)>&&, rpc::do_want_client_info)::{lambda(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)#1}::operator()(lw_shared_ptr<rpc::protocol<net::serializer, net::messaging_verb>::server::connection>, long, temporary_buffer<char>)::{lambda()#1}::operator()()::{lambda(future<foreign_ptr<lw_shared_ptr<query::result> > >)#1}&&)::{lambda(future<>)#1}, foreign_ptr<lw_shared_ptr<query::result> > >::run() (this=0x60704739cc00) at /home/centos/src/scylla/build/rpmbuild/BUILD/scylla-1.2.1/seastar/core/future.hh:402\r\n#8 0x00000000004d132e in reactor::run_tasks (this=this@entry=0x607000111000, tasks=...) at core/reactor.cc:1500\r\n#9 0x00000000004fbc6b in reactor::run (this=0x607000111000) at core/reactor.cc:1853\r\n#10 0x0000000000515c79 in smp::<lambda()>::operator()(void) const (__closure=0x600000085700) at core/reactor.cc:2640\r\n#11 0x00000000004ce24e in std::function<void ()>::operator()() const (this=<optimized out>) at /opt/scylladb/include/c++/5.3.1/functional:2271\r\n#12 dpdk_thread_adaptor (f=<optimized out>) at core/reactor.cc:2449\r\n#13 0x00000000006dac3b in eal_thread_loop ()\r\n---Type <return> to continue, or q <return> to quit---select-frame 6\r\n#14 0x00007ffa7ceefdc5 in start_thread (arg=0x7ffa718cd700) at pthread_create.c:308\r\n#15 0x00007ffa7cc1d21d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:113\r\n```", "title": "Failed future ignored somewhere inside rpc", "type": "issue" }, { "action": "closed", "author": "avikivity", "comment_id": null, "datetime": 1477382841000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
2
11,316
false
false
11,316
false
CompuMasterGmbH/cammIntegrationPortal
CompuMasterGmbH
167,823,888
13
null
[ { "action": "opened", "author": "jochenwezel", "comment_id": null, "datetime": 1469616008000, "masked_author": "username_0", "text": "Major improvements shall be\r\n\r\n- improved data integrity after deleting items sometimes left over some foreign-key-items\r\n- performance, especially for\r\n - navigation data lookup\r\n - IsUserAuthorized checks\r\n- Allow and Deny rules for memberships \r\n- Allow and Deny rules for authorizations\r\n- pave the way for (multiple) membership inheritance\r\n - inheritance of completely calculated membership sets, so that inheriting from a 2nd group with a deny rule for user A **doesn't** automatically deny user A being Allow-member of 1st group\r\n- pave the way for (multiple) authorizations inheritance\r\n - inheritance of all Allow and Deny rules, so that inheriting from a 2nd security object with a deny rule for user A **does** automatically deny user A being authorized of 1st security object\r\n- pave the way for splitting application objects into security objects + 0...n navigation items", "title": "Redesign of whole security concept", "type": "issue" }, { "action": "created", "author": "jochenwezel", "comment_id": 235553922, "datetime": 1469617259000, "masked_author": "username_0", "text": "What was the purpose of the server IDs with negative numbers and the dependencies of related code? Do they point to changed database structure? If yes, code update is required.", "title": null, "type": "comment" } ]
1
2
1,065
false
false
1,065
false
wso2/product-ei
wso2
217,208,804
382
null
[ { "action": "opened", "author": "milindaperera", "comment_id": null, "datetime": 1490612262000, "masked_author": "username_0", "text": "Relates #353", "title": "[Docs] Remove start-all.sh script information from EI Documentation", "type": "issue" }, { "action": "created", "author": "Nashaath", "comment_id": 289800624, "datetime": 1490713629000, "masked_author": "username_1", "text": "Removed information related to start-all.sh script from https://docs.wso2.com/display/EI610/Running+the+Product.", "title": null, "type": "comment" }, { "action": "closed", "author": "Nashaath", "comment_id": null, "datetime": 1490713629000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
3
124
false
false
124
false
retf/Boost.Application
null
56,913,286
35
{ "number": 35, "repo": "Boost.Application", "user_login": "retf" }
[ { "action": "opened", "author": "Heather", "comment_id": null, "datetime": 1423320040000, "masked_author": "username_0", "text": "", "title": "correct path_impl for NTDDI_VERSION >= 0x06010000", "type": "issue" }, { "action": "created", "author": "retf", "comment_id": 73368417, "datetime": 1423322520000, "masked_author": "username_1", "text": "tks for the fix", "title": null, "type": "comment" }, { "action": "created", "author": "Heather", "comment_id": 73368802, "datetime": 1423323165000, "masked_author": "username_0", "text": "yey!", "title": null, "type": "comment" } ]
3
5
581
false
true
19
false
kubernetes/kubernetes
kubernetes
149,542,540
24,486
null
[ { "action": "opened", "author": "a-robinson", "comment_id": null, "datetime": 1461090245000, "masked_author": "username_0", "text": "Its lack of coverage is quite sad, given how much logic it contains. It shouldn't be that much work to mock out the GCE service object and get significantly more coverage than what we currently have.", "title": "Write unit tests for the gce cloud provider package", "type": "issue" }, { "action": "created", "author": "spiffxp", "comment_id": 308879640, "datetime": 1497564600000, "masked_author": "username_1", "text": "/sig cluster-lifecycle\r\n/area platform/gce\r\nsince we lack a sig-gcp at the moment", "title": null, "type": "comment" } ]
4
6
1,269
false
true
280
false
DragonFlyBSD/DragonFlyBSD
DragonFlyBSD
44,809,459
2
{ "number": 2, "repo": "DragonFlyBSD", "user_login": "DragonFlyBSD" }
[ { "action": "opened", "author": "victoredwardocallaghan", "comment_id": null, "datetime": 1412347184000, "masked_author": "username_0", "text": "", "title": "Upstream fixes", "type": "issue" }, { "action": "created", "author": "jrmarino", "comment_id": 161682122, "datetime": 1449157566000, "masked_author": "username_1", "text": "The repository at GitHub is only a mirror (iow read-only). Pull requests are not supported.", "title": null, "type": "comment" } ]
2
2
92
false
false
92
false
Schamper/nodebb-plugin-shoutbox
null
72,175,381
76
null
[ { "action": "opened", "author": "kurt-stolle", "comment_id": null, "datetime": 1430403342000, "masked_author": "username_0", "text": "The plugin does not work with the latest version of NodeBB", "title": "Plugin does not work with 0.7.x", "type": "issue" }, { "action": "created", "author": "Aeternax", "comment_id": 106571916, "datetime": 1432841118000, "masked_author": "username_1", "text": "I have it running fine with 0.7.x.\r\nWhat errors are you getting?", "title": null, "type": "comment" }, { "action": "created", "author": "alesaint", "comment_id": 115686294, "datetime": 1435325340000, "masked_author": "username_2", "text": "On my side that's work but I can't remove and edit message since 0.7.0", "title": null, "type": "comment" }, { "action": "closed", "author": "Schamper", "comment_id": null, "datetime": 1448301116000, "masked_author": "username_3", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "Schamper", "comment_id": 159010453, "datetime": 1448301116000, "masked_author": "username_3", "text": "New version is compatible with 0.9.x.", "title": null, "type": "comment" } ]
4
5
229
false
false
229
false
langateam/sails-auth
langateam
142,634,818
139
null
[ { "action": "opened", "author": "aman-gautam", "comment_id": null, "datetime": 1458649743000, "masked_author": "username_0", "text": "Hi,\r\n\r\nIt's a general observation that this repo hasn't been too active in 2016. The number of open issues have crossed 50, the only code related commits have been the merging of pull requests, the dependencies are out of date. \r\n\r\nIs it still safe to use this project in a production system?\r\n\r\nBest\r\nAman", "title": "Is the project still active?", "type": "issue" }, { "action": "closed", "author": "tjwebb", "comment_id": null, "datetime": 1459105774000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "tjwebb", "comment_id": 202126352, "datetime": 1459105774000, "masked_author": "username_1", "text": "See https://github.com/langateam/sails-auth/issues/118.", "title": null, "type": "comment" } ]
2
3
361
false
false
361
false
stormpath/stormpath-sdk-android
stormpath
138,324,887
9
null
[ { "action": "opened", "author": "ericlw", "comment_id": null, "datetime": 1457046457000, "masked_author": "username_0", "text": "Several ways to store encrypted keys in android. Gets more secure in newer versions, conditionally add this logic for storing the refresh tokens.", "title": "Store keys in conditional ways for different android versions", "type": "issue" }, { "action": "created", "author": "ericlw", "comment_id": 213158705, "datetime": 1461281383000, "masked_author": "username_0", "text": "OpenSSL broken on Android N.\r\n\r\nStatically linked OpenSSL also broken.\r\n\r\nInvestigating....", "title": null, "type": "comment" }, { "action": "created", "author": "ericlw", "comment_id": 213573911, "datetime": 1461355875000, "masked_author": "username_0", "text": "Problem replacing deprecated Keygen method with the new one\r\n\r\nhttp://stackoverflow.com/questions/36802523/keypairgeneratorspec-replacement-with-keygenparameterspec-builder-equivalents", "title": null, "type": "comment" } ]
1
3
420
false
false
420
false
LLNL/spack
LLNL
195,795,181
2,599
null
[ { "action": "opened", "author": "citibeth", "comment_id": null, "datetime": 1481805424000, "masked_author": "username_0", "text": "Re-running `spack compiler find` does not find compilers that have been installed since the last time it was run. See:\r\n\r\nhttps://groups.google.com/forum/#!topic/spack/9r4CR9jmNjY", "title": "Re-running `spack compiler find`...", "type": "issue" }, { "action": "created", "author": "davydden", "comment_id": 267319074, "datetime": 1481805937000, "masked_author": "username_1", "text": "so the suggestion is to always update `compilers.yaml` with the new paths, in case Spack find the same compiler/version but in a new location now?", "title": null, "type": "comment" }, { "action": "created", "author": "adamjstewart", "comment_id": 267665166, "datetime": 1481913553000, "masked_author": "username_2", "text": "Just to be more clear, the issue isn't that `spack compiler find` doesn't find new compilers. It's that it doesn't update _existing_ Spack compilers with _newly added_ Fortran compilers. If you run `spack compiler remove gcc` and re-run `spack compiler find`, I suspect it will work as expected.\r\n\r\nThis is definitely a troublesome bug, and probably happens for a lot of macOS users.", "title": null, "type": "comment" }, { "action": "created", "author": "tgamblin", "comment_id": 267708240, "datetime": 1481926242000, "masked_author": "username_3", "text": "The idea with the current behavior is that it will not update an existing user configuration with a \"better\" compiler if the user doesn't want one. I suspect people actually want their Fortran compilers updated when they run `spack compiler find`.\r\n\r\nIs it sufficient to notice that some compilers are set to `None` and to update *those* if they are found?", "title": null, "type": "comment" }, { "action": "created", "author": "adamjstewart", "comment_id": 267708635, "datetime": 1481926375000, "masked_author": "username_2", "text": "I think that would be enough. The only thing I want to make sure is that it won't overwrite my `cflags` and friends.", "title": null, "type": "comment" }, { "action": "created", "author": "citibeth", "comment_id": 267708642, "datetime": 1481926379000, "masked_author": "username_0", "text": "`spack compiler find` could root around, find stuff, show it to the user,\nand then ask for direction on what to do.", "title": null, "type": "comment" }, { "action": "created", "author": "davydden", "comment_id": 267751734, "datetime": 1481965212000, "masked_author": "username_1", "text": "and then ask for direction on what to do.\r\n\r\nthat would also do, but I don't think it's necessary for the current problem and is also more complicated to implement. I would not bother asking a user when updating `None` to `something`, jus t confirm to him/her that this happened.", "title": null, "type": "comment" }, { "action": "created", "author": "citibeth", "comment_id": 267774273, "datetime": 1481994646000, "masked_author": "username_0", "text": "I agree, sounds good to me.", "title": null, "type": "comment" }, { "action": "created", "author": "aprokop", "comment_id": 268423009, "datetime": 1482289639000, "masked_author": "username_4", "text": "Got hit by that in #2342 when using a container with only gcc/g++ installed. Updated HEAD which included a newer variant of openmpi (which needs Fortran), installed gfortran, even blew away spack but the problem was only solved by removing `~/.spack/linux/complers.py`", "title": null, "type": "comment" }, { "action": "created", "author": "alalazo", "comment_id": 563222599, "datetime": 1575895605000, "masked_author": "username_5", "text": "Closing as it appears to be solved:\r\n```\r\n$ spack compilers\r\n==> Available compilers\r\n-- clang ubuntu18.04-x86_64 -------------------------------------\r\nclang@9.0.1 clang@8.0.0 clang@7.0.0 clang@6.0.1 clang@5.0.1 clang@4.0.1 clang@3.9.1\r\n\r\n-- gcc ubuntu18.04-x86_64 ---------------------------------------\r\ngcc@9.0.1 gcc@8.3.0 gcc@7.4.0 gcc@6.5.0 gcc@5.5.0 gcc@4.8\r\n\r\n$ spack compiler remove gcc@5.5.0\r\n==> Removed compiler gcc@5.5.0\r\n\r\n$ spack compiler find\r\n==> Added 1 new compiler to /home/culpo/.spack/linux/compilers.yaml\r\ngcc@5.5.0\r\n==> Compilers are defined in the following files:\r\n /home/culpo/.spack/linux/compilers.yaml\r\n```", "title": null, "type": "comment" }, { "action": "closed", "author": "alalazo", "comment_id": null, "datetime": 1575895606000, "masked_author": "username_5", "text": "", "title": null, "type": "issue" } ]
6
11
2,521
false
false
2,521
false
kubernetes/helm
kubernetes
182,370,096
1,342
null
[ { "action": "opened", "author": "triplem", "comment_id": null, "datetime": 1476218858000, "masked_author": "username_0", "text": "If I execute the above mentioned `helm dependencies update` in a chart, which contains a valid requirements.yaml, but without a chart-subdirectory, I do get an error (no such file or directory). If I put an invalid file in this directory (so that it is later stored in git), I do get an error as well, that this file is invalid (the file was called .do-not-delete). \r\n\r\nIs this on purpose?", "title": "charts-subdirectory when doing helm dependencies update is not created", "type": "issue" }, { "action": "created", "author": "technosophos", "comment_id": 253068951, "datetime": 1476225743000, "masked_author": "username_1", "text": "That was by design, but I could probably make it ignore files that start with `.` and `_`. I suppose calling a `mkdir` call would also be fine.", "title": null, "type": "comment" }, { "action": "closed", "author": "technosophos", "comment_id": null, "datetime": 1476300866000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
3
532
false
false
532
false
mozilla/rr
mozilla
126,965,831
1,620
null
[ { "action": "opened", "author": "dreiss", "comment_id": null, "datetime": 1452894794000, "masked_author": "username_0", "text": "I just watched one of Robert's presentations, and he mentioned that to replay asynchronous signals at the right time, you count conditional branches retired and record the full set of registers, to be compared during replay. This will fail if the code is running in a loop that has no conditional branches, and all register values are periodic. The example below uses an infinite loop (unconditional branch) and an in-place increment instruction (doesn't copy the data value into any registers). When replaying this program, the signal is generated on a very early iteration of the loop, and the output value is not the same as when the original program was run.\r\n\r\n\r\n```c\r\n#include <unistd.h>\r\n#include <signal.h>\r\n\r\n// Value that will be incremented in the main program\r\n// and printed from the signal handler.\r\nlong value;\r\n\r\nstatic void alarm_handler(int sig) {\r\n // Convert the value to a human-readable form.\r\n // The details are not important, but we must only use async-signal-safe\r\n // operations, and we must have a fixed number of conditional branches.\r\n long copy = value;\r\n char buf[64];\r\n int idx;\r\n for (idx = 0; idx < 16; idx++) {\r\n buf[idx] = 'a' + (copy & 0xf);\r\n copy >>= 4;\r\n }\r\n buf[idx++] = '\\n';\r\n // Write out to the user and exit.\r\n ssize_t ret = write(STDOUT_FILENO, buf, idx);\r\n _exit(ret != idx);\r\n}\r\n\r\nint main(int argc, char* argv[]) {\r\n // Request an async signal after 1 second.\r\n signal(SIGALRM, alarm_handler);\r\n alarm(1);\r\n\r\n // Infinite loop. No conditional branches.\r\n for (;;) {\r\n // This does not need to be an atomic operation,\r\n // but that was the easiest way to get gcc to emit code\r\n // that would add to value without changing register values.\r\n // Making value volatile resulted in a read-modify-write\r\n // (which leaves the value in a register),\r\n // and making this a normal increment resulted in\r\n // the increment being elided entirely.\r\n __sync_fetch_and_add(&value, 1);\r\n }\r\n\r\n return 0;\r\n}\r\n```", "title": "Divergence with infinite loops and async signals", "type": "issue" }, { "action": "created", "author": "rocallahan", "comment_id": 172138225, "datetime": 1452905463000, "masked_author": "username_1", "text": "You're right, this is a known limitation of rr's approach. Fortunately this\ndoesn't seem to cause problems in practice.", "title": null, "type": "comment" }, { "action": "closed", "author": "rocallahan", "comment_id": null, "datetime": 1454365155000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "rocallahan", "comment_id": 178225630, "datetime": 1454365155000, "masked_author": "username_1", "text": "I'm going to close this because this is basically a design limitation of rr that doesn't seem to cause problems in practice.", "title": null, "type": "comment" } ]
2
4
2,243
false
false
2,243
false
beltex/SMCKit
null
191,678,465
28
null
[ { "action": "opened", "author": "PKBeam", "comment_id": null, "datetime": 1480072187000, "masked_author": "username_0", "text": "Running this code:\r\n\r\n```\r\nimport Cocoa\r\nimport SMCKit\r\n\r\ndo {\r\n let fans = try SMCKit.fanCount()\r\n print(fans)\r\n} catch {\r\n print(\"Error\")\r\n}\r\n```\r\n\r\nPrints \"Error\". It happens with fan speed, fan names, etc. What am I doing wrong?", "title": "Xcode 8.1 - Can't return anything with SMCKit", "type": "issue" }, { "action": "created", "author": "beltex", "comment_id": 262976944, "datetime": 1480086961000, "masked_author": "username_1", "text": "Hello @username_0!\r\n\r\n`open()` must be called first to get a connection to the SMC driver (and make sure to call `close()` when your done!). See [SMCKitTool](https://github.com/username_1/SMCKit/blob/master/SMCKitTool/main.swift#L309-L314) for an example. `README`/docs should be updated to make that clear, since a number of folks have asked about this, (so my bad!).\r\n\r\nHope that helps!\r\n\r\n**P.S** New Macbook? Nice! :)", "title": null, "type": "comment" }, { "action": "created", "author": "PKBeam", "comment_id": 262983671, "datetime": 1480088899000, "masked_author": "username_0", "text": "Ah, I see. Thanks.", "title": null, "type": "comment" }, { "action": "closed", "author": "PKBeam", "comment_id": null, "datetime": 1480088900000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" } ]
2
4
672
false
false
672
true
algolia/instantsearch.js
algolia
184,243,274
1,450
{ "number": 1450, "repo": "instantsearch.js", "user_login": "algolia" }
[ { "action": "opened", "author": "spoeken", "comment_id": null, "datetime": 1476973245000, "masked_author": "username_0", "text": "", "title": "show search button on ios and close when user hits search", "type": "issue" }, { "action": "created", "author": "spoeken", "comment_id": 255127220, "datetime": 1476974768000, "masked_author": "username_0", "text": "Sorry. I ran a test now and realize the test needs to be fixed. I guess a new context has to be made since it's no longer wrapped in a div but a form element. Unfortunately I'm not too familiar with writing tests.", "title": null, "type": "comment" }, { "action": "created", "author": "spoeken", "comment_id": 255128688, "datetime": 1476975052000, "masked_author": "username_0", "text": "changing line 138 to `container = document.createElement('form');` and 145 to `const wrapper = container.querySelectorAll('form.ais-search-box')[0];` in `search-box-test.js`will make the test pass. But its not true that it wraps the input in a div.", "title": null, "type": "comment" }, { "action": "created", "author": "bobylito", "comment_id": 255165715, "datetime": 1476982880000, "masked_author": "username_1", "text": "Thanks for your contribution @username_0.\r\n\r\nI think that changing the root from `div` to `form` is alright but is still an unexpected practice for those doing only desktop web dev :) What do you think @username_2?", "title": null, "type": "comment" }, { "action": "created", "author": "vvo", "comment_id": 255301394, "datetime": 1477029976000, "masked_author": "username_2", "text": "Hi @username_0 could you walk us through the context of your pull request in more details? We understood it has a relation with mobile and searchbox but since we are not active on those subjects (yet), we need to understand your issue, what was not working and how what your propose is solving it.\r\n\r\nThanks a lot for contributing :)", "title": null, "type": "comment" }, { "action": "created", "author": "vvo", "comment_id": 255302834, "datetime": 1477030624000, "masked_author": "username_2", "text": "Also if you could provide maybe a reference link speaking about that mobile fix (stackoverflow/blog) that would help us commenting the code", "title": null, "type": "comment" }, { "action": "created", "author": "bobylito", "comment_id": 255322380, "datetime": 1477037953000, "masked_author": "username_1", "text": "@username_2 there is a discussion on the subject on SO http://stackoverflow.com/questions/4864167/show-search-button-in-iphone-ipad-safari-keyboard \r\nTruth is that this is for fixing a specific behavior on some browsers in specific version.", "title": null, "type": "comment" }, { "action": "created", "author": "spoeken", "comment_id": 255341778, "datetime": 1477043774000, "masked_author": "username_0", "text": "The reference @username_1 posted is a good one. \r\nAnd yes, this one is specifically for ios safari.\r\n\r\nThe first problem was that the keyboard was showing a `return` button instead of `search`. to fix this the input needs to be wrapped in a form that has an action, and the input type must be set to `search`. Since this is now a form with an action I needed to `preventDefault` on submit to prevent the page from reloading. \r\n\r\nThe second problem was that when I hit search, the keyboard would not disappear. To fix this I made it so `document.activeElement.blur()` is triggered on submit. We know that the active element is the input field since that's the only way to submit the form.", "title": null, "type": "comment" }, { "action": "created", "author": "vvo", "comment_id": 255744407, "datetime": 1477316730000, "masked_author": "username_2", "text": "There's one thing I am worried about if we do this change is that if a user already had a wrapping `form` then it will do this: `form > form > input`. Not sure this is an actual issue.\r\n\r\n@username_0 Do you think in the meantime you could fix this issue by creating your own searchbox?", "title": null, "type": "comment" }, { "action": "created", "author": "vvo", "comment_id": 256385269, "datetime": 1477496204000, "masked_author": "username_2", "text": "@username_0 Looking at it again, if you create a `<form>` yourself and instantiate the searchBox on a div inside it then you should be able to accomplish what you want here I think. Without the need to break the current implementation.", "title": null, "type": "comment" }, { "action": "created", "author": "spoeken", "comment_id": 256632210, "datetime": 1477572779000, "masked_author": "username_0", "text": "@username_2 Yes, I thought about that. But I would still need the input type to be set to `search`.", "title": null, "type": "comment" }, { "action": "created", "author": "vvo", "comment_id": 256632725, "datetime": 1477572922000, "masked_author": "username_2", "text": "I think if you target a container which is already an input, this should be ok", "title": null, "type": "comment" }, { "action": "created", "author": "vvo", "comment_id": 257584585, "datetime": 1478011439000, "masked_author": "username_2", "text": "@username_0 did you try my proposition to target an already existing input element? Thanks", "title": null, "type": "comment" }, { "action": "created", "author": "spoeken", "comment_id": 258889008, "datetime": 1478537000000, "masked_author": "username_0", "text": "@username_2 I'll try it out!", "title": null, "type": "comment" }, { "action": "created", "author": "vvo", "comment_id": 259102865, "datetime": 1478601440000, "masked_author": "username_2", "text": "Let's close this until we have a better overview/feedback. Thanks", "title": null, "type": "comment" }, { "action": "created", "author": "spoeken", "comment_id": 259118772, "datetime": 1478606386000, "masked_author": "username_0", "text": "This works!\r\n\r\nMarkup.\r\n`<form action=\"#\" id=\"ios-hide-keyboard-on-submit\" >\r\n <input id=\"search-box\" type=\"search\"/>\r\n </form>`\r\n`\r\nJs to prevent form from submitting and closing keyboard.\r\n`var iosForm = document.getElementById('ios-hide-keyboard-on-submit');\r\n iosForm.onsubmit = function(e){\r\n e.preventDefault();\r\n var inputField = document.activeElement;\r\n inputField.blur();\r\n };`\r\n\r\nAs it is now, it will trigger blur on desktop as well, which might not be wanted. Thank you for the good feedback and for this great application. Without you this would't be as smooth to make as it was: https://www.vingruppen.no/produkter", "title": null, "type": "comment" }, { "action": "created", "author": "vvo", "comment_id": 259123136, "datetime": 1478607809000, "masked_author": "username_2", "text": ":heart_eyes: Thanks for feedback too", "title": null, "type": "comment" } ]
4
18
3,732
false
true
3,604
true
Falkirks/SimpleWarp
null
120,090,103
16
{ "number": 16, "repo": "SimpleWarp", "user_login": "Falkirks" }
[ { "action": "opened", "author": "MCPEGamerJPatGitHub", "comment_id": null, "datetime": 1449115332000, "masked_author": "username_0", "text": "", "title": "Disabling FastTransfer support, this is not possible in 013", "type": "issue" }, { "action": "created", "author": "TahaTheHacker", "comment_id": 161663322, "datetime": 1449154562000, "masked_author": "username_1", "text": ":+1: Transfer packets are removed from MCPE", "title": null, "type": "comment" }, { "action": "created", "author": "MCPEGamerJPatGitHub", "comment_id": 161729050, "datetime": 1449164876000, "masked_author": "username_0", "text": "which is the most annoying part of 0.13 :/", "title": null, "type": "comment" }, { "action": "created", "author": "Falkirks", "comment_id": 161866585, "datetime": 1449201004000, "masked_author": "username_2", "text": "Isn't this going to be added back in a future release? I don't see the need to change the SimpleWarp API, this way the system is backwards compatible and compatible with the current versions. If there is an issue with warps being broken, just remove FastTransfer and it won't try to use them.", "title": null, "type": "comment" } ]
3
4
377
false
false
377
false
kubernetes/kubernetes
kubernetes
57,708,420
4,447
null
[ { "action": "opened", "author": "thockin", "comment_id": null, "datetime": 1423951886000, "masked_author": "username_0", "text": "In some places we expect an IP where a hostname might suffice. We should make a pass over the API and decide for each case and document/test it properly. As per #1300 the \"must be IP\" fields can self-document as net.IP. The hostname-or-IP fields we can think about decoding tricks.", "title": "API: analyze all \"IP\" fields and decide if hostname is valid", "type": "issue" }, { "action": "created", "author": "girishkalele", "comment_id": 236397692, "datetime": 1469923411000, "masked_author": "username_1", "text": "I guess this one is still open - #11838 was closed in favor of https://github.com/kubernetes/kubernetes/issues/13748 but that one is also still open.\r\n\r\nTo summarize my understanding, if an EndpointsAddress entry is added to a service endpoint that contains a string IP or hostname field, we respond with CNAME records ?\r\nCan CNAME and A records be mixed in a response ? i.e. if the service has both IP and hostname endpoints, can we serve a mixed CNAME and A record response ?", "title": null, "type": "comment" }, { "action": "created", "author": "thockin", "comment_id": 236499092, "datetime": 1470032800000, "masked_author": "username_0", "text": "#13748 will be worked on by @rata.\n\nYou can not mix CNAME and A records, and in fact you can not even have\nmultiple CNAME records.\n\nNeither of those are what this one is about, though. This is about\nauditing the API and documenting when a field *must* be an IP (and\nadding/tightening validation and tests in those cases) vs when a field may\nbe an IP or a hostname (and adding/loosening validation and tests in those\ncases).", "title": null, "type": "comment" }, { "action": "created", "author": "kargakis", "comment_id": 305931812, "datetime": 1496446444000, "masked_author": "username_2", "text": "@kubernetes/sig-network-misc", "title": null, "type": "comment" } ]
6
9
2,601
false
true
1,215
false
eligrey/FileSaver.js
null
176,472,631
263
null
[ { "action": "opened", "author": "fabikopa", "comment_id": null, "datetime": 1473711234000, "masked_author": "username_0", "text": "Hi,\r\ntoBlob is working for the elements added to my canvas but not backgrounds that I wrote in css.\r\nIs there a way to make save the background-image and background-color with the canvas?\r\n\r\nHere is the function:\r\n```javascript\r\n$(\"#savepng\").click(function(){\r\n coucou = document.getElementById(\"dodo\"), ctx = canvas.getContext(\"2d\");\r\n coucou.toBlob(function(blob){\r\n saveAs(blob, \"myToon.png\");\r\n });\r\n });\r\n```\r\n\r\nthankyou!\r\nFabi", "title": "not saving the background-image for the canvas", "type": "issue" }, { "action": "closed", "author": "jimmywarting", "comment_id": null, "datetime": 1473713510000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "jimmywarting", "comment_id": 246488880, "datetime": 1473713510000, "masked_author": "username_1", "text": "You need to paint the background image to the canvas instead of using css\r\n```javascript\r\nvar background = new Image();\r\nbackground.src = \"http://www.samskirrow.com/background.png\";\r\n\r\n// Make sure the image is loaded first otherwise nothing will draw.\r\nbackground.onload = function(){\r\n\tctx.drawImage(background,0,0); \r\n}​\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "fabikopa", "comment_id": 246517621, "datetime": 1473719847000, "masked_author": "username_0", "text": "thank you, but finally I did it with easelJS and JQuery\r\n```javascript\r\nvar backg = new createjs.Bitmap(\"myPath\");\r\n\r\n//in init()\r\n$(\"canvas\").on(\"load\",loadBackground());\r\n\r\n//and the function:\r\nfunction loadBackground(){\r\n stage.addChildAt(backg,0);\r\n}\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "fabikopa", "comment_id": 246517771, "datetime": 1473719885000, "masked_author": "username_0", "text": "instead of \"canvas\" goes the id of the canvas", "title": null, "type": "comment" }, { "action": "created", "author": "NKW-UA", "comment_id": 631383709, "datetime": 1589969805000, "masked_author": "username_2", "text": "what is it stage mean?", "title": null, "type": "comment" } ]
3
6
1,127
false
false
1,127
false
apache/spark
apache
227,585,259
17,933
{ "number": 17933, "repo": "spark", "user_login": "apache" }
[ { "action": "opened", "author": "ueshin", "comment_id": null, "datetime": 1494399890000, "masked_author": "username_0", "text": "## What changes were proposed in this pull request?\r\n\r\nBecause the method `TimeZone.getTimeZone(String ID)` is synchronized on the TimeZone class, concurrent call of this method will become a bottleneck.\r\nThis especially happens when casting from string value containing timezone info to timestamp value, which uses `DateTimeUtils.stringToTimestamp()` and gets TimeZone instance on the site.\r\n\r\nThis pr makes a cache of the generated TimeZone instances per thread to avoid the synchronization.\r\n\r\n## How was this patch tested?\r\n\r\nExisting tests.", "title": "[SPARK-20588][SQL] Cache TimeZone instances per thread.", "type": "issue" }, { "action": "created", "author": "viirya", "comment_id": 301358915, "datetime": 1494813801000, "masked_author": "username_1", "text": "LGTM", "title": null, "type": "comment" }, { "action": "created", "author": "gatorsmile", "comment_id": 301636373, "datetime": 1494892372000, "masked_author": "username_2", "text": "Thanks! Merging to master/2.2", "title": null, "type": "comment" } ]
5
19
3,718
false
true
578
false
flarum/core
flarum
106,542,791
485
null
[ { "action": "opened", "author": "tobscure", "comment_id": null, "datetime": 1442319284000, "masked_author": "username_0", "text": "", "title": "is:unread gambit not working", "type": "issue" }, { "action": "created", "author": "Luceos", "comment_id": 142736259, "datetime": 1443043941000, "masked_author": "username_1", "text": "@franzliedke or @username_0 just to make sure i understand the logic:\r\nhttps://github.com/flarum/core/blob/master/src/Core/Discussions/Search/Gambits/UnreadGambit.php#L55\r\n\r\nFrom what I can interpret, the problem lies with the fact that the $actor->read_time is very decisive. From my table I see this column remains `null`; can you verify this in the live database?", "title": null, "type": "comment" }, { "action": "closed", "author": "tobscure", "comment_id": null, "datetime": 1443049326000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "tobscure", "comment_id": 142755786, "datetime": 1443049357000, "masked_author": "username_0", "text": "Looked into it, turns out it was some bad logic in DiscussionRepository::getReadIds().", "title": null, "type": "comment" } ]
2
4
450
false
false
450
true
xu-li/cordova-plugin-wechat
null
131,212,696
157
null
[ { "action": "opened", "author": "hyitclj", "comment_id": null, "datetime": 1454550359000, "masked_author": "username_0", "text": "我在开放平台申请应用的时候,只填了ios端的,android端没填,现在android端支付报普通错误,ios端可以支付,什么问题啊?然后我把android端信息也填进去了,还是报普通错误,什么问题啊?求帮忙", "title": "报普通错误", "type": "issue" }, { "action": "created", "author": "xu-li", "comment_id": 179578091, "datetime": 1454552008000, "masked_author": "username_1", "text": "你的android能不能分享信息?如果不能,基本就是android的应用签名错了。", "title": null, "type": "comment" }, { "action": "created", "author": "hyitclj", "comment_id": 179589010, "datetime": 1454554103000, "masked_author": "username_0", "text": "签名不会错的,软件签的,要不你试试我的账号,看能不能付款?", "title": null, "type": "comment" }, { "action": "created", "author": "hyitclj", "comment_id": 179681726, "datetime": 1454569560000, "masked_author": "username_0", "text": "it's ok.Thanks", "title": null, "type": "comment" }, { "action": "created", "author": "xu-li", "comment_id": 179698999, "datetime": 1454572661000, "masked_author": "username_1", "text": "请问是什么错误?", "title": null, "type": "comment" }, { "action": "created", "author": "whistlebabystar", "comment_id": 179703067, "datetime": 1454573384000, "masked_author": "username_2", "text": "请问怎么解决的?我这里微信分享可以调用,但是支付的时候报普通错误", "title": null, "type": "comment" }, { "action": "created", "author": "xu-li", "comment_id": 179852731, "datetime": 1454594538000, "masked_author": "username_1", "text": "@username_2 你的支付参数的签名错了。看一下我的那个DEMO,或者看一下以前的几个issue。", "title": null, "type": "comment" }, { "action": "created", "author": "hyitclj", "comment_id": 180202814, "datetime": 1454649597000, "masked_author": "username_0", "text": "@username_1 @username_2 是审核??我什么都没动,前一天晚上不能用,然后我把android平台的填写了,第二天上午也不能用,然后到中午就可以用了。。。奇葩", "title": null, "type": "comment" }, { "action": "created", "author": "whistlebabystar", "comment_id": 180203037, "datetime": 1454649731000, "masked_author": "username_2", "text": "@username_1 请问哪些参数需要生成签名呢?", "title": null, "type": "comment" }, { "action": "created", "author": "whistlebabystar", "comment_id": 180205313, "datetime": 1454650087000, "masked_author": "username_2", "text": "@username_0 微信确实要审核一段时间,不过我用的是已上线的app的包名和签名及开发账号,所以不会存在这样的问题", "title": null, "type": "comment" }, { "action": "created", "author": "whistlebabystar", "comment_id": 180226444, "datetime": 1454655005000, "masked_author": "username_2", "text": "@username_1 搞定了,的确是参数签名的问题,弄了一个星期了,非常感谢!", "title": null, "type": "comment" }, { "action": "closed", "author": "xu-li", "comment_id": null, "datetime": 1455762328000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
3
12
486
false
false
486
true
crate/crate
crate
168,601,105
3,891
{ "number": 3891, "repo": "crate", "user_login": "crate" }
[ { "action": "opened", "author": "mfussenegger", "comment_id": null, "datetime": 1470042844000, "masked_author": "username_0", "text": "", "title": "Use SQLOperations in TransportSQL(Bulk)Action", "type": "issue" }, { "action": "created", "author": "mfussenegger", "comment_id": 236531650, "datetime": 1470043450000, "masked_author": "username_0", "text": "@username_1 @username_2 please review", "title": null, "type": "comment" }, { "action": "created", "author": "matriv", "comment_id": 236588129, "datetime": 1470059964000, "masked_author": "username_1", "text": "Through. @username_2 plz proceed with review", "title": null, "type": "comment" }, { "action": "created", "author": "seut", "comment_id": 236601911, "datetime": 1470062717000, "masked_author": "username_2", "text": "lgtm", "title": null, "type": "comment" } ]
3
4
69
false
false
69
true
openframeworks/openFrameworks
openframeworks
118,206,344
4,609
null
[ { "action": "opened", "author": "yasuhirohoshino", "comment_id": null, "datetime": 1448122711000, "masked_author": "username_0", "text": "I tried to load .hdr and .exr images using ofFloatImage. But R and B channel are swapped.\r\nI think the cause is 'ofImage.cpp's' line 136 - 143.\r\n```\r\n#ifdef TARGET_LITTLE_ENDIAN\r\nif(swapForLittleEndian){\r\n\tif(channels==3) pixFormat=OF_PIXELS_BGR;\r\n\tif(channels==4) pixFormat=OF_PIXELS_BGRA;\r\n}else{\r\n\tif(channels==3) pixFormat=OF_PIXELS_RGB;\r\n\tif(channels==4) pixFormat=OF_PIXELS_RGBA;\r\n}\r\n```\r\nUsing 'ofFloatImage::load()', 'swapForLittleEndian' is always true.", "title": "ofFloatImage swaps R and B channel of HDR Image", "type": "issue" }, { "action": "created", "author": "arturoc", "comment_id": 158661006, "datetime": 1448123979000, "masked_author": "username_1", "text": "can you post an example image? most image formats will have BGR format instead of RGB in little endian architectures (both intel and arm cpus are little endian) but probably the specific format you are using doesn't so we might need to check that instead of assuming that B and R are swapped by default", "title": null, "type": "comment" }, { "action": "created", "author": "yasuhirohoshino", "comment_id": 158662539, "datetime": 1448125681000, "masked_author": "username_0", "text": "I downloaded HDR images from this website. \r\nhttp://www.hdrlabs.com/sibl/archive.html\r\nI tried to load top rows images in the site.(Alexs Apartment, Arches PineTree, Barcelona Rooftops, Backetball Court)\r\nAnd this pack(studio033.exr and studio034.exr)\r\nhttp://zbyg.deviantart.com/art/HDRi-Pack-3-112847728", "title": null, "type": "comment" }, { "action": "closed", "author": "arturoc", "comment_id": null, "datetime": 1448128895000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "arturoc", "comment_id": 158667639, "datetime": 1448128916000, "masked_author": "username_1", "text": "there was a bug introduced with some changes in 0.9 should be fixed now", "title": null, "type": "comment" } ]
2
5
1,140
false
false
1,140
false
GoodwayGroup/intacct-api
GoodwayGroup
214,799,224
8
{ "number": 8, "repo": "intacct-api", "user_login": "GoodwayGroup" }
[ { "action": "opened", "author": "mrlannigan", "comment_id": null, "datetime": 1489689240000, "masked_author": "username_0", "text": "Addresses #7", "title": "Ability to create readMore controlFunctions from previous read queries", "type": "issue" } ]
2
2
12
false
true
12
false
nats-io/gnatsd
nats-io
142,305,249
227
{ "number": 227, "repo": "gnatsd", "user_login": "nats-io" }
[ { "action": "opened", "author": "piotrkowalczuk", "comment_id": null, "datetime": 1458555264000, "masked_author": "username_0", "text": "Logger is writing to os.Stderr not os.Stdout", "title": "README log output information fix", "type": "issue" }, { "action": "created", "author": "derekcollison", "comment_id": 199347588, "datetime": 1458575107000, "masked_author": "username_1", "text": "Writing to stderr is by design and when writing to the tty is the accepted unix standard.", "title": null, "type": "comment" } ]
3
3
426
false
true
133
false
Kitware/SMTK
Kitware
89,646,310
210
{ "number": 210, "repo": "SMTK", "user_login": "Kitware" }
[ { "action": "opened", "author": "johnkit", "comment_id": null, "datetime": 1434743376000, "masked_author": "username_0", "text": "The main issue is that the installed version of the __init__.py is getting written to the \"install/install/libexec/...\", that is, there are 2 \"install\" items in the path when there should be one.\r\n\r\nThis mod also restores the configure_file command that is applied to the __init__.py file\r\nthat is written to the install directory.\r\n\r\nNote that this change will affect superbuilds that use SMTK (which are probably already hosed by the extra \"install\" item in the path).", "title": "Fix install logic for smtk/__init__.py", "type": "issue" }, { "action": "created", "author": "johnkit", "comment_id": 114139310, "datetime": 1434984661000, "masked_author": "username_0", "text": "After a bit more thought, I think we should merge this commit as is. By removing CMAKE_INSTALL_PREFIX from line 155, the value for SMTK_PYTHON_MODULEDIR will *always* be a relative path, so that the downstream install command will act in a consistent way. Otherwise, developers who user SMTK will have to know that the python module gets installed differently depending on whether they specify CMAKE_INSTALL_PREFIX as a relative or absolute path.\r\n\r\nIf it helps, we *could* rename SMTK_PYTHON_MODULEDIR to something like SMTK_PYTHON_MODULE_RELATIVE_PATH; but personally, I think the variable name is long enough already :)", "title": null, "type": "comment" }, { "action": "created", "author": "vibraphone", "comment_id": 114660061, "datetime": 1435098736000, "masked_author": "username_1", "text": "@username_0 I guess I am OK with relative paths there.", "title": null, "type": "comment" }, { "action": "created", "author": "johnkit", "comment_id": 116854566, "datetime": 1435613901000, "masked_author": "username_0", "text": "Can this PR be merged? Our new intern Dylan is building CMB on Ubuntu and hitting the same problem.", "title": null, "type": "comment" } ]
2
4
1,243
false
false
1,243
true
jbevain/cecil
null
215,602,901
361
{ "number": 361, "repo": "cecil", "user_login": "jbevain" }
[ { "action": "opened", "author": "xen2", "comment_id": null, "datetime": 1490059633000, "masked_author": "username_0", "text": "Would it be OK that way?", "title": "Allow creation of AsyncMethodBodyDebugInformation with a null catch handler (encoded as offset -1 internally)", "type": "issue" }, { "action": "created", "author": "jbevain", "comment_id": 287950882, "datetime": 1490060345000, "masked_author": "username_1", "text": "I think it would be better if we had a parameterless constructor for this case. Also could you please follow the Mono coding guidelines, this way I could merge this as is without fixing it up :) Thanks!", "title": null, "type": "comment" }, { "action": "created", "author": "xen2", "comment_id": 287956530, "datetime": 1490062708000, "masked_author": "username_0", "text": "OK, force pushed with your proposed change.\r\nSorry for the Mono coding guideline, VS is fighting against me :)\r\n\r\nMaybe I could commit a Resharper setting file that have those rules out of the box?", "title": null, "type": "comment" }, { "action": "created", "author": "jbevain", "comment_id": 287956732, "datetime": 1490062784000, "masked_author": "username_1", "text": "Thanks! There's a DotSettings for R# in the root folder.", "title": null, "type": "comment" } ]
2
4
479
false
false
479
false
HIPS/autograd
HIPS
191,107,900
163
null
[ { "action": "opened", "author": "j-towns", "comment_id": null, "datetime": 1479844395000, "masked_author": "username_0", "text": "dtype('float64')\r\n```\r\n\r\nI'm pretty sure this is because the `1.0` value on [this line]( https://github.com/HIPS/autograd/blob/master/autograd/core.py#L49) defaults to a float64, so it should be pretty easy to properly support `float32` by testing the type of the value of the endnode and copying that type.", "title": "Support for numpy.float32 dtype", "type": "issue" }, { "action": "closed", "author": "mattjj", "comment_id": null, "datetime": 1482072742000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
2
307
false
false
307
false
mbj4668/pyang
null
126,055,760
195
null
[ { "action": "opened", "author": "mpainenz", "comment_id": null, "datetime": 1452552627000, "masked_author": "username_0", "text": "When trying to convert a large YANG model to XML, I get \"too many files to convert\". (16 Files)\r\n\r\nWhen attempting to do each file at once, most error out due to missing dependencies that are located in the other files. (error: module \"missing-module-name\" not found in search path)", "title": "YIN conversion with large model fails with \"too many files to convert\"", "type": "issue" }, { "action": "created", "author": "mbj4668", "comment_id": 486636500, "datetime": 1556192216000, "masked_author": "username_1", "text": "The yin converter can only convert one file at the time. Add flag -p to add to the search path.", "title": null, "type": "comment" }, { "action": "closed", "author": "mbj4668", "comment_id": null, "datetime": 1556192216000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
3
378
false
false
378
false
mozilla/serviceworker-cookbook
mozilla
204,193,877
268
null
[ { "action": "opened", "author": "perrin4869", "comment_id": null, "datetime": 1485832176000, "masked_author": "username_0", "text": "This is a question regarding all the push cookbooks. In all of them, we subscribe to push subscriptions from within the webpage, more or less as follows:\r\n\r\n```js\r\nnavigator.serviceWorker.register('service-worker.js')\r\n.then(function(registration) {\r\n // Use the PushManager to get the user's subscription to the push service.\r\n return registration.pushManager.getSubscription()\r\n .then(function(subscription) {\r\n // If a subscription was found, return it.\r\n if (subscription) {\r\n return subscription;\r\n }\r\n\r\n // Otherwise, subscribe the user (userVisibleOnly allows to specify that we don't plan to\r\n // send notifications that don't have a visible effect for the user).\r\n return registration.pushManager.subscribe({ userVisibleOnly: true });\r\n });\r\n}).then(function(subscription) {\r\n // send to server...\r\n});\r\n```\r\n\r\nCouldn't the same code be included inside the service worker's activation or installation phase, and wouldn't it make more sense to confine it to it?\r\n\r\n```js\r\nself.addEventListener('activate', function(event) {\r\n event.waitUntil(\r\n registration.pushManager.getSubscription()\r\n .then(function(subscription) {\r\n // If a subscription was found, return it.\r\n if (subscription) {\r\n return subscription;\r\n }\r\n\r\n return registration.pushManager.subscribe({ userVisibleOnly: true });\r\n })\r\n .then(function(subscription) {\r\n // send subscription to server...\r\n return self.clients.claim();\r\n }));\r\n});\r\n```", "title": "Why not subscribe to push subscriptions within the service worker's onactivate event?", "type": "issue" }, { "action": "created", "author": "delapuente", "comment_id": 282702803, "datetime": 1488197182000, "masked_author": "username_1", "text": "I can not imagine if there is a good reason to do this. Perhaps to reduce client/service worker messages but maybe @username_2 can answer better.", "title": null, "type": "comment" }, { "action": "created", "author": "marco-c", "comment_id": 282714255, "datetime": 1488200764000, "masked_author": "username_2", "text": "Honestly, I don't see any difference in the two approaches. @username_0 why do you think it would make more sense?", "title": null, "type": "comment" }, { "action": "created", "author": "perrin4869", "comment_id": 284233427, "datetime": 1488725988000, "masked_author": "username_0", "text": "Well, there are 2 reasons I think it makes more sense:\r\n\r\n1) Purely semantic reason. Seems like all code related to push notifications belongs in service workers, which are the only ones that can intercept these notifications in any case. Placing the code outside of service workers just feels out of place.\r\n\r\n2) If we subscribe outside of service workers, we are submitting a subscription to the server every time the page is loaded, even if the subscription hasn't changed. Not a big deal, but it's a small detail.", "title": null, "type": "comment" }, { "action": "created", "author": "nitsas", "comment_id": 369956364, "datetime": 1520005241000, "masked_author": "username_3", "text": "I agree with perrin4849 that all code about push notifications seems to belong in the service worker itself.\r\n\r\nUnfortunately, subscriptions don't stay the same for ever. When we call `subscribe()`, the browser talks with a push service to get a push subscription. I don't think the push service guarantees that it'll always return the same subscription. So we have to `subscribe()` again every now and then, just to be safe.\r\n\r\nThe service worker's `activate` event fires only once after the worker updates, installs and takes the place of an old worker. So it doesn't seem like a good place to resubscribe. If the worker never gets a new update, we'll never resubscribe to push notifications.", "title": null, "type": "comment" }, { "action": "created", "author": "perrin4869", "comment_id": 370106937, "datetime": 1520042451000, "masked_author": "username_0", "text": "@username_3 that's why you'd subscribe to the 'pushsubscriptionchange` event\r\n\r\n```\r\nself.addEventListener('pushsubscriptionchange', () => (registerPushSubscription()));\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "pazguille", "comment_id": 371309931, "datetime": 1520462125000, "masked_author": "username_4", "text": "Hi!\r\n\r\nI am trying to subscribe inside a service worker but I got an error:\r\n\r\n“ERROR DOMException: Registration failed - missing applicationServerKey, and gcm_sender_id not found in manifest”\r\n\r\nWhen I subscribe to push outside service worker every works fine.\r\n\r\nWhy it happend?", "title": null, "type": "comment" }, { "action": "created", "author": "marco-c", "comment_id": 371311236, "datetime": 1520462442000, "masked_author": "username_2", "text": "@username_4 I guess you haven't set a `applicationServerKey` in your call to `subscribe`?", "title": null, "type": "comment" }, { "action": "created", "author": "pazguille", "comment_id": 371313590, "datetime": 1520462978000, "masked_author": "username_4", "text": "Hi @username_2. In this case I'm using a `manifest.json` with a valid `gcm_sender_id` value. Is it necessary to set a `applicationServerKey`?", "title": null, "type": "comment" }, { "action": "created", "author": "marco-c", "comment_id": 371314302, "datetime": 1520463152000, "masked_author": "username_2", "text": "No, it shouldn't be. I would suggest it since it's the standard now and `gcm_sender_id` is probably deprecated.\r\nIt seems like a bug in Chrome, but I wouldn't expect them to fix it since the new way of doing things is VAPID.", "title": null, "type": "comment" }, { "action": "created", "author": "pazguille", "comment_id": 371320304, "datetime": 1520464611000, "masked_author": "username_4", "text": "@username_2 you are right! Unfortunately is a bug in Chrome: https://bugs.chromium.org/p/chromium/issues/detail?id=597317&can=1&q=gcm_sender_id&colspec=ID%20Pri%20M%20Stars%20ReleaseBlock%20Component%20Status%20Owner%20Summary%20OS%20Modified\r\n\r\nThank you!", "title": null, "type": "comment" }, { "action": "created", "author": "h43z", "comment_id": 469764476, "datetime": 1551805470000, "masked_author": "username_5", "text": "Subscribing within a service worker will not request the permission to show notification (like when done from regular javascript file) and instantly throws an exception.\r\n\r\n`Registration failed - permission denied`", "title": null, "type": "comment" } ]
6
12
4,310
false
false
4,310
true
tensorflow/tensorflow
tensorflow
158,692,784
2,686
{ "number": 2686, "repo": "tensorflow", "user_login": "tensorflow" }
[ { "action": "opened", "author": "Undo1", "comment_id": null, "datetime": 1465223634000, "masked_author": "username_0", "text": "Missing backtick was causing unhappiness.", "title": "Fix formatting issue", "type": "issue" }, { "action": "created", "author": "Undo1", "comment_id": 223977658, "datetime": 1465223723000, "masked_author": "username_0", "text": "@googlebot I signed it!", "title": null, "type": "comment" }, { "action": "created", "author": "girving", "comment_id": 224019156, "datetime": 1465232133000, "masked_author": "username_1", "text": "@username_0: Thank you for the fix! Backticks are depressing hard to see. :)", "title": null, "type": "comment" } ]
4
6
1,014
false
true
137
true
LeaVerou/bliss
null
120,852,012
36
{ "number": 36, "repo": "bliss", "user_login": "LeaVerou" }
[ { "action": "opened", "author": "zdfs", "comment_id": null, "datetime": 1449517589000, "masked_author": "username_0", "text": "This if for #35. Can be rejected, but I don’t think we need to call\nChrome by default for the build. Developers can add it to their\nenvironment at any time.", "title": "#35 Remove Chrome from karma.config", "type": "issue" }, { "action": "created", "author": "zdfs", "comment_id": 162677814, "datetime": 1449525389000, "masked_author": "username_0", "text": "WE ARE GREEN, PEOPLE.", "title": null, "type": "comment" }, { "action": "created", "author": "LeaVerou", "comment_id": 162678283, "datetime": 1449525476000, "masked_author": "username_1", "text": "I’m fine with removing it, but I’d like to check with @dperrymorrow first, since he put it in.", "title": null, "type": "comment" }, { "action": "created", "author": "zdfs", "comment_id": 162678707, "datetime": 1449525548000, "masked_author": "username_0", "text": "Anyone have an idea why the build skips a test? Same thing happens when running `npm test`.", "title": null, "type": "comment" }, { "action": "created", "author": "zdfs", "comment_id": 162678865, "datetime": 1449525580000, "masked_author": "username_0", "text": "@dperrymorrow signed off here: https://github.com/username_1/bliss/issues/35", "title": null, "type": "comment" }, { "action": "created", "author": "LeaVerou", "comment_id": 162679282, "datetime": 1449525655000, "masked_author": "username_1", "text": "Yeah, I just saw! Also, yay for being green, finally! Wheee!\r\nWhat is `.idea`? It’s added in the gitignore…", "title": null, "type": "comment" }, { "action": "created", "author": "zdfs", "comment_id": 162679711, "datetime": 1449525738000, "masked_author": "username_0", "text": "Sorry. It was for my webstorm IDE. We can remove it, if you'd like. I switched to atom.", "title": null, "type": "comment" }, { "action": "created", "author": "LeaVerou", "comment_id": 162679854, "datetime": 1449525766000, "masked_author": "username_1", "text": "Yeah, that looks like something that should be in your global gitignore :)", "title": null, "type": "comment" }, { "action": "created", "author": "zdfs", "comment_id": 162680525, "datetime": 1449525889000, "masked_author": "username_0", "text": "Noted. I'll update.", "title": null, "type": "comment" }, { "action": "created", "author": "zdfs", "comment_id": 162681706, "datetime": 1449526105000, "masked_author": "username_0", "text": "All done.", "title": null, "type": "comment" }, { "action": "created", "author": "zdfs", "comment_id": 162685131, "datetime": 1449526739000, "masked_author": "username_0", "text": "I found why it was only running 15 of the 16 tests. Not my code, so I'll leave it as is. There an `xit` in `setSpec.js`.", "title": null, "type": "comment" } ]
2
11
852
false
false
852
true
AzureAD/azure-activedirectory-library-for-dotnet
AzureAD
152,647,441
382
{ "number": 382, "repo": "azure-activedirectory-library-for-dotnet", "user_login": "AzureAD" }
[ { "action": "opened", "author": "abhishek58g", "comment_id": null, "datetime": 1462223770000, "masked_author": "username_0", "text": "Reverts AzureAD/azure-activedirectory-library-for-dotnet#377", "title": "Revert \"ADAL v2 :- CredScan CleanUp\"", "type": "issue" } ]
2
2
400
false
true
60
false
dzenbot/DZNEmptyDataSet
null
168,454,733
287
null
[ { "action": "opened", "author": "florinzf", "comment_id": null, "datetime": 1469863619000, "masked_author": "username_0", "text": "In my table view setup, I also use AMScrollingNavbar*. This makes my navigation bar scroll up when scrolling the table view. After using DZNEmptyDataSet, whenever my navigation bar is scrolling, it becomes black. Any idea on how to deal with this? I'd hate to let go one of the two libraries.\r\n\r\n\r\n\r\n[*] - https://github.com/andreamazz/AMScrollingNavbar\r\n\r\n\r\n![navigationbar - normal](https://cloud.githubusercontent.com/assets/15258293/17268847/033999ec-5640-11e6-99e4-d3afc8806a62.jpg)\r\n\r\n![navigationbar - when scrolling](https://cloud.githubusercontent.com/assets/15258293/17268846/031405f6-5640-11e6-9fce-c4b829398164.jpg)", "title": "Nav Bar gets black", "type": "issue" }, { "action": "created", "author": "florinzf", "comment_id": 239819464, "datetime": 1471272040000, "masked_author": "username_0", "text": "Thanks for the answer.\r\nNo, this happens after displaying an empty view on the same table view. Once an empty view is shown, and then I populate that table, the black bar appears when scrolling.", "title": null, "type": "comment" } ]
2
3
1,022
false
true
821
false
pinceladasdaweb/Socialight
null
154,915,924
6
null
[ { "action": "opened", "author": "adanylov", "comment_id": null, "datetime": 1463328332000, "masked_author": "username_0", "text": "Linkedin count gives 400 error", "title": "Linkedin stop working", "type": "issue" }, { "action": "closed", "author": "pinceladasdaweb", "comment_id": null, "datetime": 1463333128000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "pinceladasdaweb", "comment_id": 219298726, "datetime": 1463333128000, "masked_author": "username_1", "text": "Linkedin does not accept numbers in the jsonp callback function. I created a function to generate a dynamic string in this situation. Thanks.", "title": null, "type": "comment" } ]
2
3
171
false
false
171
false
damieng/Linq.Translations
null
6,187,022
2
null
[ { "action": "opened", "author": "AlexKeySmith", "comment_id": null, "datetime": 1344853520000, "masked_author": "username_0", "text": "Hi,\nIs it possible to use translations for navigation properties?\n\nI'm using foreign keys in dbcontext, and have succesfully overriden the FK value via an expression. But I cannot think of a way of overriding the actual navigation?\n\n-thanks\nAlex.", "title": "Is it possible to use translations for navigation properties?", "type": "issue" }, { "action": "closed", "author": "damieng", "comment_id": null, "datetime": 1445035022000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
2
246
false
false
246
false
pimentel/atom-r-exec
null
191,411,056
36
null
[ { "action": "opened", "author": "lancasterthethird", "comment_id": null, "datetime": 1479948882000, "masked_author": "username_0", "text": "Instead of sending the R code outside of Atom, is it possible to use a built-in console (as in Juno) to send the code to? A workspace with all the variables and a plotting area, which are also features in Juno (and that hopefully you can leverage!) would be awesome. Indeed, this would put Atom on par with (or really, beyond) RStudio. Thanks for the great package.", "title": "Send code to console within atom", "type": "issue" }, { "action": "created", "author": "hugeme", "comment_id": 263088305, "datetime": 1480197426000, "masked_author": "username_1", "text": "I had a similar question: https://github.com/username_2/atom-r-exec/issues/32\r\n\r\nThat feature would be awesome and incite more data lovers to use atom!", "title": null, "type": "comment" }, { "action": "created", "author": "pimentel", "comment_id": 263100582, "datetime": 1480218367000, "masked_author": "username_2", "text": "This is definitely next in the queue. Thanks for the tip on Juno. Looks like a great package and might prove helpful!\r\n\r\nI'll keep you folks posted.", "title": null, "type": "comment" }, { "action": "created", "author": "rdahis", "comment_id": 316090461, "datetime": 1500389747000, "masked_author": "username_3", "text": "+1 to R console within Atom!", "title": null, "type": "comment" } ]
4
4
690
false
false
690
true
tabulapdf/tabula-java
tabulapdf
112,619,544
38
null
[ { "action": "opened", "author": "jazzido", "comment_id": null, "datetime": 1445442389000, "masked_author": "username_0", "text": "We're stable enough to do a release to [OSSRH](http://central.sonatype.org/pages/ossrh-guide.html)", "title": "Release to Maven Central", "type": "issue" }, { "action": "created", "author": "jazzido", "comment_id": 149940355, "datetime": 1445442678000, "masked_author": "username_0", "text": "I've just posted a request to SonaType ([OSSRH-18411](https://issues.sonatype.org/browse/OSSRH-18411)", "title": null, "type": "comment" }, { "action": "closed", "author": "jazzido", "comment_id": null, "datetime": 1445722301000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "jazzido", "comment_id": 200133251, "datetime": 1458699133000, "masked_author": "username_0", "text": "For reference, here's how to push a release to Sonatype (needs credentials):\r\n\r\n```\r\n# set proper version in pom.xml and run:\r\nmvn clean deploy -Dmaven.test.skip=true\r\n```", "title": null, "type": "comment" } ]
1
4
370
false
false
370
false
jonahoffline/filepreviews-ruby
null
92,189,178
8
null
[ { "action": "opened", "author": "haggen", "comment_id": null, "datetime": 1435695414000, "masked_author": "username_0", "text": "Just by calling `Filepreviews.generate` I'm getting `JSON::ParserError`; see the trace below.\r\n\r\n```\r\njson (1.8.2) lib/json/common.rb:155:in `initialize'\r\njson (1.8.2) lib/json/common.rb:155:in `new'\r\njson (1.8.2) lib/json/common.rb:155:in `parse'\r\nfilepreviews (1.2.0) lib/filepreviews/http.rb:69:in `parse'\r\nfilepreviews (1.2.0) lib/filepreviews/http.rb:62:in `fetch'\r\nfilepreviews (1.2.0) lib/filepreviews.rb:20:in `generate'\r\nactionpack (3.2.21) lib/action_view/template.rb:145:in `block in render'\r\nactivesupport (3.2.21) lib/active_support/notifications.rb:125:in `instrument'\r\nactionpack (3.2.21) lib/action_view/template.rb:143:in `render'\r\nactionpack (3.2.21) lib/action_view/renderer/template_renderer.rb:47:in `block (2 levels) in render_template'\r\nactionpack (3.2.21) lib/action_view/renderer/abstract_renderer.rb:38:in `block in instrument'\r\nactivesupport (3.2.21) lib/active_support/notifications.rb:123:in `block in instrument'\r\nactivesupport (3.2.21) lib/active_support/notifications/instrumenter.rb:20:in `instrument'\r\nactivesupport (3.2.21) lib/active_support/notifications.rb:123:in `instrument'\r\nactionpack (3.2.21) lib/action_view/renderer/abstract_renderer.rb:38:in `instrument'\r\nactionpack (3.2.21) lib/action_view/renderer/template_renderer.rb:46:in `block in render_template'\r\nactionpack (3.2.21) lib/action_view/renderer/template_renderer.rb:54:in `render_with_layout'\r\nactionpack (3.2.21) lib/action_view/renderer/template_renderer.rb:45:in `render_template'\r\nactionpack (3.2.21) lib/action_view/renderer/template_renderer.rb:18:in `render'\r\nactionpack (3.2.21) lib/action_view/renderer/renderer.rb:36:in `render_template'\r\nactionpack (3.2.21) lib/action_view/renderer/renderer.rb:17:in `render'\r\nactionpack (3.2.21) lib/abstract_controller/rendering.rb:110:in `_render_template'\r\nactionpack (3.2.21) lib/action_controller/metal/streaming.rb:225:in `_render_template'\r\nactionpack (3.2.21) lib/abstract_controller/rendering.rb:103:in `render_to_body'\r\nactionpack (3.2.21) lib/action_controller/metal/renderers.rb:28:in `render_to_body'\r\nactionpack (3.2.21) lib/action_controller/metal/compatibility.rb:50:in `render_to_body'\r\nactionpack (3.2.21) lib/abstract_controller/rendering.rb:88:in `render'\r\nactionpack (3.2.21) lib/action_controller/metal/rendering.rb:16:in `render'\r\nactionpack (3.2.21) lib/action_controller/metal/instrumentation.rb:40:in `block (2 levels) in render'\r\nactivesupport (3.2.21) lib/active_support/core_ext/benchmark.rb:5:in `block in ms'\r\n/usr/local/lib/ruby/2.2.0/benchmark.rb:303:in `realtime'\r\nactivesupport (3.2.21) lib/active_support/core_ext/benchmark.rb:5:in `ms'\r\nactionpack (3.2.21) lib/action_controller/metal/instrumentation.rb:40:in `block in render'\r\nactionpack (3.2.21) lib/action_controller/metal/instrumentation.rb:83:in `cleanup_view_runtime'\r\nactionpack (3.2.21) lib/action_controller/metal/instrumentation.rb:39:in `render'\r\nactionpack (3.2.21) lib/action_controller/metal/implicit_render.rb:10:in `default_render'\r\nactionpack (3.2.21) lib/action_controller/metal/implicit_render.rb:5:in `send_action'\r\nactionpack (3.2.21) lib/abstract_controller/base.rb:167:in `process_action'\r\nactionpack (3.2.21) lib/action_controller/metal/rendering.rb:10:in `process_action'\r\nactionpack (3.2.21) lib/abstract_controller/callbacks.rb:18:in `block in process_action'\r\nactivesupport (3.2.21) lib/active_support/callbacks.rb:436:in `_run__1199963992045643328__process_action__354792127446437228__callbacks'\r\nactivesupport (3.2.21) lib/active_support/callbacks.rb:405:in `__run_callback'\r\nactivesupport (3.2.21) lib/active_support/callbacks.rb:385:in `_run_process_action_callbacks'\r\nactivesupport (3.2.21) lib/active_support/callbacks.rb:81:in `run_callbacks'\r\nactionpack (3.2.21) lib/abstract_controller/callbacks.rb:17:in `process_action'\r\nactionpack (3.2.21) lib/action_controller/metal/rescue.rb:29:in `process_action'\r\nactionpack (3.2.21) lib/action_controller/metal/instrumentation.rb:30:in `block in process_action'\r\nactivesupport (3.2.21) lib/active_support/notifications.rb:123:in `block in instrument'\r\nactivesupport (3.2.21) lib/active_support/notifications/instrumenter.rb:20:in `instrument'\r\nactivesupport (3.2.21) lib/active_support/notifications.rb:123:in `instrument'\r\nactionpack (3.2.21) lib/action_controller/metal/instrumentation.rb:29:in `process_action'\r\nactionpack (3.2.21) lib/action_controller/metal/params_wrapper.rb:207:in `process_action'\r\nactionpack (3.2.21) lib/abstract_controller/base.rb:121:in `process'\r\nactionpack (3.2.21) lib/abstract_controller/rendering.rb:45:in `process'\r\nactionpack (3.2.21) lib/action_controller/metal.rb:203:in `dispatch'\r\nactionpack (3.2.21) lib/action_controller/metal/rack_delegation.rb:14:in `dispatch'\r\nactionpack (3.2.21) lib/action_controller/metal.rb:246:in `block in action'\r\nactionpack (3.2.21) lib/action_dispatch/routing/route_set.rb:73:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/routing/route_set.rb:73:in `dispatch'\r\nactionpack (3.2.21) lib/action_dispatch/routing/route_set.rb:36:in `call'\r\njourney (1.0.4) lib/journey/router.rb:68:in `block in call'\r\njourney (1.0.4) lib/journey/router.rb:56:in `each'\r\njourney (1.0.4) lib/journey/router.rb:56:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/routing/route_set.rb:608:in `call'\r\nmongoid (3.1.6) lib/rack/mongoid/middleware/identity_map.rb:34:in `block in call'\r\nmongoid (3.1.6) lib/mongoid/unit_of_work.rb:39:in `unit_of_work'\r\nmongoid (3.1.6) lib/rack/mongoid/middleware/identity_map.rb:34:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/best_standards_support.rb:17:in `call'\r\nrack (1.4.5) lib/rack/etag.rb:23:in `call'\r\nrack (1.4.5) lib/rack/conditionalget.rb:25:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/head.rb:14:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/params_parser.rb:21:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/flash.rb:242:in `call'\r\nrack (1.4.5) lib/rack/session/abstract/id.rb:210:in `context'\r\nrack (1.4.5) lib/rack/session/abstract/id.rb:205:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/cookies.rb:341:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/callbacks.rb:28:in `block in call'\r\nactivesupport (3.2.21) lib/active_support/callbacks.rb:405:in `_run__2601463367027478767__call__4555912327375473480__callbacks'\r\nactivesupport (3.2.21) lib/active_support/callbacks.rb:405:in `__run_callback'\r\nactivesupport (3.2.21) lib/active_support/callbacks.rb:385:in `_run_call_callbacks'\r\nactivesupport (3.2.21) lib/active_support/callbacks.rb:81:in `run_callbacks'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/callbacks.rb:27:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/reloader.rb:65:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/remote_ip.rb:31:in `call'\r\nbetter_errors (2.1.1) lib/better_errors/middleware.rb:59:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/debug_exceptions.rb:16:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/show_exceptions.rb:56:in `call'\r\nrailties (3.2.21) lib/rails/rack/logger.rb:32:in `call_app'\r\nrailties (3.2.21) lib/rails/rack/logger.rb:16:in `block in call'\r\nactivesupport (3.2.21) lib/active_support/tagged_logging.rb:22:in `tagged'\r\nrailties (3.2.21) lib/rails/rack/logger.rb:16:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/request_id.rb:22:in `call'\r\nrack (1.4.5) lib/rack/methodoverride.rb:21:in `call'\r\nrack (1.4.5) lib/rack/runtime.rb:17:in `call'\r\nactivesupport (3.2.21) lib/active_support/cache/strategy/local_cache.rb:72:in `call'\r\nrack (1.4.5) lib/rack/lock.rb:15:in `call'\r\nactionpack (3.2.21) lib/action_dispatch/middleware/static.rb:83:in `call'\r\nrack-timeout (0.2.0) lib/rack/timeout.rb:108:in `call'\r\nrailties (3.2.21) lib/rails/engine.rb:484:in `call'\r\nrailties (3.2.21) lib/rails/application.rb:231:in `call'\r\nrailties (3.2.21) lib/rails/railtie/configurable.rb:30:in `method_missing'\r\nrack (1.4.5) lib/rack/deflater.rb:13:in `call'\r\nrack-cors (0.3.1) lib/rack/cors.rb:72:in `call'\r\nrack (1.4.5) lib/rack/content_length.rb:14:in `call'\r\nrailties (3.2.21) lib/rails/rack/log_tailer.rb:17:in `call'\r\nthin (1.6.3) lib/thin/connection.rb:86:in `block in pre_process'\r\nthin (1.6.3) lib/thin/connection.rb:84:in `catch'\r\nthin (1.6.3) lib/thin/connection.rb:84:in `pre_process'\r\nthin (1.6.3) lib/thin/connection.rb:53:in `process'\r\nthin (1.6.3) lib/thin/connection.rb:39:in `receive_data'\r\neventmachine (1.0.7) lib/eventmachine.rb:187:in `run_machine'\r\neventmachine (1.0.7) lib/eventmachine.rb:187:in `run'\r\nthin (1.6.3) lib/thin/backends/base.rb:73:in `start'\r\nthin (1.6.3) lib/thin/server.rb:162:in `start'\r\nrack (1.4.5) lib/rack/handler/thin.rb:13:in `run'\r\nrack (1.4.5) lib/rack/server.rb:268:in `start'\r\nrailties (3.2.21) lib/rails/commands/server.rb:70:in `start'\r\nrailties (3.2.21) lib/rails/commands.rb:55:in `block in <top (required)>'\r\nrailties (3.2.21) lib/rails/commands.rb:50:in `tap'\r\nrailties (3.2.21) lib/rails/commands.rb:50:in `<top (required)>'\r\nscript/rails:6:in `require'\r\nscript/rails:6:in `<main>'\r\n```", "title": "Getting when generating preview JSON::ParserError", "type": "issue" }, { "action": "created", "author": "haggen", "comment_id": 117803399, "datetime": 1435779612000, "masked_author": "username_0", "text": "The same error happens when using the CLI. Please note that I'm trying to run it from inside a Docker container running on Debian \"jessie\". If a component or library is missing, it should have complained, shouldn't it ?", "title": null, "type": "comment" }, { "action": "created", "author": "jonahoffline", "comment_id": 120648021, "datetime": 1436637064000, "masked_author": "username_1", "text": "@username_0 sorry I didn't see this sooner. I just tried to reproduce this on my box with no success. Can you share your `Dockerfile` (or anything else like ruby version, etc).", "title": null, "type": "comment" }, { "action": "created", "author": "jpadilla", "comment_id": 120648930, "datetime": 1436637979000, "masked_author": "username_2", "text": "This is part implementation issue on all v1 client libraries which did\npolling for results and hanged previews on our side. Implementation of v2\nlibraries remove this behavior by default.", "title": null, "type": "comment" }, { "action": "created", "author": "jonahoffline", "comment_id": 120651736, "datetime": 1436639814000, "masked_author": "username_1", "text": "@username_2 my implementation of `v1` doesn't poll for results (first request only returns image and metadata urls; for metadata you have to call the method).", "title": null, "type": "comment" }, { "action": "created", "author": "haggen", "comment_id": 121247520, "datetime": 1436882586000, "masked_author": "username_0", "text": "The image I'm using is this https://registry.hub.docker.com/u/username_0/rails/dockerfile/ which is based on this https://registry.hub.docker.com/u/username_0/ruby/dockerfile/.", "title": null, "type": "comment" }, { "action": "closed", "author": "jonahoffline", "comment_id": null, "datetime": 1473882512000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
3
7
9,960
false
false
9,960
true
elastic/logstash
elastic
92,773,496
3,554
null
[ { "action": "opened", "author": "fcntl", "comment_id": null, "datetime": 1435888302000, "masked_author": "username_0", "text": "I updated to 1.5.2 and my codec stopped working\r\nI was dropping it in another codec dir /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-json-0.1.7/lib/logstash/codecs/\r\nIt worked fine under 1.5.1. Now under 1.5.2 I get \"Couldn't find any codec plugin...\" message.\r\nWhere can I just drop a custom .rb file somewhere so it will be loaded as a codec?", "title": "custom codec no longer loaded in 1 .5.2", "type": "issue" }, { "action": "closed", "author": "suyograo", "comment_id": null, "datetime": 1436307787000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "suyograo", "comment_id": 119361835, "datetime": 1436307787000, "masked_author": "username_1", "text": "@username_0 currently, LS on upgrade loses locally installed plugins. We will be working on a fix for this soon. Closing this as a dup of https://github.com/elastic/logstash/issues/3540.", "title": null, "type": "comment" }, { "action": "created", "author": "suyograo", "comment_id": 119362299, "datetime": 1436307862000, "masked_author": "username_1", "text": "@username_0 not a great workaround, but still: you can reinstall custom/local plugins after the upgrade using `bin/plugin install`", "title": null, "type": "comment" } ]
2
4
663
false
false
663
true
binaryberry/seal
null
109,775,281
37
{ "number": 37, "repo": "seal", "user_login": "binaryberry" }
[ { "action": "opened", "author": "johnsyweb", "comment_id": null, "datetime": 1444043528000, "masked_author": "username_0", "text": "Context\r\n-------\r\n\r\nhttps://github.com/username_1/seal/pull/31 [broke our CI](https://travis-ci.org/envato/seal/builds/83603855) when I [merged it into our own organisation's repo](https://github.com/envato/seal/pull/14).\r\n\r\nIt didn't break CI upstream because that CI environment sets [`export SEAL_ORGANISATION=alphagov`]( https://travis-ci.org/username_1/seal/builds/82113946#L89).\r\n\r\nChange\r\n------\r\n\r\nRather than enforce setting environment variables to run the specs, I've\r\nstubbed `ENV`. This gives us greater control over the behaviour of the\r\nclass under test allowing us to check the file that would be loaded.\r\n\r\nConsiderations\r\n--------------\r\n\r\nThe behaviour of the class using environment-based configuration for a\r\nteam remains untested.", "title": "Fix Seal spec", "type": "issue" }, { "action": "created", "author": "binaryberry", "comment_id": 145584598, "datetime": 1444061411000, "masked_author": "username_1", "text": "Sorry about the break and thanks for the repair, Pete! :rainbow:", "title": null, "type": "comment" }, { "action": "created", "author": "johnsyweb", "comment_id": 145635795, "datetime": 1444072471000, "masked_author": "username_0", "text": "No worries. ☺", "title": null, "type": "comment" } ]
2
3
831
false
false
831
true
OneBusAway/onebusaway-iphone
OneBusAway
175,125,252
706
null
[ { "action": "opened", "author": "aaronbrethorst", "comment_id": null, "datetime": 1473107916000, "masked_author": "username_0", "text": "![img_1102](https://cloud.githubusercontent.com/assets/2254/18256721/0833c6de-736e-11e6-9fc5-3d968067b7d7.PNG)", "title": "Yellow backgrounds for cells sometimes get 'stuck'", "type": "issue" }, { "action": "created", "author": "chadsy", "comment_id": 249424584, "datetime": 1474813504000, "masked_author": "username_1", "text": "@username_0 I'm taking a look at this", "title": null, "type": "comment" }, { "action": "closed", "author": "aaronbrethorst", "comment_id": null, "datetime": 1475208152000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" } ]
2
3
151
false
false
151
true
tpbrown/modernbuild
null
187,416,207
1
null
[ { "action": "opened", "author": "jhinrichsen", "comment_id": null, "datetime": 1478286850000, "masked_author": "username_0", "text": "Hi Tim,\r\n\r\nin need to migrate a project from Maven to Buck, your tool comes in very handy. Unfortunately, our devs make excessive use of resource includes/ excludes, a feature that is not supported in the latest version of modernbuild.\r\n\r\nBefore i go out and get a Python book, i wanted to make sure this is not something you already have in the console right in front of you, waiting for a push...\r\n\r\nThanks in advance\r\n\r\nJochen", "title": "Support more than 1 includes/excludes", "type": "issue" }, { "action": "created", "author": "tpbrown", "comment_id": 258536501, "datetime": 1478290591000, "masked_author": "username_1", "text": "Sorry - no fix for that sitting local.\r\n\r\nI don't have time to update it, but happy to take a PR.\r\n\r\nEssentially what you'll need to do is split the include/exclude columns from the resources table and put them in a new table. \r\n\r\nFrom there you'd update the conditionals where we're filtering resources.\r\n\r\nShouldn't be too difficult :)", "title": null, "type": "comment" } ]
2
2
766
false
false
766
false
dotse/zonemaster-gui
dotse
98,246,358
118
null
[ { "action": "opened", "author": "tobbe-eklov", "comment_id": null, "datetime": 1438285245000, "masked_author": "username_0", "text": "Hello.\r\nIf you test skinnskatteberg.se it takes hours.\r\n\r\n/Tobbe", "title": "skinnskatteberg.se", "type": "issue" }, { "action": "created", "author": "sandoche2k", "comment_id": 126593444, "datetime": 1438328257000, "masked_author": "username_1", "text": "Hi Tobbe,\n\n Thanks for your message.\n\n We did identify there is an issue in Zonemaster where even when certain\nnameservers for a domain does not respond, Zonemaster tends to run all the\nremaining tests. I have assigned this issue as priority to be solved.\n\n Since this is holiday period thanks for expecting a delay to having this\nsolved.\n\nSandoche.", "title": null, "type": "comment" }, { "action": "created", "author": "sandoche2k", "comment_id": 130549632, "datetime": 1439446344000, "masked_author": "username_1", "text": "@vlevigneron seems to be the result of this issue : https://github.com/dotse/zonemaster-engine/issues/96", "title": null, "type": "comment" }, { "action": "created", "author": "sandoche2k", "comment_id": 140056281, "datetime": 1442234221000, "masked_author": "username_1", "text": "Hi @username_0 , \r\n\r\n The issue has been fixed.", "title": null, "type": "comment" }, { "action": "created", "author": "sandoche2k", "comment_id": 140056310, "datetime": 1442234231000, "masked_author": "username_1", "text": "@username_2 close ?", "title": null, "type": "comment" }, { "action": "created", "author": "mtoma", "comment_id": 140070006, "datetime": 1442236379000, "masked_author": "username_2", "text": "Can be closed.\n\n----- Original Message -----", "title": null, "type": "comment" }, { "action": "closed", "author": "sandoche2k", "comment_id": null, "datetime": 1442236707000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
3
7
628
false
false
628
true
robcalcroft/monzoweb
null
221,859,455
73
null
[ { "action": "opened", "author": "markwilliamfirth", "comment_id": null, "datetime": 1492193098000, "masked_author": "username_0", "text": "Hi @username_1! Love that you're building this!\r\n\r\nI get to this page:\r\n<img width=\"403\" alt=\"screen shot 2017-04-14 at 18 58 37\" src=\"https://cloud.githubusercontent.com/assets/5732209/25051422/69d114a8-2144-11e7-859b-f121497d36da.png\">\r\n\r\nAnd I enter my email to authorise but when I push the button nothing happens\r\n\r\nI think I may have done the set up incorrectly?\r\n![a](https://cloud.githubusercontent.com/assets/5732209/25051520/f2e8c790-2144-11e7-9f68-f9d0525f4156.png)\r\n\r\n![b](https://cloud.githubusercontent.com/assets/5732209/25051563/2e1c0cb4-2145-11e7-8442-8980f910cb70.png)\r\n\r\n<img width=\"712\" alt=\"screen shot 2017-04-14 at 19 00 12\" src=\"https://cloud.githubusercontent.com/assets/5732209/25051526/fb2f1382-2144-11e7-8ed8-47606b56e8fd.png\">\r\n\r\nAny ideas what I'm doing wrong?", "title": "Set up", "type": "issue" }, { "action": "created", "author": "robcalcroft", "comment_id": 294206192, "datetime": 1492193689000, "masked_author": "username_1", "text": "Hey! It looks like you haven't configured your redirect URL. If you're running Monzoweb locally then you can use `http://localhost:8000/callback` as your `REDIRECT_URL`. That will tell Monzo to redirect you back to Monzoweb once you've finished authenticating ☺️ Lemme know if you need some more help", "title": null, "type": "comment" }, { "action": "closed", "author": "markwilliamfirth", "comment_id": null, "datetime": 1492349527000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "markwilliamfirth", "comment_id": 294351983, "datetime": 1492349527000, "masked_author": "username_0", "text": "@username_1 thanks! It works 😄", "title": null, "type": "comment" } ]
2
4
1,122
false
false
1,122
true
jminardi/syncnet
null
127,850,653
16
null
[ { "action": "opened", "author": "isaacmao", "comment_id": null, "datetime": 1453355136000, "masked_author": "username_0", "text": "I didn't see any progresses about the prjoect for while. Is it still on going, or any barriers to move forward?", "title": "any updates along this project? ", "type": "issue" }, { "action": "created", "author": "jminardi", "comment_id": 184890454, "datetime": 1455659658000, "masked_author": "username_1", "text": "Hi Isaac, I don't have any plans to take it further at this point. It was more a proof of concept. In fact I think the bittorrent team took the idea and started building their own browser (although last I looked it was not open source and was windows only)", "title": null, "type": "comment" } ]
2
2
367
false
false
367
false
tensorflow/tensorflow
tensorflow
187,692,328
5,448
null
[ { "action": "opened", "author": "ramnath-k", "comment_id": null, "datetime": 1478519479000, "masked_author": "username_0", "text": "### What related GitHub issues or StackOverflow threads have you found by searching the web for your problem?\r\nClosest I found was https://github.com/tensorflow/tensorflow/issues/4767\r\nBut I don't intend to change the datatypes of any variables...\r\n\r\n### Environment info\r\nOperating System: Ubuntu 14.04\r\n\r\nInstalled version of CUDA and cuDNN: CUDA 7.5 and CUDNN 4.0.7\r\n(please attach the output of `ls -l /path/to/cuda/lib/libcud*`):\r\n-rw-r--r-- 1 root root 189170 Aug 25 02:29 /usr/local/cuda/lib/libcudadevrt.a\r\nlrwxrwxrwx 1 root root 16 Aug 25 02:29 /usr/local/cuda/lib/libcudart.so -> libcudart.so.7.5\r\nlrwxrwxrwx 1 root root 19 Aug 25 02:29 /usr/local/cuda/lib/libcudart.so.7.5 -> libcudart.so.7.5.18\r\n-rwxr-xr-x 1 root root 311596 Aug 25 02:29 /usr/local/cuda/lib/libcudart.so.7.5.18\r\n-rw-r--r-- 1 root root 558020 Aug 25 02:29 /usr/local/cuda/lib/libcudart_static.a\r\n\r\nCUDNN libs are in /cuda/lib64/ (output of `ls -l /path/to/cuda/lib64/libcud*`):\r\n-rw-r--r-- 1 root root 322936 Aug 25 02:29 /usr/local/cuda/lib64/libcudadevrt.a\r\nlrwxrwxrwx 1 root root 16 Aug 25 02:29 /usr/local/cuda/lib64/libcudart.so -> libcudart.so.7.5\r\nlrwxrwxrwx 1 root root 19 Aug 25 02:29 /usr/local/cuda/lib64/libcudart.so.7.5 -> libcudart.so.7.5.18\r\n-rwxr-xr-x 1 root root 383336 Aug 25 02:29 /usr/local/cuda/lib64/libcudart.so.7.5.18\r\n-rw-r--r-- 1 root root 720192 Aug 25 02:29 /usr/local/cuda/lib64/libcudart_static.a\r\n-rwxr-xr-x 1 root root 61453024 Aug 25 02:36 /usr/local/cuda/lib64/libcudnn.so\r\n-rwxr-xr-x 1 root root 61453024 Aug 25 02:36 /usr/local/cuda/lib64/libcudnn.so.4\r\n-rwxr-xr-x 1 root root 61453024 Aug 25 02:36 /usr/local/cuda/lib64/libcudnn.so.4.0.7\r\n-rw-r--r-- 1 root root 62025862 Aug 25 02:36 /usr/local/cuda/lib64/libcudnn_static.a\r\n\r\nBut **I have installed Tensorflow CPU only version in the virtualenv I am testing this code on**, hence it might not be using CUDA at all as far as this issue is concerned (I am debugging an issue that occurred on a EC2 machine which has no GPU, by reproducing it in my local system)\r\n \r\nIf installed from binary pip package, provide:\r\n\r\n1. A link to the pip package you installed: https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.11.0rc2-cp27-none-linux_x86_64.whl\r\n\r\n2. The output from `python -c \"import tensorflow; print(tensorflow.__version__)\"`.0.11.0rc2\r\n\r\n### If possible, provide a minimal reproducible example (We usually don't have time to read hundreds of lines of your code)\r\nI have saved a minimal working code which reproduces this issue at [gist](https://gist.github.com/username_0/d299964fead58c0d1e0df9c2190a4f91)\r\n\r\nTo reproduce the issue:\r\n1. Call saver.save in the main thread and checkpoint at least one tf.Variable\r\n2. loading this checkpoint from main thread and evaluating the variable works fine\r\n3. if I now launch a subprocess and try to load the checkpoint in that, Tensorflow hangs at sess.run(tf.initialize_all_variables())\r\n\r\n### What other attempted solutions have you tried?\r\nI tried putting a container with names suffixed by the subprocess pid but it didn't help. I also tried a basic version of create_local_server and it didn't work too\r\n\r\n### Logs or other output that would be helpful\r\n(If logs are large, please upload as attachment or provide link).\r\nI don't see any errors in the console when the code hangs.", "title": "Tensorflow hangs when initializing variables in a multi process setting", "type": "issue" }, { "action": "created", "author": "mrry", "comment_id": 258934405, "datetime": 1478546393000, "masked_author": "username_1", "text": "It looks like the issue here is that Python 2.7 unconditionally uses `os.fork()` in to implement `multiprocessing.Process.start()`, but the TensorFlow runtime is not fork-safe (for one thing, because it creates various internal threads and initialized static members at startup). Since in your example the parent process invokes the TensorFlow runtime, the child will be in an unspecified state when it attempts to do the first `sess.run()` call. (When I ran your code, it was blocking on a condition variable, waiting for constant propagation to finish, for example.)\r\n\r\nAs far as I can tell, there is no way to make the Python 2.7 multiprocessing module do the \"right thing\" here. If you upgrade to Python 3.4, you can use [`multiprocessing.set_start_method('spawn')`](https://docs.python.org/3/library/multiprocessing.html#multiprocessing.set_start_method) to avoid the issues over fork-safety. Alternatively, you could structure your program so that the `import tensorflow as tf` statement only runs in child processes.", "title": null, "type": "comment" }, { "action": "created", "author": "ramnath-k", "comment_id": 258939298, "datetime": 1478547467000, "masked_author": "username_0", "text": "Thanks @username_1 \r\nI tried out your suggestion to move the tensorflow code over to child processes. \r\nFirst up I moved the saving and restoring code to another module as in this [basic_saver_restore.py](https://gist.github.com/username_0/2990b46b84278e5e24919beb44968a00)\r\nThen I called the saver and restore as before using multiprocessing as in [multi_process_saver_restore.py](https://gist.github.com/username_0/fa25696f3f1def6ba32855f612353914)\r\nIn basic_saver_restore, I put a module level variable server equal to create_local_server (which I assume creates a child process to run the session right?). \r\nThis seems to fix the hang I was getting. \r\nWill this approach always work? Or is it that I am only fixing some issues but not all with this approach?\r\n\r\nMoving to Python 3 is something I will attempt later.\r\n\r\nMy actual hang occurs when a Hive launches a python user defined function which calls a restore tensorflow model function. Do you think this approach of having a module level create_local_server fix such an issue also?", "title": null, "type": "comment" }, { "action": "created", "author": "sherrym", "comment_id": 258940888, "datetime": 1478547807000, "masked_author": "username_2", "text": "Hi @username_0 ,\r\n\r\nIt appears to have hung because it's waiting for the main thread (or process) to yield the global interpreter lock. Make your processes daemon processes and it should work.\r\n\r\n for i in range(num_procs):\r\n p = Process(target=subprocess, args=(i,))\r\n p.daemon = True <<<<< Add this line.\r\n p.start() # process fails to get past initialize all variables\r\n procs.append(p)\r\n\r\nI just tried it and it worked for me. I also made some simplifications:\r\n\r\ndef add_model():\r\n a = tf.Variable(2, name='a')\r\n b = tf.Variable(5, name='b')\r\n c = tf.mul(a, b, name='c')\r\n return a, b, c\r\n\r\ndef save_session(unused_arg=None):\r\n with tf.Session(graph=tf.Graph()) as sess:\r\n a, b, c = add_model()\r\n saver = tf.train.Saver(\r\n tf.global_variables(),\r\n max_to_keep=1)\r\n init = [\r\n tf.global_variables_initializer(),\r\n tf.local_variables_initializer()]\r\n sess.run(init)\r\n e = tf.assign(a, 3, name='e')\r\n f = tf.assign(b, 4, name='f')\r\n sess.run([e, f])\r\n val = sess.run(c)\r\n print('val=', val)\r\n checkpoint_dir = './debug'\r\n if not os.path.exists(checkpoint_dir):\r\n os.makedirs(checkpoint_dir)\r\n checkpoint_prefix = os.path.join(checkpoint_dir, 'debug')\r\n path = saver.save(\r\n sess, checkpoint_prefix)\r\n print('save session complete')\r\n\r\ndef restore_session(unused_arg=None):\r\n graph = tf.Graph()\r\n with tf.Session(graph=graph) as sess:\r\n pid = os.getpid()\r\n container_name = 'worker{}'.format(pid)\r\n print('container:{}'.format(container_name))\r\n with graph.container(container_name):\r\n a, b, c = add_model()\r\n saver = tf.train.Saver(tf.global_variables())\r\n print('add model complete')\r\n init = [\r\n tf.global_variables_initializer(),\r\n tf.local_variables_initializer()]\r\n sess.run(init)\r\n print('init model complete')\r\n graph.finalize()\r\n model_checkpoint ='debug/debug'\r\n saver.restore(sess, model_checkpoint)\r\n val = sess.run(c)\r\n print('val=', val)\r\n\r\n\r\ndef subprocess(i):\r\n print('inside subprocess {}'.format(i))\r\n restore_session()\r\n print('exiting subprocess {}'.format(i))\r\n\r\n\r\ndef main(unused_argv):\r\n save_p = Process(target=save_session, args=(1,))\r\n save_p.start()\r\n save_p.join()\r\n restore_p = Process(target=restore_session, args=(1,))\r\n restore_p.start()\r\n restore_p.join()\r\n procs = []\r\n num_procs = 3\r\n for i in range(num_procs):\r\n p = Process(target=subprocess, args=(i,))\r\n p.daemon = True\r\n p.start() # process fails to get past initialize all variables\r\n procs.append(p)\r\n for p in procs:\r\n p.join()\r\n\r\nif __name__ == \"__main__\":\r\n tf.app.run()\r\n\r\n\r\nPlease let me know if this works for you. Thanks.\r\n\r\nSherry", "title": null, "type": "comment" }, { "action": "created", "author": "ramnath-k", "comment_id": 258944847, "datetime": 1478548654000, "masked_author": "username_0", "text": "Thanks for the update @username_2\r\nYes moving everything to child processes makes it work. But to clarify, in the modified code you have provided I don't even need to set daemon=True as no Tensorflow session has been created in the main process.\r\nIt hangs only if I start a session in the main process and then fork a child process.\r\n\r\nScenario is something like, only after the saver completes it must launch multiple workers which process the saved checkpoint.", "title": null, "type": "comment" }, { "action": "closed", "author": "sherrym", "comment_id": null, "datetime": 1478562736000, "masked_author": "username_2", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "hiemal", "comment_id": 275709914, "datetime": 1485534992000, "masked_author": "username_3", "text": "I use keras as wrapper. Moving all the imports into subprocess solves the issue.", "title": null, "type": "comment" }, { "action": "created", "author": "nottombrown", "comment_id": 314953888, "datetime": 1499913600000, "masked_author": "username_4", "text": "@username_1's Python3 solution worked for me. Thanks!\r\n\r\n```python\r\nimport multiprocessing\r\n\r\nimport tensorflow as tf\r\n\r\ndef f(x):\r\n session = tf.Session()\r\n a = tf.Variable(x, name='a')\r\n b = tf.Variable(100, name='b')\r\n c = tf.multiply(a, b, name='c')\r\n session.run(tf.global_variables_initializer())\r\n\r\n out = session.run(c)\r\n print(\"OK: %s\" % out)\r\n\r\nif __name__ == '__main__':\r\n multiprocessing.set_start_method('spawn') # Comment me out to hang\r\n f(0)\r\n multiprocessing.Pool().map(f, range(10))\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "markroxor", "comment_id": 517976068, "datetime": 1564897799000, "masked_author": "username_5", "text": "@username_1 Your solution raises -\r\n```python3\r\n_pickle.PicklingError: Can't pickle <class 'module'>: attribute lookup module on builtins failed\r\n```\r\nI can provide more information if required.", "title": null, "type": "comment" }, { "action": "created", "author": "zaccharieramzi", "comment_id": 557016107, "datetime": 1574331359000, "masked_author": "username_6", "text": "I am trying to solve https://github.com/tensorflow/tensorflow/issues/34456 , so I tried the main solution provided here (`multiprocessing.set_start_method('spawn')`), but I got the following error: `TypeError: can't pickle _thread.lock objects`.\r\n\r\nThis is because [tf is initializing a pool with a queue](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/utils/data_utils.py#L821-L823) which is [apparently not doable](https://stackoverflow.com/a/7865512/4332585).", "title": null, "type": "comment" } ]
7
10
9,916
false
false
9,916
true
puppetlabs/puppetlabs-apache
puppetlabs
109,139,327
1,215
{ "number": 1215, "repo": "puppetlabs-apache", "user_login": "puppetlabs" }
[ { "action": "opened", "author": "orthographic-pedant", "comment_id": null, "datetime": 1443633561000, "masked_author": "username_0", "text": "puppetlabs, I've corrected a typographical error in the documentation of the [puppetlabs-apache](https://github.com/puppetlabs/puppetlabs-apache) project. You should be able to merge this pull request automatically. However, if this was intentional or you enjoy living in linguistic squalor please let me know and [create an issue](https://github.com/username_2/username_0/issues/new) on my home repository.", "title": "Fixed typographical error, changed accomodate to accommodate in README.", "type": "issue" }, { "action": "created", "author": "igalic", "comment_id": 144706271, "datetime": 1443700628000, "masked_author": "username_1", "text": "thanks @username_0 / @username_2!", "title": null, "type": "comment" }, { "action": "created", "author": "thoppe", "comment_id": 144740080, "datetime": 1443708839000, "masked_author": "username_2", "text": "I'm happy that my snarky robots make the world a better place. Thanks you for building cool software!", "title": null, "type": "comment" } ]
3
3
551
false
false
551
true
zooniverse/wildcam-gorongosa-education
zooniverse
168,598,196
252
null
[ { "action": "opened", "author": "aliburchard", "comment_id": null, "datetime": 1470041884000, "masked_author": "username_0", "text": "@username_1 Based on conversation friday, basic requirements for \"viewing\" assingments are:\r\n1. Display the metadata that teachers put in: Name, description, due date, #classifications per student\r\n2. List the students in the assignment\r\n\r\nIf easy, we should also\r\n3. Display the number of subjects selected and, if possible \r\n4. Display the filters used to create the subject set; @shaunanoordin will provide in \"plain english\" IF EASY.", "title": "View Assignments: MVP", "type": "issue" }, { "action": "closed", "author": "simoneduca", "comment_id": null, "datetime": 1470411410000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
2
441
false
false
441
true
drewnoakes/metadata-extractor-dotnet
null
91,595,043
11
null
[ { "action": "opened", "author": "drewnoakes", "comment_id": null, "datetime": 1435494766000, "masked_author": "username_0", "text": "Diff between Java and C# versions:\r\n\r\n```diff\r\ndiff --git a/jpg/metadata/epson photopc 3000z.jpg.txt b/jpg/metadata/epson photopc 3000z.jpg.txt\r\nindex 35cb037..9c7af1a 100644\r\n--- a/jpg/metadata/epson photopc 3000z.jpg.txt\t\r\n+++ b/jpg/metadata/epson photopc 3000z.jpg.txt\t\r\n@@ -48,32 +48,5 @@ FILE: Epson PhotoPC 3000Z.jpg\r\n [Exif SubIFD - 0xa300] File Source = Digital Still Camera (DSC)\r\n [Exif SubIFD - 0xa301] Scene Type = Directly photographed image\r\n \r\n-[Olympus Makernote - 0x0200] Special Mode = Unknown picture taking mode / 208th in a sequence\r\n-[Olympus Makernote - 0x0201] JPEG Quality = Super High Quality\r\n-[Olympus Makernote - 0x0202] Macro = Macro\r\n-[Olympus Makernote - 0x0203] BW Mode = Off\r\n-[Olympus Makernote - 0x0204] DigiZoom Ratio = Unknown (1)\r\n-[Olympus Makernote - 0x0209] Camera Id = EPSON DIGITAL CAMERA\r\n-[Olympus Makernote - 0x020a] Unknown tag (0x020a) = [512 values]\r\n-[Olympus Makernote - 0x020b] Image Width = 2048\r\n-[Olympus Makernote - 0x020c] Image Height = 1536\r\n-[Olympus Makernote - 0x020d] Original Manufacturer Model = SX321\r\n-[Olympus Makernote - 0x0300] Pre Capture Frames = 0\r\n-[Olympus Makernote - 0x0f00] Data Dump = [174 values]\r\n-\r\n-[Interoperability - 0x0001] Interoperability Index = Recommended Exif Interoperability Rules (ExifR98)\r\n-[Interoperability - 0x0002] Interoperability Version = 1.00\r\n-\r\n-[Exif Thumbnail - 0x0103] Compression = JPEG (old-style)\r\n-[Exif Thumbnail - 0x011a] X Resolution = 72 dots per inch\r\n-[Exif Thumbnail - 0x011b] Y Resolution = 72 dots per inch\r\n-[Exif Thumbnail - 0x0128] Resolution Unit = Inch\r\n-[Exif Thumbnail - 0x0201] Thumbnail Offset = 4084 bytes\r\n-[Exif Thumbnail - 0x0202] Thumbnail Length = 4700 bytes\r\n-\r\n-[File - 0x0001] File Name = Epson PhotoPC 3000Z.jpg\r\n-[File - 0x0002] File Size = 744723 bytes\r\n-[File - 0x0003] File Modified Date = Sun Jan 27 02:31:15 +00:00 2013\r\n```", "title": "Epson PhotoPC image missing several directories", "type": "issue" }, { "action": "created", "author": "drewnoakes", "comment_id": 116277440, "datetime": 1435498185000, "masked_author": "username_0", "text": "This occurs in many images during Olympus makernote processing.\r\n\r\nThis one is particularly bad as all processing halts:\r\n\r\n```diff\r\ndiff --git a/jpg/metadata/minolta dimage s404.jpg.txt b/jpg/metadata/minolta dimage s404.jpg.txt\r\nindex fca5fb4..4e11678 100644\r\n--- a/jpg/metadata/minolta dimage s404.jpg.txt\t\r\n+++ b/jpg/metadata/minolta dimage s404.jpg.txt\t\r\n@@ -51,60 +51,18 @@ FILE: Minolta DiMAGE S404.jpg\r\n [Olympus Makernote - 0xf006] Image Quality = Raw\r\n [Olympus Makernote - 0xf007] Shooting Mode = Single\r\n [Olympus Makernote - 0xf008] Metering Mode = Unknown (43)\r\n-[Olympus Makernote - 0xf009] Apex Film Speed Value = 450.0\r\n+[Olympus Makernote - 0xf009] Apex Film Speed Value = 450\r\n [Olympus Makernote - 0xf00a] Apex Shutter Speed Time Value = 1.890625 sec\r\n-[Olympus Makernote - 0xf00b] Apex Aperture Value = f/0.2\r\n+[Olympus Makernote - 0xf00b] Apex Aperture Value = f/0.3\r\n [Olympus Makernote - 0xf00c] Macro Mode = Off\r\n [Olympus Makernote - 0xf00d] Digital Zoom = Unknown (6)\r\n-[Olympus Makernote - 0xf00e] Exposure Compensation = -1.6666666666666667 EV\r\n+[Olympus Makernote - 0xf00e] Exposure Compensation = -1.66666666666667 EV\r\n [Olympus Makernote - 0xf00f] Bracket Step = 1/3 EV\r\n [Olympus Makernote - 0xf010] Unknown tag (0xf010) = 0\r\n [Olympus Makernote - 0xf011] Interval Length = N/A\r\n [Olympus Makernote - 0xf012] Interval Number = N/A\r\n-[Olympus Makernote - 0xf013] Focal Length = 0.0 mm\r\n+[Olympus Makernote - 0xf013] Focal Length = 0 mm\r\n [Olympus Makernote - 0xf014] Focus Distance = Infinity\r\n [Olympus Makernote - 0xf015] Flash Fired = Unknown (131204368)\r\n-[Olympus Makernote - 0xf016] Date = Sat Apr 26 00:00:00 BST 2014\r\n-[Olympus Makernote - 0xf017] Time = 00:00:38\r\n-[Olympus Makernote - 0xf018] Max Aperture at Focal Length = f/3.5\r\n-[Olympus Makernote - 0xf019] Unknown tag (0xf019) = 0\r\n-[Olympus Makernote - 0xf01a] Unknown tag (0xf01a) = 0\r\n-[Olympus Makernote - 0xf01b] File Number Memory = Off\r\n-[Olympus Makernote - 0xf01c] Last File Number = 671\r\n-[Olympus Makernote - 0xf01d] White Balance Red = 1.0\r\n-[Olympus Makernote - 0xf01e] White Balance Green = 1.44140625\r\n-[Olympus Makernote - 0xf01f] White Balance Blue = 0.01171875\r\n-[Olympus Makernote - 0xf020] Saturation = 0\r\n-[Olympus Makernote - 0xf021] Contrast = -2\r\n-[Olympus Makernote - 0xf022] Sharpness = Normal\r\n-[Olympus Makernote - 0xf023] Subject Program = Unknown (6)\r\n-[Olympus Makernote - 0xf024] Flash Compensation = -0.6666666666666666 EV\r\n-[Olympus Makernote - 0xf025] ISO Setting = 800\r\n-[Olympus Makernote - 0xf026] Camera Model = DiMAGE 7\r\n-[Olympus Makernote - 0xf027] Interval Mode = Time Lapse Movie\r\n-[Olympus Makernote - 0xf028] Folder Name = Data Form\r\n-[Olympus Makernote - 0xf029] Color Mode = Unknown (8000)\r\n-[Olympus Makernote - 0xf02a] Color Filter = 5\r\n-[Olympus Makernote - 0x0010] Unknown tag (0x0010) = [12968 values]\r\n-[Olympus Makernote - 0x0020] Unknown tag (0x0020) = [354 values]\r\n-[Olympus Makernote - 0x0040] Compressed Image Size = 1826901\r\n-[Olympus Makernote - 0x0e00] Print Image Matching (PIM) Info = [40 values]\r\n-\r\n-[Interoperability - 0x0001] Interoperability Index = Recommended Exif Interoperability Rules (ExifR98)\r\n-[Interoperability - 0x0002] Interoperability Version = 1.00\r\n-\r\n-[Exif Thumbnail - 0x0103] Compression = JPEG (old-style)\r\n-[Exif Thumbnail - 0x0112] Orientation = Top, left side (Horizontal / normal)\r\n-[Exif Thumbnail - 0x011a] X Resolution = 72 dots per inch\r\n-[Exif Thumbnail - 0x011b] Y Resolution = 72 dots per inch\r\n-[Exif Thumbnail - 0x0128] Resolution Unit = Inch\r\n-[Exif Thumbnail - 0x0201] Thumbnail Offset = 14338 bytes\r\n-[Exif Thumbnail - 0x0202] Thumbnail Length = 4381 bytes\r\n-[Exif Thumbnail - 0x0213] YCbCr Positioning = Center of pixel array\r\n-\r\n-[File - 0x0001] File Name = Minolta DiMAGE S404.jpg\r\n-[File - 0x0002] File Size = 1897844 bytes\r\n-[File - 0x0003] File Modified Date = Sun Jan 27 02:31:15 +00:00 2013\r\n-\r\n Generated using metadata-extractor\r\n https://username_0.com/code/exif/\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "drewnoakes", "comment_id": 118561845, "datetime": 1436051866000, "masked_author": "username_0", "text": "This bug is seen because the Java version stored `int32u_t` as `long` (because there are no unsigned integers in Java), but the C# version uses `uint`. The descriptor class was failing to cast this to `long[]`, as `long` is used to model `uint` in Java.", "title": null, "type": "comment" }, { "action": "closed", "author": "drewnoakes", "comment_id": null, "datetime": 1436052044000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" } ]
1
4
6,105
false
false
6,105
true
elastic/beats
elastic
165,765,835
2,043
null
[ { "action": "opened", "author": "lightkz", "comment_id": null, "datetime": 1468581829000, "masked_author": "username_0", "text": "Based on discussion opening enhancement feature request.\r\nhttps://discuss.elastic.co/t/reload-if-modification-time-changes-in-config-files/55556", "title": "Live reload config files without restarting filebeat process", "type": "issue" }, { "action": "created", "author": "slava-vishnyakov", "comment_id": 258393888, "datetime": 1478255462000, "masked_author": "username_1", "text": "It would probably be better to reload, like nginx with something like `kill -s HUP`", "title": null, "type": "comment" }, { "action": "created", "author": "OferE", "comment_id": 261173029, "datetime": 1479366926000, "masked_author": "username_2", "text": "I'm working with microservices architecture and restarting the filebeat process is a big pain to me,\r\nI have many services coming and going and i want them to register themselves with filebeats without restarting the filebeat process everytime a service joins.\r\nRestart a process is a very aggressive operation and has consequences regarding the exactly once guarantees.", "title": null, "type": "comment" }, { "action": "created", "author": "ruflin", "comment_id": 261522029, "datetime": 1479472523000, "masked_author": "username_3", "text": "Does every service need its own config / prospector? Or do you know all of them in advance and could provide all the prospectors on first start?", "title": null, "type": "comment" }, { "action": "created", "author": "OferE", "comment_id": 261526909, "datetime": 1479474227000, "masked_author": "username_2", "text": "knowing all services in advance is a bad practice since i have multiple machine types, each type runs a different set of services. each machine may run different services even with the same machine type.\r\ncurrently this is how i workaround everything: by providing one huge config.\r\nThe correct approach IMHO will be that every service will register itself with its own config - the current filebeat process will reload the configuration on every change/inroduction of new config and adjust in a clean way.\r\nBy the way - the outputs themselves should have the ability to change dynamically and accrding to each prospector, since some services may want to write to kafka (even into different topics) and others to elasctic/logstash.\r\nThis should be per prospector and not as a global definition.\r\nThe world is moving into microservices architectures - filebeat is a great project - but it is not a perfect match for micro services.", "title": null, "type": "comment" }, { "action": "created", "author": "ruflin", "comment_id": 261874422, "datetime": 1479717088000, "masked_author": "username_3", "text": "@username_2 I was also more thinking of current work arounds, but it seems you found one. It is definitively something we are looking into also related to https://github.com/elastic/beats/issues/464", "title": null, "type": "comment" }, { "action": "created", "author": "OferE", "comment_id": 261905171, "datetime": 1479725536000, "masked_author": "username_2", "text": "@username_3 - great to see that i am useful - also great to see that u hear community feedback! great product - i tend to use it in production for streaming messages to Kafka un huge scale.", "title": null, "type": "comment" }, { "action": "created", "author": "aqiao", "comment_id": 270550877, "datetime": 1483584810000, "masked_author": "username_4", "text": "hi guys, may i know the latest progress of this feature?", "title": null, "type": "comment" }, { "action": "created", "author": "ruflin", "comment_id": 270649192, "datetime": 1483624545000, "masked_author": "username_3", "text": "@username_4 Interesting timing. There is not directly a filebeat update but I just pushed this PR here: https://github.com/elastic/beats/pull/3281 This could be reused for filebeat prospectors in the future. It's a little bit trickier for filebeat as it must be ensured that a file is only harvested once and brings up the question when exactly a harvester should shut down ...", "title": null, "type": "comment" }, { "action": "created", "author": "ruflin", "comment_id": 275058795, "datetime": 1485336323000, "masked_author": "username_3", "text": "Closing as https://github.com/elastic/beats/pull/3362 was merged to master. This allows to dynamically reload prospectors.", "title": null, "type": "comment" }, { "action": "closed", "author": "ruflin", "comment_id": null, "datetime": 1485336323000, "masked_author": "username_3", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "OferE", "comment_id": 275317745, "datetime": 1485412553000, "masked_author": "username_2", "text": "@username_3 thanks for this feature!", "title": null, "type": "comment" } ]
5
12
2,631
false
false
2,631
true
bblum/landslide
null
106,455,183
190
null
[ { "action": "opened", "author": "bblum", "comment_id": null, "datetime": 1442278165000, "masked_author": "username_0", "text": "", "title": "There's still a race that makes the ID wrapper hang waiting for landslide jobs after TIME_UP.", "type": "issue" }, { "action": "created", "author": "bblum", "comment_id": 140586098, "datetime": 1442362823000, "masked_author": "username_0", "text": "It's not to do with particular jobs.\r\n\r\nAll I know is the last line output is ```[MESSAGING] suspending time``` and it's resistant to kill -9.", "title": null, "type": "comment" }, { "action": "created", "author": "bblum", "comment_id": 140587082, "datetime": 1442363330000, "masked_author": "username_0", "text": "```[JOB 136] State space too big (32 brs elapsed, time rem 205, eta 2316) -- blocking```\r\nis printed\r\nbut its status never changes from ```Running...```", "title": null, "type": "comment" }, { "action": "created", "author": "bblum", "comment_id": 140650306, "datetime": 1442387192000, "masked_author": "username_0", "text": "infodump\r\n\r\ni was stumped for a long time until i noticed in the contents of the job struct, the state of blocking_cvar is clobbered. turns out it's never cond_init'ed. i'm running some tests with that fixed and hoping this never comes up again.\r\n\r\nmortals were not meant to understand the sequence of events in which uninitialized reads in blocking_cvar corrupted the state of its neighbour done_cvar.\r\n```\r\n(gdb) thread 6\r\n[Switching to thread 6 (Thread 0x7f7bb53ee700 (LWP 10521))]\r\n(gdb) bt\r\n#0 0x00007f7bb8fc4f7d in __lll_lock_wait ()\r\n#1 0x00007f7bb8fc74ec in _L_cond_lock_792 ()\r\n#2 0x00007f7bb8fc73c8 in __pthread_mutex_cond_lock ()\r\n#3 0x00007f7bb8fc2795 in pthread_cond_wait@@GLIBC_2.3.2 ()\r\n#4 0x00000000004092e3 in wait_on_job (j=0x7f7b98002110) at job.c:277\r\n#5 0x0000000000417b02 in process_work (j=0x7f7b98002110, was_blocked=false) at work.c:261\r\n#6 0x0000000000417e18 in workqueue_thread (arg=0x6) at work.c:290\r\n#7 0x00007f7bb8fbedf5 in start_thread ()\r\n#8 0x00007f7bb8cec1ad in clone ()\r\n\r\n(gdb) thread 2\r\n[Switching to thread 2 (Thread 0x7f7bb33ea700 (LWP 22078))]\r\n(gdb) bt\r\n#0 0x00007f7bb8fc4f7d in __lll_lock_wait ()\r\n#1 0x00007f7bb8fc27e2 in pthread_cond_wait@@GLIBC_2.3.2 ()\r\n#2 0x0000000000408fd9 in job_block (j=0x7f7b98002110) at job.c:251\r\n#3 0x000000000040a719 in handle_estimate (...) at messaging.c:214\r\n#4 0x000000000040ba27 in talk_to_child (...) at messaging.c:362\r\n#5 0x000000000040890c in run_job (arg=0x7f7b98002110) at job.c:198\r\n#6 0x00007f7bb8fbedf5 in start_thread ()\r\n#7 0x00007f7bb8cec1ad in clone ()\r\n\r\n(gdb) print *(struct job *)0x7f7b98002110\r\n$1 = {\r\nconfig = 0x7f7b98002af0,\r\nid = 117,\r\ngeneration = 3,\r\nshould_reproduce = true,\r\nconfig_file = {fd = 68, filename = \"../pebsim/config-id.landslide.Bk1nx8\"},\r\nlog_stdout = { fd = 69, filename = \"ls-setup.log.gTNRPI\"},\r\nlog_stderr = {fd = 70, filename = \"ls-output.log.djBt8i\"},\r\nstats_lock = {__data = {__lock = 0,\r\n __nr_readers = 0, __readers_wakeup = 0, __writer_wakeup = 0,\r\n __nr_readers_queued = 0, __nr_writers_queued = 0, __writer = 0, __shared = 0,\r\n __pad1 = 0, __pad2 = 0, __flags = 0}, __size = '\\000' <repeats 55 times>,\r\n __align = 0},\r\nelapsed_branches = 32,\r\nestimate_proportion = 0.0146484375,\r\nestimate_elapsed = {secs = 38, mins = 0, hours = 0, days = 0, years = 0, inf = false}\r\nestimate_eta = {secs = 43, mins = 37, hours = 0, days = 0, years = 0, inf = false}\r\nestimate_eta_numeric = 2263454568,\r\ncancelled = false,\r\ncomplete = false,\r\ntimed_out = false,\r\nkill_job = false,\r\nlog_filename = \"ls-output.log.djBt8i\",\r\ntrace_filename = 0x0,\r\nstatus = JOB_BLOCKED,\r\ndone_cvar = {__data = {__lock = 0, __futex = 2, __total_seq = 1,\r\n __wakeup_seq = 1, __woken_seq = 1, __mutex = 0x7f7b980022a0, __nwaiters = 0,\r\n __broadcast_seq = 1},\r\n __size = \"\\000\\000\\000\\000\\002\\000\\000\\000\\001\\000\\000\\000\\000\\000\\000\\000\\001\\000\\000\\000\\000\\000\\000\\000\\001\\000\\000\\000\\000\\000\\000\\000\\240\\\"\\000\\230{\\177\\000\\000\\000\\000\\000\\000\\001\\000\\000\", __align = 8589934592},\r\nblocking_cvar = {__data = {__lock = 2, __futex = 32635,\r\n __total_seq = 140168740738608, __wakeup_seq = 140168740739104,\r\n __woken_seq = 140168740739600, __mutex = 0x7f7b980022a0, __nwaiters = 2483035632,\r\n __broadcast_seq = 32635},\r\n __size = \"\\002\\000\\000\\000{\\177\\000\\000\\060\\026\\000\\224{\\177\\000\\000 \\030\\000\\224{\\177\\000\\000\\020\\032\\000\\224{\\177\\000\\000\\240\\\"\\000\\230{\\177\\000\\000\\360\\035\\000\\224{\\177\\000\",\r\n __align = 140166257704962},\r\nlifecycle_lock = {__data = {__lock = 2, __count = 0,\r\n __owner = 22078, __nusers = 2, __kind = 0, __spins = 0, __list = {__prev = 0x0,\r\n __next = 0x0}},\r\n __size = \"\\002\\000\\000\\000\\000\\000\\000\\000>V\\000\\000\\002\", '\\000' <repeats 26 times>,\r\n __align = 2}}\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "bblum", "comment_id": 140844512, "datetime": 1442430257000, "masked_author": "username_0", "text": "Figured out why this broke the way it did.\r\n\r\nWorkqueue thread (6) has been broadcasted, but is waiting for ```lifecycle_lock``` to be released before it can exit ```cond_wait```.\r\n\r\nJob thread (2) is holding ```lifecycle_lock```, getting ready to AU&D on ```blocking_cvar```. But before it can let go of ```lifecycle_lock```, it needs to get the internal ```blocking_cvar``` lock.\r\n\r\nBecause the lock is uninitialized, it randomly appears to be already locked. No thread holds it so the job thread (2) waits forever. Hence ```lifecycle_lock``` is never dropped and WQ thread 6 is stuck at the very end of ```cond_wait``` forever.", "title": null, "type": "comment" }, { "action": "closed", "author": "bblum", "comment_id": null, "datetime": 1443596814000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" } ]
1
6
4,693
false
false
4,693
false
gravitee-io/gravitee-management-rest-api
gravitee-io
138,962,337
48
null
[ { "action": "opened", "author": "aelamrani", "comment_id": null, "datetime": 1457352210000, "masked_author": "username_0", "text": "Allows to export a JSON file which contains the definition of an API.\r\nAllows to import it to update an API, or import to create.", "title": "Import/export a JSON API definition file", "type": "issue" }, { "action": "closed", "author": "NicolasGeraud", "comment_id": null, "datetime": 1457478492000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
2
129
false
false
129
false
babel/babel
babel
115,088,005
2,815
null
[ { "action": "opened", "author": "timmolendijk", "comment_id": null, "datetime": 1446653400000, "masked_author": "username_0", "text": "Let's say we have the following piece of code:\r\n\r\n```js\r\nexport async function beforeBluebirdRequire() {\r\n\tawait getMeSomePromise();\r\n}\r\n\r\nexport default async function afterBluebirdRequire() {\r\n\tawait getMeSomePromise();\r\n}\r\n```\r\n\r\nTo deal with the async/wait we transpile this with a `.babelrc` as follows:\r\n\r\n```json\r\n{\r\n\t\"plugins\": [\r\n\t\t[\"transform-async-to-module-method\", {\r\n\t\t\t\"module\": \"bluebird\",\r\n\t\t\t\"method\": \"coroutine\"\r\n\t\t}]\r\n\t]\r\n}\r\n```\r\n\r\nThis results in the following code:\r\n\r\n```js\r\nimport { coroutine as _coroutine } from \"bluebird\";\r\nexport default _coroutine(function* afterBluebirdRequire() {\r\n\tyield getMeSomePromise();\r\n});\r\n\r\nexport let beforeBluebirdRequire = (function () {\r\n\tvar ref = _coroutine(function* beforeBluebirdRequire() {\r\n\t\tyield getMeSomePromise();\r\n\t});\r\n\r\n\treturn function beforeBluebirdRequire() {\r\n\t\treturn ref.apply(this, arguments);\r\n\t};\r\n})();\r\n```\r\n\r\nWhich will not run on Node because it assumes ES6 modules, so we change `.babelrc` to:\r\n\r\n```json\r\n{\r\n\t\"plugins\": [\r\n\t\t[\"transform-async-to-module-method\", {\r\n\t\t\t\"module\": \"bluebird\",\r\n\t\t\t\"method\": \"coroutine\"\r\n\t\t}],\r\n\t\t\"transform-es2015-modules-commonjs\"\r\n\t]\r\n}\r\n```\r\n\r\nBut now the transpiled code ends up as:\r\n\r\n```js\r\n\"use strict\";\r\n\r\nlet beforeBluebirdRequire = exports.beforeBluebirdRequire = (function () {\r\n\tvar ref = (0, _bluebird.coroutine)(function* beforeBluebirdRequire() {\r\n\t\tyield getMeSomePromise();\r\n\t});\r\n\treturn function beforeBluebirdRequire() {\r\n\t\treturn ref.apply(this, arguments);\r\n\t};\r\n})();\r\n\r\nObject.defineProperty(exports, \"__esModule\", {\r\n\tvalue: true\r\n});\r\nexports.beforeBluebirdRequire = undefined;\r\n\r\nvar _bluebird = require(\"bluebird\");\r\n\r\nexports.default = (0, _bluebird.coroutine)(function* afterBluebirdRequire() {\r\n\tyield getMeSomePromise();\r\n});\r\n```\r\n\r\nThis won't run because `_bluebird.coroutine` is used before it is required.\r\n\r\n---\r\n\r\nThe silly thing is that when you separate the two transpile operations into two steps, it will all work fine and as expected.\r\n\r\nSo first we substitute the async/await using `transform-async-to-module-method`, and then we take the result as input to another transpilation call that substitutes the ES6 modules for CommonJS modules using `transform-es2015-modules-commonjs`.\r\n\r\nThe result is exactly what we need:\r\n\r\n```js\r\n\"use strict\";\r\n\r\nObject.defineProperty(exports, \"__esModule\", {\r\n\tvalue: true\r\n});\r\nexports.beforeBluebirdRequire = undefined;\r\n\r\nvar _bluebird = require(\"bluebird\");\r\n\r\nexports.default = (0, _bluebird.coroutine)(function* afterBluebirdRequire() {\r\n\tyield getMeSomePromise();\r\n});\r\nlet beforeBluebirdRequire = exports.beforeBluebirdRequire = (function () {\r\n\tvar ref = (0, _bluebird.coroutine)(function* beforeBluebirdRequire() {\r\n\t\tyield getMeSomePromise();\r\n\t});\r\n\r\n\treturn function beforeBluebirdRequire() {\r\n\t\treturn ref.apply(this, arguments);\r\n\t};\r\n})();\r\n```\r\n\r\n**Why won't we get this result when using the two plugins together, in the same transpile operation?**", "title": "`transform-async-to-module-method` entails needing `transform-es2015-modules-commonjs` but the two don't go well together", "type": "issue" }, { "action": "closed", "author": "loganfsmyth", "comment_id": null, "datetime": 1446654688000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "loganfsmyth", "comment_id": 153784021, "datetime": 1446654688000, "masked_author": "username_1", "text": "As you mentioned, this is a dup.", "title": null, "type": "comment" } ]
3
4
3,180
false
true
2,999
false
bazelbuild/bazel
bazelbuild
197,705,117
2,304
null
[ { "action": "opened", "author": "philbinj", "comment_id": null, "datetime": 1482846271000, "masked_author": "username_0", "text": "There seems to be an issue where globs produce different results when bazel is run from scratch (i.e. after a shutdown) vs when it is cached. Here is a python script that reproduces the issue:\r\n\r\n```\r\nimport os, random, shutil, sys \r\n\r\nNUM_FILES = 1000\r\nNEW_PATH_PROB = 0.5\r\nBASE = 'src/'\r\n\r\nrandom.seed(42)\r\n\r\n\r\ndef generate_path(prefix):\r\n p = random.random()\r\n suffix = str(chr(ord('a') + random.randint(0, 25)))\r\n if p < NEW_PATH_PROB:\r\n return generate_path(os.path.join(prefix, suffix + '/'))\r\n return os.path.join(prefix, suffix)\r\n\r\n\r\ndef generate_paths():\r\n paths = []\r\n for i in range(NUM_FILES):\r\n paths.append(generate_path(BASE))\r\n paths = list(set(paths))\r\n return paths\r\n\r\n\r\ndef create_source_and_header(path):\r\n try:\r\n os.makedirs(os.path.dirname(path))\r\n except OSError, e:\r\n pass\r\n ccsrc = path + '.cc'\r\n hsrc = path + '.h'\r\n funcname = path.replace('/', '_')\r\n with open(ccsrc, 'w') as f:\r\n print >>f, \"\"\"int %s() { return 1; }\\n\"\"\" % funcname\r\n with open(hsrc, 'w') as f:\r\n print >>f, \"\"\"int %s();\\n\"\"\" % funcname\r\n\r\n\r\ndef create_build():\r\n buildname = 'BUILD'\r\n with open(buildname, 'w') as f:\r\n print >>f, \"\"\"\r\ncc_library(\r\n name = \"src\",\r\n srcs = glob([\"src/**/*.cc\"]),\r\n hdrs = glob([\"src/**/*.h\"]),\r\n)\r\n\"\"\"\r\n\r\n\r\ndef create_workspace():\r\n with open('WORKSPACE', 'w') as f:\r\n pass\r\n\r\n\r\ndef main():\r\n # Reset and rebuild local tree.\r\n try:\r\n shutil.rmtree(BASE)\r\n except OSError, e:\r\n pass\r\n paths = generate_paths()\r\n for p in paths:\r\n create_source_and_header(p)\r\n create_build()\r\n create_workspace()\r\n\r\n # Now start doing rebuilds.\r\n idx = 0\r\n os.system('bazel clean')\r\n while True:\r\n print '==== RUN %05d ====' % idx\r\n os.system('bazel shutdown')\r\n os.system('bazel build -c opt :src')\r\n with open('BUILD', 'a') as f:\r\n print >>f, '\\n'\r\n os.system('bazel build -c opt :src')\r\n with open('BUILD', 'a') as f:\r\n print >>f, '\\n'\r\n idx += 1\r\n\r\n\r\nif __name__ == \"__main__\":\r\n main()\r\n```\r\n\r\nIf you create a fresh directory and run the above script from within it a random tree of c++ files will be created with a corresponding BUILD and WORKSPACE. The script then alters the BUILD file and interlaces building after a shutdown with building when the server is running. With every invocation of `bazel build` all of the target c++ files are re-compiled from scratch despite the fact that neither the source files or the BUILD target ever changes.\r\n\r\nIf you alter the above script to always run `bazel shutdown` before build then no rebuilds are seen other than the first one. If you change the script to never run `bazel shutdown` then no rebuilds are seen other than the first one. So this is clearly some interaction between cached globs and the bazel server.\r\n\r\nI believe this is high priority as it affects both correctness (bazel believes it needs to build when it shouldn't) and speed.", "title": "Unnecessary rebuilds observed when compiling targets with globs.", "type": "issue" }, { "action": "created", "author": "damienmg", "comment_id": 269336732, "datetime": 1482850802000, "masked_author": "username_1", "text": "Sorry I added -s where it was usefull and never saw any rebuild, though there was loading invalidation which is expected. \r\n\r\nReplacing the main loop with that give me 0 rebuild (-s print no action):\r\n\r\n```\r\n while idx < 10:\r\n print '==== RUN %05d ====' % idx\r\n os.system('bazel shutdown')\r\n os.system('bazel build -s -c opt :src' if idx > 0 else 'bazel build -c opt :src')\r\n with open('BUILD', 'a') as f:\r\n print >>f, '\\n'\r\n os.system('bazel build -s -c opt :src')\r\n with open('BUILD', 'a') as f:\r\n print >>f, '\\n'\r\n idx += 1\r\n```", "title": null, "type": "comment" }, { "action": "closed", "author": "damienmg", "comment_id": null, "datetime": 1482854085000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "damienmg", "comment_id": 269343923, "datetime": 1482854085000, "masked_author": "username_1", "text": "No idea but we have been working hard a few months ago to fix invalidation issue with globs, could be it.", "title": null, "type": "comment" }, { "action": "created", "author": "philbinj", "comment_id": 269357117, "datetime": 1482860117000, "masked_author": "username_0", "text": "Got it, thank you!", "title": null, "type": "comment" } ]
2
5
4,097
false
false
4,097
false
OdooCommunityWidgets/website_multi_image
OdooCommunityWidgets
53,874,399
14
null
[ { "action": "opened", "author": "treviser", "comment_id": null, "datetime": 1420814678000, "masked_author": "username_0", "text": "http://www.bloopark.de/portfolio/belsonno", "title": "Luke: have you seen what bloopark did?", "type": "issue" }, { "action": "created", "author": "lukebranch", "comment_id": 69349672, "datetime": 1420817664000, "masked_author": "username_1", "text": "@username_0 ,\r\n\r\nI have, they've done some very nice work with belsonno.de. I'll be looking at building a few of the features they've added to that website to integrate into my own projects soon.", "title": null, "type": "comment" } ]
2
2
234
false
false
234
true
walter-cd/walter
walter-cd
54,403,585
72
null
[ { "action": "opened", "author": "takahi-i", "comment_id": null, "datetime": 1421287254000, "masked_author": "username_0", "text": "This issue is a part of #69.\r\n\r\nWalter is going to support web hook by Github. When a commit is push to GitHub, GitHub submits the event to the Walter simple server. Then the Walter server executes the pipeline.\r\n\r\n**Note:**\r\n\r\n - need queue\r\n - need server", "title": "Web hook support", "type": "issue" }, { "action": "created", "author": "mizzy", "comment_id": 277600571, "datetime": 1486364555000, "masked_author": "username_1", "text": "I close this for cleaning issues.", "title": null, "type": "comment" }, { "action": "closed", "author": "mizzy", "comment_id": null, "datetime": 1486364555000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
3
298
false
false
298
false
18F/intake
18F
140,761,754
70
{ "number": 70, "repo": "intake", "user_login": "18F" }
[ { "action": "opened", "author": "melodykramer", "comment_id": null, "datetime": 1457980743000, "masked_author": "username_0", "text": "fixed some typos", "title": "Update index.md", "type": "issue" }, { "action": "created", "author": "melodykramer", "comment_id": 196463344, "datetime": 1457980749000, "masked_author": "username_0", "text": "@username_1", "title": null, "type": "comment" }, { "action": "created", "author": "wslack", "comment_id": 196566449, "datetime": 1457998207000, "masked_author": "username_1", "text": "Closing as this should be against pages-staging, but will correct there and send the PR to Mel to merge.", "title": null, "type": "comment" } ]
2
3
127
false
false
127
true
sphinx-doc/sphinx
sphinx-doc
202,711,524
3,366
null
[ { "action": "opened", "author": "zygoloid", "comment_id": null, "datetime": 1485226337000, "masked_author": "username_0", "text": "Subject: Support programs with options that differ only by case or punctuation\r\n\r\n### Problem\r\n- `clang`, an open-source C/C++/... compiler, supports lots of options, and in some cases the options differ only by punctuation (`-ObjC` and `-ObjC++` mean different things, as do `-I` and `-I-`); in other cases, options contain only punctuation (`/?`). Sphinx does not support documenting such options.\r\n\r\n#### Procedure to reproduce the problem\r\n```\r\n.. option:: -ObjC\r\n\r\nDescription of option ``-ObjC``.\r\n\r\n.. option:: -ObjC++\r\n\r\nDescription of option ``-ObjC++``.\r\n```\r\n\r\n#### Error logs / results\r\n```\r\nClangCommandLineReference.rst:9: SEVERE: Duplicate ID: \"cmdoption-clang-ObjC\".\r\n```\r\n\r\n#### Expected results\r\nProduce distinct IDs for distinct flags. Do not assume that punctuation can be stripped. Support options containing only punctuation. Either that, or provide a mechanism to override the ID generated for a particular option name.\r\n\r\n### Reproducible project / your project\r\n- https://github.com/llvm-mirror/clang/tree/master/docs/ClangCommandLineReference.rst (not submitted yet, will be there shortly)\r\n\r\n### Environment info\r\n- Python version: 2.7.6\r\n- Sphinx version: 1.2.2", "title": "Support programs with options that differ only by case or punctuation", "type": "issue" }, { "action": "created", "author": "tk0miya", "comment_id": 275302616, "datetime": 1485403000000, "masked_author": "username_1", "text": "In my short investigation:\r\nThe option directive uses following regexp to parse given option:\r\n```\r\noption_desc_re = re.compile(r'((?:/|--|-|\\+)?[-\\.?@#_a-zA-Z0-9]+)(=?\\s*.*)')\r\n```\r\nAt present, it does not contain `+` sign. So `-ObjC++` is recognized as `-ObjC` option having `++` argument. Then it is a cause of the duplication warning.\r\nTo fix this, I would add `+` sign to the regexp. In my local, it works fine.\r\n\r\nBTW, now we can use only `-.?@#_` signs for option name. This means the following characters could not be used:\r\n`!\"#$%&()*+,/:;<>[\\]^{|}~`, ` (back quote) and any non ASCII characters.\r\nIs this intended?\r\nI feel we should reconsider these definition.\r\n\r\n@username_2 any ideas?", "title": null, "type": "comment" }, { "action": "created", "author": "tk0miya", "comment_id": 275303089, "datetime": 1485403360000, "masked_author": "username_1", "text": "Side note:\r\nI don't list up `=` character to the unsupported signs for option name with intent.\r\nIt is used for a delimiter of option name and argument. So it means Sphinx cannot use `=` sign for an option name. I think it is a restriction of Sphinx.", "title": null, "type": "comment" }, { "action": "created", "author": "shimizukawa", "comment_id": 277527181, "datetime": 1486308928000, "masked_author": "username_2", "text": "IMO, it is nice if every `string.punctuation` characters were accepted exclude `=` char.", "title": null, "type": "comment" }, { "action": "closed", "author": "tk0miya", "comment_id": null, "datetime": 1487091710000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "tk0miya", "comment_id": 279768737, "datetime": 1487091786000, "masked_author": "username_1", "text": "Fixed at b9bba67 and baa065f.\r\nOnly `+` sign is allowed in stable branch to keep compatibility. It will be released as 1.5.3.\r\nAnd all punctuations are allowed in master branch.\r\n\r\nThank you for reporting.", "title": null, "type": "comment" } ]
3
6
2,430
false
false
2,430
true
ARMmbed/yotta
ARMmbed
148,949,650
767
null
[ { "action": "opened", "author": "qymab", "comment_id": null, "datetime": 1460898270000, "masked_author": "username_0", "text": "using latest yotta.app for OSX running on Capitan.\r\n```\r\nTraceback (most recent call last):\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/bin/yotta\", line 4, in <module>\r\n yotta.main()\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/lib/python2.7/site-packages/yotta/main.py\", line 239, in main\r\n status = args.command(args, following_args)\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/lib/python2.7/site-packages/yotta/search.py\", line 53, in execCommand\r\n for result in registry_access.search(query=args.query, keywords=args.kw, registry=args.registry):\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/lib/python2.7/site-packages/yotta/lib/registry_access.py\", line 679, in search\r\n response = requests.get(url, headers=headers, params=params)\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/lib/python2.7/site-packages/requests/api.py\", line 69, in get\r\n return request('get', url, params=params, **kwargs)\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/lib/python2.7/site-packages/requests/api.py\", line 50, in request\r\n response = session.request(method=method, url=url, **kwargs)\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/lib/python2.7/site-packages/requests/sessions.py\", line 471, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/lib/python2.7/site-packages/requests/sessions.py\", line 579, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/Applications/yotta.app/Contents/Resources/workspace/lib/python2.7/site-packages/requests/adapters.py\", line 430, in send\r\n raise SSLError(e, request=request)\r\nrequests.exceptions.SSLError: (\"bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)\",)\r\n```", "title": "yotta search --limit 1000 target error", "type": "issue" }, { "action": "created", "author": "autopulated", "comment_id": 211275051, "datetime": 1460969156000, "masked_author": "username_1", "text": "Please make sure you have the latest certifi module installed: `pip install -U certifi`", "title": null, "type": "comment" }, { "action": "created", "author": "qymab", "comment_id": 211278513, "datetime": 1460969632000, "masked_author": "username_0", "text": "using `pip install -U certifi` I've got this \r\n`Collecting certifi\r\n Using cached certifi-2016.2.28-py2.py3-none-any.whl\r\nInstalling collected packages: certifi\r\nSuccessfully installed certifi-2016.2.28\r\nYou are using pip version 7.1.2, however version 8.1.1 is available.\r\nYou should consider upgrading via the 'pip install --upgrade pip' command.`\r\n\r\nand then I'm trying with `pip install --upgrade pip' command`\r\nIt's working", "title": null, "type": "comment" }, { "action": "closed", "author": "qymab", "comment_id": null, "datetime": 1460969643000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "autopulated", "comment_id": 211279440, "datetime": 1460969850000, "masked_author": "username_1", "text": "👍\r\n\r\nI think this could have happened if you already had an old version of certifi installed before installing yotta with `pip install yotta`, possibly we should suggest installing with `pip install -U yotta`", "title": null, "type": "comment" } ]
2
5
2,580
false
false
2,580
false
yahoo/elide
yahoo
198,750,216
367
{ "number": 367, "repo": "elide", "user_login": "yahoo" }
[ { "action": "opened", "author": "DeathByTape", "comment_id": null, "datetime": 1483547464000, "masked_author": "username_0", "text": "", "title": "Add RequestScope to AuditLogger interface.", "type": "issue" }, { "action": "created", "author": "yahoocla", "comment_id": 270415878, "datetime": 1483547467000, "masked_author": "username_1", "text": "CLA is valid!", "title": null, "type": "comment" }, { "action": "created", "author": "clayreimann", "comment_id": 270424362, "datetime": 1483549283000, "masked_author": "username_2", "text": "👍", "title": null, "type": "comment" } ]
3
3
14
false
false
14
false
fordnox/Zend_Auth_Adapter_Facebook
null
24,940,526
9
null
[ { "action": "opened", "author": "vijay2579", "comment_id": null, "datetime": 1388581864000, "masked_author": "username_0", "text": "{\r\n \"error\": {\r\n \"message\": \"Invalid redirect_uri: Given URL is not allowed by the Application configuration.\",\r\n \"type\": \"OAuthException\",\r\n \"code\": 191\r\n }\r\n}", "title": "Error on implementing Zend Framework 1.11", "type": "issue" }, { "action": "closed", "author": "fordnox", "comment_id": null, "datetime": 1431603737000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
2
179
false
false
179
false
tensorflow/tensorflow
tensorflow
188,917,286
5,564
null
[ { "action": "opened", "author": "lordna", "comment_id": null, "datetime": 1478964468000, "masked_author": "username_0", "text": "Hello Team,\r\n\r\nYesterday evening I completed your tutorial (https://codelabs.developers.google.com/codelabs/tensorflow-for-poets/#5) to take my first steps with tensorflow. During the process I had to complete the code as in label_image.py there is no import sys line.\r\n\r\nCheers!", "title": "tutorial - code", "type": "issue" }, { "action": "closed", "author": "wolffg", "comment_id": null, "datetime": 1482359580000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "wolffg", "comment_id": 268661110, "datetime": 1482359581000, "masked_author": "username_1", "text": "Fixed! Thanks for pointing that out. It was correct in the gist, but not written in the code.", "title": null, "type": "comment" } ]
2
3
374
false
false
374
false
paypal/PayPal-node-SDK
paypal
68,988,984
84
null
[ { "action": "opened", "author": "h2non", "comment_id": null, "datetime": 1429211008000, "masked_author": "username_0", "text": "I'm creating a payout and with only one destinatary and the amount is sent three times and I don't know why. This is so estrange.\r\n\r\nI'm just using the same code as in the examples:\r\n```js\r\nvar params = {\r\n \"sender_batch_header\": {\r\n \"sender_batch_id\": \"4f4ezsemi\",\r\n \"email_subject\": \"New payment from Guidecentral\"\r\n },\r\n \"items\": [\r\n {\r\n \"recipient_type\": \"EMAIL\",\r\n \"receiver\": \"test@email.com\",\r\n \"note\": \"Thank you for\",\r\n \"sender_item_id\": \"Test title\",\r\n \"amount\": {\r\n \"value\": 0.1,\r\n \"currency\": \"EUR\"\r\n }\r\n }\r\n ]\r\n}\r\n```\r\n\r\nAnd then I call:\r\n```js\r\npaypal.payout.create(params, 'true', callback)\r\n```\r\n\r\nThis is generating duplicated payments. I can confirm that the payment was recieved multiple times by the destinatary. This is happening in production `live` mode", "title": "Payouts is sending duplicated payments", "type": "issue" }, { "action": "created", "author": "avidas", "comment_id": 94857009, "datetime": 1429632874000, "masked_author": "username_1", "text": "Hey @username_0 , do you have the [debug_id](https://github.com/paypal/PayPal-node-SDK#debugging) for a case where the payment was received multiple times? Would really help us to get to root cause.", "title": null, "type": "comment" }, { "action": "created", "author": "h2non", "comment_id": 94864716, "datetime": 1429634324000, "masked_author": "username_0", "text": "I got two `debug-id` when running the script:\r\n```\r\npaypal-debug-id: 3b682f067c319\r\npaypal-debug-id: 7af7a818b7277\r\n```\r\n\r\nAs I said, it happen when running as `live` mode", "title": null, "type": "comment" }, { "action": "created", "author": "avidas", "comment_id": 94889685, "datetime": 1429639043000, "masked_author": "username_1", "text": "Hey from the logs it looks like there is logic from your app that are retrying failed payouts. These payouts have failed due to insufficient funds on the merchant account and the recipients should not have received any money. If you have reasons to believe that what we are seeing is incorrect, please reach out to https://ppmts.custhelp.com for accounts related issues.", "title": null, "type": "comment" }, { "action": "created", "author": "h2non", "comment_id": 95650375, "datetime": 1429807512000, "masked_author": "username_0", "text": "Despite the last debug, it was due to missing funds. \r\nNow I tried it again with funds, and the transaction was success, but the amount was sent 4 times.\r\n\r\nCuriously, it generates only two `debug ids`:\r\n```\r\npaypal-debug-id: 4ba5be5ced36a\r\npaypal-debug-id: de7f22efee0d3\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "avidas", "comment_id": 95667111, "datetime": 1429810900000, "masked_author": "username_1", "text": "We are seeing four individual calls for payouts from our end and these are the debug ids for each are 7876c5ba58346, 55da39d1e0d65, c072fa30e002f, de7f22efee0d3. Are you just running the script separately or inside your app? Are you able to reproduce the issue with just individual CURL request to create a payout?", "title": null, "type": "comment" }, { "action": "created", "author": "h2non", "comment_id": 95668416, "datetime": 1429811345000, "masked_author": "username_0", "text": "I'm running it in a dead simple one-file script, based on your payouts example, like:\r\n```\r\nnode pay.js\r\n```\r\n\r\nTo be more specific, here's the script I'm actually running:\r\n```js\r\nvar paypal = require('paypal-rest-sdk')\r\n\r\npaypal.configure({\r\n 'mode': 'live',\r\n 'client_id': 'super-client-id',\r\n 'client_secret': 'super-client-s3cret!' \r\n})\r\n\r\nvar payout = function (items, callback) {\r\n var batchId = Math.random().toString(36).substring(9)\r\n\r\n var items = items.map(function (item) {\r\n var amount = +(+item.amount || 0).toFixed(2)\r\n\r\n return {\r\n 'recipient_type': 'EMAIL',\r\n 'receiver': item.email,\r\n 'note': 'Test note.',\r\n 'sender_item_id': 'Test sender',\r\n 'amount': {\r\n 'value': amount,\r\n 'currency': 'EUR'\r\n }\r\n }\r\n })\r\n .filter(function (payment) {\r\n return payment != null\r\n })\r\n\r\n var payoutParams = {\r\n 'sender_batch_header': {\r\n 'sender_batch_id': batchId,\r\n 'email_subject': 'New test payment'\r\n },\r\n items: items\r\n }\r\n\r\n paypal.payout.create(payoutParams, 'true', callback)\r\n}\r\n\r\n// test it!\r\npayout([{\r\n email: 'me@mail.com',\r\n amount: 0.05\r\n}], function (err, res) {\r\n console.log(JSON.stringify(res, null, 2))\r\n})\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "h2non", "comment_id": 95668932, "datetime": 1429811486000, "masked_author": "username_0", "text": "I've just noticed the error with my script. The `items` variable is reused.", "title": null, "type": "comment" }, { "action": "created", "author": "h2non", "comment_id": 95671864, "datetime": 1429812022000, "masked_author": "username_0", "text": "The problem persist. It continues sending duplicated payments. Here're the new debug ids:\r\n```\r\npaypal-debug-id: f185bc5ee3c1b\r\npaypal-debug-id: 3706cc464bedd\r\n```\r\n\r\nThe `items` array has only length 1, so I don't understand why this happen. The **payment the made 4 times** as well", "title": null, "type": "comment" }, { "action": "created", "author": "avidas", "comment_id": 95680956, "datetime": 1429814617000, "masked_author": "username_1", "text": "Sorry to be asking for more tests, but this is different from the behavior you notice with curl requests? https://developer.paypal.com/docs/api/#create-a-batch-or-single-payout", "title": null, "type": "comment" }, { "action": "created", "author": "h2non", "comment_id": 95723390, "datetime": 1429824423000, "masked_author": "username_0", "text": "I made some debugging in the library, and there're only two outgoing requests: the first one for the oauth2 autorization, and the second one is a `POST` request to `/v1/payments/payouts`.\r\n\r\nSo the problem is not (probably) on the client side, which is thing that concerns me. \r\nCould you scale this issue to dig into it in more detail? I can provide specific identifiers about the payments done.", "title": null, "type": "comment" }, { "action": "created", "author": "h2non", "comment_id": 101782463, "datetime": 1431544958000, "masked_author": "username_0", "text": "Any news about this?", "title": null, "type": "comment" }, { "action": "created", "author": "avidas", "comment_id": 103923191, "datetime": 1432135086000, "masked_author": "username_1", "text": "We thought it was likely that the issue was client side, perhaps some sort of retry logic. The logic could be getting invoked if the response from paypal is delayed, and since it gets resend with a different batch id, (its generated by using random so it would be different with each retry), on the server side it seems like separate payout requests are being made and hence multiple payouts are getting made. This is one possible scenerio of course, but seems to be a likely one as an issue like this server side would be felt by many more clients.", "title": null, "type": "comment" }, { "action": "created", "author": "h2non", "comment_id": 103950117, "datetime": 1432139528000, "masked_author": "username_0", "text": "Thanks for the reply. That sounds a bit estrange. I don't have a clear idea about the retry logic, just because the status code is 400, instead of 500-600, so there isn't needed any kind of retry.\r\n\r\nSo... what's the solution? Any specific idea to dig into it in detail?", "title": null, "type": "comment" }, { "action": "created", "author": "avidas", "comment_id": 115393005, "datetime": 1435265408000, "masked_author": "username_1", "text": "Unfortunately, this is the best we can offer in debugging this, since multiple debug ids are getting received, there are likely multiple requests being made somehow, likely via retries.", "title": null, "type": "comment" }, { "action": "closed", "author": "avidas", "comment_id": null, "datetime": 1435265408000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
16
5,340
false
false
5,340
true
dotnet/orleans
dotnet
110,305,290
883
{ "number": 883, "repo": "orleans", "user_login": "dotnet" }
[ { "action": "opened", "author": "butlertj", "comment_id": null, "datetime": 1444248207000, "masked_author": "username_0", "text": "Added Mesh Systems to the list of companies using Orleans.", "title": "Update Who-Is-Using-Orleans.md", "type": "issue" }, { "action": "created", "author": "gabikliot", "comment_id": 146313654, "datetime": 1444248461000, "masked_author": "username_1", "text": "I am so glad to merge that! :+1:", "title": null, "type": "comment" }, { "action": "created", "author": "sergeybykov", "comment_id": 146315821, "datetime": 1444249035000, "masked_author": "username_2", "text": "Great! No mousetraps? ;-)", "title": null, "type": "comment" } ]
4
4
460
false
true
115
false
kennethreitz/requests
null
124,580,502
2,957
null
[ { "action": "opened", "author": "zhangi", "comment_id": null, "datetime": 1451721634000, "masked_author": "username_0", "text": "Not sure if this is a bug from urllib3 or requests, the problem is that there is an extra delay introduced for all subrequests other than the first one when reusing a session object.\r\nThe test are performed in Ubuntu 14 to a http server running on the localhost\r\n```python\r\nwith requests.Session() as session:\r\n session.get(url) # this takes <1ms\r\n session.get(url) # this takes around 40ms\r\n session.get(url) # this takes around 40ms\r\n```\r\nSimilar case was observed in httplib2 with Python 2.6(https://code.google.com/p/httplib2/issues/detail?id=91), and was fixed in Python 2.7.", "title": "Persistent Connection suffer from delay of Nagle", "type": "issue" }, { "action": "created", "author": "Lukasa", "comment_id": 168376260, "datetime": 1451727718000, "masked_author": "username_1", "text": "", "title": null, "type": "comment" }, { "action": "created", "author": "zhangi", "comment_id": 168379449, "datetime": 1451731598000, "masked_author": "username_0", "text": "version: 2.9.1\r\ninstalled by: sudo pip install requests --upgrade", "title": null, "type": "comment" }, { "action": "created", "author": "Lukasa", "comment_id": 168379561, "datetime": 1451731874000, "masked_author": "username_1", "text": "So [we disable Nagle's algorithm by default](https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connection.py#L97), which means that your diagnosis of this cannot be right.\r\n\r\nAre you familiar with wireshark or tcpdump?", "title": null, "type": "comment" }, { "action": "created", "author": "zhangi", "comment_id": 168379902, "datetime": 1451732008000, "masked_author": "username_0", "text": "Let me investigate more, maybe it has something to do with server side. I will close it first.", "title": null, "type": "comment" }, { "action": "closed", "author": "zhangi", "comment_id": null, "datetime": 1451732008000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "Lukasa", "comment_id": 168379946, "datetime": 1451732097000, "masked_author": "username_1", "text": "To be clear, if the server-side has delayed ACKs turned on it may cause this problem.", "title": null, "type": "comment" } ]
2
7
1,080
false
false
1,080
false
nobitagit/material-floating-button
null
56,593,699
2
null
[ { "action": "opened", "author": "amchang", "comment_id": null, "datetime": 1423086958000, "masked_author": "username_0", "text": "", "title": "Publish to NPM", "type": "issue" }, { "action": "created", "author": "nobitagit", "comment_id": 82398736, "datetime": 1426604977000, "masked_author": "username_1", "text": "Available on npm now.", "title": null, "type": "comment" }, { "action": "closed", "author": "nobitagit", "comment_id": null, "datetime": 1426604977000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
3
21
false
false
21
false
greenplum-db/gpdb
greenplum-db
209,629,148
1,834
{ "number": 1834, "repo": "gpdb", "user_login": "greenplum-db" }
[ { "action": "opened", "author": "tushar-dadlani", "comment_id": null, "datetime": 1487811037000, "masked_author": "username_0", "text": "- This will ensure that PRs will not be skipped\r\n- We also moved the `trigger` from bin_gpdb to gpdb_pr", "title": "Use `version: every` for the PR pipeline jobs", "type": "issue" }, { "action": "created", "author": "pivotal-issuemaster", "comment_id": 281854490, "datetime": 1487811038000, "masked_author": "username_1", "text": "@username_0 Please sign the [Contributor License Agreement](https://cla.pivotal.io/sign/greenplum?repositoryId=greenplum-db/gpdb&pullRequestId=1834)!\n\n[Click here](https://cla.pivotal.io/sync/greenplum?repositoryId=greenplum-db/gpdb&pullRequestId=1834) to manually synchronize the status of this Pull Request.\n\nSee the [FAQ](https://cla.pivotal.io/about) for frequently asked questions.", "title": null, "type": "comment" }, { "action": "created", "author": "pivotal-issuemaster", "comment_id": 281854883, "datetime": 1487811170000, "masked_author": "username_1", "text": "@username_0 Thank you for signing the [Contributor License Agreement](https://cla.pivotal.io/sign/greenplum?repositoryId=greenplum-db/gpdb&pullRequestId=1834)!", "title": null, "type": "comment" }, { "action": "created", "author": "ashwinstar", "comment_id": 281855768, "datetime": 1487811487000, "masked_author": "username_2", "text": "Please state why this is necessary ? Like I am still trying to understand why `trigger: true` needs to be moved up.", "title": null, "type": "comment" }, { "action": "created", "author": "tushar-dadlani", "comment_id": 281865850, "datetime": 1487814698000, "masked_author": "username_0", "text": "@username_2 we are closing the PR as this doesn't fix the issue we hoped it would. We are still being impacted by an [open issue in concourse](https://github.com/concourse/concourse/issues/736)", "title": null, "type": "comment" } ]
3
5
964
false
false
964
true
google/auto
google
225,315,472
486
null
[ { "action": "opened", "author": "MFlisar", "comment_id": null, "datetime": 1493543906000, "masked_author": "username_0", "text": "When using `Memoized` and builders in common, there does not seem a way to copy the memoized data from one object to another.\r\n\r\nExample:\r\n\r\n Data d = Data.create();\r\n d.getHeavyData(); // this field is annotated with @Memoized\r\n Data copy = d.toBuilder().build();\r\n copy.getHeavyData(); // this will execute the heavy calculation again\r\n\r\nTherefore I would need an (optional at best) possibility to copy the memoized data to the new object. Can this be done? If yes, how? If no, I think it would make sense to add this.", "title": "Memoized - copy data to new object", "type": "issue" }, { "action": "created", "author": "netdpb", "comment_id": 298341421, "datetime": 1493648420000, "masked_author": "username_1", "text": "Hmm. `@Memoized` is intended for properties that are derived from other immutable properties on the same object. If you were able to copy that property onto a new object, that would invalidate that invariant, wouldn't it? What's to stop you from copying the wrong value?\r\n\r\nIf your `@Memoized` property is derived from only a subset of the properties of your object, maybe you could consider extracting out the `@Memoized` property and those it depends on into a new `@AutoValue` object, which itself could become a property:\r\n\r\nInstead of:\r\n\r\n```\r\n@AutoValue abstract class Foo {\r\n abstract A a();\r\n abstract B b();\r\n abstract C c();\r\n @Memoized D d() {\r\n return someFunctionOf(a(), b());\r\n }\r\n}\r\n```\r\n\r\nyou could do:\r\n\r\n```\r\n@AutoValue abstract class Foo {\r\n abstract Bar bar();\r\n abstract C c();\r\n}\r\n\r\n@AutoValue abstract class Bar {\r\n abstract A a();\r\n abstract B b();\r\n @Memoized D d() {\r\n return someFunctionOf(a(), b());\r\n }\r\n}\r\n```\r\n\r\nOf course, you could provide convenience methods to get `A`, `B`, and `D` directly from an instance of `Foo` if that helps.", "title": null, "type": "comment" }, { "action": "created", "author": "MFlisar", "comment_id": 298405448, "datetime": 1493665780000, "masked_author": "username_0", "text": "That looks fine, this way I can hand on the Bar object to a new Foo object.\r\n\r\nAnd I see the problem now with exposing a setter for a builder, this could lead to inconsistent states in the new object...", "title": null, "type": "comment" }, { "action": "created", "author": "netdpb", "comment_id": 298412275, "datetime": 1493667556000, "masked_author": "username_1", "text": "Right. I'm going to close this issue for now. Feel free to reopen if you have more thoughts on it.", "title": null, "type": "comment" }, { "action": "closed", "author": "netdpb", "comment_id": null, "datetime": 1493667557000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
5
1,915
false
false
1,915
false
SwiftWeekly/swiftweekly.github.io
SwiftWeekly
189,840,164
135
{ "number": 135, "repo": "swiftweekly.github.io", "user_login": "SwiftWeekly" }
[ { "action": "opened", "author": "BasThomas", "comment_id": null, "datetime": 1479329397000, "masked_author": "username_0", "text": "", "title": "[47] Add interesting-but-out-of-scope proposals", "type": "issue" } ]
2
2
1,942
false
true
0
false
dpayne/cli-visualizer
null
151,825,240
22
{ "number": 22, "repo": "cli-visualizer", "user_login": "dpayne" }
[ { "action": "opened", "author": "DuckThom", "comment_id": null, "datetime": 1461921607000, "masked_author": "username_0", "text": "I noticed `doctoc` also added links to the sub-headers in the Configuration block.\r\nIf these should not be there, please let me know and I will remove them.\r\nI mainly updated the ToC to reflect the recent changes in the Setup block.", "title": "Update table of contents in readme", "type": "issue" }, { "action": "created", "author": "dpayne", "comment_id": 215840591, "datetime": 1461954701000, "masked_author": "username_1", "text": "Sure, looks good", "title": null, "type": "comment" } ]
2
2
248
false
false
248
false
xamarin/Xamarin.Auth
xamarin
216,005,238
159
null
[ { "action": "opened", "author": "bpBily", "comment_id": null, "datetime": 1490176187000, "masked_author": "username_0", "text": "hi,\r\nOn OAuth1 i don't have token before beginning registered. And when i used OAuth1.getsignature without token secret the method return me a empty signature.\r\nCan you help me?\r\nthx", "title": "OAuth 1.0a (One Legge)", "type": "issue" }, { "action": "created", "author": "moljac", "comment_id": 289711894, "datetime": 1490692659000, "masked_author": "username_1", "text": "Hi @username_0 \r\n\r\nCan I get back to you in few days (as soon as NativeUI support is published - SFSafariViewController and Custom Tabs).", "title": null, "type": "comment" }, { "action": "created", "author": "moljac", "comment_id": 304081023, "datetime": 1495735662000, "masked_author": "username_1", "text": "@username_0 \r\nWhich OAuth service provider?\r\nI need for testing.", "title": null, "type": "comment" }, { "action": "closed", "author": "newky2k", "comment_id": null, "datetime": 1535662736000, "masked_author": "username_2", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "newky2k", "comment_id": 417465103, "datetime": 1535662736000, "masked_author": "username_2", "text": "No response from over a year", "title": null, "type": "comment" } ]
3
5
403
false
false
403
true
angular-ui/ui-router
angular-ui
200,393,044
3,260
null
[ { "action": "opened", "author": "AntiAsko", "comment_id": null, "datetime": 1484234115000, "masked_author": "username_0", "text": "I'm creating a state in Typescript like this: \r\n\r\n```\r\n state: {\r\n name: 'mainState.import',\r\n url: '/import',\r\n reloadOnSearch: false,\r\n onEnter: (['$uibModal', (uibModal: ng.ui.bootstrap.IModalService) => {\r\n let modalOptions: any = {\r\n headerText: 'Import State goes here'\r\n };\r\n ...\r\n uibModal.open(modalConfig);\r\n\r\n }])\r\n}\r\n```\r\n\r\nWhile it works fine I get a typescript compilation error: \r\n```[0] ERROR in ./webapp/pages.ts\r\n[0] (14,14): error TS2322: Type '({ name: string; icon: string; keyboardShortcut: string; state: { name: string; component: string...' is not assignable to type 'ISidebarPage[]'.\r\n[0] Type '{ name: string; icon: string; keyboardShortcut: string; state: { name: string; component: string;...' is not assignable to type 'ISidebarPage'.\r\n[0] Type '{ name: string; icon: string; accessibleForRoles: string[]; state: { name: string; url: string; r...' is not assignable to type 'ISidebarPage'.\r\n[0] Types of property 'state' are incompatible.\r\n[0] Type '{ name: string; url: string; reloadOnSearch: boolean; onEnter: (string | ((uibModal: IModalServic...' is not assignable to type 'StateDeclaration & { component?: string; template?: string; }'.\r\n[0] Type '{ name: string; url: string; reloadOnSearch: boolean; onEnter: (string | ((uibModal: IModalServic...' is not assignable to type 'StateDeclaration'.\r\n[0] Types of property 'onEnter' are incompatible.\r\n[0] Type '(string | ((uibModal: IModalService) => void))[]' is not assignable to type 'TransitionStateHookFn'.\r\n[0] Type '(string | ((uibModal: IModalService) => void))[]' provides no match for the signature '(transition: Transition, state: State): boolean | TargetState | void | Promise<boolean | TargetState | void>'\r\n```\r\n\r\nIs this some sort of bug with the typings?", "title": "Typings error when using onEnter in state declaration", "type": "issue" }, { "action": "created", "author": "christopherthielen", "comment_id": 272191509, "datetime": 1484234648000, "masked_author": "username_1", "text": "that should be fixed by #3076\r\nWhat version of angular-ui-router are you using?", "title": null, "type": "comment" }, { "action": "created", "author": "AntiAsko", "comment_id": 272201799, "datetime": 1484236735000, "masked_author": "username_0", "text": "I'm using 1.0.0-rc.1 which should be the latest right? And I still see the error", "title": null, "type": "comment" }, { "action": "created", "author": "christopherthielen", "comment_id": 272210968, "datetime": 1484238636000, "masked_author": "username_1", "text": "Yep rc.1 is latest. I'll take a look and see if there's still a problem", "title": null, "type": "comment" }, { "action": "created", "author": "dartzki", "comment_id": 283603306, "datetime": 1488447238000, "masked_author": "username_2", "text": "#3076 did not add typings for inline-annotated functions - those are still missing", "title": null, "type": "comment" }, { "action": "closed", "author": "christopherthielen", "comment_id": null, "datetime": 1489453818000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
3
6
2,262
false
false
2,262
false
jgallagher/rusqlite
null
98,549,037
57
null
[ { "action": "opened", "author": "davidpbrown", "comment_id": null, "datetime": 1438447601000, "masked_author": "username_0", "text": "I'm getting this error below and for searching cannot see a way through it.\r\nsqlite3 is installed and working fine.\r\n\r\nI cannot see sqlite3.pc or similar and I'm not familiar with those .pc files then to know an obvious answer.\r\n\r\nCurious that majkcramer commented in May on the other issue that he's been successful in Linux.. I tried building only this from git clone too but same error.\r\n\r\nI'm using Cargo build; unsure what the rustc equivalent to Cargo.toml is to try rustc but I would expect the same error perhaps will follow from that too.\r\n```\r\n Fresh libc v0.1.8\r\n Fresh winapi v0.2.1\r\n Fresh pkg-config v0.3.5\r\n Compiling libsqlite3-sys v0.2.0 (file:///home/username_0/rust/firefox/rusqlite)\r\n Running `/home/username_0/rust/firefox/rusqlite/target/debug/build/libsqlite3-sys-863cdbd64abd8e99/build-script-build`\r\n Fresh bitflags v0.1.1\r\n Fresh winapi-build v0.1.1\r\n Fresh advapi32-sys v0.1.2\r\n Fresh rand v0.3.9\r\n Fresh tempdir v0.3.4\r\n Fresh kernel32-sys v0.1.3\r\n Fresh time v0.1.32\r\nfailed to run custom build command for `libsqlite3-sys v0.2.0 (file:///home/username_0/rust/firefox/rusqlite)`\r\nProcess didn't exit successfully: `/home/username_0/rust/firefox/rusqlite/target/debug/build/libsqlite3-sys-863cdbd64abd8e99/build-script-build` (exit code: 101)\r\n--- stderr\r\nthread '<main>' panicked at 'called `Result::unwrap()` on an `Err` value: \"`\\\"pkg-config\\\" \\\"--libs\\\" \\\"--cflags\\\" \\\"sqlite3\\\"` did not exit successfully: exit code: 1\\n--- stderr\\nPackage sqlite3 was not found in the pkg-config search path.\\nPerhaps you should add the directory containing `sqlite3.pc\\'\\nto the PKG_CONFIG_PATH environment variable\\nNo package \\'sqlite3\\' found\\n\"', ../src/libcore/result.rs:732\r\n```", "title": "Error building Linux Mint - cannot find sqlite3", "type": "issue" }, { "action": "created", "author": "jgallagher", "comment_id": 126948060, "datetime": 1438457620000, "masked_author": "username_1", "text": "`sqlite3.pc` is the configuration file that should be installed so you can use `pkg-config` to figure out where the headers / library files are located. I haven't used Linux Mint, but I know on some systems, you have to install a `-dev` version of the sqlite package to get the headers and `.pc` file (e.g., [libsqlite3-dev](https://packages.debian.org/squeeze/amd64/libsqlite3-dev/filelist) on Debian). Does it look like there's an equivalent on Mint?", "title": null, "type": "comment" }, { "action": "created", "author": "davidpbrown", "comment_id": 126955342, "datetime": 1438464856000, "masked_author": "username_0", "text": "Yes.. it needed libsqlite3-dev .. thanks!", "title": null, "type": "comment" }, { "action": "closed", "author": "davidpbrown", "comment_id": null, "datetime": 1438464856000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "doabit", "comment_id": 146754480, "datetime": 1444365868000, "masked_author": "username_2", "text": "@username_1 I have the same problem on osx el.\r\n\r\n```\r\n Compiling libsqlite3-sys v0.2.0\r\nfailed to run custom build command for `libsqlite3-sys v0.2.0`\r\nProcess didn't exit successfully: `/Users/username_2/Codes/rust/projects/TencentOAuth/target/debug/build/libsqlite3-sys-2fe684602508b1c3/build-script-build` (exit code: 101)\r\n--- stderr\r\nthread '<main>' panicked at 'called `Result::unwrap()` on an `Err` value: \"`\\\"pkg-config\\\" \\\"--libs\\\" \\\"--cflags\\\" \\\"sqlite3\\\"` did not exit successfully: exit code: 1\\n--- stderr\\nPackage sqlite3 was not found in the pkg-config search path.\\nPerhaps you should add the directory containing `sqlite3.pc\\'\\nto the PKG_CONFIG_PATH environment variable\\nNo package \\'sqlite3\\' found\\n\"', ../src/libcore/result.rs:736\r\n```", "title": null, "type": "comment" }, { "action": "created", "author": "doabit", "comment_id": 146754873, "datetime": 1444366235000, "masked_author": "username_2", "text": "Ok, use ` PKG_CONFIG_PATH=`echo /usr/local/Cellar/sqlite/*/lib/pkgconfig/ cargo build`` can build success.", "title": null, "type": "comment" }, { "action": "created", "author": "Dashed", "comment_id": 151661352, "datetime": 1445984090000, "masked_author": "username_3", "text": "@username_2 thanks! `PKG_CONFIG_PATH=$(echo /usr/local/Cellar/sqlite/3.9.1/lib/pkgconfig/) cargo run` works for me", "title": null, "type": "comment" } ]
4
7
3,237
false
false
3,237
true
jawee/language-blade
null
158,039,145
45
{ "number": 45, "repo": "language-blade", "user_login": "jawee" }
[ { "action": "opened", "author": "nawatts", "comment_id": null, "datetime": 1464829353000, "masked_author": "username_0", "text": "Deactivate is called when the Atom window closes. So currently, if there are multiple Atom windows open, closing one overrides the Blade comment configuration and all other open windows start using HTML comments even though \"Use Blade Comments\" is still checked in the package settings UI.", "title": "Don't remove comment configuration on deactivate.", "type": "issue" }, { "action": "created", "author": "Ingramz", "comment_id": 223196708, "datetime": 1464844123000, "masked_author": "username_1", "text": "Thank you for the pull request.\r\n\r\nI noticed the same issue and wasn't quite sure where it came from.", "title": null, "type": "comment" } ]
2
2
390
false
false
390
false
IBM-Bluemix/phonebot
IBM-Bluemix
82,801,621
10
null
[ { "action": "opened", "author": "jeanfrancis", "comment_id": null, "datetime": 1432989132000, "masked_author": "username_0", "text": "Anyone ha tried it with Plivo ?", "title": "Plivo", "type": "issue" }, { "action": "created", "author": "jthomas", "comment_id": 107233851, "datetime": 1433098719000, "masked_author": "username_1", "text": "I have not. \r\nWould Plivo offer any advantages over Twilio?", "title": null, "type": "comment" }, { "action": "created", "author": "jeanfrancis", "comment_id": 108025179, "datetime": 1433266424000, "masked_author": "username_0", "text": "basically the same service but they have an open source code and they are a bit cheaper ,always better to have the possibility of being integrated with more than one supplier so users get the choice", "title": null, "type": "comment" }, { "action": "closed", "author": "jthomas", "comment_id": null, "datetime": 1433320038000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "jthomas", "comment_id": 108245680, "datetime": 1433320038000, "masked_author": "username_1", "text": "Thanks for letting me know. This is not going to be a priority for the project but I'd happily accept a pull request if you want to add support for this.", "title": null, "type": "comment" } ]
2
5
441
false
false
441
false
coderanger/kitchen-sync
null
173,726,746
25
null
[ { "action": "opened", "author": "pietervogelaar", "comment_id": null, "datetime": 1472460732000, "masked_author": "username_0", "text": "```", "title": "Could not load 'rsync' transport from load path ", "type": "issue" }, { "action": "created", "author": "jetaro1", "comment_id": 252327977, "datetime": 1475865258000, "masked_author": "username_1", "text": "thank you :)", "title": null, "type": "comment" }, { "action": "created", "author": "dragon788", "comment_id": 342228675, "datetime": 1509990443000, "masked_author": "username_2", "text": "Have you run `bundle install` before you run `ktichen list`?", "title": null, "type": "comment" }, { "action": "created", "author": "dgreeninger", "comment_id": 561744436, "datetime": 1575479951000, "masked_author": "username_3", "text": "Try running `chef gem install kitchen-sync`", "title": null, "type": "comment" } ]
4
4
118
false
false
118
false
peter-murach/tty
null
54,986,973
15
{ "number": 15, "repo": "tty", "user_login": "peter-murach" }
[ { "action": "opened", "author": "igas", "comment_id": null, "datetime": 1421825750000, "masked_author": "username_0", "text": "", "title": "Add ruby 2.2 for travis", "type": "issue" } ]
2
2
277
false
true
0
false
weprovide/valet-plus
weprovide
249,818,329
41
{ "number": 41, "repo": "valet-plus", "user_login": "weprovide" }
[ { "action": "opened", "author": "JKetelaar", "comment_id": null, "datetime": 1502551587000, "masked_author": "username_0", "text": "Otherwise this would simply do nothing.", "title": "[BUGFIX] Set quotation mark to correct place", "type": "issue" }, { "action": "created", "author": "JKetelaar", "comment_id": 321988122, "datetime": 1502551756000, "masked_author": "username_0", "text": "```\r\necho \"<?php echo 'Valet+ at your service' > index.php\"\r\n```\r\nSimply outputs `<?php echo 'Valet+ at your service' > index.php`.\r\n\r\n```\r\necho \"<?php echo 'Valet+ at your service'\" > index.php\r\n```\r\nCreates a file\r\n\r\nExample:\r\n```\r\njeroens-mbp:~ jeroen$ mkdir test && cd test\r\n\r\njeroens-mbp:test jeroen$ echo \"<?php echo 'Valet+ at your service' > index.php\"\r\n<?php echo 'Valet+ at your service' > index.php\r\n\r\njeroens-mbp:test jeroen$ ls -l\r\ntotal 0\r\n\r\njeroens-mbp:test jeroen$ echo \"<?php echo 'Valet+ at your service'\" > index.php\r\n\r\njeroens-mbp:test jeroen$ ls -l\r\ntotal 8\r\n-rw-r--r-- 1 jeroen staff 36 Aug 12 17:28 index.php\r\n```", "title": null, "type": "comment" } ]
1
2
678
false
false
678
false
nolimits4web/Swiper
null
16,644,305
255
null
[ { "action": "opened", "author": "szgeri", "comment_id": null, "datetime": 1373565310000, "masked_author": "username_0", "text": "Hi,\r\n\r\nFirst of all, thank you for Swiper, it's amazing, I love it!\r\nIt would be nice to have a public function that exposes the swipeToPosition function. Sometimes you find yourself in a situation when you badly need to scroll down (or up) the swiper by code and you can't just simply change scrollTop of the div.\r\n\r\nI did it on my own and it's really just two lines of code but I think it should get a nice place in the official API.\r\n\r\nThanks and keep up the good work!\r\n\r\n-Gergely", "title": "Expose swipeToPosition", "type": "issue" }, { "action": "closed", "author": "nolimits4web", "comment_id": null, "datetime": 1423686560000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
2
484
false
false
484
false
SciLifeLab/scilifelab
SciLifeLab
70,684,131
476
{ "number": 476, "repo": "scilifelab", "user_login": "SciLifeLab" }
[ { "action": "opened", "author": "senthil10", "comment_id": null, "datetime": 1429880411000, "masked_author": "username_0", "text": "in current RNA bp, the ref genome is always fetched from statusDB. Its convenient to have a parameter to pass in at times.\r\n\r\nIts a small PR, so will great if anyone can merge it soon :smile:", "title": "add option for genome", "type": "issue" }, { "action": "created", "author": "ewels", "comment_id": 95927892, "datetime": 1429880813000, "masked_author": "username_1", "text": "Looks good! Though that help message made me scratch my head a little..", "title": null, "type": "comment" }, { "action": "created", "author": "senthil10", "comment_id": 95929273, "datetime": 1429881128000, "masked_author": "username_0", "text": "Haha, TRUE. Its friday evening man. I could not think much for the `help` :stuck_out_tongue: Changed it now though :+1:", "title": null, "type": "comment" }, { "action": "created", "author": "guillermo-carrasco", "comment_id": 95930136, "datetime": 1429881329000, "masked_author": "username_2", "text": ":+1:", "title": null, "type": "comment" } ]
3
4
386
false
false
386
false
llooker/blocks_redshift_admin
llooker
210,827,467
12
{ "number": 12, "repo": "blocks_redshift_admin", "user_login": "llooker" }
[ { "action": "opened", "author": "stankud", "comment_id": null, "datetime": 1488297286000, "masked_author": "username_0", "text": "Update README.md to include all system tables requiring SELECT permission", "title": "add additional tables needing select permission", "type": "issue" }, { "action": "created", "author": "fabio-looker", "comment_id": 283082139, "datetime": 1488297943000, "masked_author": "username_1", "text": "Thanks!", "title": null, "type": "comment" } ]
2
2
80
false
false
80
false
thoughtbot/suspenders
thoughtbot
149,854,080
749
{ "number": 749, "repo": "suspenders", "user_login": "thoughtbot" }
[ { "action": "opened", "author": "tute", "comment_id": null, "datetime": 1461178156000, "masked_author": "username_0", "text": "Related issues:\n\n- #738\n- #748\n\nA new suspended app's scaffold works with this fix. To replicate the\nissue, you can run these steps:\n\n```bash\nsuspenders test-`date +\"%Y%m%d\"`\ncd test-*\n./bin/setup\nrails g scaffold User email password\nrake db:migrate\nrails s\nopen http://localhost:3000/users\n```\n\nThat page fails in current master, with an error message on `base/base`\nnot found.\n\nCalling in @username_1 or @brendastorer for help on this. Where was\n`base/base` coming from? Is it an ok fix to drop that line? Do we have\nnewer refills versions to include with suspenders apps? Thanks!", "title": "Fix SCSS misconfigurations", "type": "issue" }, { "action": "created", "author": "tysongach", "comment_id": 212560553, "datetime": 1461179114000, "masked_author": "username_1", "text": "@username_0 We need this. `base/base` is [Bitters](https://github.com/thoughtbot/bitters), which seems to not be getting installed anymore? Bitters used to be a dependency, but was removed in https://github.com/thoughtbot/suspenders/commit/ef3428e14cf8a43213040e9224c7a9eecbea7921.", "title": null, "type": "comment" }, { "action": "created", "author": "tute", "comment_id": 212561128, "datetime": 1461179242000, "masked_author": "username_0", "text": "aha! Thank you, on it.", "title": null, "type": "comment" }, { "action": "created", "author": "tute", "comment_id": 212567422, "datetime": 1461180051000, "masked_author": "username_0", "text": "Added back bitters, which fixes that issue. Updated commit and PR description. Mind reviewing again, @username_1? Thank you!", "title": null, "type": "comment" }, { "action": "created", "author": "tute", "comment_id": 212569524, "datetime": 1461180481000, "masked_author": "username_0", "text": "Checked it's the latest version, merging. Thank you again, Tyson.", "title": null, "type": "comment" }, { "action": "created", "author": "tysongach", "comment_id": 212569557, "datetime": 1461180488000, "masked_author": "username_1", "text": "@username_0 Does this bring this issue back: https://github.com/thoughtbot/suspenders/pull/739?\r\n\r\n@derekprior made an interesting point about this on https://github.com/thoughtbot/suspenders/issues/738#issuecomment-200961157", "title": null, "type": "comment" }, { "action": "created", "author": "tute", "comment_id": 212569958, "datetime": 1461180578000, "masked_author": "username_0", "text": "Aw, I made a mess. Yes, it does, fixing and adding this context to the commit message.", "title": null, "type": "comment" }, { "action": "created", "author": "tute", "comment_id": 212570583, "datetime": 1461180692000, "masked_author": "username_0", "text": "Pushed your commit from that PR.", "title": null, "type": "comment" }, { "action": "created", "author": "tysongach", "comment_id": 212572006, "datetime": 1461180848000, "masked_author": "username_1", "text": "@username_0 👍 💯", "title": null, "type": "comment" } ]
2
9
1,412
false
false
1,412
true
Samfundet/SamfundetAuth
Samfundet
175,126,301
14
{ "number": 14, "repo": "SamfundetAuth", "user_login": "Samfundet" }
[ { "action": "opened", "author": "konstahm", "comment_id": null, "datetime": 1473108734000, "masked_author": "username_0", "text": "", "title": "add-konstahm", "type": "issue" }, { "action": "created", "author": "alfiehub", "comment_id": 244810203, "datetime": 1473108751000, "masked_author": "username_1", "text": "Looks nice.", "title": null, "type": "comment" } ]
2
2
11
false
false
11
false
jenkinsci/jenkins-design-language
jenkinsci
160,576,994
55
{ "number": 55, "repo": "jenkins-design-language", "user_login": "jenkinsci" }
[ { "action": "opened", "author": "i386", "comment_id": null, "datetime": 1466052585000, "masked_author": "username_0", "text": "@username_1", "title": "Add a small pulsing dot to the running indicators", "type": "issue" }, { "action": "created", "author": "reviewbybees", "comment_id": 226387722, "datetime": 1466052700000, "masked_author": "username_1", "text": "This pull request originates from a [CloudBees](https://www.cloudbees.com/) employee. At CloudBees, we require that all pull requests be reviewed by other CloudBees employees before we seek to have the change accepted. If you want to learn more about our process please see [this explanation](https://github.com/username_1/about#about-username_1).", "title": null, "type": "comment" }, { "action": "created", "author": "cliffmeyers", "comment_id": 226420660, "datetime": 1466065623000, "masked_author": "username_2", "text": "LGTM 🐝", "title": null, "type": "comment" } ]
3
3
370
false
false
370
true
quintel/etmoses
quintel
100,307,642
268
null
[ { "action": "opened", "author": "ChaelKruip", "comment_id": null, "datetime": 1439298675000, "masked_author": "username_0", "text": "If the group is larger than 30, ETMoses does not seem to respect the choice of the user to assign an individual profile anyway.", "title": "When editing a LES, the assignment of individual profiles for households is overwritten", "type": "issue" }, { "action": "closed", "author": "antw", "comment_id": null, "datetime": 1441103024000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
2
127
false
false
127
false
silverstripe-australia/silverstripe-content-services
silverstripe-australia
123,184,591
6
null
[ { "action": "opened", "author": "tractorcow", "comment_id": null, "datetime": 1450647222000, "masked_author": "username_0", "text": "", "title": "Please add some unit tests. :)", "type": "issue" }, { "action": "created", "author": "nyeholt", "comment_id": 166165930, "datetime": 1450655561000, "masked_author": "username_1", "text": "https://github.com/silverstripe-australia/silverstripe-content-services/blob/master/code/tests/TestContentServices.php\r\n\r\nI have got into the habit of having unit tests in module/code/test, and Functional/Selenium level tests in module/tests. Though it looks like the module standard is to move things out to module/tests regardless... will keep in mind when next touching things.", "title": null, "type": "comment" }, { "action": "created", "author": "tractorcow", "comment_id": 166172239, "datetime": 1450662137000, "masked_author": "username_0", "text": "Oh, right! Sorry to be confusing. :)\r\n\r\nAny chance that you could hook it up to travis to get some CI going? I don't personally need it but it'd be good for others to see how great your module is. :)", "title": null, "type": "comment" }, { "action": "created", "author": "nglasl", "comment_id": 400148062, "datetime": 1529976799000, "masked_author": "username_2", "text": "Thank you for your contribution here. I just wanted to let you know that we’re looking to improve our management of issues and pull requests, with a goal to providing clear direction on how Symbiote contributes to the open source community. As part of this ongoing effort:\r\n\r\n* issues and pull requests that have been open prior to **December 2017 with no recent activity** are being closed\r\n\r\n * **this is included**\r\n\r\n * if you can confirm this remains applicable for SS3.6+ please do open it again\r\n\r\n* remaining issues and pull requests will be\r\n\r\n * categorised\r\n\r\n * on a clear path to resolution (whatever the next step may be).\r\n\r\nThis ultimately brings us one step closer to our ambition; being able to prioritise **current** issues and contributions in a timely manner. If you have any questions, please see [https://www.symbiote.com.au/contributing](https://www.symbiote.com.au/contributing) for further information.", "title": null, "type": "comment" }, { "action": "closed", "author": "nglasl", "comment_id": null, "datetime": 1529976799000, "masked_author": "username_2", "text": "", "title": null, "type": "issue" } ]
3
5
1,519
false
false
1,519
false
VSCodeVim/Vim
VSCodeVim
171,277,882
614
null
[ { "action": "opened", "author": "johnfn", "comment_id": null, "datetime": 1471300455000, "masked_author": "username_0", "text": "------\r\n\r\nPlease *thumbs-up* 👍 this issue if it personally affects you! You can do this by clicking on the emoji-face on the top right of this post. Issues with more thumbs-up will be prioritized.\r\n\r\n-----", "title": "search highlight is incorrect briefly after undo", "type": "issue" }, { "action": "created", "author": "rebornix", "comment_id": 255893268, "datetime": 1477351205000, "masked_author": "username_1", "text": "@username_0 can you help me understand why highlight is not correct after undo? From what I observer, all cases are good.", "title": null, "type": "comment" }, { "action": "closed", "author": "johnfn", "comment_id": null, "datetime": 1480631637000, "masked_author": "username_0", "text": "", "title": null, "type": "issue" } ]
2
3
322
false
false
322
true
skywinder/ActionSheetPicker-3.0
null
68,282,935
168
null
[ { "action": "opened", "author": "hzorr", "comment_id": null, "datetime": 1428994297000, "masked_author": "username_0", "text": "Hi,\r\nIs it possible to customize the view, display some other controls f.e text field with pickerview?\r\nThanks.", "title": "Display text field below picker", "type": "issue" }, { "action": "created", "author": "skywinder", "comment_id": 92818877, "datetime": 1429017144000, "masked_author": "username_1", "text": "Hi. You can find property in `AbstractActionSheetPicker`\r\n\r\n @property (nonatomic, strong) UIView *pickerView;\r\n\r\nTry to play around this property to achieve your needs.", "title": null, "type": "comment" }, { "action": "closed", "author": "skywinder", "comment_id": null, "datetime": 1429260536000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
3
283
false
false
283
false
webpack/webpack
webpack
124,394,061
1,824
null
[ { "action": "opened", "author": "pavellishin", "comment_id": null, "datetime": 1451506044000, "masked_author": "username_0", "text": "Even if no --color option is passed, output is colorized.", "title": "Webpack ignores --color option", "type": "issue" }, { "action": "created", "author": "bebraw", "comment_id": 168138319, "datetime": 1451545832000, "masked_author": "username_1", "text": "By the looks of it, the code relies on [supports-color](https://www.npmjs.com/package/supports-color). Reading from there I can see it respects `--no-color`. So if you pass that, it likely won't colorize your output.", "title": null, "type": "comment" }, { "action": "created", "author": "pavellishin", "comment_id": 168237397, "datetime": 1451590152000, "masked_author": "username_0", "text": "Sure, but this doesn't address this particular issue. \r\n\r\n1. --no-color isn't documented in webpack.\r\n2. If webpack stops using `supports-color`, the `--no-color` option goes away.\r\n3. Even if webpack keeps using it forever, it seems silly to pass arguments based on specific internal implementation details.", "title": null, "type": "comment" }, { "action": "created", "author": "bebraw", "comment_id": 168238504, "datetime": 1451591121000, "masked_author": "username_1", "text": "Yeah, I agree it's not ideal. Let's wait to hear what the author has to say. :)", "title": null, "type": "comment" }, { "action": "created", "author": "scottaddie", "comment_id": 179594558, "datetime": 1454556000000, "masked_author": "username_2", "text": "What is the recommended way to enable colors? [This issue](https://github.com/madskristensen/WebPackTaskRunner/issues/14) indicates that the `--colors` switch isn't working in Webpack 2.0.5-beta. I'm looking for a single CLI switch that's compatible with both Webpack 1.x and 2.x. Does such a switch exist?", "title": null, "type": "comment" }, { "action": "created", "author": "bebraw", "comment_id": 179685209, "datetime": 1454570135000, "masked_author": "username_1", "text": "@username_2 On a quick look Webpack 2 seems to depend on **supports-color** as well. Here are [the flags it supports](https://www.npmjs.com/package/supports-color#info). You can also force color through env.\r\n\r\nI think `--colors` was dropped a while back.", "title": null, "type": "comment" }, { "action": "created", "author": "pavellishin", "comment_id": 179820991, "datetime": 1454590795000, "masked_author": "username_0", "text": "The author seems pretty committed to using the `supports-color` options: https://github.com/webpack/webpack/pull/1825\r\n\r\nI still disagree, but it ain't my decision :)", "title": null, "type": "comment" }, { "action": "closed", "author": "bebraw", "comment_id": null, "datetime": 1460180131000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" }, { "action": "created", "author": "ericnewton76", "comment_id": 405712462, "datetime": 1531858358000, "masked_author": "username_3", "text": "so this is still an issue, the command line --colors doesnt work.\r\n\r\nhowever, --no-color did work.", "title": null, "type": "comment" }, { "action": "created", "author": "bebraw", "comment_id": 405962470, "datetime": 1531926038000, "masked_author": "username_1", "text": "@username_3 The logic is different now. Colors are enabled if the terminal supports them and you have to explicitly disable colors with `--no-color` if you don't want them.", "title": null, "type": "comment" }, { "action": "created", "author": "ericnewton76", "comment_id": 405981392, "datetime": 1531929223000, "masked_author": "username_3", "text": "@username_1 either way, --color, which is what is documented, doesn't work as explained. Nor does `--color=false` or `--color false` with a terminal that supports colors.\r\n\r\n`--no-color` does work on mine, but is not mentioned anywhere", "title": null, "type": "comment" }, { "action": "created", "author": "bebraw", "comment_id": 406393930, "datetime": 1532029886000, "masked_author": "username_1", "text": "@username_3 Ok, perfect.", "title": null, "type": "comment" } ]
4
12
1,917
false
false
1,917
true
linode/manager
linode
160,484,206
138
{ "number": 138, "repo": "manager", "user_login": "linode" }
[ { "action": "opened", "author": "eatonphil", "comment_id": null, "datetime": 1466013231000, "masked_author": "username_0", "text": "", "title": "Details style suggestions", "type": "issue" }, { "action": "created", "author": "eatonphil", "comment_id": 226267243, "datetime": 1466013269000, "masked_author": "username_0", "text": "This has some changes I'd like to see after #129 gets merged.", "title": null, "type": "comment" }, { "action": "created", "author": "eatonphil", "comment_id": 226267978, "datetime": 1466013424000, "masked_author": "username_0", "text": "![screen shot 2016-06-15 at 1 56 42 pm](https://cloud.githubusercontent.com/assets/3925912/16091295/09d59cf4-3301-11e6-97be-bdde7b304f5c.png)", "title": null, "type": "comment" }, { "action": "created", "author": "na3d", "comment_id": 226272308, "datetime": 1466014287000, "masked_author": "username_1", "text": "Is the screen the same?", "title": null, "type": "comment" }, { "action": "created", "author": "eatonphil", "comment_id": 226273915, "datetime": 1466014624000, "masked_author": "username_0", "text": "The screen?", "title": null, "type": "comment" }, { "action": "created", "author": "eatonphil", "comment_id": 226275930, "datetime": 1466015017000, "masked_author": "username_0", "text": "Differences you may or may not be able to not in the screenshot:\r\n\r\n* Decreased margins around the content of the main card and summary/access section.\r\n* Label colors are #333\r\n* Input and button font and height are standardized\r\n\r\nThat's all I can think of.", "title": null, "type": "comment" }, { "action": "created", "author": "SirCmpwn", "comment_id": 226276770, "datetime": 1466015189000, "masked_author": "username_2", "text": ":+1:", "title": null, "type": "comment" }, { "action": "created", "author": "eatonphil", "comment_id": 226278395, "datetime": 1466015522000, "masked_author": "username_0", "text": "Thanks!", "title": null, "type": "comment" } ]
4
11
1,418
false
true
506
false
mesonbuild/wrapweb
mesonbuild
231,639,303
15
null
[ { "action": "opened", "author": "riemass", "comment_id": null, "datetime": 1495808177000, "masked_author": "username_0", "text": "Hello, \r\nI'm trying to add cpr to wrapdb. Can you make a repo so I can fork it and push the changes.\r\n\r\nLibrary: https://github.com/whoshuu/cpr\r\nFork with meson.build that I currenty include in my project: https://github.com/username_0/cpr\r\nSomething I created on my local account for test: https://github.com/username_0/cpr\r\n\r\nAlso, before pushing to your repo I'm planning on adding version and licence information to meson.build files.", "title": "I'm trying to add cpr to wrapdb", "type": "issue" }, { "action": "created", "author": "jpakkane", "comment_id": 304318246, "datetime": 1495813812000, "masked_author": "username_1", "text": "Which version are you planning on wrapping? Must be an official release, as wrapdb does not support \"git head\" releases or the like.", "title": null, "type": "comment" }, { "action": "created", "author": "riemass", "comment_id": 304481761, "datetime": 1495927285000, "masked_author": "username_0", "text": "I've succeeded in wrapping the 1.3.0 version of the project, as that is the most recent tagged version.", "title": null, "type": "comment" }, { "action": "created", "author": "jpakkane", "comment_id": 304482020, "datetime": 1495927767000, "masked_author": "username_1", "text": "Thanks, created.", "title": null, "type": "comment" }, { "action": "closed", "author": "jpakkane", "comment_id": null, "datetime": 1495927767000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
2
5
683
false
false
683
true
porygonco/porybox
porygonco
164,733,717
329
null
[ { "action": "opened", "author": "not-an-aardvark", "comment_id": null, "datetime": 1468184005000, "masked_author": "username_0", "text": "In the long-term, we would like to have a 3DS Homebrew app that allows uploading to Porybox. However, it might be annoying to have to input your Porybox username and password into a 3DS, and it also leads to complications if the person who uploads the pk6 does not want to actually be its owner (e.g. if they are checking a Pokémon for someone else).\r\n\r\nSo I think it would be good to have a flow for anonymously uploading:\r\n\r\n* A client, usually on a 3DS, sends one or more pk6 files to the server\r\n* The server returns an unguessable, randomly-generated \"claim code\" to the client\r\n* The homebrew app displays the claim code on the screen so that the user can read it\r\n* Using a computer, the user logs into Porybox and enters the claim code into a form\r\n* The uploaded pk6 files get moved to the client's account\r\n\r\nThis would provide a secure method for people to upload files anonymously from a 3DS, and claim them later. It would also allow people to check things and upload for other users; the checker could upload the data anonymously, then send the claim code to the Pokémon's owner, then the owner would type the claim code into Porybox.\r\n\r\n---\r\n\r\nFunctional requirements for the porybox website:\r\n\r\n* The server must have a method to accept pk6 data<sup>1</sup> without the client being logged in.\r\n* The server must tie a valid uploaded chunk of data<sup>1</sup> to a unique, unguessable claim code.\r\n * Ideally the claim code should be human-readable, and short enough to type in manually without too much annoyance.\r\n* If the uploaded data is invalid/unreadable/malformed, the server should reject it, and must not send a claim code. (i.e. validation should take place when the data is uploaded, not when it's claimed.)\r\n* If the uploaded data is valid and readable but contains Pokémon that are prohibited from being uploaded (e.g. Kyurem-W), the server must accept all the Pokémon that are not prohibited and still send a claim code. The server should also indicate to the client that some of the uploads failed. When the client redeems this claim code, the prohibited Pokémon should not be included in the data that gets sent to the client's account.\r\n* The client must be able to enter that claim code on the website, and send it to the server.\r\n* If the ID number matches a previously-uploaded chunk of data<sup>1</sup>, that chunk of data gets sent to the account of the user that entered the claim code.\r\n* If the ID number does not match any previous upload, the server must return an error and not transfer any data to the user's account. Codes should be case-sensitive if being case-sensitive increases their entropy (e.g. they should be case-sensitive if they're generated as base64, but not if they're generated as hex strings).\r\n* If the user enters an ID number that has already been claimed by someone else, the server must return an error and not transfer any data to the user's account.\r\n * If two users enter the same claim code at the same time, only one of the requests should succeed (i.e. avoid race condition bugs)\r\n* The server should provide a method for a client to verify that a claim code is valid, without actually redeeming it. This would be useful for event checkers to double-check that they are sending the owner the correct claim code.\r\n\r\n<sup>1</sup> The server should be able to accept more than one pk6 file at a time; if a user wanted to upload their entire save from a 3DS, it would be very tedious to enter a claim code for each Pokémon. Possibly limit the upload size to 1000 Pokémon at a time; this would still be a reasonably small payload (200-300KB) and it would allow users to upload entire saves with only one claim code.\r\n\r\nQuestions to address:\r\n\r\n* Should unclaimed uploads be deleted after a certain amount of time?\r\n* What should claim codes look like? Hex strings are long enough that they're annoying to type. Some options include:\r\n * base64 strings (short with high entropy, but might contain some characters that are hard to read, e.g. 'I' vs 'l'\r\n * Random strings of a reduced set of characters (e.g. never include 'O' because it could be confused with '0')\r\n * Strings containing multiple random English words (easier to type, but the would need to be longer in order to have enough entropy)\r\n* Should it be possible to view the contents of an unclaimed upload without redeeming it? This would allow for giveaways of the form \"here's what I'm giving away, and I'll give you the claim code if you win the contest\". However, it might complicate things a bit, because it would require us to have visibility settings for unclaimed things (and someone might want to make unclaimed things private, rather than viewable).", "title": "Spec for allowing anonymous uploads", "type": "issue" }, { "action": "created", "author": "NotRaia", "comment_id": 231613084, "datetime": 1468187622000, "masked_author": "username_1", "text": "I like the idea, and it looks good so far.\r\n\r\nAnswers to questions:\r\n\r\n* Yes, we should probably delete unclaimed things after 24-48 hours.\r\n* I don't really like base64 strings because they're a pain to type. I would prefer options 2 or 3.\r\n* Maybe limit showing the contents to just a list of species? I'm not really sure I understand the visibility issue though.", "title": null, "type": "comment" }, { "action": "created", "author": "not-an-aardvark", "comment_id": 231614240, "datetime": 1468189115000, "masked_author": "username_0", "text": "* Deleting unclaimed things after awhile: My concern is that if a checker does a check for an owner, and the owner is busy/in a different timezone, they might not get around to claiming the uploads within 24-48 hours.\r\n* The issue with visibility is that we might want to allow unclaimed things to be viewed (but not claimed) by linking to them, the same way that we allow people to link to Pokémon without downloading them. This might require us to give unclaimed things a \"private\" setting if people wanted to opt-out of having their anonymous upload viewable with a link. Allowing uploads to be viewed with a link would also require us to display another random ID to the user when they upload, since they would need to have the link to view the uploaded thing.", "title": null, "type": "comment" }, { "action": "created", "author": "NotRaia", "comment_id": 231619037, "datetime": 1468195446000, "masked_author": "username_1", "text": "I probably wouldn't want to do links just to keep it simple. I think the main purpose of anonymous uploads should just be to get your files online; we probably don't need to consider giveaways.\r\n\r\nI'm envisioning something like Mystery Gift, where you enter a code to get a preview of what you're receiving, then select Yes/No on whether you want to claim it.", "title": null, "type": "comment" }, { "action": "created", "author": "not-an-aardvark", "comment_id": 231619183, "datetime": 1468195629000, "masked_author": "username_0", "text": "Maybe delete it after 7 days or so? 24 hours just seems a bit too short in my opinion.", "title": null, "type": "comment" }, { "action": "created", "author": "NotRaia", "comment_id": 231621285, "datetime": 1468198106000, "masked_author": "username_1", "text": "7 days sounds like a lot of time to leave possibly hundreds of files in limbo, but I agree that 24 hours is too short. I'd lean towards 48-72 hours.", "title": null, "type": "comment" }, { "action": "created", "author": "TheSonAlsoRises", "comment_id": 231728323, "datetime": 1468242349000, "masked_author": "username_2", "text": "The outline seems good so far!\r\n\r\n* I think 72 hours should be enough to claim the Pokémon.\r\n* Typing a chain of 30 random characters when setting up the 3DS WiFi was painful enough. I think random English words would be a lot easier to input, even if they take longer to type.\r\n* Probably not worth the effort in my opinion.", "title": null, "type": "comment" } ]
3
7
6,741
false
false
6,741
false
triniwiz/nativescript-sse
null
207,039,969
4
null
[ { "action": "opened", "author": "eldadj", "comment_id": null, "datetime": 1486892781000, "masked_author": "username_0", "text": "Hi,\r\n\r\nThe source installed when doing \r\n`npm install nativescript-sse` and \r\n`git clone https://github.com/username_3/nativescript-sse.git`\r\nare different. The npm version is pure Javascript and the git clone version is Typescript. \r\n\r\nAlso trying to use either doesn't work. Creating a new issue for that.", "title": "npm install !== git clone", "type": "issue" }, { "action": "created", "author": "1u0n", "comment_id": 299412393, "datetime": 1493973687000, "masked_author": "username_1", "text": "found the same problem, and same simple solution. This plugin's android version fails out of the box otherwise, which makes it difficult to depend on, isn't the repo creator active anymore?", "title": null, "type": "comment" }, { "action": "created", "author": "bradmartin", "comment_id": 299522566, "datetime": 1494004432000, "masked_author": "username_2", "text": "Hi @username_0 -\r\n\r\nThe source is different on npm vs. git - because typescript can't be run anywhere. Only the transpiled .js files can actually execute. Does that make sense? So typically, if you're a good developer who cares about consumers 😄, you don't ship the typescript to npm because it's wasted files. The only TS file that should be in a npm module is a `.d.ts` file. So this package if you `clone` the repo - you'll need to transpile the .ts files into .js and then use the output .js files in your app.\r\n\r\nHope that makes sense 👍", "title": null, "type": "comment" }, { "action": "closed", "author": "triniwiz", "comment_id": null, "datetime": 1512616524000, "masked_author": "username_3", "text": "", "title": null, "type": "issue" } ]
4
4
1,032
false
false
1,032
true
hallmark/gitwebhook
null
79,402,623
2,333
null
[ { "action": "opened", "author": "hallmark", "comment_id": null, "datetime": 1432291576000, "masked_author": "username_0", "text": "Content-Type: Multipart/Related; boundary=wallawalla82349-1; --wallawalla82349-1 Content-Type: application/json; { \"type\": \"popularArticle\", \"author\": \"KIT EATON\", \"published\": \"May 20, 2015 at 05:00PM\", \"section\": \"Technology\", \"url\": \"http://ift.tt/1R441Ni\" } --wallawalla82349-1 Content-Type: text/plain Content-ID: title Video Feature: As California Thirsts, These Apps Help Save Water --wallawalla82349-1 Content-Type: text/plain Content-ID: blurb Some of these apps help track one&rsquo;s own water use, while others make it easy to publicly shame water wasters. --wallawalla82349-1--", "title": "Video Feature: As California Thirsts These Apps Help Save Water", "type": "issue" } ]
1
1
590
false
false
590
false
astefanutti/decktape
null
103,542,577
15
null
[ { "action": "opened", "author": "VinceZK", "comment_id": null, "datetime": 1440691461000, "masked_author": "username_0", "text": "Tested the MAC version, it works perfect. Would you also help to compile a Linux version ?", "title": "Would you help to compile a Linux(Centos) version?", "type": "issue" }, { "action": "created", "author": "astefanutti", "comment_id": 135495394, "datetime": 1440695486000, "masked_author": "username_1", "text": "I don't have Linux boxes readily available at the moment though I'll be uploading corresponding binaries once I have access to them.", "title": null, "type": "comment" }, { "action": "created", "author": "gavincarr", "comment_id": 136596326, "datetime": 1441088007000, "masked_author": "username_2", "text": "I've got a working CentOS7 version here if you'd like it posted somewhere.", "title": null, "type": "comment" }, { "action": "created", "author": "astefanutti", "comment_id": 136674906, "datetime": 1441105195000, "masked_author": "username_1", "text": "@username_2 definitely!\r\n\r\nWould you mind creating a PR into the [`gh-pages`](../tree/gh-pages/) branch to upload it into something like `downloads/phantomjs-linux-centos7-amd64`. Just make sure you've run [UPX](http://upx.sourceforge.net) to compress the binary size.", "title": null, "type": "comment" }, { "action": "created", "author": "gavincarr", "comment_id": 136684848, "datetime": 1441107780000, "masked_author": "username_2", "text": "@username_1 Done: https://github.com/username_1/decktape/pull/16", "title": null, "type": "comment" }, { "action": "created", "author": "astefanutti", "comment_id": 136819069, "datetime": 1441131711000, "masked_author": "username_1", "text": "@username_2 Thanks a lot!\r\n\r\n@username_0 could you please test the uploaded version for Linux CentOS 7 and close that issue if that works OK.", "title": null, "type": "comment" }, { "action": "created", "author": "VinceZK", "comment_id": 137692823, "datetime": 1441360266000, "masked_author": "username_0", "text": "@username_1 , I tested in my CentOS6.4 (Sorry I do not have CentOS7 right now), the following error reports: \"error while loading shared libraries: libpng15.so.15: cannot open shared object file: No such file or directory\"\r\n\r\nI thought some lib is missing. And I found the original CentOS binary is 35MB, while @username_2 gives the binary with 14MB.", "title": null, "type": "comment" }, { "action": "created", "author": "gavincarr", "comment_id": 137697552, "datetime": 1441362175000, "masked_author": "username_2", "text": "@username_0 CentOS6 only comes with libpng12.so.0, so the C7 version won't work for you. I'll try and build a C6 version over the weekend. Are you using 64-bit or 32-bit?\r\n\r\nMy original binary was around 35MB as well, but running it through UPX dropped it down to 14MB.", "title": null, "type": "comment" }, { "action": "created", "author": "VinceZK", "comment_id": 137744614, "datetime": 1441375114000, "masked_author": "username_0", "text": "@username_2 Thanks for the clarifying. My CentOS6 runs on 64-bit machine.", "title": null, "type": "comment" }, { "action": "created", "author": "gavincarr", "comment_id": 138143371, "datetime": 1441586385000, "masked_author": "username_2", "text": "@username_1 I've just created a pull request with a CentOS6 x86_64 build of phantomjs, if you could merge when you get a chance.\r\n\r\n@username_0 Could you test this version once it's merged?\r\n\r\nThanks.", "title": null, "type": "comment" }, { "action": "created", "author": "astefanutti", "comment_id": 138274646, "datetime": 1441625219000, "masked_author": "username_1", "text": "@username_2 I've just merged it. Thanks a lot!", "title": null, "type": "comment" }, { "action": "created", "author": "VinceZK", "comment_id": 138295038, "datetime": 1441631678000, "masked_author": "username_0", "text": "@username_1 I tested CentOS6 x86_64 version, and got some situations. When printing some pages, the size is not right. It doesn't happen in normal phantomjs binary. \r\n@username_2 At first, I think it's only the CentOS6 binary. Then I tested MAC binary, the same problem. \r\n\r\nYou can find the difference from the 2 attached images. The first is the normal binary, and the second is the desktape bianry. I am sorry the page is in Chinese, but you can figure out the problem. And if you try to print page ' http://expressjs.com/ ', the same problem occurred. So it is not because of some specific pages cause the problem, it seems more a general issue. \r\n![normal_phantomjs_binary](https://cloud.githubusercontent.com/assets/4759678/9717378/d6bc1e36-55a4-11e5-8d49-4e78596007a2.jpg)\r\n![desktape_phantomjs_binary](https://cloud.githubusercontent.com/assets/4759678/9717381/d9e35d9a-55a4-11e5-94d2-8912c8ccb22a.jpg)", "title": null, "type": "comment" }, { "action": "created", "author": "VinceZK", "comment_id": 138300538, "datetime": 1441632785000, "masked_author": "username_0", "text": "And I tested to snapshot images (.png), it works fine. So it only effects .pdf snapshots.", "title": null, "type": "comment" }, { "action": "created", "author": "astefanutti", "comment_id": 138316240, "datetime": 1441636992000, "masked_author": "username_1", "text": "@username_0 I've tested the `examples/rasterize.js` script with the PhantomJS 2.0.0 Mac OS X official binary and the DeckTape Mac OS X binary and cannot see any differences for PDF output on http://expressjs.com/ and http://www.jrj.com.cn. What version of PhantomJS _normal_ binary are you using?\r\n\r\nThat being said, DeckTape is meant to be used to export HTML presentations to PDF. If this issue is meant to be provided with PhantomJS Linux CentOS binary from the PhantomJS official code line, I would suggest you contact PhantomJS project support directly.", "title": null, "type": "comment" }, { "action": "created", "author": "VinceZK", "comment_id": 138331216, "datetime": 1441641608000, "masked_author": "username_0", "text": "@username_1 I did more tests, and found if you set page format to 'A4', it will reproduce the error. \r\nPlease try command line: $phantomjs examples/rasterize.js http://www.jrj.com.cn out.pdf A4\r\n\r\nMaybe DeckTape is free of the issue. But the official code line has no confirmed schedule to the \"link\" issue. I'd like to take some time to figure it out based on your code line. \r\n\r\nThanks anyway:-)", "title": null, "type": "comment" }, { "action": "closed", "author": "astefanutti", "comment_id": null, "datetime": 1447616057000, "masked_author": "username_1", "text": "", "title": null, "type": "issue" } ]
3
16
3,651
false
false
3,651
true