id
stringlengths 4
10
| text
stringlengths 4
2.14M
| source
stringclasses 2
values | created
timestamp[s]date 2001-05-16 21:05:09
2025-01-01 03:38:30
| added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| metadata
dict |
|---|---|---|---|---|---|
169070427
|
Ajax 기능은 언제쯤 추가될까요?
안녕하세요!
NHN고도 안종태 qnibus@godo.co.kr라고 합니다.
저희가 해당 콤포넌트를 사용하고 있는데요!
Ajax 기능이 필요한데 혹 추가하실 예정이 있으신지요?
있으시다면 언제쯤일지 궁금해서 질문드립니다.
게시판의 성격과 맞지 않으면 글 보시고 지우셔도 되오며,
이메일로라도 답변 좀 해주셨으면 정말 감사하겠습니다.
수고하세요!
안녕하세요 :) NHN 엔터테인먼트 FE개발랩입니다.
문의 주신 내용에 대해서는 메일로 회신 드리겠습니다.
감사합니다.
|
gharchive/issue
| 2016-08-03T07:56:28
|
2025-04-01T06:39:45.939625
|
{
"authors": [
"qnibus",
"seonim-ryu"
],
"repo": "nhnent/tui.component.tree",
"url": "https://github.com/nhnent/tui.component.tree/issues/17",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
358775402
|
How can I restrict the file types that a user chooses for uploading an image?
How can I restrict the file types that a user chooses for uploading an image?
Thanks!
the feature is not yet supported.
let us discuss the feature here.
|
gharchive/issue
| 2018-09-10T20:10:29
|
2025-04-01T06:39:45.940779
|
{
"authors": [
"SteveSchreiner",
"kyuwoo-choi"
],
"repo": "nhnent/tui.editor",
"url": "https://github.com/nhnent/tui.editor/issues/295",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2024112270
|
fix(ci): pin @nhost/nhost-js dep version in sveltekit quickstart
This will ensure CI doesn't complain about trying to install an unpublished version of the @nhost/nhost-js package in the SvelteKit quickstart.
Codecov Report
Attention: 1 lines in your changes are missing coverage. Please review.
Comparison is base (f7c2148) 86.84% compared to head (278a641) 86.80%.
Report is 12 commits behind head on main.
Files
Patch %
Lines
packages/nhost-js/src/utils/helpers.ts
80.00%
1 Missing :warning:
:exclamation: Your organization needs to install the Codecov GitHub app to enable full functionality.
Additional details and impacted files
@@ Coverage Diff @@
## main #2402 +/- ##
==========================================
- Coverage 86.84% 86.80% -0.05%
==========================================
Files 85 85
Lines 9353 9328 -25
Branches 495 489 -6
==========================================
- Hits 8123 8097 -26
- Misses 1230 1231 +1
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
|
gharchive/pull-request
| 2023-12-04T15:21:40
|
2025-04-01T06:39:45.947628
|
{
"authors": [
"codecov-commenter",
"onehassan"
],
"repo": "nhost/nhost",
"url": "https://github.com/nhost/nhost/pull/2402",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
270995654
|
Typo in capability statement.
Documentation for the practitioner search in PractitionerRole has a typo. organation should be organization.
In core hapi - see https://github.com/jamesagnew/hapi-fhir/search?utf8=✓&q=organation&type=
|
gharchive/issue
| 2017-11-03T14:25:02
|
2025-04-01T06:39:45.949154
|
{
"authors": [
"KevinMayfield",
"VictorHarris"
],
"repo": "nhsconnect/careconnect-reference-implementation",
"url": "https://github.com/nhsconnect/careconnect-reference-implementation/issues/28",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1554618446
|
Create 2 new components: tabs and character count
To be released in February release - frontend and service manual
Related issue: https://github.com/nhsuk/nhsuk-service-manual/issues/1857
Draft in branch review/feb-release.
Published:
https://service-manual.nhs.uk/design-system/components/character-count
https://service-manual.nhs.uk/design-system/components/tabs
|
gharchive/issue
| 2023-01-24T09:27:26
|
2025-04-01T06:39:45.956547
|
{
"authors": [
"sarawilcox"
],
"repo": "nhsuk/nhsuk-service-manual",
"url": "https://github.com/nhsuk/nhsuk-service-manual/issues/1860",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
243637538
|
Feature/upgrade ci scripts
Supersedes #68
:rocket: deployment of nhsuk/profiles-db succeeded (http://profiles-db-pr-71.dev.beta.nhschoices.net)
|
gharchive/pull-request
| 2017-07-18T08:25:38
|
2025-04-01T06:39:45.957724
|
{
"authors": [
"c2s-dev",
"st3v3nhunt"
],
"repo": "nhsuk/profiles-db",
"url": "https://github.com/nhsuk/profiles-db/pull/71",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
420999419
|
Embed original gist comments?
This issue is to track discussion whether to render the comments on the original gist or not.
You can see the current implementation here: https://nicegist.github.io/84aec347c6a1b90890dad8953d7e8c67#gist-comments
The comment section is only shown for gists that have received comments. (Compare the link above to this Nicegist without comments)
What I like about it is, that you can easily link to comments (since I create unique anchor links for each comment). See: https://nicegist.github.io/84aec347c6a1b90890dad8953d7e8c67#comment-1876170
Let me know what you think. I'm still not sure if I should keep it or if its too distracting (because, ultimately, the idea of Nicegist is to offer a clean representation for gists).
Hmm... I'm torn. But that sort of also means I don't have a strong opinion either way. :) But design-wise I think they look nice.
Closing, since it's not an open issue. Further discussion welcome. Pinned this issue.
|
gharchive/issue
| 2019-03-14T12:48:35
|
2025-04-01T06:39:45.981633
|
{
"authors": [
"eyecatchup",
"fuzzy76"
],
"repo": "nicegist/nicegist.github.io",
"url": "https://github.com/nicegist/nicegist.github.io/issues/10",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1871084288
|
remove mention of summer school ?
in title (probably in yaml file), in intro text on welcome page.
we (you and I and others) will reuse this for regular workshops as well ;)
in read me file: maybe you could say instead that it was initially created for the LMU OSC summer school 2023 and linking to the website https://malikaihle.github.io/OSC-Open-Research-Summer-School-2023/
for the title: could be simply "Introduction to R' ?
also mention
Note
This chapter is optional. It is not necessary to follow the rest of the conference, but you will get startet with plotting in R.
in https://nickhaf.github.io/r_tutorial/qmd/plotting/plotting.html can be removed I think.
the title already says it's optional
|
gharchive/issue
| 2023-08-29T07:41:28
|
2025-04-01T06:39:46.053654
|
{
"authors": [
"MalikaIhle"
],
"repo": "nickhaf/r_tutorial",
"url": "https://github.com/nickhaf/r_tutorial/issues/54",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
94931887
|
Smoothing/ Running Averages
Hi
I'm interested in producing a running average &/or smoothing via the average of a definable number of previous values (in a date series). I could produce this via sql on the database that I'm getting the data from, but thought that this could be either a derived attribute or an aggregation function, but I'm a bit stuck on the best approach and also how to reference the values in previous cells. Any guidance gratefully received.
Edit
It's amazing that just after posting I came across #140 which has given me something to work from. I'll see what I can come up with.
OK, let me know how it goes! I think there's enough info in that other ticket to put you on the right track :)
Both of these are minor modifications of the code in #140 by philliproso
There are probably more efficient and elegant ways of doing this, but I've never used coffeescript before, so apologies in advance.
runningAverage takes account of whether some of the cells are empty or not,smoothedAverage does not
In the code for smoothedAverage, the value in a cell is the average of the current cell and the previous 3 cell values, in the earlier cells it's the average of the current cell and the previous cells.
runningAverage: (formatter=usFmt) -> ([attr]) -> (data, rowKey, colKey) ->
sum: 0
push: (record) -> @sum += parseFloat(record[attr]) if not isNaN parseFloat(record[attr])
value: ->
colKeys=data.getColKeys()
counter = 0
flat_col_key=colKey.join(String.fromCharCode(0))
for item in colKeys
flat_item=item.join(String.fromCharCode(0))
if flat_item is flat_col_key
itter=counter
counter++
prev_value = 0
if itter >0
denom = 1
for i in [1...itter+1]
aggregator = data.getAggregator(rowKey, colKeys[itter-i])
if 'sum' of aggregator
prev_value += aggregator.sum
denom++
return (@sum + prev_value) / (denom - 1)
format: formatter
numInputs: 1
smoothedAverage: (formatter=usFmt) -> ([attr]) -> (data, rowKey, colKey) ->
sum: 0
push: (record) -> @sum += parseFloat(record[attr]) if not isNaN parseFloat(record[attr])
value: ->
colKeys=data.getColKeys()
counter = 0
flat_col_key=colKey.join(String.fromCharCode(0))
for item in colKeys
flat_item=item.join(String.fromCharCode(0))
if flat_item is flat_col_key
itter=counter
counter++
for i in [1...itter+1]
prev_value = 0
if (itter -1) of colKeys
aggregatorone = data.getAggregator(rowKey, colKeys[itter-i])
if 'sum' of aggregatorone
prev_value += aggregatorone.sum
if (itter -2) of colKeys
aggregatortwo = data.getAggregator(rowKey, colKeys[itter-(i+1)])
if 'sum' of aggregatortwo
prev_value += aggregatortwo.sum
if (itter -3) of colKeys
aggregatorthree = data.getAggregator(rowKey, colKeys[itter-(i+2)])
if 'sum' of aggregatorthree
prev_value += aggregatorthree.sum
if itter > 3
return ((@sum + prev_value) / 4)
return ((@sum + prev_value) / (5 - (4 - itter)))
format: formatter
numInputs: 1
Both of these are minor modifications of the code by philliproso in #140
runningAverage calculates a running average of non-blank cells
smoothedAverage calculates the average of the current cell and the 3 previous cells - it doesn't take account of blank cells, or if there have been fewer than 3 previous values, the average of the current cell and it's predecessors.
There are probably lots of more efficient and elegant ways of doing this in coffeescript, but this was my first time using it, so please indulge my inefficiencies (I got part of the way there in coffeescript, hacked the resulting javascript, then tried to work out the necessary coffeescript to get that js.
Anyway I've now got node.js / npm installed so may further refine things, but it's working sufficiently for me needs
runningAverage: (formatter=usFmt) -> ([attr]) -> (data, rowKey, colKey) ->
sum: 0
push: (record) -> @sum += parseFloat(record[attr]) if not isNaN parseFloat(record[attr])
value: ->
colKeys=data.getColKeys()
counter = 0
flat_col_key=colKey.join(String.fromCharCode(0))
for item in colKeys
flat_item=item.join(String.fromCharCode(0))
if flat_item is flat_col_key
itter=counter
counter++
prev_value = 0
if itter >0
denom = 1
for i in [1...itter+1]
aggregator = data.getAggregator(rowKey, colKeys[itter-i])
if 'sum' of aggregator
prev_value += aggregator.sum
denom++
return ((@sum + prev_value) / (denom - 1))
format: formatter
numInputs: 1
smoothedAverage: (formatter=usFmt) -> ([attr]) -> (data, rowKey, colKey) ->
sum: 0
push: (record) -> @sum += parseFloat(record[attr]) if not isNaN parseFloat(record[attr])
value: ->
colKeys=data.getColKeys()
counter = 0
flat_col_key=colKey.join(String.fromCharCode(0))
for item in colKeys
flat_item=item.join(String.fromCharCode(0))
if flat_item is flat_col_key
itter=counter
counter++
for i in [1...itter+1]
prev_value = 0
if itter-1 of colKeys
aggregatorone = data.getAggregator(rowKey, colKeys[itter-1])
if 'sum' of aggregatorone
prev_value += aggregatorone.sum
if itter-2 of colKeys
aggregatortwo = data.getAggregator(rowKey, colKeys[itter-2])
if 'sum' of aggregatortwo
prev_value += aggregatortwo.sum
if itter-3 of colKeys
aggregatorthree = data.getAggregator(rowKey, colKeys[itter-3])
if 'sum' of aggregatorthree
prev_value += aggregatorthree.sum
if itter > 3
return ((@sum + prev_value) / 4)
return ((@sum + prev_value) / (5 - (4 - itter)))
format: formatter
numInputs: 1
Again, another minor modification
Percentage change from previous value
percentageChange: (formatter=usFmt) -> ([attr]) -> (data, rowKey, colKey) ->
sum: 0
push: (record) -> @sum += parseFloat(record[attr]) if not isNaN parseFloat(record[attr])
value: ->
colKeys=data.getColKeys()
counter = 0
flat_col_key=colKey.join(String.fromCharCode(0))
for item in colKeys
flat_item=item.join(String.fromCharCode(0))
if flat_item is flat_col_key
itter=counter
counter++
if itter >0
aggregator = data.getAggregator(rowKey, colKeys[itter-1])
if 'sum' of aggregator
return (((@sum - aggregator.sum)/aggregator.sum))
return
format: formatter
numInputs: 1
I'm going to close this as a duplicate of #140 and leave a link there to here.
|
gharchive/issue
| 2015-07-14T12:13:13
|
2025-04-01T06:39:46.125771
|
{
"authors": [
"50percentDave",
"nicolaskruchten"
],
"repo": "nicolaskruchten/pivottable",
"url": "https://github.com/nicolaskruchten/pivottable/issues/355",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
191709008
|
How to deal with large data set getting from server side
My situation is that I have more than 200k rows records which are gotten from server side. In order to use pivottable js, a client browser needs to download whole data, e.g via ajax request, it causes browser hang and not respond.
I'm thinking that instead of processing pivot table in front-end, I would do it on back-end side. However I would love to use the js library for drag&drop UI. It means that when I manipulate pivot table, it would send a request to server. Is there any way I could achieve this result?
Or any suggestion would prevent browser not respond while downloading a large data set for initializing the pivotable?
Thanks
I'm working now on the same issue.
My idea is to load data by portions (may be 20k rows), then init pivotUI with this array.
So, it means that pivottable will not reload all data every for each drag-n-drop move.
@ilyaguy: yeah, that's one way I have tried, but one down side is that user needs to wait until client gets all portions. And server needs to query into database for every portion (20k rows), it will take time to load whole data.
My idea is that we only load data from database one time and store it in cache or local file. For each drag&drop move, client sends request to server, server don't need to query again, just get data from cache and process the pivot table (e.g using Pandas python library to handle it)
Hello guy's,
I'had to work on an similar issue...
In my case, the final user has to deal with several data set
When then user logs in, for each data set he has subscribed to, i run an asynchronous ajax request
that stores the result of the sql query in php session.
During this asynchronous request, the front end shows a progressbar ( Loaded 1 of X...).
When finished, you can easily reload the full page without doing any other request on the database.
Your pivot table should easily deal with the 200k rows that are stored in the php session.
@chapt0011: storing data in php session means that the data is in server memory, right? If data grows up quickly, server will be out of memory. Is there any way we store data in client local storage? so we don't affect server memory.
I think you need to choose where the pivoting needs to happen!.
If it happens on server side, it will help reducing the amount of data to be transferred to client... But then you wouldn't need a pivot table ui on the client side (pls correct me if i am wrong)
On the othethand, if you want the user to have entire data and perform pivoting on it, then data volume to be transferred to client may go up!!!!
Good point to discuss about.... Waiting to learn from this discussion :)
@datnguyen0606 Sure, but assume data has been loaded in php session.
Render the page on the server side does'nt meen you can't unset the session variable on the server side after it has been sent to client side.
First:
While loading data into php session variable
//SQLRequest
// $_SESSION['dataset']=SQLResult json formatted
display loading progress!
On loaded
Redirect user to the following page:
<?php
session_start();
//if you don't use any template engine
echo '<html>
<head>[...]</head>
<body>
<script>
$(function(){
var derivers = $.pivotUtilities.derivers;
$("#output").pivotUI('.$_SESSION['dataset'].',
//your pivotUI params here
});
});
});
</script>
<div id="output"></div>
</body>';
//FINALLY unset the session variable which is already written!
unset($_SESSION['data']);
?>
Peform pivot on the server side is not a good idea. Anytime the final user wants to change rows or cols in the table, he will have to reload the dataset or the pivot result.
Actually the data in pivotui is already stored on the client side. I think you are just looking for the best way to get them as fast as possible from the server..
1st. you need to optimize your request ! if you are using mysql you can try to use a ramdisk...
2nd. Use Ajax asynchronously to perform your request, you don t really need php sessions . This way you won t get any javascript timeout
My feeling is that the architecture of this library is incompatible with a server-side integration along these lines... See #150
@chapt0011 yes you can bring in data in smaller chunks. But the pivot result on the client will be based on the data so far loaded on the client. And, the performance will be based on the client machine configuration!!!
@nicolaskruchten it would be nice if you can point us to references of such architectures that would address such requirements. Thanks :)
https://github.com/nicolaskruchten/pivottable/wiki/Frequently-Asked-Questions#server-side-integration
|
gharchive/issue
| 2016-11-25T14:06:06
|
2025-04-01T06:39:46.135764
|
{
"authors": [
"chapt0011",
"datnguyen0606",
"ilyaguy",
"nagarajanchinnasamy",
"nicolaskruchten"
],
"repo": "nicolaskruchten/pivottable",
"url": "https://github.com/nicolaskruchten/pivottable/issues/584",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1036062966
|
not installable on mac M1 machine
The output log is:
gyp info spawn args [ 'BUILDTYPE=Release', '-C', 'build' ]
TOUCH Release/obj.target/libvips-cpp.stamp
CC(target) Release/obj.target/nothing/node_modules/node-addon-api/nothing.o
LIBTOOL-STATIC Release/nothing.a
warning: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/libtool: archive library: Release/nothing.a the table of contents is empty (no object file members in the library define global symbols)
CXX(target) Release/obj.target/sharp/src/common.o
../src/common.cc:24:10: fatal error: 'vips/vips8' file not found
#include <vips/vips8>
^~~~~~~~~~~~
1 error generated.
make: *** [Release/obj.target/sharp/src/common.o] Error 1
gyp ERR! build error
gyp ERR! stack Error: `make` failed with exit code: 2
gyp ERR! stack at ChildProcess.onExit (/Users/grimmer/.nvm/versions/node/v15.14.0/lib/node_modules/npm/node_modules/node-gyp/lib/build.js:194:23)
gyp ERR! stack at ChildProcess.emit (node:events:369:20)
gyp ERR! stack at Process.ChildProcess._handle.onexit (node:internal/child_process:290:12)
gyp ERR! System Darwin 20.6.0
gyp ERR! command "/Users/grimmer/.nvm/versions/node/v15.14.0/bin/node" "/Users/grimmer/.nvm/versions/node/v15.14.0/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js" "rebuild"
gyp ERR! cwd /Users/grimmer/git/webpack-demo/node_modules/sharp
gyp ERR! node -v v15.14.0
gyp ERR! node-gyp -v v7.1.2
gyp ERR! not ok
@nicolaspanel ,
another related thing is, I forked this repo to mine and consider to release a TypeScript version as another npm package, one of the reasons is that I remove the image manipulation feature. The changes are here, https://github.com/grimmer0125/numjs/pull/4 and I do not change the algorithm part.
Should you have any concerns and suggestions, please tell me.
hi @grimmer0125
I just gave you write access to this repo => Feel free to improve it in any way
1 advice though: sticking to numpy api (or as close as possible) make it easier for numpy users
Best regards
see #125 that fixes this issue
in the meantime, adding
"resolutions": {
"sharp": "0.30.7" // or 0.29.2
},
should fix the issue
ref #110
|
gharchive/issue
| 2021-10-26T09:33:35
|
2025-04-01T06:39:46.139865
|
{
"authors": [
"grimmer0125",
"nicolaspanel",
"rawpixel-vincent"
],
"repo": "nicolaspanel/numjs",
"url": "https://github.com/nicolaspanel/numjs/issues/106",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
588313088
|
refactor(db): Make log handling explicit in StorageAPI.setState
Move the log concatenation logic out of setState and into a separate appendLog method.
Add a deltalog parameter to the StorageAPI.setState method to make it more obvious that setState should also handle appending entries to the game log.
Closes #577.
Two quick details:
This only uses appendLog during onUpdate in the game master. I think this is the only place that will produce deltalogs, but wanted to flag it just in case a setState call elsewhere could have been adding to the log.
deltalog is still being included in the state object sent to setState. It might not be necessary to store it in state, but I wasn’t sure (and it’s usually a pretty small object in any case).
deltalog resides in the state object primarily because reducer.ts passes around a state object. I'd love to separate it out if it's possible to do that in a clean way, but like you pointed out, it's a pretty small object.
OK, I’ve stripped deltalog from the state that gets persisted. I’m going to refrain from stripping it out elsewhere for now, because I’m not very clear where it is needed (e.g. on the client).
Thanks, and sorry about the additional work to revert some of the changes!
No worries — it’s probably good to keep the API surface small too.
|
gharchive/pull-request
| 2020-03-26T10:29:02
|
2025-04-01T06:39:46.144414
|
{
"authors": [
"delucis",
"nicolodavis"
],
"repo": "nicolodavis/boardgame.io",
"url": "https://github.com/nicolodavis/boardgame.io/pull/581",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
229053472
|
Remove misplaced .gitignore entry
OS-specific entries, like *.DS_Store, should reside in a global .gitignore, created by each user to fit their needs.
Example:
git config --global core.excludesfile '~/.gitignore'
echo '*.DS_Store' >> ~/.gitignore
@migueldemoura I agree but I do not want users having to do this separately. Adding this line to the .gitignore isn't really impacting anything anyway. If you have any additions for Windows users, please let me know!
Fair enough.
I'd suggest separating the two types of ignores with two new lines.
As for other suggestions:
Windows:
Thumbs.db
Desktop.ini
If you want, I can push another commit to address this.
@migueldemoura Please do! Thanks for the help :-)
|
gharchive/pull-request
| 2017-05-16T14:27:37
|
2025-04-01T06:39:46.160900
|
{
"authors": [
"migueldemoura",
"nielsenramon"
],
"repo": "nielsenramon/chalk",
"url": "https://github.com/nielsenramon/chalk/pull/85",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2403747883
|
🛑 Coffee Senses (testing) is down
In 9591c30, Coffee Senses (testing) (http://coffee.naupahouse.com) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Coffee Senses (testing) is back up in cccd047 after 53 minutes.
|
gharchive/issue
| 2024-07-11T17:55:52
|
2025-04-01T06:39:46.242614
|
{
"authors": [
"nigr0mante"
],
"repo": "nigr0mante/upptime",
"url": "https://github.com/nigr0mante/upptime/issues/27",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1260155897
|
Added multiuser support
List of fixed issues (if they exist):
Demos work on:
[ ] iOS Safari
[ ] iOS Firefox
[ ] iOS Chrome
[ ] Android Chrome
[ ] Android Firefox
[ ] macOS Safari
[ ] macOS Firefox
[ ] macOS Chrome
[ ] Windows Chrome
[ ] Windows Firefox
@cdrake, is this ready to be merged in? looks like the checks have passed ok.
|
gharchive/pull-request
| 2022-06-03T17:10:22
|
2025-04-01T06:39:46.253333
|
{
"authors": [
"cdrake",
"hanayik"
],
"repo": "niivue/niivue",
"url": "https://github.com/niivue/niivue/pull/333",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
297100369
|
Added deccan ruby conf 2018
Deccan Ruby Conf 2018 happening in Pune on 4th Aug 2018
2018 is almost over, so I'll close this. Thanks for creating the PR, much appreciated!
|
gharchive/pull-request
| 2018-02-14T13:44:55
|
2025-04-01T06:39:46.256525
|
{
"authors": [
"nikhita",
"razasayed"
],
"repo": "nikhita/tech-conferences-india",
"url": "https://github.com/nikhita/tech-conferences-india/pull/67",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2686279806
|
⚠️ LabBricks has degraded performance
In 64aac7e, LabBricks (https://labbricks.com) experienced degraded performance:
HTTP code: 523
Response time: 3210 ms
Resolved: LabBricks performance has improved in 2b9573f after 20 minutes.
|
gharchive/issue
| 2024-11-23T17:07:07
|
2025-04-01T06:39:46.268273
|
{
"authors": [
"nikolasibalic"
],
"repo": "nikolasibalic/status",
"url": "https://github.com/nikolasibalic/status/issues/201",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
192491730
|
Set up error-chain
@nikomatsakis I fixed this up to use the error chain in this PR and got it compiling. I expanded the throw macro to take fmt arguments as well and it is indeed pretty cool. I don't know about the word 'throw', but I think this macro should be in error-chain.
cc @Yamakaky this project contains an interesting throw! macro that removes some of the boilerplate of creating a new error.
@nikomatsakis check out that second commit that uses chain_err. That's what this lib encourages when adding information to an error (instead of creating a new error by e.g. format!("some error happened: {}", inner_error). Then the third commit sets up main to print the chain of errors.
@brson nice! did, this require any changes to error-chain?
Regarding throw!, The original ? included a throw keyword roughly equivalent to the macro here -- we removed it but in the time since I've come to think such a thing would be useful.
@nikomatsakis No, this code itself didn't require any error-chain patches.
|
gharchive/pull-request
| 2016-11-30T07:09:51
|
2025-04-01T06:39:46.271441
|
{
"authors": [
"brson",
"nikomatsakis"
],
"repo": "nikomatsakis/cargo-chrono",
"url": "https://github.com/nikomatsakis/cargo-chrono/pull/1",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
925415821
|
issue #335 solved
Description
I've updated about's card images
Checklist
[x] I am making a proper pull request, not a spam.
[x] I've checked the issue list before deciding what to submit.
Related Issues or Pull Requests
(Write your answer here.)
Add relevant screenshot or video (if any)
@niloysikdar the conflicts are solved, plz check it !
@niloysikdar please review it !!
Hey @ashuydv please check this does'nt seem fine
#368 issue solved
Make UI more engaging #337,
Added a new loader , plz review it.
@ashuydv the pull request is good to merge you just need to fix the footer. Please check this
And the preloader looks great!
on my localhost the loader looks like this,
why its expanding ??
Check the css maybe something might have been overwritten
Can U plz explain, what should I fix in footer ?? should I change the background or should i make it like this ??
Plz guide me
The footer looks like this. Just make it of screen size
Its working now, plz review it now
@ashuydv your preloader is working fine now. Just resolve the conflicts I'll merge it. And extremely sorry for the delay.
ok, will do
sry, for the delay, review it for the last time
index.html file is fine but it seems that you have made changes in the about page also. Please remove those changes. Check this .About page looks different from rest of the pages.
I've added only cards, I haven't touched navbar in any of them ,
Seems fine now. Thank you!
|
gharchive/pull-request
| 2021-06-19T15:25:46
|
2025-04-01T06:39:46.290369
|
{
"authors": [
"ashuydv",
"nilisha-jais"
],
"repo": "nilisha-jais/Musicophilia",
"url": "https://github.com/nilisha-jais/Musicophilia/pull/368",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2348692164
|
Observações sobre Listar Almoxarifados
Não sei a causa, mas certos almoxarifados não estão sendo deletados ao clicar no ícone de excluir. O que não faz sentido já que eu consegui excluir alguns antes. A única ação que fiz com eles foi registrar entradas.
Provavelmente deve fazer parte da regra de negócios não permitir a exclusão de almoxarifados que possuem algum tipo de registro.
@joaopontes22 verifique se este comportamento está correto.
Se algum almoxarifado tiver algum lançamento ele não deve ser apagado.
Se o almoxarifado não tiver lançamentos ele pode ser apagado.
@nilsonLazarin correto, almoxarifados com lançamentos não podem ser apagados.
@joaopontes22 Vou estar encerrando essa issue então, uma vez que o comportamento está dentro do esperado.
|
gharchive/issue
| 2024-06-12T12:48:52
|
2025-04-01T06:39:46.293096
|
{
"authors": [
"GabrielPintoSouza",
"joaopontes22",
"nilsonLazarin"
],
"repo": "nilsonLazarin/WeGIA",
"url": "https://github.com/nilsonLazarin/WeGIA/issues/601",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1104963355
|
🛑 Nimbus9 Billing API is down
In b537501, Nimbus9 Billing API (https://api.nimbus9.io/v2/billing/health) was down:
HTTP code: 502
Response time: 1337 ms
Resolved: Nimbus9 Billing API is back up in d32e870.
|
gharchive/issue
| 2022-01-16T05:37:45
|
2025-04-01T06:39:46.382569
|
{
"authors": [
"schrodingersket"
],
"repo": "nimbus9inc/api-monitor",
"url": "https://github.com/nimbus9inc/api-monitor/issues/117",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2704545700
|
Rtransparent Publication Ingestion
Working on issue https://github.com/nimh-dsst/dsst-etl/issues/10
alembic
done, please help me check @leej3
|
gharchive/pull-request
| 2024-11-29T09:37:48
|
2025-04-01T06:39:46.383985
|
{
"authors": [
"quang-ng"
],
"repo": "nimh-dsst/dsst-etl",
"url": "https://github.com/nimh-dsst/dsst-etl/pull/12",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
247125663
|
Update jquery.collection.js
Fixing label issue (#28)
Hello
Thank you for your contribution.
Quickly read the problem, and wondering if you checked this page?
https://symfony-collection.fuz.org/symfony3/troubleshoot/hide-form-labels
Yeah, but that's just hiding the label.
With my addition, it's now possible to actually use that label and show numbered "steps" or "items" if you will.
Nice.
Will check this closer within a couple of hours.
Okay, so when adding fields, it works.
But when loading in such a collection it gets rendered by twig, currently i'm solving it with adding
{%- if name matches '/^\\d+$/' -%}
{%- set name = 'item ' ~ (name|number_format + 1) -%}
{%- endif -%}
But users need to add this to their templates if they want labels to be displayed properly.
And I think moving elements won't sort correctly those numbers.
We may think of a better solution, like a selector to write position nbr somewhere.
Position is being stored in a hidden field, though.
Sure, but it may be more friendly to set a selector like span.position so you could use <label>Item #<span class="position"></span></label> to automatically fill Item #42 on your view without tricky hacks.
Yeah, but keep in mind it wouldn't always be "Item" either ;)
|
gharchive/pull-request
| 2017-08-01T16:21:28
|
2025-04-01T06:39:46.401070
|
{
"authors": [
"iSDP",
"ninsuo"
],
"repo": "ninsuo/symfony-collection",
"url": "https://github.com/ninsuo/symfony-collection/pull/78",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
378005109
|
Build Error building the latest source. DiscordBridge.cs line 117. Missing ;
Severity Code Description Project File Line Suppression State
Error CS1002 ; expected iTunesRichPresence-Rewrite D:\Desktop_Stuff\Github_Clones\C#\iTunesRichPresense\iTunesRichPresence\DiscordBridge.cs 117 Active
Here
catch (EntryPointNotFoundException) {
var newPresence = new DiscordRpc.RichPresence {
largeImageKey = "itunes_logo_big",
details = "No song playing",
state = "Re-install iTunesRichPresence to clear this message"
}
}
Adding a semi colon after the ending curly bracket let's the project build successfully.
As you can tell, I put extensive testing into this fix. I'd notice the error when I went to build a patch, but thanks for pointing it out to me.
Resolved in 06cfd81
|
gharchive/issue
| 2018-11-06T19:45:34
|
2025-04-01T06:39:46.410973
|
{
"authors": [
"JMccormick264",
"nint8835"
],
"repo": "nint8835/iTunesRichPresence",
"url": "https://github.com/nint8835/iTunesRichPresence/issues/26",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1620171758
|
ENH: JupyterLite, WebGL
https://github.com/emscripten-forge/empack
https://github.com/emscripten-forge/recipes
https://www.google.com/search?q=jupyterlite+emscripten-forge
https://jupyterlite.rtfd.io -> Lab
https://jupyterlite.readthedocs.io/en/latest/howto/configure/simple_extensions.html#the-case-of-jupyter-widgets-and-custom-renderers
https://jupyterlite.readthedocs.io/en/latest/howto/configure/rtc.html#enabling-rtc-in-jupyterlite
https://www.google.com/search?q=emscripten-forge+webgl
I don't think it can be made to work without a SIGNIFICANT amount of work
Hopefully the compiler will do most of that work. Maybe @martinRenou knows of an example emscripten-forge empack package that has OpenGL/EGL code?
ipygany does 3D in notebooks with VTK, which may or may not already be transpired to WASM IIUC
https://github.com/QuantStack/ipygany
https://emscripten.org/docs/optimizing/Optimizing-WebGL.html#which-gl-mode-to-target
There may need to be a 'polyfill' like requests-wasm-polyfill?
https://github.com/emscripten-forge/recipes/blob/main/recipes/recipes_emscripten/requests-wasm-polyfill/recipe.yaml
https://www.google.com/search?q="glfw"+emscripten
https://www.google.com/search?q="moderngl"+emscripten
https://github.com/emscripten-core/emscripten/blob/main/system/include/GL/glfw.h
https://gist.github.com/ousttrue/0f3a11d5d28e365b129fe08f18f4e141?permalink_comment_id=4484709#gistcomment-4484709
-sUSE_GLFW=3 option
Students could easily develop and share STEM games with a WASM compilation of jupylet that works in JupyterLite or VSCode.dev
Maybe @martinRenou knows of an example emscripten-forge empack package that has OpenGL/EGL code?
I don't!
ipygany does 3D in notebooks with VTK, which may or may not already be transpiled to WASM IIUC
ipygany only uses VTK on the back-end for file loading, the rendering is done entirely using WebGL.
From time to time I google to see if Kitware folks provide an easy way to compile VTK for WASM, thankfully today you triggered that search and it seems to be fruitful https://gitlab.kitware.com/vtk/vtk-wasm-docker (initial commit a month ago).
Though I guess all of this VTK discussion is completely out of scope for jupylet, so sorry for the spamming. If you want, @westurner, I'd be happy to have this discussion under an ipygany issue.
hi @martinRenou, you were one of the first people to star jupylet when it first came out. thanks! :)
@westurner, I don't have the time to jump into such a project at the moment, but I see that jupyterlite supports numpy ipywidgets and ipyevents, and is using async just as classic jupyter, so actually it may be possible to put aside audio, 3d, and sharertoy support and just reimplement the sprites and labels modules and start with them and get an initial version of jupylet for jupyterlite and continue from there (that is how jupylet started anyway).
Probably worth compiling to WASM and running the tests first
This is the requests-wasm-polyfill, because actual requests is not built on the ES/JS fetch api:
https://github.com/emscripten-forge/requests-wasm-polyfill/tree/main/requests
an example emscripten-forge empack package that has OpenGL/EGL code?
It looks like there's already glfw support in emscripten.
Is VTK actually out of scope? There probably could be a VTK backend; IDK what the advantage would be.
actually it may be possible to put aside audio, 3d, and sharertoy support
W3C Web Audio API
https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API
https://github.com/topics/web-audio-api
W3C Gamepad API
https://developer.mozilla.org/en-US/docs/Web/API/Gamepad_API
https://github.com/pybricks/support/issues/995
W3C Sensor API
https://developer.mozilla.org/en-US/docs/Web/API/Sensor_APIs
Like JupyterLite (and the vscode pyodide extensions) pyscript is also built on pyodide:
https://dev.jeff.glass/pyscript-audio/index.html
https://realpython.com/pyscript-python-in-browser/#sensor-api
3d
https://news.ycombinator.com/item?id=32657051 :
[ ] ENH: SensorCraft: replace Pyglet (OpenGL) with an alternate WebGL/WebGPU implementation
https://github.com/quobit/awesome-python-in-education/issues/50
|
gharchive/issue
| 2023-03-11T23:01:12
|
2025-04-01T06:39:46.429832
|
{
"authors": [
"martinRenou",
"nir",
"westurner"
],
"repo": "nir/jupylet",
"url": "https://github.com/nir/jupylet/issues/38",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
195860873
|
Can i limit the reading area?
Is there any way I can use the full screen camera area to display but limit the reading area to a subview?
Yes. The reading area is your SurfaceView, so how you size it (be it full screen or a specific height and width) in your layout is on you. Next is where you display your read data into , again depends on where you show the read data.
I may be able to answer more clearly if you gave me a specific case.
Hi, I'm sorry I lated too much to answer but I've been a little busy.
What I was trying to say is that I wanted to show the camera on the whole screen but limit the qr reading area to another inside view, like this:
I ended up modifying your QRDataListener by adding another onDetected method that sends the whole Barcode object so i could get the Rect of the read data and compare it to the Rect of my inside view in order to limit the reading area.
QRDataListener:
public interface QRDataListener {
/**
* On detected.
*
* @param data
* the data
*/
// Called from not main thread. Be careful
void onDetected(final String data);
// Allows to use the whole captured data
void onDetected(final Barcode data);
}
QREader:
@Override
public void receiveDetections(Detector.Detections<Barcode> detections) {
final SparseArray<Barcode> barcodes = detections.getDetectedItems();
if (barcodes.size() != 0 && qrDataListener != null) {
qrDataListener.onDetected(barcodes.valueAt(0));
qrDataListener.onDetected(barcodes.valueAt(0).displayValue);
}
}
My implementation:
...
qReader = new QREader.Builder(ScanActivity.this, scanner, new QRDataListener() {
@Override
public void onDetected(final String data) {
}
@Override
public void onDetected(Barcode barcode) {
if(areaRect.contains(barcode.getBoundingBox())) {
final String data = barcode.displayValue;
qReader.stop();
...
}
}
}).facing(QREader.BACK_CAM)
.enableAutofocus(true)
.height(width)
.width(height)
.build();
...
I hope this could help somebody else and I thank you for your attention.
@primissus if you added a feature, do consider sending a PR.
I am facing a problem on the aspect ratio of the camera preview.
Always displayed with bad aspect ration.
What to do ??
@simonkarmy I am facing same issue too. did you get any solution for that?
|
gharchive/issue
| 2016-12-15T17:07:14
|
2025-04-01T06:39:46.450910
|
{
"authors": [
"NilaxSpaceo",
"nisrulz",
"primissus",
"simonkarmy"
],
"repo": "nisrulz/qreader",
"url": "https://github.com/nisrulz/qreader/issues/29",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1138017564
|
chore: add example deployment
Deploys to vercel
Codecov Report
Merging #1 (aab750f) into develop (84c2fbf) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## develop #1 +/- ##
========================================
Coverage 82.08% 82.08%
========================================
Files 1 1
Lines 67 67
Branches 21 21
========================================
Hits 55 55
Misses 12 12
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 84c2fbf...aab750f. Read the comment docs.
|
gharchive/pull-request
| 2022-02-15T00:42:03
|
2025-04-01T06:39:46.462240
|
{
"authors": [
"codecov-commenter",
"davemooreuws"
],
"repo": "nitrictech/react-animated-term",
"url": "https://github.com/nitrictech/react-animated-term/pull/1",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2010573617
|
add test actions code
概要
test用Actions追加
#28
ブランチ名を間違えたのでclose
|
gharchive/pull-request
| 2023-11-25T11:06:37
|
2025-04-01T06:39:46.467639
|
{
"authors": [
"niwaniwa"
],
"repo": "niwaniwa/Sakura-Pi-Node",
"url": "https://github.com/niwaniwa/Sakura-Pi-Node/pull/29",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2644616408
|
Lorri displays "error extending watch paths" warning during successful operation
I recently upgraded my version of nix and lorri and began to encounter the following warning lines during lorri builds. It does this on every single new build, but not when loading from a cached build.
Nov 08 16:51:03.439 WARN error extending watch paths:, paths: [Normal("/Users/pd/.config/nixpkgs/config.nix"), Normal("<nix/fetchurl.nix>"), Normal("/Users/pd/Desktop/lorri-init-path-errors/shell.nix"), Normal("<nix/derivation-internal.nix>")], error: Error { kind: Io(Os { code: 2, kind: NotFound, message: "No such file or directory" }), paths: [] }, nix_file: /Users/pd/Desktop/lorri-init-path-errors/shell.nix
I was able to create a minimal reproduction using a brand new repository where the only files were created by lorri init, you should be able to check it out here:
https://github.com/peterldowns/lorri-init-path-warnings
Hopefully you can reproduce the behavior by checking out the repository and running:
cd lorri-init-path-warnings
direnv allow .
lorri watch --once
Expected behavior
Lorri functions correctly, updates the current shell environment to add the hello binary to the $PATH.
Actual behavior
Lorri functions correctly, updates the current shell environment to add the hello binary to the $PATH, and prints the warning log visible above.
Metadata
$ lorri info --shell-file shell.nix
Project Shell File: /Users/pd/Desktop/lorri-init-path-errors/shell.nix
Project Garbage Collector Root: /Users/pd/Library/Caches/com.github.nix-community.lorri.lorri.lorri/gc_roots/bad6431c93487fe2821f8b802ac8a0d8/gc_root/shell_gc_root
General:
Lorri User GC Root Dir: /Users/pd/Library/Caches/com.github.nix-community.lorri.lorri.lorri/gc_roots
Lorri Daemon Socket: /Users/pd/Library/Caches/com.github.nix-community.lorri.lorri.lorri/daemon.socket
Lorri Daemon Status: `lorri daemon` is running
$ uname -a
Darwin pld-mbp-22 23.4.0 Darwin Kernel Version 23.4.0: Fri Mar 15 00:10:42 PDT 2024; root:xnu-10063.101.17~1/RELEASE_ARM64_T6000 arm64 arm Darwin
Additional context
macOS 14.4.1 Sonoma
lorri 1.7.1 installed via service.lorri.enable = true; in my system flake
system flake nixpkgs (where lorri comes from) is github:NixOS/nixpkgs/85f7e662eda4fa3a995556527c87b2524b691933?narHash=sha256-JwQZIGSYnRNOgDDoIgqKITrPVil%2BRMWHsZH1eE1VGN0%3D (2024-11-07 05:50:23)
Interesting. Thanks for putting together a reproduction. I'll see if I can take a look over the weekend, but I'm on the road next week.
I agree: the warning should describe which path can't be found, at the very least.
Cool, thanks for the quick response. This isn't an urgent bug in any way, since the only problem is that there is a warning shown — lorri still works great. Thanks for maintaining and for the improvements you've been making, I'm looking forward to the improved flakes support!
|
gharchive/issue
| 2024-11-08T17:01:09
|
2025-04-01T06:39:46.488519
|
{
"authors": [
"nyarly",
"peterldowns"
],
"repo": "nix-community/lorri",
"url": "https://github.com/nix-community/lorri/issues/137",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1809641283
|
Use Nix copy command instead of nix-closure-copy
This relates to #63
I tested it, and its very fast.
It only work fast on NisOS unstable though, as NixOS 23.05 contains Nix 2.13 which doesn't support that yet.
On older nixos it just works slowly.
I tested it, and its very fast.
Can you provide some numbers to go with this? How does it deal with latency?
@exarkun sure,
here is a time terraform apply -auto-approve on some example terraform on AWS (with around 100ms latency) and sample nix config without the patch:
terraform apply -auto-approve 14.52s user 4.95s system 3% cpu 8:26.58 total
and here is with the patch (on completely new machine):
terraform apply -auto-approve 7.47s user 1.55s system 14% cpu 1:00.89 total
@exarkun i don’t have capacity to test this right now. I’d be happy to merge it if it was opt-in. Can we add a flag in the terraform module to switch between original implementation and this one? Thanks!
|
gharchive/pull-request
| 2023-07-18T10:24:25
|
2025-04-01T06:39:46.493527
|
{
"authors": [
"adrian-gierakowski",
"exarkun",
"smulikHakipod"
],
"repo": "nix-community/terraform-nixos",
"url": "https://github.com/nix-community/terraform-nixos/pull/76",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1607650525
|
Release tree-sitter-nix on Crates.io
Hi, I'd love to include tree-sitter-nix in my markdown to html converter crate, but to do that I can't have git dependencies.
Do you have any interest or plans on releasing the Rust adapter on crates.io?
Oh yeah, that's a good idea! I'll see if I can push it up tonight or this weekend.
@benwis I just now published the crate: https://crates.io/crates/tree-sitter-nix :tada:
Lemme know if there's anything else I can do!
|
gharchive/issue
| 2023-03-02T22:56:40
|
2025-04-01T06:39:46.496021
|
{
"authors": [
"benwis",
"cstrahan"
],
"repo": "nix-community/tree-sitter-nix",
"url": "https://github.com/nix-community/tree-sitter-nix/issues/36",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
173582523
|
Error building on FreeBSD
Been getting the following while with nix
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:49:5: 49:11 error: unresolved import `libc::SIGPWR`. There is no `SIGPWR` in `libc` [E0432]
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:49 SIGPWR,
^~~~~~
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:49:5: 49:11 help: run `rustc --explain E0432` to see a detailed explanation
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:50:5: 50:14 error: unresolved import `libc::SIGSTKFLT`. There is no `SIGSTKFLT` in `libc`. Did you mean to use `SIGSTKSZ`? [E0432]
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:50 SIGSTKFLT,
^~~~~~~~~
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:50:5: 50:14 help: run `rustc --explain E0432` to see a detailed explanation
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:52:5: 52:12 error: unresolved import `libc::SIGPOLL`. There is no `SIGPOLL` in `libc`. Did you mean to use `SIGILL`? [E0432]
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:52 SIGPOLL, // Alias for SIGIO
^~~~~~~
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:52:5: 52:12 help: run `rustc --explain E0432` to see a detailed explanation
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:53:5: 53:14 error: unresolved import `libc::SIGUNUSED`. There is no `SIGUNUSED` in `libc` [E0432]
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:53 SIGUNUSED, // Alias for 31
^~~~~~~~~
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/sys/signal.rs:53:5: 53:14 help: run `rustc --explain E0432` to see a detailed explanation
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/unistd.rs:188:47: 188:50 error: mismatched types [E0308]
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/unistd.rs:188 let res = unsafe { libc::sethostname(ptr, len) };
^~~
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/unistd.rs:188:47: 188:50 help: run `rustc --explain E0308` to see a detailed explanation
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/unistd.rs:188:47: 188:50 note: expected type `i32`
/home/abuilder/.cargo/registry/src/github.com-1ecc6299db9ec823/nix-0.6.0/src/unistd.rs:188:47: 188:50 note: found type `usize`
Looks like the issue is fixed in 0.6.1-pre.
|
gharchive/issue
| 2016-08-27T07:08:07
|
2025-04-01T06:39:46.498023
|
{
"authors": [
"dariusc93"
],
"repo": "nix-rust/nix",
"url": "https://github.com/nix-rust/nix/issues/409",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
212023668
|
PTRACE_O_SYSGOOD flag is interpreted as part of a signal
In sys::wait, if waitpid returns a status originating from ptrace and PTRACE_O_SYSGOOD is set, the seventh (0x80) bit of the status will be flipped. This causes status parsing to fail.
The flag technically isn't a ptrace event either, but I feel like that would be the best place to put it. Alternatively, a SYSGOOD-status could be its own unique kind of status, or a boolean entry could be added in WaitStatus::Stopped to mark it.
I've opened a pull request simply ignoring the bit for now (#549)
@chaosagent thanks for the report. What's the platform / architecture this happens on?
This only happens on Linux, and I can confirm it happens on x86_64, but the Linux man page says that it might not work on all platforms.
The name of the flag is PTRACE_O_TRACESYSGOOD btw; I'll changethe issue title to reflect.
|
gharchive/issue
| 2017-03-06T04:51:23
|
2025-04-01T06:39:46.500321
|
{
"authors": [
"chaosagent",
"kamalmarhubi"
],
"repo": "nix-rust/nix",
"url": "https://github.com/nix-rust/nix/issues/550",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1858675220
|
先决条件
[X] 我已尝试更新模板版本
[X] 我已检索模板手册
[X] 我已检索项目 wiki
[X] 我已确认这个问题没有在其他 issues 中提出过。
期望的现象
英文摘要页的论文标题不要出问题
发生了什么
下面是英文标题的设置:
下面是摘要页的英文标题:
最小工作示例
见**发生了什么**
njuthesis 模板版本
v1.3.0
模板获取途径
通过 GitHub Releases 或镜像站下载
操作系统
Windows
TeX 发行版
No response
编译程序
XeLaTeX
额外信息
No response
修了。
准备发release吗,还是最近不会发了我直接clone好了
现在可以从 https://github.com/nju-lug/NJUThesis/actions/runs/5924875459 的 Artifacts 板块下载每次 commit 生成的模板格式文件。
|
gharchive/issue
| 2023-08-21T06:17:26
|
2025-04-01T06:39:46.520243
|
{
"authors": [
"atxy-blip",
"elem-azar-unis"
],
"repo": "nju-lug/NJUThesis",
"url": "https://github.com/nju-lug/NJUThesis/issues/227",
"license": "LPPL-1.3c",
"license_type": "permissive",
"license_source": "github-api"
}
|
1327192667
|
IndexError: only integers, slices (:), ellipsis (...), numpy.newaxis (None) Error
Using this example:
from sklearn import datasets
import numpy as np
from classix import CLASSIX
X, y = datasets.make_blobs(n_samples=5000, centers=2, n_features=2, cluster_std=1, random_state=1)
clx = CLASSIX(sorting='pca', radius=0.15, group_merging='density', verbose=1, minPts=13, post_alloc=False)
clx.fit(X)
I am getting the following error:
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
<ipython-input-569-bb587af68fc1> in <module>
5 X, y = datasets.make_blobs(n_samples=5000, centers=2, n_features=2, cluster_std=1, random_state=1)
6 clx = CLASSIX(sorting='pca', radius=0.15, group_merging='density', verbose=1, minPts=13, post_alloc=False)
----> 7 clx.fit(X)
8
9 X
~/miniconda3/envs/ltf-analysis/lib/python3.8/site-packages/classix/clustering.py in fit(self, data)
506 self.labels_ = copy.deepcopy(self.groups_)
507 else:
--> 508 self.labels_ = self.clustering(
509 data=self.data,
510 agg_labels=self.groups_,
~/miniconda3/envs/ltf-analysis/lib/python3.8/site-packages/classix/clustering.py in clustering(self, data, agg_labels, splist, sorting, radius, method, minPts)
724 # self.merge_groups = merge_pairs(self.connected_pairs_)
725
--> 726 self.merge_groups, self.connected_pairs_ = self.fast_agglomerate(data, splist, radius, method, scale=self.scale)
727 maxid = max(labels) + 1
728
~/miniconda3/envs/ltf-analysis/lib/python3.8/site-packages/classix/merging.py in fast_agglomerate(data, splist, radius, method, scale)
115 # den1 = splist[int(i), 2] / volume # density(splist[int(i), 2], volume = volume)
116 for j in select_stps.astype(int):
--> 117 sp2 = data[splist[j, 0]] # splist[int(j), 3:]
118
119 c2 = np.linalg.norm(data-sp2, ord=2, axis=-1) <= radius
IndexError: only integers, slices (`:`), ellipsis (`...`), numpy.newaxis (`None`) and integer or boolean arrays are valid indices
python==3.8.12
classixclustering==0.6.5
numpy==1.22.0
scipy==1.7.3
I'll give it a spin and report back! Thanks for the prompt action!!
This seems to be working now. It do say that it isn't using Cython but was quick enough anyway for what I was doing.
Many thanks!
|
gharchive/issue
| 2022-08-03T13:07:16
|
2025-04-01T06:39:46.586085
|
{
"authors": [
"joshdunnlime"
],
"repo": "nla-group/classix",
"url": "https://github.com/nla-group/classix/issues/9",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
52691391
|
Linux
This is compatible with Linux ETS2?
The current revisions are DLLs and only run on Windows.
I am unaware of any support for SDK plugins in ETS2 for Linux neither. All SDK examples are for Windows, and they also use Windows-specific includes in their code.
I don't run Linux as a gaming OS at this moment, so I haven't got ETS2 installed to try this out.
If you do have any related links to the ETS2 SDK and Linux, please let me know.
|
gharchive/issue
| 2014-12-22T21:39:33
|
2025-04-01T06:39:46.587683
|
{
"authors": [
"TheAifam5",
"nlhans"
],
"repo": "nlhans/ets2-sdk-plugin",
"url": "https://github.com/nlhans/ets2-sdk-plugin/issues/4",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1307918060
|
Multiple versions causing conflict
Description
I'm using a 3rd party library that uses nlohmann/json under the hood. I am also using it in my code.
I have found that unless these two versions are exactly the same, there are side effects that cause the code not to function in the 3rd party library. (The failure happens with 3.9.2 in the 3rd party library and 3.10.5 in mine.) The exact failure I can't pinpoint, because it happens deep in code I'm not famililar with, but it causes no build-time error or crash, but DOES cause incorrect behavior that appears to be null JSON objects.
Keeping my version matched to the library gives me temporary respite, but is a pain to maintain, and becomes unsolvable if it happpens with yet another library.
The problem is clearly a collision in object code resolution. Could we have it customizable to change from nlohmann to MYNAMESPACE::nlhomann? Wrapping the header file in a namespace doesn't work for a variety of reasons. I CAN do a very cheap hack by doing
#define nlhomann my_nlhomann
but I suspect that might not be very sustainable.
Or perhaps there is another solution that I haven't found online?
Reproduction steps
The 3rd party library in question is depthai-core, in a dependency.
Expected vs. actual results
The exact throw is "cannot use at() with null", but this is due to a previous problem.
Minimal code example
No response
Error messages
No response
Compiler and operating system
Linux gcc
Library version
3.9.2, 3.10.5
Validation
[ ] The bug also occurs if the latest version from the develop branch is used.
[ ] I can successfully compile and run the unit tests.
What about a versioned inline namespace? (@nlohmann)
I might try it later, but it'll need some support from your release scripts to bump the namespace version.
But I agree; if the header file used
namespace nlohmann_3_10_5 {...}
namespace nlohmann = nlohman_3_10_5;
then that would easily solve the problem forevermore.
That's not quite what I'm suggesting.
namespace nlohmann {
inline namespace v3_10_5 {
// ...
}
}
Nothing changes for end users unless there's an ambiguity because both headers are included in one translation unit.
See https://en.cppreference.com/w/cpp/language/namespace#Inline_namespaces for an explanation.
Are you having this issue because two different versions of the library headers are being included in the same file (translation unit) or because two different files (translation units) with different versions are linked together? The former is now prevented by https://github.com/nlohmann/json/pull/3418 in the develop branch.
Are you having this issue because two different versions of the library headers are being included in the same file (translation unit) or because two different files (translation units) with different versions are linked together? The former is now prevented by #3418 in the develop branch.
Based on the proposed workaround, I'm assuming this is about linking different versions together. An inline namespace would result in different symbol names. JSON_DIAGNOSTICS would still pose a problem, but could be solved with an inline namespace as well, as mentioned during one of the last ABI-related discussions.
FYI, we'll soon check whether incompatible versions are used together, see https://json.nlohmann.me/api/macros/json_skip_library_version_check/#runtime-assertions. That is, 3.11.0 will detect when it is used by earlier versions.
@nlohmann That only helps with including two different versions of the header in the same file. It doesn't help with ODR violations that occur at link time when you use different versions in different files. There are things that can help with that too, but we're not doing any of them yet.
https://docs.microsoft.com/en-us/cpp/preprocessor/detect-mismatch?view=msvc-170
This can also be used to detect one file built with diagnostics, and one without. It will only help going forward as it can only detect when the pragmas exist and are different between files. It won't prevent previous versions being used together or with the new version.
@gregmarr The problem is specifically two object files that use diffent versions of the header-only json, so that the wrong code gets executed leading to bad data. The inlined versioning looks like it would work, and is much more elegant than my brute force #define
|
gharchive/issue
| 2022-07-18T12:55:16
|
2025-04-01T06:39:46.650655
|
{
"authors": [
"falbrechtskirchinger",
"gregmarr",
"nathanieltagg",
"nlohmann"
],
"repo": "nlohmann/json",
"url": "https://github.com/nlohmann/json/issues/3588",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
228480051
|
fix doxygen error of basic_json::get()
@ref command do not seems to work in markdown code span. so i change
`void from_json(const @ref basic_json&, ValueType&)`
into
`void from_json(const ` @ref basic_json`&, ValueType&)`
are there better solution for this problem?
The mentioned functions are not meant to be linked in the first place, because they are most likely in the client's code - it can be any from_json function based on ValueType. I'm not sure whether fixing this so that basic_json can be clicked brings too much value here.
Coverage remained the same at 99.722% when pulling dfa371c436fa14926544eb11f3207d0439cd2c43 on zhaohuaxishi:doxygen_error into 9b764ee5d671b41255d390ab9089f12036b2d38a on nlohmann:develop.
is that means the @ref command should just be removed then.
Coverage remained the same at 99.722% when pulling b8dff3bc1674acb6a30cd733ecae545b93b1baeb on zhaohuaxishi:doxygen_error into 723c87560459eaacb3e3b3f0d4666bd2d02be317 on nlohmann:develop.
Coverage remained the same at 99.722% when pulling b8dff3bc1674acb6a30cd733ecae545b93b1baeb on zhaohuaxishi:doxygen_error into 723c87560459eaacb3e3b3f0d4666bd2d02be317 on nlohmann:develop.
|
gharchive/pull-request
| 2017-05-13T15:22:13
|
2025-04-01T06:39:46.656990
|
{
"authors": [
"coveralls",
"nlohmann",
"zhaohuaxishi"
],
"repo": "nlohmann/json",
"url": "https://github.com/nlohmann/json/pull/583",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
642520078
|
how to understand use_bert_emb?
I change the bert model to Albert model.
But when use_bert_emb is set to true, the model reports an error: RuntimeError: The size of tensor a (128) must match the size of tensor b (768) at non-singleton dimension 2.
if use_bert_emb is set to false, the model works normally.
How can I solve this problem?
I have the same problem with you. Have you solved it?
Sorry, the problem has not been solved for the time being.
| |
Shuai Zhao
|
|
邮箱:17839192463@163.com
|
Signature is customized by Netease Mail Master
On 08/21/2020 15:16, GuanNiPiShi123 wrote:
I have the same problem with you. Have you solved it?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
gharchive/issue
| 2020-06-21T08:26:02
|
2025-04-01T06:39:46.665480
|
{
"authors": [
"17839192463",
"GuanNiPiShi123"
],
"repo": "nlpyang/PreSumm",
"url": "https://github.com/nlpyang/PreSumm/issues/179",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
113699096
|
Is generate() coming back?
Would really love to see this function back again. The book says it will be reinstated in a later version, but when?
+1 , given the recent RNN generation from @karpathy (https://github.com/karpathy/char-rnn), it might be very easy to create a generate algorithm but support of any NN implementation is all very unstable API changes almost every 2-3 weeks =(
Is there a reason why this function was removed? Just curious.
|
gharchive/issue
| 2015-10-27T21:59:33
|
2025-04-01T06:39:46.667169
|
{
"authors": [
"JonathanReeve",
"alvations"
],
"repo": "nltk/nltk",
"url": "https://github.com/nltk/nltk/issues/1180",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
103795622
|
#2404 Modal open calls should complete in order
This is a second PR attempt for issue #2404. (PR #2443 has rebase issues and will be closed.)
Oops...PR'd to the fork instead of upstream.
|
gharchive/pull-request
| 2015-08-28T21:06:05
|
2025-04-01T06:39:46.668158
|
{
"authors": [
"nlwillia"
],
"repo": "nlwillia/bootstrap",
"url": "https://github.com/nlwillia/bootstrap/pull/1",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2381198262
|
Stage 6 updates
This is based on #3, so that should get merged first.
This does the bare minimum to allow the user to go to Stage 6. Guidelines with mc or fr questions will not appear, so if you try to navigate past the first guideline, you will be stuck.
I did not attempt to set up any gates, but I can work on that once the new mc & fr methods are set up.
I did not attempt to wire up the layer toggle or hubble viewer. @johnarban or @Carifio24, perhaps one of you could work on that once you're finished with the other content you are updating. Thanks!
Opened new PR at https://github.com/cosmicds/hubbleds/pull/436 to merge this to cosmicds main, so closing this.
Reopening this and closing https://github.com/cosmicds/hubbleds/pull/436
|
gharchive/pull-request
| 2024-06-28T21:03:41
|
2025-04-01T06:39:46.687332
|
{
"authors": [
"patudom"
],
"repo": "nmearl/hubbleds",
"url": "https://github.com/nmearl/hubbleds/pull/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2433869434
|
Benefit of GRAAL VM integration
Theoretical, mostly.
Current situation with GRAAL, native-image does not run properly.
Rhino is an order of magnitude faster than Graal's JS implementation.
https://github.com/oracle/graaljs/issues/836
It is also evident that graal is slow, at least by 12% for our use case.
More update: https://github.com/mozilla/rhino/issues/1555
|
gharchive/issue
| 2024-07-28T10:46:44
|
2025-04-01T06:39:46.692869
|
{
"authors": [
"nmondal"
],
"repo": "nmondal/cowj",
"url": "https://github.com/nmondal/cowj/issues/111",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2621104257
|
Re-add deprecated function deleteThirdPartyOwnedRelationshipAttributeAndNotifyPeer
For backwards compatibility
Readiness checklist
[ ] I added/updated tests.
[x] I ensured that the PR title is good enough for the changelog.
[x] I labeled the PR.
@Milena-Czierlinski as we are the only consumers we can keep this removed silently! :)
@sebbi08 expressed his concern that the previous release introduced a breaking change for integrators.
IMO as the API is unchanged and we don't know about any consumers of the js sdk than us we can ignore it.
also I don't expect that someone is already using this functionality
@sebbi08 as you requested this, any word from your side for or against this?
If we are sure that no one uses this, I am also fine with just removing it. Otherwise, this would be a breaking change.
|
gharchive/pull-request
| 2024-10-29T12:29:23
|
2025-04-01T06:39:46.700369
|
{
"authors": [
"Milena-Czierlinski",
"jkoenig134",
"sebbi08"
],
"repo": "nmshd/connector",
"url": "https://github.com/nmshd/connector/pull/299",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
639072185
|
Release version 0.3.2
Creating a New Release
Changelog
[x] First, update the CHANGELOG file in the project root directory. Use the command
git log --oneline v0.3.1..HEAD to get the changes since the last tag. Add
an entry like the following:
## [X.Y.Z] - YYYY-MM-DD
### Breaking Changes
- ...
### New Features
- ...
### Bug Fixes
- ...
Tagging
[x] Tag new release in git.
[x] Ensure that the vdsm tests ran against this state.
# Make sure your local git repo is sync with upstream.
# The whole version string should be like `v0.0.3`.
# Put strings like `nmstate 0.0.3 release` as commit message.
git tag --sign v<version>
git push upstream --tags
[x] If you need to remove a tag because something needs to be fixed:
# Remove local tag
git tag -d <tag_name>
# Remove upstream tag
git push --delete upstream <tag_name>
GitHub Release
[x] Generate and sign the tarball.
git clean -x -d -n
# before running the next command check, that it is ok to remove the files
git clean -x -d -f
# Please remove python3-setuptools_scm, or it will all git files into tarbal.
env --unset=PYTHONPATH python3 setup.py sdist
gpg2 --armor --detach-sign dist/nmstate-<version>.tar.gz
[x] Visit github draft release page.
[x] Make sure you are in Release tab.
[x] Choose the git tag just pushed.
[x] Title should be like Version 0.0.3 release.
[x] The content should be copied from the CHANGELOG file.
[x] Click Attach binaries by dropping them here or selecting them. and upload
the dist/nmstate-<version>.tar.gz and dist/nmstate-<version>.tar.gz.asc.
[x] Download the tarball and the signature.
[x] Check if the signature is correct.
curl --silent https://www.nmstate.io/nmstate.gpg | gpg2 --import
gpg2 --verify nmstate-<version>.tar.gz.asc nmstate-<version>.tar.gz
[x] Check in a clean Fedora/centOS container if the package build and install correctly.
podman run -d -it --name <name> docker.io/library/fedora:31 bash
podman cp nmstate-<version>.tar.gz <name>:/home/nmstate-<version>.tar.gz
podman exec -it <name> bash
cd /home/
tar xzvf nmstate-<version>.tar.gz
python3 setup.py build
python3 setup.py install
[x] Click Save draft and seek for review.
[x] Click Publish release once approved.
PyPi Release
# Make sure you installed python package: wheel and twine.
yum install twine python3-wheel
rm -rf dist
python3 setup.py sdist bdist_wheel
# Upload to pypi test.
python3 -m twine upload --repository-url https://test.pypi.org/legacy/ dist/*
# Now, check https://test.pypi.org/project/nmstate/
# If it works, now upload to pypi.
python3 -m twine upload dist/*
Post Release
[x] 1. Create a pull request with increased version number in the VERSION file
and merge it before any other PR. This is necessary to ensure that the
development RPMs are newer than the stable version in distributions.
[x] 2. Update the SPEC files in Fedora, create new builds and updates as neccessary
[x] 3. Rebuild Copr repositories for stable releases as necessary (this requires
the SPEC files in Fedora to be updated, first)
https://copr.fedorainfracloud.org/coprs/nmstate/
[x] 4. Send out a notification to the fedorahosted mailing list:
nmstate-devel@lists.fedorahosted.org
All done, thanks!
|
gharchive/issue
| 2020-06-15T19:04:41
|
2025-04-01T06:39:46.710585
|
{
"authors": [
"cathay4t",
"ffmancera"
],
"repo": "nmstate/nmstate",
"url": "https://github.com/nmstate/nmstate/issues/1116",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1464807763
|
Draft: Add torch-based CMA-ES implementation
This pull request:
Refactors the existing evotorch.algorithms.cmaes.CMAES, which is simply a wrapper for pycma to evotorch.algorithms.pycmaes.PyCMAES.
Introduces a new, torch-based, implementation of CMAES under the name evotorch.algorithms.cmaes.CMAES. This aims to faithfully reimplement the most recent version of pycma, while also benefiting from the evotorch ecosystem e.g. vectorized, GPU based, working with torch tensors for compatibility with other features.
Introduces evotorch.algorithms.restarters, which provides basic functionality for meta-algorithms that wrap around a SearchAlgorithm class + parameterization, and allow automatic restarting of the algorithm. This creates basic functionality for the eventual re-implementation e.g. of IPOP-CMA-ES and BIPOP-CMA-ES, and equivalent algorithms for XNES, SNES etc.
Adds the is_terminated flag to all SearchAlgorithm classes, that will allow the user to define their own general functionality for detecting termination states that should trigger restarts. Currently, this value defaults to False, but in the future it is intended to add example termination states to the new CMAES implementation.
Codecov Report
Merging #41 (63f3cc0) into master (d3e3f0a) will decrease coverage by 2.14%.
The diff coverage is 12.45%.
:exclamation: Current head 63f3cc0 differs from pull request most recent head b1ec581. Consider uploading reports for the commit b1ec581 to get more accurate results
@@ Coverage Diff @@
## master #41 +/- ##
==========================================
- Coverage 52.97% 50.82% -2.15%
==========================================
Files 43 47 +4
Lines 6233 6471 +238
==========================================
- Hits 3302 3289 -13
- Misses 2931 3182 +251
Impacted Files
Coverage Δ
src/evotorch/algorithms/restarter/__init__.py
0.00% <0.00%> (ø)
...rc/evotorch/algorithms/restarter/modify_restart.py
0.00% <0.00%> (ø)
src/evotorch/algorithms/restarter/restart.py
0.00% <0.00%> (ø)
src/evotorch/algorithms/cmaes.py
15.47% <9.33%> (-5.88%)
:arrow_down:
src/evotorch/algorithms/pycmaes.py
21.34% <21.34%> (ø)
src/evotorch/algorithms/searchalgorithm.py
39.26% <66.66%> (+0.43%)
:arrow_up:
src/evotorch/algorithms/__init__.py
100.00% <100.00%> (ø)
src/evotorch/core.py
61.59% <0.00%> (-3.96%)
:arrow_down:
src/evotorch/operators/base.py
33.01% <0.00%> (+0.30%)
:arrow_up:
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
|
gharchive/pull-request
| 2022-11-25T17:52:33
|
2025-04-01T06:39:46.728870
|
{
"authors": [
"NaturalGradient",
"codecov-commenter"
],
"repo": "nnaisense/evotorch",
"url": "https://github.com/nnaisense/evotorch/pull/41",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
782663638
|
Add lib Changelog to the PR feature
Feature Request
I known this is not simple but if the lib has a CHANGELOG file on it's repo (if it's a github repo or something like that), add the changelog to the PR description would be great
I'll take a bit to think through this one. I don't think it'll be too hard to write a generic header and the same info as is in the commit message, but I also suspect this would be best implemented with some configurable options around the CHANGELOG message.
I'll take a bit to think through this one. I don't think it'll be too hard to write a generic header and the same info as is in the commit message, but I also suspect this would be best implemented with some configurable options around the CHANGELOG message.
Yeah, at work (Nubank) we have a bot that opens PRs called Bumpito, it's pretty similar to your github action, but it's closed source, unfortunately...
It opens PRs for our common libs bumps and in the description, it prints something like this:
PR description:
Bumpito has found new common libraries versions.
Refer to the following changelogs to see what is new :newspaper_roll:
Changelogs
common-kafka
10.86.0
Remove jackson-core explicit dependency from common-kafka
10.85.1
remove consumer cron tick code
common-i18n
4.5.0
Fix Federal Holidays for Mexico
common-crypto
10.24.1
improve the way of communicating error getting s3 keys
common-metrics
10.6.0
Add prometheus text format parser
common-redis
Could not parse changelog :(
Yeah, at work (Nubank) we have a bot that opens PRs called Bumpito, it's pretty similar to your github action, but it's closed source, unfortunately...
It opens PRs for our common libs bumps and in the description, it prints something like this:
PR description:
Bumpito has found new common libraries versions.
Refer to the following changelogs to see what is new :newspaper_roll:
Changelogs
common-kafka
10.86.0
Remove jackson-core explicit dependency from common-kafka
10.85.1
remove consumer cron tick code
common-i18n
4.5.0
Fix Federal Holidays for Mexico
common-crypto
10.24.1
improve the way of communicating error getting s3 keys
common-metrics
10.6.0
Add prometheus text format parser
common-redis
Could not parse changelog :(
@ericdallo Following up on this, antq now has the ability to surface links to the GitHub diff between the old and the new version. It's slightly less ergonomic than a compiled list of changelog updates- but it is available for little extra effort. I've updated the commit messages to include these links in this PR: https://github.com/nnichols/clojure-dependency-update-action/pull/9
Would you find these sufficient for now?
@michols I think worth the shot, better than nothing! thank you
Closed by: https://github.com/nnichols/clojure-dependency-update-action/pull/9
|
gharchive/issue
| 2021-01-09T18:44:22
|
2025-04-01T06:39:46.745069
|
{
"authors": [
"ericdallo",
"nnichols"
],
"repo": "nnichols/clojure-dependency-update-action",
"url": "https://github.com/nnichols/clojure-dependency-update-action/issues/4",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
57215196
|
Highlight Lines instead of Events in LineChart
Hi, thanks for the nice library!
I haven't come across if this is possible or not. Is it possible to highlight a line, instead of the event on mouse hover? The use case would be to be able to easily follow the history of a particular line, throughout all events. With anything more than a dozen or so lines, it becomes difficult to track the progress of a line.
Thanks.
Hello, I'm sorry this issue hasn't received much attention yet. It seems this is an implementation question, could you please post your question to StackOverflow with the Chart.js tag and include a link here. This will ensure your question reaches the largest audience.
https://stackoverflow.com/questions/tagged/chart.js
Huzzah! The first alpha of Chart.js 2.0 has landed and should fix this issue. Check out the release and try it out! We've got a lot of momentum right now, so please help us test so we can launch 2.0 Gold by the end of the month.
https://github.com/nnnick/Chart.js/releases/tag/v2.0-alpha
I'm closing this issue for now, but if you have implementation questions or find bugs, please create a jsfiddle and post the link here and we'll reopen this issue and get it fixed.
|
gharchive/issue
| 2015-02-10T18:28:58
|
2025-04-01T06:39:46.748559
|
{
"authors": [
"derekperkins",
"fulldecent",
"mazubieta"
],
"repo": "nnnick/Chart.js",
"url": "https://github.com/nnnick/Chart.js/issues/930",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2496573796
|
[android] Set envvar for Qualcomm related filters
QNN / SNPE needs some additional tasks. This job has been done by nnstreamer's each sub-plugin.
Let the task be done in nnstreamer-native-api.c
https://github.com/nnstreamer/nnstreamer/pull/4563 and this PR should be merged together.
:memo: TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #556. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/.
|
gharchive/pull-request
| 2024-08-30T08:26:24
|
2025-04-01T06:39:46.751531
|
{
"authors": [
"anyj0527",
"taos-ci"
],
"repo": "nnstreamer/api",
"url": "https://github.com/nnstreamer/api/pull/556",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
930054204
|
[Wait for #1339] [Resnet] Connect the model with cifar100
[Pending commit: #1339]
[Resnet] Connect the model with cifar100
**Changes proposed in this PR:**
- Implement cifar100dataloader
- Connect the data loader
**Self evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test: [X]Passed [ ]Failed [ ]Skipped
Signed-off-by: Jihoon Lee <jhoon.it.lee@samsung.com>
:memo: TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #1340. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://nnstreamer.mooo.com/.
:octocat: cibot: @zhoonit, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://nnstreamer.mooo.com/nntrainer/ci/repo-workers/pr-checker/1340-202106301147160.68477702140808-b72c6f2e4171fd48f8f7f3d062b38a56ca92f95f/.
:octocat: cibot: @zhoonit, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://nnstreamer.mooo.com/nntrainer/ci/repo-workers/pr-checker/1340-202107051429260.86057209968567-39d732fa59d7212522da55b25c70736d7248e2ae/.
|
gharchive/pull-request
| 2021-06-25T10:35:28
|
2025-04-01T06:39:46.760792
|
{
"authors": [
"taos-ci",
"zhoonit"
],
"repo": "nnstreamer/nntrainer",
"url": "https://github.com/nnstreamer/nntrainer/pull/1340",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2729316766
|
[ Meson ] BCQTensor dependency is added for Android build
This commit add BiQGEMM path to nntrainer_inc_abs to support Android build with enable-biqgemm option.
The path for BiQGEMM corresponds to the one in the top meson.build
Self evaluation:
Build test: [X]Passed [ ]Failed [ ]Skipped
Run test: [X]Passed [ ]Failed [ ]Skipped
LGTM! One minor suggestion: How about adding it to the tensor/meson.build for consistency?
https://github.com/nnstreamer/nntrainer/blob/cd17a66b8eec45a2e91e543a884722763b5e608e/nntrainer/tensor/meson.build#L79-L83
Following the suggestion from @djeong20, I moved the code to nntrainer/tensor/meson.build.
Since the existence check for BiQGEMM folder is completed at the precedent meson file, the duplicate check code is removed.
meson.source_root() / '..' / 'BiQGEMM'
DO NOT HARDCODE EXTERNAL PATHS IN MESON SCRIPT!
If pkgconfig or cmake is not ready with BiQGEMM,
get such information via meson_options and specify such "default paths" at meson_options.
And..
if get_option('enable-biqgemm')
# check if BiQGEMM directory exist. otherwise, throw an error
fs = import('fs')
if fs.is_dir('../BiQGEMM')
extra_defines += '-DENABLE_BIQGEMM=1'
biqgemm_inc = include_directories('../BiQGEMM')
else
error ('BiQGEMM cannot be enabled without BiQGEMM library.')
endif
endif
is completely non-sense. Other developers (e.g., nntrainer users in other departments) won't understand what's going on here.
You should try
get biqgemm info from pkgconfig/cmake.
(if 1 fails) try to load it from common path (/usr/include, /usr/include/biqgemm)
(if 2 fails) try to load it from user-defined path (meson_option)
(if 3 fails) try to load it from such hardcoded path. but specify such external hardcoded path at meson_options.txt, not in this script.
@myungjoo, Thank you for the detailed comment and guidance :) I will make additional PR to make it feasible following your suggestion. Sincerely.
|
gharchive/pull-request
| 2024-12-10T07:56:20
|
2025-04-01T06:39:46.766863
|
{
"authors": [
"EunjuYang",
"myungjoo"
],
"repo": "nnstreamer/nntrainer",
"url": "https://github.com/nnstreamer/nntrainer/pull/2823",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1941742213
|
Dataset
hello,I want to know the information about how to split the kitti dataset,in the paper ,there are 39180 for training ,but monodepth2 has 39810 for training,will the difference affect results?
Hi, I think this should be a typo in our paper. We used the same training dataset as Monodepth2.
|
gharchive/issue
| 2023-10-13T11:10:26
|
2025-04-01T06:39:46.792216
|
{
"authors": [
"Rookie764",
"noahzn"
],
"repo": "noahzn/Lite-Mono",
"url": "https://github.com/noahzn/Lite-Mono/issues/75",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1446933773
|
Socket.write not writing commands
OS: Ubuntu 20.04
Node version: v16.18.1 / v12.12.0 / v10.15.3
node-bluetooth-hci-socket: v0.5.3-8
The following code is used to crearte le conn with the bluetooth device but this is not working and connection is not created. This is used in @abandonware/noble package.
const { minInterval = 0x0006, maxInterval = 0x000c, latency = 0x0000, timeout = 0x00c8 } = parameters;
const cmd = Buffer.alloc(29);
// header
cmd.writeUInt8(HCI_COMMAND_PKT, 0);
cmd.writeUInt16LE(LE_CREATE_CONN_CMD, 1);
// length
cmd.writeUInt8(0x19, 3);
// data
cmd.writeUInt16LE(0x0060, 4); // interval
cmd.writeUInt16LE(0x0030, 6); // window
cmd.writeUInt8(0x00, 8); // initiator filter
cmd.writeUInt8(addressType === 'random' ? 0x01 : 0x00, 9); // peer address type
(Buffer.from(address.split(':').reverse().join(''), 'hex')).copy(cmd, 10); // peer address
cmd.writeUInt8(0x00, 16); // own address type
cmd.writeUInt16LE(minInterval, 17); // min interval
cmd.writeUInt16LE(maxInterval, 19); // max interval
cmd.writeUInt16LE(latency, 21); // latency
cmd.writeUInt16LE(timeout, 23); // supervision timeout
cmd.writeUInt16LE(0x0004, 25); // min ce length
cmd.writeUInt16LE(0x0006, 27); // max ce length
debug("create le conn - writing:", ${cmd.toString('hex')});
this._socket.write(cmd);
Is this issue also present on latest release of:
https://www.npmjs.com/package/@abandonware/bluetooth-hci-socket/v/0.5.3-12
|
gharchive/issue
| 2022-11-13T13:48:38
|
2025-04-01T06:39:46.794658
|
{
"authors": [
"k-bharath-7",
"rzr"
],
"repo": "noble/node-bluetooth-hci-socket",
"url": "https://github.com/noble/node-bluetooth-hci-socket/issues/159",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2308899991
|
matmul-elementwise fusion failed to legalize operation 'vector.contract'
With the support of fusing consumer (tensor.unpack) into the loop, I've modified the matmul-elementwise fusion with pack-peel pipeline. Now i32 test compiles and has correct outputs on hardware. However, bf16 test failed at AIE stage with error:
error: failed to legalize operation 'vector.contract' that was explicitly marked illegal
I've attached the dump IR here
afterall_aie.txt
Already talked to @erwei-xilinx about the issue, but would like to have more eyes on it, since this seems to be related to the vectorization problem. @newling @jsetoain
I've taking a look at the first few and they all seem valid, I'll take a closer look tomorrow.
The vector.contract is all bf16. What is the input IR, does the elementwise operation happen in f32 or bf16?
Currently we don't support matmul with result type bf16. I've been meaning to improve the error diagnostic error in the case where the matmul is bf16->bf16 as this issue has been encountered at least twice before.
The vector.contract is all bf16. What is the input IR, does the elementwise operation happen in f32 or bf16?
Currently we don't support matmul with result type bf16. I've been meaning to improve the error diagnostic error in the case where the matmul is bf16->bf16 as this issue has been encountered at least twice before.
Okay, I see. The output type of this test is bf16. Let me try another test with elementwise/output operation in f32.
Then here comes another question... Do you know what is the data type of elementwise op in tres leches bf16 model?
The vector.contract is all bf16. What is the input IR, does the elementwise operation happen in f32 or bf16?
Currently we don't support matmul with result type bf16. I've been meaning to improve the error diagnostic error in the case where the matmul is bf16->bf16 as this issue has been encountered at least twice before.
Okay, I see. The output type of this test is bf16. Let me try another test with elementwise/output operation in f32.
Then here comes another question... Do you know what is the data type of elementwise op in tres leches bf16 model?
Yes, I just confirmed that if I change the matmul output and elementwise type to f32 it works without problem! @newling Thanks for pointing out the issue.
|
gharchive/issue
| 2024-05-21T18:46:00
|
2025-04-01T06:39:46.844159
|
{
"authors": [
"jsetoain",
"newling",
"yzhang93"
],
"repo": "nod-ai/iree-amd-aie",
"url": "https://github.com/nod-ai/iree-amd-aie/issues/363",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
159014832
|
node-red crashed with error "Error: write EIO at errnoException (net.js:904:11) at Object.afterWrite (net.js:720:19)"
Hi,
Since I added a flow with a "Watson IoT" outbound node I am having frequent node-red crashes.
FYI I am running node-red version 0.12.5 (Node.js version 0.10.29) on raspberry pi (Linux raspberrypi 4.1.13+ #826 PREEMPT Fri Nov 13 20:13:22 GMT 2015 armv6l GNU/Linux)
Here below you find the information that is logged in the node-red log file for 3 such crashes:
instance 1:
....
30 May 23:42:38 - [warn] [google-credentials:942c464d.6bd3b8] trying to refresh token due to expiry
Connection was closed.
31 May 00:01:30 - [red] Uncaught Exception:
31 May 00:01:30 - Error: write EIO
at errnoException (net.js:904:11)
at Object.afterWrite (net.js:720:19)
instance 2:
....
5 Jun 23:48:03 - [warn] [google-credentials:942c464d.6bd3b8] trying to refresh token due to expiry
5 Jun 23:56:12 - [warn] [function:arp-scan output to json] hosts responded (=11) != packets received (=12)
Connection was closed.
6 Jun 00:11:10 - [red] Uncaught Exception:
6 Jun 00:11:10 - Error: write EIO
at errnoException (net.js:904:11)
at Object.afterWrite (net.js:720:19)
instance 3:
....
7 Jun 21:35:59 - [warn] [google-credentials:942c464d.6bd3b8] trying to refresh t oken due to expiry
7 Jun 21:39:27 - [warn] [function:arp-scan output to json] hosts responded (=13) != packets received (=41)
Connection was closed.
7 Jun 21:59:03 - [red] Uncaught Exception:
7 Jun 21:59:03 - Error: write EIO
at errnoException (net.js:904:11)
at Object.afterWrite (net.js:720:19)
Hi, to help identify what node is causing this, can you list what nodes you have that might be accessing the network? The challenge is this is an error thrown asynchronously to the main runtime - the nodes ought to have appropriate error handling in place for this sort of async event.
Hi Nick,
I am using following nodes:
node-red-contrib-ibm-watson-iot
node-red-contrib-ui
node-red-node-emoncms
node-red-node-google
node-red-node-openweathermap
mqtt
I think that when I added the watson-iot node I started having these problems (4 instances in the last 2 weeks).
many thanks for the support.
Jan.
Here below you find the npm modules installed.
`pi@raspberrypi:/usr/lib/node_modules $ npm -g ls
/usr/lib
├─┬ node-red@0.12.5
│ ├── basic-auth@1.0.3
│ ├─┬ bcrypt@0.8.5
│ │ ├── bindings@1.2.1
│ │ └── nan@2.0.5
│ ├── bcryptjs@2.3.0
│ ├─┬ body-parser@1.14.2
│ │ ├── bytes@2.2.0
│ │ ├── content-type@1.0.1
│ │ ├─┬ debug@2.2.0
│ │ │ └── ms@0.7.1
│ │ ├── depd@1.1.0
│ │ ├─┬ http-errors@1.3.1
│ │ │ ├── inherits@2.0.1
│ │ │ └── statuses@1.2.1
│ │ ├── iconv-lite@0.4.13
│ │ ├─┬ on-finished@2.3.0
│ │ │ └── ee-first@1.1.1
│ │ ├── qs@5.2.0
│ │ └─┬ type-is@1.6.10
│ │ └─┬ mime-types@2.1.9
│ │ └── mime-db@1.21.0
│ ├─┬ cheerio@0.19.0
│ │ ├─┬ css-select@1.0.0
│ │ │ ├── boolbase@1.0.0
│ │ │ ├── css-what@1.0.0
│ │ │ ├─┬ domutils@1.4.3
│ │ │ │ └── domelementtype@1.3.0
│ │ │ └── nth-check@1.0.1
│ │ ├─┬ dom-serializer@0.1.0
│ │ │ └── domelementtype@1.1.3
│ │ ├── entities@1.1.1
│ │ ├─┬ htmlparser2@3.8.3
│ │ │ ├── domelementtype@1.3.0
│ │ │ ├── domhandler@2.3.0
│ │ │ ├── domutils@1.5.1
│ │ │ ├── entities@1.0.0
│ │ │ └─┬ readable-stream@1.1.13
│ │ │ ├── core-util-is@1.0.2
│ │ │ ├── inherits@2.0.1
│ │ │ ├── isarray@0.0.1
│ │ │ └── string_decoder@0.10.31
│ │ └── lodash@3.10.1
│ ├── clone@1.0.2
│ ├─┬ cors@2.7.1
│ │ └── vary@1.1.0
│ ├─┬ cron@1.1.0
│ │ └─┬ moment-timezone@0.3.1
│ │ └── moment@2.11.0
│ ├─┬ express@4.13.3
│ │ ├─┬ accepts@1.2.13
│ │ │ ├─┬ mime-types@2.1.9
│ │ │ │ └── mime-db@1.21.0
│ │ │ └── negotiator@0.5.3
│ │ ├── array-flatten@1.1.1
│ │ ├── content-disposition@0.5.0
│ │ ├── content-type@1.0.1
│ │ ├── cookie@0.1.3
│ │ ├── cookie-signature@1.0.6
│ │ ├─┬ debug@2.2.0
│ │ │ └── ms@0.7.1
│ │ ├── depd@1.0.1
│ │ ├── escape-html@1.0.2
│ │ ├── etag@1.7.0
│ │ ├─┬ finalhandler@0.4.0
│ │ │ └── unpipe@1.0.0
│ │ ├── fresh@0.3.0
│ │ ├── merge-descriptors@1.0.0
│ │ ├── methods@1.1.1
│ │ ├─┬ on-finished@2.3.0
│ │ │ └── ee-first@1.1.1
│ │ ├── parseurl@1.3.0
│ │ ├── path-to-regexp@0.1.7
│ │ ├─┬ proxy-addr@1.0.10
│ │ │ ├── forwarded@0.1.0
│ │ │ └── ipaddr.js@1.0.5
│ │ ├── qs@4.0.0
│ │ ├── range-parser@1.0.3
│ │ ├─┬ send@0.13.0
│ │ │ ├── destroy@1.0.3
│ │ │ ├─┬ http-errors@1.3.1
│ │ │ │ └── inherits@2.0.1
│ │ │ ├── mime@1.3.4
│ │ │ ├── ms@0.7.1
│ │ │ └── statuses@1.2.1
│ │ ├── serve-static@1.10.0
│ │ ├─┬ type-is@1.6.10
│ │ │ └─┬ mime-types@2.1.9
│ │ │ └── mime-db@1.21.0
│ │ ├── utils-merge@1.0.0
│ │ └── vary@1.0.1
│ ├─┬ follow-redirects@0.0.7
│ │ ├─┬ debug@2.2.0
│ │ │ └── ms@0.7.1
│ │ └── stream-consume@0.1.0
│ ├─┬ fs-extra@0.26.4
│ │ ├── graceful-fs@4.1.2
│ │ ├── jsonfile@2.2.3
│ │ ├── klaw@1.1.3
│ │ ├── path-is-absolute@1.0.0
│ │ └─┬ rimraf@2.5.0
│ │ └─┬ glob@6.0.3
│ │ ├─┬ inflight@1.0.4
│ │ │ └── wrappy@1.0.1
│ │ ├── inherits@2.0.1
│ │ ├─┬ minimatch@3.0.0
│ │ │ └─┬ brace-expansion@1.1.2
│ │ │ ├── balanced-match@0.3.0
│ │ │ └── concat-map@0.0.1
│ │ └─┬ once@1.3.3
│ │ └── wrappy@1.0.1
│ ├─┬ fs.notify@0.0.4
│ │ ├── async@0.1.22
│ │ └── retry@0.6.1
│ ├─┬ i18next@1.10.6
│ │ ├─┬ cookies@0.5.1
│ │ │ └── keygrip@1.0.1
│ │ ├── i18next-client@1.10.3
│ │ └── json5@0.2.0
│ ├── is-utf8@0.2.1
│ ├── media-typer@0.3.0
│ ├─┬ mqtt@1.6.3
│ │ ├─┬ commist@1.0.0
│ │ │ └── leven@1.0.2
│ │ ├─┬ concat-stream@1.5.1
│ │ │ ├─┬ readable-stream@2.0.5
│ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ ├── process-nextick-args@1.0.6
│ │ │ │ ├── string_decoder@0.10.31
│ │ │ │ └── util-deprecate@1.0.2
│ │ │ └── typedarray@0.0.6
│ │ ├─┬ end-of-stream@1.1.0
│ │ │ └─┬ once@1.3.3
│ │ │ └── wrappy@1.0.1
│ │ ├─┬ help-me@0.1.0
│ │ │ └─┬ pump@1.0.1
│ │ │ └─┬ once@1.3.3
│ │ │ └── wrappy@1.0.1
│ │ ├── inherits@2.0.1
│ │ ├── minimist@1.2.0
│ │ ├─┬ mqtt-connection@2.1.1
│ │ │ ├── reduplexer@1.1.0
│ │ │ └── through2@0.6.5
│ │ ├─┬ mqtt-packet@3.4.4
│ │ │ └── bl@0.9.4
│ │ ├─┬ readable-stream@1.0.33
│ │ │ ├── core-util-is@1.0.2
│ │ │ ├── isarray@0.0.1
│ │ │ └── string_decoder@0.10.31
│ │ ├── reinterval@1.0.2
│ │ ├─┬ websocket-stream@2.3.0
│ │ │ ├─┬ duplexify@3.4.2
│ │ │ │ ├─┬ end-of-stream@1.0.0
│ │ │ │ │ └─┬ once@1.3.3
│ │ │ │ │ └── wrappy@1.0.1
│ │ │ │ └─┬ readable-stream@2.0.5
│ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ ├── process-nextick-args@1.0.6
│ │ │ │ ├── string_decoder@0.10.31
│ │ │ │ └── util-deprecate@1.0.2
│ │ │ └─┬ through2@2.0.0
│ │ │ └─┬ readable-stream@2.0.5
│ │ │ ├── core-util-is@1.0.2
│ │ │ ├── isarray@0.0.1
│ │ │ ├── process-nextick-args@1.0.6
│ │ │ ├── string_decoder@0.10.31
│ │ │ └── util-deprecate@1.0.2
│ │ └── xtend@4.0.1
│ ├── mustache@2.2.1
│ ├─┬ node-red-node-email@0.1.0
│ │ ├─┬ imap@0.8.14
│ │ │ ├─┬ readable-stream@1.1.13
│ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ ├── inherits@2.0.1
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ └── string_decoder@0.10.31
│ │ │ └── utf7@1.0.0
│ │ └─┬ nodemailer@1.3.4
│ │ ├─┬ buildmail@1.3.0
│ │ │ ├── addressparser@0.3.2
│ │ │ ├── libbase64@0.1.0
│ │ │ └── libqp@1.1.0
│ │ ├─┬ hyperquest@1.2.0
│ │ │ ├─┬ duplexer2@0.0.2
│ │ │ │ └─┬ readable-stream@1.1.13
│ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ ├── inherits@2.0.1
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ └── string_decoder@0.10.31
│ │ │ └─┬ through2@0.6.5
│ │ │ ├─┬ readable-stream@1.0.33
│ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ ├── inherits@2.0.1
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ └── string_decoder@0.10.31
│ │ │ └── xtend@4.0.1
│ │ ├─┬ libmime@1.2.0
│ │ │ ├── iconv-lite@0.4.13
│ │ │ ├── libbase64@0.1.0
│ │ │ └── libqp@1.1.0
│ │ ├─┬ nodemailer-direct-transport@1.1.0
│ │ │ └── smtp-connection@1.3.8
│ │ └─┬ nodemailer-smtp-transport@1.1.0
│ │ ├── nodemailer-wellknown@0.1.7
│ │ └── smtp-connection@1.3.8
│ ├─┬ node-red-node-feedparser@0.1.3
│ │ ├─┬ feedparser@1.1.3
│ │ │ ├── addressparser@0.1.3
│ │ │ ├── array-indexofobject@0.0.1
│ │ │ ├─┬ readable-stream@1.0.33
│ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ ├── inherits@2.0.1
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ └── string_decoder@0.10.31
│ │ │ └── sax@0.6.1
│ │ └─┬ request@2.65.0
│ │ ├── aws-sign2@0.6.0
│ │ ├─┬ bl@1.0.0
│ │ │ └─┬ readable-stream@2.0.5
│ │ │ ├── core-util-is@1.0.2
│ │ │ ├── inherits@2.0.1
│ │ │ ├── isarray@0.0.1
│ │ │ ├── process-nextick-args@1.0.6
│ │ │ ├── string_decoder@0.10.31
│ │ │ └── util-deprecate@1.0.2
│ │ ├── caseless@0.11.0
│ │ ├─┬ combined-stream@1.0.5
│ │ │ └── delayed-stream@1.0.0
│ │ ├── extend@3.0.0
│ │ ├── forever-agent@0.6.1
│ │ ├─┬ form-data@1.0.0-rc3
│ │ │ └── async@1.5.1
│ │ ├─┬ har-validator@2.0.3
│ │ │ ├─┬ chalk@1.1.1
│ │ │ │ ├── ansi-styles@2.1.0
│ │ │ │ ├── escape-string-regexp@1.0.4
│ │ │ │ ├─┬ has-ansi@2.0.0
│ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ ├─┬ strip-ansi@3.0.0
│ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ └── supports-color@2.0.0
│ │ │ ├─┬ commander@2.9.0
│ │ │ │ └── graceful-readlink@1.0.1
│ │ │ ├─┬ is-my-json-valid@2.12.3
│ │ │ │ ├── generate-function@2.0.0
│ │ │ │ ├─┬ generate-object-property@1.2.0
│ │ │ │ │ └── is-property@1.0.2
│ │ │ │ ├── jsonpointer@2.0.0
│ │ │ │ └── xtend@4.0.1
│ │ │ └─┬ pinkie-promise@2.0.0
│ │ │ └── pinkie@2.0.1
│ │ ├─┬ hawk@3.1.2
│ │ │ ├── boom@2.10.1
│ │ │ ├── cryptiles@2.0.5
│ │ │ ├── hoek@2.16.3
│ │ │ └── sntp@1.0.9
│ │ ├─┬ http-signature@0.11.0
│ │ │ ├── asn1@0.1.11
│ │ │ ├── assert-plus@0.1.5
│ │ │ └── ctype@0.5.3
│ │ ├── isstream@0.1.2
│ │ ├── json-stringify-safe@5.0.1
│ │ ├─┬ mime-types@2.1.9
│ │ │ └── mime-db@1.21.0
│ │ ├── node-uuid@1.4.7
│ │ ├── oauth-sign@0.8.0
│ │ ├── qs@5.2.0
│ │ ├── stringstream@0.0.5
│ │ ├── tough-cookie@2.2.1
│ │ └── tunnel-agent@0.4.2
│ ├── node-red-node-rbe@0.1.1
│ ├─┬ node-red-node-serialport@0.1.0
│ │ └─┬ serialport@2.0.6
│ │ ├── async@0.9.0
│ │ ├── bindings@1.2.1
│ │ ├─┬ debug@2.2.0
│ │ │ └── ms@0.7.1
│ │ ├── nan@2.0.9
│ │ ├─┬ node-pre-gyp@0.6.18
│ │ │ ├─┬ mkdirp@0.5.1
│ │ │ │ └── minimist@0.0.8
│ │ │ ├─┬ nopt@3.0.6
│ │ │ │ └── abbrev@1.0.7
│ │ │ ├─┬ npmlog@2.0.0
│ │ │ │ ├── ansi@0.3.0
│ │ │ │ ├─┬ are-we-there-yet@1.0.5
│ │ │ │ │ ├── delegates@0.1.0
│ │ │ │ │ └─┬ readable-stream@2.0.5
│ │ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ │ ├── inherits@2.0.1
│ │ │ │ │ ├── isarray@0.0.1
│ │ │ │ │ ├── process-nextick-args@1.0.6
│ │ │ │ │ ├── string_decoder@0.10.31
│ │ │ │ │ └── util-deprecate@1.0.2
│ │ │ │ └─┬ gauge@1.2.2
│ │ │ │ ├── has-unicode@1.0.1
│ │ │ │ ├─┬ lodash.pad@3.1.1
│ │ │ │ │ ├── lodash._basetostring@3.0.1
│ │ │ │ │ └─┬ lodash._createpadding@3.6.1
│ │ │ │ │ └── lodash.repeat@3.0.1
│ │ │ │ ├─┬ lodash.padleft@3.1.1
│ │ │ │ │ ├── lodash._basetostring@3.0.1
│ │ │ │ │ └─┬ lodash._createpadding@3.6.1
│ │ │ │ │ └── lodash.repeat@3.0.1
│ │ │ │ └─┬ lodash.padright@3.1.1
│ │ │ │ ├── lodash._basetostring@3.0.1
│ │ │ │ └─┬ lodash._createpadding@3.6.1
│ │ │ │ └── lodash.repeat@3.0.1
│ │ │ ├─┬ rc@1.1.5
│ │ │ │ ├── deep-extend@0.4.0
│ │ │ │ ├── ini@1.3.4
│ │ │ │ ├── minimist@1.2.0
│ │ │ │ └── strip-json-comments@1.0.4
│ │ │ ├─┬ request@2.67.0
│ │ │ │ ├── aws-sign2@0.6.0
│ │ │ │ ├─┬ bl@1.0.0
│ │ │ │ │ └─┬ readable-stream@2.0.5
│ │ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ │ ├── inherits@2.0.1
│ │ │ │ │ ├── isarray@0.0.1
│ │ │ │ │ ├── process-nextick-args@1.0.6
│ │ │ │ │ ├── string_decoder@0.10.31
│ │ │ │ │ └── util-deprecate@1.0.2
│ │ │ │ ├── caseless@0.11.0
│ │ │ │ ├─┬ combined-stream@1.0.5
│ │ │ │ │ └── delayed-stream@1.0.0
│ │ │ │ ├── extend@3.0.0
│ │ │ │ ├── forever-agent@0.6.1
│ │ │ │ ├─┬ form-data@1.0.0-rc3
│ │ │ │ │ └── async@1.5.0
│ │ │ │ ├─┬ har-validator@2.0.3
│ │ │ │ │ ├─┬ chalk@1.1.1
│ │ │ │ │ │ ├── ansi-styles@2.1.0
│ │ │ │ │ │ ├── escape-string-regexp@1.0.3
│ │ │ │ │ │ ├─┬ has-ansi@2.0.0
│ │ │ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ │ │ ├─┬ strip-ansi@3.0.0
│ │ │ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ │ │ └── supports-color@2.0.0
│ │ │ │ │ ├─┬ commander@2.9.0
│ │ │ │ │ │ └── graceful-readlink@1.0.1
│ │ │ │ │ ├─┬ is-my-json-valid@2.12.3
│ │ │ │ │ │ ├── generate-function@2.0.0
│ │ │ │ │ │ ├─┬ generate-object-property@1.2.0
│ │ │ │ │ │ │ └── is-property@1.0.2
│ │ │ │ │ │ ├── jsonpointer@2.0.0
│ │ │ │ │ │ └── xtend@4.0.1
│ │ │ │ │ └─┬ pinkie-promise@2.0.0
│ │ │ │ │ └── pinkie@2.0.1
│ │ │ │ ├─┬ hawk@3.1.2
│ │ │ │ │ ├── boom@2.10.1
│ │ │ │ │ ├── cryptiles@2.0.5
│ │ │ │ │ ├── hoek@2.16.3
│ │ │ │ │ └── sntp@1.0.9
│ │ │ │ ├─┬ http-signature@1.1.0
│ │ │ │ │ ├── assert-plus@0.1.5
│ │ │ │ │ ├─┬ jsprim@1.2.2
│ │ │ │ │ │ ├── extsprintf@1.0.2
│ │ │ │ │ │ ├── json-schema@0.2.2
│ │ │ │ │ │ └── verror@1.3.6
│ │ │ │ │ └─┬ sshpk@1.7.1
│ │ │ │ │ ├── asn1@0.2.3
│ │ │ │ │ ├── assert-plus@0.2.0
│ │ │ │ │ ├─┬ dashdash@1.10.1
│ │ │ │ │ │ └── assert-plus@0.1.5
│ │ │ │ │ ├── ecc-jsbn@0.1.1
│ │ │ │ │ ├── jodid25519@1.0.2
│ │ │ │ │ ├── jsbn@0.1.0
│ │ │ │ │ └── tweetnacl@0.13.2
│ │ │ │ ├── is-typedarray@1.0.0
│ │ │ │ ├── isstream@0.1.2
│ │ │ │ ├── json-stringify-safe@5.0.1
│ │ │ │ ├─┬ mime-types@2.1.8
│ │ │ │ │ └── mime-db@1.20.0
│ │ │ │ ├── node-uuid@1.4.7
│ │ │ │ ├── oauth-sign@0.8.0
│ │ │ │ ├── qs@5.2.0
│ │ │ │ ├── stringstream@0.0.5
│ │ │ │ ├── tough-cookie@2.2.1
│ │ │ │ └── tunnel-agent@0.4.2
│ │ │ ├─┬ rimraf@2.4.4
│ │ │ │ └─┬ glob@5.0.15
│ │ │ │ ├─┬ inflight@1.0.4
│ │ │ │ │ └── wrappy@1.0.1
│ │ │ │ ├── inherits@2.0.1
│ │ │ │ ├─┬ minimatch@3.0.0
│ │ │ │ │ └─┬ brace-expansion@1.1.2
│ │ │ │ │ ├── balanced-match@0.3.0
│ │ │ │ │ └── concat-map@0.0.1
│ │ │ │ ├─┬ once@1.3.3
│ │ │ │ │ └── wrappy@1.0.1
│ │ │ │ └── path-is-absolute@1.0.0
│ │ │ ├── semver@5.1.0
│ │ │ ├─┬ tar@2.2.1
│ │ │ │ ├── block-stream@0.0.8
│ │ │ │ ├─┬ fstream@1.0.8
│ │ │ │ │ └── graceful-fs@4.1.2
│ │ │ │ └── inherits@2.0.1
│ │ │ └─┬ tar-pack@3.1.2
│ │ │ ├── debug@0.7.4
│ │ │ ├─┬ fstream@1.0.8
│ │ │ │ ├── graceful-fs@4.1.2
│ │ │ │ └── inherits@2.0.1
│ │ │ ├─┬ fstream-ignore@1.0.3
│ │ │ │ ├── inherits@2.0.1
│ │ │ │ └─┬ minimatch@3.0.0
│ │ │ │ └─┬ brace-expansion@1.1.2
│ │ │ │ ├── balanced-match@0.3.0
│ │ │ │ └── concat-map@0.0.1
│ │ │ ├── once@1.1.1
│ │ │ ├─┬ readable-stream@2.0.5
│ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ ├── inherits@2.0.1
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ ├── process-nextick-args@1.0.6
│ │ │ │ ├── string_decoder@0.10.31
│ │ │ │ └── util-deprecate@1.0.2
│ │ │ └── uid-number@0.0.3
│ │ ├── UNMET DEPENDENCY node-pre-gyp-github@^1.1.0
│ │ ├─┬ optimist@0.6.1
│ │ │ ├── minimist@0.0.10
│ │ │ └── wordwrap@0.0.3
│ │ └── sf@0.1.7
│ ├─┬ node-red-node-twitter@0.1.4
│ │ ├── oauth@0.9.14
│ │ ├─┬ request@2.67.0
│ │ │ ├── aws-sign2@0.6.0
│ │ │ ├─┬ bl@1.0.0
│ │ │ │ └─┬ readable-stream@2.0.5
│ │ │ │ ├── core-util-is@1.0.2
│ │ │ │ ├── inherits@2.0.1
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ ├── process-nextick-args@1.0.6
│ │ │ │ ├── string_decoder@0.10.31
│ │ │ │ └── util-deprecate@1.0.2
│ │ │ ├── caseless@0.11.0
│ │ │ ├─┬ combined-stream@1.0.5
│ │ │ │ └── delayed-stream@1.0.0
│ │ │ ├── extend@3.0.0
│ │ │ ├── forever-agent@0.6.1
│ │ │ ├─┬ form-data@1.0.0-rc3
│ │ │ │ └── async@1.5.1
│ │ │ ├─┬ har-validator@2.0.3
│ │ │ │ ├─┬ chalk@1.1.1
│ │ │ │ │ ├── ansi-styles@2.1.0
│ │ │ │ │ ├── escape-string-regexp@1.0.4
│ │ │ │ │ ├─┬ has-ansi@2.0.0
│ │ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ │ ├─┬ strip-ansi@3.0.0
│ │ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ │ └── supports-color@2.0.0
│ │ │ │ ├─┬ commander@2.9.0
│ │ │ │ │ └── graceful-readlink@1.0.1
│ │ │ │ ├─┬ is-my-json-valid@2.12.3
│ │ │ │ │ ├── generate-function@2.0.0
│ │ │ │ │ ├─┬ generate-object-property@1.2.0
│ │ │ │ │ │ └── is-property@1.0.2
│ │ │ │ │ ├── jsonpointer@2.0.0
│ │ │ │ │ └── xtend@4.0.1
│ │ │ │ └─┬ pinkie-promise@2.0.0
│ │ │ │ └── pinkie@2.0.1
│ │ │ ├─┬ hawk@3.1.2
│ │ │ │ ├── boom@2.10.1
│ │ │ │ ├── cryptiles@2.0.5
│ │ │ │ ├── hoek@2.16.3
│ │ │ │ └── sntp@1.0.9
│ │ │ ├─┬ http-signature@1.1.0
│ │ │ │ ├── assert-plus@0.1.5
│ │ │ │ ├─┬ jsprim@1.2.2
│ │ │ │ │ ├── extsprintf@1.0.2
│ │ │ │ │ ├── json-schema@0.2.2
│ │ │ │ │ └── verror@1.3.6
│ │ │ │ └─┬ sshpk@1.7.2
│ │ │ │ ├── asn1@0.2.3
│ │ │ │ ├── assert-plus@0.2.0
│ │ │ │ ├─┬ dashdash@1.11.0
│ │ │ │ │ └── assert-plus@0.1.5
│ │ │ │ ├── ecc-jsbn@0.1.1
│ │ │ │ ├── jodid25519@1.0.2
│ │ │ │ ├── jsbn@0.1.0
│ │ │ │ └── tweetnacl@0.13.2
│ │ │ ├── is-typedarray@1.0.0
│ │ │ ├── isstream@0.1.2
│ │ │ ├── json-stringify-safe@5.0.1
│ │ │ ├─┬ mime-types@2.1.9
│ │ │ │ └── mime-db@1.21.0
│ │ │ ├── node-uuid@1.4.7
│ │ │ ├── oauth-sign@0.8.0
│ │ │ ├── qs@5.2.0
│ │ │ ├── stringstream@0.0.5
│ │ │ ├── tough-cookie@2.2.1
│ │ │ └── tunnel-agent@0.4.2
│ │ └── twitter-ng@0.6.2
│ ├─┬ nopt@3.0.6
│ │ └── abbrev@1.0.7
│ ├─┬ oauth2orize@1.2.0
│ │ ├─┬ debug@2.2.0
│ │ │ └── ms@0.7.1
│ │ ├── uid2@0.0.3
│ │ └── utils-merge@1.0.0
│ ├── on-headers@1.0.1
│ ├─┬ passport@0.3.2
│ │ ├── passport-strategy@1.0.0
│ │ └── pause@0.0.1
│ ├─┬ passport-http-bearer@1.0.1
│ │ └── passport-strategy@1.0.0
│ ├─┬ passport-oauth2-client-password@0.1.2
│ │ └── passport-strategy@1.0.0
│ ├─┬ raw-body@2.1.5
│ │ ├── bytes@2.2.0
│ │ ├── iconv-lite@0.4.13
│ │ └── unpipe@1.0.0
│ ├── semver@5.1.0
│ ├─┬ sentiment@1.0.4
│ │ └─┬ lodash.assign@3.2.0
│ │ ├─┬ lodash._baseassign@3.2.0
│ │ │ └── lodash._basecopy@3.0.1
│ │ ├─┬ lodash._createassigner@3.1.1
│ │ │ ├── lodash._bindcallback@3.0.1
│ │ │ ├── lodash._isiterateecall@3.0.9
│ │ │ └── lodash.restparam@3.6.1
│ │ └─┬ lodash.keys@3.1.2
│ │ ├── lodash._getnative@3.9.1
│ │ ├── lodash.isarguments@3.0.4
│ │ └── lodash.isarray@3.0.4
│ ├─┬ uglify-js@2.6.1
│ │ ├── async@0.2.10
│ │ ├── source-map@0.5.3
│ │ ├── uglify-to-browserify@1.0.2
│ │ └─┬ yargs@3.10.0
│ │ ├── camelcase@1.2.1
│ │ ├─┬ cliui@2.1.0
│ │ │ ├─┬ center-align@0.1.2
│ │ │ │ ├─┬ align-text@0.1.3
│ │ │ │ │ ├─┬ kind-of@2.0.1
│ │ │ │ │ │ └── is-buffer@1.1.1
│ │ │ │ │ ├── longest@1.0.1
│ │ │ │ │ └── repeat-string@1.5.2
│ │ │ │ └── lazy-cache@0.2.7
│ │ │ ├─┬ right-align@0.1.3
│ │ │ │ └─┬ align-text@0.1.3
│ │ │ │ ├─┬ kind-of@2.0.1
│ │ │ │ │ └── is-buffer@1.1.1
│ │ │ │ ├── longest@1.0.1
│ │ │ │ └── repeat-string@1.5.2
│ │ │ └── wordwrap@0.0.2
│ │ ├─┬ decamelize@1.1.2
│ │ │ └── escape-string-regexp@1.0.4
│ │ └── window-size@0.1.0
│ ├── when@3.7.7
│ ├─┬ ws@0.8.1
│ │ ├─┬ bufferutil@1.2.1
│ │ │ ├── bindings@1.2.1
│ │ │ └── nan@2.1.0
│ │ ├── options@0.0.6
│ │ ├── ultron@1.0.2
│ │ └─┬ utf-8-validate@1.2.1
│ │ ├── bindings@1.2.1
│ │ └── nan@2.1.0
│ └─┬ xml2js@0.4.15
│ ├── sax@1.1.4
│ └─┬ xmlbuilder@4.2.0
│ └── lodash@3.10.1
├─┬ node-red-admin@0.1.2
│ ├─┬ bcrypt@0.8.5
│ │ ├── bindings@1.2.1
│ │ └── nan@2.0.5
│ ├── bcryptjs@2.3.0
│ ├─┬ cli-table@0.3.1
│ │ └── colors@1.0.3
│ ├── colors@1.1.2
│ ├── minimist@1.2.0
│ ├─┬ read@1.0.7
│ │ └── mute-stream@0.0.5
│ ├─┬ request@2.67.0
│ │ ├── aws-sign2@0.6.0
│ │ ├─┬ bl@1.0.0
│ │ │ └─┬ readable-stream@2.0.5
│ │ │ ├── core-util-is@1.0.2
│ │ │ ├── inherits@2.0.1
│ │ │ ├── isarray@0.0.1
│ │ │ ├── process-nextick-args@1.0.6
│ │ │ ├── string_decoder@0.10.31
│ │ │ └── util-deprecate@1.0.2
│ │ ├── caseless@0.11.0
│ │ ├─┬ combined-stream@1.0.5
│ │ │ └── delayed-stream@1.0.0
│ │ ├── extend@3.0.0
│ │ ├── forever-agent@0.6.1
│ │ ├─┬ form-data@1.0.0-rc3
│ │ │ └── async@1.5.1
│ │ ├─┬ har-validator@2.0.3
│ │ │ ├─┬ chalk@1.1.1
│ │ │ │ ├── ansi-styles@2.1.0
│ │ │ │ ├── escape-string-regexp@1.0.4
│ │ │ │ ├─┬ has-ansi@2.0.0
│ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ ├─┬ strip-ansi@3.0.0
│ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ └── supports-color@2.0.0
│ │ │ ├─┬ commander@2.9.0
│ │ │ │ └── graceful-readlink@1.0.1
│ │ │ ├─┬ is-my-json-valid@2.12.3
│ │ │ │ ├── generate-function@2.0.0
│ │ │ │ ├─┬ generate-object-property@1.2.0
│ │ │ │ │ └── is-property@1.0.2
│ │ │ │ ├── jsonpointer@2.0.0
│ │ │ │ └── xtend@4.0.1
│ │ │ └─┬ pinkie-promise@2.0.0
│ │ │ └── pinkie@2.0.1
│ │ ├─┬ hawk@3.1.2
│ │ │ ├── boom@2.10.1
│ │ │ ├── cryptiles@2.0.5
│ │ │ ├── hoek@2.16.3
│ │ │ └── sntp@1.0.9
│ │ ├─┬ http-signature@1.1.0
│ │ │ ├── assert-plus@0.1.5
│ │ │ ├─┬ jsprim@1.2.2
│ │ │ │ ├── extsprintf@1.0.2
│ │ │ │ ├── json-schema@0.2.2
│ │ │ │ └── verror@1.3.6
│ │ │ └─┬ sshpk@1.7.2
│ │ │ ├── asn1@0.2.3
│ │ │ ├── assert-plus@0.2.0
│ │ │ ├─┬ dashdash@1.11.0
│ │ │ │ └── assert-plus@0.1.5
│ │ │ ├── ecc-jsbn@0.1.1
│ │ │ ├── jodid25519@1.0.2
│ │ │ ├── jsbn@0.1.0
│ │ │ └── tweetnacl@0.13.2
│ │ ├── is-typedarray@1.0.0
│ │ ├── isstream@0.1.2
│ │ ├── json-stringify-safe@5.0.1
│ │ ├─┬ mime-types@2.1.9
│ │ │ └── mime-db@1.21.0
│ │ ├── node-uuid@1.4.7
│ │ ├── oauth-sign@0.8.0
│ │ ├── qs@5.2.0
│ │ ├── stringstream@0.0.5
│ │ ├── tough-cookie@2.2.1
│ │ └── tunnel-agent@0.4.2
│ └── when@3.7.5
├─┬ node-red-contrib-ibm-watson-iot@0.2.5
│ └─┬ ibmiotf@0.2.12
│ ├─┬ axios@0.5.4
│ │ └── es6-promise@2.3.0
│ ├── bluebird@2.10.2
│ ├── btoa@1.1.2
│ ├── events@1.1.0
│ ├── format@0.2.2
│ ├── loglevel@1.4.0
│ └─┬ mqtt@1.5.0
│ ├─┬ commist@1.0.0
│ │ └── leven@1.0.2
│ ├─┬ concat-stream@1.5.1
│ │ ├─┬ readable-stream@2.0.6
│ │ │ ├── core-util-is@1.0.2
│ │ │ ├── isarray@1.0.0
│ │ │ ├── process-nextick-args@1.0.7
│ │ │ ├── string_decoder@0.10.31
│ │ │ └── util-deprecate@1.0.2
│ │ └── typedarray@0.0.6
│ ├─┬ end-of-stream@1.1.0
│ │ └─┬ once@1.3.3
│ │ └── wrappy@1.0.2
│ ├─┬ help-me@0.1.0
│ │ └─┬ pump@1.0.1
│ │ └─┬ once@1.3.3
│ │ └── wrappy@1.0.2
│ ├── inherits@2.0.1
│ ├── minimist@1.2.0
│ ├─┬ mqtt-connection@2.1.1
│ │ ├── reduplexer@1.1.0
│ │ └── through2@0.6.5
│ ├─┬ mqtt-packet@3.4.6
│ │ └── bl@0.9.5
│ ├─┬ readable-stream@1.0.34
│ │ ├── core-util-is@1.0.2
│ │ ├── isarray@0.0.1
│ │ └── string_decoder@0.10.31
│ ├─┬ websocket-stream@2.3.0
│ │ ├─┬ duplexify@3.4.3
│ │ │ ├─┬ end-of-stream@1.0.0
│ │ │ │ └─┬ once@1.3.3
│ │ │ │ └── wrappy@1.0.2
│ │ │ └─┬ readable-stream@2.1.4
│ │ │ ├── buffer-shims@1.0.0
│ │ │ ├── core-util-is@1.0.2
│ │ │ ├── isarray@1.0.0
│ │ │ ├── process-nextick-args@1.0.7
│ │ │ ├── string_decoder@0.10.31
│ │ │ └── util-deprecate@1.0.2
│ │ ├─┬ through2@2.0.1
│ │ │ └─┬ readable-stream@2.0.6
│ │ │ ├── core-util-is@1.0.2
│ │ │ ├── isarray@1.0.0
│ │ │ ├── process-nextick-args@1.0.7
│ │ │ ├── string_decoder@0.10.31
│ │ │ └── util-deprecate@1.0.2
│ │ └─┬ ws@0.8.1
│ │ ├─┬ bufferutil@1.2.1
│ │ │ ├── bindings@1.2.1
│ │ │ └── nan@2.3.3
│ │ ├── options@0.0.6
│ │ ├── ultron@1.0.2
│ │ └─┬ utf-8-validate@1.2.1
│ │ ├── bindings@1.2.1
│ │ └── nan@2.3.3
│ └── xtend@4.0.1
├─┬ node-red-contrib-kodi@0.2.5
│ ├─┬ kodi-ws@2.4.0
│ │ ├─┬ babel-eslint@5.0.0-beta9
│ │ │ ├── acorn-to-esprima@2.0.8
│ │ │ ├─┬ babel-traverse@6.4.5
│ │ │ │ ├─┬ babel-code-frame@6.3.13
│ │ │ │ │ ├─┬ chalk@1.1.1
│ │ │ │ │ │ ├── ansi-styles@2.1.0
│ │ │ │ │ │ ├── escape-string-regexp@1.0.4
│ │ │ │ │ │ ├─┬ has-ansi@2.0.0
│ │ │ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ │ │ ├─┬ strip-ansi@3.0.0
│ │ │ │ │ │ │ └── ansi-regex@2.0.0
│ │ │ │ │ │ └── supports-color@2.0.0
│ │ │ │ │ ├── esutils@2.0.2
│ │ │ │ │ ├── js-tokens@1.0.2
│ │ │ │ │ └─┬ line-numbers@0.2.0
│ │ │ │ │ └── left-pad@0.0.3
│ │ │ │ ├── babel-messages@6.3.18
│ │ │ │ ├─┬ babel-runtime@5.8.35
│ │ │ │ │ └── core-js@1.2.6
│ │ │ │ ├─┬ debug@2.2.0
│ │ │ │ │ └── ms@0.7.1
│ │ │ │ ├── globals@8.18.0
│ │ │ │ ├─┬ invariant@2.2.0
│ │ │ │ │ └─┬ loose-envify@1.1.0
│ │ │ │ │ └── js-tokens@1.0.2
│ │ │ │ ├── lodash@3.10.1
│ │ │ │ └─┬ repeating@1.1.3
│ │ │ │ └─┬ is-finite@1.0.1
│ │ │ │ └── number-is-nan@1.0.0
│ │ │ ├─┬ babel-types@6.4.5
│ │ │ │ ├─┬ babel-runtime@5.8.35
│ │ │ │ │ └── core-js@1.2.6
│ │ │ │ ├── esutils@2.0.2
│ │ │ │ ├── lodash@3.10.1
│ │ │ │ └── to-fast-properties@1.0.1
│ │ │ ├─┬ babylon@6.4.5
│ │ │ │ └─┬ babel-runtime@5.8.35
│ │ │ │ └── core-js@1.2.6
│ │ │ ├─┬ lodash.assign@3.2.0
│ │ │ │ ├─┬ lodash._baseassign@3.2.0
│ │ │ │ │ └── lodash._basecopy@3.0.1
│ │ │ │ ├─┬ lodash._createassigner@3.1.1
│ │ │ │ │ ├── lodash._bindcallback@3.0.1
│ │ │ │ │ ├── lodash._isiterateecall@3.0.9
│ │ │ │ │ └── lodash.restparam@3.6.1
│ │ │ │ └─┬ lodash.keys@3.1.2
│ │ │ │ ├── lodash._getnative@3.9.1
│ │ │ │ ├── lodash.isarguments@3.0.6
│ │ │ │ └── lodash.isarray@3.0.4
│ │ │ └─┬ lodash.pick@3.1.0
│ │ │ ├─┬ lodash._baseflatten@3.1.4
│ │ │ │ ├── lodash.isarguments@3.0.6
│ │ │ │ └── lodash.isarray@3.0.4
│ │ │ ├── lodash._bindcallback@3.0.1
│ │ │ ├── lodash._pickbyarray@3.0.2
│ │ │ ├─┬ lodash._pickbycallback@3.0.0
│ │ │ │ ├── lodash._basefor@3.0.3
│ │ │ │ └─┬ lodash.keysin@3.0.8
│ │ │ │ ├── lodash.isarguments@3.0.6
│ │ │ │ └── lodash.isarray@3.0.4
│ │ │ └── lodash.restparam@3.6.1
│ │ ├─┬ has-value@0.2.1
│ │ │ ├── get-value@2.0.3
│ │ │ └── has-values@0.1.3
│ │ ├─┬ jrpc-schema@2.0.0
│ │ │ └─┬ skeemas@1.2.2
│ │ │ ├── skeemas-json-pointer@1.0.0
│ │ │ └── skeemas-json-refs@1.0.1
│ │ ├─┬ set-value@0.2.0
│ │ │ ├── isobject@1.0.2
│ │ │ └── noncharacters@1.1.0
│ │ └─┬ ws@0.7.2
│ │ ├─┬ bufferutil@1.1.0
│ │ │ ├── bindings@1.2.1
│ │ │ └── nan@1.8.4
│ │ ├── options@0.0.6
│ │ └── ultron@1.0.2
│ └─┬ machina@1.1.2
│ └── lodash@3.10.1
├─┬ node-red-contrib-ui@1.2.19
│ ├─┬ serve-static@1.10.2
│ │ ├── escape-html@1.0.3
│ │ ├── parseurl@1.3.1
│ │ └─┬ send@0.13.1
│ │ ├── debug@2.2.0
│ │ ├── depd@1.1.0
│ │ ├── destroy@1.0.4
│ │ ├── etag@1.7.0
│ │ ├── fresh@0.3.0
│ │ ├─┬ http-errors@1.3.1
│ │ │ └── inherits@2.0.1
│ │ ├── mime@1.3.4
│ │ ├── ms@0.7.1
│ │ ├─┬ on-finished@2.3.0
│ │ │ └── ee-first@1.1.1
│ │ ├── range-parser@1.0.3
│ │ └── statuses@1.2.1
│ └─┬ socket.io@1.4.5
│ ├─┬ debug@2.2.0
│ │ └── ms@0.7.1
│ ├─┬ engine.io@1.6.8
│ │ ├─┬ accepts@1.1.4
│ │ │ ├─┬ mime-types@2.0.14
│ │ │ │ └── mime-db@1.12.0
│ │ │ └── negotiator@0.4.9
│ │ ├── base64id@0.1.0
│ │ ├─┬ engine.io-parser@1.2.4
│ │ │ ├── after@0.8.1
│ │ │ ├── arraybuffer.slice@0.0.6
│ │ │ ├── base64-arraybuffer@0.1.2
│ │ │ ├── blob@0.0.4
│ │ │ ├─┬ has-binary@0.1.6
│ │ │ │ └── isarray@0.0.1
│ │ │ └── utf8@2.1.0
│ │ └─┬ ws@1.0.1
│ │ ├── options@0.0.6
│ │ └── ultron@1.0.2
│ ├─┬ has-binary@0.1.7
│ │ └── isarray@0.0.1
│ ├─┬ socket.io-adapter@0.4.0
│ │ └─┬ socket.io-parser@2.2.2
│ │ ├── benchmark@1.0.0
│ │ ├── component-emitter@1.1.2
│ │ ├── debug@0.7.4
│ │ ├── isarray@0.0.1
│ │ └── json3@3.2.6
│ ├─┬ socket.io-client@1.4.5
│ │ ├── backo2@1.0.2
│ │ ├── component-bind@1.0.0
│ │ ├── component-emitter@1.2.0
│ │ ├─┬ engine.io-client@1.6.8
│ │ │ ├── component-emitter@1.1.2
│ │ │ ├── component-inherit@0.0.3
│ │ │ ├─┬ engine.io-parser@1.2.4
│ │ │ │ ├── after@0.8.1
│ │ │ │ ├── arraybuffer.slice@0.0.6
│ │ │ │ ├── base64-arraybuffer@0.1.2
│ │ │ │ ├── blob@0.0.4
│ │ │ │ ├─┬ has-binary@0.1.6
│ │ │ │ │ └── isarray@0.0.1
│ │ │ │ └── utf8@2.1.0
│ │ │ ├── has-cors@1.1.0
│ │ │ ├─┬ parsejson@0.0.1
│ │ │ │ └─┬ better-assert@1.0.2
│ │ │ │ └── callsite@1.0.0
│ │ │ ├─┬ parseqs@0.0.2
│ │ │ │ └─┬ better-assert@1.0.2
│ │ │ │ └── callsite@1.0.0
│ │ │ ├─┬ ws@1.0.1
│ │ │ │ ├── options@0.0.6
│ │ │ │ └── ultron@1.0.2
│ │ │ ├── xmlhttprequest-ssl@1.5.1
│ │ │ └── yeast@0.1.2
│ │ ├── indexof@0.0.1
│ │ ├── object-component@0.0.3
│ │ ├─┬ parseuri@0.0.4
│ │ │ └─┬ better-assert@1.0.2
│ │ │ └── callsite@1.0.0
│ │ └── to-array@0.1.4
│ └─┬ socket.io-parser@2.2.6
│ ├── benchmark@1.0.0
│ ├── component-emitter@1.1.2
│ ├── isarray@0.0.1
│ └── json3@3.3.2
├── node-red-node-emoncms@0.0.9
├─┬ node-red-node-google@0.1.0
│ ├── clone@0.1.11
│ ├─┬ minimatch@2.0.4
│ │ └─┬ brace-expansion@1.1.2
│ │ ├── balanced-match@0.3.0
│ │ └── concat-map@0.0.1
│ └─┬ request@2.40.0
│ ├── aws-sign2@0.5.0
│ ├── forever-agent@0.5.2
│ ├─┬ form-data@0.1.4
│ │ ├── async@0.9.2
│ │ ├─┬ combined-stream@0.0.7
│ │ │ └── delayed-stream@0.0.5
│ │ └── mime@1.2.11
│ ├─┬ hawk@1.1.1
│ │ ├── boom@0.4.2
│ │ ├── cryptiles@0.2.2
│ │ ├── hoek@0.9.1
│ │ └── sntp@0.2.4
│ ├─┬ http-signature@0.10.1
│ │ ├── asn1@0.1.11
│ │ ├── assert-plus@0.1.5
│ │ └── ctype@0.5.3
│ ├── json-stringify-safe@5.0.1
│ ├── mime-types@1.0.2
│ ├── node-uuid@1.4.7
│ ├── oauth-sign@0.3.0
│ ├── qs@1.0.2
│ ├── stringstream@0.0.5
│ ├── tough-cookie@2.2.1
│ └── tunnel-agent@0.4.2
├── node-red-node-ledborg@0.0.13
├── node-red-node-openweathermap@0.1.7
├── node-red-node-ping@0.0.7
├── node-red-node-random@0.0.5
├── node-red-node-rbe@0.1.1
├── node-red-node-smooth@0.0.5
└─┬ node-red-node-weather-underground@0.1.2
└─┬ wundergroundnode@0.9.0
├── limiter@1.0.5
├── moment@2.11.1
├─┬ request@2.67.0
│ ├── aws-sign2@0.6.0
│ ├─┬ bl@1.0.0
│ │ └─┬ readable-stream@2.0.5
│ │ ├── core-util-is@1.0.2
│ │ ├── inherits@2.0.1
│ │ ├── isarray@0.0.1
│ │ ├── process-nextick-args@1.0.6
│ │ ├── string_decoder@0.10.31
│ │ └── util-deprecate@1.0.2
│ ├── caseless@0.11.0
│ ├─┬ combined-stream@1.0.5
│ │ └── delayed-stream@1.0.0
│ ├── extend@3.0.0
│ ├── forever-agent@0.6.1
│ ├─┬ form-data@1.0.0-rc3
│ │ └── async@1.5.2
│ ├─┬ har-validator@2.0.3
│ │ ├─┬ chalk@1.1.1
│ │ │ ├── ansi-styles@2.1.0
│ │ │ ├── escape-string-regexp@1.0.4
│ │ │ ├─┬ has-ansi@2.0.0
│ │ │ │ └── ansi-regex@2.0.0
│ │ │ ├─┬ strip-ansi@3.0.0
│ │ │ │ └── ansi-regex@2.0.0
│ │ │ └── supports-color@2.0.0
│ │ ├─┬ commander@2.9.0
│ │ │ └── graceful-readlink@1.0.1
│ │ ├─┬ is-my-json-valid@2.12.3
│ │ │ ├── generate-function@2.0.0
│ │ │ ├─┬ generate-object-property@1.2.0
│ │ │ │ └── is-property@1.0.2
│ │ │ ├── jsonpointer@2.0.0
│ │ │ └── xtend@4.0.1
│ │ └─┬ pinkie-promise@2.0.0
│ │ └── pinkie@2.0.1
│ ├─┬ hawk@3.1.2
│ │ ├── boom@2.10.1
│ │ ├── cryptiles@2.0.5
│ │ ├── hoek@2.16.3
│ │ └── sntp@1.0.9
│ ├─┬ http-signature@1.1.0
│ │ ├── assert-plus@0.1.5
│ │ ├─┬ jsprim@1.2.2
│ │ │ ├── extsprintf@1.0.2
│ │ │ ├── json-schema@0.2.2
│ │ │ └── verror@1.3.6
│ │ └─┬ sshpk@1.7.2
│ │ ├── asn1@0.2.3
│ │ ├── assert-plus@0.2.0
│ │ ├─┬ dashdash@1.12.1
│ │ │ └── assert-plus@0.1.5
│ │ ├── ecc-jsbn@0.1.1
│ │ ├── jodid25519@1.0.2
│ │ ├── jsbn@0.1.0
│ │ └── tweetnacl@0.13.3
│ ├── is-typedarray@1.0.0
│ ├── isstream@0.1.2
│ ├── json-stringify-safe@5.0.1
│ ├─┬ mime-types@2.1.9
│ │ └── mime-db@1.21.0
│ ├── node-uuid@1.4.7
│ ├── oauth-sign@0.8.0
│ ├── qs@5.2.0
│ ├── stringstream@0.0.5
│ ├── tough-cookie@2.2.1
│ └── tunnel-agent@0.4.2
└── underscore@1.8.3
npm ERR! missing: node-pre-gyp-github@^1.1.0, required by serialport@2.0.6
pi@raspberrypi:/usr/lib/node_modules $
`
`
which version of node.js and npm do you have installed ? what do
node -v
npm -v
report ?
Hi,
As requested:
pi@raspberrypi:/usr/lib/node_modules $ npm -v
2.14.15
pi@raspberrypi:/usr/lib/node_modules $ node -v
v0.10.29
pi@raspberrypi:/usr/lib/node_modules $ npm -g -v
2.14.15
pi@raspberrypi:/usr/lib/node_modules $ node -g -v
v0.10.29
pi@raspberrypi:/usr/lib/node_modules $
Just had another instance of this problem. It seemed to be triggered by accessing my node-red UI at URL : http://192.168.1.15:1880/#
Just had another instance. It happened when I accessed my node-red UI (see below screenshot).
FYI this time the following was logged in the node-red log file:
10 Jun 07:43:49 - [warn] [google-credentials:942c464d.6bd3b8] trying to refresh token due to expiry
10 Jun 08:43:54 - [warn] [google-credentials:942c464d.6bd3b8] trying to refresh token due to expiry
10 Jun 09:21:24 - [warn] [function:arp-scan output to json] hosts responded (=13) != packets received (=14)
10 Jun 09:43:59 - [warn] [google-credentials:942c464d.6bd3b8] trying to refresh token due to expiry
Connection was closed.
10 Jun 10:37:12 - [red] Uncaught Exception:
10 Jun 10:37:12 - Error: write EIO
at errnoException (net.js:904:11)
at Object.afterWrite (net.js:720:19)
There have been a couple updates recently to the Watson IoT nodes - would be interested to know if this is still an issue
Given this is not coming from a core node (plus its 4 years old) I'm going to close.
|
gharchive/issue
| 2016-06-07T20:18:51
|
2025-04-01T06:39:47.053763
|
{
"authors": [
"dceejay",
"janvda",
"knolleary"
],
"repo": "node-red/node-red",
"url": "https://github.com/node-red/node-red/issues/903",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
304432755
|
avoid unneeded call to Isolate::GetCurrent() in converters
I have seen an unneeded call to Isolate::GetCurrent() in converters during stepping around. Even the overhead of this is small it's unneeded.
Thank you. Seems nobody has noticed this in over a year.
|
gharchive/pull-request
| 2018-03-12T15:58:19
|
2025-04-01T06:39:47.135983
|
{
"authors": [
"Flarna",
"kkoopa"
],
"repo": "nodejs/nan",
"url": "https://github.com/nodejs/nan/pull/754",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
145934926
|
less generic token.
fixes #3
we are going to have multiple tokens configured in this app, so we have a shared namespace.
merged but github broke during
|
gharchive/pull-request
| 2016-04-05T09:28:55
|
2025-04-01T06:39:47.491619
|
{
"authors": [
"AdrianRossouw"
],
"repo": "nodezoo/nodezoo-github",
"url": "https://github.com/nodezoo/nodezoo-github/pull/47",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
197694652
|
Ejecting express-middleware to its own module
It doesn't really belong to the network layer. It's just a mean to support the "batching protocol" that should be separated from the rest.
Not everybody use express. I use koa and I made koa-graphql-batch to use this network layer, thus I don't need express-middleware.
If one doesn't bother to do properly minification/tree-shaking, the middleware code will end up client side where it's all but needed.
I could take care of this if @nodkz agrees.
@mattecapu 👍
I agree with you. Express-middleware should be extracted from RRNL. In the next month I'll start implementing subscriptions and bump new major version.
Before this moment let keep untouched this extra 1.6 kb of express-middleware for semver compatibility.
BTW. Please add a link to your koa-graphql-batch to readme. Thanks!
|
gharchive/issue
| 2016-12-27T12:14:01
|
2025-04-01T06:39:47.495183
|
{
"authors": [
"mattecapu",
"nodkz"
],
"repo": "nodkz/react-relay-network-layer",
"url": "https://github.com/nodkz/react-relay-network-layer/issues/30",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1335661244
|
Remove parallel streaming
Parallel streams are run on a seperate thread causing AssertionError: must be run on client thread when calling ItemManager.getItemPrice
I confirmed this fixes the crash. Can we get this merged in?
|
gharchive/pull-request
| 2022-08-11T08:52:49
|
2025-04-01T06:39:47.496181
|
{
"authors": [
"Barragek0",
"mrhappyasthma"
],
"repo": "nofatigue/runelite-profit-tracker",
"url": "https://github.com/nofatigue/runelite-profit-tracker/pull/32",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
2454632995
|
docker log format
the docker log format does not include the source IP address. correct so that it does (including using the proxy forward-for header)
added 2m of time spent
closing as release v1.3 #371 made changes forcing this issue now to be stale.
|
gharchive/issue
| 2024-05-23T16:02:37
|
2025-04-01T06:39:47.499696
|
{
"authors": [
"jon-nfc"
],
"repo": "nofusscomputing/centurion_erp",
"url": "https://github.com/nofusscomputing/centurion_erp/issues/26",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1361128609
|
segfaults when calling setup
Hi. Would you mind sharing what versions of neovim, nvim-oxi and rustc you use to test nvim-completion?
I've tried with rustc 1.65.0-nightly/1.63.0, neovim (NVIM v0.8.0-dev-1021-g24fbda04b
Build type: Release), nvim-oxi latest master. Could you also share your config options for nvim-completion? Thanks.
rustc: rustc 1.65.0-nightly (e1b28cd2f 2022-08-19);
nvim: NVIM v0.7.2;
nvim-oxi: latest master, updated in (d6f48b06a448b73135dafbfbc771ce9dd61efead);
Config:
-- config.lua
local completion = require("nvim_completion")
completion.setup({
sources = {
lipsum = { enable = function(_buf) return true end },
lsp = { enable = true },
},
})
Starting Neovim w/ nvim --clean -u ./config.lua.
Then you need to open a new buffer for the sources to attach to via :e <somefilename>.
Can you share your config? It shouldn't segfault. Does the segfault also happen if you use Neovim 0.7.2 instead of nightly?
Yup, just confirmed it only segfaults with neovim latest master.
That's not a segfault, that's just a panic (which I'm aware of).
Yup, I figured you would.
|
gharchive/issue
| 2022-09-04T12:04:20
|
2025-04-01T06:39:47.509193
|
{
"authors": [
"MurdeRM3L0DY",
"noib3"
],
"repo": "noib3/nvim-completion",
"url": "https://github.com/noib3/nvim-completion/issues/19",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2309956460
|
Add zsh-completions
[ ] Add documents
[x] Add cli flag
# e.g
--generate-completion-script zsh/bash/etc..
|
gharchive/issue
| 2024-05-22T08:51:56
|
2025-04-01T06:39:47.510920
|
{
"authors": [
"hahwul"
],
"repo": "noir-cr/noir",
"url": "https://github.com/noir-cr/noir/issues/305",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1475339859
|
noirup fails due to Barretenberg build failure
Description
Aim
Successful installation of the nargo tool in order to perform Noir development.
Expected behavior
For invocation of noirup to successfully install nargo. Something along the lines of:
$ noirup
# ...
$ echo $?
0
$ nargo
# ...
$ echo $?
0
Bug
$ sudo apt-get install libomp-dev cmake
$ curl -L https://raw.githubusercontent.com/noir-lang/noir/master/noirup/install | bash
$ noirup
HEAD is now at e239f415 Handle predicate operator during inlining (#544)
Installing nargo v0.1.0 (/home/jmcph4/.nargo/noir-lang/noir/crates/nargo)
Updating crates.io index
Compiling psm v0.1.21
Compiling barretenberg_wrapper v0.1.0 (https://github.com/AztecProtocol/barretenberg?rev=804c7dcf21111acd1302a768a8fa2f453dcec50f#804c7dcf)
error: failed to run custom build command for `barretenberg_wrapper v0.1.0 (https://github.com/AztecProtocol/barretenberg?rev=804c7dcf21111acd1302a768a8fa2f453dcec50f#804c7dcf)`
Caused by:
process didn't exit successfully: `/home/jmcph4/.nargo/noir-lang/noir/target/release/build/barretenberg_wrapper-6e0ec251307286f5/build-script-build` (exit status: 101)
--- stdout
CMAKE_TOOLCHAIN_FILE_x86_64-unknown-linux-gnu = None
CMAKE_TOOLCHAIN_FILE_x86_64_unknown_linux_gnu = None
HOST_CMAKE_TOOLCHAIN_FILE = None
CMAKE_TOOLCHAIN_FILE = None
CMAKE_GENERATOR_x86_64-unknown-linux-gnu = None
CMAKE_GENERATOR_x86_64_unknown_linux_gnu = None
HOST_CMAKE_GENERATOR = None
CMAKE_GENERATOR = None
CMAKE_PREFIX_PATH_x86_64-unknown-linux-gnu = None
CMAKE_PREFIX_PATH_x86_64_unknown_linux_gnu = None
HOST_CMAKE_PREFIX_PATH = None
CMAKE_PREFIX_PATH = None
CMAKE_x86_64-unknown-linux-gnu = None
CMAKE_x86_64_unknown_linux_gnu = None
HOST_CMAKE = None
CMAKE = None
running: "cmake" "/home/jmcph4/.cargo/git/checkouts/barretenberg-6ce83dfea69613eb/804c7dc/barretenberg_wrapper/../barretenberg" "-DCMAKE_INSTALL_PREFIX=/home/jmcph4/.nargo/noir-lang/noir/target/release/build/barretenberg_wrapper-24257eaf3fa79116/out" "-DCMAKE_C_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_C_COMPILER=/usr/bin/cc" "-DCMAKE_CXX_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_CXX_COMPILER=/usr/bin/c++" "-DCMAKE_ASM_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_ASM_COMPILER=/usr/bin/cc" "-DCMAKE_BUILD_TYPE=Release"
-- The CXX compiler identification is GNU 12.2.0
-- The C compiler identification is GNU 12.2.0
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Found OpenMP_C: -fopenmp (found version "4.5")
-- Found OpenMP_CXX: -fopenmp (found version "4.5")
-- Found OpenMP: TRUE (found version "4.5")
-- Multithreading is enabled.
-- Found Python: /home/jmcph4/.pyenv/shims/python3 (found version "3.9.0") found components: Interpreter
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Failed to find LLVM FileCheck
-- Found Git: /usr/bin/git (found version "2.37.2")
-- git version: v1.6.1 normalized to 1.6.1
-- Version: 1.6.1
-- Looking for shm_open in rt
-- Looking for shm_open in rt - found
-- Performing Test HAVE_CXX_FLAG_STD_CXX11
-- Performing Test HAVE_CXX_FLAG_STD_CXX11 - Success
-- Performing Test HAVE_CXX_FLAG_WALL
-- Performing Test HAVE_CXX_FLAG_WALL - Success
-- Performing Test HAVE_CXX_FLAG_WEXTRA
-- Performing Test HAVE_CXX_FLAG_WEXTRA - Success
-- Performing Test HAVE_CXX_FLAG_WSHADOW
-- Performing Test HAVE_CXX_FLAG_WSHADOW - Success
-- Performing Test HAVE_CXX_FLAG_WERROR
-- Performing Test HAVE_CXX_FLAG_WERROR - Success
-- Performing Test HAVE_CXX_FLAG_WSUGGEST_OVERRIDE
-- Performing Test HAVE_CXX_FLAG_WSUGGEST_OVERRIDE - Success
-- Performing Test HAVE_CXX_FLAG_PEDANTIC
-- Performing Test HAVE_CXX_FLAG_PEDANTIC - Success
-- Performing Test HAVE_CXX_FLAG_PEDANTIC_ERRORS
-- Performing Test HAVE_CXX_FLAG_PEDANTIC_ERRORS - Success
-- Performing Test HAVE_CXX_FLAG_WSHORTEN_64_TO_32
-- Performing Test HAVE_CXX_FLAG_WSHORTEN_64_TO_32 - Failed
-- Performing Test HAVE_CXX_FLAG_FSTRICT_ALIASING
-- Performing Test HAVE_CXX_FLAG_FSTRICT_ALIASING - Success
-- Performing Test HAVE_CXX_FLAG_WNO_DEPRECATED_DECLARATIONS
-- Performing Test HAVE_CXX_FLAG_WNO_DEPRECATED_DECLARATIONS - Success
-- Performing Test HAVE_CXX_FLAG_WNO_DEPRECATED
-- Performing Test HAVE_CXX_FLAG_WNO_DEPRECATED - Success
-- Performing Test HAVE_CXX_FLAG_WSTRICT_ALIASING
-- Performing Test HAVE_CXX_FLAG_WSTRICT_ALIASING - Success
-- Performing Test HAVE_CXX_FLAG_WD654
-- Performing Test HAVE_CXX_FLAG_WD654 - Failed
-- Performing Test HAVE_CXX_FLAG_WTHREAD_SAFETY
-- Performing Test HAVE_CXX_FLAG_WTHREAD_SAFETY - Failed
-- Performing Test HAVE_CXX_FLAG_COVERAGE
-- Performing Test HAVE_CXX_FLAG_COVERAGE - Success
-- Performing Test HAVE_STD_REGEX
-- Performing Test HAVE_STD_REGEX
-- Performing Test HAVE_STD_REGEX -- success
-- Performing Test HAVE_GNU_POSIX_REGEX
-- Performing Test HAVE_GNU_POSIX_REGEX
-- Performing Test HAVE_GNU_POSIX_REGEX -- failed to compile
-- Performing Test HAVE_POSIX_REGEX
-- Performing Test HAVE_POSIX_REGEX
-- Performing Test HAVE_POSIX_REGEX -- success
-- Performing Test HAVE_STEADY_CLOCK
-- Performing Test HAVE_STEADY_CLOCK
-- Performing Test HAVE_STEADY_CLOCK -- success
-- Using optimized assembly for field arithmetic.
-- Looking for unistd.h
-- Looking for unistd.h - found
-- Looking for crc32c_value in crc32c
-- Looking for crc32c_value in crc32c - not found
-- Looking for snappy_compress in snappy
-- Looking for snappy_compress in snappy - not found
-- Looking for malloc in tcmalloc
-- Looking for malloc in tcmalloc - not found
-- Looking for fdatasync
-- Looking for fdatasync - found
-- Looking for F_FULLFSYNC
-- Looking for F_FULLFSYNC - not found
-- Performing Test HAVE_CLANG_THREAD_SAFETY
-- Performing Test HAVE_CLANG_THREAD_SAFETY - Failed
-- Performing Test HAVE_CXX17_HAS_INCLUDE
-- Performing Test HAVE_CXX17_HAS_INCLUDE - Success
-- Looking for sqlite3_open in sqlite3
-- Looking for sqlite3_open in sqlite3 - found
-- Performing Test HAVE_KYOTOCABINET
-- Performing Test HAVE_KYOTOCABINET - Failed
-- Configuring done
-- Generating done
-- Build files have been written to: /home/jmcph4/.nargo/noir-lang/noir/target/release/build/barretenberg_wrapper-24257eaf3fa79116/out/build
running: "cmake" "--build" "." "--target" "install" "--config" "Release"
[ 0%] Building CXX object src/aztec/env/CMakeFiles/env_objects.dir/logstr.cpp.o
[ 0%] Built target env_objects
[ 1%] Building CXX object src/aztec/numeric/CMakeFiles/numeric_objects.dir/random/engine.cpp.o
[ 1%] Built target numeric_objects
[ 2%] Building CXX object _deps/googletest-build/googletest/CMakeFiles/gtest.dir/src/gtest-all.cc.o
--- stderr
CMake Warning:
Manually-specified variables were not used by the project:
CMAKE_ASM_COMPILER
CMAKE_ASM_FLAGS
In file included from /usr/include/c++/12/ios:40,
from /usr/include/c++/12/ostream:38,
from /usr/include/c++/12/bits/unique_ptr.h:41,
from /usr/include/c++/12/memory:76,
from /home/jmcph4/.nargo/noir-lang/noir/target/release/build/barretenberg_wrapper-24257eaf3fa79116/out/build/_deps/googletest-src/googletest/include/gtest/gtest.h:57,
from /home/jmcph4/.nargo/noir-lang/noir/target/release/build/barretenberg_wrapper-24257eaf3fa79116/out/build/_deps/googletest-src/googletest/src/gtest-all.cc:38:
In static member function ‘static constexpr std::char_traits<char>::char_type* std::char_traits<char>::copy(char_type*, const char_type*, std::size_t)’,
inlined from ‘static constexpr void std::__cxx11::basic_string<_CharT, _Traits, _Alloc>::_S_copy(_CharT*, const _CharT*, size_type) [with _CharT = char; _Traits = std::char_traits<char>; _Alloc = std::allocator<char>]’ at /usr/include/c++/12/bits/basic_string.h:423:21,
inlined from ‘constexpr std::__cxx11::basic_string<_CharT, _Traits, _Allocator>& std::__cxx11::basic_string<_CharT, _Traits, _Alloc>::_M_replace(size_type, size_type, const _CharT*, size_type) [with _CharT = char; _Traits = std::char_traits<char>; _Alloc = std::allocator<char>]’ at /usr/include/c++/12/bits/basic_string.tcc:532:22,
inlined from ‘constexpr std::__cxx11::basic_string<_CharT, _Traits, _Alloc>& std::__cxx11::basic_string<_CharT, _Traits, _Alloc>::replace(size_type, size_type, const _CharT*, size_type) [with _CharT = char; _Traits = std::char_traits<char>; _Alloc = std::allocator<char>]’ at /usr/include/c++/12/bits/basic_string.h:2171:19,
inlined from ‘constexpr std::__cxx11::basic_string<_CharT, _Traits, _Alloc>& std::__cxx11::basic_string<_CharT, _Traits, _Alloc>::insert(size_type, const _CharT*) [with _CharT = char; _Traits = std::char_traits<char>; _Alloc = std::allocator<char>]’ at /usr/include/c++/12/bits/basic_string.h:1928:22,
inlined from ‘constexpr std::__cxx11::basic_string<_CharT, _Traits, _Allocator> std::operator+(const _CharT*, __cxx11::basic_string<_CharT, _Traits, _Allocator>&&) [with _CharT = char; _Traits = char_traits<char>; _Alloc = allocator<char>]’ at /usr/include/c++/12/bits/basic_string.h:3541:36,
inlined from ‘static std::string testing::internal::StreamingListener::UrlEncode(const char*)’ at /home/jmcph4/.nargo/noir-lang/noir/target/release/build/barretenberg_wrapper-24257eaf3fa79116/out/build/_deps/googletest-src/googletest/src/gtest.cc:4882:27:
/usr/include/c++/12/bits/char_traits.h:431:56: error: ‘void* __builtin_memcpy(void*, const void*, long unsigned int)’ accessing 9223372036854775810 or more bytes at offsets [2, 9223372036854775807] and 1 may overlap up to 9223372036854775813 bytes at offset -3 [-Werror=restrict]
431 | return static_cast<char_type*>(__builtin_memcpy(__s1, __s2, __n));
| ~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
cc1plus: all warnings being treated as errors
gmake[2]: *** [_deps/googletest-build/googletest/CMakeFiles/gtest.dir/build.make:76: _deps/googletest-build/googletest/CMakeFiles/gtest.dir/src/gtest-all.cc.o] Error 1
gmake[1]: *** [CMakeFiles/Makefile2:1185: _deps/googletest-build/googletest/CMakeFiles/gtest.dir/all] Error 2
gmake: *** [Makefile:146: all] Error 2
thread 'main' panicked at '
command did not execute successfully, got: exit status: 2
build script failed, must exit now', /home/jmcph4/.cargo/registry/src/github.com-1ecc6299db9ec823/cmake-0.1.49/src/lib.rs:1104:5
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
error: failed to compile `nargo v0.1.0 (/home/jmcph4/.nargo/noir-lang/noir/crates/nargo)`, intermediate artifacts can be found at `/home/jmcph4/.nargo/noir-lang/noir/target`
noirup: command failed: cargo install --path ./crates/nargo --bins --locked --force --root /home/jmcph4/.nargo
To reproduce
Install noirup (as per https://github.com/noir-lang/noir/tree/1d1b592039fa73ce561d6b6886c36b02b36ada53/noirup#readme)
Invoke noirup
Environment
$ uname --all
Linux foobar 5.19.0-2-amd64 #1 SMP PREEMPT_DYNAMIC Debian 5.19.11-1 (2022-09-24) x86_64 GNU/Linux
$ cat /etc/os-release
PRETTY_NAME="Debian GNU/Linux bookworm/sid"
NAME="Debian GNU/Linux"
ID=debian
HOME_URL="https://www.debian.org/"
SUPPORT_URL="https://www.debian.org/support"
BUG_REPORT_URL="https://bugs.debian.org/"
Additional context
I initially attempted to follow the official documentation, but this failed for other reasons. I then found noirup and decided it was the more graceful approach.
The build appears to be failing during the Barretenberg build section. I appear to satisfy the Barretenberg dependences.
Since moving to nix, this issue has been solved. Feel free to reopen if you encounter it again!
|
gharchive/issue
| 2022-12-05T00:30:08
|
2025-04-01T06:39:47.522163
|
{
"authors": [
"jmcph4",
"kevaundray"
],
"repo": "noir-lang/noir",
"url": "https://github.com/noir-lang/noir/issues/556",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
2632438868
|
Test brillig_cow fails with inliner = -Inf
Aim
Trying to run test_programs with different --inliner-aggressiveness in https://github.com/noir-lang/noir/pull/6429
Expected Behavior
execution_success tests should pass with any inliner setting
Bug
test_brillig_cow fails with the following error message:
❯ nargo execute --force --inliner-aggressiveness -9223372036854775808
error: Failed to solve program: 'Failed to solve brillig function'
┌─ /Users/aakoshh/Work/aztec/noir/test_programs/execution_success/brillig_cow/src/main.nr:46:12
│
46 │ assert(expected_result.is_equal(modify_in_inlined_constrained(original, index)));
│ ------------------------------------------------------------------------
│
= Call stack:
1. /Users/aakoshh/Work/aztec/noir/test_programs/execution_success/brillig_cow/src/main.nr:46:12
Failed to solve program: 'Failed to solve brillig function'
To Reproduce
cd test_programs/execution_success/brillig_cow
nargo execute --force --inliner-aggressiveness -9223372036854775808
Alternatively:
3. cargo run -p nargo_cli -- --program-dir . execute --force --inliner-aggressiveness -9223372036854775808
Workaround
None
Workaround Description
No response
Additional Context
No response
Project Impact
None
Blocker Context
No response
Nargo Version
nargo version = 0.36.0 noirc version = 0.36.0+2f0cb3e80f3d93a1dee77fffacc397811e300257 (git version hash: 2f0cb3e80f3d93a1dee77fffacc397811e300257, is dirty: false)
NoirJS Version
No response
Proving Backend Tooling & Version
No response
Would you like to submit a PR for this Issue?
None
Support Needs
No response
A bit more detail.
I modified the program like so:
unconstrained fn main(original: [Field; ARRAY_SIZE], index: u64, expected_result: ExecutionResult) {
let result_uncons = modify_in_unconstrained(original, index);
let result_cons = modify_in_inlined_constrained(original, index);
std::println(f"uncons: {result_uncons}");
std::println(f"cons: {result_cons}");
std::println(f"expect: {expected_result}");
assert(expected_result.is_equal(result_uncons));
assert(expected_result.is_equal(result_cons));
}
The print shows that the constrained and unconstrained agree:
uncons: ExecutionResult { original: [0x00, 0x01, 0x02, 0x03, 0x04], modified_once: [0x00, 0x01, 0x1b, 0x1b, 0x04], modified_twice: [0x00, 0x01, 0x1b, 0x1b, 0x04] }
cons: ExecutionResult { original: [0x00, 0x01, 0x02, 0x03, 0x04], modified_once: [0x00, 0x01, 0x1b, 0x1b, 0x04], modified_twice: [0x00, 0x01, 0x1b, 0x1b, 0x04] }
expect: ExecutionResult { original: [0x00, 0x01, 0x02, 0x03, 0x04], modified_once: [0x00, 0x01, 0x1b, 0x03, 0x04], modified_twice: [0x00, 0x01, 0x1b, 0x1b, 0x04] }
error: Failed to solve program: 'Failed to solve brillig function'
┌─ /Users/aakoshh/Work/aztec/noir/test_programs/execution_success/brillig_cow/src/main.nr:55:12
│
55 │ assert(expected_result.is_equal(result_uncons));
│ ---------------------------------------
The modify function looks like this:
let mut modified = original;
modified[index] = 27;
let modified_once = modified;
modified[index + 1] = 27;
ExecutionResult { original, modified_once, modified_twice: modified }
So modified_once ends up being a reference and not a snapshot of modified, before modified is further mutated.
What is strange is that if I call the CLI with the --show-ssa option to see what's happening under the hood, the problem goes away:
❯ nargo execute --force --inliner-aggressiveness -9223372036854775808 --show-ssa
Initial SSA:
brillig(inline) fn main f0 {
...
brillig(inline) fn eq f11 {
b0(v0: Field, v1: Field):
v2 = eq v0, v1
return v2
}
uncons: ExecutionResult { original: [0x00, 0x01, 0x02, 0x03, 0x04], modified_once: [0x00, 0x01, 0x1b, 0x03, 0x04], modified_twice: [0x00, 0x01, 0x1b, 0x1b, 0x04] }
cons: ExecutionResult { original: [0x00, 0x01, 0x02, 0x03, 0x04], modified_once: [0x00, 0x01, 0x1b, 0x03, 0x04], modified_twice: [0x00, 0x01, 0x1b, 0x1b, 0x04] }
expect: ExecutionResult { original: [0x00, 0x01, 0x02, 0x03, 0x04], modified_once: [0x00, 0x01, 0x1b, 0x03, 0x04], modified_twice: [0x00, 0x01, 0x1b, 0x1b, 0x04] }
[brillig_cow] Circuit witness successfully solved
[brillig_cow] Witness saved to /Users/aakoshh/Work/aztec/noir/test_programs/execution_success/brillig_cow/target/brillig_cow.gz
A genuine Heisenbug 👀
I changed printing the SSA to do it without normalising the IDs, which avoids "fixing" the bug:
fn print(mut self, msg: &str) -> Self {
if self.print_ssa_passes {
//self.ssa.normalize_ids();
println!("{msg}\n{}", self.ssa);
}
self
}
With this the last SSA printed before the error looks like this:
After Array Set Optimizations:
```
brillig(inline) fn main f0 {
b0(v0: [Field; 5], v1: u64, v2: [Field; 5], v3: [Field; 5], v4: [Field; 5]):
v38, v39, v40 = call f1(v0, v1)
call f9([u8 117, u8 110, u8 99, u8 111, u8 110, u8 115, u8 58, u8 32, u8 123, u8 114, u8 101, u8 115, u8 117, u8 108, u8 116, u8 95, u8 117, u8 110, u8 99, u8 111, u8 110, u8 115, u8 125], u32 1, v38, v39, v40)
call f10([u8 101, u8 120, u8 112, u8 101, u8 99, u8 116, u8 58, u8 32, u8 123, u8 101, u8 120, u8 112, u8 101, u8 99, u8 116, u8 101, u8 100, u8 95, u8 114, u8 101, u8 115, u8 117, u8 108, u8 116, u8 125], u32 1, v2, v3, v4)
v41 = call f11(v2, v3, v4, v38, v39, v40)
constrain v41 == u1 1
return
}
brillig(inline) fn modify_in_unconstrained f1 {
b0(v0: [Field; 5], v1: u64):
inc_rc v0
v23 = cast v1 as u32
v24 = array_set v0, index v23, value Field 27
inc_rc v24
v26 = add v1, u64 1
v27 = cast v26 as u32
v28 = array_set mut v24, index v27, value Field 27
inc_rc v0
inc_rc v24
return v0, v24, v28
}
brillig(inline) fn print_unconstrained f7 {
b0(v0: u1, v1: [u8; 25], v2: Field, v3: [Field; 5], v4: [Field; 5], v5: [Field; 5]):
call v6(v0, v1, v2, v3, v4, v5, [u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 115, u8 116, u8 114, u8 117, u8 99, u8 116, u8 34, u8 44, u8 34, u8 110, u8 97, u8 109, u8 101, u8 34, u8 58, u8 34, u8 69, u8 120, u8 101, u8 99, u8 117, u8 116, u8 105, u8 111, u8 110, u8 82, u8 101, u8 115, u8 117, u8 108, u8 116, u8 34, u8 44, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 115, u8 34, u8 58, u8 91, u8 91, u8 34, u8 111, u8 114, u8 105, u8 103, u8 105, u8 110, u8 97, u8 108, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 44, u8 91, u8 34, u8 109, u8 111, u8 100, u8 105, u8 102, u8 105, u8 101, u8 100, u8 95, u8 111, u8 110, u8 99, u8 101, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 44, u8 91, u8 34, u8 109, u8 111, u8 100, u8 105, u8 102, u8 105, u8 101, u8 100, u8 95, u8 116, u8 119, u8 105, u8 99, u8 101, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 93, u8 125], u1 1)
return
}
brillig(inline) fn print_unconstrained f8 {
b0(v0: u1, v1: [u8; 23], v2: Field, v3: [Field; 5], v4: [Field; 5], v5: [Field; 5]):
call v6(v0, v1, v2, v3, v4, v5, [u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 115, u8 116, u8 114, u8 117, u8 99, u8 116, u8 34, u8 44, u8 34, u8 110, u8 97, u8 109, u8 101, u8 34, u8 58, u8 34, u8 69, u8 120, u8 101, u8 99, u8 117, u8 116, u8 105, u8 111, u8 110, u8 82, u8 101, u8 115, u8 117, u8 108, u8 116, u8 34, u8 44, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 115, u8 34, u8 58, u8 91, u8 91, u8 34, u8 111, u8 114, u8 105, u8 103, u8 105, u8 110, u8 97, u8 108, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 44, u8 91, u8 34, u8 109, u8 111, u8 100, u8 105, u8 102, u8 105, u8 101, u8 100, u8 95, u8 111, u8 110, u8 99, u8 101, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 44, u8 91, u8 34, u8 109, u8 111, u8 100, u8 105, u8 102, u8 105, u8 101, u8 100, u8 95, u8 116, u8 119, u8 105, u8 99, u8 101, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 93, u8 125], u1 1)
return
}
brillig(inline) fn println f9 {
b0(v0: [u8; 23], v1: Field, v2: [Field; 5], v3: [Field; 5], v4: [Field; 5]):
call f8(u1 1, v0, v1, v2, v3, v4)
return
}
brillig(inline) fn println f10 {
b0(v0: [u8; 25], v1: Field, v2: [Field; 5], v3: [Field; 5], v4: [Field; 5]):
call f7(u1 1, v0, v1, v2, v3, v4)
return
}
brillig(inline) fn is_equal f11 {
b0(v0: [Field; 5], v1: [Field; 5], v2: [Field; 5], v3: [Field; 5], v4: [Field; 5], v5: [Field; 5]):
v20 = call f12(v0, v3)
v21 = call f12(v1, v4)
v22 = mul v20, v21
v23 = call f12(v2, v5)
v24 = mul v22, v23
return v24
}
brillig(inline) fn eq f12 {
b0(v0: [Field; 5], v1: [Field; 5]):
v26 = allocate
store u1 1 at v26
jmp b1(u32 0)
b1(v4: u32):
v27 = lt v4, u32 5
jmpif v27 then: b2, else: b3
b2():
v29 = load v26
v30 = array_get v0, index v4
v31 = array_get v1, index v4
v32 = call f13(v30, v31)
v33 = mul v29, v32
store v33 at v26
v34 = add v4, u32 1
jmp b1(v34)
b3():
v28 = load v26
return v28
}
brillig(inline) fn eq f13 {
b0(v0: Field, v1: Field):
v5 = eq v0, v1
return v5
}
```
My branch does include https://github.com/noir-lang/noir/pull/6355 but since this test is also about mutable arrays, I tried to run it without this last SSA pass, which is where the previous bug was. Without that array optimisations, the test passes. The last SSA in that case looks like this:
After Simplifying:
```
brillig(inline) fn main f0 {
b0(v0: [Field; 5], v1: u64, v2: [Field; 5], v3: [Field; 5], v4: [Field; 5]):
v38, v39, v40 = call f1(v0, v1)
call f9([u8 117, u8 110, u8 99, u8 111, u8 110, u8 115, u8 58, u8 32, u8 123, u8 114, u8 101, u8 115, u8 117, u8 108, u8 116, u8 95, u8 117, u8 110, u8 99, u8 111, u8 110, u8 115, u8 125], u32 1, v38, v39, v40)
call f10([u8 101, u8 120, u8 112, u8 101, u8 99, u8 116, u8 58, u8 32, u8 123, u8 101, u8 120, u8 112, u8 101, u8 99, u8 116, u8 101, u8 100, u8 95, u8 114, u8 101, u8 115, u8 117, u8 108, u8 116, u8 125], u32 1, v2, v3, v4)
v41 = call f11(v2, v3, v4, v38, v39, v40)
constrain v41 == u1 1
return
}
brillig(inline) fn modify_in_unconstrained f1 {
b0(v0: [Field; 5], v1: u64):
inc_rc v0
v23 = cast v1 as u32
v24 = array_set v0, index v23, value Field 27
inc_rc v24
v26 = add v1, u64 1
v27 = cast v26 as u32
v28 = array_set v24, index v27, value Field 27
inc_rc v0
inc_rc v24
return v0, v24, v28
}
brillig(inline) fn print_unconstrained f7 {
b0(v0: u1, v1: [u8; 25], v2: Field, v3: [Field; 5], v4: [Field; 5], v5: [Field; 5]):
call v6(v0, v1, v2, v3, v4, v5, [u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 115, u8 116, u8 114, u8 117, u8 99, u8 116, u8 34, u8 44, u8 34, u8 110, u8 97, u8 109, u8 101, u8 34, u8 58, u8 34, u8 69, u8 120, u8 101, u8 99, u8 117, u8 116, u8 105, u8 111, u8 110, u8 82, u8 101, u8 115, u8 117, u8 108, u8 116, u8 34, u8 44, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 115, u8 34, u8 58, u8 91, u8 91, u8 34, u8 111, u8 114, u8 105, u8 103, u8 105, u8 110, u8 97, u8 108, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 44, u8 91, u8 34, u8 109, u8 111, u8 100, u8 105, u8 102, u8 105, u8 101, u8 100, u8 95, u8 111, u8 110, u8 99, u8 101, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 44, u8 91, u8 34, u8 109, u8 111, u8 100, u8 105, u8 102, u8 105, u8 101, u8 100, u8 95, u8 116, u8 119, u8 105, u8 99, u8 101, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 93, u8 125], u1 1)
return
}
brillig(inline) fn print_unconstrained f8 {
b0(v0: u1, v1: [u8; 23], v2: Field, v3: [Field; 5], v4: [Field; 5], v5: [Field; 5]):
call v6(v0, v1, v2, v3, v4, v5, [u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 115, u8 116, u8 114, u8 117, u8 99, u8 116, u8 34, u8 44, u8 34, u8 110, u8 97, u8 109, u8 101, u8 34, u8 58, u8 34, u8 69, u8 120, u8 101, u8 99, u8 117, u8 116, u8 105, u8 111, u8 110, u8 82, u8 101, u8 115, u8 117, u8 108, u8 116, u8 34, u8 44, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 115, u8 34, u8 58, u8 91, u8 91, u8 34, u8 111, u8 114, u8 105, u8 103, u8 105, u8 110, u8 97, u8 108, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 44, u8 91, u8 34, u8 109, u8 111, u8 100, u8 105, u8 102, u8 105, u8 101, u8 100, u8 95, u8 111, u8 110, u8 99, u8 101, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 44, u8 91, u8 34, u8 109, u8 111, u8 100, u8 105, u8 102, u8 105, u8 101, u8 100, u8 95, u8 116, u8 119, u8 105, u8 99, u8 101, u8 34, u8 44, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 97, u8 114, u8 114, u8 97, u8 121, u8 34, u8 44, u8 34, u8 108, u8 101, u8 110, u8 103, u8 116, u8 104, u8 34, u8 58, u8 53, u8 44, u8 34, u8 116, u8 121, u8 112, u8 101, u8 34, u8 58, u8 123, u8 34, u8 107, u8 105, u8 110, u8 100, u8 34, u8 58, u8 34, u8 102, u8 105, u8 101, u8 108, u8 100, u8 34, u8 125, u8 125, u8 93, u8 93, u8 125], u1 1)
return
}
brillig(inline) fn println f9 {
b0(v0: [u8; 23], v1: Field, v2: [Field; 5], v3: [Field; 5], v4: [Field; 5]):
call f8(u1 1, v0, v1, v2, v3, v4)
return
}
brillig(inline) fn println f10 {
b0(v0: [u8; 25], v1: Field, v2: [Field; 5], v3: [Field; 5], v4: [Field; 5]):
call f7(u1 1, v0, v1, v2, v3, v4)
return
}
brillig(inline) fn is_equal f11 {
b0(v0: [Field; 5], v1: [Field; 5], v2: [Field; 5], v3: [Field; 5], v4: [Field; 5], v5: [Field; 5]):
v20 = call f12(v0, v3)
v21 = call f12(v1, v4)
v22 = mul v20, v21
v23 = call f12(v2, v5)
v24 = mul v22, v23
return v24
}
brillig(inline) fn eq f12 {
b0(v0: [Field; 5], v1: [Field; 5]):
v26 = allocate
store u1 1 at v26
jmp b1(u32 0)
b1(v4: u32):
v27 = lt v4, u32 5
jmpif v27 then: b2, else: b3
b2():
v29 = load v26
v30 = array_get v0, index v4
v31 = array_get v1, index v4
v32 = call f13(v30, v31)
v33 = mul v29, v32
store v33 at v26
v34 = add v4, u32 1
jmp b1(v34)
b3():
v28 = load v26
return v28
}
brillig(inline) fn eq f13 {
b0(v0: Field, v1: Field):
v5 = eq v0, v1
return v5
}
```
Since we know that calling ssa.normalize_ids() fixes the problem, it's worth comparing the SSA with and without normalisation.
With normalisation it looks like this:
brillig(inline) fn modify_in_unconstrained f1 {
b0(v0: [Field; 5], v1: u64):
inc_rc v0
v2 = cast v1 as u32
v4 = array_set v0, index v2, value Field 27
inc_rc v4
v6 = add v1, u64 1
v7 = cast v6 as u32
v8 = array_set v4, index v7, value Field 27
inc_rc v0
inc_rc v4
return v0, v4, v8
}
Without normalisation the IDs of the variables have higher values:
brillig(inline) fn modify_in_unconstrained f1 {
b0(v0: [Field; 5], v1: u64):
inc_rc v0
v23 = cast v1 as u32
v24 = array_set v0, index v23, value Field 27
inc_rc v24
v26 = add v1, u64 1
v27 = cast v26 as u32
v28 = array_set mut v24, index v27, value Field 27
inc_rc v0
inc_rc v24
return v0, v24, v28
}
The code that decides whether the setting of the v4 a.k.a v24 array can be made mutable basically says it can be, unless the old identifier is used again in a future GET, or a return value, or it's coming from a potentially shared reference.
The problem is that if we print the value of the terminator and the ID of the array we set, we see this without normalisation:
TERMINATOR: Return { return_values: [Id(0), Id(5), Id(11)], call_stack: [Location { span: Span(Span { start: ByteIndex(1084), end: ByteIndex(1103) }), file: FileId(69) }] }
ARRAY SET: Id(0) to Id(4)
ARRAY SET: Id(24) to Id(4)
v24 is identified by Id(24), however in the return statement it corresponds to Id(5). By contrast with normalisation we get matching IDs:
TERMINATOR: Return { return_values: [Id(0), Id(4), Id(8)], call_stack: [Location { span: Span(Span { start: ByteIndex(1084), end: ByteIndex(1103) }), file: FileId(69) }] }
ARRAY SET: Id(0) to Id(3)
ARRAY SET: Id(4) to Id(3)
Here we see v4 is in the return value, and this check works:
let mut array_in_terminator = false;
terminator.for_each_value(|value| {
if value == array {
array_in_terminator = true;
}
});
|
gharchive/issue
| 2024-11-04T10:37:38
|
2025-04-01T06:39:47.540413
|
{
"authors": [
"aakoshh"
],
"repo": "noir-lang/noir",
"url": "https://github.com/noir-lang/noir/issues/6439",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1728166166
|
chore(ssa refactor): Adds basic program with empty body to experimental-ssa test corpus
Description
This adds the program from #1403 into the test corpus.
Problem*
Resolves
Summary*
This PR sets out to
Example
Before:
After:
Documentation
[ ] This PR requires documentation updates when merged.
[ ] I will submit a noir-lang/docs PR.
[ ] I will request for and support Dev Rel's help in documenting this PR.
Additional Context
PR Checklist*
[ ] I have tested the changes locally.
[ ] I have formatted the changes with Prettier and/or cargo fmt on default settings.
Copying the CI output:
---- tests::noir_integration_ssa_refactor stdout ----
Running test simple_program_no_body
Initial SSA:
fn main f0 {
b0(v0: Field, v1: Field):
return unit 0
}
After Inlining:
fn main f1 {
b0(v0: Field, v1: Field):
return unit 0
}
After Unrolling:
fn main f1 {
b0(v0: Field, v1: Field):
return unit 0
}
After Simplifying:
fn main f1 {
b0(v0: Field, v1: Field):
return unit 0
}
After Flattening:
fn main f1 {
b0(v0: Field, v1: Field):
return unit 0
}
After Mem2Reg:
fn main f1 {
b0(v0: Field, v1: Field):
return unit 0
}
thread 'main' panicked at 'ICE: Return of value not yet encountered', crates/noirc_evaluator/src/ssa_refactor/acir_gen/mod.rs:113:28
Error is occuring because when we try to do acir_gen for the return instruction, we:
Get the return values
Check to see if we've acir_gen'd this return value and panic if not
The second point assumes that the return value will always be a variable that was encountered before, whereas return unit 0 is a constant that the acir_gen pass has never seen before
Looks good to me but we should avoid merging until the issue in acir-gen is fixed
ssa-gen.
Yep will debug this -- I think it would also trigger, if this returns any ValueId which was not used in another part of the program, so I imagine just returning a constant would also trigger this bug
Seems there is another error in acir_gen:
CompiledProgram {
circuit: current witness index : 3
public parameters indices : [1, 2]
return value indices : [3]
EXPR [ (-1, _3) 0 ]
,
abi: Abi {
parameters: [
AbiParameter {
name: "_x",
typ: Field,
visibility: Private,
},
AbiParameter {
name: "_y",
typ: Field,
visibility: Public,
},
],
param_witnesses: {
"_x": [
Witness(
1,
),
],
"_y": [
Witness(
2,
),
],
},
return_type: None,
return_witnesses: [
Witness(
3,
),
],
},
}
Serialized transcript does not contain the required number of bytes
Serialized transcript does not contain the required number of bytes
This error really only happens in the barretenberg when there is a discrepancy between the circuit or proving system. Since the proving system has not changed, then we can then deduce that something is malformed about the circuit.
In particular, that error arises when there is a mismatch between public inputs.
To fix:
If the return is unit 0 then we should treat this as not having a return parameter. We should then see return value indices : [] ie no returns. This should also get rid of EXPR [ (-1, _3) 0 ].
The next issue is public parameters indices : [1, 2] we should have ``public parameters indices : [2]` indicating that there is one public parameter and it is the second parameter which should have index 2
Also going to block this PR until we check for:
fn main() -> pub Field {
return <constant>;
}
To make sure we don't mess something up there
Tested this with a constant return and that also passes :) This passes for me locally so just waiting for it to pass on CI
|
gharchive/pull-request
| 2023-05-26T19:59:24
|
2025-04-01T06:39:47.552117
|
{
"authors": [
"jfecher",
"kevaundray"
],
"repo": "noir-lang/noir",
"url": "https://github.com/noir-lang/noir/pull/1418",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1501432885
|
Raise specific Exceptions when LDAP-connections fail due to SSL-problems
Issue:
I recently ran multiple times into the problem, that LDAP-connection-attempts were unsuccessful, because the server-certificates could not got verified. This was due to a lack of the corresponding CA-Certificate in the central CA-system-store (in our case Ubuntu Linux).
When I was tracking down these connection-problems, I had to do trial&error to nail it down to this certificate-thing. Because bonsai just raises bonsai.errors.ConnectionError: Can't contact LDAP server. (unknown error code) (0xFFFF [-1]).
Is there a possibility to raise more specific exceptions on these SSL-related errors?
Setup:
Linux Ubuntu 20.04
Python 3.11.0
bonsai v1.5.1
Testscript:
#!/usr/bin/env python
import bonsai
from bonsai.errors import AuthenticationError, ConnectionError
bonsai.set_debug(True)
client = bonsai.LDAPClient("ldaps://ldap.domain.local")
client.set_credentials("SIMPLE", user="testuser", password="testpassword")
# client.set_cert_policy("allow")
try:
returnValue = client.connect()
except ConnectionError as err:
print(f"LDAP-Connection failed: {err}")
except AuthenticationError as err:
print(f"LDAP-Authentication failed: {err}")
except Exception as e:
print(f"Generic exception: {e}")
Output:
DBG: ldapconnection_new [self:0x7f7bcc9a35b0]
DBG: ldapconnection_init (self:0x7f7bcc9a35b0)
DBG: ldapconnection_open (self:0x7f7bcc9a35b0)
DBG: connecting (self:0x7f7bcc9a35b0)
DBG: create_conn_info (mech:SIMPLE, sock:-1, creds:0x7f7bcdf03f00)
DBG: ldapconnectiter_new [self:0x7f7bcc802740]
DBG: create_init_thread_data (client:0x7f7bcdc476d0, sock:-1)
DBG: create_init_thread (ld:0x24db110, info:0x24e7860, thread:0)
DBG: ldapconnection_result (self:0x7f7bcc9a35b0, args:0x7f7bcdd2ee80, kwds:(nil))[msgid:-1]
DBG: LDAPConnection_Result (self:0x7f7bcc9a35b0, msgid:-1, millisec:-1)
DBG: LDAPConnectIter_Next (self:0x7f7bcc802740, timeout:-1) [tls:0, state:0]
DBG: _ldap_finish_init_thread (async:0, thread:140169686472448, timeout:-1, misc:0x24db110)
DBG: _pthread_mutex_timedlock
DBG: ldap_init_thread_func (params:0x24db110)
DBG: set connecting async: 0
DBG: ldap_init_thread_func [retval:0]
DBG: LDAPConnectIter_Next (self:0x7f7bcc802740, timeout:-1) [tls:0, state:0]
DBG: _ldap_finish_init_thread (async:0, thread:140169686472448, timeout:-1, misc:0x24db110)
DBG: _pthread_mutex_timedlock
DBG: set_certificates (self:0x7f7bcc802740)
DBG: binding [state:3]
DBG: _ldap_bind (ld:0x7f7bc4000b60, info:0x24e7860, ppolicy:0, result:(nil), msgid:0)
DBG: ldapconnectiter_dealloc (self:0x7f7bcc802740)
DBG: dealloc_conn_info (info:0x24e7860)
LDAP-Connection failed: Can't contact LDAP server. (unknown error code) (0xFFFF [-1])
DBG: ldapconnection_dealloc (self:0x7f7bcc9a35b0)
(When I uncomment the line client.set_cert_policy("allow") in my code the connection gets successfully established.)
Unfortunately, If OpenLDAP doesn't set a specific return value or set a diagnostic message, then I don't think it's possible to raise a specific error.
The raised exception is based on the LDAP error code (returned by an LDAP function call or set to the LDAP structure's corresponding field), and if additional diagnostic message is provided, then it's concatenated to the exception's error message.
TLS related errors are usually only shown in libldap's trace logs. You can set trace level logging with bonsai.set_debug(True, -1).
Ok, I agree, that this is an openldap-Libraryissue. So I'll close the issue.
But thanks for the hint to the trace-logs. There I can see the real error:
[...]
DBG: _ldap_bind (ld:0x7f3764000b60, info:0x1943560, ppolicy:0, result:(nil), msgid:0)
ldap_sasl_bind
ldap_send_initial_request
ldap_new_connection 1 1 0
ldap_int_open_connection
ldap_connect_to_host: TCP ldap.domain.local:636
ldap_new_socket: 3
ldap_prepare_socket: 3
ldap_connect_to_host: Trying 10.10.10.10:636
ldap_pvt_connect: fd: 3 tm: -1 async: 0
attempting to connect:
connect success
TLS: peer cert untrusted or revoked (0x42)
TLS: can't connect: (unknown error code).
ldap_msgfree
ldap_err2string
DBG: ldapconnectiter_dealloc (self:0x7f376a99a740)
[...]
|
gharchive/issue
| 2022-12-17T15:08:54
|
2025-04-01T06:39:47.559404
|
{
"authors": [
"noirello",
"senfomat"
],
"repo": "noirello/bonsai",
"url": "https://github.com/noirello/bonsai/issues/75",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
952306719
|
fix: fix types for event listeners, remove typedoc
fixes #191
After looking at dom.d.ts a bit I figured out that I was missing some declarations for addEventListener/removeEventListener, which is why it would fail in strict mode. Unfortunately I couldn't figure out how to make these changes in the .ts files, but we can just put them directly in the .d.ts files and modify those directly, since I'm not really using Typedoc anymore so there's no need to keep it around.
This is the magic; you need these type: string declarations too:
https://github.com/nolanlawson/emoji-picker-element/blob/39e1ce51fb1f9efa7d5d292c2966be3dbcf19247/picker.d.ts#L20-L23
|
gharchive/pull-request
| 2021-07-25T16:13:53
|
2025-04-01T06:39:47.567072
|
{
"authors": [
"nolanlawson"
],
"repo": "nolanlawson/emoji-picker-element",
"url": "https://github.com/nolanlawson/emoji-picker-element/pull/193",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
241272538
|
Implement zoom methods and implement mousewheel support
This is a (multiple allowed):
[ ] bug
[x] enhancement
[ ] feature-discussion (RFC)
There should be zoomIn() zoomOut() and zoomReset() methods. The zoom should also have an option to work with mousewheel.
They are in v4
|
gharchive/issue
| 2017-07-07T13:59:16
|
2025-04-01T06:39:47.568855
|
{
"authors": [
"julmot",
"nolimits4web"
],
"repo": "nolimits4web/Swiper",
"url": "https://github.com/nolimits4web/Swiper/issues/2148",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
40257872
|
width and height using percentage
Hey I want to make something that mix between responsive and partial, I set the swiper-container{ width:100%; height:100%} and the swiper-slide{width:90%; height:100%} but in the output I get the swiper-slide width 100% (the same as the container) how I can correct this.
Check the demo http://jsfiddle.net/fnhb137d/
But if we know the width of the parent (on page load) can't we define the width of the slider-wrapper depending one the number of item, then set their width in % of the slider-wrapper width?
I'm asking because I have a project with "fluid" typography, and when I increase the font size on the body, the slider breaks in IE10/11 (because the swipes width is in pixel).
|
gharchive/issue
| 2014-08-14T14:13:53
|
2025-04-01T06:39:47.570832
|
{
"authors": [
"mnifakram",
"zapatoche"
],
"repo": "nolimits4web/Swiper",
"url": "https://github.com/nolimits4web/Swiper/issues/896",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
204652109
|
mousewheelControl param setting when updating this using MousewheelCo…
Fix for:
https://github.com/nolimits4web/Swiper/issues/1998
Always follow the contribution guidelines when submitting a pull request.
Merged, thanks
|
gharchive/pull-request
| 2017-02-01T17:35:35
|
2025-04-01T06:39:47.572474
|
{
"authors": [
"klojniewski",
"nolimits4web"
],
"repo": "nolimits4web/Swiper",
"url": "https://github.com/nolimits4web/Swiper/pull/1999",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
861456826
|
Pagination renderBullet putting last bullet in front of other bullets
Swiper Version: Swiper 6.5.7
Platform/Target and Browser Versions: Windows, Chrome
What You Did
Using custom pagination renderBullet to use custom pagination so we can define buttons as data attributes in each slide item.
JSFiddle with Example
Expected Behavior
Show custom rendered bullets in correct order
Actual Behavior
Last custom rendered bullet is being shown first
How about more minimal example without extra logic (that could be a root of the problem/issue)?
How about more minimal example without extra logic (that could be a root of the problem/issue)?
I'll edit the JS Fiddle, and let you know when it's been minimized
So while updating the JSFiddle to a more minimal example I found the route of the cause.
While having loop enabled it looks like the custom bullet render is grabbing the duplicated slide and trying to create a button with that.
Everything is correct with pagination render. index number it returns in renderBullet is not an index of slide, it is an index of the bullet. And you try to use same index for slide which is wrong, as in loop mode there are duplicated slides added
|
gharchive/issue
| 2021-04-19T15:46:25
|
2025-04-01T06:39:47.578078
|
{
"authors": [
"Bimmr",
"nolimits4web"
],
"repo": "nolimits4web/swiper",
"url": "https://github.com/nolimits4web/swiper/issues/4454",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1804373739
|
🛑 mastodonczech.cz is down
In e270308, mastodonczech.cz (https://mastodonczech.cz/nodeinfo/2.0) was down:
HTTP code: 502
Response time: 882 ms
Resolved: mastodonczech.cz is back up in deb46ce.
|
gharchive/issue
| 2023-07-14T07:33:36
|
2025-04-01T06:39:47.581249
|
{
"authors": [
"matejdivecky"
],
"repo": "nolog-it/mastodon-uptime",
"url": "https://github.com/nolog-it/mastodon-uptime/issues/190",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2299776447
|
Plugin: nonebot-plugin-tsugu-bangdream-bot
PyPI 项目名
nonebot-plugin-tsugu-bangdream-bot
插件 import 包名
nonebot_plugin_tsugu_bangdream_bot
标签
[{"label":"tsugu","color":"#ffee88"}]
插件配置项
No response
已修改
怎么又来一个tsugu(
怎么又来一个tsugu(
那个tsugu写的实在是看不下去所以跑来写了一个(
ycm
插件元数据缺少支持的适配器,请继承依赖的插件
插件元数据缺少支持的适配器,请继承依赖的插件
已添加支持适配器元数据
再次修改,跟随 tsugu-api-python 的结构更改
自己把自己坑了
没触发检测吗?
|
gharchive/issue
| 2024-05-16T08:55:15
|
2025-04-01T06:39:47.597759
|
{
"authors": [
"GreyElaina",
"RF-Tar-Railt",
"WindowsSov8forUs",
"yanyongyu"
],
"repo": "nonebot/nonebot2",
"url": "https://github.com/nonebot/nonebot2/issues/2718",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1994701350
|
[NC | NSFS] WAL based tape migrations and recalls
Explain the changes
This PR adds WAL based migrations and recalls to NooBaa. The PR adds 2 commands manage_nsfs:
tape recall - Uses eeadm to restore all the files that were requested for restore.
tape migrate - Uses eeadm to migrate all the files to the tape.
NOTE: There were many races but now I hope I am dealing with all of them. What are the races I accounted for?
Multiple nodes issuing the migrate/recall at the same time (maybe cron doing it?).
Same nodes issue migrate and recall at the same time such that both of them interleave which is not problematic on the surface as they don't share the log but it is problematic because eventually they all deal with the same S3 objects. Imagine, (1) Upload a file to GLACIER (2) Issue a restore-object (3) Cron schedules tape recall as well as tape migrate (4) a file gets "recalled" but then is migrated.
A single instance of migrate is running (probably because WAL was huge and eeadm is slow) while another one got scheduled. Same for recall.
NOTE: This solution does NOT allow running tape recall and tape migrate at the same time. It is intentional. One call will block another.
[ ] Doc added/updated
[ ] Tests added
Ah, there is still one possible race here:
P1 opens a file
P2 renames the file
Someone issues a tape migrate which gets exclusive access to the recently renamed file
P1 gets blocked as tape migrate as the exclusive access
tape migrate unlinks the file on completion
P1 gets unblocked, happily performs write (the file won't be deleted as there is an open FD).
P1 closes the FD upon swap, the file gets deleted
RESULT: Lost writes.
Working on it now...
|
gharchive/pull-request
| 2023-11-15T12:40:16
|
2025-04-01T06:39:47.605725
|
{
"authors": [
"tangledbytes"
],
"repo": "noobaa/noobaa-core",
"url": "https://github.com/noobaa/noobaa-core/pull/7601",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
783731929
|
IMAGE
|
gharchive/issue
| 2021-01-11T21:52:48
|
2025-04-01T06:39:47.644491
|
{
"authors": [
"nopnop2002"
],
"repo": "nopnop2002/esp-idf-parallel-tft",
"url": "https://github.com/nopnop2002/esp-idf-parallel-tft/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
974981003
|
文の途中でも行間が開いてしまう問題
「あああああああああ
ああああ。
ああああああああ。」
ではなく
「ああああああああ
ああああ。
ああああああああ。」
と表示されてしまい読みにくい感じがします(自分だけ?)。
ただ、
・置換後の文字列をotayori_adjustedに代入
・行間調整
の処理順なので、うまい修正が思いつきません。
@kitokun
環境、条件、入力を教えてくだせえ
Windows/Chrome、(仕様の話なので環境は関係なさそう)
「朝起きたらとなりでルイズが寝ていた。
俺は死ぬほど嬉しかった。
何せ夢にまで見たルイズだ。
童貞の俺にもついに彼女が出来て幸せに暮らしてい
ける。
そしてマイホームを買って子供をつくって毎日幸せ
な生活をする。」
こうなっているのを
「朝起きたらとなりでルイズが寝ていた。
俺は死ぬほど嬉しかった。
何せ夢にまで見たルイズだ。
童貞の俺にもついに彼女が出来て幸せに暮らしてい
ける。
そしてマイホームを買って子供をつくって毎日幸せ
な生活をする。」
こうしたほうが読みやすくないですか?(文がつながってる場合は、行間を詰めたほうが読みやすそう)
という主張です。
「本来の文の改行」と「ブラウザの幅が原因の改行」の行間を変えたいという事ね。
調べたけど解決方法がわからなかったのでペンディングでお願いします。スワセン…
|
gharchive/issue
| 2021-08-19T19:33:41
|
2025-04-01T06:39:47.653757
|
{
"authors": [
"kitokun",
"norihitoishida"
],
"repo": "norihitoishida/adjust-posts",
"url": "https://github.com/norihitoishida/adjust-posts/issues/33",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2429502384
|
fix: BUG-8180/INC-B17940
Updated navigation to 'replace' so that ?code entry which in history with the intended new url without ?code
Hi @akshaykalaskar1,
So as I understand it, replace forces the page to reload and therefore running the process to remove code from the URL and load correctly. Is that right?
|
gharchive/pull-request
| 2024-07-25T09:36:56
|
2025-04-01T06:39:47.660373
|
{
"authors": [
"IanOvenden",
"akshaykalaskar1"
],
"repo": "norm-l/odx",
"url": "https://github.com/norm-l/odx/pull/641",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
146667484
|
Append custom regex to errorMatch for a project
Hi,
I am using a Makefile to compile an ocaml project. This makefile actually calls ocamlbuild, which is a wrapper around ocamlc/ocamlopt (the ocaml compilers).
The errors that are provided by those compilers follow this schema:
File "foo/bar.ml", line 277, characters 12-13:
Warning 26: unused variable g.
File "foo/bar.ml", line 280, characters 6-64:
Warning 34: unused type t.
File "foo/bar.ml", line 279, characters 0-6:
Error: Syntax error
I have a regex to match the warnings and errors:
(?<file>[\\/0-9a-zA-Z\\._\\-]+)", line (?<line>\\d+), characters (?<col>\\d+)-(?<col_end>\\d+):\\n(?<message>.+)
For the moment, I have a fork of build-make in which I added my regex to the errorMatch array.
I wonder if there is a way with .atom-build.{js,json,cson,yml} to specify my regex rather than to modify build-make?
I'm thinking about using preBuild to append my regex to the existing array, but I'm not sure this is the clean way to do it. Also, I would have to create a custom build command for all the targets in my makefile. This is not really convenient, I would like to use this regex for all the targets.
Or should I create a build provider for ocamlc/ocamlopt and even if I launch a make task, the regex from my provider will be applied? As my workflow (Makefile + ocamlopt) is common in the ocaml world, I would be happy to provide a solution that works for everyone.
I have the problem for ocaml now, but will probably have the same one with C/C++ or other languages/tools later. I am curious to know what is the best solution.
Also, subsidiary questions:
is there a way to display the warnings and the errors in a different way? (I suspect it is not possible because of this line)
can I access col and col_end from postBuild? I'd like to modify their value.
Thanks
Can you not make a PR to build-make with your regex?
Sure, I will do it today.
Oh, also saw your other questions now:
There is no way to differentiate between errors and warnings right now. I wouldn't mind a way of doing this, I just don't know the best way yet.
You cannot access col or col_end from postBuild, not change them either. Before enabling things like this I want to verify that it makes sense in all (at least most) scenarios. I haven't thought too much about what one might want to achieve in postBuild, it's a quite new feature.
Ok, thanks for the answers and for your great tool :)
|
gharchive/issue
| 2016-04-07T15:54:54
|
2025-04-01T06:39:47.681752
|
{
"authors": [
"Khady",
"noseglid"
],
"repo": "noseglid/atom-build",
"url": "https://github.com/noseglid/atom-build/issues/384",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
325622538
|
Preact compatibility
Would you be open for a PR to make it compatible with preact?
Just checking..
Unfortunately we have limited resource to make it work and keep update with all those awesome libs, so this is not in the plan yet.
Also sorry for the late response.
Close it for now.
|
gharchive/issue
| 2018-05-23T09:28:12
|
2025-04-01T06:39:47.683514
|
{
"authors": [
"nosir",
"viktor-izettle"
],
"repo": "nosir/cleave.js",
"url": "https://github.com/nosir/cleave.js/issues/353",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
902939635
|
How to check what moderator permissions client has?
I can see what permissions I (an account) has by GET to https://www.reddit.com/subreddits/mine/moderator/.json and looking at the property mod_permissions on each subreddit.
I don't see this available through getMe() or getModeratedSubreddits() in snoowrap. And the only place permissions are mentioned in typings is for inviteModerator() and setModeratorPermissions() as arguments.
How, using snoostorm, can I check that the authenticated account has permission to perform mod actions on a subreddit other than just trying and seeing if i get a 400/401/403 back?
It is done like this:
// options to getModerators is optional but will ensure index 0 of the returned array is the user you are looking for
const mods = snoowrap.getSubreddit('mySubreddit').getModerators({name: 'modName'});
const modPermissions = mods[0].mod_permissions;
|
gharchive/issue
| 2021-05-26T21:33:17
|
2025-04-01T06:39:47.689703
|
{
"authors": [
"FoxxMD"
],
"repo": "not-an-aardvark/snoowrap",
"url": "https://github.com/not-an-aardvark/snoowrap/issues/324",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
1375561584
|
Develop
Added get_guest_user_ISE_API.py
Added Headlines to the exportet csv File
|
gharchive/pull-request
| 2022-09-16T07:48:52
|
2025-04-01T06:39:47.826348
|
{
"authors": [
"nouse4it"
],
"repo": "nouse4it/ISE_API_scripts",
"url": "https://github.com/nouse4it/ISE_API_scripts/pull/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1939137735
|
feat: last-modified date colorizer
Adds ability to colorize by "last modified date".
Example: https://npmgraph-git-datecolorizer-npmgraph.vercel.app/?q=express#color=modified
I'm a little unsure whether to merge this. The "Last modified date" is last time the package record was modified in the NPM registry. That'll generally be the date of the last version that was published, but could also be something unrelated (e.g. when a version was unpublished, or an owner changed or... whatever.) I'm concerned this could be more confusing or misleading than actually helpful.
Thoughts?
Closing unmerged. This is going to be more confusing than it will be useful, I think.
|
gharchive/pull-request
| 2023-10-12T04:08:20
|
2025-04-01T06:39:48.029590
|
{
"authors": [
"broofa"
],
"repo": "npmgraph/npmgraph",
"url": "https://github.com/npmgraph/npmgraph/pull/170",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
161699581
|
Bug: audio player and web vr icon aren't showing up on mobile by default
rotating the device causes the controls to show up.
Fixed.
|
gharchive/issue
| 2016-06-22T14:33:38
|
2025-04-01T06:39:48.031481
|
{
"authors": [
"TylerFisher",
"lindamood"
],
"repo": "nprapps/rockymountain",
"url": "https://github.com/nprapps/rockymountain/issues/100",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
118202136
|
Facebook login works, but profile view gives error in android emulator(works fine in plnkr, postman, even pasting url in browser)
I am able to login to Facebook and get back access_token , but when making angular $http request for viewing profile page, if throws error:
$http request code:
var url = "https://graph.facebook.com/me?access_token=" + $localStorage.accessToken;
$http({
method: 'GET',
url: url
}).then(function successCallback(response) {
alert("success");
alert(JSON.stringify(response.data));
}, function errorCallback(response) {
alert("error");
alert(JSON.stringify(response));
});
above code works properly in plnkr, postman and browser but throws error in android emulator:
error:
"{"data":"","status":404,"config":{"method":"GET","transformRequest":[null],"transformResponse":[null],"url":"https://graph.facebook.com/me?access_token=*****","headers":{"Accept":"application/json, text/plain, /"}},"statusText":"Not Found"}",
adding this : ionic plugin add cordova-plugin-whitelist
solved the problem
|
gharchive/issue
| 2015-11-21T15:11:52
|
2025-04-01T06:39:48.035500
|
{
"authors": [
"spartan1234"
],
"repo": "nraboy/ng-cordova-oauth",
"url": "https://github.com/nraboy/ng-cordova-oauth/issues/166",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
257276901
|
URL without www loads page, but has a CORS error for the data.
I.e., visiting https://rustaceans.org/findwork rather than https://www.rustaceans.org/findwork (and presumably the http versions) loads the website, but is stuck at loading... forever becaue fetching the data gives a CORS error.
Similarly, navigating to the http site (http://rustaceans.org/findwork) will lead you somewhere else entirely.
https://github.com/aturon/rfcs/blob/roadmap-2018/text/0000-roadmap-2018.md links to the broken page...
|
gharchive/issue
| 2017-09-13T06:59:22
|
2025-04-01T06:39:48.038923
|
{
"authors": [
"jwatt",
"nrc",
"skade"
],
"repo": "nrc/find-work",
"url": "https://github.com/nrc/find-work/issues/8",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
89409950
|
Rustfmt always recompiles
Rustfmt always recompiles on my machine, even if no source code has been altered. Is this happening for other people as well? If so, why does it happen and how can we prevent it?
It's also happening for me. It seems to be related to the build script. Commenting the line build=build.rs prevent the rebuild when nothing has changed. (I don't know why ?)
Interesting. Maybe it updates the last edit timestamp on default.toml in the root directory, which could trick Cargo into believing code may have changed?
Commenting the line std::fs::copy(in_file, out_file).unwrap(); doesn't change anything. It's maybe a default behavior of Cargo to rebuild everything when there is a build script... needs more investigation.
Weird -- commenting out that line does the trick here...
The source of trouble was a symlink rustfmt/rustfmt -> rustfmt/target/debug/rustfmt. Cargo seems to consider all files under rustfmt/ to decide when to rebuild.
Did you create this symlink by hand?
Unfortunately, https://github.com/nrc/rustfmt/pull/111 does not prevent recompilation one when of the test files in /tests/source/ or /tests/target/ changes... Any ideas on how we could fix that?
Yes, I created the symlink by hand (for tests, because cargo run recompiled always).
I don't know how to avoid rebuild when a non-source file is touched, but I have maybe an explaination why it occurs: the build script is intended to build external dependencies, but cargo doesn't which files the script builds, so it takes a conservative approach : rebuild if any file changed. If you remove the build script from Cargo.toml, a change in test files doesn't trigger rebuild anymore.
I think the good thing would be to remove the build script, and use something like a deploy script, but it doesn't seem to exist (yet ?). There is maybe another way : cargo is made to recieve build commands from the stdout of the build script, there are maybe commands to indicate dependencies (But at first sight there isn't anything like this in the docs.)
Removing the build script sounds good to me. Maybe we can add the default.toml to the root directory in the repository and also add it to .gitignore, so that editing it will not register as a change to the repository.
Might be worth asking on #cargo about this, seems like there ought to be a solution.
It's a known issue: https://github.com/rust-lang/cargo/issues/1162. No known workarounds at the moment. Maybe it's a nice issue to work on.
Build script was removed by #165.
|
gharchive/issue
| 2015-06-18T22:12:15
|
2025-04-01T06:39:48.044925
|
{
"authors": [
"cassiersg",
"marcusklaas",
"nrc"
],
"repo": "nrc/rustfmt",
"url": "https://github.com/nrc/rustfmt/issues/110",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
364303326
|
Reviewers Doc
Need a doc for reviewers. Not only syntax but the spirit of the project. Focusing on the workflow rather than just "automating the network". Being neat with diagrams and examples. Not only for reviewers but also so contributors know what to expect. Convey that our goal is to make this the first impression for automators, so it has to preserve that.
Closing in lieu of #17
|
gharchive/issue
| 2018-09-27T04:45:34
|
2025-04-01T06:39:48.046709
|
{
"authors": [
"Mierdin"
],
"repo": "nre-learning/antidote",
"url": "https://github.com/nre-learning/antidote/issues/41",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
807344772
|
Is it possible to get help to create a new pull request?
Hello,
I would like to create new pull requests (for #356 and #332), but I am not familar with the branch/fork and pull technologies....
Is it possible to get help? (or better, someone creates the pull reqests directly!)
My changes are already well tested, a documentation is also availlable.
Regards
Ess Bee
Hi!
did you make your changes on a cloned Git repository of brouter-web?
Hi @EssBee59, the procedure should be:
Click on Fork button on https://github.com/nrenner/brouter-web/ page:
Then clone your repository locally with: git clone git@github.com:EssBee59/brouter-web.git
Modify the code as you wish.
Commit with it with a brief explanation of the changes: git commit -am "Improve voice hint generation. Fix #356" for instance.
Push it: git push
Create a pull request by opening https://github.com/nrenner/brouter-web/compare/master...EssBee59:master
And voilà!
As I am not familiar with repositories I have a local installation on my PC.
So I make the changes locally, and test locally first.
When the tests are succesfull, I install the "dist" on an instance of brouter.de (brouter.de/essbee)
The changes are documented in both issues, by need I can deliver more documentation
Thank for your help and regards
Ess Bee
Maybe it's easiest to create an archive of your local BRouter installation and provide it as download or send it as mail attachment so that someone else can integrate the patches and create the pull-requests. I've some time tomorrow and would take a look if it's ok with you.
mjaschen,
Thank for your proposal, that I liked to accept!!
Please send me a mail addrees where I can send my changes
Reagrds
Ess bee
mjaschen@gmail.com should work :-)
Thank´s all for your help!
Some more information:
GitHub used to have nice, short step-by-step guides that I can't find anymore. They seem to have rewritten the docs, and now I fear this is too much for starting:
Collaborating with issues and pull requests - GitHub Docs
From a quick search this seems to be a nice guide:
How to make your first pull request on GitHub
There is also GitHub Desktop if you prefer using a UI over command line (Windows and MacOS, also as Linux Fork):
Setting up GitHub Desktop and Git
Cloning and forking repositories from GitHub Desktop
Committing and reviewing changes to your project
Pushing changes to GitHub
Creating an issue or pull request (missing!?)
|
gharchive/issue
| 2021-02-12T15:54:30
|
2025-04-01T06:39:48.058231
|
{
"authors": [
"EssBee59",
"bagage",
"mjaschen",
"nrenner"
],
"repo": "nrenner/brouter-web",
"url": "https://github.com/nrenner/brouter-web/issues/368",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2284309601
|
[nrf fromtree] net: if: Extend the usage of rejoining the multicast groups
It may happen that the interface is up but not yet running when we issue the rejoin_ipv6_mcast_groups(). This can be fixed by calling this function again right after the iface is set to 'running' state in the notify_iface_up handler.
Signed-off-by: Marcin Kajor marcin.kajor@nordicsemi.no
(cherry picked from commit b571e45d80923a786979460f218618e2000da8f3)
Green light on the corresponding sdk-nrf PR. We can merge this one I guess.
|
gharchive/pull-request
| 2024-05-07T22:09:56
|
2025-04-01T06:39:48.110895
|
{
"authors": [
"markaj-nordic"
],
"repo": "nrfconnect/sdk-zephyr",
"url": "https://github.com/nrfconnect/sdk-zephyr/pull/1686",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
1314889357
|
hotfix: add try-catch in setTimeout/setImmediate
What kind of change does this PR introduce? (Bug fix, feature, docs update, ...)
What is the current behavior? (You can also link to an open issue here)
What is the new behavior (if this is a feature change)?
Other information:
Status
[ ] Code documentation for the relevant parts in the code have been added/updated by the PR author
[ ] The functionality has been tested by the PR author
[ ] The functionality has been tested by NRK
Codecov Report
Merging #748 (379e4b3) into release41 (c80781a) will increase coverage by 0.02%.
The diff coverage is 16.66%.
@@ Coverage Diff @@
## release41 #748 +/- ##
=============================================
+ Coverage 69.86% 69.88% +0.02%
=============================================
Files 302 302
Lines 34428 34434 +6
Branches 4662 4664 +2
=============================================
+ Hits 24052 24065 +13
+ Misses 9925 9917 -8
- Partials 451 452 +1
Impacted Files
Coverage Δ
meteor/server/api/userActions.ts
20.97% <0.00%> (-0.28%)
:arrow_down:
meteor/lib/api/userActions.ts
100.00% <100.00%> (ø)
packages/job-worker/src/ingest/rundownInput.ts
78.33% <0.00%> (+2.07%)
:arrow_up:
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c80781a...379e4b3. Read the comment docs.
|
gharchive/pull-request
| 2022-07-22T11:30:06
|
2025-04-01T06:39:48.121775
|
{
"authors": [
"codecov-commenter",
"jstarpl"
],
"repo": "nrkno/sofie-core",
"url": "https://github.com/nrkno/sofie-core/pull/748",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
1606804457
|
Repository transfer
Hi @mechairoi, I got this updated in the PureScript Registry quicker than expected. :-)
At this point, I would like to migrate the repository to a different namespace, e.g. purescript-contrib, rowtype-yoga, or similar. This can distribute the maintenance burden and avoid either of us being a "single point of failure".
In the interim, though, would you please transfer this repository to my username?
Thanks again!
Thanks @nsaunders. I'm grad to hear that.
I have requested to transfer the repository to you. I think it's a good idea to migrate to a namespace like those.
Thanks again for your contribution!
Thank you @mechairoi!
|
gharchive/issue
| 2023-03-02T13:24:26
|
2025-04-01T06:39:48.185103
|
{
"authors": [
"mechairoi",
"nsaunders"
],
"repo": "nsaunders/purescript-unique",
"url": "https://github.com/nsaunders/purescript-unique/issues/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
54184849
|
SSH Deployer key based authentication not working
First things first: Thanks for this great plugin!
There seems to be a bug with SSH Deployer authentication when using SSH keys. Password authentication works fine, however, when selecting "Default Private Key" I can not enter a username with which to authenticate with the server (even if using a default private key, it's still mandatory for authentication to specify the user that should be used on the remote host).
It is also not possible to circumvent that problem by selecting "Custom Private Key" (with this it is possible to specify a user name) as the path to the "custom identity file" is incorrectly resolved. It always tries to find the path one specifies relative the to buildagent work dir instead of the home dir and it is not possible to specify an absolute path either as that always just get's appended to the current working dir. Hence, specifying something like ~/.ssh/id_rsa does not work. Using %agent.home.dir% doesn't work either (appends the home dir to the temp dir).
Bitbucket: https://bitbucket.org/nskvortsov/deployer/issue/23
Originally reported by: Sebastian Siemssen
Originally created at: 2014-01-07T14:29:06.547
fixed in this build
allow to enter username for default key
allow absolute paths to custom key
Original comment by: Nikita Skvortsov
|
gharchive/issue
| 2015-01-13T12:10:00
|
2025-04-01T06:39:48.200791
|
{
"authors": [
"nskvortsov"
],
"repo": "nskvortsov/deployer",
"url": "https://github.com/nskvortsov/deployer/issues/23",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
}
|
91729259
|
verifyCredentialsWithSuccessBlock not returning
So here goes - I had an old code of STTwitter not via cocoapods.
Using pods now.
The problem:
verifyCredentialsWithSuccessBlock not returning - not success block and not errorBlock.
It started with the latest version (0.2.1) so I thought it has something to do with the deprecation of the function.
Reverted to 0.2.0 (with patching of the extern to get it to link)
Still same thing.
When a user already logged-in in the past I'm saving it in UserDefaults and retrying with it with the following code.
Any assistance on investigating will be appreciated
NSUserDefaults *prefs = [NSUserDefaults standardUserDefaults];
NSString *oauthAccessToken = [prefs stringForKey:@"twitterAccessToken"];
NSString *oauthAccessTokenSecret = [prefs stringForKey:@"twitterAccessTokenSecret"];
NSString *key = [[NSBundle mainBundle] objectForInfoDictionaryKey:@"twitterKey"];
NSString *secret = [[NSBundle mainBundle] objectForInfoDictionaryKey:@"twitterSecret"];
if (oauthAccessToken != nil) //we are logged in
{
STTwitterAPI *twitter = [STTwitterAPI twitterAPIWithOAuthConsumerKey:key consumerSecret:secret oauthToken:oauthAccessToken oauthTokenSecret:oauthAccessTokenSecret];
[twitter verifyCredentialsWithSuccessBlock:
^(NSString *username) {
//great... do stuff and login to the app
} errorBlock:^(NSError *error) {
NSLog(@"error verifying credentials: %@", error);
[self gotoLoginScreen];
}];
}
I updated the code, still not working.
From what I was able to dig in the code:
successBlock(username, userID); (ln 247 - in the success block of verifyCredentialsRemotelyWithSuccessBlock)
Is getting called twice, with nil in username, and userID,
And only afterwards the GET request from twitter returns (ln 246 in STTwitterOAuth.m) and nothing is done with the result
Not sure why :(
I'm confused. The following code works for me with the latest version of STTwitter. Can you double check that you're also using it? (by typing git pull)
NSString *oauthAccessToken = @"";
NSString *oauthAccessTokenSecret = @"";
NSString *key = @"";
NSString *secret = @"";
STTwitterAPI *twitter = [STTwitterAPI twitterAPIWithOAuthConsumerKey:key
consumerSecret:secret
oauthToken:oauthAccessToken
oauthTokenSecret:oauthAccessTokenSecret];
[twitter verifyCredentialsWithSuccessBlock:^(NSString *username) {
NSLog(@"-- username: %@", username);
} errorBlock:^(NSError *error) {
NSLog(@"-- error: %@", error);
}];
If it still doesn't work for you, can you please setup a minimal Xcode project, so that I'll be able to reproduce and fix the bug? Thank you for your help.
sorry for the noob question - how can I git pull if I'm using pod?
maybe you can git pull somewhere else, replace your curren STTwitter directory with the new one and run the project again
P.S. you can get a pod from GIT by doing:
pod 'STTwitter', :git => 'https://github.com/nst/STTwitter.git'
Anyway - a small project works, so it is not the code itself... I'll look for differences between the projects.
Do you think it is related to the fact that STTwitter used to be code in my project? (leftover frameworks?)
Well... after commenting out the verifyCredentialsLocallyWithSuccessBlock to see if it works and figuring out it is still gets executed I just removed all pods and podCache - reinstalled, and now everything works....
I hate Cocoapods!!!!
Well, thanks again! I believe we found a real bug in the process so it wasn't all for nothing...
Glad to hear that.
Don't hesitate to report any other issue you may find in the future.
|
gharchive/issue
| 2015-06-29T07:56:42
|
2025-04-01T06:39:48.231973
|
{
"authors": [
"boazin",
"nst"
],
"repo": "nst/STTwitter",
"url": "https://github.com/nst/STTwitter/issues/199",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
}
|
272715174
|
License
Thanks for creating this library, it looks nice and simple to use 👍
I notice the mix.exs file says the license is MIT, but I don't see a LICENSE file in the repo.
Could you add one just to clarify.
Thanks again!
Thanks for the heads up @mbuhot. Missing LICENSE has been added. Will be available in the next hex release.
|
gharchive/issue
| 2017-11-09T20:34:41
|
2025-04-01T06:39:48.236933
|
{
"authors": [
"mbuhot",
"nsweeting"
],
"repo": "nsweeting/authex",
"url": "https://github.com/nsweeting/authex/issues/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
}
|
2598427125
|
Add cyrillic letters to sort tests
Fixes https://github.com/nt4f04uNd/sweyer/issues/158
Thanks for adding the tests.
Thanks for making them =)
I've seen this CI failure before, but I can't replicate it locally. My guess is that we start the initialization in a test, and then the test ends but the while the initialization is waiting on an await point and we start a new test overwriting the queue controller. For now I think you can just re-run the test, but a real solution it to either find a way to cancel / await all un-awaited futures started or to not have singletons and instead pass in all controllers when creating the app.
Created an issue https://github.com/nt4f04uNd/sweyer/issues/162
|
gharchive/pull-request
| 2024-10-18T21:38:16
|
2025-04-01T06:39:48.239286
|
{
"authors": [
"Abestanis",
"nt4f04uNd"
],
"repo": "nt4f04uNd/sweyer",
"url": "https://github.com/nt4f04uNd/sweyer/pull/161",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
144081168
|
ResultView: Add action panel to text/html output
Closes #26
Just make the pre element clickable if it's a single line output.
I see a bug but I can't reproduce it reliably.
When it fails:
When it works:
The logs in the dev console look the same.
I will look at this more carefully. But I'm not sure if I'll be able to look at this tomorrow.
Thanks for taking a look.
I can't reproduce this. I always get the second output no matter how many times I try.
It happens more often when I use C-A-enter.
Strangely enough this doesn't happen on my machine.
Does it happen when you add back the previousText check at line 111 like this?
previousText = @getAllText()
container.appendChild htmlElement
if mimeType is 'text/plain'
text = @getAllText()
if previousText is '' and text.length < 50 and text.indexOf('\n')
@lgeiger I that was it. After adding the check, I don't it to fail.
It should work now.
I think it's easier for the user to be able to click anywhere container.
Another problem with the pre hack is that it breaks when a block with multiple stdouts is executed, e.g.:
print('A plot')
plt.plot(t, s)
OK I'll revert my changes except those directly related to #26.
All right, back to the basics :wink:
LGTM
|
gharchive/pull-request
| 2016-03-28T20:56:52
|
2025-04-01T06:39:48.245510
|
{
"authors": [
"lgeiger",
"n-riesco"
],
"repo": "nteract/hydrogen",
"url": "https://github.com/nteract/hydrogen/pull/232",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
2721500635
|
Provide cell corners
Moin Nicolas,
First of all, thanks a lot for providing the healpix package!
To add HEALPix support to UXarray, an xarray extension for unstructured grids, we need a function that computes the vertices, and ideally also a mapping of the vertices to the cells.
I've played a bit with the healpix package, and found that the _uv functions will provide the corners if supplied with the correct arguments, as in this gist. From there, one can move on with unique functions as in this gist by @philipc2.
I'm sure this is not the most efficient way of generating the vertices, but I was wondering if you would be interested in supporting this operation in your library. I'd be more than happy to create a merge request if that helps.
Pinging @philipc2 and @erogluorhan, as they are core developers of UXarray.
Cheers
Flo
Hi Flo!
Thanks for getting in touch. It would be great if we added add a healpy-compatible function such as
def boundaries(nside, ipix, step=1, nest=False):
...
to the code. In the first instance, I think that using the pix2ang_uv functions at the Python layer, as in your examples, is perfectly fine. In a second step, it could make sense to add the boundaries() function directly to the C library. That might save a bit of overhead -- but it would anyway be good to have the Python functions in place to make the comparison.
How does that sound?
Thanks for pinging me @florianziemen!
We are excited to use this package to add support for HEALPix within UXarray. To add onto what Flo already mentioned:
UXarray is written around the UGRID conventions, which requires at least the vertices (node_lon, node_lat) and cell- vertex indicies(face_node_connectivity) to represent an arbitrary 2D unstructured grid. This would be for loading a HEALPix grid into UXarray.
Going the other way, converting from UGRID to HEALPix can be achieved using a nearest neighbor remap, however for conservative remapping, knowing the boundaries is important
Hi,
sorry for the late reply. I've been at the AGU conference and took a couple days off afterwards. I've looked into the boundaries function. In principle, I know what to do to replicate it. I'm not sure, we want to mimic exactly this function, as it kind of assumes a single point going in, and only goes to xyz (if I remember correctly).
Basic thought would be to provide the behavior of boundaries with a signature that's matching the original call for compatibility, and provide a second function, that produces angles in (theta/phi) and (lon/lat) notation.
I won't manage to do this before the holidays, but it's high on my todo list for early next year.
Cheers
Flo
|
gharchive/issue
| 2024-12-05T21:48:35
|
2025-04-01T06:39:48.252590
|
{
"authors": [
"florianziemen",
"ntessore",
"philipc2"
],
"repo": "ntessore/healpix",
"url": "https://github.com/ntessore/healpix/issues/66",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
1697671314
|
Speed up generate function
Describe what you would like to achieve
Provide a mechanism to speed up generation of large numbers of prefixes or ASNs.
Describe why the current solution (if any) is not satisfactory
When passing a large number of ASNs or Prefixes to the generate function, the process takes several hours.
Provide an example
time ./bin/bgpalerter-macos-x64 generate -m -l .output/prefixes.txt -o output/prefixes-test.yml
real 245m27.073s
user 8m16.969s
sys 1m10.959s
Your information
Ryan Harden: AS11537
Hi @ancker010,
What does prefixes.txt contains and how many ASes are you monitoring?
prefixes.txt is a list of prefixes, one per line, currently 6887.
And approximately 1045 ASNs.
|
gharchive/issue
| 2023-05-05T13:58:54
|
2025-04-01T06:39:48.279506
|
{
"authors": [
"ancker010",
"massimocandela"
],
"repo": "nttgin/BGPalerter",
"url": "https://github.com/nttgin/BGPalerter/issues/1082",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
}
|
619448061
|
请问一下如何将表中的英文修改为中文的形式?
我使用了 listings 这个包,然后按照这个格式插入了一串代码:
\begin{lstlisting}[caption={不透明谓词示例一}, label={tab:opaque1}, frame=single]
xxxxxx
\end{lstlisting}
编译后生成的是这个格式:
我希望将其中的 Listing 改为中文该怎么做呢?
另外好像不管怎么设置 label 都不会被 \autoref 引用
目前模板没有适配常用宏包(偷懒的借口),你可以在 global.tex 里加上:
\renewcommand{\lstlistingname}{演算法}
\newcommand\lstlistingautorefname{算法}
当然,上面的名字只是示例,请根据自己的需求,修改成合适的名字。
非常感谢!
|
gharchive/issue
| 2020-05-16T11:18:47
|
2025-04-01T06:39:48.291216
|
{
"authors": [
"40m41h42t",
"yzwduck"
],
"repo": "nuaatug/nuaathesis",
"url": "https://github.com/nuaatug/nuaathesis/issues/38",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
}
|
259382873
|
Switch to Encodable for JSONification
Now that Swift 4 is out, we have Codable! Yaaaay!
Except that it wasn't meant for situations where you use some structs and some random dictionaries. It really doesn't like trying to encode [String: Any]s or even [String: Encodable]s (trying to encode either with the default implementation throws a FatalError saying fatal error: Array<Encodable> does not conform to Encodable because Encodable does not conform to itself. You must use a concrete type to encode or decode.)
Is the new implementation hackish? A little. But I think it's a little less hackish than the one it's replacing. Also, we can finally follow the Swift API Guidelines on acronyms (I'm talking about you, Id and Url) without an overly complicated snakecase function (though that's not implemented here).
Yeah whenever conditional conformance arrive it should get rid of the hacky things we have to here since it'll let them handle these in the stdlib
@tellowkrinkle Have you toyed around with trying bring deserialization to the holding types? It would be fantastic if we could get rid of all that crufty stuff we have now.
Although thinking about it more I guess conditional conformance really won't help with the case where it's just Any... since it can't know what to encode. And toying around it doesn't look like you define a custom protocol that just declares Codable conformance to use, which is kind of a bummer.
I haven't looked into deserialization much yet. I do know that Codable doesn't support autogenerating with defaults, so we'll still need to manually implement init(from decoder:).
|
gharchive/pull-request
| 2017-09-21T05:53:46
|
2025-04-01T06:39:48.301601
|
{
"authors": [
"nuclearace",
"tellowkrinkle"
],
"repo": "nuclearace/SwiftDiscord",
"url": "https://github.com/nuclearace/SwiftDiscord/pull/51",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.