added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| created
timestamp[us]date 2001-10-09 16:19:16
2025-01-01 03:51:31
| id
stringlengths 4
10
| metadata
dict | source
stringclasses 2
values | text
stringlengths 0
1.61M
|
|---|---|---|---|---|---|
2025-04-01T06:38:44.156706
| 2015-09-09T09:04:03
|
105553008
|
{
"authors": [
"BasvdM",
"fabiogel",
"frapontillo"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6093",
"repo": "frapontillo/angular-bootstrap-duallistbox",
"url": "https://github.com/frapontillo/angular-bootstrap-duallistbox/issues/13"
}
|
gharchive/issue
|
info-all = 'false' attribute not working
Hi!
Thanks for creating this angular version of the duallistbox.
I tried to use the info-all = 'false' setting, but instead of hiding the info text it shows the text 'false' above the dual list box.
I changed the code myself by adding an extra transformFn to the InfoAll attribute.
var getFalseOrStringValue = function (attributeValue) {
if (attributeValue === false || attributeValue === 'false') return false;
return attributeValue;
};
...and...
'infoAll': {
changeFn: 'setInfoText',
defaultValue: 'Showing all {0}',
transformFn: getFalseOrStringValue
},
I am not keen on making own changes. If you could provide a fix I would be very gratefull.
Hello BasvdM! You can use a special character as ''.
Its work for me. Cya
Hi,
Thanks for your reply. Unfortunately I don't understand your answer....
What should I do with the soft hyphen?
Gr. Bas
Wow, srry! Use info-all="".
Can I see a live demo of the issue please?
|
2025-04-01T06:38:44.270908
| 2017-06-19T00:30:34
|
236755757
|
{
"authors": [
"louiss0",
"nmdoliveira"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6095",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/issues/15437"
}
|
gharchive/issue
|
Counting Cards
Challenge Counting Cards has an issue.
User Agent is: Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36.
Please describe how to reproduce this issue, and include links to screenshots if possible.
My code:
var count = 0;
function cc(card) {
// Only change code below this line
switch(card) {
case 2:
case 3:
case 4:
case 5:
case 6:
count+=1;
break;
case 10:
case 'J':
case 'Q':
case 'K':
case 'A':
count-=1;
break;
}
return count + (count > 0 ? " Bet " : " Hold ");
// Only change code above this line
}
// Add/remove calls to test your function.
// Note: Only the last will display
cc(2); cc(3); cc(7); cc('K'); cc('A');
``` i got 0 hold but it didnt go through
You have extra spaces after "Bet" and "Hold".
|
2025-04-01T06:38:44.272904
| 2019-07-04T11:03:00
|
464205419
|
{
"authors": [
"raisedadead",
"sadikn"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6096",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/issues/36372"
}
|
gharchive/issue
|
Login 401 error
I am trying to have a login set up in the application and I am using the loopback login and its throwing a 401 Unauthorized error. can you guys help?
Closing this for lack of sufficient information, can you please elaborate in comments below detailed steps with how you see the error. Please make sure you fill up the information template presented to you while opening the issue:
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
**Smartphone (please complete the following information):**
- Device: [e.g. iPhone6]
- OS: [e.g. iOS8.1]
- Browser [e.g. stock browser, safari]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
|
2025-04-01T06:38:44.282178
| 2019-11-10T22:38:25
|
520669122
|
{
"authors": [
"RandellDawson",
"allison-strandberg",
"moT01",
"ojeytonwilliams"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6097",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/issues/37734"
}
|
gharchive/issue
|
Confusing task in Basic JavaScript: Comparisons with the Logical And Operator
Describe your problem and how to reproduce it:
For this challenge, the starting code in my editor contains the following lines:
if (val) {
if (val) {
return "Yes";
}
}
The challenge is
Combine the two if statements into one statement which will return "Yes" if val is less than or equal to 50 and greater than or equal to 25. Otherwise, will return "No".
Since the idea behind this challenge is the ability to combine conditions with the && operator, I would expect the starting code to instead read:
if (val <= 50) {
if (val >= 25) {
return "Yes";
}
}
Add a Link to the page with the problem:
https://www.freecodecamp.org/learn/javascript-algorithms-and-data-structures/basic-javascript/comparisons-with-the-logical-and-operator
Tell us about your browser and operating system:
Browser Name: Chrome
Browser Version: 76.0.3809.100
Operating System: MacOS Sierra
If possible, add a screenshot here (you can drag and drop, png, jpg, gif, etc. in this box):'
The instructions tell you what to do regardless of the seed code. If the seed code was provided as you describe, it would not be much of a challenge to solve.
I think the point of the exercise is to demonstrate the usage of the && operator, so even though it's easier with my suggestion, it's not trivial. If you'd rather not give that hint, though, there's no reason for the nested if statements in the seed code, and I was confused by them. The seed code should read:
if (val) {
return "Yes";
}
The point of the exercise is two-fold. One is to teach the user how to use the && operator. The other is to have the user figure how to achieve the same logic with the && operator as without the operator using the nested if statements. I do not see any reason to change this challenge.
I will let the other collaborators/mods give their opinions on this issue.
I'm with @RandellDawson on this one - I don't think the seed needs changing. I redid the challenge and found that the nested if statements were helpful. You can create the intermediate solution:
function testLogicalAnd(val) {
// Only change code below this line
if (val >= 25) {
if (val <= 50) {
return "Yes";
}
}
// Only change code above this line
return "No";
}
if you're more comfortable with nested if statements and that will pass all but the first two tests. This allows you to confirm that you've got the logic right before refactoring it into a single conditional.
The only thing I would consider changing is
Combine the two if statements into one statement
to
Replace the two if statements with one statement, using the && operator,
since 'combine' slightly implies that the new statement is built out of the original two, which isn't really the case here.
Doing that should make it less confusing without watering the task down.
Thanks @ojeytonwilliams: I like your suggestion to change the wording to
Replace the two if statements with one statement, using the && operator,
Any interest in creating a PR for that @allison-strandberg?
Thank you for the link @moT01!
|
2025-04-01T06:38:44.289325
| 2020-02-06T04:09:37
|
560766651
|
{
"authors": [
"RandellDawson",
"andrescaroc",
"moT01"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6098",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/issues/38171"
}
|
gharchive/issue
|
Broken link on Redux Challenge Intro Page
Describe your problem and how to reproduce it:
1) Go to: https://www.freecodecamp.org/learn/front-end-libraries/redux/
2) Find this text: "Improve this intro on GitHub."
3) Link to GitHub to improve the intro is broken
Add a Link to the page with the problem:
https://www.freecodecamp.org/learn/front-end-libraries/redux/
Tell us about your browser and operating system:
Browser Name: Chrome
Browser Version: 79
Operating System: Mac OS 10.13.6
If possible, add a screenshot here (you can drag and drop, png, jpg, gif, etc. in this box):
I am not even sure why that last sentence is present. I think it is best if it is removed completely from the introduction. If you see anymore links like this, please add them to this issue, so someone can create a PR to fix them all at the same time.
What do you think about removing the link to the redux website @RandellDawson? (the first word in the intro)
@moT01 I think for now, we can leave it alone. I think the discussion of external links should be on a separate issue (if there is not already an outstanding issue). I thought I had an open issue at one point about external links. It may have gotten closed at some point (can't remember why).
If you see anymore links like this, please add them to this issue, so someone can create a PR to fix them all at the same time.
Hey guys, the same issue is happening in the React and Redux section, please check: https://www.freecodecamp.org/learn/front-end-libraries/react-and-redux/
|
2025-04-01T06:38:44.292830
| 2021-02-15T15:24:07
|
808626218
|
{
"authors": [
"RandellDawson",
"sondregi"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6099",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/issues/41130"
}
|
gharchive/issue
|
No image at coding challenge "Add a Text Alternative to Images for Visually Impaired Accessibility"
Describe your problem and how to reproduce it:
No image at coding challenge
Add a Link to the page with the problem:
https://www.freecodecamp.org/learn/responsive-web-design/applied-accessibility/add-a-text-alternative-to-images-for-visually-impaired-accessibility
Tell us about your browser and operating system:
Browser Name: Brave
Browser Version: V1.20.103
Operating System: MacOS Big Sur v11.1
If possible, add a screenshot here (you can drag and drop, png, jpg, gif, etc. in this box):
Duplicate of issue #41131, so closing.
|
2025-04-01T06:38:44.297907
| 2017-12-23T07:35:48
|
284292210
|
{
"authors": [
"profoundhub"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6100",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/pull/16287"
}
|
gharchive/pull-request
|
Master: update my repo
Pre-Submission Checklist
[ ] Your pull request targets the staging branch of freeCodeCamp.
[ ] Branch starts with either fix/, feature/, or translate/ (e.g. fix/signin-issue)
[ ] You have only one commit (if not, squash them into one commit).
[ ] All new and existing tests pass the command npm test. Use git commit --amend to amend any fixes.
Type of Change
[ ] Small bug fix (non-breaking change which fixes an issue)
[ ] New feature (non-breaking change which adds new functionality)
[ ] Breaking change (fix or feature that would change existing functionality)
[ ] Add new translation (feature adding new translations)
Checklist:
[ ] Tested changes locally.
[ ] Addressed currently open issue (replace XXXXX with an issue no in next line)
Closes #XXXXX
Description
Oops, made that by mistake.
|
2025-04-01T06:38:44.300972
| 2018-10-21T00:57:36
|
372267452
|
{
"authors": [
"TrollzorFTW",
"ezioda004"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6101",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/pull/24848"
}
|
gharchive/pull-request
|
Typo on strcpy
[x] I have read freeCodeCamp's contribution guidelines.
[x] My pull request has a descriptive title (not a vague title like Update index.md)
[x] My pull request targets the master branch of freeCodeCamp.
[x] None of my changes are plagiarized from another source without proper attribution.
[x] My article does not contain shortened URLs or affiliate links.
If your pull request closes a GitHub issue, replace the XXXXX below with the issue number.
Closes #XXXXX
Thank you @TrollzorFTW for your contribution and congratulations on your first PR merge to this repo! 🎉
Hope to see more contributions from you in the future.
|
2025-04-01T06:38:44.303987
| 2018-10-29T11:57:45
|
374981280
|
{
"authors": [
"CodingHero97",
"RandellDawson",
"thecodingaviator"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6102",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/pull/31715"
}
|
gharchive/pull-request
|
Added brief description and example
[x] I have read freeCodeCamp's contribution guidelines.
[x] My pull request has a descriptive title (not a vague title like Update index.md)
[x] My pull request targets the master branch of freeCodeCamp.
[x] None of my changes are plagiarized from another source without proper attribution.
[x] My article does not contain shortened URLs or affiliate links.
If your pull request closes a GitHub issue, replace the XXXXX below with the issue number.
Closes #XXXXX
Closed and Reopened this PR to attempt to resolve a specific Travis build failure.
Changes to this file have been made by #28260 in favour of which I'm closing this PR
|
2025-04-01T06:38:44.309080
| 2019-09-05T13:21:43
|
489752486
|
{
"authors": [
"ahmadabdolsaheb",
"moT01",
"raisedadead"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6103",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/pull/36760"
}
|
gharchive/pull-request
|
fix(style): rework color scheme for icons
I reworked the colors of our icons to go with the new theme. I'm not sure if that was wanted - I think they look pretty good - see photos.
[x] I have read freeCodeCamp's contribution guidelines.
[x] My pull request has a descriptive title (not a vague title like Update index.md)
[x] My pull request targets the master branch of freeCodeCamp.
[x] None of my changes are plagiarized from another source without proper attribution.
[x] All the files I changed are in the same world language (for example: only English changes, or only Chinese changes, etc.)
[x] My changes do not use shortened URLs or affiliate links.
Closes #36666
Not related but could you add a 1-2px padding or margin between those test lists. I have always hated them getting squished like that.
We could even add more space.
Alternate background colors for list items could be re-evaluated.
See #36681 mocks for reference.
we could also experiment with smaller icons for the tests
I added my changes to the PR in case anyone wants to see the code or test it locally.
it looks much better already
|
2025-04-01T06:38:44.314218
| 2022-07-28T16:35:55
|
1321202691
|
{
"authors": [
"JohnathanTWebster",
"Sboonny"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6104",
"repo": "freeCodeCamp/freeCodeCamp",
"url": "https://github.com/freeCodeCamp/freeCodeCamp/pull/47064"
}
|
gharchive/pull-request
|
Fix(curriculum) accessibility issue with labels in registration form course
Checklist:
[x] I have read freeCodeCamp's contribution guidelines.
[x] My pull request has a descriptive title (not a vague title like Update index.md)
[x] My pull request targets the main branch of freeCodeCamp.
[x] I have tested these changes either locally on my machine, or GitPod.
Closes #46593
Adjusted all steps starting from when labels are first introduced to incorporate accessibility formatting.
hi @JohnathanTWebster, the file has been changed after you had worked on them, which lead to merge conflict
Clicking use the command line will open instruction of how to fix the conflict
If you have any questions, feel free to ask questions on the 'Contributors' category on our forum or the contributors chat room.
Fairly confident I just removed my changes
@Sboonny I just adjusted the pull request. Sorry, It wasn't cooperating but this pull should be correct.
@JohnathanTWebster no worries, I have seen the new pull request there are issues with it, but I am rhetoric enough to describe them well.
I have contacted another mod, and hopefully they will guide you better
|
2025-04-01T06:38:44.375252
| 2016-07-28T02:39:24
|
168001771
|
{
"authors": [
"arderyp",
"mlissner"
],
"license": "bsd-2-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6105",
"repo": "freelawproject/juriscraper",
"url": "https://github.com/freelawproject/juriscraper/pull/146"
}
|
gharchive/pull-request
|
Fixing vt_p
There are some bogus records at the bottom of the page. There are a lot of cases on the page in general, 130+. Since we are only concerned about the newest cases, I've added a limit to extract, at most, the 100 most recent cases. This solves the problem presented by the bogus records and reduces the network load by a bit. That being said, the bogus records looks like they are valid opinions that were simply improperly entered. I will try to reach out to the court to have them fix it. Added new text file.
Good solution. Merged!
I should have commented here as well (I did in slack), but by fixed those bogus records
Sent from a phone
On Aug 2, 2016, at 7:48 AM, Mike Lissner<EMAIL_ADDRESS>wrote:
Good solution. Merged!
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.
|
2025-04-01T06:38:44.395686
| 2017-12-06T19:13:03
|
279872437
|
{
"authors": [
"joshglick",
"npmccallum"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6106",
"repo": "freeotp/freeotp-ios",
"url": "https://github.com/freeotp/freeotp-ios/pull/97"
}
|
gharchive/pull-request
|
Fix apparent swift issues with swift 4
I had issues when cloning so fixed the issues. I'd be happy to find out this is an unnecessary PR.
I fixed these in my repo on my computer at home, but I forgot to push them. Also, I'm pretty sure that your attempt at fixing FreeOTP/TokenStore.swift breaks compatibility with anyone using the code from master. This isn't necessarily a deal-breaker because we haven't released this code yet. I'm also not excited to learn that NSCoding seems to be so fragile.
That's good to hear. I thought I was going crazy when I clone and recloned and still had build issues. I'm doing a clean fork to use for a newer project I am working on so i didn't have issues with the token store but happy to try and figure out how to fix that for others.
Or to cancel my PR if you have a working version already that you'd like to push.
@joshglick What are these new commits?
Sorry! These were pushed to the wrong remote, closing now
@joshglick I'm still curious what you're working on. Are you able to talk about it?
I actually can’t talk about it quite yet, but I will let you know once we
are ready to make something more public.
On Thu, Feb 15, 2018 at 11:45 AM Nathaniel McCallum <
<EMAIL_ADDRESS>wrote:
@joshglick https://github.com/joshglick I'm still curious what you're
working on. Are you able to talk about it?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/freeotp/freeotp-ios/pull/97#issuecomment-365987550,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AA11qP1tvQH9KO4KYxr5lrz_L5syWkAFks5tVF8XgaJpZM4Q4cFg
.
|
2025-04-01T06:38:44.402678
| 2017-05-25T20:02:21
|
231443494
|
{
"authors": [
"codecov-io",
"raulraja"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6107",
"repo": "frees-io/freestyle",
"url": "https://github.com/frees-io/freestyle/pull/327"
}
|
gharchive/pull-request
|
My apologies for this println
:(
Codecov Report
Merging #327 into master will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #327 +/- ##
======================================
Coverage 82.4% 82.4%
======================================
Files 25 25
Lines 216 216
Branches 2 2
======================================
Hits 178 178
Misses 38 38
|
2025-04-01T06:38:44.456574
| 2016-11-01T22:56:16
|
186670156
|
{
"authors": [
"fsasaki",
"jnehring",
"m1ci",
"sandroacoelho",
"xFran"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6108",
"repo": "freme-project/freme-ner",
"url": "https://github.com/freme-project/freme-ner/issues/160"
}
|
gharchive/issue
|
nif:taMsClassRef
@m1ci , @koidl have requested to fill the property nif:taMsClassRef with the most specific one based on the dbpedia ontology.
E.g:
<http://freme-project.eu/#offset_0_14>
a nif:OffsetBasedString , nif:Phrase ;
nif:anchorOf "Diego Maradona"^^xsd:string ;
nif:annotationUnit [ a nif:EntityOccurrence ;
itsrdf:taAnnotatorsRef <http://freme-project.eu/tools/freme-ner> ;
itsrdf:taClassRef <http://nerd.eurecom.fr/ontology#Person> , <http://dbpedia.org/ontology/SoccerManager> , <http://dbpedia.org/ontology/Agent> , <http://dbpedia.org/ontology/SportsManager> , <http://dbpedia.org/ontology/Person> ;
nif:taMsClassRef <http://dbpedia.org/ontology/SoccerManager> ;
itsrdf:taConfidence "0.9869992701528016"^^xsd:double ;
itsrdf:taIdentRef <http://dbpedia.org/resource/Diego_Maradona>
] ;
nif:beginIndex "0"^^xsd:nonNegativeInteger ;
nif:endIndex "14"^^xsd:nonNegativeInteger ;
nif:referenceContext <http://freme-project.eu/#offset_0_33> .
Basically, it can be retrieved the following SPARQL Query
SELECT ?type WHERE {
<http://dbpedia.org/resource/Diego_Maradona> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
FILTER NOT EXISTS {
?subtype ^a <http://dbpedia.org/resource/Diego_Maradona> ;
rdfs:subClassOf ?type .
}
FILTER regex(str(?type), "dbpedia.org/ontology/")
}
Hi @m1ci : I ran the same query at http://rv2622.1blu.de:8890/sparql and did not get the same result as DBpedia SPARQL. I would bet that this problem is related to our indexed data. Just to ensure that I am in the right way, could you please check if my SPARQL is correct?
Thank you
Hi @sandroacoelho
the query at the FREME sparql endpoint did not work because the dbpedia ontology wasn't loaded and there were the required subClassOf statements. Now it shoud work.
Try http://rv2622.1blu.de:8890/sparql
@sandroacoelho can you now implement so that we have the nif:taMsClassRef property in the NIF output?
thanks!
Hi @m1ci. As I promised, nif:taMsClassRef is already implemented. Could you please test it?
Best
I dont see the nif:taMsClassRef in the output, see
https://api-dev.freme-project.eu/current/e-entity/freme-ner/documents?language=en&dataset=dbpedia&mode=all&outformat=turtle&informat=text&input=Diego Maradona is from Argentina.&nif-version=2.1
Hi @m1ci ,
Checking Jenkins, I saw that the main jar is not using SNAPSHOTS. The build is using our last stable version 0.11 that does not contain this new feature.
I forced it to give you a chance to test - (Note: This is wrong and should be reversed as soon as possible) .
At SPARQLProcessor.java, the address "http://www.freme-project.eu/datasets/types" is used as defaultGraph. Could you please load the DBpedia ontology inside it?
Best,
At SPARQLProcessor.java, the address "http://www.freme-project.eu/datasets/types" is used as defaultGraph. Could you please load the DBpedia ontology inside it?
Can you try without specifying the default graph?
Just
`QueryExecution qexec = QueryExecutionFactory.sparqlService(this.endpoint, query);
Hi @m1ci , Done!
Thanks, however for https://api-dev.freme-project.eu/current/e-entity/freme-ner/documents?language=en&dataset=dbpedia&mode=all&outformat=turtle&informat=text&input=Diego Maradona is from Argentina.&nif-version=2.1
I get one type for Diego Maradna, which is fine, but two types for Argentina - problem. Why there are two types for Argentina?
I found the reason why,
actually, in the dbpedia ontology there is
<http://dbpedia.org/ontology/Place> <http://www.w3.org/2002/07/owl#equivalentClass> <http://dbpedia.org/ontology/Location> .
which means they are same types. In order to return only one, just add LIMIT 1 at the end of the query. That will solve the isssue.
I found why, there is no subclass relation between http://dbpedia.org/ontology/Country and http://dbpedia.org/ontology/Location, also Location is sameAs Place. We need to fix the sparql query.
By definition, our query retrieves leafs (a type that does not have subtypes) for the most specific types in DBpedia ontology.
For Argentina, we have two "leafs" types and it could happen with others.
If we want just one resource to fill in nif:taMsClassRef, we could
Takes the first (I don't like this solution because is not deterministic);
Define filters to decide what type we should select to be a nif:taMsClassRef
Best,
not, it's one leaf and its Country. Location is super class.
Here is the solution
SELECT ?type WHERE {
<http://dbpedia.org/resource/Argentina> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> ?type .
FILTER NOT EXISTS {
<http://dbpedia.org/resource/Argentina> a ?subtype .
?subtype rdfs:subClassOf|owl:equivalentClass ?type .
}
FILTER regex(str(?type), "dbpedia.org/ontology/")
}
@sandroacoelho please update the query
@m1ci : done!
thanks @sandroacoelho
@koidl @xFran please test, for example: https://api-dev.freme-project.eu/current/e-entity/freme-ner/documents?language=en&dataset=dbpedia&mode=all&outformat=turtle&informat=text&input=Diego Maradona is from Argentina.&nif-version=2.1
the query at the FREME sparql endpoint did not work because the dbpedia ontology wasn't loaded and there were the required subClassOf statements. Now it shoud work.
Do we need to update the dataset dumps for the docker installation because of that?
Oh forget the last comment, I just read https://github.com/freme-project/freme-docker/issues/17#issuecomment-258323248 which says we need to update the dataset
@sandroacoelho do we need to update the dataset? I think it is irrelevant in which graph are the datasets loaded, we query all data in all graphs. Correct me if Im wrong.
@m1ci is working with turtle but no json-ld
https://api-dev.freme-project.eu/current/e-entity/freme-ner/documents?language=en&dataset=dbpedia&mode=all&outformat=json-ld&informat=text&input=Diego Maradona is from Argentina.
I can't see nif:taMsClassRef or something similar in the response.
Hi, @xFran: I will check our jsonld.
It works with NIF version 2.1, see http://tinyurl.com/jedem72
2016-11-14 13:32 GMT+01:00 Sandro<EMAIL_ADDRESS>
Hi, @xFran https://github.com/xFran: I will check our jsonld.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/freme-project/freme-ner/issues/160#issuecomment-260323656,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABH5AuVMMltjcYTvsNf__2HepaM4VYQ5ks5q-FT6gaJpZM4KmrZJ
.
Works with NIF version 2.1 indeed.
https://api-dev.freme-project.eu/current/e-entity/freme-ner/documents?language=en&dataset=dbpedia&mode=all&outformat=json-ld&informat=text&input=Diego Maradona is from Argentina.&nif-version=2.1
Thank you @fsasaki
I have a few questions about this feature:
Would it be complicated to add a configuration option to switch off the feature? I guess it has a huge impact on the performance of NER.
What happens when the data for this feature is not contained in the triple store? Does it produce no MFS value? Or an error message?
Is there any documentation about the feature? We should add a (brief) documentation to https://freme-project.github.io/knowledge-base/freme-for-api-users/freme-ner.html
Would it be complicated to add a configuration option to switch off the feature? I guess it has a huge impact on the performance of NER.
I would not complicate the things. And provide this info always. In other words, leave as it is.
What happens when the data for this feature is not contained in the triple store? Does it produce no MFS value? Or an error message?
If there is no data, then we dont provide the mfs value. If there is then we provide. Also, if there is just one type, then we list this type as most-specific-type and also as itsrdf:taClassRef.
Is there any documentation about the feature? We should add a (brief) documentation to https://freme-project.github.io/knowledge-base/freme-for-api-users/freme-ner.html
I'm afraid this is not documented. Please add section to the doc called "NIF Output explained" and explain each piece of information (1-2 sentences).
| Would it be complicated to add a configuration option to switch off the feature? I guess it has a huge impact on the performance of NER.
| I would not complicate the things. And provide this info always. In other words, leave as it is.
Ok
thanks for the information
I tested and this is installed on freme-live also. The documentation issue is moved to https://github.com/freme-project/freme-project.github.io/issues/320 . So this issue is done
|
2025-04-01T06:38:44.492189
| 2024-09-30T14:49:56
|
2556917727
|
{
"authors": [
"Marenz"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6109",
"repo": "frequenz-floss/frequenz-dispatch-python",
"url": "https://github.com/frequenz-floss/frequenz-dispatch-python/pull/62"
}
|
gharchive/pull-request
|
Hack-Fix bug that eventloop leaks into other tests
This is so far the only reliable way to make the bug not happen.
Unfortunately it also happens with this
|
2025-04-01T06:38:44.494379
| 2020-01-07T15:36:05
|
546351018
|
{
"authors": [
"JGwilliams",
"bguidolim",
"steventnorris-40AU"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6110",
"repo": "freshOS/Then",
"url": "https://github.com/freshOS/Then/issues/66"
}
|
gharchive/issue
|
Cocoapods support
Hi!
I saw that Cocoapods is deprecated now, I would like to suggest adding support again since it doesn't change the way to develop for SPM as well.
The library is a very useful library and Cocoapods still in use for a huge number of projects out there, would be nice to keep using this the way it is.
Thanks.
Well, at least set it as deprecated on the cocoapods-specs repo.
I second this. Adding PODs support back in would be much appreciated. We have several projects that are using Cocoapods, and I'd like to not mix my dependency management.
I third this. I've inherited a codebase that has a third-party Cocoapod library which includes Then as a dependency, along with other Cocoapods. This is making it difficult - and maybe impossible - for me to compile since I can't find a way to get Cocoapods to work with SPM dependencies.
|
2025-04-01T06:38:44.520252
| 2013-08-04T19:46:58
|
17612051
|
{
"authors": [
"rgrp"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6111",
"repo": "frictionlessdata/ideas",
"url": "https://github.com/frictionlessdata/ideas/issues/63"
}
|
gharchive/issue
|
Google spreadsheet data package (simple data format) example
What data?
Timeline data
Spend data example (should be easy to get one)
...?
Would be nice to try that out with the create tool as well
DUPLICATE of #31
|
2025-04-01T06:38:44.599404
| 2021-09-06T04:54:44
|
988764888
|
{
"authors": [
"joelthe1"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6112",
"repo": "friendsofagape/autographa",
"url": "https://github.com/friendsofagape/autographa/issues/286"
}
|
gharchive/issue
|
Scroll USFM Editor Reference
On click of a verse in the USFM-editor component then the reference USFM-editor should scroll to that verse and highlight it. Try scrolling so that the verse is in the center of the section.
Same as #450.
|
2025-04-01T06:38:44.632786
| 2019-10-11T13:36:02
|
505857431
|
{
"authors": [
"baygulovd",
"frontend-tinkoff-ekb"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6113",
"repo": "frontend-tinkoff-ekb/rtf-lab-1",
"url": "https://github.com/frontend-tinkoff-ekb/rtf-lab-1/pull/27"
}
|
gharchive/pull-request
|
'first_task'
Попытка номер раз
/home/vsts/work/1/s/lab-tests/01/index.js
1:4 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
2:56 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
3:82 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
4:3 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
5:36 error Trailing spaces not allowed no-trailing-spaces
5:37 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
6:5 error Expected { after 'if' condition curly
6:74 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
7:9 error Expected no linebreak before this statement nonblock-statement-body-position
7:21 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
8:1 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
9:27 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
10:43 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
11:5 error Opening curly brace does not appear on the same line as controlling statement brace-style
11:6 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
12:9 error Expected { after 'if' condition curly
12:56 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
13:13 error Expected no linebreak before this statement nonblock-statement-body-position
13:38 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
14:14 error Expected { after 'if' condition curly
14:18 error Unary word operator 'typeof' must be followed by whitespace space-unary-ops
14:39 error Strings must use singlequote quotes
14:48 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
15:13 error Expected no linebreak before this statement nonblock-statement-body-position
15:25 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
16:6 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
17:1 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
18:9 error 'intNumber' is never reassigned. Use 'const' instead prefer-const
18:21 error Missing radix parameter radix
18:53 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
19:9 error 'res' is never reassigned. Use 'const' instead prefer-const
19:36 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
20:9 error 'arr' is never reassigned. Use 'const' instead prefer-const
20:18 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
21:22 error Expected '===' and instead saw '==' eqeqeq
21:27 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
22:5 error Opening curly brace does not appear on the same line as controlling statement brace-style
22:6 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
23:26 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
24:9 error Expected blank line before this statement padding-line-between-statements
24:20 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
25:6 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
26:1 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
27:5 error Expected { after 'for' condition curly
27:41 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
28:9 error Expected no linebreak before this statement nonblock-statement-body-position
28:18 error Missing radix parameter radix
28:36 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
29:1 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
30:16 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
31:2 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
32:1 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
33:19 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
34:14 error Expected linebreaks to be 'LF' but found 'CRLF' linebreak-style
34:14 error Missing trailing comma comma-dangle
35:2 error Missing semicolon semi
35:2 error Newline required at end of file but not found eol-last
|
2025-04-01T06:38:44.715771
| 2020-10-29T15:25:57
|
732414379
|
{
"authors": [
"barryvdh",
"driesvints"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6114",
"repo": "fruitcake/laravel-cors",
"url": "https://github.com/fruitcake/laravel-cors/pull/511"
}
|
gharchive/pull-request
|
Enable PHP 8 builds
PHP 8 is almost upon us so we should make sure laravel-cors runs properly on it in Github Actions.
In draft until https://github.com/asm89/stack-cors/pull/83 is merged.
I've tagged stack-cors, so converted this PR to ready for review + rerun tests. Might take a few minutes for the tag to be found though.
I can't seem to retrigger the builds myself for some reason..
Oh phpunit is the culprit. Might need to update that.
Hmm https://github.com/laravel/dusk is not yet PHP8 compatible?
@barryvdh trying to get that fixed in a few. Gonna concentrate on the skeleton atm first 👍
Seems to be passing now, thanks!
|
2025-04-01T06:38:44.723958
| 2021-04-26T11:03:12
|
867578581
|
{
"authors": [
"scala-steward"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6115",
"repo": "fs2-blobstore/fs2-blobstore",
"url": "https://github.com/fs2-blobstore/fs2-blobstore/pull/375"
}
|
gharchive/pull-request
|
Update s3 to 2.16.47
Updates software.amazon.awssdk:s3 from 2.16.45 to 2.16.47.
I'll automatically update this PR to resolve conflicts as long as you don't change it yourself.
If you'd like to skip this version, you can just close this PR. If you have any feedback, just mention me in the comments below.
Configure Scala Steward for your repository with a .scala-steward.conf file.
Have a fantastic day writing Scala!
Ignore future updates
Add this to your .scala-steward.conf file to ignore future updates of this dependency:
updates.ignore = [ { groupId = "software.amazon.awssdk", artifactId = "s3" } ]
labels: library-update, semver-patch
Superseded by #380.
|
2025-04-01T06:38:44.731976
| 2018-06-13T15:25:23
|
332046496
|
{
"authors": [
"KeAWang",
"LaurentHayez",
"Michielskilian",
"Servinjesus1",
"fsavje"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6116",
"repo": "fsavje/math-with-slack",
"url": "https://github.com/fsavje/math-with-slack/issues/36"
}
|
gharchive/issue
|
Stopped working in Slack Version 3.2.0?
Hi,
I used your script for quite some time now and it always worked fine. But now, since a week or so, it's just not rendering anymore.
I'm not sure if it's the new Slack version (I have v3.2.0 installed)?
I already tried everything like downloading the script again from the Github, re-running the script and checking with some basic math that should normally work.
Thanks in advance for any help!
Hi Kilian,
Me and another user both have this issue with Slack v3.2.0 (and now 3.2.1) on Ubuntu 18.04. Do you also use Ubuntu 18.04 or do you use another OS?
It doesn't work on Windows 10 with Slack v3.2.0 either
Hi Laurent,
I'm using MacOS High Sierra (latest version).
Hi @Michielskilian and thanks for the bug report. Please try this script: https://github.com/fsavje/math-with-slack/blob/d385db532d6698b16a79c6d06fe1bdcb52732888/math-with-slack.sh Please let me know how it works.
(Working on Windows version at the moment, @KeAWang )
@KeAWang Could you please try whether the updated script works, and report back to here?
Hi @fsavje, the script you attached works!
Thank you for the quick fix!
Unfortunately the newest version is not working for me in Win10, v3.2.0
A few of the paths aren't working even though the .js files are in the directory. The previous release, v0.2.3 runs without a hitch but none of the math is rendering either way...
@fsavje It still doesn't seem to work on Windows 10. The new script gives the following errors
Using Slack installation at: ...\slack\app-3.2.0\resources\app.asar.unpacked\src\static
The system cannot find the path specified.
FINDSTR: Cannot open ...\slack\app-3.2.0\resources\app.asar.unpacked\src\ssb-interop.js
The system cannot find the path specified.
FINDSTR: Cannot open ...\slack\app-3.2.0\resources\app.asar.unpacked\src\ssb-interop.js
Backup already exists: ...\slack\app-3.2.0\resources\app.asar.unpacked\src\static/ssb-interop.js.mwsbak
Press any key to continue . . .
@Servinjesus1 and @KeAWang Thanks for trying it out. I don't have access to a Windows machine atm, so I can't debug this properly. Could you try this script: https://raw.githubusercontent.com/fsavje/math-with-slack/7d7bc39d2723e8c299bff47d4d80d465d044a1af/math-with-slack.bat Thanks!
Works great now, thanks!
Thanks for checking!
|
2025-04-01T06:38:44.773780
| 2018-05-10T02:31:43
|
321788424
|
{
"authors": [
"damaestro"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6118",
"repo": "fspin-k8s/fspin-infrastructure",
"url": "https://github.com/fspin-k8s/fspin-infrastructure/issues/18"
}
|
gharchive/issue
|
Implement GC for Repo, Build Results and Images
Automatically clean up:
repo
build-results
gce images
Determine how many of each makes the most sense.
repo trimmed down to 5 snapshots.
Only keep around the latest release for the mirrors: https://github.com/fspin-k8s/fspin-infrastructure/pull/31
Only keep around 5 builder images: https://github.com/fspin-k8s/fspin-infrastructure/pull/33
|
2025-04-01T06:38:44.775045
| 2024-12-07T18:22:57
|
2724758046
|
{
"authors": [
"MaxenceDC",
"fspoettel"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6119",
"repo": "fspoettel/advent-of-code-rust",
"url": "https://github.com/fspoettel/advent-of-code-rust/issues/74"
}
|
gharchive/issue
|
Uses u32 for answers, today's answer required u64.
I wasted a lot of time because of this :(
It's documented in the readme and should error in non-release mode, but maybe we should just use u64 everywhere.
I would accept a PR that changes this to u64 by default.
|
2025-04-01T06:38:44.782662
| 2016-04-01T23:22:10
|
145308081
|
{
"authors": [
"alfonsogarciacaro",
"fdcastel"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6120",
"repo": "fsprojects/Fable",
"url": "https://github.com/fsprojects/Fable/issues/89"
}
|
gharchive/issue
|
Output to ES6? (Possible? Worth it?)
It would be possible to output "pure" ES6 code? (instead of "ES5 with some ES6 classes"?)
I know Babel is supposed to translate ES6 to ES5. But... just being curious.:)
Why?
I'm including the generated .js files in a project written in ES6. And I already have a build pipeline to convert ES6 to ES5. It would be nice to use only one language for everything.
Currently I'm simply importing the Fable-generated files with
import MyModule from '../imports/MyModule';
and it works nicely.
I don't know if this would have any benefit. Just because. ;)
In the next release it will be possible to output ES6 modules to allow tree shaking with tools like WebPack. I wanted to make Fable as easy to use as possible and that's why I'm including the necessary Babel plugins to compile to ES5 by default. But it shouldn't be difficult to add an option to compile to ES6 (ES5 being the default) like TypeScript has, you only need to omit all Babel plugins but the first four (I've found Babel has problems when you don't transform the property mutators):
https://github.com/fsprojects/Fable/blob/master/src/fable-js/index.js#L127-L134
I'll try to do it for the next release :)
Yeah! After I posted this I did read more about WebPack 2 (currently in beta) planned optimizations and improvements to ES6. Great!
And thanks for the pointers about BabelPlugins. It was extremely valuable for learning more about how Fable works! :+1:
I will play with them. Let's see.. ;)
This has been also implemented in the imports branch with the --ecma argument :)
Note: For consistency with --module argument, es2015 must be used instead of es6 (though we may add es6 as an alternative).
|
2025-04-01T06:38:44.783840
| 2019-07-11T16:34:02
|
466995334
|
{
"authors": [
"TimLariviere"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6121",
"repo": "fsprojects/Fabulous",
"url": "https://github.com/fsprojects/Fabulous/pull/515"
}
|
gharchive/pull-request
|
Remove Android designer files
Removing these files since they are automatically generated each time, and they pollute our commits
/azp run full build
|
2025-04-01T06:38:44.811452
| 2017-08-26T13:22:14
|
253092156
|
{
"authors": [
"forki",
"lexarchik",
"matthid",
"tebeco",
"trondd",
"vbfox"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6122",
"repo": "fsprojects/Paket",
"url": "https://github.com/fsprojects/Paket/issues/2668"
}
|
gharchive/issue
|
[Discussion] Magic mode working in a team ? align teamate version ?
Description
Hello there,
in the past we encounter many times issue because we had to tell teamate to update paket CLI
once done we had other issue using magic mode (aka #2656)
Expected behavior
Looking for an idea to align version of paket for developer + CI
we got about 30 developer, 10 build agents, the number of update per date can be hazardous sometimes
I'm wondering if there's a way to use magic mode across different multiple computer but not to the latest, a specified version
It could be an ENV_VAR that for example point to a :
server url
shared folder
exposed from a Git repo used by the team
...
Actual behavior
Magic mode always get latest (if there's a bug we have to fallback manually and communicate fast + change agent configuration)
Known workarounds
Manually change paket.exe from magic mode to "normal mode", slow and kind of flacky
You can "lock" the version in your paket.dendencies file:
version 5.86.0-alpha005
source http://....
nuget MyPackage
The "Magic"-Mode-Bootstrapper will download the version locked in the depsfile.
well we would have to align 60 repositories
if we decide to change, we have 60 repo to changes (we got like 100 repo only for our team, and 40 of it are yet to be migrated to paket)
I do understand it is a step forward but this is still a huge manual change
Yes I agree that it is not perfect. But is there a reason that all repositories need to use the same version?
/cc @forki what do you think about this scale?
It's more like we need to be sure the CI Agent will run on the exact same configuration as any developer
We ended up rolling back mutually but nobody had the same version
but even CI had an older version (and this is not easy to change CI versions)
We have a process doing HASH on paket.exe on CI + a server dedicated of maintaining CI integrity
it ensure us that all agents are aligned not only for paket but also SDK, etc ...
Another way would be to specified another URL for the Magic mode
for example, the CI Agents does not have Firewall right to access github.com (this site specifically)
So if you could for example say to the magic mode
please dont use github.com/fsproject/paket/releases, use this other feed
it would solve everything
say to the magic mode
It can be done by options in dependencies file.
E.g. line version <version> --force-nuget --nuget-source=<feed>
We use network share as source for paket package in our projects.
sooo paket.exe is released as a nuget ? and nuget-source is an private nuget feed that provide paket as a nuget ?
this is no CLI argument right ?
i dont want to confuse people says asking for them to add this param every time
Also magic mode is reading a config file automatically. Put paket.exe.config alongside paket.exe. It allows you to pin the version
@forki as said, i want to be driving by an external feed of each repo, avoviding us to edit 60 times the .config for each repo
IIRC there was already an url parameter in the config file. If not we would
accept a pr. This url could provide a blessed version.
Am 26.08.2017 16:18 schrieb "TeBeCo"<EMAIL_ADDRESS>
@forki https://github.com/forki as said, i want to be driving by an
external feed of each repo, avoviding us to edit 60 times the .config for
each repo
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/fsprojects/Paket/issues/2668#issuecomment-325131848,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AADgNNg1H2tLsfO4Bx32-c2ws5CyTChuks5scCkigaJpZM4PDe3n
.
edit 60 times the .config for each repo
If you need synchronize paket version (and its changing) among many repo you can set nuget-feed in each dependencies file and not set version (also set --max-file-age to 0).
For update paket everywhere you can just add new version of paket package to you nuget feed.
will dig into paket code to see that nuget-feed
thx everyone here :)
Ok seems url is not yet in
https://fsprojects.github.io/Paket/bootstrapper.html#In-application-settings
But I bet it's not hard to add. It's even a C# project ;-)
Am 26.08.2017 4:41 nachm. schrieb "TeBeCo"<EMAIL_ADDRESS>
will dig into paket code to see that nuget-feed
thx everyone here :)
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/fsprojects/Paket/issues/2668#issuecomment-325133950,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AADgNCJ26dDoDt_msTLIUViU9xakO09Wks5scC6NgaJpZM4PDe3n
.
The reason I added version configuration in the paket.dependencies for magic mode is exactly this case : Synchronizing versions between all devs and our CI while still using magic mode.
BTW having the version pinned per-repository also allow you to build old git commits with old paket versions and get the same result. It can also allow teams to update paket at their own pace instead of forcing "big bang" upgrades but I can understand that upgrading a big number of repo can be painful. (We're mostly in monorepo so we sidestep that)
As for CI not having access to github.com it's also our case and we use the already cited --force-nuget --nuget-source= in our "version" line in paket.dependencies to point to our local NuGet cache (ProGet)
You can use the following solution to sync paket versions internally:
Add paket.bootstrapper.exe 5.92.2 or later and rename it to paket.exe to your repos.
Add the following paket.exe.config file as well:
<configuration>
<appSettings>
<add key="ForceNuget" value="True"/>
<add key="IgnoreCache" value="True"/>
<add key="NugetSource" value="http://internal-feed" />
</appSettings>
</configuration>```
Now all users will pick you that latest NuGet package that you push to your internal feed.
btw the version line in paket.dependencies can accomplish the same
version -f --force-nuget --nuget-source=http://internal-feed
(version being a misleading name in this uncommon case, it's in fact the command line of the bootstrapper)
|
2025-04-01T06:38:44.817457
| 2019-07-11T12:20:45
|
466858634
|
{
"authors": [
"BlythMeister",
"matthid"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6123",
"repo": "fsprojects/Paket",
"url": "https://github.com/fsprojects/Paket/issues/3613"
}
|
gharchive/issue
|
net471 doesn't default to netstandard but a older net framework
When taking a dependency into a net471 project, i would expect a netstandard2.0 dependency to be favoured over an older (pre standard) framework version.
Since netstandard2.0 is fully compatible with net471, a project maintainer is unable to support newer features in an older framework version (as seen here: https://github.com/EasyNetQ/EasyNetQ/issues/623#issuecomment-510444144)
This means that if a project maintainer is using a multi target switch to support the newer features, and is publishing these as netstandard2.0 and say net451 paket will always default to the net451 version because it's framework.
Short of asking the maintainer to also publish the netstandard2.0 version as net471 (which isn't always practical) would it not make more sense for paket to use netstandard2.0 with the same level of "penalty" as the directly compatible framework version? - or at least give users a preference that they would like to use netstandard over older netframework versions.
This has nothing to do with paket. Target framework preferences are spec'd by NuGet and I doubt they will change them. In any case you should open an issue over there.
so paket isn't the one choosing to use the net451 over netstandard2.0 when both are present in the package?
I'm a little confused, since paket is adding the reference to the net471 project and stating to use the net451 version. for example:
<Choose>
<When Condition="$(TargetFrameworkIdentifier) == '.NETFramework' And $(TargetFrameworkVersion) == 'v4.7.1'">
<ItemGroup>
<Reference Include="EasyNetQ">
<HintPath>..\..\packages\EasyNetQ\lib\net451\EasyNetQ.dll</HintPath>
<Private>True</Private>
<Paket>True</Paket>
</Reference>
</ItemGroup>
</When>
</Choose>
on package: https://www.nuget.org/packages/EasyNetQ/ which has net451 and netstandard2.0
The code is in paket, but the spec is on NuGet, we follow the spec closely.
What I mean is: If you use nuget client or dotnet sdk it will/should reference the same dll.
If we diverge here this change will not be in dotnet sdk style projects as there we only forward to NuGet.
As we don't care to much about legacy projects we won't do such a radical change for old-style projects only. And I don't see this working on dotnet sdk style projects
Ok, I think I understand.
Thanks for clarifying.
I'm going to speak to this project maintainer and see if we get a net461 version added with the releavant newer framework features added!
|
2025-04-01T06:38:44.819394
| 2016-04-28T04:05:43
|
151545134
|
{
"authors": [
"forki",
"matthid"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6124",
"repo": "fsprojects/Paket",
"url": "https://github.com/fsprojects/Paket/pull/1651"
}
|
gharchive/pull-request
|
Revert Netstandard to see if build works
don't merge
@matthid I think I reverted to something that already built. I don't really understand that error. Is that a mono issue?
I would say nunit bug, because appveyor fails as well. Can you try to revert that update as well.
Now it fails for different reasons. So looks like nunit is really broken
yep, looks like either paket bug or not updated unit tests to me...
|
2025-04-01T06:38:44.826593
| 2015-08-07T22:14:44
|
99739128
|
{
"authors": [
"ashtonkj",
"cloudRoutine",
"dungpa",
"vasily-kirichenko"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6125",
"repo": "fsprojects/VisualFSharpPowerTools",
"url": "https://github.com/fsprojects/VisualFSharpPowerTools/pull/1070"
}
|
gharchive/pull-request
|
Skip object identifiers when checking for unused symbols
UserVoice request: http://vfpt.uservoice.com/forums/247560-general/suggestions/8793808-do-not-mark-orange-in-the-right-bar-unused-self-re
Graying out object identifiers could be annoying. Since it has little practical value, we skip it while checking for unused references.
I'm afraid I disagree. What's more, that User Voice has only 4 votes. I like how it works now. Maybe we should ask people?
Sure, please ask around on Twitter and Slack.
@cloudroutine wrote on Slack:
I'd prefer another toggle like "Gray out unused member identifiers" and a toggle for whether unused items show up in the scroll bar
I agree, we should add two options:
Syntax coloring
Gray out unused opens
Gray out unused declarations
Gray out unused self identifiers
Show unused symbols on scroll bar
BTW, what do you think about using "Highlight" instead of "Gray out"?
Highlight is more accurate since this is in the context of syntax highlighting and because the color for unused items doesn't have to be grey.
Gray out unused self identifiers: the option is too fine-grained to support a very specific use case. I wouldn't want to go that way.
Show unused symbols on scroll bar: It might be useful, but it doesn't solve the problem here.
What do you think about using "Highlight" instead of "Gray out"?
Yes, it's better with 'Highlight' as @cloudRoutine said.
If you don't like the idea to add a setting for "Gray out unused self identifiers" and not all people like removing the feature (including me), I suggest to drop this PR.
Please do make it optional. I consider the "this" identifier to be used as soon as a member is declared using the member this.membername syntax. I don't like being told that it isn't being used when it is required by the syntax (it feels like returning a false positive).
this is completely redundant. It's just a variable, and if you don't need it, you should claim about it explicitly, like _self or __. I think putting this everywhere is just old C# habit.
This tweet convinced me https://twitter.com/CarstenK_Dev/status/631013723700903936.
Happy to close this and forget it forever.
|
2025-04-01T06:38:44.828102
| 2022-10-19T20:26:54
|
1415543864
|
{
"authors": [
"lucien-sim",
"martindurant"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6126",
"repo": "fsspec/kerchunk",
"url": "https://github.com/fsspec/kerchunk/pull/239"
}
|
gharchive/pull-request
|
Fix for lat/lon 1D array in grib2
Fixes #238
Needs a test
I've tested this with the GEFS files, it does the job.
OK, merging this, but finding a suitable test is still TODO. I don't know if we can persuade xarray to write a grib2 file of this sort.
|
2025-04-01T06:38:44.830849
| 2021-02-25T12:47:06
|
816405132
|
{
"authors": [
"fstab",
"jesperronn"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6127",
"repo": "fstab/grok_exporter",
"url": "https://github.com/fstab/grok_exporter/issues/145"
}
|
gharchive/issue
|
verbose logging in grok_exporter?
First, thanks for creating and maintaining this tool.
Forgive my lack of knowledge, but I am looking if there is an easy way to give a --verbose flag to the tool, (or --debug, or --trace) which can help me debug my configuration when I am working on the matchers.
thanks in advance!
Unfortunately there is no good verbose option right now.
If you want to debug grok pattern, there are multiple grok pattern test websites on the Internet where you can test a pattern and example log lines.
For debugging within grok_exporter itself, the best way to debug is to look the grok_exporter_lines_matching_total built-in metric. I am currently working on improving this (branch built-in-metrics-improvement), but that's work in progress.
Thanks for the update @fstab. Looking forward to improved debug/logging abilities later :)
|
2025-04-01T06:38:44.833335
| 2023-12-23T17:30:11
|
2054850765
|
{
"authors": [
"ftde0",
"gigigigi53",
"jackhacksren"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6128",
"repo": "ftde0/yt2009",
"url": "https://github.com/ftde0/yt2009/issues/28"
}
|
gharchive/issue
|
just a suggestion: ad module
is it possible to put the ad module into yt2009 or atleast emulate it and even at flash form
it can be enabled by checking emulate_ads in the flags
haven't messed with the ad module as i won't be adding ads into yt2009. leaving this open however as a reminder
Any update? I really want to try out ads on YT2009 (mobile and web), which i'm making a new issue on...
|
2025-04-01T06:38:44.855475
| 2024-02-22T21:26:50
|
2149967420
|
{
"authors": [
"tytyvillus"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6129",
"repo": "fuhrmanator/pandoc-filter-fr-nbsp",
"url": "https://github.com/fuhrmanator/pandoc-filter-fr-nbsp/issues/3"
}
|
gharchive/issue
|
Filter is too greedy when checking for pre-existing nbsp
Using a no-break space before the word preceding the punctuation prevents punctuation from spacing correctly.
In Markdown, typing the text on the left gives that on the right:
test: --> test : (CORRECT)
1 test: --> 1 test : (CORRECT)
1\ test: --> 1 test: (INCORRECT)
Something like 1\ test: would be a sensible thing to type for example when wanting to keep a number with its unit.
|
2025-04-01T06:38:44.897875
| 2022-03-17T00:29:21
|
1171728324
|
{
"authors": [
"SG2019"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6131",
"repo": "fullprofile/fullprofile-status-monitor",
"url": "https://github.com/fullprofile/fullprofile-status-monitor/issues/4"
}
|
gharchive/issue
|
Xero Scheduled Maintenance In Progress
Xero is currently conducting scheduled maintenance. This outage may result in the delayed processing of invoices status.
Xero is currently conducting scheduled maintenance. This outage may result in the delayed processing of invoices status.
Xero status page - Click here
Xero scheduled maintenance has been completed. Xero status page.
Click here
Xero is currently conducting scheduled maintenance. This outage may result in the delayed processing of invoices status.
Xero status page - Click here
|
2025-04-01T06:38:44.902701
| 2021-09-11T08:35:49
|
993786463
|
{
"authors": [
"akema-trebla",
"iammukeshm"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6132",
"repo": "fullstackhero/dotnet-webapi-boilerplate",
"url": "https://github.com/fullstackhero/dotnet-webapi-boilerplate/pull/45"
}
|
gharchive/pull-request
|
Add Swagger Filter
Added Swagger Filters to TokensController to enable adding tenant Id to header for requests.
there seem to be some errors with the build, could you check it. And also add in a brief description of what all is covered in this commit . Thanks :)
there seem to be some errors with the build, could you check it. And also add in a brief description of what all is covered in this commit . Thanks :)
Sure. Will do
there seem to be some errors with the build, could you check it. And also add in a brief description of what all is covered in this commit . Thanks :)
Build errors resolved.
Edited Comment to include a description.
|
2025-04-01T06:38:44.938696
| 2023-02-02T01:26:55
|
1567157100
|
{
"authors": [
"Aiuan",
"GoroYeh56",
"LadissonLai",
"YoushaaMurhij",
"samueleruffino99"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6133",
"repo": "fundamentalvision/BEVFormer",
"url": "https://github.com/fundamentalvision/BEVFormer/issues/155"
}
|
gharchive/issue
|
ImportError: cannot import name 'ball_query_ext' from partially initialized module 'mmdet3d.ops.ball_query'
I got this error when running
python tools/create_data.py nuscenes --root-path ./data/nuscenes --out-dir ./data/nuscenes --extra-tag nuscenes --version v1.0 --canbus ./data
(open-mmlab) goroyeh56@Goros-MacBook-Air mmdetection3d % python3 tools/create_data.py nuscenes --root-path ./data/nuscenes --out-dir ./data/nuscenes --extra-tag nuscenes --version v1.0 --canbus ./data
Traceback (most recent call last):
File "tools/create_data.py", line 6, in <module>
from tools.data_converter import kitti_converter as kitti
File "/Users/goroyeh56/mmdetection3d/tools/data_converter/kitti_converter.py", line 8, in <module>
from mmdet3d.core.bbox import box_np_ops
File "/Users/goroyeh56/mmdetection3d/mmdet3d/core/__init__.py", line 3, in <module>
from .bbox import * # noqa: F401, F403
File "/Users/goroyeh56/mmdetection3d/mmdet3d/core/bbox/__init__.py", line 5, in <module>
from .iou_calculators import (AxisAlignedBboxOverlaps3D, BboxOverlaps3D,
File "/Users/goroyeh56/mmdetection3d/mmdet3d/core/bbox/iou_calculators/__init__.py", line 2, in <module>
from .iou3d_calculator import (AxisAlignedBboxOverlaps3D, BboxOverlaps3D,
File "/Users/goroyeh56/mmdetection3d/mmdet3d/core/bbox/iou_calculators/iou3d_calculator.py", line 6, in <module>
from ..structures import get_box_type
File "/Users/goroyeh56/mmdetection3d/mmdet3d/core/bbox/structures/__init__.py", line 2, in <module>
from .base_box3d import BaseInstance3DBoxes
File "/Users/goroyeh56/mmdetection3d/mmdet3d/core/bbox/structures/base_box3d.py", line 6, in <module>
from mmdet3d.ops.iou3d import iou3d_cuda
File "/Users/goroyeh56/mmdetection3d/mmdet3d/ops/__init__.py", line 6, in <module>
from .ball_query import ball_query
File "/Users/goroyeh56/mmdetection3d/mmdet3d/ops/ball_query/__init__.py", line 1, in <module>
from .ball_query import ball_query
File "/Users/goroyeh56/mmdetection3d/mmdet3d/ops/ball_query/ball_query.py", line 4, in <module>
from . import ball_query_ext
ImportError: cannot import name 'ball_query_ext' from partially initialized module 'mmdet3d.ops.ball_query' (most likely due to a circular import) (/Users/goroyeh56/mmdetection3d/mmdet3d/ops/ball_query/__init__.py)
(open-mmlab) goroyeh56@Goros-MacBook-Air mmdetection3d
Anyone knows how to resolve this?
Thank you!
I have the same problem. It's related to the version of mmdet-3d and mmcv
Could you please provide a suggestion?
Thanks
@whai362 Thanks for your awesome work.
I also meet the problem. I install mmdetection3d following this official install tutorial [https://github.com/fundamentalvision/BEVFormer/blob/master/docs/install.md],
mmcv-full = 1.4.0 ,
mmdet = 2.14.0
mmsegmentation = 0.14.1
mmdet3d = v0.17.1
I install all the requirement without error, but I meet the bug. the same as the issue.
when i run the Prepare nuScenes Data , i got the error.
python tools/create_data.py nuscenes --root-path ./data/nuscenes --out-dir ./data/nuscenes --extra-tag nuscenes --version v1.0 -mini --canbus ./data
the error info is as follows.
Traceback (most recent call last): File "tools/create_data.py", line 6, in <module> from data_converter.create_gt_database import create_groundtruth_database File "/home/ubt2t/AL/BEVFormer/tools/data_converter/create_gt_database.py", line 11, in <module> from mmdet3d.core.bbox import box_np_ops as box_np_ops File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/core/__init__.py", line 3, in <module> from .bbox import * # noqa: F401, F403 File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/core/bbox/__init__.py", line 5, in <module> from .iou_calculators import (AxisAlignedBboxOverlaps3D, BboxOverlaps3D, File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/core/bbox/iou_calculators/__init__.py", line 2, in <module> from .iou3d_calculator import (AxisAlignedBboxOverlaps3D, BboxOverlaps3D, File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/core/bbox/iou_calculators/iou3d_calculator.py", line 6, in <module> from ..structures import get_box_type File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/core/bbox/structures/__init__.py", line 2, in <module> from .base_box3d import BaseInstance3DBoxes File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/core/bbox/structures/base_box3d.py", line 6, in <module> from mmdet3d.ops.iou3d import iou3d_cuda File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/ops/__init__.py", line 6, in <module> from .ball_query import ball_query File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/ops/ball_query/__init__.py", line 1, in <module> from .ball_query import ball_query File "/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/ops/ball_query/ball_query.py", line 4, in <module> from . import ball_query_ext ImportError: cannot import name 'ball_query_ext' from partially initialized module 'mmdet3d.ops.ball_query' (most likely due to a circular import) (/home/ubt2t/AL/BEVFormer/mmdetection3d/mmdet3d/ops/ball_query/__init__.py)
Thanks your reply , Sincerely!!
Same problem here!
Same problem
|
2025-04-01T06:38:44.945100
| 2022-11-21T15:02:27
|
1458092131
|
{
"authors": [
"Anoesj",
"ErwinAI",
"andrevferreiraa",
"d3xter-dev",
"jankohlbach",
"mariuscdejong",
"offline-first"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6134",
"repo": "funkenstudio/sitemap-module-nuxt-3",
"url": "https://github.com/funkenstudio/sitemap-module-nuxt-3/pull/13"
}
|
gharchive/pull-request
|
Hardcoded .output/public causing issues on Netlify
Hey!
I've run into some issues with this module when deploying a statically generated Nuxt app (using nuxt generate) to Netlify.
This module seems to assume that the output dir is always .output/public in the Nuxt source directory, because sitemaps are always basically written to path.join(nuxtInstance.options.srcDir, '.output/public), but Nitro recognizes Netlify (probably by checking process.env.NETLIFY), sets NITRO_PRESET to netlify and that changes the output directory to dist instead of .output/public. I tried to fix this in a PR, but I'm still getting some failing tests.
Could you look into this? Consider my PR just an experiment, I'm not a very experienced open source dev :sweat_smile:.
You can mimic Nitro's behavior on Netlify by running NITRO_PRESET=netlify yarn build-module. If ran, the current tests fail too.
Thanks! I will have a look
Hey @d3xter-dev, I edited the original post, because I found out it has to do with the NITRO_PRESET environment variable. You can find all Nitro presets here: https://github.com/unjs/nitro/blob/main/src/presets/index.ts. This is the file where Nitro presets are resolved: https://github.com/unjs/nitro/blob/main/src/options.ts.
Changes look good to me. @d3xter-dev Any chance this can be merged / released?
Tbh this probably needs some extra work if some tests are failing. It just needs an extra set of eyes for a few hours.
Any updates?
As a workaround for now, you can use this guide, replace the query with the dynamic routes.
https://content.nuxtjs.org/guide/recipes/sitemap
import { SitemapStream, streamToPromise } from 'sitemap'
export default defineEventHandler(async (event) => {
const config = useRuntimeConfig()
const links = await $fetch('/api/routes')
const sitemap = new SitemapStream({
hostname: config.public.storyblok.siteUrl,
})
for (const link of links) {
sitemap.write({
url: link,
changefreq: 'monthly'
})
}
sitemap.end()
return streamToPromise(sitemap)
})
same issue here with vercel
at least now I know what's the issue 😅
thanks @Anoesj for the fix
@d3xter-dev is there a plan to merge this? otherwise I need to look for another solution, but this one would be the best
Any plans to merge this?
|
2025-04-01T06:38:44.953350
| 2018-06-14T03:07:18
|
332236351
|
{
"authors": [
"Bencey"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6135",
"repo": "funkypenguin/geek-cookbook",
"url": "https://github.com/funkypenguin/geek-cookbook/issues/35"
}
|
gharchive/issue
|
Discord rss
Find bot (or make one) To post RSS feed to #socialfeed Or #changelogs
Or maybe even webhook? (Not sure if its possible)
Added bot. Awaiting rss edit to test
Bot working
Ticket closed
|
2025-04-01T06:38:44.968584
| 2018-03-05T01:08:04
|
302142692
|
{
"authors": [
"furrary",
"iambudi"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6138",
"repo": "furrary/livereload",
"url": "https://github.com/furrary/livereload/issues/7"
}
|
gharchive/issue
|
Not Found Response
Hi,
I use livereload on angulardart 5.
pub run livereload --spa on
[INFO] Serving a WebSocket server at ws://localhost:4242
[INFO] Entrypoint: Generating build script...
[INFO] Entrypoint: Generating build script completed, took 324ms
...
[INFO] Build: Succeeded after 215ms with 0 outputs
[INFO] `build_runner` starts serving `on` on port 8080
Browse your web app at http://localhost:8000
Accessing localhost:8000 return 404.
When i run with pub run build_runner serve it displays page normally.
From the log:
[INFO] `build_runner` starts serving `on` on port 8080
It started serving the directory on instead of web .
I'm sorry, the document is not clear enough.
To enable SPA, you don't need to specify the option because it's the default. If you want to disable it, here's the way.
pub run livereload --no-spa
|
2025-04-01T06:38:44.976555
| 2017-12-22T11:42:17
|
284143148
|
{
"authors": [
"jollytoad",
"nchanged"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6139",
"repo": "fuse-box/fuse-box",
"url": "https://github.com/fuse-box/fuse-box/issues/989"
}
|
gharchive/issue
|
Problems using Node APIs in Electron renderer
I'm attempting to use fuse-box (2.5.0-beta.1) to bundle the renderer side of my Electron app, which makes use of the native Node modules available in the environment, ie, path, fs, etc.
Even after setting target: 'electron' these modules are empty.
I've read through a lots of issues here on this matter, eg. serverBundle option - which appears to have been removed now, and looked through the codebase, the way that these modules check FuseBox.isServer, and that FuseBox.isServer = !isBrowser
It seems to be that the most obvious fix for this is to have both FuseBox.isBrowser & FuseBox.isServer set to true for the 'electron' target. Unless isServer has some deeper meaning that i've not discovered yet.
If you agree with this course of action i'd be happy to experiment, test, and submit a PR - or any other suggestions?
hi @jollytoad
Thanks for submitting ;-). Let's determine first the cause of the issue. Maybe you could create a repository which reproduces the bug? and we could move forward after that.
And please, use the latest fuse-box@next
Hmm, seemed to be a configuration issue, coming from an ejected 'create-react-app', after stripping out old configs and restarting with something more like the fuse-box-electron-seed project i'm having more luck.
@jollytoad there still an issue with server polyfill, this will be fixed soon enough ;-)
|
2025-04-01T06:38:44.980598
| 2016-05-04T16:24:48
|
153056241
|
{
"authors": [
"mithrandi"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6140",
"repo": "fusionapp/fusion-index",
"url": "https://github.com/fusionapp/fusion-index/pull/45"
}
|
gharchive/pull-request
|
[requires.io] dependency update on master branch
This change is
Reviewed 1 of 1 files at r1.
Review status: all files reviewed at latest revision, all discussions resolved.
Comments from Reviewable
|
2025-04-01T06:38:44.999163
| 2014-05-10T23:43:24
|
33251536
|
{
"authors": [
"disbelief"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6141",
"repo": "futuresimple/dropbox-api",
"url": "https://github.com/futuresimple/dropbox-api/pull/47"
}
|
gharchive/pull-request
|
OAuth2 Support
I needed to modify dropbox-api to support OAuth2 and figured I'd submit my changes as a PR. Dropbox mentions in their docs that:
OAuth 1.0 continues to be supported for all API requests, but OAuth 2.0 is now preferred.
This change adds a new config setting auth_type which can be either oauth (default) or oauth2:
Dropbox::API::Config.auth_type = "oauth2" # default is oauth
It also slightly changes the way that Access Tokens are requested when OAuth2 is used:
## Manual Access Token retrieval:
consumer = ::Dropbox::API::OAuth2.consumer(:authorize)
authorize_uri = consumer.authorize_url(client_id: APP_KEY, response_type: 'code')
# open authorize_uri in browser, sign in, grant permission, copy code that is displayed
access_token = consumer.auth_code.get_token('code_from_dropbox')
# Browser-based Access Token retrieval:
consumer = ::Dropbox::API::OAuth2.consumer(:authorize)
authorize_uri = consumer.authorize_url(client_id: APP_KEY, response_type: 'code', redirect_uri: 'https://yoursite.com/dropbox_landing', state: 'optional string')
# redirect user to authorize_uri
# upon return to https://yoursite.com/dropbox_landing?code=SOME_CODE&uid=SOME_ID
access_token = consumer.auth_code.get_token('code_from_query_string')
I'm closing this PR because I accidentally submitted it from the master branch of my fork. Will re-open a new PR using a feature branch.
|
2025-04-01T06:38:45.015348
| 2018-03-27T04:01:00
|
308814281
|
{
"authors": [
"deusaquilus",
"fwbrasil"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6142",
"repo": "fwbrasil/arrows",
"url": "https://github.com/fwbrasil/arrows/issues/1"
}
|
gharchive/issue
|
Is this what I think it is?
This looks really cool. Is it based on the John Hughes paper?
@deusaquilus
|
2025-04-01T06:38:45.048302
| 2021-04-05T00:02:55
|
849989002
|
{
"authors": [
"citelao",
"fxcoudert"
],
"license": "BSD-2-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6143",
"repo": "fxcoudert/homebrew-core",
"url": "https://github.com/fxcoudert/homebrew-core/pull/3"
}
|
gharchive/pull-request
|
Completely fix libsvg
brew install --build-from-source libsvg completes on my M1 MacBook Pro
brew reinstall --build-from-source libsvg also completes, after removing CarloCab's additional lines.
[ ] Have you followed the guidelines for contributing?
[ ] Have you checked that there aren't other open pull requests for the same formula update/change?
[ ] Have you built your formula locally with brew install --build-from-source <formula>, where <formula> is the name of the formula you're submitting?
[ ] Is your test running fine brew test <formula>, where <formula> is the name of the formula you're submitting?
[ ] Does your build pass brew audit --strict <formula> (after doing brew install <formula>)?
Please open it as a PR to homebrew-core's repo, not on my branch
|
2025-04-01T06:38:45.056354
| 2022-12-10T00:26:26
|
1487688304
|
{
"authors": [
"Thanos1716",
"lunathelemon"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6144",
"repo": "fxys/Super-Secret-Settings",
"url": "https://github.com/fxys/Super-Secret-Settings/issues/2"
}
|
gharchive/issue
|
[Request] Commit 1.19 code
I noticed that the Curseforge page has been updated to 1.19, however this repository has not, so I was wondering if you could commit the changes made in 1.19 if you still have them, thanks!
Just updated to 1.19.4 #3
|
2025-04-01T06:38:45.062314
| 2022-09-07T11:15:50
|
1364506555
|
{
"authors": [
"Dimple16"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6145",
"repo": "fylein/fyle-mobile-app",
"url": "https://github.com/fylein/fyle-mobile-app/pull/1325"
}
|
gharchive/pull-request
|
View team advance request revamp
Note
These were Sandeep's changes, we had to revert due to some issues faced during the release
Previously approved PR - https://github.com/fylein/fyle-mobile-app/pull/1082
Changes I've made:
Used the previously done changes
Added a bunch of refactoring
Fixed the bugs which were faced during testing last time
Note - This is a new PR checked out from the latest master, I've manually added the previous changes to this branch
|
2025-04-01T06:38:45.124539
| 2024-03-25T04:30:19
|
2204862320
|
{
"authors": [
"TeomanEgeSelcuk",
"fynnfluegge"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6146",
"repo": "fynnfluegge/codeqai",
"url": "https://github.com/fynnfluegge/codeqai/issues/38"
}
|
gharchive/issue
|
Indexing Error with codeqai on Conda Environment: Continuous Indexing Without Completion
While using the codeqai tool within a conda environment, I encountered an issue during the indexing process where it continuously attempts to index without completion. This problem occurred when I tried to utilize codeqai's search functionality in my project directory. Specifically, the error IndexError: list index out of range was thrown, indicating an issue with handling the document vector indexing. Below are the detailed steps to reproduce, along with the specific environment setup.
Steps to Reproduce:
Installed codeqai using pip within a conda environment.
Ran codeqai configure and configured the tool with the following settings:
Selected "y" for using local embedding models.
Chose "Instructor-Large" for the local embedding model.
Selected "N" for using local chat models and chose "OpenAI" with "gpt-4" as the remote LLM.
Attempted to start the codeqai search by navigating to my project directory (2-006) that includes .m, .mat, .txt. files. Running codeqai search in the terminal.
Received a message indicating no vector store was found for 2-006 and that initial indexing may take a few minutes. Shortly after, the indexing process started but then failed with an IndexError: list index out of range.
Expected Behavior:
The indexing process should be completed, allowing for subsequent searches within the codebase using codeqai.
Actual Behavior:
The application failed to complete the indexing process due to an IndexError in the vector indexing step, specifically indicating a problem with handling the document vectors.
Environment:
codeqai version: 0.0.14
langchain-community version: 0.0.17
sentence-transformers version: 2.3.1
Python version: 3.11
Conda version: 4.12.0
Operating System: Windows (with Conda environment)
Full Terminal Output and Error
{GenericDirectory>}conda activate condaqai-env
(condaqai-env) {GenericDirectory>}codeqai search
Not a git repository. Exiting.
(condaqai-env) {GenericDirectory>}ls
'ls' is not recognized as an internal or external command,
operable program or batch file.
(condaqai-env) {GenericDirectory>}cd 2-006
(condaqai-env) {GenericDirectory}\2-006>codeqai search
No vector store found for 2-006. Initial indexing may take a few minutes.
⠋ 💾 Indexing vector store...Traceback (most recent call last):
File "C:\Users\Edge\anaconda3\envs\condaqai-env\lib\runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "C:\Users\Edge\anaconda3\envs\condaqai-env\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "C:\Users\Edge\anaconda3\envs\condaqai-env\Scripts\codeqai.exe\__main__.py", line 7, in <module>
sys.exit(main())
File "C:\Users\Edge\anaconda3\envs\condaqai-env\lib\site-packages\codeqai\__main__.py", line 5, in main
app.run()
File "C:\Users\Edge\anaconda3\envs\condaqai-env\lib\site-packages\codeqai\app.py", line 146, in run
vector_store.index_documents(documents)
File "C:\Users\Edge\anaconda3\envs\condaqai-env\lib\site-packages\codeqai\vector_store.py", line 34, in index_documents
self.db = FAISS.from_documents(documents, self.embeddings)
File "C:\Users\Edge\anaconda3\envs\condaqai-env\lib\site-packages\langchain_core\vectorstores.py", line 508, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\Edge\anaconda3\envs\condaqai-env\lib\site-packages\langchain_community\vectorstores\faiss.py", line 960, in from_texts
return cls.__from(
File "C:\Users\Edge\anaconda3\envs\condaqai-env\lib\site-packages\langchain_community\vectorstores\faiss.py", line 919, in __from
index = faiss.IndexFlatL2(len(embeddings[0]))
IndexError: list index out of range
⠴ 💾 Indexing vector store...
Additional Context:
This issue seems to stem from the vector indexing process within the langchain-community package, possibly due to an empty or malformed document set being processed for vectorization. Given the configuration steps and the use of a conda environment, there might be specific dependencies or configurations that contribute to this problem.
Thanks for that detailed report! I think the cause is probably an empty split set for a document, as you also mentioned already.
|
2025-04-01T06:38:45.126472
| 2016-06-22T10:15:37
|
161643746
|
{
"authors": [
"Achilles-96",
"fzaninotto"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6147",
"repo": "fzaninotto/Faker",
"url": "https://github.com/fzaninotto/Faker/issues/941"
}
|
gharchive/issue
|
Question: Using name and column type guessers
I am not using any ORM libraries. Is it possible to use name and column type guessers with just having Faker library?
Check out https://github.com/nelmio/alice for a complement to Faker.
Other than that, good idea for a feature! Feel free to work on it.
|
2025-04-01T06:38:45.128963
| 2023-06-05T20:40:38
|
1742587981
|
{
"authors": [
"ahmdt",
"fzyzcjy",
"neiljaywarner"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6148",
"repo": "fzyzcjy/flutter_convenient_test",
"url": "https://github.com/fzyzcjy/flutter_convenient_test/issues/341"
}
|
gharchive/issue
|
7 months is a long time, is there a release coming to work with flutter 3.10.x and dart 3 please
thanks so much :)
Sure! Firstly try to use master branch before releasing. Btw do you know that, new features have been added to the master branch ;) Mainly:
run the same test code using host machine without a simulator, much faster (e.g. 10x) and stabler
a simple yet useful monkey (to be open sourced)
Because of the new features, I hope I can find a bit of time to update README and then release the new version.
is this project dead?
NO! I use it personally everyday!
v1.3.0 is published :)
|
2025-04-01T06:38:45.130402
| 2020-12-08T20:59:24
|
759776590
|
{
"authors": [
"g-andrade"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6149",
"repo": "g-andrade/locus",
"url": "https://github.com/g-andrade/locus/issues/19"
}
|
gharchive/issue
|
Fix tls_certificate_check build errors on OTP 20.1+, when on top of macOS Big Sur
As described under tls_certificate_check's issue #3.
Released under 1.13.1 (published to Hex as well.)
|
2025-04-01T06:38:45.287072
| 2019-02-15T00:25:03
|
410552154
|
{
"authors": [
"Ldcabansay",
"agduncan94",
"denis-yuen"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6152",
"repo": "ga4gh/dockstore",
"url": "https://github.com/ga4gh/dockstore/issues/2119"
}
|
gharchive/issue
|
Editable display name for organizations and collections
Feature Request
Desired behaviour
Organizations and Collections would have an editable display name that is separate from the UUID and url. This display name should be unique (given on a first come, first served basis), and allow for special characters and spaces. The display name should be editable on the user interface for organization/collections maintainers.
┆Issue is synchronized with this Jira Story
┆Sprint: Seabright Sprint 3 Electric
┆Issue Number: DOCK-516
Just to clarify, will the display name ever be used to link directly to an organisation?
Also, this was what I had in mind for displaying the display name alongside the main name
Ah Twitter. That's actually a good example. Twitter handles don't change, but you do the see the display change in lists, replies, and the like.
What should we allow? alphanumeric, spaces, and punctuation (\p{Punct})
During review, one issue:
Expected the non-display names (just called "names") to be used in the URLs when browsing around normally (currently using database ids)
Expected display name to be used when browsing (seems working)
What should we allow? alphanumeric, spaces, and punctuation (\p{Punct})
Was this either specified? I'm finding that only some special characters and punctuation are accepted. Good news: consistent between organizations and collections, and also consistent between UI and swagger.
Accepted: &(),'-_
Rejected:`~!@#$%^*+={}[]|.?<>;:"
@Ldcabansay Denis and I decided on the accepted &(),'-_ and not all punctuation. The logic was that it is better to be more restrictive at first and slowly relax constraints than to go the other way. I had updated the constraints in the code but I forgot to update them here in this issue, thanks for looking into it.
Verified that this works on 1.6.0-beta.3
|
2025-04-01T06:38:45.289905
| 2016-11-08T23:15:10
|
188130243
|
{
"authors": [
"achave11",
"david4096",
"dcolligan"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6153",
"repo": "ga4gh/ga4gh-server",
"url": "https://github.com/ga4gh/ga4gh-server/issues/1463"
}
|
gharchive/issue
|
Replace call to run_tests in documentation
In http://ga4gh-reference-implementation.readthedocs.io/en/latest/installation.html#installing-the-development-version-on-mac-os-x
Under Test and run when I ran the script for testing it wouldn't ran, and complained IOError: [Errno 2] No such file or directory: u'.travis.yml'
That should probably just say python -m nose tests to run tests instead. That script isn't in the source tree, is it?
The ga4gh_common package should install a executable called ga4gh_run_tests which replaces the scripts/run_tests.py scripts
Closed with https://github.com/ga4gh/ga4gh-server/commit/77d42db46268875549511edfffab2dca22528c5e
|
2025-04-01T06:38:45.325109
| 2013-09-14T00:05:17
|
19482587
|
{
"authors": [
"gabriel",
"rdingman"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6154",
"repo": "gabriel/yajl-objc",
"url": "https://github.com/gabriel/yajl-objc/issues/35"
}
|
gharchive/issue
|
Update to yajl 2.0
It would be great if you could update to yajl 2.0.
Released new pod as YAJLO. See new README.
|
2025-04-01T06:38:45.341102
| 2021-04-12T18:32:38
|
856237479
|
{
"authors": [
"GameGC",
"PythonCreator27",
"gadicc"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6155",
"repo": "gadicc/node-yahoo-finance2",
"url": "https://github.com/gadicc/node-yahoo-finance2/issues/131"
}
|
gharchive/issue
|
sectorTrend and industryTrend not working
output is empty for TSLA AAPL and etc: {"industryTrend":{"maxAge":1,"symbol":null,"estimates":[]}}
Please, please, please, fill out the issue template! It exists for a reason! Please open a new issue with the issue template filled out! Now nobody has any idea about what the problem is!
Quick question, though: How did you open a new issue without a label? You shouldn't be able to.
This is what the API gives us. Go to https://query2.finance.yahoo.com/v10/finance/quoteSummary/AAPL?modules=sectorTrend and you will get the same output. This is not our problem. This is Yahoo's problem. Maybe they removed the submodule. We have no control over it.
Hey @GameGC, thanks for your issue. I believe we found the problem why your issue didn't have any template, so don't worry about that. As @PythonCreator27 said, however, all this library does make it easier to consume data from Yahoo, and certain data is only available for particular stocks / symbols / markets.
As Yahoo's API isn't a public service, there's no documentation about what we can expect where (but fortunately, this library goes to great lengths to ensure data is always returned in a consistent format). However, if you figure out any patterns in this regard, and feel inclined to add it to the wiki here on this repository, I think many other users could benefit from it too.
Thanks and good luck!
|
2025-04-01T06:38:45.346839
| 2021-08-10T18:03:30
|
965210518
|
{
"authors": [
"aronrodrigues",
"gadicc"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6156",
"repo": "gadicc/node-yahoo-finance2",
"url": "https://github.com/gadicc/node-yahoo-finance2/issues/255"
}
|
gharchive/issue
|
Failed validation: #/definitions/SearchResult
Validation Error
Minimal Reproduction
y2.search("AMZN").then(d=> d.quotes);
Symbol(s) that it happened for
AMZN
Error Message
> The following result did not validate with schema: #/definitions/SearchResult
[
{
instancePath: '',
schemaPath: '#/additionalProperties',
keyword: 'additionalProperties',
params: { additionalProperty: 'screenerFieldResults' },
message: 'must NOT have additional properties'
},
{
instancePath: '',
schemaPath: '#/additionalProperties',
keyword: 'additionalProperties',
params: { additionalProperty: 'timeTakenForScreenerField' },
message: 'must NOT have additional properties'
}
]
This may happen intermittently and you should catch errors appropriately.
However: 1) if this recently started happening on every request for a symbol
that used to work, Yahoo may have changed their API. 2) If this happens on
every request for a symbol you've never used before, but not for other
symbols, you've found an edge-case. Please see if anyone has reported
this previously:
https://github.com/gadicc/node-yahoo-finance2/issues?q=is%3Aissue+Failed%20validation%3A%20%23%2Fdefinitions%2FSearchResult
or open a new issue (and mention the symbol):
https://github.com/gadicc/node-yahoo-finance2/issues/new?labels=bug%2C+validation&template=validation.md&title=Failed%20validation%3A%20%23%2Fdefinitions%2FSearchResult
For information on how to turn off the above logging or skip these errors,
see https://github.com/gadicc/node-yahoo-finance2/tree/devel/docs/validation.md.
Uncaught FailedYahooValidationError: Failed Yahoo Schema validation
at Object.validate [as default] (/Users/aronrodrigues/workspace/tribie-app/functions/node_modules/yahoo-finance2/dist/cjs/src/lib/validateAndCoerceTypes.js:194:15) {
result: {
explains: [],
count: 10,
quotes: [ [Object], [Object], [Object], [Object], [Object], [Object] ],
news: [ [Object], [Object], [Object], [Object] ],
nav: [],
lists: [],
researchReports: [],
screenerFieldResults: [],
totalTime: 83,
timeTakenForQuotes: 478,
timeTakenForNews: 700,
timeTakenForAlgowatchlist: 400,
timeTakenForPredefinedScreener: 400,
timeTakenForCrunchbase: 400,
timeTakenForNav: 400,
timeTakenForResearchReports: 0,
timeTakenForScreenerField: 0
},
errors: [
{
instancePath: '',
schemaPath: '#/additionalProperties',
keyword: 'additionalProperties',
params: [Object],
message: 'must NOT have additional properties'
},
{
instancePath: '',
schemaPath: '#/additionalProperties',
keyword: 'additionalProperties',
params: [Object],
message: 'must NOT have additional properties'
}
]
}
// Returned quotes:
[
{
exchange: 'NMS',
shortname: 'Amazon.com, Inc.',
quoteType: 'EQUITY',
symbol: 'AMZN',
index: 'quotes',
score: 31857500,
typeDisp: 'Equity',
longname: 'Amazon.com, Inc.',
isYahooFinance: true
},
{
exchange: 'MEX',
shortname: 'AMAZON COM INC',
quoteType: 'EQUITY',
symbol: 'AMZN.MX',
index: 'quotes',
score: 20328,
typeDisp: 'Equity',
longname: 'Amazon.com, Inc.',
isYahooFinance: true
},
{
exchange: 'NEO',
shortname: 'AMAZON.COM CDR (CAD HEDGED)',
quoteType: 'EQUITY',
symbol: 'AMZN.NE',
index: 'quotes',
score: 20152,
typeDisp: 'Equity',
isYahooFinance: true
},
{
exchange: 'BUE',
shortname: 'AMAZON COM INC',
quoteType: 'EQUITY',
symbol: 'AMZN.BA',
index: 'quotes',
score: 20064,
typeDisp: 'Equity',
longname: 'Amazon.com, Inc.',
isYahooFinance: true
},
{
exchange: 'OPR',
shortname: 'AMZN Aug 2021 3300.000 put',
quoteType: 'OPTION',
symbol: 'AMZN210813P03300000',
index: 'quotes',
score: 20026,
typeDisp: 'Option',
isYahooFinance: true
},
{
exchange: 'OPR',
shortname: 'AMZN Aug 2021 3400.000 call',
quoteType: 'OPTION',
symbol: 'AMZN210813C03400000',
index: 'quotes',
score: 20025,
typeDisp: 'Option',
isYahooFinance: true
}
]
Environment
Node
Node version 16.5.0
Npm version: 6.14.13
Library version 1.14.3
Additional Context
Hey @aronrodrigues, thanks for reporting. Looks like Yahoo just added this field. This is fixed in our devel branch, but since it's very new, I'm not releasing yet until we've had a little more experience with it.
In particular, we're accepting { screenerFieldResults?: Array<any>; } since we have no data yet on what this new field is meant to contain.
Hey @aronrodrigues, v1.14.4 partially fixes this, so that it no longer throws an error, but we still don't actually know what this data looks like. Will track further developments in #259, but closing at least this validation error for now. Thanks again for reporting!
|
2025-04-01T06:38:45.356496
| 2016-12-24T04:12:16
|
197447269
|
{
"authors": [
"kiwenlau",
"robmcguinness"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6157",
"repo": "gaearon/gitbook-plugin-prism",
"url": "https://github.com/gaearon/gitbook-plugin-prism/issues/13"
}
|
gharchive/issue
|
How to use Google Light?
I want to use Google Light
{
"plugins": ["prism", "-highlight"]
}
"pluginsConfig": {
"prism": {
"css": [
"syntax-highlighting/assets/css/prism/prism-base16-google.light.css"
]
}
}
But I get the error:
Error: Cannot find module 'syntax-highlighting/assets/css/prism/prism-base16-google.light.css'
Did you npm i atelierbram/syntax-highlighting -D?
I installed atelierbram/syntax-highlighting from GitHub and it works, thank you!
npm install https://github.com/atelierbram/syntax-highlighting/tarball/master
|
2025-04-01T06:38:45.390510
| 2023-04-22T03:46:26
|
1679310252
|
{
"authors": [
"elizabethdinella",
"prestwich"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6158",
"repo": "gakonst/ethers-rs",
"url": "https://github.com/gakonst/ethers-rs/pull/2371"
}
|
gharchive/pull-request
|
fix bug 2370
fixes #2370
Can you point me to where to update the docs?
Thanks for the contribution!
Can you point me to where to update the docs?
Docs can be updated in the /// rustdoc above the relevant code items
|
2025-04-01T06:38:45.400597
| 2015-11-23T09:10:53
|
118341976
|
{
"authors": [
"afgane",
"fabiorjvieira"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6159",
"repo": "galaxyproject/ansible-galaxy-os",
"url": "https://github.com/galaxyproject/ansible-galaxy-os/pull/6"
}
|
gharchive/pull-request
|
Include the package pkg-config on role.
This necessary for the correct installation of many galaxy tools and it is also listed in the page https://wiki.galaxyproject.org/Admin/Config/ToolDependenciesList page as a required package.
Thanks.
|
2025-04-01T06:38:45.470707
| 2020-08-05T15:27:16
|
673628570
|
{
"authors": [
"bgruening",
"wm75"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6160",
"repo": "galaxyproject/training-material",
"url": "https://github.com/galaxyproject/training-material/pull/2023"
}
|
gharchive/pull-request
|
101 for everyone: fix workflow configuration & licensing concerns
The customization of the scatterplot steps that needs to be done by the
user in the workflow run dialog was described incompletely, which caused
problems in trainings when users selected column 4 both for plotting and
for grouping. The updated version explains in more detail the purpose of
the customization and clearly states which columns need to be selected
for plotting and grouping.
In addition, the "4Cs of diamonds" image got removed from the tutorial
(and is now only linked to) because of licensing concerns.
fixes #1972
@annefou @jennaj does this look good to you?
Thanks!
|
2025-04-01T06:38:45.487415
| 2018-11-16T02:25:37
|
381419746
|
{
"authors": [
"bryceswarm",
"gamagori",
"zedin27"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6161",
"repo": "gamagori/pizzapi",
"url": "https://github.com/gamagori/pizzapi/issues/38"
}
|
gharchive/issue
|
How to add a coupon in my order?
I tried to use one of the coupons from the list. In this case, I'm using the coupon number 9175 which is "Any Large Specialty Pizza". The number is supposed to retrieve the product code; but, in the function, add_coupon from the object class Order says missing 1 required positional argument: 'code'
What am I missing in here?
@zedin27 Could you share some code that would let me reproduce your issue? Everything you need to do from from pizzapi import * to Order.add_coupon() would be very helpful.
@gamagori sure thing. I knew what was my mistake after realizing I missed one tiny line in the code. I forgot to have an order variable to store what I'm supposed to order before adding the coupon. However, I encountered another error after trying to add the coupon by doing order.add_coupon(9103), which display a KeyError: 9013. This is what I have in my python file (this is my first time playing with python lol):
from pizzapi import *
def ordertest():
zeid = Customer('Zeid', 'Tisnes'<EMAIL_ADDRESS>'5555555555')
address = Address('ur address here', 'ur city', 'UR', '00000')
local_dominos = address.closest_store(zeid)
menu = local_dominos.get_menu()
order = Order(local_dominos, zeid, address)
order.add_coupon(9103) #1 Medium 3 Topping Pizza (here is where it complains)
Error message:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/ztisnes/Desktop/pizzapi/pizzapi/order.py", line 52, in add_coupon
item = self.menu.variants[code]
KeyError: 9118
You can remove ordertest and just write everything I have from that function in the interpreter.
@zedin27 I see what's going on - this is a code issue, not a usage issue.
In the add_coupon() function, the item = self.menu.variants[code] is looking for the coupon ID that you're passing it, but it's looking for it in the 'variants' part of the menu, rather than the 'coupons' part of the menu.
You should be able to fix this locally by changing that line to item = self.menu.coupons[code]. I need to get some other things in place before I can fix and test this properly, but I'll be sure to let you know once it's good to go.
I made a PR for that specifically. Let me know when it is all good :). Thank you!
That still wont work as menu.coupons just returns a list of the coupon objects...
|
2025-04-01T06:38:45.556513
| 2020-10-06T19:58:27
|
715982079
|
{
"authors": [
"GernotMaier",
"RaulRPrado",
"orelgueta"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6162",
"repo": "gammasim/simtools",
"url": "https://github.com/gammasim/simtools/issues/79"
}
|
gharchive/issue
|
Error bars
(With the danger that you take away my github account, I'll continue starting issues, even for minor stuff)
Noticed that we have to think as well about error bars.
Looking at the following plot make me think that there are some missing? Would that be difficult?
Same for effective focal length, d80, ...
I agree.
The difficulty really depends on the specifics of each plot.
In this case, the eff area comes from a fraction of photons detected/simulated. A binomial uncertainty should give what we want.
I'll implement that.
It's important to note that it won't always be so easy nor worth it. In this case, the eff area is basically flat, which makes very small fluctuations look so ugly. If the error calculation here was complicated, I'd say it's not worth it because it's in general very small.
Agree - don't put a huge amount of work into it. I didn't notice the very suppressed y-axis, which amplifies the flucutations.
Errors bars should be wherever they make sense and can be achieved with reasonable effort.
Ok.
I think it's better to implement this ones later. I'll finish going through the documentation issues first.
I can help implementing error bars here if you wish. Also, I am not sure the fluctuations are that small (1-2 m^2 is non-negligible no?).
Orel is actually right - where does the uncertainty of 1 m^2 come from? Although this is a ray-tracing experiment, the result should be almost deterministic. Of course we should see partly the imprint of the structure / beams etc.
You don't see the imprint of the structure because we do not simulate it in sim_telarray. To include it we use the telescope transmission function, but for the LST (which is what I assume is simulated, should add a label to the plot), that function is flat.
However, shouldn't we see the imprint of the camera? That we do simulate I think.
I suggest to close this issue.
The requirement on having uncertainties on all results is a generic task and written down in the requirements and concept document.
|
2025-04-01T06:38:45.610812
| 2024-11-08T20:56:18
|
2645143147
|
{
"authors": [
"ganghuo2024"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6163",
"repo": "ganghuo2024/sampling",
"url": "https://github.com/ganghuo2024/sampling/pull/2"
}
|
gharchive/pull-request
|
Gang Huo - Submission of assignment 2 questionnaire design part_a
… for sampling course
What changes are you trying to make? (e.g. Adding or removing code, refactoring existing code, adding reports)
Completed all the tasks in assignment 2 (questionnaire design part a) of the sampling course.
What did you learn from the changes you have made?
Based on the scenario #1, I described the survey purpose, target population, sampling frame, and sampling strategy of the survey and designed some survey questions. I learned how to design the survey questions.
Was there another approach you were thinking about making? If so, what approach(es) were you thinking of?
Were there any challenges? If so, what issue(s) did you face? How did you overcome it?
How were these changes tested?
A reference to a related issue in your repository (if applicable)
Checklist
[ ] I can confirm that my changes are working as intended
A2 Observational units missing, please be more specific about sampling units
I don't understand what you means by observational units missing. Can you please elaborate about that? Is my assignment 2 complete or not? Please confirm. Thanks
A2 Observational units missing, please be more specific about sampling units
Hi Amanda, I have made amendment to my answer based on your comments. Please confirm if my assignment 2 is complete or not. Look forward to your early reply. Thanks
|
2025-04-01T06:38:45.612342
| 2016-02-15T03:58:31
|
133618107
|
{
"authors": [
"atlithorn",
"wushaobo"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6164",
"repo": "gangverk/flask-swagger",
"url": "https://github.com/gangverk/flask-swagger/pull/24"
}
|
gharchive/pull-request
|
improve the compatibility for python 3
For python3 (I was using python 3.4), there are two cases as follows that flask-swagger throws exceptions. This commit is to fix it.
python code: swagger(app, template=a_template)
command line: flaskswagger -h
Thanks!
|
2025-04-01T06:38:45.615636
| 2018-07-27T11:19:31
|
345189091
|
{
"authors": [
"ganqqwerty",
"mavrik"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6165",
"repo": "ganqqwerty/123-Essential-JavaScript-Interview-Questions",
"url": "https://github.com/ganqqwerty/123-Essential-JavaScript-Interview-Questions/issues/45"
}
|
gharchive/issue
|
Question No.3 - Answer example incorrect
The function defined in the prototype function displayIncreasedSalary() is trying to call the private function increaseSalary() defined in the constructor function. This doesn't work.
I don't think we need that section of code anyway
|
2025-04-01T06:38:45.672774
| 2020-09-08T13:11:56
|
695857712
|
{
"authors": [
"GSzabados",
"MimbaMonkeyHouse",
"arjen-w",
"cpuks",
"garbled1",
"gilperme",
"mr-sneezy",
"ronjtaylor",
"scooper1"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6166",
"repo": "garbled1/homeassistant_ecowitt",
"url": "https://github.com/garbled1/homeassistant_ecowitt/issues/23"
}
|
gharchive/issue
|
hassio version 115 issue
the integration will not load due to missing constant UNIT_PERCENTAGE
They have removed the UNIT_ from the constant
to fix edit all instances of UNIT_PERCENTAGE to PERCENTAGE in the init.py file
When replacing all instances from UNIT_PERCENTAGE to PERCENTAGE, I get this error in the logs:
Setup failed for ecowitt: Unable to import component: cannot import name 'PERCENTAGE' from 'homeassistant.const' (/usr/src/homeassistant/homeassistant/const.py)
23:20:11 – setup.py (ERROR)
My mistake, this error appeared before I updated to the 115 beta. It disappears when upgraded from 114. I'm still left with these errors and no created entities:
2020-09-13 23:32:38 ERROR (MainThread) [aiohttp.server] Error handling request
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/aiohttp/web_protocol.py", line 418, in start
resp = await task
File "/usr/local/lib/python3.8/site-packages/pyecowitt/ecowitt.py", line 262, in handler
weather_data = self.convert_units(data_copy)
File "/usr/local/lib/python3.8/site-packages/pyecowitt/ecowitt.py", line 163, in convert_units
data["windspdkmh_avg10m"] = float(data["windspdmph_avg10m"]
TypeError: float expected at most 1 argument, got 2
2020-09-13 23:33:16 WARNING (MainThread) [custom_components.ecowitt] Unhandled sensor type maxdailygustms
``
I can confirm that the issue is solved with changing UNIT_PERCENTAGE to PERCENTAGE in version 0.115.
Confirm - replacing UNIT_PERCENTAGE to PERCENTAGE solved error. It's related to 0.115 as I made change on 0.114.4 prior to update and addon couldn't start.
I just updated to 115.1 and get this notification
and this is display
Been working fine up until 115.1
I've created a Pull request to get this solved.
The suggestion above worked for my install as well. Thank you.
I've created a Pull request to get this solved.
Seems like the pull request was not succesfull:
File "/hacs/custom_components/hacs/repositories/integration.py", line 32, in localpath
return f"{self.hacs.core.config_path}/custom_components/{self.data.domain}"
AttributeError: type object 'HacsCore' has no attribute 'config_path'
@arjen-w, to be honest, I have no idea how to pass the HACS validator.
The changed lines of code is working fine with HASS.IO v0.115 without any issue. If you have experience with HACS validator, then please pass your advise here, and I will look at it. Meanwhile the owner of the integration/repo has not replied to any messages, neither here, neither on the Home Assistant community forum.
If we can't get a PR into this repo can we just create a new fork (until the repo owner want's to get back in to it) and move over to that ?
What is the etiquette in doing that on Git ?
If we can't get a PR into this repo can we just create a new fork (until the repo owner want's to get back in to it) and move over to that ?
What is the etiquette in doing that on Git ?
Not sure, but as long as you are referring to this project as source in your release notes and code, I personally think it won't be an issue.
On the other hand, for me everything is working after fixing things manually, so for me having a fork isn't necessary at the moment.
Sorry about this.. this is now fixed in 0.3.1 Was very busy for a few months with annoying life stuff.
|
2025-04-01T06:38:45.676058
| 2024-03-04T14:57:48
|
2167049334
|
{
"authors": [
"oliver-goetz",
"petersutter"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6167",
"repo": "gardener/ci-infra",
"url": "https://github.com/gardener/ci-infra/pull/1274"
}
|
gharchive/pull-request
|
Enable renovate for UI and CLI repositories
/kind enhancement
What this PR does / why we need it:
With this PR the UI and CLI repositories will be watched by renovate 🤖
Which issue(s) this PR fixes:
Fixes #
Special notes for your reviewer:
Awesome, welcome 🥳
/lgtm
/approve
|
2025-04-01T06:38:45.679461
| 2022-06-22T10:11:41
|
1279900541
|
{
"authors": [
"gardener-robot-ci-3",
"himanshu-kun"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6168",
"repo": "gardener/gardener",
"url": "https://github.com/gardener/gardener/pull/6160"
}
|
gharchive/pull-request
|
[ci:component:github.com/gardener/autoscaler:v1.21.0->v1.22.0]
Release Notes:
sync the changes till v1.22.0 of upstream autoscaler
IT retry to scale up and down the pod in case on conflicts. Retry of 5 times with interval of 10 millisecond is kept
This needs manual change will update it .
The PR doing the relevant change https://github.com/gardener/gardener/pull/6163
|
2025-04-01T06:38:45.684026
| 2023-03-10T15:41:49
|
1619182169
|
{
"authors": [
"MartinWeindel"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6169",
"repo": "gardener/gardener",
"url": "https://github.com/gardener/gardener/pull/7624"
}
|
gharchive/pull-request
|
Provide valuegardener.seed.name for controller registration helm charts.
How to categorize this PR?
/area control-plane
/kind enhancement
What this PR does / why we need it:
The value gardener.seed.identity is deprecated, as "identity" implies global uniqueness. Instead it is recommended to use the value gardener.seed.clusterIdentity. For the dns-shoot-service we would prefer to continue with using the seed name as part of the the DNSOwner id, as it has contains the gardener.garden.clusterIdentity to make it unique.
With introducing gardener.seed.name, we can still use the seed name for better readability.
Which issue(s) this PR fixes:
Fixes #
Special notes for your reviewer:
Related to #2851
Release note:
provide value `gardener.seed.name` for controller registration helm charts.
/invite @rfranzke
/test pull-gardener-integration
/test pull-gardener-e2e-kind-upgrade
/test pull-gardener-integration
|
2025-04-01T06:38:45.691697
| 2023-08-24T06:55:22
|
1864527074
|
{
"authors": [
"ialidzhikov",
"rfranzke",
"seshachalam-yv",
"shafeeqes"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6170",
"repo": "gardener/gardener",
"url": "https://github.com/gardener/gardener/pull/8389"
}
|
gharchive/pull-request
|
Split codegen target and further enhance generate script
How to categorize this PR?
/area dev-productivity
/kind enhancement
What this PR does / why we need it:
This PR splits the codegen target as well for various groups, and also fixes a minor bug to make it run in parallel. Now, MODE is also available for codegen target.
Invalid options for codegen are skipped.
It is possible to pass any folder path for manifests target.
❯ make generate PRINT_HELP=y
# Usage: make generate [WHAT="<targets>"] [MODE="<mode>"] [CODEGEN_GROUPS="<groups>"] [MANIFESTS_FOLDERS="<folders>"]
#
# Options:
# WHAT - Specify the targets to run (e.g., "protobuf codegen manifests logcheck gomegacheck monitoring-docs")
# CODEGEN_GROUPS - Specify which groups to run the 'codegen' target for, not applicable for other targets (e.g., "authentication core extensions resources operator seedmanagement operations settings operatorconfig controllermanager admissioncontroller scheduler gardenlet resourcemanager shoottolerationrestriction shootdnsrewriting provider_local extensions_config")
# MANIFESTS_FOLDERS - Specify which folders to run the 'manifests' target in, not applicable for other targets (Default folders are "charts cmd example extensions imagevector pkg plugin test")
# MODE - Specify the mode for the 'manifests' or 'codegen' target (e.g., "parallel" or "sequential")
#
# Examples:
# make generate
# make generate WHAT="codegen protobuf"
# make generate WHAT="codegen protobuf" MODE="sequential"
# make generate WHAT="manifests" MANIFESTS_FOLDERS="pkg/component plugin" MODE="sequential"
# make generate WHAT="codegen" CODEGEN_GROUPS="core extensions"
# make generate WHAT="codegen manifests" CODEGEN_GROUPS="operator controllermanager" MANIFESTS_FOLDERS="charts example/provider-local"
#
Which issue(s) this PR fixes:
Fixes #
Special notes for your reviewer:
/cc @timuthy @seshachalam-yv
Release note:
NONE
/test pull-gardener-unit
/test pull-gardener-e2e-kind-upgrade
/test pull-gardener-e2e-kind-ha-multi-zone
/test pull-gardener-e2e-kind-ha-multi-zone
/hold
/test pull-gardener-unit
@seshachalam-yv I have reworked this PR to allow passing any folder to manifests target. Can you PTAL?
/test pull-gardener-unit
/unhold
ping @seshachalam-yv
/assign
PTAL! @ary1992 @ialidzhikov
|
2025-04-01T06:38:45.695116
| 2023-03-05T02:48:56
|
1610031095
|
{
"authors": [
"garfias06"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6171",
"repo": "garfias06/prework-study-guide",
"url": "https://github.com/garfias06/prework-study-guide/issues/5"
}
|
gharchive/issue
|
Deployment
#DEPLOYMENT
##User Story
As a boot camp student,
I want the prework notes to be structured on a webpage,
So that I can easily find and read the information.
#Acceptance Criteria
GIVEN a Prework Study Guide website,
WHEN I visit the website using the URL,
THEN I can access my website from any browser
Deployment completed through GitHub Pages
|
2025-04-01T06:38:45.697796
| 2020-09-25T12:32:45
|
708899572
|
{
"authors": [
"dev-event",
"garganurag893"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6172",
"repo": "garganurag893/react-native-phone-number-input",
"url": "https://github.com/garganurag893/react-native-phone-number-input/issues/8"
}
|
gharchive/issue
|
Update props
Thanks for the library. Excellent. Please see all props and add them to the documentation. For instance - textContainerStyle;
Add the ability to indicate your own icon (an arrow next to the flag);
Ability to change the size of the flag
Thanks! package 5 stars
Thanks for the advice. Kindly check v1.1.0 for new updates.
|
2025-04-01T06:38:45.705810
| 2024-12-12T16:21:08
|
2736350573
|
{
"authors": [
"Caladius",
"garrettjoecox"
],
"license": "CC0-1.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6173",
"repo": "garrettjoecox/2ship2harkinian",
"url": "https://github.com/garrettjoecox/2ship2harkinian/pull/60"
}
|
gharchive/pull-request
|
Add Tingle Map Shuffle
Shuffles Tingle and his Maps into the Pool.
Adds CAN_USE_PROJECTILE Logic Check to the list.
Adds TWIN_ISLANDS to the Regions List
Renamed Dungeon Map RI's since overworld maps are listed as such.
Build Artifacts
2ship-linux.zip
2ship-mac.zip
2ship-windows.zip
Merged
|
2025-04-01T06:38:45.719631
| 2021-11-25T16:12:57
|
1063761262
|
{
"authors": [
"Amerousful",
"slandelle"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6174",
"repo": "gatling/gatling",
"url": "https://github.com/gatling/gatling/issues/4170"
}
|
gharchive/issue
|
[Netty] Failed in the runtime
After upgrade to version 3.7.1 catch the error during the launch. The error appears after choose simulation and fill description.
I tried to do this:
change different JDKs (8, 11)
get demo project and launch there
Error log:
computerdatabase.BasicSimulation is the only simulation, executing it.
Select run description (optional)
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x0000000133fb11f2, pid=1636, tid=5635
#
# JRE version: OpenJDK Runtime Environment AdoptOpenJDK-11.0.11+9 (11.0.11+9) (build 11.0.11+9)
# Java VM: OpenJDK 64-Bit Server VM AdoptOpenJDK-11.0.11+9 (11.0.11+9, mixed mode, tiered, compressed oops, g1 gc, bsd-amd64)
# Problematic frame:
# C [libnetty_tcnative_osx_x86_6417779003216119739014.dylib+0x1671f2] __isPlatformOrVariantPlatformVersionAtLeast.cold.1+0x152
#
# No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /Users/pavelbairov/IdeaProjects/gatling-maven-plugin-demo-scala/hs_err_pid1636.log
#
# If you would like to submit a bug report, please visit:
# https://github.com/AdoptOpenJDK/openjdk-support/issues
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
Process finished with exit code 134 (interrupted by signal 6: SIGABRT)
And there is full log
hs_err_pid1636.log
As you can see from the log, the problem is somewhere in the netty library itself...
Weird, I don't have any issue.
% uname -a
Darwin Stephanes-MacBook-Pro 20.5.0 Darwin Kernel Version 20.5.0: Sat May 8 05:10:33 PDT 2021; root:xnu-7195.121.3~9/RELEASE_X86_64 x86_64
% java -version
openjdk version "1.8.0_312"
OpenJDK Runtime Environment (Zulu <IP_ADDRESS>-CA-macosx) (build 1.8.0_312-b07)
OpenJDK 64-Bit Server VM (Zulu <IP_ADDRESS>-CA-macosx) (build 25.312-b07, mixed mode)
Could you please try a different OpenJDK build than the ones from AdoptOpenJDK, eg Zulu?
If you still experience an issue, please provide your uname -a and your java -version.
No issue either with Java 11:
% java -version
openjdk version "11.0.13" 2021-10-19 LTS
OpenJDK Runtime Environment Zulu11.52+13-CA (build 11.0.13+8-LTS)
OpenJDK 64-Bit Server VM Zulu11.52+13-CA (build 11.0.13+8-LTS, mixed mode)
Nor Java 17:
% java -version
openjdk version "17.0.1" 2021-10-19 LTS
OpenJDK Runtime Environment Zulu17.30+15-CA (build 17.0.1+12-LTS)
OpenJDK 64-Bit Server VM Zulu17.30+15-CA (build 17.0.1+12-LTS, mixed mode, sharing)
uname -a
Darwin bairov.local 19.0.0 Darwin Kernel Version 19.0.0: Thu Oct 17 16:17:15 PDT 2019; root:xnu-6153.41.3~29/RELEASE_X86_64 x86_64
I tried Zulu. Also Corretto, AdoptOpenJDK
Something strange. Ok, I will try JDK 17 and then maybe try to update my OS
Yeah, your MacOS version is pretty old, that could be the issue.
Still, if that fixes your issue, it might be worth reporting it to Netty.
Great!
|
2025-04-01T06:38:45.722004
| 2017-05-26T14:59:39
|
231651502
|
{
"authors": [
"joemeszaros",
"slandelle"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6175",
"repo": "gatling/gatling",
"url": "https://github.com/gatling/gatling/pull/3309"
}
|
gharchive/pull-request
|
Fix feeder shuffle description
Problem
Feeder shuffle description had a typo:
shuffle entries, then behave live queue
Solution
Change live -> like
Thanks! I had already fixed on our private branch, but I'll cherry-pick yours instead so you're credited with the commit :)
:-) thanks
cherry-picked
|
2025-04-01T06:38:45.835790
| 2020-07-30T08:33:25
|
668492312
|
{
"authors": [
"kishan-tocca"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6176",
"repo": "gatteo/react-jitsi",
"url": "https://github.com/gatteo/react-jitsi/issues/20"
}
|
gharchive/issue
|
CONFERENCE FAILED: conference.authenticationRequired
throwing this error, i didn't find solution
<Jitsi
roomName={roomName}
displayName={displayName}
password={password}
config={{ startWithAudioMuted: true, startWithVideoMuted: true }}
interfaceConfig={{ filmStripOnly: false }}
containerStyle={{ width: '100%', height: '100%' }}
onAPILoad={handleAPI}
domain={process.env.REACT_APP_JITSI_SERVER}
/>
i m using react-jitsi library
sorry for the noise, its mistake in my displayName and password
|
2025-04-01T06:38:45.839485
| 2021-10-29T13:54:11
|
1039594249
|
{
"authors": [
"GaiaB0t",
"JamesGreen31",
"adrisj7"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6177",
"repo": "gaucho-matrero/altoclef",
"url": "https://github.com/gaucho-matrero/altoclef/issues/125"
}
|
gharchive/issue
|
ignored gear
Bug Description
the bot ignores already provided netherite gear to make iron gear for some reason on all modes
Steps to Reproduce (as best as you can)
have netherite gear, doa command like @gamer or something
Expected Behavior
utilize the provided gear
Actual Behavior
completely ignore the provided gear and make iron tools
Crashlogs and Screenshots (if applicable)
Fix Idea (Personal Noes for when I fix this later)
gearSatisfied(Slot) -> returns whether a gear of a certain material or better is in that slot. From now on use that whenever checking for gear equipment.
EquipArmorOrBetterTask -> Keep EquipArmorTask, but have this extra option where it will equip the best armor in our inventory and use gearSatisfied to determine if it has enough gear to equip.
btw this applies to tools also
I believe that this has been applied for tools. Possibly armor.
|
2025-04-01T06:38:45.872718
| 2021-02-24T07:56:10
|
815222371
|
{
"authors": [
"MortenHofft"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6178",
"repo": "gbif/hosted-portals",
"url": "https://github.com/gbif/hosted-portals/issues/26"
}
|
gharchive/issue
|
feat: add option to link from top level menu items
From the mailing list
It is possible to add a landing page to the name of the dropdown menu?
It is not possible. Simply because I find it a bit confusing to have such a menu, but my limited capacity shouldn't hinder others from doing so.
This needs more testing and consideration. It is a bit too easy to create nonsense menus.
Mobile support needs thinking through and probably also some guidelines on how to do proper landing pages for such top level links
|
2025-04-01T06:38:45.877872
| 2017-02-23T16:01:35
|
209803966
|
{
"authors": [
"kbraak",
"nestorjal"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6179",
"repo": "gbif/ipt",
"url": "https://github.com/gbif/ipt/issues/1323"
}
|
gharchive/issue
|
Failed to update DOI
IPT version: 2.3.3-r01
When trying to publish a new version of a dataset the following message appears:
Publication log:
An error of type DOI was encountered during publishing: Failed to update doi:10.15472/xcpipy metadata: HTTP 500:
Restored version #2.0 of resource bios_microorg_seaflower_2016 after publishing failure 3:58:38 PM
I've already checked both data and metadata for inconsistencies but they are fine.
Any advice?
Thanks Nestor,
As per DataCite's API documentation when encountering a 500 server internal error, you should "try later and if problem persists please contact us". Can you please try again and report back if it works this time?
Otherwise, I'd recommend turning on debugging mode in your IPT settings and looking for more detailed information in your IPT logs.
By the way, it says your IPT is based on version "2.3.3-r01" but the officially released version is "2.3.3-rdb4ab13". I should raise again the warning in https://github.com/gbif/ipt/issues/1319#issuecomment-280310595 regarding using non-officially released versions in production.
If you are just customizing the style there is no problem of course. Here is one problem with the re-styling that I noticed by the way:
Thanks Kyle, today finally the publication was successful. I'll check the issues with the re-styling.
Wonderful, glad to hear that. Thanks for this feedback.
I will close this issue then.
|
2025-04-01T06:38:45.930164
| 2022-11-19T21:00:24
|
1456658436
|
{
"authors": [
"alordash",
"gbj"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6180",
"repo": "gbj/leptos",
"url": "https://github.com/gbj/leptos/issues/96"
}
|
gharchive/issue
|
Fetch demo: cats count input on:input callback is not working
I used code from fetch example.
When running this example locally, nothing changes when I change number of cats.
However, if I add on:input callback to input's label, then it works:
<label on:input=move |ev| {
let val = event_target_value(&ev).parse::<u32>().unwrap_or(0);
set_cat_count(val);
}>
"How many cats would you like?"
<input type="number"
value={move || cat_count.get().to_string()}
/>
</label>
Removing label also fixes callback not firing.
Hm... I'm not able to reproduce this.
I'm going to guess that wrapping "How many cats" in a <span> also fixes it? Is that correct?
Are you running this within the repository, or as a separate example and if so, using what Leptos version? (If you're running locally using leptos = "0.0" is it 0.0.16 or 0.0.17? Is it fixed if you update to 0.0.17 or use a git dependency on the repo, i.e., leptos = { git = "https://github.com/gbj/leptos" } in your Cargo.toml)
I was running it using leptos = "0.0.16". Switching to 0.0.17 actually fixed the issue!
Perfect! Yeah 0.0.16 had a rendering bug unfortunately.
|
2025-04-01T06:38:45.932733
| 2022-06-14T07:40:10
|
1270389157
|
{
"authors": [
"gbouras13",
"ronepz"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6181",
"repo": "gbouras13/pharokka",
"url": "https://github.com/gbouras13/pharokka/issues/103"
}
|
gharchive/issue
|
Phrokka gff contig name
phrokka version: 0.1.0
Python version: 3.9
Operating System: MacOS
Description
Small thing, but I noticed that in the .gff output that the names for the CDS and the tRNA hits are different - seems trimmed.
Should be fixed with v0.1.7 removal of hhsuite
|
2025-04-01T06:38:45.955539
| 2017-07-22T07:18:17
|
244830216
|
{
"authors": [
"SimonMeskens",
"gcanti"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6182",
"repo": "gcanti/fp-ts",
"url": "https://github.com/gcanti/fp-ts/pull/167"
}
|
gharchive/pull-request
|
add Distributive functor
This PR provides the notion that is categorically dual to Traversable.
A Distributive Functor is one that you can push any functor inside of.
distribute :: (Functor f, Distributive g) => f (g a) -> g (f a)
Compare this with the corresponding Traversable notion, sequence.
sequence :: (Applicative f, Traversable g) => g (f a) -> f (g a)
Is there a reason why Distributive and Traversable don't have corresponding fantasy interfaces?
AFAIK FantasyTraversable does exist. FantasyDistributive seems not possible since the outer type is HKT<G, _>. What do you propose?
The spec for fantasy-land Traversable is here:
https://github.com/fantasyland/fantasy-land#traversable
If I understand correctly, Distributive would just be a reversion of the arrows?
AFAIK FantasyTraversable does exist
I mean that is already defined in fp-ts https://github.com/gcanti/fp-ts/blob/ee67de4b897cb9a5b87be9d39cee6751a1353251/src/Traversable.ts#L11
The signature
distribute :: (Functor f, Distributive g) => f (g a) -> g (f a)
suggests that adding a method to the prototype of Distributive g is useless since the value at hand is a f (g a)
Oh my bad, right, it's FantasyFilterable that doesn't exist, I looked at the wrong file.
I'm not sure I understand how your version of Distributive works yet, I'll play around with it a bit tomorrow and look if I can come up with a sensible Fantasy version.
it's FantasyFilterable that doesn't exist
Ah you are right (and Witherable as well). Would you like to send a PR for them? Writing the Fantasy* instances for Filterable and Witherable should make clear why I think is not possible to do the same for Distributive: both the missing definitions involve a HKT<F, A> value where F is the "main" type parameter of the interfaces while in Filterable the main type parameter is F but the involved value is G-parametrised. Does it make sense?
Ah, cool, yeah, I was finishing up one of the Immutable collections yesterday and ran into this part of the library. I'll have a look if I can solve the conundrum.
You are correct.
I looked up all the theory behind it (and why a monadic operation such as distribute or traverse only needs respectively a Functor or an Applicative) and I get it now. Without extension methods in TypeScript, you cannot attach the Distributive to anything. You can always write a Fantasy type of a dual, unless that dual forces the focus type into the output slot. Makes total sense. There's no hidden way to express it that would give rise a Fantasy type.
|
2025-04-01T06:38:45.985972
| 2019-08-27T17:18:58
|
485933858
|
{
"authors": [
"LabanSkollerDefensify",
"joshbarth",
"mifriis"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6183",
"repo": "gchq/CyberChef",
"url": "https://github.com/gchq/CyberChef/issues/624"
}
|
gharchive/issue
|
Bug report: JWT Verify doesn't require an algorithm
As detailed here, JWT verification functions should require specifying the algorithm that should have been used, in order to prevent an attacker from changing the algorithm to a symmetric algorithm from an asymmetric one and using the public key to sign the token. Probably low priority for this particular app, but it would be good to at least have the option.
Hi Josh,
Correct me if i am wrong or misunderstand you, but isn't the problem you link to, a serverside issue where developers simply trust whatever alg the token specifies? Removing "none" from Cyberchef won't help the problem. Servers need to interpret the tokens with the alg it writes the tokens with.
Having the "none" alg there is a great tool for testing your implementation of JWT verification, to catch the problems mentioned in the link.
I don't mean removing the none algorithm, I meaning allowing the user to specify which algorithm is expected and raise an error if the jwt defines it differently. The relevant part of the article is this: https://auth0.com/blog/critical-vulnerabilities-in-json-web-token-libraries/#RSA-or-HMAC-. Specifically "If a server is expecting a token signed with RSA, but actually receives a token signed with HMAC, it will think the public key is actually an HMAC secret key." Basically, the solution is, rather than trusting the jwt alg field, allowing the user to define the algorithm. Hope that makes more sense.
Yeah, that would be good especially since the JWT header isn't shown at all. I recommend using https://jwt.io/ for playing with JWT tokens.
|
2025-04-01T06:38:45.986978
| 2017-07-06T07:38:53
|
240869851
|
{
"authors": [
"p013570"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6184",
"repo": "gchq/Gaffer",
"url": "https://github.com/gchq/Gaffer/issues/1044"
}
|
gharchive/issue
|
Add example of applying validation to multiple properties in a schema
This should be added to the Schema walkthrough in the dev guide.
Merged into develop.
|
2025-04-01T06:38:45.988312
| 2017-11-03T13:13:02
|
270973372
|
{
"authors": [
"ac74475",
"gaffer01"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6185",
"repo": "gchq/Gaffer",
"url": "https://github.com/gchq/Gaffer/issues/1493"
}
|
gharchive/issue
|
Parquet Store generates too many files for HDFS to handle when importing RDD
The is an issue where importing an RDD with lots of partitions and wanting to split the data into lots of files per group, will worst case cause
x x files to be generated which HDFS will struggle to handle.
This bug was fixed during the major rewrite of the Parquet store in #1884 .
|
2025-04-01T06:38:45.990504
| 2023-11-16T15:14:51
|
1997097085
|
{
"authors": [
"tb06904"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6186",
"repo": "gchq/Gaffer",
"url": "https://github.com/gchq/Gaffer/pull/3099"
}
|
gharchive/pull-request
|
Gh-3098 Improve Testing of GafferEntityGenerator
Improves tests for the GafferEntityGenerator coverage now > 80%
Small tweak to the main class so it checks for null values and to use easier to read lambdas.
Related issue
Resolve #3098
You've added null checks for the vertex properties, but this is showing as untested.
I haven't added them they were in the existing logic although looking at the code a bit more is impossible to actually make a GafferPop Edge or Vertex with a property that has a null key as it is validated when its added and also the properties are technically stored in a HashMap which also fundamentally does not allow it. Therefore these checks likely can just be removed as they are adding nothing.
|
2025-04-01T06:38:45.993375
| 2019-10-08T10:51:08
|
503971006
|
{
"authors": [
"CLAassistant",
"dev930018"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6187",
"repo": "gchq/Palisade",
"url": "https://github.com/gchq/Palisade/pull/476"
}
|
gharchive/pull-request
|
PAL 185 - Added .clone() .equals() .hashCode() methods to hr-generator types
While not likely necessary for production, this is required for testing to validate (a lack of) changes to records after applying Rules
Thank you for your submission, we really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.You have signed the CLA already but the status is still pending? Let us recheck it.
Held back for a bit by a JUnit bug, be aware if using Theories in the future
https://github.com/junit-team/junit4/issues/1629#issue-504602826
|
2025-04-01T06:38:45.994530
| 2019-05-13T07:48:07
|
443252704
|
{
"authors": [
"d47853",
"p013570"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6188",
"repo": "gchq/gaffer-tools",
"url": "https://github.com/gchq/gaffer-tools/issues/728"
}
|
gharchive/issue
|
Allow an operation chain to be execute on load of the UI
Create a url query param 'preQuery' that a user/system could set to invoke an operation chain when the UI first loads. This would allow click through from other apps and queries to be shared between users.
Merged into develop
|
2025-04-01T06:38:45.996066
| 2018-12-13T13:10:54
|
390672163
|
{
"authors": [
"stroomdev10",
"stroomdev66"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6189",
"repo": "gchq/stroom",
"url": "https://github.com/gchq/stroom/issues/1024"
}
|
gharchive/issue
|
SupersededOutputHelper is not initialised
20181213.txt
For some reason the ReferenceDataLoad pipeline is trying to write an output stream. This isn't expected to be the case.
The SupersededOutputHelper will no longer check that output is superseded for pipelines that exist outside of normal processing.
Change will be available in v6.0-beta.21 onwards
|
2025-04-01T06:38:46.020925
| 2024-09-25T11:16:49
|
2547701311
|
{
"authors": [
"dhavalmnjtech",
"gdelataillade"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6191",
"repo": "gdelataillade/alarm",
"url": "https://github.com/gdelataillade/alarm/issues/244"
}
|
gharchive/issue
|
How to check for alarm stops when stopping from notification stop button? and onTap of Notification.
Alarm plugin version
4.0.0-dev.3
How i check for is my alarm is stopped from notification stop button.
In my app to stop alarm i am using Alarm.stop(id); to stop the alarm so here after i called it i can do something in my code,
but what about stopping alarm from notification button? here i have to call a function when i stop alarm from notification
as me, you can provide onStop method when stop from notification stop button.
If there is method like this then everyone can perform some task on alarm stop from stop button of notification.
And also need onTap of notification
If there is onTap for notification then user can perform task accordingly.
Additional context
For my app i have to show non dismissible notification when user stops the alarm. This is possible when i use Alarm.stop(id); but need to do from notification stop button. I used flutter_local_notifications for non dismissible notification.
And also want to open Alarm screen when tap on notification, so for it onTap is useful.
Hi @dhavalmnjtech
Thanks for your interest in the package.
I'll consider adding a onStop callback when the alarm is stopped from the notification and a onTap when app is opened from the notification. I'll keep you updated here.
Hello @gdelataillade
I was working on it but faced a technical limitation on Android. If someone with Kotlin experience could help me this PR #275 it would be great !
|
2025-04-01T06:38:46.097616
| 2019-12-26T17:10:54
|
542609411
|
{
"authors": [
"felixhaeberle",
"gearsdigital"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6192",
"repo": "gearsdigital/kirby-reporter",
"url": "https://github.com/gearsdigital/kirby-reporter/pull/21"
}
|
gharchive/pull-request
|
update composer.json to fix install in site/plugins
update composer.json to fix install in site/plugins
Thank you very much for fixing this issue. This will make the plugin even more convenient 👍
|
2025-04-01T06:38:46.099414
| 2011-11-20T18:16:00
|
2296398
|
{
"authors": [
"joshsz"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6193",
"repo": "ged/rspec-formatter-webkit",
"url": "https://github.com/ged/rspec-formatter-webkit/issues/1"
}
|
gharchive/issue
|
Fix for a MatchData issue
I was getting an exception using this formatter:
rspec-formatter-webkit/lib/rspec/core/formatters/webkit.rb:176:in `expand_path': can't convert nil into String (TypeError)
from rspec-formatter-webkit/lib/rspec/core/formatters/webkit.rb:176:in `block in backtrace_line'
The issue (for me at least) was that gsub wasn't matching the line appropriately and wasn't generating the correct MatchData variables ($1, $2).
I fixed the .gsub syntax that was not parsing lines correctly and switched to .match instead, which cleared up my issue.
nevermind, I was dumb and didn't see what all this is actually doing. I've lost some functionality in this commit. I'll rework it, sorry about that.
So weird, I can't reproduce the error now. Ah well, guess I was jumping at shadows :)
|
2025-04-01T06:38:46.137387
| 2024-03-17T16:19:07
|
2190747774
|
{
"authors": [
"crjaensch",
"iorisa"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6194",
"repo": "geekan/MetaGPT",
"url": "https://github.com/geekan/MetaGPT/issues/1022"
}
|
gharchive/issue
|
Generated Python code for Game of 2048 largely inconsistent
Bug description
I tried to create the Game of 2048 as instructed in the Intro section and I used the OpenAI model "gpt-4-turbo-preview". I set up MetaGPT using conda on my macOS system, using Python 3.9. I initialized the config to use OpenAI with a valid API-KEY and the best model available (see above).
Unfortunately, the generated code is not self-consistent. MetaGPT created multiple Python files and the generated code looked superficiously good. But when trying to run the main.py code, many errors were raised. Most errors are the result of code references to either Python constants, methods, or constructors of generated Python code that did not exist or were referenced with an incorrect name.
The errors are too numerous to fix. I have attached the generated Python code for further details. I renamed all Python files to use the filename extension '.txt', since an upload of Python files seems to be prohibited by Github.
main.txt
game.txt
ui.txt
logic.txt
constants.txt
Let me add that I tried the --code-review as well as the -no-code-review option. Moreover, the --run-tests option does not seem to work. Here I would have expected that MetaGPT recognizes that there are errors and tries to incrementally fix the errors through revised code generation trials.
About the generated code is not self-consistent: LLM does not always produce good design and then write good code. I
suggest you retry if it failed.
About --run-tests option does not work: The value of --n-round is too small. --n-round is a safety valve that will terminate the inference if the number of reference round exceeds it. Unit testing will not start until the code is successfully written. The QA is not working indicating that the code has not been successfully written.
Thanks for the specific suggestions on how to get better results with MetaGPT. I will certainly try the suggestions.
|
2025-04-01T06:38:46.139815
| 2024-05-08T06:58:28
|
2284834133
|
{
"authors": [
"Anxhul10",
"NitkarshChourasia",
"yanliutafewa"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6195",
"repo": "geekcomputers/Python",
"url": "https://github.com/geekcomputers/Python/issues/2180"
}
|
gharchive/issue
|
Wish to add a program that convert wind directions in degrees to compass directions.
Hi, Repo Owner,
I wish I can add a program in this repo. The program is about to convert a degree number like 44.5 to NE.
Would you allow me?
Best,
Yan
Hi , can you assign me this issue .
You both are assigned the task.
@yanliutafewa @Anxhul10
Submit a PR for review and tag me along.
It will be merged.
Thank you,
@NitkarshChourasia
|
2025-04-01T06:38:46.145651
| 2024-05-15T11:53:35
|
2297700507
|
{
"authors": [
"NitkarshChourasia",
"Xceptions"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6196",
"repo": "geekcomputers/Python",
"url": "https://github.com/geekcomputers/Python/pull/2196"
}
|
gharchive/pull-request
|
Adding a Search Engine to the repo
This PR contributes a search engine that can be used to search for documents that contain a search term
Hi, @geekcomputers , kindly see that @NitkarshChourasia left a comment for me to squash some commits, I planned on working on that this weekend, but I see that you have merged the PR regardless. Will the squashing still be needed?
Hi, @geekcomputers , kindly see that @NitkarshChourasia left a comment for me to squash some commits, I planned on working on that this weekend, but I see that you have merged the PR regardless. Will the squashing still be needed?
No
It is to reduce the number of commits, so as to handle it in future. If needed.
Next time try to squash them when the commits are huge in numbers as per to the features introduced. @Xceptions
Thank you,
@NitkarshChourasia
okay thank you.
Looking to make more contributions soon
sure!
On Sat, 18 May 2024 at 14:41, Xceptions @.***> wrote:
okay thank you.
Looking to make more contributions soon
—
Reply to this email directly, view it on GitHub
https://github.com/geekcomputers/Python/pull/2196#issuecomment-2118726130,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AVTTXYG7QPVRJ7A7P7GMXATZC4LKXAVCNFSM6AAAAABHYAB34KVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMJYG4ZDMMJTGA
.
You are receiving this because you were mentioned.Message ID:
@.***>
--
Nitkarsh Chourasia
|
2025-04-01T06:38:46.158617
| 2021-03-08T07:06:49
|
824250445
|
{
"authors": [
"Aayush-hub",
"geekquad"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6197",
"repo": "geekquad/Image-Processing-OpenCV",
"url": "https://github.com/geekquad/Image-Processing-OpenCV/issues/24"
}
|
gharchive/issue
|
Add Github Action Welcome Bot
Add Github action greeting bot which greets first time contributors with a welcome message and message to follow contributing guidelines.
@geekquad @kritikaparmar-programmer Can I look into this issue?
Sure @Aayush-hub. Go ahead!
@geekquad @kritikaparmar-programmer Waiting for my #23 PR to be merged. Will make a PR soon after that solving this issue!
Hey @Aayush-hub, you can make a new branch for the same and continue your work.
Hey @Aayush-hub, you can make a new branch for the same and continue your work.
@geekquad Done :)
|
2025-04-01T06:38:46.165356
| 2016-01-08T20:48:12
|
125690775
|
{
"authors": [
"geerlingguy",
"lihop"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6199",
"repo": "geerlingguy/ansible-for-devops",
"url": "https://github.com/geerlingguy/ansible-for-devops/issues/19"
}
|
gharchive/issue
|
nodejs: forever is not installed
Following the nodejs example I get the error:
TASK [Check list of running Node.js apps.] *************************************
fatal: [default]: FAILED! => {"changed": false, "cmd": "forever list", "failed": true, "msg": "[Errno 2] No such file or directory", "rc": 2}
Further invocations of vagrant provision produce the same error and interestingly the "Install Forever (to run our Node.js app)" task is marked as changed every time. I logged into the machine and found out forever wasn't being installed at all.
Changing the "Install Forever" task so that state=present rather than state=latest fixes the problem.
However, my understanding is that state=latest should also ensure the package is installed so maybe this is a bug in the ansible npm module?
I am using Ansible version: 2.0.0.
@lihop - Yes, state=present should be preferred, and should work fine... it looks like there's a chance the npm module has a bug in 2.0.0. Note that it's still pre-release software—you may want to report this bug in the Ansible project's queue.
Thanks for confirming @geerlingguy. I've downgraded to ansible 1.9.4 and the playbook runs fine as is. I was using Ansible 2.0.0 because that is the version shown in the "Installing Ansible" section of the book. I didn't realize the software was pre-release, so I will look into making a bug report.
Looks like the bug has already been reported here: ansible/ansible-modules-extras#1375
Thanks for the update!
|
2025-04-01T06:38:46.208149
| 2020-04-16T20:58:48
|
601466785
|
{
"authors": [
"geirev",
"rafaeljmoraes"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6200",
"repo": "geirev/EnKF_seir",
"url": "https://github.com/geirev/EnKF_seir/pull/3"
}
|
gharchive/pull-request
|
Adapts python plotting functionality to adhere to new tecplot output files
Hi Geir,
this PR adapts the python code to your new tecplot files and fixes Peter issue.
Cheers,
Rafael
Very good. Peter will be testing it :-)
Thanks a lot for contributing.
|
2025-04-01T06:38:46.230868
| 2023-09-01T17:45:11
|
1877814327
|
{
"authors": [
"BobbyRBruce",
"abmerop",
"mkjost0"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:6201",
"repo": "gem5/gem5",
"url": "https://github.com/gem5/gem5/pull/256"
}
|
gharchive/pull-request
|
misc: Add LULESH GPU tests
This adds the LULESH tests, which currently run successfully, though they still use the gem5-resources directory to build the binary used as opposed to using a pre-built binary
Change-Id: I91c511fe92b7f9d11dfb027f435573f826bc6714
Thanks for helping to move the GPU weekly tests out of the bash script! This looks fine for GCN3. One thing I would like though is to test the Vega ISA as well. I realize that will take more resources. @mattsinc and I have been planning to discuss in detail if GCN3 can be deprecated. I would also like to see Vega as part of the ALL build, which would further reduce compilation for testing, I think (not sure if each yaml file is rebuilding).
I have many more detailed thoughts on moving to Vega but it is probably better for a discussions thread rather than this PR.
Yes, this is something we can look into! I think if we do that, we can put it into a separate PR so we can update all the GPU tests to use Vega at the same time after we ensure it works locally first, unless you have other thoughts on that.
thanks! Please let @abmerop confirm before you merge in though.
done
FYI, when this is good to go (soon, I promise), i'll do a merge commit with this one so it'll be added to develop as one commit.
Just giving a heads up before people start to notice all the touch-up commits i've been doing.
Hi @Harshil2107 , I was actually just looking at this and was wondering how this command was even working. It seems it's not. I will post a comment on what I suspect is the issue.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.