added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| created
timestamp[us]date 2001-10-09 16:19:16
2025-01-01 03:51:31
| id
stringlengths 4
10
| metadata
dict | source
stringclasses 2
values | text
stringlengths 0
1.61M
|
|---|---|---|---|---|---|
2025-04-01T04:54:40.820955
| 2021-10-09T23:16:12
|
1021830375
|
{
"authors": [
"DesmondTo",
"codecov-commenter"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13039",
"repo": "AY2122S1-CS2103T-W10-4/tp",
"url": "https://github.com/AY2122S1-CS2103T-W10-4/tp/pull/62"
}
|
gharchive/pull-request
|
Update JavaDoc and resolve Java coding style violation
Fix #62
Update JavaDoc for isName() method in StringUtil.java
Use Egyptian style for curly bracket
Codecov Report
Merging #62 (e504db1) into master (c8f6c04) will not change coverage.
The diff coverage is 100.00%.
@@ Coverage Diff @@
## master #62 +/- ##
=========================================
Coverage 68.71% 68.71%
Complexity 424 424
=========================================
Files 75 75
Lines 1397 1397
Branches 156 156
=========================================
Hits 960 960
Misses 387 387
Partials 50 50
Impacted Files
Coverage Δ
...in/java/seedu/address/commons/util/StringUtil.java
95.45% <ø> (ø)
...rc/main/java/seedu/address/model/ModelManager.java
95.83% <100.00%> (ø)
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c8f6c04...e504db1. Read the comment docs.
|
2025-04-01T04:54:40.823614
| 2021-10-30T08:33:29
|
1040088535
|
{
"authors": [
"lhw-1",
"nus-pe-script",
"vigneshsankariyer1234567890"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13040",
"repo": "AY2122S1-CS2103T-W15-4/tp",
"url": "https://github.com/AY2122S1-CS2103T-W15-4/tp/issues/133"
}
|
gharchive/issue
|
[PE-D] Edit Command: not all tags cleared
Steps to reproduce: edit 1 -t
is INCOMPLETE / COMPLETE a tag?
Labels: severity.High type.FunctionalityBug
original: tsinyee/ped#9
I guess we should explicitly state that every person would have a completion status tag in UG. Documentation error
Resolved by #152
|
2025-04-01T04:54:40.835146
| 2021-10-10T10:02:14
|
1021937505
|
{
"authors": [
"KelvinSoo",
"codecov-commenter",
"lwj1711"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13041",
"repo": "AY2122S1-CS2103T-W17-4/tp",
"url": "https://github.com/AY2122S1-CS2103T-W17-4/tp/pull/89"
}
|
gharchive/pull-request
|
Add Ui logic for weekly panel
Move daily boxes into WeeklyPanel.fxml
Codecov Report
Merging #89 (500c938) into master (83ecbdb) will decrease coverage by 1.55%.
The diff coverage is 0.00%.
@@ Coverage Diff @@
## master #89 +/- ##
============================================
- Coverage 71.14% 69.58% -1.56%
Complexity 399 399
============================================
Files 71 73 +2
Lines 1251 1279 +28
Branches 128 128
============================================
Hits 890 890
- Misses 329 357 +28
Partials 32 32
Impacted Files
Coverage Δ
src/main/java/seedu/unify/ui/DailyPanel.java
0.00% <0.00%> (ø)
src/main/java/seedu/unify/ui/MainWindow.java
0.00% <0.00%> (ø)
src/main/java/seedu/unify/ui/TaskCard.java
0.00% <0.00%> (ø)
src/main/java/seedu/unify/ui/WeeklyPanel.java
0.00% <0.00%> (ø)
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 83ecbdb...500c938. Read the comment docs.
LGTM
|
2025-04-01T04:54:40.841840
| 2022-03-06T14:43:21
|
1160628528
|
{
"authors": [
"alfredkohhh",
"codecov-commenter"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13042",
"repo": "AY2122S2-CS2103-F11-1/tp",
"url": "https://github.com/AY2122S2-CS2103-F11-1/tp/pull/40"
}
|
gharchive/pull-request
|
Updated ReadMe CI Status link to match with our project's CI
Changed the CI Status link to ours.
Codecov Report
Merging #40 (4642437) into master (d3e617b) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #40 +/- ##
=========================================
Coverage 72.15% 72.15%
Complexity 399 399
=========================================
Files 70 70
Lines 1232 1232
Branches 125 125
=========================================
Hits 889 889
Misses 311 311
Partials 32 32
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d3e617b...4642437. Read the comment docs.
|
2025-04-01T04:54:40.859038
| 2022-02-26T06:39:43
|
1151328844
|
{
"authors": [
"bakano98",
"codecov-commenter",
"lawwm"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13043",
"repo": "AY2122S2-CS2103T-T13-4/tp",
"url": "https://github.com/AY2122S2-CS2103T-T13-4/tp/pull/16"
}
|
gharchive/pull-request
|
Clear data branch
Two new commands are added, which are "deletemodule" and "clearmodules".
deletemodule : Delete specified modules for someone
deletemodule 5 t/CS1231S t/CS3230
clearmodules : Clear all modules for someone
clearmodules 5
Added regex expression to ensure module codes are correct.
2-3 letters prefix followed by 4 digits and 1 optional letter.
[a-zA-Z]{2,3}\d{4}[a-zA-Z]?
Was lazy to change all tags to modules in variable names. Maybe after midterms.
LGTM. Might need to do some refactoring after this has been merged since new fields are added
Codecov Report
Merging #16 (a43fc93) into master (17dc973) will increase coverage by 0.67%.
The diff coverage is 83.90%.
@@ Coverage Diff @@
## master #16 +/- ##
============================================
+ Coverage 72.31% 72.99% +0.67%
- Complexity 437 463 +26
============================================
Files 76 80 +4
Lines 1333 1418 +85
Branches 139 148 +9
============================================
+ Hits 964 1035 +71
- Misses 331 335 +4
- Partials 38 48 +10
Impacted Files
Coverage Δ
...va/seedu/address/logic/commands/CommandResult.java
88.23% <ø> (ø)
.../seedu/address/logic/parser/AddressBookParser.java
76.19% <0.00%> (-8.03%)
:arrow_down:
...du/address/logic/commands/DeleteModuleCommand.java
77.41% <77.41%> (ø)
...ddress/logic/parser/DeleteModuleCommandParser.java
82.35% <82.35%> (ø)
...du/address/logic/commands/ClearModulesCommand.java
91.30% <91.30%> (ø)
...address/logic/parser/ClearModuleCommandParser.java
100.00% <100.00%> (ø)
src/main/java/seedu/address/model/AddressBook.java
93.33% <100.00%> (+1.66%)
:arrow_up:
src/main/java/seedu/address/model/tag/Tag.java
90.00% <100.00%> (ø)
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 17dc973...a43fc93. Read the comment docs.
|
2025-04-01T04:54:40.860505
| 2022-11-05T08:09:04
|
1436896023
|
{
"authors": [
"ChryslineLim",
"riccqi"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13044",
"repo": "AY2223S1-CS2103T-F12-1/tp",
"url": "https://github.com/AY2223S1-CS2103T-F12-1/tp/issues/240"
}
|
gharchive/issue
|
matchbuyer: should match properties that are below their budget range too
and also for matchprops: should match buyers whose price range is above the property price too
Will not be fixed, instead will explain intended behavior in the UG
|
2025-04-01T04:54:40.865535
| 2022-10-28T18:04:28
|
1427606109
|
{
"authors": [
"carriezhengjr",
"nus-pe-script",
"sltsheryl"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13045",
"repo": "AY2223S1-CS2103T-T11-4/tp",
"url": "https://github.com/AY2223S1-CS2103T-T11-4/tp/issues/178"
}
|
gharchive/issue
|
[PE-D][Tester D] Syntax of user commands are inconsistent
The interest-related commands are in camel case where as the module-related commands are delimited with white space. E.g., addInt vs mod add.
Moving forward, perhaps the team could standardize to the same format for all commands such that it is easier for users to remember and use them.
Labels: type.FeatureFlaw severity.VeryLow
original: shwene/ped#2
@carriezhengjr We can possibly add an explanation why it is designed this way (ie. any commands relating to student information are done through add ..., findInt and addInt but we deliberately use the syntax mod <command word> to distinguish Student commands and Module commands. (Should we place this in UG or DG?)
@carriezhengjr We can possibly add an explanation why it is designed this way (ie. any commands relating to student information are done through add ..., findInt and addInt but we deliberately use the syntax mod <command word> to distinguish Student commands and Module commands. (Should we place this in UG or DG?)
Maybe we can put in UG, before all mod commands, as a Tip.
@carriezhengjr We can possibly add an explanation why it is designed this way (ie. any commands relating to student information are done through add ..., findInt and addInt but we deliberately use the syntax mod <command word> to distinguish Student commands and Module commands. (Should we place this in UG or DG?)
Maybe we can put in UG, before all mod commands, as a Tip.
Yes
|
2025-04-01T04:54:40.868749
| 2022-09-18T09:31:41
|
1376997611
|
{
"authors": [
"Nephelite",
"codecov-commenter"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13046",
"repo": "AY2223S1-CS2103T-T13-4/tp",
"url": "https://github.com/AY2223S1-CS2103T-T13-4/tp/pull/58"
}
|
gharchive/pull-request
|
Add tjanenggerkevin PPP
Fixes #57
Codecov Report
Merging #58 (4a8e2e8) into master (d9c88f6) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #58 +/- ##
=========================================
Coverage 72.15% 72.15%
Complexity 399 399
=========================================
Files 70 70
Lines 1232 1232
Branches 125 125
=========================================
Hits 889 889
Misses 311 311
Partials 32 32
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
|
2025-04-01T04:54:40.871989
| 2022-11-07T07:58:07
|
1437920633
|
{
"authors": [
"Kok-je",
"codecov-commenter"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13047",
"repo": "AY2223S1-CS2103T-T15-1/tp",
"url": "https://github.com/AY2223S1-CS2103T-T15-1/tp/pull/273"
}
|
gharchive/pull-request
|
Ppp update shawn
Fixed UML diagrams
Codecov Report
Merging #273 (b4a0649) into master (4d9f856) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #273 +/- ##
=========================================
Coverage 65.21% 65.21%
Complexity 604 604
=========================================
Files 93 93
Lines 2199 2199
Branches 271 271
=========================================
Hits 1434 1434
Misses 687 687
Partials 78 78
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
|
2025-04-01T04:54:40.881176
| 2022-10-01T12:06:38
|
1393378789
|
{
"authors": [
"codecov-commenter",
"marcuspang"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13048",
"repo": "AY2223S1-CS2103T-W13-4/tp",
"url": "https://github.com/AY2223S1-CS2103T-W13-4/tp/pull/21"
}
|
gharchive/pull-request
|
Add team field to Person
Changes
Add team field to Person
Fixes #20
Codecov Report
Base: 72.15% // Head: 70.72% // Decreases project coverage by -1.43% :warning:
Coverage data is based on head (6569d31) compared to base (103c409).
Patch coverage: 0.00% of modified lines in pull request are covered.
Additional details and impacted files
@@ Coverage Diff @@
## master #21 +/- ##
============================================
- Coverage 72.15% 70.72% -1.44%
Complexity 399 399
============================================
Files 70 71 +1
Lines 1232 1257 +25
Branches 125 129 +4
============================================
Hits 889 889
- Misses 311 336 +25
Partials 32 32
Impacted Files
Coverage Δ
src/main/java/seedu/address/model/team/Team.java
0.00% <0.00%> (ø)
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
Good adherence to coding standards. Looks good to merge. Would be nice to see a delete_member command and a visual effect in the UI upon adding a member to the team.
The delete_member command will be done in a separate PR. I think I won't add a visual effect to the so early, perhaps only when the Team model is implemented properly
|
2025-04-01T04:54:40.882514
| 2022-09-23T03:04:00
|
1383237632
|
{
"authors": [
"Ferusel",
"yixiann"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13049",
"repo": "AY2223S1-CS2103T-W16-2/tp",
"url": "https://github.com/AY2223S1-CS2103T-W16-2/tp/issues/5"
}
|
gharchive/issue
|
As a purchasing manager, I can rename my inventory items
... so that I can update items with an incorrect name.
... so that I can update items with an incorrect name.
Closed by #151 during refactoring of Person to Item. This is already a command available in AB3.
|
2025-04-01T04:54:40.884864
| 2023-11-03T21:16:00
|
1976941755
|
{
"authors": [
"evanyan13",
"nus-se-script",
"peiran18"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13050",
"repo": "AY2324S1-CS2103T-W12-3/tp",
"url": "https://github.com/AY2324S1-CS2103T-W12-3/tp/issues/158"
}
|
gharchive/issue
|
[PE-D][Tester D] Being able to add multiple interviews at the same time slot
I was adding interviews and had not realise that i had clashes with my different types of interview for the same company. I think having an warning message to tell me i have clashing interviews would be great :)
Labels: type.FunctionalityBug severity.High
original: wasjoe1/ped#9
Can include interviews at the same time slots
Thank you for the feedback. But as discussed by the team, we think it is more logical to allow user to add interviews with the timeslot as it does not necessarily mean the user must attend the interview just by recording it
|
2025-04-01T04:54:40.886706
| 2023-11-03T19:40:42
|
1976824644
|
{
"authors": [
"Elijah5399",
"nus-pe-script"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13051",
"repo": "AY2324S1-CS2103T-W16-4/tp",
"url": "https://github.com/AY2324S1-CS2103T-W16-4/tp/issues/130"
}
|
gharchive/issue
|
[PE-D][Tester B] edit-friend allows name to be replaced by numbers
UG indicates that name must be a string but the app allowed integers to replace name
Labels: severity.Low type.FunctionalityBug
original: imkwokyong/ped#5
Closing this issue, since add-friend also allows for numerical names to be used. (ie alphanumeric characters and spaces)
|
2025-04-01T04:54:40.890187
| 2024-03-18T14:18:27
|
2192353329
|
{
"authors": [
"Lalelulilulela",
"dabzpengu"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13052",
"repo": "AY2324S2-CS2103T-T08-3/tp",
"url": "https://github.com/AY2324S2-CS2103T-T08-3/tp/pull/30"
}
|
gharchive/pull-request
|
Add company name feature into AddressBook
Add Company Name feature into AddressBook.
Use cn/ as parameter
Should only be up to 100 characters
resolved conflicts
LGTM :)
|
2025-04-01T04:54:40.893085
| 2024-04-05T22:04:32
|
2228867789
|
{
"authors": [
"Joseph31416",
"nus-pe-script"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13053",
"repo": "AY2324S2-CS2103T-W13-1/tp",
"url": "https://github.com/AY2324S2-CS2103T-W13-1/tp/issues/182"
}
|
gharchive/issue
|
[PE-D][Tester A] marking feature moves position of loan in display
Initial state:
Input: mark loan 3
New state:
Expected: No change in index/position of loan in list display
Actual: the loan with amount 666 moves to the first position.
This can confuse the user, more so if the list is long
Labels: severity.Medium type.FunctionalityBug
original: Murugan-Maniish/ped#14
Stability in sorting by end date
Fix by sort by creation? Not exactly a bug anyways
|
2025-04-01T04:54:40.895921
| 2024-11-08T19:30:38
|
2644978061
|
{
"authors": [
"DesSnowy",
"Incogdino",
"nus-se-script"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13054",
"repo": "AY2425S1-CS2103T-T11-2/tp",
"url": "https://github.com/AY2425S1-CS2103T-T11-2/tp/issues/307"
}
|
gharchive/issue
|
[PE-D][Tester D] Edit command with incorrect index returns inconsistent error message depending on whether the index is too high or too low
As seen above, when edit command is used with index 0(the list is one-indexed), I get an invalid command format message, whereas when an index such as 6 that is too high(the list has 5 people) is used, I get an invalid person index instead. This could slow down a user's ability to understand what is wrong if they accidentally key in index 0.
Labels: severity.Low type.FunctionalityBug
original: naythee169/ped#5
Can implement this fix
The error are not wrong so we wont be changing this.
|
2025-04-01T04:54:40.897946
| 2022-02-22T11:38:02
|
1146801105
|
{
"authors": [
"PalmEmanuel",
"egullbrandsson"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13055",
"repo": "AZDOPS/AZDevOPS",
"url": "https://github.com/AZDOPS/AZDevOPS/issues/21"
}
|
gharchive/issue
|
Update PAT when it expires
I want to be able to replace the PAT for a specific organization, for e.g. when I'm regenerating the PAT or when it expires.
My suggestion is to allow Connect-AZDOPS to update an existing connection. Either by default or using something like a -Force switch.
|
2025-04-01T04:54:40.900461
| 2022-12-24T11:08:40
|
1510051919
|
{
"authors": [
"Aadarsh805",
"Ninad-Patil"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13056",
"repo": "Aadarsh805/TweetSage",
"url": "https://github.com/Aadarsh805/TweetSage/issues/83"
}
|
gharchive/issue
|
Documentation update
I think instead of saying it like in the above screenshot we could say it like this:
"It works by fetching the user's recent tweets from the Twitter API"
so that it's easy to understand
If approved I would love to make this change.
If approved I would love to make this change.
Yes sir go ahead
If approved I would love to make this change.
Yes sir go ahead
made a pull request
|
2025-04-01T04:54:40.901555
| 2021-10-01T11:22:30
|
1013228542
|
{
"authors": [
"Taikelenn",
"tulikavijay"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13057",
"repo": "Aalto-LeTech/aplus-courses",
"url": "https://github.com/Aalto-LeTech/aplus-courses/issues/767"
}
|
gharchive/issue
|
Fix privacy policy link in error reporting window
The window says at the bottom "click here to view privacy policy" but the link doesn't actually go anywhere
Hi @Taikelenn I would like to take this up
|
2025-04-01T04:54:40.914307
| 2019-02-22T15:43:56
|
413460500
|
{
"authors": [
"AaronJackson",
"NarenBabuR"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13058",
"repo": "AaronJackson/vrn",
"url": "https://github.com/AaronJackson/vrn/issues/119"
}
|
gharchive/issue
|
Unsupported image type, must be 8bit gray or RGB image.
/home/tango/usr/local/torch/install/bin/luajit: ./facedetection_dlib.lua:21: Python error: opaque ref: call
RuntimeError: Unsupported image type, must be 8bit gray or RGB image.
either open the image in an image editor and save it as an RGB image, or
modify the code to accommodate such images.
Data_Science writes:
/home/tango/usr/local/torch/install/bin/luajit: ./facedetection_dlib.lua:21: Python error: opaque ref: call
RuntimeError: Unsupported image type, must be 8bit gray or RGB image.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or mute the thread.
--
Aaron Jackson - M6PIU
http://aaronsplace.co.uk/
But this issue rises with the images provided by you...
Could you please help me
This issue comes with every other images. Not able to solve
You might need to upgrade Pillow
|
2025-04-01T04:54:40.942884
| 2023-01-20T16:16:58
|
1551120926
|
{
"authors": [
"nre-ableton"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13059",
"repo": "Ableton/ansible-role-jenkins-jcasc",
"url": "https://github.com/Ableton/ansible-role-jenkins-jcasc/pull/267"
}
|
gharchive/pull-request
|
Use unsafe_writes when writing PIMT logfile
Use unsafe_writes when writing PIMT logfile
Version 0.3.2
@ablbot rebase
Hmm, in spite of this fix, the error still occurs. :confused:
|
2025-04-01T04:54:40.963450
| 2023-08-18T17:22:07
|
1857045722
|
{
"authors": [
"kwokcb",
"marwie"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13060",
"repo": "AcademySoftwareFoundation/MaterialX",
"url": "https://github.com/AcademySoftwareFoundation/MaterialX/issues/1464"
}
|
gharchive/issue
|
Graph Editor: Implicit cast from "vec3" to "vec2" - using texcoord both as vector3 and vector2
This seems to be related to texcoord 0 being used for the normalmap as well as the basecolor - I'm still investigating this and will try to reproduce it with a fresh material but for the sake of documentation I'll open an issue here now 😅
<?xml version="1.0"?>
<materialx name="" version="1.38" >
<nodegraph name="needle_shadergraph" >
<!-- MainTex -->
<input name="_MainTex" type="filename" value="Tile_Albedo.jpg" />
<dot name="dot__MainTex" type="filename" >
<input name="in" type="filename" interfacename="_MainTex" />
</dot>
<texcoord name="uv" type="vector3" >
<input name="index" type="integer" value="0" />
</texcoord>
<convert name="convert_to_vector2" type="vector2" >
<input name="in" type="vector3" nodename="uv" />
</convert>
<image name="tex" type="color4" >
<input name="file" type="filename" nodename="dot__MainTex" />
<input name="texcoord" type="vector2" nodename="convert_to_vector2" />
</image>
<convert name="convert_to_color3" type="color3" >
<input name="in" type="color4" nodename="tex" />
</convert>
<output name="out" type="color3" nodename="convert_to_color3" />
<!-- Normal -->
<input name="_Normal" type="filename" value="Tile_Normal.jpg" />
<dot name="dot__Normal" type="filename" >
<input name="in" type="filename" interfacename="_Normal" />
</dot>
<texcoord name="uv_2" type="vector2" >
<input name="index" type="integer" value="0" />
</texcoord>
<image name="tex_1" type="color4" >
<input name="file" type="filename" nodename="dot__Normal" />
<input name="texcoord" type="vector2" nodename="uv_2" />
</image>
<swizzle name="swizzle_0_tex_1" type="vector3" >
<input name="in" type="color4" nodename="tex_1" />
<input name="channels" type="string" value="rgb" />
</swizzle>
<output name="out_1" type="vector3" nodename="normalmap" />
<normalmap name="normalmap" type="vector3" >
<input name="in" type="vector3" nodename="swizzle_0_tex_1" />
</normalmap>
<constant name="specular_roughness" type="float" >
<input name="value" type="float" value="0.0" />
</constant>
<output name="out_2" type="float" nodename="invert" />
<invert name="invert" type="float" >
<input name="in" type="float" nodename="specular_roughness" />
</invert>
</nodegraph>
<standard_surface name="needle_standard_surface" type="surfaceshader" >
<!-- BaseColor -->
<input name="base_color" type="color3" nodegraph="needle_shadergraph" output="out" />
<!-- NormalTS -->
<input name="normal" type="vector3" nodegraph="needle_shadergraph" output="out_1" />
<!-- Smoothness -->
<input name="specular_roughness" type="float" nodegraph="needle_shadergraph" output="out_2" />
</standard_surface>
<surfacematerial name="Default" type="material" >
<input name="surfaceshader" type="surfaceshader" nodename="needle_standard_surface" />
</surfacematerial>
</materialx>
Textures
The material compiles when I remove one of the texcoord usages (both use channel 0)
Here is a minimal material that reproduces the issue above
<?xml version="1.0"?>
<materialx version="1.38" colorspace="lin_rec709">
<standard_surface name="SR_marble1" type="surfaceshader" xpos="6.159420" ypos="-0.568965">
<input name="base" type="float" value="1" />
<input name="base_color" type="color3" nodename="image_color3" />
<input name="specular_roughness" type="float" value="0.1" />
<input name="subsurface" type="float" value="0.4" />
<input name="subsurface_color" type="color3" nodename="image_color4" />
</standard_surface>
<surfacematerial name="Marble_3D" type="material" xpos="8.695652" ypos="0.000000">
<input name="surfaceshader" type="surfaceshader" nodename="SR_marble1" />
</surfacematerial>
<image name="image_color3" type="color3" xpos="3.528986" ypos="-1.241379">
<input name="file" type="filename" value="Tile_Albedo.jpg" />
<input name="texcoord" type="vector2" nodename="texcoord_vector2" />
</image>
<texcoord name="texcoord_vector2" type="vector2" xpos="1.666667" ypos="-1.413793" />
<texcoord name="texcoord_vector3" type="vector3" xpos="0.275362" ypos="1.077586" />
<image name="image_color4" type="color3" xpos="3.485507" ypos="1.905172">
<input name="file" type="filename" value="Tile_Albedo.jpg" />
<input name="texcoord" type="vector2" nodename="swizzle_vector3_vector2" />
</image>
<swizzle name="swizzle_vector3_vector2" type="vector2" xpos="1.637681" ypos="1.767241">
<input name="in" type="vector3" nodename="texcoord_vector3" />
<input name="channels" type="string" value="xy" />
</swizzle>
</materialx>
This is the generated GLSL code snippet:
void main()
{
in VertexData
{
vec3 normalWorld;
vec3 tangentWorld;
vec2 texcoord_0;
vec3 positionWorld;
} vd;
vec3 geomprop_Nworld_out1 = normalize(vd.normalWorld);
vec3 geomprop_Tworld_out1 = normalize(vd.tangentWorld);
vec2 texcoord_vector2_out = vd.texcoord_0;
vec3 texcoord_vector3_out = vd.texcoord_0;
vec3 image_color3_out = vec3(0.0);
Seem we are declaring one stream input (vec2) and then skipping adding in the second (vec3) and then reusing the same stream route. Hence there is a vec2 to vec3 cast.
Adding @niklasharrysson, for thoughts on this.
I'm guessing that a type check is not being performed when determingin whether to create a geomstry stream ShaderNode?
BTW @marwie, The MaterialX Viewer has more diagnostic functionality as it allows you to do things like dump out the GLSL code (G key) if case you find this useful.
As suggested offline to avoid this declare unique stream inputs which are type vec3 as a workaround for now.
Hi @madmann91,
Thanks a lot for taking a look at this!
I think the choices are:
As you suggest, always keep vec3 and add code to extract a vec2 from vec3 if the lookup is vec2. (Basically you want to avoid explicit casts again). I like the idea to widen only when necessary.
Keep both a vec2 and a vec3. This means no code generator changes are required but there needs to be a way to avoid 2 streams having the same name. The published naming convention indicates a renderer can bind to a stream with name of the form: <stream_type>_<stream_number>. All I can think of is to add an additional qualifier to get something like texcoord_0:2, texcoord_0:3. Binding code needs to be updated.
I'm leaning towards 1 since it does not affect integrators.
In this case vec3->vec2 conversion can be done by:
1a. inserting additional shader code on lookups,
1b. replace/insert a vec3-to-vec2 conversion ShaderNode instead of a texcoord ShaderNode . ShaderNodes only exist for code generation and generators do insert additional nodes as necessary including geometry nodes.
Option 1b allows you to not worry about each shading language's syntax and I think it should be pretty robust.
There may be a 1c. but this is all that comes to mind currently :)
If this is okay to work on, it would great if you look at this for dev days. Of course there will be folks around to help out in this area :)
|
2025-04-01T04:54:40.966779
| 2022-06-10T00:05:12
|
1266789019
|
{
"authors": [
"BrianSharpe",
"jstone-lucasfilm"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13061",
"repo": "AcademySoftwareFoundation/MaterialX",
"url": "https://github.com/AcademySoftwareFoundation/MaterialX/issues/986"
}
|
gharchive/issue
|
variable R21 in mx_fresnel_airy (glsl) is unused. bug?
There is a variable called R21 in mx_fresnel_airy() (libraries/pbrlib/genglsl/lib/mx_microfacet_specular.glsl, line 360) that is unused.
Is this a bug?
Or is it safe to simply comment it out?
cheers!
Great catch, @BrianSharpe, and that same unused variable seems to be present in Laurent Belcour's original code. The link has changed since @niklasharrysson developed the GLSL version for MaterialX, and can now be found here:
https://belcour.github.io/blog/research/publication/2017/05/01/brdf-thin-film.html
@BrianSharpe If you have the bandwidth, feel free to remove this line from the GLSL code, and we should likely update the link to Laurent Belcour's paper to the new location above.
done (https://github.com/AcademySoftwareFoundation/MaterialX/pull/989)
cheers!
|
2025-04-01T04:54:40.974205
| 2020-05-31T12:23:29
|
627952713
|
{
"authors": [
"VVD",
"hodoulp"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13062",
"repo": "AcademySoftwareFoundation/OpenColorIO",
"url": "https://github.com/AcademySoftwareFoundation/OpenColorIO/issues/1023"
}
|
gharchive/issue
|
Build error with python 3.7
FreeBSD 12.1 amd64.
With DOCS option OFF build fine with python 3.7 and python 2.7.
But with DOCS option ON build fine with python 2.7 only. With 3.7 I got error and search didn't help.
Build log:
-- The C compiler identification is Clang 8.0.1
-- The CXX compiler identification is Clang 8.0.1
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc - works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ - works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Setting Build Type to: Release
-- Setting Namespace to: OpenColorIO
-- Exec prefix not specified, defaulting to /usr/local
-- Use Boost Ptr: OFF
-- Setting python bin to: /usr/local/bin/python3.7
Python library: PYTHON_LIBRARY-NOTFOUND
-- Setting EXTDIST_BINPATH: /tmp/work/usr/ports/graphics/opencolorio/work/.build/ext/dist/bin
-- Setting EXTDIST_PYTHONPATH: /tmp/work/usr/ports/graphics/opencolorio/work/.build/ext/dist/lib/python3.7/site-packages
-- Found TinyXML: /usr/local/lib/libtinyxml.so
-- TinyXML version: 2.6.2
-- External TinyXML will be used.
-- Found PkgConfig: pkgconf (found version "1.6.3")
-- Found yaml-cpp: /usr/local/lib/libyaml-cpp03.so
-- Generate Documentation: true
-- Create sphinx conf.py from conf.py.in
-- Copying doc to staging area
-- Copy extra doc files to staging area
-- Extracting .rst files from C++ headers
-- SSE Optimizations: ON
-- Could NOT find Truelight (missing: Truelight_INCLUDE_DIR Truelight_LIBRARIES Truelight_LIBRARY_DIR)
-- Not building truelight transform support. Add the flag -D TRUELIGHT_INSTALL_PATH=... or set the TRUELIGHT_ROOT environment variable
-- Create OpenColorABI.h from OpenColorABI.h.in
-- Setting OCIO SOVERSION to: 1
-- Create OpenColorIO.pc from OpenColorIO.pc.in
-- Build Unit Tests: ON
-- Create ocio_core_tests.sh.in from ocio_core_tests.sh
-- Disable build of apps. See cmake options : OCIO_BUILD_APPS and OCIO_BUILD_SHARED/OCIO_BUILD_STATIC (requiered)
-- Python library to include 'lib' prefix: OFF
-- Python 3.7 okay (UCS: ucs4), will build the Python bindings against /usr/local/include/python3.7m
-- Python variant path is lib/python3.7/site-packages
-- Found PythonLibs: /usr/local/lib/libpython3.7m.so (found version "3.7.7")
PYTHON_VARIANT_PATH: lib/python3.7/site-packages
-- Configuring done
-- Generating done
CMake Warning:
Manually-specified variables were not used by the project:
BOOST_PYTHON_SUFFIX
CMAKE_CXX_FLAGS_DEBUG
CMAKE_C_FLAGS_DEBUG
CMAKE_C_FLAGS_RELEASE
CMAKE_VERBOSE_MAKEFILE
THREADS_HAVE_PTHREAD_ARG
-- Build files have been written to: /tmp/work/usr/ports/graphics/opencolorio/work/.build
===> Building for opencolorio-1.1.1_1
[1/200] cd /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs && /usr/local/bin/cmake -E make_directory /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools && /usr/local/bin/cmake -E make_directory /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools && /usr/local/bin/cmake -E make_directory /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix && /usr/local/bin/cmake -E make_directory /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/tmp && /usr/local/bin/cmake -E make_directory /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp && /usr/local/bin/cmake -E make_directory /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src && /usr/local/bin/cmake -E make_directory /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp && /usr/local/bin/cmake -E touch /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/setuptools-mkdir
[2/200] cd /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src && /usr/local/bin/cmake -P /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/verify-setuptools.cmake && /usr/local/bin/cmake -P /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/extract-setuptools.cmake && /usr/local/bin/cmake -E touch /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/setuptools-download
CMake Warning at setuptools-stamp/verify-setuptools.cmake:15 (message):
File will not be verified since no URL_HASH specified
-- extracting...
src='/tmp/work/usr/ports/graphics/opencolorio/work/OpenColorIO-1.1.1/ext/setuptools-1.1.6.tar.gz'
dst='/tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools'
-- extracting... [tar xfz]
-- extracting... [analysis]
-- extracting... [rename]
-- extracting... [clean up]
-- extracting... done
[3/200] cd /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs && /usr/local/bin/cmake -E echo_append && /usr/local/bin/cmake -E touch /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/setuptools-update
[4/200] cd /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs && /usr/local/bin/cmake -E echo_append && /usr/local/bin/cmake -E touch /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/setuptools-patch
[5/200] cd /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools && /usr/local/bin/cmake -E make_directory /tmp/work/usr/ports/graphics/opencolorio/work/.build/ext/dist/lib/python3.7/site-packages && /usr/local/bin/cmake -E touch /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/setuptools-configure
[6/200] cd /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools && PYTHONPATH=/tmp/work/usr/ports/graphics/opencolorio/work/.build/ext/dist/lib/python3.7/site-packages: /usr/local/bin/python3.7 setup.py build && /usr/local/bin/cmake -E touch /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/setuptools-build
FAILED: docs/setuptools-prefix/src/setuptools-stamp/setuptools-build
cd /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools && PYTHONPATH=/tmp/work/usr/ports/graphics/opencolorio/work/.build/ext/dist/lib/python3.7/site-packages: /usr/local/bin/python3.7 setup.py build && /usr/local/bin/cmake -E touch /tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools-stamp/setuptools-build
Traceback (most recent call last):
File "setup.py", line 17, in <module>
exec(init_file.read(), command_ns)
File "<string>", line 8, in <module>
File "/tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools/setuptools/__init__.py", line 11, in <module>
from setuptools.extension import Extension
File "/tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools/setuptools/extension.py", line 5, in <module>
from setuptools.dist import _get_unpatched
File "/tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools/setuptools/dist.py", line 16, in <module>
import pkg_resources
File "/tmp/work/usr/ports/graphics/opencolorio/work/.build/docs/setuptools-prefix/src/setuptools/pkg_resources.py", line 1435, in <module>
register_loader_type(importlib_bootstrap.SourceFileLoader, DefaultProvider)
AttributeError: module 'importlib._bootstrap' has no attribute 'SourceFileLoader'
[7/200] /usr/bin/c++ -DOpenColorIO_EXPORTS -DUSE_SSE -I/tmp/work/usr/ports/graphics/opencolorio/work/OpenColorIO-1.1.1/export -Iexport -I/tmp/work/usr/ports/graphics/opencolorio/work/OpenColorIO-1.1.1/ext/oiio/src/include -Iext/dist/include -O2 -pipe -march=core2 -fstack-protector-strong -isystem /usr/local/include -fno-strict-aliasing -Wno-deprecated-register -isystem /usr/local/include -msse2 -O2 -pipe -march=core2 -fstack-protector-strong -isystem /usr/local/include -fno-strict-aliasing -Wno-deprecated-register -isystem /usr/local/include -fPIC -DOLDYAML -fPIC -fvisibility=hidden -Werror -std=c++11 -MD -MT src/core/CMakeFiles/OpenColorIO.dir/AllocationOp.cpp.o -MF src/core/CMakeFiles/OpenColorIO.dir/AllocationOp.cpp.o.d -o src/core/CMakeFiles/OpenColorIO.dir/AllocationOp.cpp.o -c /tmp/work/usr/ports/graphics/opencolorio/work/OpenColorIO-1.1.1/src/core/AllocationOp.cpp
ninja: build stopped: subcommand failed.
Hi @VVD
OCIO 1.x does not support Python 3. But the coming OCIO v2 supports both Python versions.
That's a known limitation of OCIO v1 so I close the defect.
Please feel free to reopen it if needed.
But it build and work with python 3.7 without DOCS…
When v2 will be released?
OCIOv2 feature complete is planned for this summer.
Note: As any open source project we are open to contributions from the community. So, you can definitively submit a pull request to fix the Python 3 support in OCIOv1 if that's a blocker for you. Refer to CONTRIBUTING.md for details.
Thanks. I'm not python developer and I'll wait v2.
Is master support python 3 already?
All FreeBSD ports with mandatory dependency from python 2 will be removed on 2020-12-31 if not fixed:
https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=249695
There are issue already: https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=249534
|
2025-04-01T04:54:40.986173
| 2024-05-05T03:46:16
|
2279307461
|
{
"authors": [
"etsach",
"lgritz"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13063",
"repo": "AcademySoftwareFoundation/OpenShadingLanguage",
"url": "https://github.com/AcademySoftwareFoundation/OpenShadingLanguage/pull/1812"
}
|
gharchive/pull-request
|
build: llvm 18 compat issue - include libclangAPINotes
Fixes #1809
@etsach are you able to try this change on your slide and let us know if it works?
It seems safe, and doesn't break anything. But then again, I could never get it to fail. Maybe because you were using llvm 18.0 and I was using 18.1 and maybe they fixed something on their end in between?
Hi,
This fixes the link error, but then I have another serie of link errors:
undefined reference to clang::SourceMgrAdapter::~SourceMgrAdapter()' undefined reference to clang::SourceMgrAdapter::handleDiag(llvm::SMDiagnostic const&, void*)'
etc...
And I can't find a library that fix them.
Definitely something else causing an issue.
Le dim. 5 mai 2024 à 05:48, Larry Gritz @.***> a écrit :
@etsach https://github.com/etsach are you able to try this change on
your slide and let us know if it works?
It seems safe, and doesn't break anything. But then again, I could never
get it to fail. Maybe because you were using llvm 18.0 and I was using 18.1
and maybe they fixed something on their end in between?
—
Reply to this email directly, view it on GitHub
https://github.com/AcademySoftwareFoundation/OpenShadingLanguage/pull/1812#issuecomment-2094570297,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ACXIPYMDXBVO4JHQY2VSWZTZAWTZHAVCNFSM6AAAAABHHLUVYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOJUGU3TAMRZG4
.
You are receiving this because you were mentioned.Message ID:
<AcademySoftwareFoundation/OpenShadingLanguage/pull/1812/c2094570297@
github.com>
I forgot, this is with LLVM 18.1.3
Le lun. 6 mai 2024 à 22:24, Etienne Sandré-Chardonnal <
@.***> a écrit :
Hi,
This fixes the link error, but then I have another serie of link errors:
undefined reference to clang::SourceMgrAdapter::~SourceMgrAdapter()' undefined reference to clang::SourceMgrAdapter::handleDiag(llvm::SMDiagnostic const&, void*)'
etc...
And I can't find a library that fix them.
Definitely something else causing an issue.
Le dim. 5 mai 2024 à 05:48, Larry Gritz @.***> a
écrit :
@etsach https://github.com/etsach are you able to try this change on
your slide and let us know if it works?
It seems safe, and doesn't break anything. But then again, I could never
get it to fail. Maybe because you were using llvm 18.0 and I was using 18.1
and maybe they fixed something on their end in between?
—
Reply to this email directly, view it on GitHub
https://github.com/AcademySoftwareFoundation/OpenShadingLanguage/pull/1812#issuecomment-2094570297,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ACXIPYMDXBVO4JHQY2VSWZTZAWTZHAVCNFSM6AAAAABHHLUVYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOJUGU3TAMRZG4
.
You are receiving this because you were mentioned.Message ID:
<AcademySoftwareFoundation/OpenShadingLanguage/pull/1812/c2094570297@
github.com>
@etasch, any comments? Should I merge what I have here and then we can try to find any remaining issues? Or are you worried that these changes are not correct?
This has languished for a while without a reply, but I think it's safe. Merging.
|
2025-04-01T04:54:40.998754
| 2021-06-24T21:28:23
|
929625405
|
{
"authors": [
"cary-ilm",
"lgritz",
"meshula"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13064",
"repo": "AcademySoftwareFoundation/openexr",
"url": "https://github.com/AcademySoftwareFoundation/openexr/pull/1069"
}
|
gharchive/pull-request
|
Clean up library VERSION and SOVERSION
Reduce confusion between "VERSION", "REVISION", and "SOVERSION":
Label the internal variable IMATH_LIBTOOL_* to indicate their purpose.
Use terminology closer to the libtool description
Add comment documenting the library update process
Signed-off-by: Cary Phillips<EMAIL_ADDRESS>
I'm bringing this up for the exact reason you cite, I strongly suspect we're not updating these numbers properly, because it's so confusing. I'm certainly open to other ideas or solutions.
I think you want the patch to be reflected in the age, not current, because it's ok to substitute a new patch release, right?
I believe that's true, I'm just wondering if that's a real use case? If someone is releasing a new ubuntu, it's unlikely they're going to cleverly avoid building openexr. If it's you or I building for our own not /usr/local directory, why are we doing that? If it's to rev an app like Blender, recompiling exr is 1% of 1% of the build time. So what I'm questioning is whether age is a pedantic correctness thing as opposed to a practical correctness. If it's merely pedantic, I'm all for all sails to the wind, let's do the thing that requires the least brain power, and only bump one number, because right now I'm feeling like maybe it's only an issue for PedanticKitty :) ... I am looking to be educated here, I feel like we've been coddling this problem along for almost plural decades now and still confused about it.
I don't disagree, as far as my personal workflow and priorities. I don't know if anybody else really cares.
I will note that if we wrap the whole thing up in current, we can never make a substitution that breaks (that's the single most important constraint), and although it doesn't allow for some possibly correct substitutions, it's no worse than the situation people would be in if they were linking statically.
Raising this again, as we need to resolve it before the 3.1 release.
In spite of how much I'd like to (a) not think about this again, and (b) not make mistakes, I'm not sure the automatic formula conforms to the libtool instructions: https://www.gnu.org/software/libtool/manual/html_node/Updating-version-info.html, which also say: "Never try to set the interface numbers so that they correspond to the release number of your package. This is an abuse that only fosters misunderstanding of the purpose of library versions." Since the project has traditionally followed this library versioning policy, I'd prefer to stick with it, and just be deliberate.
The 3.0.5 release has library version 29.0.0. The 3.1 release make a few minor internal fixes (most have been patched into 3.0.5) and adds the OpenEXRCore library. So:
Start with version information of ‘0:0:0’ for each libtool library.
Update the version information only immediately before a public release of your software.
If the library source code has changed at all since the last update, then increment revision (‘c:r:a’ becomes ‘c:r+1:a’).
yes -> 29.1.0
If any interfaces have been added, removed, or changed since the last update, increment current, and set revision to 0.
yes -> 30.0.0
If any interfaces have been added since the last public release, then increment age.
yes -> 30.0.1
If any interfaces have been removed or changed since the last public release, then set age to 0.
no -> 30.0.1
Sound right?
Since we have an entirely different library name for every minor release (libOpenEXR-3.1.so vs libOpenEXR-3.2.so), do we start at 0:0:0 again for every one of those? Or does 3.2 pick up at where 3.1 left off?
I'm not sure your examples are correct. I don't think you can ever have a x.0.1 because age says how far BACK in revision is considered compatible, and there is no revision prior to 0.
This is so hard to get right.
We also install libOpenEXR.so in the chain of symlinks to libOpenEXR-3_<IP_ADDRESS>, and I think that's where the "drop-in" policy figures in, so I don't think the -3_1 suffix resets things.
And as pointed out in the new issue, drop-in of arbitrary OpenEXR libs into a cascade of symlinks such that one can simply link libOpenEXR.so doesn't work anyway because the version is burned into the ABI.
I'd like propose that we go with this PR as is for now. It doesn't change behavior or policy, it simply names variables in a slightly less confusing way. We approved similar changes for Imath, which went into 3.1. I don't think we should hold up the 3.1 release of OpenEXR in hopes of resolving this any more effectively, although I'm open to ideas.
Can someone approve the review so I can merge it?
|
2025-04-01T04:54:41.004578
| 2024-06-02T21:43:31
|
2329876832
|
{
"authors": [
"Xe",
"jamiec7919",
"stuartaccent"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13065",
"repo": "AccentDesign/gcss",
"url": "https://github.com/AccentDesign/gcss/issues/4"
}
|
gharchive/issue
|
Media queries
I've been trying to port Xess to gcss as an exercise to see what I think about it and I've run into an issue, I can't properly express media queries like this:
@media only screen and (max-device-width: 736px) {
main {
padding: 0;
}
}
What's the gcss way to do this?
Hi @Xe
Ive added a basic example of how you could do it. The end goal with any of it is you just need to wrap styles with something like the media string to the writer. In short there isn't a way and im keen not to impose it as that will keep the repo free of problems.
But here is an example:
package main
import (
"github.com/AccentDesign/gcss"
"github.com/AccentDesign/gcss/props"
"io"
"os"
)
type (
Styles []gcss.Style
Media struct {
Query string
Styles Styles
}
Stylesheet struct {
Styles Styles
Medias []Media
}
)
// WriteCSS writes the CSS for the media query to the writer
func (m Media) WriteCSS(w io.Writer) error {
if _, err := io.WriteString(w, m.Query); err != nil {
return err
}
if _, err := io.WriteString(w, "{"); err != nil {
return err
}
for _, style := range m.Styles {
if err := style.CSS(w); err != nil {
return err
}
}
if _, err := io.WriteString(w, "}"); err != nil {
return err
}
return nil
}
// WriteCSS writes the CSS for the stylesheet to the writer
func (ss Stylesheet) WriteCSS(w io.Writer) error {
// Write the base styles first
for _, style := range ss.Styles {
if err := style.CSS(w); err != nil {
return err
}
}
// Write the media queries next
for _, media := range ss.Medias {
if err := media.WriteCSS(w); err != nil {
return err
}
}
return nil
}
var (
base = Styles{
{
Selector: "body",
Props: gcss.Props{
Margin: props.UnitRaw(0),
},
},
}
screen736 = Media{
Query: "@media only screen and (max-device-width: 736px)",
Styles: Styles{
{
Selector: "main",
Props: gcss.Props{
Padding: props.UnitRaw(0),
},
},
},
}
stylesheet = Stylesheet{
Styles: base,
Medias: []Media{screen736},
}
)
// This is just a basic idea of how you could structure your CSS
// the goal hear is just to wrap the css how you wish with what ever you wish
// construct your stylesheet to suit your needs.
// The end goal is just to call CSS on each style with the object to write to.
func main() {
file, err := os.Create("media.css")
if err != nil {
panic(err)
}
defer file.Close()
if err := stylesheet.WriteCSS(file); err != nil {
panic(err)
}
}
hope this helps in some way
PS. when you get commit ur happy with feel free to pr it to the examples would love to see it :)
Sorry to wade in here but I've been evaluating gcss - couldn't this also be solved (and also enable nested styles) by just including a []Styles child field in Style?
hi @jamiec7919,
yeh there are a multitude of ways really, what ever suits the needs best. I did have a play around with a sort of starting point after this issue at https://github.com/AccentDesign/gcss-starter as a "could it actually generate a more complete basic stylesheet to cover the basics" and "does it actually need anything adding that can serve this kind of use more specifically".
I did toy with the nested idea, but thought I would wait and chew it over. The only thought off the cuff would be validity of the selectors and whether it is actually a media query or standard selectors.
i'm not sure where the lib is going at present. Still trying to work that out. it feels like we should either do it properly (aka more complete). The bare basics. Or nothing :)
Stu
|
2025-04-01T04:54:41.008285
| 2024-01-31T19:47:28
|
2110842576
|
{
"authors": [
"Pubudu-Basnayaka-COS"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13066",
"repo": "Accessible360/accessible-slick",
"url": "https://github.com/Accessible360/accessible-slick/issues/98"
}
|
gharchive/issue
|
Carousel syncing does not work when using 'responsive' setting
I have carousels that are synced using the following example
Using this example https://codepen.io/A360/pen/BaKwaGa
But I'm trying to get the thumbnail nav to become horizontal for smaller screens, I'm doing this via the 'responsive' setting, but it seems that once the breakpoint happens the syncing is broken (even if i drag the window back to a larger size).
e.g.
on https://codepen.io/A360/pen/BaKwaGa
If you add the responsive setting to the '.thumbnail-slider' e.g.
responsive: [ { breakpoint: 1000, settings: { vertical: false, } } ],
This will switch the slider to horizontal as expected but clicking on the thumbnail slider buttons do not change the main slider, syncing between the two is broken.
note* that example will look weird as the CSS in that example doesn't accommodate for that behavior, but the clicking behavior should still work I believe
I just realize that example is doing carousel syncing manually not using the settings, so its probably an issue with that
|
2025-04-01T04:54:41.015606
| 2020-03-31T11:33:37
|
591031645
|
{
"authors": [
"StevenPyle",
"glewis-ANet"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13067",
"repo": "AchievementNetwork/static",
"url": "https://github.com/AchievementNetwork/static/pull/55"
}
|
gharchive/pull-request
|
AT2-464 - Standalone service without vasco - [do not merge]
Following the pattern used in assess-api, this allows the static service to run securely as a standalone service. The vasco repo is still referenced, but only as a library.
It can't be merge to master yet since all services must be upgraded simulataneously. (They require AWS Load Balancer changes)
Following the pattern used in assess-api, this allows the static service to run securely as a standalone service. The vasco repo is still referenced, but only as a library.
It can't be merge to master yet since all services must be upgraded simulataneously. (They require AWS Load Balancer changes)
Is there a way to avoid this big bang approach so that each service can be done one at a time?
Following the pattern used in assess-api, this allows the static service to run securely as a standalone service. The vasco repo is still referenced, but only as a library.
It can't be merged to master yet since all services must be upgraded simultaneously. (They require AWS Load Balancer changes)
Is there a way to avoid this big bang approach so that each service can be done one at a time? If so, what is the cost?
Since this change decouples each service from vasco, the AT2 application itself fails until the AWS load balancer's listeners are reconfigured to not route to vasco, but rather route directly to the services. This is all due to vasco doing both routing and service discovery. I suppose it might possible, but I'd have to do some investigation. The biggest risk is that QT development is interrupted since QT calls the AT2 learnosity-mirror service to retrieve Learnosity data. I'll give it some thought. Thanks!
|
2025-04-01T04:54:41.035626
| 2018-09-20T14:35:34
|
362218143
|
{
"authors": [
"CLAassistant",
"jx-activiti-cloud"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13068",
"repo": "Activiti/activiti-api",
"url": "https://github.com/Activiti/activiti-api/pull/9"
}
|
gharchive/pull-request
|
update Activiti/activiti-build to master
UpdateBot pushed version changes from the source code in repository: Activiti/activiti-build ref: master
UpdateBot commands:
updatebot push --ref master https://github.com/Activiti/activiti-build.git
Thank you for your submission, we really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.jenkins-x-bot seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.You have signed the CLA already but the status is still pending? Let us recheck it.
|
2025-04-01T04:54:41.055734
| 2014-10-10T16:50:05
|
45500745
|
{
"authors": [
"allaire",
"suan",
"vaughanj10",
"zzak"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13069",
"repo": "ActsAsParanoid/acts_as_paranoid",
"url": "https://github.com/ActsAsParanoid/acts_as_paranoid/issues/20"
}
|
gharchive/issue
|
validates_uniqueness_of_without_deleted undefined in Rails 4.2
More of an FYI I suppose since 4.2 is not official yet, but the above method is undefined both on master and on 0.5.0.beta1 when used as instructed in the README
Any update on this @zzak ?
@allaire there is a patch, maybe you can point your Gemfile to @mvz's fix-build branch?
https://github.com/mvz/acts_as_paranoid/tree/fix-build
If you try it please report any issues to #26, thank you!
+1
This should be fixed in the last release, please check it. <3
Please try 0.5.0, this should be fixed now
|
2025-04-01T04:54:41.059705
| 2021-07-03T15:58:32
|
936281605
|
{
"authors": [
"Crementif",
"dimanaum"
],
"license": "CC0-1.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13070",
"repo": "ActualMandM/cemu_graphic_packs",
"url": "https://github.com/ActualMandM/cemu_graphic_packs/issues/526"
}
|
gharchive/issue
|
[Breath of the Wild] FPS++ - High Refresh Rate Causes Excessive Bokoblin/Moblin Ragdolling
Describe the issue
I'm playing the game at 144hz using the FPS++ mod. I have noticed that the ragdolling of enemies increases greatly. For example, when I get a headshot on a bokoblin, they go absolutely flying, several times further than expected. I would assume this issue has to do with the speed of the engine being accelerated due to the higher FPS.
To Reproduce
Steps to reproduce the behavior:
Enable a higher refresh rate in FPS++ (it is extremely noticeable past 100FPS)
Cause a bokoblin to go flying due to a critical hit of some sort (I recommend a headshot with an arrow)
The bokoblin will be yeeted across the environment instead of flying backwards a few feet.
Expected behavior
I would expect the knockback/ragdolling to behave similarly to how it does in 30fps.
Desktop (please complete the following information):
OS: Windows
GPU: Nvidia
Renderer: Vulkan
Version: 1.2.170
This is now fixed with the new major graphic pack update, thanks to the work of @Exzap!
|
2025-04-01T04:54:41.119625
| 2021-07-17T01:34:13
|
946678997
|
{
"authors": [
"Capri2014",
"LeeYiyuan"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13071",
"repo": "AdaCompNUS/magic",
"url": "https://github.com/AdaCompNUS/magic/issues/2"
}
|
gharchive/issue
|
“models/: Contains the neural networks used. Also contains the trained models for each task.”
I did not see the trained model in this folder, could you elaborate more on how to play with different examples shown in paper?
Hi @Capri2014, so sorry about that! We had .gitignore'ed the trained models by accident. It has now been added to the models/folder. The instructions have also been updated with examples on how to run the scripts.
|
2025-04-01T04:54:41.251399
| 2021-07-08T04:51:26
|
939466466
|
{
"authors": [
"ShilpaAmbi",
"by-4x1",
"davidjgonzalez",
"justinedelson",
"kwin",
"viveksachdeva"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13072",
"repo": "Adobe-Consulting-Services/acs-aem-commons",
"url": "https://github.com/Adobe-Consulting-Services/acs-aem-commons/issues/2637"
}
|
gharchive/issue
|
Issues flagged by Cloud Manager - Clientlib versioning
Required Information
[ ] AEM Version, including Service Packs, Cumulative Fix Packs, etc: 6.5.5
[ ] ACS AEM Commons Version: 4.11.2
[ ] Reproducible on Latest? yes/no Yes
Expected Behavior
As per versioned clientlib doc https://adobe-consulting-services.github.io/acs-aem-commons/features/versioned-clientlibs/index.html, we created config at Level 4... /apps/myapp/config/rewriter/versioned-clientlibs. With that we are seeing a major bug in Scan reports from Cloudmanager
ConfigAndInstallShouldOnlyContainOsgiNodes -- Node /apps/myapp/config is an OSGi config or install path which contains non-OSGi-related children that will not be visible to non-administrative users.
Actual Behavior
There shouldnt be Cloud Manager issue with versioned clientlib config
Steps to Reproduce
Install ACS commons package
Configure versioned clienlibs at /apps/myapp/config/rewriter/versioned-clientlibs and run CM pipeline
Links
N.A.
@viveksachdeva The newer AEM Archetype splits configs into another package (ui.confg) from the code (ui.apps) ... These might be part of the "cloud ready" recent changes ?
https://experienceleague.adobe.com/docs/experience-manager-core-components/using/developing/archetype/using.html?lang=en
@viveksachdeva mm - interesting. Just a quick question - any reason you arent using OOTB client lib versioning? Highly recommend you do so over ACS Commons, unless you have a good reason.
@justinedelson any insights on this? The rewriter config is a sling:Folder under the /apps/.../config node ... CM is unhappy that the node is not an OSGi config (ie. sling:OsgiConfig/.cfg.json).
Pretty sure this is a false positive in the CM build. Configuration pipelines must be configured in that path: https://sling.apache.org/documentation/bundles/output-rewriting-pipelines-org-apache-sling-rewriter.html#configuring-a-processor-1
@kwin that was my expectation as well. JE might be busy - ill log an issue with CM team internally and ask them to review the rule.
@davidjgonzalez @kwin I'd never say never, but this was actually an issue 2 years ago that I don't think has regressed.
FWIW, the content structured used in testing this specific case is:
/apps
/testco
/config
/com.day.cq.rewriter.linkchecker.impl.LinkCheckerImpl.xml (sling:OsgiConfig)
/rewriter (sling:Folder)
/pdf (nt:unstructured)
i.e. /apps/testco/config/rewriter/.content.xml declares the node as sling:Folder and /apps/testco/config/rewriter/pdf/.content.xml declares that node as nt:unstructured.
I see that the ACS AEM Commons docs say to use a sling:Folder node for the configuration. This, however, does not have an impact (just validated this as well).
@viveksachdeva If you can provide the execution ID, I can probably look up some more detail. We could also move this to an AEM support issue if you would be more comfortable with that.
This test project does not generate any issues.
I have connected with out PS to relay this information as this could have some client info. thanks!
Hi,
We too are facing same issue. Could you please let me know how it was resolved?
Thanks,
Shilpa
|
2025-04-01T04:54:41.266102
| 2020-06-24T14:03:12
|
644641392
|
{
"authors": [
"smlambert",
"sophia-guo"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13073",
"repo": "AdoptOpenJDK/build-jdk",
"url": "https://github.com/AdoptOpenJDK/build-jdk/pull/4"
}
|
gharchive/pull-request
|
Update buildJDK id
Fix the run java version step
https://github.com/AdoptOpenJDK/build-jdk/runs/801659565?check_suite_focus=true
Signed-off-by: Sophia Guo<EMAIL_ADDRESS>
Hotspot builds now ok, but java -version step still failing for OpenJ9 @sophia-guo
|
2025-04-01T04:54:41.269938
| 2019-04-17T12:37:20
|
434259296
|
{
"authors": [
"M-Davies",
"MeFisto94",
"fede-green",
"karianna"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13074",
"repo": "AdoptOpenJDK/openjdk-build",
"url": "https://github.com/AdoptOpenJDK/openjdk-build/issues/1043"
}
|
gharchive/issue
|
how should i install this in my custom docker environment?
Hello,
trying installing jdk 8 on my docker machine. Running this:
git clone https://github.com/AdoptOpenJDK/openjdk-build
cd openjdk-build
./makejdk-any-platform.sh --docker jdk8u
i am getting the following error:
Starting ./makejdk-any-platform.sh to configure, build (Adopt)OpenJDK binary
Parsing opt: --docker
Possible opt arg: jdk8u
Working dir is ./build/
[debug] COPY_MACOSX_FREE_FONT_LIB_FOR_JDK_FLAG=true
[debug] COPY_MACOSX_FREE_FONT_LIB_FOR_JRE_FLAG=true
JDK Image folder name: j2sdk-image
JRE Image folder name: j2re-image
Searching for JDK_BOOT_DIR
readlink: missing operand
Try 'readlink --help' for more information.
dirname: missing operand
Try 'dirname --help' for more information.
dirname: missing operand
Try 'dirname --help' for more information.
any suggestion?
thanks
Searching the repo for readlink gives only this as result:
https://github.com/AdoptOpenJDK/openjdk-build/blob/95dc19f9b5a9ecc02addae45312d047ad154cfe8/sbin/common/common.sh#L129
Install a JDK on your system, so which javac suceeeds and building works again (Just tried that, had the same problem).
What O/S are you running? what happens when you run:
which javac and subsequently readlink -f $(which javac)?
Hi @fede-green . Have you tried Martijn's comment above?
This was resolved some time ago (we enhanced the detection)
|
2025-04-01T04:54:41.275920
| 2020-04-20T16:24:16
|
603365798
|
{
"authors": [
"karianna",
"sxa",
"tmancill"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13075",
"repo": "AdoptOpenJDK/openjdk-build",
"url": "https://github.com/AdoptOpenJDK/openjdk-build/pull/1682"
}
|
gharchive/pull-request
|
remove zulu-7 from the jdk8 build
zulu-7 no longer appears to be needed or used during the build.
(Perhaps it was used previously as part of a bootstrap?)
The PR also installs ca-certificates, which is still needed and was
being pulled in by software-properties-common.
I realize that this doesn't change much. This saves one apt-get update during the build process and avoids unnecessary traffic to Azul's repo, so perhaps speeds up the build just a tad. But I was curious about "why we can't have nice things..."
@tmancill - how does the docker build bootstrap itself without the Java 7?
@tmancill - how does the docker build bootstrap itself without the Java 7?
@karianna I was puzzling over this as well when I started looking into it. Ostensibly the assumption is that we're already successfully bootstrapped and can retrieve jdk8 from api.adoptopenjdk.net in that Dockerfile here; these lines changed in https://github.com/AdoptOpenJDK/openjdk-build/pull/1278.
To test, I cleaned my environment with docker system prune -a to ensure that a container with a JDK wasn't being reused from cache, and then built with ./makejdk-any-platform.sh --docker jdk8u:
...
Step 4/24 : RUN mkdir -p /openjdk/target
---> Running in 4b5869d461f2
Removing intermediate container 4b5869d461f2
---> a06a1507ac08
Step 5/24 : RUN wget 'https://api.adoptopenjdk.net/v2/binary/releases/openjdk8?openjdk_impl=hotspot&os=linux&arch=x64&release=latest&type=jdk' -O jdk8.tar.gz
---> Running in 44d266989564
--2020-04-20 18:36:59-- https://api.adoptopenjdk.net/v2/binary/releases/openjdk8?openjdk_impl=hotspot&os=linux&arch=x64&release=latest&type=jdk
Resolving api.adoptopenjdk.net (api.adoptopenjdk.net)... <IP_ADDRESS>, <IP_ADDRESS>
Connecting to api.adoptopenjdk.net (api.adoptopenjdk.net)|<IP_ADDRESS>|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://github.com/AdoptOpenJDK/openjdk8-binaries/releases/download/jdk8u252-b09/OpenJDK8U-jdk_x64_linux_hotspot_8u252b09.tar.gz [following]
...
And then later on in openjdk_container:latest:
...
Searching for JDK_BOOT_DIR
Guessing JDK_BOOT_DIR: /usr/lib/jvm/jdk8
If this is incorrect explicitly configure JDK_BOOT_DIR
Boot dir set to /usr/lib/jvm/jdk8
Running gradle with /usr/lib/jvm/jdk8
...
So we're currently building jdk8 with itself. Should we break the recursion?
While building with JDK8 works, we generally use the previous version, but if we've already got it pulling the adoptopenjdk8 then this seems like a reasonable change ... thoughts @karianna ?
I'm actually going to be a pian here and state that we should stay with the Zulu 7 as the bootstrap JDK. We should ideally always build with version -1 (as recommended by the upstream guides).
Makes sense to me. Closing this PR. I'll have another look to see what it will take to build with 7 have Java 8 to run Gradle.
|
2025-04-01T04:54:41.283123
| 2018-02-07T03:50:48
|
294998003
|
{
"authors": [
"gdams"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13076",
"repo": "AdoptOpenJDK/openjdk-infrastructure",
"url": "https://github.com/AdoptOpenJDK/openjdk-infrastructure/pull/181"
}
|
gharchive/pull-request
|
Initial attempt at formatting hostnames/inventory
So I have used some of the fantastic code written at https://github.com/nodejs/build to setup an inventory based system. The key thing to this is that the backend python does some pretty clever stuff. You can see the inventory here.
Example:
hosts:
- build:
- cloudcone:
ubuntu1604-x64-1: {ip: <IP_ADDRESS>}
Becomes build-cloudcone-ubuntu1604-x64-1
And this in turn means that the ansible variables are a bit more useful:
ok: [build-cloudcone-ubuntu1604-x64-1] => {
"msg": [
"inventory_hostname: build-cloudcone-ubuntu1604-x64-1 ",
"ansible_hostname: build-cloudcone-x64-ubuntu-16-04-1",
"ansible_ssh_private_key_file: ~/.ssh/id_rsa",
"ansible_fqdn: build-cloudcone-x64-ubuntu-16-04-1.cloudcone.com",
"ansible_default_ipv4.address: <IP_ADDRESS>",
"ansible_os_family: Debian ",
"ansible_distribution: Ubuntu ",
"ansible_distribution_major_version: 16 ",
"ansible_architecture: x86_64 ",
"ansible_processor_vcpus: 4 ",
"ansible_processor_cores: 1 ",
"Jenkins_Username: jenkins ",
"Superuser_Account: Enabled",
"Vendor_File: ***Undefined***",
"Nagios_Plugins: Enabled ",
"Nagios_Monitoring: Enabled",
"Nagios_Master_IP: <IP_ADDRESS>"
]
}
This should then allow us to standardise the hostnames which was part of the problem but can also be used as the hostname for the nagios master updates etc.
To Run:
ansible-playbook playbooks/AdoptOpenJDK_Linux_Playbook/main.yml --limit "build-cloudcone*"
macOS requires you to run:
export PYTHONPATH=$(pip2 show pyyaml | grep Location | awk '{print $2}')
(thanks @gibfahn for working that one out)
I have managed to get AWX to pick up this inventory style so I am going to go ahead and merge the PR and update AWX to pull from our inventory.yml. I am then keen to go through nagios and remove most of the unix hosts as as lot of them will have the wrong name and then we can let the playbook, auto set the new hostname
|
2025-04-01T04:54:41.289031
| 2021-02-25T23:53:50
|
816903210
|
{
"authors": [
"aahlenst",
"markathomas"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13077",
"repo": "AdoptOpenJDK/openjdk-support",
"url": "https://github.com/AdoptOpenJDK/openjdk-support/issues/266"
}
|
gharchive/issue
|
JVM crash on ARM64
Summary
JVM inside Docker container is crashing in the C1 Compiler Thread after around 5-10 of running on AArch64 Amazon Linux 2 host
Steps to reproduce
Not sure what to provide here. This happens on one of three containers using the same JVM version, same Spring Boot version, same everything only the class name in current compilation ask is only used on container that is crashing. I'd be happy to provide this class privately. Let me know what else I can provide
Expected results
JVM does not crash
Actual results
Hard crash
Triaging info
Java version:
openjdk version "11.0.10" 2021-01-19
OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.10+9)
OpenJDK 64-Bit Server VM AdoptOpenJDK (build 11.0.10+9, mixed mode)
What is your operating system and platform?
Container is based on arm64v8/adoptopenjdk:11-jdk-hotspot
Host: AArch64 Amazon Linux 2 - 4.14.219-161.340.amzn2.aarch64 #1 SMP Thu Feb 4 05:54:27 UTC 2021 aarch64 aarch64 aarch64 GNU/Linux
How did you install Java?
arm64v8/adoptopenjdk:11-jdk-hotspot container comes with it
Did it work before?
Switched to Arm64
Did you test with other Java versions?
yes, java 11.0.10 openj9 works fine; only hotspot having issues
hs_err_pid7.log
replay_pid7.log
No rush on this, I'm using the OpenJ9 build currently. This was more an FYI ticket.
Gets fix in April: https://bugs.openjdk.java.net/browse/JDK-8247766. Nightly builds should have it.
Might need a backport to 8.
Good enough for me.
|
2025-04-01T04:54:41.303716
| 2020-11-05T02:34:13
|
736560774
|
{
"authors": [
"smlambert",
"sophia-guo"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13078",
"repo": "AdoptOpenJDK/run-aqa",
"url": "https://github.com/AdoptOpenJDK/run-aqa/issues/39"
}
|
gharchive/issue
|
cmdLineTester_SCCommandLineOptionTests_1 fail (jdk8/j9)
https://github.com/AdoptOpenJDK/run-aqa/runs/1354982566?check_suite_focus=true#step:4:50735
Testing: Run !printallcallsites
Test start time: 2020/11/04 21:36:14 Coordinated Universal Time
Running command: /opt/hostedtoolcache/jdk-8-openj9/1.0.0/x64/bin/jdmpview -core j9core.dmp
Time spent starting: 2 milliseconds
Time spent executing: 689 milliseconds
Test result: FAILED
Output from test:
[OUT] DTFJView version 4.29.5, using DTFJ version 1.12.29003
[OUT] Loading image from DTFJ...
[OUT]
[OUT] Could not load dump file and/or could not load XML file: null
[OUT] For a list of commands, type "help"; for how to use "help", type "help help"
[OUT] > DDR is not enabled for this core file, '!' commands are disabled
[OUT] >
https://github.com/AdoptOpenJDK/run-aqa/runs/1354982566?check_suite_focus=true#step:4:53130
[OUT] nameOption2: Expected to find the cache name
[OUT] TEST PASSED
[ERR] JVMSHRC806I Compressed references persistent shared cache "runner" has been destroyed. Use option -Xnocompressedrefs if you want to destroy a non-compressed references cache.
Looks like cmdLineTester_callsitedbgddrext_openj9 passed in recent running. https://github.com/AdoptOpenJDK/run-aqa/runs/1476255594?check_suite_focus=true#step:4:33468. Not sure if it's intermittent or fixed.
Updating this issue to remove cmdLineTester_callsitedbgddrext_openj9 from the heading. That test was failing consistently a few weeks ago, but has not failed for a while. I do not see an obvious PR/fix that would change behaviour, but we can raise it as a separate issue if its turned into something intermittent.
Update cmdLineTester_SCCommandLineOptionTests_1 failures on linux with jdk8/11/15 ( Original one is the failure of cmdLineTester_callsitedbgddrext_openj9):
Testing: nameOption2
Test start time: 2020/12/02 20:42:49 Coordinated Universal Time
Running command: sh nameOption2.sh /opt/hostedtoolcache/jdk-8-openj9/1.0.0/x64/bin
Time spent starting: 1 milliseconds
Time spent executing: 272 milliseconds
Test result: FAILED
[OUT] Hello, world!
[OUT] TEST PASSED
[OUT] nameOption2: TEST FAILED
[OUT] nameOption2: Expected to find the cache name
[OUT] TEST PASSED
[ERR] JVMSHRC806I Compressed references persistent shared cache "runner" has been destroyed. Use option -Xnocompressedrefs if you want to destroy a non-compressed references cache.
Success condition was found: [Output match: TEST PASSED]
Failure condition was found: [Output match: TEST FAILED]
Failure condition was not found: [Output match: Error:]
Failure condition was not found: [Output match: Unhandled Exception]
Failure condition was not found: [Output match: Exception:]
...
Testing: nameOption4
Test start time: 2020/12/02 20:42:50 Coordinated Universal Time
Running command: sh nameOption4.sh /opt/hostedtoolcache/jdk-8-openj9/1.0.0/x64/bin
Time spent starting: 8 milliseconds
Time spent executing: 276 milliseconds
Test result: FAILED
[OUT] Hello, world!
[OUT] TEST PASSED
[OUT] nameOption4: TEST FAILED
[OUT] nameOption4: Expected to find the cache name
[OUT] TEST PASSED
[ERR] JVMSHRC806I Compressed references persistent shared cache "_runner_docker" has been destroyed. Use option -Xnocompressedrefs if you want to destroy a non-compressed references cache.
Success condition was found: [Output match: TEST PASSED]
Failure condition was found: [Output match: TEST FAILED]
Failure condition was not found: [Output match: Error:]
Failure condition was not found: [Output match: Unhandled Exception]
Failure condition was not found: [Output match: Exception:]
cmdLineTester_callsitedbgddrext_openj9 reopened in #50
Test nameOption2 and nameOption4 get the username from the environment variable LOGNAME
export TESTUSER=$LOGNAME
to match the shared Cache generated by NAME=_%u.
In github runner ubuntu environment _%u is runner and $LOGNAME turns out be empty ( github runner macos $LOGNAME is runner, so test passes with macos).
This actually should also affect test nameOption, which set DEFAULT_CACHE_NAME="sharedcc_$TESTUSER" . However as TESTUSER is empty so grep "$DEFAULT_CACHE_NAME" $TESTSCRIPT.out will return 0. Kind of luck, test is actually fragile.
Next step would need to figure out why $LOGNAME is empty on ubuntu runner, and be runner on macos runner. Is this expected?
LOGNAME is not in the Ubuntu GitHub Actions Environment Variables List. Instead RUNNER_USER is the environment for the user runner. There’s always a chance that the list will get outdated as soon as a new container is configured. We will put a warning if it happens and update our fixes.
|
2025-04-01T04:54:41.310451
| 2020-04-11T11:31:29
|
598247493
|
{
"authors": [
"AdrianWilczynski",
"niem94"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13079",
"repo": "AdrianWilczynski/NamespaceAutocompletion",
"url": "https://github.com/AdrianWilczynski/NamespaceAutocompletion/issues/2"
}
|
gharchive/issue
|
Problem with new namespace autocompletion
When I create a new namespace with the C# snippet, I have to click away from the auto focus, on the namespace name, for namespace autocompletion to occur. This Is a mild annoyance, as it would be cool to create a new namespace, and immediately have this extension recognize the namespace for autocompletion.
Edit: This is a problem when typing the full namespace, haven't tried namespace-fill, which might work just fine.
Hi,
Sorry for not responding earlier. I didn't notice this issue 😳
I think it's really just how VSCode works. I'm not getting autcompletion from official C# extension in snippets either.
Good think is that you don't have to click away, you can pull up autocompletion list (without existing the snippet) by pressing Ctrl + Space.
@AdrianWilczynski that's okay :-)
Thank you for the suggestion, I'll try to make a habit of activating autocompletion manually in this case.
|
2025-04-01T04:54:41.353358
| 2019-03-20T13:58:32
|
423265925
|
{
"authors": [
"gerald-dev"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13080",
"repo": "Agile-Defense/pcf-pipelines",
"url": "https://github.com/Agile-Defense/pcf-pipelines/pull/1"
}
|
gharchive/pull-request
|
Sync with pivotal-cf/pcf-pipeline master
Thanks for submitting an pull request to pcf-pipelines.
To speed up the process of reviewing your pull request please provide us with:
A short explanation of the proposed change:
An explanation of the use cases your change solves:
Expected result after the change:
Current result before the change:
Links to any other associated PRs or issues:
[x] I have viewed signed and have submitted the Contributor License Agreement
[x] I have made this pull request to the master branch
[x] I have run all the unit tests
Please sync up Agile to Pivotal lastest changes
|
2025-04-01T04:54:41.359875
| 2020-04-30T12:51:13
|
609941445
|
{
"authors": [
"nathanramli"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13081",
"repo": "Agneese-Saini/SA-MP",
"url": "https://github.com/Agneese-Saini/SA-MP/issues/39"
}
|
gharchive/issue
|
PreviewModelDialog include x rotation doesnt work
I have try to use custom model rotation, but it doesnt work as well. x rotation doesnt work with any number. Althought y and z rotation work perfectly. x rotation parameter doesnt do anything.
Any idea about this?
Thank you.
I have fixed it on #42
Thanks
|
2025-04-01T04:54:41.370129
| 2022-06-15T18:20:51
|
1272582849
|
{
"authors": [
"TheCoordinator",
"maxxfrazer",
"plutoless"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13082",
"repo": "AgoraIO/AgoraRtcEngine_iOS",
"url": "https://github.com/AgoraIO/AgoraRtcEngine_iOS/issues/18"
}
|
gharchive/issue
|
XCFramework binary size too big
Hey all,
Are there any plans to reduce the binary frameworks' size? It seems to me the sizes are way too large.
Currently seeing AgoraRTCKit at ~430MB and BeQuic at 130MB.
Thanks.
@TheCoordinator i think the reason is it includes sdk for all arch. does it increase your final app product a lot as well? i believe not all files will be linked when building release app.
Hi @plutoless, yes you are spot on. However when looking at the AppStore bundle size on my device (iPhone 11 Pro) my app size has indeed been increased by 40MB which I still would say is quite a lot. But this is still quite big. Wondering if there's a way I can reduce this given my use-case? i.e. only link dependencies that I need via a custom package?
@TheCoordinator
the baseline is media sdk will be a lot more bigger than normal sdk mainly due to its complicated dependencies (e.g. decoder codec/encoder codec etc) so 40mb increase does not sound too surprising to me.
the chances are:
see if you are using audio-only features. if you don't need video, this will be a big advantage when you want to reduce your app size. In this case you can choose to use our audio sdk, which is basically a slim sdk version which removes video modules.
for some advanced features (like beauty), we have designed to make these features into separate dynamic libraries. if you are not using these features, you can choose to not link them in xode, which may also help you in this case.
Thanks for that @plutoless.
I am using liveBroadcasting mode so would need both video/audio and nothing fancy at all.
What libs do you recommend me to drop from this list here so I make sure, I don't break anything.
https://github.com/agorabuilder/AgoraRtcEngine_iOS_Beta/blob/master/Package.swift
@TheCoordinator
this page should solve your concern,
https://docs.agora.io/en/Video/reduce_rtc_app_size?platform=iOS
OK thanks @plutoless.
Not sure if removing the extensions will change dramatically but will give it a go anyway. In the long run, do you have any plans to reduce the video framework size?
Not in the pipeline for now. We'll raise it internally to see if a lightweight version could be possible in the future, as there's many benefits to that.
Thank you @maxxfrazer That'll be hugely benefitial.
Hey @maxxfrazer , Thanks for the update. See there's been some changes since the beta on how products can be linked via SPM. Is this issue now resolved if I use RtcBasic?
Which I assume, it means I can use the bare minimum and none of the extensions.
you’re correct @TheCoordinator, we’ll update the README here to make that clearer. the other packages are for various things like background segmentation.
The largest package product, AgoraRtcKit, is still included in RtcBasic though, so won’t solve everything in terms of size.
Thanks @maxxfrazer, I think this is already an improvement. I had 20MB back just by this change and removing unnecessary extensions compared to the last beta. 👏🏼
But you're right, it'd be great if we can also have a more basic version of AgoraRtcKit in the future updates.
also each xcframework contains multiple architectures - at least 1 for the physical device, and 2 for the simulator. only one will actually be bundled when it goes to a device.
they really add up when in the xcframework format..!
Obviously xcframework itself is not a huge issue, as discussed above what gets linked based on the arch is important. If there are room for more improvement over that, it'd be highly appreciated.
@TheCoordinator I believe we've made some huge progress on reducing the app size. The XCFramework for the core SDK (those included in RtcBasic).
@TheCoordinator I believe we've made some huge progress on reducing the app size. The XCFramework for the core SDK (those included in RtcBasic).
Great to hear @maxxfrazer. Will give it a go for our next version.
Going to assume it's better now, please re-open if you're still facing the issue.
|
2025-04-01T04:54:41.568934
| 2024-05-12T06:32:43
|
2291190094
|
{
"authors": [
"Akshat111111",
"Amarta113",
"Satyam0775",
"amishhaa",
"apooyadv",
"diptarup794",
"palayushi293",
"tejasvinigoel"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13083",
"repo": "Akshat111111/Hedging-of-Financial-Derivatives",
"url": "https://github.com/Akshat111111/Hedging-of-Financial-Derivatives/issues/59"
}
|
gharchive/issue
|
Crude Oil Analysis
It involves data analysis and prediction of crude oil
@Akshat111111 Assign me this issue I will work on it.
Assign this to me, I'm interested in working on this.
Can i work on this ? My approach is to build a Random Forest crude oil price prediction model.
hello ,, pls assign this issue to me i will give my best
Waiting for your response
Im interested in this, kindly assign it to me
I would like to work on this issue. Please assign this task to me
It involves data analysis and prediction of crude oil under the main folder- Hedging by crude oil
@apooyadv @tejasvinigoel lets give others some task, you can then improve upon.
@amishhaa has done the work,are you also working with new feature?? @Amarta113
@Akshat111111 Yes I am working on it, the dataset is too large therefore I am delay
If you are facing any issue in uploading the dataset, convert it to zip file.
Yet its not ready to upload, I need some time to complete it. Is it necessary to upload now?
yes it is needed.
@Akshat111111 I am interested in working on this issue . Kindly assign it to me if possible
One implementation is already there, so create a new issue with a specific functionality.
|
2025-04-01T04:54:41.572256
| 2023-06-07T12:17:20
|
1745768961
|
{
"authors": [
"AkshitIreddy",
"Thrqureshi"
],
"license": "Unlicense",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13084",
"repo": "AkshitIreddy/AI-Powered-Video-Tutorial-Generator",
"url": "https://github.com/AkshitIreddy/AI-Powered-Video-Tutorial-Generator/issues/1"
}
|
gharchive/issue
|
error NameError: name 'video_main_function' is not defined
I am a beginner to python and NodeJS
when I am running this app. I am getting this error
NameError: name 'video_main_function' is not defined
on a backend terminal, I am getting this
INFO: Started server process [9272]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://<IP_ADDRESS>:8000 (Press CTRL+C to quit)
INFO: <IP_ADDRESS>:52834 - "GET /videoCreate HTTP/1.1" 405 Method Not Allowed
Any idea why I am getting these error
oh i made a mistake, in main.py video_main_function was supposed to be video_main, i fixed the bug, you can try again by cloning or go to main.py and and rename video_main_function to video_main in line 30 path = ...
|
2025-04-01T04:54:41.583220
| 2019-09-23T12:59:04
|
497082593
|
{
"authors": [
"chuckmitchell",
"farfromrefug"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13085",
"repo": "Akylas/nativescript-material-components",
"url": "https://github.com/Akylas/nativescript-material-components/issues/58"
}
|
gharchive/issue
|
Demo does not run --syncAllFiles option deprecated {N} CLI 6
Make sure to check the demo app(s) for sample usage
Did this, demo does not run.
Make sure to check the existing issues in this repository
Didn't see this issue mentioned.
If the demo apps cannot help and there is no issue for your problem, tell us about it
The demo app does not run when following the instructions in Readme.md.
`
npm run demo.android
<EMAIL_ADDRESS>demo.android /Users/charles/projects/kanayo/mymobile-native-26/current/mymobile-native/nativescript-material-components
cd ./demo && tns run android --syncAllFiles
{
The option 'syncAllFiles' is not supported.
Run 'tns run android --help' for more information.
`
Which platform(s) does your issue occur on?
Both
All versions
Emulator and device
Please, provide the following version numbers that your issue occurs with:
CLI: 6.1.2
Please, tell us how to recreate the issue in as much detail as possible.
Describe the steps to reproduce it.
npm i
npm run tsc
npm run demo.ios
npm run demo.android
Is there any code involved?
The solution is simple, unless we want to support loading node_modules in {N} < 6 AND {N} >= 6
Thanks! scripts have been updated
Thanks, closing 👍
|
2025-04-01T04:54:41.591115
| 2021-09-02T01:03:03
|
985918378
|
{
"authors": [
"blackfalcon",
"jason-capsule42"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13086",
"repo": "AlaskaAirlines/auro-card",
"url": "https://github.com/AlaskaAirlines/auro-card/pull/34"
}
|
gharchive/pull-request
|
feat(banner): banner removed from repo
Alaska Airlines Pull Request
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Fixes: #31
Summary:
This change removes auro-banner from auro-card.
Type of change:
Please delete options that are not relevant.
[ ] New capability
[ ] Revision of an existing capability
[ ] Infrastructure change (automation, etc.)
[x] Other - Removes the auro-banner feature set
Checklist:
[x] My update follows the CONTRIBUTING guidelines of this project
[x] I have performed a self-review of my own update
By submitting this Pull Request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Pull Requests will be evaluated by their quality of update and whether it is consistent with the goals and values of this project. Any submission is to be considered a conversation between the submitter and the maintainers of this project and may require changes to your submission.
Thank you for your submission!
-- Auro Design System Team
Given that https://github.com/AlaskaAirlines/auro-card/pull/52/commits is released, this PR should be closed.
|
2025-04-01T04:54:41.599390
| 2024-09-30T20:15:07
|
2557557381
|
{
"authors": [
"CLAassistant",
"jason-capsule42"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13087",
"repo": "AlaskaAirlines/auro-input",
"url": "https://github.com/AlaskaAirlines/auro-input/pull/337"
}
|
gharchive/pull-request
|
Jjones/beta merge conflicts
Alaska Airlines Pull Request
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Resolves: # (issue, if applicable)
Summary:
Please summarize the scope of the changes you have submitted, what the intent of the work is and anything that describes the before/after state of the project.
Type of change:
Please delete options that are not relevant.
[ ] New capability
[ ] Revision of an existing capability
[ ] Infrastructure change (automation, etc.)
[ ] Other (please elaborate)
Checklist:
[ ] My update follows the CONTRIBUTING guidelines of this project
[ ] I have performed a self-review of my own update
By submitting this Pull Request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Pull Requests will be evaluated by their quality of update and whether it is consistent with the goals and values of this project. Any submission is to be considered a conversation between the submitter and the maintainers of this project and may require changes to your submission.
Thank you for your submission!
-- Auro Design System Team
Summary by Sourcery
Simplify SCSS imports by removing file extensions in various style files to streamline the codebase.
Enhancements:
Simplify SCSS imports by removing file extensions across multiple style files.
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.2 out of 3 committers have signed the CLA.:white_check_mark: jordanjones243:white_check_mark: jason-capsule42:x: semantic-release-botYou have signed the CLA already but the status is still pending? Let us recheck it.
|
2025-04-01T04:54:41.605229
| 2024-12-17T22:02:25
|
2746178406
|
{
"authors": [
"blackfalcon",
"rmenner"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13088",
"repo": "AlaskaAirlines/auro-select",
"url": "https://github.com/AlaskaAirlines/auro-select/pull/226"
}
|
gharchive/pull-request
|
Force dropdown bib min-width to not expand outside container #225
Alaska Airlines Pull Request
Type of change:
Please delete options that are not relevant.
[ ] New capability
[x] Revision of an existing capability
[ ] Infrastructure change (automation, etc.)
[ ] Other (please elaborate)
Checklist:
[x] My update follows the CONTRIBUTING guidelines of this project
[x] I have performed a self-review of my own update
By submitting this Pull Request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Pull Requests will be evaluated by their quality of update and whether it is consistent with the goals and values of this project. Any submission is to be considered a conversation between the submitter and the maintainers of this project and may require changes to your submission.
Thank you for your submission!
-- Auro Design System Team
Summary by Sourcery
Bug Fixes:
Fix dropdown menu to prevent it from expanding outside its container by setting a minimum width.
:tada: This PR is included in version 3.3.2 :tada:
The release is available on:
npm package (@latest dist-tag)
GitHub release
Your semantic-release bot :package::rocket:
|
2025-04-01T04:54:41.653826
| 2022-03-28T04:42:25
|
1182874767
|
{
"authors": [
"MasterZitron",
"Si1kn"
],
"license": "CC0-1.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13089",
"repo": "AlephInfinity1/ForgeBlock",
"url": "https://github.com/AlephInfinity1/ForgeBlock/issues/17"
}
|
gharchive/issue
|
ForgeBlock encountered an error during the common setup event phase
Built the mod all good and all, but when loading it up I get this
So I tried running with the forge version 31.2.36 and I got a new error, perhaps I did not build the mod all good and all.
Hello! Is it possible you could send your full crash log?
Yep sorry late reply https://paste.ee/p/AhUN9
Ah I see the mistake, when compiling the mod the rpc library isn't being added, I'll see what I can do when I get home, otherwise you could mess around with the grade build file thing :)
|
2025-04-01T04:54:41.679519
| 2024-10-18T06:55:33
|
2596612535
|
{
"authors": [
"slycordinator"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13090",
"repo": "Alex313031/Thorium-Win",
"url": "https://github.com/Alex313031/Thorium-Win/issues/287"
}
|
gharchive/issue
|
Inconsistent Documentation For Build Versions (AVX/AVX2)
Problem
The repo readme recommends one to go to the to https://github.com/Alex313031/Thorium-AVX2 for AVX2 builds, which then redirects to https://github.com/Alex313031/Thorium-Win-AVX2
But according to https://github.com/Alex313031/Thorium-Win-AVX2/releases/tag/M120, AVX2 builds are being provided here from now on.
If unclear, I mean that the latest release in Thorium-Win-AVX2 says that from now on, builds are being provided at Thorium-Win instead.
|
2025-04-01T04:54:41.687196
| 2023-12-20T02:16:19
|
2049717635
|
{
"authors": [
"Alex313031",
"InventorXtreme",
"gz83"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13091",
"repo": "Alex313031/thorium",
"url": "https://github.com/Alex313031/thorium/issues/477"
}
|
gharchive/issue
|
Build Issues
System Details
OS: Arch Linux
Thorium Version [e.g.118.0.5993.177]
Problem
Unable to generate ninja build files. All attempts to run gn args out/thorium result in
Waiting for editor on "/mnt/linuxextrassd/chromium/src/out/thorium/args.gn"...
Generating files...
ERROR at //build/config/compiler/BUILD.gn:183:15: Duplicate build argument declaration.
use_cxx17 = false
^----
Here you're declaring an argument that was already declared elsewhere.
You can only declare each argument once in the entire build so there is one
canonical place for documentation and the default value. Either move this
argument to the build config file (for visibility everywhere) or to a .gni file
that you "import" from the files where you need it (preferred).
See //build_overrides/build.gni:61:15: Previous declaration.
use_cxx17 = false
^----
See also "gn help buildargs" for more on how build arguments work.
See //build/config/BUILDCONFIG.gn:334:3: which caused the file to be included.
"//build/config/compiler:afdo_optimize_size",
^-------------------------------------------
Please note that the file referenced on the last line of the error changes each run.
Additional Notes
I have recloned multiple times and followed word for word the instructions in the build guide.
There seems to be an error occurring with circular references, and you need to make sure that there is only one copy of the full clone of the Chromium source code on the device, and that all dependencies are pulled.
Also, it is recommended to use Ubuntu as well as Debian systems for compilation.
@InventorXtreme
Is there any way to sort out the circular references? I really don't want to spin up a VM and reclone everything for the fourth time. Thanks for the response by the way.
Ok, by manually setting the version in the version script to the version in the releases page I got past the gn script, but now am unable to build because clang is not recognizing x86_64-v3 as a real arch. I will probably try to reclone everything tomorrow.
@InventorXtreme What you should do is follow the chromium documentation up to the point where it wants you to do the gn args out. At that point, then: (assuming depot_tools, the thorium repo, and the chromium repo are in $HOME).
Run ./trunk.sh to fetch tags and full git history.
Run ./version.sh to check out the Chromium repo at the revision Thorium is currently at.
Run ./setup.sh --help. This will show you the flags that can be used for the different platforms. For regular linux, you can run setup.sh with no arguments. For an AVX2 build like what you are wanting, use setup.sh --avx2
After this you can manually run gn args out/thorium, and use the args.gn appropriate for the platform. Make sure to update the PGO Profile location and version at the bottom, and you can add API Keys at the top if you wish.
Then you can either use the build scripts (build.sh, build_win.sh, etc.) to build for your platform. The scripts take one integer afterwards to tell ninja how many jobs to spawn.
For example, to build thorium for linux on an 8 core sysytem:
./build.sh 8
@InventorXtreme I updated the docs, see > https://thorium.rocks/docs/building.html
|
2025-04-01T04:54:41.694257
| 2022-02-16T12:20:56
|
1139945111
|
{
"authors": [
"AlexCuse"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13092",
"repo": "AlexCuse/watermill-jetstream",
"url": "https://github.com/AlexCuse/watermill-jetstream/issues/7"
}
|
gharchive/issue
|
Inline SubscribeInitialize?
Right now even if auto provisioning is enabled, clients need to call SubscribeInitialize to allocate on the broker before subscribing.
Feels like there may be a case to call this inside subscribe, not sure. May be worth caching whats initialized and whats not if the change is made, though NATS lookups seem very fast.
In general probably what should happen is moving the configuration check into publisher / subscriber and call SubscribeInitialize if it's enabled in the subscriber. Publisher can still call directly.
|
2025-04-01T04:54:41.708847
| 2024-01-04T15:03:56
|
2065817647
|
{
"authors": [
"lgaribaldi"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13093",
"repo": "AlexEidt/Vidio",
"url": "https://github.com/AlexEidt/Vidio/issues/10"
}
|
gharchive/issue
|
Getting scrambled frames for portrait videos
By using the example code to read a video and write the output directly and save the frames, I am getting all frames and the output video scrambled for certain input videos. This seems to only happen in portrait videos as far as I can tell.
This is an example frame, as you can see it looks like the stride is off:
I'll submit a PR with the fix that I am using
I can add a test if you would like, but I will need to add a new test video for this, as it does not happen with the current test videos
|
2025-04-01T04:54:41.727306
| 2024-08-07T11:24:53
|
2453243190
|
{
"authors": [
"AlexFlipnote",
"StupidMurderDroneFanThatUsesArchBtw"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13094",
"repo": "AlexFlipnote/neofetch-win",
"url": "https://github.com/AlexFlipnote/neofetch-win/issues/27"
}
|
gharchive/issue
|
it doesnt work
What Python version are you using?
What Python version are you using?
Python 3.12.4
What's the version of psutil you have?
C:\Users\AlexFlipnote>python --version
Python 3.12.4
>>> import psutil
>>> psutil.__version__
'6.0.0'
Tested with my local machine and 6.0.0 works on my end at least, which seems to be latest version.
What's the version of psutil you have?
C:\Users\AlexFlipnote>python --version
Python 3.12.4
>>> import psutil
>>> psutil.__version__
'6.0.0'
Tested with my local machine and 6.0.0 works on my end at least, which seems to be latest version.
it says i have psutil installed but when i try to use anything psutil related it says i dont have psutil
It feels a bit out of scope for neofetch, since with the same settings that you have but for my own machine, it works. I'd recommend checking your Windows to see if psutil simply has no access to get DLL, maybe if it could be that your Python and Windows are in two different partitions, etc.
Could also try pip install psutil==5.9.3 and see if lower version works for your case
pip install psutil==5.9.3
Collecting psutil==5.9.3
Downloading psutil-5.9.3.tar.gz (483 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Building wheels for collected packages: psutil
Building wheel for psutil (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for psutil (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [38 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-cpython-312
creating build\lib.win-amd64-cpython-312\psutil
copying psutil_common.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_compat.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_psaix.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_psbsd.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_pslinux.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_psosx.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_psposix.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_pssunos.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_pswindows.py -> build\lib.win-amd64-cpython-312\psutil
copying psutil_init_.py -> build\lib.win-amd64-cpython-312\psutil
creating build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\runner.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_aix.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_bsd.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_connections.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_contracts.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_linux.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_memleaks.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_misc.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_osx.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_posix.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_process.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_sunos.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_system.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_testutils.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_unicode.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests\test_windows.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests_init_.py -> build\lib.win-amd64-cpython-312\psutil\tests
copying psutil\tests_main_.py -> build\lib.win-amd64-cpython-312\psutil\tests
running build_ext
building 'psutil._psutil_windows' extension
error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for psutil
Failed to build psutil
ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (psutil)
I ALREADY HAVE C++ 14.0.85 INSTALLED
The error says you apparently do not, so there is not much i can say or do. Essentially, try different versions of psutil and see if it can help, i will look into seeing if the tool is required on later updates
|
2025-04-01T04:54:41.738205
| 2024-01-26T03:46:55
|
2101552847
|
{
"authors": [
"AlexanderJGael",
"Jbrockhoff"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13095",
"repo": "AlexanderJGael/social-searcher",
"url": "https://github.com/AlexanderJGael/social-searcher/issues/10"
}
|
gharchive/issue
|
Search Function
As a user
When I enter a query into the searchbar
Then I am redirected to a separate page displaying content which includes my query
Complete
|
2025-04-01T04:54:41.749030
| 2019-09-30T15:12:01
|
500342794
|
{
"authors": [
"dtpoirot",
"gokulmanohar"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13099",
"repo": "AlexanderWillner/runMacOSinVirtualBox",
"url": "https://github.com/AlexanderWillner/runMacOSinVirtualBox/pull/82"
}
|
gharchive/pull-request
|
Added the code for enabling 1080p full screen resolution.
Many here had issues regarding the full screen mode. To enable 1080p resolution on VB try:
cd “c:/Program Files\oracle\virtualbox”
VBoxManage setextradata “YOUR VB NAME” VBoxInternal2/EfiGraphicsResolution 1920x1080
This code worked for me. Run this on CMD and now run the virtual machine.
Replace N with one of 0,1,2,3,4,5. These numbers correspond to the screen resolutions
640x480, 800x600, 1024x768, 1280x1024, 1440x900, 1920x1200 screen resolution,
respectively.
VBoxManage setextradata "YOUR VB NAME" VBoxInternal2/EfiGopMode 4
|
2025-04-01T04:54:41.755558
| 2023-03-22T02:57:59
|
1634976617
|
{
"authors": [
"marcos-diazg",
"toddajohnson"
],
"license": "BSD-2-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13100",
"repo": "AlexandrovLab/SigProfilerAssignment",
"url": "https://github.com/AlexandrovLab/SigProfilerAssignment/issues/81"
}
|
gharchive/issue
|
SBS96 signature assignment reproducibility using SigProfilerAssignment?
I ran SigProfilerAssignment for 396 HCC samples across a mutational matrix constructed using SigProfilerMatrixGenerator from PCAWG vcfs (lifted over to GRCh38) or vcfs from our own in house mutation calling pipeline (called using GRCh38 as ref.). Twenty-six of the samples were called in both pipelines (our lab submitted the LIRI-JP PCAWG data). Of some concern, the signature profiles sometimes show very large mis-matches for the assigned SBS signatures, so I don't know what values I can "trust". I could understand that would happen if the two pipelines were outputting very different calls for the same sample, but this occurs even when the values in the mutational matrix for a sample's duplicates appear to be almost identical (r=0.9986 for the two columns) and the SigProfilerMatrixGenerator output/plots/SBS_96_plots...pdf appear identical to me for those samples.
I have attached the JOB_METADATA_SPA file and an Excel file for one sample (RK001_T) with the mutational matrix data and SBS96 signatures that were called in either of the two samples.
As one can see looking at the SBS96 signature summary table, SBS5 was assigned in both samples, but SBS8, SBS46, and SBS92 were assigned only in LIRI-JP_RK001_T and SBS12 and SBS40 were assigned only to the data from my current somatic calling pipeline. I did not see such differences for ID or DBS in this sample.
JOB_METADATA_SPA.txt
RK001_T_matrix_and_COSMIC_sigs.xlsx
Dear @toddajohnson,
Please accept my apologies for not getting back to you sooner. Unfortunately, what you described is one of the biggest challenges that we currently face in the field of signature assignment. Considering the large pool of SBS reference signatures available, the possibility of different signature reconstructions having very similar reconstruction similarity is relatively high. Indeed, as you pointed out, almost no difference was observed for ID and DBS, with both having a much smaller number of COSMIC reference signatures.
However, there are several strategies to try to overcome this uncertainty. First, one can focus on those signatures previously related to the cancer type of interest, liver cancer in your particular case. Although this would restrict the analysis to previous knowledge, it would avoid unreliable assignment of signatures, such as SBS46 in your specific case, which is a signature related to a sequencing artifact observed in early releases of TCGA. You can exclude certain reference signatures from the assignment by using the exclude_signature_subgroups parameter (following the instructions on the README) or directly providing a modified signature database using the signature_database parameter. Other options can include checking the transcriptional strand bias of your samples compared to the one observed in the reference signatures (available in the topographical features section of the COSMIC Mutational Signatures website) or excluding certain etiologies based on additional information about the input samples (e.g., restricting the assignment of chemotherapy-related signatures in treatment-naive cases).
I hope this helps, and please feel free to reopen the ticket or reach out by email<EMAIL_ADDRESS>if you have further comments or questions. Thanks for your interest!
|
2025-04-01T04:54:41.758626
| 2022-11-26T15:35:56
|
1465185435
|
{
"authors": [
"aacebedo"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13101",
"repo": "Alexays/Waybar",
"url": "https://github.com/Alexays/Waybar/issues/1834"
}
|
gharchive/issue
|
Missing tray icons for blueberry-tray and some other applications
Hi
I am using waybar on a sway based nixos install and when I try to run blueberry-tray or some other applications (such as youtube-music or mailspring) their tray icons do not show up in waybar even if they are activated.
When starting the apps I don't see any specific logs regarding an error. I don't know if it comes from my nixos install or from waybar itself but I was wondering how I can find the reason why it is not working.
waybar version: 0.9.15
blueberry-tray version: 1.4.8
youtube-music version: 1.17.0
mailspring version: 1.10.5
closing this as I solved it by installing correctly libappindicator on nixos
|
2025-04-01T04:54:41.762213
| 2023-07-14T05:58:05
|
1804235269
|
{
"authors": [
"BijanRegmi",
"MithicSpirit",
"NoahFraiture",
"oniGino",
"robertgzr"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13102",
"repo": "Alexays/Waybar",
"url": "https://github.com/Alexays/Waybar/issues/2312"
}
|
gharchive/issue
|
mpris when no player is active generates errors in logs
When no player is currently active allot of log noise is generated
(waybar:5688): playerctl-WARNING **: 22:56:30.449: Spotify does not use the D-Bus property cache, getting properties directly
(waybar:5688): playerctl-WARNING **: 22:56:30.449: Spotify does not use the D-Bus property cache, getting properties directly
(waybar:5688): playerctl-WARNING **: 22:56:30.449: Spotify does not use the D-Bus property cache, getting properties directly
(waybar:5688): playerctl-WARNING **: 22:56:30.450: Spotify does not use the D-Bus property cache, getting properties directly
[2023-07-13 22:56:30.450] [error] mpris[playerctld]: GDBus.Error:com.github.altdesktop.playerctld.NoActivePlayer: No player is being controlled by playerctld
repeating every few seconds
Same issue. the error keeps repeating
I also have this issue. I suspect that it is logged on this line: https://github.com/Alexays/Waybar/blob/a90e275d5e26226c9e69abbb6f9be4d7391ba3c1/src/modules/mpris/mpris.cpp#L570C1-L570C1
Is there any news ? I also have this problem and I'm afraid of battery impact
Is there any news ? I also have this problem and I'm afraid of battery impact
Since it's just logging, I doubt it has a significant impact on energy usage.
should be fixed by https://github.com/Alexays/Waybar/pull/2622
|
2025-04-01T04:54:41.830114
| 2022-10-24T20:01:08
|
1421361189
|
{
"authors": [
"davidsmejia",
"nozomione"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13103",
"repo": "AlexsLemonade/refinebio-web",
"url": "https://github.com/AlexsLemonade/refinebio-web/issues/30"
}
|
gharchive/issue
|
Implement About Page
Context
Add the necessary components and implement them on the about page.
@davidsmejia This PR will be created based on the branch nozomione/26-create-global-components-for-app-structure, for it's a dependent of the issue #26, thank you!
|
2025-04-01T04:54:41.838295
| 2022-11-06T17:02:38
|
1437471955
|
{
"authors": [
"AlexxIT",
"Cruzy404",
"ZiggySatdust",
"camsaway",
"dipdipotat0chip",
"santiniuk"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13104",
"repo": "AlexxIT/SonoffLAN",
"url": "https://github.com/AlexxIT/SonoffLAN/issues/1026"
}
|
gharchive/issue
|
FIX - SONOFF BULB B02-BL seen as switch and no dimmer or temperature options
SONOFF B02-BL seen as switch and no dimmer or temperature option
found no reference to a fix to this or any mentions on a work around
SOLUTION
on the device go to DOWNLOAD DIAGNOSTICS open the file and find the UIID mine was 135
open /config/custom_components/sonoff/core/devices.py
you should find the entry for UIID 135 missing
add the line
135: [XLightB02, RSSI], # Sonoff B02-BL
B02-BL should now be visible as a light in your drop downs and you will be able to add the dimmer and the cool temperature.
Hope this helps anyone.
DIPDIPOTATOCH1P
This needs to please be added/pulled into code base.
Thanks!
Fix worked a treat!
Thanks for sharing.
Just what i needed. Thank-you!
Un grand merci !!!!!
https://github.com/AlexxIT/SonoffLAN/releases/tag/v3.4.0
|
2025-04-01T04:54:41.841194
| 2022-05-04T10:42:30
|
1225200169
|
{
"authors": [
"AlexxIT",
"lgxmedia"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13105",
"repo": "AlexxIT/SonoffLAN",
"url": "https://github.com/AlexxIT/SonoffLAN/issues/814"
}
|
gharchive/issue
|
Sonoff TH10 slow update interval after 3.0.0
I use two Sonoff TH10 to monitor and control incubators temperature and humidity. I need fast and accurate readings to keep those in check. Only way to have reliable readings and reaction is to use older versions (2.4.7 working great) with those settings in configuration.yaml:
mode: local
force_update: [temperature, humidity]
scan_interval: "00:00:30"
sensors: [temperature, humidity]
Without those settings temperature and humidity go up or down way to much until update is received. 5 minutes is bad for my use, but even so the update sometimes is more then 10 minutes. A way to update the sensors is also by toggle the switch, but interfere with automations.
It is any way I can force the update in new versions? The settings in configuration are ignored on new version?
Added to latest master version. Will be in next release
https://github.com/AlexxIT/SonoffLAN#force-update
|
2025-04-01T04:54:41.859722
| 2024-10-31T09:34:49
|
2626402014
|
{
"authors": [
"Subashree-selvaraj",
"pankaj-bind"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13106",
"repo": "AlgoGenesis/C",
"url": "https://github.com/AlgoGenesis/C/issues/1525"
}
|
gharchive/issue
|
Divide and Conquer Maximum Subarray Problem
The Maximum Subarray Problem is a classical algorithmic problem that involves finding the contiguous subarray within a one-dimensional array of numbers which has the largest sum. This problem can be efficiently solved using the Divide and Conquer approach.
Do not create multiple issues.
|
2025-04-01T04:54:41.862853
| 2024-11-09T15:38:54
|
2646242667
|
{
"authors": [
"SimranShaikh20",
"pankaj-bind"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13107",
"repo": "AlgoGenesis/C",
"url": "https://github.com/AlgoGenesis/C/issues/1807"
}
|
gharchive/issue
|
[NEW ALGORITHM] Cheapest Flights Within K Stops Graph Probelm
Issue will be closed if:
the problem is about finding the cheapest way to travel between two cities given a set of flights, while also considering a limit on the number of stops. It tests your ability to work with graphs and optimize paths based on costs and constraints.
Name:
Cheapest Flights Within K Stops
About:
Propose a new algorithm to be added to the repository
Labels:
new algorithm, gssoc-ext, hacktoberfest, level1
Assignees:
[x] Contributor in GSSoC-ext
[x] Want to work on it
@pankaj-bind pls assign me this issue!
@pankaj-bind !
Do not create multiple issues.
|
2025-04-01T04:54:41.866963
| 2024-10-18T09:34:55
|
2596995391
|
{
"authors": [
"Swastimp",
"biswajit-sarkar-007",
"hari-dev-003",
"pankaj-bind"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13108",
"repo": "AlgoGenesis/C",
"url": "https://github.com/AlgoGenesis/C/issues/981"
}
|
gharchive/issue
|
[NEW ALGORITHM] Memory Management in C (malloc, calloc, realloc, free)
Description:
A comprehensive understanding of memory management is crucial for effective C programming. This issue proposes the addition of a dedicated section in the repository to cover memory management techniques, specifically focusing on the following:
1.Dynamic Memory Allocation:
Explain the importance of dynamic memory allocation in C programming.
Introduce functions like malloc, calloc, and realloc for allocating memory at runtime.
2.Memory Deallocation:
Describe how to properly deallocate memory using the free function to prevent memory leaks.
3.Best Practices:
Discuss best practices for managing memory in C, including checking for null pointers, avoiding memory leaks, and understanding memory fragmentation.
Name:
[]
About:
Propose a new algorithm to be added to the repository
Labels:
new algorithm, gssoc-ext, hacktoberfest, level1
Assignees:
[x] Contributor in GSSoC-ext
[x] Want to work on it
can you assign this for me ?
Since I have a good grasp of the basics, I was wondering if you'd be up for assigning me a task related to malloc, calloc, realloc, and free.
Do not create unnecessary issues until you complete the previous one.
|
2025-04-01T04:54:41.883124
| 2017-01-03T09:16:05
|
198434232
|
{
"authors": [
"AlgorithmX2",
"Pitigoi"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13109",
"repo": "AlgorithmX2/FlatColoredBlocks",
"url": "https://github.com/AlgorithmX2/FlatColoredBlocks/issues/35"
}
|
gharchive/issue
|
full colours
how good of a computer is required for 16777216 colors of one block type?
I wouldn't know, you would at least need "NotEnoughIds" to extend the block id limit.
FCB isn't that demanding, but that would use a size-able amount of memory just to keep track of the ids and references, I imagine whats in MC to keep track of those blocks would exceed what FCB would use to accomplish said task.
|
2025-04-01T04:54:41.888406
| 2021-06-27T14:56:37
|
930950332
|
{
"authors": [
"nurbek91",
"rockeynebhwani"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13110",
"repo": "AliasIO/wappalyzer",
"url": "https://github.com/AliasIO/wappalyzer/issues/4072"
}
|
gharchive/issue
|
Improve Osano detection
Example site - https://www.hobbycraft.co.uk/
Current detection -
"Osano": {
"cats": [
67
],
"description": "Osano is a data privacy platform that helps your website become compliant with laws such as GDPR and CCPA.",
"icon": "osano.png",
"scripts": "cookieconsent\\.min\\.js",
"website": "https://www.osano.com/"
},
To improve detection, look for JS object 'Osano'. Keep the current script detection as well
Hey there , updated
|
2025-04-01T04:54:41.954254
| 2024-10-16T17:37:27
|
2592593798
|
{
"authors": [
"Alitindrawan24",
"shreya-paul-17"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13111",
"repo": "Alitindrawan24/Binary-Calculator",
"url": "https://github.com/Alitindrawan24/Binary-Calculator/issues/137"
}
|
gharchive/issue
|
Description bar size fixing needed
The bar beneath the calculation bar needs a bit of adjustment in width to remove irregularity so that it becomes more visually appealing
Here is a reference of the above case marked in red color:
Kindly assign the issue to me
duplicate with #136
|
2025-04-01T04:54:42.041257
| 2024-11-22T23:05:36
|
2684887034
|
{
"authors": [
"XX-Yin",
"alexpiet",
"ellahiltonvano",
"micahwoodard"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13112",
"repo": "AllenNeuralDynamics/dynamic-foraging-task",
"url": "https://github.com/AllenNeuralDynamics/dynamic-foraging-task/issues/1059"
}
|
gharchive/issue
|
We need to investigate whether this is an issue on scopes or with code.
Cross side issues per box:
ephys1A: 2
ephys3A: 1
ephys4A: 1
2B: 1
3C: 1
3D: 1
6A: 2
6B: 1
7A: 1
7C: 1
7D: 2
8D: 1
7B: 1
Same side issues per box:
ephys3A: 5
1A: 1
1D: 1
6A: 1
7B: 1
7C: 1
Updated list of issues per box (posted week of 11/25/2024):
Cross side issues per box:
ephys1: 5
ephys3: 1
ephys4: 2
2B: 1
3C: 1
3D: 1
6A: 3
6B: 3
6D: 2
7A: 2
7B: 3
7C: 7
7D: 5
8A: 1
8C: 1
8D: 2
9D: 2
Same side issues per box:
ephys3: 13
ephys4: 1
1A: 1
1C: 1
1D: 1
6A: 2
7B: 2
7C: 1
@ZhixiaoSu recommends changing the definition of cross-side licks
https://github.com/AllenNeuralDynamics/dynamic-foraging-task/issues/1026#issuecomment-2513330895
@JeremiahYCohen suggests decreasing 100ms to 50ms
Updated 12/23/24:
Cross side issues per box:
1B: 2
1C: 1
1D: 2
3C: 3
3D: 1
6A: 2
6B: 3
6D: 3
7A: 2
7B: 5
7C: 4
7D: 6
8A: 2
8B: 2
8C: 3
8D: 2
ephys1: 5
ephys3: 5
ephys4: 3
9D: 2
Same side issues per box:
1A: 3
1C: 3
1D: 3
2B: 1
3C: 2
6A: 2
6D: 2
7B: 3
7C: 1
8B: 1
ephys3: 15
ephys4: 2
This appears to be a very serious issue and might worth a discussion in a separate place.
While some of the lick detection problems could potentially be mitigated by adjusting parameters and the sensitivity of the lick detector, I’ve observed that the Janelia lick detector often lacks stability and consistency across different mice and days. This seems to be an inherent hardware limitation of the device.
It might be worth discussing with SIPE the possibility of developing a more stable lick detector specifically for the behavior rig. It's ok if it has licking artefacts during ephys recording when we lack an idea solution, which could be a separate issue.
@XX-Yin I agree we need to discuss more. We are waiting on some feedback from Sue and will likely discuss in a meeting in January.
|
2025-04-01T04:54:42.048888
| 2020-04-06T19:26:59
|
595368637
|
{
"authors": [
"AmplabJenkins",
"ZacBlanco",
"ns1123"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13113",
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/11241"
}
|
gharchive/pull-request
|
Remove doubled PK defaults for master client gc threshold
Defaults were set twice accidentally. Removed them.
Merged build finished. Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/9009/
Test PASSed.
alluxio-bot, merge this please
alluxio-bot, cherry-pick this to branch-2.1 please
alluxio-bot, cherry-pick this to branch-2.2 please
|
2025-04-01T04:54:42.050606
| 2022-06-24T23:38:52
|
1284314519
|
{
"authors": [
"jiacheliu3",
"maobaolong"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13114",
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/15775"
}
|
gharchive/pull-request
|
Print snapshot file size
What changes are proposed in this pull request?
Just print the size of the snapshot file which has just been downloaded or generated.
Why are the changes needed?
Print this snapshot file size can help to get more info when encountered some issue.
Does this PR introduce any user facing changes?
No
@jiacheliu3 Would you please take a look at this PR? Thanks!
alluxio-bot, merge this please
|
2025-04-01T04:54:42.053030
| 2022-11-08T14:11:39
|
1440281312
|
{
"authors": [
"Xenorith",
"jja725",
"vimalKeshu"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13115",
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/16503"
}
|
gharchive/pull-request
|
Improve job worker health report
What changes are proposed in this pull request?
Restructure the worker health report class so user of the class can get the worker health report at the point in time plus it will be easy to add many more system information to calculate the worker health.
#16502
Why are the changes needed?
It would be easy to add new system information like memory, io etc.. of the worker in calculating the worker health. The new structure of the class provides the worker health report point in time.
Does this PR introduce any user facing changes?
No
@vimalKeshu mostly LGTM, please fix the checkStyle. @luzhang6 PTAL since it's observability related
Yes, let me fix the style.thank you @jja725.
alluxio-bot, merge this please
|
2025-04-01T04:54:42.056727
| 2016-02-29T18:46:28
|
137336301
|
{
"authors": [
"AmplabJenkins",
"aaudiber",
"calvinjia",
"gpang"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13116",
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/2779"
}
|
gharchive/pull-request
|
[ALLUXIO-1739] Move Dataserver to Alluxio Worker
https://tachyon.atlassian.net/browse/ALLUXIO-1739
This is a step to implementing ALLUXIO-1739. After moving the data server out of the block worker, we can share the data server between different types of workers.
Merged build finished. Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/8709/
Test PASSed.
Merged build finished. Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/8710/
Test PASSed.
LGTM
|
2025-04-01T04:54:42.059692
| 2016-08-09T08:37:46
|
170110963
|
{
"authors": [
"AmplabJenkins",
"aaudiber",
"apc999",
"lshmouse"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13117",
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/3806"
}
|
gharchive/pull-request
|
[ALLUXIO-1893] Support deploying alluxio on secure yarn cluster
Send a PR because #3060 has been closed.
Thanks for @gpang @apc999 @aaudiber 's review~
https://alluxio.atlassian.net/browse/ALLUXIO-1893
alluxio-bot, check this please
alluxio-bot, check this please
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/11058/
Test PASSed.
Merged build finished. Test PASSed.
LGTM
LGTM
|
2025-04-01T04:54:42.074418
| 2017-08-21T12:55:56
|
251651432
|
{
"authors": [
"AmplabJenkins",
"ifcharming",
"uronce-cc"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13118",
"repo": "Alluxio/alluxio",
"url": "https://github.com/Alluxio/alluxio/pull/5914"
}
|
gharchive/pull-request
|
[ALLUXIO-2997] Implement PUT bucket in S3RestServiceHandler.
This PR implemented http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUT.html in Alluxio Proxy.
The following Python script using AWS S3 Python SDK to create a bucket works.
import boto
import boto.s3.connection
access_key = 'put your access key here!'
secret_key = 'put your secret key here!'
conn = boto.connect_s3(
aws_access_key_id = access_key,
aws_secret_access_key = secret_key,
host = 'localhost',
port = 39999,
path = '/api/v1/s3',
is_secure=False,
calling_format = boto.s3.connection.OrdinaryCallingFormat(),
)
# This will succeed.
conn.create_bucket('not-mount-point')
# This will raise an error formatted as XML with error message:
# "The specified bucket is not a directory directly under a mount point".
try:
conn.create_bucket('not-mount-point:bucket')
except boto.exception.S3ResponseError as e:
print(e)
@uronce-cc @calvinjia FYI, I've updated the PR to target at s3 feature branch. The s3 branch was created based on latest master.
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/16483/Build result: FAILURE[...truncated 2778 lines...][JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/hdfs/pom.xml to org.alluxio/alluxio-core-client-hdfs/1.6.0-SNAPSHOT/alluxio-core-client-hdfs-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/hdfs/target/alluxio-core-client-hdfs-1.6.0-SNAPSHOT.jar to org.alluxio/alluxio-core-client-hdfs/1.6.0-SNAPSHOT/alluxio-core-client-hdfs-1.6.0-SNAPSHOT.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/hdfs/target/alluxio-core-client-hdfs-1.6.0-SNAPSHOT-sources.jar to org.alluxio/alluxio-core-client-hdfs/1.6.0-SNAPSHOT/alluxio-core-client-hdfs-1.6.0-SNAPSHOT-sources.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/hdfs/target/alluxio-core-client-hdfs-1.6.0-SNAPSHOT-tests.jar to org.alluxio/alluxio-core-client-hdfs/1.6.0-SNAPSHOT/alluxio-core-client-hdfs-1.6.0-SNAPSHOT-tests.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/keyvalue/common/pom.xml to org.alluxio/alluxio-keyvalue-common/1.6.0-SNAPSHOT/alluxio-keyvalue-common-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/fs/pom.xml to org.alluxio/alluxio-core-client-fs/1.6.0-SNAPSHOT/alluxio-core-client-fs-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/fs/target/alluxio-core-client-fs-1.6.0-SNAPSHOT.jar to org.alluxio/alluxio-core-client-fs/1.6.0-SNAPSHOT/alluxio-core-client-fs-1.6.0-SNAPSHOT.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/fs/target/alluxio-core-client-fs-1.6.0-SNAPSHOT-sources.jar to org.alluxio/alluxio-core-client-fs/1.6.0-SNAPSHOT/alluxio-core-client-fs-1.6.0-SNAPSHOT-sources.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/fs/target/alluxio-core-client-fs-1.6.0-SNAPSHOT-tests.jar to org.alluxio/alluxio-core-client-fs/1.6.0-SNAPSHOT/alluxio-core-client-fs-1.6.0-SNAPSHOT-tests.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/common/pom.xml to org.alluxio/alluxio-core-common/1.6.0-SNAPSHOT/alluxio-core-common-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/common/target/alluxio-core-common-1.6.0-SNAPSHOT.jar to org.alluxio/alluxio-core-common/1.6.0-SNAPSHOT/alluxio-core-common-1.6.0-SNAPSHOT.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/common/target/alluxio-core-common-1.6.0-SNAPSHOT-sources.jar to org.alluxio/alluxio-core-common/1.6.0-SNAPSHOT/alluxio-core-common-1.6.0-SNAPSHOT-sources.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/common/target/alluxio-core-common-1.6.0-SNAPSHOT-tests.jar to org.alluxio/alluxio-core-common/1.6.0-SNAPSHOT/alluxio-core-common-1.6.0-SNAPSHOT-tests.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/server/worker/pom.xml to org.alluxio/alluxio-core-server-worker/1.6.0-SNAPSHOT/alluxio-core-server-worker-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/underfs/glusterfs/dependency-reduced-pom.xml to org.alluxio/alluxio-underfs-glusterfs/1.6.0-SNAPSHOT/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/underfs/glusterfs/target/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT.jar to org.alluxio/alluxio-underfs-glusterfs/1.6.0-SNAPSHOT/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/underfs/glusterfs/target/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT-sources.jar to org.alluxio/alluxio-underfs-glusterfs/1.6.0-SNAPSHOT/alluxio-underfs-glusterfs-1.6.0-SNAPSHOT-sources.jar[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/client/pom.xml to org.alluxio/alluxio-core-client/1.6.0-SNAPSHOT/alluxio-core-client-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/examples/pom.xml to org.alluxio/alluxio-examples/1.6.0-SNAPSHOT/alluxio-examples-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/pom.xml to org.alluxio/alluxio-core/1.6.0-SNAPSHOT/alluxio-core-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/core/server/proxy/pom.xml to org.alluxio/alluxio-core-server-proxy/1.6.0-SNAPSHOT/alluxio-core-server-proxy-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/shell/pom.xml to org.alluxio/alluxio-shell/1.6.0-SNAPSHOT/alluxio-shell-1.6.0-SNAPSHOT.pom[JENKINS] Archiving /home/jenkins/workspace/Alluxio-Pull-Request-Builder/tests/pom.xml to org.alluxio/alluxio-tests/1.6.0-SNAPSHOT/alluxio-tests-1.6.0-SNAPSHOT.pomchannel stopped
Test FAILed.
Merged build finished. Test FAILed.
jenkins, test this please
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/16485/
Test PASSed.
Merged build finished. Test PASSed.
jenkins, test this please
Merged build finished. Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/16490/
Test PASSed.
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/Alluxio-Pull-Request-Builder/16493/
Test PASSed.
|
2025-04-01T04:54:42.109755
| 2024-01-12T19:42:14
|
2079582431
|
{
"authors": [
"Alorel",
"Ekleog"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13119",
"repo": "Alorel/rust-indexed-db",
"url": "https://github.com/Alorel/rust-indexed-db/issues/32"
}
|
gharchive/issue
|
Default transactions to abort on drop
Hey!
Here's an idea for indexed-db-futures v0.5: what if transactions aborted on drop, instead of committing on drop?
This'd make them farther away from the indexed-db standard, for sure, but the indexed-db standard is based on callbacks, which are pretty different from futures anyway. And, in Rust, it's very easy to miss one early-return point, to make sure that returning Err from a function aborts the transaction.
TL;DR:
fn foo() {
let transaction = [...];
transaction.add_key_val(...).unwrap().await.unwrap();
do_some_check(&transaction).await?;
Ok(())
}
This will (AFAICT) commit the transaction if do_some_check were to return an Err. The behavior I'd expect from the code if just reading it intuitively, would be for the transaction to be aborted.
In order to get the behavior I'd instinctively expect, I need to use the following code, which is quite a bit less pleasant to both write and read:
fn foo() {
let transaction = [...];
transaction.add_key_val(...).unwrap().await.unwrap();
if let Err(e) = do_some_check(&transaction).await {
transaction.abort().unwrap();
return Err(e);
}
Ok(())
}
WDYT about adding a commit(self) function to transaction, that'd commit it (ie. just drop it), and to have IdbTransaction's Drop implementation abort the transaction if it has not been explicitly committed?
Anyway, I'm just starting using indexed-db-futures, but it already seems much, much more usable than the web-sys version. So, thank you! :D
After some more investigation, I learned that this would be a nontrivial feature.
So I decided to go ahead and implement it, and this resulted in the indexed-db crate, that has a misuse-resistant API with transactions that always abort upon errors and never commit early.
Do you want to keep this issue open to track this feature inside indexed_db_futures, or should we close it? :)
This issue has been included in v0.6.0 :tada:
- Your friendly neighbourhood :robot: semantic release bot
|
2025-04-01T04:54:42.111156
| 2023-09-08T01:13:28
|
1886789323
|
{
"authors": [
"Scrippy"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13120",
"repo": "Alpha-Authority/Szeebe",
"url": "https://github.com/Alpha-Authority/Szeebe/issues/10"
}
|
gharchive/issue
|
exec_aud.sh <- Need Push Fix to Github
Not sure how to correctly execute the Git Push / Commit / Add, feel free to make a working version.
Entering the Danger Zone.
Fixed with @brplcc , with contributions from @TacticalTux and @wes34.
|
2025-04-01T04:54:42.300823
| 2023-02-18T16:14:38
|
1590390724
|
{
"authors": [
"Amark19"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13122",
"repo": "Amark19/calcont.in",
"url": "https://github.com/Amark19/calcont.in/issues/21"
}
|
gharchive/issue
|
footer padding issue
yeah sure !
Hey @Amark19, Can i work on this?? Please assign me.
yeah sure !
Hey I am not able to run this projects on local. Can you please guide me. Here are the error screenshots
which python version are you using ?
@rajpatel17-bot which python version are you using ?
can u explicitely run this command python -m pip install flake8 ?
you can simply run pip install -r requirements.txt but first you need to remove psycopg2 module from it then run that command
hey @rajpatel17-bot , I'll see the issue try it from my end then will ping u..
hey @rajpatel17-bot you can try now all the steps
|
2025-04-01T04:54:42.335758
| 2017-10-31T16:30:42
|
270027708
|
{
"authors": [
"andyanderso",
"ccorey",
"jimurl"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13123",
"repo": "AmericanAvalanche/avalanche.org-API-Client",
"url": "https://github.com/AmericanAvalanche/avalanche.org-API-Client/issues/1"
}
|
gharchive/issue
|
Is there a way to customize the base map, map size, zoom level?
I would like to be able to customize the base map, map size, and zoom level. AKA set the map height to 600px and zoom level to 8 or something. Is this possible through the API?
Hi Andy, W were checking that out too. Clark is working on a different base map ( colored instead of BW ). Perhaps there will be an option to set this in the http post data array.
I found that to fix the size of the map, I had to put it into a div of certain width and height.
The zoom, we found, its set automatically, so that none of your regions are cut off. I had to tweak the height of ours because it was a hair to small to fit all the regions into. Once I made it slightly taller, it defaulted to the next zoom level up.
The <div style="width:100%; height:600px;"> works great for the size. Thanks, Jim that is a nice simple solution! I would still like to set the zoom one level higher than it displays on our map to make our region take up all the space in the map window.
you can check out the 3 options on my dev site if you want to compare them:
http://dev.sierraavalanchecenter.org
Yeah, I see what you mean. Try increasing the height a bit. I think whatever function auto zooms in or out, it likes to have some padding around the advisory area. Go big to find a height that will definitely work, then try dropping it down to the smallest height that will still give you the better zoom level.
See docs for latest update. Client can now supply map height as well as desired zoom level.
|
2025-04-01T04:54:42.344175
| 2019-03-06T00:02:05
|
417559262
|
{
"authors": [
"akatsukilevi",
"moelrobi"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13124",
"repo": "AminoJS/Amino.JS",
"url": "https://github.com/AminoJS/Amino.JS/issues/56"
}
|
gharchive/issue
|
Invite someone in the Chat
Describe the solution you'd like
Endpoint:
POST /api/v1//s/chat/thread//member/invite HTTP/1.1
JSON:
{
"uids": ["Array of UserID's to be invited to Chat", "MUST BE ABLE TO (i.e. the invitee needs to follow the inviter)"],
"timestamp": "A JS Timestamp"
}
Got it. Will be adding at the update
Did the module inviteChat.js, and wrote the example already. This new feature will be added at the 3.0.0-nightly
Closing due to finished
|
2025-04-01T04:54:42.346250
| 2021-06-18T22:10:43
|
925236566
|
{
"authors": [
"AmionSky",
"rezural"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13125",
"repo": "AmionSky/bevy_obj",
"url": "https://github.com/AmionSky/bevy_obj/pull/7"
}
|
gharchive/pull-request
|
create normal and uv data where needed so bevy's renderer doesnt complain
I'm not sure if this is the right place to solve this problem (perhaps in an AssetCleanerPlugin), but this adds normal data (which is actually obviously broken, I could compute it from the face orientation..), and vertex uv data.
I have only tested this with a file missing UV data, so It is probably horribly broken when trying to inject normal data...
Is this something you would add to the crate? If so I will test it with normal data. Otherwise I will look at creating a AssetCleanerPlugin.
Cheers
I'm definitely interested adding this to the crate. In fact I see this as a bugfix.
I have tested it with both 'UV and Normal' and 'only UV' missing and it seems to work fine.
I haven't looked into how logging works in bevy but it would be nice to write out a small warning maybe.
Also If you could generate some simple normals, it would be even better.
But none of these required
|
2025-04-01T04:54:42.358711
| 2024-11-11T18:46:00
|
2650112641
|
{
"authors": [
"calinfaja",
"vikash28-cloud"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13126",
"repo": "AmruthPillai/Reactive-Resume",
"url": "https://github.com/AmruthPillai/Reactive-Resume/issues/2098"
}
|
gharchive/issue
|
[Bug] Education Category, summary text box not auto-sizing. Create Button missing
Is there an existing issue for this?
[X] Yes, I have searched the existing issues and none of them match my problem.
Product Variant
Cloud (https://rxresu.me)
Current Behavior
When you try to add something in the Education Category, when you paste something into the Summary Box, it does not auto-size and the Create button disappears. You can't scroll in this window
Expected Behavior
The summary box autosizes and you can press the button.
Steps To Reproduce
Go to Education Category
Paste something into the Summary Box
Create button missing.
What browsers are you seeing the problem on?
Chrome
What template are you using?
None
Anything else?
No response
I want to try this issue #2098
|
2025-04-01T04:54:42.370436
| 2023-02-10T23:18:49
|
1580474713
|
{
"authors": [
"anaconda-pkg-build",
"bkreider"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13127",
"repo": "AnacondaRecipes/atpublic-feedstock",
"url": "https://github.com/AnacondaRecipes/atpublic-feedstock/pull/1"
}
|
gharchive/pull-request
|
CFFL
CFFL
Linter check found the following problems:
The following problems have been found:
ERROR: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:22: missing_wheel: For pypi packages, wheel should be present in the host section
ERROR: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:30: missing_pip_check: For pypi packages, pip check should be present in the test commands
ERROR: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:40: missing_dev_url: The recipe is missing a dev_url
WARNING: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:40: missing_description: The recipe is missing a description
ERROR: /tmp/abs_cfbortivdo/clone/recipe/meta.yaml:40: http_url: http://public.readthedocs.io/ is not https
Errors were found
|
2025-04-01T04:54:42.373160
| 2023-11-07T16:42:03
|
1981781682
|
{
"authors": [
"boldorider4",
"lorepirri"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13128",
"repo": "AnacondaRecipes/dlfcn-win32-feedstock",
"url": "https://github.com/AnacondaRecipes/dlfcn-win32-feedstock/pull/1"
}
|
gharchive/pull-request
|
build 1.4.1
Upstream
Jira ticket
is this a SF package?
is this a SF package?
it's a dependency for pykx. Maybe it should go to SF channel.
is this a SF package?
it's a dependency for pykx. Maybe it should go to SF channel.
it's not on defaults already, so yes, it goes to SF 👌 thanks!
|
2025-04-01T04:54:42.376963
| 2023-02-13T04:13:39
|
1581624589
|
{
"authors": [
"anaconda-pkg-build",
"bkreider"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13129",
"repo": "AnacondaRecipes/duckdb-engine-feedstock",
"url": "https://github.com/AnacondaRecipes/duckdb-engine-feedstock/pull/1"
}
|
gharchive/pull-request
|
CFFL
CFFL
Linter check found the following problems:
The following problems have been found:
ERROR: /tmp/abs_a5xx2q0rsr/clone/recipe/meta.yaml:15: pip_install_args: pip install should be run with --no-deps.
ERROR: /tmp/abs_a5xx2q0rsr/clone/recipe/meta.yaml:22: missing_wheel: For pypi packages, wheel should be present in the host section
ERROR: /tmp/abs_a5xx2q0rsr/clone/recipe/meta.yaml:39: missing_pip_check: For pypi packages, pip check should be present in the test commands
ERROR: /tmp/abs_a5xx2q0rsr/clone/recipe/meta.yaml:48: missing_documentation: The recipe is missing a doc_url or doc_source_url
Errors were found
Linter check found the following problems:
The following problems have been found:
ERROR: /tmp/abs_64oznm803a/clone/recipe/meta.yaml:15: pip_install_args: pip install should be run with --no-deps.
ERROR: /tmp/abs_64oznm803a/clone/recipe/meta.yaml:22: missing_wheel: For pypi packages, wheel should be present in the host section
ERROR: /tmp/abs_64oznm803a/clone/recipe/meta.yaml:39: missing_pip_check: For pypi packages, pip check should be present in the test commands
ERROR: /tmp/abs_64oznm803a/clone/recipe/meta.yaml:48: missing_documentation: The recipe is missing a doc_url or doc_source_url
Errors were found
|
2025-04-01T04:54:42.394408
| 2024-06-05T21:59:13
|
2336900757
|
{
"authors": [
"ViridianMelody",
"anaconda-pkg-build",
"lorepirri"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13130",
"repo": "AnacondaRecipes/hmmlearn-feedstock",
"url": "https://github.com/AnacondaRecipes/hmmlearn-feedstock/pull/2"
}
|
gharchive/pull-request
|
[PKG-4871] 0.3.2
hmmlearn 0.3.2 :snowflake:
Destination channel: Snowflake
Links
PKG-4871
Upstream repository
Upstream changelog/diff
Explanation of changes:
Forced openblas on osx-64 to prevent a segfault error.
Added upstream tests.
The tests require the package to be built in place and editable, so it's currently being rebuilt in the test block. Please let me know if there's a better way to go about that.
Added $RPATH/ld64.so.1 to missing_dso_whitelist for s390x
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_b8iflxnl06/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:61: missing_description: The recipe is missing a description
clone/recipe/meta.yaml:28: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== ERRORS =====
clone/recipe/meta.yaml:28: missing_wheel: For pypi packages, wheel should be present in the host section
===== Final Report: =====
1 Error and 2 Warnings were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_04w0vgazwm/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:28: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_cffhv4rr3j/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_2d3roj_dif/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_47fy0mrt8y/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_e7wyvczymq/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
The tests on osx-64 are triggering a segfault any time _AbstractHMM.fit is called. I suspect it might be connected to thescikit-learn version is pulled in.
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_34zy_3spyu/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== ERRORS =====
clone/recipe/meta.yaml:38: version_constraints_missing_whitespace: Packages and their version constraints must be space separated
===== Final Report: =====
1 Error and 1 Warning were found
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_1ckgcfl1nh/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
let's create and merge first 0.3.0 and then this one, otherwise we will have to have a versioned feedstock for 0.3.0
let's create and merge first 0.3.0 and then this one, otherwise we will have to have a versioned feedstock for 0.3.0
It's still unclear whether the client actually needs both versions. The production code is identical aside from a single exception message. Jack asked them about it last week, but there was no response.
Linter check found the following problems:
ERROR conda.cli.main_run:execute(125): `conda run conda-lint /tmp/abs_73g3bjwvqe/clone` failed. (See above for error)
The following problems have been found:
===== WARNINGS =====
clone/recipe/meta.yaml:30: host_section_needs_exact_pinnings: Linked libraries host should have exact version pinnings.
===== Final Report: =====
0 Errors and 1 Warning were found
|
2025-04-01T04:54:42.396898
| 2023-02-10T15:05:42
|
1579842585
|
{
"authors": [
"anaconda-pkg-build",
"bkreider"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13131",
"repo": "AnacondaRecipes/uri-template-feedstock",
"url": "https://github.com/AnacondaRecipes/uri-template-feedstock/pull/1"
}
|
gharchive/pull-request
|
CFFL
CFFL
Linter check found the following problems:
The following problems have been found:
ERROR: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:18: pip_install_args: pip install should be run with --no-deps.
ERROR: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:23: missing_wheel: For pypi packages, wheel should be present in the host section
ERROR: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:49: missing_documentation: The recipe is missing a doc_url or doc_source_url
ERROR: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:49: missing_dev_url: The recipe is missing a dev_url
WARNING: /tmp/abs_e3pm9q3n55/clone/recipe/meta.yaml:49: missing_description: The recipe is missing a description
Errors were found
|
2025-04-01T04:54:42.405776
| 2024-06-05T20:13:35
|
2336755544
|
{
"authors": [
"AndGem",
"corop"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13132",
"repo": "AndGem/osm_extract_polygon",
"url": "https://github.com/AndGem/osm_extract_polygon/issues/45"
}
|
gharchive/issue
|
Not working
╰─$ ./osm_extract_polygon -f kazakhstan-latest.osm.pbf
error: cannot set both -o (--overwrite) and -s (--skip)!
uff. sorry.. I must have broken the recent build. Will fix asap.
Hey @corop ,
can you try the new release https://github.com/AndGem/osm_extract_polygon/releases/tag/v.0.5.5 ?
I think this fixes the issue you have reported. Many thanks for pointing this out.
Hey @corop ,
can you try the new release https://github.com/AndGem/osm_extract_polygon/releases/tag/v.0.5.5 ?
I think this fixes the issue you have reported. Many thanks for pointing this out.
my test 0.0.5 version ((( need check again.... i5, Debian 12, zsh
-rw-r--r-- 1 asergeev asergeev 34M Jun 5 02:26 china-latest-admin.osm.pbf
-rwxr-xr-x 1 asergeev docker 2.5M Jun 6 23:39 osm_extract_polygon
-rw-r--r-- 1 asergeev docker 6.2K Jun 5 08:40 README.md
╭─asergeev@uran ~/work/osm/country-borders/china
╰─$ ./osm_extract_polygon --file china-latest-admin.osm.pbf
error: cannot set both -o (--overwrite) and -s (--skip)!
Hey @corop ,
thanks for the update. I understand that you want to check again?
Either 0.5.5 or the newest 0.5.6 should both work. Please let me know if there is an issue, and ideally with a pbf file (or a link where I can download it from) to try it out.
i assume everything is fine now
|
2025-04-01T04:54:42.431834
| 2024-09-02T21:50:50
|
2501615892
|
{
"authors": [
"AndreaPontrandolfo"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13133",
"repo": "AndreaPontrandolfo/sheriff",
"url": "https://github.com/AndreaPontrandolfo/sheriff/issues/252"
}
|
gharchive/issue
|
Crash for missing deps
Crash on windows, some errors about missing deps.
Invalid. It was a issue with Nodejs on my Windows system.
Invalid. It was a issue with Nodejs on my Windows system.
|
2025-04-01T04:54:42.456672
| 2020-02-19T09:16:30
|
567424159
|
{
"authors": [
"AndrewRissing",
"holigo1"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13134",
"repo": "AndrewRissing/GenericParsing",
"url": "https://github.com/AndrewRissing/GenericParsing/pull/11"
}
|
gharchive/pull-request
|
Open DataSource without locking it
This allow to keep the datasource file writable by the application generating it while parsing.
(Sorry for the online editing of github screwing file encoding)
Good call. I'll get this merged in, along with a contributor file. The latest by the end of the week hopefully.
|
2025-04-01T04:54:42.481892
| 2021-11-09T14:59:06
|
1048707101
|
{
"authors": [
"Andy-Python-Programmer",
"JCapucho"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13135",
"repo": "Andy-Python-Programmer/stivale",
"url": "https://github.com/Andy-Python-Programmer/stivale/pull/5"
}
|
gharchive/pull-request
|
Allow mutating the SMP info tag
Allows getting a mutable reference to StivaleSmpTag and mutating the StivaleSmpInfo structs, this can be used to start the application processors.
Closes #4
Thanks @JCapucho!
|
2025-04-01T04:54:42.485078
| 2023-07-13T18:17:22
|
1803537740
|
{
"authors": [
"AndyEveritt",
"MajliTech"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13136",
"repo": "AndyEveritt/GcodeParser",
"url": "https://github.com/AndyEveritt/GcodeParser/issues/4"
}
|
gharchive/issue
|
Way to calculate grams?
How can I calculate grams from given GCode?
Thanks.
Built CuraEngine, wontfix
Sorry, missed this issue. You would need to use the diameter of the filament to calculate the cross sectional area of the filament. Then multiply by the Extrusion distance in the G1 command to get volume. Multiply volume by the density of the filament to get mass.
|
2025-04-01T04:54:42.489476
| 2018-10-16T16:11:10
|
370689021
|
{
"authors": [
"AngeloStavrow",
"sauerj"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13137",
"repo": "AngeloStavrow/indigo",
"url": "https://github.com/AngeloStavrow/indigo/issues/20"
}
|
gharchive/issue
|
Homepage URL on 404.html is broken
Hey, thanks for your work. I like your theme.
I noticed that on the 404.html page the link back to the homepage is broken.
Instead of a valid URL a link to Page("Homepage name") is created.
I have a fix ready and can submit a pull request if you want.
Thanks for reporting this, and for your kind words!
If you've got a pull request ready, please go ahead and submit it — I'll review it ASAP.
|
2025-04-01T04:54:42.529618
| 2023-03-21T16:41:13
|
1634317222
|
{
"authors": [
"Anjok07",
"Dyslexicon",
"sabaasa"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13138",
"repo": "Anjok07/ultimatevocalremovergui",
"url": "https://github.com/Anjok07/ultimatevocalremovergui/issues/460"
}
|
gharchive/issue
|
Manually Import Model Files?
Hello could a feature be added to allow importation of a .th Demucs model file from a local directory please?
Not a code-warrior, so I would appreciate an option to import model files manually from my hard drive.
I like this idea too. I've wanted an option to add my own models without issues
Importing your own models is very possible in UVR. For Demucs, only custom v3 and v4 models can be imported. You will need to create a .yaml for them and reference the models accordingly.
Importing your own models is very possible in UVR. For Demucs, only custom v3 and v4 models can be imported. You will need to create a .yaml for them and reference the models accordingly.
Importing your own models is very possible in UVR. For Demucs, only custom v3 and v4 models can be imported. You will need to create a .yaml for them and reference the models accordingly.
How does one create a .yaml file and correlate it to the .th file so that UVR recognizes them as a pair?
And then would the model show up in UVR automatically after relaunch, under one of the Demucs categories?
Very specific directions appreciated. Thank you :)
Importing your own models is very possible in UVR. For Demucs, only custom v3 and v4 models can be imported. You will need to create a .yaml for them and reference the models accordingly.
How does one create a .yaml file and correlate it to the .th file so that UVR recognizes them as a pair?
And then would the model show up in UVR automatically after relaunch, under one of the Demucs categories?
Very specific directions appreciated. Thank you :)
Here's an example:
Download and rename this file and use it as a template as a template yaml file
Change the yaml file name to whatever you want (obviously keep the .yaml)
Edit the yaml file to reference your Demucs model.
Example: Change the "955717e8" in the yaml to the name of the .th model you wish to associate with it.
models: ['955717e8']
Make sure the yaml and .th model are in the following directory - models/Demucs_Models/v3_v4_models
Thank you, this process worked! UVR found the model and was able to begin the separation process.
However an error occurs straight away, "bag_num" not defined. I am sure there may be subsequent errors that need troubleshooting to get this to work; is there an email I can contact you at for a little bit of guidance through any errors that may pop up?
|
2025-04-01T04:54:42.544532
| 2024-04-24T17:49:13
|
2261855189
|
{
"authors": [
"Ansub",
"dakshsinghrathore",
"epoll31"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13140",
"repo": "Ansub/SyntaxUI",
"url": "https://github.com/Ansub/SyntaxUI/issues/69"
}
|
gharchive/issue
|
✨: Create issue forms for respective buttons
Is your feature request related to a problem? Please describe.
Currently, we lack issue forms which is kinda crucial in making the whole process of raising an issue smooth for newbie contributors
Describe the solution you'd like
Describe alternatives you've considered
NULL
Additional context
Add any other context or screenshots about the feature request here.
this actually makes sense, thanks for this idea.
if you have the template ready can you raise a PR for this?
@Ansub yes will be raising one shortly.
@dakshsinghrathore This is pretty cool
@dakshsinghrathore Can you still upload photos in the forms?
Yes @epoll31 check this out
That's neat! I didn't realize those were md fields.
@epoll31 github recently added this feature, do give it a read.
#72
Merged PR #72
|
2025-04-01T04:54:42.603599
| 2022-04-04T17:40:31
|
1192106980
|
{
"authors": [
"Antonio-Laguna",
"nzakas"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13141",
"repo": "Antonio-Laguna/markdown-it-image-figures",
"url": "https://github.com/Antonio-Laguna/markdown-it-image-figures/issues/10"
}
|
gharchive/issue
|
Bug: Can't load ESM module in Node.js
Environment:
Node v16.3.0
npm v7.10.0
What I did
Tried to load the plugin in an ESM file like this:
import figuresPlugin from "markdown-it-image-figures";
What happened
Got the following error:
import figuresPlugin from "markdown-it-image-figures";
^^^^^^^^^^^^^
SyntaxError: The requested module 'markdown-it-image-figures' does not provide an export named 'default'
at ModuleJob._instantiate (node:internal/modules/esm/module_job:121:21)
at async ModuleJob.run (node:internal/modules/esm/module_job:171:5)
at async Loader.import (node:internal/modules/esm/loader:178:24)
at async Object.loadESM (node:internal/process/esm_loader:68:5)
Notes
From looking through the dist folder locally, it appears that all of the generated files that should be ESM are actually CommonJS, including the ones marked as .module.js and .mjs. I'm guessing this has something to do with microbundle.
Has been released as 2.0.2, @nzakas could you double-check on your end?
Confirmed! This fixes the issue. Thanks so much.
Thanks for confirming! Closing this!
|
2025-04-01T04:54:42.608252
| 2024-02-04T11:05:56
|
2117038587
|
{
"authors": [
"AntonyLeons",
"nicandris"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13142",
"repo": "AntonyLeons/Ward",
"url": "https://github.com/AntonyLeons/Ward/pull/107"
}
|
gharchive/pull-request
|
Feature/background color
Add ability to enable/disable the background fog
If fog is disabled then default background color is set based on your theme.
Background color can be customized using a hex value eg. #3c3c3c if fog is disabled
Thanks for your contribution, i think we need to add to the setup flow too, not just environment variables, I can work on this in a few weeks, if you dont have time.
I added it in the setup also. please take a good look since I'm rusty on js/html
the setup looks good, one issue though:
default behaviour has now changed fog is now disabled by default, consider calling the flag disable fog, at least internally.
this will change behaviour for people who auto update with docker.
I added it as a "migration" method in IndexService. Can be moved to a separate migration service if needed
looks good will merge soon, just a minor suggestion but doesn't matter.
|
2025-04-01T04:54:42.655676
| 2023-05-22T08:27:30
|
1719172900
|
{
"authors": [
"Tarun0951",
"chiki012"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:13143",
"repo": "Anupkjha2601/food-recipes-website",
"url": "https://github.com/Anupkjha2601/food-recipes-website/issues/98"
}
|
gharchive/issue
|
[GSSOC' 23 ] HOVERING IS NOT WORKING FOR TWO BUTTONS
I am an Open source Contributor
DESCRIPTION:-
The hovering effect of the Instagram and LinkedIn icons aren't working in the site so please give me an opportunity to correct those and also I would like to change the ui a bit to make it more impressive.
I CAN FIX THIS AND MAKE THIS MORE GOOD WITH GOOD ICONS
CODE OF CONDUCT
-I Follow Contributing Guidelines of this project
Hello, I already worked on a food recipe app, so I'm good to handle this issue. please assign me this issue.
I'm GSSOC'23 contributor.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.