added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| created
timestamp[us]date 2001-10-09 16:19:16
2025-01-01 03:51:31
| id
stringlengths 4
10
| metadata
dict | source
stringclasses 2
values | text
stringlengths 0
1.61M
|
|---|---|---|---|---|---|
2025-04-01T06:37:06.146651
| 2019-03-15T18:29:58
|
421651727
|
{
"authors": [
"OscarHernandezG",
"ValdiviaDev"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1366",
"repo": "JellyBitStudios/JellyBitEngine",
"url": "https://github.com/JellyBitStudios/JellyBitEngine/issues/50"
}
|
gharchive/issue
|
When Alita walks near enemies her rotation goes crazy
Description:
When Alita walks near enemies her rotation goes crazy
Build:
v<IP_ADDRESS>
Type:
Alita
Steps to reproduce:
Walk near an enemy
Frequency:
Sometimes
This was the enemies pushing the enemy, but it should be alright now.
|
2025-04-01T06:37:06.161969
| 2015-05-02T22:10:30
|
72705815
|
{
"authors": [
"JeremySkinner",
"SeanKilleen",
"csaloio",
"jrharmon"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1367",
"repo": "JeremySkinner/FluentValidation",
"url": "https://github.com/JeremySkinner/FluentValidation/issues/60"
}
|
gharchive/issue
|
Will there be a DNX Core 5 version of this library created?
It would be great if this was useable w/ DNX core projects.
Yes, there will be.
Any rough time frame on when you are planning on this?
Thanks!
The initial work is done, but it was done against the VS2015 beta before the aspnetcore -> dnxcore rename, so that all still needs doing.
All the automatic integration with ASP.NET MVC needs writing too, but this needs the new validation API in MVC6 to be finished first.
I'd say this is still several months away from being finished, and I'm away for all of July, August and half of September, so it's unlikely to be until later in the year - sorry.
Cleanup: Closing this issue since it hasn't had activity in about a year.
|
2025-04-01T06:37:06.163391
| 2019-05-15T08:16:50
|
444298478
|
{
"authors": [
"JeremySkinner",
"ry8806"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1368",
"repo": "JeremySkinner/FluentValidation",
"url": "https://github.com/JeremySkinner/FluentValidation/pull/1122"
}
|
gharchive/pull-request
|
Spelling corrections and readability changes
Thanks for a great lib!
Came across a couple of small changes needed to the ASP.NET Core docs as I was reading through - hence this PR
Merged, thanks!
|
2025-04-01T06:37:06.180477
| 2015-11-23T03:20:16
|
118312930
|
{
"authors": [
"EMCP"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1369",
"repo": "JesusGuerrero/MAHRIO",
"url": "https://github.com/JesusGuerrero/MAHRIO/issues/41"
}
|
gharchive/issue
|
error in Android
not finding a certain file on landing page of Android Build
patched in latest PR and merged with master. was just a typo in a reference line.
|
2025-04-01T06:37:06.185639
| 2021-07-27T16:49:54
|
954073296
|
{
"authors": [
"funkstermonster",
"zaleslaw"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1370",
"repo": "JetBrains/KotlinDL",
"url": "https://github.com/JetBrains/KotlinDL/pull/163"
}
|
gharchive/pull-request
|
Bugfix/161 correct temporary dataset path
Now the data.zip is downloaded to the correct path.
This fixes the issue: Incorrect temporary folder for the cats-vs-dogs dataset LINK
fixes #161
Hi @zaleslaw! Did you have a chance to take a look at it?
Fixed in #235
|
2025-04-01T06:37:06.198887
| 2021-07-16T14:55:03
|
946367262
|
{
"authors": [
"Schahen",
"okushnikov"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1371",
"repo": "JetBrains/compose-multiplatform",
"url": "https://github.com/JetBrains/compose-multiplatform/issues/902"
}
|
gharchive/issue
|
Deprecate Color.RGB, Color.HSL etc. functions in favor of top-level rgb, hsl an so on
We've decided that this:
borderColor(rgb(10, 20, 30))
is better than this
borderColor(Color.rgb(200, 20, 10))
Please check the following ticket on YouTrack for follow-ups to this issue. GitHub issues will be closed in the coming weeks.
|
2025-04-01T06:37:06.216452
| 2020-12-18T12:51:50
|
770863596
|
{
"authors": [
"IKrukov-HORIS",
"alshan"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1372",
"repo": "JetBrains/lets-plot",
"url": "https://github.com/JetBrains/lets-plot/issues/270"
}
|
gharchive/issue
|
size_unit for geom_text should take into account user specified label_format
Otherwise labels can overlap.
IMO, better would be to have a parameter specifying number of symbols that constitute label taken as unit.
|
2025-04-01T06:37:06.217807
| 2015-06-10T05:57:43
|
86851982
|
{
"authors": [
"gep13",
"striglone"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1373",
"repo": "JetBrains/meta-runner-power-pack",
"url": "https://github.com/JetBrains/meta-runner-power-pack/pull/51"
}
|
gharchive/pull-request
|
Compatibility fix for Chocolatey v<IP_ADDRESS>
The chocolatey executable files were moved a couple of versions back breaking this meta-runner.
This change updates the path to the new location and appends the '-y' option to suppress the user confirmation to install the package.
:+1:
|
2025-04-01T06:37:06.220743
| 2021-09-30T20:24:11
|
1012589533
|
{
"authors": [
"dmprusak",
"philipto"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1374",
"repo": "JetBrains/swot",
"url": "https://github.com/JetBrains/swot/pull/12611"
}
|
gharchive/pull-request
|
Add high school
High school where one of the faculties is computer science.
School website: https://brzozowa5.edu.pl/
@dmprusak Pull request merged. Thank you!
|
2025-04-01T06:37:06.222843
| 2024-02-27T13:52:32
|
2156650615
|
{
"authors": [
"masghi",
"philipto"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1375",
"repo": "JetBrains/swot",
"url": "https://github.com/JetBrains/swot/pull/20215"
}
|
gharchive/pull-request
|
Create ITTS.txt
please add my collage
@masghi The first verification stage is complete: the official domain for **Institut Teknologi Tangerang Selatan ** is itts.ac.id, to the best of my knowledge. This is just an informational note. The pull request is still under review. The review may take some more time. I greatly appreciate your patience. Here is the proof of domain ownership: https://opencourse.itts.ac.id/
@masghi Pull request merged. Thank you.
|
2025-04-01T06:37:06.228061
| 2024-12-11T01:07:55
|
2731572531
|
{
"authors": [
"bobderf",
"philipto"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1376",
"repo": "JetBrains/swot",
"url": "https://github.com/JetBrains/swot/pull/25775"
}
|
gharchive/pull-request
|
Add files via upload
Blount County
@bobderf Please provide us with:
the school official website URL
the school street address, including city and country
a proof, which shows that the school recognizes the domain you are submitting as an official email domain for the students.
@bobderf You might miss my previous comment. Please react.
SorryOn Sat, Dec 14, 2024 at 6:02 AM Philip Torchinsky @.***> wrote:
@bobderf You might miss my previous comment. Please react.
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
On Thu, Dec 12, 2024 at 5:37 AM Philip Torchinsky @.***> wrote:
@bobderf Please provide us with:
the school official website URL = https://heritagehigh.blountk12.org/the school street address, including city and country = 3741 E Lamar Alexander Pkwy, Maryville, TN 37804a proof, which shows that the school recognizes the domain you are submitting as an official email domain for the students. I’m not sure what you mean I am a student ?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>
@bobderf Thank you for the clarifications. Pull request merged. Please start requesting the licenses in about an hour, to let the changes to propagate through our system.
|
2025-04-01T06:37:06.229386
| 2020-01-07T13:37:39
|
546287756
|
{
"authors": [
"SacretFlyer",
"philipto"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1377",
"repo": "JetBrains/swot",
"url": "https://github.com/JetBrains/swot/pull/7427"
}
|
gharchive/pull-request
|
Adds Harrow International School Beijing
北京哈罗国际学校
Harrow International School Beijing
www.harrowbeijing.cn
@SacretFlyer Pull request merged. Thank you!
|
2025-04-01T06:37:06.230644
| 2020-04-21T21:21:26
|
604294472
|
{
"authors": [
"haydhook",
"philipto"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1378",
"repo": "JetBrains/swot",
"url": "https://github.com/JetBrains/swot/pull/8177"
}
|
gharchive/pull-request
|
Uploaded: Ron Dearing UTC - University Technical College
Website: https://www.rondearingutc.com/
@haydhook Pull request merged. Thank you!
|
2025-04-01T06:37:06.234834
| 2020-10-20T17:02:27
|
725781687
|
{
"authors": [
"philipto",
"uhCHS"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1379",
"repo": "JetBrains/swot",
"url": "https://github.com/JetBrains/swot/pull/9653"
}
|
gharchive/pull-request
|
Adding hwbcymru domain
https://www.cardiffhigh.cardiff.sch.uk/
https://i.imgur.com/XQofrpK.png
@uhCHS Please provide us with a proof, which shows that the school recognizes the domain you are submitting as an official email domain for the students. I am sorry, the screenshot of a mailbox is not enough. The screenshot shows that you probably has an access to the mailbox there, but it says nothing about recognition of the domain by the school.
@philipto Thanks for getting back to me, You can see this link which is from the government that the domain hwbcymru has been setup for 85% of Wales schools. This has been something that was enforced with the government and not with the the school its self. Thanks
forgot to provide the link https://gov.wales/wales-leads-way-microsoft-schools
@uhCHS Pull request merged. Thank you!
FTR: additional proof of the domain validity is at https://hwb.gov.wales/news/articles/dc715915-79e3-4582-8789-5ea9f664c908
@philipto Does it take time to activate?
https://i.imgur.com/XOsnS0q.png
working now thanks :+1:
|
2025-04-01T06:37:06.251302
| 2020-03-27T09:55:07
|
589015268
|
{
"authors": [
"FineStrokes",
"JiHong88",
"rwaldron"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1380",
"repo": "JiHong88/SunEditor",
"url": "https://github.com/JiHong88/SunEditor/issues/300"
}
|
gharchive/issue
|
Facing css import issue in create react app in typescript
I am facing css import issue in create react app in typescript
Can u please let me know how I can resolve this issue as I have imported the css into my typescript file as below
import 'suneditor/dist/css/suneditor.min.css';
Which version are you using?
<EMAIL_ADDRESS>
I have added the path of the suneditor.min.css in CopyWebpackPlugin in Webpack
as node_modules/suneditor/dist/css/suneditor.min.css
And in
new HtmlWebpackIncludeAssetsPlugin({
assets: [
'css/suneditor.min.css'] })
And use in my typescript file as
import 'suneditor/dist/css/suneditor.min.css';
It works in my dev env but fails in Prod after build as shown in figure above
It seems that the web font is not loaded.
You must have a file loader.
{
test: /\.(eot|svg|ttf|woff|woff2)(\?v=[0-9]\.[0-9]\.[0-9])?$/,
use: [{
loader: "file-loader",
options: {
publicPath: '../',
name: 'fonts/[hash].[ext]',
limit: 5000,
mimetype: 'application/font-woff'
}
}]
}
Or
The latest version uses "inline svg" rather than a web font.
Please update the editor to the latest version.
ok how do I use font import in CopyWebpackPlugin
Sorry, I don't know..😭
I recommend updating the version.
I set it like this in my webpack environment.
{
test: /\.(eot|svg|ttf|woff|woff2)(\?v=[0-9]\.[0-9]\.[0-9])?$/,
use: [{
loader: "file-loader",
options: {
publicPath: '../',
name: 'fonts/[hash].[ext]',
limit: 5000,
mimetype: 'application/font-woff'
}
}]
}
Even if the web font is loaded with this setting, it is recommended to use the latest version.
Can u let me know what is the improvement in switching to newer version
Please refer to the release history.
https://github.com/JiHong88/SunEditor/releases
I was experiencing similar issues, as I have
import 'suneditor/dist/css/suneditor.min.css';
...in my code as well.
Here's what I've learned: in development mode, webpack is just copying the contents of that file and putting it into a <style> element, without interacting with the source. In production mode, webpack will open the source, copy the contents into memory and attempt to minify it with Terser, however there must be a syntax error in 'suneditor/dist/css/suneditor.min.css', because it fails to parse and therefore fails to include it in the final production bundle. There is also evidence that 'suneditor/dist/css/suneditor.min.css' has a syntax error, because my text editor cannot apply css syntax highlighting to it. I will report back if I can find more information.
According to csslint.com: Expected RBRACE at line 1, col 30372.
I think both my CSS syntax highlighter, Terser and CSS Lint don't understand the 1turn value here:
@keyframes spinner{
to {
transform: rotate(1turn);
}
}
When I change it to:
@keyframes spinner{
to {
transform: rotate(360deg);
}
}
CSS Lint is happy.
@rwaldron Thank for your feedback!
I have checked this issue, and it seems that "cssnano" converts "360deg" to "1turn".
https://github.com/cssnano/cssnano/issues/823
This issue does not seem to be resolved. :(
I will fix this issue by changing "360deg" to "361deg".
The 2.30.0 version has been updated.
If this issue has not been resolved, please reopen this issue.
Thank you.
|
2025-04-01T06:37:06.258375
| 2020-08-09T22:32:31
|
675784038
|
{
"authors": [
"10000TB"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1381",
"repo": "Jiar/SegementSlide",
"url": "https://github.com/Jiar/SegementSlide/issues/60"
}
|
gharchive/issue
|
collectionView controller as contentviewcontroller ?
it looks like all examples are using tableview controller as content view controller. Is collection viewcontroller as content supported ? any example?
nvm, it also works 👍
There seems no limitation to what type of VC content view controller it can accept. Pretty general
|
2025-04-01T06:37:06.262665
| 2024-03-06T17:35:46
|
2172074950
|
{
"authors": [
"FDT2k",
"JibayMcs",
"RicLeP",
"invaders-xx"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1382",
"repo": "JibayMcs/filament-tour",
"url": "https://github.com/JibayMcs/filament-tour/issues/12"
}
|
gharchive/issue
|
Can’t open tour from getHeaderActions but does work from widget button.
What happened?
Launching a tour from a page header action does not work, I have the following:
protected function getHeaderActions(): array
{
return [
Action::make('Tour')->dispatch('filament-tour::open-tour', ['tour_dashboard']),
];
}
When clicking the button I get this error Tour with id 'tour_dashboard' not found.
However, if I add this to a widget’s view on the same page, it will launch the tour as expected.
<button wire:click="$dispatch('filament-tour::open-tour', 'tour_dashboard')">Tour</button>
How to reproduce the bug
Create a tour on a custom dashboard then include a header action to dispatch the open-tour event.
Package Version
<IP_ADDRESS>
PHP Version
8.2.15
Laravel Version
10.46.0
Which operating systems does with happen with?
Windows
Notes
No response
I have the same issue : console displays "Tour with id 'blabla' not found
Same for me.
Solution here #8 Adding tour_ before the id
Action::make('demo')->link()->action(fn($livewire)=>$livewire->dispatch('filament-tour::open-tour', 'tour_grow-page-list')),
// or
Action::make('demo')->link()->dispatch('filament-tour::open-tour', ['tour_grow-page-list']),
Hi ! This bug will be fixed in the next release !
|
2025-04-01T06:37:06.265837
| 2018-09-25T22:07:03
|
363784797
|
{
"authors": [
"bemasc",
"fortuna",
"shatlyktma"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1383",
"repo": "Jigsaw-Code/Intra",
"url": "https://github.com/Jigsaw-Code/Intra/issues/80"
}
|
gharchive/issue
|
Enable TCP_NODELAY
It might make sense to enable TCP_NODELAY in our SOCKS server to avoid unnecessary Nagling.
It might make sense to enable TCP_NODELAY in our SOCKS server to avoid unnecessary Nagling.
Kode
I think this is fixed with the migration. to Go?
Yeah, Go is nodelay by default.
|
2025-04-01T06:37:06.266982
| 2020-09-23T19:12:11
|
707614250
|
{
"authors": [
"Sebas3525",
"honk7777"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1384",
"repo": "Jiiks/BetterDiscordApp",
"url": "https://github.com/Jiiks/BetterDiscordApp/issues/955"
}
|
gharchive/issue
|
bugged "Click To Update!"
as the title says, the click to update at the top is bugged and can't be clicked on.
image
im having the same issue :/
|
2025-04-01T06:37:06.276701
| 2019-09-23T14:22:58
|
497129848
|
{
"authors": [
"Jinmo",
"tmr232"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1385",
"repo": "Jinmo/idapkg",
"url": "https://github.com/Jinmo/idapkg/issues/13"
}
|
gharchive/issue
|
Add a license
Can you please add a license to the repository?
The absence of a license makes usage and contribution tricky.
Thanks for the suggestion! I'll add MIT license.
|
2025-04-01T06:37:06.293422
| 2024-07-14T23:20:47
|
2407637123
|
{
"authors": [
"JoeGruffins",
"ukane-philemon"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1386",
"repo": "JoeGruffins/cake_wallet",
"url": "https://github.com/JoeGruffins/cake_wallet/pull/13"
}
|
gharchive/pull-request
|
Add decred to macos/ios scripts and rescan page bug fix
Builds on #12.
Tested on simnet: https://github.com/JoeGruffins/cake_wallet/issues/11#issuecomment-2227516005
Modified commits:
https://github.com/JoeGruffins/cake_wallet/commit/ceef7e0f23689659fd96fc608c22e6e423b969c1
https://github.com/JoeGruffins/cake_wallet/commit/97ad5c308c89d6aae5b0b555de93aef2e6d46f67
https://github.com/JoeGruffins/cake_wallet/commit/f5305670a2a775a54534820e32283efd54abdf66
Is 7531942 for testing or do we need those changes?
Is 7531942 for testing or do we need those changes?
Errm, most of it was from @itswisdomagain commit on the stale branch. But we need the script and readme changes.
We would clean that commit of test changes once we are ready to push upstream.
@JoeGruffins you might need to replace the upstream commit with this one: https://github.com/JoeGruffins/cake_wallet/pull/13/commits/55d837ccc2a1a42083b95d7d9794e4e184a170ea
Ok I updated it.
Thanks.
getting an error when building:
lib/entities/provider_types.dart:50:29: Error: A non-null value must be returned since the return type 'List<ProviderType>' doesn't allow null.
- 'List' is from 'dart:core'.
- 'ProviderType' is from 'package:cake_wallet/entities/provider_types.dart' ('lib/entities/provider_types.dart').
static List<ProviderType> getAvailableBuyProviderTypes(WalletType walletType) {
^
lib/entities/provider_types.dart:87:29: Error: A non-null value must be returned since the return type 'List<ProviderType>' doesn't allow null.
- 'List' is from 'dart:core'.
- 'ProviderType' is from 'package:cake_wallet/entities/provider_types.dart' ('lib/entities/provider_types.dart').
static List<ProviderType> getAvailableSellProviderTypes(WalletType walletType) {
I think this comment was intended for #14, it has been resolved.
I think this comment was intended for https://github.com/JoeGruffins/cake_wallet/pull/14, it has been resolved.
Yeah sorry about that.
Changes have been added to the upstream pr so closing.
|
2025-04-01T06:37:06.339324
| 2023-03-16T19:18:15
|
1628121859
|
{
"authors": [
"John4064"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1387",
"repo": "John4064/WoW-Ranking",
"url": "https://github.com/John4064/WoW-Ranking/issues/7"
}
|
gharchive/issue
|
Pick Color Scheme & name
For fun easy issue
On This now as well as figma
https://www.figma.com/file/ExU4vVOZXjSc01fPM60rU5/Untitled?node-id=10%3A39&t=AV6VEJNCelnS5bnE-1
|
2025-04-01T06:37:06.340845
| 2019-11-25T04:05:26
|
527831334
|
{
"authors": [
"admiral-ackbar",
"glouel"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1388",
"repo": "JohnCoates/Aerial",
"url": "https://github.com/JohnCoates/Aerial/issues/887"
}
|
gharchive/issue
|
Show Battery
It would be nice if there was a setting to show the current battery level or to hide it if the battery was fully charged
Adding this as a maybe
This is now in beta3 : https://github.com/JohnCoates/Aerial/releases/tag/v1.6.5beta3
Beware after installing, you will need to set again your video format (1080p, 4K, HDR) as I changed some more internal stuff.
|
2025-04-01T06:37:06.350422
| 2022-01-02T13:52:41
|
1092024936
|
{
"authors": [
"JohnDTill"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1389",
"repo": "JohnDTill/Forscape",
"url": "https://github.com/JohnDTill/Forscape/issues/19"
}
|
gharchive/issue
|
Static typing and sizing
Currently Forscape does all type and dimension checking at runtime. We would like to transfer this responsibility to compile time, not only for performance, but also for semantic formatting. We can highlight errors prior to running the interpreter, and even bold matrix identifiers.
There are challenges to consider:
Generic functions
Symbolic dimensions, e.g. an m×n matrix
Aggregate types, e.g. lists of X, sets of X, etc.
Nonhomogeneous aggregate types
Subtypes, e.g. int is a subtype of rational is a subtype of real
Inference
Trying something too difficult and failing
Probably use "call-site template function instantiation", or whatever it's really called. I'm having trouble finding educational material.
With call-site instantiation of generic functions:
The function declaration doesn't give you much, just the number of parameters, number of default args, and their types
A function plus a list of arg types lets you deduce the return type
The function to call is the result of an expression, so you need to be sure it is instantiated, and you need to deduce the return type without being able to statically determine the function to call
Can you return a generic function, then deduce its type at call sites? You need the body to deduce the type. When the called function could evaluate to different options, you need all the options.
I halfway think it can work, but it's doing my head in. Maybe I should sleep.
Actually, I think I'm biting off much more than I can chew with generic functions. I don't need to get bogged down on tangents, just make a restriction that the function declaration includes a fully specified signature, and make a note to relax that restriction later.
Well declare-time deduction is incredibly limited. It has challenges with recursion and functions passed as arguments. The closure examples are a little contrived, but the isolated function definitions have ambiguous signatures. The root finding example fails because for f(x), f could be a function ℝ → ℝ or just a member of ℝ resulting in implicit multiplication. To have any chance of success, you'll need to consider the call site. But then you're statically tying identifiers to functions, which is not cool.
But static typing is the gateway to most of the interesting features, and it should be done non-intrusively. Get it together!
Well this just won't happen quickly. First things first, I need another Very Sophisticated Vector™ to represent types. I expect it will be similar to the "tree" in KiCAS. Then with that support in place, a lot of experimentation to find what works. Probably not a weekend project...
Perhaps a call graph would be helpful?
The problem with any type deduction involving the call site is that we cannot generally determine which function will be called. Take for example:
alg add(x, y) return x + y
alg mult(x, y) return x * y
f = addfalsemulttrue
print(f(3, 3))
However, I believe we can determine all the functions which could possibly be called, and instantiate them with the correct types.
Static type checking basically works. It still lacks some tedious details around references captures, function prototypes, and recursion. It needs to clone functions rather than just use the original definition (vital for recursion) while keeping valid references (can think about frame offsets in the symbol table). Meaningful error messages will require some book-keeping to report at the declare and/or call sites. The whole thing could use a refactor, especially before pressing on to static dimensioning and autodiff. But it works 😭.
I almost have test coverage, but I worry that "cursed_factorial" will expose my recursion strategy as too naive. I had thought that if you hit recursion, you resolve as much as possible in your pass, then take an additional pass to either finish or conclude you have a cycle without an exit. It may not be the simple, but I still think it's feasible. This is a nice puzzle.
It seems to work now. Hopefully the recursion handling doesn't admit invalid Forscape programs. The IDE crashes constantly now, and the typing error messages are terrible, but that's cleanup work.
This is set to close with https://github.com/JohnDTill/Forscape/pull/38. There is still more to explore with instantiation as opposed to just checking, and static dimensioning, but the scope of this issue has been large enough!
|
2025-04-01T06:37:06.352388
| 2016-07-13T01:55:10
|
165219842
|
{
"authors": [
"JohnEstropia",
"jannon"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1390",
"repo": "JohnEstropia/GCDKit",
"url": "https://github.com/JohnEstropia/GCDKit/pull/13"
}
|
gharchive/pull-request
|
mark iOS and tvOS frameworks save for app extensions
Just ticking the boxes in the iOS and tvOS schemes to mark them safe for app extensions
Thanks! 👍 This seems ok, but I'll need to raise GCDKit's version as well or Carthage won't see this change if it cached the previous version's repository. I'll merge your fork and push it as 1.2.6 later when I get free time :) (Same with CoreStore)
|
2025-04-01T06:37:06.357775
| 2017-02-21T21:06:59
|
209272719
|
{
"authors": [
"JohnMaguire",
"scarescrow"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1391",
"repo": "JohnMaguire/Cardinal",
"url": "https://github.com/JohnMaguire/Cardinal/pull/126"
}
|
gharchive/pull-request
|
Add handling for disambiguation pages in wikipedia
Fixes #97
Hey @scarescrow -- Thanks for this. 👍 I'll probably lint it against flake8 locally unless you'd like to do so yourself.
However, I ran into an issue: When I hit a disambiguation page, I seemed to get an error. If I remember correctly I tried .wiki Test and .wiki A (disambiguation).
I'm willing to look into it and fix it up, but I'd have to find some free time. 🙂 It'd be nice if you could verify that you are seeing the same thing, or if you could provide a working disambiguation page to see how it works! Thanks again! 😄
Hey @JohnMaguire , did you get errors for both pages, Test and A (disambiguation)?
I checked for .wiki hello and got the link to the disambiguation page correctly.
I'll check it once more with the pages you mentioned to see if I'm getting the errors as well.
Will reopen if a fix is posted.
|
2025-04-01T06:37:06.365119
| 2024-10-20T02:02:52
|
2599837236
|
{
"authors": [
"JohnnyMorganz",
"nightcycle"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1392",
"repo": "JohnnyMorganz/luau-lsp",
"url": "https://github.com/JohnnyMorganz/luau-lsp/issues/799"
}
|
gharchive/issue
|
RemoteEvent / BindableEvent / RemoteFunction / BindableFunction should be "unknown?" instead of "any"
As the firing of these Roblox instances have no way to enforce type safety, I feel it's cleanest to make developers validate the types that are received.
In this below example, no errors are detected - everything is fine. That player variable could be passed into another function and cause a really weird error somewhere far away from the event.
local bindableEvent: BindableEvent
bindableEvent.Event:Connect(function(player: Player)
-- the player is actually a string, but the dev is expecting a player
end)
bindableEvent:Fire(Players.LocalPlayer.Name)
What I propose is that it's better for the code to accept this uncertainty, and refine it in the following code. For example:
local bindableEvent: BindableEvent
bindableEvent.Event:Connect(function(player: unknown?)
assert(typeof(player) == "Instance", `player is not instance, got type "{typeof(player)}"`)
assert(player:IsA("Player"), `player is not player, got class "{player.ClassName}"`)
-- the type is now certainly a player, or has given the developer a clear error at the earliest point possible
end)
bindableEvent:Fire(Players.LocalPlayer.Name)
I feel this handling of it is safe and more accurate to what's actually happening in the code, similar to the improvement made to attributes earlier.
Thank you for your time!
The main reason I am hesistant about this is from a user experience PoV, as switching this from any to unknown will lead to loads of type errors throughout existing code bases I imagine.
I can understand that, I guess I just feel it's a purer version of type safety that some developers will find desirable. If having it as the default behavior is too extreme, would considering it as a setting / flag be viable? Similar to strict datamodels.
|
2025-04-01T06:37:06.420740
| 2017-11-06T10:23:12
|
271424386
|
{
"authors": [
"JonNRb",
"khalkash"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1393",
"repo": "JonNRb/gthread",
"url": "https://github.com/JonNRb/gthread/issues/15"
}
|
gharchive/issue
|
Calling malloc/free should be done by scheduler
void * mymalloc(size_t size, gthread_task_t* owner);
void myfree(void * data, gthread_task_t* owner);
Both of these need to be invoked by scheduler to indicate the calling thread. Currently, I am not sure as to what the best way to do this is (what files need to be changed) so if one of you guys with more knowledge of the scheduler code could do it, that would be cool.
I'll take a look at this. There's one or two places where the scheduler uses libc malloc() and free() and I want to get rid of those as well. I know the assignment description says there should be an indication for mymalloc()s done by the scheduler and "kernel" but I think we can get rid of these and only have mymalloc() used by user code.
|
2025-04-01T06:37:06.461337
| 2024-10-23T12:13:01
|
2608374859
|
{
"authors": [
"JoostVanVelthoven",
"TLA020"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1394",
"repo": "JoostVanVelthoven/Nullness.Bang",
"url": "https://github.com/JoostVanVelthoven/Nullness.Bang/issues/1"
}
|
gharchive/issue
|
Code Fix for CS8603 Possible Null Reference Return
We have been using the Nulless.bang analyzer to ensure null safety in our .NET C# projects, particularly in helping us migrate to strict nullable reference types mode while maintaining legacy code.
However, we would like to address the CS8603 warning for possible null reference returns. Currently, the following code results in a CS8603 warning:
public string demo()
{
return null;
}
To resolve this, we would like to introduce a code fix to modify the code to handle nullable returns explicitly. Here's how the code should look after the fix:
public string? demo()
{
return null;
}
This change should ensure that nullable reference types are correctly handled, and CS8603 warnings are resolved.
Steps to Implement:
Modify the code fix to identify CS8603 warnings.
Automatically update return types to nullable (string? instead of string) when the method can return null.
Ensure this change is compatible with existing code and maintains functionality.
Expected Outcome:
The analyzer should modify methods that return null to use nullable return types, resolving the CS8603 warning.
Null safety should be improved while keeping flexibility for legacy code.
Nice !
|
2025-04-01T06:37:06.467096
| 2018-07-26T13:02:50
|
344834727
|
{
"authors": [
"JordyMoos"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1395",
"repo": "JordyMoos/elm-pixel-boulder-game",
"url": "https://github.com/JordyMoos/elm-pixel-boulder-game/issues/61"
}
|
gharchive/issue
|
Add setting to limit the amount of logic updates per render update
To prevent enemies from jumping too much.
Downside is that the game speed will be lowered
Done
|
2025-04-01T06:37:06.475648
| 2024-03-28T19:39:30
|
2213992048
|
{
"authors": [
"Jorge-Lopes",
"otoole-brendan"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1396",
"repo": "Jorge-Lopes/KREAd",
"url": "https://github.com/Jorge-Lopes/KREAd/issues/4"
}
|
gharchive/issue
|
KREAd frontend should remove legacy queries
Description
Creating this issue in Kread repo to match https://github.com/Agoric/agoric-sdk/issues/9145
In the upcoming agoric-upgrade-16, we plan to upgrade our cosmos-sdk major version to 0.47. This version of cosmos-sdk removes support for "legacy queries", which had been deprecated since version 0.45. Our codebase originally demonstrated use of legacy queries as the mechanism for reading "vstorage", and this pattern had been copied by some versions of the KREAd frontend code.
Proposed Solution
Proposed Solution
The Kryha/KREAd frontend appear to have eliminated legacy queries on at least recent version of the development branch. However, we are uncertain if that is the version currently running in production.
This ticket is to ensure that the public-facing KREAd frontend does not use legacy queries, whether that involves development, deployment, or if it's already done this ticket can be closed.
See https://github.com/Agoric/agoric-sdk/issues/9096 for more information on legacy queries and how to switch to alternative query methods.
Acceptance criteria
Confirmation legacy queries not used
Additional info
No response
Conclusion
The current state of KREAd running in production, commit 1f281a83b1bdf374a5bf11358c537829da614508 , the agoric/rpc pkg being imported has the version "^0.6.0", package.json.
This confirms that the KREAd version running in production still uses legacy queries.
Note: the same is true for the development branch, package.json.
Context
At the PR #40 , were the rpc pkg is upgraded to version 0.6.0 , the ChainStorageWatcher executes a method used to query the vstorage called batchVstorageQuery, where we can see that the structure passed as option to the fetch method is the same as described as legacy queries on the issue #9096
const options = {
method: 'POST',
body: JSON.stringify(
paths.map((path, index) => ({
jsonrpc: '2.0',
id: index,
method: 'abci_query',
params: { path: `/custom/vstorage/${path[0]}/${path[1]}` },
})),
),
};
Source
On the PR #55 the batchVstorageQuery is update to use the JSON API instead of the deprecated RPC method.
On the PR #65, the batchQuery file is replaced with vstorageQuery , which still use the expected JSON API to execute the queries.
This feature is present from the @agoric/rpc version: 0.7.2 and forward.
The latest version is 0.9.0
|
2025-04-01T06:37:06.507003
| 2015-06-21T08:16:38
|
89881169
|
{
"authors": [
"JoshuaBrockschmidt"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1397",
"repo": "JoshuaBrockschmidt/ideal_ANN",
"url": "https://github.com/JoshuaBrockschmidt/ideal_ANN/issues/4"
}
|
gharchive/issue
|
Decreasing Average Fitness
Of the last four simulations I've ran, the average fitness consistently decreases each generation. This is obviously an issue. There is virtually no improvement occurring.
With my next simulation, I'm going to try and give more weight to chromosomes with a higher fitness when selecting parents during reproduction.
And of course, there is always the possibility that my fundamental implementation of a genetic algorithm is flawed, as this is only my second project utilizing such.
|
2025-04-01T06:37:06.519358
| 2019-06-21T12:06:56
|
459164926
|
{
"authors": [
"Dipsip6969",
"JoshuaWise"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1398",
"repo": "JoshuaWise/better-sqlite3",
"url": "https://github.com/JoshuaWise/better-sqlite3/issues/278"
}
|
gharchive/issue
|
Can not install better-sqlite3
I keep getting these errors when I try to install it
gyp ERR! configure error
gyp ERR! stack Error: Command failed: C:\Users\Ahmad Alfalasi\AppData\Local\Programs\Python\Python37-32\python.EXE -c import sys; print "%s.%s.%s" % sys.version_info[:3];
gyp ERR! stack File "", line 1
gyp ERR! stack import sys; print "%s.%s.%s" % sys.version_info[:3];
gyp ERR! stack ^
gyp ERR! stack SyntaxError: invalid syntax
gyp ERR! stack
gyp ERR! stack at ChildProcess.exithandler (child_process.js:294:12)
gyp ERR! stack at ChildProcess.emit (events.js:182:13)
gyp ERR! stack at maybeClose (internal/child_process.js:962:16)
gyp ERR! stack at Process.ChildProcess._handle.onexit (internal/child_process.js:251:5)
gyp ERR! System Windows_NT 10.0.17134
gyp ERR! command "C:\Program Files\nodejs\node.exe" "C:\Users\Ahmad Alfalasi\AppData\Roaming\npm\node_modules\npm\node_modules\node-gyp\bin\node-gyp.js" "rebuild"
gyp ERR! cwd C:\Users\Ahmad Alfalasi\Desktop\Current bots\bot\node_modules\integer
gyp ERR! node -v v10.15.0
gyp ERR! node-gyp -v v3.8.0
gyp ERR! not ok
npm WARN enoent ENOENT: no such file or directory, open 'C:\Users\Ahmad Alfalasi\Desktop\Current bots\bot\package.json'
npm WARN<EMAIL_ADDRESS>requires a peer of<EMAIL_ADDRESS>but none is installed. You must install peer dependencies yourself.
npm WARN<EMAIL_ADDRESS>requires a peer of erlpack@discordapp/erlpack but none is installed. You must install peer dependencies yourself.
npm WARN<EMAIL_ADDRESS>requires a peer of<EMAIL_ADDRESS>but none is installed. You must install peer dependencies yourself.
npm WARN<EMAIL_ADDRESS>requires a peer of<EMAIL_ADDRESS>but none is installed. You must install peer dependencies yourself.
npm WARN<EMAIL_ADDRESS>requires a peer of<EMAIL_ADDRESS>but none is installed. You must install peer dependencies yourself.
npm WARN<EMAIL_ADDRESS>requires a peer of<EMAIL_ADDRESS>but none is installed. You must install peer dependencies yourself.
npm WARN<EMAIL_ADDRESS>requires a peer of<EMAIL_ADDRESS>but none is installed. You must install peer dependencies yourself.
npm WARN bot No description
npm WARN bot No repository field.
npm WARN bot No README data
npm WARN bot No license field.
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR<EMAIL_ADDRESS>install: node-gyp rebuild
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the<EMAIL_ADDRESS>install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! C:\Users\Ahmad Alfalasi\AppData\Roaming\npm-cache_logs\2019-06-21T12_06_10_612Z-debug.log
#279 is a duplicate of this, but I'll favor that one since it already got a response.
|
2025-04-01T06:37:06.523185
| 2022-02-03T09:25:59
|
1122842980
|
{
"authors": [
"Prinzhorn",
"beenotung"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1399",
"repo": "JoshuaWise/better-sqlite3",
"url": "https://github.com/JoshuaWise/better-sqlite3/issues/760"
}
|
gharchive/issue
|
Possible to use CURRENT_TIMESTAMP in update function?
better-sqlite3 provides a helper function db.update(table, data, where, whiteList). For the data parameter, we can pass in a key-value pairs to specify the column name and column value.
However, it treats the value as literal value, e.g. db.update('user', {updated_at:'CURRENT_TIMESTAMP'}, {id:1}) will set the updated_at column to be string literal 'CURRENT_TIMESTAMP', instead of setting the column to the current database timestamp.
Work around is to use prepared statement, however that will be much more verbose. In the current design, is there a way to express sql expression on the column value part?
better-sqlite3 provides a helper function db.update(table, data, where, whiteList)
Does it? Can't find it in the docs and that sounds oddly specific and high level, not like something this package would offer. https://github.com/JoshuaWise/better-sqlite3/blob/master/docs/api.md#class-database
const Database = require('better-sqlite3');
const db = new Database(':memory:');
console.log(db.update);
Logs undefined.
I don't think this question is related to better-sqlite3. If it is, can you provide a minimal example?
Oh I was referring to better-sqlite3-helper, submitted to the wrong repo
|
2025-04-01T06:37:06.583788
| 2021-08-16T18:16:41
|
971986072
|
{
"authors": [
"Ju99ernaut",
"shkhalid"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1400",
"repo": "Ju99ernaut/grapesjs-ga",
"url": "https://github.com/Ju99ernaut/grapesjs-ga/issues/1"
}
|
gharchive/issue
|
Can not drag the blocks to body?
How can i use this plugin? I tried the demo but the blocks is not draggle.
Not sure why it's not possible to drag the blocks onto the canvas anymore.
|
2025-04-01T06:37:06.630494
| 2022-02-06T14:19:08
|
1125215544
|
{
"authors": [
"Juansero29"
],
"license": "Unlicense",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1401",
"repo": "Juansero29/Loop",
"url": "https://github.com/Juansero29/Loop/issues/11"
}
|
gharchive/issue
|
[INIT] Create data model from UML diagram
Create the data model via classes and their associated tables in an SQLite database
https://github.com/Juansero29/Loop/blob/main/docs/uml.svg
Folders per modules
modules/models.ts
modules/services.ts
const create = ...
const update = ...
export default { create, update }
For exporting useful methods from services
https://ionicframework.com/docs/native/sqlite
https://github.com/storesafe/cordova-sqlite-storage
https://github.com/storesafe/cordova-sqlite-storage#android-database-provider
https://github.com/alex-steinberg/ionic-react-sqlite-example/blob/master/src/pages/Home.tsx
https://github.com/typeorm/typeorm
Dependency Injection of repositories: https://thomasburlesonia.medium.com/https-medium-com-thomasburlesonia-universal-dependency-injection-86a8c0881cbc
That dependency injection system looks quite nice.
npm install @mindspace-io/utils --save
It allows decoupling of use and construction, and easy testing via mocks. I didn't fully understand however how to tell when to use the mock and when to use the "real" implementation. Do I need to comment out the real one to use the mock? Seems weird => need to investigate further this library
Also:
Don’t forget to build a custom hook to make DI lookups super easy!
Looks nice
Updated typeorm site: https://typeorm.io/
Also, updated dependency injector is this one: https://www.npmjs.com/package/@mindspace-io/react - oddly enough it requires use of --legacy-deps in order to be installed
|
2025-04-01T06:37:06.710523
| 2020-10-27T04:40:52
|
730100108
|
{
"authors": [
"PavithraPurushothaman-T2S",
"mpetrenco",
"radhakrishnant2s",
"selvamariappant2s"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1402",
"repo": "Judopay/JudoKit-ReactNative",
"url": "https://github.com/Judopay/JudoKit-ReactNative/issues/72"
}
|
gharchive/issue
|
Apple pay window does not show for the first time apple pay is initiated
Version used:
2.1.0
On the first launch of the app and when the first time the apple pay is initiated, the apple pay does not
show and the user canceled message is triggered.
On further attempts, this works without any problem.
Please check out 3.1.0 and see if it persists: https://www.npmjs.com/package/judokit-react-native
Tried this latest version and got an error with apple pay. More details here https://github.com/Judopay/JudoKit-ReactNative/issues/82
Issue was handled at #82
It occurs in the 3.3.5 version of judokit-react-native
Issue is happening now in version 3.3.7 of judokit-react-native also.
|
2025-04-01T06:37:06.712609
| 2021-08-09T01:20:51
|
963568821
|
{
"authors": [
"JujuAdams",
"offalynne"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1403",
"repo": "JujuAdams/Input",
"url": "https://github.com/JujuAdams/Input/pull/139"
}
|
gharchive/pull-request
|
Safe keyboard-direct checks
Blocks keyboard on iOS/tvOS native (I lack the hardware atm to test), allows conditional Android and Switch keyboard support (default off).
Tasty. Good work!
|
2025-04-01T06:37:06.718195
| 2021-02-09T20:52:27
|
804915552
|
{
"authors": [
"DragoniteSpam",
"JujuAdams",
"OmegaX1000",
"patchuby"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1404",
"repo": "JujuAdams/Scribble",
"url": "https://github.com/JujuAdams/Scribble/issues/184"
}
|
gharchive/issue
|
Add underline and strikethrough
I don't know if you can do this with normal game maker studio text functions let alone with scribble but this feels like a nice thing to have...
[slant] can be used to emulate italics (but tends to be frowned on by typographers). Bold doesn't have an equivalent; I don't know what Juju's stance on adding one would be. With that said, you can set up a font with actual bold and italics attributes in GameMaker and change the font to it on the fly with [font_name]`.
Strikethrough and underline are unaccounted for since those are just lines being drawn and not actually font attributes. I imagine those cold be added.
it would be easier if this was built into scribble, it eliminates a few extra steps...
Actually, I was looking at an old version of the documentation: you can now combine font families together, and switch between them using formatting tags: https://github.com/JujuAdams/Scribble/wiki/Functions-(Font-Modification)
As Dragonite kindly points out, swapping between bold and italic (and bold+italic) fonts is supported using the [b] and [i] and [bi] command tags. You need to set these up manually using scribble_font_set_style_family as Scribble does not procedurally associate fonts (it used to, but it was unreliable).
Adding underline and strikethrough is... something I'll think about. I'm not 100% sure if GM reveals enough information to handle these features, though I will check.
I'm sure you can put together something that's "good enough" just based on the dimensions of each glyph and or the final positions of each vertex. No idea how much doing something simple like that would offend typographers though.
I just implemented underline, I'm adding my (badly written) functions to get individual lines width and draw an underline in case it's useful as a starting point. It's of course only for horizontal, left to right languages.
The textargument is the scribble element.
{
var _line_count = text.get_line_count();
var _glyph_count = text.get_glyph_count();
var _previous_x = 99999;
var _last_leftmost_glyph = -1;
text[$ "__lines_width"] = array_create(_line_count);
if(_line_count == 1) text.__lines_width[0] = text.get_bbox().width;
else
{
var j = 0;
for(var i = 0; i < _glyph_count; i++)
{
var _current_x = text.get_glyph_data(i).left;
if(_current_x < _previous_x || i == _glyph_count - 1)
{
if(_last_leftmost_glyph != -1)
{
text.__lines_width[j] = text.get_glyph_data(i - 1).right - text.get_glyph_data(_last_leftmost_glyph).left;
j++;
}
_last_leftmost_glyph = i;
}
_previous_x = _current_x;
}
}
text[$ "__total_width"] = 0;
for(var i = 0; i < _line_count; i++) text.__total_width+= text.__lines_width[i];
}
function scribble_draw_underline(text,xx,yy,underline_pos,typist = undefined,color = c_white,alpha = 1,thickness = 3)
{
if(underline_pos < 0) exit;
if(text[$ "__lines_width"] == undefined) show_debug_message("Error trying to draw underline before having called 'scribble_get_lines_width'");
draw_set_color(color);
draw_set_alpha(alpha)
if(typist != undefined) var _underline_pos = min(typist.__revealed_total_width,underline_pos);
else _underline_pos = underline_pos;
var _current_left = 0;
var _bbox = text.get_bbox();
for(var i = 0; i < text.get_line_count(); i++)
{
if(_underline_pos < _current_left) break;
if(_underline_pos >= _current_left + text.__lines_width[i]) var line_progress = text.__lines_width[i];
else line_progress = _underline_pos - _current_left;
var dist = line_height/2 + (i * line_height);
var x1 = xx + lengthdir_x(dist,-90) + (_bbox.width - text.__lines_width[i])/2;
var y1 = yy + lengthdir_y(dist,-90);
draw_rectangle(x1,y1,x1 + line_progress,y1 + thickness,false);
_current_left+= text.__lines_width[i];
}
}```
|
2025-04-01T06:37:06.742072
| 2023-09-30T19:23:59
|
1920373396
|
{
"authors": [
"JuliaTagBot"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1405",
"repo": "JuliaAI/MLJBalancing.jl",
"url": "https://github.com/JuliaAI/MLJBalancing.jl/issues/6"
}
|
gharchive/issue
|
TagBot trigger issue
This issue is used to trigger TagBot; feel free to unsubscribe.
If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.
If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!
Triggering TagBot for merged registry pull request: https://github.com/JuliaRegistries/General/pull/92349
Triggering TagBot for merged registry pull request: https://github.com/JuliaRegistries/General/pull/92881
Triggering TagBot for merged registry pull request: https://github.com/JuliaRegistries/General/pull/93181
Triggering TagBot for merged registry pull request: https://github.com/JuliaRegistries/General/pull/93821
Triggering TagBot for merged registry pull request: https://github.com/JuliaRegistries/General/pull/98497
Triggering TagBot for merged registry pull request: https://github.com/JuliaRegistries/General/pull/108275
|
2025-04-01T06:37:06.758299
| 2022-01-03T22:23:00
|
1092862129
|
{
"authors": [
"ablaom",
"codecov-commenter"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1406",
"repo": "JuliaAI/MLJBase.jl",
"url": "https://github.com/JuliaAI/MLJBase.jl/pull/707"
}
|
gharchive/pull-request
|
For a 0.19.2 release
#706
Codecov Report
Merging #707 (483263e) into master (5d8c78c) will decrease coverage by 2.68%.
The diff coverage is 100.00%.
:exclamation: Current head 483263e differs from pull request most recent head a852ddc. Consider uploading reports for the commit a852ddc to get more accurate results
@@ Coverage Diff @@
## master #707 +/- ##
==========================================
- Coverage 86.53% 83.84% -2.69%
==========================================
Files 36 36
Lines 3401 2904 -497
==========================================
- Hits 2943 2435 -508
- Misses 458 469 +11
Impacted Files
Coverage Δ
src/MLJBase.jl
100.00% <100.00%> (+7.14%)
:arrow_up:
src/interface/data_utils.jl
91.30% <100.00%> (-2.03%)
:arrow_down:
src/sources.jl
70.00% <0.00%> (-18.00%)
:arrow_down:
src/composition/models/transformed_target_model.jl
85.45% <0.00%> (-14.55%)
:arrow_down:
src/data/datasets.jl
86.84% <0.00%> (-13.16%)
:arrow_down:
src/measures/continuous.jl
89.13% <0.00%> (-7.17%)
:arrow_down:
src/show.jl
29.92% <0.00%> (-7.12%)
:arrow_down:
src/measures/measures.jl
64.63% <0.00%> (-6.24%)
:arrow_down:
src/measures/probabilistic.jl
58.46% <0.00%> (-4.70%)
:arrow_down:
... and 26 more
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5d8c78c...a852ddc. Read the comment docs.
|
2025-04-01T06:37:06.801556
| 2024-10-24T05:03:31
|
2610471795
|
{
"authors": [
"asbisen",
"joshday"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1407",
"repo": "JuliaComputing/EasyConfig.jl",
"url": "https://github.com/JuliaComputing/EasyConfig.jl/issues/9"
}
|
gharchive/issue
|
is it possible to embed javascript in Config()
I have been playing around with Apache ECharts and the JSON object it uses for plotting can take Javascript functions as callbacks. For example
{
type: "scatter",
symbol: "circle",
symbolSize: function (val) {return val[2] * 2; }
}
Config() object can take in string, but how do I render the callback functions in JSON without the quotes for string. Performing JSON3.write() gets me "symbolSize: function (val) {return val[2] * 2; }" but I would like to render symbolSize: function (val) {return val[2] * 2; }
I think you'd need to find/create a type that JSON3 will write as you want.
|
2025-04-01T06:37:07.000022
| 2023-12-03T23:22:35
|
2022762812
|
{
"authors": [
"jdeldre",
"masteral456"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1408",
"repo": "JuliaIBPM/ComputationalHeatTransfer.jl",
"url": "https://github.com/JuliaIBPM/ComputationalHeatTransfer.jl/pull/9"
}
|
gharchive/pull-request
|
ComputationalHeatTransfer with Immersed Layers
Added functionality of Dirichlet and Neumann problems with method dispatching for unbounded problems
I've merged the pull request, but there are still a few small issues with the documentation. There needs to be some documentation for the APIs.
|
2025-04-01T06:37:07.014989
| 2019-03-18T17:31:54
|
422345670
|
{
"authors": [
"TheSodesa",
"stevengj"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1409",
"repo": "JuliaLang/IJulia.jl",
"url": "https://github.com/JuliaLang/IJulia.jl/issues/824"
}
|
gharchive/issue
|
Failed to start kernel
Trying to open up a notebook after starting Jupyter up with IJulia.notebook() on Julia 1.1 nets me the following error message:
Traceback (most recent call last):
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/web.py", line 1592, in _execute
result = yield result
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/gen.py", line 1133, in run
value = future.result()
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/gen.py", line 1141, in run
yielded = self.gen.throw(*exc_info)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/notebook/services/sessions/handlers.py", line 73, in post
type=mtype))
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/gen.py", line 1133, in run
value = future.result()
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/gen.py", line 1141, in run
yielded = self.gen.throw(*exc_info)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/notebook/services/sessions/sessionmanager.py", line 79, in create_session
kernel_id = yield self.start_kernel_for_session(session_id, path, name, type, kernel_name)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/gen.py", line 1133, in run
value = future.result()
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/gen.py", line 1141, in run
yielded = self.gen.throw(*exc_info)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/notebook/services/sessions/sessionmanager.py", line 92, in start_kernel_for_session
self.kernel_manager.start_kernel(path=kernel_path, kernel_name=kernel_name)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/gen.py", line 1133, in run
value = future.result()
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/tornado/gen.py", line 326, in wrapper
yielded = next(result)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/notebook/services/kernels/kernelmanager.py", line 160, in start_kernel
super(MappingKernelManager, self).start_kernel(**kwargs)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/jupyter_client/multikernelmanager.py", line 110, in start_kernel
km.start_kernel(**kwargs)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/jupyter_client/manager.py", line 259, in start_kernel
**kw)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/jupyter_client/manager.py", line 204, in _launch_kernel
return launch_kernel(kernel_cmd, **kw)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/site-packages/jupyter_client/launcher.py", line 128, in launch_kernel
proc = Popen(cmd, **kwargs)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/subprocess.py", line 769, in __init__
restore_signals, start_new_session)
File "/Users/soderhos/.julia/conda/3/lib/python3.7/subprocess.py", line 1516, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: '/Applications/Julia-1.0.app/Contents/Resources/julia/bin/julia': '/Applications/Julia-1.0.app/Contents/Resources/julia/bin/julia'
It seems like Jupyter is trying to locate Julia 1.0, even though the notebook session was started with Julia 1.1. Why would this happen?
When you installed the new version of Julia and deleted the old one, you have to launch the Julia command line and run build IJulia at the package prompt (to tell Jupyter where to find Julia).
See https://github.com/JuliaLang/IJulia.jl#updating-julia-and-ijulia
|
2025-04-01T06:37:07.137222
| 2021-11-24T15:36:44
|
1062575020
|
{
"authors": [
"DNF2",
"ItzLevvie",
"MatthijsBlom",
"davidanthoff"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1410",
"repo": "JuliaLang/juliaup",
"url": "https://github.com/JuliaLang/juliaup/issues/175"
}
|
gharchive/issue
|
Add alternative installation instructions
Linux & Mac: I couldn't find specific instructions, but I'm guessing they exist, as it sounds like Juliaup should work on these platforms.
Windows: For some, installing from the Store may not be an option. How should these people install Juliaup?
It would be nice if the README listed these installation instructions.
The only option that is ready at this point is the Windows Store option. We are getting closer with Linux and MacOS support, but at this point it is not ready for general consumption or feedback, so for now we should wait with install instructions in the README until things are ready :) But then, yes, agreed, we need to add them! I'll keep this issue open to track that.
Related to this: I understand that juliaup should update itself. Does this, too, rely on a connection with the Windows Store? Getting the Windows Store to work is very hard, so after the first install, it would be good to not rely on juliaup communicating with the store.
Yes, the version from the Windows Store gets updated by the Windows Store. It solves a lot of problems as doing background updates is not trivial...
My current plan is to next figure out automatic self-update for Linux and Mac, and then come back to Windows and see whether we can improve the situation for folks where the Windows Store is in some form blocked.
[…] doing background updates is not trivial...
What exactly do you mean by 'background updates'?
Off the top of my head, Powershell (pwsh) warns that an upgrade is available on start, and pip likewise warns at least when used to install a package (and maybe in more cases). Both require the user to initiate the upgrade – typically through manual download in the case of pwsh, but a simple one-liner in the case of pip (and rustup, and stack, and…). I'm fine with the latter strategy.
Does the new-ish Windows package manager, winget, change the picture at all, or is that just a CLI for the Store app?
[…] or is that just a CLI for the Store app?
It is not, but it can function as one:
PS ~ > winget search julia
Name Id Version Source
-----------------------------------------------------
Julia 9NJNWW8PVKMN Unknown msstore
Julian Date Selector 9NSGP4VDNW0R Unknown msstore
Julia Julialang.Julia 1.6.2 winget
I tried to circumvent the Store by using winget, but it doesn't work: installing 9NJNWW8PVKMN still requires logging in.
I tried to circumvent the Store by using winget, but it doesn't work: installing 9NJNWW8PVKMN still requires logging in.
Blocked by https://github.com/microsoft/winget-cli/issues/1585#issuecomment-974509235.
|
2025-04-01T06:37:07.148008
| 2018-03-16T17:57:28
|
306023557
|
{
"authors": [
"appleparan",
"stevengj"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1411",
"repo": "JuliaMath/AbstractFFTs.jl",
"url": "https://github.com/JuliaMath/AbstractFFTs.jl/issues/13"
}
|
gharchive/issue
|
Improve documentation
I had a misunderstanding of plan_* functions.
Main reason of this was from the term "inverse" on documentation
It says
You can compute the inverse-transform plan by inv(P) and apply the inverse plan with P \ Â (the inverse plan is cached and reused for subsequent calls to inv or ), and apply the inverse plan to a pre-allocated output array A with ldiv!(A, P, Â).
The purpose of this explanation was
you have FORWARD transform plan
using mul! : FORWARD transform
using ldiv! : BACKWARD transform
plan_fft is for FORWARD transform so it doesn't matter.
However, other plan_* documentation says basically "Same as plan_fft", so if you apply this structure to BACKWARD transform, documentation have two meaning
you have BACKWARD transform plan
using mul! : BACKWARD transform
using ldiv! : FORWARD transform
or
you have BACKWARD transform plan
using ldiv : BACKWARD transform
because the term "INVERSE" and "BACKWARD" are often used interchangeably. It was hard to distinguish without investigating source code. I know there is a "-" between inverse and transform, but it is really confusing.
Would we have more clear explanation of this?
"forward" and "backward" just refer to the sign of the exponent in the Fourier transform. To get the inverse of a given transform, you have to flip the sign of the exponent (going from forward to backward or vice versa) and also scale by 1/n where n is the length of the transform.
The inverse of a plan (as computed by inv or applied by ldiv!) is actually this inverse: e.g. the inverse of a forwards transform is a backwards transform scaled by 1/n. The inverse of an inverse plan is the original plan.
|
2025-04-01T06:37:07.234008
| 2024-06-27T20:21:37
|
2379029207
|
{
"authors": [
"DilumAluthge",
"gbruer15"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1412",
"repo": "JuliaRegistries/CompatHelper.jl",
"url": "https://github.com/JuliaRegistries/CompatHelper.jl/pull/498"
}
|
gharchive/pull-request
|
GitHub: get API url and server URL from env
I encountered difficulties trying to use CompatHelper.jl on an Enterprise server, but it turns out I just needed to set the API url and the hostname correctly.
This pull request updates auto_detect_ci_service to automatically handle non-GitHub.com GitHub servers.
I am not sure how to add tests for this.
@DilumAluthge I see you are probably the most active here. Can you take a look at this and advise on tests?
So, if we add this functionality into CompatHelper itself, it'll be a little difficult for us to test it.
Instead, could you modify your CompatHelper.yml file?
For example, if you look at the recommended CompatHelper.yml file, part of it looks like this:
https://github.com/JuliaRegistries/CompatHelper.jl/blob/97e9dcdde383ea5bea89d51de221698958b47630/.github/workflows/CompatHelper.yml#L37-L41
You could instead modify your CompatHelper.yml file to look like this:
- name: "Run CompatHelper"
run: |
import CompatHelper
my_ci_cfg = CompatHelper.GitHubActions(;
username = "...",
email = "...",
api_hostname = "...",
clone_hostname = "...",
)
CompatHelper.main(ENV, my_ci_cfg)
shell: julia --color=yes {0}
Where you'd fill in the relevant values in the GitHubActions(; ...) constructor.
That way, you can specify the exact values you need, without us needed to modify the source code of CompatHelper.jl (and thus needing to figure out a way to test it).
|
2025-04-01T06:37:07.239171
| 2020-07-26T16:06:01
|
665809612
|
{
"authors": [
"DilumAluthge",
"drcxcruz"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1413",
"repo": "JuliaRegistries/General",
"url": "https://github.com/JuliaRegistries/General/issues/18468"
}
|
gharchive/issue
|
ERROR: LoadError: UndefVarError: include not defined
hi
include is part of the Julia language. Thus, I do not understand the reason for this error. thanks for your time
2020-07-26T16:03:27.4974582Z VegaLite
2020-07-26T16:03:27.9252965Z ERROR: LoadError: UndefVarError: include not defined
2020-07-26T16:03:28.5092161Z Stacktrace:
2020-07-26T16:03:28.5093738Z [1] top-level scope at /tmp/jl_dglOg0/packages/InvariantCausalPrediction/dnTFi/src/InvariantCausalPrediction.jl:39
2020-07-26T16:03:28.6607536Z [2] include(::Module, ::String) at ./Base.jl:377
2020-07-26T16:03:28.6608348Z [3] top-level scope at none:2
2020-07-26T16:03:28.6637636Z [4] eval at ./boot.jl:331 [inlined]
2020-07-26T16:03:28.6638488Z [5] eval(::Expr) at ./client.jl:449
2020-07-26T16:03:28.6639540Z [6] top-level scope at ./none:3
2020-07-26T16:03:28.6640050Z in expression starting at /tmp/jl_dglOg0/packages/InvariantCausalPrediction/dnTFi/src/InvariantCausalPrediction.jl:39
2020-07-26T16:03:29.9026532Z ERROR: Failed to precompile InvariantCausalPrediction [5fe40f08-422b-4ec7-90aa-ba60e31ac74e] to /tmp/jl_dglOg0/compiled/v1.4/InvariantCausalPrediction/3Q6yc_c0C9F.ji.
2020-07-26T16:03:30.0841146Z Stacktrace:
It sounds like you have having some issues getting your package to work.
It doesn't sound like these issues are specific to AutoMerge or the registration process.
Please ask questions on the Julia Discourse forum.
hi,
I have asked the question in the forum.
https://discourse.julialang.org/t/automerge-decision-new-package-error-loaderror-undefvarerror-include-not-defined/43730
thanks
|
2025-04-01T06:37:07.242345
| 2021-06-10T09:23:04
|
917119614
|
{
"authors": [
"jlbuild",
"maleadt"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1414",
"repo": "JuliaRegistries/General",
"url": "https://github.com/JuliaRegistries/General/pull/38547"
}
|
gharchive/pull-request
|
New version: SPIRV_LLVM_Translator_jll v9.0.0+4
Autogenerated JLL package registration
Registering JLL package SPIRV_LLVM_Translator_jll.jl
Repository: https://github.com/JuliaBinaryWrappers/SPIRV_LLVM_Translator_jll.jl
Version: v9.0.0+4
Commit: 72ec82700b502b6b770a9eb04fefad091a1d82b0
Revision on Yggdrasil: https://github.com/JuliaPackaging/Yggdrasil/commit/55777a3d0eecb57e7f1362cef33da805c20c9aa2
Created by: @maleadt
Same as https://github.com/JuliaRegistries/General/pull/38546.
|
2025-04-01T06:37:07.245974
| 2021-11-24T17:51:27
|
1062713220
|
{
"authors": [
"JuliaRegistrator",
"giordano"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1415",
"repo": "JuliaRegistries/General",
"url": "https://github.com/JuliaRegistries/General/pull/49332"
}
|
gharchive/pull-request
|
New version: NDTensors v0.1.33
Registering package: NDTensors
Repository: https://github.com/ITensor/ITensors.jl
Created by: @mtfishman
Version: v0.1.33
Commit: f029107358903335a7e0242fc8fe74f12f35b8d2
Reviewed by: @mtfishman
Reference: https://github.com/ITensor/ITensors.jl/commit/f029107358903335a7e0242fc8fe74f12f35b8d2#commitcomment-60824698
Description: A Julia library for efficient tensor computations and tensor network calculations
@mtfishman please see the message above. Once you fix the automerge issues you can register again the new revision without changing the version number and this pull request will be automatically updated.
[noblock]
|
2025-04-01T06:37:07.251500
| 2022-09-25T22:32:38
|
1385185519
|
{
"authors": [
"giordano",
"jlbuild",
"stemann"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1416",
"repo": "JuliaRegistries/General",
"url": "https://github.com/JuliaRegistries/General/pull/68946"
}
|
gharchive/pull-request
|
New version: Torch_jll v1.10.2+0
Autogenerated JLL package registration
Registering JLL package Torch_jll.jl
Repository: https://github.com/JuliaBinaryWrappers/Torch_jll.jl
Version: v1.10.2+0
Commit: 2bafb001677b1eb77ac26303f5d958319f302c45
Revision on Yggdrasil: https://github.com/JuliaPackaging/Yggdrasil/commit/dc06b9903eba8408087120ed962ed7e5d3716a31
Created by: @Wimmerer
@stemann Now the problem is MKL:
ERROR: InitError: could not load library "/tmp/jl_mho9f5/artifacts/929e07419d06d190327bd30982e5b0b510a49664/lib/libtorch.so"
libmkl_intel_lp64.so.2: cannot open shared object file: No such file or directory
Can we avoid MKL entirely? It's a mess. We can't link to the libmkl_intel libraries in a sane way, enjoy reading https://github.com/JuliaPackaging/Yggdrasil/pull/1075 if you want to learn more.
[noblock]
@stemann Now the problem is MKL:
ERROR: InitError: could not load library "/tmp/jl_mho9f5/artifacts/929e07419d06d190327bd30982e5b0b510a49664/lib/libtorch.so"
libmkl_intel_lp64.so.2: cannot open shared object file: No such file or directory
Can we avoid MKL entirely? It's a mess. We can't link to the libmkl_intel libraries in a sane way, enjoy reading JuliaPackaging/Yggdrasil#1075 if you want to learn more.
[noblock]
OK, I see.
I will try to exclude building with MKL: https://github.com/JuliaPackaging/Yggdrasil/pull/5583
[noblock]
|
2025-04-01T06:37:07.271008
| 2024-01-24T20:15:49
|
2099013920
|
{
"authors": [
"JuliaRegistrator",
"goerz",
"juliohm"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1417",
"repo": "JuliaRegistries/General",
"url": "https://github.com/JuliaRegistries/General/pull/99463"
}
|
gharchive/pull-request
|
New package: GeoStatsFunctions v0.1.0
Registering package: GeoStatsFunctions
Repository: https://github.com/JuliaEarth/GeoStatsFunctions.jl
Created by: @juliohm
Version: v0.1.0
Commit: a6a2c8dad93770da666d93fb8f20ae0671bd4027
Reviewed by: @juliohm
Reference: https://github.com/JuliaEarth/GeoStatsFunctions.jl/commit/a6a2c8dad93770da666d93fb8f20ae0671bd4027#commitcomment-137741950
Description: Geostatistical functions for the GeoStats.jl framework
Even though this seems to be part of a larger framework, I'd still ask that you add some documentation before registering. This could be a README that's a little more expansive, with a list of functions and a paragraph or two on how these functions fit into GeoStats, and/or a minimal Documenter-based page with all the docstrings. Or, a link to the specific part of the GeoStats documentation where these functions are documented.
Even though this seems to be part of a larger framework, I'd still ask that you add some documentation before registering. This could be a README that's a little more expansive, with a list of functions and a paragraph or two on how these functions fit into GeoStats, and/or a minimal Documenter-based page with all the docstrings. Or, a link to the specific part of the GeoStats documentation where these functions are documented.
Sorry @goerz, but I disagree. We are documenting the framework in a single place to avoid outdated README's in submodules of the project. That is why the README explicitly mentions the official GeoStats.jl docs and community channel. These modules are not intended for end-users, they are intended for developers of the framework only.
[noblock]
[noblock] They gotta be documented somewhere. Just picking something at random, ballsearch.jl has a docstring for BallSearchAccum, but if I search https://juliaearth.github.io/GeoStatsDocs/stable/search.html?q=BallSearchAccum I'm not seeing that docstring.
I have a similar situation with my own packages, where there's a *Base package that's not for public consumption, but even that can have a minimal listing of docstrings: https://juliaquantumcontrol.github.io/QuantumControlBase.jl/dev/
You definitely don't have to put anything in the README that could become outdated, but a few words about the functions contained in this package and where to find their documentation would be helpful.
Thank you for the suggestions, we are constantly improving our documentation, it is just not the right place in our viewpoint. Many docstrings are not intended for end-users either, and we are constantly evolving internals to accommodate documented functionality in the main documentation website.
Appreciate if you can add a [noblock] to your first comment to avoid blocking the auto-merge by the bot.
it is just not the right place
Then what is the right place?
It doesn't seem like a high bar to require that every registered package, even an auxiliary package, has some form – any form – of documentation.
Appreciate if you can add a [noblock] to your first comment
Sorry, no, the requirement that every package needs some minimal form of documentation is a line I'm willing to hold. So if you insist on not having any documentation, you would have to get a registry maintainer to override my veto. Or, preferably, just add a documentation stub like the example I gave before. I don't think it's something that'll take more than 20 minutes or so to set up, and I'd be happy to unblock then.
If this package is indeed 100% internal functions, it probably shouldn't be a package, but a submodule of GeoStats.
Then what is the right place?
I already explained that the documentation of this module lives inside the main documentation of the project, and that internal functions not intended for end-users are not present on purpose.
Sorry, no, the requirement that every package needs some minimal form of documentation is a line I'm willing to hold. So if you insist on not having any documentation, you would have to get a registry maintainer to override my veto.
Do you really feel that this requirement is helping with the quality of the general registry? We have so many other modules already registered, all pointing to the main docs of the project:
https://github.com/JuliaEarth/GeoStatsBase.jl
https://github.com/JuliaEarth/GeoTables.jl
https://github.com/JuliaEarth/GeoStatsTransforms.jl
The requirement of documentation is important, but you are forcing us to write the documentation in a specific place of your preference. We prefer to point end-users to a central well-maintained documentation, and yes, we believe that it still makes sense to develop this as a separate package in a separate repository where people can contribute specific PRs.
Besides, it is useless to enforce contributors of packages to add a README at registration time. They can always undo the README later on with additional commits. You have to give contributors the freedom to write good documentation where they feel is best for the community.
[noblock]
[noblock] Yeah, looking at the project organization, I can see the pattern. So I think it's okay. The overall documentation of the organization is far better than many other new registrations, so it seems unfair to block it.
I'd still say if I was a user of GeoStats, and I'd ever have to explicitly import GeoStatsFunctions (or Meshes, or any of the other listed package), I'd find it useful to have a complete reference API documentation for that sub-package. But yeah, that's up to you. It's certainly not a prerequisite for registration that a package has perfect documentation according to my tastes ;-)
[noblock] Thank you, it is a lot of work to keep these packages synced in
both source and docs. That is the most productive way we found. GeoStats.jl
is an umbrella package that loads the full stack of packages using
Reexport.jl it hosts the docs and is the goto place for end-users.
Em qui., 25 de jan. de 2024 11:09, Michael Goerz @.***>
escreveu:
[noblock] Yeah, looking at the project organization
https://juliaearth.github.io/GeoStatsDocs/dev/index.html#Project-organization,
I can see the pattern. So I think it's okay. The overall documentation of
the organization is far better than many other new registrations, so it
seems unfair to block it.
I'd still say if I was a user of GeoStats, and I'd ever have to explicitly
import GeoStatsFunctions (or Meshes, or any of the other listed package),
I'd find it useful to have a complete reference API documentation for that
sub-package. But yeah, that's up to you. It's certainly not a prerequisite
for registration that a package has perfect documentation according to
my tastes ;-)
—
Reply to this email directly, view it on GitHub
https://github.com/JuliaRegistries/General/pull/99463#issuecomment-1910290229,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AAZQW3MNU65EC4VPHFLYQD3YQJRQ7AVCNFSM6AAAAABCJLTJFCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMJQGI4TAMRSHE
.
You are receiving this because you were mentioned.Message ID:
@.***>
|
2025-04-01T06:37:07.365891
| 2023-09-06T15:32:53
|
1884273180
|
{
"authors": [
"gdalle",
"pat-alt"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1418",
"repo": "JuliaTrustworthyAI/CounterfactualExplanations.jl",
"url": "https://github.com/JuliaTrustworthyAI/CounterfactualExplanations.jl/issues/300"
}
|
gharchive/issue
|
Use Enzyme or DifferentiationInterface for autodiff
Current Status
The AD part of the package has not undergone any major overhaul since I first implemented it around the start of the project. Back then I relied on Flux/Zygote, because the package was tailored to Flux models anyway (at the time), I was entirely new to AD and Julia; and I could use Zygote to differentiate through structs, like so:
"""
∂ℓ(
generator::AbstractGradientBasedGenerator,
ce::AbstractCounterfactualExplanation,
)
The default method to compute the gradient of the loss function at the current counterfactual state for gradient-based generators.
It assumes that `Zygote.jl` has gradient access.
"""
function ∂ℓ(
generator::AbstractGradientBasedGenerator, ce::AbstractCounterfactualExplanation
)
return Flux.gradient(ce -> ℓ(generator, ce), ce)[1][:counterfactual_state]
end
Pain Points
The current implementation is less than ideal for various reasons:
Zygote cannot handle nested AD, which is necessary for some counterfactual generators (see #376).
Gradients are still taken implicitly, which is not in line with where the broader ecosystem is headed, I believe.
The previous point also makes it difficult to implement forward-over-reverse to solve the nested AD issue.
The AD implementation has never been optimized for performance, so I guess there's a lot of room for improvement here.
To Do
[ ] Double-check https://gdalle.github.io/JuliaCon2024-AutoDiff/#/title-slide
[ ] Try out DifferentiationInterface.jl
This seems like a good idea (he said, completely unbiased). Want a hand with that @pat-alt ?
@gdalle fancy seeing you here 😄
That would be amazing, of course, if you could help out, but only if it's not too much trouble for you. I want to look at this soon, but I might look at #495 first, because I think I may need this for a research project I'm currently working on.
I've updated the description a little bit. If you have any pointers, I'd much appreciate if you could share them here.
Thanks!
|
2025-04-01T06:37:07.368205
| 2018-10-02T10:21:38
|
365832651
|
{
"authors": [
"Julian",
"gaetano-guerriero"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1419",
"repo": "Julian/jsonschema",
"url": "https://github.com/Julian/jsonschema/pull/472"
}
|
gharchive/pull-request
|
Fix py2 urlopen
urlopen() was used as a context manager on python 2 too.
This happened when resolving remote refs and not using requests, for instance when resolving file:/// refs.
This solves the same problem as #439, but better ;).
Hi! Thanks a lot for this, it'll close #439.
Left a comment on how to shorten up the test -- it also looks like the test might be failing on Windows.
I've applied the comments.
In order to fix the added test on windows I need to find a windows host so it will take some days.
|
2025-04-01T06:37:07.374051
| 2021-05-27T23:43:28
|
904391180
|
{
"authors": [
"Julian",
"gebner"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1420",
"repo": "Julian/lean.nvim",
"url": "https://github.com/Julian/lean.nvim/pull/29"
}
|
gharchive/pull-request
|
Reset the cursor to the top of the infoview when updating.
Closes: #27
Bit hairy getting some testing in place, but @gebner care to review this does what you expect?
Hmm, unfortunately this does not work as well as I'd hoped. The cursor jumps back to the top as soon as you switch back to the window for the lean file. Maybe it just takes getting used to, but it is very counter-intuitive that switching windows moves the cursor. The infoview moves to the top while I'm still reading it.
-- long trace message for testing
#eval let y := (List.range 100).map fun x => dbg_trace x; x + 1 dbg_trace "this is important" y.length
Maybe a better heuristic is to move the cursor to the top only when you move to a new line.
The behavior of scrolloff=10 would also be okay, but that doesn't seem to work for automatic updates.
Aha, ok, I think I follow, you want it to move just if the new contents are shorter than the window length? Or do you want it to move even if they're long but have changed? (Whereas now it moves always even if nothing whatsoever changes in the infoview contents, just CursorHold fires again and it repopulates)
I think I can imagine both of those other behaviors being the expected one in different scenarios, right?
Aha, ok, I think I follow, you want it to move just if there are some new infoview contents are shorter than the window length?
My main issue with the current (= main branch) behavior is that it is easy to miss errors: you scroll down a long trace message.
Then you move a line up, the infoview is empty, and now you think that everything is fine or that lean is still processing the file.
So for me it's only important to scroll up when the content changes and becomes shorter.
Or do you want it to move even if they're long but have changed?
Ideally not. When I look at the middle of a long trace message and it's updated to similarly long message, then chances are I still want to see the same part, and see if anything's changed.
I think I can imagine both of those other behaviors being the expected one in different scenarios, right?
Good question, and I can't really tell. That's why I'd like to keep the automatic movements to a minimum. Scrolling the infoview has never bothered me in vscode, but it's also never empty.
OK, I think I convinced myself this is actually likely a neovim bug, which I filed as neovim/neovim#14663.
I found a way to hack around it for now, which I pushed to this PR (it actually I think has nothing to do with the cursor position but I'm not 100% and didn't go off and read the neovim C source yet).
Feel free to have another look, hopefully this is closer to what you expect, though perhaps we could be even more sticky on restoring line position. But in trying your long trace example and adding some other lines around it, I suspect this is an improvement.
This is exactly what I wanted! Thanks!
|
2025-04-01T06:37:07.405795
| 2016-12-29T14:46:04
|
198018091
|
{
"authors": [
"ramezsaeed",
"rkhamis"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1421",
"repo": "Jumpscale/jscockpit",
"url": "https://github.com/Jumpscale/jscockpit/issues/326"
}
|
gharchive/issue
|
scality service doesn't have the server data
when try to run the s3 scality service to install s3server using this blueprint
g8client__main:
url: 'du-conv-2.demo.greenitglobe.com'
login: 's3user'
password: '123456789'
account: 's3_acc'
vdc__scality22:
g8client: 'main'
location: 'du-conv-2'
disk.ovc__disk1:
size: 1000
s3__s3vm:
vdc: 'scality22'
disk:
- 'disk1'
hostprefix: 's3app22'
scenario:
1- create new repo from cockpit portal
2- execute the bp from cockpit portal
3- the run status will be OK and all the steps are OK too
4- however if I go to the scality service to get the sercet/access keys, it will be emtpy json
root@vm-14:/optvar/cockpit_repos/s3server4/services/vdcfarm!auto_82/vdc!scality24/node!s3vm/os!s3vm/node!app/os!app/scality!app# cat data.json
{
"domain":"",
"keyAccess":"",
"keySecret":"",
"os":"app",
"storageData":"\/data\/data",
"storageMeta":"\/data\/meta"
5- now from the cockpit machine, do ays destroy & ays blueprint & ays install
now the data.json will be OK and include all the needed info
root@vm-14:/optvar/cockpit_repos/s3server4/services/vdcfarm!auto_82/vdc!scality24/node!s3vm/os!s3vm/node!app/os!app/scality!app# cat data.json
{
"domain":"s3app21-3232242740.gigapps.io",
"keyAccess":"W7X2BxGxWXxO",
"keySecret":"YBbEjNdeQrqp",
"os":"app",
"storageData":"\/data\/data",
"storageMeta":"\/data\/meta"
Doing the execute action from the portal only units the services. These values are set during the install.
To have these values set, you should've also run the install action from the portal.
This is designed behavior.
|
2025-04-01T06:37:07.408221
| 2016-11-28T13:13:55
|
192003006
|
{
"authors": [
"abdulrahmantkhalifa",
"despiegk",
"xmonader"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1422",
"repo": "Jumpscale/jumpscale_core8",
"url": "https://github.com/Jumpscale/jumpscale_core8/issues/588"
}
|
gharchive/issue
|
cuisine executor? does not show output while going
When I start, I see no output till command is done,
this is not ok
need to see while going.
Checked on Ubuntu: works
https://gist.github.com/xmonader/ac7398c0edeae39fd16364c410e660da
Needs to be validated on osx
verfied on osx
|
2025-04-01T06:37:07.455680
| 2020-07-14T22:15:55
|
656931289
|
{
"authors": [
"AutumnClove",
"ibrahimk157"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1423",
"repo": "Just-Some-Bots/MusicBot",
"url": "https://github.com/Just-Some-Bots/MusicBot/issues/2094"
}
|
gharchive/issue
|
Download YouTube playlist videos in a different way
Please tick all applicable boxes.
[X] I am using Python 3.5.3 or higher (run python --version on the command line)
[X] I have followed the official guides to install the bot for my system
[X] I have updated my dependencies to the latest version using the appropriate update script
Which version are you using?
[ ] The latest master version (release-260419)
[X] The latest review version
What type of issue are you creating?
[X] Bug
[ ] Feature request
[ ] Question
Description of issue
When the bot is commanded to play a YouTube playlist, it downloads all the videos of the playlist all at once... which in a large playlist makes YouTube IP block the VPS the bot is running on.
This behavior doesn’t make sense!
Why doesn’t the bot instead only download the first song, plays it, when it’s done it downloads the second song, plays it, etc...
Much better than downloading all the songs of a playlist at once! And will for sure prevent YouTube’s annoying IP block.
Steps to reproduce
Make it play any YouTube playlist with many videos (example: https://www.youtube.com/playlist?list=PL4o29bINVT4EG_y-k5jGoOu3-Am8Nvi10)
It will try to download all of the videos all at once and will get IP blocked by Youtube in no time.
Log file
Please attach your MusicBot log file (located at logs/musicbot.log) to this issue. You can do so by dragging and dropping the file here. If you do not include your log file, you WILL be asked to provide one.
[30.515432358] 2020-07-14 21:31:48,849 - INFO - launcher: Checking for Python 3.5+
[30.718088150] 2020-07-14 21:31:48,849 - INFO - launcher: Checking console encoding
[30.900955200] 2020-07-14 21:31:48,849 - INFO - launcher: Ensuring we're in the right environment
[448.570251465] 2020-07-14 21:31:49,267 - INFO - launcher: Required checks passed.
[449.181795120] 2020-07-14 21:31:49,268 - INFO - launcher: Optional checks passed.
[449.469566345] 2020-07-14 21:31:49,268 - INFO - launcher: Moving old musicbot log
######################### PRE-RUN SANITY CHECKS PASSED #########################
[1.0954689979553223] 2020-07-14 21:31:49,914 - WARNING - musicbot.config | In config.py::MainThread(140250957956928), line 123 in run_checks: i18n file does not exist. Trying to fallback to config/i18n/en.json.
[1.0958342552185059] 2020-07-14 21:31:49,914 - INFO - musicbot.config | In config.py::MainThread(140250957956928), line 134 in run_checks: Using i18n: config/i18n/en.json
[1.0980954170227051] 2020-07-14 21:31:49,917 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 239 in _setup_logging: Set logging level to INFO
[1.1010162830352783] 2020-07-14 21:31:49,920 - DEBUG - musicbot.json | In json.py::MainThread(140250957956928), line 8 in __init__: Init JSON obj with config/i18n/en.json
[1.1118435859680176] 2020-07-14 21:31:49,930 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 90 in __init__: Starting MusicBot release-260819-72-g2350384
[1.1124629974365234] 2020-07-14 21:31:49,931 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 96 in __init__: Loaded autoplaylist with 2531 entries
[1.1938691139221191] 2020-07-14 21:31:50,012 - DEBUG - musicbot.spotify | In spotify.py::MainThread(140250957956928), line 81 in get_token: Created a new access token: {'access_token': 'BQBYcGaTU42K96Gnt82HevLsnJEwJOVWAES6nNenj54CuwRTc6jCZH0ua598Lsd790dqgTGSY1IefZahBno', 'token_type': 'Bearer', 'expires_in': 3600, 'scope': '', 'expires_at':<PHONE_NUMBER>}
[1.1942601203918457] 2020-07-14 21:31:50,013 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 121 in __init__: Authenticated with Spotify successfully using client ID and secret.
[3.6479089260101318] 2020-07-14 21:31:52,466 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 965 in on_ready: Connection established, ready to go.
[3.6483943462371826] 2020-07-14 21:31:52,467 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 356 in _cache_app_info: Caching app info
[3.7539911270141602] 2020-07-14 21:31:52,573 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 787 in _scheck_ensure_env: Ensuring data folders exist
[3.7552502155303955] 2020-07-14 21:31:52,574 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 803 in _scheck_server_permissions: Checking server permissions
[3.7554564476013184] 2020-07-14 21:31:52,574 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 807 in _scheck_autoplaylist: Auditing autoplaylist
[3.7556393146514893] 2020-07-14 21:31:52,574 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 811 in _scheck_configs: Validating config
[3.7558276653289795] 2020-07-14 21:31:52,574 - DEBUG - musicbot.config | In config.py::MainThread(140250957956928), line 237 in async_validate: Validating options...
[3.7559976577758789] 2020-07-14 21:31:52,575 - DEBUG - musicbot.config | In config.py::MainThread(140250957956928), line 251 in async_validate: Acquired owner id via API
[3.7561569213867188] 2020-07-14 21:31:52,575 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 814 in _scheck_configs: Validating permissions config
[3.7563233375549316] 2020-07-14 21:31:52,575 - DEBUG - musicbot.permissions | In permissions.py::MainThread(140250957956928), line 94 in async_validate: Validating permissions...
[3.7565703392028809] 2020-07-14 21:31:52,575 - DEBUG - musicbot.permissions | In permissions.py::MainThread(140250957956928), line 98 in async_validate: Fixing automatic owner group
[3.7567837238311768] 2020-07-14 21:31:52,575 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 979 in on_ready: Connected:<PHONE_NUMBER>16830987/Merde#6205
[3.7572257518768311] 2020-07-14 21:31:52,576 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 987 in on_ready: Owner:<PHONE_NUMBER>99267585/ٴٴ#9103
[3.7575387954711914] 2020-07-14 21:31:52,576 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 993 in on_ready: Guild List:
[3.7578756809234619] 2020-07-14 21:31:52,576 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 997 in on_ready: - Harder
[3.7582430839538574] 2020-07-14 21:31:52,577 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 1053 in on_ready: Not bound to any text channels
[3.7584948539733887] 2020-07-14 21:31:52,577 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 1079 in on_ready: Not autojoining any voice channels
[3.7594964504241943] 2020-07-14 21:31:52,578 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 286 in _join_startup_channels: Found owner in "General"
[3.7597861289978027] 2020-07-14 21:31:52,578 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 295 in _join_startup_channels: Attempting to join Harder/General
[6.4041831493377686] 2020-07-14 21:31:55,223 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 747 in deserialize_queue: Deserializing queue for<PHONE_NUMBER>14011738
[6.6349246501922607] 2020-07-14 21:31:55,453 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 439 in get_player: Created player via deserialization for guild<PHONE_NUMBER>14011738 with 1 entries
[6.6354134082794189] 2020-07-14 21:31:55,454 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 311 in _join_startup_channels: Joined Harder/General
[6.6366314888000488] 2020-07-14 21:31:55,455 - DEBUG - musicbot.entry | In entry.py::MainThread(140250957956928), line 62 in get_ready_future: Created future for None
[6.6373357772827148] 2020-07-14 21:31:55,456 - INFO - musicbot.entry | In entry.py::MainThread(140250957956928), line 366 in _really_download: Download started: https://soundcloud.com/izanagibang/best-song-ever-created
[8.2095448970794678] 2020-07-14 21:31:57,028 - INFO - musicbot.entry | In entry.py::MainThread(140250957956928), line 376 in _really_download: Download complete: https://soundcloud.com/izanagibang/best-song-ever-created
[8.2132229804992676] 2020-07-14 21:31:57,032 - FFMPEG - musicbot.player | In player.py::MainThread(140250957956928), line 293 in _play: Creating player with options: -nostdin -vn audio_cache/soundcloud-90213207-Best_Song_Ever_Created_Maximbady_-_Hey_baby.mp3
[8.2230744361877441] 2020-07-14 21:31:57,042 - DEBUG - musicbot.player | In player.py::MainThread(140250957956928), line 306 in _play: Playing <musicbot.player.SourcePlaybackCounter object at 0x7f8eb49a2400> using <discord.voice_client.VoiceClient object at 0x7f8eb521fb50>
[8.2273652553558350] 2020-07-14 21:31:57,046 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 474 in on_player_play: Running on_player_play
[8.2278752326965332] 2020-07-14 21:31:57,046 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 723 in serialize_queue: Serializing queue for<PHONE_NUMBER>14011738
[27.0019505023956299] 2020-07-14 21:32:15,820 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 2716 in on_message:<PHONE_NUMBER>99267585/ٴٴ#9103: !skip
[27.0131967067718506] 2020-07-14 21:32:15,832 - DEBUG - musicbot.player | In player.py::Thread-3(140250671998720), line 220 in _playback_finished: Deleting file: audio_cache/soundcloud-90213207-Best_Song_Ever_Created_Maximbady_-_Hey_baby.mp3
[27.0143411159515381] 2020-07-14 21:32:15,833 - DEBUG - musicbot.player | In player.py::Thread-3(140250671998720), line 225 in _playback_finished: File deleted: audio_cache/soundcloud-90213207-Best_Song_Ever_Created_Maximbady_-_Hey_baby.mp3
[27.0178744792938232] 2020-07-14 21:32:15,836 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 550 in on_player_finished_playing: Running on_player_finished_playing
[27.2232422828674316] 2020-07-14 21:32:16,042 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 723 in serialize_queue: Serializing queue for<PHONE_NUMBER>14011738
[27.2246158123016357] 2020-07-14 21:32:16,043 - DEBUG - musicbot.bot | In bot.py::MainThread(140250957956928), line 546 in on_player_stop: Running on_player_stop
[104.0940260887145996] 2020-07-14 21:33:32,913 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 2716 in on_message:<PHONE_NUMBER>99267585/ٴٴ#9103: !stream https://youtu.be/-aLYvZ5sX28
[104.9392480850219727] 2020-07-14 21:33:33,758 - ERROR - musicbot.bot | In bot.py::MainThread(140250957956928), line 2845 in on_message: Error in stream: ExtractionError: Unknown error: [0;31mERROR:[0m Unable to download webpage: HTTP Error 429: Too Many Requests (caused by <HTTPError 429: 'Too Many Requests'>)
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/extractor/common.py", line 627, in _request_webpage
return self._downloader.urlopen(url_or_request)
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/YoutubeDL.py", line 2238, in urlopen
return self._opener.open(req, timeout=self._socket_timeout)
File "/usr/lib/python3.8/urllib/request.py", line 531, in open
response = meth(req, response)
File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
response = self.parent.error(
File "/usr/lib/python3.8/urllib/request.py", line 563, in error
result = self._call_chain(*args)
File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
result = func(*args)
File "/usr/lib/python3.8/urllib/request.py", line 755, in http_error_302
return self.parent.open(new, timeout=req.timeout)
File "/usr/lib/python3.8/urllib/request.py", line 531, in open
response = meth(req, response)
File "/usr/lib/python3.8/urllib/request.py", line 640, in http_response
response = self.parent.error(
File "/usr/lib/python3.8/urllib/request.py", line 569, in error
return self._call_chain(*args)
File "/usr/lib/python3.8/urllib/request.py", line 502, in _call_chain
result = func(*args)
File "/usr/lib/python3.8/urllib/request.py", line 649, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/YoutubeDL.py", line 797, in extract_info
ie_result = ie.extract(url)
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/extractor/common.py", line 530, in extract
ie_result = self._real_extract(url)
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/extractor/youtube.py", line 1782, in _real_extract
video_webpage, urlh = self._download_webpage_handle(url, video_id)
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/extractor/youtube.py", line 276, in _download_webpage_handle
return super(YoutubeBaseInfoExtractor, self)._download_webpage_handle(
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/extractor/common.py", line 660, in _download_webpage_handle
urlh = self._request_webpage(url_or_request, video_id, note, errnote, fatal, data=data, headers=headers, query=query, expected_status=expected_status)
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/extractor/common.py", line 645, in _request_webpage
raise ExtractorError(errmsg, sys.exc_info()[2], cause=err)
youtube_dl.utils.ExtractorError: Unable to download webpage: HTTP Error 429: Too Many Requests (caused by <HTTPError 429: 'Too Many Requests'>)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ikk157/MusicBot/musicbot/playlist.py", line 123, in add_stream_entry
info = await self.downloader.extract_info(self.loop, song_url, download=False)
File "/home/ikk157/MusicBot/musicbot/downloader.py", line 84, in extract_info
return await loop.run_in_executor(self.thread_pool, functools.partial(self.unsafe_ytdl.extract_info, *args, **kwargs))
File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/YoutubeDL.py", line 820, in extract_info
self.report_error(compat_str(e), e.format_traceback())
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/YoutubeDL.py", line 625, in report_error
self.trouble(error_message, tb)
File "/usr/local/lib/python3.8/dist-packages/youtube_dl/YoutubeDL.py", line 595, in trouble
raise DownloadError(message, exc_info)
youtube_dl.utils.DownloadError: [0;31mERROR:[0m Unable to download webpage: HTTP Error 429: Too Many Requests (caused by <HTTPError 429: 'Too Many Requests'>)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ikk157/MusicBot/musicbot/bot.py", line 2823, in on_message
response = await handler(**handler_kwargs)
File "/home/ikk157/MusicBot/musicbot/bot.py", line 1709, in cmd_stream
await player.playlist.add_stream_entry(song_url, channel=channel, author=author)
File "/home/ikk157/MusicBot/musicbot/playlist.py", line 138, in add_stream_entry
raise ExtractionError("Unknown error: {}".format(e))
musicbot.exceptions.ExtractionError: Unknown error: [0;31mERROR:[0m Unable to download webpage: HTTP Error 429: Too Many Requests (caused by <HTTPError 429: 'Too Many Requests'>)
[133.1778171062469482] 2020-07-14 21:34:01,996 - INFO - musicbot.bot | In bot.py::MainThread(140250957956928), line 2716 in on_message:<PHONE_NUMBER>99267585/ٴٴ#9103: !shutdown
[133.707516432] launcher-INFO: All done. ```
This won't fix anything about the IP block, because you'll still be downloading from the same IP. We do want to add a built in proxy in the feature to help prevent the IP block however.
This won't fix anything about the IP block, because you'll still be downloading from the same IP. We do want to add a built in proxy in the feature to help prevent the IP block however.
This won't fix anything about the IP block, because you'll still be downloading from the same IP. We do want to add a built in proxy in the feature to help prevent the IP block however.
Ah i see... I appreciate your efforts!
This won't fix anything about the IP block, because you'll still be downloading from the same IP. We do want to add a built in proxy in the feature to help prevent the IP block however.
Ah i see... I appreciate your efforts!
|
2025-04-01T06:37:07.464882
| 2018-09-12T04:03:37
|
359307385
|
{
"authors": [
"ReddeR1337",
"TheerapakG"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1424",
"repo": "Just-Some-Bots/MusicBot",
"url": "https://github.com/Just-Some-Bots/MusicBot/pull/1727"
}
|
gharchive/pull-request
|
Autoplaylist Functionalities Phase 1: Autostream Implementation
After creating your pull request, tick these boxes if they are applicable to you.
[x] I have tested my changes against the review branch (the latest developmental version), and this pull request is targeting that branch as a base
[x] I have tested my changes on Python 3.5/3.6
Description
I planned to do several pull requests to rewrite and add functionalities to autoplaylist which cover a range of feature requests. This will happen in different phases. Phases are arranged in the order that it won't interfere with each other very much.
phase 1: add autostream and related functionalities
phase 2: make options to allow totally different autoplaylist.txt in different servers (or guild)
phase 3: add an option to show autoplaylist "now playing" notifications
This pull request introduces autostream. It's similar to autoplaylist, but it's for streams.
You can add streams for autoplaying in config/autostream.txt.
If the configuration for randomization and toggling is not set, autostream will play after autoplaylist.
Update 1: autostream and autoplaylist could be set to skip automatically when someone adds stuff to the queue.
Update 2: autostream and autoplaylist can play in 2 modes:
merge: merge autoplaylist and autostream when playback
toggle: use the toggleplaylist command to toggle between them
Update 3:
modes are saved
different modes for different servers now possible
I am currently satisfied with these changes. I don't think I will push more commits except commits for clarity and bugs fixing from b06c8b3 onwards. (This mark an end to more feature implementation to this little PR)
As of now, if #1740 got merged before this. This PR will require some modifications to work properly. If there's plan to merge #1740, merge it first. I will make the required modifications on this PR after #1740 got merged. If #1740 got merged and this message is still here, DO NOT MERGE THIS PR as it will definitely break. I will merge changes required from TheerapakG/MusicBot:autostream_permissions into this.
Related issues (if applicable)
(#1462, #1587)
not a direct implementation of #1587 but introduce "roughly equivalent" features, here is the breakdown:
autostream.txt now exist
you can add or remove a stream using autostream command
toggling can be set using the config file
state of toggling does save so next startup it'll be on whatever mode you've set
you can use the config file to skip the autos when added the new entry (the self-closing stream do exist so I added the config to let you choose to force the autostream to skip or not)
Waiting AutoStream so much! Thx for work man!
Any ETA ?
Any ETA for autostream function ?
You can try it out now by pulling from the branch in my repository. There might be some bugs (I would expect only two or three bugs at most) which I haven't found. If you try it out and found them please let me know! For the date when this will be merged into the main branch, I don't know yet. It depends on the collaborator of this repository.
|
2025-04-01T06:37:07.468698
| 2023-09-19T16:06:06
|
1903308752
|
{
"authors": [
"JustAnyones"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1425",
"repo": "JustAnyones/Plugin-creator-website",
"url": "https://github.com/JustAnyones/Plugin-creator-website/issues/3"
}
|
gharchive/issue
|
Allow to create .plugin file from any plugin
Might make sense in accordance with https://github.com/JustAnyones/Plugin-creator-website/issues/2 to allow to create a .plugin from any plugin or previous plugin made in PCA in case of an update.
It has been decided to not implement it for any kind of plugin. The API for encrypting plugins can be made public, but PCA will not support this feature out of the box, only for its own projects.
|
2025-04-01T06:37:07.512761
| 2021-03-28T02:41:00
|
842646266
|
{
"authors": [
"Jyouhou",
"wushilian"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1426",
"repo": "Jyouhou/UnrealText",
"url": "https://github.com/Jyouhou/UnrealText/issues/23"
}
|
gharchive/issue
|
Can you share the word crop code
In the paper : "We crop from the proposed multilingual dataset.
But I ended up with more than 7 million text line images.
How did you crop the text regions? Did you use axis-aligned boxes or quadrilaterals?
@Jyouhou I use axis-aligned boxes,and only the rectangle with width and height greater than 32 is reserved
Thanks for the reply.
Most text are highly oriented in the dataset. I filtered by the shortest edge of the quadrilaterals (not the axis-aligned boxes).
@Jyouhou Can you share your wechat? It's more convenient to communicate
Sure. You can send your wechat account to my cmu email<EMAIL_ADDRESS>
|
2025-04-01T06:37:07.576566
| 2024-10-27T20:04:21
|
2616837114
|
{
"authors": [
"KEINOS"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1427",
"repo": "KEINOS/Dockerfile_of_SQLite3",
"url": "https://github.com/KEINOS/Dockerfile_of_SQLite3/pull/61"
}
|
gharchive/pull-request
|
Changes by create-pull-request action
Automated changes by create-pull-request GitHub action
Your image
keinos/sqlite3:latest
Current base image
alpine:latest
Overview
Image reference
keinos/sqlite3:3.46.1
keinos/sqlite3:latest
- digest
ae78ae013f46
cdc9a2c3c976
- tag
3.46.1
latest
- stream
latest
- provenance
https://github.com/KEINOS/Dockerfile_of_SQLite3/commit/9e8d27bfac0f790f9a6de7babbb20270ab9611b7
- vulnerabilities
- platform
linux/amd64
linux/amd64
- size
7.4 MB
9.7 MB (+2.3 MB)
- packages
17
14 (-3)
Base Image
alpine:3also known as:• 3.20• 3.20.3• latest
alpine:latestalso known as:• 3• 3.20• 3.20.3
- vulnerabilities
Policies (0 improved, 0 worsened, 7 missing data)
Policy Name
keinos/sqlite3:3.46.1
keinos/sqlite3:latest
Change
Standing
Default non-root user
:white_check_mark:
:question: No data
No AGPL v3 licenses
:white_check_mark:
:question: No data
No fixable critical or high vulnerabilities
:white_check_mark:
:question: No data
No high-profile vulnerabilities
:white_check_mark:
:question: No data
No outdated base images
:white_check_mark:
:question: No data
No unapproved base images
:white_check_mark:
:question: No data
Supply chain attestations
:white_check_mark:
:question: No data
Packages and Vulnerabilities (5 package changes and 0 vulnerability changes)
:heavy_minus_sign: 3 packages removed
:infinity: 2 packages changed
12 packages unchanged
Changes for packages of type apk (5 changes)
Package
Versionkeinos/sqlite3:3.46.1
Versionkeinos/sqlite3:latest
:heavy_minus_sign:
ca-certificates
20240705-r0
:infinity:
libcrypto3
3.3.2-r0
3.3.2-r1
:infinity:
libssl3
3.3.2-r0
3.3.2-r1
:heavy_minus_sign:
openssl
3.3.2-r0
:heavy_minus_sign:
pax-utils
1.3.7-r2
|
2025-04-01T06:37:07.582426
| 2017-06-03T07:09:02
|
233352048
|
{
"authors": [
"KELiON",
"danielmelogpi",
"maximbaz",
"nazar-pc"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1428",
"repo": "KELiON/cerebro",
"url": "https://github.com/KELiON/cerebro/issues/333"
}
|
gharchive/issue
|
How could I disable Google search?
I'm using Cerebro as apps launcher and never want to use it for web search. However, sometimes I end up executing search queries which I don't want to do, especially considering it is a Google search (my search engine of choice is Duckduckgo).
I can't find corresponding plugin and do not see an option to disable this. I remember you've being planning to decouple everything into plugins, will it be possible to disable/remove search entirely then?
It would be nice to have google search as a plugin so we could have more flexibility
It is already extracted from the main repository as a separate plugin, so we are all eagerly awaiting the 0.2.9 release to finally be able to disable it 🙂
I see.
It seems that the same applies to Yandex translate:
https://github.com/KELiON/cerebro-yandex-translate
Yes, I've finished extracting plugins, but there are few minor bugs that prevents from releasing it:) But it will be soon, I promise! :D
In version 0.3.0 you can uninstall google plugin as any other
|
2025-04-01T06:37:07.632618
| 2016-08-10T20:55:24
|
170516603
|
{
"authors": [
"KN4CK3R",
"L0ginErr0r"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1430",
"repo": "KN4CK3R/KeePassBrowserImporter",
"url": "https://github.com/KN4CK3R/KeePassBrowserImporter/issues/7"
}
|
gharchive/issue
|
Plugin cannot import time records from Firefox (password created, password last used/modified)
Would it be possible for the plugin to take both available in Firefox dates and transfer them to KeePass?
I know KeePass has two columns:
Creation Time
Last Modification Time
After importing all passwords they are both set at the time of the import.
However, this information is stored in Firefox under:
First time used
Last Change
Would it be possible to import all those dates together with the passwords?
added @ v1.0.2
|
2025-04-01T06:37:07.670004
| 2023-08-01T11:13:15
|
1831038700
|
{
"authors": [
"HebaruSan",
"ink0r"
],
"license": "CC0-1.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1432",
"repo": "KSP-CKAN/NetKAN",
"url": "https://github.com/KSP-CKAN/NetKAN/issues/9747"
}
|
gharchive/issue
|
Kopernicus.dll is registered to Kopernicus but has not been removed
Is there an existing issue for this?
[X] I have searched the existing issues
Operating System
Win 10
CKAN Version
1.33.2
Game Version
<IP_ADDRESS>90
Did you make any manual changes to your game folder (i.e., not via CKAN)?
No response
Describe the bug
Can't install an update to Kopernicus 2, CKAN gives an error about inconsistencies found
No idea if this is a NetKAN or CKAN bug sorry if this is in wrong place, but metadata problem sounds more likely?
Steps to reproduce
cleared Kopernicus archive from CKAN cache
attempted to re apply update through CKAN as if it were a download error, still happens
Relevant log output
* Upgrade: Kopernicus Planetary System Modifier 2:release-1.12.1-176 to 2:release-1.12.1-177 (cached)
The following inconsistencies were found:
* D:/Games/SteamLibrary/steamapps/common/Kerbal Space Program/GameData/Kopernicus/Plugins/Kopernicus.Parser.dll is registered to Kopernicus but has not been removed!
* D:/Games/SteamLibrary/steamapps/common/Kerbal Space Program/GameData/Kopernicus/Plugins/Kopernicus.dll is registered to Kopernicus but has not been removed!
Error during installation!
If the above message indicates a download error, please try again. Otherwise, please open an issue for us to investigate.
If you suspect a metadata problem: https://github.com/KSP-CKAN/NetKAN/issues/new/choose
If you suspect a bug in the client: https://github.com/KSP-CKAN/CKAN/issues/new/choose
That means your OS wouldn't let CKAN delete that file, which usually means either the permissions got messed up (possibly because CKAN was run as administrator previously) or the game is still running. Make sure the game is closed (reboot if necessary), check the permissions, and try again.
|
2025-04-01T06:37:07.673395
| 2015-09-06T01:36:43
|
105061140
|
{
"authors": [
"KerbalStuffBot",
"plague006"
],
"license": "CC0-1.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1433",
"repo": "KSP-CKAN/NetKAN",
"url": "https://github.com/KSP-CKAN/NetKAN/pull/2255"
}
|
gharchive/pull-request
|
Add Alternis Kerbol Rekerjiggered from Kerbal Stuff
This pull request was automatically generated by Kerbal Stuff on behalf of GregroxMun, to add Alternis Kerbol Rekerjiggered to CKAN.
Please direct questions about this pull request to GregroxMun.
This mod wants to overwrite configs for PlanetShine and DistantObject, depends Kopernicus, and recommends KopernicusExpansion and PlanetShine.
PlanetShine and DistantObject already have config splits, so it's just a matter of creating AlternisKerbolRekerjiggered, DistantObject-AlternisKerbolRekerjiggered, and AlternisKerbolRekerjiggered-PlanetShine
Closing in favour of https://github.com/KSP-CKAN/NetKAN/pull/2443
|
2025-04-01T06:37:07.684727
| 2020-02-11T23:07:34
|
563579718
|
{
"authors": [
"HebaruSan",
"Space-Duck",
"whale2"
],
"license": "CC0-1.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1434",
"repo": "KSP-CKAN/NetKAN",
"url": "https://github.com/KSP-CKAN/NetKAN/pull/7689"
}
|
gharchive/pull-request
|
Add Infernal RO-Robotics from SpaceDock
This pull request was automatically generated by SpaceDock on behalf of whale_2, to add Infernal RO-Robotics to CKAN.
Please direct questions about this pull request to whale_2.
Mod details:
name = /mod/2329/Infernal%20RO-Robotics
author = whale_2
abstract = Infernal Robotics fork for Realism Overhaul
license = GPLv3
Homepage = https://github.com/whale2/InfernalRobotics/tree/master
description =
This is special fork of Infernal Robotics targeted at Realism Overhaul, though it should work just fine in usual KSP too.
Features autostrut-like mechanism for moving parts if needed.
Hey @whale2, this is marked as compatible with KSP 1.8 but RealismOverhaul is still on KSP 1.7. Is that OK?
Also what about this is particular to RO? It kind of sounds like this is an adoption/continuation of InfernalRobotics just using RO as branding...?
Hi HebaruSan,
It's quite a complicated story with many parties involved. As you
probably know, Infernal Robotics was picked up by Rudolf Meier and
completely rewritten along with some other mods like KJR on which IR Next
is kind of depends. At this point there already was KJR Continued by
pap1723, which was a part of RO set of mods. Currently, RO folks refuse to
provide any support if KJR Next is used, don't ask me why, but this
rendered IR Next less viable option for RO. I was eager to add IR to my RO
gameplay, so I picked up old IR, added some features to it as well as RO
rebalancing. This move was discussed with sirkut and ZodiusInfuser, (I also
messaged Ziw but got no answer) and I said I won't advertise it much on
forums so not to add to confusion with different IR forks.
So technically, my IR fork works as well in non-RO game as in RO, but from
"product placement" POV, it is intended for RO where IR Next is harder to
use. Hope it explains the matter.
As of 1.8 - RO gradually moves to this version after Kopernicus got
released for 1.8.1 - you can see it in so called Golden Spreadsheet (
https://docs.google.com/spreadsheets/d/1Ldf_nCZw0MiCP-Y5YFuvQioEKihY1SNgD-5yS983itE/edit?usp=sharing)
Also, preferred method of installing RO is CKAN, so I wanted to add this
RO-branded IR to it.
On Wed, Feb 12, 2020 at 5:42 PM HebaruSan<EMAIL_ADDRESS>wrote:
Hey @whale2 https://github.com/whale2, this is marked as compatible
with KSP 1.8 but RealismOverhaul is still on KSP 1.7. Is that OK?
Also what about this is particular to RO? It kind of sounds like this is
an adoption/continuation of InfernalRobotics just using RO as branding...?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/KSP-CKAN/NetKAN/pull/7689?email_source=notifications&email_token=AAKALOW6ZQK7H3U5TQVQYBLRCQRGTA5CNFSM4KTMFMTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOELRPCFY#issuecomment-585298199,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AAKALORJIO26HWAHVMLP5HLRCQRGTANCNFSM4KTMFMTA
.
Hmm, would it make sense to add IR Next as a conflicting mod? I believe it uses the same directory under GameData.
Yes, I think several new relationships are probably necessary here.
Can this mod be used with KJRNext?
Can this mod be used with KJRNext?
Frankly - I didn't test it.
OK, if any users complain about problems, let us know and we can add a conflict.
|
2025-04-01T06:37:07.690558
| 2023-11-23T19:41:08
|
2008725577
|
{
"authors": [
"kaarmu",
"sulthansf"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1435",
"repo": "KTH-SML/sidewalk_mobility_demo",
"url": "https://github.com/KTH-SML/sidewalk_mobility_demo/pull/1"
}
|
gharchive/pull-request
|
Add sidewalk segmentation feature to svea_vision package
This PR introduces sidewalk segmentation feature to svea_vision package. The implementation is primarily done in the sidewalk_segmentation.py node.
The new feature includes the following key functionalities:
Image segmentation: The segment_image method is used to segment the sidewalk from the input image using FastSAM model.
Prompting: The segmented results can be prompted with a bounding box, a set of points or text to guide the segmentation. The type of prompt can be specified using the prompt_type parameter.
Point cloud extraction: The extract_pointcloud method is used to extract the point cloud data corresponding to the segmented sidewalk.
Customization: A diverse set of ROS parameters are used in this feature, allowing for flexibility in configuring the sidewalk segmentation such as the prompt type, the value of the prompt, topic names etc.
Logging: The time taken for each step (inference, prompt, postprocess, extract point cloud, publish) can be logged for performance analysis using verbose parameter.
Static image publisher: Alongside the main sidewalk_segmentation.py node, a utility node static_image_publisher.py is included in this PR. This script is used to publish a static image or a set of images to a ROS topic, which can be useful for testing and debugging.
Really nice. We can discuss in more detail what/how to bring this into svea.
|
2025-04-01T06:37:07.692788
| 2019-06-11T12:48:17
|
454670045
|
{
"authors": [
"willu47"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1436",
"repo": "KTH-dESA/OSeMOSYS",
"url": "https://github.com/KTH-dESA/OSeMOSYS/issues/23"
}
|
gharchive/issue
|
Generate model input files from one common format
At present, each of the three (four if OSeMOSYS_PuLP is included) language versions of OSeMOSYS use a different input file format.
Write a script which generates a correctly formatted datafile for the:
[ ] GAMS version
[ ] PuLP version
[ ] Pyomo version
from a GNU MathProg dat file.
Alternatively, generate a common file format (possible CSV) which can be read by all language versions of OSeMOSYS.
Split into #70 and #71
|
2025-04-01T06:37:07.730580
| 2022-12-07T12:31:15
|
1481809985
|
{
"authors": [
"sherifkayad"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1438",
"repo": "Kaginari/terraform-provider-mongodb",
"url": "https://github.com/Kaginari/terraform-provider-mongodb/issues/34"
}
|
gharchive/issue
|
Updating / Deleting a user (using Terraform) results in Error: User does not exist
Related to #33
Deleting a user by removing the resource from Terraform results in Error: User does not exist. The same exact thing happens when e.g. we try to update e.g. the username of the user
<EMAIL_ADDRESS>Destroying... [id=YWRtaW4uc2hlcmlmLmF5YWRAcGFydG5lci5pb25pdHkuZXU=]
╷
│ Error: User does not exist
The following is my config for the user resource:
locals {
personalized_users = {
<EMAIL_ADDRESS>{
username =<EMAIL_ADDRESS> password = "some_secure_pass"
// ... some other stuff
}
}
}
resource "mongodb_db_user" "personalized_user" {
for_each = local.personalized_users
auth_database = "admin"
name = each.value.username
password = each.value.password
role {
db = "my_db_1"
role = "readWrite"
}
role {
db = "my_db_2"
role = "readWrite"
}
}
Found the reason! I was using usernames with dotes (. & @) characters .. despite that DocumentDB doesn't seem to complain, these were causing issues with the provider. I switched to using underscores (_) instead and the deletion / update of users is working like a charm.
|
2025-04-01T06:37:07.757277
| 2024-01-06T21:13:15
|
2068837027
|
{
"authors": [
"Kaiede",
"s-dukes"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1439",
"repo": "Kaiede/Bedrockifier",
"url": "https://github.com/Kaiede/Bedrockifier/issues/76"
}
|
gharchive/issue
|
Backup doesn't work when Bedrock server is running on a docker swarm
When Minecraft Bedrock Server is running on a docker swarm, the container name is appended with a random numbers/letters each time the stack is started, because of this the backup cannot connect to the container using the defined name.
This could be fixed by taking the name defined in the config.yml in this this instance "bds_minecraft_bedrock" and running the command below to get the container id which can then be used successfully as the name.
$(docker ps -q -f name=bds_minecraft_bedrock)
Cheers
Si
Kubernetes throws a wrench in things too. There's something happening recently that may or may not impact you here?
I've been working with itzg on adding SSH console support to the java/bedrock containers, and I've just merged the support in bedrockifier a few minutes ago. So instead of connecting via the container's name, it connects to the hostname for the container.
However, I think the catch might be that the hostname in this case is still the container name, or is the individual node's container hidden behind a more usable hostname? If the swarm hostnames are random, this might not help much. Unfortunately, I'm not super up to speed on swarm, as everything lives on a single VM for me.
Thanks for the reply, I'll take a look at the SSH option and see if its viable.
I'm completely new to docker, but as usual jumped in at the deep end with the whole swarm thing.
Currently I'm running a script on the host which literally runs the command below to switch out the name in the config file with the current container id, which has it working at least. "bds_minecraft_bedrock" in this case is the stack service name.
sudo sed -i -r "s/^( - name: ).*/\1 $(docker ps -q -f name=bds_minecraft_bedrock)/" /backup/config.yml
Cheers
Si
From: Kaiede @.>
Sent: 06 January 2024 21:51
To: Kaiede/Bedrockifier @.>
Cc: s-dukes @.>; Author @.>
Subject: Re: [Kaiede/Bedrockifier] Backup doesn't work when Bedrock server is running on a docker swarm (Issue #76)
Kubernetes throws a wrench in things too. There's something happening recently that may or may not impact you here?
I've been working with itzg on adding SSH console support to the java/bedrock containers, and I've just merged the support in bedrockifier a few minutes ago. So instead of connecting via the container's name, it connects to the hostname for the container.
However, I think the catch might be that the hostname in this case is still the container name, or is the individual node's container hidden behind a more usable hostname? If the swarm hostnames are random, this might not help much. Unfortunately, I'm not super up to speed on swarm, as everything lives on a single VM for me.
—
Reply to this email directly, view it on GitHubhttps://github.com/Kaiede/Bedrockifier/issues/76#issuecomment-1879839085, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AIC4Q6ARGEZOLENFWSOKZMTYNHBO5AVCNFSM6AAAAABBPZYICCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZZHAZTSMBYGU.
You are receiving this because you authored the thread.Message ID: @.***>
It's in the test tag now, and itzg's containers should have support in the latest tag. Examples using docker-compose are here until I can write up more complete documentation: https://github.com/Kaiede/Bedrockifier/tree/main/Examples
I would be interested to see how it helps, as one reason for this work was to make Kubernetes easier to support.
Just tested… it works!!! 😊 but I did have to set the hostname in the docker compose via the hostname environment variable.
Below is a sanitised copy of my docker compose:
version: '3.4'
services:
minecraft_bedrock:
image: itzg/minecraft-bedrock-server
deploy:
placement:
constraints:
- node.hostname == docker-node-2
restart_policy:
condition: on-failure
delay: 5s
max_attempts: 3
window: 120s
hostname: bedrock_private
environment:
EULA: "TRUE"
VERSION: LATEST
UID: 0
GID: 0
TZ: europe/london
PACKAGE_BACKUP_KEEP: 2
ENABLE_SSH: "true"
SSH_ENABLE: "TRUE"
RCON_PASSWORD: fauxverysecurepassword
SERVER_NAME: Bedrock at Home
SERVER_PORT: 19132
SERVER_PORT_V6: 19133
GAMEMODE: survival
DIFFICULTY: hard
LEVEL_TYPE:
ALLOW_CHEATS: "false"
MAX_PLAYERS: 10
ONLINE_MODE: "true"
ALLOW_LIST:
VIEW_DISTANCE: 32
TICK_DISTANCE: 8
PLAYER_IDLE_TIMEOUT: 0
MAX_THREADS: 0
LEVEL_NAME: Bedrock at Home
LEVEL_SEED: myseed
DEFAULT_PLAYER_PERMISSION_LEVEL: member
TEXTUREPACK_REQUIRED: "false"
SERVER_AUTHORITATIVE_MOVEMENT: server-auth
PLAYER_MOVEMENT_SCORE_THRESHOLD: 20
PLAYER_MOVEMENT_DISTANCE_THRESHOLD: 0.3
PLAYER_MOVEMENT_DURATION_THRESHOLD_IN_MS: 500
CORRECT_PLAYER_MOVEMENT: "false"
ALLOW_LIST_USERS:
OPS:
MEMBERS:
VISITORS:
ENABLE_LAN_VISIBILITY: "true"
expose:
- 2222
ports:
- "19132:19132/udp"
volumes:
- bedrock_data:/data
stdin_open: true
tty: true
backup:
image: kaiede/minecraft-bedrock-backup:test
deploy:
placement:
constraints:
- node.hostname == docker-node-2
restart: on-failure
depends_on:
- "minecraft_bedrock"
environment:
BACKUP_INTERVAL: "3h"
TZ: "Europe/London"
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- bedrock_backup:/backups
- bedrock_data:/server
volumes:
bedrock_data:
driver: glusterfs
name: "gv0/volumes/minecraft_data"
bedrock_backup:
driver: glusterfs
name: "gv0/volumes/minecraft_backup"
Can you confirm which of these are correct?
ENABLE_SSH: "true"
SSH_ENABLE: "TRUE"
I think your documentation has it one way and itzg the other.
Thanks for the help!
From: Kaiede @.>
Sent: Sunday, January 7, 2024 6:06 AM
To: Kaiede/Bedrockifier @.>
Cc: s-dukes @.>; Author @.>
Subject: Re: [Kaiede/Bedrockifier] Backup doesn't work when Bedrock server is running on a docker swarm (Issue #76)
It's in the test tag now, and itzg's containers should have support in the latest tag. Examples using docker-compose are here until I can write up more complete documentation: https://github.com/Kaiede/Bedrockifier/tree/main/Examples
I would be interested to see how it helps, as one reason for this work was to make Kubernetes easier to support.
—
Reply to this email directly, view it on GitHubhttps://github.com/Kaiede/Bedrockifier/issues/76#issuecomment-1879966210, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AIC4Q6GIUZ74HX7KSVUG6D3YNI3KXAVCNFSM6AAAAABBPZYICCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZZHE3DMMRRGA.
You are receiving this because you authored the thread.Message ID<EMAIL_ADDRESS>
I did have to set the hostname in the docker compose
Hmm, so when using docker-compose, I’ve seen that it does create hostnames for the services that are reachable from within the private network. It’s not quite the container name in my experience. For example, I have ~/minecraft/docker-compose.yml
services:
# My Java Server
yosemite:
… etc …
# My Bedrock Server
cascades:
… etc …
backup:
… etc …
The containers themselves wind up being minecraft_yosemite, minecraft_cascades and minecraft_backup. However, my config.yml for the backup uses yosemite:2222 and cascades:2222 for the ssh address, since that’s the hostname generated for my by docker-compose. Does this work for docker swarm or not, I wonder?
This is one reason I need time to update the documentation, to try to capture some of the subtleties of the new system.
Can you confirm which of these are correct?
The docker-compose.yml on my personal server is using ENABLE_SSH, so that’s the correct one. I’ll double check the examples and fix it there.
Tested and yes using the service name also works with swarm, setting a hostname isn't required if you use the service name set in the compose for the SSH connection.
So my bad, I just assumed the hostname would be needed, clearly there is some kind of resolution happening in the background of docker translating service names as well.
Tested and yes using the service name also works with swarm, setting a hostname isn't required if you use the service name set in the compose for the SSH connection.
Good to know. I’ll keep that in mind when writing up the docs.
Since it sounds like SSH is working well for this scenario, and the work is now fully released with updates to the Wiki, I'll go ahead and close this.
|
2025-04-01T06:37:07.841615
| 2019-08-09T14:18:33
|
479011407
|
{
"authors": [
"JaiLuthra1",
"Swapnilr1",
"Varunvaruns9",
"vsvipul"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1440",
"repo": "KamandPrompt/kamandprompt.github.io",
"url": "https://github.com/KamandPrompt/kamandprompt.github.io/pull/91"
}
|
gharchive/pull-request
|
Removed broken links and fixed typing error and Added Legacy Page
Broken links - Swapnil Sharma and Pinank Solanki
@Varunvaruns9 @Swapnilr1
Pinank Solanki's Facebook ID seems to be https://www.facebook.com/psolanki10 so you can just update I think. Also need to remove his GitHub.
will do it and may I remove the github link because I can't find one?
@Swapnilr1 fixed it
@Varunvaruns9 please tell how to solve the issue.
Add some media queries here:
https://github.com/KamandPrompt/kamandprompt.github.io/blob/eab42c64b0e2b7e610fd84b8d77b890aa3b5fdb7/css/styles.css#L620
@JaiLuthra1 If you're not working on this, you should close this.
|
2025-04-01T06:37:07.859860
| 2024-06-01T15:54:32
|
2329211364
|
{
"authors": [
"Karl-HeinzSchneider",
"RadeghostWM"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1441",
"repo": "Karl-HeinzSchneider/WoW-DragonflightUI",
"url": "https://github.com/Karl-HeinzSchneider/WoW-DragonflightUI/issues/86"
}
|
gharchive/issue
|
Backport (paid?) to 3.3.5a
Would be interested in that. Thanks.
I don't think you have enough money for that 😅
It might be a lot of work/many hours.
|
2025-04-01T06:37:07.929090
| 2022-03-10T08:04:08
|
1164887888
|
{
"authors": [
"SergeStinckwich",
"coveralls"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1442",
"repo": "KendrickOrg/kendrick",
"url": "https://github.com/KendrickOrg/kendrick/pull/435"
}
|
gharchive/pull-request
|
Start to fix visualization class
Fix name of tests.
Start to fix visualization class
Pull Request Test Coverage Report for Build<PHONE_NUMBER>
2 of 8 (25.0%) changed or added relevant lines in 2 files are covered.
9 unchanged lines in 1 file lost coverage.
Overall coverage decreased (-0.02%) to 37.697%
Changes Missing Coverage
Covered Lines
Changed/Added Lines
%
src/Kendrick/Visualization.class.st
0
6
0.0%
Files with Coverage Reduction
New Missed Lines
%
src/Kendrick/Visualization.class.st
9
0%
Totals
Change from base Build<PHONE_NUMBER>:
-0.02%
Covered Lines:
5488
Relevant Lines:
14558
💛 - Coveralls
|
2025-04-01T06:37:07.931492
| 2016-07-19T19:06:50
|
166409720
|
{
"authors": [
"Keno",
"josefsachsconning"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1443",
"repo": "Keno/Gallium.jl",
"url": "https://github.com/Keno/Gallium.jl/issues/136"
}
|
gharchive/issue
|
Gallium fails to precompile
Using master branches of julia and Gallium and its dependencies. I'm pretty sure that it worked at b480ce3 of julia.
AWS-Sachs-Ubuntu$ usr/bin/julia
_
_ _ _(_)_ | A fresh approach to technical computing
(_) | (_) (_) | Documentation: http://docs.julialang.org
_ _ _| |_ __ _ | Type "?help" for help.
| | | | | | |/ _` | |
| | |_| | | | (_| | | Version 0.5.0-dev+5510 (2016-07-19 18:28 UTC)
_/ |\__'_|_|_|\__'_| | Commit 0524a52 (0 days old master)
|__/ | x86_64-linux-gnu
julia> using Gallium
INFO: Precompiling module Gallium...
ERROR: LoadError: LoadError: syntax: invalid operator ".!"
in include_from_node1(::String) at ./loading.jl:426 (repeats 2 times)
in macro expansion; at ./none:2 [inlined]
in anonymous at ./<missing>:?
in eval(::Module, ::Any) at ./boot.jl:234
in process_options(::Base.JLOptions) at ./client.jl:239
in _start() at ./client.jl:318
while loading /home/sachs/.julia/v0.5/JuliaParser/src/lexer.jl, in expression starting on line 46
while loading /home/sachs/.julia/v0.5/JuliaParser/src/JuliaParser.jl, in expression starting on line 9
ERROR: LoadError: Failed to precompile JuliaParser to /home/sachs/.julia/lib/v0.5/JuliaParser.ji
in compilecache(::String) at ./loading.jl:505
in require(::Symbol) at ./loading.jl:337
in include_from_node1(::String) at ./loading.jl:426
in macro expansion; at ./none:2 [inlined]
in anonymous at ./<missing>:?
in eval(::Module, ::Any) at ./boot.jl:234
in process_options(::Base.JLOptions) at ./client.jl:239
in _start() at ./client.jl:318
while loading /home/sachs/.julia/v0.5/ASTInterpreter/src/ASTInterpreter.jl, in expression starting on line 8
ERROR: LoadError: Failed to precompile ASTInterpreter to /home/sachs/.julia/lib/v0.5/ASTInterpreter.ji
in compilecache(::String) at ./loading.jl:505
in require(::Symbol) at ./loading.jl:337
in include_from_node1(::String) at ./loading.jl:426
in macro expansion; at ./none:2 [inlined]
in anonymous at ./<missing>:?
in eval(::Module, ::Any) at ./boot.jl:234
in process_options(::Base.JLOptions) at ./client.jl:239
in _start() at ./client.jl:318
while loading /home/sachs/.julia/v0.5/Gallium/src/Gallium.jl, in expression starting on line 3
ERROR: Failed to precompile Gallium to /home/sachs/.julia/lib/v0.5/Gallium.ji
in compilecache(::String) at ./loading.jl:505
in require(::Symbol) at ./loading.jl:364
in eval(::Module, ::Any) at ./boot.jl:234
in macro expansion at ./REPL.jl:92 [inlined]
in (::Base.REPL.##1#2{Base.REPL.REPLBackend})() at ./event.jl:46
814c974026b8c5dc1a19a19f403e8a49395455ea will fix this soon
I'm confused. 814c974 was prior to 0524a52, where the problem was reported.
sorry, what I meant was 814c974, but I will fix this soon
|
2025-04-01T06:37:07.968604
| 2023-04-16T10:02:22
|
1669829016
|
{
"authors": [
"Anas-Mughal",
"KevCui",
"lord8266"
],
"license": "WTFPL",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1444",
"repo": "KevCui/animepahe-dl",
"url": "https://github.com/KevCui/animepahe-dl/issues/88"
}
|
gharchive/issue
|
Error. While downloading
./animepahe-dl.sh: line 294: 6270 Killed xargs -I {} -P "$(get_thread_number "$1")" bash -c 'url="{}"; file="${url##*/}.encrypted"; download_file "$url" "${op}/${file}"' < <(grep "^https" "$1")
[Process completed (signal 9) - press Enter]
Which anime and episode?
Hey @Anas-Mughal, following the question from @lord8266, are you able to download this episode using normal download mode without -t?
Hey @Anas-Mughal, following the question from @lord8266, are you able to download this episode using normal download mode without -t?
Yes
@Anas-Mughal which anime and episode? I can quickly check it.
@Anas-Mughal which anime and episode? I can quickly check it.
Then I was downloading Parasyte. But now you can check by downloading Black Clover or Mob Psycho 100 but the error is same in downloading all anime.
Hey @Anas-Mughal, I tried the mentioned amines and I don't see the error. It seems the error [Process completed (signal 9) - press Enter] related to termux on Android. If you are using termux to run the script, please try to use smaller number with -t, or search online for this error and you may find a solution to this termux error.
Hey @Anas-Mughal, I tried the mentioned amines and I don't see the error. It seems the error [Process completed (signal 9) - press Enter] related to termux on Android. If you are using termux to run the script, please try to use smaller number with -t, or search online for this error and you may find a solution to this termux error.
Do you know how I can fix this? Because earlier I used to download from Termux and this problem did not occur. And I also reduced the value of argument -t and the same problem occurs.
@Anas-Mughal, I don't know how since I don't have this problem... Here is a googled result that looks like a solution: https://github.com/agnostic-apollo/Android-Docs/blob/master/en/docs/apps/processes/phantom-cached-and-empty-processes.md#how-to-disable-the-phantom-processes-killing. You can try to search [Process completed (signal 9) - press Enter] and find more results. Good luck!
|
2025-04-01T06:37:08.068410
| 2024-04-19T16:33:33
|
2253424953
|
{
"authors": [
"KevinVoell",
"matthiasbeyer"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1445",
"repo": "KevinVoell/network_manager",
"url": "https://github.com/KevinVoell/network_manager/issues/12"
}
|
gharchive/issue
|
Version in Cargo.toml not matching released version
crates.io lists version 0.5.1, but the Cargo.toml in the repository says 0.5.0, where does that mismatch come from?
Pushed cargo.toml file change
|
2025-04-01T06:37:08.073646
| 2024-04-17T15:00:24
|
2248515171
|
{
"authors": [
"danielsaidi",
"rubenspessoa"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1446",
"repo": "KeyboardKit/KeyboardKit",
"url": "https://github.com/KeyboardKit/KeyboardKit/issues/710"
}
|
gharchive/issue
|
Autocomplete is not applied if the UITextDocumentProxy.currentWord is nil
Hi everyone,
First of all, congrats on the amazing library! Kudos to the team 😄
I am facing a small issue after updating my project from the 7th version to the 8th version of KeyboardKit.
In the application that I'm currently work, we add AutoComplete.Suggestion default objects when the textField is empty. This behaviour is similar to what Apple does natively with their keyboard. For example, for the en-US locale and in an empty text field, the keyboard suggests "I", "The" and "I'm".
Using KeyboardKit 8+ I can add these options on the keyboard toolbar, however I cannot apply these suggestions when there's no text or spaces in the textField yet. That's happening due to some guards checks in the code.
I investigated the code a bit and I believe it could be slightly changed on KeyboardInputViewController.performAutocomplete to handle cases where there's no text present yet. I have made the changes and tested myself, it worked fine, but it would be nice to run some other tests to guarantee the functionality is not broken.
I appreciate your time reading my request and I apologize if there's a misunderstanding on my part. I'd be more than happy to help in case it's needed.
Cheers,
Rubens Pessoa
Hi @rubenspessoa
Thank you for reaching out with this!
In KK 8+, the controller resets the autocomplete context if the text is empty. While I'm sure the intention with this was good, it's still incorrect, since it's the responsibility of the autocomplete provider to decide this.
I will release this fix as a 8.5.1. 👍
|
2025-04-01T06:37:08.087905
| 2024-12-01T19:41:15
|
2709751406
|
{
"authors": [
"HaraldNordgren",
"benjaminjkraft"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1447",
"repo": "Khan/genqlient",
"url": "https://github.com/Khan/genqlient/pull/365"
}
|
gharchive/pull-request
|
Move websocket headers to opt function 'WithWebsocketHeaders'
Follow-up up on the discussion in https://github.com/Khan/genqlient/pull/360#pullrequestreview-2471038655.
Move websocket headers to and opt function 'WithWebsocketHeaders'.
I have:
[x] Written a clear PR title and description (above)
[x] Signed the Khan Academy CLA
[x] Added tests covering my changes, if applicable
[x] Included a link to the issue fixed, if applicable
[x] Included documentation, for new features
[x] Added an entry to the changelog
@benjaminjkraft Here is a follow-up PR regarding our discussion here: https://github.com/Khan/genqlient/pull/360#pullrequestreview-2471038655
This is without tests now, which is not great. I'm struggling to find an easy way to test this.
Normally I would set up a middleware and set the authentication ctx key for the header. But it seems that the httptest server does not have middleware support out-of-the-box. Maybe I'm wrong?
Could be wise to hold off merging this until tests have been added 🤗
Thanks!
I think you should be able to do middleware? gqlgenServer is just an http.Handler, so if you have a middleware that's func(http.Handler) http.Handler you can just do NewServer(middleware(gqlgenServer)).
@benjaminjkraft Thanks! Tests added now!
|
2025-04-01T06:37:08.125177
| 2021-02-12T23:36:18
|
807614694
|
{
"authors": [
"billhollings"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1448",
"repo": "KhronosGroup/MoltenVK",
"url": "https://github.com/KhronosGroup/MoltenVK/pull/1266"
}
|
gharchive/pull-request
|
Remove official support for direct MSL shader loading from documentation.
Shader code should be submitted as SPIR-V. Although some simple direct MSL shaders may work,
direct loading of MSL source code or compiled MSL code is not officially supported at this time.
Future versions of MoltenVK may support direct MSL submission again.
Addresses issue #1253
It's really my fault we have to do this. I wanted to fix this, but I'm so busy with other stuff at the moment that I don't have time. I guess this will do for now.
Meh. Don't be too hard on yourself. Direct MSL has fallen behind along several dimensions. There have been previous issues raised around how to map Metal resource indexes in a more sophisticated manner. And future Metal argument buffer use will complicate it even further. I expect it will need a fair bit of work to recover effectively and continue to maintain.
I'm also interested in following up on the relatively new pipeline caching options available through Metal. If we can mesh that with Vulkan's pipeline caching, that might provide a more Vulkan-friendly and maintainable approach to improving shader conversion and compiling performance, and mitigate some of the need to directly support MSL.
|
2025-04-01T06:37:08.127066
| 2023-06-03T15:06:28
|
1739608888
|
{
"authors": [
"Try"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1449",
"repo": "KhronosGroup/SPIRV-Cross",
"url": "https://github.com/KhronosGroup/SPIRV-Cross/issues/2160"
}
|
gharchive/issue
|
MSL: vertex shader's return type is void
Hit a bug, when metal-vertex shader is generated:
Full shader: https://shader-playground.timjones.io/c11d83135a1eac44c5753e147eba5fe6
Relevant part:
vertex void main0(constant uint* spvBufferSizeConstants [[buffer(25)]], device Input& _10 [[buffer(0)]], uint gl_VertexIndex [[vertex_id]])
{
main0_out out = {};
constant uint& _10BufferSize = spvBufferSizeConstants[0];
_10.ssbo[int(gl_VertexIndex)] = uint(int((_10BufferSize - 0) / 4));
out.gl_Position = float4(0.0, 0.0, 0.0, 1.0);
// no return here, so gl_Position was discarded
}
My bad: didn't know that msl version has to be set at 2.1 or newer for vertex-shader side effect to work.
|
2025-04-01T06:37:08.132573
| 2021-09-02T20:53:30
|
987154934
|
{
"authors": [
"dneto0",
"s-perron"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1450",
"repo": "KhronosGroup/SPIRV-Tools",
"url": "https://github.com/KhronosGroup/SPIRV-Tools/pull/4501"
}
|
gharchive/pull-request
|
fix parsing of bad binary exponents in hex floats
The binary exponent must have some decimal digits
A + or - after the binary exponent digits should not be interpreted as
part of the binary exponent.
Fixes: #4500
Amazingly, the CI-macos-clang-release build started only 30 minutes ago (roughly).
Amazingly, the CI-macos-clang-release build started only 30 minutes ago (roughly).
It had failed because of a machine issue, so I restarted it.
|
2025-04-01T06:37:08.145961
| 2023-08-21T13:36:38
|
1859392065
|
{
"authors": [
"SaschaWillems",
"lobneroO"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1451",
"repo": "KhronosGroup/Vulkan-Samples",
"url": "https://github.com/KhronosGroup/Vulkan-Samples/issues/783"
}
|
gharchive/issue
|
Vulkan Instance creation is broken when using VK_EXT_VALIDATION_FEATURES_EXTENSION_NAME
I am trying to run the vulkan ray_tracing_extended sample. This used to work a few weeks back (when it was still named raytracing_extended), but I pulled the new version today and now instance creation does not work anymore. This probably is not a problem of the sample but of the entire framework though.
The error I get is:
[error] [framework\platform\platform.cpp:169] Error Message: Could not create Vulkan instance : ERROR_EXTENSION_NOT_PRESENT
The problematic extension seems to be VK_EXT_VALIDATION_FEATURES_EXTENSION_NAME:
If I comment out enabled_extensions.push_back(VK_EXT_VALIDATION_FEATURES_EXTENSION_NAME); in Vulkan-Samples/framework/core/instance.cpp (on my current main branch, this is line 226), the instance creation throws no error.
I thought this was a driver issue at first, since the querying of the extension looks correct to me, but I have tried on the following hardwarde (both on Windows 11, Visual Studio 2022):
NVidia RTX A1000 Laptop GPU, Driver version 536.96 and
NVidia RTX 2080 Ti, Driver version 536.40
The behaviour is the same on both machines: it only works, if I comment out the push_back line.
Thanks for raising this issue. We are already aware of it and will fix this soon with #774.
#774 has been merged, and this should be fixed. If not, please feel free to open a new issue.
|
2025-04-01T06:37:08.150148
| 2022-10-28T00:28:37
|
1426412981
|
{
"authors": [
"charles-lunarg",
"ci-tester-lunarg"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1452",
"repo": "KhronosGroup/Vulkan-Tools",
"url": "https://github.com/KhronosGroup/Vulkan-Tools/pull/699"
}
|
gharchive/pull-request
|
Vulkaninfo: Escape json strings
Fix #696
Also:
Parse & print `VkPhysicalDeviceProperties:driverVersion' according to driver specific formats. Falls back on the Vulkan Major.Minor.Patch if on an unknown platform.
Switch the VkPhysicalDeviceProperties:apiVersion printing to put the major.minor.patch outside of the parenthesis. apiVersion = 1.3.245 (232343235) is how it is now printed, instead of ` apiVersion = 232343235 (1.3.245)
Cleanup the logic around printing UUID arrays. Took some effort but uses the operator<< rather than requiring explicit conversion to strings.
CI Vulkan-Tools build queued with queue ID 22165.
CI Vulkan-Tools build # 819 running.
CI Vulkan-Tools build # 819 passed.
Added bob as a reviewer since he would like to know about any changes to the format vulkaninfo uses.
CI Vulkan-Tools build queued with queue ID 22858.
CI Vulkan-Tools build # 820 running.
CI Vulkan-Tools build # 820 passed.
|
2025-04-01T06:37:08.172388
| 2018-11-08T00:41:43
|
378530458
|
{
"authors": [
"SurlyBird",
"j-conrad",
"julienduroure",
"pafurijaz"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1453",
"repo": "KhronosGroup/glTF-Blender-IO",
"url": "https://github.com/KhronosGroup/glTF-Blender-IO/issues/73"
}
|
gharchive/issue
|
Sharp Edges Ignored
Exporter ignores manually-assigned hard/sharp edges in Blender 2.8, resulting in an 'all-soft' mesh.
Repro in Blender 2.80 Experimental with glTF-Blender-IO:
Create a cube
Display>Shading>Smooth Edges
In Edit mode, select all edges. Edge>Edge Data>Mark Sharp
Select the Object Data tab, Turn on Auto Smooth and set the angle to 180.
Edges of cube should appear hard/sharp in the viewport.
With cube selected File>Export glTF 2.0 (glb). Make sure Export normals and Export Tangents are checked.
Write file to disk.
Load file in sandbox.babylonjs.com/
Hard edges will not be preserved.
Contrast the same workflow with Blender 2.79 using glTF-Blender-Exporter: Hard edges are preserved in .glb written to disk.
Note: In 2.80, setting Display>Shading>Flat faces does export a .glb with hard edges, but all edges are hard. What we want is the ability to define custom hard/soft edges on any given mesh (like we can do in Blender 2.79 with the glTF-Blender-Exporter addon).
Example files attached.
cube_hard_soft.zip
Note that an issue has been opened in developer.blender.org regarding sharp edges management:
https://developer.blender.org/T58638
Apologies for adding this comment on multiple issues, but I've seen several issues discussing 'Apply Modifiers' as a solution to different problems. Unfortunately, it appears to be a less-than-ideal fix for certain situations like this:
It appears to be a known issue that one cannot export a model that has manually-defined smooth/hard edges unless 'Apply Modifiers' is turned on. (Doesn't make much sense considering defining smooth/hard edges can be done without the use of any modifiers, but 🤷♂️)
The problem I'm having now is that I'd like to be able to export my Shape Keys AND have proper smooth/hard edges. However, Shape Keys don't export when 'Apply Modifiers' is turned on, aka the opposite problem. (Also a bit of a misnomer since Shape Keys appear to have zero to do with modifiers.)
Does anyone know how to get around this?
I'm the author of thread in blender, here a link of a my model and I get some errors and I can't export. https://drive.google.com/open?id=1RnLZ1OHdxNRgTU9E_jKdSc-1zp5SogkH
Anyway a noticed that, when you export with the script and with option apply modifiers, I get a very clean result, with smooth and sharp edges correctly, but if I do conversion of the model manually and then export the model I have a wrong shading loosing all sharp edges.
|
2025-04-01T06:37:08.221320
| 2018-11-27T21:16:44
|
384989419
|
{
"authors": [
"Wyqer",
"veteran29"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1454",
"repo": "KillahPotatoes/KP-Liberation",
"url": "https://github.com/KillahPotatoes/KP-Liberation/pull/538"
}
|
gharchive/pull-request
|
Virtual module - Scripted curator management
Q
A
Bug fix?
no
New feature?
yes
Needs wipe?
yes
Fixed issues
#517
Description:
This module adds BIS Curator module as default mean of AI commanding for commander and sub-commander slots.
It offers three curator modes, every slot can be configured to use different mode via CBA settings.
Content:
[x] Removed curator modules from mission file
[x] Created virtual module in liberation framework
[x] Full curator mode
[x] Limited mode with free camera movement
[x] Limited mode with locked camera
[x] No curator mode
[x] Changed build module pos load/save to use posWorld
[x] Added common_getFobAlphabetName
Successfully tested on:
[x] Local MP Vanilla
[ ] Dedicated MP Vanilla
Compatibility checked with:
NOTHING
Conflicts and tests in dedicated environment? 🙂
Conflicts resolved.
I will test it later on dedicated, if something will not work on it the code won't change much anyway so this can be reviewed anyway IMO.
|
2025-04-01T06:37:08.240291
| 2021-09-16T18:54:48
|
998528737
|
{
"authors": [
"KinDR007",
"lucasimons",
"syssi"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1455",
"repo": "KinDR007/VictronMPPT-ESPHOME",
"url": "https://github.com/KinDR007/VictronMPPT-ESPHOME/issues/21"
}
|
gharchive/issue
|
Some errors
Hi I am using this software with a 100/20 l mppt, I have encountered problems with decimals let me explain better, some times for example the battery voltage is 26v but the sensor sends 0.26v, this also happens to the total energy I already have put a filter that averages but is not enough
Could you try to provide the raw data of your solar charger?
@lucasimons
may i ask how are you wiring ?
I have connected the tx to D7 e gnd to gnd... I would not want my nodemcu to go into crisis with esp8266 because it actually sends a lot of data it is not possible to decrease the refresh rate, as soon as I get it I will use an esp 32 to see if I solve
@lucasimons
power supply how did you deal with? external source? or stepdown from Battery
Yes I have the 24v battery and a victron orion 24v to 12v converter and then connected a quick charge 3.0 module that powers a raspberry and the nodemcu however I think it's like putting a step down 24 to 5
Ok ,
Please disconect gnd from ve.direct port and try
(
Ve.direct connected only tx
Power supply vcc and gnd from step down
)
|
2025-04-01T06:37:08.255847
| 2022-09-13T14:28:57
|
1371551744
|
{
"authors": [
"bjungbogati",
"liero0",
"morhadi"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1457",
"repo": "Kingsman44/Pixelify",
"url": "https://github.com/Kingsman44/Pixelify/issues/121"
}
|
gharchive/issue
|
Solved: How to install on Windows 11
Thanks for this pixel hack, really appreciated. I already tried this on Android X86 9.0 on VirtualBox and it works fine.
Could someone help with installing this on Windows Subsystem for Android?
I installed the Android for Windows 11 using the instructions on this site:
https://www.androidsage.com/2022/06/28/download-wsa-android-12-1-with-magisk-root-google-play-store/
Then I installed Magisk using these instructions:
https://www.getdroidtips.com/root-windows-subsystem-for-android-via-magisk/
So, now I have a fully working WSA with Google services and Magisk works, but how to flash pixelify? The 'normal' version stops to an error about volume keys, so I think I need to download Pixelify-v2.1-no-VK.zip.
How about installation? I already copied the config.prop to internal folder, but then what? This is clearly not enough. Then I flashed the zip with magisk, but I still don't have unlimited storage in google photos (I did erase photos app's data after flashing).
EDIT:
I got this. I forgot to enable 'zygisk'. Now it works.
I have zygisk enabled but everytime i flash the module and reboot magisk it shows nothing in installed modules. do you have any idea?
After reading logs I can see, It is showing 2 errors
Installing Google Photos from backups
Failure [INSTALL_FAILED_VERSION_DOWNGRADE: Downgrade detected: Update version code 48627807 is older than current 48849424]
Please Disable Google Photos Auto Update on Playstore
chmod: /data/data/com.google.android.dialer/files/phenotype: No such file or directory
Google is installed.
Comment : This shows even though auto update is turned off on play store
Try using release WSA_2311.40000.5.0_x64_Release-Nightly-with-Magisk-26.4-stable-MindTheGapps-13.0.7z
from https://github.com/MustardChef/WSABuilds/releases
everytime i flash the module and reboot magisk it shows nothing in installed modules.
Update :
If I dont reboot it shows pixelify in installed modules and works fine but if i close wsa once or reboot magisk it doesnt show in installed modules.
|
2025-04-01T06:37:08.267149
| 2022-10-24T14:56:49
|
1420971834
|
{
"authors": [
"Kiril95",
"Vancho99"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1458",
"repo": "Kiril95/EntertainmentHub",
"url": "https://github.com/Kiril95/EntertainmentHub/issues/1"
}
|
gharchive/issue
|
Unnecessary setting
You can safely remove .UseQueryTrackingBehavior(QueryTrackingBehavior.NoTracking); from your contacts tests inside the ContactServiceTests.cs file, because you are detaching the entity from the context on line 150
Okaaaay. 10x.
|
2025-04-01T06:37:08.276148
| 2023-03-10T23:34:13
|
1619756340
|
{
"authors": [
"Kirill5k",
"ioleo"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1459",
"repo": "Kirill5k/mongo4cats",
"url": "https://github.com/Kirill5k/mongo4cats/pull/27"
}
|
gharchive/pull-request
|
WIP ZIO JSON
This is work in progress, missing (and failing) tests. Feel free to contribute @Kirill5k I've invited you to edit my fork.
I needed to add snapshot dependency to ZIO JSON as some useful methods are not yet available in latest stable release.
Fixes #26
@Kirill5k can you take a look? there is some error when running testOnly JsonMapperSpec, but it's late and my mind is no longer working xD perhaps a fresh view will reveal the bug
For some reason
Some("{"_id": {"$oid": "640c698699f7394fe1fa68b9"}, "string": "string", "null": null, "boolean": true, "long": 1.678535046665E12, "int": 1.0, "bigDecimal": 100.0, "array": ["a", "b"], "dateInstant": {"$date": "2023-03-11T11:44:06.665Z"}, "dateEpoch": {"$date": "2023-03-11T11:44:06.665Z"}, "dateLocalDate": {"$date": "2022-01-01T00:00:00Z"}, "document": {"field1": "1", "field2": 2.0}}")
was not equal to
Some("{"_id": {"$oid": "640c698699f7394fe1fa68b9"}, "string": "string", "null": null, "boolean": true, "long":<PHONE_NUMBER>665, "int": 1, "bigDecimal": {"$numberDecimal": "100.0"}, "array": ["a", "b"], "dateInstant": {"$date": "2023-03-11T11:44:06.665Z"}, "dateEpoch": {"$date": "2023-03-11T11:44:06.665Z"}, "dateLocalDate": {"$date": "2022-01-01T00:00:00Z"}, "document": {"field1": "1", "field2": 2}}"
Int is represented as 1.0, long appears once in normal, once in scientific notation, and bigdecimal is weird
I'm not sure what's going in. I suppose its related to the internal representation of Json.Num which is bigDecimal.
Now its just BigDecimal an issue, fixed int and long
Some("{"_id": {"$oid": "640c80dda25fe1341f6b231c"}, "string": "string", "null": null, "boolean": true, "long":<PHONE_NUMBER>538, "int": 1, "bigDecimal": 100, "array": ["a", "b"], "dateInstant": {"$date": "2023-03-11T13:23:41.538Z"}, "dateEpoch": {"$date": "2023-03-11T13:23:41.538Z"}, "dateLocalDate": {"$date": "2022-01-01T00:00:00Z"}, "document": {"field1": "1", "field2": 2}}")
was not equal to
Some("{"_id": {"$oid": "640c80dda25fe1341f6b231c"}, "string": "string", "null": null, "boolean": true, "long":<PHONE_NUMBER>538, "int": 1, "bigDecimal": {"$numberDecimal": "100.0"}, "array": ["a", "b"], "dateInstant": {"$date": "2023-03-11T13:23:41.538Z"}, "dateEpoch": {"$date": "2023-03-11T13:23:41.538Z"}, "dateLocalDate": {"$date": "2022-01-01T00:00:00Z"}, "document": {"field1": "1", "field2": 2}}")
Hmm OK, it seems the issue is JsonNumSyntax#toBsonValue will interpret BigDecimal(100.0) as isValidInt and encode that as BsonValue.int (that's why we see just 100), on the other hand we have JsonEncoder that directly encodes BigDecimal(100.0) (not cast to int), thence the diffrence. I wonder how it worked with circe.
Fixed, however in circe the toBigDecimal returned an Option, if it failed they default to BsonValue.double.
In our case Json.Num internally is already a BigDecimal, so it will always map to BsonValue.bigDecimal there is no case for BsonValue.double.
The test case succeeds, but I guess only because we don't check double values. I'm not sure if that's OK.
MongoCollectionSpec and MongoJsonCodecsSpec are failing.
This looks very good. I will give it a better look after several hours when I am free.
It looks like you forgot to commit Dependencies.scala changes
This is puzzling me now. The only diffrence I see is the missing parenthesis on the left hand side (or extra parameters on right hand side, depending how you look at it).
I have no idea why it is the way it is.
{"$oid":"640ce1fcc850af29e11314c9"}
was not equal to
"{"$oid":"640ce1fcc850af29e11314c9"}"
It was a CharSequence.. calling .toString solved the problem of parenthesis.
@Kirill5k all tests are passing :tada: Please review.
Hmm.. StackOverflow on the CI. I wonder why. There is no issue locally (using JDK17).
@Kirill5k can we merge this and cut a release?
Sure. all looks good to me.
Will do a release later today.
|
2025-04-01T06:37:08.280113
| 2017-10-15T02:32:30
|
265540222
|
{
"authors": [
"KirkMcDonald",
"MakerBurst"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1460",
"repo": "KirkMcDonald/kirkmcdonald.github.io",
"url": "https://github.com/KirkMcDonald/kirkmcdonald.github.io/issues/55"
}
|
gharchive/issue
|
Suggestion: Accept donations
I wanted to give you $5 for this awesome tool, but I couldn't find any way to do so. A donate link would be nice.
P.S. I've used many other factorio calculators in the past, but this was by far the best one, and the only one that really gave me what I wanted. Thank you!
I have added a link to my Patreon page in e81df4f1a1e0c5f113f393df233ac50d31b674d7.
|
2025-04-01T06:37:08.282438
| 2018-07-18T00:34:25
|
342135094
|
{
"authors": [
"KirkMcDonald",
"terite"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1461",
"repo": "KirkMcDonald/kirkmcdonald.github.io",
"url": "https://github.com/KirkMcDonald/kirkmcdonald.github.io/issues/91"
}
|
gharchive/issue
|
Out of memory error while trying to build seablock recipes
I loaded in seablock data to this calculator, but encounter out of memory errors in Chrome while in the simplex method.
The OOM seems to be related to BigInteger (either the object size or the garbage created). I encounter it pretty frequently while doing matrix manipulation in the simplex method.
This bug report lacks details, so it is difficult to say with certainty what the problem is. However, I can make an educated guess.
The current algorithm for detecting which portions of the recipe graph require representation with a linear program is a hack that works for the vanilla graph, but which I fully expect to fail catastrophically with other, more complex recipe graphs. Replacing this algorithm is one of the major barriers to supporting Bob's Mods (see also #35), among other alternate recipe graphs.
These sorts of failures could potentially take many forms. What you describe sounds like some sort of infinite recursion, which continually multiplies some numbers together until it OOMs.
I am closing this issue. Seablock is not supported at this time, and loading its recipe graph into the calculator is not expected to work. This may change in the future, but the relevant efforts are already covered by #35.
|
2025-04-01T06:37:08.338274
| 2022-12-13T13:14:21
|
1494217921
|
{
"authors": [
"0xAeterno"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1462",
"repo": "KlimaDAO/klimadao",
"url": "https://github.com/KlimaDAO/klimadao/issues/820"
}
|
gharchive/issue
|
[retirements] Wrap retirement details
Lengthy retirement details break page layout spacing.
Wrapping the text should fix this but curious how the page would look 🤔
Questions:
What string length validation do we have on beneficiaryName
Retirement to validate changes against:
https://www.klimadao.finance/retirements/carboncar.klima/1
Might no longer be an issue with https://github.com/KlimaDAO/klimadao/pull/834
|
2025-04-01T06:37:08.364199
| 2024-09-12T04:16:29
|
2521300415
|
{
"authors": [
"Knightro63",
"RansomBroker"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1463",
"repo": "Knightro63/three_js",
"url": "https://github.com/Knightro63/three_js/issues/10"
}
|
gharchive/issue
|
example using VideoTexture from live feed camera
i have facing problem E/flutter ( 983): [ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: NoSuchMethodError: Class 'VideoTexture' has no instance method '[]'.
when pass camera to VideoTexture from flutter
HI @RansomBroker,
There is currently a bug with VideoTexture. I have resolved it but have not pushed to pub.dev because of other things that are being worked on as well.
Please use CanvasTexture as a workaround for the moment. Here is an example of how to do that.
Hope this helps.
when i use that code example the image show like that
Hi @RansomBroker,
I am sorry for the issue. This is the only way currently to use the video feed.
What platform are you using?
I just tested it on Mac and it works fine, but I have not tested it on Android or web.
There is currently an issue with windows and camera feeds. I will work on it soon.
Sorry for the issue. I hope to resolve it soon.
yeah im testing it on android and desktop
are video texture will release in near future? or webcam update for that issues on android and web?
Hi @RansomBroker,
I have updated the examples to fix the issue you are having with the android and web camera versions.
Hope this helps.
Hi @RansomBroker,
I think we should be able to close this issue.
|
2025-04-01T06:37:08.365641
| 2022-07-11T15:29:53
|
1300878675
|
{
"authors": [
"GerkinDev",
"mokone91"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1464",
"repo": "KnodesCommunity/typedoc-plugins",
"url": "https://github.com/KnodesCommunity/typedoc-plugins/issues/119"
}
|
gharchive/issue
|
v0.23 support
Hi!
Do you plan to support current newest version?
Thanks!
Hi, yes, I'm currently working on it. There are a couple of non-trivial changes that still requires work, but I'm getting there
Released in v0.23.0
|
2025-04-01T06:37:08.379652
| 2020-04-02T17:43:58
|
592814061
|
{
"authors": [
"LucaCappelletti94",
"deepakunni3",
"justaddcoffee"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1465",
"repo": "Knowledge-Graph-Hub/kg-covid-19",
"url": "https://github.com/Knowledge-Graph-Hub/kg-covid-19/issues/63"
}
|
gharchive/issue
|
Speed up STRING ingest
Luca said he might want to try speeding up STRING ingest, possibly by parallelization.
Here is the ingest in question.
Possibly could parallelize here while reading lines, or maybe as @deepakunni3 said, this utility fxn could be sped up
Some questions on the utility: is there any reason why it returns None?
Some questions on the utility: is there any reason why it returns None?
No reason, other than to return a type of some kind for type-checking. Could refactor to return a boolean I guess?
Though it is possible to parse the lines using multiprocessing it is not possible to write the lines in the same way, how much memory do you think would it take to keep them in memory and write them all after the first read has finished?
Though it is possible to parse the lines using multiprocessing it is not possible to write the lines in the same way, how much memory do you think would it take to keep them in memory and write them all after the first read has finished?
Not sure, but you could get some idea by running:
python run.py transform -s StringTransform
then looking at the size of the files generated:
ls -lh data/transformed/STRING/
At least a few gigs I bet
The file size is at 122M. So ingesting that into an in-memory data structure shouldn't be a problem at the moment. But note that we are dealing with just a subset of the original data by restricting to human PPIs.
The actual master file is 68GB (compressed). I doubt we will ever need to parse all of the file. Just highlighting the upper bounds of memory requirements.
Hi @LucaCappelletti94 - for clarity, can you confirm you are working on this? @deepakunni3 said he could handle this if you are not. Just trying to avoid duplicating effort
Yes, it should be done by the end of the morning.
I see that the same "seen" list is used for check for both the already seen proteins and genes: could this cause a name clash?
I'm working on this issue here.
I have refactored the code and replaced the lists with sets, so now that aspect should run a bit faster. I am not sure if parallelizing the I/O can add a significant speedup as the remaining data elaboration is minimal. What do you think?
@LucaCappelletti94 There shouldn't be a name clash for seen. The identifiers are mutually exclusive for gene and protein
I have refactored the code and replaced the lists with sets, so now that aspect should run a bit faster. I am not sure if parallelizing the I/O can add a significant speedup as the remaining data elaboration is minimal. What do you think?
Thanks Luca! A very dramatic speed-up, STRING ingest is now down to 2m!
(venv) ~/PycharmProjects/kg-emerging-viruses *speed_up_string $ time python run.py transform -s StringTransform
WARNING:ToolkitGenerator:class "pairwise interaction association" slot "interacting molecules category" does not reference an existing slot. New slot was created.
WARNING:ToolkitGenerator:Unrecognized prefix: SEMMEDDB
[snip]
INFO:root:Parsing StringTransform
[transform.py][ transform] INFO: Parsing StringTransform
real 2m19.998s
user 2m14.680s
sys 0m2.388s
It almost leaves me with the doubt that I haven't done something wrong with the code: the output file looks okay? I think that other than the pythonification of the code the only real source of speed up was replacing the list with a set.
If this helps:
edges.tsv produced from your branch and from master is exactly the same (see m5 hash below, bak/ dir is produced from master).
Your nodes.tsv has fewer entries than master, but the few I checked out are just extra ENSEMBL ids not mentioned in edges.tsv
(venv) ~/PycharmProjects/kg-emerging-viruses $ ls -l data/transformed/STRING/*.tsv data/transformed/STRING/bak/*tsv
-rw-r--r-- 1 jtr4v staff<PHONE_NUMBER> Apr 3 11:57 data/transformed/STRING/bak/edges.tsv
-rw-r--r-- 1 jtr4v staff 2456430 Apr 3 11:57 data/transformed/STRING/bak/nodes.tsv
-rw-r--r-- 1 jtr4v staff<PHONE_NUMBER> Apr 5 09:12 data/transformed/STRING/edges.tsv
-rw-r--r-- 1 jtr4v staff 2391930 Apr 5 09:12 data/transformed/STRING/nodes.tsv
(venv) ~/PycharmProjects/kg-emerging-viruses $ md5 data/transformed/STRING/*.tsv data/transformed/STRING/bak/*tsv
MD5 (data/transformed/STRING/edges.tsv) = c960835ae79f6235d8b4a7be6ce4372f
MD5 (data/transformed/STRING/nodes.tsv) = 21ea74367b01086301f4a9dfba6e1bb9
MD5 (data/transformed/STRING/bak/edges.tsv) = c960835ae79f6235d8b4a7be6ce4372f
MD5 (data/transformed/STRING/bak/nodes.tsv) = 408ed8393fbce7abcf20864cee5bbd77
Ok! That seems promising, I was worried that the speedup was caused by some coding mistake such as an if-condition that was never met. Out of curiosity, how much time was it required on the same machine before?
Out of curiosity, how much time was it required on the same machine before?
14 hours on my laptop - a very dramatic speed-up
Wow, @LucaCappelletti94 This is amazing! A simple change made all the difference!
For posterity: In Python, lists are slow in lookup for membership. Whereas, sets are dramatically faster because of the way set stores its values: hash tables.
For posterity: In Python, lists are slow in lookup for membership. Whereas, sets are dramatically faster because of the way set stores its values: hash tables.
Good to know - I didn't realize either how much quicker sets are
Should I proceed with the pull request?
@LucaCappelletti94 Yes, the output looks proper. Feel free to make a PR
|
2025-04-01T06:37:08.384412
| 2024-08-25T13:57:31
|
2485275426
|
{
"authors": [
"gsteckman",
"mediavrog"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1466",
"repo": "KoalaPlot/koalaplot-core",
"url": "https://github.com/KoalaPlot/koalaplot-core/pull/83"
}
|
gharchive/pull-request
|
feat(PieChart): mark proper composable scope for labels and holeContent
To be able to use the proper modifiers for alignment etc, mark the pie composable lambdas for label and holeContent with BoxScope
I think this is a good idea for the hole content, since there's a clear "box" that fits inside the hole.
For the labels, they are positioned so the left/right edge is next to the label connector depending on if they are on the right/left side of the pie. What's the use case for having them in a Box, and what would be the bounds of the box for each label?
I guess it is not that useful for labels, I just saw they were wrapped in a Box and wanted to make sure the proper scope is set. But I also don't see a clear use case, so I will update this PR to only target holeContent.
Or if https://github.com/KoalaPlot/koalaplot-core/pull/85 is a candidate for merging, we can just close this PR.
Yes let's use #85 since that also adds the content padding.
|
2025-04-01T06:37:08.492961
| 2021-09-29T02:37:15
|
1010414066
|
{
"authors": [
"Kolkir",
"jcyhcs"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1467",
"repo": "Kolkir/superpoint",
"url": "https://github.com/Kolkir/superpoint/issues/1"
}
|
gharchive/issue
|
how can i got superpoint.pt files?
hi,professor:
thanks for your code! in the examples, requires superpoint.pt file, where can i find it?
PLEASE!
Hello,
If you are talking about the main.cc file, then it's assumed that you've exported superpoint.pt from the Python program as a PyTorch jit script. You can see how it can be done in the inferencewrapper.py.
|
2025-04-01T06:37:08.557723
| 2023-08-10T14:54:45
|
1845361510
|
{
"authors": [
"barockok",
"gszr"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1468",
"repo": "Kong/go-pdk",
"url": "https://github.com/Kong/go-pdk/pull/157"
}
|
gharchive/pull-request
|
Fixes Validate Client.Authenticate when given
Fixes Validate Client.Authenticate when given consumer or credential both nil
Fixes #153
Closing this one in favor of https://github.com/Kong/go-pdk/pull/153.
@barockok Thank you for your PR! We really appreciate it.
|
2025-04-01T06:37:08.562960
| 2015-03-13T01:48:14
|
60957510
|
{
"authors": [
"darrenjennings",
"thibaultCha",
"thibaultcha"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:1469",
"repo": "Kong/httpsnippet",
"url": "https://github.com/Kong/httpsnippet/issues/28"
}
|
gharchive/issue
|
[objc/native] Improvements
I want to implement a couple more things:
About literals (I like literals because they generate less cluttered code snippets and allow more flexibility to the people pasting them):
[ ] Add a literals or verbose flag (verbose would be more consistent with other targets): If true, will generate literals from parameters/headers like currently, if false, will just use the objects computed by httpsnippet. It will be a nice options if people don't want too much verbose at the cost of flexibility for "pasters".
[x] Make body parameters literals (not sure about this one: very verbose)
[x] Make querystring parameters literals (not sure about this one either: very verbose)
About code style (JS):
[x] Comment what's going on in the generation (native.js)
[x] Improve the code climate rating (:cry:)
About improvements that could be made but are waiting on general guidelines:
[ ] Parse the response body from NSData. Need to wait on response definition for now.
[ ] Add an option for explanatory comments generation? Depends if other languages will do that too in the future. It could be a new guideline.
About making literal querystrings, a huge downside would be the extra verbose:
NSURLComponents *components = [[NSURLComponents alloc] init];
NSURLQueryItem *q1 = [NSURLQueryItem queryItemWithName:@"foo" value:@"bar"];
NSURLQueryItem *q2 = [NSURLQueryItem queryItemWithName:@"hello" value:@"world"];
components.queryItems = @[ q1, q2 ];
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:components.URL
cachePolicy:NSURLRequestUseProtocolCachePolicy
timeoutInterval:10.0];
// execute request through NSURLSession
Not sure about this. The same way, multipart requests are already very verbose in order to build the body.
@thibaultcha is this resolved?
@darrenjennings No way I could remember, sorry. Better close it and move on.
Sounds like a plan. Thanks!
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.