added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| created
timestamp[us]date 2001-10-09 16:19:16
2025-01-01 03:51:31
| id
stringlengths 4
10
| metadata
dict | source
stringclasses 2
values | text
stringlengths 0
1.61M
|
|---|---|---|---|---|---|
2025-04-01T06:41:07.390145
| 2024-04-16T06:48:14
|
2245250253
|
{
"authors": [
"bnpfeife"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12423",
"repo": "influxdata/influx-cli",
"url": "https://github.com/influxdata/influx-cli/pull/541"
}
|
gharchive/pull-request
|
chore: upgrade to go 1.21.9
Can we go to 1.21.9 (the latest 1.29 at the time of this review)? Otherwise, LGTM
Oops! Thanks, I've updated it to 1.21.9!
|
2025-04-01T06:41:07.402352
| 2023-03-18T11:15:20
|
1630281113
|
{
"authors": [
"bosd"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12424",
"repo": "invoice-x/invoice2data",
"url": "https://github.com/invoice-x/invoice2data/pull/498"
}
|
gharchive/pull-request
|
Support coerce types in table plugin
Quickly drafted this. Did not test it yet. Seems good tests are green :smile:
|
2025-04-01T06:41:07.405388
| 2024-04-05T22:05:55
|
2228869608
|
{
"authors": [
"Uzhgsm01",
"abdullahdevrel"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12425",
"repo": "ipinfo/cli",
"url": "https://github.com/ipinfo/cli/issues/216"
}
|
gharchive/issue
|
brew install ipinfo-cli
Homebrew includes IPinfo's CLI. It is called ipinfo-cli.
Here is the link to the page: https://formulae.brew.sh/formula/ipinfo-cli#default
|
2025-04-01T06:41:07.407792
| 2024-04-06T12:57:13
|
2229231362
|
{
"authors": [
"AlvinRey",
"rabbitism"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12426",
"repo": "irihitech/Ursa.Avalonia",
"url": "https://github.com/irihitech/Ursa.Avalonia/pull/208"
}
|
gharchive/pull-request
|
Fix Ipv4
As discussed with @AlvinRey , PR https://github.com/irihitech/Ursa.Avalonia/pull/207 will continue in this one.
|
2025-04-01T06:41:07.439944
| 2023-12-30T05:01:00
|
2060634487
|
{
"authors": [
"j2kun"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12427",
"repo": "j2kun/todo-backlinks",
"url": "https://github.com/j2kun/todo-backlinks/issues/2"
}
|
gharchive/issue
|
Support more regex patterns
This issue has 1 outstanding TODOs:
entrypoint.py:26 : Support more regex patterns
This comment was autogenerated by todo-backlinks
|
2025-04-01T06:41:07.447012
| 2023-02-07T09:47:32
|
1574007387
|
{
"authors": [
"Olen",
"longstone"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12428",
"repo": "jaroslawhartman/withings-sync",
"url": "https://github.com/jaroslawhartman/withings-sync/pull/103"
}
|
gharchive/pull-request
|
add documentation for cron and docker-compose
An alternative way, that might be better, is to just run crond as the entrypoint and add a random minute:
diff --git a/Dockerfile b/Dockerfile
index 7017e13..08392ed 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -14,4 +14,6 @@ COPY . /src
RUN cd /src && \
python3 ./setup.py install
-ENTRYPOINT ["withings-sync"]
+RUN echo "$(( $RANDOM % 59 + 0 )) */2 * * * withings-sync" >> /var/spool/cron/crontabs/root
+
+ENTRYPOINT ["crond", "-f", "-l", "5", "-L", "/dev/stdout"]
This will run the cronjob every 2 hours at a random minute, to avoid that everyone runs it at the same time.
crond -f = Run crond in foreground
-l 5 = Crond-loglevel (0 = debug, 8 = almost nothing)
-L /dev/stdout = log to stdout (which means the logs are visible with docker logs)
I really like the part with the random minutes!! I'll change that.
I'll not change the docker file, as I do not want to change the default behavior. therefore overriding the entry point and pointing to a script.
|
2025-04-01T06:41:07.457337
| 2020-06-13T00:10:31
|
638057463
|
{
"authors": [
"jebunnasrin"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12429",
"repo": "jebunnasrin/jn-d8-composer-project",
"url": "https://github.com/jebunnasrin/jn-d8-composer-project/pull/14"
}
|
gharchive/pull-request
|
Update Composer dependencies (2020-06-13-00-10)
Visual regression test passed!
View the visual regression test report
|
2025-04-01T06:41:07.459130
| 2023-10-07T19:28:33
|
1931474546
|
{
"authors": [
"ocrosby"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12430",
"repo": "jedi-knights/scaffit",
"url": "https://github.com/jedi-knights/scaffit/pull/3"
}
|
gharchive/pull-request
|
docs: add status badges
:tada: This PR is included in version 1.0.0 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
|
2025-04-01T06:41:07.521708
| 2023-04-01T10:54:16
|
1650417536
|
{
"authors": [
"jonnymaserati"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12431",
"repo": "jonnymaserati/welleng",
"url": "https://github.com/jonnymaserati/welleng/pull/154"
}
|
gharchive/pull-request
|
Try and get regular pip install working in colab again
Seems to be working... couldn't see any deprecationwarnings.
|
2025-04-01T06:41:07.702561
| 2024-11-07T19:31:49
|
2641984473
|
{
"authors": [
"jzombie"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12432",
"repo": "jzombie/etf-matcher",
"url": "https://github.com/jzombie/etf-matcher/pull/142"
}
|
gharchive/pull-request
|
Reuse TickerDetail layouts for bucket views
@coderabbitai, this PR is growing to be quite large, and I still have a ways to go. Can you give me a preliminary review that is very not nitpicky? I will likely do some refactoring later, but right now I just want to get this functionality moving forward.
|
2025-04-01T06:41:07.712876
| 2024-01-07T22:25:08
|
2069317319
|
{
"authors": [
"kadhirvelm"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12433",
"repo": "kadhirvelm/the-tower-of-cultivation",
"url": "https://github.com/kadhirvelm/the-tower-of-cultivation/pull/80"
}
|
gharchive/pull-request
|
Add concept of NPCs
[!WARNING]
This pull request is not mergeable via GitHub because a downstack PR is open. Once all requirements are satisfied, merge this PR as a stack on Graphite.
Learn more
Current dependencies on/for this PR:
main
PR #79
PR #80 👈
PR #81
This stack of pull requests is managed by Graphite.
Merge activity
Jan 7, 5:25 PM: @kadhirvelm started a stack merge that includes this pull request via Graphite.
|
2025-04-01T06:41:07.763751
| 2022-10-07T15:24:23
|
1401353081
|
{
"authors": [
"KernelDeimos",
"jlhughes"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12434",
"repo": "kgrgreer/foam3",
"url": "https://github.com/kgrgreer/foam3/pull/2108"
}
|
gharchive/pull-request
|
Separate MutableX and ProxyX
@KernelDeimos the intent was to just use the MutableX in the boot sequence, it should be the only place it is needed.
|
2025-04-01T06:41:07.779780
| 2023-01-08T07:01:04
|
1524365938
|
{
"authors": [
"DeniseSkidmore"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12435",
"repo": "komsa-ag/Camunda.Api.Client",
"url": "https://github.com/komsa-ag/Camunda.Api.Client/pull/20"
}
|
gharchive/pull-request
|
Add missing ids in Variable Get return data.
looks right to me
Do you have the power to approve it?
|
2025-04-01T06:41:07.782415
| 2024-07-01T16:44:31
|
2384282932
|
{
"authors": [
"shawn-hurley"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12436",
"repo": "konveyor/tackle2-addon-analyzer",
"url": "https://github.com/konveyor/tackle2-addon-analyzer/pull/101"
}
|
gharchive/pull-request
|
:bug: During source only analsis, we should not also set the scope.
fixes for https://issues.redhat.com/browse/MTA-3169
@jortel This should be changed once success with errors is in so that we can display that the error is running depth retrieval.
|
2025-04-01T06:41:07.790916
| 2022-07-21T08:50:49
|
1312906869
|
{
"authors": [
"codecov-commenter",
"marcosschroh"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12437",
"repo": "kpn/kstreams",
"url": "https://github.com/kpn/kstreams/pull/13"
}
|
gharchive/pull-request
|
Create LICENSE
Codecov Report
:exclamation: No coverage uploaded for pull request base (master@7f22afb). Click here to learn what that means.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #13 +/- ##
=========================================
Coverage ? 97.23%
=========================================
Files ? 19
Lines ? 470
Branches ? 0
=========================================
Hits ? 457
Misses ? 13
Partials ? 0
Flag
Coverage Δ
unittests
97.23% <ø> (?)
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7f22afb...6ff8b75. Read the comment docs.
|
2025-04-01T06:41:07.796718
| 2024-03-22T18:04:43
|
2203038781
|
{
"authors": [
"coveralls",
"kristof-mattei"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12438",
"repo": "kristof-mattei/rust-end-to-end-application",
"url": "https://github.com/kristof-mattei/rust-end-to-end-application/pull/1290"
}
|
gharchive/pull-request
|
chore: use semgrep action, not container
Pull Request Test Coverage Report for Build<PHONE_NUMBER>
Details
0 of 0 changed or added relevant lines in 0 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage remained the same at 34.066%
Totals
Change from base Build<PHONE_NUMBER>:
0.0%
Covered Lines:
31
Relevant Lines:
91
💛 - Coveralls
|
2025-04-01T06:41:07.856231
| 2022-03-12T15:09:07
|
1167313318
|
{
"authors": [
"leroycep",
"xoich"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12439",
"repo": "leroycep/sqlite-zig",
"url": "https://github.com/leroycep/sqlite-zig/pull/3"
}
|
gharchive/pull-request
|
Fix some issues with zig 0.9.1
Thanks for making a pull request!
I've done some work that brings it up to 0.9 on the sqlite-v3.37.0 branch, I just haven't pushed that to master yet. It also changes the API in significant ways, I just need to get around to finishing the update.
|
2025-04-01T06:41:08.403744
| 2022-09-12T17:54:34
|
1370304917
|
{
"authors": [
"callebtc",
"codecov-commenter"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12441",
"repo": "lnbits/lnbits-legend",
"url": "https://github.com/lnbits/lnbits-legend/pull/964"
}
|
gharchive/pull-request
|
Fix/db reuse connection mark pending
Codecov Report
Merging #964 (83d4e04) into main (1660b9d) will decrease coverage by 1.12%.
The diff coverage is 60.00%.
:exclamation: Current head 83d4e04 differs from pull request most recent head 9d0c6db. Consider uploading reports for the commit 9d0c6db to get more accurate results
@@ Coverage Diff @@
## main #964 +/- ##
==========================================
- Coverage 43.18% 42.06% -1.13%
==========================================
Files 244 244
Lines 13545 13546 +1
==========================================
- Hits 5850 5698 -152
- Misses 7695 7848 +153
Impacted Files
Coverage Δ
lnbits/tasks.py
36.84% <0.00%> (ø)
lnbits/core/models.py
65.57% <75.00%> (+0.28%)
:arrow_up:
lnbits/extensions/boltz/mempool.py
60.34% <0.00%> (-32.76%)
:arrow_down:
lnbits/wallets/cln.py
19.00% <0.00%> (-29.76%)
:arrow_down:
lnbits/wallets/lndrest.py
17.09% <0.00%> (-29.06%)
:arrow_down:
lnbits/extensions/boltz/boltz.py
49.75% <0.00%> (-18.91%)
:arrow_down:
lnbits/db.py
80.80% <0.00%> (-9.61%)
:arrow_down:
lnbits/wallets/macaroon/macaroon.py
27.53% <0.00%> (-5.80%)
:arrow_down:
lnbits/extensions/boltz/crud.py
81.53% <0.00%> (-3.08%)
:arrow_down:
lnbits/bolt11.py
78.57% <0.00%> (-2.39%)
:arrow_down:
... and 2 more
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
|
2025-04-01T06:41:08.409164
| 2024-11-13T08:34:44
|
2654668814
|
{
"authors": [
"codecov-commenter",
"shivamG640"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12442",
"repo": "lordrip/kaoto",
"url": "https://github.com/lordrip/kaoto/pull/36"
}
|
gharchive/pull-request
|
Feat(DND): Enable Simple DND
Welcome to Codecov :tada:
Once you merge this PR into your default branch, you're all set! Codecov will compare coverage reports and display results in all future pull requests.
Thanks for integrating Codecov - We've got you covered :open_umbrella:
|
2025-04-01T06:41:08.430179
| 2023-01-28T09:15:48
|
1560778593
|
{
"authors": [
"doriaviram",
"mmanciop"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12443",
"repo": "lumigo-io/opentelemetry-js-distro",
"url": "https://github.com/lumigo-io/opentelemetry-js-distro/pull/155"
}
|
gharchive/pull-request
|
chore: update OpenTelemetry baseline to 1.4.0
:tada: This PR is included in version 1.17.0 :tada:
The release is available on:
npm package (@latest dist-tag)
GitHub release
Your semantic-release bot :package::rocket:
|
2025-04-01T06:41:08.442739
| 2018-09-21T03:42:20
|
362445021
|
{
"authors": [
"macMikey"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12444",
"repo": "macMikey/scanhammer",
"url": "https://github.com/macMikey/scanhammer/issues/1"
}
|
gharchive/issue
|
Up/Down arrows to navigate file list
Works until the browser widget gets the focus. Can't seem to take the focus away, yet.
|
2025-04-01T06:41:08.443688
| 2023-11-11T23:59:03
|
1989161591
|
{
"authors": [
"macterra"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12445",
"repo": "macterra/artx-market",
"url": "https://github.com/macterra/artx-market/issues/172"
}
|
gharchive/issue
|
Add default images to Admin page
Done in https://github.com/macterra/artx-market/compare/5f3f14e2bfa4...54e36da0a8d2
|
2025-04-01T06:41:08.444970
| 2024-02-10T17:52:08
|
2128627203
|
{
"authors": [
"maddsua"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12446",
"repo": "maddsua/lambda",
"url": "https://github.com/maddsua/lambda/pull/155"
}
|
gharchive/pull-request
|
Feat id tracking
Todo: put request id header on streamHandler too
Todo: put request id header on streamHandler too
nah won't do - everything is to be moving in serverlessHandler direction anyway
now gotta fix a few logging bugs
|
2025-04-01T06:41:08.447174
| 2024-04-24T18:59:06
|
2261974768
|
{
"authors": [
"Mostafa12248",
"mahmood-mohie"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12447",
"repo": "mahmood-mohie/the-career-journey",
"url": "https://github.com/mahmood-mohie/the-career-journey/pull/8"
}
|
gharchive/pull-request
|
Login
where is the code of login page ?
are you changed anything in about page ?
I need you to push only login page code in this pull request not all of these changes in these files
|
2025-04-01T06:41:08.451639
| 2015-08-12T11:01:25
|
100519104
|
{
"authors": [
"ambrinchaudhary"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12448",
"repo": "mairatma/alloy-ui",
"url": "https://github.com/mairatma/alloy-ui/pull/242"
}
|
gharchive/pull-request
|
AUI-1956 Use the "noconflict" version of ACE editor
New PR sent to master-deprecated: https://github.com/mairatma/alloy-ui/pull/243
|
2025-04-01T06:41:08.456387
| 2022-10-02T03:46:59
|
1393630832
|
{
"authors": [
"manan-shxrma",
"samyakjain26"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12449",
"repo": "manan-shxrma/dsalgo",
"url": "https://github.com/manan-shxrma/dsalgo/pull/90"
}
|
gharchive/pull-request
|
Create CoinChange.cpp
kindly provide a description or question link of your code
Given an integer array of coins[ ] of size N representing different types of currency and an integer sum, The task is to find the number of ways to make sum by using different combinations from coins[].
|
2025-04-01T06:41:08.463966
| 2022-07-07T03:12:25
|
1296738468
|
{
"authors": [
"marcusblake",
"seokhakang9"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12450",
"repo": "marcusblake/software-product-sprint-2022",
"url": "https://github.com/marcusblake/software-product-sprint-2022/pull/9"
}
|
gharchive/pull-request
|
Event info
Weren't these changes already merged into development? https://github.com/marcusblake/software-product-sprint-2022/pull/4
|
2025-04-01T06:41:08.465485
| 2024-02-03T10:04:38
|
2116398296
|
{
"authors": [
"skejeton",
"vtereshkov"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12451",
"repo": "marekmaskarinec/tophat",
"url": "https://github.com/marekmaskarinec/tophat/issues/143"
}
|
gharchive/issue
|
ui.TextBox: Delete key not working
I implemented Delete alongside with other shortcuts like: Home, End, Ctrl+Left, Ctrl+Right, Ctrl+C, Ctrl+V. Try them out
Verified and closed.
|
2025-04-01T06:41:08.652826
| 2022-05-10T05:39:14
|
1230642949
|
{
"authors": [
"codacy-badger",
"ivan-ristovic"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12452",
"repo": "matf-pp/2022_MATDAQ",
"url": "https://github.com/matf-pp/2022_MATDAQ/pull/33"
}
|
gharchive/pull-request
|
Add a Codacy badge to README.md
Note: Codacy analysis is meant to be a tool that can help you make your code better and more robust (and also learn to use code analysis tools). It is not mandatory to implement suggested changes.
|
2025-04-01T06:41:08.691503
| 2022-04-26T14:12:31
|
1216022749
|
{
"authors": [
"johannahemminger"
],
"license": "CC-BY-4.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12453",
"repo": "mendix/docs",
"url": "https://github.com/mendix/docs/pull/4502"
}
|
gharchive/pull-request
|
Proofread last Catalog docs
@Adam-Dupaski these are the final docs I have on the list for my proofreading task. I made a few changes and additions, please check it out when you have some time.
|
2025-04-01T06:41:08.710662
| 2024-04-09T07:42:33
|
2232815184
|
{
"authors": [
"michaelpeterswa"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12454",
"repo": "michaelpeterswa/talvi",
"url": "https://github.com/michaelpeterswa/talvi/pull/1"
}
|
gharchive/pull-request
|
fix: first pass
:tada: This PR is included in version 1.0.0 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
|
2025-04-01T06:41:08.714238
| 2022-08-01T17:39:48
|
1324755148
|
{
"authors": [
"hassanhabib",
"mimendel"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12455",
"repo": "microsoft/mixedreality.dmx.gatekeeper",
"url": "https://github.com/microsoft/mixedreality.dmx.gatekeeper/pull/65"
}
|
gharchive/pull-request
|
DATA: Lab Commands
@mimendel @terrypalo data is for data migration - models alone are not data. they just become a part of whatever task you are trying to accomplish.
|
2025-04-01T06:41:08.715342
| 2021-08-11T16:53:24
|
967072525
|
{
"authors": [
"cleonard-git"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12456",
"repo": "microsoft/moc-sdk-for-go",
"url": "https://github.com/microsoft/moc-sdk-for-go/pull/70"
}
|
gharchive/pull-request
|
moc-sdk : null ptr check for SourceType for backward compatibility
Can we make sure we test everything end to end and then raise PRs.
That way we dont have to keep making sdk changes.
Testing with wssdtest was done with my changes which rested the feature , so this was not caught earlier. Vanilla wssdtest with latest tags from all repos should be tested. Will keep this scenario in mind next time.
|
2025-04-01T06:41:08.716966
| 2023-07-25T22:06:12
|
1821237384
|
{
"authors": [
"justinchuby"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12457",
"repo": "microsoft/onnxscript",
"url": "https://github.com/microsoft/onnxscript/pull/921"
}
|
gharchive/pull-request
|
Create action to display test results on Github | test
Nice errors:
@jcwchen could you take a look? Thanks!
After approval I will revert the test changes
|
2025-04-01T06:41:08.718487
| 2023-01-17T11:56:27
|
1536258033
|
{
"authors": [
"JonLiu1993",
"autoantwort"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12458",
"repo": "microsoft/vcpkg",
"url": "https://github.com/microsoft/vcpkg/pull/29004"
}
|
gharchive/pull-request
|
[open62541] Update version to 1.3.4
Please get failure logs here.
Pinging @autoantwort for response. Is work still being done for this PR?
I think I can close this temporarily
|
2025-04-01T06:41:08.726227
| 2022-10-25T13:53:30
|
1422507278
|
{
"authors": [
"coveralls",
"miguelnietoa"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12459",
"repo": "miguelnietoa/stellar_sdk",
"url": "https://github.com/miguelnietoa/stellar_sdk/pull/15"
}
|
gharchive/pull-request
|
Update CI
Coverage remained the same at 99.745% when pulling 067478d147d9c9b0351a6762ba3cc043f9156e07 on update-ci into 0fdde73c1118df94ff53ce5769c1d8995bb3a8e9 on main.
|
2025-04-01T06:41:08.729258
| 2024-03-13T14:22:42
|
2184143434
|
{
"authors": [
"mikesmithgh"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12460",
"repo": "mikesmithgh/kitty-scrollback.nvim",
"url": "https://github.com/mikesmithgh/kitty-scrollback.nvim/pull/209"
}
|
gharchive/pull-request
|
chore: latest backport changes for version.lua
:tada: This PR is included in version 4.2.3 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
|
2025-04-01T06:41:08.735151
| 2022-07-29T13:46:31
|
1322274752
|
{
"authors": [
"milankl",
"white-alistair"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12461",
"repo": "milankl/SpeedyWeather.jl",
"url": "https://github.com/milankl/SpeedyWeather.jl/pull/117"
}
|
gharchive/pull-request
|
Implement LowerTriangularMatrix for spectral transform
@white-alistair This PR is a bit hectic, but I've been restructuring the PrognosticVariables and DiagnosticVariables such that one can now do something like
julia> using SpeedyWeather
julia> progn,diagn,model = initialize_speedy();
julia> progn.
layers lmax mmax n_leapfrog nlev pres
so PrognosticVariables contains another struct with layers and pres (as it's just surface separated). Hence, access to the variables is straight forward like
julia> progn.layers[1].leapfrog[1].vor
32×32 LowerTriangularMatrix{ComplexF32}:
...
and it should be super easy to loop over all layers like (which can be a single loop very early on at every time step)
for layer in progn.layers
func!(layer,...)
end
With LowerTriangularMatrix as the default struct for spectral coefficients looping in spectral domain is either as before
for m in 1:mmax+1
for l in m:mmax+1
vor[l,m]
or more convenient (and faster)
for lm in eachharmonic(vor)
vor[lm]
which is equivalent to
for lm in 1:length(vor)
vor[lm]
@white-alistair This PR is a bit hectic, but I've been restructuring the PrognosticVariables and DiagnosticVariables such that one can now do something like
julia> using SpeedyWeather
julia> progn,diagn,model = initialize_speedy();
julia> progn.
layers lmax mmax n_leapfrog nlev pres
so PrognosticVariables contains another struct with layers and pres (as it's just surface separated). Hence, access to the variables is straight forward like
julia> progn.layers[1].leapfrog[1].vor
32×32 LowerTriangularMatrix{ComplexF32}:
...
and it should be super easy to loop over all layers like (which can be a single loop very early on at every time step)
for layer in progn.layers
func!(layer,...)
end
@milankl looks good, nice one!
I see that you moved the parametrization vars into a file column_variables.jl :+1: On a related note, say there are intermediate quantities used in the calculation of some parametrization, would you put those arrays in the IntermediateVariables struct or the ParametrizationVariables struct?
Let me know if you want an actual review of this at some point.
@white-alistair I answered that question in #118, I suggest creating a ColumnVariables struct.
|
2025-04-01T06:41:08.738211
| 2023-09-06T13:29:58
|
1884024947
|
{
"authors": [
"SamitHuang",
"wtomin"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12462",
"repo": "mindspore-lab/mindone",
"url": "https://github.com/mindspore-lab/mindone/pull/128"
}
|
gharchive/pull-request
|
Fix videocomposer bugs
Remove model_weights_ddd/clip/bpe_simple_vocab_16e6.txt.gz pls. I was supposed to git add model_weights/bpe_simple_vocab_16e6.txt.gz, but accidentally added this one.
|
2025-04-01T06:41:08.748115
| 2024-10-01T14:39:38
|
2559449693
|
{
"authors": [
"mirpedrol"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12463",
"repo": "mirpedrol/customnonnfcoresync",
"url": "https://github.com/mirpedrol/customnonnfcoresync/pull/1"
}
|
gharchive/pull-request
|
Template update for nf-core/tools version 2.14.2.dev0
Version 2.14.2.dev0 of the nf-core/tools pipeline template has just been released. This pull-request is now outdated and has been closed in favour of https://github.com/mirpedrol/customnonnfcoresync/pull/2
Please use https://github.com/mirpedrol/customnonnfcoresync/pull/2 to merge in the new changes from the nf-core template as soon as possible.
|
2025-04-01T06:41:08.760655
| 2016-04-15T17:41:01
|
148722507
|
{
"authors": [
"mjackson",
"slorber",
"taion"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12464",
"repo": "mjackson/history",
"url": "https://github.com/mjackson/history/pull/274"
}
|
gharchive/pull-request
|
Add regression test for base href on separate origin
Looks like this needs to run a restricted set of browsers.
@taion so it seems you did add my proposal change but tests still not pass :(
tell me if I can help but not sure to know how to interpret these test results. Is the problem "Some of your tests did a full page reload!" ?
That's correct – that will fail a test in Karma.
Looks like progress here has stalled. Closing unless someone wants to pick this up and run with it.
As far as I know, our dev env which use base href does not work on Safari because of this issue, and we still didn't find any workaround
|
2025-04-01T06:41:08.761438
| 2021-11-11T22:06:34
|
1051399248
|
{
"authors": [
"mjlomeli"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12465",
"repo": "mjlomeli/jcp",
"url": "https://github.com/mjlomeli/jcp/issues/27"
}
|
gharchive/issue
|
Initial Setup - The root.html.erb will only have one html tag: <div id="root">React Broken</div>
finished
|
2025-04-01T06:41:08.771955
| 2022-01-04T16:26:18
|
1093517810
|
{
"authors": [
"mlondschien"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12466",
"repo": "mlondschien/biosphere",
"url": "https://github.com/mlondschien/biosphere/pull/31"
}
|
gharchive/pull-request
|
Use introsort.
This appears to improve performance very slightly for medium sized n and one-hot variables, but reduces performance for continuous variables.
Gnuplot not found, using plotters backend
Benchmarking forest
Benchmarking forest: Warming up for 3.0000 s
Warning: Unable to complete 10 samples in 5.0s. You may wish to increase target time to 106.0s.
Benchmarking forest: Collecting 10 samples in estimated 106.02 s (10 iterations)
Benchmarking forest: Analyzing
forest time: [9.9991 s 10.128 s 10.281 s]
change: [-2.8555% -1.0205% +0.8443%] (p = 0.33 > 0.05)
No change in performance detected.
Running unittests (target/release/deps/bench_tree-e4e43edb68de800e)
WARNING: HTML report generation will become a non-default optional feature in Criterion.rs 0.4.0.
This feature is being moved to cargo-criterion (https://github.com/bheisler/cargo-criterion) and will be optional in a future version of Criterion.rs. To silence this warning, either switch to cargo-criterion or enable the 'html_reports' feature in your Cargo.toml.
Gnuplot not found, using plotters backend
Benchmarking tree_split/tree_n=100, d=10, max_depth=4, mtry=10
Benchmarking tree_split/tree_n=100, d=10, max_depth=4, mtry=10: Warming up for 3.0000 s
Benchmarking tree_split/tree_n=100, d=10, max_depth=4, mtry=10: Collecting 100 samples in estimated 5.0350 s (56k iterations)
Benchmarking tree_split/tree_n=100, d=10, max_depth=4, mtry=10: Analyzing
tree_split/tree_n=100, d=10, max_depth=4, mtry=10
time: [91.210 us 91.496 us 91.794 us]
change: [+19.423% +19.829% +20.227%] (p = 0.00 < 0.05)
Performance has regressed.
Found 11 outliers among 100 measurements (11.00%)
2 (2.00%) low severe
9 (9.00%) low mild
Benchmarking tree_split/tree_n=1000, d=10, max_depth=4, mtry=10
Benchmarking tree_split/tree_n=1000, d=10, max_depth=4, mtry=10: Warming up for 3.0000 s
Benchmarking tree_split/tree_n=1000, d=10, max_depth=4, mtry=10: Collecting 100 samples in estimated 9.0133 s (10k iterations)
Benchmarking tree_split/tree_n=1000, d=10, max_depth=4, mtry=10: Analyzing
tree_split/tree_n=1000, d=10, max_depth=4, mtry=10
time: [872.86 us 876.92 us 881.02 us]
change: [+1.0759% +1.5909% +2.1139%] (p = 0.00 < 0.05)
Performance has regressed.
Benchmarking tree_split/tree_n=10000, d=10, max_depth=4, mtry=10
Benchmarking tree_split/tree_n=10000, d=10, max_depth=4, mtry=10: Warming up for 3.0000 s
Benchmarking tree_split/tree_n=10000, d=10, max_depth=4, mtry=10: Collecting 100 samples in estimated 5.4537 s (600 iterations)
Benchmarking tree_split/tree_n=10000, d=10, max_depth=4, mtry=10: Analyzing
tree_split/tree_n=10000, d=10, max_depth=4, mtry=10
time: [9.1067 ms 9.1431 ms 9.1797 ms]
change: [+0.1906% +0.7948% +1.4053%] (p = 0.01 < 0.05)
Change within noise threshold.
Benchmarking tree_split/tree_n=10000, d=100, max_depth=4, mtry=10
Benchmarking tree_split/tree_n=10000, d=100, max_depth=4, mtry=10: Warming up for 3.0000 s
Benchmarking tree_split/tree_n=10000, d=100, max_depth=4, mtry=10: Collecting 100 samples in estimated 6.7835 s (200 iterations)
Benchmarking tree_split/tree_n=10000, d=100, max_depth=4, mtry=10: Analyzing
tree_split/tree_n=10000, d=100, max_depth=4, mtry=10
time: [33.324 ms 33.964 ms 34.607 ms]
change: [+11.553% +14.812% +18.152%] (p = 0.00 < 0.05)
Performance has regressed.
Benchmarking tree_split/tree_n=10000, d=100, max_depth=4, mtry=100
Benchmarking tree_split/tree_n=10000, d=100, max_depth=4, mtry=100: Warming up for 3.0000 s
Warning: Unable to complete 100 samples in 5.0s. You may wish to increase target time to 10.6s, or reduce sample count to 40.
Benchmarking tree_split/tree_n=10000, d=100, max_depth=4, mtry=100: Collecting 100 samples in estimated 10.622 s (100 iterations)
Benchmarking tree_split/tree_n=10000, d=100, max_depth=4, mtry=100: Analyzing
tree_split/tree_n=10000, d=100, max_depth=4, mtry=100
time: [101.90 ms 102.33 ms 102.78 ms]
change: [+0.5477% +1.3369% +2.1588%] (p = 0.00 < 0.05)
Change within noise threshold.
Benchmarking tree_split/tree_n=100000, d=10, max_depth=4, mtry=10
Benchmarking tree_split/tree_n=100000, d=10, max_depth=4, mtry=10: Warming up for 3.0000 s
Warning: Unable to complete 100 samples in 5.0s. You may wish to increase target time to 11.3s, or reduce sample count to 40.
Benchmarking tree_split/tree_n=100000, d=10, max_depth=4, mtry=10: Collecting 100 samples in estimated 11.270 s (100 iterations)
Benchmarking tree_split/tree_n=100000, d=10, max_depth=4, mtry=10: Analyzing
tree_split/tree_n=100000, d=10, max_depth=4, mtry=10
time: [111.56 ms 112.08 ms 112.60 ms]
change: [+3.9732% +4.5678% +5.1672%] (p = 0.00 < 0.05)
Performance has regressed.
Running unittests (target/release/deps/bench_utils-5e89e0c474c48fdf)
WARNING: HTML report generation will become a non-default optional feature in Criterion.rs 0.4.0.
This feature is being moved to cargo-criterion (https://github.com/bheisler/cargo-criterion) and will be optional in a future version of Criterion.rs. To silence this warning, either switch to cargo-criterion or enable the 'html_reports' feature in your Cargo.toml.
Gnuplot not found, using plotters backend
Benchmarking argsort/argsort_continuous/1000
Benchmarking argsort/argsort_continuous/1000: Warming up for 3.0000 s
Benchmarking argsort/argsort_continuous/1000: Collecting 100 samples in estimated 5.0855 s (76k iterations)
Benchmarking argsort/argsort_continuous/1000: Analyzing
argsort/argsort_continuous/1000
time: [67.907 us 68.091 us 68.277 us]
change: [+52.530% +53.327% +54.157%] (p = 0.00 < 0.05)
Performance has regressed.
Found 2 outliers among 100 measurements (2.00%)
2 (2.00%) low mild
Benchmarking argsort/argsort_one_hot/1000
Benchmarking argsort/argsort_one_hot/1000: Warming up for 3.0000 s
Benchmarking argsort/argsort_one_hot/1000: Collecting 100 samples in estimated 5.0135 s (965k iterations)
Benchmarking argsort/argsort_one_hot/1000: Analyzing
argsort/argsort_one_hot/1000
time: [5.1262 us 5.1462 us 5.1686 us]
change: [-28.386% -28.054% -27.745%] (p = 0.00 < 0.05)
Performance has improved.
Benchmarking argsort/sample_weight, size=1000
Benchmarking argsort/sample_weight, size=1000: Warming up for 3.0000 s
Benchmarking argsort/sample_weight, size=1000: Collecting 100 samples in estimated 5.0142 s (672k iterations)
Benchmarking argsort/sample_weight, size=1000: Analyzing
argsort/sample_weight, size=1000
time: [7.4495 us 7.4902 us 7.5280 us]
change: [-5.6246% -5.1484% -4.6678%] (p = 0.00 < 0.05)
Performance has improved.
Benchmarking argsort/sample_indices_from_weights, size=1000
Benchmarking argsort/sample_indices_from_weights, size=1000: Warming up for 3.0000 s
Benchmarking argsort/sample_indices_from_weights, size=1000: Collecting 100 samples in estimated 5.0007 s (2.0M iterations)
Benchmarking argsort/sample_indices_from_weights, size=1000: Analyzing
argsort/sample_indices_from_weights, size=1000
time: [2.5101 us 2.5212 us 2.5324 us]
change: [+0.9455% +1.5571% +2.1822%] (p = 0.00 < 0.05)
Change within noise threshold.
Found 1 outliers among 100 measurements (1.00%)
1 (1.00%) low mild
Benchmarking argsort/oob_samples_from_weights, size=1000
Benchmarking argsort/oob_samples_from_weights, size=1000: Warming up for 3.0000 s
Benchmarking argsort/oob_samples_from_weights, size=1000: Collecting 100 samples in estimated 5.0036 s (5.6M iterations)
Benchmarking argsort/oob_samples_from_weights, size=1000: Analyzing
argsort/oob_samples_from_weights, size=1000
time: [915.81 ns 920.57 ns 925.28 ns]
change: [+4.3305% +5.1706% +5.9416%] (p = 0.00 < 0.05)
Performance has regressed.
Found 4 outliers among 100 measurements (4.00%)
3 (3.00%) low mild
1 (1.00%) high mild
Benchmarking argsort/argsort_continuous/10000
Benchmarking argsort/argsort_continuous/10000: Warming up for 3.0000 s
Warning: Unable to complete 100 samples in 5.0s. You may wish to increase target time to 6.2s, enable flat sampling, or reduce sample count to 60.
Benchmarking argsort/argsort_continuous/10000: Collecting 100 samples in estimated 6.1799 s (5050 iterations)
Benchmarking argsort/argsort_continuous/10000: Analyzing
argsort/argsort_continuous/10000
time: [1.2114 ms 1.2169 ms 1.2233 ms]
change: [+45.201% +46.079% +47.020%] (p = 0.00 < 0.05)
Performance has regressed.
Benchmarking argsort/argsort_one_hot/10000
Benchmarking argsort/argsort_one_hot/10000: Warming up for 3.0000 s
Benchmarking argsort/argsort_one_hot/10000: Collecting 100 samples in estimated 5.3663 s (61k iterations)
Benchmarking argsort/argsort_one_hot/10000: Analyzing
argsort/argsort_one_hot/10000
time: [87.918 us 88.335 us 88.737 us]
change: [-3.7658% -3.2375% -2.7066%] (p = 0.00 < 0.05)
Performance has improved.
Benchmarking argsort/sample_weight, size=10000
Benchmarking argsort/sample_weight, size=10000: Warming up for 3.0000 s
Benchmarking argsort/sample_weight, size=10000: Collecting 100 samples in estimated 5.4432 s (30k iterations)
Benchmarking argsort/sample_weight, size=10000: Analyzing
argsort/sample_weight, size=10000
time: [180.80 us 181.41 us 181.99 us]
change: [-5.6198% -5.1232% -4.6308%] (p = 0.00 < 0.05)
Performance has improved.
Found 3 outliers among 100 measurements (3.00%)
3 (3.00%) low mild
Benchmarking argsort/sample_indices_from_weights, size=10000
Benchmarking argsort/sample_indices_from_weights, size=10000: Warming up for 3.0000 s
Benchmarking argsort/sample_indices_from_weights, size=10000: Collecting 100 samples in estimated 5.3596 s (50k iterations)
Benchmarking argsort/sample_indices_from_weights, size=10000: Analyzing
argsort/sample_indices_from_weights, size=10000
time: [105.11 us 105.36 us 105.60 us]
change: [-3.1431% -2.7386% -2.3162%] (p = 0.00 < 0.05)
Performance has improved.
Found 5 outliers among 100 measurements (5.00%)
2 (2.00%) low mild
3 (3.00%) high mild
Benchmarking argsort/oob_samples_from_weights, size=10000
Benchmarking argsort/oob_samples_from_weights, size=10000: Warming up for 3.0000 s
Benchmarking argsort/oob_samples_from_weights, size=10000: Collecting 100 samples in estimated 5.1426 s (116k iterations)
Benchmarking argsort/oob_samples_from_weights, size=10000: Analyzing
argsort/oob_samples_from_weights, size=10000
time: [43.120 us 43.273 us 43.428 us]
change: [-4.3595% -3.9207% -3.5132%] (p = 0.00 < 0.05)
Performance has improved.
Benchmarking argsort/argsort_continuous/100000
Benchmarking argsort/argsort_continuous/100000: Warming up for 3.0000 s
Benchmarking argsort/argsort_continuous/100000: Collecting 100 samples in estimated 6.5469 s (400 iterations)
Benchmarking argsort/argsort_continuous/100000: Analyzing
argsort/argsort_continuous/100000
time: [16.423 ms 16.485 ms 16.546 ms]
change: [+51.802% +52.474% +53.105%] (p = 0.00 < 0.05)
Performance has regressed.
Benchmarking argsort/argsort_one_hot/100000
Benchmarking argsort/argsort_one_hot/100000: Warming up for 3.0000 s
Warning: Unable to complete 100 samples in 5.0s. You may wish to increase target time to 5.2s, enable flat sampling, or reduce sample count to 60.
Benchmarking argsort/argsort_one_hot/100000: Collecting 100 samples in estimated 5.2193 s (5050 iterations)
Benchmarking argsort/argsort_one_hot/100000: Analyzing
argsort/argsort_one_hot/100000
time: [1.0226 ms 1.0258 ms 1.0288 ms]
change: [-23.345% -23.032% -22.716%] (p = 0.00 < 0.05)
Performance has improved.
Found 2 outliers among 100 measurements (2.00%)
1 (1.00%) low mild
1 (1.00%) high mild
Benchmarking argsort/sample_weight, size=100000
Benchmarking argsort/sample_weight, size=100000: Warming up for 3.0000 s
Warning: Unable to complete 100 samples in 5.0s. You may wish to increase target time to 6.5s, enable flat sampling, or reduce sample count to 60.
Benchmarking argsort/sample_weight, size=100000: Collecting 100 samples in estimated 6.4582 s (5050 iterations)
Benchmarking argsort/sample_weight, size=100000: Analyzing
argsort/sample_weight, size=100000
time: [1.2706 ms 1.2736 ms 1.2766 ms]
change: [-0.2038% +0.2651% +0.7174%] (p = 0.26 > 0.05)
No change in performance detected.
Benchmarking argsort/sample_indices_from_weights, size=100000
Benchmarking argsort/sample_indices_from_weights, size=100000: Warming up for 3.0000 s
Warning: Unable to complete 100 samples in 5.0s. You may wish to increase target time to 7.1s, enable flat sampling, or reduce sample count to 50.
Benchmarking argsort/sample_indices_from_weights, size=100000: Collecting 100 samples in estimated 7.0850 s (5050 iterations)
Benchmarking argsort/sample_indices_from_weights, size=100000: Analyzing
argsort/sample_indices_from_weights, size=100000
time: [1.3802 ms 1.3859 ms 1.3921 ms]
change: [-2.1769% -1.5761% -0.9605%] (p = 0.00 < 0.05)
Change within noise threshold.
Benchmarking argsort/oob_samples_from_weights, size=100000
Benchmarking argsort/oob_samples_from_weights, size=100000: Warming up for 3.0000 s
Benchmarking argsort/oob_samples_from_weights, size=100000: Collecting 100 samples in estimated 7.4265 s (15k iterations)
Benchmarking argsort/oob_samples_from_weights, size=100000: Analyzing
argsort/oob_samples_from_weights, size=100000
time: [489.68 us 491.38 us 493.10 us]
change: [-1.6254% -1.1573% -0.6621%] (p = 0.00 < 0.05)
Change within noise threshold.
Found 2 outliers among 100 measurements (2.00%)
2 (2.00%) low mild
Benchmarking argsort/argsort_continuous/1000000
Benchmarking argsort/argsort_continuous/1000000: Warming up for 3.0000 s
Warning: Unable to complete 100 samples in 5.0s. You may wish to increase target time to 21.1s, or reduce sample count to 20.
Benchmarking argsort/argsort_continuous/1000000: Collecting 100 samples in estimated 21.079 s (100 iterations)
Benchmarking argsort/argsort_continuous/1000000: Analyzing
argsort/argsort_continuous/1000000
time: [223.35 ms 224.81 ms 226.20 ms]
change: [+70.409% +71.754% +73.015%] (p = 0.00 < 0.05)
Performance has regressed.
Found 1 outliers among 100 measurements (1.00%)
1 (1.00%) low mild
Benchmarking argsort/argsort_one_hot/1000000
Benchmarking argsort/argsort_one_hot/1000000: Warming up for 3.0000 s
Benchmarking argsort/argsort_one_hot/1000000: Collecting 100 samples in estimated 5.2668 s (500 iterations)
Benchmarking argsort/argsort_one_hot/1000000: Analyzing
argsort/argsort_one_hot/1000000
time: [10.465 ms 10.509 ms 10.553 ms]
change: [+5.3742% +5.9750% +6.5720%] (p = 0.00 < 0.05)
Performance has regressed.
Benchmarking argsort/sample_weight, size=1000000
Benchmarking argsort/sample_weight, size=1000000: Warming up for 3.0000 s
Benchmarking argsort/sample_weight, size=1000000: Collecting 100 samples in estimated 5.2993 s (600 iterations)
Benchmarking argsort/sample_weight, size=1000000: Analyzing
argsort/sample_weight, size=1000000
time: [8.7805 ms 8.8160 ms 8.8525 ms]
change: [-3.3162% -2.7423% -2.1684%] (p = 0.00 < 0.05)
Performance has improved.
Benchmarking argsort/sample_indices_from_weights, size=1000000
Benchmarking argsort/sample_indices_from_weights, size=1000000: Warming up for 3.0000 s
Benchmarking argsort/sample_indices_from_weights, size=1000000: Collecting 100 samples in estimated 5.6539 s (400 iterations)
Benchmarking argsort/sample_indices_from_weights, size=1000000: Analyzing
argsort/sample_indices_from_weights, size=1000000
time: [14.087 ms 14.159 ms 14.231 ms]
change: [-4.9967% -4.4051% -3.9018%] (p = 0.00 < 0.05)
Performance has improved.
Found 1 outliers among 100 measurements (1.00%)
1 (1.00%) high mild
Benchmarking argsort/oob_samples_from_weights, size=1000000
Benchmarking argsort/oob_samples_from_weights, size=1000000: Warming up for 3.0000 s
Benchmarking argsort/oob_samples_from_weights, size=1000000: Collecting 100 samples in estimated 5.4853 s (1100 iterations)
Benchmarking argsort/oob_samples_from_weights, size=1000000: Analyzing
argsort/oob_samples_from_weights, size=1000000
time: [5.0249 ms 5.0364 ms 5.0470 ms]
change: [+3.1074% +3.5555% +3.9981%] (p = 0.00 < 0.05)
Performance has regressed.
Found 5 outliers among 100 measurements (5.00%)
1 (1.00%) low severe
4 (4.00%) low mild
|
2025-04-01T06:41:08.781490
| 2024-11-07T22:06:36
|
2642309108
|
{
"authors": [
"NathanQingyangXu"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12467",
"repo": "mongodb/mongo-hibernate",
"url": "https://github.com/mongodb/mongo-hibernate/pull/11"
}
|
gharchive/pull-request
|
add static analysis section to README.md
Commented back.
I updated to make the description more accurate and concise. I don't think we need to provide concrete gradle command for end-user should know the basics of gradle usage. As long as we give the relevant gradle tasks, it is good enough. It also makes the section short and to-the-point.
|
2025-04-01T06:41:08.807892
| 2022-02-18T21:04:56
|
1143776056
|
{
"authors": [
"aauker",
"codecov-commenter"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12468",
"repo": "msk-mind/luna",
"url": "https://github.com/msk-mind/luna/pull/283"
}
|
gharchive/pull-request
|
fix: Fixes on slide ingestion
Codecov Report
Merging #283 (6f75193) into dev (f43337c) will increase coverage by 2.28%.
The diff coverage is 100.00%.
@@ Coverage Diff @@
## dev #283 +/- ##
==========================================
+ Coverage 70.75% 73.03% +2.28%
==========================================
Files 100 51 -49
Lines 5057 3542 -1515
==========================================
- Hits 3578 2587 -991
+ Misses 1479 955 -524
Impacted Files
Coverage Δ
pyluna-pathology/luna/pathology/cli/slide_etl.py
90.47% <100.00%> (+0.18%)
:arrow_up:
...y/proxy_table/regional_annotation/test_generate.py
...pathology/tests/luna/pathology/cli/test_dsa_viz.py
...a-pathology/tests/luna/pathology/common/test_ml.py
...tests/luna/radiology/cli/test_extract_radiomics.py
...hology/tests/luna/pathology/cli/test_save_tiles.py
...ogy/tests/luna/pathology/common/test_preprocess.py
pyluna-core/tests/luna/project/test_generate.py
...thology/tests/luna/pathology/spatial/test_stats.py
...logy_annotations/test_get_pathology_annotations.py
... and 40 more
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f43337c...6f75193. Read the comment docs.
|
2025-04-01T06:41:08.839537
| 2024-12-21T06:55:46
|
2753811525
|
{
"authors": [
"Li-Jihong",
"Vishu26",
"jacobsn",
"yinliaoabc"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12469",
"repo": "mvrl/GeoSynth",
"url": "https://github.com/mvrl/GeoSynth/issues/10"
}
|
gharchive/issue
|
Could you please provide the Canny Control dateset?
Hi
You can just run the canny algorithm in the opencv package on the satellite images to create the dataset.
Thanks
With default parameters?
All default parameters except the 1st threshold set to 100 and the 2nd threshold set to 200.
Probably worth throwing a script in to do that so people can replicate results.
Could you please provide the Canny dateset?The effect I generated myself is not good.Thanks!
Below I'm sharing a script to load the Canny dataset. After reading the satellite image, I run the canny detector algorithm defined in the ControlNet repo:
import json
import cv2
import numpy as np
from ..ControlNet.annotator.canny import CannyDetector
from ..ControlNet.annotator.util import HWC3
from torch.utils.data import Dataset
class Dataset(Dataset):
def __init__(self, prompt_path, location_embeds_path):
self.data = json.load(open(prompt_path, "rt"))
self.apply_canny = CannyDetector()
def __len__(self):
return len(self.data)
def __getitem__(self, idx):
item = self.data[idx]
target_filename = item["target"]
prompt = item["prompt"]
target = cv2.imread("../" + target_filename)
# Do not forget that OpenCV read images in BGR order.
target = cv2.cvtColor(target, cv2.COLOR_BGR2RGB)
# Apply Canny Edge Algorithm
source = self.apply_canny(target, 100, 200)
source = HWC3(source)
# Normalize source images to [0, 1].
source = source.astype(np.float32) / 255.0
# Normalize target images to [-1, 1].
target = (target.astype(np.float32) / 127.5) - 1.0
return dict(
jpg=target, txt=prompt, hint=source
)
Below I'm sharing a script to load the Canny dataset. After reading the satellite image, I run the canny detector algorithm defined in the ControlNet repo (which uses opencv):
import json
import cv2
import numpy as np
from ..ControlNet.annotator.canny import CannyDetector
from ..ControlNet.annotator.util import HWC3
from torch.utils.data import Dataset
class Dataset(Dataset):
def __init__(self, prompt_path):
self.data = json.load(open(prompt_path, "rt"))
self.apply_canny = CannyDetector()
def __len__(self):
return len(self.data)
def __getitem__(self, idx):
item = self.data[idx]
target_filename = item["target"]
prompt = item["prompt"]
target = cv2.imread("../" + target_filename)
# Do not forget that OpenCV read images in BGR order.
target = cv2.cvtColor(target, cv2.COLOR_BGR2RGB)
# Apply Canny Edge Algorithm
source = self.apply_canny(target, 100, 200)
source = HWC3(source)
# Normalize source images to [0, 1].
source = source.astype(np.float32) / 255.0
# Normalize target images to [-1, 1].
target = (target.astype(np.float32) / 127.5) - 1.0
return dict(
jpg=target, txt=prompt, hint=source
)
Thanks for the code, I will try it out and thanks again for this great work.
|
2025-04-01T06:41:08.946218
| 2023-11-16T07:20:31
|
1996220638
|
{
"authors": [
"herwinw"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12470",
"repo": "natalie-lang/natalie",
"url": "https://github.com/natalie-lang/natalie/pull/1491"
}
|
gharchive/pull-request
|
Implement $+
This should resolve a few compile errors in the nightly specs (library/defined, library/predefined and English)
|
2025-04-01T06:41:08.963786
| 2018-08-13T21:07:15
|
350194589
|
{
"authors": [
"dtraleigh",
"jcwlib"
],
"license": "CC0-1.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12471",
"repo": "ncopenpass/NCOpenPass",
"url": "https://github.com/ncopenpass/NCOpenPass/issues/1132"
}
|
gharchive/issue
|
Figure out blog posts for marketing Civic Camp
https://docs.google.com/document/d/1H4V5DmP4OEIGkGE_GiK9DblUudXb4TkT_F_3TuaBPQ0/edit?usp=sharing
Can you give me a deadline for my post?
On Mon, Sep 3, 2018 at 8:07 AM Janel<EMAIL_ADDRESS>wrote:
https://docs.google.com/document/d/1H4V5DmP4OEIGkGE_GiK9DblUudXb4TkT_F_3TuaBPQ0/edit?usp=sharing
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/ncopenpass/NCOpenPass/issues/1132#issuecomment-418096124,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AB3Ie0b2-zgJEvJ6ibfTR0znL1QDkZF-ks5uXRuDgaJpZM4V7TMw
.
|
2025-04-01T06:41:08.974730
| 2022-09-28T02:53:27
|
1388641440
|
{
"authors": [
"denis-tingaikin",
"mayulin123456"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12472",
"repo": "networkservicemesh/deployments-k8s",
"url": "https://github.com/networkservicemesh/deployments-k8s/issues/7469"
}
|
gharchive/issue
|
if nsm has plans to use k8s CNI to communicate? Because some pods function is bound with k8s cni. If they don't use k8s cni, they can't realize their function
Could you say more?
I think it'd be more concrete if you could provide the next points:
scenario
2 .actual behavior
expected behavior
|
2025-04-01T06:41:09.017586
| 2022-04-09T16:52:50
|
1198712483
|
{
"authors": [
"codecov-commenter",
"nikitanovosibirsk"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12473",
"repo": "nikitanovosibirsk/vedro",
"url": "https://github.com/nikitanovosibirsk/vedro/pull/27"
}
|
gharchive/pull-request
|
add config
Codecov Report
Merging #27 (bb899da) into master (f376656) will increase coverage by 1.85%.
The diff coverage is 93.54%.
@@ Coverage Diff @@
## master #27 +/- ##
==========================================
+ Coverage 81.96% 83.81% +1.85%
==========================================
Files 62 72 +10
Lines 1558 1718 +160
Branches 242 189 -53
==========================================
+ Hits 1277 1440 +163
+ Misses 266 257 -9
- Partials 15 21 +6
Impacted Files
Coverage Δ
vedro/plugins/skipper/_skipper.py
20.61% <0.00%> (ø)
vedro/__init__.py
70.37% <50.00%> (+11.39%)
:arrow_up:
vedro/core/_lifecycle.py
89.47% <71.42%> (-10.53%)
:arrow_down:
vedro/plugins/director/_director_init_event.py
85.71% <85.71%> (ø)
vedro/core/_module_loader/_module_file_loader.py
92.59% <92.59%> (ø)
vedro/_config.py
100.00% <100.00%> (ø)
vedro/_context.py
75.00% <100.00%> (+8.33%)
:arrow_up:
vedro/_interface.py
100.00% <100.00%> (ø)
vedro/core/__init__.py
100.00% <100.00%> (ø)
vedro/core/_config_loader/__init__.py
100.00% <100.00%> (ø)
... and 21 more
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f376656...bb899da. Read the comment docs.
|
2025-04-01T06:41:09.032561
| 2021-11-29T05:21:58
|
1065624316
|
{
"authors": [
"codecov-commenter",
"tjholm"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12474",
"repo": "nitrictech/boxygen",
"url": "https://github.com/nitrictech/boxygen/pull/7"
}
|
gharchive/pull-request
|
Decouple dockerfile backend from gRPC server.
Codecov Report
Merging #7 (0ccf0cf) into develop (a88acaf) will increase coverage by 8.44%.
The diff coverage is 96.77%.
@@ Coverage Diff @@
## develop #7 +/- ##
===========================================
+ Coverage 43.96% 52.41% +8.44%
===========================================
Files 9 7 -2
Lines 257 145 -112
===========================================
- Hits 113 76 -37
+ Misses 132 68 -64
+ Partials 12 1 -11
Flag
Coverage Δ
unittests
52.41% <96.77%> (+8.44%)
:arrow_up:
Flags with carried forward coverage won't be shown. Click here to find out more.
Impacted Files
Coverage Δ
pkg/server/docker/from.go
80.00% <85.71%> (+2.85%)
:arrow_up:
pkg/server/docker/add.go
100.00% <100.00%> (ø)
pkg/server/docker/config.go
100.00% <100.00%> (+64.28%)
:arrow_up:
pkg/server/docker/copy.go
100.00% <100.00%> (ø)
pkg/server/docker/run.go
100.00% <100.00%> (ø)
pkg/server/docker/server.go
100.00% <100.00%> (ø)
pkg/backend/dockerfile/container.go
pkg/backend/dockerfile/store.go
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a88acaf...0ccf0cf. Read the comment docs.
|
2025-04-01T06:41:09.041920
| 2023-08-11T03:02:48
|
1846129733
|
{
"authors": [
"codecov-commenter",
"noamteyssier"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12475",
"repo": "noamteyssier/bedrs",
"url": "https://github.com/noamteyssier/bedrs/pull/88"
}
|
gharchive/pull-request
|
87 redefine containment
Codecov Report
Merging #88 (74125d6) into main (5deec81) will increase coverage by 0.17%.
The diff coverage is 100.00%.
:exclamation: Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more.
@@ Coverage Diff @@
## main #88 +/- ##
==========================================
+ Coverage 96.72% 96.90% +0.17%
==========================================
Files 23 23
Lines 3909 3937 +28
==========================================
+ Hits 3781 3815 +34
+ Misses 128 122 -6
Files Changed
Coverage Δ
src/traits/interval/overlap.rs
100.00% <100.00%> (ø)
src/traits/interval/subtract.rs
99.31% <100.00%> (+2.36%)
:arrow_up:
|
2025-04-01T06:41:09.067055
| 2024-04-02T10:27:28
|
2220137628
|
{
"authors": [
"recap"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12476",
"repo": "obeliss-nlesc/otree-waiting-room",
"url": "https://github.com/obeliss-nlesc/otree-waiting-room/issues/47"
}
|
gharchive/issue
|
Get variables from oTree server
It is possible to decode the base64 variable encoding. In oTree the base64 is generated as follows
binascii.b2a_base64(pickle.dumps(dict(value))).decode('utf-8')
and loaded as
pickle.loads(binascii.a2b_base64(value.encode('utf-8')))
from database.py
|
2025-04-01T06:41:09.082113
| 2022-07-15T20:11:04
|
1306444969
|
{
"authors": [
"builder-247",
"howardchung"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12477",
"repo": "odota/core",
"url": "https://github.com/odota/core/pull/2583"
}
|
gharchive/pull-request
|
Average medal always returns at least 1 star
Wait how are we getting 0 here?
If avgStars is less than 0.5 mod 5
|
2025-04-01T06:41:09.088204
| 2022-03-14T07:41:52
|
1168001308
|
{
"authors": [
"coveralls",
"sbchaos",
"sravankorumilli"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12478",
"repo": "odpf/optimus",
"url": "https://github.com/odpf/optimus/pull/216"
}
|
gharchive/pull-request
|
refactor: propagate context to downstream calls
Pull Request Test Coverage Report for Build<PHONE_NUMBER>
7 of 8 (87.5%) changed or added relevant lines in 3 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage remained the same at 72.72%
Changes Missing Coverage
Covered Lines
Changed/Added Lines
%
run/service.go
2
3
66.67%
Totals
Change from base Build<PHONE_NUMBER>:
0.0%
Covered Lines:
5158
Relevant Lines:
7093
💛 - Coveralls
@sbchaos lets resolve the conflicts
|
2025-04-01T06:41:09.090621
| 2022-03-04T16:27:17
|
1159824582
|
{
"authors": [
"lucky-edu",
"rbrisuda"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12479",
"repo": "ohpensource/platform-cicd",
"url": "https://github.com/ohpensource/platform-cicd/pull/74"
}
|
gharchive/pull-request
|
feat: GMP-449 minor fixes related to runner environment
Update commits message to reflect the "fix" (not the feature!)
|
2025-04-01T06:41:09.107045
| 2022-09-26T15:08:18
|
1386251741
|
{
"authors": [
"tdelabro"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12480",
"repo": "onlydustxyz/marketplace-backend",
"url": "https://github.com/onlydustxyz/marketplace-backend/pull/252"
}
|
gharchive/pull-request
|
Store events metadata
what is the goal of the origin field? Can't it be included in the metadata json?
The idea behind the origin field is to discriminate a bit between the different metadata.
Events from starknet will have different metadata than events from the API or from another chain.
But we cannot know which type of metadata it will be just based on the aggregate type itself. So this extra field will allow us to programmatically discriminate between this diversity of metadata schema.
It can be included in the json, but I think it's better to have it outside in order to know the target before deserialization.
It's also more high-level/domain-related than block_number or transaction_hash. Like timestamp it is common to all events, regardless of their nature/origin. For those reasons I think it's better to have it flattened.
I think the concept of event metadata should only exist at the infrastructure layer.
StorableEvent is defined in the domain. It contains everything to be stored with an event.
If we want to take metadata out of StorableEvent we change the Store trait
fn append(&self, aggregate_id: &A::Id, events: Vec<StorableEvent<A>>) -> Result<(), Error>;
can become
fn append(&self, aggregate_id: &A::Id, events: Vec<(StorableEvent<A>, json::Value)>) -> Result<(), Error>;
But this trait is still defined in the domain, so it changes nothing regarding your concern.
|
2025-04-01T06:41:09.110684
| 2023-08-25T05:53:33
|
1866351163
|
{
"authors": [
"L-M-Sherlock",
"asukaminato0721",
"dae"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12481",
"repo": "open-spaced-repetition/fsrs-optimizer-burn",
"url": "https://github.com/open-spaced-repetition/fsrs-optimizer-burn/pull/30"
}
|
gharchive/pull-request
|
use path join
I'm not sure whether it is the best practice of Rust. You can ask the review from @dae.
The rest of the changes here seem reasonable, and it actually wouldn't be a big deal if this were merged without waiting - once Burn gets updated, it would just be a case of removing the .to_str().unwrap() to get non-UTF8 support.
|
2025-04-01T06:41:09.120985
| 2023-01-10T21:11:22
|
1528014047
|
{
"authors": [
"bmtcril"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12482",
"repo": "openedx/openedx-oars",
"url": "https://github.com/openedx/openedx-oars/issues/26"
}
|
gharchive/issue
|
ADR: Analytic Reporting and Display Tech Selection
https://github.com/openedx/openedx-oars/pull/37 is in review
https://github.com/openedx/openedx-oars/blob/main/docs/decisions/0003_superset.rst
|
2025-04-01T06:41:09.131852
| 2022-04-13T00:35:17
|
1202609300
|
{
"authors": [
"serverless-qe"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12483",
"repo": "openshift-knative/eventing-kafka",
"url": "https://github.com/openshift-knative/eventing-kafka/pull/644"
}
|
gharchive/pull-request
|
:robot: Triggering CI on branch 'release-next' after synching to upstream/main
OCF Webhook is merging this PR
|
2025-04-01T06:41:09.157268
| 2024-06-20T14:22:12
|
2364587953
|
{
"authors": [
"as-suvorov"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12485",
"repo": "openvinotoolkit/openvino.genai",
"url": "https://github.com/openvinotoolkit/openvino.genai/pull/538"
}
|
gharchive/pull-request
|
Enable win lib build
@Wovchena , Win build has been fixed prior to this PR. I enabled casual_lm_cpp and genai_package for win in this PR. There are still 3 failing tests in genai_python_lib for win, I propose to fix them in the next PR.
|
2025-04-01T06:41:09.159155
| 2024-01-16T14:11:48
|
2084058874
|
{
"authors": [
"RechieKho"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12486",
"repo": "openxla/iree",
"url": "https://github.com/openxla/iree/pull/16121"
}
|
gharchive/pull-request
|
Add IREE_ENABLE_WERROR_FLAG CMake option.
Sorry, this is an accident. Should have check what branch I'm merging to.
|
2025-04-01T06:41:09.166407
| 2024-08-05T22:36:17
|
2449589305
|
{
"authors": [
"overlookmotel"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12487",
"repo": "oxc-project/oxc",
"url": "https://github.com/oxc-project/oxc/pull/4674"
}
|
gharchive/pull-request
|
refactor(semantic): simplify setting scope flags
#4674 👈
main
This stack of pull requests is managed by Graphite. Learn more about stacking.
Join @overlookmotel and the rest of your teammates on Graphite
|
2025-04-01T06:41:09.178490
| 2022-07-05T12:16:45
|
1294219895
|
{
"authors": [
"codecov-commenter",
"goflutterjava"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12488",
"repo": "paashzj/dev-tools-kotlin",
"url": "https://github.com/paashzj/dev-tools-kotlin/pull/152"
}
|
gharchive/pull-request
|
use ui resp
Codecov Report
Merging #152 (d029326) into main (d154d37) will decrease coverage by 0.23%.
The diff coverage is 0.00%.
:exclamation: Current head d029326 differs from pull request most recent head 027a3ea. Consider uploading reports for the commit 027a3ea to get more accurate results
@@ Coverage Diff @@
## main #152 +/- ##
============================================
- Coverage 22.42% 22.19% -0.24%
Complexity 160 160
============================================
Files 87 89 +2
Lines 1516 1532 +16
Branches 92 92
============================================
Hits 340 340
- Misses 1160 1176 +16
Partials 16 16
Flag
Coverage Δ
unittests
22.19% <0.00%> (-0.24%)
:arrow_down:
Flags with carried forward coverage won't be shown. Click here to find out more.
Impacted Files
Coverage Δ
.../java/com/github/shoothzj/dev/dump/DumpAction.java
0.00% <0.00%> (ø)
...ava/com/github/shoothzj/dev/module/UiListResp.java
0.00% <0.00%> (ø)
...in/java/com/github/shoothzj/dev/module/UiResp.java
0.00% <0.00%> (ø)
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d154d37...027a3ea. Read the comment docs.
|
2025-04-01T06:41:09.209214
| 2023-04-04T08:42:47
|
1653448536
|
{
"authors": [
"paschendale"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12489",
"repo": "paschendale/webgis-itabirito",
"url": "https://github.com/paschendale/webgis-itabirito/pull/1"
}
|
gharchive/pull-request
|
add dark mode
:tada: This PR is included in version 1.0.0 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
|
2025-04-01T06:41:09.215199
| 2023-03-14T12:56:15
|
1623424387
|
{
"authors": [
"abderhim",
"patrickmichalik"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12490",
"repo": "patrykandpatrick/vico",
"url": "https://github.com/patrykandpatrick/vico/issues/211"
}
|
gharchive/issue
|
How to add my own Font style to the axis labels ? it only accepts TypeFace.
Hello! The Typeface class doesn’t restrict you to any particular font family. See, for example, ResourcesCompat.getFont.
Thank you.
You’re welcome!
|
2025-04-01T06:41:09.265396
| 2021-11-10T14:23:19
|
1049893479
|
{
"authors": [
"JNKPercona",
"hors"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12491",
"repo": "percona/percona-postgresql-operator",
"url": "https://github.com/percona/percona-postgresql-operator/pull/153"
}
|
gharchive/pull-request
|
K8SPG-120 keep backups and PVC by default
Test name
Status
init-deploy
failed
demand-backup
failed
data-migration-gcs
passed
scaling
failed
recreate
failed
affinity
failed
monitoring
passed
scheduled-backup
failed
self-healing
passed
operator-self-healing
passed
clone-cluster
passed
tls-check
passed
upgrade
passed
commit: https://github.com/percona/percona-postgresql-operator/pull/153/commits/ff3b63894af3b41281c855eb4958a2a796299051
image: perconalab/percona-postgresql-operator:PR-153-ff3b6389-pgo-apiserver
image: perconalab/percona-postgresql-operator:PR-153-ff3b6389-pgo-event
image: perconalab/percona-postgresql-operator:PR-153-ff3b6389-pgo-rmdata
image: perconalab/percona-postgresql-operator:PR-153-ff3b6389-pgo-scheduler
image: perconalab/percona-postgresql-operator:PR-153-ff3b6389-postgres-operator
image: perconalab/percona-postgresql-operator:PR-153-ff3b6389-pgo-deployer
Test name
Status
demand-backup
failed
init-deploy
failed
data-migration-gcs
passed
scaling
failed
recreate
failed
affinity
failed
monitoring
passed
scheduled-backup
failed
self-healing
passed
operator-self-healing
passed
clone-cluster
passed
tls-check
passed
upgrade
passed
commit: https://github.com/percona/percona-postgresql-operator/pull/153/commits/25fa1fb54d34725542ca71c6bede146600f73f17
image: perconalab/percona-postgresql-operator:PR-153-25fa1fb5-pgo-apiserver
image: perconalab/percona-postgresql-operator:PR-153-25fa1fb5-pgo-event
image: perconalab/percona-postgresql-operator:PR-153-25fa1fb5-pgo-rmdata
image: perconalab/percona-postgresql-operator:PR-153-25fa1fb5-pgo-scheduler
image: perconalab/percona-postgresql-operator:PR-153-25fa1fb5-postgres-operator
image: perconalab/percona-postgresql-operator:PR-153-25fa1fb5-pgo-deployer
Test name
Status
demand-backup
passed
init-deploy
passed
data-migration-gcs
passed
scaling
passed
recreate
passed
affinity
passed
monitoring
passed
scheduled-backup
failed
self-healing
passed
operator-self-healing
passed
clone-cluster
passed
tls-check
failed
upgrade
failed
commit: https://github.com/percona/percona-postgresql-operator/pull/153/commits/83673b77c657e37e737fe46d5d120754d91959c5
image: perconalab/percona-postgresql-operator:PR-153-83673b77-pgo-apiserver
image: perconalab/percona-postgresql-operator:PR-153-83673b77-pgo-event
image: perconalab/percona-postgresql-operator:PR-153-83673b77-pgo-rmdata
image: perconalab/percona-postgresql-operator:PR-153-83673b77-pgo-scheduler
image: perconalab/percona-postgresql-operator:PR-153-83673b77-postgres-operator
image: perconalab/percona-postgresql-operator:PR-153-83673b77-pgo-deployer
Test name
Status
demand-backup
passed
init-deploy
passed
data-migration-gcs
passed
scaling
passed
recreate
passed
affinity
passed
monitoring
passed
scheduled-backup
passed
self-healing
passed
operator-self-healing
passed
clone-cluster
passed
tls-check
passed
upgrade
passed
users
passed
commit: https://github.com/percona/percona-postgresql-operator/pull/153/commits/f36863108ad0ab542d79529e286f9c6491d46715
image: perconalab/percona-postgresql-operator:PR-153-f3686310-pgo-apiserver
image: perconalab/percona-postgresql-operator:PR-153-f3686310-pgo-event
image: perconalab/percona-postgresql-operator:PR-153-f3686310-pgo-rmdata
image: perconalab/percona-postgresql-operator:PR-153-f3686310-pgo-scheduler
image: perconalab/percona-postgresql-operator:PR-153-f3686310-postgres-operator
image: perconalab/percona-postgresql-operator:PR-153-f3686310-pgo-deployer
|
2025-04-01T06:41:09.287788
| 2021-11-01T16:32:30
|
1041363826
|
{
"authors": [
"hamidsamak",
"titonova"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12492",
"repo": "pheditor/pheditor",
"url": "https://github.com/pheditor/pheditor/issues/52"
}
|
gharchive/issue
|
How can we add auto completion?
Hi @titonova
There are some addons for CodeMirror to add auto completion.
Check this: https://codemirror.net/demo/complete.html
|
2025-04-01T06:41:09.295973
| 2024-06-23T19:21:15
|
2368838829
|
{
"authors": [
"easchaefer",
"helgibbons"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12493",
"repo": "pimoroni/badger2040",
"url": "https://github.com/pimoroni/badger2040/issues/84"
}
|
gharchive/issue
|
Can not back out of menu item after selected, deleting
Try pressing A and C together to return to the launcher :)
https://learn.pimoroni.com/article/getting-started-with-badger-2040
Oh, just seen #85 - glad to hear you're sorted.
Closing this duplicate :)
|
2025-04-01T06:41:09.298071
| 2023-01-31T23:31:18
|
1565150997
|
{
"authors": [
"rfleur01",
"robbintt"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12494",
"repo": "pinterest/teletraan",
"url": "https://github.com/pinterest/teletraan/pull/1119"
}
|
gharchive/pull-request
|
Revert host type UI updates to support dropdown search and multiple pages
Can you add in the description which file was not reverted so it is clear to someone that this isn't a complete revert?
|
2025-04-01T06:41:09.317970
| 2022-12-27T08:59:17
|
1511585969
|
{
"authors": [
"jordanlambrecht"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12495",
"repo": "pixelbakery/pixelbakery-website",
"url": "https://github.com/pixelbakery/pixelbakery-website/pull/240"
}
|
gharchive/pull-request
| |
2025-04-01T06:41:09.322661
| 2024-09-04T19:25:43
|
2506115259
|
{
"authors": [
"plengauer"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12496",
"repo": "plengauer/opentelemetry-bash",
"url": "https://github.com/plengauer/opentelemetry-bash/issues/571"
}
|
gharchive/issue
|
Deep injection into Github Renovate Action not working
ubuntu@xxx:~$ \sudo docker run --entrypoint /bin/cat --rm ghcr.io/renovatebot/renovate:latest /usr/local/sbin/renovate
#!/bin/bash
if [[ -f "/usr/local/etc/env" && -z "${CONTAINERBASE_ENV+x}" ]]; then
# shellcheck source=/dev/null
. /usr/local/etc/env
fi
/usr/local/renovate/node --use-openssl-ca "${RENOVATE_NODE_ARGS[@]}" /usr/local/renovate/dist/renovate.js "$@"
we see the span for that file with shebang, theory is we are injecting, (second part of the condition is not empty) and then we do not inject into an absolute path and detect that its a node process.
|
2025-04-01T06:41:09.372871
| 2024-07-26T05:58:53
|
2431483342
|
{
"authors": [
"jan-janssen"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12497",
"repo": "pyiron/atomistics",
"url": "https://github.com/pyiron/atomistics/pull/311"
}
|
gharchive/pull-request
|
Test matgl with python 3.12
The following packages are incompatible
├─ matgl 1.1.2** is installable and it requires
│ ├─ dgl >=2.0.0 with the potential options
│ │ ├─ dgl [2.0.0|2.1.0] would require
│ │ │ └─ python_abi 3.10.* *_cp310, which can be installed;
│ │ ├─ dgl [2.0.0|2.1.0] would require
│ │ │ └─ python_abi 3.11.* *_cp311, which can be installed;
│ │ ├─ dgl [2.0.0|2.1.0] would require
│ │ │ └─ python_abi 3.9.* *_cp39, which can be installed;
│ │ ├─ dgl [2.0.0|2.1.0] would require
│ │ │ └─ pytorch * *cuda* with the potential options
│ │ │ ├─ pytorch [1.11.0|1.12.0|...|2.3.1] would require
│ │ │ │ └─ python_abi 3.10.* *_cp310, which can be installed;
│ │ │ ├─ pytorch [1.10.0|1.10.1|...|2.3.1] would require
│ │ │ │ └─ python_abi 3.8.* *_cp38, which can be installed;
│ │ │ ├─ pytorch [1.10.0|1.10.1|...|2.3.1] would require
│ │ │ │ └─ python_abi 3.9.* *_cp39, which can be installed;
│ │ │ ├─ pytorch [1.11.0|1.12.0|...|2.3.1] would require
│ │ │ │ └─ __cuda, which is missing on the system;
│ │ │ ├─ pytorch [2.0.1|2.3.0] would require
│ │ │ │ └─ __cuda >=11.8 , which is missing on the system;
│ │ │ ├─ pytorch [1.10.0|1.10.1|...|1.9.1] would require
│ │ │ │ └─ python_abi 3.7.* *_cp37m, which can be installed;
│ │ │ ├─ pytorch [1.6.0|1.7.1|1.8.0|1.9.0|1.9.1] would require
│ │ │ │ └─ python_abi 3.6.* *_cp36m, which can be installed;
│ │ │ ├─ pytorch [0.2.0|0.3.0] would require
│ │ │ │ └─ cudatoolkit 7.5* , which does not exist (perhaps a missing channel);
│ │ │ ├─ pytorch [0.2.0|0.3.0|0.3.1] would require
│ │ │ │ └─ cudatoolkit 8.0* , which does not exist (perhaps a missing channel);
│ │ │ ├─ pytorch [1.0.1|1.1.0|...|1.4.0] conflicts with any installable versions previously reported;
│ │ │ └─ pytorch 1.0.1 would require
│ │ │ └─ cudatoolkit >=8.0,<8.1.0a0 , which does not exist (perhaps a missing channel);
│ │ ├─ dgl [2.0.0|2.1.0] would require
│ │ │ └─ pytorch >=2.3.0,<2.4.0a0 with the potential options
│ │ │ ├─ pytorch [1.11.0|1.12.0|...|2.3.1], which can be installed (as previously explained);
│ │ │ ├─ pytorch [1.13.0|1.13.1|...|2.3.1] would require
│ │ │ │ └─ python_abi 3.11.* *_cp311, which can be installed;
│ │ │ ├─ pytorch 2.3.0, which can be installed;
│ │ │ ├─ pytorch [1.10.0|1.10.1|...|2.3.1], which can be installed (as previously explained);
│ │ │ ├─ pytorch [1.10.0|1.10.1|...|2.3.1], which can be installed (as previously explained);
│ │ │ ├─ pytorch [1.11.0|1.12.0|...|2.3.1], which cannot be installed (as previously explained);
│ │ │ ├─ pytorch 2.3.1, which can be installed;
│ │ │ └─ pytorch [2.0.1|2.3.0], which cannot be installed (as previously explained);
│ │ └─ dgl 2.1.0 would require
│ │ └─ pytorch >=2.3.1,<2.4.0a0 with the potential options
│ │ ├─ pytorch [1.11.0|1.12.0|...|2.3.1], which can be installed (as previously explained);
│ │ ├─ pytorch [1.13.0|1.13.1|...|2.3.1], which can be installed (as previously explained);
│ │ ├─ pytorch [1.10.0|1.10.1|...|2.3.1], which can be installed (as previously explained);
│ │ ├─ pytorch [1.10.0|1.10.1|...|2.3.1], which can be installed (as previously explained);
│ │ ├─ pytorch [1.11.0|1.12.0|...|2.3.1], which cannot be installed (as previously explained);
│ │ └─ pytorch 2.3.1, which can be installed;
│ └─ pytorch <=2.2.1 with the potential options
│ ├─ pytorch [1.11.0|1.12.0|...|2.3.1], which can be installed (as previously explained);
│ ├─ pytorch [1.13.0|1.13.1|...|2.3.1], which can be installed (as previously explained);
│ ├─ pytorch [1.10.0|1.10.1|...|2.3.1], which can be installed (as previously explained);
│ ├─ pytorch [1.10.0|1.10.1|...|2.3.1], which can be installed (as previously explained);
│ ├─ pytorch [1.11.0|1.12.0|...|2.3.1], which cannot be installed (as previously explained);
│ ├─ pytorch [2.0.1|2.3.0], which cannot be installed (as previously explained);
│ ├─ pytorch [0.4.0|0.4.1|...|2.2.0] conflicts with any installable versions previously reported;
│ ├─ pytorch [1.10.0|1.10.1|...|1.9.1], which can be installed (as previously explained);
│ ├─ pytorch [1.6.0|1.7.1|1.8.0|1.9.0|1.9.1], which can be installed (as previously explained);
│ ├─ pytorch [0.2.0|0.3.0], which cannot be installed (as previously explained);
│ ├─ pytorch [0.2.0|0.3.0|0.3.1], which cannot be installed (as previously explained);
│ ├─ pytorch [0.3.1|0.4.0] would require
│ │ └─ cudatoolkit 8.0.* , which does not exist (perhaps a missing channel);
│ ├─ pytorch [1.0.1|1.1.0|...|1.4.0] conflicts with any installable versions previously reported;
│ └─ pytorch 1.0.1, which cannot be installed (as previously explained);
├─ python 3.12** is not installable because there are no viable options
│ ├─ python [3.12.0|3.12.1|3.12.2|3.12.3|3.12.4] would require
│ │ └─ python_abi 3.12.* *_cp312, which conflicts with any installable versions previously reported;
│ ├─ python 3.12.0rc3 would require
│ │ └─ _python_rc, which does not exist (perhaps a missing channel);
│ └─ python [3.12.0|3.12.1|3.12.2|3.12.3|3.12.4] conflicts with any installable versions previously reported;
└─ scipy 1.14.0** is installable with the potential options
├─ scipy 1.14.0 would require
│ └─ python_abi 3.10.* *_cp310, which can be installed;
├─ scipy 1.14.0 would require
│ └─ python_abi 3.11.* *_cp311, which can be installed;
└─ scipy 1.14.0 would require
└─ python_abi 3.12.* *_cp312, which conflicts with any installable versions previously reported.
|
2025-04-01T06:41:09.383764
| 2024-04-05T00:49:52
|
2226759655
|
{
"authors": [
"codecov-commenter",
"quambene"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12498",
"repo": "quambene/pigeon-rs",
"url": "https://github.com/quambene/pigeon-rs/pull/9"
}
|
gharchive/pull-request
|
Fix write image
Codecov Report
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 78.14%. Comparing base (611fe53) to head (3632fe8).
Additional details and impacted files
@@ Coverage Diff @@
## main #9 +/- ##
==========================================
+ Coverage 75.73% 78.14% +2.41%
==========================================
Files 27 27
Lines 2089 2082 -7
==========================================
+ Hits 1582 1627 +45
+ Misses 507 455 -52
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
|
2025-04-01T06:41:09.397428
| 2024-02-13T18:46:18
|
2132943381
|
{
"authors": [
"raghavN13"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12499",
"repo": "raghavN13/kdu-coursework",
"url": "https://github.com/raghavN13/kdu-coursework/pull/28"
}
|
gharchive/pull-request
|
added homework-6
Metric
Value
alert_status
:white_check_mark: OK
bugs
0
code_smells
0
reliability_rating
A (0 Bugs)Note: A being best, E being worst
security_rating
A (0 Vulnerabilities)Note: A being best, E being worst
vulnerabilities
0
|
2025-04-01T06:41:09.399371
| 2021-12-14T09:22:19
|
1079508243
|
{
"authors": [
"Itxaka",
"mudler"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12500",
"repo": "rancher-sandbox/cOS-toolkit",
"url": "https://github.com/rancher-sandbox/cOS-toolkit/issues/964"
}
|
gharchive/issue
|
Unit test coverage - cover the core features
We have a 73,78% coverage of the main base. Mainly all the main stuff is covered by tests, the only thing mostly missing is a piece of the disk.go and mostly erroring out.
|
2025-04-01T06:41:09.400137
| 2020-03-29T16:50:16
|
589833876
|
{
"authors": [
"ibuildthecloud",
"rmweir"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12501",
"repo": "rancher/channelserver",
"url": "https://github.com/rancher/channelserver/pull/5"
}
|
gharchive/pull-request
|
Don't include prerelease tags
@ibuildthecloud actually, the way the PR is written doesn't even have an effect on the releases, just the channels API which rancher doesn't use.
|
2025-04-01T06:41:09.405990
| 2023-06-02T14:32:35
|
1738316762
|
{
"authors": [
"fedfontana",
"rcastellotti"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12502",
"repo": "rcastellotti/tinyben",
"url": "https://github.com/rcastellotti/tinyben/issues/10"
}
|
gharchive/issue
|
log what is happening (pre start run start post start)
we are logging in the base class, we need to implement a way to have meaningful log positions
should also log module name and function name (%(module) and %(funcName) are valid logging formatter templates, but they do not work: with the current setup they always print base and run)
|
2025-04-01T06:41:09.408755
| 2022-06-13T18:59:52
|
1269830369
|
{
"authors": [
"cletustboone",
"jlacivita",
"kpears201"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12503",
"repo": "rdkcentral/firebolt-openrpc",
"url": "https://github.com/rdkcentral/firebolt-openrpc/pull/41"
}
|
gharchive/pull-request
|
Wire up the correlationId to pull APIs
This is not backwards compatible and will break XClass
Worked with @kpears201 to name the variables in a way that only breaks unimplemented use cases.
:tada: This PR is included in version 1.4.1-next.1 :tada:
The release is available on:
npm package (@next dist-tag)
GitHub release
Your semantic-release bot :package::rocket:
|
2025-04-01T06:41:09.434491
| 2024-05-22T11:52:54
|
2310337066
|
{
"authors": [
"kdubois"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12504",
"repo": "redhat-scholars/containers-tutorial",
"url": "https://github.com/redhat-scholars/containers-tutorial/pull/25"
}
|
gharchive/pull-request
|
Change docker commands to podman
@cedricclyburn do you want to review and merge if it looks good?
|
2025-04-01T06:41:09.440588
| 2024-02-08T00:36:37
|
2124153259
|
{
"authors": [
"rekola"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12505",
"repo": "rekola/radix-cpp",
"url": "https://github.com/rekola/radix-cpp/issues/9"
}
|
gharchive/issue
|
Add non-linear probing
This was slower than linear. However, we could try to make the hash function worse and then use non-linear probing.
|
2025-04-01T06:41:09.449365
| 2024-02-22T04:31:29
|
2148169544
|
{
"authors": [
"cartermp",
"mattt"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12506",
"repo": "replicate/replicate-go",
"url": "https://github.com/replicate/replicate-go/pull/44"
}
|
gharchive/pull-request
|
Create polling example
Hey @cartermp. Thanks for opening this PR. We actually have a built-in Wait method that takes care of this for you, as well as a Run convenience method for running a model and waiting for its output. I just opened #45 with an example of how to use that.
There's a lot more we can do to document this library in the README and Go docs, and I'm excited to invest more in that.
Ah, I see, thanks! I think additional examples would help the most, since I immediately went with the readme, then looked at the REST docs and saw about a need to poll: https://replicate.com/docs/reference/http#predictions.create
To get the final result of the prediction you should either provide a webhook HTTPS URL for us to call when the results are ready, or poll the get a prediction endpoint until it has finished.
|
2025-04-01T06:41:09.452540
| 2015-04-21T15:04:10
|
69870891
|
{
"authors": [
"jviotti"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12507",
"repo": "resin-io/resin-request",
"url": "https://github.com/resin-io/resin-request/pull/5"
}
|
gharchive/pull-request
|
See node-request state information with DEBUG=true
Ignore hound, will send the proper Hound CI configuration in another PR.
|
2025-04-01T06:41:09.482702
| 2024-07-22T04:25:40
|
2421888847
|
{
"authors": [
"NathanFlurry"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12508",
"repo": "rivet-gg/plugin-godot",
"url": "https://github.com/rivet-gg/plugin-godot/pull/163"
}
|
gharchive/pull-request
|
chore: impl extended backend configs
[!WARNING]
This pull request is not mergeable via GitHub because a downstack PR is open. Once all requirements are satisfied, merge this PR as a stack on Graphite.
Learn more
#164
#163 👈
#162
#161
#160
#159
#158
#157
#156
#155
#154
#153
#152
#151
#150
#149
#148
#147
#146
#145
#144
#143
#142
#141
#140
#139
#138
#137
#136
main
This stack of pull requests is managed by Graphite. Learn more about stacking.
Join @NathanFlurry and the rest of your teammates on Graphite
|
2025-04-01T06:41:09.500213
| 2023-06-27T19:11:09
|
1777597578
|
{
"authors": [
"rohany"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12509",
"repo": "rohany/legate.core",
"url": "https://github.com/rohany/legate.core/pull/5"
}
|
gharchive/pull-request
|
Kernel fusion branch 23.07
This still needs a fix from wonchan (or i can add it) that caches array types in the core.
|
2025-04-01T06:41:09.507416
| 2024-02-20T15:40:16
|
2144672379
|
{
"authors": [
"logan-markewich",
"nerdai"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12510",
"repo": "run-llama/llama_index",
"url": "https://github.com/run-llama/llama_index/pull/11032"
}
|
gharchive/pull-request
|
fixes, so many fixes [circular import bonanza]
so many fixes is right lol - 100 files changed ❤️
@nerdai yea most of these were in integrations 🤔 The other issue was somehow circular imports got introduced (not sure how), so I moved the BaseLLM class and generic_utils
Ah okay. So odd though as I do remember string search and replace on those integrations. Perhaps I'm wrong tho. 🙏🙏🙏
|
2025-04-01T06:41:09.513820
| 2023-04-17T09:18:40
|
1670758548
|
{
"authors": [
"bors",
"oli-obk"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12511",
"repo": "rust-lang/miri",
"url": "https://github.com/rust-lang/miri/pull/2845"
}
|
gharchive/pull-request
|
Rustup
@bors r+
:pushpin: Commit 393159aceb600aad528a4cd9e741d184d8e7bbb8 has been approved by oli-obk
It is now in the queue for this repository.
:hourglass: Testing commit 393159aceb600aad528a4cd9e741d184d8e7bbb8 with merge 41ca2861418274f16dfef86718b14c3bca942896...
:sunny: Test successful - checks-actions
Approved by: oli-obk
Pushing 41ca2861418274f16dfef86718b14c3bca942896 to master...
|
2025-04-01T06:41:09.527811
| 2023-03-13T05:15:50
|
1620770372
|
{
"authors": [
"coveralls",
"sagnikgh1899"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12512",
"repo": "sagnikgh1899/FraudDetection",
"url": "https://github.com/sagnikgh1899/FraudDetection/pull/126"
}
|
gharchive/pull-request
|
Update Component Specification.md
Pull Request Test Coverage Report for Build<PHONE_NUMBER>
0 of 0 changed or added relevant lines in 0 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage remained the same at 100.0%
Totals
Change from base Build<PHONE_NUMBER>:
0.0%
Covered Lines:
78
Relevant Lines:
78
💛 - Coveralls
|
2025-04-01T06:41:09.546242
| 2022-04-24T15:36:28
|
1213687480
|
{
"authors": [
"codecov-commenter",
"sanders41"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12513",
"repo": "sanders41/meilisearch-python-async",
"url": "https://github.com/sanders41/meilisearch-python-async/pull/243"
}
|
gharchive/pull-request
|
Add additional pre-commit hooks
Codecov Report
Merging #243 (0e07f55) into main (33ef15a) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## main #243 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 14 14
Lines 837 829 -8
=========================================
- Hits 837 829 -8
Impacted Files
Coverage Δ
meilisearch_python_async/index.py
100.00% <0.00%> (ø)
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 33ef15a...0e07f55. Read the comment docs.
|
2025-04-01T06:41:09.548609
| 2024-12-24T09:49:43
|
2757514601
|
{
"authors": [
"sargunv"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12514",
"repo": "sargunv/maplibre-compose",
"url": "https://github.com/sargunv/maplibre-compose/pull/175"
}
|
gharchive/pull-request
|
disable broken property and fix some docs
#175 👈 (View in Graphite)
main
This stack of pull requests is managed by Graphite. Learn more about stacking.
|
2025-04-01T06:41:09.564695
| 2022-11-25T04:33:36
|
1464025216
|
{
"authors": [
"kunxian-xia",
"roynalnaruto"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12515",
"repo": "scroll-tech/zkevm-circuits",
"url": "https://github.com/scroll-tech/zkevm-circuits/pull/225"
}
|
gharchive/pull-request
|
RLP Circuit and RLP Table
The constraints for "Nonce", "GasPrice", "Gas", "To", "Value" are very similar. It will simplify the codes greatly if we can use one set of constraints to handle all these tags. A possible solution is
Add a lookup argument in which for each tag we can look up the expected next tag in the next_tag column.
tag
next_tag
Nonce
GasPrice
GasPrice
Gas
Gas
To
To
Value
@lispc This PR is ready to be mock tested for mainnet blocks.
|
2025-04-01T06:41:09.570297
| 2022-11-20T23:17:24
|
1457084948
|
{
"authors": [
"seanbreckenridge"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12516",
"repo": "seanbreckenridge/ttally",
"url": "https://github.com/seanbreckenridge/ttally/issues/5"
}
|
gharchive/issue
|
cache export command by using modification times of files
[ ~/Repos/ttally | master ] $ hyperfine 'ttally export food' 'python3 -m ttally export food'
Benchmark 1: ttally export food
Time (mean ± σ): 428.8 ms ± 30.3 ms [User: 398.5 ms, System: 30.2 ms]
Range (min … max): 401.7 ms … 480.1 ms 10 runs
Benchmark 2: python3 -m ttally export food
Time (mean ± σ): 154.8 ms ± 10.3 ms [User: 129.3 ms, System: 23.9 ms]
Range (min … max): 140.5 ms … 169.9 ms 17 runs
Summary
'python3 -m ttally export food' ran
2.77 ± 0.27 times faster than 'ttally export food'
[ ~/Repos/ttally | master ] $ hyperfine 'ttally recent food' 'python3 -m ttally recent food'
Benchmark 1: ttally recent food
Time (mean ± σ): 244.5 ms ± 12.6 ms [User: 219.7 ms, System: 24.5 ms]
Range (min … max): 223.3 ms … 259.3 ms 11 runs
Benchmark 2: python3 -m ttally recent food
Time (mean ± σ): 228.3 ms ± 10.9 ms [User: 202.7 ms, System: 23.8 ms]
Range (min … max): 213.8 ms … 240.6 ms 12 runs
Summary
'python3 -m ttally recent food' ran
1.07 ± 0.08 times faster than 'ttally recent food'
```
added in f1efa02f99864d86c0773c51a307e49bb66ba82d
|
2025-04-01T06:41:09.571056
| 2024-07-03T00:46:24
|
2387404430
|
{
"authors": [
"seanno"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12517",
"repo": "seanno/vdj",
"url": "https://github.com/seanno/vdj/issues/12"
}
|
gharchive/issue
|
import samples from agate
this is working now --- may have bugs but they should get their own issues!
|
2025-04-01T06:41:09.579745
| 2022-09-13T01:40:03
|
1370725943
|
{
"authors": [
"codecov-commenter",
"neutralino1"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12518",
"repo": "sematic-ai/sematic",
"url": "https://github.com/sematic-ai/sematic/pull/134"
}
|
gharchive/pull-request
|
Fix dictionary visualization bug
Codecov Report
Base: 87.43% // Head: 87.43% // No change to project coverage :thumbsup:
Coverage data is based on head (07357fe) compared to base (e7f07d9).
Consider uploading reports for the commit 46c42b2 to get more accurate results
Additional details and impacted files
@@ Coverage Diff @@
## main #134 +/- ##
=======================================
Coverage 87.43% 87.43%
=======================================
Files 129 129
Lines 9226 9226
=======================================
Hits 8067 8067
Misses 1159 1159
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
|
2025-04-01T06:41:09.609612
| 2022-05-02T17:52:26
|
1223167537
|
{
"authors": [
"andyw8",
"schonert"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12519",
"repo": "setup-rails/setup-rails",
"url": "https://github.com/setup-rails/setup-rails/issues/8"
}
|
gharchive/issue
|
Add Redis support
Example: https://github.com/actions/example-services/blob/main/.github/workflows/redis-service.yml
Hey again! I'll most likely be migrating our current moonshine setup over, to something like this, within the month.
So, sadly, I can't test it yet. Setting up a redis outside is no biggie - so if passing the connection through is possible there there will be no need to choke everyone's pipelines for a nice-to-have 😊
|
2025-04-01T06:41:09.612076
| 2024-03-20T21:09:16
|
2198573354
|
{
"authors": [
"sharkov63"
],
"license": "Unlicense",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12520",
"repo": "sharkov63/dotfiles",
"url": "https://github.com/sharkov63/dotfiles/issues/3"
}
|
gharchive/issue
|
rofi: automatically switch to EN keyboard layout
Completed by https://github.com/sharkov63/dotfiles/commit/088859a4d0b2c181d6144785bc5f004f7e77ef3e
|
2025-04-01T06:41:09.613675
| 2024-11-25T11:43:48
|
2690459291
|
{
"authors": [
"shashank-mishra-appdev"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12521",
"repo": "shashank-mishra-appdev/semantic-version",
"url": "https://github.com/shashank-mishra-appdev/semantic-version/pull/16"
}
|
gharchive/pull-request
|
feat: Used dynamic value for version
:tada: This PR is included in version 1.14.0 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
|
2025-04-01T06:41:09.625749
| 2021-12-07T00:43:16
|
1072783436
|
{
"authors": [
"coveralls",
"sunny4381"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12522",
"repo": "shirasagi/shirasagi",
"url": "https://github.com/shirasagi/shirasagi/pull/4284"
}
|
gharchive/pull-request
|
[modify] update gem 'rdoc'
Coverage decreased (-0.03%) to 84.898% when pulling 0b64b534313eb7d2225ae9cdb1698e59b4d43776 on pr-update-rdoc into ab547b2edca39ac2569ff394d5bd4085062af9ec on master.
|
2025-04-01T06:41:09.640746
| 2023-09-30T17:34:26
|
1920342261
|
{
"authors": [
"EmpiricEmpire",
"cakekindel",
"f-f",
"sigma-andex"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12523",
"repo": "sigma-andex/purescript-httpurple",
"url": "https://github.com/sigma-andex/purescript-httpurple/pull/16"
}
|
gharchive/pull-request
|
feat: generalize ResponseM to any MonadAff
@cakekindel what's the usecase for this?
I noticed that Response (and maybe some other structures) are generic over any MonadAff and am running HTTPurple in a custom pipe-based MTL, and figured this would improve the UX a bit.
See CustomStack for the concrete value-add.
I removed ResponseM because it's not super useful anymore but this change does not have to be breaking since it can easily maintain the previous API while just adding the general server' function I wrote.
Note that there has also been an open issue in httpure for several years if you want a little more context: #134
thx @cakekindel , currently on vacation. Will have a look at it next week.
Will have a look at it next week.
Which one, is it the week on September 16 or the one on October 7-th?
For sure!
On Wed, Sep 11, 2024, 10:39 Fabrizio Ferrai @.***>
wrote:
@.**** approved this pull request.
I think this is a good change - CI is currently failing and the logs are
not available anymore, @cakekindel https://github.com/cakekindel could
you have a look at this?
—
Reply to this email directly, view it on GitHub
https://github.com/sigma-andex/purescript-httpurple/pull/16#pullrequestreview-2297642077,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ADBCZGB44MQVR5R3QPTO2QDZWBP25AVCNFSM6AAAAABOBFYAUKVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMZDEOJXGY2DEMBXG4
.
You are receiving this because you were mentioned.Message ID:
@.***>
I don't know how the tests ever passed - 2 main issues:
mockResponse didn't implement Stream#write correctly, so the tests that tried to write to those objects always timed out. Replaced the record with a MockResponse class extending http.OutgoingMessage
all tests that ran serve had race conditions where requests were issued before the server was actually running. For some, I used a serveAwaitReady function that resolves an Aff with {onStarted :: Maybe (Effect Unit)}. For others that already specified that, I just did the naiive thing of spinning until a request can connect
just noticed my git & gpg emails mismatched, force push was just changing the committer email
@cakekindel awesome! thanks!
@f-f I don't know what the current process is to publish a new version with the new registry.
@sigma-andex if you cut a new tag it should be picked up by the registry, but switching out the spago.dhall with a spago.yaml should be good for future proofing
The registry currently has a cronjob to pick up all legacy packages once a day, but using spago publish will publish it immediately
|
2025-04-01T06:41:09.661169
| 2024-08-06T11:40:31
|
2450670228
|
{
"authors": [
"piotmag769",
"wawel37"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:12524",
"repo": "software-mansion/scarb",
"url": "https://github.com/software-mansion/scarb/issues/1508"
}
|
gharchive/issue
|
Create Scarb documentation for scarb doc
@wawel37 make sure to tag with scarb-doc before creating the issue, this way it won't be assigned to Scarb project, only to scarb-doc (as @maciektr requested before)
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.