added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| created
timestamp[us]date 2001-10-09 16:19:16
2025-01-01 03:51:31
| id
stringlengths 4
10
| metadata
dict | source
stringclasses 2
values | text
stringlengths 0
1.61M
|
|---|---|---|---|---|---|
2025-04-01T04:10:19.204630
| 2015-11-06T11:58:35
|
115492430
|
{
"authors": [
"ckross01",
"in-in",
"ronag",
"xHN35RQ"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13926",
"repo": "DanBrooker/file-icons",
"url": "https://github.com/DanBrooker/file-icons/issues/227"
}
|
gharchive/issue
|
No colours
After the recent update colours stopped working. I've tried uninstalling, restarting atom, reinstalling and finally restarting atom without success. Also tried unchecking and rechecking the "coloured" setting.
@ronag your setting looks same?
I have no colours either. atom 1.2.2, file-icons 1.6.12, settings identical to the above
going to close this issue, if having the colored setting doesnt fix the issue please reopen.
Thanks!
|
2025-04-01T04:10:19.233045
| 2022-03-07T11:34:20
|
1161299689
|
{
"authors": [
"Daniel-Itzul",
"DoDzilla-ai"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13927",
"repo": "DanielMoralisSamples/25_NFT_MARKET_PLACE",
"url": "https://github.com/DanielMoralisSamples/25_NFT_MARKET_PLACE/issues/4"
}
|
gharchive/issue
|
market_place.sol doesn't support royalties (EIP2981)
The title explains everything. I've deployed the market_place.sol and an ERC721 contract on Remix. Minted, listed, and sold an NFT with EIP2981 complient NFT contract but the royalty recipient didn't receive anything and the seller received everything. I think this is needed since the EIP2981 is an industry-standard at the moment. The ERC 721 NFT contract (with EIP2981 support) can be found here: https://github.com/dievardump/EIP2981-implementation/blob/main/contracts/mocks/ERC721WithRoyalties.sol
Hi DoDzilla-ai, this is contract is merely for learning purposes, not for specific use cases. But I might cover that in future tutorials.
|
2025-04-01T04:10:19.265739
| 2024-02-26T07:03:40
|
2153405003
|
{
"authors": [
"DannyVanpoucke",
"lsorber"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13928",
"repo": "DannyVanpoucke/LSSVMlib",
"url": "https://github.com/DannyVanpoucke/LSSVMlib/issues/7"
}
|
gharchive/issue
|
Neo LS-SVM
Hi there, since this is one of the very few LS-SVM implementations, I thought you might be interested to know that I've recently released a new LS-SVM package called Neo LS-SVM. The idea is that LS-SVMs have a lot of untapped potential, but that they could use a little more love ❤️ now that most of the attention is going to deep learning and gradient-boosted decision trees.
Thank you for the info. Feel free to link to our repositry from Neo-LS-SVM. Best regards Danny.
|
2025-04-01T04:10:19.269031
| 2021-12-20T09:20:41
|
1084549481
|
{
"authors": [
"Dantevg",
"coco0325"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13929",
"repo": "Dantevg/WebStats",
"url": "https://github.com/Dantevg/WebStats/issues/14"
}
|
gharchive/issue
|
MySQL Auto Reconnect
The plugin should reconnect automatically after the connection is closed.
https://pastebin.com/eN1LkPM8
Thanks for opening an issue again! I believe I already fixed this in the not-yet-released version 1.6.
I attached the latest development version which contains the fix: WebStats-1.5.1-dev3.zip
https://pastebin.com/rV04vJh8
The issue still exists.
I tried something else, hopefully that fixed it. Latest development version attached again: WebStats-1.5.1-dev4.zip
(note: not properly tested, it might still not work or there might be other bugs)
I assume it is fixed now, closing. Please re-open and comment if the bug is still there :)
|
2025-04-01T04:10:19.322249
| 2020-05-12T06:44:05
|
616403844
|
{
"authors": [
"AnkitPathak41",
"DarqueWarrior",
"SebastianSchuetze"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13930",
"repo": "DarqueWarrior/vsteam",
"url": "https://github.com/DarqueWarrior/vsteam/issues/307"
}
|
gharchive/issue
|
Group type (aad) not handled yet for Add-VSTeamGitRepositoryPermission
Please add the group type aad to be handled by the cmdlet Add-VSTeamGitRepositoryPermission, It is able to add the vsts groups, However does not work with aad groups.
Below is the output of the command when running the cmdlet with aad group type
Steps to reproduce
$OwnerPermission = @()
$OwnerPermission += "CreateBranch"
Add-VSTeamGitRepositoryPermission -Project $ProjectName -RepositoryId $repoID -BranchName "master" -Group $groups -Allow $OwnerPermission -Deny ManagePermissions
Expected behavior
Actual behavior
Repository: https://github.com/DarqueWarrior/vsteam/issues
Environment data
OS
[ ] Windows
Server
[ ] Azure DevOps Service
> Get-VSTeamAPIVersion
Name Value
---- -----
Release 5.1-preview
Version VSTS
MemberEntitlementManagement 5.1-preview
TaskGroups 5.1-preview.1
DistributedTask 5.0-preview
Core 5.0
Packaging 5.1-preview
ServiceFabricEndpoint 5.0-preview
VariableGroups 5.0-preview.1
Build 5.1-preview
Git 5.1-preview
ExtensionsManagement 5.1-preview
Graph 5.1-preview
Tfvc 5.0
> 5.1.18362.752
Thank you for reporting. We gonna investigate this.
Is this an issue or a feature request?
It is a feature request. As repository permissions only supports vsts type groups not Azure AD group
Is this an issue or a feature request?
Looking at the docs this looks more like a request for the Azure DevOps team than this module. Once the API provides it we can offer it.
I am not really understanding that API with the permission. Where can you see that this must be added via API and cannot / should not be added via the module?
When I understand it, I would open a feature request and close this issue. Or we fix it in the module.
|
2025-04-01T04:10:19.488151
| 2018-09-19T23:56:09
|
361969148
|
{
"authors": [
"coverbeck"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13931",
"repo": "DataBiosphere/cgp-dashboard",
"url": "https://github.com/DataBiosphere/cgp-dashboard/pull/69"
}
|
gharchive/pull-request
|
Upgrade nginx version
After changing the version, Postgres failed to install during the build
of the Docker image. The error made me realize we don't need Postgres,
so I removed that as well. These changes are all in Dockerfile.
Then to be safe, removed a bunch of Python/HTML code that is no longer
being used, and that has had dependencies on Postgres, e.g., invoicing
service that stored/read data.
Running deployment at https://charles.ucsc-cgp-dev.org
The old code is in Git history, so if I went too far, we could always restore it.
We showed the removal of the menu choices at sprint review, and neither Brian nor Cricket complained (in fact they asked me to remove vestiges of the old site to do it at the sprint review before last, which is why I changed the home page content and the menu), so I think it's safe to remove the code that those removed menu choices were invoking without double checking with them.
|
2025-04-01T04:10:19.489959
| 2023-07-25T15:49:06
|
1820635069
|
{
"authors": [
"dillydally414",
"nawatts"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13932",
"repo": "DataBiosphere/terra-ui",
"url": "https://github.com/DataBiosphere/terra-ui/pull/4074"
}
|
gharchive/pull-request
|
Update SubmissionConfig test to follow conventions
This updates the SubmissionConfig unit test to follow conventions. Details in inline comments.
@nawatts Some of these changes will be addressed in https://github.com/DataBiosphere/terra-ui/pull/4063 - I think the only remaining one is fixing the imports, which I could handle in that PR as well if it's easier
Excellent! I'll close this PR then.
|
2025-04-01T04:10:19.501345
| 2022-10-06T14:02:42
|
1399698842
|
{
"authors": [
"HadhemiDD",
"pducolin"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13933",
"repo": "DataDog/datadog-agent",
"url": "https://github.com/DataDog/datadog-agent/pull/13779"
}
|
gharchive/pull-request
|
[WIP] : Use datadog agent SNMP configuration as options for the integrated snmpwalk
What does this PR do?
The goal behind this PR is to parse the customer SNMP configuration from the agent files (under snmp.d and eventually from the datadog.yaml) then use that snmp configuration to create an integrated snmpwalk command within the agent that only needs the IP address to run.
The new snmpwalk will use the IP address provided to search through the snmp configuration already provided within the agent, then run a classic snmpwalk command combining the IP address and the snmp options that were found in the agent.
Motivation
When investigating snmp issues, we have to blindly trust the customer is using the same SNMP configuration within the agent as the ones passed to the snmpwalk command.
Unfortunately, in many cases the custom makes typos or confuses different options, which leaves us with a lot of going back and forth just to fix simple configurations.
So parsing the agent snmp configuration and passing it as options to the snmpwalk integrated command will help us validate the customer configuration in a timely manner.
Additional Notes
Possible Drawbacks / Trade-offs
Describe how to test/QA your changes
Reviewer's Checklist
[ ] If known, an appropriate milestone has been selected; otherwise the Triage milestone is set.
[ ] Use the major_change label if your change either has a major impact on the code base, is impacting multiple teams or is changing important well-established internals of the Agent. This label will be use during QA to make sure each team pay extra attention to the changed behavior. For any customer facing change use a releasenote.
[ ] A release note has been added or the changelog/no-changelog label has been applied.
[ ] Changed code has automated tests for its functionality.
[ ] Adequate QA/testing plan information is provided if the qa/skip-qa label is not applied.
[ ] At least one team/.. label has been applied, indicating the team(s) that should QA this change.
[ ] If applicable, docs team has been notified or an issue has been opened on the documentation repo.
[ ] If applicable, the need-change/operator and need-change/helm labels have been applied.
[ ] If applicable, the k8s/<min-version> label, indicating the lowest Kubernetes version compatible with this feature.
[ ] If applicable, the config template has been updated.
Great starting point ! What I would add here is a function that takes an ip address as input, and returns the configuration, if any, or nil. This could be later used by the snmpwalk command to try loading a configuration.
|
2025-04-01T04:10:19.504920
| 2024-06-04T19:25:40
|
2334221294
|
{
"authors": [
"ken-schneider"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13934",
"repo": "DataDog/datadog-agent",
"url": "https://github.com/DataDog/datadog-agent/pull/26348"
}
|
gharchive/pull-request
|
[SNMP] Ping instance config should override init_config settings
What does this PR do?
Fixes behavior where init_config for ping would override any instance level config set for a device. It should be the other way around, e.g. if ping is generally enabled at the init_config level, but an instance has ping disabled, that instance should have ping disabled. Before this change, it would still be enabled.
Motivation
Noticed the behavior during some local testing. While this behavior is technically valid, it's at odds with the way other configs inside the SNMP integration work today.
Additional Notes
Possible Drawbacks / Trade-offs
Describe how to test/QA your changes
/merge
/merge
/merge
/merge
|
2025-04-01T04:10:19.509474
| 2024-08-20T15:33:57
|
2475952273
|
{
"authors": [
"DylanLovesCoffee"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13935",
"repo": "DataDog/datadog-agent",
"url": "https://github.com/DataDog/datadog-agent/pull/28601"
}
|
gharchive/pull-request
|
[SVLS-5268] Disable flush ticker for AWS Lambda
What does this PR do?
Adds back what was removed in https://github.com/DataDog/datadog-agent/pull/27722 but gates it to only AWS Lambda so that this doesn't affect serverless-init (GCP + Azure).
Motivation
We've been seeing a very infrequent panic in the Lambda Extension: panic: sync: WaitGroup is reused before previous Wait has returned. The panic appeared when we first introduced using WaitGroups to sync a flush trigger to the logs payload request to intake.
The current theory is that flush ticker will sometimes call WaitGroup.Add() after WaitGroup.Wait() is called, throwing a panic. A flush ticker is not necessary in Lambda where we'll always attempt to flush logs at some point during an invocation.
Additional Notes
Will need to also test this in self-monitoring for serverless-init to ensure it does not cause a regression.
Possible Drawbacks / Trade-offs
Describe how to test/QA your changes
Changes are a re-implementation, but can be tested again. As the panic is very difficult to reproduce, I deployed the Extension to dogfood apps and monitored if a panic occurs over several hours/days.
/merge
|
2025-04-01T04:10:19.526169
| 2022-05-11T15:34:40
|
1232844952
|
{
"authors": [
"fuzzybinary"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13936",
"repo": "DataDog/dd-sdk-flutter",
"url": "https://github.com/DataDog/dd-sdk-flutter/pull/119"
}
|
gharchive/pull-request
|
✨ Allow setting a sample rate for tracing
What and why?
This add support for setting a trace sampling rate for both the datadog_tracking_http_client and datadog_grpc_interceptor. A sample rate sets the probability that a given network request will include a sampling priority of '1' (meaning it should sample) and send trace and spanIds.
There is also a minor change to send a sampling priority of '0' on first part calls when network request does not pass the sample test.
Review checklist
[x] This pull request has appropriate unit and / or integration tests
[x] This pull request references a Github or JIRA issue
I've renamed sessionRate and added tests that check for it being clamped, as well as properly testing that tracingSampling rate is properly clamped.
|
2025-04-01T04:10:19.985568
| 2024-10-11T15:12:41
|
2581619448
|
{
"authors": [
"HadhemiDD",
"sergiiplevako-hnb"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13937",
"repo": "DataDog/integrations-core",
"url": "https://github.com/DataDog/integrations-core/issues/18820"
}
|
gharchive/issue
|
Request for release plan for PR #17957 (librdkafka SSL path option)
Hello DataDog team,
We are DataDog enterprise customers, and we rely heavily on your monitoring services. We are currently using Datadog version 3.62.1 and are trying to upgrade to version 3.74.1. However, we have encountered an issue with the librdkafka SSL path configuration that was addressed in PR #17957, which was merged but hasn't been released yet.
Could you please provide information on when this PR will be included in an upcoming release? Any guidance on your release plan would be greatly appreciated, as it is currently blocking our upgrade.
Thanks for your time and assistance.
This has been released with agent 7.56
|
2025-04-01T04:10:20.032853
| 2021-12-15T01:01:21
|
1080446278
|
{
"authors": [
"justiniso",
"kei6u"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13938",
"repo": "DataDog/integrations-core",
"url": "https://github.com/DataDog/integrations-core/pull/10861"
}
|
gharchive/pull-request
|
Use % for MySQL integration setup document
What does this PR do?
Replace localhost with %.
Motivation
Describe what happened:
I've tried Agent based MySQL integration according to this setup guide.
I could not create the connection between Agent and MySQL when I use 'datadog'@'localhost' or<EMAIL_ADDRESS>Therefore, I want to suggest to use 'datadog'@'%' to prevent the connection error.
Describe what you expected:
No one faces the connection error when reading the setup guide.
Steps to reproduce the issue:
Execute docker compose up after preparing following files.
docker-compose.yaml
version: "3.8"
services:
agent:
image: gcr.io/datadoghq/agent:7
environment:
DD_API_KEY: "" # TODO: set your api key
volumes:
- ./conf.yaml:/conf.d/mysql.d/conf.yaml
- /sys/fs/cgroup/:/host/sys/fs/cgroup:ro
- /proc/:/host/proc/:ro
- /var/run/docker.sock:/var/run/docker.sock:ro
networks:
- default
depends_on:
- mysql
mysql:
image: bitnami/mysql
ports:
- 3306:3306
restart: always
environment:
ALLOW_EMPTY_PASSWORD: yes
volumes:
- ./init.sql:/docker-entrypoint-initdb.d/init.sql
networks:
- default
conf.yaml
init_config:
instances:
- server: mysql
user: datadog
pass: "password"
port: "3306"
options:
replication: false
galera_cluster: true
extra_status_metrics: true
extra_innodb_metrics: true
extra_performance_metrics: true
schema_size_metrics: false
disable_innodb_metrics: false
init.sql
use localhost (This is going to fail)
CREATE USER 'datadog'@'localhost' IDENTIFIED BY 'password';
GRANT REPLICATION CLIENT ON *.* TO 'datadog'@'localhost';
ALTER USER 'datadog'@'localhost'
WITH MAX_USER_CONNECTIONS 5;
GRANT PROCESS ON * . * TO 'datadog'@'localhost';
use <IP_ADDRESS> according to MySQL Localhost Error - Localhost VS <IP_ADDRESS> (This is going to fail)
CREATE USER<EMAIL_ADDRESS>IDENTIFIED BY 'password';
GRANT REPLICATION CLIENT ON *.* TO<EMAIL_ADDRESS>ALTER USER<EMAIL_ADDRESS>WITH MAX_USER_CONNECTIONS 5;
GRANT PROCESS ON * . * TO<EMAIL_ADDRESS>
use % (This is going to success)
CREATE USER 'datadog'@'%' IDENTIFIED BY 'password';
GRANT REPLICATION CLIENT ON *.* TO 'datadog'@'%';
ALTER USER 'datadog'@'%'
WITH MAX_USER_CONNECTIONS 5;
GRANT PROCESS ON * . * TO 'datadog'@'%';
Additional Notes
Review checklist (to be filled by reviewers)
[ ] Feature or bugfix MUST have appropriate tests (unit, integration, e2e)
[ ] PR title must be written as a CHANGELOG entry (see why)
[ ] Files changes must correspond to the primary purpose of the PR as described in the title (small unrelated changes should have their own PR)
[ ] PR must have changelog/ and integration/ labels attached
@ruthnaebeck @DataDog/agent-integrations @DataDog/database-monitoring
I appreciate your review.
Could you merge this PR since I've got the approval?
Or should I fix something?
Thanks for the assistance @kei6u!
|
2025-04-01T04:10:20.039112
| 2020-06-26T15:57:51
|
646358325
|
{
"authors": [
"ChristineTChen",
"soitbros"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13939",
"repo": "DataDog/integrations-core",
"url": "https://github.com/DataDog/integrations-core/pull/7001"
}
|
gharchive/pull-request
|
Conflicting Docs
It looks like on https://app.datadoghq.com/monitors#create/process that we state you use conf.d/process.yaml, and now I'm not sure if I know which one is the correct one. I assume this one is, but that means we need to update the app documentation to reflect this correctly.
What does this PR do?
Motivation
Additional Notes
Review checklist (to be filled by reviewers)
[ ] Feature or bugfix MUST have appropriate tests (unit, integration, e2e)
[ ] PR title must be written as a CHANGELOG entry (see why)
[ ] Files changes must correspond to the primary purpose of the PR as described in the title (small unrelated changes should have their own PR)
[ ] PR must have changelog/ and integration/ labels attached
Hi @soitbros, can you share a screenshot of what you're seeing and expecting?
Hi Christine,
Yep, please see here on the app:
The documentation on https://app.datadoghq.com/monitors#create/process conflicts with the documentation on this page. I believe only one of them is the correct one.
Hi @soitbros, I see the discrepancy now. The format conf.d/process.yaml is a legacy one where as now integrations are in the format conf.d/process.d/conf.yaml. I believe the monitors documentation would need to be updated.
Closing this PR as the docs fix is not in the scope of this repo. Thanks for bringing this our attention!
|
2025-04-01T04:10:20.047046
| 2024-01-17T14:27:38
|
2086311748
|
{
"authors": [
"bouwkast",
"cbeauchesne"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13940",
"repo": "DataDog/system-tests",
"url": "https://github.com/DataDog/system-tests/pull/2031"
}
|
gharchive/pull-request
|
Enable the new array encoding for .NET Tracer for OpenTelemetry tags
Motivation
The .NET Tracer is moving toward the newer array encoding for OpenTelemetry spans, so skipping the legacy one and enabling the newer one for supported tracers.
Changes
Marked legacy test as irrelevant
Unskipped new test for versions of the tracer that have the feature.
Workflow
⚠️ Create your PR as draft ⚠️
Work on you PR until the CI passes (if something not related to your task is failing, you can ignore it)
Mark it as ready for review
Test logic is modified? -> Get a review from RFC owner. We're working on refining the codeowners file quickly.
Framework is modified, or non obvious usage of it -> get a review from R&P team
:rocket: Once your PR is reviewed, you can merge it!
🛟 #apm-shared-testing 🛟
Reviewer checklist
[ ] Relevant label (run-parametric-scenario, run-profiling-scenario...) are presents
[ ] No system-tests internal is modified. Otherwise, I have the approval from R&P team
[ ] CI is green, or failing jobs are not related to this change (and you are 100% sure about this statement)
[ ] A docker base image is modified?
[ ] the relevant build-XXX-image label is present
[ ] To R&P team: locally build and push the image to hub.docker.com
[ ] A scenario is added (or removed)?
[ ] Get a review from R&P team
[ ] Once merged, add (or remove) it in system-test-dasboard nightly
Hi @bouwkast , do you still plan to work on this ?
Hi @bouwkast , do you still plan to work on this ?
Looks like it has been done in a different PR, closing.
|
2025-04-01T04:10:20.078128
| 2024-01-22T16:07:17
|
2094249720
|
{
"authors": [
"Samfrancisartuza",
"iannesbitt"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13945",
"repo": "DataONEorg/mnlite",
"url": "https://github.com/DataONEorg/mnlite/issues/59"
}
|
gharchive/issue
|
Debug error system
Some coding are not found in program
@Samfrancisartuza I appreciate the report. Can you be more specific? It seems like you are saying that there is not sufficient test coverage for mnlite, which is true. If so, I can work on expanding the tests to cover more of the code base.
Closing as stale.
|
2025-04-01T04:10:20.080547
| 2024-11-21T16:49:37
|
2680188076
|
{
"authors": [
"NithinKumaraNT",
"jrosskopf"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13946",
"repo": "DataZooDE/flapi",
"url": "https://github.com/DataZooDE/flapi/issues/2"
}
|
gharchive/issue
|
Not able to run the latest docker image
The new docker build has an error at runtime :
/app/flapi: error while loading shared libraries: libubsan.so.1: cannot open shared object file: No such file or directory
To reproduce the error :
docker run -it ghcr.io/datazoode/flapi:7bed7cfe7881bdc81a5a2051f6727d2b2c658b95 bash
Hey, unfortunately there seems still to be a glitch with the GH actions build:
latest != 7bed7cfe7881bdc81a5a2051f6727d2b2c658b95
So can you please use the image https://github.com/DataZooDE/flapi/pkgs/container/flapi/306431664?tag=latest for the moment. There I have fixed the missing GCC ubsan library already.
Works now
|
2025-04-01T04:10:20.082913
| 2020-10-09T03:45:15
|
717829078
|
{
"authors": [
"etagwerker",
"ngan"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13947",
"repo": "DatabaseCleaner/database_cleaner",
"url": "https://github.com/DatabaseCleaner/database_cleaner/issues/665"
}
|
gharchive/issue
|
Ruby 2.7 deprecations
We're using Ruby 2.7 and are getting warnings from this gem:
database_cleaner-core-2.0.0.beta2/lib/database_cleaner/cleaners.rb:41: warning: Using the last argument as keyword parameters is deprecated; maybe ** should be added to the call
database_cleaner-core-2.0.0.beta2/lib/database_cleaner/cleaner.rb:28: warning: The called method `initialize' is defined here
I'd be more than happy to put up a PR...is there a minimum version of Ruby that this database_cleaner v2 plans to support?
@ngan That would be great. We are currently maintaining Ruby 2.5 and higher: https://github.com/DatabaseCleaner/database_cleaner/blob/master/.travis.yml
@etagwerker here ya go: https://github.com/DatabaseCleaner/database_cleaner/pull/667
|
2025-04-01T04:10:20.091826
| 2021-06-08T16:41:15
|
915243406
|
{
"authors": [
"DavHau",
"PierreR"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13948",
"repo": "DavHau/mach-nix",
"url": "https://github.com/DavHau/mach-nix/issues/284"
}
|
gharchive/issue
|
Document the usage of fetchPypiSdist
Hi,
It is not clear when I should use fetchPypiSdist over buildPythonPackage. If the python package is in PyPi and I don't need much customization should I always prefer fetchPypiSdist.
I would love some guidance on that point.
Cheers,
Hey, fetchPypiSdist is not an alternative for buildPythonPackage, rather it can be used to fetch the src which is used inside buildPythonPackage. I will review the docs in the coming days and see how to make it clearer.
|
2025-04-01T04:10:20.094098
| 2020-08-13T06:59:00
|
678202522
|
{
"authors": [
"DavHau",
"InLaw"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13949",
"repo": "DavHau/mach-nix",
"url": "https://github.com/DavHau/mach-nix/issues/75"
}
|
gharchive/issue
|
python3.7-pyspark-3.0.0/bin/spark-shell: line 60: /bin/spark-submit: No such file or directory
Installtion of the package
pyspark
error:
pyspark
File "/nix/store/3qi9pcs9vqv0y9x5c4zirizx2vygazrc-python3.7-pyspark-3.0.0/bin/find_spark_home.py", line 2
export PATH='/nix/store/r94aa2gj4drkhfvkm2p4ab6cblb6kxlq-python3-3.7.6/bin:/nix/store/3qi9pcs9vqv0y9x5c4zirizx2vygazrc-python3.7-pyspark-3.0.0/bin'${PATH:+':'}$PATH
^
SyntaxError: invalid syntax
/nix/store/3qi9pcs9vqv0y9x5c4zirizx2vygazrc-python3.7-pyspark-3.0.0/bin/pyspark: line 24: /bin/load-spark-env.sh: No such file or directory
/nix/store/3qi9pcs9vqv0y9x5c4zirizx2vygazrc-python3.7-pyspark-3.0.0/bin/pyspark: line 68: /bin/spark-submit: No such file or directory
and spark-shell
File "/nix/store/3qi9pcs9vqv0y9x5c4zirizx2vygazrc-python3.7-pyspark-3.0.0/bin/find_spark_home.py", line 2
export PATH='/nix/store/r94aa2gj4drkhfvkm2p4ab6cblb6kxlq-python3-3.7.6/bin:/nix/store/3qi9pcs9vqv0y9x5c4zirizx2vygazrc-python3.7-pyspark-3.0.0/bin'${PATH:+':'}$PATH
^
SyntaxError: invalid syntax
/nix/store/3qi9pcs9vqv0y9x5c4zirizx2vygazrc-python3.7-pyspark-3.0.0/bin/spark-shell: line 60: /bin/spark-submit: No such file or directory
The package exists in nixpkgs and has the same problem there.
The fix should be made in nixpkgs. Could you please report the issue there?
done
|
2025-04-01T04:10:20.105943
| 2022-06-17T12:03:44
|
1274948529
|
{
"authors": [
"vasilissoti"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13950",
"repo": "DaveMcEwan/svdata",
"url": "https://github.com/DaveMcEwan/svdata/pull/36"
}
|
gharchive/pull-request
|
Expression resolver
Currently the resolver doesn't support function calls (including data objects) but instead self independent explicit declarations on the RHS of the ConstanrExpression. At the moment only arithmetic operations located within binary operations are supported for both signedness and datatype.
Capability to derive the final signedness and datatype has been added.
Placing the PR into progress in order to change literal integer datatype in logic as well as to ensure that there is an adequate differentiation in the analysis of the following literals:
Unbased unsized literal e.g '1
Unsized based literal e.g. 'b101
Simple decimal e.g -3
Sized based literal 3'b101
This is crucial prior to proceeding with adding range as an entry
Placing the PR in progress in order to change the literal integer datatype in logic as well as to ensure that there is an adequate differentiation in the analysis of the following literals:
Unbased unsized literal e.g '1
Unsized based literal e.g. 'b101
Simple decimal e.g -3
Sized based literal 3'b101
This is crucial prior to proceeding with adding range as an entry
The new commit should ensure an accurate signedness and datatype for any of the above categories. Test cases will be made in order to verify that rigorously.
The only pending task for now:
Update the current implementations for strings within resolver_datatype and resolver_signedness in order to compensate for a possible misuse of a string e.g. operations on one or more string operands that lead to a non string type.
The new string implementation will be added in a future PR.
2 - Bugfixes to be made.
2 - Bugfixes to be made.
Fixed
I had missed a required bugfix, now it is ready to be merged.
|
2025-04-01T04:10:20.115691
| 2022-07-22T10:07:40
|
1314808497
|
{
"authors": [
"DavidAnson",
"filiphagan"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13951",
"repo": "DavidAnson/markdownlint-cli2-action",
"url": "https://github.com/DavidAnson/markdownlint-cli2-action/issues/10"
}
|
gharchive/issue
|
Command not working as expected
After adding command: config to my workflow markdownlint-cli2 doesn't find any Markdown files in the directory
Current setup in ci.yml:
- name: Lint all Markdown files
uses: DavidAnson/markdownlint-cli2-action@0820d56e6c53c3cc344ec2ef9c09fc74041aeb1c
with:
command: config
globs: |
"./.github/workflows/config/.markdownlint.yml"
"**/*.md"
Current output:
> Run DavidAnson/markdownlint-cli2-action@0820d56e6c53c3cc344ec2ef9c09fc74041aeb1c
markdownlint-cli2-config v0.4.0 (markdownlint v0.25.1)
Finding: "**/*.md"
Linting: 0 file(s)
Summary: 0 error(s)
Expected output:
> Run DavidAnson/markdownlint-cli2-action@0820d56e6c53c3cc344ec2ef9c09fc74041aeb1c
markdownlint-cli2-config v0.4.0 (markdownlint v0.25.1)
Finding: "**/*.md"
Linting: 25 file(s)
Summary: 0 error(s)
Previous setup returning expected output:
- name: Copy markdown config file to root
run: |
cp ./.github/workflows/config/.markdownlint.yml .markdownlint.yml
- name: Lint all Markdown files
uses: DavidAnson/markdownlint-cli2-action@v5
with:
globs: "**/*.md"
You need to use a release that includes a commit I made yesterday to add this support - and no such release exists yet. I normally use a "next" branch to prevent GitHub from closing issues prematurely, but don't on this project. I expect to publish a new release soon.
In the description above I added that commit SHA (uses: DavidAnson/markdownlint-cli2-action@0820d56e6c53c3cc344ec2ef9c09fc74041aeb1c)
Sorry, I didn't notice that!
Since you saw my commit, you also saw that I added testing for this and it works as expected there. From what I can tell reviewing your notes, you've done everything reasonably. However, my suspicion is that you are running into a YAML quirk - though you didn't include the one piece of output that would tell for sure.
In the previous scenario, you had a single glob expression which you provided on one line in quotes. I think that YAML removes those quotes so that the glob expression passed to the tool is correct. However, in the new scenario, you need to pass multiple glob expressions and use the pipe syntax. In this case, we can see that YAML is not removing the quote characters because they show up in the output. Your files do not have a quote character, so that glob does not match anything. I think that if you remove the quotes in the new scenario this will work as expected.
Please give that a try and let me know! Thank you.
David, thank you for your support. You were right, it was caused by these quotes, my bad. Thank you for developing this feature
|
2025-04-01T04:10:20.118971
| 2022-05-02T17:51:34
|
1223166785
|
{
"authors": [
"alexec",
"greyscaled",
"thisislawatts",
"wences-dc-uba-ar"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13952",
"repo": "DavidAnson/markdownlint",
"url": "https://github.com/DavidAnson/markdownlint/issues/521"
}
|
gharchive/issue
|
Headings must be title-case
I've a lot of documentation where the heading are a mixture of the following:
Sentance-case
Title-case
Mixture (e.g. "About Settings and Config")
Like genders in German, it is impossible to codify simple rules for "mixture".
Instead, I want all heading to be title-case. Problem solved.
This sounds great @alexec, exactly what I was looking for in a tool to lint Markdown.
Awesome. I’ll leave this issue open as I think it should be a core rule.
I've been working on one that allows configuring both sentence case and title case over here: https://github.com/greyscaled/markdownlint-rule-title-case-style
Still in an early stage, but actively being refined.
In case this is ever implemented, please don't make it enabled by default.
Thanks.
|
2025-04-01T04:10:20.132413
| 2024-10-28T19:12:00
|
2619313905
|
{
"authors": [
"DavidT3",
"astrophysics-megan"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13953",
"repo": "DavidT3/XGA",
"url": "https://github.com/DavidT3/XGA/issues/1232"
}
|
gharchive/issue
|
Started prototyping spectral map measurements for clusters (and other extended objects)
There is no branch yet, but I have started prototyping spectral property measurements in two spatial dimensions, which will be included in XGA to produce things like temperature, metallicity, and entropy maps.
Started off with simple square pixels, as considerably easier to mess around with than abstract polygons. The SAS selection language can easily be used for this.
I think in the final implementation, when there will probably be an option for square pixels, I may not generalise them to polygons, because I suspect that would be considerably slower than just using the box region.
When it gets to the property map classes though, they should be generalised as polygons.
My test case is Abell 3667, which has a very interesting thermal structure, and has had temperature maps measured before. I will include updates (and the prototyping notebook) in this issue when possible.
Could Jeremy’s region definition program be adapted to Xmm?
Sent from my iPhone
On Oct 28, 2024, at 3:12 PM, David Turner @.***> wrote:
There is no branch yet, but I have started prototyping spectral property measurements in two spatial dimensions, which will be included in XGA to produce things like temperature, metallicity, and entropy maps.
Started off with simple square pixels, as considerably easier to mess around with than abstract polygons. The SAS selection language can easily be used for this.
I think in the final implementation, when there will probably be an option for square pixels, I may not generalise them to polygons, because I suspect that would be considerably slower than just using the box region.
When it gets to the property map classes though, they should be generalised as polygons.
My test case is Abell 3667, which has a very interesting thermal structure, and has had temperature maps measured before. I will include updates (and the prototyping notebook) in this issue when possible.
—
Reply to this email directly, view it on GitHubhttps://urldefense.com/v3/__https://github.com/DavidT3/XGA/issues/1232__;!!HXCxUKc!wSSAeBecl6PTfkMKK7t1LocQdapyQGsBQs6mCr21E5bUr_n0MiUPruXjqdk7R3ps8pjQJUUCR3CIb2JUiU9tQ4d2YQ$, or unsubscribehttps://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/ASKK4PZME5FHLOMD4RLKMILZ5Z5ALAVCNFSM6AAAAABQYEAZK2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGYYTSMZRGM4TANI__;!!HXCxUKc!wSSAeBecl6PTfkMKK7t1LocQdapyQGsBQs6mCr21E5bUr_n0MiUPruXjqdk7R3ps8pjQJUUCR3CIb2JUiU_VzG_9Dg$.
You are receiving this because you are subscribed to this thread.Message ID: @.***>
First pass didn't go well:
Was far too ambitious in terms of number of pixel regions - also the grid is meant to be evenly spaced in RA and DEC and for some reason there is overlap in DEC (significant) in the boxes specified as the 'pixels'. I almost certainly screwed up something wrt projection of a spherical coordinate system, because I always do.
I appear to have knitted a scarf
Not my greatest plot, but it is a start:
And one with coords:
So the prototype does function somewhat, not that I was expecting the spectrum generation and XSPEC fitting to be the hard parts here.
My main worry right now is the seemingly messed up spacing in RA:
(these are meant to be square)
Probably me screwing up some projection thing as I mentioned earlier - but does mean that in the RA direction this map's pixels overlap significantly, so that above image should not be trusted.
The other caveats of this prototype map is that I'm not sure the background is actually working for these fits, and I am not accounting for the PSF effects (i.e. cross-arf, that is gonna be a fun one for something like this).
I also reduced the detmap resolution used to weight ARFs to make it run faster, but I'm not feeling too bad for that, as the small region size of each spectrum likely means the ARF doesn't have much spatial variation anyway.
I also had the fear earlier that the sky coordinate systems for XMM detectors across different observations/instruments might not have the same rotation. So 'pixels' with coverage from multiple observations/instruments might not have spectra with the same region.
As another indication of the RA woes, here are some debug images for the spectra generated (for the same ObsID-inst in this case, for demonstrative purposes) - these images represent the exact spatial selection used for the spectra. When I set the cross-hairs for all of them to match the same RA-DEC coords, we can see that the dec binning is working perfectly:
But the RA alignment is way off:
In the RA direction each pixel will be overlapping by almost half...
But then an equally spaced grid overlaid by DS9 looks like this:
HOW DO RA-DEC MAPPING TO OTHER COORDINATE SYSTEMS ALWAYS MANAGE TO CONFUSE ME THIS MUCH?!
|
2025-04-01T04:10:20.154080
| 2022-03-28T12:29:25
|
1183368137
|
{
"authors": [
"joeldavidw",
"nichellekoh"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13954",
"repo": "DeFiCh/jellyfish",
"url": "https://github.com/DeFiCh/jellyfish/pull/1275"
}
|
gharchive/pull-request
|
feat(status-api): main app setup
What this PR does / why we need it:
Which issue(s) does this PR fixes?:
Part of #1270
Additional comments?:
@nichellekoh I recommend you create another PR with the above suggested change
@fuxingloh A separate PR has been created with the changes https://github.com/DeFiCh/jellyfish/pull/1292
|
2025-04-01T04:10:20.173464
| 2017-01-10T16:43:54
|
199872443
|
{
"authors": [
"Dead2",
"silviucpp",
"ztlpn"
],
"license": "Zlib",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13955",
"repo": "Dead2/zlib-ng",
"url": "https://github.com/Dead2/zlib-ng/issues/81"
}
|
gharchive/issue
|
Invalid gzip stream produced for compression level = 1
Hi. I've noticed that the algorithm for compression level 1 sometimes produces invalid gzip stream. I've created a fork containing the test case: https://github.com/ztlpn/zlib-ng. It consists of a file with input data and tweaked buffer sizes in minigzip.c.
Steps to reproduce:
./configure && make
cat test.in | ./minigzip -c -1 | gzip -d > /dev/null
gzip then fails with the following message: gzip: stdin: invalid compressed data--format violated
The output of ./configure:
Checking for shared library support...
Building shared library libz.so.1.2.8.zlib-ng with gcc-5.
Checking for off64_t... Yes.
Checking for fseeko... Yes.
Checking for strerror... Yes.
Checking for unistd.h... Yes.
Checking for stdarg.h... Yes.
Checking for ANSI C compliant compiler... Yes.
Checking for attribute(visibility(hidden)) support... Yes.
Checking for attribute(visibility(internal)) support... Yes.
Checking for __builtin_ctzl ... Yes.
Checking for SSE2 intrinsics ... Yes.
Checking for PCLMULQDQ intrinsics ... Yes.
ARCH: x86_64
Using arch directory: arch/x86
@Dead2 Have you been able to reproduce?
I have the same problem in my app.
I have unfortunately been swamped with work lately, although I am slowly making headway with the backlog. I can honestly say that not a day goes by without me thinking about zlib-ng.
That said, I suggest you check out the code from Intel's fork of zlib, since that is where the deflate-quick (level 1 of zlib-ng) comes from. We have made only very minor changes to that part of the code, partially because it is quite a bit harder to understand and modify than the stock zlib code. I also see that they have made 3 commits that likely have not been imported to zlib-ng yet, so if one of those fixes this problem then I would be very interested in knowing so.
And if the problem still exists in intel's fork, then opening an issue there with a test-case such as the one you made for zlib-ng is probably the most likely to result in a fix, and they will probably appreciate having a reproducible test-case such as you have made. They have had several reports of such problems, but often without good test-cases, making it nearly impossible to find the bug(s), and this might be one of the bugs they have not managed to nail down yet.
I am sorry I can not be of more help right now, but thank you for taking the time to contribute your problem report.
Hello,
I tested now my use case with intel fork and seems fine. I can't replicate my problem there.
Silviu
Ok, thanks for the reply! I will try to investigate some more. Maybe I will be able to zero in on the root cause just by comparing patches even without understanding much about zlib internals.
As for now, I think it would be advisable to switch deflate-quick off by default as the simplest fix.
I'd also suggest testing the patches in https://github.com/jtkukunas/zlib/pull/14
If those fix the problem (would be great to know which of the commits as well), then I'd be inclined to pull those even though Intel has not had time to review them yet.
Just to clarify, remember to test without their last commit "temporarily remove deflate_quick from default config". That commit disables deflate_quick completely and would invalidate any testing :)
Ok, I've tested with the Intel fork. The problem is present there, introduced in https://github.com/jtkukunas/zlib/commit/d948170e11e4eaa68cce189e5ea10a4d41e01437 (the commit introducing deflate_quick strategy).
The patches in jtkukunas/zlib#14 seem to help.
I have merged those (and a lot of other backlogged commits) to the hacknslash4 branch, would you please test whether the problem is fixed now?
More of the backlogged upstream zlib commits will be merged soon hopefully.
hacknslash5 has been merged into the develop branch, closing
|
2025-04-01T04:10:20.187065
| 2022-10-04T00:17:19
|
1395495575
|
{
"authors": [
"Deadpikle"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13956",
"repo": "Deadpikle/pathfinder-bible-experience-prep-quiz-master",
"url": "https://github.com/Deadpikle/pathfinder-bible-experience-prep-quiz-master/issues/96"
}
|
gharchive/issue
|
Add Flag Reason when flagging questions
It may be helpful for whoever is reviewing flagged questions to have a comment identifying what may have been wrong. If we don’t want a free form text field, maybe a dropdown could be offered indicating most common problems? Possible Dropdown Values: Confusing Wording, Incorrect Points, Spelling/Typos, Incorrect Verse Reference, Other
Closed in 4be51bc6b57d107abc6a486b68d5e798db6b729f
|
2025-04-01T04:10:20.192287
| 2017-09-30T08:38:32
|
261829238
|
{
"authors": [
"DeanThompson",
"sunzhy",
"zplzpl"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13957",
"repo": "DeanThompson/ginpprof",
"url": "https://github.com/DeanThompson/ginpprof/issues/3"
}
|
gharchive/issue
|
Access /debug/pprof/ will trigger the download
Access /debug/pprof/ will trigger the download.Why did it happen like that?
Can not re-produce.
调试时不能使用router.Use(gzip.Gzip(gzip.DefaultCompression))等压缩方法
|
2025-04-01T04:10:20.195019
| 2022-08-02T14:09:30
|
1325918266
|
{
"authors": [
"Sabrinavigil",
"lauthieb"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13958",
"repo": "Decathlon/vitamin-design",
"url": "https://github.com/Decathlon/vitamin-design/issues/36"
}
|
gharchive/issue
|
[Card] Remove badge for "more items"
Duplicates ❌
[X] I have searched the existing issues
Which Figma library is concerned?
Not related to one Figma library
Current behavior 😯
There is a badge as counter for "more items"
Expected behavior 🤔
Context 🔦
No response
Screenshots / Videos 📸
No response
@Sabrinavigil, can you please reupload your screenshot? It seems broken.
Thanks
Done ✅
|
2025-04-01T04:10:20.199513
| 2022-01-06T15:49:20
|
1095446049
|
{
"authors": [
"GaspardMathon",
"lauthieb"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13959",
"repo": "Decathlon/vitamin-web",
"url": "https://github.com/Decathlon/vitamin-web/pull/865"
}
|
gharchive/pull-request
|
refactor(@vtmn/css-rating): improve accessibility with keyboard + screen reader
Pull Request checklist
Description
The previous version of the rating componet wasn't intuitive/accessible with keyboard + screen reader.
The component has been reworked to be accessible
Does this introduce a breaking change?
No (non class name changes)
Some aria props should be removed
The class vtmn-rating--disabled can be removed and replaced by aria-disabled="true"
❤️ Thank you!
@GaspardMathon We also have a problem with :hover state:
It works with values upper than the current value but not with before
as we still have issues in this PR (discussed together), I convert again to draft.
@GaspardMathon We also have a problem with :hover state: It works with values upper than the current value but not with before.
Resolved in last commit
|
2025-04-01T04:10:20.201662
| 2024-11-20T15:42:30
|
2676328485
|
{
"authors": [
"CrasCris",
"nkinnaird",
"petrsmid"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13960",
"repo": "Deci-AI/super-gradients",
"url": "https://github.com/Deci-AI/super-gradients/issues/2067"
}
|
gharchive/issue
|
Licensing for commercial use - how to contact you?
💡 Your Question
Hello,
There are multiple LICENSE files in your repository and it is unclear to me which license applies. Two files state that if we want to use your code commercially we should contact you. However, your web page does not exist anymore and we cannot find any way how to contact you. Could you please advise us?
Thank you,
Petr
Versions
No response
i may be in stand by cuz NVIDIA bought Deci-AI
That's definitely the reason. I hope that Nvidia fully open sources this project, but not sure, so good to leave this question up in case anyone ever comes back to answer it.
|
2025-04-01T04:10:20.206234
| 2023-02-14T14:27:49
|
1584264350
|
{
"authors": [
"rnllv"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13961",
"repo": "DeepBlueCLtd/LegacyMan",
"url": "https://github.com/DeepBlueCLtd/LegacyMan/pull/109"
}
|
gharchive/pull-request
|
101 parse non standard countries
@IanMayo, this is the base patch to start processing classes in non-standard countries. The actual class structure is same as for standard countries. We'll need to refactor the same. I'll take care of the same as part of issue #106
Also, I'll take up additional parsing as part of #107 and #108 once this PR is reviewed and merged.
Parser tests for classes of non-standard countries would be covered as part of #108
|
2025-04-01T04:10:20.207932
| 2021-07-06T22:35:05
|
938311031
|
{
"authors": [
"braceal"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13962",
"repo": "DeepDriveMD/DeepDriveMD-pipeline",
"url": "https://github.com/DeepDriveMD/DeepDriveMD-pipeline/issues/37"
}
|
gharchive/issue
|
DeepDriveMD API Discussion
Space for discussion related to the DeepDriveMD API.
For detailed discussion on the current code status see #38
Features we would like to support:
Different streaming technologies can be substituted by linking with a different module/library
Streaming or non-streaming approaches can be used by linking with a different module/library
Streaming is merged to main.
|
2025-04-01T04:10:20.414279
| 2023-03-15T11:04:28
|
1625296430
|
{
"authors": [
"priscavdsluis"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13963",
"repo": "Deltares/HYDROLIB-core",
"url": "https://github.com/Deltares/HYDROLIB-core/pull/468"
}
|
gharchive/pull-request
|
#450: Update MeshKernel to 2.0.2
This version contains Linux suport
Closed PR and branch. Will continue in different branch.
|
2025-04-01T04:10:20.421876
| 2023-11-21T15:57:11
|
2035822849
|
{
"authors": [
"Manangka"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13964",
"repo": "Deltares/imod-python",
"url": "https://github.com/Deltares/imod-python/issues/658"
}
|
gharchive/issue
|
Tie up loose ends in partitioning story
In GitLab by @luitjansl on Nov 21, 2023, 16:57
We have implemented partitioning models into smaller submodels that can be solved in parallel.
However there are several loose ends that we still need to take care of.
The partitioning still needs to be fixed for the hfb package, the lake package and the uzf package
We still need to compute the auxiliary variables cdist and anglex for unstructured grids
Postprocessing step: the exchange flux should still be merged into the balance files
Transport is still untested for partitioning, but parallel flow + transport is still under development on the Modflow side, so it's not urgent.
In GitLab by @JoerivanEngelen on Nov 22, 2023, 10:26
Relevant for the computation of aux variables for unstructured grids is this comment: https://gitlab.com/deltares/imod/imod-python/-/issues/479#note_1506589478
Here's the code how Modflow 6 computes geometric properties:
https://github.com/MODFLOW-USGS/modflow6/blob/develop/src/Model/ModelUtilities/DisvGeom.f90
Original code to compute geometric properties (where angledx is not computed entirely correctly)
def set_geometric_properties(df: pd.DataFrame, grid: xu.Ugrid2d) -> None:
"""
Compute the geometric properties for every connection: the length in the
first cell (cl1), the length in the second cell (cl2), and the width of the
connection (hwa).
"""
face_centroids = grid.centroids
edge_coordinates = grid.edge_node_coordinates[df["edge_index"]]
centroid_i = face_centroids[df["i"]]
centroid_j = face_centroids[df["j"]]
U = np.diff(edge_coordinates, axis=1)[:, 0]
Vi = centroid_i - edge_coordinates[:, 0]
Vj = centroid_j - edge_coordinates[:, 0]
length = np.linalg.norm(U, axis=1)
anglex = np.atan2(U[:, 1], U[:, 0])
anglex[anglex < 0] += 2 * np.pi
df["cl1"] = np.abs(np.cross(U, Vi)) / length
df["cl2"] = np.abs(np.cross(U, Vj)) / length
df["hwva"] = length
df["anglex"] = anglex
return
In GitLab by @JoerivanEngelen on Nov 22, 2023, 12:37
added #659 as child task
In GitLab by @JoerivanEngelen on Nov 22, 2023, 12:37
added #660 as child task
In GitLab by @JoerivanEngelen on Nov 22, 2023, 12:39
added #663 as child task
In GitLab by @JoerivanEngelen on Nov 22, 2023, 12:38
added #662 as child task
In GitLab by @JoerivanEngelen on Nov 22, 2023, 12:39
added #664 as child task
In GitLab by @JoerivanEngelen on Nov 22, 2023, 12:45
marked this issue as related to #324
In GitLab by @JoerivanEngelen on Nov 22, 2023, 12:45
marked this issue as related to #565
In GitLab by @JoerivanEngelen on Nov 22, 2023, 12:50
added #548 as child task
In GitLab by @JoerivanEngelen on Dec 7, 2023, 15:10
marked this issue as related to #683
|
2025-04-01T04:10:20.454384
| 2015-10-23T11:55:34
|
113005362
|
{
"authors": [
"DenHeadless",
"belkevich"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13965",
"repo": "DenHeadless/DTCollectionViewManager",
"url": "https://github.com/DenHeadless/DTCollectionViewManager/pull/15"
}
|
gharchive/pull-request
|
Updated for Xcode 7
Fixed example podfile
Allowed non-https traffic for Xcode > 7
Update swift to modern syntax
Thanks, @belkevich =)
|
2025-04-01T04:10:20.456149
| 2024-06-20T13:31:55
|
2364465263
|
{
"authors": [
"bio-la",
"giuliaelgarcia"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13966",
"repo": "DendrouLab/panpipes",
"url": "https://github.com/DendrouLab/panpipes/pull/288"
}
|
gharchive/pull-request
|
Threads documentation
Added a spreadsheet with each task and the threads required, and added a comment on each yaml.md informing the user to check the spreadsheet for more information.
Hi @giuliaelgarcia, the reference to the file in each workflow is fine, however this is not how the xls is displayed (you can see for yourself that the action building the docs is not displaying the item in the index) , can you please see this
https://stackoverflow.com/questions/18845306/sphinx-include-xlsx-data-into-rst
|
2025-04-01T04:10:20.460772
| 2022-10-22T12:02:26
|
1419261406
|
{
"authors": [
"1995parham",
"Aatmaj-Zephyr",
"DenverCoder1",
"HamiltonMultimedia",
"Nirzak"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13967",
"repo": "DenverCoder1/github-readme-streak-stats",
"url": "https://github.com/DenverCoder1/github-readme-streak-stats/issues/342"
}
|
gharchive/issue
|
Contributions are now showing only from Jan 1
Old
New (bug)
Same here
Same here
Mine shows from May 10th suddenly.
My contributions date back to 2017.
I thought I was trippin.
I'm seeing the issue as well, but unfortunately I can't reproduce the issue locally.
When running on localhost everything appears correct.
Nothing has changed recently in the code, so it probably has to do with the GitHub API being inconsistent, as it frequently is.
Seems like the issue resolved itself as I expected it would.
I can only make it as accurate as the GitHub API allows.
|
2025-04-01T04:10:20.463427
| 2021-12-01T09:10:37
|
1068149018
|
{
"authors": [
"DenverCoder1",
"frelle"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13968",
"repo": "DenverCoder1/unicode-formatter",
"url": "https://github.com/DenverCoder1/unicode-formatter/issues/28"
}
|
gharchive/issue
|
special characters (french...)
Hi,
Do you know how can I integrate special characters (for example, for french, éèàà ) ?
thank you
The unicode fonts only support certain symbols so you won't find variations that include those.
You can however try using Combining Characters to add accents to any symbol.
Eg:
◌̀ ◌́
These links will allow you to copy the accent symbol to your clipboard:
https://unicode-explorer.com/c/0300
https://unicode-explorer.com/c/0301
Then they can be pasted after any character to make it appear above.
|
2025-04-01T04:10:20.516120
| 2022-05-11T06:03:32
|
1232055861
|
{
"authors": [
"smajidian",
"wrengs"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13969",
"repo": "DessimozLab/read2tree",
"url": "https://github.com/DessimozLab/read2tree/issues/1"
}
|
gharchive/issue
|
Installation: module yaml has no attribute FullLoader
Dear Authors,
With great interest I have read your article on BioRxiv and was looking forward for an attempt myself on PacBio HiFi data of 4 different species.
Following your detailed instructions, I have tried installing read2tree.
Since it is mentioned that read2tree was build and tested using python 3.5.1, I first created a python3.5 environment within conda.
Next I installed all mentioned modules.
Upon attempting to test run read2tree as instructed, I received the following error:
AttributeError: module 'yaml' has no attribute 'FullLoader'
The FullLoader attribute was added to yaml version 5.1 ( https://pyyaml.org/wiki/PyYAML#history )
Thus I attempted installation of version 5.1, however got the error that my existing python installation was not compatible.
Specifications:
pyyaml==5.1 -> python[version='>=2.7,<2.8.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0']
I am wondering if you could provide some more detailed information of the versions of the installed modules.
Additionally, the module filelock is not mentioned as an actual installation, but is mentioned to be needed.
Any further suggestions would be most welcome.
Thanks a lot in advance!
Kind regards,
Willem
Dear Willem
Thanks for your interest. I'm sorry for the inconvenience. I was able to reproduce the error with python 3.5 on linux. We'll update the readme soon. I just created a new conda env in linux with python 3.8 or 3.9 and ran read2tree successfully on the test dataset. Could you possibly consider using python 3.8 or 3.9?
Thanks for the comment on filelock which is installed by setup.py. As it can be installed by conda too, we'll update installation section in readme.
Please let us know whether it works for you or not.
Regards,
Sina
Dear Sina,
Thank you kindly for the quick reply!
I will have a go with python3.8 or 3.9 and keep you updated.
All the best,
Willem
Dear Willem,
My pleasure, I'm glad to hear that. So, I'm closing the issue but feel free to inform us if you have any other issues or questions.
Regards, Sina.
|
2025-04-01T04:10:20.568037
| 2020-12-19T14:31:43
|
771391220
|
{
"authors": [
"Dev-CasperTheGhost",
"Tegnio6882"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13970",
"repo": "Dev-CasperTheGhost/ghostybot",
"url": "https://github.com/Dev-CasperTheGhost/ghostybot/pull/128"
}
|
gharchive/pull-request
|
Updated 'randomcolor' command
Removed randomcolor dependency as it is no longer used
Added preview for generated color in 'randomcolor' command
Forgot about adding new api to the docs..
Forgot about adding new api to the docs..
don't worry bout it
|
2025-04-01T04:10:20.598256
| 2018-02-16T15:37:33
|
297825177
|
{
"authors": [
"AlekseyMartynov",
"jtsom"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13971",
"repo": "DevExpress/DevExtreme.AspNet.Data",
"url": "https://github.com/DevExpress/DevExtreme.AspNet.Data/issues/200"
}
|
gharchive/issue
|
Announcement: 1.4.0-rc1 release candidate is available
Resolved issues:
#170 / https://devexpress.com/issue=T595036
Numerous issues related to SUM / AVG summaries: #179, #182, #184, #199
New:
jQuery-free NPM package for Angular apps:
https://www.npmjs.com/package/devextreme-aspnet-data-nojquery
loadMethod option (requested in #161)
Packages:
https://www.nuget.org/packages/DevExtreme.AspNet.Data/1.4.0-rc1
npm i<EMAIL_ADDRESS>npm i<EMAIL_ADDRESS>
We would appreciate any feedback.
After installing the RC version, I'm now seeing:
core.js:1448 ERROR RangeError: Maximum call stack size exceeded
at extend (extend.js:23)
at Object.Deferred._promise.promise (deferred.js:84)
at new Deferred (deferred.js:86)
at new exports.Deferred (deferred.js:159)
at inheritor._loadImpl (custom_store.js:221)
at inheritor.eval [as _loadImpl] (class.js:16)
at inheritor.load [as _loadFunc] (abstract_store.js:55)
at invokeUserLoad (custom_store.js:71)
at inheritor._loadImpl (custom_store.js:225)
at inheritor.eval [as _loadImpl] (class.js:16)
If I comment out the from the .html, it doesn't appear, so it looks like there's something going on.
Version 1.4.0 published.
And the error is still there:
webpack-internal:///./node_modules/@angular/core/esm5/core.js:1664 ERROR RangeError: Maximum call stack size exceeded
at extend (webpack-internal:///./node_modules/devextreme/core/utils/extend.js:23)
at Object.Deferred._promise.promise (webpack-internal:///./node_modules/devextreme/core/utils/deferred.js:84)
at new Deferred (webpack-internal:///./node_modules/devextreme/core/utils/deferred.js:86)
at new exports.Deferred (webpack-internal:///./node_modules/devextreme/core/utils/deferred.js:159)
at inheritor._loadImpl (webpack-internal:///./node_modules/devextreme/data/custom_store.js:221)
at inheritor.eval [as _loadImpl] (webpack-internal:///./node_modules/devextreme/core/class.js:16)
at inheritor.load [as _loadFunc] (webpack-internal:///./node_modules/devextreme-aspnet-data-nojquery/node_modules/devextreme/data/abstract_store.js:55)
at invokeUserLoad (webpack-internal:///./node_modules/devextreme/data/custom_store.js:71)
at inheritor._loadImpl (webpack-internal:///./node_modules/devextreme/data/custom_store.js:225)
at inheritor.eval [as _loadImpl] (webpack-internal:///./node_modules/devextreme/core/class.js:16)
with
"devextreme": "17.2.6",
"devextreme-angular": "^17.2.6",
"devextreme-aspnet-data-nojquery": "1.4.0",
|
2025-04-01T04:10:20.602023
| 2016-11-02T11:05:32
|
186771251
|
{
"authors": [
"GhEllie",
"achopijocoder",
"kvet"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13972",
"repo": "DevExpress/devextreme-angular",
"url": "https://github.com/DevExpress/devextreme-angular/issues/192"
}
|
gharchive/issue
|
datetime format in grid
Hello ,
I have a column with datetme format in my grid like below :
dataType: 'date',
format: 'dd/MM/yyyy HH:mm'
Grid is showing all dates in +1 hour . for example I have "2016-07-06 10:20:00" and the grid is showing "2016-07-06 11:20:00"
I think that's a bug in grid
Hi,
I could not reproduce the issue on my side. Date formatting is performed by Globalize, so this issue may be not a DevExtreme bug.
If you are not sure that this is a Globalize bug, can you provide a simple example where we can reproduce the issue?
@GhEllie That is the effect of the local time. The data arrives to the grid with the hour "10:20:00" but when is parsed to be shown is converted to local time. Maybe you should use UTC time or handle the hour in your rest api.
|
2025-04-01T04:10:20.609590
| 2020-11-11T03:35:24
|
740429753
|
{
"authors": [
"Farfurix",
"aleks-pro",
"suryap666"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13973",
"repo": "DevExpress/testcafe",
"url": "https://github.com/DevExpress/testcafe/issues/5694"
}
|
gharchive/issue
|
HTML rendering issue
Then submit button is rendered in unexpected place
When I open a browser it looks like shown below
When I open the same page using TestCafe
"testcafe": "^1.9.4"
@suryap666
Hello,
It is difficult to find the cause of the issue without a runnable example. Could you please share some project or a public URL?
@Farfurix I agree with you, even I thought the same, how to share the page. according to the company policy, I should not share the application URL, username and password etc.. But I am willing to attach the HTML page in this issue, I am not sure how to. please suggest me the standard to follow.
Hello @suryap666,
I'm not sure we will be able to reproduce this issue with an HTML page, but you can pack it into a .zip archive and send to<EMAIL_ADDRESS>It's not necessary to provide access to your entire application URL. A simple example created based on your web site code should be sufficient.
I do agree with you, The HTML page will not be helpful, when I am trying to save it in my local, it renders completely in a different way due to lack of full CSS. My application is a monolith code and as a tester, I haven't worked on the application code much. so cannot spare much time to build a sample website @aleks-pro Thanks for your support. I will close this issue for now. Since we are moving from bulk big application to single page application using Vue.js, I am sure this issue will not happen, if it happens, I will log a bug with a sample public-facing website.
|
2025-04-01T04:10:20.617697
| 2022-06-13T06:49:03
|
1268984257
|
{
"authors": [
"Aleksey28",
"bugnano"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13974",
"repo": "DevExpress/testcafe",
"url": "https://github.com/DevExpress/testcafe/issues/7083"
}
|
gharchive/issue
|
Element fails to scroll if the CSS property is set to flex-direction: column-reverse
What is your Scenario?
The scenario is a chat application.
On top of the chat, there is a link with id showMore, that is used to show
older messages.
Right after that link, there are all the chat messages.
Now, I want to test what happens when the user clicks the #showMore element.
As there are many messages, and we have the last message into view, the test
code needs to scroll up, until the #showMore element is visible.
What is the Current behavior?
When I try to interact with the #showMore element with testcafe functions, the test fails with the error
The element that matches the specified selector is not visible.
What is the Expected behavior?
The #showMore element is scrolled into view, and interacted with.
What is your public website URL? (or attach your complete example)
I made a complete example in this repository, and the README.md file has all the instructions needed to run the test.
https://github.com/talkjs/testcafe_bug
What is your TestCafe test code?
The test code is in the same repository
https://github.com/talkjs/testcafe_bug
Your complete configuration file
No response
Your complete test report
No response
Screenshots
No response
Steps to Reproduce
Git clone https://github.com/talkjs/testcafe_bug
Follow the instructions in the README.md file
Profit!
TestCafe version
1.19.0
Node.js version
v16.15.1
Command-line arguments
npx testcafe chromium tests/
Browser name(s) and version(s)
No response
Platform(s) and version(s)
Linux Ubuntu 20.04
Other
No response
Hi @bugnano,
Thank you for the example. I managed to reproduce this issue. We'll update this thread once we have news.
|
2025-04-01T04:10:20.619457
| 2022-08-08T07:57:42
|
1331481164
|
{
"authors": [
"DevSteph0"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13975",
"repo": "DevSteph0/prework-study-guide",
"url": "https://github.com/DevSteph0/prework-study-guide/pull/10"
}
|
gharchive/pull-request
|
Merge pull request #9 from DevSteph0/feature/starter-code
Merge pull request #8 from DevSteph0/main
this pr updates css style sheet and debugs html
|
2025-04-01T04:10:20.636549
| 2023-08-30T08:44:27
|
1873220967
|
{
"authors": [
"Developers-Adgitm",
"KryptoKizzie",
"palakbansal8810"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13976",
"repo": "Developers-Adgitm/Projects",
"url": "https://github.com/Developers-Adgitm/Projects/issues/13"
}
|
gharchive/issue
|
Scientific Calculator using Python
Scientific Calculator in Python
Features
[x] GUI Based
[x] Keyboard Support
Go ahead , create a pull request
Please assign it to me
|
2025-04-01T04:10:20.642490
| 2018-08-21T11:37:56
|
352500458
|
{
"authors": [
"koficodes",
"paynegreen"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13977",
"repo": "DevlessTeam/DV-PHP-CORE",
"url": "https://github.com/DevlessTeam/DV-PHP-CORE/issues/101"
}
|
gharchive/issue
|
getUserWhere($key, $value) doesn't fetch records in the extended user table
@paynegreen Kindly add more description, please. :wink:
The getUserWhere($key, $value) rule only fetches the records from the users' table and doesn't fetch records from the user_profile (extended) table on the DevLess module.
Added a method get getUserUsingExtraParams(["location" => "foo"]) to fetch to retrieve users and extras params.
Issue fixed.
|
2025-04-01T04:10:20.653408
| 2018-04-13T12:41:24
|
314092075
|
{
"authors": [
"Brouilles",
"Tzelon"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13978",
"repo": "Dezeiraud/bsdiff-nodejs",
"url": "https://github.com/Dezeiraud/bsdiff-nodejs/issues/2"
}
|
gharchive/issue
|
Add progression events
Hey,
Thanks for your work on this project.
I wonder how I can add an event to show how much time left to the end of the process.
Thanks
Tzelon
Hello,
Currently this is not possible. But this is a good idea, probably for a future version.
Thanks
Is that demand tinkering with the C code?
For a real data yes.
Thanks.
I'm hoping to see that implemented in the near version :)
Work In Progress 133610050f35e7f8446d238a9b179acd6e1772b8
Now available with the version 2.0.0
Better performance
Async
Get percentage with callback
|
2025-04-01T04:10:20.656100
| 2020-05-02T10:25:23
|
611154770
|
{
"authors": [
"Dhruv-Techapps",
"dharmesh-hemaram",
"ivan1d"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13979",
"repo": "Dhruv-Techapps/auto-click-auto-fill",
"url": "https://github.com/Dhruv-Techapps/auto-click-auto-fill/issues/106"
}
|
gharchive/issue
|
Delete button and list of tasks
It is very easy to press the massive delete button by mistake and in doing so deleting a task, for double pressing it deleting two tasks in this way.
In addition, the tasks should have a multi lines drop down, so it would be easy to see all tasks created and select accordingly
Hi @ivan1d ,
We would happy bring the new changes suggested by you. But can you provide more details with screen shot.
Refer #118 - this feature is already implemented and available in Beta version
|
2025-04-01T04:10:20.670949
| 2017-06-27T13:48:00
|
238857131
|
{
"authors": [
"uglyog",
"yuqitao"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13980",
"repo": "DiUS/pact-workshop-jvm",
"url": "https://github.com/DiUS/pact-workshop-jvm/issues/2"
}
|
gharchive/issue
|
pactPublish error when I am on step13
yu@ubuntu:~/pact-workshop-jvm$ ./gradlew consumer:pactPublish
:consumer:pactPublish
Publishing Our Little Consumer-Our Provider.json ... FAILED! 401 Unauthorized - Unknown error
:consumer:pactPublish FAILED
FAILURE: Build failed with an exception.
What went wrong:
One or more of the pact files were rejected by the pact broker
Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 6.734 secs
need I do this in file gradle.properties :
actBrokerUser=pactBrokerUser
pactBrokerPassword=pactBrokerPassword
after do this , there is still an error like :
yu@ubuntu:~/pact-workshop-jvm$ ./gradlew consumer:pactPublish
FAILURE: Build failed with an exception.
Where:
Build file '/home/yu/pact-workshop-jvm/providers/springboot-provider/build.gradle' line: 46
What went wrong:
A problem occurred evaluating project ':providers:springboot-provider'.
Unauthorized
Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Total time: 6.391 secs
i know i must sign up on website : https://pact.dius.com.au/register
can I deploy my own service as a broker like this ?
You can use your own broker, you don't need to signup for one. You can run one using docker, just search on docker hub.
You will have to setup the appropriate values in https://github.com/DiUS/pact-workshop-jvm/blob/master/providers/springboot-provider/build.gradle#L46 and https://github.com/DiUS/pact-workshop-jvm/blob/master/providers/dropwizard-provider/src/test/groovy/au/com/dius/pactworkshop/dropwizardprovider/PactVerificationTest.groovy#L19 to match the pact broker you are running
i know , thank you very much
|
2025-04-01T04:10:20.674086
| 2024-11-07T15:03:12
|
2641300115
|
{
"authors": [
"DominicOram",
"rtuck99"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13981",
"repo": "DiamondLightSource/mx-bluesky",
"url": "https://github.com/DiamondLightSource/mx-bluesky/pull/625"
}
|
gharchive/pull-request
|
595 write to ispyb sample status after collection
Fixes #595
Requires #606
The blSampleStatus field in ispyb will now be updated on successful robot load to LOADED, when the load_centre_collect plan is used.
In the event of the pin not being found or no diffraction, blSampleStatus will be updated to ERROR - sample to indicate an issue with the sample
If any other error occurs, blSampleStatus will be updated to ERROR - beamline to indicate a general problem with the beamline.
On normal completion the blSampleStatus will be left at LOADED
I have also just been told about https://github.com/DiamondLightSource/mx-bluesky/issues/636. I think it probably makes sense to assume the new endpoints for this issue
I have also just been told about #636. I think it probably makes sense to assume the new endpoints for this issue
Code has already been updated for the new endpoints
|
2025-04-01T04:10:20.733323
| 2017-10-09T07:25:38
|
263804426
|
{
"authors": [
"jimver04"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13982",
"repo": "DigiArt-project/WordpressUnity3DEditor",
"url": "https://github.com/DigiArt-project/WordpressUnity3DEditor/issues/157"
}
|
gharchive/issue
|
Right click on doors
Can we implement a right click on a 3D object in the 3D editor ? It should popup a window of properties that can be edited. We need it for the door category that will show a dropdown with other doors existing in all scenes.
Implementation steps
[x] Make the right click implementation (Dimitris)
[x] Make the 2D gui to display above the 3d object (a panel with a dropdown and a field to place the id of the door ) (Tasos)
[x] Query the wp database to find all the doors existing in the game (Stathis)
a. Get all jsons for the scenes that belong to the game
b. Extract door information from the scenes and return it to javascript
[x] Save in the json the information for the target door (dimitrio)
Now the door info is saved and retrieved from json. I merged tasos GUI with my changes.
I have a bug in the javascript side. If you click first a door and change the values, and the click on a second door, the values are passed from the second door to the first door. I will fix it. Something is happening with the javascript.
Minor bug was fixed in 293b866
Stathi now we are waiting for you to make the php that
Gets all jsons for the scenes that belong to the game
Gets the information about the doors . I have modified the json format so that the door fields are also saved in the json like this:
"mydoora_1507541772" : {
"position" : [0,0,-21.738358259867],
"rotation" : [0,0,0],
"quaternion" : [0,0,0,1],
"scale" : [1,1,1],
"fnPath" : "/Models/",
"assetid" : "1513",
"fnObj" : "d3e613c6816493e83f38dc0ce7ed0c91_objmydoora.txt",
"fnObjID" : "1516",
"categoryName" : "Door",
"categoryID" : "523",
"diffImage" : "http://<IP_ADDRESS>:8080/digiart-project_Jan17/wpunity_asset3d/mydoora/attachment/1514/",
"diffImageID" : "1514",
"image1id" : "",
"doorName_source" : "1111111",
"doorName_target" : "doorGreen",
"sceneName_target" : "SecondScene",
"fnMtl" : "ae78c2bf26a1471a648044d8e1cac14d_materialmydoora.txt",
"fnMtlID" : "1515",
"children" : {
}
},
I want you to get me all the
"doorName_source" : "1111111",
"doorName_target" : "doorGreen",
"sceneName_target" : "SecondScene",
Existing in the scene where the categoryName == "Door"
It is done without ajax. The door information is fixed throughout the scene and it is fetched with standard php and javascript variable transformation with json_encode from php. :)
echo "var doorsAll=".json_encode($doorsAllInfo).";";
|
2025-04-01T04:10:20.737555
| 2017-07-06T19:37:14
|
241063143
|
{
"authors": [
"VictorTagayun",
"samkristoff"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13983",
"repo": "Digilent/openscope-mz",
"url": "https://github.com/Digilent/openscope-mz/issues/5"
}
|
gharchive/issue
|
Unable to set gain at 5v/div
Reported in 1.4.0
Seems to be working now.... Not sure what version of WaveformsLive it is...
Fixed in firmware version 1.37.0. Fix does not depend on a specific version of WaveForms Live.
|
2025-04-01T04:10:21.006802
| 2024-04-16T08:23:34
|
2245440522
|
{
"authors": [
"AutomationDimension"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13984",
"repo": "DimensionDev/status",
"url": "https://github.com/DimensionDev/status/issues/3057"
}
|
gharchive/issue
|
⚠️ API - Lens has degraded performance
In 29f3259, API - Lens ($LENS_API_URL) experienced degraded performance:
HTTP code: 503
Response time: 344 ms
Resolved: API - Lens performance has improved in 3d1c11a after 6 minutes.
|
2025-04-01T04:10:21.025921
| 2016-02-22T08:49:25
|
135354568
|
{
"authors": [
"bajiat",
"jykae"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13985",
"repo": "Dipor/platform",
"url": "https://github.com/Dipor/platform/issues/19"
}
|
gharchive/issue
|
Move repository and set up Waffle
Move the contents of this repository to Digipalvelutehdas. Invite all team members. Set up Dipor team in Digipalvelutehdas organization. Set up a Waffle board.
Created repo https://github.com/Digipalvelutehdas/dipor-dashboard and added @bajiat and @brylie as admin.
This issue was moved to Digipalvelutehdas/dipor-dashboard#1
This issue was moved to Digipalvelutehdas/dipor-dashboard#2
|
2025-04-01T04:10:21.043763
| 2021-01-16T08:25:11
|
787380531
|
{
"authors": [
"DirtyHarryLYL",
"Pursue26"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13986",
"repo": "DirtyHarryLYL/HAKE-Action-Torch",
"url": "https://github.com/DirtyHarryLYL/HAKE-Action-Torch/issues/28"
}
|
gharchive/issue
|
GT DETECTION
For the anno_ bbox.mat on the HICO-DET datasets, when a HOI action GT label (e.g, ride horse) is given from an image, and it is based on a GT human box (named H1) and a GT object box (named O1).
BUT, there may also be another HOI action (e.g, sit on horse), and it is also based on H1 (named H2) and O1 (named O2), but the detection coordinates may be slightly different from the H1, O1. This phenomenon usually exists in GT anno_bbox.mat file on the HICO-DET datasets.
I want to know whether your GT DETECTION regards the above H-O pairs as two samples (i.e., H1-O1 and H2-O2) or one sample (i.e., H1-O1 or H2-O2) . Then these samples are fed into your prediction network.
Thank you.
A: You see it as two pairs (samples).
I tested your gt_hoi_py2.zip file on my own model, and obtained the accuracy of 43+ (Full) (mean Rec all is 1.000).
By the way, you can visualize GT pairs, and you will find that there are a large number of almost overlapping boxes. If you simply de-duplicate them, the mAP maybe improved by 4%+ (meanwhile the mean Rec is 0.98+).
I also tested the file (test_HICO_finetuned_v3.pkl) provided by DRG paper on my own model [ What is the filtering threshold of humans and objects in your test code? ], and got the mAP of 28+ on the Full condition, but the mAP on the detector pre-trained with MS-COCO is ordinary (< 20 mAP). What's the matter?
Thank you.
A: You see it as two pairs (samples).
I tested your gt_hoi_py2.zip file on my own model, and obtained the accuracy of 43+ (Full) (mean Rec all is 1.000).
By the way, you can visualize GT pairs, and you will find that there are a large number of almost overlapping boxes. If you simply de-duplicate them, the mAP maybe improved by 4%+ (meanwhile the mean Rec is 0.98+).
I also tested the file (test_HICO_finetuned_v3.pkl) provided by DRG paper on my own model [ What is the filtering threshold of humans and objects in your test code? ], and got the mAP of 28+ on the Full condition, but the mAP on the detector pre-trained with MS-COCO is ordinary (< 20 mAP). What's the matter?
Thank you.
Since the labeling process of HICO-DET, one person may have multi-box with a slight difference. In our GT test, we did not fuse these "jittered'' boxes but directly input them into the model and run the evaluation, i.e., as two samples.
Since the labeling process of HICO-DET, one person may have multi-box with a slight difference. In our GT test, we did not fuse these "jittered'' boxes but directly input them into the model and run the evaluation, i.e., as two samples.
get.
I also want to ask about your HICO-DET Detector from DRG, do you use the test_HICO_finetuned_v3.pkl file provided by DRG paper for performance evaluation?
get.
I also want to ask about your HICO-DET Detector from DRG, do you use the test_HICO_finetuned_v3.pkl file provided by DRG paper for performance evaluation?
We have reorganized the file into our format (the one provided in our repo), but the boxes and threshold are the same as DRG.
We have reorganized the file into our format (the one provided in our repo), but the boxes and threshold are the same as DRG.
|
2025-04-01T04:10:21.048565
| 2023-09-22T17:43:32
|
1909296487
|
{
"authors": [
"KhounborineII",
"Larsonog"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13987",
"repo": "DiscoTrayStudios/hendrix-today",
"url": "https://github.com/DiscoTrayStudios/hendrix-today/pull/139"
}
|
gharchive/pull-request
|
Added Report Event Icon and Functionality
An Icon now appears on each Event Dialog allowing the User to directly email the Hendrix Today team to report errors on the specific event.
Works perfectly!!!!
|
2025-04-01T04:10:21.067845
| 2021-02-24T20:31:56
|
815826532
|
{
"authors": [
"Morta1985",
"cement-head",
"dzackgarza",
"jerome-diver",
"joren485",
"timnolte",
"toniToniDev"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13990",
"repo": "DisplayLink/evdi",
"url": "https://github.com/DisplayLink/evdi/issues/258"
}
|
gharchive/issue
|
kernel-5.11.1 break evdi-1.7 again
Archlinux
kernel:
linux-5.11.1
and linux-zen-5.11.1
there is no dkms evdi module in the tree, because last kernel update break the evdi-1.7 module to be build (any other evdi last update module doesn't want to build and then, doesn't work)
This looks like a duplicate of #249.
yes, thank you
Also seeing this issue on Arch as of this/last week. The patch mentioned in the linked thread seems to be working for the moment:
https://github.com/DisplayLink/evdi/issues/249#issuecomment-780332761
This new Patch for evdi 1.7.2 freeze Arch Linux and Cinnamon Desktop when i switch off the screen. Poweroff and reboot is during very long but its works more or less.
This new Patch for evdi 1.7.2 freeze Arch Linux and Cinnamon Desktop when i switch off the screen. Poweroff and reboot is during very long but its works more or less.
Yep, I'm also seeing all kinds of issues after applying that patch....various wake-from-suspend and startup problems, and (maybe unrelated) somehow having certain BIOS settings reset periodically.
Seeing this issue on the latest Pop!_OS as well.
Yep... Pop!_OS affected and my working docking station, stopped working for the displays...
Fix is in master, needs to be built for Ubuntu: https://bugs.launchpad.net/ubuntu/+source/xorg-server/+bug/1931547
|
2025-04-01T04:10:21.094015
| 2020-08-26T16:20:32
|
686451138
|
{
"authors": [
"fsavina",
"mkucmus"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13991",
"repo": "DivanteLtd/shopware-pwa",
"url": "https://github.com/DivanteLtd/shopware-pwa/issues/1055"
}
|
gharchive/issue
|
[BUG] error 500 on loadAssociations in views/ProductView.vue
Describe the bug
The call to loadAssociations in ProductView is returning a 500 error "Can not find association by name group"
the issue is no longer present
|
2025-04-01T04:10:21.096293
| 2018-11-15T22:42:00
|
381370885
|
{
"authors": [
"danrcoull",
"filrak"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13992",
"repo": "DivanteLtd/vue-storefront",
"url": "https://github.com/DivanteLtd/vue-storefront/issues/1996"
}
|
gharchive/issue
|
CMS Extension Issue
Current behavior
A issue with the cms extension as getters are triggering before the data is returned and have no access to the state, so just throws exceptions.
" Error in render: "TypeError: state.cmsPage is undefined"" but also the data returns yet the catch stated "install the module" is always triggered even when that module is isntalled, but whats even strangers is the json exists , and is readable but still throws a catch, always
Expected behavior
Ajax runs, displays the content and caches the state and if there is no online connection gives a message rather than throwing an exception internal server error.
@danrcoull afaik the extension is currently being rewritten to a module with gql support
@anqaka should know the details ;)
please see https://github.com/DivanteLtd/vue-storefront/issues/1987
|
2025-04-01T04:10:21.103588
| 2020-01-10T14:29:41
|
548110251
|
{
"authors": [
"andrzejewsky",
"benjick"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13993",
"repo": "DivanteLtd/vue-storefront",
"url": "https://github.com/DivanteLtd/vue-storefront/pull/3979"
}
|
gharchive/pull-request
|
OutputCaching: Support for x-vs-store-code
Short Description and Why It's Useful
Currently it's not possible to use the x-vs-store-code header and outputCaching because the caching only looks at req.url. This fixes that.
Which Environment This Relates To
Check your case. In case of any doubts please read about Release Cycle
[x] Test version (https://test.storefrontcloud.io) - this is a new feature or improvement for Vue Storefront. I've created branch from develop branch and want to merge it back to develop
[ ] RC version (https://next.storefrontcloud.io) - this is a stabilisation fix for Release Candidate of Vue Storefront. I've created branch from release branch and want to merge it back to release
[ ] Stable version (https://demo.storefrontcloud.io) - this is an important fix for current stable version. I've created branch from hotfix or master branch and want to merge it back to hotfix
Upgrade Notes and Changelog
[x] No upgrade steps required (100% backward compatibility and no breaking changes)
[ ] I've updated the Upgrade notes and Changelog on how to port existing Vue Storefront sites with this new feature
IMPORTANT NOTICE - Remember to update CHANGELOG.md with description of your change
Contribution and Currently Important Rules Acceptance
[x] I read and followed contribution rules
Not sure what changelog to put it in
@benjick just put Added OutputCaching: Support for x-vs-store-code - @benjick (#3979) 😃
Thanks, didn't see the "[1.12.0-rc1] - UNRELEASED" at first, was probably looking in the wrong branch! 👍
|
2025-04-01T04:10:21.107492
| 2022-02-28T11:50:06
|
1153997212
|
{
"authors": [
"1HazArd1"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13994",
"repo": "Diversion2k22/Samarpan",
"url": "https://github.com/Diversion2k22/Samarpan/pull/19"
}
|
gharchive/pull-request
|
Implemented FIRESTORE for donor's information collection
Description
Implemented firestore for donor's data collection into the firebase
Fixes #16
Type of change
[ ] Bug fix (non-breaking change which fixes an issue)
[x] New feature (non-breaking change which adds functionality)
[ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
[ ] This change requires a documentation update
Checklist:
[x] My code follows the style guidelines of this project
[x] I have performed a self-review of my own code
[x] I have commented my code, particularly in hard-to-understand areas
[ ] I have made corresponding changes to the documentation
[x] My changes generate no new warnings
[ ] I have added tests that prove my fix is effective or that my feature works
[ ] New and existing unit tests pass locally with my changes
[ ] Any dependent changes have been merged and published in downstream modules
fixes #17
|
2025-04-01T04:10:21.121700
| 2015-03-09T15:45:43
|
60363149
|
{
"authors": [
"DFurnes",
"aaronschachter",
"weerd"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13995",
"repo": "DoSomething/dosomething",
"url": "https://github.com/DoSomething/dosomething/pull/4137"
}
|
gharchive/pull-request
|
Reportback final cleanup
Fixes #4107
Removes old code no longer in use now that Reportback v2 has launched.
@aaronschachter @DFurnes
:+1: from me!
Looks good except I think encoding the apostrophe makes things less readable (and not sure why we'd need to encode it)
So, it's a right apostrophe. Not a normal one, and always best to encode those. While not as readable it's in general better and actually makes it's more obvious that it's a special apostrophe character. It's not even available on the keyboard. It's a special HTML entity.
Dang, you fancy. Alright. :+1:
|
2025-04-01T04:10:21.146610
| 2014-06-09T14:45:02
|
35290304
|
{
"authors": [
"angaither",
"barryclark",
"marahml"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13996",
"repo": "DoSomething/phoenix",
"url": "https://github.com/DoSomething/phoenix/issues/2449"
}
|
gharchive/issue
|
Campaign Close - Pre-signup button adds user to MailChimp
Blocked by #2463
[x] New pre-signup form rendered in campaign header
[x] Pre-signup form submit
[x] Write a presignup record
[ ] Subscribe user to Mailchimp
[x] Upon submit, display the copy "Sweet, we'll send you an email as soon as [campaign_title] re-opens. In the meantime, find a campaign you can do right now."
@mikefantini @marahml is this still something we should have open?
HAI @angaither @mikefantini. Yes, would love to revisit this. The 'notify me' promise that we offer users on closed campaigns doesn't actually ever happen.
I'd make more sense for the experience to be:
BEST FOR UX: Button pushes to an existing campaign that's related OR just to /campaigns
BEST FOR CONVERSION: Change the copy from "Notify Me" to something around "join our email list & hear about other campaigns" (but obvi much shorter)
Let me know your thoughts!
|
2025-04-01T04:10:21.150590
| 2015-08-24T19:11:58
|
102869771
|
{
"authors": [
"angaither",
"mikefantini",
"namimody",
"sbsmith86",
"sergii-tkachenko"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13997",
"repo": "DoSomething/phoenix",
"url": "https://github.com/DoSomething/phoenix/issues/4892"
}
|
gharchive/issue
|
Thor Tumblr Problem Share Lags
Still showing a "load" graphic after about 2 minutes.
this is most likely related to #4891
Works:
Still a problem seen on Thor on multiple browsers (Chrome, Firefox)
@sheyd @sheyd Cron seems to work. Any other ideas? Could it be the job itself?
@namimody This is related to the same thing we are seeing in #5482. Tumblr is lagging because it can't find the images. It can't find the images because the url where the images are located is being redirected to a url with the country prefix in it, resulting in an incorrect URL. I don't think we have assigned the work out to anyone to fix it.
See the discussion here:
https://github.com/DoSomething/phoenix/issues/5482#issuecomment-148789978
/cc @sheyd @sergii-tkachenko
I think we are ok with this one too, @namimody re-open if this is not true
All good on this @angaither!
|
2025-04-01T04:10:21.205834
| 2016-08-08T02:31:39
|
169832600
|
{
"authors": [
"TyrionGraphiste",
"acburst",
"eugeneilyin"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:13999",
"repo": "Dogfalo/materialize",
"url": "https://github.com/Dogfalo/materialize/issues/3475"
}
|
gharchive/issue
|
materialize.css font-weight: 14px
Please check bin/materialize.css line(6944)
.side-nav .userView .name,
.side-nav .userView .email {
font-weight: 14px;
line-height: 24px;
}
Is font-weight can be 14px? Seems like typo (not a font-weight or not 14px)
You're right, it's : font-size: 14px; or font-weight: bold; for example.
What is correct variant?
Fixed in 0828914
|
2025-04-01T04:10:21.209498
| 2017-03-28T20:37:21
|
217685029
|
{
"authors": [
"markstalker",
"tomscholz"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14000",
"repo": "Dogfalo/materialize",
"url": "https://github.com/Dogfalo/materialize/issues/4429"
}
|
gharchive/issue
|
Non disable components (in materialize.scss)
When I disable any 'import' in materialize.scss and then compile this file, my tabs stop working (Don't appear in full width, colors become default)
If I return default .css file, problem persists.
Why did you do that?
I don't need all functionality of the materialize. Moreover, I want to make my own design for forms, because I think that it will be easier to delete the code than to overlap it.
Some imports are required to let any component work. I created a gist here how your materialize.scss could look like. Then you can delete anything except for:
components/_mixins.scss
components/_ color.scss
components/_variables.scss
components/_ normalize.scss
components/_ tabs.scss
Does it work? If yes, then please close this issue
No. Doesn't work. Even if I don't delete the import, but simply compile materialize.scss
@markstalker Try this repo: https://github.com/tomscholz/materialize-sass-demo
Thank you so much!
|
2025-04-01T04:10:21.216901
| 2017-08-19T23:34:33
|
251456082
|
{
"authors": [
"DanielRuf",
"Dogfalo",
"HC-Isma"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14001",
"repo": "Dogfalo/materialize",
"url": "https://github.com/Dogfalo/materialize/issues/5130"
}
|
gharchive/issue
|
Tab swipeable div contracts when inserting an image
Help please... !!
Tab swipeable does not accept to insert an image, at the moment of inserting the div it is contracted hiding everything.
I'm with version 0.99.0
Please make a codepen illustrating your issue
1).-
Test 1
Test 2
Title
First Line
Second Line
grade
Test 2
2) .-
Test 1
Test 2
Test 2
Test 1
Test2
hello
Please read the CONTRIBUTING.md and use the Codepen link
`
Test 1
Test2
hello
`
<ul id="tabs-swipe-demo" class="tabs">
<li class="tab col s6 blue">
<a href="#test-swipe-1">Test 1</a>
</li>
<li class="tab col s6"><a class="active" href="#test-swipe-2">Test2</a></li>
</ul>
<div id="test-swipe-1" class="col s12 blue test-swipe">
<img src="/images/cars/toyota.jpg" alt="" class=" circle">
</div>
<div id="test-swipe-2" class="col s12 red test-swipe">
hello
</div>
This is not a codepen.
|
2025-04-01T04:10:21.221766
| 2015-08-11T03:05:48
|
100215537
|
{
"authors": [
"brunocalou",
"tomscholz"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14002",
"repo": "Dogfalo/materialize",
"url": "https://github.com/Dogfalo/materialize/pull/1887"
}
|
gharchive/pull-request
|
Decentralized icon on the nav bar for small screens
The search bar icon gets decentralized when the screen is small.
I tested this behavior on
Google Chrome 44.0.2403.130 (64-bit) - elementary OS 0.2.1
Firefox 39.0.3 - elementary OS 0.2.1
Google Chrome 44.0.2403.133 - Android 4.4.4
The problem is that on small screens, the line-height: 1 of the material-icons class is dominant. So, adding line-height: inherit to the icon inside the navbar solves the problem.
Working Codepen: http://codepen.io/brunocalou/pen/RPdYGR
Resize the page to see the bug
The same behavior is observed on any icon on the nav bar. So, changing the nav .nav-wrapper i instead of adding it just to the nav .input-field label i solves the problem.
Working Codepen: http://codepen.io/brunocalou/pen/waObxZ
Resize the page to see the bug
This seems fixed in the current release
|
2025-04-01T04:10:21.269360
| 2023-03-01T21:24:15
|
1605686793
|
{
"authors": [
"Dolu89",
"RandyMcMillan",
"rabble"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14003",
"repo": "Dolu89/nostr-proxy",
"url": "https://github.com/Dolu89/nostr-proxy/issues/19"
}
|
gharchive/issue
|
Build fails, tries to load local npm lib
Followed readme instructions and everything built but this:
npm install
npm ERR! code EUNSUPPORTEDPROTOCOL
npm ERR! Unsupported URL Type "link:": link:@noble/hashes/sha256
npm ERR! /Users/rabble/.npm/_logs/2023-03-01T21_22_05_923Z-debug-0.log
pnpm install
Lockfile is up to date, resolution step is skipped
Already up to date
> WARN Local dependency not found at /Users/rabble/code/experiments/nostr-proxy/@noble/hashes/sha256
Done in 1.1s
What is your version of nodejs?
Also, what is this npm install at the beginning? This project uses Pnpm instead of npm or yarn
try "pnpm rb"
rebuild
|
2025-04-01T04:10:21.271604
| 2019-08-01T11:52:22
|
475638772
|
{
"authors": [
"brettpenzer123",
"kaenganxt",
"matteo-mlr"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14004",
"repo": "DominicMaas/Tango",
"url": "https://github.com/DominicMaas/Tango/issues/116"
}
|
gharchive/issue
|
Syncing issues, while playing with two players.
Hi there, thank you for this great mod!
I just started using it but noticed a problem, where me and my teammate don't have the same date and therefore also not the same population in game, so we basically have two very different games running. The one person is already allowed to build a specific building and the other one is not. Is there a way to fix this issue?
Best regards
Matteo Mueller
Hi @matteo-mlr. I would suggest that you join the discord server: https://discordapp.com/invite/RjACPhd as these issues you have mentioned above are on our list to fix or to implement if they do not already exist. It would be a good way to keep up with what is being added to the game.
Closing this as the problems depend on many factors already covered in other issues. The problems are mostly caused because certain features are not yet implemented. (Still thanks for your report)
|
2025-04-01T04:10:21.296239
| 2024-05-20T02:07:22
|
2304928219
|
{
"authors": [
"Dooy",
"superultra01"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14005",
"repo": "Dooy/chatgpt-web-midjourney-proxy",
"url": "https://github.com/Dooy/chatgpt-web-midjourney-proxy/issues/352"
}
|
gharchive/issue
|
gpt4o、4o-0513回答资讯类问题时会把会把search信息打印出来,而且提问是中文,回答是英文
版本:2.17.7
提问:最新资讯
回答:
search("latest news May 20 2024")mclick(["4", "5", "7", "15", "19"])
Here are some of the latest updates as of May 20, 2024:
International News:
A helicopter crash involving Iranian President Ebrahim Raisi has been reported. The incident occurred near the Azerbaijan border during a dam inauguration project. Investigations are ongoing to determine the cause and the extent of injuries sustained by those on board【6†source】.
In Ukraine, Russian attacks on the Kharkiv region have resulted in at least 11 fatalities. The conflict continues to escalate, impacting civilian areas and causing significant casualties【6†source】.
US Politics:
The 2024 US presidential election is heating up with debates being scheduled between President Joe Biden and former President Donald Trump. The first debate is set to take place on June 27 in Atlanta, Georgia. Additionally, RFK Jr. is actively working to secure ballot access for his campaign【7†source】.
The Supreme Court has recently cleared the way for Louisiana to implement a congressional map with two districts designed to enhance representation for Black voters. This ruling is seen as a potential boost for Democratic candidates in the upcoming elections【7†source】.
Education:
In India, the Central Board of Secondary Education (CBSE) is expected to release the results for Class 10 and 12 examinations shortly after May 20, 2024. Around 39 lakh students are awaiting these results, which will be available on several official websites such as cbseresults.nic.in and digilocker.gov.in【8†source】.
Sports:
Manchester City has clinched their fourth Premier League title, continuing their dominance in English football. Despite Arsenal's victory, Manchester City's lead in the standings secured their championship【6†source】.
The 2024 Indian Premier League (IPL) is in full swing, with live updates and scores available on the official IPL website. Fans are eagerly following the matches as the tournament progresses towards the finals【8†source】.
For more detailed updates and to follow these stories, you can visit news platforms like Al Jazeera, Politico, and Hindustan Times.
请用中文回答
显示这个search("latest news May 20 2024")mclick(["4", "5", "7", "15", "19"])是预期之内的吗
不清楚啊
额,我现在复现不出那些现象了,现在都是按官方的cut off来回答问题了,试了好几个资讯类问题都一样。您已经修复了?
是的 我修复了?
你相信么? 我有这么厉害就好了 ^_^
现在确实没问题了,之前问它小米汽车最新情况,它肯定要search("latest news May 20 2024")mclick(["4", "5", "7", "15", "19"])之类的,现在它就按2023-10月的日期回答。还有很多类似的情况。别装了,你就是repair之神。
|
2025-04-01T04:10:21.316833
| 2024-05-17T15:53:27
|
2303076151
|
{
"authors": [
"DosMike",
"nosoop"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14006",
"repo": "DosMike/TF2-PvP-OptIn",
"url": "https://github.com/DosMike/TF2-PvP-OptIn/pull/20"
}
|
gharchive/pull-request
|
Update gamedata for TF2 version 8835751 (2024-04-22)
Closes #19.
Untested; you'll probably want to verify the changes yourself (CC @NNinja1255 for testing). Note the following:
On Linux, some function calls may have been optimized to use custom calling conventions. The detours will not appear to work in those cases. I have not identified which ones are still called.
On Windows, CObjectSentrygun::ValidTargetPlayer() has been inlined into ::FindTarget(). I've opted to remove the signature here. The plugin seems to support missing signatures, so it'll just be missing functionality.
I haven't updated the annotations to match the signatures.
Notes on finding signatures:
CHeadlessHatmanAttack::IsPotentiallyChaseable via referenced ConVar tf_halloween_bot_quit_range
CMerasmusAttack::IsPotentiallyChaseable via referenced ConVar tf_merasmus_chase_range
CEyeballBoss::FindClosestVisibleVictim via referenced ConVar tf_eyeball_boss_debug_orientation
CObjectSentrygun::FoundTarget via unique string Building_Sentrygun.AlertTarget
Yea the plugin should definitely be able to handle missing sigs. Thank you again for updating my signatures 💙
|
2025-04-01T04:10:21.319093
| 2024-10-01T11:01:43
|
2558889878
|
{
"authors": [
"JimBobSquarePants",
"bjornhellander",
"kasperk81",
"stefannikolei",
"symbiogenesis"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14007",
"repo": "DotNetAnalyzers/StyleCopAnalyzers",
"url": "https://github.com/DotNetAnalyzers/StyleCopAnalyzers/issues/3893"
}
|
gharchive/issue
|
SA1515 on collection expression?
SA1515 triggers on
public int[] Blob =
[
// first item is sentinel
-42
];
Looks like a duplicate of #3766. This has already been fixed but not yet released.
Almost all of those items in the 1.2 milestone are from 2015. I don't think they are blockers.
Would make sense to get an update before .NET 9
Yeah @sharwell a updated beta release would be lovely.
Anyone I can bribe to get a release shipped? Happy to sponsor if that helps.
|
2025-04-01T04:10:21.339762
| 2023-08-29T16:07:33
|
1871988562
|
{
"authors": [
"DrFaust92",
"Jeoffreybauvin",
"PedroMartinSteenstrup"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14008",
"repo": "DrFaust92/terraform-provider-airflow",
"url": "https://github.com/DrFaust92/terraform-provider-airflow/issues/34"
}
|
gharchive/issue
|
How this provider reacts with secrets ?
Hi,
I'm trying to debug why my terraform apply is not idempotent :
# airflow_connection.airflow_connection["np6_api_key_display"] will be updated in-place
~ resource "airflow_connection" "airflow_connection" {
~ extra = jsonencode(
~ {
~ api_key = "***" -> "myapikey"
}
)
id = "np6_api_key_display"
# (3 unchanged attributes hidden)
}
I saw this : https://github.com/apache/airflow/blob/3b3650e87d093d57a97b8701834c568f67327ab4/airflow/utils/log/secrets_masker.py#L59
How this provider reacts with this kind of secret ?
The provider does nothing explict about secrets. the above diff is a change made on the ariflow server side (this is how the code youve sent relates to it)
we can make a change to explictly supress diff for those keys. wdyt?
@DrFaust92 I think it's a good idea :).
@DrFaust92 if you maybe have a decent idea of the scope of the change (beyond that issue), I'd be happy to try to contribute here
|
2025-04-01T04:10:21.354201
| 2018-01-23T02:46:13
|
290690067
|
{
"authors": [
"DrSleep",
"RaphaelHighness",
"minnieyao",
"sunbin1205"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14009",
"repo": "DrSleep/tensorflow-deeplab-resnet",
"url": "https://github.com/DrSleep/tensorflow-deeplab-resnet/issues/163"
}
|
gharchive/issue
|
result about run inference.py
Perform inference. py tests a picture, and the segmentation result shows no complete outline of the object, but a lot of color spots. How to solve it? Looking forward to your reply!!
@DrSleep
Hi @DrSleep,
I also meet the problem when I run the inference.py. I retain the model using the following setup, because the original batch size is out of memory I changed it to 4
BATCH_SIZE = 4
DATA_DIRECTORY = '/PASCAL/SemanticImg'
DATA_LIST_PATH = './dataset/train.txt'
IGNORE_LABEL = 255
INPUT_SIZE = '321,321'
LEARNING_RATE = 2.5e-4
MOMENTUM = 0.9
NUM_CLASSES = 21
NUM_STEPS = 20001
POWER = 0.9
RANDOM_SEED = 1234
RESTORE_FROM = './deeplab_resnet.ckpt'
SAVE_NUM_IMAGES = 2
SAVE_PRED_EVERY = 1000
SNAPSHOT_DIR = './snapshots/'
WEIGHT_DECAY = 0.0005
the train.txt is as follows:
/JPEGImages/2007_000032.jpg /SegmentationClass/2007_000032.png
/JPEGImages/2007_000039.jpg /SegmentationClass/2007_000039.png
/JPEGImages/2007_000063.jpg /SegmentationClass/2007_000063.png
/JPEGImages/2007_000068.jpg /SegmentationClass/2007_000068.png
/JPEGImages/2007_000121.jpg /SegmentationClass/2007_000121.png
/JPEGImages/2007_000170.jpg /SegmentationClass/2007_000170.png
/JPEGImages/2007_000241.jpg /SegmentationClass/2007_000241.png
/JPEGImages/2007_000243.jpg /SegmentationClass/2007_000243.png
/JPEGImages/2007_000250.jpg /SegmentationClass/2007_000250.png
/JPEGImages/2007_000256.jpg /SegmentationClass/2007_000256.png
/JPEGImages/2007_000333.jpg /SegmentationClass/2007_000333.png
/JPEGImages/2007_000363.jpg /SegmentationClass/2007_000363.png
/JPEGImages/2007_000364.jpg /SegmentationClass/2007_000364.png
/JPEGImages/2007_000392.jpg /SegmentationClass/2007_000392.png
/JPEGImages/2007_000480.jpg /SegmentationClass/2007_000480.png
/JPEGImages/2007_000504.jpg /SegmentationClass/2007_000504.png
/JPEGImages/2007_000515.jpg /SegmentationClass/2007_000515.png
/JPEGImages/2007_000528.jpg /SegmentationClass/2007_000528.png
/JPEGImages/2007_000549.jpg /SegmentationClass/2007_000549.png
/JPEGImages/2007_000584.jpg /SegmentationClass/2007_000584.png
/JPEGImages/2007_000645.jpg /SegmentationClass/2007_000645.png
/JPEGImages/2007_000648.jpg /SegmentationClass/2007_000648.png
/JPEGImages/2007_000713.jpg /SegmentationClass/2007_000713.png
/JPEGImages/2007_000720.jpg /SegmentationClass/2007_000720.png
/JPEGImages/2007_000733.jpg /SegmentationClass/2007_000733.png
/JPEGImages/2007_000738.jpg /SegmentationClass/2007_000738.png
/JPEGImages/2007_000768.jpg /SegmentationClass/2007_000768.png
.
.
.
The total number of training images is 1464 , the name of the images is from the VOC2012 trian list.
The logs is showing as follow:
Restored model parameters from ./deeplab_resnet.ckpt
The checkpoint has been created.
step 0 loss = 1.268, (16.858 sec/step)
step 1 loss = 3.971, (1.734 sec/step)
step 2 loss = 1.308, (1.339 sec/step)
step 3 loss = 2.991, (1.329 sec/step)
step 4 loss = 1.252, (1.346 sec/step)
step 5 loss = 1.344, (1.335 sec/step)
step 6 loss = 8.126, (1.331 sec/step)
step 7 loss = 4.652, (1.339 sec/step)
step 8 loss = 5.097, (1.339 sec/step)
step 9 loss = 1.318, (1.334 sec/step)
step 10 loss = 1.769, (1.353 sec/step)
.
.
.
step 19990 loss = 1.191, (1.419 sec/step)
step 19991 loss = 1.183, (1.425 sec/step)
step 19992 loss = 1.197, (1.424 sec/step)
step 19993 loss = 1.183, (1.422 sec/step)
step 19994 loss = 1.184, (1.408 sec/step)
step 19995 loss = 1.192, (1.416 sec/step)
step 19996 loss = 1.183, (1.419 sec/step)
step 19997 loss = 1.183, (1.414 sec/step)
step 19998 loss = 1.183, (1.420 sec/step)
step 19999 loss = 1.183, (1.437 sec/step)
The checkpoint has been created.
step 20000 loss = 1.183, (12.276 sec/step)
It looks normal right?
but when I run the inferency. py the results are all background,even though I use the training image.
Also if I use the deeplab_resnet.ckpt for inference, the output has both background and the segmented image.
Do you have any suggestion about this error?
Thank you!
@sunbin1205
E tensorflow/stream_executor/cuda/cuda_diagnostics.cc:303] kernel version 384.90.0 does not match DSO version 387.26.0 -- cannot find working devices in this configuration
Something with the drivers I assume. Can't help with that one.
@minnieyao
You are restoring from the already pre-trained model, hence the learning rate might be too high. Try to restore from the init model and check the progress in tensorboard, should be alright
Hi, @minnieyao.
I git the same problem. Did you solve it?
I restored from the deeplab_resnet_init.ckpt
|
2025-04-01T04:10:21.363707
| 2020-10-02T05:47:46
|
713364305
|
{
"authors": [
"DragaDoncila",
"codecov-commenter"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14010",
"repo": "DragaDoncila/napari-sentinel-to-zarr",
"url": "https://github.com/DragaDoncila/napari-sentinel-to-zarr/pull/11"
}
|
gharchive/pull-request
|
Transposing x,y coords correctly for interpolated ims
Fixes #8
Codecov Report
Merging #11 into master will decrease coverage by 0.12%.
The diff coverage is 0.00%.
@@ Coverage Diff @@
## master #11 +/- ##
==========================================
- Coverage 57.87% 57.75% -0.13%
==========================================
Files 5 5
Lines 330 329 -1
==========================================
- Hits 191 190 -1
Misses 139 139
Impacted Files
Coverage Δ
sentinel_to_zarr/sentinel_to_zarr.py
34.61% <0.00%> (ø)
sentinel_to_zarr/napari_sentinel_to_zarr.py
95.45% <0.00%> (-0.05%)
:arrow_down:
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f0dd500...5a9f80b. Read the comment docs.
|
2025-04-01T04:10:21.387868
| 2019-04-04T06:53:43
|
429111404
|
{
"authors": [
"Dragory",
"T3NED"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14013",
"repo": "Dragory/modmailbot",
"url": "https://github.com/Dragory/modmailbot/issues/260"
}
|
gharchive/issue
|
Changing Mod Mail Message
Hello, just a quick one. Where could I locate the message that responds to the user when they DM the bot? (where it is replied to the user)
If I understood correctly which message you're referring to, that'd be the responseMessage config option.
Yes, I know that. I would like to check if the user has any roles on the main guild. And if they don’t then return and send a message instead of opening a new thread
What line and what file can I find where this message is sent?
That'd be here:
https://github.com/Dragory/modmailbot/blob/2a20d9fdaa34a3c3c2ac9b70bc068d511458470f/src/data/threads.js#L151
|
2025-04-01T04:10:21.389411
| 2021-07-05T08:12:05
|
936818072
|
{
"authors": [
"Ashuto7h",
"PrathmeshMutke"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14014",
"repo": "Dragsters/Nutrihelp",
"url": "https://github.com/Dragsters/Nutrihelp/issues/68"
}
|
gharchive/issue
|
[Feature Request] Update on Readme.md
Is your feature request related to a problem? Please describe.
I want to add certain characters and text lines in it
LGMSOC21
go for it. can you tell in more details so that i can decide the level of this issue?
|
2025-04-01T04:10:21.399951
| 2019-04-21T22:33:14
|
435554081
|
{
"authors": [
"DreamingRaven"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14015",
"repo": "DreamingRaven/nemesyst",
"url": "https://github.com/DreamingRaven/nemesyst/issues/74"
}
|
gharchive/issue
|
Example update: add example saving + reusing keras model
Is your feature request related to a problem? Please describe.
It may be difficult for those new to Nemesyst to understand how to pickle/ sequentialise a model and then reuse it/ reload the same model for further training.
Describe the solution you'd like
sequentialisation in example,
unpack of sequentialised model in example,
saving of predicted results in example so it does not output anything into current directory.
Describe alternatives you've considered
Better documentation, but i do believe we need both.
Additional context
This should likeley be a bigger push with a wholistic upgrade to mongodb to support gridfs which is the likeley future of this framework.
Will be changed and due to #85 will be pytorch examples
|
2025-04-01T04:10:21.412193
| 2018-07-15T11:16:24
|
341313104
|
{
"authors": [
"takahirom"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14016",
"repo": "DroidKaigi/conference-app-2018",
"url": "https://github.com/DroidKaigi/conference-app-2018/issues/677"
}
|
gharchive/issue
|
Could not find dependency
Overview (Required)
I bought a new MacBook Pro. I can't find the dependency. Probably this is related with https://github.com/DroidKaigi/conference-app-2018/pull/676 🤔
Could not find android.arch.persistence.room:compiler:1.0.0.
Searched in the following locations:
file:/Users/takahirom/Library/Android/sdk/extras/m2repository/android/arch/persistence/room/compiler/1.0.0/compiler-1.0.0.pom
file:/Users/takahirom/Library/Android/sdk/extras/m2repository/android/arch/persistence/room/compiler/1.0.0/compiler-1.0.0.jar
file:/Users/takahirom/Library/Android/sdk/extras/google/m2repository/android/arch/persistence/room/compiler/1.0.0/compiler-1.0.0.pom
file:/Users/takahirom/Library/Android/sdk/extras/google/m2repository/android/arch/persistence/room/compiler/1.0.0/compiler-1.0.0.jar
file:/Users/takahirom/Library/Android/sdk/extras/android/m2repository/android/arch/persistence/room/compiler/1.0.0/compiler-1.0.0.pom
file:/Users/takahirom/Library/Android/sdk/extras/android/m2repository/android/arch/persistence/room/compiler/1.0.0/compiler-1.0.0.jar
https://repo.maven.apache.org/maven2/android/arch/persistence/room/compiler/1.0.0/compiler-1.0.0.pom
https://repo.maven.apache.org/maven2/android/arch/persistence/room/compiler/1.0.0/compiler-1.0.0.jar
Required by:
project :app
I fixed it.
But I don't know why 🤦♂️
I tried this. Please refer if you can not build
Refresh
Set variant
Sync project with Gradle Files
|
2025-04-01T04:10:21.445377
| 2024-11-12T21:04:32
|
2653325783
|
{
"authors": [
"damianh"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14017",
"repo": "DuendeSoftware/foss",
"url": "https://github.com/DuendeSoftware/foss/issues/38"
}
|
gharchive/issue
|
Consider deprecating Base64Url
@joegoldman2 wrote:
.NET 9 adds support for Base64 Url (https://github.com/dotnet/runtime/issues/1658). It's also compatible with .NET Standard 2.0. The class Base64Url should probably be deprecated in favor of the built-in support in the BCL.
https://github.com/IdentityModel/IdentityModel/issues/580
If we were to adopt this we would need to add a net9.0 target to IdentityModel and have some conditional package references.
Ref https://github.com/dotnet/runtime/issues/1658#issuecomment-2221062478 .
Package references:
netstandard2.0 : System.Text.Json.8.0.5 and System.BCL.Memory.9.0.0
net8.0: System.BCL.Memory.9.0.0
net9.0: none
|
2025-04-01T04:10:21.497353
| 2022-09-07T03:42:32
|
1364046321
|
{
"authors": [
"dustyroom-studio",
"plebon"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14018",
"repo": "Dustyroom/flat-kit-doc",
"url": "https://github.com/Dustyroom/flat-kit-doc/issues/87"
}
|
gharchive/issue
|
Flat Kit Fog render features stops working when changing quality settings (URP)
In URP, when changing quality settings at runtime, the Flat Kit Fog render features stops working.
Steps to reproduce the behavior:
Open the Wanderer Demo scene in with Unity 2021.3.7f1
Change the Render pipeline asset to use Flatkit example URP settings as per documentation
Create a Copy of the "[FlatKit] Example Settings URP" config and change a setting (i.e. Shadow Cascade count -> 3)
Set this new config as the medium/Balanced "Render Pipeline Asset"
Play the sample scene
Change the Quality Level from the Project Settings to Medium/Balanced
See that the Fog no longer works. It also does not work when switching back to High Quality.
Expected behavior
The Fog setting should work if the other quality setting also has the same render features.
Unity details:
Flat Kit version 3.0.1
Unity 2021.3.7f1
Dev platform: WindowsEditor
Target platform: StandaloneWindows64
URP installed: True, version 12.1.7
Render pipeline: UniversalPipeline
Color space: Linear
Quality config: [FlatKit] Example Settings URP (Med)
Graphics config: [FlatKit] Example Settings URP (Med)
Additional context
I have verified the issue on a Development and Release Build in my main project. If I change the quality option in a gameplay scene while the fog is rendering it no longer works for the rest of the time in game. If I quit the game completely and come back it will work again.
I can change the quality from the main menu (just UI without the fog render feature) and it will work when running the gameplay scene.
This should be fixed in v. 3.0.5, which should appear on the Asset Store in a few hours.
Please re-open this ticked if the issue persists.
|
2025-04-01T04:10:21.508243
| 2022-02-28T21:12:31
|
1154555313
|
{
"authors": [
"scala-steward"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14019",
"repo": "Dwolla/mysql-init-custom-resource",
"url": "https://github.com/Dwolla/mysql-init-custom-resource/pull/44"
}
|
gharchive/pull-request
|
Update secretsmanager, sts to 2.17.139
Updates
software.amazon.awssdk:secretsmanager
software.amazon.awssdk:sts
from 2.17.138 to 2.17.139.
I'll automatically update this PR to resolve conflicts as long as you don't change it yourself.
If you'd like to skip this version, you can just close this PR. If you have any feedback, just mention me in the comments below.
Configure Scala Steward for your repository with a .scala-steward.conf file.
Have a fantastic day writing Scala!
Ignore future updates
Add this to your .scala-steward.conf file to ignore future updates of this dependency:
updates.ignore = [ { groupId = "software.amazon.awssdk" } ]
labels: library-update, early-semver-patch, semver-spec-patch, commit-count:1
Superseded by #45.
|
2025-04-01T04:10:21.519641
| 2024-02-05T02:39:43
|
2117549066
|
{
"authors": [
"DynamicsNinja",
"emibanana"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14020",
"repo": "DynamicsNinja/FlowExecutionHistory",
"url": "https://github.com/DynamicsNinja/FlowExecutionHistory/issues/17"
}
|
gharchive/issue
|
App is broken
[Write any error info to resolve easier]
System.IndexOutOfRangeException:
Index was outside the bounds of the array.
Flow Execution History
- System.Collections.Generic.List`1.Add(T item)
- Fic.XTB.FlowExecutionHistory.Services.FlowClient.GetFlowRunsFromApi(Flow flow, String status, DateTimeOffset dateFrom, Boolean includeTriggerOutputs)
- Fic.XTB.FlowExecutionHistory.Services.FlowClient.GetFlowRuns(Flow flow, String status, DateTimeOffset dateFrom, DateTimeOffset dateTo, Int32 durationThreshold, Boolean includeTriggerOutputs)
- Fic.XTB.FlowExecutionHistory.FlowExecutionHistory.<>c__DisplayClass26_0.<GetFlowRuns>b__3(Flow f)
- System.Threading.Tasks.Parallel.<>c__DisplayClass17_0`1.<ForWorker>b__1()
- System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask)
- System.Threading.Tasks.Task.<>c__DisplayClass176_0.<ExecuteSelfReplicating>b__0(Object <p0>)
XrmToolBox Version: 1.2023.12.68
FlowExecutionHistory Version: 1.2024.1.4
DB Version: 9.2.24012.218
Deployment: Online
I dont know much about the details, it happen when i open the app today, and if i select any of the avaialble flow (after connect to the power automate api), but doing any other action this will cause an error.
by meaning any action i mean: selecting or deselecting a flow, change date, etc
Do you know if it worked for you before or if you just started using the tool?
I tried on my instance that needs to connect to flow API, and everything is working fine. There is probably some other issue present in your case too.
After reviewing this issue, it seems that there hasn't been any new feedback or activity for some time. In light of this, I'm going to close this issue for now. If the problem resurfaces or if you have any further information to add, please feel free to reopen it or create a new issue, and I'll be happy to assist you further.
Hi, this issues still keeps happening randomly, see below log:
at Fic.XTB.FlowExecutionHistory.Services.FlowClient.<>c__DisplayClass19_0.b__0(FlowRunsCache x)
at System.Linq.Enumerable.FirstOrDefault[TSource](IEnumerable1 source, Func2 predicate)
at Fic.XTB.FlowExecutionHistory.Services.FlowClient.GetFlowRunsFromCache(String flowId, String status, DateTimeOffset dateFrom)
at Fic.XTB.FlowExecutionHistory.Services.FlowClient.GetFlowRuns(Flow flow, String status, DateTimeOffset dateFrom, DateTimeOffset dateTo, Int32 durationThreshold, Boolean includeTriggerOutputs)
at Fic.XTB.FlowExecutionHistory.FlowExecutionHistory.<>c__DisplayClass26_0.b__3(Flow f)
at System.Threading.Tasks.Parallel.<>c__DisplayClass17_0`1.b__1()
at System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask)
at System.Threading.Tasks.Task.<>c__DisplayClass176_0.b__0(Object )
During this error, nothing specific was set, i open the app, connect to power automate api, succeeded, and then when i select my flow, errors occured
|
2025-04-01T04:10:21.615560
| 2022-08-30T16:21:14
|
1356015434
|
{
"authors": [
"Athanasius"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14021",
"repo": "EDCD/EDDN",
"url": "https://github.com/EDCD/EDDN/pull/195"
}
|
gharchive/pull-request
|
schemas: fcmaterials for both Journal and CAPI-sourced data
This is an initial implementation of an fcmaterials/1 schema for Journal source data only.
Closes #193
These new schemas check out as working in an end-to-end test with the matching PR in EDMC and acceptance by dev.eddn.edcd.io, so I'm now going to merge this.
|
2025-04-01T04:10:21.624676
| 2022-04-19T16:50:43
|
1208615561
|
{
"authors": [
"CiceroFonsecaOSG",
"SezarGantousOSG",
"aubeOSG"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14022",
"repo": "EEBOS/SCROWL",
"url": "https://github.com/EEBOS/SCROWL/pull/36"
}
|
gharchive/pull-request
|
[sc-627] - Chore/Playwright
Short summary
Playwright was installed to manage the E2E tests
It will run all tests added to the course-authoring/tests/ folder
Jest will ignore this folder, this way we can have unit tests in the component folder and the main E2E tests within the course-authoring/tests/ folder
*An unrelated key error caught by Jest was fixed in this PR.
Test steps
Run yarn in the root of the project to install all the packages and dependencies
Still in the root, run yarn test:tool:course, this will run all unit and E2E tests
To run the tests in a more granular way, head to the course-authoring/ folder and run:
yarn test:e2e to run E2E Playwright tests only
yarn test:e2e:debug to run Playwright in debug mode, a different window is going to open which allows you to debug each line of your tests
@CiceroFonsecaOSG before review, can you get all status checks to pass. Thanks.
@CiceroFonsecaOSG before review, can you get all status checks to pass. Thanks.
Sure thing, working on that.
@CiceroFonsecaOSG @aubeOSG
You know what the more I think about it, I never seen regression testing being part of the integration testing. I would move that logic into a separate workflow all together so it is not run on PR.
Automated E2E testing usually happen on like nightly builds or when pushing to main branch something like that. Also, I think we need to build the app first then test it on different OSs because I dont think its testing this currently right?
We will probably need to use something like this: https://www.npmjs.com/package/xvfb-maybe
and have playwright start the executable... I am sure we can find examples of others.
|
2025-04-01T04:10:21.636360
| 2024-04-19T10:23:03
|
2252623671
|
{
"authors": [
"paolini78",
"sebastian-luna-valero"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14023",
"repo": "EGI-Federation/documentation",
"url": "https://github.com/EGI-Federation/documentation/issues/650"
}
|
gharchive/issue
|
documentation for creating pool accounts on WNs
there is the need to document how to create pool accounts on WNs.
for example there is this old documentation that still refers to the good old YAIM:
https://twiki.cern.ch/twiki/bin/view/LCG/YaimGuide301#users_conf
In case of other middlware products, the creation of pool accounts could be automatic when enabling a VO, and the documentation could already exist, but it is worth to check if it is the case and to add a link to it
In case it helps:
https://www.nordugrid.org/arc/arc7/admins/reference.html#mapping-block
|
2025-04-01T04:10:21.638636
| 2020-10-06T09:09:22
|
715487961
|
{
"authors": [
"enolfc",
"gwarf"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14024",
"repo": "EGI-Foundation/documentation",
"url": "https://github.com/EGI-Foundation/documentation/pull/73"
}
|
gharchive/pull-request
|
Improve contributing guide
Improve contributing guide, adding some explicit documentation on using git and GitHub CLI.
Should we move this to the actual page instead of linking?
I've disabled prettier for now, in fact it's more to be used on users' computers to re-write the content properly than to be used as a linter as the --check output is not very convenient/meaningful: it can only show that files have not went through prettier.
This should still be enough to be used to check code, but will only be enabled when most/all content will have been properly updated.
I've also started to add a place with some markdown tips, would be good to have us add things that make sense and that we easily forgot.
I feel the style of docs should be highlighted before all the PR details (there is only the comment on valid MD and 80 lines)
I
|
2025-04-01T04:10:21.642785
| 2024-06-24T08:08:44
|
2369538415
|
{
"authors": [
"DaniBodor",
"psomhorst"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14025",
"repo": "EIT-ALIVE/eitprocessing",
"url": "https://github.com/EIT-ALIVE/eitprocessing/issues/254"
}
|
gharchive/issue
|
update documentation building workflow to mkdocs
I wanted to create a workflow that checks whether the documentation can be built without errors. This issue might be a duplicate of #252 though.
blocked by #233
This has not been implemented with the new documentation building workflow. I created a new PR #318 to deal with this.
I believe this was resolved in #318
|
2025-04-01T04:10:21.654881
| 2023-10-07T18:23:08
|
1931453635
|
{
"authors": [
"Chenzkiii",
"EMERALD0874"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14026",
"repo": "EMERALD0874/Steam-Deck-Themes",
"url": "https://github.com/EMERALD0874/Steam-Deck-Themes/issues/65"
}
|
gharchive/issue
|
New Fonts for SteamDeck?
Hi, Iam a fan of your themes at Decky Loader, May I request to add more fonts? ☺️
There is a custom font option. No more fonts are planned for Fonts.
|
2025-04-01T04:10:21.662644
| 2023-04-10T05:34:54
|
1660264102
|
{
"authors": [
"TEAM4-0",
"codecov-commenter"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14027",
"repo": "EMMC-ASBL/otelib",
"url": "https://github.com/EMMC-ASBL/otelib/pull/106"
}
|
gharchive/pull-request
|
[Auto-generated] Check & update dependencies (pyproject.toml)
Update dependencies (pyproject.toml)
Automatically created PR based on the 'CI - Check pyproject.toml dependencies' workflow in SINTEF/ci-cd.
Codecov Report
Patch and project coverage have no change.
Comparison is base (51e0add) 90.48% compared to head (ca2d937) 90.48%.
:mega: This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more
Additional details and impacted files
@@ Coverage Diff @@
## ci/dependency-updates #106 +/- ##
======================================================
Coverage 90.48% 90.48%
======================================================
Files 25 25
Lines 389 389
======================================================
Hits 352 352
Misses 37 37
Flag
Coverage Δ
linux
90.48% <ø> (ø)
windows
?
Flags with carried forward coverage won't be shown. Click here to find out more.
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
|
2025-04-01T04:10:21.664156
| 2022-01-05T19:42:28
|
1094686214
|
{
"authors": [
"kailash360"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14028",
"repo": "EOS-uiux-Solutions/user-story",
"url": "https://github.com/EOS-uiux-Solutions/user-story/issues/111"
}
|
gharchive/issue
|
Add "Created At" field in story description
More info at https://userstory.eosdesignsystem.com/story/60e6df9180886200152e34b9
Can I work on this issue?
PR #113 has been created for this issue.
|
2025-04-01T04:10:21.673509
| 2020-12-23T17:18:00
|
773943215
|
{
"authors": [
"linhuang-blockone"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14029",
"repo": "EOSIO/eos",
"url": "https://github.com/EOSIO/eos/pull/9828"
}
|
gharchive/pull-request
|
Fix packed transaction version conversion -- Release 2.1.x
Change Description
https://blockone.atlassian.net/browse/EPE-579
In Jungle testnet, LIBs were not advancing in a mixed of Rel. 2.0.x and Rel. 2.1.x nodes.
This was due to 2.0.x validation failure of blocks produced by 2.1.x, and in turn caused by
incorrect conversion between packed_transaction_v0 and packed_transaction_v1.
The solution is to make sure a conversion does not stripped any data.
New test cases are added.
Change Type
Select ONE:
[ ] Documentation
[x] Stability bug fix
[ ] Other
[ ] Other - special case
Testing Changes
Select ANY that apply:
[x] New Tests
[ ] Existing Tests
[ ] Test Framework
[ ] CI System
[ ] Other
Consensus Changes
[ ] Consensus Changes
API Changes
[ ] API Changes
Documentation Additions
[ ] Documentation Additions
Thanks @heifner and @b1bart!
|
2025-04-01T04:10:21.676335
| 2019-04-12T00:38:54
|
432338117
|
{
"authors": [
"kj4ezj"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14030",
"repo": "EOSIO/eosio.cdt",
"url": "https://github.com/EOSIO/eosio.cdt/pull/495"
}
|
gharchive/pull-request
|
Buildkite Integration Test Beta
Change Description
Beta-testing changes described in pull request 494 on release/1.6.x branch.
API Changes
[ ] API Changes
None.
Documentation Additions
[ ] Documentation Additions
None.
Closing in favor of pull request 499.
|
2025-04-01T04:10:21.753764
| 2019-10-15T13:53:43
|
507260988
|
{
"authors": [
"katjaweigel"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14031",
"repo": "ESMValGroup/ESMValTool",
"url": "https://github.com/ESMValGroup/ESMValTool/issues/1380"
}
|
gharchive/issue
|
CVDP overwrites data when run on multiple models
If run on multiple models the CVDP output looks ok on the first glance (as many *.nc files as models, as many panels in each figure as models), but all of these data sets and panels contain the same data (most probably only from one of these models).
Recipe: recipe_cvdp.yml
The reason for the overwriting of data iss, that the at the point where cvdp_wrapper.py writes the master namefile for the *.ncl program () the model name is not part of the path.
This causes the namelist_byvar point to all data sets (with *) and not to one specific one. Therefore only one (last or first I guess) is read in by the *.ncl routines.
Changing:
<
content.append("{0} | {1} | {2} | {3}\n".format(
attributes[0]["alias"], ppath, attributes[0]["start_year"],
attributes[0]["end_year"]))
to
<
content.append("{0} | {1}{0} | {2} | {3}\n".format(
attributes[0]["alias"], ppath, attributes[0]["start_year"],
attributes[0]["end_year"]))
solves that.
|
2025-04-01T04:10:21.757910
| 2021-03-24T10:41:41
|
839587981
|
{
"authors": [
"bouweandela",
"valeriupredoi",
"zklaus"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14032",
"repo": "ESMValGroup/ESMValTool",
"url": "https://github.com/ESMValGroup/ESMValTool/issues/2100"
}
|
gharchive/issue
|
Drop ecmwf-api-client conda package from esmvalgroup conda channel
In commit 192b3f6b0c914f37b643a5e4d49e63976f44908f we added our own conda package due to a perceived reluctance of upgrading the conda-forge package (cf conda-forge/ecmwf-api-client-feedstock#5).
The conda-forge package is now regularly updated and even seems to be ahead of our own package.
Maybe it is time to drop our own package? This would also help with the move of esmvaltool itself to conda-forge.
I moved the package to the deprecated label, if it's still needed, it can be installed with conda install -c esmvalgroup/label/deprecated ecmwf-api-client. If all the installation tests work fine tonight, we can also remove the recipe file ecmwf_api_client_meta.yaml from this repo.
Cheers. Will have a look at the yamale thing.
The conda build succeeded, so ecmwf_api_client_meta.yaml can be removed.
yes, that's great, but it still takes ~4h to build it, am trying to make it faster in #2096 - so far I've managed to get it to ~40min (on my laptop though, on Circle and GA still takes ages, but I have just set the conda package priority to strict in the latest GA test, fingers crossed that'll do the trick)
BTW the conda build was never a problem it would fail, it was failing on Circle because it would go over 60min with no output, running it with --debug ensures constant output to screen and it goes on and finishes everytime, after 3-4h, but it finishes fine
YUSSS! I built it in 42min on GA
|
2025-04-01T04:10:21.761130
| 2018-02-23T11:44:17
|
299685415
|
{
"authors": [
"nicoddemus",
"prusse-martin"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14033",
"repo": "ESSS/pytest-replay",
"url": "https://github.com/ESSS/pytest-replay/pull/3"
}
|
gharchive/pull-request
|
Use plain text files to record node ids and add flag to replay them
Unfortunately due to the limitations of bat file sizes on Windows we
need to use another solution; in the end seems better anyway because
now you can take files generated in one platform and use that in
another platform.
What happens if xdist is used when replay a recorded file?
pytest --replay=.pytest-replay-gw1.txt -n 2
It will run only the tests contained in the file, in parallel. (which kind defeats the purpose, but it works)
|
2025-04-01T04:10:21.787518
| 2024-05-01T09:25:27
|
2273212943
|
{
"authors": [
"Elsa-is-My-Muse",
"krealyt"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0004.json.gz:14034",
"repo": "EVerest/everest-core",
"url": "https://github.com/EVerest/everest-core/issues/663"
}
|
gharchive/issue
|
python3 -m pip install --upgrade pip setuptools wheel jstyleson jsonschema error.
Describe the bug
EVerest Domain
Other
Affected EVerest Module
No response
To Reproduce
No response
Anything else?
No response
I would install that missing package manually.
Checking if something in the install process can be improved.
Anyone a fast additional idea / opinion on that?
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.