added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| created
timestamp[us]date 2001-10-09 16:19:16
2025-01-01 03:51:31
| id
stringlengths 4
10
| metadata
dict | source
stringclasses 2
values | text
stringlengths 0
1.61M
|
|---|---|---|---|---|---|
2025-04-01T06:37:38.984061
| 2024-01-31T19:24:49
|
2110806386
|
{
"authors": [
"rbouwer"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2950",
"repo": "UBC-MDS/PyXplor",
"url": "https://github.com/UBC-MDS/PyXplor/issues/115"
}
|
gharchive/issue
|
Peer Review Feedback
Tests
check test framework - potential compatibility issues with non-Mac users (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1915907137 - 1)Tests passed for Joey (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916139577)
Look into Marco's experience with Windows: (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1917919802 - 3, 4) @phchen5
Flies to Delete
remove pyxplor.py (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1915907137 - 2) (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916139577 - 1) (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916277295 - 2) (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1917919802 - 1) @iris0614 (DELETE: pyxplor.py from src AND from tests)
Vignette @rbouwer
Break the Main Vignette, into smaller ones, could have a quick look, basic usage section. A longer in depth vignette going into the lengthy explorations that are possible with the package (separated) under a separate tab. Albeit the current breakdown is definitely helpful. (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1915907137 - 3)
It would be better if there was more context about the dataset, and a more detailed introduction explaining the importance and applications of EDA in data science would help users understand the relevance and application of the examples. (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916139577 - 2)
If possible, include interactive elements or widgets in the tutorial for a hands-on experience. These interactive elements can make the learning process more engaging and effective, and help users better understand the capabilities and use of the package. (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916139577 - 3)
Function Improvements
plot_categorical @rbouwer (Nice to have - OPTIONAL)
For the categorical bar plots, they could be ordered by largest to smallest to make it easier to visualise the categories. They are ordered in the facetted plots, only the first plot in the docs for categorical is not. (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1915907137 - 4)
It might be beneficial to reconsider the use of colour in the Distribution of Categorical Variables. The current approach assigns colours to bars based on their count ranking, which could potentially confuse users. For example, in your example.ipynb, pickup_borough is displayed in green for Bronx, while dropoff_borough is in red, solely due to count variations for the same variable. Given that each bar already has a clear label, the additional colour coding might not be necessary and could be removed. (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916277295 - 3)
plot_numeric @iris0614 (Nice to have - OPTIONAL)
You can enhance the EDA experience by offering scatterplots between the target variable (if numerical) and the numerical explanatory variables, catering to users who want to visualize the relationships before creating models for predictions. (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916277295 - 4)
Ordinal? (For later)
Consider EDA for ordinal variables, I think "passengers" is visualized as an ordinal variable instead of categorical variable, as you've maintained the natural order of the number of passengers instead of ranking them as you did with other categorical variables. (see example.ipynb). (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916277295 - 5)
Badges
The badges are not all all there, could consider all adding the 2 missing ones Continuous integration and test coverage, and Python versions supported (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1915907137 - 5) (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916139577 - 4) (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1916277295 - 1) @arturoboquin
Repo Improvements
Although there is a clickable button that leads you to the documentation, I would suggest adding a link in the "About" section of the repo so that less experienced users don't struggle as much when trying to find it. (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1917919802 - 2) @arturoboquin
README
As of today, the installation instructions' title is "Installation (developers)". As a regular user, that made me think that I was looking into the wrong section and I searched for the "Installation (mortals)" section. As I didn't find one, I assumed that you chose this title as the package is still in development. However, I would remove the "(developers)" in the final version or create a regular user section and move the developer's instructions to the ReadTheDocs full documentation website. (https://github.com/UBC-MDS/software-review-2024/issues/9#issuecomment-1917919802 - 5) @phchen5
any additional improvements (log in this issue)
COMMIT MESSAGE FORMATTING
fix: Feedback addressed by ...
|
2025-04-01T06:37:39.028025
| 2020-10-23T17:33:12
|
728394787
|
{
"authors": [
"atefehsz",
"zzzDavid"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2952",
"repo": "UCLA-VAST/FlexCNN",
"url": "https://github.com/UCLA-VAST/FlexCNN/issues/14"
}
|
gharchive/issue
|
Software Emulation / Onboard Running Stuck after Enqueued task
First of all, thank you for the awesome work.
I am trying to run the FlexCNN workflow and repeat the experiments in the paper. I have encountered some difficulties and I think I could use some help.
Environment
Ubuntu 16.04 LTS
Vivado Design Suite 2018.3
SDx 2018.3
Instruction generation and DSE are done the same as the documentation says, no changes made.
Description
I could not finish running the entire workflow on AWS.F1 instance. SDAccel hw_emu works but sw_emu produces the same error.
The host program output during sw_emu is as follows:
$ ./pose_prj.exe binary_container_1.xclbin
Your working PATH is: /home/niansong.zhang/FlexCNN
cin_size: 15728384
bias_size: 27792
weight_size: 553952
Loading instructions...
Layer num: 95
Preparing data...
Loading input...
Loading weight...
Loading bias...
Loading output...
Found Platform
Platform Name: Xilinx
device number: 1
device name: xilinx_aws-vu9p-f1-04261818_dynamic_5_0
INFO: Importing binary_container_1.xclbin
Loading: 'binary_container_1.xclbin'
Kernel launched!
Enqueued task!
The workflow could not finish running after ~5 hours. However, hw_emu finishes running and passes result check (which is done in the host program).
Makefile
My makefile for sw_emu is: (same as the repo, only changed tool paths)
#
# this file was created by a computer. trust it.
#
# compiler tools
XILINX_VIVADO_HLS ?= $(XILINX_SDX)/Vivado_HLS
SDX_CXX ?= $(XILINX_SDX)/bin/xcpp
XOCC ?= $(XILINX_SDX)/bin/xocc
RM = rm -f
RMDIR = rm -rf
SDX_PLATFORM = xilinx_aws-vu9p-f1-04261818_dynamic_5_0
# SDX_PLATFORM = xilinx_vcu1525_xdma_201830_1
XILINX_XRT = /opt/xilinx/xrt
# host compiler global settings
CXXFLAGS += -DSDX_PLATFORM=$(SDX_PLATFORM) -D__USE_XOPEN2K8 -I/home/Xilinx/SDx/2018.3/runtime/include/1_2/ -I/home/Xilinx/SDx//2018.3/include/ -O2 -Wall -c -fmessage-length=0 -std=c++14 -I/opt/xilinx/xrt/include/
LDFLAGS += -L/opt/xilinx/xrt/lib/ -lxilinxopencl -lpthread -lrt -lstdc++ -L/home/Xilinx/SDx/2018.3/runtime/lib/x86_64
# kernel compiler global settings
XOCC_OPTS = -t sw_emu --save-temps --report system --max_memory_ports top_kernel --platform $(SDX_PLATFORM) -O3 --sp top_kernel_1.m_axi_gmem1:bank0 --sp top_kernel_1.m_axi_gmem2:bank1 --sp top_kernel_1.m_axi_gcontrol:bank0
#
# OpenCL kernel files
#
BINARY_CONTAINERS += binary_container_1.xclbin
BUILD_SUBDIRS += binary_container_1
BINARY_CONTAINER_1_OBJS += binary_container_1/top_kernel.xo
ALL_KERNEL_OBJS += binary_container_1/top_kernel.xo
ALL_MESSAGE_FILES = $(subst .xo,.mdb,$(ALL_KERNEL_OBJS)) $(subst .xclbin,.mdb,$(BINARY_CONTAINERS))
#
# host files
#
HOST_OBJECTS += src/cnn_sw.o
HOST_OBJECTS += src/host.o
HOST_OBJECTS += src/xcl2.o
HOST_EXE = pose_prj.exe
BUILD_SUBDIRS += src/
#
# primary build targets
#
.PHONY: all clean
all: $(BINARY_CONTAINERS) $(HOST_EXE)
clean:
-$(RM) $(BINARY_CONTAINERS) $(ALL_KERNEL_OBJS) $(ALL_MESSAGE_FILES) $(HOST_EXE) $(HOST_OBJECTS)
-$(RM) *.xclbin.sh
-$(RMDIR) $(BUILD_SUBDIRS)
-$(RMDIR) _xocc*
-$(RMDIR) .Xil
.PHONY: incremental
incremental: all
nothing:
#
# binary container: binary_container_1.xclbin
#
binary_container_1/top_kernel.xo: ../src/hw_kernel.cpp ../src/pose.h /home/Xilinx/Vivado/2018.3/include/hls_stream.h /home/Xilinx/Vivado/2018.3/include/ap_int.h /home/Xilinx/Vivado/2018.3/include/ap_fixed.h
@mkdir -p $(@D)
-@$(RM) $@
$(XOCC) $(XOCC_OPTS) -c -k top_kernel --max_memory_ports top_kernel --messageDb $(subst .xo,.mdb,$@) -I"$(<D)" --xp misc:solution_name=_xocc_compile_binary_container_1_top_kernel -o"$@" "$<" --kernel_frequency 310
binary_container_1.xclbin: $(BINARY_CONTAINER_1_OBJS)
-@echo $(XOCC) $(XOCC_OPTS) -l --nk top_kernel:1 --messageDb $(subst .xclbin,.mdb,$@) --xp misc:solution_name=_xocc_link_binary_container_1 --remote_ip_cache /home/niansong.zhang/workspace/ipcache -o"$@" $(+) > binary_container_1.xclbin.sh
$(XOCC) $(XOCC_OPTS) -l --nk top_kernel:1 --messageDb $(subst .xclbin,.mdb,$@) --xp misc:solution_name=_xocc_link_binary_container_1 --remote_ip_cache /home/niansong.zhang/workspace/ip_cache -o"$@" $(+) --kernel_frequency 310
#
# host rules
#
src/cnn_sw.o: ../src/cnn_sw.cpp ../src/pose.h
@mkdir -p $(@D)
$(SDX_CXX) $(CXXFLAGS) -DSDX_PLATFORM=$(SDX_PLATFORM) -D__USE_XOPEN2K8 -I/home/Xilinx/SDx/2018.3/runtime/include/1_2/ -I/home/Xilinx/Vivado/2018.3/include/ -O2 -Wall -c -fmessage-length=0 -o "$@" "$<"
src/host.o: ../src/host.cpp ../src/xcl2.hpp ../src/pose.h
@mkdir -p $(@D)
$(SDX_CXX) $(CXXFLAGS) -DSDX_PLATFORM=$(SDX_PLATFORM) -D__USE_XOPEN2K8 -I/home/Xilinx/SDx/2018.3/runtime/include/1_2/ -I/home/Xilinx/Vivado/2018.3/include/ -O2 -Wall -c -fmessage-length=0 -o "$@" "$<"
src/xcl2.o: ../src/xcl2.cpp ../src/xcl2.hpp
@mkdir -p $(@D)
$(SDX_CXX) $(CXXFLAGS) -DSDX_PLATFORM=$(SDX_PLATFORM) -D__USE_XOPEN2K8 -I/home/Xilinx/SDx/2018.3/runtime/include/1_2/ -I/home/Xilinx/Vivado/2018.3/include/ -O2 -Wall -c -fmessage-length=0 -o "$@" "$<"
$(HOST_EXE): $(HOST_OBJECTS)
$(SDX_CXX) -o "$@" $(+) $(LDFLAGS) -lxilinxopencl -lpthread -lrt -lstdc++ -L/home/Xilinx/SDx/2018.3/runtime/lib/x86_64
I would really appreciate any help or insights. Thank you for looking into this problem.
TL,DR: sw_emu and hardware test are stuck after task enqueued.
Could you please run the code with libsacc/config/openpose.insts so that we can find out whether the code has a problem or the instructions?
If that worked, add “#define DEBUG_layer” to “util.h”, run the software emulation and see at which layer the code gets stuck. Note that when you add this, hardware emulation doesn’t run. You should comment it for hardware emulation.
Thank you for the quick response!
I changed the instruction to libsacc/config/openpose.insts and recompiled, but software emulation still stucks.
Using openpose instructions and enabling DEBUG_layer:
Passed85
Passed86
Passed87
Software emulation of compute unit(s) exited unexpectedly
It was running fine until layer 88. I'll check what layer it is and see if I can get some clue.
A random guess: could it have something to do with pooling layer? I am using the SDx_project code, and just noticed that pooling is not connected in the engine module (SDx_project/src/hw_kernel.cpp).
The problem here is that there is no layer 88. There are instructions for 87 layers in openpose.insts. Change the LAYER_NUM in params.h to 87 and the problem should go away.
The OpenPose network does not use the pooling layer that is why it is commented in SDx_project/src/hw_kernel.cpp. If your network uses a pooling layer, just uncomment the pool module.
Thank you, that solved my problem. Thank you very much for the timely help!
|
2025-04-01T06:37:39.030599
| 2023-05-09T00:21:04
|
1701115184
|
{
"authors": [
"markmatney"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2953",
"repo": "UCLALibrary/prl-harvester",
"url": "https://github.com/UCLALibrary/prl-harvester/pull/62"
}
|
gharchive/pull-request
|
Name intermediate results of Future composition, and other clarifications
The primary aim of this PR is to make it easier to comprehend the Future composition paradigm that I have used extensively throughout the codebase, by giving names to things that were previously anonymous.
I've also corrected some typos and made minor no-op changes for concision (hopefully not at the expense of clarity).
I'm open to all feedback as far as what you all think works, or doesn't.
Also, I'd like to pretend the branch name is "naming things".
I guess the tests ran slowly enough on that particular PR build that they triggered a timeout error. The timeout value on those tests is 10 seconds. My rationale for keeping those was that they provide additional specification of how the component under test should behave, although I am not opposed to removing them if they're going to fail intermittently when run in GHA.
|
2025-04-01T06:37:39.035509
| 2015-02-27T01:14:45
|
59174171
|
{
"authors": [
"amazingcaleb",
"mliou"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2954",
"repo": "UCSB-CS56-Projects/cs56-games-minesweeper",
"url": "https://github.com/UCSB-CS56-Projects/cs56-games-minesweeper/pull/16"
}
|
gharchive/pull-request
|
Fixed issues and added W15_lab06.md
I addressed issues #8, #9 and #13 and added W15_lab06.md as required by lab06
For #8, I didn't add the pictures for the flag and mine, i just made the colors nicer and removed the numbers.
For #9, I added dialog boxes that pop up when you win or lose.
Lab06 procedure changed; make a new pull request to the amazingcaleb branch
|
2025-04-01T06:37:39.076953
| 2024-02-19T15:53:05
|
2142685388
|
{
"authors": [
"daywiss",
"gsteenkamp89"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2955",
"repo": "UMAprotocol/snapshot",
"url": "https://github.com/UMAprotocol/snapshot/pull/127"
}
|
gharchive/pull-request
|
feat: osnap execution warning message
motivation
We want to show oSnap users a warning if their safe is misconfigured and their transaction will not auto execute.
We call an endpoint at https://osnap.uma.xyz/api/sapce-config to inspect the on-chain settings for a given safe.
In Proposals list (already proposed)
In transaction Builder (before proposal)
seeing an issue on space: https://snapshot-77xkgjvgj-uma.vercel.app/#/umadev.eth/create
not sure if this is us or snapshot
prd here: https://github.com/snapshot-labs/snapshot/pull/4567
|
2025-04-01T06:37:39.084869
| 2024-02-28T20:21:36
|
2159803339
|
{
"authors": [
"Krastanov",
"rniffenegger"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2956",
"repo": "UMassQIS/UMassQISWebsite",
"url": "https://github.com/UMassQIS/UMassQISWebsite/issues/7"
}
|
gharchive/issue
|
automated recompilation of the website every week (for updating pages like Publications that craw external services)
New paper that has posted on: https://arxiv.org/a/niffenegger_r_1.html
is not showing up on https://quantumdraft.umass-amherst.org/publications/.
The papers update only when a new version of the website is compiled. I will set up a recurrent job to do that once a week when it is closer to done. When I merge #5, the paper should be visible.
The papers update only when a new version of the website is compiled. I will set up a recurrent job to do that once a week when it is closer to done. When I merge #5, the paper should be visible.
it is now complete (see the .buildkite and .github/deploy configs)
|
2025-04-01T06:37:39.144104
| 2021-11-11T13:34:43
|
1050986321
|
{
"authors": [
"JanellC",
"aprematta"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2957",
"repo": "US-EPA-CAMD/easey-ui",
"url": "https://github.com/US-EPA-CAMD/easey-ui/issues/2283"
}
|
gharchive/issue
|
Retest Only: CAMPD Right Menu - EPA 508 issue
Finding
Page/ Screen
Finding
WCAG 2.0 Standard(s)
Right Menu
The right menu contains 3 subtopics each of which has both a button and a link. Both the buttons and links have identical names but different functions. The buttons expand and collapse the corresponding submenus while the links lead to an external information page. These names should be updated to clarify the function of each.
2.4.4
Context
The tech team is working on the right menu component. The new menu component will be implemented with ticket Refactor CAMPD UI to use easey-design-system components #1572. This ticket should be retested with ticket #1572
Retest complete, there are no more expandable buttons, only links. Issue is fixed
|
2025-04-01T06:37:39.148309
| 2022-10-18T02:52:24
|
1412490531
|
{
"authors": [
"JanellC",
"jwhitehead77"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2958",
"repo": "US-EPA-CAMD/easey-ui",
"url": "https://github.com/US-EPA-CAMD/easey-ui/issues/4392"
}
|
gharchive/issue
|
Load Test History Report Definition into database
Need to define & load the report definition...
Example Reports
https://teams.microsoft.com/l/file/CC8EC58F-2014-49DD-9E85-BEC9D39E270C?tenantId=88b378b3-6748-4867-acf9-76aacbeca6a7&fileType=pdf&objectUrl=https%3A%2F%2Fusepa.sharepoint.com%2Fsites%2FCAMDCVPTeam%2FShared Documents%2FTheEmissioners%2FECMPS Reports%2FMonitor Plan<EMAIL_ADDRESS>Example of test history report:
https://usepa.sharepoint.com/sites/CAMDCVPTeam/Shared Documents/Forms/AllItems.aspx?FolderCTID=0x012000B4ABB0EF9635994FA680705355892410&id=%2Fsites%2FCAMDCVPTeam%2FShared Documents%2FECMPS 2.0%2FECMPS Reports%2FTest History.pdf&parent=%2Fsites%2FCAMDCVPTeam%2FShared Documents%2FECMPS 2.0%2FECMPS Reports
Need clarififcation from Chris W on the priority of this report
|
2025-04-01T06:37:39.151507
| 2023-10-03T14:40:02
|
1924302438
|
{
"authors": [
"acollad1",
"jwhitehead77",
"mosesdeeCVP"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2959",
"repo": "US-EPA-CAMD/easey-ui",
"url": "https://github.com/US-EPA-CAMD/easey-ui/issues/5731"
}
|
gharchive/issue
|
Bug: Evaluate Critical Error Adjusted Value
Issue: When a user logs in, imports from a historical file back, and then evaluates the user receives critical errors regarding the adjusted value
Steps to Recreate:
Log in
Import form historical Q4 2022 Limestone
Evaluate the file
User gets critical errors regarding adjusted value
Link to evaluation report:
https://ecmps-tst.app.cloud.gov/reports?reportCode=EM_EVAL&facilityId=298&monitorPlanId=MDC-D4D7F122FD8F488F8B09A87DF926020E&year=2022&quarter=4
#5731, #5732 and #5759 are all the same problem where adjustedHourlyValue was not being read in on import. These have all been addressed with changes from #5759
Private Zenhub Image
Evaluated without Critical Error
|
2025-04-01T06:37:39.171560
| 2022-02-02T00:17:07
|
1121335960
|
{
"authors": [
"WesIngwersen",
"catherinebirney"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2960",
"repo": "USEPA/stateior",
"url": "https://github.com/USEPA/stateior/issues/9"
}
|
gharchive/issue
|
TwoRegion_Summary_DomesticUse rds missing for 2012 and 2013
The two rds files are missing
https://edap-ord-data-commons.s3.amazonaws.com/index.html?prefix=stateio/
The json files are present, but not the rds files
Uploaded
|
2025-04-01T06:37:39.194744
| 2022-03-01T20:06:22
|
1155763182
|
{
"authors": [
"jds485"
],
"license": "CC0-1.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2961",
"repo": "USGS-R/regional-hydrologic-forcings-ml",
"url": "https://github.com/USGS-R/regional-hydrologic-forcings-ml/issues/64"
}
|
gharchive/issue
|
Make a target for the SOHL land cover data
The SOHL land cover data is not being automatically recognized by the function that scans ScienceBase. We'll need to add this data to the pipeline manually. That could be done in a couple of ways:
After the table of ScienceBase links is fetched, create a target that adds a row for the SOHL data. I think this works best for use in our current workflow.
Create a separate target that fetches the SOHL data.
From @ajsekell: this happened because item_list_children() has a default max limit of 20 items. Increasing that limit resulted in retrieving the SOHL land cover item in the table of ScienceBase links.
I'm going to link this issue with the ScienceBase PR #54
Addressed in #75
|
2025-04-01T06:37:39.196301
| 2023-10-14T06:40:47
|
1943027242
|
{
"authors": [
"USKhokhar"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2962",
"repo": "USKhokhar/fehrist",
"url": "https://github.com/USKhokhar/fehrist/issues/4"
}
|
gharchive/issue
|
Train To Pakistan
Book Title?
Train To Pakistan
Author?
Kushwant Singh
Genre of Book?
Fiction/Historical Fiction
#6
|
2025-04-01T06:37:39.216940
| 2024-09-26T21:39:39
|
2551574146
|
{
"authors": [
"amstilp"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2963",
"repo": "UW-GAC/django-anvil-consortium-manager",
"url": "https://github.com/UW-GAC/django-anvil-consortium-manager/issues/519"
}
|
gharchive/issue
|
Move account link verification email template(s) into the account adapter instead of hardcoding them
Allow the account adapter to specify the email template that should be used to send a verification email, instead of hardcoding it into the view.
This requires adding an "account_verification_email_template" setting to the AccountAdapter, and updating the AccountLink view to use the template from the adapter instead of the hard-coded template. The default template in the adapter should be set to the template currently being used.
Closed by #526
|
2025-04-01T06:37:39.219020
| 2016-01-24T19:41:05
|
128419724
|
{
"authors": [
"jhamman",
"tbohn"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2964",
"repo": "UW-Hydro/VIC",
"url": "https://github.com/UW-Hydro/VIC/pull/364"
}
|
gharchive/pull-request
|
Fix/issue_209
count the number of output vars in each file rather than requiring the user specify this integer directly.
fixes #209
This could also be done with forcing vars... just sayin'
@tbohn - do you have any comments on the implementation here. I'm not super familiar with text file parsing in C so comments would be appreciated.
I'll put together another PR for the remaining issues in #366.
The text-parsing looks OK. The docs still mention nvars though...
Thanks @tbohn. I've update the docs, will squash, then merge on passed Travis tests.
|
2025-04-01T06:37:39.229074
| 2017-03-09T05:03:27
|
212936197
|
{
"authors": [
"benjaminwinger",
"chrishajduk84"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2965",
"repo": "UWARG/computer-vision",
"url": "https://github.com/UWARG/computer-vision/pull/79"
}
|
gharchive/pull-request
|
PixelObjectList implementation - Duplicates by visual identification
This first PR only looks at the visual portion of the analysis.
Note there are some preliminary GPS functions and colour functions for now. They are a result of planning. They are included in this PR, but are not in a functional state. Please ignore these mostly empty functions.
This PR should have been completed a while ago....here it is.
Issues fixed with respect to your comments.
Two things that are not in this PR and will be worked on in a future one:
Settings Singleton (Should this be for the whole warg-cv suite? or just for my module? I'm thinking the whole thing)
Option for pre-computed data (for contours), I'm still not quite sure how I will do this exactly. Once again, this is a separate PR.
-GPS code works in unit tests, I just need a proper camera calibration (alpha values) and to test it in the field or with videos.
@benjaminwinger If you get the chance, can you review this PR?
Sorry, I've been a little busy (and sick).
Working on it.
Sorry no worries. I'm going to try and get a live version of this running this weekend though...
Also, regarding the duplicate comments from yesterday, the first time I accidentally closed the page and they disappeared, then my graphics driver crashed and I had to reboot my computer.
Apparently my comments weren't lost after all, though they weren't showing up before I submitted the rewritten ones.
|
2025-04-01T06:37:39.230413
| 2016-01-17T02:01:10
|
127069007
|
{
"authors": [
"ccqi",
"divad12"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2966",
"repo": "UWFlow/rmc",
"url": "https://github.com/UWFlow/rmc/issues/267"
}
|
gharchive/issue
|
Setup Slack channel for developers and maintainers
Slack seems to be the collaboration app to use right now. It'll great if someone can set up a channel as a common place for flow collaboration.
I just set up a channel! Give me an email and I'll send you an invite.
|
2025-04-01T06:37:39.238323
| 2016-10-25T02:07:48
|
185000714
|
{
"authors": [
"codedragon",
"komashu"
],
"license": "unlicense",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2967",
"repo": "UWPCE-PythonCert/IntroPython2016",
"url": "https://github.com/UWPCE-PythonCert/IntroPython2016/pull/77"
}
|
gharchive/pull-request
|
Catching up with some stuff
Not finished with all of it.
Off to a good start. Mailroom is very nice.
|
2025-04-01T06:37:39.266993
| 2024-11-13T16:34:03
|
2656072879
|
{
"authors": [
"Ujstor",
"regenrek"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2968",
"repo": "Ujstor/self-hosting-infrastructure-cluster",
"url": "https://github.com/Ujstor/self-hosting-infrastructure-cluster/issues/4"
}
|
gharchive/issue
|
S3 key backup
In your diagram you have sketches that keys will get backup to s3. But i dont see any configuration or credentials needed for a save s3 account to save backups there.
what im missing?
S3 backup is for Terraform state and created SSH keys for Hetzner. There is Terraform code for creating buckets, DynamoDB entries, and a bucket for SSH key backup. It assumes that you have AWS CLI installed and you are logged in with valid credentials
SSH backup will create a bucket and back up the keys that are in the coolify_hetzner_infra/.ssh directory during Terraform apply
|
2025-04-01T06:37:39.351097
| 2022-08-31T07:44:28
|
1356938243
|
{
"authors": [
"UnamSanctam",
"masterjek"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2971",
"repo": "UnamSanctam/UnamWebPanel",
"url": "https://github.com/UnamSanctam/UnamWebPanel/issues/186"
}
|
gharchive/issue
|
Duplicate Entries
I checked the miner on my computer and noticed that two identical entries were created in the web panel.
I don't think this is a critical issue. But I wonder what is the reason?
Either they are two miners you have running with different build IDs, or it wasn't able to read the already existing entry in your database so it created a new one on the connection.
Understood. Thanks for the quick response.
|
2025-04-01T06:37:39.390127
| 2021-07-28T12:09:06
|
954788719
|
{
"authors": [
"avilaton",
"tomas-muller"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2974",
"repo": "UniTime/unitime",
"url": "https://github.com/UniTime/unitime/issues/92"
}
|
gharchive/issue
|
Docker compose for demo
Hi,
I have been reading about unitime and trying to wrap my head around it, to see if we could use to help high-schools create and manage their timetables.
I was able to get set up locally using a docker-compose demo I found at https://github.com/vlatka-sinisa/docker-unitime and I thought it would be worth to mention it here, that having that docker-compose file in this repository could be a great way to help newcomers set up unitime locally. It is also a great way to document installation since dockerfiles tend to have all the instructions needed to set up the system.
If you think this is a good idea, I could create a pull request for it, maybe you have some guidance on how it would best be done.
Thanks
It is not my intention to have this docker-compose file serve as a way to deploy unitime to production but to set up the demo environment, which once coded as a set of docker containers, should end up being as simple as docker-compose up. I have learned that the easier we make it for beginners to set up a demo, the greater the chances of them contributing to the project.
If you had a chance to read through the files in that repository you will see they describe two containers, one for the DB and the other for the server, and in under 20 lines install all dependencies to get ready to run it locally.
Perhaps placing them inside the documentation folder is an option.
In any case thanks for the project, it is impressive!. Let me know if you are ok with adding these files to the documentation folder, I can do it as a PR.
Yes, I have looked at the files in the vlatka-sinisa/docker-unitime repo. I am a bit concern about the absolute paths (like /usr/local/tomcat) -- would it work with Docker on Mac OS or Windows? Most of the questions we get from people struggling to install UniTime are actually Windows users.
I can see having it under something like /Documentation/Docker-Example.
Oh that is the magic of docker, that path is internal to the container, for example https://github.com/vlatka-sinisa/docker-unitime/blob/master/docker/tomcat8/Dockerfile#L12 is mkdir /usr/local/tomcat/data but that happens inside a container. It is as if you got a fresh clean server and you can do that sort of thing inside of it, without affecting the external system.
With that docker setup, all a person needs to have on their machine is docker and docker-compose.
This is also of great help for development, since you can reproduce an environment in seconds, try something out, and if you need, recreate it also within seconds.
I tried running unitime to see if I could help with frontend, or to see if there was a REST api I could use, for example.
A simple docker installation is now available under Documentation/Docker and available starting with UniTime 4.8.126.
|
2025-04-01T06:37:39.437922
| 2023-05-15T07:48:16
|
1709516977
|
{
"authors": [
"vimwitch",
"vivianjeng"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2975",
"repo": "Unirep/Unirep",
"url": "https://github.com/Unirep/Unirep/issues/456"
}
|
gharchive/issue
|
[Audit medium severity] Potential Attestation Loss
The Unirep protocol allows attesters to assert that changes should be made to the owners of particular epoch keys. Over the course of the epoch, these changes are recorded the attester’s epochTree which is a re-usable incremental merkle tree. When epochs expire, users are expected to apply the changes recorded in the epochTree by performing a state transition. To do so, however, the epoch tree root must be saved along with the state tree root in the history tree. In the Unirep contract, this is done when transitioning to a new epoch but only if an update has been made to the stateTree. Therefore, if no state tree update has been made the attestations are discarded.
Location where the epochTree is reset
function updateEpochIfNeeded(
uint160 attesterId
) public returns (uint48 epoch) {
...
if (attester.stateTree.numberOfLeaves > 0) {
uint256 historyTreeLeaf = PoseidonT3.hash(
[attester.stateTree.root, attester.epochTree.root]
);
uint256 root = IncrementalBinaryTree.insert(
attester.historyTree,
historyTreeLeaf
);
attester.historyTreeRoots[root] = true;
ReusableMerkleTree.reset(attester.stateTree);
attester.epochTreeRoots[fromEpoch] = attester.epochTree.root;
emit HistoryTreeLeaf(attesterId, historyTreeLeaf);
}
ReusableMerkleTree.reset(attester.epochTree);
emit EpochEnded(epoch - 1, attesterId);
attester.currentEpoch = epoch;
}
Impact
As the state tree is only changed when a user is added by the attester or when a user transitions state, it is possible for attestations to be lost. In particular, assuming no users are updated it is possible for current users of the protocol to collude to discard undesirable updates to the state (particularly if there are a small number of users). Since this epoch is essentially discarded and ignored, this users may then transition from the previous epoch to the next epoch (i.e. users may skip e to transition from e-1 to e+1).
Developer Response
It is expected that the attester will validate an epoch key before performing an attestation. As long as this is done properly, then the epoch tree will be empty if the state tree is empty.
I think we can add revert in attest
if there is no leaves in the current state tree, the contract reverts the attestation.
Yes we could, it would add a cold SLOAD (a few thousand gas). I think a warning is sufficient for this, attesters should be aware of the epoch keys they're attesting to (e.g. validate that the epoch key exists in the state tree).
|
2025-04-01T06:37:39.453291
| 2023-02-17T09:43:38
|
1589050125
|
{
"authors": [
"alisevych"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2976",
"repo": "UnitTestBot/UTBotJava",
"url": "https://github.com/UnitTestBot/UTBotJava/issues/1809"
}
|
gharchive/issue
|
Go. Assertions for structures should be accurate
Description
Now tests generated for Go structures have assertNotEquals for the whole structure.
Actual and expected structures have only one different field.
If some other field is different in actual and expected structures, the test will pass too.
Context
for example:
utbot-go/go-samples/simple/supported_types_go_ut_test.go see TestStructWithNanByUtGoFuzzer
Actual behavior
assert.NotEqual(t, Structure{int: -1, int8: 1, int16: 32767, int32: -1, int64: -1, uint:<PHONE_NUMBER>3709551615, uint8: 0, uint16: 1, uint32: 0, uint64:<PHONE_NUMBER>3709551615, uintptr:<PHONE_NUMBER>3709551615, float32: 0.02308184, float64: math.NaN(), complex64: complex(float32(0.02308184), float32(0.02308184)), complex128: complex(0.9412491794821144, 0.9412491794821144), byte: 0, rune: -1, string: "", bool: false}, actualVal)
Expected behavior
assert.NotEqual(t, math.NaN(), actualVal.float64)
Environment
IntelliJ IDEA 2022.1 - 2022.2 Ultimate/Community
GoLand 2022.2
Can we identify which fields has changed from initial ones?
Then can make assertNotEquals for them only.
Other possible solutions are:
also add assertEquals for other fields of the structure? Like the following:
assert.NotEqual(t, math.NaN(), actualVal.float64)
assert.Equal(t, -1, actualVal.int)
assert.Equal(t, 1, actualVal.int8)
...
Or it would be better to use one assertEquals for the whole structure? Like:
assert.Equal(t, Structure{int: -1, int8: 1, int16: 32767, int32: -1, int64: -1, uint:<PHONE_NUMBER>3709551615, uint8: 0, uint16: 1, uint32: 0, uint64:<PHONE_NUMBER>3709551615, uintptr:<PHONE_NUMBER>3709551615, float32: 0.7815346, float64: 0.3332183994766498, complex64: complex(float32(0.7815346), float32(0.7815346)), complex128: complex(0.3332183994766498, 0.3332183994766498), byte: 0, rune: -1, string: "", bool: false}, actualVal)
During discussion of the issue @Markoutte suggested an assertion approach, that can be useful for all languages:
#1881
|
2025-04-01T06:37:39.495002
| 2021-05-02T18:44:05
|
874014048
|
{
"authors": [
"Pierre-Demessence",
"SimplyJpk"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2977",
"repo": "Unity-Developer-Community/UDC-Bot",
"url": "https://github.com/Unity-Developer-Community/UDC-Bot/issues/16"
}
|
gharchive/issue
|
Bot Raid Protection
Need to remake https://github.com/Sirush/UDHBot/pull/112 in this repo
I'll implement it using the same logic as the setup as Sirush#112 but I'll make this into a proper service with a command that can be used to enable it manually with some sort of cooldown before it automatically turns back off.
Do we still need this? Does wick do this? If we do still need something, was the original solution to just kick any new joins after X number of people join at the same time acceptable?
Think I got most of the way with this, but stopped. I'll see if this was complete and try testing it sometime in the next couple days.
Wick has this feature but only on premium.
So that might still be good to have, especially since you already did some work and it would be a shame to waste it!
|
2025-04-01T06:37:39.537450
| 2021-08-02T22:17:25
|
958550802
|
{
"authors": [
"MFatihMAR",
"SamuelBellomo",
"becksebenius-unity",
"mattwalsh-unity"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2978",
"repo": "Unity-Technologies/com.unity.multiplayer.mlapi",
"url": "https://github.com/Unity-Technologies/com.unity.multiplayer.mlapi/pull/1007"
}
|
gharchive/pull-request
|
chore!: replace MLAPI namespace with Unity.Netcode
part of our upcoming MLAPI rebranding, we need to change our namespaces around the codebase.
I'm wondering if using "Unity.NGO" instead of "Unity.Netcode" would be better to differentiate with dots netcode.
Changes LGTM assuming there's alignment on the namespace. I don't have a problem with it, although tools is using Unity.Multiplayer and we may want to change that too to follow suit (I'll follow up on this).
This will need some really good messaging so internal devs all know what to expect, and any outstanding PRs know to merge backwards even if there's no conflicts.
Yeah, @becksebenius-unity I think you're right, I think we'd have to all move under Unity.Netcode
Yeah, @becksebenius-unity I think you're right, I think we'd have to all move under Unity.Netcode
For Boss Room, I'd lean toward keeping with our "BossRoom" namespace, since it's all user side code. It'd make sense to not mix namespaces for this.
|
2025-04-01T06:37:39.570258
| 2023-09-08T10:12:54
|
1887368921
|
{
"authors": [
"MartinMUU",
"SundermannC",
"ThiBruUU",
"felixrieg"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2979",
"repo": "Universal-Variability-Language/uvl-lsp",
"url": "https://github.com/Universal-Variability-Language/uvl-lsp/issues/88"
}
|
gharchive/issue
|
Error implicating imported features on core features
While testing constraints, I encountered the following issue, but it only occurs in this particular configuration. See screenshots.
Important: If other features are used on both import1-side and the other side, taut will not occur.
It's the same for me. looks really weird.
Maybe solved by #94
Fixed by #94
|
2025-04-01T06:37:39.595275
| 2022-07-18T06:32:05
|
1307468778
|
{
"authors": [
"michal-milkowski",
"t-little"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2980",
"repo": "UniversalRobots/RTDE_Python_Client_Library",
"url": "https://github.com/UniversalRobots/RTDE_Python_Client_Library/issues/5"
}
|
gharchive/issue
|
Problem running example_control_loop.py
I've attached a photo of the error. It occurs from the rtde_config.ConfigFile(config_filename) on line 43 of the program.
Any suggestions of points in the right direction are appreciated
It's likely because you're running example_control_loop.py from main folder instead of examples folder.
Version 2.7.2 is also fixing minor issue in control loop example.
|
2025-04-01T06:37:39.601741
| 2022-05-21T19:42:34
|
1244066962
|
{
"authors": [
"LeoVaris",
"mluukkai",
"vaahtokarkki"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2981",
"repo": "UniversityOfHelsinkiCS/oodikone",
"url": "https://github.com/UniversityOfHelsinkiCS/oodikone/issues/3700"
}
|
gharchive/issue
|
[student page - hops] a graduated student has only 27% in hops
014807781
lots of simillar cases, eg 011547400
It has now magically changed to 100%: https://oodikone.helsinki.fi/students/014807781 ...
Both seems fine
|
2025-04-01T06:37:39.603264
| 2022-10-18T15:22:15
|
1413420900
|
{
"authors": [
"nicole-dmass"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2982",
"repo": "UniversityOfSaintThomas/UST-EASY",
"url": "https://github.com/UniversityOfSaintThomas/UST-EASY/pull/86"
}
|
gharchive/pull-request
|
Term program interaction update
Critical Changes
Term and Program selection updates: after a term is selected, only programs with related Intended Programs Terms with the selected term appear as options
Interaction updates: Interactions created from Application Registration now have additional information (in particular, Term and Program information), interactions are now created for any additional applications started after application registration
actually good to go now!
actually good to go now!
|
2025-04-01T06:37:39.617915
| 2023-11-20T17:40:29
|
2002696992
|
{
"authors": [
"jlubawy",
"kwasniew"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2983",
"repo": "Unleash/unleash-proxy-client-swift",
"url": "https://github.com/Unleash/unleash-proxy-client-swift/pull/73"
}
|
gharchive/pull-request
|
feat: init context
About the changes
Ability to pass initial context fields when initializing the client
Design considerations:
event though having a strongly typed context would be a nicer solution overall it's a bigger change that I'd like to handle together with the updateContext refactoring to have a symmetrical API. We can try this breaking change in version 2.0 of this client. For now I'd like to avoid increasing the API surface area and adding 2 types of init/updateContext
since updateContext accepts [String: String] I am thinking about exposing the same Stringly typed context into the init method
I extracted calculateContext method that will apply to both updateContext and init to split flat context [String, String] into standard context fields and user defined properties. But from the usage perspective it's just one Stringly typed map
appName and environments are always taken from the explicit fields (for backwards compatibility) and everything else can be overwritten with context
Important files
Discussion points
This seems reasonable to me, and solves the same problem as #71, only request I have would be to make the Context fields public to allow the following use-case.
Today, if I want to "clear" certain fields (e.g. userId/sessionId on log out), but keep other existing values (e.g. appVersion), I would need to keep a separate copy of properties outside to keep track of values, since Context.properties is not public today. That's why I made this change: https://github.com/Unleash/unleash-proxy-client-swift/pull/71/files#diff-4196a132b84d6fcbe509066c27b6f6ccd247f880ae7e81826c98355c50256889R2
var newContext = client.context
keys.forEach({ newContext.properties.removeValue(forKey: $0.rawValue) })
return client.updateContext(newContext)
|
2025-04-01T06:37:39.621155
| 2023-10-18T13:30:27
|
1949765393
|
{
"authors": [
"kwasniew",
"sighphyre"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2984",
"repo": "Unleash/unleash-proxy",
"url": "https://github.com/Unleash/unleash-proxy/pull/155"
}
|
gharchive/pull-request
|
feat: default session id
About the changes
What problem are we solving?
enabled info and variant info in the same feature have to be consistent when stickiness is set to default
also we want to have consistency between parent and child features
Solution:
if the user didn't provide sessionId, generate one one the fly
Important files
Discussion points
Is Math.random() good enough?
Is Math.random() good enough?
I think so. In the end we only need it to decide rollout here. Should be good enough for 0.1% splits after our hashing
Yeah agreed, I don't think it matters that much in this context
|
2025-04-01T06:37:39.626812
| 2024-05-09T21:28:20
|
2288493395
|
{
"authors": [
"Mygod"
],
"license": "Unlicense",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2985",
"repo": "UnownHash/Golbat",
"url": "https://github.com/UnownHash/Golbat/pull/230"
}
|
gharchive/pull-request
|
Properly preserve gym mega evolutions
Finished mega evolutions do not get cleared so we need to preserve when they finish as well.
Screenshot of ReactMap for some additional context.
|
2025-04-01T06:37:39.639346
| 2022-05-20T06:36:24
|
1242691769
|
{
"authors": [
"Araraura",
"Ununoctium117"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2986",
"repo": "Ununoctium117/fumosite",
"url": "https://github.com/Ununoctium117/fumosite/pull/28"
}
|
gharchive/pull-request
|
Add Mysterious Sword Master Youmu (from LostWord)
New sale https://twitter.com/gift_news/status/1527532221623836672
Thanks, merged with a slight name change
|
2025-04-01T06:37:39.664498
| 2024-08-22T00:38:59
|
2479475082
|
{
"authors": [
"NamreenSyed"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2987",
"repo": "UofT-DSI/shell",
"url": "https://github.com/UofT-DSI/shell/pull/104"
}
|
gharchive/pull-request
|
Assignment1
What changes are you trying to make? (e.g. Adding or removing code, refactoring existing code, adding reports)
What did you learn from the changes you have made?
Was there another approach you were thinking about making? If so, what approach(es) were you thinking of?
Were there any challenges? If so, what issue(s) did you face? How did you overcome it?
How were these changes tested?
A reference to a related issue in your repository (if applicable)
Checklist
[ ] I can confirm that my changes are working as intended
Will this be reviewed and then merged?
|
2025-04-01T06:37:39.666858
| 2022-03-24T14:13:50
|
1179578577
|
{
"authors": [
"CatherineZM",
"Felix-Deng"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2988",
"repo": "UofT-VEEP/VEEP-Website",
"url": "https://github.com/UofT-VEEP/VEEP-Website/issues/36"
}
|
gharchive/issue
|
Home Page (Functions)
Summary
This is the sub issue under Home Page to improve the functionalities of the Home Page of the website.
For Sub-Issue
Change Proposal
Change project types into Card and Button
Propose and add changes to the website to improve usability and interaction
Deadline of the task: TBD
Additional Info
Closing issue to combine it with UI
|
2025-04-01T06:37:39.674236
| 2016-02-20T09:26:33
|
135052625
|
{
"authors": [
"andre-d",
"k0nsl",
"k3d3"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2989",
"repo": "Upload/Up1",
"url": "https://github.com/Upload/Up1/issues/51"
}
|
gharchive/issue
|
Replace highlight.js with prism.js?
Would there be any interest in replacing highlight.js with prism.js1?
Refs.
https://github.com/PrismJS/prism
PS: seems like I haven't followed this project for sometime now, as it was still written in Golang when I used it 😂
The reason we use highlight is the language auto detection. It we can
replace that portion I would look into replacing with one of many different
highlight engines.
On Feb 20, 2016 4:26 AM, "k0nsl"<EMAIL_ADDRESS>wrote:
Would there be any interest in replacing highlight.js with prism.js1?
https://github.com/PrismJS/prism
PS: seems like I haven't followed this project for sometime now, as it
was still written in Golang when I used it 😂
—
Reply to this email directly or view it on GitHub
https://github.com/Upload/Up1/issues/51.
PS: seems like I haven't followed this project for sometime now, as it was still written in Golang when I used it 😂
It actually still is written in Golang, however we've added a second Node server for the time being. Both are currently maintained and we don't have any plans for deprecation at the moment.
The advantage of the Go server is that it is dependency free beyond Go itself
@andre-d:
Yes, that's the issue I stumbled upon when I tried to replace it myself. I couldn't be bothered with it and reverted back to the latest release of highlight.js :)
@k3d3:
Thanks for the clarification.
|
2025-04-01T06:37:39.701287
| 2016-02-02T12:50:47
|
130673992
|
{
"authors": [
"Urigo",
"kamilkisiela"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2990",
"repo": "Urigo/angular-meteor",
"url": "https://github.com/Urigo/angular-meteor/issues/1178"
}
|
gharchive/issue
|
Release Angular1Meteor through Npm
Expose Angular1Meteor through Npm as default.
Also use it in an Atmosphere package to be compatible with Meteor versions before 1.3
Only things that could be exposed in npm are: angular-meteor-data and angular-meteor-auth.
Compilers stays in atmosphere.
Names in npm:
angular-meteor-data as angular-meteor
angular-meteor-auth stays the same
What you all think about using webpack?
I prepared an example how it would be look like.
https://github.com/kamilkisiela/angular-meteor/tree/v1.4.x-npm/packages/angular-meteor-data
About name of the main package.
At the moment angular-meteor has two dependencies:
angular-meteor-data
angular-templates
But angular-templates is a compiler so it cannot be published as npm package.
That's the reason of my proposal of keeping angular-meteor-data as angular-meteor in npm.
It allows to include possible angular-templates in main angular-meteor package in the future.
@kamilkisiela sounds great.
It's also aligned with Angular2-Meteor where the process will be:
npm install angular2
npm install angular2-meteor
meteor add angular2-compilers
so in Angular1-Meteor it might look like:
npm install angular
npm install angular-meteor
meteor add angular-compilers // which will consist of `angular-html-templates` (now called angular-templates) and `pbastowski:angular-babel` (until we will get everything inside the official `ecmascript`)
we also are using Webpack there so Webpack sounds great.
@Urigo May I do something in that direction so we can already work on it in v1.4?
v1.4.x branch is rapidly updated so should I still wait? Or may I prepare PR so everybody could switch into angular1-meteor with build process (webpack)?
@Urigo And I think we should use some code standards to keep code clean also to prevent silly mistakes like using not defined variables etc. My proposal is to use eslint with airbnb's rules with few changes.
We're using eslint in angular2-now and it works great :)
@kamilkisiela I think that this weekend we will close a beta for 1.3.6 and then use your change in 1.3.7.
So let's wait for Sunday for the Npm branch.
About the code cleanup I think that's a great suggestion.
Can you open it as a separate issue to track?
also, do you think we could connect it to Bithound?
Ok
Bithound supports ESLint do yeah, it is possible
I'm impatient so I pulled down 1.3.6 branch and added build process (webpack) with linting utility (eslint).
You can see it here:
https://github.com/kamilkisiela/angular-meteor/tree/v1.3.7/packages/angular-meteor-data
Testing
To run tests in watch mode so velocity can receive changes and wepack can rebuild output file:
npm run watch
npm run test:local
To run tests in CI mode:
npm test
Building
Outputs not minified bundle:
npm run build:dist
Outputs minified bundle:
npm run build:prod
Both at once:
npm run build
I came up with idea.
Names of modules and services are being used as a string so to avoid silly mistakes like typo etc ,we can use something like this:
import { Mixer, module as mixerModule } from './modules/mixer';
export const module = 'angular-meteor.reactive';
export const Reactive = '$$Reactive';
angular.module(module, [
mixerModule
]);
service(Reactive, [
Mixer
function($$Mixer) { /* ... */ }
])
done.
Thank you @kamilkisiela for the huge help and great work!
|
2025-04-01T06:37:39.703733
| 2015-08-20T23:49:09
|
102261991
|
{
"authors": [
"RichardLitt",
"simonv3"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2991",
"repo": "Urigo/angular-meteor",
"url": "https://github.com/Urigo/angular-meteor/pull/604"
}
|
gharchive/pull-request
|
Initial tweaks to website
What's Changed
Bring some things above the fold
Add link to submit
Move around front page elements
Add Open Source Meta Tags (cc @RichardLitt)
What's Next
Going to explore some new designs for the visual thingies
Clean up some of the text on certain sections of the documentation. Paragraph width, etc.
:+1:
|
2025-04-01T06:37:39.735280
| 2023-07-26T08:52:37
|
1821895650
|
{
"authors": [
"bitfabrikken",
"sebastian-godja",
"uc-leo",
"userCTest"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2992",
"repo": "Usercentrics/react-native-sdk",
"url": "https://github.com/Usercentrics/react-native-sdk/issues/86"
}
|
gharchive/issue
|
[BUG] [IOS] Cannot build, error: 'guard' body must not fall through, consider using a 'return' or 'throw' to exit the scope
Describe the bug
When trying to build a project in xcode, the error occurs, and the build fails.
The error is:
'guard' body must not fall through, consider using a 'return' or 'throw' to exit the scope
To Reproduce
Steps to reproduce the behavior:
Build
See error
Expected behavior
Build without failing.
Additional context
macOS 12.6
Xcode 14.0.1
"@usercentrics/react-native-sdk": "^2.8.2"
"react-native": "0.68.6"
Hey @bitfabrikken ,
do you think you could provide us with the stack trace/build log so we can see where the error is coming from?
Cheers,
Rui
@userCTest It's happening in UsercentricsAnalyticsEventType+Int.swift, line 8, col 9.
@userCTest podfile included in case it helps
require_relative '../node_modules/react-native/scripts/react_native_pods'
require_relative '../node_modules/@react-native-community/cli-platform-ios/native_modules'
platform :ios, '12.0'
install! 'cocoapods', :deterministic_uuids => false
target 'asdf' do
config = use_native_modules!
use_frameworks! :linkage => :static
$RNFirebaseAsStaticFramework = true
$RNFirebaseAnalyticsWithoutAdIdSupport=true #added to not use ad-ids in analytics
flags = get_default_flags()
use_react_native!(
:path => config[:reactNativePath],
:hermes_enabled => flags[:hermes_enabled],
:fabric_enabled => flags[:fabric_enabled],
:app_path => "#{Pod::Config.instance.installation_root}/.."
)
post_install do |installer|
react_native_post_install(installer)
__apply_Xcode_12_5_M1_post_install_workaround(installer)
end
end
I had the same problem without using the new Track API and I managed to fix it
You can use patch-package to make the patch
`diff --git<EMAIL_ADDRESS><EMAIL_ADDRESS>index 95a0570..1bffb9f 100644
---<EMAIL_ADDRESS>+++<EMAIL_ADDRESS>@@ -2,9 +2,10 @@ import Usercentrics
public extension UsercentricsAnalyticsEventType {
static func initialize(from value: Int) -> UsercentricsAnalyticsEventType {
static func initialize(from value: Int) -> UsercentricsAnalyticsEventType? {
guard let eventType = UsercentricsAnalyticsEventType.values().get(index: Int32(value)) else {
assert(false)
return nil
}
return eventType
}
diff --git<EMAIL_ADDRESS><EMAIL_ADDRESS>index 383e19c..4d701f9 100644
---<EMAIL_ADDRESS>+++<EMAIL_ADDRESS>@@ -188,7 +188,8 @@ class RNUsercentricsModule: NSObject, RCTBridgeModule {
}
@objc func track(_ event: Int) -> Void {
usercentricsManager.track(event: UsercentricsAnalyticsEventType.initialize(from: event))
guard let usercentricsAnalyticsEventType = UsercentricsAnalyticsEventType.initialize(from: event) else { return }
usercentricsManager.track(event: usercentricsAnalyticsEventType)
}
@objc func reset() -> Void {
`
Everything seems to work, but maybe they'll come up with a better solution for this
@bitfabrikken
I had the same problem without using the new Track API and I managed to fix it
You can use patch-package to make the patch
UsercentricsAnalyticsEventType+Int.swift
`import Usercentrics
public extension UsercentricsAnalyticsEventType {
static func initialize(from value: Int) -> UsercentricsAnalyticsEventType? {
guard let eventType = UsercentricsAnalyticsEventType.values().get(index: Int32(value)) else {
assert(false)
return nil
}
return eventType
}
}
`
RNUsercentricsModule.swift line 190
@objc func track(_ event: Int) -> Void { guard let usercentricsAnalyticsEventType = UsercentricsAnalyticsEventType.initialize(from: event) else { return } usercentricsManager.track(event: usercentricsAnalyticsEventType) }
Everything seems to work, but maybe they'll come up with a better solution for this
@bitfabrikken
I had the same problem without using the new Track API and I managed to fix it
You can use patch-package to make the patch
UsercentricsAnalyticsEventType+Int.swift
`
import Usercentrics
public extension UsercentricsAnalyticsEventType {
static func initialize(from value: Int) -> UsercentricsAnalyticsEventType? {
guard let eventType = UsercentricsAnalyticsEventType.values().get(index: Int32(value)) else {
assert(false)
return nil
}
return eventType
}
}
`
RNUsercentricsModule.swift line 190
@objc func track(_ event: Int) -> Void { guard let usercentricsAnalyticsEventType = UsercentricsAnalyticsEventType.initialize(from: event) else { return } usercentricsManager.track(event: usercentricsAnalyticsEventType) }
Everything seems to work, but maybe they'll come up with a better solution for this
@sebastian-godja thx for the info!
@bitfabrikken thanks for creating the support ticket. If that's ok with you, will carry the discussion on the ticket?
@userCTest I an using "react-native": "0.72.3"
import Usercentrics
public extension UsercentricsAnalyticsEventType {
static func initialize(from value: Int) -> UsercentricsAnalyticsEventType? {
guard let eventType = UsercentricsAnalyticsEventType.values().get(index: Int32(value)) else {
assert(false)
return nil
}
return eventType
}
}
Thanks for this, it allows me to build further.
But then it stops on some errors:
Undefined symbol: _SKAdNetworkCoarseConversionValueHigh
Undefined symbol: _SKAdNetworkCoarseConversionValueLow
Undefined symbol: _SKAdNetworkCoarseConversionValueMedium
Undefined symbol: _SKStoreProductParameterAdNetworkSourceIdentifier
glad to know @bitfabrikken!! And thanks @sebastian-godja for the workaround! :)
Anyway, as I said before, we will still be looking into this, so hopefully, some updates will follow soon.
Cheers,
Rui
fixes in https://github.com/Usercentrics/react-native-sdk/pull/88
|
2025-04-01T06:37:39.744225
| 2023-12-19T11:46:31
|
2048496667
|
{
"authors": [
"qubixes"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2993",
"repo": "UtrechtUniversity/iBridges",
"url": "https://github.com/UtrechtUniversity/iBridges/issues/16"
}
|
gharchive/issue
|
Discuss how to handle exception/None for default resource
I am currently throwing an error when the default reource is not set. But it might be more handy to return `None` since that might be easier to test for in the data uploads and downoads. For the up and downloads we would have to check whether the default resource exists. the session just sets it, but currently you can also set the irods_default_resource to 'bogus' and the property would be set without throwing an error ...
Feel free to change and if you are happy to merge.
Originally posted by @chStaiger in https://github.com/UtrechtUniversity/iBridges/issues/14#issuecomment-1859994562
I don't think this is currently a problem anymore, reopen if wrong.
|
2025-04-01T06:37:39.789032
| 2021-08-02T15:20:49
|
958253759
|
{
"authors": [
"AntonDueck",
"Pythocrates",
"collani-bosch"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2994",
"repo": "VDA5050/VDA5050",
"url": "https://github.com/VDA5050/VDA5050/issues/51"
}
|
gharchive/issue
|
Pause and resume actions or not?
Context
The definition of the startPause action [6.8.1] states: "... Actions can continue...", whereas the definition of the stopPause action states: "... Movement and all other actions will be resumed (if any)..."
The "finished" column in the table in [6.8.2] suggests that actions should be paused (startPause: "... All actions will be paused..." and stopPause: "... All paused actions will be resumed...").
Questions
Is it safe to assume that actions should be paused and that the first quote is a mistake?
If used in a preemptive fashion as described in Issue #8 (which makes perfect sense to me - if I want to pause something, I must preempt it), do we maybe need some additional flag indicating whether or not to resume any preempted actions?
if you have atomic actions you may delay the action result until this action if finished, and the mark the pause action as finished. or you reject the pause action with an error.
so would prefer the first way.
@AntonDueck : I agree; if an already running action cannot be paused, the state of 'startPause' is RUNNING, until the running action is finished. Rejecting 'startPause' would be unlucky, because it depends on timing whether 'startPause' is accepted or not.
you have to inspect the action result (of all actions) after your instant action is finished to see what is been paused and which actions have finished.
|
2025-04-01T06:37:39.792960
| 2021-06-28T17:53:31
|
931819115
|
{
"authors": [
"moontrip"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2995",
"repo": "VEuPathDB/web-components",
"url": "https://github.com/VEuPathDB/web-components/pull/161"
}
|
gharchive/pull-request
|
handling date for scatter plot etc
This PR involves several works concerning scatter plot and its viz (at web-eda)
Detailed description and some screenshots can be found at the corresponding viz
Note: these involve significant changes from previous scatter plot (XYPlot) component, thus merging with plot tidy-up work by Bob may need to be done carefully
Updated with tidy-up works in conjunction with the corresponding viz part. Details can be found at the corresponding PR
I presume that Bob agreed with this PR as he approved corresponding viz PR :)
|
2025-04-01T06:37:39.804647
| 2021-06-10T17:12:25
|
917585628
|
{
"authors": [
"bobular",
"d-callan",
"dmfalke",
"moontrip",
"nkittur-uga"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2996",
"repo": "VEuPathDB/web-eda",
"url": "https://github.com/VEuPathDB/web-eda/issues/172"
}
|
gharchive/issue
|
plot thumbnails to reflect config set when expanded
when ive minimized a visualization (i was looking at scatter) and gone back to where i can add another, its reasonable a user might be looking to have the configurations they set when expanded/full screen to still apply.
So if a user chooses log scale y axis in histogram, the next freshly made histogram should have log scale on by default?
Some settings (like log y scale) could be sticky, but others, like bin width, should not.
Currently, a user can copy/clone a visualisation if they wanted to preserve some settings. I tried this with histogram. You can set a custom log-scale and bin width, then clone the viz, then change the variable in the new viz, the log scale persists, but the bin width goes back to default, which is IMO the desired behaviour.
I mean when I configure a viz and then minimize it, the thumbnail version should probably keep the configuration. I didn't mean to apply the configuration to a new viz. Sorry I wasn't clear.
That's OK.
The behaviour you describe should be what happens! There may be a bug.
Could you give more details please?
On Fri, Jun 11, 2021 at 11:23 AM Danielle Callan @.***>
wrote:
I mean when I configure a viz and then minimize it, the thumbnail version
should probably keep the configuration. I didn't mean to apply the
configuration to a new viz. Sorry I wasn't clear.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/VEuPathDB/web-eda/issues/172#issuecomment-859477798,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AACLLHZ4WARFKV3CTNPPJCDTSHPZVANCNFSM46O5Y45A
.
Did a quick test. The scatter plot config (the plot type radio buttons) is
honoured in the thumbnail versions, so I don't see the problem you see -
yet!
On Fri, Jun 11, 2021 at 11:35 AM Bob MacCallum @.***> wrote:
That's OK.
The behaviour you describe should be what happens! There may be a bug.
Could you give more details please?
On Fri, Jun 11, 2021 at 11:23 AM Danielle Callan @.***>
wrote:
I mean when I configure a viz and then minimize it, the thumbnail version
should probably keep the configuration. I didn't mean to apply the
configuration to a new viz. Sorry I wasn't clear.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/VEuPathDB/web-eda/issues/172#issuecomment-859477798,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AACLLHZ4WARFKV3CTNPPJCDTSHPZVANCNFSM46O5Y45A
.
right. it may be that what im asking for isnt very easy. but the client side controls dont seem to persist. like if i click something off in the legend for ex. but im not sure a user has any reason to know why some things persist and others dont. itll probably be particularly noticeable for something like switching axes. because some types of plots i think client side switching makes sense, and others a new request should be made to the data service.
or maybe its even specific to the plotly legend, i havent played around w it too much yet tbh.
We are not persisting the result of legend interactions. I'm sure this is something we can address.
Oh I see. Won't it be nasty persisting internal plotly things? That's why we've disabled virtually all of the interactivity features.
implementation hint: plotly has a callback to capture entire state of a plot (ask @dmfalke for more)
post phase 1 - roll our own legend will make this work redundant (will need to redo it)
Tested in GEMS1 -> Bar plot of "Study timepoint" with "age group" as overlay. When I switch off an age group in legend and minimize the plot, when I open it again, the state persists (that age group is still off). Passes QA.
@nkittur-uga thank you for your tests! 👍
|
2025-04-01T06:37:39.813045
| 2022-12-21T10:30:45
|
1506076147
|
{
"authors": [
"Bochlin",
"kraigher",
"wrightsg"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2997",
"repo": "VHDL-LS/rust_hdl_vscode",
"url": "https://github.com/VHDL-LS/rust_hdl_vscode/issues/55"
}
|
gharchive/issue
|
Allow specifying vhdl_ls version to use
When selecting embedded for the language server, the plugin will always use the latest vhdl_ls release.
This can break stuff without actually changing anything. In my case, I'm running the VHDL LS plugin on Ubuntu 18.04. The latest vhdl_ls release 0.22 now requires glibc version > 2.27. Ubuntu 18.04 only comes with glibc version 2.27 so I cannot use the VHDL LS plugin anymore.
It would be nice to be able to specify an explicit vhdl_ls version with the embedded option.
It would probably be better if we build vhdl_ls using a bit older docker image so that it does not require such a new glibc. Currently it is build using ubuntu-lastest
I did some investigation and github is about to deprecate the ubuntu-18:04 runner so it does not seem attractive to use it. You could always build vhdl_ls yourself. I believe the vscode plugin supports pointing out binaries you build yourself.
Using an old version of vhdl_ls is not good since a lot of new functionality is being added right now.
Yes you can use your own compiled language server, just point it out with the vhdlls.languageServerUserPath option.
Another option is to build the binaries without any libc dependency using the musl target. That would reach the most possible systems. If someone makes a PR of that it probably would be approved.
Yes cargo install just adds the binary. Cargo has no concept of a data folder. When building locally you should just check out the code and run cargo build --release and point to the binary in the target/release folder and it will find the libraries.
Anyway we should just build a musl version that does not depend on glib and avoid this complexity for users such as you.
Note that it is not recommended to mix vhdl_libraries folder with a vhdl_ls binary when they are not from the same commit. The standard.vhd package is tightly coupled with the binary and has recently changed as the vhdl_ls analysis became smarter.
@Bochlin I am adding x86_64-unknown-linux-musl to the github release targets with a plan to phase out x86_64-unknown-linux-gnu. Is it possible to make a new rust_hdl_vscode release that uses the musl binary?
@Bochlin https://github.com/VHDL-LS/rust_hdl/releases/tag/v0.24.0 now includes musl builds which do not have any glibc dependency on Linux
So rust_hdl_vscode should switch from linux-gnu to linux-musl zip folder on linux.
Published 0.4.0 which uses the musl build instead.
@Bochlin excellent
@wrightsg could you try if it solved your original problem?
I can confirm that the extension works again on Ubuntu 18.04 with version v0.4.0.
Thank you!
|
2025-04-01T06:37:39.815132
| 2021-02-18T17:12:50
|
811296578
|
{
"authors": [
"maqzi"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2998",
"repo": "VIDA-NYU/openclean-core",
"url": "https://github.com/VIDA-NYU/openclean-core/pull/110"
}
|
gharchive/pull-request
|
Documentation
adds the data provenance section
updates notebook links
adds a knn and token signature subsection
fixes minor formatting issues
It might be necessary to update the refdata dependency in docs/requirements.txt to >=0.2.0(?)
good catch!
|
2025-04-01T06:37:39.817731
| 2024-04-07T23:01:54
|
2229997681
|
{
"authors": [
"VK2BEA",
"tomverbeure"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:2999",
"repo": "VK2BEA/HP8753-Companion",
"url": "https://github.com/VK2BEA/HP8753-Companion/pull/11"
}
|
gharchive/pull-request
|
Add command line to not display splash screen.
Title says it all. :-)
When you're debugging a startup issue, the splash screen gets old real quick!
Tom
already added with --debug 2 (or greater)
|
2025-04-01T06:37:39.836861
| 2023-01-13T06:33:05
|
1531781699
|
{
"authors": [
"Danil42Russia",
"KorDum"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3000",
"repo": "VKCOM/modulite-phpstan",
"url": "https://github.com/VKCOM/modulite-phpstan/issues/18"
}
|
gharchive/issue
|
Allow symfony/yaml:^6
Hello!
Can you allow Symfony Yaml package version as ^5.4 || ^6 please?
Problem 1
- Root composer.json requires vkcom/modulite-phpstan * -> satisfiable by vkcom/modulite-phpstan[v1.0.0].
- vkcom/modulite-phpstan v1.0.0 requires symfony/yaml ^5.4 -> found symfony/yaml[v5.4.0, ..., v5.4.17] but the package is fixed to v6.2.2 (lock file version) by a partial update and that version does not match. Make sure you list it as an argument for the update command.
There is pull request
https://github.com/VKCOM/modulite-phpstan/pull/12
@KorDum, hi!
We fixed the symfony/yaml version problem in #42. Do we close your issue?
|
2025-04-01T06:37:39.960679
| 2022-03-20T09:21:34
|
1174484392
|
{
"authors": [
"ValentinH",
"raed667"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3005",
"repo": "ValentinH/react-easy-crop",
"url": "https://github.com/ValentinH/react-easy-crop/pull/366"
}
|
gharchive/pull-request
|
[CI] Add Quality Gate Action
Add a "quality-gate" GitHub Action that runs build, lint then unit test.
The action can run on push, PR, or manually.
I added a badge to the read-me but I can't see if it will properly work until it runs on the main repo.
Additional changes: I had to add a condition to the release workflow so that the it wouldn't fail on my fork.
Next steps:
Remove the duplicated workflows from CircleCI
Create a Github workflow for Cypress
Add better integration for test results, linting, coverage reporting, etc .. (maybe something like https://danger.systems/js/ )
Thank for this!
|
2025-04-01T06:37:39.969581
| 2024-05-14T17:29:41
|
2296008217
|
{
"authors": [
"pchrapka"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3006",
"repo": "ValidereInc/grafana-fargate",
"url": "https://github.com/ValidereInc/grafana-fargate/pull/12"
}
|
gharchive/pull-request
|
fix(CHB-2989): update engine version on instance
Resolves Jira ticket
Context
Missed a spot where engine version is specified
Summary of Changes
A brief description of the changes included in the PR. (i.e. rationale behind solution for implementation of a new feature, etc.) Screenshots if applicable.
Additional Considerations
Any additional consequences, side effects, uncertainties stemming from the changes in this PR.
Instructions for the Reviewers
What do you expect from the reviewers? E.g. What code should be run to reproduce the results? Is there anything that needs attention?
Housekeeping Checklist
[ ] Linked the PR to a Jira ticket?
[ ] Linked the Jira ticket to the PR?
[ ] Checked Draft/vs Ready to Review?
[ ] Tagged the reviewers if Ready for Review?
On CH backend we use
lifecycle {
ignore_changes = [
engine_version,
]
}
to allow aurora to auto update without breaking our deployments.
thanks!!
|
2025-04-01T06:37:40.129990
| 2020-08-12T07:28:33
|
677463636
|
{
"authors": [
"ProNoobLi",
"Vandermode"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3007",
"repo": "Vandermode/ELD",
"url": "https://github.com/Vandermode/ELD/issues/9"
}
|
gharchive/issue
|
Artifacts in the synthetic low light clean image
Hi, I got the artifacts when generating synthetic low light clean images. According to your paper, the fake low light clean images = long exposure images / ratio, while actually such operation(large integers are divided by ratio then make float to integers) squeezes the range of the values which loses the accuracy and generates the "non-continuous step" in the image, which feels like a HDR image displayed on an 8-bit screen. The result is as follows:
original long exposure image
synthesize low light clean image after auto-brightness for imshow
original low light noisy image
synthesize noise image based on the "non-continious" low light clean image
How do you fix the artifacts?
while actually such operation(large integers are divided by ratio then make float to integers)
You shouldn't convert the float to int in this step
Hi, but anyway the photon-electrons map converted from the low-light-clean raw are integers, right??
My way:
long-exposed-clean raw(integer) -> synthetic-low-light-clean(float)-> photon-electrons map(integer)-> poisson noisy photon-electrons map(integer) ->poisson noisy raw(integer)
It's not practical to keep float number in the step of generating the Poisson noisy photon-electrons map
|
2025-04-01T06:37:40.161611
| 2023-12-11T09:11:26
|
2035184802
|
{
"authors": [
"op2786",
"psifertex"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3008",
"repo": "Vector35/binaryninja-api",
"url": "https://github.com/Vector35/binaryninja-api/issues/4818"
}
|
gharchive/issue
|
Multiple BN window corrupts Binary Ninja menu
Version and Platform (required):
Binary Ninja Version: 3.6.4712-dev Personal, 599e2ad7
OS: macos
OS Version: 14.1
CPU Architecture: arm64
Bug Description:
When we have multiple BN instance, Binary Ninja menu have some duplicated entries.
Steps To Reproduce:
Open multiple BN instance
Click Binary Ninja menu
See there are multiple entries
Screenshots:
This is a QT bug and a duplicate issue of https://github.com/Vector35/binaryninja-api/issues/2630
|
2025-04-01T06:37:40.162900
| 2017-12-14T01:34:02
|
281953463
|
{
"authors": [
"bpotchik",
"plafosse"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3009",
"repo": "Vector35/binaryninja-api",
"url": "https://github.com/Vector35/binaryninja-api/issues/889"
}
|
gharchive/issue
|
Case labels missing from all views except HLIL
Unlike assembly view the IL views do not show the case labels for resolved jump tables.
The case labels in asm view were broken in 2.4.2900 with commit https://github.com/Vector35/binaryninja/commit/039aee205de3625a3b4c5bc99891fff2c5a932fc
|
2025-04-01T06:37:40.164595
| 2021-04-19T17:36:32
|
861584716
|
{
"authors": [
"marpie"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3010",
"repo": "Vector35/binaryninja-api",
"url": "https://github.com/Vector35/binaryninja-api/pull/2380"
}
|
gharchive/pull-request
|
Fix wrong CorePluginABIVersion, plugin_abi_version, and plugin_abi_minimum_version
The current Rust lib incorrectly defines the functions needed to register ABI versions.
As I'm not familiar with rustgen I cannot provide the right fix to generate uitypes.h and retrieve the variables BN_MINIMUM_UI_ABI_VERSION and BN_CURRENT_UI_ABI_VERSION.
With this fix the included plugins build and run with the latest BinaryNinja dev release.
A yeah, you are right. If you want, you can close this PR and integrate the changes yourself; I'm not at my dev station right now and cannot make the changes till later this week.
|
2025-04-01T06:37:40.267666
| 2024-10-07T05:52:52
|
2569396481
|
{
"authors": [
"Shizu-ka",
"jeromehardaway",
"jonulak"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3011",
"repo": "Vets-Who-Code/VetsAI",
"url": "https://github.com/Vets-Who-Code/VetsAI/issues/2"
}
|
gharchive/issue
|
Add Unit Tests for Text Extraction Functions
Testing Framework: Use a testing framework like unittest or pytest to create unit tests for the text extraction functions.
• Coverage: Write tests for various scenarios, including:
• Valid PDF and DOCX files.
• Empty documents.
• Documents with non-text content (images, charts).
• Files with special characters and different encodings.
• Continuous Integration: Integrate the tests into the CI pipeline to ensure they run automatically on code changes.
Relevant Code Sections:
• extract_text_from_pdf(file)
• extract_text_from_word(file)
Acceptance Criteria:
Unit tests should be created using unittest or pytest.
Tests should cover all outlined scenarios (valid files, empty files, non-text content, special characters, etc.).
Tests should be integrated with the CI pipeline and run automatically on every code push.
Code coverage should increase with the addition of these tests.
Technical Considerations:
• Use unittest or pytest for testing.
• Mock file handling where necessary to simulate different scenarios.
• Ensure that test dependencies are properly configured in the CI pipeline.
will work on this, assign this to me
@Shizu-ka I am going to give first right to refusal to my troops then I will assign to you if no one takes it.
I can work on this if it's still available
@jonulak go for it.
Unit tests are looking good, but I'll need access to git actions to implement automated tests.
|
2025-04-01T06:37:40.339778
| 2023-03-02T09:45:57
|
1606472256
|
{
"authors": [
"Haleygo",
"f41gh7"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3012",
"repo": "VictoriaMetrics/operator",
"url": "https://github.com/VictoriaMetrics/operator/issues/599"
}
|
gharchive/issue
|
why not create client for VMAlertmanagerConfig and VMAUTH
https://github.com/VictoriaMetrics/operator/issues/481 had done the work using code-generator to generate client for the vm's crd. But types like VMAlertmanagerConfig and VMAuth don't have the "//+genclient" tag to create client.
I am wondering why, can they be added now?
Must be fixed at v0.31.0 version
|
2025-04-01T06:37:40.360195
| 2018-01-09T13:15:36
|
287084810
|
{
"authors": [
"MacDisein",
"TAKeanice",
"josh64x2"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3013",
"repo": "ViennaRSS/vienna-rss",
"url": "https://github.com/ViennaRSS/vienna-rss/issues/1060"
}
|
gharchive/issue
|
Tab bar should be always visible
It is really annoying that the tab bar appears/disappears if you open/close an article.
The tab bar should be always visible to avoid reorganization of the layout. Each time you open an article the tab bar appears and the other controls are moved down - and the opposite way if you close the last article tab.
That's true, it is quite annoying
This will be solved when I replace the tab bar.
Solved, great implementation
|
2025-04-01T06:37:40.372110
| 2024-09-29T20:40:37
|
2555146045
|
{
"authors": [
"BinaryBhakti",
"Vignesh025"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3014",
"repo": "VigneshDevHub/CampX",
"url": "https://github.com/VigneshDevHub/CampX/issues/30"
}
|
gharchive/issue
|
issue #4
Description
This update proposes a significant enhancement to the user interface (UI), focusing on multiple aspects of visual and functional improvements. The main areas of change include the overall layout, background gradients, typography (fonts, text sizing, and placement), and box designs. The aim is to modernize the interface, improve clarity, and create a more engaging and cohesive user experience that aligns with the CampX brand.
Current Behavior
Layout: The current layout, while functional, lacks structure and modern aesthetics. Spacing, alignment, and organization of elements could be improved for better readability and flow.
Background: The existing background is static and lacks depth. It does not fully capture the essence of CampX's adventurous and nature-focused branding.
Fonts & Text Sizing: The current typography is basic and inconsistent. It lacks a clear visual hierarchy, making it harder for users to navigate the page intuitively.
Text Placement: Text placement is not optimized for readability, leading to an inconsistent and cluttered appearance.
Box Designs: Input fields and form elements are functional but lack the modern styling that would enhance the overall user experience.
Proposed Behavior
Layout: The new layout introduces a cleaner, more structured design that enhances user navigation and overall visual appeal. Elements are more balanced with improved spacing and alignment to guide the user's attention smoothly.
Background: Gradients have been added to the background, bringing a sense of depth and dynamism. The use of soft gradients adds a modern and vibrant feel that better reflects CampX's adventurous and outdoor-centric identity.
Fonts & Text Sizing: The updated typography uses more contemporary and readable fonts, with clearly defined sizes that establish a stronger visual hierarchy. This helps in guiding users through the page efficiently.
Text Placement: Text placement has been optimized to ensure clarity and readability. Proper spacing around the text allows for a more professional and polished look.
Box Designs: Form fields and buttons have been redesigned to appear more modern, with cleaner lines, better spacing, and intuitive styling. The new design makes interactive elements like input fields, buttons, and validation messages more user-friendly and visually engaging.
Screenshots
Additional Context
This update is informed by modern UI/UX principles and user feedback, highlighting the need for a more intuitive, engaging, and visually appealing interface. The introduction of gradients, updated fonts, and restructured layout ensures the design remains contemporary while aligning with the core themes of the CampX brand. These changes are not just aesthetic but aim to create a seamless and enjoyable user experience.
Impact
Usability: The clearer layout, modern typography, and improved text placement enhance readability and ease of navigation, making the login and registration process smoother.
Visual Appeal: The addition of gradients, refined text sizing, and polished form elements elevates the overall aesthetic of the UI, making a strong first impression on users.
Accessibility: By ensuring better readability, contrast, and responsiveness, the redesign caters to a wider range of users, including those with visual impairments.
Performance: A more streamlined design with modern elements can improve loading times, especially on mobile devices, leading to a more efficient and responsive experience.
Related Issues
This proposal complements ongoing discussions about UI consistency across the CampX platform, ensuring that design standards and visual appeal are maintained throughout. Additionally, it addresses user feedback regarding the need for better accessibility and visual coherence across devices.
@BinaryBhakti , Thank you for the new issue! Everything looks great, but I have a few suggestions. It would be ideal to have a background that is not completely dark or white—perhaps a lighter background with some subtle dark shades. I’d also like to see some padding around the campground picture for better spacing. Lastly, please ensure there's a carousel feature for displaying multiple campground images. Looking forward to seeing these changes!
Do you have any logo from where I can extract colors, and keep the color theme accordingly
I don't have a logo at the moment, and honestly, I'm not great with UI. So, I'd prefer avoiding any green. Black and white shades would be ideal for the color theme. Thanks!
If this doesn't match your requirements then I will be happy to change the design according to your needs.
@BinaryBhakti , I'm happy with this. Make sure you sync your fork before making a pull request. Thank you!
Ok
You haven't assigned me this issue. Could you assign me?
@BinaryBhakti ,I've assigned it to you!
|
2025-04-01T06:37:40.379489
| 2018-11-04T04:26:52
|
377119858
|
{
"authors": [
"Shigma",
"batracos"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3016",
"repo": "ViktorQvarfordt/Sublime-WolframLanguage",
"url": "https://github.com/ViktorQvarfordt/Sublime-WolframLanguage/issues/21"
}
|
gharchive/issue
|
What is nb_code_styles for?
See here. nb_code_styles is assigned but never used. What is it for?
@chere005 did you add it? I don't remember writing it.
Aha. I did add it but its reference has been removed.
It's meant for "notebook cell style", which may be displayed in an .nb file like this:
(* ::Input:: *)
1 + 2
Then the cell below will be displayed as an input cell.
This pattern sounds to have little significance. Maybe it's time to remove it.
|
2025-04-01T06:37:40.380728
| 2021-04-08T21:24:31
|
853934071
|
{
"authors": [
"Txori",
"VilleKrumlinde"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3017",
"repo": "VilleKrumlinde/zgameeditor",
"url": "https://github.com/VilleKrumlinde/zgameeditor/pull/4"
}
|
gharchive/pull-request
|
Replaced ifend per endif
As Delphi complains a bit about that:
Legacy '$IFEND' directive found. Consider changing to '$ENDIF' or enable $LEGACYIFEND at line 39 (39:3)
Then ZDesigner compiles perfectly.
Thanks! I'll change to $endif everywhere. I had forgotten about this
|
2025-04-01T06:37:40.387650
| 2014-11-17T11:16:59
|
49076293
|
{
"authors": [
"Celc",
"VincentGarreau",
"errogaht",
"ignaty",
"lorenmh",
"watadarkstar"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3018",
"repo": "VincentGarreau/particles.js",
"url": "https://github.com/VincentGarreau/particles.js/issues/19"
}
|
gharchive/issue
|
Lags on Firefox
Hello!
in chrome it works well, but ofcourse it lags in FireFox, very slow, and lags.
Hi @VincentGarreau I'm willing to fix this issue if you can mentor me through this bug. Any idea why its lagging on Firefox or what sections of the code I should look into that might be causing these performance issues?
@VincentGarreau you could start ;)
@VincentGarreau if 'distanceParticles' is commented out, then the lag goes away
This might not be solvable because Firefox has some serious issues with lineTo. It's 21x slower than chrome: http://jsperf.com/draw-lines
I tried making a minimal proof of concept drawing 300 lines (no animation, no alpha transparency) and the performance degrades quickly with canvas size, it's completely unusable on rMBP. Mozilla also seem to be aware lineTo is slow: https://bugzilla.mozilla.org/show_bug.cgi?id=1001954
My small demo page drawing 300 lines: http://jsfiddle.net/4ry8pdpb/1/, in my actual project it's closer to 600 lines that's rendered effortlessly by chrome.
Thanks for your detailed explanation @Celc! I noticed that Firefox has serious issues with lineTo, and your explanation enlightened me.
|
2025-04-01T06:37:40.391623
| 2024-12-03T20:33:50
|
2715925179
|
{
"authors": [
"TimonGisler",
"Vinzent03"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3019",
"repo": "Vinzent03/obsidian-git",
"url": "https://github.com/Vinzent03/obsidian-git/pull/822"
}
|
gharchive/pull-request
|
[bug fix] Allow commiting of files large than 100mb if handled by LFS
The plugin refuses to add files bigger than 100mb if the remote is github, even if an file is handled by LFS.
I added some logic that checks if a file is (or will be) handled by LFS, before blocking the commit.
Detailed Problem Description
Commit Where Check for Too Big Files Was Added
I have not yet implemented IsomorphicGit, but I am happy to do that if I get feedback on whether my change is being considered.
The changes were only tested on my Windows 11 machine (not on Linux or macOS).
This is my first ever PR to an open source project, so any feedback is welcome. I would be very happy if this actually were merged. :D
Thanks for the fixes. I just noticed that you are currently passing the vault_path to the git lfs check, but it needs the repo relative path, which is currently not stored in those objects. So it needs a bit more of restructure, which I will add to this pr in the next days.
|
2025-04-01T06:37:40.394205
| 2023-11-17T17:00:19
|
1999568670
|
{
"authors": [
"AsangaColney",
"Viren070"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3020",
"repo": "Viren070/Emulator-Manager",
"url": "https://github.com/Viren070/Emulator-Manager/issues/12"
}
|
gharchive/issue
|
Windows defender detect it as viruses and deleted it
Is this false alarm?
Yes. I think it's because I don't sign the executable files so Windows defender just marks it as a virus since its from an unknown publisher. It is safe to run and is only a false alarm.
|
2025-04-01T06:37:40.416568
| 2021-11-24T20:20:25
|
1062862197
|
{
"authors": [
"NS-BOBBY-C"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3021",
"repo": "ViroCommunity/viro",
"url": "https://github.com/ViroCommunity/viro/issues/62"
}
|
gharchive/issue
|
GVR only showing Viro360Image on left eye
Requirements:
Please go through this checklist before opening a new issue
[x] Review the documentation
[x] Search for existing issues in: viromedia/viro & ViroCommunity/viro
[x] Use the latest ViroReact release
Environment
Please provide the following information about your environment:
Development OS: Mac
Device OS & Version: Android 10
Version: "@viro-community/react-viro": "^2.21.1", "react-native": "0.66.2",
Device(s): Pixel XL
Description
I'm only seeing the image on one side of the screen. There is a small blue sparkle which I couldn't make out for the right eye. Zooming in with a screenshot didn't help either.
Reproducible Demo
export default function Screen() {
return (
<ViroVRSceneNavigator
initialScene={{
scene: MyStartScene,
}}
/>
);
}
export const MyStartScene = () => {
return (
<ViroScene>
<Viro360Image source={require('../../../../assets/images/grid.jpeg')} />
</ViroScene>
);
};
I did see something about stereoMode having issues with items other than 'None' - is it possible that the steroMode is not being set down through react --> android?
https://forum.unity.com/threads/google-vr-unity-2019-2-1-lwrp-only-showing-left-eye-right-eye-blank.735389/
Reproduced here
|
2025-04-01T06:37:40.428101
| 2016-10-05T20:23:33
|
181256190
|
{
"authors": [
"csandfeld",
"iainbrighton"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3022",
"repo": "VirtualEngine/Lability",
"url": "https://github.com/VirtualEngine/Lability/issues/153"
}
|
gharchive/issue
|
Add Lability Module Caching cmdlets
Now that Lability is downloading and caching PowerShell and DSC resource modules, we could do with some additional cmdlets to manage these (currently I have to manipulate the cache directly in the filesystem).
Get-LabModule(Cache) - Returns cached modules
Remove-LabModule(Cache) - Deletes a cached module
Clear-LabModuleCache - Empties all cached modules
@csandfeld If we think that Lability is the correct place, we could add the following too?:
Install-LabModule - Registers all required cached modules defined in a Lability configuration
Used to enable compiling MOFs on the Lability host
Should default to CurrentUser scope
Uninstall-LabModule - Unregisters all modules defined in a Lability configuration
Should default to CurrentUser scope
Might be better just deleting all user-scoped modules?!
Not sure it gets you closer to a decission, but see my comment in #147 :-)
|
2025-04-01T06:37:40.434099
| 2022-04-03T08:36:01
|
1190861193
|
{
"authors": [
"claudio-ebel",
"zkrolikowski-vl"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3023",
"repo": "VirtusLab/pandas-stubs",
"url": "https://github.com/VirtusLab/pandas-stubs/issues/162"
}
|
gharchive/issue
|
"DatetimeIndex" has no attribute "strftime"
Minimal working example
# ––– file strftime_bug.py
import pandas as pd
days: pd.DatetimeIndex = pd.date_range('2020-1-1', periods=3)
print([day for day in days.strftime("%d")])
Behaviour
mypy emits error:
$ mypy strftime_bug.py
strftime_bug.py 5: error: "DatetimeIndex" has no attribute "strftime"
Found 1 error in 1 file (checked 1 source file)
execution works:
$ python strftime_bug.py
['01', '02', '03']
Versions
$ python --version
Python 3.10.4
$ mypy --version
mypy 0.942
$ pip freeze | grep stub
pandas-stubs==<IP_ADDRESS>
pandas-stubs has moved to a new repository and will now be managed alongside pandas itself: https://github.com/pandas-dev/pandas-stubs
You might try using the newest version pip install pandas-stubs==<IP_ADDRESS>626 which comes from that repository. If it doesn't work please considering opening an issue in the new repository.
|
2025-04-01T06:37:40.436294
| 2020-05-28T15:23:29
|
626599877
|
{
"authors": [
"plusvic",
"vincent-guesnard"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3024",
"repo": "VirusTotal/yara",
"url": "https://github.com/VirusTotal/yara/issues/1291"
}
|
gharchive/issue
|
[ADDING] New company which use your solution
Hello,
Javier Ramirez from VirusTotal asked us to open an issue about this .. Well, that's not a issue at all.
Thanks for all the work you did and do, it's an amazing jobs.
Our company use your solution since 3 years - we would be glad to be within the company list : www.touchweb.fr / TouchWeb - as you proposed in the README.
Thanks for all,
Kind regards,
Vincent
Added in 9a3b4e3d7d246df9cadd43420d24195a7a7ea758. Thank you very much for your feedback!
|
2025-04-01T06:37:40.445150
| 2021-03-10T00:14:48
|
826839300
|
{
"authors": [
"Vishal-raj-1",
"harshgupta20",
"kumarishalini6"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3025",
"repo": "Vishal-raj-1/Awesome-JavaScript-Projects",
"url": "https://github.com/Vishal-raj-1/Awesome-JavaScript-Projects/issues/275"
}
|
gharchive/issue
|
bug:- the search bar is not working.
search bar;-
while searching someting like;- project... etc noting happens.
can you assign this to me under gssoc21
@kumarishalini6 Yeah Sure !! You can implement that feature
Any update ?
Reply within 2 days or else i have to close this issue.
|
2025-04-01T06:37:40.454186
| 2022-06-29T15:10:26
|
1288875207
|
{
"authors": [
"Wangyf1998",
"junchen14"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3026",
"repo": "Vision-CAIR/VisualGPT",
"url": "https://github.com/Vision-CAIR/VisualGPT/issues/12"
}
|
gharchive/issue
|
About memory overflow error during training
Hi! Thanks for the code. When I train the model to 42 batch, I met with the following error using 4*gtx2080ti:
“CUDA: out of memory, tried to allocate...”
I set the batchsize = 10, it still occurs. Is it totally a hardware problem? What device you use to train the model?
I notice that you said your code doesn't support multi-gpu training 1 years ago, is it still not supported yet?
Thank you!
hi, it is because the code does not well support multi-gpu training. Sorry for this inconvenience!
|
2025-04-01T06:37:40.456602
| 2021-05-05T14:06:11
|
876489276
|
{
"authors": [
"TouringBubble"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3027",
"repo": "VitaHEX-Games/HexFlow-Launcher",
"url": "https://github.com/VitaHEX-Games/HexFlow-Launcher/issues/29"
}
|
gharchive/issue
|
Existing PSP/PS1 titles not shown
I'm currently running HexFlow on my Vita (1000) using my existing Sony memory card. So, I've got titles that were installed from PSN before jailbreak on the card and in bubbles shown on the home screen.
My issue is that the existing bubbles for PSN downloaded titles are not showing in HexFlow.
The titles show as available in the BubbleMaager app, but don't show as created, as they already existed prior to the install.
Is there a way to get HexFlow to check for existing bubbles for PSP/PS1 titles?
Additional Info: Reinstalling a title from PSN store does not change the issue. Still doesn't show. Tested with FF Origins reinstall.
|
2025-04-01T06:37:40.493497
| 2021-07-06T08:47:42
|
937664432
|
{
"authors": [
"iantearle"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3028",
"repo": "VoidVolker/MagickSlicer",
"url": "https://github.com/VoidVolker/MagickSlicer/issues/14"
}
|
gharchive/issue
|
Slicing appears off by 1px
Normally not really visible, I think this is only exaggerated on the diagonal lines we have on the image itself.
Have tried multiple options but every set of images appears to result in the same thing happening. Any ideas?
This is in use with Open Sea Dragon, and effect is only visible when zoomed it at least 5 times.
Further to this, even the "duomo" image from OSD is suffering from this same effect, and it appears to be down to the overlap 1px rule. I believe MagickSlicer doesn't yet support this. There's a solution albeit, using a different library, see my ticket on OSD https://github.com/openseadragon/openseadragon/issues/2004
|
2025-04-01T06:37:40.505216
| 2024-03-23T21:25:00
|
2204053733
|
{
"authors": [
"VonHeikemen",
"jamesonBradfield"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3029",
"repo": "VonHeikemen/fine-cmdline.nvim",
"url": "https://github.com/VonHeikemen/fine-cmdline.nvim/issues/32"
}
|
gharchive/issue
|
finecmdline not loading when syntax errors in config (how do yall solve not having commandline accessible when you have an error in your config?)
I am a complete noob to neovim scripting, and am not sure if this is because I am using this wrong, but it is maddening when I go to save a file and I get Not an editor command: FineCmdline ! i am pretty sure this is just a limitation of the idea of a nicer command line plugin, do I have any other option other than opening a new terminal and using nano, How do you solve this problem?
I recommend mapping the enter key to execute the command FineCmdline, and leave : as is. That way you can always use the default Neovim command line.
Another thing you can try is calling the lua function directly.
vim.keymap.set('n', '<Enter>', '<cmd>lua require("fine-cmdline").open()<cr>')
If the lua function doesn't work either, then the issue is in the way you are loading the plugin.
Also worth mention, you can open Neovim without a config using the command nvim --clean in your terminal.
|
2025-04-01T06:37:40.507753
| 2022-12-19T09:18:59
|
1502587318
|
{
"authors": [
"AyalaBu",
"rachelbt",
"yinonov"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3030",
"repo": "Vonage/vivid-3",
"url": "https://github.com/Vonage/vivid-3/issues/920"
}
|
gharchive/issue
|
[avatar]: proposal to add an additional appearance
In addition to outline + filled we should consider adding duaton for delicate and more subtle design.
@yinonov , @AyalaBu WDYS?
why? was there a request?
There will be soon.
Tamir spoke to me about adding this variant
is this aligned with the team designing the chat?
anyway, there's no formal request, let's discuss when there is
|
2025-04-01T06:37:40.532584
| 2021-04-25T15:06:43
|
867034207
|
{
"authors": [
"AMiller42",
"Lyxal"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3031",
"repo": "Vyxal/Vyxal",
"url": "https://github.com/Vyxal/Vyxal/issues/28"
}
|
gharchive/issue
|
Vectorised print commands not disabling implicit output
When using the v command to vectorise print commands, the print commands do not disable implicit output.
Example: Try it Online!
well dang.
Try it Online!
It looks like this is actually a different issue. If you use O to disable the implicit output, nothing gets printed. Try it Online! If you then add an extra , after the vectorised print, it prints the same unexpected output, so it looks like the vectorised printing is actually modifying the list somehow. Try it Online!
This is a bigger issue than just vectorised printing.
|
2025-04-01T06:37:40.555680
| 2024-02-09T01:45:15
|
2126341221
|
{
"authors": [
"SlayerOrnstein"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3033",
"repo": "WFCD/warframestat_client",
"url": "https://github.com/WFCD/warframestat_client/pull/15"
}
|
gharchive/pull-request
|
chore: prepare to publish to pub.dev
Status
IN DEVELOPMENT
Description
Prepare project to be published to pub.dev
Type of Change
[ ] ✨ New feature (non-breaking change which adds functionality)
[ ] 🛠️ Bug fix (non-breaking change which fixes an issue)
[ ] ❌ Breaking change (fix or feature that would cause existing functionality to change)
[ ] 🧹 Code refactor
[x] ✅ Build configuration change
[x] 📝 Documentation
[x] 🗑️ Chore
Still a few things left but the pana package that does the checklist needs this part in order to update the score
:tada: This PR is included in version 3.6.1 :tada:
The release is available on:
GitHub release
v3.6.1
Your semantic-release bot :package::rocket:
|
2025-04-01T06:37:40.623398
| 2016-06-05T16:02:00
|
158561107
|
{
"authors": [
"CLAassistant",
"jpbernius"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3034",
"repo": "WINFOspace/Blog",
"url": "https://github.com/WINFOspace/Blog/pull/3"
}
|
gharchive/pull-request
|
Add a Gitter chat badge to README.md
Adds Gitter Badge from #2, but correctly placed.
Resolves #2.
This manually fixes Problem in #2 with @gitter-badger as addressed in gitterHQ/readme-badger#4 and gitterHQ/readme-badger#22.
Thank you for your submission, we really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.1 out of 2 committers have signed the CLA.:white_check_mark: jpbernius:x: gitter-badger
|
2025-04-01T06:37:40.636317
| 2021-06-29T12:28:23
|
932594763
|
{
"authors": [
"WJCHumble",
"edanweis"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3035",
"repo": "WJCHumble/vite-plugin-vue2-css-vars",
"url": "https://github.com/WJCHumble/vite-plugin-vue2-css-vars/issues/2"
}
|
gharchive/issue
|
Cannot read property 'content' of undefined for components with no style tags.
When components or dependencies have no style tags in them, I get error:
[vite] Internal server error: Cannot read property 'content' of undefined
@edanweis It was fixed in 0.1.9.
That was fast. @WJCHumble but now I am getting:
Internal server error: TypeError: Cannot read property 'spaces' of undefined
in two of my components.
@edanweis Can you provide reproduction? I am not sure what is it problem.
I discovered it was a missing dependency, but when using vite-plugin-vue2-css-vars with pnpm, the missing dependency error was not handled or shown in console. I am working through other problems and will create more issues as I go. Thanks Wu
|
2025-04-01T06:37:40.741349
| 2020-09-06T14:50:29
|
694408810
|
{
"authors": [
"RossellaFer",
"ann-kilzer",
"tuttiq"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3036",
"repo": "WWCodeTokyo/speak-her-db",
"url": "https://github.com/WWCodeTokyo/speak-her-db/issues/119"
}
|
gharchive/issue
|
Encourage bios to be written in 3rd person.
Is your feature request related to a problem? Please describe.
I notice a lot of the bios are written in the 1st person (I ...). Let's encourage users to submit them in the third person (she/they)
Describe the solution you'd like
[ ] Change the hint on the speaker bio box to a persistent-hint.
[ ] Review the bios in Airtable and rewrite them in 3rd person.
Describe alternatives you've considered
[ ] Could also write a vuelidate custom validator to check for use of "I" or わたし.
Additional context
Please solve each of these as separate PRs / tasks.
In hindsight, our site is not a conference CFP form, it's just a database of profiles, so I don't see why bios in first person are a big problem (people write their LinkedIn bios in first person and it seems acceptable?).
If we put too many small restrictions, it just makes the form harder to fill and doesn't bring much value (I don't think bios in first person are a huge decrease in the quality of entries...). We're just adding the risk of more people dropping the nomination process halfway through because it's too "mendokusai".
Honestly, if everyone is okay and no feelings are harmed, I'd vote to just drop this feature and keep the form simple.
I would even go further and open a ticket to remove the "mandatory" condition from the "Japanese name" as well, since some foreigners may prefer to not have their names translated (or just don't know how to translate it / how to input Japanese characters).
What do you think?
I agree with this, since it generates an error maybe it would discourage people from completing the form.
However, the persistent hint is a good idea, but this is only a guideline instead of validation. It's a minor change but I will make another PR for this
Yeah, perhaps we are making another challenging form to complete. 😩
Thanks @RossellaFer for your hard work on this ticket nonetheless.
Thank you both for your work on this! ❤️
|
2025-04-01T06:37:40.754529
| 2019-05-28T12:45:04
|
449248331
|
{
"authors": [
"PontusLindberg",
"Smurf-IV",
"Wagnerp"
],
"license": "BSD-3-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3037",
"repo": "Wagnerp/Krypton-NET-5.470",
"url": "https://github.com/Wagnerp/Krypton-NET-5.470/issues/190"
}
|
gharchive/issue
|
[BUG] Scaling of Krypton Forms with multiple monitors
Scaling of Krypton forms do not work when dragging them between monitors with different scaling.
When you use multiple monitors and they have different scaling (for example a full hd laptop with 100% and a 4K screen with 150%), Krypton forms are not scaled according to the monotor settings when dragging them from one monitior to the other. Regular windows forms are scaled depending on the settings for each monitor.
I use the 4.6 version.
Any feedback would be greatly appreciated.
Just Checking: @PontusLindberg Have you set the following in your app.config
<System.Windows.Forms.ApplicationConfigurationSection>
<add key="DpiAwareness" value="PerMonitorV2" />
</System.Windows.Forms.ApplicationConfigurationSection>
And in the App.Manifest:
<!-- Windows 10 -->
<supportedOS Id="{8e0f7a12-bfb3-4fe8-b9a5-48fd50a15a9a}" />
etc....
<!-- Indicates that the application is DPI-aware and will not be automatically scaled by Windows at higher
DPIs. Windows Presentation Foundation (WPF) applications are automatically DPI-aware and do not need
to opt in. Windows Forms applications targeting .NET Framework 4.6 that opt into this setting, should
also set the 'EnableWindowsFormsHighDpiAutoResizing' setting to 'true' in their app.config. -->
<application xmlns="urn:schemas-microsoft-com:asm.v3">
<windowsSettings>
<dpiAware xmlns="http://schemas.microsoft.com/SMI/2005/WindowsSettings">true</dpiAware>
</windowsSettings>
</application>
May be related to #73
Thanks @Smurf-IV, I will check and get back to you.
@Smurf-IV , the Krypton forms we use are in dlls which are 'add-ons' to an Autodesk application. As long as we stick to regular Windows forms, those are scaled correctly when dragged between monitors. When we change to Krypton, the scaling when dragging between monitors stop working. When we change back to Windows forms, it works fine again.
Have a look at this: https://docs.microsoft.com/en-us/dotnet/framework/winforms/high-dpi-support-in-windows-forms I thought that the app.config configuration was only supported in 4.7 and higher?
EDIT: Yes high DPI awareness support only exists in .NET 4.7 or higher: _Starting with the .NET Framework 4.7, Windows Forms includes enhancements for common high DPI and dynamic DPI scenarios. _
Aha, so we need need newer Krypton dlls?
Yes, 5.470 has this configuration built into it, but you'll also may need to re-target your projects to .NET 4.7 or newer.
Removed bug label, as .NET Framework 4.7 or higher is required for high DPI scaling.
|
2025-04-01T06:37:40.800039
| 2023-03-12T14:32:13
|
1620386873
|
{
"authors": [
"Neti-Sade",
"bkrem"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3038",
"repo": "WalletConnect/web-examples",
"url": "https://github.com/WalletConnect/web-examples/issues/132"
}
|
gharchive/issue
|
@json-rpc-tools/utils is deprecated
As you can see here, @json-rpc-tools/utils package is deprecated.
Is there an alternative way to construct my response in the wallet, for example, in
web-examples/wallets/react-wallet-v2/src/utils/EIP155RequestHandlerUtil.ts
all the responses are built with the formatJsonRpcResult method from this package.
Hi @Neti-Sade,
I believe the package was deprecated in order to migrate the functionality into a different package: @walletconnect/jsonrpc-utils
You should be able to use the same functions as before when importing from that package instead.
|
2025-04-01T06:37:40.809446
| 2021-12-24T16:07:37
|
1088440486
|
{
"authors": [
"Epsylon42",
"LastSymbol0",
"Wandalen",
"dmvict"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3039",
"repo": "Wandalen/game_chess",
"url": "https://github.com/Wandalen/game_chess/issues/45"
}
|
gharchive/issue
|
Highlight cells
Task consists of next steps:
Implement a function that creates cell entities of provided color.
The next samples can be helpful:
How to draw board from sprites.
After closing the issue, consider closing related task:
#46.
#47.
Useful links:
Bevy entities and components.
Bevy resources.
Hey, I'll take this task
How to add new sprite from a function?
commands.spawn_bundle( SpriteBundle { ... });
Or, give me details, please.
fn f1( x : i32 )
{
/* we want to add new sprite from here */
}
You need to have commands instance.
It is an example
use bevy_ecs::world::World;
struct Position {
x: f32,
y: f32,
}
let mut world = World::new();
let entity = world.spawn()
.insert(Position { x: 0.0, y: 0.0 })
.id();
let position = world.entity(entity).get::<Position>().unwrap();
assert_eq!(position.x, 0.0);
Беру
Команда 0xADDC0DE
Assingned.
|
2025-04-01T06:37:40.813803
| 2021-12-25T04:37:02
|
1088545050
|
{
"authors": [
"Wandalen",
"dmvict",
"ihor-tarasov"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3040",
"repo": "Wandalen/game_chess",
"url": "https://github.com/Wandalen/game_chess/issues/76"
}
|
gharchive/issue
|
Board margins
Tweak the camera to get expected results.
Task consists of next steps:
Investigate how to work the camera projection.
Add offsets in the camera projection.
Implement system that draws margins around board.
Add system to the Bevy main app.
Feature:
Board should not go outside window neither touch edges of window.
To avoid that introduce parameter during drawing, number of cells for gap between board and window edge.
The next samples can be helpful:
How to draw board from sprites.
How to draw sprite from image.
Taken
Taken
Please, add name of command.
Babrochky
Assigned.
|
2025-04-01T06:37:40.817859
| 2022-11-24T04:44:09
|
1462750333
|
{
"authors": [
"Wansmer",
"alphatroya"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3041",
"repo": "Wansmer/treesj",
"url": "https://github.com/Wansmer/treesj/issues/9"
}
|
gharchive/issue
|
Add an option to add trailing comma during split
For Go language a trailing comma is required when items place each on the own line, for example
func a(b int, c int, d int)
func a(
b int,
c int,
d int, // <-comma here is required
)
The same applied for arrays and dictionaries.
Can't find an option for this, can you point on it if I miss?
Yes, you can configure it. Option 'last_separator' - https://github.com/Wansmer/treesj#nodes-configuration
|
2025-04-01T06:37:40.868072
| 2023-09-11T02:36:43
|
1889535342
|
{
"authors": [
"apepkuss",
"juntao"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3042",
"repo": "WasmEdge/wasmedge-rust-sdk",
"url": "https://github.com/WasmEdge/wasmedge-rust-sdk/pull/64"
}
|
gharchive/pull-request
|
chore(rust-sys): update build script
In this PR update the version of WasmEdge to 0.13.4 in build script.
Hello, I am a code review bot on flows.network.
It could take a few minutes for me to analyze this PR. Relax, grab a cup of coffee and check back later. Thanks!
@L-jasmine Could you please help review this PR? Thanks a lot!
|
2025-04-01T06:37:40.872076
| 2015-03-16T16:45:42
|
62140232
|
{
"authors": [
"Waxolunist",
"dahmian"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3043",
"repo": "Waxolunist/metalsmith-writemetadata",
"url": "https://github.com/Waxolunist/metalsmith-writemetadata/issues/1"
}
|
gharchive/issue
|
Contents Buffer
Thanks for writing this plug in! I have a question about contents. I noticed that metalsmith outputs the "contents" as a buffer, so if I use this plugin, "contents" is not available to the JSON output. What if I want to use contents in the JSON file? Is there a way to do this without modifying metalsmith-writemetadata to output a string instead of a buffer?
No it is not possible without modifying it. I can have a look, so maybe there is an option.
If you have problems using the buffer you can have a look into my site: http://christian.sterzl.info or https://github.com/Waxolunist/christian.sterzl.info
Is this solution what you expected?
Yes! That works just like I would expect.
Cool. I had a small error. You should use 0.4.4.
|
2025-04-01T06:37:41.243970
| 2020-03-25T11:08:07
|
587617365
|
{
"authors": [
"MattiaPontonioKineton",
"g-zachar",
"mattiapontonio"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3044",
"repo": "WebPlatformForEmbedded/Lightning",
"url": "https://github.com/WebPlatformForEmbedded/Lightning/issues/139"
}
|
gharchive/issue
|
Entering a state do not contruct the state class
Using the debugger I noticed that the method $enter is called before the constructor and constructor is never called.
My goal is to add the event listener before entering a state but only in the context of this state.
Hi @MattiaPontonioKineton
That's correct, classes to describe state of a component are not instantiated.
You should think of states as code that modifies or sets the behavior of component. States shouldn't have their own properties but rather modify the component's ones (that's why you have full access to component's context by default). Of course, there are reasonable cases where it would make sense for state to encapsulate some sort of data of its own, but lightning's state machine implementation goes into different direction.
So, addressing your issue, my suggestion is to use component's constructor to initialise your state, or state's $enter method if it should be delegated.
Best regards
@g-zachar,
thanks for the clarification.
As a side note, I want to suggest that this causes inconsistency in VS Code type checking. It may drives to error the use of the class keyword.
It's a sort of downgrades of the JavaScript features.
Anyway, If it drives to better code It should be followed.
|
2025-04-01T06:37:41.247002
| 2023-12-04T04:35:52
|
2023007141
|
{
"authors": [
"Srabutdotcom",
"WebReflection"
],
"license": "ISC",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3045",
"repo": "WebReflection/linkedom",
"url": "https://github.com/WebReflection/linkedom/issues/251"
}
|
gharchive/issue
|
Creating script element resulting unexpected opening tag.
I found an issue when creating a script element.,
the script element toSting() result '\x3Cscript />' instead of <script />
And when create a nested element that contains script element will send unexpected result to client i.e
<script type="module" src="index.js"></head><body class="antialiased" /></html></script>
Kindly need you to review.
Thanks & regards
self closing tags don't exist in HTML
|
2025-04-01T06:37:41.251026
| 2023-07-04T05:58:18
|
1787204513
|
{
"authors": [
"Panquesito7",
"mkubdev"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3046",
"repo": "WebXDAO/WebXDAO.github.io",
"url": "https://github.com/WebXDAO/WebXDAO.github.io/issues/471"
}
|
gharchive/issue
|
[OTHER] Shouldn't the teams page be team?
What would you like to share?
This is not a major issue at all; however, I think it would more consistent and better to name the page "team", as we're mentioning the whole WebXDAO team, not teams (even though we have multiple teams or sub-teams, I don't think that counts).
In case it's going to be updated, a redirect should be added, IMO.
What do you think about this? Thanks. 🙂
Additional information
No response
Checklist
[X] I have read the Contributing Guidelines
[X] I have checked the existing issues
[ ] I am willing to work on this issue (optional)
[ ] I am a GSSoC'23 contributor
Well spotted!
Thanks! Working on this. 🙂
|
2025-04-01T06:37:41.256833
| 2021-04-21T15:10:24
|
863994942
|
{
"authors": [
"MaxwellRebo",
"NescobarAlopLop"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3047",
"repo": "Webhose/webhoseio-python",
"url": "https://github.com/Webhose/webhoseio-python/issues/6"
}
|
gharchive/issue
|
http?
https://github.com/Webhose/webhoseio-python/blob/3c5ff4616a79b44c61df4a5b0191f9212666c915/webhoseio/__init__.py#L17
Shouldn't this use https?
@MaxwellRebo was thinking the exact same thing
https://github.com/Webhose/webhoseio-python/pull/5
|
2025-04-01T06:37:41.423361
| 2023-02-05T11:32:41
|
1571393888
|
{
"authors": [
"WiIIiam278",
"alexdev03",
"iVillager"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3048",
"repo": "WiIIiam278/HuskHomes2",
"url": "https://github.com/WiIIiam278/HuskHomes2/pull/308"
}
|
gharchive/pull-request
|
Redis Library changed from Jedis to Lettuce + Reload with plugin managers
Lettuce is a redis library that is better in performance than jedis. Also lettuce has a built-in system for async commands that is very useful. I use a pool of n connections so you are use how many connections you are using and it's easy to debug.
In the plugin main I added a filter for already present permissions so the plugin can still load and skip those permissions. This is related to the reload process.
Thanks also to @Emibergo02 for helping me with the redis classes.
Tested in production and everything is working.
I added a commit with the requested fixes.
I'm still king of thinking the lettuce stuff would better be moved into its own library before merging
Any news on this @WiIIiam278?
Closing as stale (since this targets a previous major version) -- feel free to rebase and PR again on the latest with feedback in mind :)
|
2025-04-01T06:37:41.693251
| 2016-06-01T18:40:14
|
157978024
|
{
"authors": [
"Dirrk",
"trevorriles"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3049",
"repo": "William-Yeh/ansible-oracle-java",
"url": "https://github.com/William-Yeh/ansible-oracle-java/pull/30"
}
|
gharchive/pull-request
|
Always regenerate variables for jdk name and url
This allows us to use this role to install multiple versions of java.
I use this role to provision CI systems that require multiple versions of Java. Attempting to install to different versions of Java resulted in urls that don't exist like:
http://download.oracle.com/otn-pub/java/jdk/8u51-b16/jdk-7u80-linux-x64.tar.gz
Which obviously gets some versions mixed up.
There might be a better way to solve this, but this fixes the problem for myself.
Maybe I still need the generic part, I will investigate.
Hi @William-Yeh, it looks like the failed build on travis-ci didn't even run. Can you try re-running it? Aside from that what else do you need to get this merged?
Bump
closing due to lack of response.
|
2025-04-01T06:37:41.700423
| 2015-08-28T02:37:56
|
103635245
|
{
"authors": [
"ivanvenosdel"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3050",
"repo": "WimpyAnalytics/pynt-of-django",
"url": "https://github.com/WimpyAnalytics/pynt-of-django/issues/16"
}
|
gharchive/issue
|
Test tox flush arg seems to be ignored
It seems to assume that pynt supports bools from the command line but it does not.
Really the feature should just be eliminated as it's simply an alias. As opposed to the other commands which either generally allow people to work without having to care whether or not they activated the venv or cd into a lower level project dir, or both.
|
2025-04-01T06:37:41.711700
| 2015-03-17T14:54:14
|
62419272
|
{
"authors": [
"MatthiasWinzeler",
"mwrock",
"sneal",
"tmm1"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3051",
"repo": "WinRb/WinRM",
"url": "https://github.com/WinRb/WinRM/pull/130"
}
|
gharchive/pull-request
|
UTF-8 as default codepage
I recently had trouble when echoing strings to stdout/stderr on windows systems (over WinRM) that contained non-ascii chars, especially umlauts (occur in latin1) and other utf8 characters: They were always messed up in the winrm response.
I found that the gem defaultly specifies codepage 437 when opening the shell (https://github.com/WinRb/WinRM/blob/ec9140063bc47f92b52f3a84bde1e4b0800ed911/lib/winrm/winrm_service.rb#L99) so that only characters from this codepage (which contains only the ascii set and some other chars) are displayed correctly.
When passing the option :codepage => 65001 (which corresponds to utf-8 according to this index: https://msdn.microsoft.com/en-us/library/dd317756(VS.85).aspx) to open_shell, windows successfully receives the utf8 chars.
But when getting the command output, I get the error Encoding::CompatibilityError: incompatible encoding regexp match (UTF-8 regexp with ASCII-8BIT string). If I add force_encoding('utf-8') when getting the command output, the error is gone and the utf-8 chars are received correctly on the client (on both windows and linux clients).
In order to support the whole unicode character set in this Gem, this PR changes the default codepage for WinRM shell to utf-8.
Maybe this change is too radical by introducing a new default and will break dependent gems or programs...
But I couldn't find another solution - adding force_encoding('utf-8') would break the response when using another codepage as utf-8 for the shell. Nevertheless, I think utf-8 as default seems appropriate nowadays. The spec suite is passing on both windows and linux.
A less intruding alternative (completely backward-compatible): Introduce a utf-8 => true option that automatically sets the codepage to 650001 and forces the encoding to utf-8 on the output (but leaves the default & other codepages alone) - but I think that would be less elegant.
Spec is included and tested on windows 8.1 & ubuntu linux (ruby 1.9.3) against a windows 2012r2 server.
Could this be merged?
@zenchild @pmorton Is there a reason you can think of where this would cause issues? It seems pretty sensible to default to UTF-8.
This may become a pressing issue. I'm currently trying to setup packer templates for Windows Nano server and its using 65001 and complains when using 437. This will be an issue with the GO package as well.
@mwrock Interesting about Nano, but that sounds like a good change. My main worry is backwards compatibility. I'm pretty sure it'll be a safe change, but I was hoping for some additional opinions since I may have missed something.
I just tested against nano, win2012R2, win 8.1 and win 2008R2 endpoints. All were successful. I don't think we need to go any further back than 2008R2. I feel pretty comfortable with this change. @tmm1: would you mind fixing up merge conflicts with master?
@sneal any objection to me fixing the merge conflict here and bumping to 1.3.5.dev?
Go for it! I think we've done our due diligence on this one.
cool. done. If you could push a new dev gem to ruby gems whenever you have a chance that would be awesome.
@mwrock Done
|
2025-04-01T06:37:41.739739
| 2024-06-24T14:16:56
|
2370375916
|
{
"authors": [
"WittmannF",
"andrebelem"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3052",
"repo": "WittmannF/jupyter-translate",
"url": "https://github.com/WittmannF/jupyter-translate/pull/7"
}
|
gharchive/pull-request
|
Update jupyter_translate.py
Hello @WittmannF,
Nice library ! I'm suggesting a few changes in the REGEX and a 2-step open scheme for json files to handle with codec.
If you agree, just merge it.
Cheers
Andre
Thanks @andrebelem! I'll take a look. I completely forgot about this project. Great to see it is still working. Are you interested in being one of the admins?
Hello Fernando. It's a very useful script !
You can reach me (in portuguese!) by @.***
Cheers
Andre
Em seg., 24 de jun. de 2024 às 11:21, Fernando Marcos Wittmann <
@.***> escreveu:
Thanks @andrebelem https://github.com/andrebelem! I'll take a look. I
completely forgot about this project. Great to see it is still working. Are
you interested in being one of the admins?
—
Reply to this email directly, view it on GitHub
https://github.com/WittmannF/jupyter-translate/pull/7#issuecomment-2186700270,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AK42FD56YFQPJB2A4VQ2WXTZJATNJAVCNFSM6AAAAABJZ5T5YGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOBWG4YDAMRXGA
.
You are receiving this because you were mentioned.Message ID:
@.***>
--
Prof. Andre L. Belem (Dr. rer. nat.)
Github: https://github.com/andrebelem (+Lattes
http://lattes.cnpq.br/8174173696509765, +ORCID
https://orcid.org/0000-0002-8865-6180, +ReseachGate
https://www.researchgate.net/profile/Andre-Belem, +Linkedin
https://www.linkedin.com/in/andre-belem)
Departamento de Engenharia Agrícola e Meio Ambiente - Escola de Engenharia
Universidade Federal Fluminense
Rua Passo da Pátria, 156 bloco E sl 345
Campus da Praia Vermelha - São Domingos
24210-240 Niterói,RJ / Brasil
|
2025-04-01T06:37:41.742042
| 2018-02-03T14:53:49
|
294116720
|
{
"authors": [
"eddelbuettel",
"evanmiller"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3053",
"repo": "WizardMac/librdata",
"url": "https://github.com/WizardMac/librdata/pull/2"
}
|
gharchive/pull-request
|
Beginnings of a more general Makefile
This is not ideal yet, but a little better. I guess we may have to bite the bullet and list and filter the src/ files to have a proper library dependency. Then we may want to properly (with soname, major, minor?) build a library, either dynamic or (it is small, after all) static.
I think I correctly fast-forwarded to have my two commits after yours.
Thanks. It builds a dynamic library on Mac, do you want to add a similar logic for Linux? (I'm happy to merge as-is, since it's already an improvement.)
I am a fan of many small increments so I'd merge now too.
Getting the reader working is more important for me :)
But we can build a simple dynamic library. It'll take me a few lines and we have to -fPIC and all that. Happy to do hat esp if we have the macOS side working already.
|
2025-04-01T06:37:41.767399
| 2016-11-21T23:23:46
|
190865095
|
{
"authors": [
"RazerM",
"TWAC",
"WoLpH",
"keenondrums",
"vitidev"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3055",
"repo": "WoLpH/portalocker",
"url": "https://github.com/WoLpH/portalocker/issues/31"
}
|
gharchive/issue
|
Does portalocker support python 3.5.2 win?
Windows 7 x64, python 3.5.2
code from example, 'somefile' exists and writeable
file = open('somefile', 'r+') portalocker.lock(file, portalocker.LOCK_EX)
crashed with message
Process finished with exit code -1073740777 (0xC0000417)
stack overflow/stack exhaustion
on line 33 in portalocker.py
msvcrt.locking(file_.fileno(), mode, -1)
But that is not all. Rewrite code to
portalocker.lock(file, portalocker.LOCK_SH)
and script fail with message
File "C:\Python\lib\site-packages\portalocker\portalocker.py", line 14, in lock
mode = msvcrt.LK_RLOCK
AttributeError: module 'msvcrt' has no attribute 'LK_RLOCK'
+1 for Win 10
+1 for python 3.6
Though it seemed like a simple AttributeError which could be fixed with simple
getattr(msvcrt, 'LK_RLOCK', msvcrt.LK_RLCK)
it goes deeper for me.
Having applied the fix above I stumbled upon another issue: my python interpreter started failing.
The issue is:
(msvcrt.locking(file_.fileno(), mode, -1))[https://github.com/WoLpH/portalocker/blob/develop/portalocker/portalocker.py#L33]
If I put a random number instead of -1 it starts working.
@keenondrums
msvcrt.locking(fd, mode, nbytes) where nbytes -"the locked region of the file extends from the current file position for nbytes bytes, and may continue beyond the end of the file"
We need to lock whole file.
@vitidev I see. I just pointed that if wee need to lock the whole file -1 is not the answer.
These lines:
savepos = file_.tell()
if savepos:
file_.seek(0)
try:
msvcrt.locking(file_.fileno(), mode, -1)
need to be replaced with something like this:
savepos = file_.tell()
file_.seek(0, os.SEEK_END)
size = file_.tell()
file_.seek(0)
try:
msvcrt.locking(file_.fileno(), mode, size)
I see no indication from the documentation that -1 is a valid value for the third argument. The msvcrt.locking causes Python to silently exit, which must be a Python bug, though.
After updating 3.5.2 to 3.5.3 and still getting the crash, I verified that 2.7 and 3.4 don't cause a crash then filed a bug: http://bugs.python.org/issue29392
The msvcrt module uses _locking which says "It is possible to lock bytes past end of file".
Perhaps one can simply lock as much possible,<PHONE_NUMBER>? Any larger value gives OverflowError: Python int too large to convert to C long.
@techtonik added -1, what do you think?
It looks like the bug was fixed so that's good :)
http://bugs.python.org/issue29392
Still... doesn't help much with the bug though. Not sure I can help much here guys (no windows) but I'll help with anything I can offer.
Second bug "AttributeError: module 'msvcrt' has no attribute 'LK_RLOCK'" not fixed in 1.1.0
Oops, Github automatically closed this. I've reopened.
If fix in line 14 mode = msvcrt.LK_RLOCK -> mode = msvcrt.LK_RLCK
I did not find LK_RLOCK in any python documentation and I think it's a typo
Fixed on develop, I'm releasing a new version today :)
The new release works perfect for me on Python 2.x and 3.x on Windows, OS X and Linux
|
2025-04-01T06:37:41.769585
| 2017-07-23T18:49:28
|
244928579
|
{
"authors": [
"NicoHood",
"WoLpH"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3056",
"repo": "WoLpH/python-utils",
"url": "https://github.com/WoLpH/python-utils/issues/3"
}
|
gharchive/issue
|
GPG signatures for source validation
Hi,
you know what to do :)
I will ask try to get the python progressbar2 into [community] of archlinux. Could you please tell me what is compatible and what not with the old/other version? I want to simply replace the package instead of creating another one.
Also correct:
https://python-utils.readthedocs.io/en/latest/
Done, should be a fully signed release now :)
Let me know if you have any issues
|
2025-04-01T06:37:41.802187
| 2024-11-14T17:09:09
|
2659490369
|
{
"authors": [
"joanaBrit",
"stepsen89"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3057",
"repo": "Women-Coding-Community/wcc-frontend",
"url": "https://github.com/Women-Coding-Community/wcc-frontend/issues/44"
}
|
gharchive/issue
|
Tabs (reusable)
Tab - Make a reusable component for the tab to allow to navigate between sections.
View:
I would split this - one is a hero and one is a title.
The navigation (tabs) is also used on the event container (Upcoming events, Past events)
If the reusable component can already display different titles in the font size and colour we want then just use that, and rewrite this ticket to be the tabs
@stepsen89 I got a bit confused, what you consider calling a hero ? I thought it would be the first part that you see on the page like an image, title... But in this case the hero you saying is the blue background, right? and I guess hero section is the first part in the page, that is more than one element.
|
2025-04-01T06:37:41.807822
| 2018-06-26T21:45:22
|
336002370
|
{
"authors": [
"Chase-Reid",
"Ziriax"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3058",
"repo": "WonderMediaProductions/Maya2glTF",
"url": "https://github.com/WonderMediaProductions/Maya2glTF/issues/33"
}
|
gharchive/issue
|
Maya2GLTF PBR Material - Environment maps not being exported
Hello!
I am having an issue with the maya PBR material. While applying maps to my car body, I have applied the following texture:
...to the Environment fields outlined below:
I export the object out of maya, but unfortunately, when I import my object into the threejs editor my environment map is empty.
Am I using the environment properties incorrectly or is this a bug?
glTF 2.0 files do not specify the environment map. I have just put it in the Maya shader for previewing.
So it is not a bug, it is by design, because I cannot put this in a standard glTF file.
Typically you specify an environment in your render engine of choice, separate from the 3D models you are importing. It will be the same for all models.
That being said, as soon as Maya2glTF will support the KHR lights extension, environment maps will be exported as part of the lights I guess. Obviously the render engines must also support this extension.
Thank you for the reply. I just did a test GLTF export from the ThreeJS editor to debug this issue and I see what you are talking about. Good to know! Feel free to close this issue :)
Thanks for reporting! Getting user feedback is important. Closing this now then :-)
|
2025-04-01T06:37:41.850060
| 2016-07-16T07:05:08
|
165915042
|
{
"authors": [
"GaryJones",
"jrfnl",
"lkraav"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3059",
"repo": "WordPress-Coding-Standards/WordPress-Coding-Standards",
"url": "https://github.com/WordPress-Coding-Standards/WordPress-Coding-Standards/issues/606"
}
|
gharchive/issue
|
Add sniff for long conditional end comments.
From the handbook:
Furthermore, if you have a really long block, consider whether it can be broken into two or more shorter blocks or functions. If you consider such a long block unavoidable, please put a short comment at the end so people can tell at glance what that ending brace ends – typically this is appropriate for a logic block, longer than about 35 rows, but any code that’s not intuitively obvious can be commented.
Ref: https://make.wordpress.org/core/handbook/best-practices/coding-standards/php/#brace-style
This rule is so far not covered, but there is a sniff available for this upstream.
The finer details of a PR for this depend on the upstream PR https://github.com/squizlabs/PHP_CodeSniffer/pull/1074 and therefore can only be added once the upstream PR has been merged and the minimum PHPCS version required for WPCS is upped to match.
If it is desired to have this functionality available before that point in time, a copy of the - adjusted - upstream sniff could be used for the time being and altered to extend the upstream class at a later point in time.
Commit in feature branch which implements the functionality based on upstream: ~~https://github.com/WordPress-Coding-Standards/WordPress-Coding-Standards/commit/6774ab0f4f756cbaedee078713098443c7a13885~~ https://github.com/WordPress-Coding-Standards/WordPress-Coding-Standards/commit/c38a6c095d47229a64328b52efd1d216d39f1945
[Edit]: Travis will - of course - fail for this branch as long as the upstream PR has not been merged yet.
Personally, I can't stand these redundant comments, but the upstream patch makes sense to improve it.
Personally, I can't stand these redundant comments
True that, but it is in the handbook, which is why I suggest for WPCS to cover it. People can always turn it off for individual projects. And at least WP only suggests it for > 35 lines.
( which would make the condition a prime candidate for refactoring anyway)
I have found reasons not to be that harsh against clarity comments at the end of blocks, esp. on nested stuff where just seeing 3 closing blocks with a short comment allows quick understand of what you're seeing in complex situations. Just that manually maintaining the comment content accuracy is annoying.
Just that manually maintaining the comment content accuracy is annoying.
@lkraav The sniff actually contains a fixer, so that can be handled for you ;-)
FYI: looks like there's some movement upstream - the PR for this has been merged. Still, there isn't a released PHPCS version which contains it atm which we could set as a minimum version, so we'd still need to bridge this with an extended class for now or wait until it is contained in a released version and the WPCS minimum required PHPCS version has caught up.
Opinions ?
For the record, how would one disable this in their phpcs.xml file?
For the record, how would one disable this in their phpcs.xml file?
<exclude name="Squiz.Commenting.LongConditionClosingComment" />
And if a different line limit or end comment is preferred, you can overrule the settings by adding the following in phpcs.xml (with different values for the properties):
<rule ref="Squiz.Commenting.LongConditionClosingComment">
<properties>
<property name="lineLimit" value="35" />
<property name="commentFormat" value="// End %s()." />
</properties>
</rule>
|
2025-04-01T06:37:41.853706
| 2013-10-11T18:21:38
|
20885773
|
{
"authors": [
"GaryJones",
"jrfnl",
"westonruter"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3060",
"repo": "WordPress-Coding-Standards/WordPress-Coding-Standards",
"url": "https://github.com/WordPress-Coding-Standards/WordPress-Coding-Standards/issues/91"
}
|
gharchive/issue
|
Flag bypassing of Settings API
The only way I can think of to automate checking of this is to look for instances of add_menu_page or add_submenu_page, and for the callback argument, and then to attempt to find that callback function and check to see if it outputs any fields, and if it does, warn that they should use settings API.
However, this seems to be better checked with PHPUnit. A unit test could be run which executes the admin page callback for each admin page and checks to see if the settings API is ever invoked during the execution of the function.
Not Using the Settings API #
Instead of handling the output of settings pages and storage yourself, use the WordPress Settings API as it handles a lot of the heavy lifting for you including added security.
Make sure to also validate and sanitize submitted values from users using the sanitize callback in the register_setting call.
Is this meant to be for VIP only, or is it something everyone could benefit from?
I think checking that register_setting() is always called with sanitize_callback set in the $args would be a good addition for everyone, this is covered by #126.
Checking the callback functions passed to add_(sub)menu_page() is something which IMHO cannot easily be done in a reliable manner with PHPCS.
Closing as VIP issues are no longer relevant here.
|
2025-04-01T06:37:42.050712
| 2021-06-09T06:37:25
|
915840652
|
{
"authors": [
"obulat",
"zackkrida"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:3061",
"repo": "WordPress/openverse-catalog",
"url": "https://github.com/WordPress/openverse-catalog/pull/98"
}
|
gharchive/pull-request
|
Run ci on main push only
Fixes #96
This PR restricts GitHub actions to run on push only to the main branch to make sure that the linting and testing CI actions are not run twice.
I had used 'master' branch by mistake, and had to rename the branch because of that. Renaming a PR branch closes that PR, apparently.
Signed-off-by: Olga Bulat<EMAIL_ADDRESS>
FYI there is a $default-branch variable that can be used in GitHub actions, so that 'main' or 'master' doesn't have to be hardcoded.
Oh, that's great to know!
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.