id stringlengths 4 10 | text stringlengths 4 2.14M | source stringclasses 2
values | created timestamp[s]date 2001-05-16 21:05:09 2025-01-01 03:38:30 | added stringdate 2025-04-01 04:05:38 2025-04-01 07:14:06 | metadata dict |
|---|---|---|---|---|---|
203864945 | href not working
Hi, I've been using this plugin for a while and it's been great for what I needed, but now I need to make an area link to a website but it doesn't seem to work. Is there a way to make the url work properly?
<area href="http://www.google.com/" alt="google" data-mapid="9" shape="poly" coords="393,594,471,520,482,512,472,488,457,480,441,476,424,475,407,475,392,478,383,482,368,482,356,482,348,486,339,490,332,495,327,510,329,526,341,546,352,564,366,582,384,594" class="{fillColor:'04606A',strokeColor: '0392B7'}">
You could try adding an onclick to the area tag.
You could try adding an onclick to the area tag.
<area onclick="window.open('http://www.google.com', '_self');" href="#" alt="google" data-mapid="9" shape="poly" coords="393,594,471,520,482,512,472,488,457,480,441,476,424,475,407,475,392,478,383,482,368,482,356,482,348,486,339,490,332,495,327,510,329,526,341,546,352,564,366,582,384,594" class="{fillColor:'04606A',strokeColor: '0392B7'}">
| gharchive/issue | 2017-01-29T11:40:16 | 2025-04-01T06:45:10.797640 | {
"authors": [
"chimovski",
"nswilhelm"
],
"repo": "nswilhelm/jquery-map-trifecta",
"url": "https://github.com/nswilhelm/jquery-map-trifecta/issues/4",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
340542714 | Shadow DOM update breaks vertical alignment in Chrome/Safari
The "Shadow DOM" commit (c3091f0c681b6aa5a8da61f414bf4bf58be3147b) has broken vertical alignment in Chrome (Mac/Win) and Safari (Mac). Here's a quick CodePen to demonstrate the issue:
https://codepen.io/anon/pen/gjprpx
Earlier versions of github-buttons did not present this issue. To be more specific, a star button with count used to line up vertically with a "License: MIT" icon on a few of my documentation sites, but now they do not:
https://jhildenbiddle.github.io/docsify-themeable/
https://jhildenbiddle.github.io/class-change/
It appears as though Chrome/Safari are applying a user-agent style to shadow dom nodes, causing the misalignment. End users cannot adjust these values, but perhaps an adjustment can be made in the github-buttons CSS or JS.
Weird, it fixes your codepen example, but does not fix the other example you have.
Now it's aligned properly on all your examples.
Finally got this fixed in all kinds of condition. It turned out vertical-align: bottom which I had initially is not the problem. The previous attempt of changing vertical-align to a different value breaks other cases where font-size or line-height is different.
The real fix is that the shadow root element need to have display: inline-block and a overflow: hidden (or anything that's not visible):
The baseline of an 'inline-block' is the baseline of its last line box in the normal flow, unless it has either no in-flow line boxes or if its 'overflow' property has a computed value other than 'visible', in which case the baseline is the bottom margin edge.
Excellent! Thank you for taking care of this so quickly. Very much appreciated! :)
| gharchive/issue | 2018-07-12T08:42:55 | 2025-04-01T06:45:10.826964 | {
"authors": [
"jhildenbiddle",
"ntkme"
],
"repo": "ntkme/github-buttons",
"url": "https://github.com/ntkme/github-buttons/issues/41",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
2325345 | fixed rescue call in Rakefile to assign LoadError object
The diff pretty much explains it. STDERR.puts e.message fails with undefined method `message' for nil:NilClass because e isn't assigned in the rescue clause. Changed it to rescue LoadError => e
Uhm, i missed that pull request somehow, sorry for the late merge.
Anyway, can you add yourself to the contributors in readme and send me another pull request with it?
done and done.
This issue was moved to gmailgem/gmail#45
| gharchive/issue | 2011-11-23T01:07:29 | 2025-04-01T06:45:10.908219 | {
"authors": [
"alexgenco",
"johnnyshields",
"nu7hatch"
],
"repo": "nu7hatch/gmail",
"url": "https://github.com/nu7hatch/gmail/issues/45",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1917280117 | Support inline svg elements
Inline svg elements in .nue files get replaced with <nue-island>
<svg style="display:none;>
<symbol id="collapse" viewBox="0 0 16 16">
<polygon points="11.62 3.81 7.43 8 11.62 12.19 10.09 13.71 4.38 8 10.09 2.29 11.62 3.81" />
</symbol>
</svg>
in a .nue file is compiled to
<svg style="display:none;>
<nue-island id="collapse" viewBox="0 0 16 16">
<nue-island points="11.62 3.81 7.43 8 11.62 12.19 10.09 13.71 4.38 8 10.09 2.29 11.62 3.81" />
</nue-island>
</svg>
We should add the svg elements to the standard element list.
SVG container elements
SVG graphics elements
Good call. Will fix. Thanks!
| gharchive/issue | 2023-09-28T11:16:44 | 2025-04-01T06:45:10.963189 | {
"authors": [
"nurges",
"tipiirai"
],
"repo": "nuejs/nuejs",
"url": "https://github.com/nuejs/nuejs/issues/60",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2402114793 | 🛑 Secret Site BIS is down
In 8d6fefe, Secret Site BIS ($SECRET_SITE) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Secret Site BIS is back up in 419eeff after 22 minutes.
| gharchive/issue | 2024-07-11T02:50:26 | 2025-04-01T06:45:10.965378 | {
"authors": [
"nueve9"
],
"repo": "nueve9/uptime",
"url": "https://github.com/nueve9/uptime/issues/1461",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1837763948 | 🛑 Università e ricerca is down
In 699e460, Università e ricerca (https://www.mur.gov.it/it) was down:
HTTP code: 403
Response time: 664 ms
Resolved: Università e ricerca is back up in 4eae4af.
| gharchive/issue | 2023-08-05T13:32:36 | 2025-04-01T06:45:10.967990 | {
"authors": [
"nuke86"
],
"repo": "nuke86/ransomPing",
"url": "https://github.com/nuke86/ransomPing/issues/3640",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1705563977 | 🛑 Università e ricerca is down
In 5907cd0, Università e ricerca (https://www.mur.gov.it/it) was down:
HTTP code: 403
Response time: 640 ms
Resolved: Università e ricerca is back up in f2df0e7.
| gharchive/issue | 2023-05-11T10:51:29 | 2025-04-01T06:45:10.970408 | {
"authors": [
"nuke86"
],
"repo": "nuke86/ransomPing",
"url": "https://github.com/nuke86/ransomPing/issues/860",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1206431477 | Issue with injected language
Hey, have noticed that if you have a cursor inside string such as (lua file example)
vim.cmd 'stopinsert!'
commanding gcc leads to incorrectly commented out line:
" vim.cmd 'stopinsert!'
I am not quite sure if I would expect it to turn to
vim.cmd '" stopinsert!'
or simply do nothing (perhaps it should comment out injected code, as it would make sense within multilane blocks). But correct behaviour is surely wrong.
(thanks for great plugin!)
This is kinda similar to #135
Could you try to recreate the same with vim.cmd[[stopinsert!]] or vim.cmd('stopinsert!')?
Both work fine, as in, comments out entire line with -- .
As I suspect, the thing with gcc is that it takes the range from the current cusor possition to the end of line and uses that to detects the language.
So with vim.cmd 'stopinsert!', gcc was only able to detect vim and used " as the commentstring. But with vim.cmd('stopinstert!'), the last char i.e ) is Lua so gcc then used ---
This situation, two languages on the same line, will always be ambiguous. And I don't even know whether is this something that needs to be fixed or not.
I am closing this for now. As I don't know a solution that'll work in every case. If anyone figures this out feel free to reply or even raise a PR :)
| gharchive/issue | 2022-04-17T16:57:01 | 2025-04-01T06:45:10.982625 | {
"authors": [
"gegoune",
"numToStr"
],
"repo": "numToStr/Comment.nvim",
"url": "https://github.com/numToStr/Comment.nvim/issues/144",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
273975749 | Compilation time of a list of booleans
If I create a list of booleans, the compilation of the function takes a long time, especially the scaling is very bad. This problem does not occur if I use integers or floats.
For this code
from numba import jit
import numpy as np
@jit(nopython=True)
def foo(a, b):
valid = [
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0),
(a - 1 >= 0) and (b - 1 >= 0)
]
foo(1, 1)
the compilation time is
| elements | time |
-------------------
| 21 | 2.7s |
| 22 | 5.1s |
| 23 | 10s |
| ... | ... |
-------------------
python: 3.6.3
numba: 0.35.0
Creating an empty array and setting the elements afterward, reduces the compilation time significantly (stackexchange).
Thanks for the report, I can reproduce. The generated machine code is about what I'd expect, but the path to get there involves huge amounts of IR containing identical blocks with differing labels (as expected given the source). Seems like most of the compilation time is spent in dataflow analysis dealing with the list items via recursive calls to BlockInfo.request_outgoing.
I think I have a similar issue.
When I am trying to compile my code - here , it takes about 220 secounds and uses 2200 MB of RAM.
If I generate the next batch of functions, I don't think it is feasible to compile the code. Is there a work around?
As of mainline for the upcoming Numba 0.55, this still doesn't scale brilliantly but the compilation time doesn't seem too bad any more.
30 elements - 2.2s
60 elements - 3.6s
120 elements - 9.7s
| gharchive/issue | 2017-11-14T23:04:55 | 2025-04-01T06:45:10.988844 | {
"authors": [
"Melisius",
"TheIdealis",
"stuartarchibald"
],
"repo": "numba/numba",
"url": "https://github.com/numba/numba/issues/2611",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
726526592 | Fix typed.Dict and typed.List crashing on parametrized types
Fixes #6401 as described here
Thanks for the PR! Once it's no longer [WIP], we can queue it up for review.
Thanks very much for the PR, I've queued it for review.
| gharchive/pull-request | 2020-10-21T14:05:44 | 2025-04-01T06:45:10.990476 | {
"authors": [
"asodeur",
"gmarkall"
],
"repo": "numba/numba",
"url": "https://github.com/numba/numba/pull/6402",
"license": "bsd-2-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
503079492 | Problems with latest rocm environment
Hello,
I tried to run some of the examples with rocm 2.9 environment unsuccessfully. Just saw that the repository wasn't updated since last year. Is it correct that the current state of Numba is incompatible with the latest rocm stack? Which recently version of rocm is compatible with numba?
numba -s shows:
System info:
--------------------------------------------------------------------------------
__Time Stamp__
2019-10-06 09:57:48.722056
__Hardware Information__
Machine : x86_64
CPU Name : znver1
CPU count : 32
CFS restrictions : None
CPU Features :
64bit adx aes avx avx2 bmi bmi2 clflushopt clzero cmov cx16 f16c fma fsgsbase
lzcnt mmx movbe mwaitx pclmul popcnt prfchw rdrnd rdseed sahf sha sse sse2 sse3
sse4.1 sse4.2 sse4a ssse3 xsave xsavec xsaveopt xsaves
__OS Information__
Platform : Linux-5.0.0-31-generic-x86_64-with-debian-stretch-sid
Release : 5.0.0-31-generic
System Name : Linux
Version : #33~18.04.1-Ubuntu SMP Tue Oct 1 10:20:39 UTC 2019
OS specific info : debianstretch/sid
glibc info : glibc 2.10
__Python Information__
Python Compiler : GCC 7.3.0
Python Implementation : CPython
Python Version : 3.7.4
Python Locale : en_US UTF-8
__LLVM information__
LLVM version : 8.0.0
__CUDA Information__
CUDA driver library cannot be found or no CUDA enabled devices are present.
Error class: <class 'numba.cuda.cudadrv.error.CudaSupportError'>
__ROC Information__
ROC available : True
Available Toolchains : librocmlite library, ROC command line tools
Found 3 HSA Agents:
Agent id : 0
vendor: CPU
name: AMD Ryzen Threadripper 2950X 16-Core Processor
type: CPU
Agent id : 1
vendor: AMD
name: gfx906
type: GPU
Agent id : 2
vendor: AMD
name: gfx906
type: GPU
Found 2 discrete GPU(s) : gfx906, gfx906
__SVML Information__
SVML state, config.USING_SVML : False
SVML library found and loaded : False
llvmlite using SVML patched LLVM : True
SVML operational : False
__Threading Layer Information__
TBB Threading layer available : False
+--> Disabled due to : Unknown import problem.
OpenMP Threading layer available : True
Workqueue Threading layer available : True
__Numba Environment Variable Information__
None set.
__Conda Information__
conda_build_version : 3.18.8
conda_env_version : 4.7.10
platform : linux-64
python_version : 3.7.3.final.0
root_writable : True
__Current Conda Env__
_libgcc_mutex 0.1 main
blas 1.0 mkl
ca-certificates 2019.8.28 0
certifi 2019.9.11 py37_0
intel-openmp 2019.4 243
libedit 3.1.20181209 hc058e9b_0
libffi 3.2.1 hd88cf55_4
libgcc-ng 9.1.0 hdf63c60_0
libgfortran-ng 7.3.0 hdf63c60_0
libstdcxx-ng 9.1.0 hdf63c60_0
llvmlite 0.30.0rc1 py37hf484d3e_0 numba
mkl 2019.4 243
mkl-service 2.3.0 py37he904b0f_0
mkl_fft 1.0.14 py37ha843d7b_0
mkl_random 1.1.0 py37hd6b4f25_0
ncurses 6.1 he6710b0_1
numba 0.46.0rc1 np116py37hf484d3e_0 numba
numpy 1.16.4 py37h7e9f1db_0
numpy-base 1.16.4 py37hde5b4d6_0
openssl 1.1.1d h7b6447c_2
pip 19.2.3 py37_0
python 3.7.4 h265db76_1
readline 7.0 h7b6447c_5
roctools 0.0.0 hf484d3e_1 numba
setuptools 41.2.0 py37_0
six 1.12.0 py37_0
sqlite 3.30.0 h7b6447c_0
tk 8.6.8 hbc83047_0
wheel 0.33.6 py37_0
xz 5.2.4 h14c3975_4
zlib 1.2.11 h7b6447c_3
--------------------------------------------------------------------------------
If requested, please copy and paste the information between
the dashed (----) lines, or from a given specific section as
appropriate.
=============================================================
IMPORTANT: Please ensure that you are happy with sharing the
contents of the information present, any information that you
wish to keep private you should remove before sharing.
=============================================================
Running the Matrix Multiplication example from https://numba.pydata.org/numba-doc/dev/roc/examples.html#matrix-multiplication results in:
# numba test.py
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
warning: Linking two modules of different data layouts: '' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-A5' whereas '<string>' is 'e-p:64:64-p1:64:64-p2:32:32-p3:32:32-p4:64:64-p5:32:32-p6:32:32-i64:64-v16:16-v24:32-v32:32-v48:64-v96:128-v192:256-v256:256-v512:512-v1024:1024-v2048:2048-n32:64-S32-A5'
warning: Linking two modules of different target triples: ' is 'amdgcn-amd-amdhsa-amdgizcl' whereas '<string>' is 'amdgcn--amdhsa'
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
'gfx906' is not a recognized processor for this target (ignoring processor)
LLVM ERROR: Attempting to emit S_LOAD_DWORDX2_IMM_si instruction but the Feature_isGCN predicate(s) are not met
rocminfo shows:
=====================
HSA System Attributes
=====================
Runtime Version: 1.1
System Timestamp Freq.: 1000.000000MHz
Sig. Max Wait Duration: 18446744073709551615 (0xFFFFFFFFFFFFFFFF) (timestamp count)
Machine Model: LARGE
System Endianness: LITTLE
==========
HSA Agents
==========
*******
Agent 1
*******
Name: AMD Ryzen Threadripper 2950X 16-Core Processor
Marketing Name: AMD Ryzen Threadripper 2950X 16-Core Processor
Vendor Name: CPU
Feature: None specified
Profile: FULL_PROFILE
Float Round Mode: NEAR
Max Queue Number: 0(0x0)
Queue Min Size: 0(0x0)
Queue Max Size: 0(0x0)
Queue Type: MULTI
Node: 0
Device Type: CPU
Cache Info:
L1: 32768(0x8000) KB
Chip ID: 0(0x0)
Cacheline Size: 64(0x40)
Max Clock Freq. (MHz): 3500
BDFID: 0
Internal Node ID: 0
Compute Unit: 32
SIMDs per CU: 0
Shader Engines: 0
Shader Arrs. per Eng.: 0
WatchPts on Addr. Ranges:1
Features: None
Pool Info:
Pool 1
Segment: GLOBAL; FLAGS: KERNARG, FINE GRAINED
Size: 65879904(0x3ed3f60) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Acessible by all: TRUE
Pool 2
Segment: GLOBAL; FLAGS: COARSE GRAINED
Size: 65879904(0x3ed3f60) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Acessible by all: TRUE
ISA Info:
N/A
*******
Agent 2
*******
Name: gfx906
Marketing Name: Vega 20
Vendor Name: AMD
Feature: KERNEL_DISPATCH
Profile: BASE_PROFILE
Float Round Mode: NEAR
Max Queue Number: 128(0x80)
Queue Min Size: 4096(0x1000)
Queue Max Size: 131072(0x20000)
Queue Type: MULTI
Node: 1
Device Type: GPU
Cache Info:
L1: 16(0x10) KB
Chip ID: 26287(0x66af)
Cacheline Size: 64(0x40)
Max Clock Freq. (MHz): 1801
BDFID: 2560
Internal Node ID: 1
Compute Unit: 60
SIMDs per CU: 4
Shader Engines: 4
Shader Arrs. per Eng.: 1
WatchPts on Addr. Ranges:4
Features: KERNEL_DISPATCH
Fast F16 Operation: FALSE
Wavefront Size: 64(0x40)
Workgroup Max Size: 1024(0x400)
Workgroup Max Size per Dimension:
x 1024(0x400)
y 1024(0x400)
z 1024(0x400)
Max Waves Per CU: 40(0x28)
Max Work-item Per CU: 2560(0xa00)
Grid Max Size: 4294967295(0xffffffff)
Grid Max Size per Dimension:
x 4294967295(0xffffffff)
y 4294967295(0xffffffff)
z 4294967295(0xffffffff)
Max fbarriers/Workgrp: 32
Pool Info:
Pool 1
Segment: GLOBAL; FLAGS: COARSE GRAINED
Size: 16760832(0xffc000) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Acessible by all: FALSE
Pool 2
Segment: GROUP
Size: 64(0x40) KB
Allocatable: FALSE
Alloc Granule: 0KB
Alloc Alignment: 0KB
Acessible by all: FALSE
ISA Info:
ISA 1
Name: amdgcn-amd-amdhsa--gfx906
Machine Models: HSA_MACHINE_MODEL_LARGE
Profiles: HSA_PROFILE_BASE
Default Rounding Mode: NEAR
Default Rounding Mode: NEAR
Fast f16: TRUE
Workgroup Max Size: 1024(0x400)
Workgroup Max Size per Dimension:
x 1024(0x400)
y 1024(0x400)
z 1024(0x400)
Grid Max Size: 4294967295(0xffffffff)
Grid Max Size per Dimension:
x 4294967295(0xffffffff)
y 4294967295(0xffffffff)
z 4294967295(0xffffffff)
FBarrier Max Size: 32
*******
Agent 3
*******
Name: gfx906
Marketing Name: Vega 20
Vendor Name: AMD
Feature: KERNEL_DISPATCH
Profile: BASE_PROFILE
Float Round Mode: NEAR
Max Queue Number: 128(0x80)
Queue Min Size: 4096(0x1000)
Queue Max Size: 131072(0x20000)
Queue Type: MULTI
Node: 2
Device Type: GPU
Cache Info:
L1: 16(0x10) KB
Chip ID: 26287(0x66af)
Cacheline Size: 64(0x40)
Max Clock Freq. (MHz): 1801
BDFID: 17152
Internal Node ID: 2
Compute Unit: 60
SIMDs per CU: 4
Shader Engines: 4
Shader Arrs. per Eng.: 1
WatchPts on Addr. Ranges:4
Features: KERNEL_DISPATCH
Fast F16 Operation: FALSE
Wavefront Size: 64(0x40)
Workgroup Max Size: 1024(0x400)
Workgroup Max Size per Dimension:
x 1024(0x400)
y 1024(0x400)
z 1024(0x400)
Max Waves Per CU: 40(0x28)
Max Work-item Per CU: 2560(0xa00)
Grid Max Size: 4294967295(0xffffffff)
Grid Max Size per Dimension:
x 4294967295(0xffffffff)
y 4294967295(0xffffffff)
z 4294967295(0xffffffff)
Max fbarriers/Workgrp: 32
Pool Info:
Pool 1
Segment: GLOBAL; FLAGS: COARSE GRAINED
Size: 16760832(0xffc000) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Acessible by all: FALSE
Pool 2
Segment: GROUP
Size: 64(0x40) KB
Allocatable: FALSE
Alloc Granule: 0KB
Alloc Alignment: 0KB
Acessible by all: FALSE
ISA Info:
ISA 1
Name: amdgcn-amd-amdhsa--gfx906
Machine Models: HSA_MACHINE_MODEL_LARGE
Profiles: HSA_PROFILE_BASE
Default Rounding Mode: NEAR
Default Rounding Mode: NEAR
Fast f16: TRUE
Workgroup Max Size: 1024(0x400)
Workgroup Max Size per Dimension:
x 1024(0x400)
y 1024(0x400)
z 1024(0x400)
Grid Max Size: 4294967295(0xffffffff)
Grid Max Size per Dimension:
x 4294967295(0xffffffff)
y 4294967295(0xffffffff)
z 4294967295(0xffffffff)
FBarrier Max Size: 32
*** Done ***
Having the same issue
Same issue here with ROCm 3.5.1 and numba 0.50.1.
It seems, this repo is not actively maintained. Switched to rocm tensorflow for my projects to enhance my vega VII's. I couldn't get Numba working with latest ROCM stacks. Too bad, because I think it's way more intuitive to parallelize workloads in python with numba. If someone has a working config, please post it here.
| gharchive/issue | 2019-10-06T10:11:44 | 2025-04-01T06:45:10.999819 | {
"authors": [
"PipOlt",
"bawaji94",
"kburns"
],
"repo": "numba/roctools",
"url": "https://github.com/numba/roctools/issues/5",
"license": "BSD-2-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
377875697 | display of sets in variable preview is sometimes too long
I have a feeling I've recently done the opposite of what I'm about to propose: when the string rendering of a set is too long, the preview should instead say something like "set of N items", similar to how lists are displayed.
I think I have resolved this with pull request: https://github.com/numbas/editor/pull/443
Thanks! I've merged it in.
For future reference: it makes my life easier if you create a separate branch for each pull request. That way, if you have more than one pull request open at once, I can merge them separately.
Okay, I will make sure to branch in the future.
On 17 Dec 2018, at 9:44 pm, Christian Lawson-Perfect notifications@github.com wrote:
Thanks! I've merged it in.
For future reference: it makes my life easier if you create a separate branch for each pull request. That way, if you have more than one pull request open at once, I can merge them separately.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or mute the thread.
| gharchive/issue | 2018-11-06T14:39:28 | 2025-04-01T06:45:11.006072 | {
"authors": [
"JoshuaCapel",
"christianp"
],
"repo": "numbas/editor",
"url": "https://github.com/numbas/editor/issues/439",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
398728250 | Prometheus Metrics
Publish some meaningful metrics.
Working in this branch: https://github.com/number101010/kubernetes-grafana-controller/tree/issue-19
Currently don't like that type names come from prometheus. They should originate from the syncers? from the api objects themselves?
| gharchive/issue | 2019-01-14T02:15:10 | 2025-04-01T06:45:11.046392 | {
"authors": [
"number101010"
],
"repo": "number101010/kubernetes-grafana-controller",
"url": "https://github.com/number101010/kubernetes-grafana-controller/issues/19",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
714190736 | prepare named arguments
Hi,
This PR propose changing argument names when they don't match parents. In PHP8, this could cause unexpected errors if named arguments are used against a contract and the implementation has changed the names. (Here is the full rationale: https://psalm.dev/docs/running_psalm/issues/ParamNameMismatch/)
This is reported by Psalm, if this is merged, I have a few other Psalm issues fixes that I will submit in a next PR.
Personally not a fan of the shortened variable names, however it may just be easier to say we don't support named parameters for this (using @no-named-arguments).
Although, looking at the base interface that we extend, most of these methods seem to have been deprecated. 🤨 So we might need to work on implementing the new interfaces.
The entire collision package is internal and is not meant to be extended or used directly by a developer.
Thank you for the contribution tho.
| gharchive/pull-request | 2020-10-03T22:49:45 | 2025-04-01T06:45:11.163229 | {
"authors": [
"nunomaduro",
"orklah",
"owenvoke"
],
"repo": "nunomaduro/collision",
"url": "https://github.com/nunomaduro/collision/pull/158",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1015476658 | isinstanceof(t, *Tensor) broken
File lib/python3.8/site-packages/torchvision/transforms/functional.py exposed a but with isinstanceof. It has this code:
img = img.permute((2, 0, 1)).contiguous()
if isinstance(img, torch.ByteTensor):
return img.to(dtype=default_float_dtype).div(255)
else:
return img
It always returns false with Torchy, and then it crashes.
Simple repro:
x = torch.zeros([4, 3], dtype=torch.uint8)
print(isinstance(x, torch.ByteTensor))
print(x.dtype)
I've no idea where ByteTensor is defined or zeros() ends up creating a ByteTensor obj.
Found the code in torch/csrc/tensor/python_tensor.cpp:
static PyObject* Tensor_instancecheck(PyObject* _self, PyObject* arg) {
HANDLE_TH_ERRORS
auto self = (PyTensorType*)_self;
if (THPVariable_Check(arg)) {
const auto& var = THPVariable_Unpack(arg);
if (legacyExtractDispatchKey(var.key_set()) == self->get_dispatch_key() &&
var.scalar_type() == static_cast<ScalarType>(self->scalar_type)) {
Py_RETURN_TRUE;
}
}
Py_RETURN_FALSE;
END_HANDLE_TH_ERRORS
}
Note the check for the dispatch key, which makes some sense as PyTorch has different subclasses for CPU & CUDA tensors.
Gah!
Longer term maybe we want to remove Lazy from the comparison in legacyExtractDispatchKey (c10/core/DispatchKeySet.h)?
| gharchive/issue | 2021-10-04T18:09:37 | 2025-04-01T06:45:11.166263 | {
"authors": [
"nunoplopes"
],
"repo": "nunoplopes/torchy",
"url": "https://github.com/nunoplopes/torchy/issues/8",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2247671076 | Ungraded Exercise 2 Level 2
I'm abit stuck at the flatMap part since i cannot use the get() and also toString() method, is there any alternative way?
29 public Str map(Function<String,String> mapper) {
30 Optional newopt = this.opt.map(mapper);
31 return new Str(newopt);
32 }
33
34 public Str flatMap(Function<String,Str> mapper) {
35 Optional newopt = this.opt.map(mapper);
36 Optional newstr = newopt.map(x -> .....
37 return new Str(newopt);
38 }
tysm guys.
Once it returns the Str from the function passed into flatMap(), you can use a map on the new Str returned from that function
Alright Thanks guys, solved it.
| gharchive/issue | 2024-04-17T08:14:16 | 2025-04-01T06:45:11.171623 | {
"authors": [
"Bryanngu03",
"Jeremythong2002"
],
"repo": "nus-cs2030/2324-s2",
"url": "https://github.com/nus-cs2030/2324-s2/issues/605",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2273292002 | need help with Anonymous class
Does this work for Anonymous class?
Does it produce the same output? 🤔
But should work.
yes it give the same output
Now let’s say you call decompose more than needed
Get Outlook for iOShttps://aka.ms/o0ukef
From: Tan Jian Feng @.>
Sent: Wednesday, May 1, 2024 6:56:30 PM
To: nus-cs2030/2324-s2 @.>
Cc: Zhang Hao Dong @.>; Comment @.>
Subject: Re: [nus-cs2030/2324-s2] need help with Anonymous class (Issue #826)
yes it give the same output
image.png (view on web)https://github.com/nus-cs2030/2324-s2/assets/68832631/76ab6f59-2aad-4324-a341-6368a1538d2d
—
Reply to this email directly, view it on GitHubhttps://github.com/nus-cs2030/2324-s2/issues/826#issuecomment-2088287870, or unsubscribehttps://github.com/notifications/unsubscribe-auth/BBRJ5GSKZH5272HBBQVUDBLZADC55AVCNFSM6AAAAABHBYWYD6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOBYGI4DOOBXGA.
You are receiving this because you commented.Message ID: @.***>
finals is over you can close this issue
| gharchive/issue | 2024-05-01T10:27:17 | 2025-04-01T06:45:11.178791 | {
"authors": [
"Jett-Tan",
"ZHD1987E"
],
"repo": "nus-cs2030/2324-s2",
"url": "https://github.com/nus-cs2030/2324-s2/issues/826",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1371454433 | [CS2103T-W09-3] LinkedOn
LinkedOn is a desktop app for managing teams optimized for use via a Command line interface (CLI). As an employee, you will be able to easily view which members are in the teams you are on. As an employer, you can edit the team information and team members in the team. With our app, team management would be easier than ever.
Codecov Report
Merging #66 (2ed37d9) into master (77a32bf) will not change coverage.
The diff coverage is n/a.
:exclamation: Current head 2ed37d9 differs from pull request most recent head 7a8b6c2. Consider uploading reports for the commit 7a8b6c2 to get more accurate results
@@ Coverage Diff @@
## master #66 +/- ##
=========================================
Coverage 72.15% 72.15%
Complexity 399 399
=========================================
Files 70 70
Lines 1232 1232
Branches 125 125
=========================================
Hits 889 889
Misses 311 311
Partials 32 32
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
| gharchive/pull-request | 2022-09-13T13:24:10 | 2025-04-01T06:45:11.188214 | {
"authors": [
"JonathanWiguna",
"codecov-commenter"
],
"repo": "nus-cs2103-AY2223S1/tp",
"url": "https://github.com/nus-cs2103-AY2223S1/tp/pull/66",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1376395663 | [CS2103-F13-4] Healthcare Xpress
Healthcare Xpress helps medical administrator to manage details and schedules of the patients and nurses that are involved in home-visit. It is optimized for CLI users so that frequent tasks can be done faster by typing in commands.
Codecov Report
Merging #94 (471f6bd) into master (77a32bf) will not change coverage.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #94 +/- ##
=========================================
Coverage 72.15% 72.15%
Complexity 399 399
=========================================
Files 70 70
Lines 1232 1232
Branches 125 125
=========================================
Hits 889 889
Misses 311 311
Partials 32 32
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
| gharchive/pull-request | 2022-09-16T19:39:32 | 2025-04-01T06:45:11.191987 | {
"authors": [
"codecov-commenter",
"xhphoong"
],
"repo": "nus-cs2103-AY2223S1/tp",
"url": "https://github.com/nus-cs2103-AY2223S1/tp/pull/94",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
359048370 | [W5.11][T16-1]Nur Tristy Erna Binte Mohammed Harlan
added enhancement: changed to find name, ignoring sensitivity of character cases.
user guide updated.
relevant tests updated on JUnit tests and I/O tests
simple clean and easy to understand code, perhaps you could use extract method for refactoring 👍
Noted with thanks. Will try to code it more efficiently.
| gharchive/pull-request | 2018-09-11T13:29:25 | 2025-04-01T06:45:11.222733 | {
"authors": [
"tristyxxnana",
"zhicaizack"
],
"repo": "nusCS2113-AY1819S1/addressbook-level2",
"url": "https://github.com/nusCS2113-AY1819S1/addressbook-level2/pull/107",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1091994428 | colour of table is slightly wrong
I'm not seeing this issue either.
@fdncred - it's suspiciously the same as the indicator, which makes me wonder if that's coming from a missing ansi reset from reedline or something
ya, good point. we may have ansi-leak there in reedline.
it sure looks like ansi-leak but this code looks like it's calling reset.
https://github.com/nushell/reedline/blob/a2682b50f949245b5933471992b8094e1b3ae478/src/painter.rs#L250
| gharchive/issue | 2022-01-02T10:46:25 | 2025-04-01T06:45:11.224955 | {
"authors": [
"fdncred",
"jntrnr"
],
"repo": "nushell/engine-q",
"url": "https://github.com/nushell/engine-q/issues/647",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1618313219 | How to get remaining time to live of a key that has a timeout ?
from:https://github.com/nutsdb/nutsdb/discussions/287
The current nutsdb memory version does not have a method to return the remaining TTL, I wrote a demo, you can refer to it. Later versions can consider supporting APIs that can be called directly.
package main
import (
"fmt"
"log"
"time"
"github.com/nutsdb/nutsdb/inmemory"
)
var (
db *inmemory.DB
bucket string
)
func init() {
bucket = "bucket1"
opts := inmemory.DefaultOptions
db, _ = inmemory.Open(opts)
}
func main() {
put()
time.Sleep(10 * time.Second)
ttl, _ := GetTTL(bucket, []byte("key1"))
fmt.Println(ttl) // 90s
}
func put() {
key := []byte("key1")
val := []byte("val1")
err := db.Put(bucket, key, val, 100)
if err != nil {
log.Fatal(err)
}
}
func GetTTL(bucket string, key []byte) (int64, error) {
entry, err := db.Get(bucket, key)
if err != nil {
return -2, err
}
ttl := entry.Meta.TTL
// Permanently, the TTL returns -1
if ttl == 0 {
return -1, nil
}
timestamp := entry.Meta.Timestamp
nowUnix := time.Now().Unix()
remainingTime := int64(timestamp) + int64(ttl) - nowUnix
if remainingTime <= 0 {
return -2, nil
}
return remainingTime, nil
}
| gharchive/issue | 2023-03-10T03:50:30 | 2025-04-01T06:45:11.261247 | {
"authors": [
"xujiajun"
],
"repo": "nutsdb/nutsdb",
"url": "https://github.com/nutsdb/nutsdb/issues/304",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
448601029 | Nuxt Build Error
I got an error
Property '$recaptcha' does not exist on type 'Page'.
I have done adding dependencies and add it to modules at nuxt.config.ts
package.json
...
"dependencies": {
...
"@nuxtjs/recaptcha": "^0.4.1",
...
},
...
nuxt.config.ts
...
modules: [
// Doc: https://axios.nuxtjs.org/usage
'@nuxtjs/axios',
'@nuxtjs/pwa',
'@nuxtjs/recaptcha',
],
recaptcha: {
hideBadge: process.env.NODE_ENV !== 'development',
siteKey: <MY_SITE_KEY>,
version: 3
},
...
and what I have been doing in the Page is something like this
async mounted() {
await this.$recaptcha.init();
}
async submitForm() {
try {
const token = await this.$recaptcha.execute('login');
} catch (error) {
console.error(error)
}
}
And I got error in those lines with $recaptcha when I ran nuxt build. I am using @nuxt/cli v2.6.1
. Plus, in development mode, it runs perfectly. Can you please help?
I just built example/v3 and I get no errors. Do you get any errors while building?
Yeah, i got no errors while building the example.
Anyway, that problem was my bad. I couldn't access this.$recaptcha on that submitForm method, maybe because I couldn't get the context. So I solved it by using store action, there I could get the context and could access this.$recaptcha perfectly. Finally, my project could be built without errors. Perhaps, It will be better that way.
| gharchive/issue | 2019-05-26T20:02:19 | 2025-04-01T06:45:11.313740 | {
"authors": [
"kevinalexandersurjadi",
"mvrlin"
],
"repo": "nuxt-community/recaptcha-module",
"url": "https://github.com/nuxt-community/recaptcha-module/issues/17",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1640082714 | useAsyncQuery only work in server process
Environment
"@nuxtjs/apollo": "^5.0.0-alpha.5",
"@vue/apollo-composable": "^4.0.0-beta.4",
"apollo-cache-inmemory": "^1.6.6",
"apollo-link-context": "^1.0.20",
"apollo-link-http": "^1.5.17",
"nuxt": "^3.3.1",
"vue": "^3.2.47"
Describe the bug
in a simple query on Nuxt3 project, in app.vue, we wrote this query:
const { data } = await useAsyncQuery(inbox_seen_graphQL_Query)
console.log(data)
in process.server, data filled but in process.client, data empty and when try to render, first data show and a next moment, disappeared.
Expected behaviour
show data on render
Reproduction
No response
Additional context
No response
Logs
server:
RefImpl { 01:10:11
__v_isShallow: false,
dep: undefined,
__v_isRef: true,
_rawValue: { inbox: { inbox: [Array] } },
_value: { inbox: { inbox: [Array] } }
}
client:
RefImpl {__v_isShallow: false, dep: undefined, __v_isRef: true, _rawValue: null, _value: null}
Are you able to provide a reproduction repo?
@iamandrewluca I am having the same/similar issue, where it does not actually call graphql with useAsyncQuery. Just on the second time around.
This is what I am calling from my nuxt pina store:
import { defineStore } from 'pinia'
export const useMainStore = defineStore('main', {
state: () => ({
countries: null as null | Ref<unknown>,
}),
actions: {
async test() {
const query = gql`
query getCountries {
country(code: "BR") {
name
native
capital
emoji
currency
languages {
code
name
}
}
}
`
const variables = { limit: 5 }
const { data } = await useAsyncQuery(query, variables);
this.countries = data;
},
},
})
Someone seems to have the same issue here:
https://stackoverflow.com/questions/75587477/why-my-useasyncquery-does-not-call-up-my-api
Any help is greatly appriciated.
I think in your use case useAsyncQuery is used wrong.
useAsyncQuery should be used in script/setup context, you are using it inside a pinia action
@iamandrewluca can you elaborate, the docs are not very clear on when you should/shouldn't useAsyncQuery and it's unclear to me why you wouldn't use it in this context
useAsyncQuery is a hook composable, it needs to be called only in setup for the best functionality. It cannot be called wherever you want.
E.g.
<script setup>
const response = await useAsyncQuery(YourDocument)
</script>
<script>
export default defineComponent({
async setup() {
const response = await useAsyncQuery(YourDocument)
}
})
</script>
@iamandrewluca thanks for the response, I'm probably not familiar enough with vue to fully understand this, but all of our code is in setup so I'm not even sure what "other places" would mean in this context. We are using useAsyncQuery in another composable that is then included in our setup, is that ok?
| gharchive/issue | 2023-03-24T21:44:50 | 2025-04-01T06:45:11.320969 | {
"authors": [
"iamandrewluca",
"lisaschumann",
"ssyberg",
"wlodevelopteam"
],
"repo": "nuxt-modules/apollo",
"url": "https://github.com/nuxt-modules/apollo/issues/497",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
728557337 | Any way to provide app context to link resolver for nuxt-i18n compat?
When writing the linkResolver function, we must provide our full understanding of our app's routing. This works fine in simple cases.
However, some of us use nuxt-i18n to manage language-aware routing, which does some complicated things like discovering routes, merging them with custom language-specific paths, and siloing routes by language code. We can provide it with a full list of the languages we support, the default, etc.
It can be time-consuming and error prone to have to duplicate the final routing logic of nuxt-i18n when writing our linkResolver function. It would be significantly easier if we had access to nuxt-i18n's helpers, which are provided to components via global mixins and as part of Nuxt's context.app.
I believe this is the same issue that was mentioned here: https://github.com/nuxt-community/prismic-module/issues/23
I'm not super familiar with any of these libraries yet, so please forgive my lack of understanding here. Does anyone have suggestions on how to accomplish this? Perhaps we could amend how Prismic Nuxt loads the linkResolver before passing it to Prismic Vue - if it provided the plugin file with access to context and then expected it to return a constructed linkResolver function which makes use of the helpers it pulled from context.app?
Hey @zrisher, I'm really sorry for the delay, might have slipped through my notifications 🙏
I understand the issue here and this is a great feedback! I definitely agree that it's not convenient, error-prone, to maintain a routing strategy for Prismic separated from the Nuxt i18n one and I'm afraid there's currently no way to overcome that.
As pointed out in https://github.com/nuxt-community/prismic-module/issues/23#issuecomment-487290764 you should have access to doc.lang in order to perform routing in your link resolver. An alternative method to resolving links if through the apiOptions.routes object, more info about it here: https://prismic.nuxtjs.org/configuration#new-routes-resolver
We are planning to work on a major version (v. 2) of this module in Q1 2021 next year and I'll definitely look for a way to grant access to Nuxt context inside the link resolver and HTML serializer :)
I know this issue/feature request has been hanging in the balance for a while now and I'm sorry about that.
Nowadays, our preferred way of resolving documents URL at Prismic is now achieved through the Route Resolver: https://prismic.io/docs/route-resolver. It is not programmatic, it is just a rule object sent to the API so that document URLs don't need to be resolved at runtime anymore.
So I'm left a bit clueless as to how we could tighten integration with the i18n module here now 🤔
Thank you @zrisher for being a long-time contributor by the way, appreciate it ☺️
Hi @lihbr, it is my pleasure to provide my feedback where I can on your excellent set of tools. Thanks for always being responsive.
It has been far too long since I looked at this particular use case to be able to say what this problem looks like now. Nuxt has changed quite a bit, and I don't have any current projects with it where I'm using an i18n module.
I think we can safely close this until someone else comes along with an understanding of something lacking in the current functionality.
Thank you!
| gharchive/issue | 2020-10-23T22:27:59 | 2025-04-01T06:45:11.329731 | {
"authors": [
"lihbr",
"zrisher"
],
"repo": "nuxt-modules/prismic",
"url": "https://github.com/nuxt-modules/prismic/issues/107",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1994910480 | fix: correct endpoint for for devtools
https://github.com/nuxt/devtools/issues/505
Thanks for the PR - this should be addressed in #754!
| gharchive/pull-request | 2023-11-15T14:38:24 | 2025-04-01T06:45:11.331113 | {
"authors": [
"antfu",
"ineshbose"
],
"repo": "nuxt-modules/tailwindcss",
"url": "https://github.com/nuxt-modules/tailwindcss/pull/762",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1700597541 | Add support for other frameworks
Other frameworks have the same problems, which has been the reason for creating these devtools. Maybe it is possible to make the nuxt-devtools modular so that other frameworks can adopt them.
The whole Nuxt DevTools is built on top of Nuxt's convention, it has to be some extent of the coupling to make something useful. For us, we don't have enough bandwidth to support other frameworks but we see there are already community forks/ports made possible: https://github.com/webfansplz/vite-plugin-vue-devtools
Thanks for bringing this up.
Thanks for the fast answer.
I opened an issue in the fork/port: https://github.com/webfansplz/vite-plugin-vue-devtools/issues/6
| gharchive/issue | 2023-05-08T16:58:04 | 2025-04-01T06:45:11.338846 | {
"authors": [
"Jak2k",
"antfu"
],
"repo": "nuxt/devtools",
"url": "https://github.com/nuxt/devtools/issues/223",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
432821879 | I'm getting WS error in the browser (during & after) the building process
I get this error during building nuxt.
Also it persists after building.
Also after build is done, if I reload the page it disappears.
Hi. Are you using nuxt with custom server (express, etc) or using nuxt dev?
Hey @pi0 I'm using express. Here I made a repo https://github.com/chanlito/nuxt-loading-bug showing the issue.
Unfortunately, this is expected as setting up WebSocket for a custom server requires additional code in the project. (Related issue #9) Loading-screen has a fallback with normal HTTP requests so normal progress should work for you.
Maybe it would be a better idea to hide [WS] errors.
hi, any idea what we can do to avoid this error. after updating to nuxt@2.6.2, this error still happens.
any example additional code repo can be put here? :D it would be very useful
I got the same issue, I don't see how hiding errors would fix it. What additional code would be needed?
@LOUKOUKOU Are you using nuxt 2.9?
"nuxt": "^2.4.0",
Im also using
"@nuxt/loading-screen": "0.3.0"
Please upgrade to 2.9. New version of loading-screen uses SSE which should resolve your issues.
So it doesn't throw the errors now, but it stays on the loading screen.
ohhhhhhhhhhhhhhhhhhhhh!
Make sure do this
let config = require('./nuxt.config.js')
config.dev = false
Otherwise it thinks your in dev, but didn't build anything, so it has nothing to load.
| gharchive/issue | 2019-04-13T08:08:07 | 2025-04-01T06:45:11.358047 | {
"authors": [
"LOUKOUKOU",
"chanlito",
"pi0",
"summercn"
],
"repo": "nuxt/loading-screen",
"url": "https://github.com/nuxt/loading-screen/issues/12",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
252203160 | App freezing after middleware
Hi, I have this auth middleware:
export default function ({store, redirect, route}) {
if (route.path === '/auth/verify' || route.path === '/auth/signup' || route.path === '/auth/password') {
return
} else if (!store.getters.loggedUser && !store.getters.bridgeCrossed) {
return redirect('/bridge')
}
}
So, bridge is the site where i'm checking that user is logged and redirecting towards. But for some paths, routs i don't need, want to do this; as above. Don't know why, but when i'm loading app directly on these routs after some $router redirect action - simple nuxt-link app is freezing. When i'm navigating to these sites from other it works fine... I guess this is because this return, but how to do this otherwise?
Thanks for help.
The return isn't need if the middleware doesn't have anything to do:
export default function ({store, redirect, route}) {
const authorizedRoutes = ['/auth/verify', '/auth/signup', '/auth/password']
if (!store.getters.loggedUser && !store.getters.bridgeCrossed && authorizedRoutes.indexOf(route.path) === -1) {
return redirect('/bridge')
}
}
| gharchive/issue | 2017-08-23T08:37:02 | 2025-04-01T06:45:11.360477 | {
"authors": [
"lukasborawski",
"paulgv"
],
"repo": "nuxt/nuxt.js",
"url": "https://github.com/nuxt/nuxt.js/issues/1452",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
284565103 | How to create plugin for SMS Gateway API
Hai I need help how to create plugin for SMS Gateway API
https://www.npmjs.com/package/sms-gateway-nodejs
or
https://www.npmjs.com/package/smsgateway
I tried to create but i got error fs dependency not found..... even i have installed it
Show us your code
@besnikh
I'm really not sure it's correct
`import Vue from 'vue'
import SmsGateway from 'smsgateway'
Vue.use(SmsGateway)
export default({app}, inject) => {
app.smsGateway = new SmsGateway('....@gmail.com', 'password') //for smsgateway.me
}`
| gharchive/issue | 2017-12-26T14:42:28 | 2025-04-01T06:45:11.364018 | {
"authors": [
"Imanullah",
"besnikh"
],
"repo": "nuxt/nuxt.js",
"url": "https://github.com/nuxt/nuxt.js/issues/2462",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
593617015 | Pages 404 on page refresh using Github Pages
Version
v2.11.0
Reproduction link
https://github.com/jbratcher/nuxt-gh-pages-404-reproduction
Steps to reproduce
Demo Link
Navigate to About Page
Refresh Page
What is expected ?
Page is refreshed
What is actually happening?
Page give 404 error
Additional comments?
I have a project using Nuxt and Github Pages and noticed when I refreshed static pages other than the home route, they would give a 404 error.
A similar but not exactly the same issue may be #2240
This issue can be fixed by setting generate.fallback to true. However, I'm really not sure why/how this works. As I understand it, generate.fallback is used to control what is rendered when the page isn't found.
In nuxt.config.js
export default {
generate: {
fallback: true
},
}
The fix branch contains this code for the fix.
Any guidance on what is happening here? Thanks!
This bug report is available on Nuxt community (#c10479)
This is intended. When generating pages, you have to tell your platform/web server (GitHub pages in your case) in case you want to handle 404 errors "through your own application". This is usually not wanted when building non-SPAs but highly recommended for SPAs (to avoid the mentioend reload - 404 - issue).
When enabled, generate.fallback creates a 404.html which is essentially an SPA fallback page. GitHub pages picks it up automatically and redirects all "not found URLs" there.
However, generated URLs (with mode: universal) should not need a redirect. In your case, the capitalization of About.vue is giving troubles here. Lowercasing it and it should work fine 👍🏻
For the future, it might be an idea to unify pages, no matter how they are capitalized (e.g. About.vue should behave the same like about.vue or aBoUt.vue, as long as no drawbacks are present (cc @pi0).
Closing here 😋
| gharchive/issue | 2020-04-03T20:17:02 | 2025-04-01T06:45:11.370353 | {
"authors": [
"jbratcher",
"manniL"
],
"repo": "nuxt/nuxt.js",
"url": "https://github.com/nuxt/nuxt.js/issues/7172",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
202420238 | make it more compatible with other frameworks
make it use able with async with await
Current coverage is 100% (diff: 100%)
Merging #157 into master will not change coverage
@@ master #157 diff @@
====================================
Files 11 11
Lines 383 383
Methods 0 0
Messages 0 0
Branches 0 0
====================================
Hits 383 383
Misses 0 0
Partials 0 0
Powered by Codecov. Last update e016b5d...d17b12e
Thanks @Mirodil
| gharchive/pull-request | 2017-01-22T23:35:59 | 2025-04-01T06:45:11.373341 | {
"authors": [
"Atinux",
"Mirodil",
"codecov-io"
],
"repo": "nuxt/nuxt.js",
"url": "https://github.com/nuxt/nuxt.js/pull/157",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1623641472 | Could not load E:/\u79C1\u5355/xxx/pages/car/screening/index.vue?macro=true&vue&type=script&setup=true&lang.ts (imported by pages/car/screening/index.vue?macro=true): ENOENT: no such file or directory, open 'E:/\u79C1\u5355/xxx/pages/car/screening/index.vue'
[vite:vue] Could not load E:/\u79C1\u5355/rvtimes_nuxt3_h5/pages/car/screening/index.vue?macro=true&vue&type=script&setup=true&lang.ts (imported by pages/car/screening/index.vue?macro=true): ENOENT: no such file or directory, open 'E:/\u79C1\u5355/rvtimes_nuxt3_h5/pages/car/screening/index.vue'
ERROR Could not load E:/\u79C1\u5355/rvtimes_nuxt3_h5/pages/car/screening/index.vue?macro=true&vue&type=script&setup=true&lang.ts (imported by pages/car/screening/index.vue?macro=true): ENOENT: no such file or directory, open 'E:/\u79C1\u5355/rvtimes_nuxt3_h5/pages/car/screening/index.vue'
at Object.openSync (node:fs:594:3)
at Object.readFileSync (node:fs:462:35)
at getDescriptor (/E:/%E7%A7%81%E5%8D%95/rvtimes_nuxt3_h5/node_modules/.pnpm/@vitejs+plugin-vue@4.0.0_vite@4.1.4+vue@3.2.47/node_modules/@vitejs/plugin-vue/dist/index.mjs:86:10)
at Object.load (/E:/%E7%A7%81%E5%8D%95/rvtimes_nuxt3_h5/node_modules/.pnpm/@vitejs+plugin-vue@4.0.0_vite@4.1.4+vue@3.2.47/node_modules/@vitejs/plugin-vue/dist/index.mjs:2663:28)
at /E:/%E7%A7%81%E5%8D%95/rvtimes_nuxt3_h5/node_modules/.pnpm/rollup@3.19.1/node_modules/rollup/dist/es/shared/node-entry.js:24343:40
at async PluginDriver.hookFirstAndGetPlugin (/E:/%E7%A7%81%E5%8D%95/rvtimes_nuxt3_h5/node_modules/.pnpm/rollup@3.19.1/node_modules/rollup/dist/es/shared/node-entry.js:24243:28)
at async /E:/%E7%A7%81%E5%8D%95/rvtimes_nuxt3_h5/node_modules/.pnpm/rollup@3.19.1/node_modules/rollup/dist/es/shared/node-entry.js:23531:75
at async Queue.work (/E:/%E7%A7%81%E5%8D%95/rvtimes_nuxt3_h5/node_modules/.pnpm/rollup@3.19.1/node_modules/rollup/dist/es/shared/node-entry.js:24453:32)
i have this problem too . Did you solve it?
| gharchive/issue | 2023-03-14T14:49:15 | 2025-04-01T06:45:11.379348 | {
"authors": [
"TracyG7",
"evanlong0926"
],
"repo": "nuxt/nuxt",
"url": "https://github.com/nuxt/nuxt/issues/19678",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2412959112 | The regex caused the auto-imports to fail.
Environment
Operating System: Windows_NT
Node Version: v20.13.1
Nuxt Version: 3.12.3
CLI Version: 3.12.0
Nitro Version: 2.9.7
Package Manager: pnpm@8.9.2
Builder: -
User Config: devtools, app, runtimeConfig, ssr, experimental, pwa, sourcemap, typescript, modules, vite, primevue, unocss, imports, googleFonts, turnstile, supabase, css, postcss
Runtime Modules: @nuxtjs/turnstile@0.9.3, @unocss/nuxt@0.61.5, nuxt-primevue@3.0.0, @nuxtjs/google-fonts@3.2.0, @nuxtjs/supabase@1.3.5, @pinia/nuxt@0.5.1, @pinia-plugin-persistedstate/nuxt@1.2.1, @vueuse/nuxt@10.11.0, @vite-pwa/nuxt@0.9.1
Build Modules: -
Reproduction
https://stackblitz.com/edit/github-c1luxa?file=app.vue
Describe the bug
const regex =
/^[a-zA-Z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-zA-Z0-9-]+(?:\.[a-zA-Z0-9-]+)*$/; //
Once I use this regex anywhere in the script tag, all subsequent auto-imports stop working.
Seriously, I often encounter this issue, especially when I write some content or comments within the template tags. Even though it fully complies with Vue's syntax, it causes compilation to fail until I either delete the content, change some words, or rewrite it differently. I don't understand the internal parsing logic of Nuxt or Vue3. This issue doesn't always happen, but it's quite common.
Additional context
No response
Logs
No response
let's track in https://github.com/unjs/unimport/issues/346.
| gharchive/issue | 2024-07-17T08:11:23 | 2025-04-01T06:45:11.385051 | {
"authors": [
"Ice-Hazymoon",
"danielroe"
],
"repo": "nuxt/nuxt",
"url": "https://github.com/nuxt/nuxt/issues/28190",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2446210880 | Why are many hooks in the custom routing module in the nuxt official documentation not using arrow functions correctly
Environment
https://nuxtjs.org.cn/docs/guide/going-further/custom-routing#using-approuteroptions
Reproduction
https://nuxtjs.org.cn/docs/guide/going-further/custom-routing#using-approuteroptions
Describe the bug
Additional context
No response
Logs
No response
### Tasks
Hello :wave: this may seems weird but it is a perfectly valid syntax. Theses are not arrow functions but normal function declaration with quotes due to -
| gharchive/issue | 2024-08-03T07:22:33 | 2025-04-01T06:45:11.388487 | {
"authors": [
"huang-julien",
"jiejie-color"
],
"repo": "nuxt/nuxt",
"url": "https://github.com/nuxt/nuxt/issues/28395",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1758872388 | feat(nuxt): allow configuring a mixin to keep $route injection synced
🔗 Linked issue
resolves https://github.com/nuxt/nuxt/issues/20674
❓ Type of change
[ ] 📖 Documentation (updates to the documentation, readme or JSdoc annotations)
[ ] 🐞 Bug fix (a non-breaking change that fixes an issue)
[x] 👌 Enhancement (improving an existing functionality like performance)
[ ] ✨ New feature (a non-breaking change that adds functionality)
[ ] 🧹 Chore (updates to the build process or auxiliary tools and libraries)
[ ] ⚠️ Breaking change (fix or feature that would cause existing functionality to change)
📚 Description
This adds an experimental option to sync $route template object with the Nuxt-managed route.
📝 Checklist
[ ] I have linked an issue or discussion.
[ ] I have updated the documentation accordingly.
Not sure that this should be as an experimental feature since it actually give the same behavior without and with vue-router.
Also, I also think that the $route and useRoute() should only changed once the next page is resolved.
I would consider this as a bug fix IMO. Happy to discuss about it if you see it with another POV.
Agreed. However, personally I would advise against using these kind of injections as much as possible in the template as it makes the 'dependencies' of the component less explicit.
Yet the main issue I have is that I'm not sure of the impact on performance. We might be able instead to transform and auto execute useRoute based on usage. What do you think?
@antfu Would love your thoughts on whether it's safe to access _ctx._.provides in this way.
@antfu Very up for a different implementation here, but wanted to get this in for 3.7 release.
| gharchive/pull-request | 2023-06-15T13:42:47 | 2025-04-01T06:45:11.394915 | {
"authors": [
"Atinux",
"danielroe"
],
"repo": "nuxt/nuxt",
"url": "https://github.com/nuxt/nuxt/pull/21585",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2529116930 | How can I add data-* attributes to the generated
📚 What are you trying to do?
I need to implement OneTrustSDK and that thing needs a data-domain-script attribute on the <script>-Tag. How can I get this data-attribute to render?
🔍 What have you tried?
I tried adding it in sciprts.global key:
scripts: {
globals: {
oneTrust: {
src: 'https://XXX.my.onetrust.eu/cdn/cookies/scripttemplates/otSDKStub.js',
'data-domain-script': 'XXXXXX-XXXXXXX-XXXXX',
}
},
I also tried adding it in my composable:
export function useScriptOneTrust<T extends OneTrustApi>(options?: OneTrustInput) {
return useRegistryScript<T, typeof OneTrustOptions>('oneTrust', options => ({
scriptInput: {
'src': 'https://XXX.my.onetrust.eu/cdn/cookies/scripttemplates/otSDKStub.js', // can't be bundled
'data-domain-script': options.domainScript,
},
}))
}
The only thing I can see being rendered is a preload link which fails because it needs a <script>-Tag with a data-domain-script attribute:
<link rel="preload" as="script" href="https://XXX.my.onetrust.eu/cdn/cookies/scripttemplates/otSDKStub.js" crossorigin="anonymous" fetchpriority="low" data-hid="1817094">
I can see some data-cf-beacon attribute being set here, so I assumed it should be possible: https://github.com/nuxt/scripts/blob/main/src/runtime/registry/cloudflare-web-analytics.ts#L44
No response
Solved. I messed up passing the options properly. Works as expected and documented (it won't render a script tag by default by design).
| gharchive/issue | 2024-09-16T18:00:39 | 2025-04-01T06:45:11.405406 | {
"authors": [
"bernhardberger"
],
"repo": "nuxt/scripts",
"url": "https://github.com/nuxt/scripts/issues/268",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2061057473 | Sorting only works on current page, but not the whole array of entries.
Environment
OS: WSL ubuntu
Node: 18+
Nuxt: 3.9
Version
v3.9
Reproduction
https://stackblitz.com/edit/nuxt-ui-fwppqm?file=app.vue
Description
If pagination is included in the data table sorting only works for the current page, but not on all the entries.
On the first page the titles go from D to Q ascending.
On the second page it goes from A to V ascending.
So basically each page is sorted individually. As I saw its only working correctly w/ the id-s. I'm not sure if thats intended or not.
You can also check this out in the docs' example: https://ui.nuxt.com/getting-started/examples#table
Additional context
I have my own project where the same is happening with dates. The first pages gets sorted from earliest to latest, and the second page starts with the earliest again.
Logs
No response
@omgzsa I've updated the example to sort on the backend using v-model:sort and sort-mode="manual".
Hi, I have the same issue as the OP, but I'm not consuming an API that accepts query params, so I got all the data in the server side and use it to populate the table.
So I was wondering if there is a way to sort not only the current page of the table in a "preset" data set.
Thanks in advance.
similar issue as @aldoea, anyway to sort the entire dataset?
| gharchive/issue | 2023-12-31T13:17:00 | 2025-04-01T06:45:11.412399 | {
"authors": [
"aldoea",
"benjamincanac",
"omgzsa",
"safejace"
],
"repo": "nuxt/ui",
"url": "https://github.com/nuxt/ui/issues/1177",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1890498815 | Number Input does not contain minimum and maximum variables.
Number Input does not contain minimum and maximum variables. How can I do that?
Have you tried setting the min and max props on your <UInput>? Not all attributes are defined as props as there are lots of them: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/input#attributes and don't change the logic of the component. But all of them can be used as there is a v-bind="$attrs" on the actual <input> element.
Sorry, my bad. :(
min and max are just regular attributes, there is no need to redefine them explicitly though props
| gharchive/issue | 2023-09-11T13:20:11 | 2025-04-01T06:45:11.415028 | {
"authors": [
"AndrewBogdanovTSS",
"benjamincanac",
"xtrmus"
],
"repo": "nuxt/ui",
"url": "https://github.com/nuxt/ui/issues/658",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1799821690 | Example Documentation Updates
Contributes to #1026
/merge
| gharchive/pull-request | 2023-07-11T22:05:37 | 2025-04-01T06:45:11.417602 | {
"authors": [
"cwharris",
"mdemoret-nv"
],
"repo": "nv-morpheus/Morpheus",
"url": "https://github.com/nv-morpheus/Morpheus/pull/1038",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1394814939 | mikm25-hw10
Odkaz: https://esotemp.vse.cz/~mikm25/cv10
| gharchive/pull-request | 2022-10-03T14:15:26 | 2025-04-01T06:45:11.544467 | {
"authors": [
"marek-mikula",
"nvbach91"
],
"repo": "nvbach91/4IZ268-2022-2023-ZS",
"url": "https://github.com/nvbach91/4IZ268-2022-2023-ZS/pull/13",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1394664390 | 🛑 PufferAI HPC Portal is down
In fc5eab7, PufferAI HPC Portal (https://hpc.lab.novaglobal.com.sg) was down:
HTTP code: 0
Response time: 0 ms
Resolved: PufferAI HPC Portal is back up in d4dd883.
| gharchive/issue | 2022-10-03T12:37:51 | 2025-04-01T06:45:11.546903 | {
"authors": [
"nvgsg"
],
"repo": "nvgsg/lab-upptime",
"url": "https://github.com/nvgsg/lab-upptime/issues/1971",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1330727887 | 🛑 Kubeflow ML Platform is down
In 18b13a0, Kubeflow ML Platform (https://kubeflow.lab.novaglobal.com.sg) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Kubeflow ML Platform is back up in 9794117.
| gharchive/issue | 2022-08-06T11:28:02 | 2025-04-01T06:45:11.549425 | {
"authors": [
"nvgsg"
],
"repo": "nvgsg/lab-upptime",
"url": "https://github.com/nvgsg/lab-upptime/issues/314",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1505213368 | 🛑 Kubeflow ML Platform is down
In 3eee428, Kubeflow ML Platform (https://kubeflow.lab.novaglobal.com.sg) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Kubeflow ML Platform is back up in 8546db5.
| gharchive/issue | 2022-12-20T20:25:17 | 2025-04-01T06:45:11.552037 | {
"authors": [
"nvgsg"
],
"repo": "nvgsg/lab-upptime",
"url": "https://github.com/nvgsg/lab-upptime/issues/4099",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1551463040 | 🛑 NVIDIA AI Enterprise Hub is down
In c9c5daa, NVIDIA AI Enterprise Hub (https://nvaie.lab.novaglobal.com.sg) was down:
HTTP code: 0
Response time: 0 ms
Resolved: NVIDIA AI Enterprise Hub is back up in 81fc549.
| gharchive/issue | 2023-01-20T21:18:53 | 2025-04-01T06:45:11.554504 | {
"authors": [
"nvgsg"
],
"repo": "nvgsg/lab-upptime",
"url": "https://github.com/nvgsg/lab-upptime/issues/5847",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1037732276 | Added lookbehind functionality
Lookbehind is like lookahead but in the other direction. This can be useful if you want to select the function name from the function body with a custom query. If lookahead and lookbehind are configured as true, lookahead is dominant. As a further improvement i suggest that lookahead and lookbehind should be assigned to each keymap individually, because this will allow bindings like vim's integrated f for search forward and F for search backward.
Thank you again for this nice addition @fabiocaruso !
| gharchive/pull-request | 2021-10-27T18:31:06 | 2025-04-01T06:45:11.568411 | {
"authors": [
"fabiocaruso",
"theHamsta"
],
"repo": "nvim-treesitter/nvim-treesitter-textobjects",
"url": "https://github.com/nvim-treesitter/nvim-treesitter-textobjects/pull/132",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2678370112 | is NormalTangentMirrorTest rendered incorrect?
rendered by vk_gltf_renderer
model is taken from glTF-Sample-Assets
and reference render
same problem with NormalTangentTest
reference
Strong normal maps in ray tracing can produce visible artifacts, especially on flat surfaces, due to the fundamental differences between ray tracing and rasterization approaches.
When light hits a real curved surface (as shown in the top diagram), the rays bounce off naturally, following the physical shape. The challenge with normal maps on flat surfaces (bottom diagram) is that they tell the light to bounce as if it were hitting a curve. In ray tracing, this can cause reflected rays to intersect back into the surface itself - something that wouldn't happen in reality. These invalid self-intersections show up as black spots in the final render.
This issue is specific to ray tracing because it accurately simulates light paths and geometry intersections. Rasterization with cube maps handles things differently - instead of calculating actual ray bounces, it simply looks up colors from the environment map based on the normal direction. That's why normal-mapped reflections can look convincing in rasterized renders, even though they're physically inaccurate.
While rasterizing with strong normal maps produces nice visual results, creating the same look with raytracing remains a challenge due to its physical base. This illustrates the common trade-off between physical accuracy and artistic control in different rendering approaches.
| gharchive/issue | 2024-11-21T07:31:42 | 2025-04-01T06:45:11.575566 | {
"authors": [
"mklefrancois",
"tigrazone"
],
"repo": "nvpro-samples/nvpro_core",
"url": "https://github.com/nvpro-samples/nvpro_core/issues/76",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1741378317 | 1489 Segmentation fault "${@}"
Hei Guys,
i am working on a raspiberry pi 3 model b+ :-
Linux raspberrypi 6.1.21-v7+ #1642 SMP Mon Apr 3 17:20:52 BST 2023 armv7l GNU/Linux(new setup)
Installed the UUU-Tool which is mentioned like here:-
https://snapcraft.io/install/universal-update-utility/raspbian
Run first time UUU :-
ERROR: ld.so: object '/usr/lib/arm-linux-gnueabihf/libarmmem-${PLATFORM}.so' from /etc/ld.so.preload cannot be preloaded (cannot open shared object file): ignored.
uuu (Universal Update Utility) for nxp imx chips -- libuuu_1.5.21-0-g1f42172
uuu [-d -m -v -V] <bootloader|cmdlists|cmd>
.....
Added # at /etc/ld.so.preload to clear the error
Installed:-
udo apt-get install libusb-1.0-0-dev libbz2-dev libzstd-dev
Find the device with -lsusb
uuu (Universal Update Utility) for nxp imx chips -- libuuu_1.5.21-0-g1f42172
Connected Known USB Devices
Path Chip Pro Vid Pid BcdVersion
==================================================
1:112 MX8MM SDP: 0x1FC9 0x0134 0x0101
Run:-
Snap connect universal-update-utility:removable-media
Run:-
uuu uuu.auto
uuu (Universal Update Utility) for nxp imx chips -- libuuu_1.5.21-0-g1f42172
Success 0 Failure 0
1:112 1/ 1 [ ] SDP: boot -f imx-boot.bin
/snap/universal-update-utility/501/bin/universal-update-utility-launch: line 64: 1489 Segmentation fault "${@}"
Any advice? I can flash the image without any problems from our windows system. The Raspberry PI 4 Model B can't even find the MX8MM with lsusb, but that's a problem from raspberry i guess...
hei guys,
i looked at the ERROR: ld.so: object '/usr/lib/arm-linux-gnueabihf/libarmmem-${PLATFORM}.so' from /etc/ld.so.preload cannot be preloaded (cannot open shared object file): ignored.
Changed the /etc/ld.so.preload to:-
usr/lib/arm-linux-gnueabihf/libarmmen-v7l.so
Bcs the variable $PLATFROM is empty
cat /proc/cpuinfo | grep "model name"
model name : ARMv7 Processor rev 4 (v7l)
model name : ARMv7 Processor rev 4 (v7l)
model name : ARMv7 Processor rev 4 (v7l)
model name : ARMv7 Processor rev 4 (v7l)
run:
uuu uuu.auto
ERROR: ld.so: object '/usr/lib/arm-linux-gnueabihf/libarmmem-v7l.so' from /etc/ld.so.preload cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/usr/lib/arm-linux-gnueabihf/libarmmem-v7l.so' from /etc/ld.so.preload cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/usr/lib/arm-linux-gnueabihf/libarmmem-v7l.so' from /etc/ld.so.preload cannot be preloaded (cannot open shared object file): ignored.
ERROR: ld.so: object '/usr/lib/arm-linux-gnueabihf/libarmmem-v7l.so' from /etc/ld.so.preload cannot be preloaded (cannot open shared object file): ignored.
uuu (Universal Update Utility) for nxp imx chips -- libuuu_1.5.21-0-g1f42172
Success 0 Failure 0
1:113 2/ 3 [ ] SDPV: write -f imx-boot.bin -skipspl
/snap/universal-update-utility/501/bin/universal-update-utility-launch: line 64: 5596 Segmentation fault "${@}"
Thanks in advance
Chris
Maybe you can try our prebuild image: https://github.com/nxp-imx/mfgtools/releases/download/master/uuu_armv7
The pre-built image link is 404.
I wound up here because I'm seeing a segmentation fault too. I'm using the current source built for armv7l:
Wait for Known USB Device Appear...
>Start Cmd:SDPS: boot -f /root/imx93-11x11-lpddr4x-evk/u-boot-spl.bin
New USB Device Attached at 2:1
Segmentation fault
legal team currently review snap delivery. So, I can't update snap util finish
Thank you for your reply. As mentioned in the first post, I am using a Raspberry Pi 3 Model B. I am now using the current commit. However, when I try to flash the image, I get stuck.
uuu -d -v handtmann_v1.0.0_serie/uuu.auto
uuu (Universal Update Utility) for nxp imx chips -- libuuu_1.5.191-1-g5d77a61
Build in config:
Pctl Chip Vid Pid BcdVersion Serial_No
==================================================
SDPS: MX8QXP 0x1fc9 0x012f [0x0002..0xffff]
SDPS: MX8QM 0x1fc9 0x0129 [0x0002..0xffff]
SDPS: MX8DXL 0x1fc9 0x0147
SDPS: MX28 0x15a2 0x004f
SDPS: MX815 0x1fc9 0x013e
SDPS: MX865 0x1fc9 0x0146
SDPS: MX8ULP 0x1fc9 0x014a
SDPS: MX8ULP 0x1fc9 0x014b
SDPS: MX93 0x1fc9 0x014e
SDPS: MX91 0x1fc9 0x0159
SDPS: MX95 0x1fc9 0x015d
SDPS: MX95 0x1fc9 0x015c
SDP: MX7D 0x15a2 0x0076
SDP: MX6Q 0x15a2 0x0054
SDP: MX6D 0x15a2 0x0061
SDP: MX6SL 0x15a2 0x0063
SDP: MX6SX 0x15a2 0x0071
SDP: MX6UL 0x15a2 0x007d
SDP: MX6ULL 0x15a2 0x0080
SDP: MX6SLL 0x1fc9 0x0128
SDP: MX7ULP 0x1fc9 0x0126
SDP: MXRT106X 0x1fc9 0x0135
SDP: MX8MM 0x1fc9 0x0134
SDP: MX8MQ 0x1fc9 0x012b
SDPU: SPL 0x0525 0xb4a4 [0x0000..0x04ff]
SDPV: SPL1 0x0525 0xb4a4 [0x0500..0x9998]
SDPV: SPL1 0x1fc9 0x0151 [0x0500..0x9998]
SDPU: SPL 0x0525 0xb4a4 [0x9999..0x9999]
SDPU: SPL 0x3016 0x1001 [0x0000..0x04ff]
SDPV: SPL1 0x3016 0x1001 [0x0500..0x9998]
FBK: 0x066f 0x9afe
FBK: 0x066f 0x9bff
FBK: 0x1fc9 0x0153
FB: 0x0525 0xa4a5
FB: 0x18d1 0x0d02
FB: 0x3016 0x0001
FB: 0x1fc9 0x0152
FB: 0x0483 0x0afb
Wait for Known USB Device Appear...
New USB Device Attached at 1:1332-
1:1332->Start Cmd:SDP: boot -f imx-boot.bin
100%1:1332->Okay (0.558s)
New USB Device Attached at 1:1332-
1:1332->Start Cmd:SDPV: delay 1000
1:1332->Okay (1.001s)
1:1332->Start Cmd:SDPV: write -f imx-boot.bin -skipspl
100%1:1332->Okay (1.415s)
1:1332->Start Cmd:SDPV: jump
100%1:1332->Okay (2.792s)
New USB Device Attached at 1:1332-150F1209DABC1514
1:1332-150F1209DABC1514>Start Cmd:FB: ucmd setenv fastboot_dev mmc
1:1332-150F1209DABC1514>Okay (0.05s)
1:1332-150F1209DABC1514>Start Cmd:FB: ucmd setenv mmcdev 2
1:1332-150F1209DABC1514>Okay (0.005s)
1:1332-150F1209DABC1514>Start Cmd:FB: ucmd mmc dev 2
1:1332-150F1209DABC1514>Okay (0.055s)
1:1332-150F1209DABC1514>Start Cmd:FB: flash -raw2sparse all handtmann.img
At begin, uuu design for 64bit PC host only and use simple use mmap for big file. maybe handtmann.img is too big for 32bit system.
| gharchive/issue | 2023-06-05T09:32:23 | 2025-04-01T06:45:11.632487 | {
"authors": [
"Chris1452",
"lionel",
"nxpfrankli"
],
"repo": "nxp-imx/mfgtools",
"url": "https://github.com/nxp-imx/mfgtools/issues/375",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
1636209428 | 🛑 Harmony Bot Website is down
In 92fa54e, Harmony Bot Website ($HARMONY_WEB) was down:
HTTP code: 0
Response time: 0 ms
Resolved: Harmony Bot Website is back up in 76e9b47.
| gharchive/issue | 2023-03-22T17:28:29 | 2025-04-01T06:45:11.636664 | {
"authors": [
"nxvvvv"
],
"repo": "nxvvvv/uptime",
"url": "https://github.com/nxvvvv/uptime/issues/16200",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
30626688 | Configuration
There a several options to gpg that it would be really useful to use, but I would like users to be able to set them as they please.
gpg (quite correctly) only reads from a single config file, so having simplekey use a "new default" file and let the user's options override isn't an option.
I'm tempted to create a set of options for simplekey, and add function to check that all of those options at least have values. If not, issue a warning or fail, with the suggestion of running ./simplekey configure to edit the file automatically to add those options.
The problem with this option is that it couples tightly to the gpg2 options format, but gpgconf doesn't manage a number of the options I'd like to be using.
I think that a strongly opinionated tool should use reasonable defaults, which work well on most cases (under certain conditions and assumptions). But it should also allow the users to change them easily, if they want to.
| gharchive/issue | 2014-04-01T18:52:41 | 2025-04-01T06:45:11.638414 | {
"authors": [
"dashohoxha",
"nyarly"
],
"repo": "nyarly/simplekey",
"url": "https://github.com/nyarly/simplekey/issues/9",
"license": "unlicense",
"license_type": "permissive",
"license_source": "bigquery"
} |
1918188733 | Using Nylas Python v3 SDK, create a code sample to update a user grant
Using Nylas Python v3 SDK, create a code sample to update a user grant
Sup! I can work on this issue.
Sure, let me assign. @wiseaidev
@wiseaidev - any updates on this ticket?
@wiseaidev - just a few minor updates on the PR, we can merge after those are addressed 👍
| gharchive/issue | 2023-09-28T20:01:28 | 2025-04-01T06:45:11.652014 | {
"authors": [
"relaxedtomato",
"wiseaidev"
],
"repo": "nylas-samples/nylas-hacktoberfest-2023",
"url": "https://github.com/nylas-samples/nylas-hacktoberfest-2023/issues/50",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
223136531 | Update _launcherPath to use correct folder & file
Looks like this got missed during a recent update.
Should fix: #3440
@jstejada do these get looked at? Seems you have a lot of people making issue and PR contributions, but no feedback at all, not even acknowledgement from Nylas.
@lxalln Was this still an issue on Nylas Mail Lives I know your PR never made it in.
@dweremeichik As far as I'm aware this was still an issue. Although I never got around to running NML as a daily driver, so wasn't too concerned with it starting up on boot.
| gharchive/pull-request | 2017-04-20T17:09:02 | 2025-04-01T06:45:11.654211 | {
"authors": [
"dweremeichik",
"lxalln"
],
"repo": "nylas/nylas-mail",
"url": "https://github.com/nylas/nylas-mail/pull/3443",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
546764619 | Feature/styles extend
Added draggableContainer & mask to customStyles which enables their extension (composition).
I think it's not necessary to add style on mask & draggable container and most of other developers won't need this, so i won't merge this pull request. However, thank you for taking the time to improve this lib. I'll merge this code if other developers need to extend this feature.
| gharchive/pull-request | 2020-01-08T10:12:33 | 2025-04-01T06:45:11.676569 | {
"authors": [
"nysamnang",
"veksi"
],
"repo": "nysamnang/react-native-raw-bottom-sheet",
"url": "https://github.com/nysamnang/react-native-raw-bottom-sheet/pull/45",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
210315053 | Add push_db
When developing a site locally, I usually push to staging all the time. Do you have a script for pushing to remote already, or should I send you a pull request?
The workflow I use for that is I have Forge set up to auto-deploy when pushed to the staging or develop branch of my Git repo.
So I don't really need a script to do that; if you want to then pull the db from local to staging, you could either set up the config on the staging server to pull the db from your local dev, or yeah, you could make a push_db script analogous to the pull_db script that would do the opposite.
Pull requests would be awesome if you do decide to make a push_db script; it should be quite simple to do by modifying the pull_db script.
If you do, you might also want to make an analogous push_assets script as well.
Has anything happened with this? I may be misunderstanding how these scripts should be utilised, but our current setup is production branch connected to master db. All changes to the database are made on the master, then the scripts pulls the db and assets down locally to my local env. Now we have a staging env as well, so ideally I'd like to pull the master db and assets down to this env, or push them up from my local to it as it's essentially the same. Is this currently possible?
@shornuk you should try using a workflow as described here:
https://nystudio107.com/blog/database-asset-syncing-between-environments-in-craft-cms
You absolutely can pull the db & assets to your staging environment; just set up a .env.sh for that environment as well.
@sjelfull did you ever work up a workflow for pushing_db & push_assets? 😬
Naah, I did not. Someday 😬
Anyone think this would work for pushing the local database to the remote database? I tried to plug into their scripts and variables as much as possible. This has been tested in pieces, but not as a whole. I'm about 99% sure it should work, but 1% of me still thinks it will crash everything : P
push_db.sh
#!/bin/bash
# Get the directory of the currently executing script
DIR="$(dirname "${BASH_SOURCE[0]}")"
# Include files
INCLUDE_FILES=(
"common/defaults.sh"
".env.sh"
"common/common_env.sh"
"common/common_db.sh"
)
for INCLUDE_FILE in "${INCLUDE_FILES[@]}"; do
if [[ ! -f "${DIR}/${INCLUDE_FILE}" ]] ; then
echo "File ${DIR}/${INCLUDE_FILE} is missing, aborting."
exit 1
fi
source "${DIR}/${INCLUDE_FILE}"
done
# Backup local database:
./backup_db.sh
# Set the backup db file name, parent directory path, and full path
BACKUP_DB_DIR_PATH="${LOCAL_BACKUPS_PATH}${LOCAL_DB_NAME}/${DB_BACKUP_SUBDIR}/"
BACKUP_DB_NAME="$(basename $(ls -t $BACKUP_DB_DIR_PATH*.sql* | head -1))";
BACKUP_DB_PATH="${BACKUP_DB_DIR_PATH}${BACKUP_DB_NAME}";
# Copy local backup to server:
echo "*** Copying $BACKUP_DB_PATH to $REMOTE_BACKUPS_PATH$BACKUP_DB_NAME";
scp "$BACKUP_DB_PATH" "$REMOTE_SSH_LOGIN:$REMOTE_BACKUPS_PATH$BACKUP_DB_NAME";
# Log into server and restore local backup
ssh -t $REMOTE_SSH_LOGIN << EOF
"${REMOTE_ROOT_PATH}scripts/restore_db.sh" "$REMOTE_BACKUPS_PATH$BACKUP_DB_NAME";
EOF
# Normal exit
exit 0
| gharchive/issue | 2017-02-26T12:29:50 | 2025-04-01T06:45:11.683059 | {
"authors": [
"callaginn",
"chasegiunta",
"khalwat",
"shornuk",
"sjelfull"
],
"repo": "nystudio107/craft-scripts",
"url": "https://github.com/nystudio107/craft-scripts/issues/6",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
925503243 | Design file structure
As a Developer
I need to decide file structure
So that developers can start developing
Assumptions:
decide file structure
decide file name (e.g., model.py, status.py, etc)
write file description
Acceptance Criteria:
Given folder and files
When developers read each file
Then understand each file role and start writing code in the appropriate file
Just a comment to take note of meeting discussion:
In this story, we should have setup empty files (such as models.py) so we could start branching with these files and developing.
| gharchive/issue | 2021-06-20T02:04:39 | 2025-04-01T06:45:11.686914 | {
"authors": [
"jtsen",
"shihokuni"
],
"repo": "nyu-devops-2021-summer-promotions/promotions",
"url": "https://github.com/nyu-devops-2021-summer-promotions/promotions/issues/6",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1046658143 | Checklist could be different than 2 entries
https://github.com/nyxnor/onionservice/blob/d429606e546b65321ce8b7191cd23702f8871dbb/onionservice-tui#L70
Issue
It seems (I use whiptail instead of dialog) that the "2" representing two entries is hardcoded in this line of code. However, someone may have only one or several entries.
Describe the solution you'd like
For the above reasons, I propose to replace "2" with "$i". To be sure that the list is not getting too long and is fitting to a little LCD display, the entire code could look like that:
if [ $i -gt 11 ]; then i=11; fi
CHOICE_SERVICE="$(dialog --clear --backtitle "${BACKTITLE}" --title "${TITLE}" --"${DIALOG_TYPE}" "${MENU}" \
"$((i+8))" 80 ${i} ${SERVICE_LIST} 2>&1 >/dev/tty)"
Additional remarks
I didn't submit a pull request yet because I don't use dialog, and I'm not entirely sure it reacts like whiptail.
On dialog, one or more services works fine
1 service:
3 services:
LCD display
Yes, max will be pinned, so this change would be accepted, if possible to consider also the AUTH_* and any other relevant option to pin/fix lines regarding clients names (why 11?).
[ "${i}" -gt 11 ] && i=11
# shellcheck disable=SC2086
CHOICE_SERVICE="$(dialog --clear --backtitle "${BACKTITLE}" --title "${TITLE}" --"${DIALOG_TYPE}" "${MENU}" \
"$((i+8))" 80 ${i} ${SERVICE_LIST} 2>&1 >/dev/tty)"
11 because this is the maximum number I can squeeze into my PiTFT Plus 480x320 3.5".
| gharchive/issue | 2021-11-07T04:30:52 | 2025-04-01T06:45:11.693370 | {
"authors": [
"nyxnor",
"radio24"
],
"repo": "nyxnor/onionservice",
"url": "https://github.com/nyxnor/onionservice/issues/3",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1032481254 | SSL connection error
@o1lab ,
Once i've been trying to connect via cli to the external DBserver, got the following error:
As according to the error seems that the connection been executed properly but there is additional parameter missing.
Could you please advice?
` Generating REST APIs at the speed of your thought..
Cache init failed during database reading
Error: UNKNOWN_CODE_PLEASE_REPORT: SSL connection is required. Please specify SSL options and retry.
at Handshake.Sequence._packetToError (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\protocol\sequences\Sequence.js:47:14)
at Handshake.ErrorPacket (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\protocol\sequences\Handshake.js:123:18)
at Protocol._parsePacket (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\protocol\Protocol.js:291:23)
at Parser._parsePacket (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\protocol\Parser.js:433:10)
at Parser.write (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\protocol\Parser.js:43:10)
at Protocol.write (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\protocol\Protocol.js:38:16)
at Socket. (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\Connection.js:88:28)
at Socket. (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\Connection.js:526:10)
at Socket.emit (events.js:310:20)
at addChunk (_stream_readable.js:286:12)
--------------------
at Protocol._enqueue (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\protocol\Protocol.js:144:48)
at Protocol.handshake (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\protocol\Protocol.js:51:23)
at PoolConnection.connect (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\Connection.js:116:18)
at Pool.getConnection (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\Pool.js:48:16)
at Pool.query (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\node_modules\mysql\lib\Pool.js:202:8)
at Xsql.dbCacheInitAsync (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\lib\xsql.js:31:15)
at Xsql.init (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\lib\xsql.js:23:10)
at Xapi.init (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\lib\xapi.js:35:16)
at startXmysql (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\bin\index.js:42:12)
at start (C:\Users\Stas\AppData\Roaming\npm\node_modules\xmysql\bin\index.js:89:5) {
code: 'UNKNOWN_CODE_PLEASE_REPORT',
errno: 9002,
sqlMessage: 'SSL connection is required. Please specify SSL options and retry.\u0000',
sqlState: '28000',
fatal: true
} undefined`
xmysql doesn't have an option to set the SSL mode - is this something that can be configured ?
what would be the work around for this ?
| gharchive/issue | 2021-10-21T13:12:14 | 2025-04-01T06:45:11.735017 | {
"authors": [
"Totti10as",
"shyam9813"
],
"repo": "o1lab/xmysql",
"url": "https://github.com/o1lab/xmysql/issues/40",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
233818296 | edit VM is aligned to right
the content of edit VM is aligned to right while the VM detail to left.
Related to https://github.com/oVirt/ovirt-web-ui/issues/196
Closing, will be fixed within #196
| gharchive/issue | 2017-06-06T08:38:40 | 2025-04-01T06:45:11.760704 | {
"authors": [
"jelkosz",
"mareklibra"
],
"repo": "oVirt/ovirt-web-ui",
"url": "https://github.com/oVirt/ovirt-web-ui/issues/215",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
2375597522 | [PUPIL-805] Refactoring some areas of the pupil journey
Description
Music year: 1966
This branch refactors a few areas of the pupil browse journey.
I'll add comments throughout the PR for code review.
How to test
Go to https://deploy-preview-2541--oak-web-application.netlify.thenational.academy/pupils/years
Check that the browse journey is working as expected
Checklist
[x] Added or updated tests where appropriate
[x] Manually tested across browsers / devices
[x] Considered impact on accessibility
[ ] Design sign-off
[ ] Approved by product owner
[x] Does this PR update a package with a breaking change
:tada: This PR is included in version 1.475.3 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket:
| gharchive/pull-request | 2024-06-26T14:59:37 | 2025-04-01T06:45:11.771173 | {
"authors": [
"BTitterington",
"oak-machine-user"
],
"repo": "oaknational/Oak-Web-Application",
"url": "https://github.com/oaknational/Oak-Web-Application/pull/2541",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1973580526 | [VL] Allow users to set bloom filter configurations
Description
Gluten should allow users to config these bloom filter params, as it may impact the performance.
https://github.com/facebookincubator/velox/blob/73d4279a14744cf4d038d3a967a49dcddbad9d39/velox/core/QueryConfig.h#L629
If no one would like to pick up this, I would like to work on this, as now the default value of MaxNumBits is too small comparing to spark default value, we saw about 20s difference in our TPCDS run.
| gharchive/issue | 2023-11-02T06:16:12 | 2025-04-01T06:45:11.780723 | {
"authors": [
"zhli1142015",
"zhouyuan"
],
"repo": "oap-project/gluten",
"url": "https://github.com/oap-project/gluten/issues/3594",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1472874231 | Ensure access to EJBs is controlled
Several EJBs are not properly annotated yet, so they don't follow the required authentication/authorization model. Authentication is required to access EJBs and, once authenticated, a user can access all the exposed methods of the EJB.
In order to implement this model, EJBs must be annotated with @PermitAll and @SecurityDomain(value = "other"), see for example BondTradeServiceBean.
In addition, we must ensure that all Business delegates call services using SecurityUtil.run/SecurityUtil.runEx, so credentials are passed to the server. BondTradeBusinessDelegate can be checked to see an illustration of correct calls.
Environment:
Version: 1.0.6
It is now done, this ticket will be part of the v2.0.0.
| gharchive/issue | 2022-12-02T13:31:56 | 2025-04-01T06:45:11.878680 | {
"authors": [
"oasuncion"
],
"repo": "oasuncion/tradista",
"url": "https://github.com/oasuncion/tradista/issues/76",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
310681417 | vk.com me() function returns 500
Fiddle:
https://jsfiddle.net/d7nfc85m/79/
In 'network' tab:
URL:
Request URL: https://oauth.io/auth/vk/me
Request Method: GET
Status Code: 500 Internal Server Error
Remote Address: 52.72.148.11:443
Referrer Policy: no-referrer-when-downgrade
Response Header:
Access-Control-Allow-Methods: GET, POST, PUT, PATCH, DELETE
Access-Control-Allow-Origin: https://fiddle.jshell.net
Connection: keep-alive
Content-Length: 111
Content-Type: application/json; charset=utf-8
Date: Tue, 03 Apr 2018 03:04:39 GMT
Server: nginx/1.6.2
Request Header:
Accept: /
Accept-Encoding: gzip, deflate, br
Accept-Language: ja,en-US;q=0.9,en;q=0.8
Connection: keep-alive
Host: oauth.io
oauthio: k=HwAr2OtSxRgEEnO2-JnYjsuA3tc&access_token=xxxxxxxxxxxx
Origin: https://fiddle.jshell.net
Referer: https://fiddle.jshell.net/_display/
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/65.0.3325.162 Safari/537.36
Response:
{"status":"error","code":500,"message":"Cannot read property '0' of undefined","data":{"code":"InternalError"}}
Fixed
| gharchive/issue | 2018-04-03T03:26:49 | 2025-04-01T06:45:11.891961 | {
"authors": [
"nethsix"
],
"repo": "oauth-io/oauthd",
"url": "https://github.com/oauth-io/oauthd/issues/210",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
788360490 | Error when leaving tabpage
To reproduce
Run Vim with vim-hitspop.
:tabnew.
Run arbitral search command to show hitspop's popup.
:tabclose.
Error occurs.
Error detected while processing WinEnter Autocommands for "*"..function hitspop#main:
line 13:
E716: Key not present in Dictionary: "line, opts.col] != [coord.line, coord.col]"
After :tabclose, hitpop's popup is closed but it's callback function seems to be not called. why?
After :tabclose, hitpop's popup is closed but it's callback function seems to be not called. why?
| gharchive/issue | 2021-01-18T15:29:55 | 2025-04-01T06:45:11.920366 | {
"authors": [
"obcat"
],
"repo": "obcat/vim-hitspop",
"url": "https://github.com/obcat/vim-hitspop/issues/9",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2483095545 | Add a step for building the normalizer to the Maven plugin
Details
@yegor256 wrote a Maven Plugin for the whole pipeline
The pipeline is in a Dockerfile
normalizer should be built in that pipeline
Subtasks
[x] get a link to the Dockerfile from @yegor256
[ ] add steps to build normalizer to the Dockerfile
@deemp here it is: https://github.com/objectionary/hone-maven-plugin/issues/19
I believe this issue is superseded by https://github.com/objectionary/hone-maven-plugin/issues/19
@deemp we can close this ticket, I believe
| gharchive/issue | 2024-08-23T12:58:02 | 2025-04-01T06:45:11.972006 | {
"authors": [
"deemp",
"yegor256"
],
"repo": "objectionary/normalizer",
"url": "https://github.com/objectionary/normalizer/issues/476",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2442760389 | feat(#344): Identify Problem With INVOKESPECIAL Instruction
In this PR I identified problem related to INVOKESPECIAL instruction decompilation.
I added one more integration test to reveal the problem and added one more puzzle to solve the problem in the future.
Related to #344.
PR-Codex overview
This PR adds a new Playground class, tests it, and refactors the InvokespecialHandler class.
Detailed summary
Added Playground class with isAvailable method
Added test for Playground class in Main.java
Refactored InvokespecialHandler class in handlers package
The following files were skipped due to too many changes: src/test/resources/xmir/disassembled/WebProperties$Resources$Chain$Strategy$Content.xmir
✨ Ask PR-Codex anything about this PR by commenting with /codex {your question}
@rultor merge
@rultor merge
@volodya-lombrozo OK, I'll try to merge now. You can check the progress of the merge here
@rultor merge
@volodya-lombrozo Done! FYI, the full log is here (took me 8min)
@volodya-lombrozo Thanks for the contribution! You've earned +5 points for this: +30 as a basis; -7 for too many hits-of-code (251 >= 100); -15 for the lack of code review; -10 for too few (2) comments; +7 to give you at least something. Please, keep them coming. Your running balance is +274.
| gharchive/pull-request | 2024-08-01T15:26:44 | 2025-04-01T06:45:11.978057 | {
"authors": [
"0crat",
"rultor",
"volodya-lombrozo"
],
"repo": "objectionary/opeo-maven-plugin",
"url": "https://github.com/objectionary/opeo-maven-plugin/pull/360",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2453180543 | The Carbon TearSheet component
Tasks
[x] #667
[x] container: responsive
[x] overlay
[x] title + description
[x] #668
[x] influencer
[x] #674
[x] #675
[x] #682
Carbon was removed from Way (for now at least).
| gharchive/issue | 2024-08-07T10:50:53 | 2025-04-01T06:45:11.981850 | {
"authors": [
"marcioendo"
],
"repo": "objectos/objectos.way",
"url": "https://github.com/objectos/objectos.way/issues/666",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
372972484 | humeroulnar joint
Add new term:
abnormal humeroulnar joint morphology (a child of abnormal forelimb joint morphology MP:0030807, abnormal synovial joint morphology MP:0030804 and abnormal elbow joint morphology MP:0013945)
DEF: any structural anomaly of the simple hinge-type synovial joint between the trochlea on the medial aspect of the distal end of the humerus and the trochlear notch on the proximal ulna
SYN: abnormal humero-ulnar joint morphology
SYN: abnormal ulnohumeral joint morphology
SYN: abnormal ulno-humeral joint morphology
SYN: abnormal articulatio humeroulnaris morphology
SOURCE: https://www.kenhub.com/en/library/anatomy/elbow-joint
Added new term:
abnormal humeroulnar joint morphology MP:0030897
| gharchive/issue | 2018-10-23T12:24:50 | 2025-04-01T06:45:12.000795 | {
"authors": [
"anna-anagnostop"
],
"repo": "obophenotype/mammalian-phenotype-ontology",
"url": "https://github.com/obophenotype/mammalian-phenotype-ontology/issues/2929",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1594118093 | dependabot parse error
now it is showing an error :+1:
https://github.com/obs-nebula/function-five/commit/139700a62336e9b27452392d3f13e0e05f353709
| gharchive/issue | 2023-02-21T21:24:25 | 2025-04-01T06:45:12.002260 | {
"authors": [
"helio-frota"
],
"repo": "obs-nebula/function-five",
"url": "https://github.com/obs-nebula/function-five/issues/31",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1835121975 | Importer creates files with invalid characters
Received a report via email that / is replaced by : in filenames.
/ is not allowed in filenames. Could you get more info from them? Which export format are they importing? Can they open the file to check how it's written out in the export file?
Any updates? There is not enough information to do anything.
Let's reopen when there's more details.
| gharchive/issue | 2023-08-03T13:57:35 | 2025-04-01T06:45:12.028522 | {
"authors": [
"ericaxu",
"joethei",
"lishid"
],
"repo": "obsidianmd/obsidian-importer",
"url": "https://github.com/obsidianmd/obsidian-importer/issues/41",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1998299161 | CLI: update utils.js to show to cd into project before running
What does this do?
Shows a message after the '📱 Run your project:` sentence saying cd into project before running
Why did you do this?
To make it clear to understand that before running, one need to cd into the project
Who/what does this impact?
No impact, only changes the message shown on CLI
How did you test this?
Don't know how to test it as I need to deploy the package on NPM? I'd appreciate if a maintainer showed me.
Thank you @onurusluca for your contribution
fix #242
| gharchive/pull-request | 2023-11-17T05:20:10 | 2025-04-01T06:45:12.117931 | {
"authors": [
"onurusluca",
"yjose"
],
"repo": "obytes/react-native-template-obytes",
"url": "https://github.com/obytes/react-native-template-obytes/pull/247",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1883872173 | [Bug]: show trace result set download sql data only one line of data
ODC version
4.2.0
OB version
oracle_3.2.3.2
What happened?
select * from emp; -- emp empty table
Execute show trace in the sql window; download the result set
What did you expect to happen?
All data in the result set
How can we reproduce it (as minimally and precisely as possible)?
No response
Anything else we need to know?
No response
Cloud
No response
Show trace's should not be downloaded currently.
| gharchive/issue | 2023-09-06T12:05:09 | 2025-04-01T06:45:12.318557 | {
"authors": [
"LuckyPickleZZ",
"runzi389205"
],
"repo": "oceanbase/odc",
"url": "https://github.com/oceanbase/odc/issues/212",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1931613330 | [Bug]: When all recognition rules are selected, the disabled rules will be recognized together
ODC version
odc422
OB version
not have
What happened?
What did you expect to happen?
When selecting all, there will be no disabled recognition rules
How can we reproduce it (as minimally and precisely as possible)?
Identify Rule Selection All
Anything else we need to know?
No response
Cloud
No response
pass
| gharchive/issue | 2023-10-08T03:01:18 | 2025-04-01T06:45:12.322102 | {
"authors": [
"sl01388797"
],
"repo": "oceanbase/odc",
"url": "https://github.com/oceanbase/odc/issues/433",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2007550024 | [Feature]: mysql and ob mysql mode support gis datatype
Is your feature request related to a problem?
no
Describe the solution you'd like
mysql support gis datatype, Supports creating, viewing, and managing tables containing gis field types.
And supports processing result sets containing gis field types:
GEOMETRY
POINT
LINESTRING
POLYGON
MULTIPOINT
MULTILINESTRING
MULTIPOLYGON
GEOMETRYCOLLECTION
Additional context
No response
duplicate:https://github.com/oceanbase/odc/issues/851
| gharchive/issue | 2023-11-23T06:18:04 | 2025-04-01T06:45:12.325690 | {
"authors": [
"Jane201510",
"PeachThinking"
],
"repo": "oceanbase/odc",
"url": "https://github.com/oceanbase/odc/issues/899",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1134879250 | [Bug]: Type Error SlashOptionBase, required set to false
What happened?
A bug happened! Type error, build fails when using required options on slashOoptonsBase
Reproduction
upgrade to project to latest discordX package.
go to any slash command and add required option and notice error pops up
Package
discordx
Version
Stable
Relevant log output
(property) required: true
Type 'true' is not assignable to type 'false'.ts(2322)
slash.d.ts(11, 5): The expected type comes from property 'required' which is declared here on type 'SlashOptionParams'
Code of Conduct
[X] I agree to follow this project's Code of Conduct
Please see #510
It's now reverted back to boolean! 👍
| gharchive/issue | 2022-02-13T00:39:34 | 2025-04-01T06:45:12.334684 | {
"authors": [
"ReevMich",
"oceanroleplay"
],
"repo": "oceanroleplay/discord.ts",
"url": "https://github.com/oceanroleplay/discord.ts/issues/513",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
105197538 | Does not compile with the latest GHCJS
The error I'm getting is
[2 of 5] Compiling VirtualDom.Prim ( src/VirtualDom/Prim.hs, dist/dist-sandbox-686dbbbb/build/VirtualDom/Prim.js_o )
src/VirtualDom/Prim.hs:28:9:
Not in scope: type constructor or class ‘ToJSString’
Perhaps you meant ‘JSString’ (imported from GHCJS.Types)
src/VirtualDom/Prim.hs:29:29: Not in scope: ‘toJSString’
src/VirtualDom/Prim.hs:55:5:
Not in scope: ‘fromJSString’
Perhaps you meant ‘fromString’ (imported from Data.String)
src/VirtualDom/Prim.hs:88:41:
Not in scope: type constructor or class ‘JSArray’
src/VirtualDom/Prim.hs:92:41:
Not in scope: type constructor or class ‘JSArray’
src/VirtualDom/Prim.hs:94:32:
Not in scope: type constructor or class ‘JSArray’
src/VirtualDom/Prim.hs:127:50:
Not in scope: type constructor or class ‘JSFun’
src/VirtualDom/Prim.hs:135:51: Not in scope: ‘syncCallback1’
src/VirtualDom/Prim.hs:135:65:
Not in scope: data constructor ‘AlwaysRetain’
src/VirtualDom/Prim.hs:139:54:
Not in scope: type constructor or class ‘JSFun’
src/VirtualDom/Prim.hs:148:55: Not in scope: ‘syncCallback2’
src/VirtualDom/Prim.hs:148:69:
Not in scope: data constructor ‘AlwaysRetain’
The definitions were removed in https://github.com/ghcjs/ghcjs-base/commit/919c5acbe958f03e2ea8a54c5e784b0d1f7c0010 — I couldn't find a migration guide, though.
You might be better moving back over to https://github.com/ghcjs/ghcjs-vdom - the improved-base branch has recent activity. I say that because I'm not actively using virtual-dom anymore, so I'm not sure when I can get around to fixing this.
Thanks for letting me know.
| gharchive/issue | 2015-09-07T11:15:49 | 2025-04-01T06:45:12.345087 | {
"authors": [
"int-index",
"ocharles"
],
"repo": "ocharles/virtual-dom",
"url": "https://github.com/ocharles/virtual-dom/issues/6",
"license": "bsd-3-clause",
"license_type": "permissive",
"license_source": "bigquery"
} |
2490249314 | Idempotency Issue with Exadata Infrastructure
I created an Exadata Infrastructure using the AzAPI Infra call. When I re-ran my Terraform I got an "Exadata Infrastructure name cannot be reused within the same resource group. Please use a different name" error message.
To test, follow these steps:
Clone https://github.com/sihbher/avm-res-oracledatabase-cloudexadatainfrastructure/tree/main/
Run Terraform apply on this example
Re-run Terraform apply on the example
Error message
Plan: 0 to add, 4 to change, 0 to destroy.
azurerm_resource_group.this: Modifying... [id=/subscriptions/ef2c2154-2d60-433c-a2d8-b947e184c3a7/resourceGroups/rg-bg1x]
azurerm_resource_group.this: Modifications complete after 1s [id=/subscriptions/ef2c2154-2d60-433c-a2d8-b947e184c3a7/resourceGroups/rg-bg1x]
module.avm_odaa_infra.azapi_resource.odaa_infra: Modifying... [id=/subscriptions/ef2c2154-2d60-433c-a2d8-b947e184c3a7/resourceGroups/rg-bg1x/providers/Oracle.Database/cloudExadataInfrastructures/odaa-infra-ig01j]
module.odaa_vnet.azapi_resource.vnet: Modifying... [id=/subscriptions/ef2c2154-2d60-433c-a2d8-b947e184c3a7/resourceGroups/rg-bg1x/providers/Microsoft.Network/virtualNetworks/vnet-odaa]module.odaa_vnet.azapi_resource.vnet: Modifications complete after 4s [id=/subscriptions/ef2c2154-2d60-433c-a2d8-b947e184c3a7/resourceGroups/rg-bg1x/providers/Microsoft.Network/virtualNetworks/vnet-odaa]
╷
│ Error: Failed to create/update resource
│
│ with module.avm_odaa_infra.azapi_resource.odaa_infra,
│ on ..\..\..\avm-res-oracle-database-cloudexadatainfrastructure-fork\main.tf line 2, in resource "azapi_resource" "odaa_infra":
│ 2: resource "azapi_resource" "odaa_infra" {
│
│ creating/updating Resource: (ResourceId
│ "/subscriptions/ef2c2154-2d60-433c-a2d8-b947e184c3a7/resourceGroups/rg-bg1x/providers/Oracle.Database/cloudExadataInfrastructures/odaa-infra-ig01j"
│ / Api Version "2023-09-01"): PUT
│ https://management.azure.com/subscriptions/ef2c2154-2d60-433c-a2d8-b947e184c3a7/resourceGroups/rg-bg1x/providers/Oracle.Database/cloudExadataInfrastructures/odaa-infra-ig01j
│ --------------------------------------------------------------------------------
│ RESPONSE 400: 400 Bad Request
│ ERROR CODE: 400
│ --------------------------------------------------------------------------------
│ {
│ "error": {
│ "code": "400",
│ "message": "Exadata Infrastructure name cannot be reused within the same resource group. Please use a different name"
│ }
│ }
│ --------------------------------------------------------------------------------
│
@terrymandin We are not able to reproduce the issue on our end. would you pls share your environment configuration? We can take a look into that.
@chanstev , thanks for looking into this. We will re-test on our side.
@chanstev, we have confirmed that this is still an issue:
@terrymandin As discussed, this is related to 2 behaviours as described below, and we will log 2 separate issues for a clearer description of the issue and be followed up accordingly.
Update property of Exadata Infra via Azure side is not supported at the moment, neither via Azure Portal or Terraform AzAPI. However, it can be done via interfaces on OCI side. We will follow up as a feature request.
Due to the above nature, the related metadata (e.g. Azure tagging) has to be updated via azapi_update_resource instead of azapi_resource call. You can find the sample of azapi_update_resource here. We will log a separate issue for supporting Azure tagging to follow up.
| gharchive/issue | 2024-08-27T20:17:44 | 2025-04-01T06:45:12.352204 | {
"authors": [
"chanstev",
"terrymandin"
],
"repo": "oci-landing-zones/terraform-oci-multicloud-azure",
"url": "https://github.com/oci-landing-zones/terraform-oci-multicloud-azure/issues/33",
"license": "UPL-1.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1919650801 | fix: show deprecation warning in help when using deprecated alias
Show standard deprecation warning in help when using a deprecated alias
@W-14208399@
Fixes https://github.com/oclif/core/issues/800
QA: I see the warning at the top of the help. Looks good.
| gharchive/pull-request | 2023-09-29T17:17:36 | 2025-04-01T06:45:12.353892 | {
"authors": [
"mdonnalley",
"mshanemc"
],
"repo": "oclif/core",
"url": "https://github.com/oclif/core/pull/801",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2178625287 | OCTOPUS-625:added playbook roles folder
This to add the playbook and roles folder for the task which we created .
In your next PR, please fix up:
Error: conflicting action statements: hosts, vars_files
| gharchive/pull-request | 2024-03-11T09:15:45 | 2025-04-01T06:45:12.372976 | {
"authors": [
"pkenchap",
"prb112"
],
"repo": "ocp-power-automation/ocp4-upi-multiarch-compute",
"url": "https://github.com/ocp-power-automation/ocp4-upi-multiarch-compute/pull/15",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
635442582 | Request adding support for proxy environments
A lot of companies have limited access from their environments to the internet. Most of them are using web proxies to allow controlled access from systems.
It would help a lot to have such support in your automation.
Setting a proxy would be needed for:
yum (yum.conf)
RHSM (rhsm.conf)
openshift-installer config file
bastion server environment variables (in RHEL 8 it is done by the file /etc/profile.d/http_proxy.sh )
git command
Important is also the no_proxy and that the domain used by the user gets added. Otherwise the installation procedure might get stuck at a point where it shouldn't go through a proxy, but because of the missing no_proxy entry it does.
Thanks a lot for raising the feature request @torwen1 . We'll take a look into the feature request and get back
| gharchive/issue | 2020-06-09T13:40:41 | 2025-04-01T06:45:12.375712 | {
"authors": [
"bpradipt",
"torwen1"
],
"repo": "ocp-power-automation/ocp4-upi-powervm",
"url": "https://github.com/ocp-power-automation/ocp4-upi-powervm/issues/29",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
29320184 | Unauthenticated login causes error trying to retrieve repositories
Using the new
[OCTUser userWithRawLogin:@"login" server:[OCTServer dotComServer]];
then calling
RACSignal *request = [self.client fetchUserRepositories];
[request subscribeNext:^(OCTRepository *repository) {}]
to pull the repositories will cause an error at
- [OCTClient - (RACSignal *)enqueueUserRequestWithMethod:(NSString *)method relativePath:(NSString *)relativePath parameters:(NSDictionary *)parameters resultClass:(Class)resultClass]
since the line
path = [NSString stringWithFormat:@"users/%@%@", self.user.login, relativePath];
has self.user.login returning nil, and the rawlogin property contains the appropriate username.
Not sure if this is related to this bug of not. But seem to me in the OCTClient+Search.m file, braces should be added after "if (orderBy.length > 0)" to include all calls related
(RACSignal *)searchRepositoriesWithQuery:(NSString *)query orderBy:(NSString *)orderBy ascending:(BOOL)ascending {
NSParameterAssert(query.length > 0);
NSMutableDictionary *parameters = [NSMutableDictionary dictionary];
parameters[@"q"] = query;
if (orderBy.length > 0)
{ // <--- here
parameters[@"sort"] = orderBy;
parameters[@"order"] = ascending ? @"asc" : @"desc";
NSMutableURLRequest *request = [self requestWithMethod:@"GET" path:@"/search/repositories" parameters:parameters notMatchingEtag:nil];
[request addValue:@"application/vnd.github.v3.text-match+json" forHTTPHeaderField:@"Accept"];
} // <--- and here
return [[self enqueueRequest:request resultClass:OCTRepositoriesSearchResult.class fetchAllPages:NO] oct_parsedResults];
}
So anyone know how to actually get OctoKit to work?
Sigh, ended up just using a token and the authenticated client vs unauthenticated one.
| gharchive/issue | 2014-03-13T01:56:03 | 2025-04-01T06:45:12.420381 | {
"authors": [
"NashBean",
"danielgalasko",
"joeljfischer"
],
"repo": "octokit/octokit.objc",
"url": "https://github.com/octokit/octokit.objc/issues/173",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
227133742 | Vue Language Server Request textDocument/hover failed (v0.6.8)
I'm having this error repetitively. Seems like the last updates don't fix it.
[Error - 3:05:16 PM] Request textDocument/hover failed.
Message: Request textDocument/hover failed with message: Cannot read property 'parameters' of undefined
Code: -32603
/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/node_modules/vetur-vls/lib/parser/htmlScanner.js:46
this.len = source.length;
^
TypeError: Cannot read property 'length' of undefined
at new MultiLineStream (/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/node_modules/vetur-vls/lib/parser/htmlScanner.js:46:30)
at Object.createScanner (/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/node_modules/vetur-vls/lib/parser/htmlScanner.js:188:22)
at Object.getDocumentRegions (/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/modes/embeddedSupport.js:9:35)
at /home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/modes/languageModes.js:11:125
at Object.get (/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/languageModelCache.js:30:33)
at /home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/modes/javascriptMode.js:16:43
at Object.get (/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/languageModelCache.js:30:33)
at Object.getScriptKind (/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/modes/javascriptMode.js:74:33)
at Object.getScriptKind (/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/node_modules/typescript/lib/typescript.js:71209:31)
at HostCache.createEntry (/home/***/.vscode/extensions/octref.vetur-0.6.8/client/server/node_modules/typescript/lib/typescript.js:85685:36)
[Info - 3:14:41 PM] Connection to server got closed. Server will restart.
vetur initialized
[Error - 3:24:31 PM] Request textDocument/hover failed.
Message: Request textDocument/hover failed with message: Cannot read property 'parameters' of undefined
Code: -32603
Did you do this?
ohh! my project is javascript only. Should I need to include *.ts anyway?
The file you are opening is under app/**/*.vue, I suppose?
The problem might be you do not have the correct typing dependency installed, and that's causing the signatureHelp failure.
npm list or yarn list output plz.
Same here, vetur v0.6.8 and vscode v1.12.1, do not have tsconfig.json nor jsconfig.json
npm list: https://pastebin.com/raw/8DGMYdPp
This is the output for npm list.
testvue@1.0.0-alpha.2 /home/****/code/testvue
├── animate.css@3.5.2 extraneous
├─┬ autoprefixer@6.7.7
│ ├─┬ browserslist@1.7.7
│ │ └── electron-to-chromium@1.3.8
│ ├── caniuse-db@1.0.30000659
│ ├── normalize-range@0.1.2
│ ├── num2fraction@1.2.2
│ ├─┬ postcss@5.2.17
│ │ ├── js-base64@2.1.9
│ │ ├── source-map@0.5.6
│ │ └── supports-color@3.2.3
│ └── postcss-value-parser@3.3.0
├─┬ axios@0.16.1
│ └── follow-redirects@1.2.3
├─┬ babel-core@6.24.1
│ ├─┬ babel-code-frame@6.22.0
│ │ ├── esutils@2.0.2
│ │ └── js-tokens@3.0.1
│ ├─┬ babel-generator@6.24.1
│ │ ├─┬ detect-indent@4.0.0
│ │ │ └─┬ repeating@2.0.1
│ │ │ └── is-finite@1.0.2
│ │ ├── jsesc@1.3.0
│ │ ├── source-map@0.5.6
│ │ └── trim-right@1.0.1
│ ├── babel-helpers@6.24.1
│ ├── babel-messages@6.23.0
│ ├─┬ babel-runtime@6.23.0
│ │ └── regenerator-runtime@0.10.3
│ ├── babel-template@6.24.1
│ ├── babel-traverse@6.24.1
│ ├─┬ babel-types@6.24.1
│ │ ├── esutils@2.0.2
│ │ └── to-fast-properties@1.0.2
│ ├── babylon@6.17.0
│ ├── convert-source-map@1.5.0
│ ├─┬ debug@2.6.4
│ │ └── ms@0.7.3
│ ├── json5@0.5.1
│ ├── lodash@4.17.4
│ ├─┬ minimatch@3.0.3
│ │ └─┬ brace-expansion@1.1.7
│ │ ├── balanced-match@0.4.2
│ │ └── concat-map@0.0.1
│ ├── path-is-absolute@1.0.1
│ ├── private@0.1.7
│ ├── slash@1.0.0
│ └── source-map@0.5.6
├─┬ babel-loader@6.4.1
│ ├─┬ find-cache-dir@0.1.1
│ │ ├── commondir@1.0.1
│ │ └── pkg-dir@1.0.0
│ ├─┬ loader-utils@0.2.17
│ │ ├── big.js@3.1.3
│ │ └── emojis-list@2.1.0
│ ├─┬ mkdirp@0.5.1
│ │ └── minimist@0.0.8
│ └── object-assign@4.1.1
├── babel-plugin-transform-runtime@6.23.0
├─┬ babel-preset-env@1.4.0
│ ├── babel-plugin-check-es2015-constants@6.22.0
│ ├── babel-plugin-syntax-trailing-function-commas@6.22.0
│ ├─┬ babel-plugin-transform-async-to-generator@6.24.1
│ │ ├── babel-helper-remap-async-to-generator@6.24.1
│ │ └── babel-plugin-syntax-async-functions@6.13.0
│ ├── babel-plugin-transform-es2015-arrow-functions@6.22.0
│ ├── babel-plugin-transform-es2015-block-scoped-functions@6.22.0
│ ├── babel-plugin-transform-es2015-block-scoping@6.24.1
│ ├─┬ babel-plugin-transform-es2015-classes@6.24.1
│ │ ├── babel-helper-define-map@6.24.1
│ │ ├── babel-helper-function-name@6.24.1
│ │ ├── babel-helper-optimise-call-expression@6.24.1
│ │ └── babel-helper-replace-supers@6.24.1
│ ├── babel-plugin-transform-es2015-computed-properties@6.24.1
│ ├── babel-plugin-transform-es2015-destructuring@6.23.0
│ ├── babel-plugin-transform-es2015-duplicate-keys@6.24.1
│ ├── babel-plugin-transform-es2015-for-of@6.23.0
│ ├── babel-plugin-transform-es2015-function-name@6.24.1
│ ├── babel-plugin-transform-es2015-literals@6.22.0
│ ├── babel-plugin-transform-es2015-modules-amd@6.24.1
│ ├─┬ babel-plugin-transform-es2015-modules-commonjs@6.24.1
│ │ └── babel-plugin-transform-strict-mode@6.24.1
│ ├─┬ babel-plugin-transform-es2015-modules-systemjs@6.24.1
│ │ └── babel-helper-hoist-variables@6.24.1
│ ├── babel-plugin-transform-es2015-modules-umd@6.24.1
│ ├── babel-plugin-transform-es2015-object-super@6.24.1
│ ├─┬ babel-plugin-transform-es2015-parameters@6.24.1
│ │ ├── babel-helper-call-delegate@6.24.1
│ │ └── babel-helper-get-function-arity@6.24.1
│ ├── babel-plugin-transform-es2015-shorthand-properties@6.24.1
│ ├── babel-plugin-transform-es2015-spread@6.22.0
│ ├─┬ babel-plugin-transform-es2015-sticky-regex@6.24.1
│ │ └── babel-helper-regex@6.24.1
│ ├── babel-plugin-transform-es2015-template-literals@6.22.0
│ ├── babel-plugin-transform-es2015-typeof-symbol@6.23.0
│ ├─┬ babel-plugin-transform-es2015-unicode-regex@6.24.1
│ │ └─┬ regexpu-core@2.0.0
│ │ ├── regenerate@1.3.2
│ │ ├── regjsgen@0.2.0
│ │ └─┬ regjsparser@0.1.5
│ │ └── jsesc@0.5.0
│ ├─┬ babel-plugin-transform-exponentiation-operator@6.24.1
│ │ ├─┬ babel-helper-builder-binary-assignment-operator-visitor@6.24.1
│ │ │ └── babel-helper-explode-assignable-expression@6.24.1
│ │ └── babel-plugin-syntax-exponentiation-operator@6.13.0
│ ├─┬ babel-plugin-transform-regenerator@6.24.1
│ │ └── regenerator-transform@0.9.11
│ └─┬ invariant@2.2.2
│ └── loose-envify@1.3.1
├─┬ babel-preset-stage-2@6.24.1
│ ├── babel-plugin-syntax-dynamic-import@6.18.0
│ ├─┬ babel-plugin-transform-class-properties@6.24.1
│ │ └── babel-plugin-syntax-class-properties@6.13.0
│ ├─┬ babel-plugin-transform-decorators@6.24.1
│ │ ├─┬ babel-helper-explode-class@6.24.1
│ │ │ └── babel-helper-bindify-decorators@6.24.1
│ │ └── babel-plugin-syntax-decorators@6.13.0
│ └─┬ babel-preset-stage-3@6.24.1
│ ├─┬ babel-plugin-transform-async-generator-functions@6.24.1
│ │ └── babel-plugin-syntax-async-generators@6.13.0
│ └─┬ babel-plugin-transform-object-rest-spread@6.23.0
│ └── babel-plugin-syntax-object-rest-spread@6.13.0
├─┬ babel-register@6.24.1
│ ├── core-js@2.4.1
│ ├─┬ home-or-tmp@2.0.0
│ │ ├── os-homedir@1.0.2
│ │ └── os-tmpdir@1.0.2
│ └─┬ source-map-support@0.4.14
│ └── source-map@0.5.6
├─┬ chalk@1.1.3
│ ├── ansi-styles@2.2.1
│ ├── escape-string-regexp@1.0.5
│ ├─┬ has-ansi@2.0.0
│ │ └── ansi-regex@2.1.1
│ ├── strip-ansi@3.0.1
│ └── supports-color@2.0.0
├── connect-history-api-fallback@1.3.0
├─┬ copy-webpack-plugin@4.0.1
│ ├── bluebird@2.11.0
│ ├─┬ fs-extra@0.26.7
│ │ ├── graceful-fs@4.1.11
│ │ ├── jsonfile@2.4.0
│ │ └── klaw@1.3.1
│ ├─┬ glob@6.0.4
│ │ ├─┬ inflight@1.0.6
│ │ │ └── wrappy@1.0.2
│ │ ├── inherits@2.0.3
│ │ └── once@1.4.0
│ ├─┬ is-glob@3.1.0
│ │ └── is-extglob@2.1.1
│ └── node-dir@0.1.16
├─┬ css-loader@0.28.0
│ ├─┬ css-selector-tokenizer@0.7.0
│ │ ├── cssesc@0.1.0
│ │ ├── fastparse@1.1.1
│ │ └── regexpu-core@1.0.0
│ ├─┬ cssnano@3.10.0
│ │ ├── decamelize@1.2.0
│ │ ├── defined@1.0.0
│ │ ├─┬ postcss-calc@5.3.1
│ │ │ ├── postcss-message-helpers@2.0.0
│ │ │ └─┬ reduce-css-calc@1.3.0
│ │ │ ├── math-expression-evaluator@1.2.16
│ │ │ └── reduce-function-call@1.0.2
│ │ ├─┬ postcss-colormin@2.2.2
│ │ │ └─┬ colormin@1.1.2
│ │ │ ├─┬ color@0.11.4
│ │ │ │ ├── clone@1.0.2
│ │ │ │ ├─┬ color-convert@1.9.0
│ │ │ │ │ └── color-name@1.1.2
│ │ │ │ └── color-string@0.3.0
│ │ │ └── css-color-names@0.0.4
│ │ ├── postcss-convert-values@2.6.1
│ │ ├── postcss-discard-comments@2.0.4
│ │ ├── postcss-discard-duplicates@2.1.0
│ │ ├── postcss-discard-empty@2.1.0
│ │ ├── postcss-discard-overridden@0.1.1
│ │ ├─┬ postcss-discard-unused@2.2.3
│ │ │ └── uniqs@2.0.0
│ │ ├─┬ postcss-filter-plugins@2.0.2
│ │ │ └─┬ uniqid@4.1.1
│ │ │ └── macaddress@0.2.8
│ │ ├── postcss-merge-idents@2.1.7
│ │ ├── postcss-merge-longhand@2.0.2
│ │ ├─┬ postcss-merge-rules@2.1.2
│ │ │ ├─┬ caniuse-api@1.6.1
│ │ │ │ ├── lodash.memoize@4.1.2
│ │ │ │ └── lodash.uniq@4.5.0
│ │ │ └── vendors@1.0.1
│ │ ├── postcss-minify-font-values@1.0.5
│ │ ├── postcss-minify-gradients@1.0.5
│ │ ├─┬ postcss-minify-params@1.2.2
│ │ │ └── alphanum-sort@1.0.2
│ │ ├── postcss-minify-selectors@2.1.1
│ │ ├── postcss-normalize-charset@1.1.1
│ │ ├─┬ postcss-normalize-url@3.0.8
│ │ │ ├── is-absolute-url@2.1.0
│ │ │ └─┬ normalize-url@1.9.1
│ │ │ ├── prepend-http@1.0.4
│ │ │ ├─┬ query-string@4.3.4
│ │ │ │ └── strict-uri-encode@1.1.0
│ │ │ └─┬ sort-keys@1.1.2
│ │ │ └── is-plain-obj@1.1.0
│ │ ├── postcss-ordered-values@2.2.3
│ │ ├── postcss-reduce-idents@2.4.0
│ │ ├── postcss-reduce-initial@1.0.1
│ │ ├── postcss-reduce-transforms@1.0.4
│ │ ├─┬ postcss-svgo@2.1.6
│ │ │ ├─┬ is-svg@2.1.0
│ │ │ │ └── html-comment-regex@1.1.1
│ │ │ └─┬ svgo@0.7.2
│ │ │ ├─┬ coa@1.0.1
│ │ │ │ └── q@1.5.0
│ │ │ ├── colors@1.1.2
│ │ │ ├─┬ csso@2.3.2
│ │ │ │ ├── clap@1.1.3
│ │ │ │ └── source-map@0.5.6
│ │ │ ├── sax@1.2.2
│ │ │ └── whet.extend@0.9.9
│ │ ├── postcss-unique-selectors@2.0.2
│ │ └── postcss-zindex@2.2.0
│ ├── loader-utils@1.1.0
│ ├── lodash.camelcase@4.3.0
│ ├── postcss-modules-extract-imports@1.0.1
│ ├─┬ postcss-modules-local-by-default@1.1.1
│ │ └─┬ css-selector-tokenizer@0.6.0
│ │ └── regexpu-core@1.0.0
│ ├─┬ postcss-modules-scope@1.0.2
│ │ └─┬ css-selector-tokenizer@0.6.0
│ │ └── regexpu-core@1.0.0
│ ├─┬ postcss-modules-values@1.2.2
│ │ └── icss-replace-symbols@1.0.2
│ └── source-list-map@0.1.8
├─┬ d3-geo-projection@2.1.0
│ ├─┬ commander@2.9.0
│ │ └── graceful-readlink@1.0.1
│ ├── d3-array@1.2.0
│ └── d3-geo@1.6.3
├─┬ eslint@3.19.0
│ ├─┬ concat-stream@1.6.0
│ │ └── typedarray@0.0.6
│ ├─┬ doctrine@2.0.0
│ │ ├── esutils@2.0.2
│ │ └── isarray@1.0.0
│ ├─┬ escope@3.6.0
│ │ ├─┬ es6-map@0.1.5
│ │ │ ├── d@1.0.0
│ │ │ ├── es5-ext@0.10.15
│ │ │ ├── es6-iterator@2.0.1
│ │ │ ├── es6-set@0.1.5
│ │ │ ├── es6-symbol@3.1.1
│ │ │ └── event-emitter@0.3.5
│ │ ├── es6-weak-map@2.0.2
│ │ ├─┬ esrecurse@4.1.0
│ │ │ └── estraverse@4.1.1
│ │ └── estraverse@4.2.0
│ ├─┬ espree@3.4.2
│ │ ├── acorn@5.0.3
│ │ └─┬ acorn-jsx@3.0.1
│ │ └── acorn@3.3.0
│ ├─┬ esquery@1.0.0
│ │ └── estraverse@4.2.0
│ ├── estraverse@4.2.0
│ ├── esutils@2.0.2
│ ├─┬ file-entry-cache@2.0.0
│ │ └─┬ flat-cache@1.2.2
│ │ ├── circular-json@0.3.1
│ │ ├─┬ del@2.2.2
│ │ │ ├─┬ globby@5.0.0
│ │ │ │ ├─┬ array-union@1.0.2
│ │ │ │ │ └── array-uniq@1.0.3
│ │ │ │ └── glob@7.1.1
│ │ │ ├── is-path-cwd@1.0.0
│ │ │ └─┬ is-path-in-cwd@1.0.0
│ │ │ └── is-path-inside@1.0.0
│ │ └── write@0.2.1
│ ├─┬ glob@7.1.1
│ │ └── fs.realpath@1.0.0
│ ├── globals@9.17.0
│ ├── ignore@3.2.7
│ ├── imurmurhash@0.1.4
│ ├─┬ inquirer@0.12.0
│ │ ├── ansi-escapes@1.4.0
│ │ ├─┬ cli-cursor@1.0.2
│ │ │ └─┬ restore-cursor@1.0.1
│ │ │ ├── exit-hook@1.1.1
│ │ │ └── onetime@1.1.0
│ │ ├── cli-width@2.1.0
│ │ ├── figures@1.7.0
│ │ ├─┬ readline2@1.0.1
│ │ │ ├── code-point-at@1.1.0
│ │ │ ├─┬ is-fullwidth-code-point@1.0.0
│ │ │ │ └── number-is-nan@1.0.1
│ │ │ └── mute-stream@0.0.5
│ │ ├── run-async@0.1.0
│ │ ├── rx-lite@3.1.2
│ │ ├── string-width@1.0.2
│ │ └── through@2.3.8
│ ├─┬ is-my-json-valid@2.16.0
│ │ ├── generate-function@2.0.0
│ │ ├─┬ generate-object-property@1.2.0
│ │ │ └── is-property@1.0.2
│ │ └── jsonpointer@4.0.1
│ ├─┬ is-resolvable@1.0.0
│ │ └── tryit@1.0.3
│ ├─┬ js-yaml@3.7.0
│ │ ├─┬ argparse@1.0.9
│ │ │ └── sprintf-js@1.0.3
│ │ └── esprima@2.7.3
│ ├─┬ json-stable-stringify@1.0.1
│ │ └── jsonify@0.0.0
│ ├─┬ levn@0.3.0
│ │ ├── prelude-ls@1.1.2
│ │ └── type-check@0.3.2
│ ├── natural-compare@1.4.0
│ ├─┬ optionator@0.8.2
│ │ ├── deep-is@0.1.3
│ │ ├── fast-levenshtein@2.0.6
│ │ └── wordwrap@1.0.0
│ ├── path-is-inside@1.0.2
│ ├── pluralize@1.2.1
│ ├── progress@1.1.8
│ ├─┬ require-uncached@1.0.3
│ │ ├─┬ caller-path@0.1.0
│ │ │ └── callsites@0.2.0
│ │ └── resolve-from@1.0.1
│ ├── strip-bom@3.0.0
│ ├── strip-json-comments@2.0.1
│ ├─┬ table@3.8.3
│ │ ├── slice-ansi@0.0.4
│ │ └─┬ string-width@2.0.0
│ │ └── is-fullwidth-code-point@2.0.0
│ ├── text-table@0.2.0
│ └── user-home@2.0.0
├── eslint-plugin-async-await@0.0.0
├─┬ eslint-plugin-html@2.0.1
│ └─┬ htmlparser2@3.9.2
│ ├── domelementtype@1.3.0
│ ├── domhandler@2.3.0
│ ├─┬ domutils@1.5.1
│ │ └─┬ dom-serializer@0.1.0
│ │ └── domelementtype@1.1.3
│ └── entities@1.1.1
├─┬ eslint-plugin-import@2.2.0
│ ├── builtin-modules@1.1.1
│ ├── contains-path@0.1.0
│ ├─┬ doctrine@1.5.0
│ │ └── esutils@2.0.2
│ ├─┬ eslint-import-resolver-node@0.2.3
│ │ └─┬ resolve@1.3.3
│ │ └── path-parse@1.0.5
│ ├─┬ eslint-module-utils@2.0.0
│ │ └─┬ debug@2.2.0
│ │ └── ms@0.7.1
│ ├─┬ has@1.0.1
│ │ └── function-bind@1.1.0
│ ├── lodash.cond@4.5.2
│ └─┬ pkg-up@1.0.0
│ └─┬ find-up@1.1.2
│ └── path-exists@2.1.0
├── eventsource-polyfill@0.9.6
├─┬ express@4.15.2
│ ├─┬ accepts@1.3.3
│ │ ├─┬ mime-types@2.1.15
│ │ │ └── mime-db@1.27.0
│ │ └── negotiator@0.6.1
│ ├── array-flatten@1.1.1
│ ├── content-disposition@0.5.2
│ ├── content-type@1.0.2
│ ├── cookie@0.3.1
│ ├── cookie-signature@1.0.6
│ ├─┬ debug@2.6.1
│ │ └── ms@0.7.2
│ ├── depd@1.1.0
│ ├── encodeurl@1.0.1
│ ├── escape-html@1.0.3
│ ├── etag@1.8.0
│ ├─┬ finalhandler@1.0.2
│ │ └── unpipe@1.0.0
│ ├── fresh@0.5.0
│ ├── merge-descriptors@1.0.1
│ ├── methods@1.1.2
│ ├─┬ on-finished@2.3.0
│ │ └── ee-first@1.1.1
│ ├── parseurl@1.3.1
│ ├── path-to-regexp@0.1.7
│ ├─┬ proxy-addr@1.1.4
│ │ ├── forwarded@0.1.0
│ │ └── ipaddr.js@1.3.0
│ ├── qs@6.4.0
│ ├── range-parser@1.2.0
│ ├─┬ send@0.15.1
│ │ ├── debug@2.6.1
│ │ ├── destroy@1.0.4
│ │ ├── http-errors@1.6.1
│ │ └── ms@0.7.2
│ ├── serve-static@1.12.1
│ ├── setprototypeof@1.0.3
│ ├── statuses@1.3.1
│ ├─┬ type-is@1.6.15
│ │ └── media-typer@0.3.0
│ ├── utils-merge@1.0.0
│ └── vary@1.1.1
├─┬ extract-text-webpack-plugin@2.1.0
│ ├─┬ ajv@4.11.7
│ │ └── co@4.6.0
│ ├── async@2.3.0
│ ├── loader-utils@1.1.0
│ └─┬ webpack-sources@0.1.5
│ └── source-map@0.5.6
├─┬ file-loader@0.11.1
│ └── loader-utils@1.1.0
├─┬ foundation-sites@6.3.1
│ └── what-input@4.1.1
├─┬ friendly-errors-webpack-plugin@1.6.1
│ ├─┬ error-stack-parser@2.0.0
│ │ └── stackframe@1.0.2
│ └── string-length@1.0.1
├── fuse.js@3.0.0
├─┬ html-webpack-plugin@2.28.0
│ ├── bluebird@3.5.0
│ ├─┬ html-minifier@3.4.3
│ │ ├─┬ camel-case@3.0.0
│ │ │ ├─┬ no-case@2.3.1
│ │ │ │ └── lower-case@1.1.4
│ │ │ └── upper-case@1.1.3
│ │ ├─┬ clean-css@4.0.12
│ │ │ └── source-map@0.5.6
│ │ ├─┬ ncname@1.0.0
│ │ │ └── xml-char-classes@1.0.0
│ │ ├── param-case@2.1.1
│ │ └── relateurl@0.2.7
│ ├─┬ pretty-error@2.1.0
│ │ ├─┬ renderkid@2.0.1
│ │ │ ├─┬ css-select@1.2.0
│ │ │ │ ├── boolbase@1.0.0
│ │ │ │ ├── css-what@2.1.0
│ │ │ │ └── nth-check@1.0.1
│ │ │ ├─┬ dom-converter@0.1.4
│ │ │ │ └── utila@0.3.3
│ │ │ ├─┬ htmlparser2@3.3.0
│ │ │ │ ├── domhandler@2.1.0
│ │ │ │ ├── domutils@1.1.6
│ │ │ │ └─┬ readable-stream@1.0.34
│ │ │ │ ├── isarray@0.0.1
│ │ │ │ └── string_decoder@0.10.31
│ │ │ └── utila@0.3.3
│ │ └── utila@0.4.0
│ └── toposort@1.0.3
├─┬ http-proxy-middleware@0.17.4
│ ├─┬ http-proxy@1.16.2
│ │ ├── eventemitter3@1.2.0
│ │ └── requires-port@1.0.0
│ └─┬ micromatch@2.3.11
│ ├─┬ arr-diff@2.0.0
│ │ └── arr-flatten@1.0.3
│ ├── array-unique@0.2.1
│ ├─┬ braces@1.8.5
│ │ ├─┬ expand-range@1.8.2
│ │ │ └─┬ fill-range@2.2.3
│ │ │ ├── is-number@2.1.0
│ │ │ ├── isobject@2.1.0
│ │ │ ├── randomatic@1.1.6
│ │ │ └── repeat-string@1.6.1
│ │ ├── preserve@0.2.0
│ │ └── repeat-element@1.1.2
│ ├─┬ expand-brackets@0.1.5
│ │ └── is-posix-bracket@0.1.1
│ ├─┬ extglob@0.3.2
│ │ └── is-extglob@1.0.0
│ ├── filename-regex@2.0.0
│ ├── is-extglob@1.0.0
│ ├── is-glob@2.0.1
│ ├─┬ kind-of@3.1.0
│ │ └── is-buffer@1.1.5
│ ├─┬ normalize-path@2.1.1
│ │ └── remove-trailing-separator@1.0.1
│ ├─┬ object.omit@2.0.1
│ │ └── is-extendable@0.1.1
│ ├─┬ parse-glob@3.0.4
│ │ ├─┬ glob-base@0.3.0
│ │ │ └─┬ is-glob@2.0.1
│ │ │ └── is-extglob@1.0.0
│ │ ├── is-dotfile@1.0.2
│ │ ├── is-extglob@1.0.0
│ │ └── is-glob@2.0.1
│ └─┬ regex-cache@0.4.3
│ ├── is-equal-shallow@0.1.3
│ └── is-primitive@2.0.0
├── install@0.8.8 extraneous
├── jquery@3.2.1
├─┬ mapbox-gl@0.36.0
│ ├── @mapbox/gl-matrix@0.0.1
│ ├── @mapbox/shelf-pack@3.0.0
│ ├── @mapbox/unitbezier@0.0.0
│ ├── @mapbox/whoots-js@3.0.0
│ ├─┬ brfs@1.4.3
│ │ ├─┬ quote-stream@1.0.2
│ │ │ └── buffer-equal@0.0.1
│ │ └─┬ static-module@1.3.1
│ │ ├─┬ duplexer2@0.0.2
│ │ │ └─┬ readable-stream@1.1.14
│ │ │ ├── isarray@0.0.1
│ │ │ └── string_decoder@0.10.31
│ │ ├─┬ escodegen@1.3.3
│ │ │ ├── esprima@1.1.1
│ │ │ ├── estraverse@1.5.1
│ │ │ └── esutils@1.0.0
│ │ ├─┬ falafel@1.2.0
│ │ │ ├── acorn@1.2.2
│ │ │ ├── foreach@2.0.5
│ │ │ ├── isarray@0.0.1
│ │ │ └── object-keys@1.0.11
│ │ ├── object-inspect@0.4.0
│ │ ├─┬ quote-stream@0.0.0
│ │ │ └── minimist@0.0.8
│ │ ├─┬ readable-stream@1.0.34
│ │ │ ├── isarray@0.0.1
│ │ │ └── string_decoder@0.10.31
│ │ ├── shallow-copy@0.0.1
│ │ ├─┬ static-eval@0.2.4
│ │ │ └─┬ escodegen@0.0.28
│ │ │ ├── esprima@1.0.4
│ │ │ └── estraverse@1.3.2
│ │ └─┬ through2@0.4.2
│ │ └─┬ xtend@2.1.2
│ │ └── object-keys@0.4.0
│ ├─┬ bubleify@0.7.0
│ │ └─┬ buble@0.15.2
│ │ ├── acorn@3.3.0
│ │ ├─┬ acorn-object-spread@1.0.0
│ │ │ └── acorn@3.3.0
│ │ └── magic-string@0.14.0
│ ├── earcut@2.1.1
│ ├─┬ geojson-rewind@0.1.0
│ │ ├─┬ concat-stream@1.2.1
│ │ │ └─┬ bops@0.0.6
│ │ │ ├── base64-js@0.0.2
│ │ │ └── to-utf8@0.0.1
│ │ ├─┬ geojson-area@0.1.0
│ │ │ └── wgs84@0.0.0
│ │ └── minimist@0.0.5
│ ├── geojson-vt@2.4.0
│ ├── grid-index@1.0.0
│ ├── mapbox-gl-supported@1.2.0
│ ├─┬ package-json-versionify@1.0.4
│ │ └── browserify-package-json@1.0.1
│ ├─┬ pbf@1.3.7
│ │ ├── ieee754@1.1.8
│ │ └─┬ resolve-protobuf-schema@2.0.0
│ │ └── protocol-buffers-schema@2.2.0
│ ├── point-geometry@0.0.0
│ ├── quickselect@1.0.0
│ ├─┬ supercluster@2.3.0
│ │ └── kdbush@1.0.1
│ ├── tinyqueue@1.2.2
│ ├─┬ unassertify@2.0.4
│ │ ├── acorn@4.0.11
│ │ ├─┬ escodegen@1.8.1
│ │ │ ├── esprima@2.7.3
│ │ │ ├── estraverse@1.9.3
│ │ │ ├── esutils@2.0.2
│ │ │ └── source-map@0.2.0
│ │ ├─┬ multi-stage-sourcemap@0.2.1
│ │ │ └─┬ source-map@0.1.43
│ │ │ └── amdefine@1.0.1
│ │ └─┬ unassert@1.5.1
│ │ ├── acorn@4.0.11
│ │ ├─┬ call-matcher@1.0.1
│ │ │ └── estraverse@4.2.0
│ │ ├── deep-equal@1.0.1
│ │ ├── espurify@1.7.0
│ │ ├── estraverse@4.2.0
│ │ └── esutils@2.0.2
│ ├─┬ unflowify@1.0.1
│ │ └─┬ flow-remove-types@1.2.0
│ │ └── vlq@0.2.2
│ ├── vector-tile@1.3.0
│ ├── vt-pbf@2.1.2
│ └── webworkify@1.4.0
├── mezr@0.6.1
├─┬ node-sass@4.5.2
│ ├── async-foreach@0.1.3
│ ├─┬ cross-spawn@3.0.1
│ │ └─┬ which@1.2.14
│ │ └── isexe@2.0.0
│ ├── gaze@1.1.2
│ ├── get-stdin@4.0.1
│ ├── glob@7.1.1
│ ├── in-publish@2.0.0
│ ├── lodash.assign@4.2.0
│ ├── lodash.clonedeep@4.5.0
│ ├── lodash.mergewith@4.6.0
│ ├─┬ meow@3.7.0
│ │ ├─┬ camelcase-keys@2.1.0
│ │ │ └── camelcase@2.1.1
│ │ ├─┬ loud-rejection@1.6.0
│ │ │ ├─┬ currently-unhandled@0.4.1
│ │ │ │ └── array-find-index@1.0.2
│ │ │ └── signal-exit@3.0.2
│ │ ├── map-obj@1.0.1
│ │ ├── minimist@1.2.0
│ │ ├─┬ normalize-package-data@2.3.8
│ │ │ ├── hosted-git-info@2.4.2
│ │ │ ├── is-builtin-module@1.0.0
│ │ │ └─┬ validate-npm-package-license@3.0.1
│ │ │ ├─┬ spdx-correct@1.0.2
│ │ │ │ └── spdx-license-ids@1.2.2
│ │ │ └── spdx-expression-parse@1.0.4
│ │ ├─┬ read-pkg-up@1.0.1
│ │ │ └─┬ read-pkg@1.1.0
│ │ │ ├─┬ load-json-file@1.1.0
│ │ │ │ └─┬ strip-bom@2.0.0
│ │ │ │ └── is-utf8@0.2.1
│ │ │ └── path-type@1.1.0
│ │ ├─┬ redent@1.0.0
│ │ │ ├── indent-string@2.1.0
│ │ │ └── strip-indent@1.0.1
│ │ └── trim-newlines@1.0.0
│ ├── nan@2.6.2
│ ├─┬ node-gyp@3.6.1
│ │ ├── fstream@1.0.11
│ │ ├── glob@7.1.1
│ │ ├─┬ nopt@3.0.6
│ │ │ └── abbrev@1.1.0
│ │ ├── osenv@0.1.4
│ │ └─┬ tar@2.2.1
│ │ └── block-stream@0.0.9
│ ├─┬ npmlog@4.0.2
│ │ ├─┬ are-we-there-yet@1.1.4
│ │ │ └── delegates@1.0.0
│ │ ├── console-control-strings@1.1.0
│ │ ├─┬ gauge@2.7.4
│ │ │ ├── aproba@1.1.1
│ │ │ ├── has-unicode@2.0.1
│ │ │ └── wide-align@1.1.0
│ │ └── set-blocking@2.0.0
│ ├─┬ request@2.81.0
│ │ ├── aws-sign2@0.6.0
│ │ ├── aws4@1.6.0
│ │ ├── caseless@0.12.0
│ │ ├─┬ combined-stream@1.0.5
│ │ │ └── delayed-stream@1.0.0
│ │ ├── extend@3.0.1
│ │ ├── forever-agent@0.6.1
│ │ ├─┬ form-data@2.1.4
│ │ │ └── asynckit@0.4.0
│ │ ├─┬ har-validator@4.2.1
│ │ │ └── har-schema@1.0.5
│ │ ├─┬ hawk@3.1.3
│ │ │ ├── boom@2.10.1
│ │ │ ├── cryptiles@2.0.5
│ │ │ ├── hoek@2.16.3
│ │ │ └── sntp@1.0.9
│ │ ├─┬ http-signature@1.1.1
│ │ │ ├── assert-plus@0.2.0
│ │ │ ├─┬ jsprim@1.4.0
│ │ │ │ ├── assert-plus@1.0.0
│ │ │ │ ├── extsprintf@1.0.2
│ │ │ │ ├── json-schema@0.2.3
│ │ │ │ └── verror@1.3.6
│ │ │ └─┬ sshpk@1.13.0
│ │ │ ├── asn1@0.2.3
│ │ │ ├── assert-plus@1.0.0
│ │ │ ├── bcrypt-pbkdf@1.0.1
│ │ │ ├─┬ dashdash@1.14.1
│ │ │ │ └── assert-plus@1.0.0
│ │ │ ├── ecc-jsbn@0.1.1
│ │ │ ├─┬ getpass@0.1.7
│ │ │ │ └── assert-plus@1.0.0
│ │ │ ├── jodid25519@1.0.2
│ │ │ ├── jsbn@0.1.1
│ │ │ └── tweetnacl@0.14.5
│ │ ├── is-typedarray@1.0.0
│ │ ├── isstream@0.1.2
│ │ ├── json-stringify-safe@5.0.1
│ │ ├── oauth-sign@0.8.2
│ │ ├── performance-now@0.2.0
│ │ ├── safe-buffer@5.0.1
│ │ ├── stringstream@0.0.5
│ │ ├── tough-cookie@2.3.2
│ │ ├── tunnel-agent@0.6.0
│ │ └── uuid@3.0.1
│ ├─┬ sass-graph@2.2.2
│ │ ├── glob@7.1.1
│ │ ├─┬ scss-tokenizer@0.2.1
│ │ │ └── source-map@0.4.4
│ │ └─┬ yargs@6.6.0
│ │ ├── camelcase@3.0.0
│ │ └── cliui@3.2.0
│ └── stdout-stream@1.4.0
├── npm@4.5.0 extraneous
├─┬ opn@4.0.2
│ └─┬ pinkie-promise@2.0.1
│ └── pinkie@2.0.4
├─┬ optimize-css-assets-webpack-plugin@1.3.1
│ └── underscore@1.8.3
├─┬ ora@1.2.0
│ ├─┬ cli-cursor@2.1.0
│ │ └─┬ restore-cursor@2.0.0
│ │ └─┬ onetime@2.0.1
│ │ └── mimic-fn@1.1.0
│ ├── cli-spinners@1.0.0
│ └── log-symbols@1.0.2
├── polylabel@1.0.2
├── pyrsmk-w@1.7.0
├─┬ rimraf@2.6.1
│ └── glob@7.1.1
├─┬ sass-lint@1.10.2
│ ├─┬ eslint@2.13.1
│ │ ├─┬ concat-stream@1.4.10
│ │ │ └─┬ readable-stream@1.1.14
│ │ │ ├── isarray@0.0.1
│ │ │ └── string_decoder@0.10.31
│ │ ├── doctrine@1.5.0
│ │ ├── estraverse@4.2.0
│ │ ├── esutils@2.0.2
│ │ ├── file-entry-cache@1.3.1
│ │ ├── shelljs@0.6.1
│ │ └── strip-json-comments@1.0.4
│ ├── front-matter@2.1.0
│ ├── fs-extra@1.0.0
│ ├── glob@7.1.1
│ ├─┬ globule@1.1.0
│ │ ├── glob@7.1.1
│ │ └── lodash@4.16.6
│ ├─┬ gonzales-pe@3.4.7
│ │ └── minimist@1.1.3
│ ├── lodash.capitalize@4.2.1
│ ├── lodash.kebabcase@4.1.1
│ ├── merge@1.2.0
│ └─┬ util@0.10.3
│ └── inherits@2.0.1
├─┬ sass-loader@6.0.3
│ ├─┬ clone-deep@0.2.4
│ │ ├─┬ for-own@0.1.5
│ │ │ └── for-in@1.0.2
│ │ ├─┬ is-plain-object@2.0.1
│ │ │ └── isobject@1.0.2
│ │ ├── lazy-cache@1.0.4
│ │ └─┬ shallow-clone@0.1.2
│ │ ├── kind-of@2.0.1
│ │ ├── lazy-cache@0.2.7
│ │ └─┬ mixin-object@2.0.1
│ │ └── for-in@0.1.8
│ ├── loader-utils@1.1.0
│ ├── lodash.tail@4.1.1
│ └── pify@2.3.0
├── semver@5.3.0
├─┬ shapefile@0.6.2
│ ├── array-source@0.0.3
│ ├─┬ path-source@0.1.2
│ │ └── file-source@0.6.1
│ ├── slice-source@0.4.1
│ ├── stream-source@0.3.4
│ └── text-encoding@0.6.1
├─┬ shelljs@0.7.7
│ ├── glob@7.1.1
│ ├── interpret@1.0.3
│ └── rechoir@0.6.2
├─┬ through2@2.0.3
│ ├─┬ readable-stream@2.2.9
│ │ ├── buffer-shims@1.0.0
│ │ ├── core-util-is@1.0.2
│ │ ├── process-nextick-args@1.0.7
│ │ ├── string_decoder@1.0.0
│ │ └── util-deprecate@1.0.2
│ └── xtend@4.0.1
├── topojson-client@3.0.0
├── topojson-server@3.0.0
├── traverse@0.6.6
├── tween.js@16.6.0
├─┬ url-loader@0.5.8
│ ├── loader-utils@1.1.0
│ └── mime@1.3.4
├── vue@2.2.6
├─┬ vue-loader@11.3.4
│ ├─┬ consolidate@0.14.5
│ │ └── bluebird@3.5.0
│ ├── hash-sum@1.0.2
│ ├─┬ js-beautify@1.6.12
│ │ ├─┬ config-chain@1.1.11
│ │ │ ├── ini@1.3.4
│ │ │ └── proto-list@1.2.4
│ │ └─┬ editorconfig@0.13.2
│ │ ├── bluebird@3.5.0
│ │ ├── lru-cache@3.2.0
│ │ └── sigmund@1.0.1
│ ├── loader-utils@1.1.0
│ ├─┬ lru-cache@4.0.2
│ │ ├── pseudomap@1.0.2
│ │ └── yallist@2.1.2
│ ├─┬ postcss-load-config@1.2.0
│ │ ├─┬ cosmiconfig@2.1.1
│ │ │ ├─┬ parse-json@2.2.0
│ │ │ │ └─┬ error-ex@1.3.1
│ │ │ │ └── is-arrayish@0.2.1
│ │ │ └── require-from-string@1.2.1
│ │ ├── postcss-load-options@1.2.0
│ │ └── postcss-load-plugins@2.3.0
│ ├─┬ postcss-selector-parser@2.2.3
│ │ ├── flatten@1.0.2
│ │ ├── indexes-of@1.0.1
│ │ └── uniq@1.0.1
│ ├── source-map@0.5.6
│ ├── vue-hot-reload-api@2.1.0
│ └── vue-template-es2015-compiler@1.5.2
├── vue-material@0.7.1
├── vue-router@2.5.1
├─┬ vue-style-loader@2.0.5
│ └── loader-utils@1.1.0
├─┬ vue-template-compiler@2.2.6
│ ├── de-indent@1.0.2
│ └── he@1.1.1
├── vuex@2.3.1
├─┬ webpack@2.4.1
│ ├── acorn@5.0.3
│ ├─┬ acorn-dynamic-import@2.0.2
│ │ └── acorn@4.0.11
│ ├── ajv-keywords@1.5.1
│ ├── enhanced-resolve@3.1.0
│ ├── json-loader@0.5.4
│ ├── loader-runner@2.3.0
│ ├─┬ memory-fs@0.4.1
│ │ └─┬ errno@0.1.4
│ │ └── prr@0.0.0
│ ├─┬ node-libs-browser@2.0.0
│ │ ├── assert@1.4.1
│ │ ├─┬ browserify-zlib@0.1.4
│ │ │ └── pako@0.2.9
│ │ ├─┬ buffer@4.9.1
│ │ │ └── base64-js@1.2.0
│ │ ├─┬ console-browserify@1.1.0
│ │ │ └── date-now@0.1.4
│ │ ├── constants-browserify@1.0.0
│ │ ├─┬ crypto-browserify@3.11.0
│ │ │ ├─┬ browserify-cipher@1.0.0
│ │ │ │ ├─┬ browserify-aes@1.0.6
│ │ │ │ │ └── buffer-xor@1.0.3
│ │ │ │ ├─┬ browserify-des@1.0.0
│ │ │ │ │ └── des.js@1.0.0
│ │ │ │ └── evp_bytestokey@1.0.0
│ │ │ ├─┬ browserify-sign@4.0.4
│ │ │ │ ├── bn.js@4.11.6
│ │ │ │ ├── browserify-rsa@4.0.1
│ │ │ │ ├─┬ elliptic@6.4.0
│ │ │ │ │ ├── brorand@1.1.0
│ │ │ │ │ ├── hash.js@1.0.3
│ │ │ │ │ ├── hmac-drbg@1.0.1
│ │ │ │ │ ├── minimalistic-assert@1.0.0
│ │ │ │ │ └── minimalistic-crypto-utils@1.0.1
│ │ │ │ └─┬ parse-asn1@5.1.0
│ │ │ │ └── asn1.js@4.9.1
│ │ │ ├── create-ecdh@4.0.0
│ │ │ ├─┬ create-hash@1.1.2
│ │ │ │ ├── cipher-base@1.0.3
│ │ │ │ ├── ripemd160@1.0.1
│ │ │ │ └── sha.js@2.4.8
│ │ │ ├── create-hmac@1.1.4
│ │ │ ├─┬ diffie-hellman@5.0.2
│ │ │ │ └── miller-rabin@4.0.0
│ │ │ ├── pbkdf2@3.0.9
│ │ │ ├── public-encrypt@4.0.0
│ │ │ └── randombytes@2.0.3
│ │ ├── domain-browser@1.1.7
│ │ ├── events@1.1.1
│ │ ├── https-browserify@0.0.1
│ │ ├── os-browserify@0.2.1
│ │ ├── path-browserify@0.0.0
│ │ ├── process@0.11.9
│ │ ├── punycode@1.4.1
│ │ ├── querystring-es3@0.2.1
│ │ ├── stream-browserify@2.0.1
│ │ ├─┬ stream-http@2.7.0
│ │ │ ├── builtin-status-codes@3.0.0
│ │ │ └── to-arraybuffer@1.0.1
│ │ ├── string_decoder@0.10.31
│ │ ├─┬ timers-browserify@2.0.2
│ │ │ └── setimmediate@1.0.5
│ │ ├── tty-browserify@0.0.0
│ │ ├─┬ url@0.11.0
│ │ │ └── punycode@1.3.2
│ │ └─┬ vm-browserify@0.0.4
│ │ └── indexof@0.0.1
│ ├── source-map@0.5.6
│ ├─┬ supports-color@3.2.3
│ │ └── has-flag@1.0.0
│ ├── tapable@0.2.6
│ ├─┬ uglify-js@2.8.22
│ │ ├── source-map@0.5.6
│ │ ├── uglify-to-browserify@1.0.2
│ │ └─┬ yargs@3.10.0
│ │ ├── camelcase@1.2.1
│ │ ├─┬ cliui@2.1.0
│ │ │ ├─┬ center-align@0.1.3
│ │ │ │ └─┬ align-text@0.1.4
│ │ │ │ └── longest@1.0.1
│ │ │ ├── right-align@0.1.3
│ │ │ └── wordwrap@0.0.2
│ │ └── window-size@0.1.0
│ ├─┬ watchpack@1.3.1
│ │ └─┬ chokidar@1.6.1
│ │ ├─┬ anymatch@1.3.0
│ │ │ └── arrify@1.0.1
│ │ ├── async-each@1.0.1
│ │ ├── UNMET OPTIONAL DEPENDENCY fsevents@^1.0.0
│ │ ├─┬ glob-parent@2.0.0
│ │ │ └─┬ is-glob@2.0.1
│ │ │ └── is-extglob@1.0.0
│ │ ├─┬ is-binary-path@1.0.1
│ │ │ └── binary-extensions@1.8.0
│ │ ├─┬ is-glob@2.0.1
│ │ │ └── is-extglob@1.0.0
│ │ └─┬ readdirp@2.1.0
│ │ └── set-immediate-shim@1.0.1
│ ├─┬ webpack-sources@0.2.3
│ │ └── source-list-map@1.1.1
│ └─┬ yargs@6.6.0
│ ├── camelcase@3.0.0
│ ├─┬ cliui@3.2.0
│ │ └── wrap-ansi@2.1.0
│ ├── get-caller-file@1.0.2
│ ├─┬ os-locale@1.4.0
│ │ └─┬ lcid@1.0.0
│ │ └── invert-kv@1.0.0
│ ├── require-directory@2.1.1
│ ├── require-main-filename@1.0.1
│ ├── which-module@1.0.0
│ ├── y18n@3.2.1
│ └─┬ yargs-parser@4.2.1
│ └── camelcase@3.0.0
├─┬ webpack-bundle-analyzer@2.4.0
│ ├── acorn@5.0.3
│ ├── ejs@2.5.6
│ ├── filesize@3.5.6
│ ├─┬ gzip-size@3.0.0
│ │ └── duplexer@0.1.1
│ └── opener@1.4.3
├── webpack-dev-middleware@1.10.2
├─┬ webpack-hot-middleware@2.18.0
│ ├── ansi-html@0.0.7
│ ├── html-entities@1.2.1
│ └── querystring@0.2.0
└── webpack-merge@4.1.0
OK, it's not dependency problem.
If you have opened this project previously with vetur, it has cache on language files. Can you try to rename the project (or move it to a different path), and see if it still crashes?
Any way to remove the cache folder only?
@rafaelpimpa Does that solve the crash?
There is no easy way to do that. You can rename -> reopen in VSCode -> name it back -> open again. I'll try to find a workaround...
So far so good :D I'm still testing..
mmm nope
vetur initialized
[Error - 5:20:13 PM] Request textDocument/hover failed.
Message: Request textDocument/hover failed with message: Cannot read property 'parameters' of undefined
Code: -32603
rename -> reopen in VSCode -> name it back -> open again
Did this, but the error persists 😢
[Error - 17:20:42] Request textDocument/hover failed.
Message: Request textDocument/hover failed with message: Cannot read property 'parameters' of undefined
Code: -32603
[Error - 17:21:09] Request textDocument/hover failed.
Message: Request textDocument/hover failed with message: Cannot read property 'parameters' of undefined
Code: -32603
[Error - 17:21:15] Request textDocument/hover failed.
Message: Request textDocument/hover failed with message: Cannot read property 'parameters' of undefined
Code: -32603
[Error - 17:21:39] Request textDocument/hover failed.
Message: Request textDocument/hover failed with message: Cannot read property 'parameters' of undefined
Code: -32603```
@cubodehelio If this is just a toy project, do you mind sharing with me on GitHub? I can't repro it using vue's webpack template.
Mine is Buefy
Sorry it's not a toy project, but I will try to start a new one tomorrow just as I did this one to see if I can reproduce it and in that case a will share it with you. Thank you!
May be @rafaelpimpa can help on this. @rafaelpimpa does your project includes the jsconfig.json or the tsconfig.json? as far as a understand this is necessary
@octref
Reproducible Steps
Use cursor hover 'data2' in 'this.data2 = 1'
<template>
</template>
<script>
export default {
watch: {
data1 (val) {
this.data2 = 1
},
data2 (val) {}
}
}
</script>
@jing2si Thanks, that would be super helpful for debugging.
Always got these when command/control + hovering. Especially when the file is being edited but not saved and not ready for compiling.
I think this extension is compiling the edited files while they have errors(like incomplete expressions).
Does 0.6.11 still have the problem?
It does.
This is fixed by swallowing the error when no definition is found for the file. Will be published soon.
Also, when Vue 2.4 releases with new type definitions, this.data2 will actually become typed.
Stay tuned 😉
Kinda offtopic but where do you see next vue features? :D
@octref Any update on this? I still experience the issue described by @jing2si except Vetur now swallows the error. If just one imported component modifies a watched variable, the extension crashes making it unusable.
Vue 2.4 has been released so can this be fixed now?
@nextgensparx I cannot repro the problem any more. Please open a new issue with repro steps.
@octref I created a new issue #751
| gharchive/issue | 2017-05-08T18:31:30 | 2025-04-01T06:45:12.442177 | {
"authors": [
"cubodehelio",
"jing2si",
"nextgensparx",
"octref",
"rafaelpimpa",
"seancheung"
],
"repo": "octref/vetur",
"url": "https://github.com/octref/vetur/issues/191",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1770891657 | 🛑 GUAU (guau.umet.edu.ar) is down
In 808675f, GUAU (guau.umet.edu.ar) (https://guau.umet.edu.ar/api/health) was down:
HTTP code: 0
Response time: 0 ms
Resolved: GUAU (guau.umet.edu.ar) is back up in 1bec715.
| gharchive/issue | 2023-06-23T06:55:09 | 2025-04-01T06:45:12.451906 | {
"authors": [
"apps-suterh"
],
"repo": "octubre-softlab/octubre-upptime",
"url": "https://github.com/octubre-softlab/octubre-upptime/issues/1650",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2132454110 | Define columns in custom 2D turbulence definition
Feature request
Use Case
We need to clearly define what the columns represent, and if necessary add a place in the schema to record the bin_center if required (ie if bins are not evenly distributed in a known way)
@Dash16PM to distribute paper describing this method
Updated definitions as best we can in #61
| gharchive/issue | 2024-02-13T14:26:52 | 2025-04-01T06:45:12.453339 | {
"authors": [
"thclark"
],
"repo": "octue/power-curve-schema",
"url": "https://github.com/octue/power-curve-schema/issues/59",
"license": "CC0-1.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2348255816 | Check the repo file
A link check for the repo file would have avoided https://github.com/ocaml/opam-repository/pull/26065
Docs on the repo file: https://opam.ocaml.org/doc/Manual.html#repo
| gharchive/issue | 2024-06-12T09:17:18 | 2025-04-01T06:45:12.456484 | {
"authors": [
"kit-ty-kate",
"shonfeder"
],
"repo": "ocurrent/opam-repo-ci",
"url": "https://github.com/ocurrent/opam-repo-ci/issues/324",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
216255775 | Knowledge base configuration misses motivation behind the options and why are they required
In Czech:
Konfigurace knowledge base je popsána jen na úrovni seznamu konfiguračních položek,
není zcela jasné, proč jsou některé potřeba.
This issue was moved to odalic/sti#388
| gharchive/issue | 2017-03-23T01:07:15 | 2025-04-01T06:45:12.459620 | {
"authors": [
"brodecva"
],
"repo": "odalic/odalic-ui",
"url": "https://github.com/odalic/odalic-ui/issues/373",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
939713098 | Support public key
Hi
First of all, thanks for your work. It is a great project.
but when I use this library, I need not only an address but also a public key.
So, I made this PR to support the public key too.
Please review this and I hope this commit would be merged.
Thanks for the pull-request.
I think this is a good change.
When it receives a pull request from a human, CI don't run #255
So, I ran yarn test in local and was successed the command.
Released https://github.com/odanado/aws-kms-provider/releases/tag/v0.3.3
Thanks for the fast response!
| gharchive/pull-request | 2021-07-08T10:28:44 | 2025-04-01T06:45:12.462469 | {
"authors": [
"odanado",
"riemannulus"
],
"repo": "odanado/aws-kms-provider",
"url": "https://github.com/odanado/aws-kms-provider/pull/254",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
65738762 | drive pull taking too much memory - linux kill it (out of memory) and freeze the terminal
I am trying to pull a folder (12 files - total 399GB) from my Google Drive to my local disk in Linux using your program (thank you for this). This folder was a shared one that I had included to my Google Drive, so I could pull it using "drive".
However, after 1 hour and 2 hour, the program is "terminated by signal 9". Checking what could cause it I found (with dmesg) this:
Out of memory: Kill process 3729 (sshd) score 1 or sacrifice child
(Besides killing the program, it also caused screen session be killed. I captured the message of "terminating by signal 9" in the nohup log after 2 or 3 screen session crash with no clue)
Even after the system killing the program, the amount of memory allocated by the program (almost all 32 GB that the machine has) was held and the only solution to reestablish the system resources was to reboot the system.
System info:
Linux 3.2.0-4-686-pae #1 SMP Debian 3.2.57-3 i686 GNU/Linux
quad-core
32 GB memory
Please, any idea what could be causing this program behavior?
Thank you for reporting this @lintzyli and welcome to drive. This is quite unfortunate but definitely an interesting edge case.
Does it help to pull one massive file at a time?
Btw drive allows for you do stat as well as list on files on the remote. You can find this information in the README, just in case you haven't already looked at it.
Thank you for you messages @odeke-em. I have tried list, but I have not used stat yet (I will).
In case it helps, I was downloading this dataset, which is hosted at Google Drive:
http://www.yli-corpus.org/mediaeval-2014-placing-task-dataset
Yes, when I pull one big file at time, drive is able to finish OK.
Can't really reproduce or debug this. Going to close this for now. Let's re-open if it returns. Thank you @lintzyli for reporting it.
| gharchive/issue | 2015-04-01T17:37:08 | 2025-04-01T06:45:12.470049 | {
"authors": [
"lintzyli",
"odeke-em"
],
"repo": "odeke-em/drive",
"url": "https://github.com/odeke-em/drive/issues/142",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
227010222 | Fix item buy time
fixes #437
Do you have a sample match ID that you tested this against?
https://www.opendota.com/matches/2762021523/overview
Before and with fix
Fix also affects the buy time of all items in general.
I think the use of the purchase_log is much more reasonable than the first_purchase_time.
Is there any reason for use first_purchase_time? Perhaps I do not see the whole picture as a whole?
It's just simpler since it doesn't require iterating over the purchase log.
| gharchive/pull-request | 2017-05-08T10:57:56 | 2025-04-01T06:45:12.689776 | {
"authors": [
"howardchung",
"pinkiesky"
],
"repo": "odota/ui",
"url": "https://github.com/odota/ui/pull/949",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1242500905 | feat: resource apply changes only if different
This PR is to address #342 .
Changes:
updated the implementation in service layer
rework unit tests for the impacted methods
Pull Request Test Coverage Report for Build 2355607561
60 of 64 (93.75%) changed or added relevant lines in 1 file are covered.
No unchanged relevant lines lost coverage.
Overall coverage increased (+0.1%) to 75.137%
Changes Missing Coverage
Covered Lines
Changed/Added Lines
%
datastore/service.go
60
64
93.75%
Totals
Change from base Build 2308712360:
0.1%
Covered Lines:
6316
Relevant Lines:
8406
💛 - Coveralls
| gharchive/pull-request | 2022-05-20T01:43:16 | 2025-04-01T06:45:12.699097 | {
"authors": [
"coveralls",
"irainia"
],
"repo": "odpf/optimus",
"url": "https://github.com/odpf/optimus/pull/351",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1085704522 | Switch react container base image to UBI
I propose to switch the react UI container base image to a UBI image
supported base image
tested/updated for security vulnarabilities
consistent with image used for base egeria
hosted on quay.io so will avoid dockerhub pull request limitations (but note that either side could have outages)
The likely base is https://catalog.redhat.com/software/containers/ubi8/nodejs-14-minimal/6065b8e1b92fbda3a4c65d91
See also https://github.com/odpi/egeria-ui/issues/289 for the other UI
| gharchive/issue | 2021-12-21T10:58:22 | 2025-04-01T06:45:12.701871 | {
"authors": [
"planetf1"
],
"repo": "odpi/egeria-react-ui",
"url": "https://github.com/odpi/egeria-react-ui/issues/311",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
461961802 | findEntitiesByProperty does not work with local graph repository
findEntitiesByProperty does not work with local graph repository. The entity can be found with Gremlin in the graph.
I run into this issue while testing DataEngine with the local graph. It worked with Atlas repository & in memory repository.
2019-06-28 12:37:29.899 DEBUG 3694 --- [io-5500-exec-10] .o.o.a.r.g.r.GraphOMRSMetadataCollection : findEntitiesByProperty: continuing with search for entity type SchemaAttribute because it is a subtype of SchemaAttribute
2019-06-28 12:37:29.900 DEBUG 3694 --- [io-5500-exec-10] o.o.o.a.r.g.r.GraphOMRSMetadataStore : findEntitiesByProperty primitive match property has key qualifiedName value (host_(engine))=INFOSVR::(data_file_folder)=/::(data_file_folder)=data::(data_file_folder)=files::(data_file_folder)=CocoPharma::(data_file)=Employee-Employee.csv::(data_file_record)=Employee-Employee::(data_file_field)=ROLE(host_(engine))=INFOSVR::(transformation_project)=dstage1::(dsjob)=file_to_EMPLOYEE::(stage)=CompDir_EmployeeCSVDemo1
2019-06-28 12:37:29.901 DEBUG 3694 --- [io-5500-exec-10] o.o.o.a.r.g.r.GraphOMRSMetadataStore : findEntitiesByProperty primitive match property has property criterion [HasStep([ReferenceablexqualifiedName.textRegex(.(host_(engine))=INFOSVR::(data_file_folder)=/::(data_file_folder)=data::(data_file_folder)=files::(data_file_folder)=CocoPharma::(data_file)=Employee-Employee.csv::(data_file_record)=Employee-Employee::(data_file_field)=ROLE(host_(engine))=INFOSVR::(transformation_project)=dstage1::(dsjob)=file_to_EMPLOYEE::(stage)=CompDir_EmployeeCSVDemo1.)])]
2019-06-28 12:37:29.902 DEBUG 3694 --- [io-5500-exec-10] o.o.o.a.r.g.r.GraphOMRSMetadataStore : findEntitiesByProperty traversal looks like this --> [GraphStep(vertex,[]), HasStep([~label.eq(Entity), vetypeName.eq(SchemaAttribute), veentityIsProxy.eq(false)]), OrStep([[HasStep([ReferenceablexqualifiedName.textRegex(.(host_(engine))=INFOSVR::(data_file_folder)=/::(data_file_folder)=data::(data_file_folder)=files::(data_file_folder)=CocoPharma::(data_file)=Employee-Employee.csv::(data_file_record)=Employee-Employee::(data_file_field)=ROLE(host_(engine))=INFOSVR::(transformation_project)=dstage1::(dsjob)=file_to_EMPLOYEE::(stage)=CompDir_EmployeeCSVDemo1.)])]])]
2019-06-28 12:37:29.905 DEBUG 3694 --- [io-5500-exec-10] o.j.g.transaction.StandardJanusGraphTx : Guava vertex cache size: requested=20000 effective=20000 (min=100)
2019-06-28 12:37:29.906 DEBUG 3694 --- [io-5500-exec-10] o.j.g.t.vertexcache.GuavaVertexCache : Created dirty vertex map with initial size 32
2019-06-28 12:37:29.906 DEBUG 3694 --- [io-5500-exec-10] o.j.g.t.vertexcache.GuavaVertexCache : Created vertex cache with max size 20000
2019-06-28 12:37:29.907 DEBUG 3694 --- [io-5500-exec-10] o.j.g.t.JanusGraphBlueprintsGraph : Created new thread-bound transaction standardjanusgraphtx[0x67dd1a79]
2019-06-28 12:37:30.246 DEBUG 3694 --- [nsumer.outTopic] o.a.e.t.k.KafkaOpenMetadataEventConsumer : Found records: 0
2019-06-28 12:37:30.246 DEBUG 3694 --- [ineage.outTopic] o.a.e.t.k.KafkaOpenMetadataEventConsumer : Found records: 0
2019-06-28 12:37:30.246 DEBUG 3694 --- [Engine.outTopic] o.a.e.t.k.KafkaOpenMetadataEventConsumer : Found records: 0
2019-06-28 12:37:30.246 DEBUG 3694 --- [hort1.OMRSTopic] o.a.e.t.k.KafkaOpenMetadataEventConsumer : Found records: 0
2019-06-28 12:37:30.246 DEBUG 3694 --- [atform.outTopic] o.a.e.t.k.KafkaOpenMetadataEventConsumer : Found records: 0
2019-06-28 12:37:30.246 DEBUG 3694 --- [latform.inTopic] o.a.e.t.k.KafkaOpenMetadataEventConsumer : Found records: 0
2019-06-28 12:37:30.246 DEBUG 3694 --- [fficer.outTopic] o.a.e.t.k.KafkaOpenMetadataEventConsumer : Found records: 0
2019-06-28 12:37:30.246 DEBUG 3694 --- [onView.outTopic] o.a.e.t.k.KafkaOpenMetadataEventConsumer : Found records: 0
2019-06-28 12:37:30.323 INFO 3694 --- [io-5500-exec-10] .o.o.a.r.g.r.GraphOMRSMetadataCollection : findEntitiesByProperty: for type SchemaAttribute found no entities
findEntitiesByProperty_schemaAttribute.log
Hi @popa-raluca - Can you please describe the process by which the entity was added to the repository? I am wondering if it is a proxy or a full (local) entity.
Hi @grahamwallis - I run into the issue while trying to run Data Engine with the local graph. The entities are not proxies, they are created using the createEntity method(which uses metadataCollection.addEntity) from RepositoryHandler.
In the tests done last week, I saw that a SoftwareServerCapability entity can be retrievied with findEntitiesByProperty, but SchemaType(TabularSchemaType) and SchemaAttribute entities are not retrived. Im only searching based on the qualifiedName, not sure if this is helpfull.
If you want, we can have a call to discuss more details.
@popa-raluca - thanks Raluca - I will try to reproduce it.
@grahamwallis - Hi Graham, I tested with some simple qualifiedNames and it works.
@grahamwallis Hi Graham, I did some more tests and I now have another issue, that might be related to this one - if I have 2 entities in the graph with the qualifiedNames TAX and TAXP, then searching for the qualifiedName TAX will return both entities
@popa-raluca - Hi Raluca, There are two (separate) problems here. One is that the findEntitiesByProperty() method extends the search string to make it match a substring of the actual property value; the second problem is special characters in the regex syntax.
Regarding the first problem I have made a code change to the graph repo that will only do substring matches for findXXXByPropertyValue() - because this is a broad search-style interface. For the findXXXByProperty() or findEntitiesByProperty() interfaces - which are by their nature more specific, precise retrievals, I have removed the extension of the regex string; so that the regex string will be looking for a fully matching property value (or classification name). This will prevent you getting back more than you wanted - in the example of TAX and TAXP.
The other problem is about the regex syntax - and particularly the role of special characters in the syntax, such as dots, plus signs, stars, parentheses, brackets, etc. Having discussed this with Mandy we want to continue to expose the full regex syntax because it will provide the flexibility to perform appropriate matches; the disadvantage is that the search string (at least at the OMRS interface) needs to be regex-aware. My proposal for addressing this is to add documentation about the search interface, and to make sure that both the in-memory and graph repositories are consistent in their support. In the meantime, to perform a match of a property value that contains any of the special characters, escape the special characters so that they are included as part of the pattern. For example, to find an entity with qualifiedName="outer(inner)" (and avoid any other matches) the regex string would be "outer(inner)". Note that if you issue this string in a REST command from Postman for example you need to double-escape the parentheses, i.e. "outer\(inner\)".
I hope this helps. I will document this properly, and will also spend some time testing and thinking about the findXXXByPropertyValue methods - especially as they take the search string as a URI parameter and the additional HTML escaping gets awkward.
I will push my "Issue1194" PR soon and will add the documentation and further update here.
Fixed by #1522
| gharchive/issue | 2019-06-28T10:11:21 | 2025-04-01T06:45:12.719418 | {
"authors": [
"grahamwallis",
"popa-raluca"
],
"repo": "odpi/egeria",
"url": "https://github.com/odpi/egeria/issues/1194",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1436410183 | Version 4 build
Is there an existing issue for this?
[X] I have searched the existing issues
Please describe the new behavior that that will improve Egeria
Establish new build process for Egeria version 4
Alternatives
No response
Any Further Information?
No response
Would you be prepared to be assigned this issue to work on?
[X] I can work on this
I'm working through getting the v4 pipelines/code working fully.
One thing to note -- this branch needs to frequently receive merges from main. In order to simplify the git log I have permitted myself to do direct pushes to the repo. This is needed to do a 'git rebase'. This effectively keeps the v4 specific changes floating on top of the other repo changes, which will keep the history synced with 'main'.
When we start work on v4 fully we can push the 4.0pre branch changes (normal merge) back to main to continue with development, and avoid having a confusing history with loads of extra merges.
This is specific to the 4.0 branch whilst in the early stages.,
As per slack post
The Egeria v4 pipeline is now working
builds from egeria-release-4.0pre
delta - specific fixes to gradle build and github actions only
Uses gradle (only)
Uses Java 17 LTS (temurin)
FVT tests all good
running codeQL & sonatype lift
Maven Snapshots signed & published to maven central (4.0-SNAPSHOT)
Many ‘intermediate’ pom-only artifacts are removed (ie org.odpi.egeria:access-services) as their need was only driven by maven’s structure
Docker images published (4.0-SNAPSHOT)
Will rebase from main periodically.
This is part of a bigger set of steps to get to gradle/java 17 for release 4 in 2023. There is still more to do on verification. Maven removal will be left until closer to the time.
(The last fix for docker is just being merged)
I’ll do another iteration in a few weeks. After that (or perhaps Jan) we can review progress/changes in a community or dev call.
Closing this as the initial setup to allow us to refine work on v4 and fix up the gradle build is now in place
| gharchive/issue | 2022-11-04T17:27:05 | 2025-04-01T06:45:12.725497 | {
"authors": [
"planetf1"
],
"repo": "odpi/egeria",
"url": "https://github.com/odpi/egeria/issues/7090",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1219583334 | 🛑 urbamonde.org is down
In 7e863bf, urbamonde.org (https://www.urbamonde.org) was down:
HTTP code: 429
Response time: 431 ms
Resolved: urbamonde.org is back up in ac1abca.
| gharchive/issue | 2022-04-29T01:57:25 | 2025-04-01T06:45:12.729981 | {
"authors": [
"ntopulos"
],
"repo": "odqo/upptime",
"url": "https://github.com/odqo/upptime/issues/39",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
790245376 | Allow ontology as input parameter
Validating the mappings according to the ontology is specially relevant when sources are complex and there are many mapping rules
This will not be be added in the short term
| gharchive/issue | 2021-01-20T19:16:42 | 2025-04-01T06:45:12.735334 | {
"authors": [
"ArenasGuerreroJulian"
],
"repo": "oeg-upm/Morph-KGC",
"url": "https://github.com/oeg-upm/Morph-KGC/issues/8",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
108403896 | Allow message on custom validators to be set via options
For all the validators, except ones that are custom, the message is set on the options object. It would be handy if this was available for custom validators as well:
var Validations = buildValidations({
hello: [
validator(function(value) {
return false;
}, { message: 'Sorry, this greeting will never pass' }),
validator('length', {
is: 25,
message: 'The greeting should be 25 characters long'
})
]
});
The function passed could return true or false, with the message still passed in via the options and used when the case is false. This would help where people use quick boolean helpers for the function.
Im not sure I follow. Could you not do something like this
var Validations = buildValidations({
hello: [
validator(function(value, options) {
return options.message;
}, { message: 'Sorry, this greeting will never pass' }),
]
});
or even
var Validations = buildValidations({
hello: [
validator(function(value) {
return 'Sorry, this greeting will never pass';
}),
]
});
You're absolutely right, that's the way you can pass things from the options, or just return the string directly. What I was thinking that you could use a utility function, maybe something you've used elsewhere that relates to the property that you're validating. As an example I'll say isValidSkuFormat, that returns a boolean.
The validation would look something like this:
import SkuHelpers from '../utils/sku-helpers';
var Validations = buildValidations({
sku: validator(SkuHelpers.isValidSkuFormat, { message: 'Please use a valid sku' })
});
This keeps the re-usability of the helper concise in implementation without having to wrap it in a function to set things up. The true/false logic is kept separate from the message, with a similar style as the other validators. I could see enough people possibly having these single parameter helpers, with a true/false return, that could be usable in this case. You're right though, you can access the options argument and pull this through in a way that is organized, with only a few more lines.
Just a suggestion, in any case thanks for this project. Everything has been really a pleasure to use, and is versatile in combining with custom functions.
I think you might want to try creating a custom validator. That should solve your use-case while also giving you the ability to inject services like an ember-data store.
Sounds good, thanks for the feedback.
| gharchive/issue | 2015-09-25T20:38:52 | 2025-04-01T06:45:12.789178 | {
"authors": [
"chadian",
"offirgolan"
],
"repo": "offirgolan/ember-cp-validations",
"url": "https://github.com/offirgolan/ember-cp-validations/issues/45",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
105677269 | Changelog.
Fixes #14.
Lgtm
Bummer that we declared 1.0.0 so early. Our 1.0.1 release breaks semantic versioning with the confirmation validator change. We should beware of this in the future.
@blimmer we can always release 2.0 and 3.0 :P
Yeah, bumping the number is not a big deal, but following semantic versioning 1.0.1 should have been 2.0.0 because of breaking api changes.
@blimmer lets consider it a "bugfix" then, and do better going forward.
:+1:
| gharchive/pull-request | 2015-09-09T20:23:41 | 2025-04-01T06:45:12.792089 | {
"authors": [
"blimmer",
"stefanpenner"
],
"repo": "offirgolan/ember-cp-validations",
"url": "https://github.com/offirgolan/ember-cp-validations/pull/15",
"license": "BSD-3-Clause",
"license_type": "permissive",
"license_source": "github-api"
} |
92200173 | Tab Switcher Skips Last Tab
I just upgraded to 1.2.0 this morning and it feels a little "derpy." I think I've narrowed it down to the switcher skipping the last used tab.
In the above image I was editing PING.php, and switched to NETDOWN.php. Hitting the hotkey and bringing up the switcher skips immediately to index.php. I can press the hotkey again to cycle all the way around, and it will once again skip over PING.php. In short, the last used tab is not selectable. I am unsure of what could cause this. This bug is not present in previous version.
Hi Trcx,
So you're saying when you hit alt-], it starts on the 3rd item? (It's supposed be the 2nd, with the 1st being the current tab.) I'm unable to reproduce this, either on Mac OS, or a WIn 8.1 VM. Both using Atom 1.0. You could try v1.2.1, but I don't think anything there would change this.
Does it still happen after reloading Atom (window:reload)? If so, then what if you move aside your config directory (i.e. start clean), and only install tab-switcher?
That is correct. I remapped the hotkeys to the standard tab switch keys (ctrl tab, ctrl shift tab), but I doubt that'd cause any issues.
Regardless it seems to have been resolved this morning when I updated to 1.2.1, not sure what changed but I'm just happy to have it working as expected. Thanks for your help.
In the event that this is reopened in the future here are a list of installed packages:
Remote-FTP @ 0.7.1
Aligner-php @ 1.0.0
Atom-autocomplete-php @ 0.1.10
Color-picker @ 2.0.7
File-icons @ 1.5.8
HIghlight-line @ 0.11.0
Minimap @ 4.10.1
Quick-query @ 0.4.0
| gharchive/issue | 2015-06-30T21:08:39 | 2025-04-01T06:45:12.804484 | {
"authors": [
"Trcx528",
"oggy"
],
"repo": "oggy/tab-switcher",
"url": "https://github.com/oggy/tab-switcher/issues/18",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
2018215026 | Update the game over message.
Update the game over message so people know how to play again!
cklodes
| gharchive/pull-request | 2023-11-30T09:20:53 | 2025-04-01T06:45:12.805425 | {
"authors": [
"ogolaSospeter"
],
"repo": "ogolaSospeter/skills-review-pull-requests",
"url": "https://github.com/ogolaSospeter/skills-review-pull-requests/pull/1",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
105192993 | crashed: Nexus 7 (2012) (grouper)
prj: https://play.google.com/store/apps/details?id=com.dyadchenko.ss4km
os: Android 4.4
java.lang.IllegalArgumentException
at com.google.android.gles_jni.EGLImpl._eglCreateContext(Native Method)
at com.google.android.gles_jni.EGLImpl.eglCreateContext(EGLImpl.java:54)
at com.android.godot.GodotView$ContextFactory.createContext(GodotView.java:397)
at android.opengl.GLSurfaceView$EglHelper.start(GLSurfaceView.java:1030)
at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1401)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240)
Hmm I have a Nexus 7 2012 and never had any issues with it, but will give
it a test
On Mon, Sep 7, 2015 at 7:40 AM, ret80 notifications@github.com wrote:
prj: https://play.google.com/store/apps/details?id=com.dyadchenko.ss4km
os: Android 4.4
java.lang.IllegalArgumentException
at com.google.android.gles_jni.EGLImpl._eglCreateContext(Native Method)
at com.google.android.gles_jni.EGLImpl.eglCreateContext(EGLImpl.java:54)
at
com.android.godot.GodotView$ContextFactory.createContext(GodotView.java:397)
at android.opengl.GLSurfaceView$EglHelper.start(GLSurfaceView.java:1030)
at
android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1401)
at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240)
—
Reply to this email directly or view it on GitHub
https://github.com/okamstudio/godot/issues/2443.
I think this bug might be related to OUYA crash bug (https://github.com/okamstudio/godot/issues/1608) since both these devices have Tegra 3 cpu.
@ret80 Can you test the solution described in https://github.com/godotengine/godot/issues/1608#issuecomment-158626872?
We believe this issue might be fixed by #2862. Feel free to reopen (or ask me to reopen) if it's not the case.
| gharchive/issue | 2015-09-07T10:40:15 | 2025-04-01T06:45:12.885185 | {
"authors": [
"akien-mga",
"kubecz3k",
"reduz",
"ret80"
],
"repo": "okamstudio/godot",
"url": "https://github.com/okamstudio/godot/issues/2443",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
164140130 | preserveCase creates loophole where duplicate tags can still be created
When using preserveCase: true (which is preferred in my instance) the duplicate tag feature fails. While it will still disable creating tags spelled the same, it now only considers tags spelled with the same casing as a duplicate. This is problematic as now people can do this:
Tag, tag, tAg, taG
This would be great.
Hmm this an interesting issue because if you want to preserveCase, technically Tag is different than taG so it should not be considered a duplicate in my mind (because you're being case sensitive). I could see this being a problem in your instance though. The question is more, how would we implement this in a clean way that works both ways because I still think the current implementation makes sense.
I don't necessarily think we should change the way it works currently as I can see that being somewhat of a breaking change. If we changed the way it works it would have to wait until a 2.0 release. I'm open to suggestions of how to handle this though.
@okcoker What if you just added an additional option for allowDuplicates? Basically in addition to true/false, you could also set it to 'strict' and then this would simply run it through a RegExp instead of a simple .indexof()
You can see an example of the difference here: http://codepen.io/ndimatteo/pen/KrvyGX
I imagine this would be pretty easy to add in around line #498: https://github.com/okcoker/taggle.js/blob/master/src/taggle.js#L498
I think if this ever were a thing, I'd rather you pass the regular expression as the actual option. But I think you can use the beforeTagAdd option to do your regex check anyway.
| gharchive/issue | 2016-07-06T18:14:49 | 2025-04-01T06:45:12.892468 | {
"authors": [
"ndimatteo",
"okcoker",
"webchaz"
],
"repo": "okcoker/taggle.js",
"url": "https://github.com/okcoker/taggle.js/issues/60",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.