Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
9,170
3,025,066,613
IssuesEvent
2015-08-03 04:37:53
servo/servo
https://api.github.com/repos/servo/servo
closed
Intermittent timeout in /css21_dev/html4/absolute-replaced-height-012.htm
A-testing I-intermittent P-linux
``` 0:31.85 TEST_START: Thread-TestrunnerManager-8 /css21_dev/html4/absolute-replaced-height-012.htm 0:47.89 TEST_END: Thread-TestrunnerManager-8 TIMEOUT, expected FAIL ```
1.0
Intermittent timeout in /css21_dev/html4/absolute-replaced-height-012.htm - ``` 0:31.85 TEST_START: Thread-TestrunnerManager-8 /css21_dev/html4/absolute-replaced-height-012.htm 0:47.89 TEST_END: Thread-TestrunnerManager-8 TIMEOUT, expected FAIL ```
non_process
intermittent timeout in dev absolute replaced height htm test start thread testrunnermanager dev absolute replaced height htm test end thread testrunnermanager timeout expected fail
0
74,059
14,172,527,390
IssuesEvent
2020-11-12 17:03:39
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
Git Credential Manager Deprecated
devops-code-git/tech devops/prod doc-enhancement
> [Install the Git Credential Manager](https://docs.microsoft.com/en-us/azure/devops/repos/git/set-up-credential-managers?view=azure-devops#install-the-git-credential-manager) >Windows > >Download and run the latest Git for Windows installer, which includes the Git Credential Manager for Windows. Make sure to enable the Git Credential Manager installation option. Git Credential Manager has been [deprecated](https://github.com/Microsoft/Git-Credential-Manager-for-Windows#notice-this-project-is-no-longer-being-maintained-warning). Doc needs to be updated to suggest Git Credential Manager Core. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 6c2f260d-9321-d669-172d-01289c233263 * Version Independent ID: 14e9ae18-df9d-7555-902c-3e8b4c2b2f22 * Content: [Connect to your Git repos using credential managers - Azure Repos](https://docs.microsoft.com/en-us/azure/devops/repos/git/set-up-credential-managers?view=azure-devops#windows) * Content Source: [docs/repos/git/set-up-credential-managers.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/repos/git/set-up-credential-managers.md) * Product: **devops** * Technology: **devops-code-git** * GitHub Login: @vtbassmatt * Microsoft Alias: **macoope**
1.0
Git Credential Manager Deprecated - > [Install the Git Credential Manager](https://docs.microsoft.com/en-us/azure/devops/repos/git/set-up-credential-managers?view=azure-devops#install-the-git-credential-manager) >Windows > >Download and run the latest Git for Windows installer, which includes the Git Credential Manager for Windows. Make sure to enable the Git Credential Manager installation option. Git Credential Manager has been [deprecated](https://github.com/Microsoft/Git-Credential-Manager-for-Windows#notice-this-project-is-no-longer-being-maintained-warning). Doc needs to be updated to suggest Git Credential Manager Core. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 6c2f260d-9321-d669-172d-01289c233263 * Version Independent ID: 14e9ae18-df9d-7555-902c-3e8b4c2b2f22 * Content: [Connect to your Git repos using credential managers - Azure Repos](https://docs.microsoft.com/en-us/azure/devops/repos/git/set-up-credential-managers?view=azure-devops#windows) * Content Source: [docs/repos/git/set-up-credential-managers.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/repos/git/set-up-credential-managers.md) * Product: **devops** * Technology: **devops-code-git** * GitHub Login: @vtbassmatt * Microsoft Alias: **macoope**
non_process
git credential manager deprecated windows download and run the latest git for windows installer which includes the git credential manager for windows make sure to enable the git credential manager installation option git credential manager has been doc needs to be updated to suggest git credential manager core document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops code git github login vtbassmatt microsoft alias macoope
0
560,887
16,605,477,853
IssuesEvent
2021-06-02 02:50:01
eclipse-ee4j/glassfish
https://api.github.com/repos/eclipse-ee4j/glassfish
closed
Glassfish grizzly thread consuming large amounts of CPU time in NIO poll operation
Component: grizzly-kernel ERR: Assignee Priority: Major Stale Type: Bug
We recently noticed an issue where our Glassifish server, after running successfully for several hours, would suddenly peg one of the CPUs at 100%. Our application becomes unresponsive during this time. After restarting, the problem will eventually happen again (usually after several hours). I ran this command to see what the threads were doing: asadmin generate-jvm-report --type=thread In the resulting output, one thread looked highly suspicious (consuming orders of magnitude more CPU time than any other thread): Thread Execution Information: Thread "Grizzly-kernel-thread(1)" thread-id: 27 thread-state: RUNNABLE Running in native at: sun.nio.ch.WindowsSelectorImpl$SubSelector.poll0(Native Method) at: sun.nio.ch.WindowsSelectorImpl$SubSelector.poll(WindowsSelectorImpl.java:273) at: sun.nio.ch.WindowsSelectorImpl$SubSelector.access$400(WindowsSelectorImpl.java:255) at: sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:136) at: sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:69) at: sun.nio.ch.SelectorImpl.select(SelectorImpl.java:80) at: com.sun.grizzly.TCPSelectorHandler.select(TCPSelectorHandler.java:513) at: com.sun.grizzly.SelectorHandlerRunner.doSelect(SelectorHandlerRunner.java:190) at: com.sun.grizzly.SelectorHandlerRunner.run(SelectorHandlerRunner.java:132) at: java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at: java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at: java.lang.Thread.run(Thread.java:662) Thread Synchronization Statistics: Number of times this thread was blocked (to enter/reenter a Monitor): 4,520 Number of times this thread waited for a notification (i.e. it was in WAITING or TIMED_WAITING state): 0 Total CPU time for this thread: 2,753 seconds 703,125,000 nanoseconds. User-level CPU time for this thread: 2,753 seconds 703,125,000 nanoseconds. Object Monitors currently held or requested by this thread: [] Ownable Synchronizers (e.g. ReentrantLock and ReentrantReadWriteLock) held by this thread: [] #### Environment JDK version 1.6.0_37, Windows Server 2008 R2 Enterprise #### Affected Versions [3.1.2.2]
1.0
Glassfish grizzly thread consuming large amounts of CPU time in NIO poll operation - We recently noticed an issue where our Glassifish server, after running successfully for several hours, would suddenly peg one of the CPUs at 100%. Our application becomes unresponsive during this time. After restarting, the problem will eventually happen again (usually after several hours). I ran this command to see what the threads were doing: asadmin generate-jvm-report --type=thread In the resulting output, one thread looked highly suspicious (consuming orders of magnitude more CPU time than any other thread): Thread Execution Information: Thread "Grizzly-kernel-thread(1)" thread-id: 27 thread-state: RUNNABLE Running in native at: sun.nio.ch.WindowsSelectorImpl$SubSelector.poll0(Native Method) at: sun.nio.ch.WindowsSelectorImpl$SubSelector.poll(WindowsSelectorImpl.java:273) at: sun.nio.ch.WindowsSelectorImpl$SubSelector.access$400(WindowsSelectorImpl.java:255) at: sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:136) at: sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:69) at: sun.nio.ch.SelectorImpl.select(SelectorImpl.java:80) at: com.sun.grizzly.TCPSelectorHandler.select(TCPSelectorHandler.java:513) at: com.sun.grizzly.SelectorHandlerRunner.doSelect(SelectorHandlerRunner.java:190) at: com.sun.grizzly.SelectorHandlerRunner.run(SelectorHandlerRunner.java:132) at: java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at: java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at: java.lang.Thread.run(Thread.java:662) Thread Synchronization Statistics: Number of times this thread was blocked (to enter/reenter a Monitor): 4,520 Number of times this thread waited for a notification (i.e. it was in WAITING or TIMED_WAITING state): 0 Total CPU time for this thread: 2,753 seconds 703,125,000 nanoseconds. User-level CPU time for this thread: 2,753 seconds 703,125,000 nanoseconds. Object Monitors currently held or requested by this thread: [] Ownable Synchronizers (e.g. ReentrantLock and ReentrantReadWriteLock) held by this thread: [] #### Environment JDK version 1.6.0_37, Windows Server 2008 R2 Enterprise #### Affected Versions [3.1.2.2]
non_process
glassfish grizzly thread consuming large amounts of cpu time in nio poll operation we recently noticed an issue where our glassifish server after running successfully for several hours would suddenly peg one of the cpus at our application becomes unresponsive during this time after restarting the problem will eventually happen again usually after several hours i ran this command to see what the threads were doing asadmin generate jvm report type thread in the resulting output one thread looked highly suspicious consuming orders of magnitude more cpu time than any other thread thread execution information thread grizzly kernel thread thread id thread state runnable running in native at sun nio ch windowsselectorimpl subselector native method at sun nio ch windowsselectorimpl subselector poll windowsselectorimpl java at sun nio ch windowsselectorimpl subselector access windowsselectorimpl java at sun nio ch windowsselectorimpl doselect windowsselectorimpl java at sun nio ch selectorimpl lockanddoselect selectorimpl java at sun nio ch selectorimpl select selectorimpl java at com sun grizzly tcpselectorhandler select tcpselectorhandler java at com sun grizzly selectorhandlerrunner doselect selectorhandlerrunner java at com sun grizzly selectorhandlerrunner run selectorhandlerrunner java at java util concurrent threadpoolexecutor worker runtask threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java thread synchronization statistics number of times this thread was blocked to enter reenter a monitor number of times this thread waited for a notification i e it was in waiting or timed waiting state total cpu time for this thread seconds nanoseconds user level cpu time for this thread seconds nanoseconds object monitors currently held or requested by this thread ownable synchronizers e g reentrantlock and reentrantreadwritelock held by this thread environment jdk version windows server enterprise affected versions
0
205,943
15,700,804,385
IssuesEvent
2021-03-26 10:20:08
LiskHQ/lisk-sdk
https://api.github.com/repos/LiskHQ/lisk-sdk
closed
Update framework integration tests to use getBlockProcessEnv testing util
framework type: test
### Description Update framework integration tests to use created testing utilities. ### Motivation All testing utilities should be used in the framework to test the usability of the functions ### Acceptance Criteria - Update `getBlockProcessingEnv` testing util to take delegate slots into consideration and also use the correct passphrase for the specific delegate - Framework integration tests should use `getBlockProcessingEnv` and other framework testing utils for integration tests - Cleanup unwanted integration utils/functions
1.0
Update framework integration tests to use getBlockProcessEnv testing util - ### Description Update framework integration tests to use created testing utilities. ### Motivation All testing utilities should be used in the framework to test the usability of the functions ### Acceptance Criteria - Update `getBlockProcessingEnv` testing util to take delegate slots into consideration and also use the correct passphrase for the specific delegate - Framework integration tests should use `getBlockProcessingEnv` and other framework testing utils for integration tests - Cleanup unwanted integration utils/functions
non_process
update framework integration tests to use getblockprocessenv testing util description update framework integration tests to use created testing utilities motivation all testing utilities should be used in the framework to test the usability of the functions acceptance criteria update getblockprocessingenv testing util to take delegate slots into consideration and also use the correct passphrase for the specific delegate framework integration tests should use getblockprocessingenv and other framework testing utils for integration tests cleanup unwanted integration utils functions
0
17,526
23,338,276,795
IssuesEvent
2022-08-09 11:59:38
pystatgen/sgkit
https://api.github.com/repos/pystatgen/sgkit
closed
Remove scikit-learn pin
process + tools upstream
When dask-ml has a new release that includes https://github.com/dask/dask-ml/pull/910 (version number after v2022.1.22) then we can remove the scikit-learn pin added in 7ce14d8f47b85772c66addb9e7d59b0e550b4fb3.
1.0
Remove scikit-learn pin - When dask-ml has a new release that includes https://github.com/dask/dask-ml/pull/910 (version number after v2022.1.22) then we can remove the scikit-learn pin added in 7ce14d8f47b85772c66addb9e7d59b0e550b4fb3.
process
remove scikit learn pin when dask ml has a new release that includes version number after then we can remove the scikit learn pin added in
1
139,531
5,378,003,210
IssuesEvent
2017-02-23 13:52:40
CCAFS/MARLO
https://api.github.com/repos/CCAFS/MARLO
closed
If funding source is W1/W2 type, donor list should be CGIAR SMO
Priority - High Type - Bug
When I select W1/W2 as the funding window, the only donor option should be - what? Should it be CGIAR SMO? Can we have a consistent "donor" for the W1/W2? I don't see what the best option should be right now. Pascale Nov 17:Suggest "CGIAR Fund" or "CGIAR" AW (Dec 7): Currently listed as CGIAR Consortium Office. I am okay with this, but will not change the status until others weigh in. Hector Jan 9: Marking this issue as fixed. PS Feb 17: reopening this issue, which at least for PIM is not fixed. I create a W1/W2 funding source and I still get the whole list of donors instead of only CGIAR und.
1.0
If funding source is W1/W2 type, donor list should be CGIAR SMO - When I select W1/W2 as the funding window, the only donor option should be - what? Should it be CGIAR SMO? Can we have a consistent "donor" for the W1/W2? I don't see what the best option should be right now. Pascale Nov 17:Suggest "CGIAR Fund" or "CGIAR" AW (Dec 7): Currently listed as CGIAR Consortium Office. I am okay with this, but will not change the status until others weigh in. Hector Jan 9: Marking this issue as fixed. PS Feb 17: reopening this issue, which at least for PIM is not fixed. I create a W1/W2 funding source and I still get the whole list of donors instead of only CGIAR und.
non_process
if funding source is type donor list should be cgiar smo when i select as the funding window the only donor option should be what should it be cgiar smo can we have a consistent donor for the i don t see what the best option should be right now pascale nov suggest cgiar fund or cgiar aw dec currently listed as cgiar consortium office i am okay with this but will not change the status until others weigh in hector jan marking this issue as fixed ps feb reopening this issue which at least for pim is not fixed i create a funding source and i still get the whole list of donors instead of only cgiar und
0
2,431
5,205,349,855
IssuesEvent
2017-01-24 17:42:33
jlm2017/jlm-video-subtitles
https://api.github.com/repos/jlm2017/jlm-video-subtitles
closed
[Subtitles] [FR] MÉLENCHON - « IL FAUT SORTIR DU NUCLÉAIRE ET PASSER AU 100% RENOUVELABLE»
Language: French Process: [6] Approved
# Video title MÉLENCHON - « IL FAUT SORTIR DU NUCLÉAIRE ET PASSER AU 100% RENOUVELABLE» # URL https://www.youtube.com/watch?v=Rudtj9ZJONo # Youtube subtitles language Français # Duration 7:37 # Subtitles URL https://www.youtube.com/timedtext_editor?lang=fr&v=Rudtj9ZJONo&tab=captions&action_mde_edit_form=1&bl=vmp&ref=player&ui=hd
1.0
[Subtitles] [FR] MÉLENCHON - « IL FAUT SORTIR DU NUCLÉAIRE ET PASSER AU 100% RENOUVELABLE» - # Video title MÉLENCHON - « IL FAUT SORTIR DU NUCLÉAIRE ET PASSER AU 100% RENOUVELABLE» # URL https://www.youtube.com/watch?v=Rudtj9ZJONo # Youtube subtitles language Français # Duration 7:37 # Subtitles URL https://www.youtube.com/timedtext_editor?lang=fr&v=Rudtj9ZJONo&tab=captions&action_mde_edit_form=1&bl=vmp&ref=player&ui=hd
process
mélenchon « il faut sortir du nucléaire et passer au renouvelable» video title mélenchon « il faut sortir du nucléaire et passer au renouvelable» url youtube subtitles language français duration subtitles url
1
15,690
19,848,019,186
IssuesEvent
2022-01-21 09:07:39
ooi-data/CE06ISSP-SP001-02-DOSTAJ000-recovered_cspp-dosta_abcdjm_cspp_instrument_recovered
https://api.github.com/repos/ooi-data/CE06ISSP-SP001-02-DOSTAJ000-recovered_cspp-dosta_abcdjm_cspp_instrument_recovered
opened
🛑 Processing failed: ValueError
process
## Overview `ValueError` found in `processing_task` task during run ended on 2022-01-21T09:07:38.927833. ## Details Flow name: `CE06ISSP-SP001-02-DOSTAJ000-recovered_cspp-dosta_abcdjm_cspp_instrument_recovered` Task name: `processing_task` Error type: `ValueError` Error message: not enough values to unpack (expected 3, got 0) <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing final_path = finalize_data_stream( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream append_to_zarr(mod_ds, final_store, enc, logger=logger) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr _append_zarr(store, mod_ds) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr existing_arr.append(var_data.values) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 519, in values return _as_array_or_item(self._data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 259, in _as_array_or_item data = np.asarray(data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 1541, in __array__ x = self.compute() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 288, in compute (result,) = compute(self, traverse=False, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 571, in compute results = schedule(dsk, keys, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py", line 79, in get results = get_async( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 507, in get_async raise_exception(exc, tb) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 315, in reraise raise exc File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task result = _execute_task(task, data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 116, in getter c = np.asarray(c) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 357, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 551, in __array__ self._ensure_cached() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 548, in _ensure_cached self.array = NumpyIndexingAdapter(np.asarray(self.array)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 521, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 70, in __array__ return self.func(self.array) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 137, in _apply_mask data = np.asarray(data, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/zarr.py", line 73, in __getitem__ return array[key.tuple] File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 673, in __getitem__ return self.get_basic_selection(selection, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 798, in get_basic_selection return self._get_basic_selection_nd(selection=selection, out=out, File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 841, in _get_basic_selection_nd return self._get_selection(indexer=indexer, out=out, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1135, in _get_selection lchunk_coords, lchunk_selection, lout_selection = zip(*indexer) ValueError: not enough values to unpack (expected 3, got 0) ``` </details>
1.0
🛑 Processing failed: ValueError - ## Overview `ValueError` found in `processing_task` task during run ended on 2022-01-21T09:07:38.927833. ## Details Flow name: `CE06ISSP-SP001-02-DOSTAJ000-recovered_cspp-dosta_abcdjm_cspp_instrument_recovered` Task name: `processing_task` Error type: `ValueError` Error message: not enough values to unpack (expected 3, got 0) <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing final_path = finalize_data_stream( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream append_to_zarr(mod_ds, final_store, enc, logger=logger) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr _append_zarr(store, mod_ds) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr existing_arr.append(var_data.values) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 519, in values return _as_array_or_item(self._data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 259, in _as_array_or_item data = np.asarray(data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 1541, in __array__ x = self.compute() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 288, in compute (result,) = compute(self, traverse=False, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 571, in compute results = schedule(dsk, keys, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py", line 79, in get results = get_async( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 507, in get_async raise_exception(exc, tb) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 315, in reraise raise exc File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task result = _execute_task(task, data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 116, in getter c = np.asarray(c) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 357, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 551, in __array__ self._ensure_cached() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 548, in _ensure_cached self.array = NumpyIndexingAdapter(np.asarray(self.array)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 521, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 70, in __array__ return self.func(self.array) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 137, in _apply_mask data = np.asarray(data, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/zarr.py", line 73, in __getitem__ return array[key.tuple] File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 673, in __getitem__ return self.get_basic_selection(selection, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 798, in get_basic_selection return self._get_basic_selection_nd(selection=selection, out=out, File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 841, in _get_basic_selection_nd return self._get_selection(indexer=indexer, out=out, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1135, in _get_selection lchunk_coords, lchunk_selection, lout_selection = zip(*indexer) ValueError: not enough values to unpack (expected 3, got 0) ``` </details>
process
🛑 processing failed valueerror overview valueerror found in processing task task during run ended on details flow name recovered cspp dosta abcdjm cspp instrument recovered task name processing task error type valueerror error message not enough values to unpack expected got traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize data stream file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize data stream append to zarr mod ds final store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr append zarr store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in append zarr existing arr append var data values file srv conda envs notebook lib site packages xarray core variable py line in values return as array or item self data file srv conda envs notebook lib site packages xarray core variable py line in as array or item data np asarray data file srv conda envs notebook lib site packages dask array core py line in array x self compute file srv conda envs notebook lib site packages dask base py line in compute result compute self traverse false kwargs file srv conda envs notebook lib site packages dask base py line in compute results schedule dsk keys kwargs file srv conda envs notebook lib site packages dask threaded py line in get results get async file srv conda envs notebook lib site packages dask local py line in get async raise exception exc tb file srv conda envs notebook lib site packages dask local py line in reraise raise exc file srv conda envs notebook lib site packages dask local py line in execute task result execute task task data file srv conda envs notebook lib site packages dask core py line in execute task return func execute task a cache for a in args file srv conda envs notebook lib site packages dask array core py line in getter c np asarray c file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray self array dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array self ensure cached file srv conda envs notebook lib site packages xarray core indexing py line in ensure cached self array numpyindexingadapter np asarray self array file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray self array dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray array dtype none file srv conda envs notebook lib site packages xarray coding variables py line in array return self func self array file srv conda envs notebook lib site packages xarray coding variables py line in apply mask data np asarray data dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray array dtype none file srv conda envs notebook lib site packages xarray backends zarr py line in getitem return array file srv conda envs notebook lib site packages zarr core py line in getitem return self get basic selection selection fields fields file srv conda envs notebook lib site packages zarr core py line in get basic selection return self get basic selection nd selection selection out out file srv conda envs notebook lib site packages zarr core py line in get basic selection nd return self get selection indexer indexer out out fields fields file srv conda envs notebook lib site packages zarr core py line in get selection lchunk coords lchunk selection lout selection zip indexer valueerror not enough values to unpack expected got
1
20,555
15,687,134,909
IssuesEvent
2021-03-25 13:21:04
SanderMertens/flecs
https://api.github.com/repos/SanderMertens/flecs
closed
Update API & query parser to new entity-relation terminology/notation
deprecates enhancement usability
**Describe the problem you are trying to solve.** With the upcoming new query implementation (#295) the primary use case for traits will be as a way to store relationships between entities. The terminology used by the current API does not make this obvious or intuitive. Additionally, there are a few inconsistencies between the C and C++ API that need to be addressed (the C API uses the reverse order of specifying traits as the C++ API). **Describe the solution you'd like** Modify the API & query parser to use terminology/notation as described here: https://github.com/SanderMertens/flecs/discussions/344
True
Update API & query parser to new entity-relation terminology/notation - **Describe the problem you are trying to solve.** With the upcoming new query implementation (#295) the primary use case for traits will be as a way to store relationships between entities. The terminology used by the current API does not make this obvious or intuitive. Additionally, there are a few inconsistencies between the C and C++ API that need to be addressed (the C API uses the reverse order of specifying traits as the C++ API). **Describe the solution you'd like** Modify the API & query parser to use terminology/notation as described here: https://github.com/SanderMertens/flecs/discussions/344
non_process
update api query parser to new entity relation terminology notation describe the problem you are trying to solve with the upcoming new query implementation the primary use case for traits will be as a way to store relationships between entities the terminology used by the current api does not make this obvious or intuitive additionally there are a few inconsistencies between the c and c api that need to be addressed the c api uses the reverse order of specifying traits as the c api describe the solution you d like modify the api query parser to use terminology notation as described here
0
172,836
6,516,774,390
IssuesEvent
2017-08-27 14:13:48
Polpetta/SecurityAndRiskManagementNotes
https://api.github.com/repos/Polpetta/SecurityAndRiskManagementNotes
closed
Missing slide
enhancement in progress priority:medium type:content
**Description** ![image](https://user-images.githubusercontent.com/6976484/29555848-35c9e19e-8724-11e7-9a3c-199e44c699c3.png) [BC]Manca_Una_Slide(Scusate_gli_underscore_ma_mi_è_andata_in_mona_la_barra_spaziatrice_xD) La_aggiungiamo_? **Other issues**
1.0
Missing slide - **Description** ![image](https://user-images.githubusercontent.com/6976484/29555848-35c9e19e-8724-11e7-9a3c-199e44c699c3.png) [BC]Manca_Una_Slide(Scusate_gli_underscore_ma_mi_è_andata_in_mona_la_barra_spaziatrice_xD) La_aggiungiamo_? **Other issues**
non_process
missing slide description manca una slide scusate gli underscore ma mi è andata in mona la barra spaziatrice xd la aggiungiamo other issues
0
13,533
16,066,012,108
IssuesEvent
2021-04-23 19:13:34
unicode-org/icu4x
https://api.github.com/repos/unicode-org/icu4x
closed
Extend CONTRIBUTING.md to explicitly state that the PR author can remove automatically added reviewers
C-process S-tiny T-docs-tests
Since GitHub is auto-adding reviewers, and we expect PR author to wait for all reviewers, the result is that sometimes github adds a reviewer and the PR author doesn't think they need that person to review it (for example because patch is trivial and got r+ already from someone else). It would be good to document that the PR author has the authority to remove pending reviewers if they consider the PR to be sufficiently reviewed.
1.0
Extend CONTRIBUTING.md to explicitly state that the PR author can remove automatically added reviewers - Since GitHub is auto-adding reviewers, and we expect PR author to wait for all reviewers, the result is that sometimes github adds a reviewer and the PR author doesn't think they need that person to review it (for example because patch is trivial and got r+ already from someone else). It would be good to document that the PR author has the authority to remove pending reviewers if they consider the PR to be sufficiently reviewed.
process
extend contributing md to explicitly state that the pr author can remove automatically added reviewers since github is auto adding reviewers and we expect pr author to wait for all reviewers the result is that sometimes github adds a reviewer and the pr author doesn t think they need that person to review it for example because patch is trivial and got r already from someone else it would be good to document that the pr author has the authority to remove pending reviewers if they consider the pr to be sufficiently reviewed
1
17,839
23,776,644,538
IssuesEvent
2022-09-01 21:43:58
Azure/azure-sdk-tools
https://api.github.com/repos/Azure/azure-sdk-tools
opened
SDK Review Step: Cadl
Engagement Experience WS: Process Tools & Automation
The purpose of this Epic is to define the gaps on the SDK Review process inside the Azure SDK Release process affected by Cadl. First of, how is Cadl affecting SDK? Is it?
1.0
SDK Review Step: Cadl - The purpose of this Epic is to define the gaps on the SDK Review process inside the Azure SDK Release process affected by Cadl. First of, how is Cadl affecting SDK? Is it?
process
sdk review step cadl the purpose of this epic is to define the gaps on the sdk review process inside the azure sdk release process affected by cadl first of how is cadl affecting sdk is it
1
366,288
10,819,332,859
IssuesEvent
2019-11-08 14:11:09
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
github.com - design is broken
browser-firefox-mobile engine-gecko priority-critical
<!-- @browser: Firefox Mobile 68.0 --> <!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 --> <!-- @reported_with: mobile-reporter --> **URL**: https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen **Browser / Version**: Firefox Mobile 68.0 **Operating System**: Android **Tested Another Browser**: No **Problem type**: Design is broken **Description**: Issue search bar doesn't show in mobile view **Steps to Reproduce**: I wanted to search if any issues had been reported for the website Twitter at the Webcompat Github. The search bar doesn't display in the mobile view but it does in desktop view. [![Screenshot Description](https://webcompat.com/uploads/2019/10/20d1727d-34c5-44d9-ba7a-a0c1fffed203-thumb.jpeg)](https://webcompat.com/uploads/2019/10/20d1727d-34c5-44d9-ba7a-a0c1fffed203.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20191024175008</li><li>channel: beta</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> <p>Console Messages:</p> <pre> ['[console.log([Nano] Nano Defender Activated :: github.com) moz-extension://56303829-27b3-44c3-94f4-8d12edf3f335/content/core.js:43:24]', '[console.log([Nano] Excluded :: uBO-Extra) moz-extension://56303829-27b3-44c3-94f4-8d12edf3f335/content/rules-common.js:295:28]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]'] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
github.com - design is broken - <!-- @browser: Firefox Mobile 68.0 --> <!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 --> <!-- @reported_with: mobile-reporter --> **URL**: https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen **Browser / Version**: Firefox Mobile 68.0 **Operating System**: Android **Tested Another Browser**: No **Problem type**: Design is broken **Description**: Issue search bar doesn't show in mobile view **Steps to Reproduce**: I wanted to search if any issues had been reported for the website Twitter at the Webcompat Github. The search bar doesn't display in the mobile view but it does in desktop view. [![Screenshot Description](https://webcompat.com/uploads/2019/10/20d1727d-34c5-44d9-ba7a-a0c1fffed203-thumb.jpeg)](https://webcompat.com/uploads/2019/10/20d1727d-34c5-44d9-ba7a-a0c1fffed203.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20191024175008</li><li>channel: beta</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> <p>Console Messages:</p> <pre> ['[console.log([Nano] Nano Defender Activated :: github.com) moz-extension://56303829-27b3-44c3-94f4-8d12edf3f335/content/core.js:43:24]', '[console.log([Nano] Excluded :: uBO-Extra) moz-extension://56303829-27b3-44c3-94f4-8d12edf3f335/content/rules-common.js:295:28]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]', '[JavaScript Error: "Content Security Policy: The pages settings blocked the loading of a resource at inline (script-src)." {file: "https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen" line: 1}]'] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_process
github com design is broken url browser version firefox mobile operating system android tested another browser no problem type design is broken description issue search bar doesn t show in mobile view steps to reproduce i wanted to search if any issues had been reported for the website twitter at the webcompat github the search bar doesn t display in the mobile view but it does in desktop view browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel beta hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false console messages nano defender activated github com moz extension content core js excluded ubo extra moz extension content rules common js from with ❤️
0
7,302
10,443,146,483
IssuesEvent
2019-09-18 14:21:59
threefoldtech/0-core
https://api.github.com/repos/threefoldtech/0-core
closed
0-OS auto-update
process_wontfix state_planned type_feature
The goal is to be able to update 0-core without the need of a reboot. Idea how to get there was: - make 0-core to NOT be the pid 1 of the system - create a new pid 1 that is just responsible to keep 0-core running (@maxux started working on https://github.com/maxux/0-init) - design an upgrade procedure where 0-OS just download the new version 0-core and restart it.
1.0
0-OS auto-update - The goal is to be able to update 0-core without the need of a reboot. Idea how to get there was: - make 0-core to NOT be the pid 1 of the system - create a new pid 1 that is just responsible to keep 0-core running (@maxux started working on https://github.com/maxux/0-init) - design an upgrade procedure where 0-OS just download the new version 0-core and restart it.
process
os auto update the goal is to be able to update core without the need of a reboot idea how to get there was make core to not be the pid of the system create a new pid that is just responsible to keep core running maxux started working on design an upgrade procedure where os just download the new version core and restart it
1
63,625
17,791,817,498
IssuesEvent
2021-08-31 17:04:44
gwaldron/osgearth
https://api.github.com/repos/gwaldron/osgearth
closed
Regression: Artifacts in draped geometry
defect
246ce0f14a71ea18b89d50e203cdd78b260b8e07 introduced rendering artifacts in `feature_draped_polygons.earth`. Before: ![image](https://user-images.githubusercontent.com/326618/131525665-ca4b3782-d24f-4f1e-b3fc-c8f985aee619.png) After: ![image](https://user-images.githubusercontent.com/326618/131525537-878b3c17-5434-410d-9ae8-931b01f5bcee.png)
1.0
Regression: Artifacts in draped geometry - 246ce0f14a71ea18b89d50e203cdd78b260b8e07 introduced rendering artifacts in `feature_draped_polygons.earth`. Before: ![image](https://user-images.githubusercontent.com/326618/131525665-ca4b3782-d24f-4f1e-b3fc-c8f985aee619.png) After: ![image](https://user-images.githubusercontent.com/326618/131525537-878b3c17-5434-410d-9ae8-931b01f5bcee.png)
non_process
regression artifacts in draped geometry introduced rendering artifacts in feature draped polygons earth before after
0
772,199
27,110,778,638
IssuesEvent
2023-02-15 15:10:45
codersforcauses/wadl
https://api.github.com/repos/codersforcauses/wadl
closed
Create manage divisions page
backend frontend priority::high stage 2 difficulty:extreme
## Prerequiste #172 ## Basic Information - [x] Dynamic routing depending on the level - [x] Ability to add new divisions - [x] Ability to choose a venue for that division - [x] Ability to add teams to a division - [x] modal should pop up - [x] Teams should have a colour chips depending on the priority that the team has for that venue - [x] Add divisions/ teams to firebase tournament structure - [x] Load divisions / teams froms firebase if already exists - [x] Only update divisions in firebase if they have changes - [x] Allocate byes to all odd number of teams in a division This issue requires a few people including someone who knows backend
1.0
Create manage divisions page - ## Prerequiste #172 ## Basic Information - [x] Dynamic routing depending on the level - [x] Ability to add new divisions - [x] Ability to choose a venue for that division - [x] Ability to add teams to a division - [x] modal should pop up - [x] Teams should have a colour chips depending on the priority that the team has for that venue - [x] Add divisions/ teams to firebase tournament structure - [x] Load divisions / teams froms firebase if already exists - [x] Only update divisions in firebase if they have changes - [x] Allocate byes to all odd number of teams in a division This issue requires a few people including someone who knows backend
non_process
create manage divisions page prerequiste basic information dynamic routing depending on the level ability to add new divisions ability to choose a venue for that division ability to add teams to a division modal should pop up teams should have a colour chips depending on the priority that the team has for that venue add divisions teams to firebase tournament structure load divisions teams froms firebase if already exists only update divisions in firebase if they have changes allocate byes to all odd number of teams in a division this issue requires a few people including someone who knows backend
0
167,297
13,018,879,773
IssuesEvent
2020-07-26 19:34:01
Lassal/dbdiff
https://api.github.com/repos/Lassal/dbdiff
closed
Integrated tests for DBModelFS
Integration Tests
Test all public methods and outputs of the class br.lassal.dbvcs.tatubola.fs.DBModelFS The tests should cover all naming and filesystem structure used. Serialization don't need to be tested here.
1.0
Integrated tests for DBModelFS - Test all public methods and outputs of the class br.lassal.dbvcs.tatubola.fs.DBModelFS The tests should cover all naming and filesystem structure used. Serialization don't need to be tested here.
non_process
integrated tests for dbmodelfs test all public methods and outputs of the class br lassal dbvcs tatubola fs dbmodelfs the tests should cover all naming and filesystem structure used serialization don t need to be tested here
0
9,558
12,517,663,483
IssuesEvent
2020-06-03 11:34:32
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Metabase rounding 17 digit strings
Priority:P2 Querying/Processor Type:Bug
Reported on Discourse here: http://discourse.metabase.com/t/bug-metabase-rounding-17-digit-strings/748 > I've got a very strange bug/feature appearing in one of my tables where a certain bit of user data (the Steam ID) is having it's last digit rounded to the nearest 10. So 765611980024149**61** becomes 765611980024149**60** Was originally reported a long time ago, but seems like a recent comment shows it's still an issue.
1.0
Metabase rounding 17 digit strings - Reported on Discourse here: http://discourse.metabase.com/t/bug-metabase-rounding-17-digit-strings/748 > I've got a very strange bug/feature appearing in one of my tables where a certain bit of user data (the Steam ID) is having it's last digit rounded to the nearest 10. So 765611980024149**61** becomes 765611980024149**60** Was originally reported a long time ago, but seems like a recent comment shows it's still an issue.
process
metabase rounding digit strings reported on discourse here i ve got a very strange bug feature appearing in one of my tables where a certain bit of user data the steam id is having it s last digit rounded to the nearest so becomes was originally reported a long time ago but seems like a recent comment shows it s still an issue
1
61,781
6,758,000,976
IssuesEvent
2017-10-24 12:54:50
ComputationalRadiationPhysics/alpaka
https://api.github.com/repos/ComputationalRadiationPhysics/alpaka
opened
travid CI test case take to long
testing
One of your tests takes to long and will not pass finish. https://travis-ci.org/ComputationalRadiationPhysics/alpaka/jobs/291978888 @BenjaminW3 Can we shrink the tested examples?
1.0
travid CI test case take to long - One of your tests takes to long and will not pass finish. https://travis-ci.org/ComputationalRadiationPhysics/alpaka/jobs/291978888 @BenjaminW3 Can we shrink the tested examples?
non_process
travid ci test case take to long one of your tests takes to long and will not pass finish can we shrink the tested examples
0
6,519
9,605,319,755
IssuesEvent
2019-05-10 23:25:20
NuGet/Home
https://api.github.com/repos/NuGet/Home
opened
CI Preview Build Number doesn't match NuGet.exe Preview number
Area: Release Process
Build process generated a preview2 nuget.exe which was put on Nuget.org. However, I was releasing preview3. There is a variable in the build process that is not referenced anywhere else which is incrementing based on something irrelevant. Solution Just hard-code the preview# into the build script to match the VS Preview number.
1.0
CI Preview Build Number doesn't match NuGet.exe Preview number - Build process generated a preview2 nuget.exe which was put on Nuget.org. However, I was releasing preview3. There is a variable in the build process that is not referenced anywhere else which is incrementing based on something irrelevant. Solution Just hard-code the preview# into the build script to match the VS Preview number.
process
ci preview build number doesn t match nuget exe preview number build process generated a nuget exe which was put on nuget org however i was releasing there is a variable in the build process that is not referenced anywhere else which is incrementing based on something irrelevant solution just hard code the preview into the build script to match the vs preview number
1
14,855
18,250,298,950
IssuesEvent
2021-10-02 04:43:25
dtcenter/MET
https://api.github.com/repos/dtcenter/MET
closed
Create new Gen-Ens-Prod tool for ensemble product generation.
type: new feature priority: blocker reporting: DTC NOAA R2O requestor: METplus Team required: FOR DEVELOPMENT RELEASE MET: PreProcessing Tools (Grid) MET: Ensemble Verification
## Describe the New Feature ## Create a new tool for ensemble product generation named Gen-Ens-Prod. This tool requires a configuration file and should contain the ensemble product generation currently performed for the fields in the "ens" dictionary of Ensemble-Stat. Consider renaming the "ens" dictionary to "data" to be consistent with the conventions of the Grid-Diag tool. This tool does not process observations, but must support climatology mean and standard deviation data to support the use of climatological distribution percentile thresholds (e.g. >CDP75). ### Acceptance Testing ### Ensure that other MET tools can read the NetCDF output created by this tool. ### Time Estimate ### 2 weeks. ### Sub-Issues ### Consider breaking the new feature down into sub-issues. Sub-issues will likely be required. Not sure what they are yet. - [ ] *Add a checkbox for each sub-issue here.* ### Relevant Deadlines ### *List relevant project deadlines here or state NONE.* ### Funding Source ### Split 2793541, 2700041, 2799991 ## Define the Metadata ## ### Assignee ### - [x] Select **engineer(s)** or **no engineer** required: @JohnHalleyGotway - [x] Select **scientist(s)** or **no scientist** required: @j-opatz Can also solicit feedback from @mpm-meto, @jwolff-ncar, and @michelleharrold. ### Labels ### - [x] Select **component(s)** - [x] Select **priority** - [x] Select **requestor(s)** ### Projects and Milestone ### - [x] Select **Repository** and/or **Organization** level **Project(s)** or add **alert: NEED PROJECT ASSIGNMENT** label - [x] Select **Milestone** as the next official version or **Future Versions** ## Define Related Issue(s) ## Consider the impact to the other METplus components. - [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose) - [x] dtcenter/METplus#1180 - [x] No downstream impacts on METdatadb, METplotpy, METcalcpy, METviewer, or METexpress. ## New Feature Checklist ## See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details. - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding source**. - [ ] Fork this repository or create a branch of **develop**. Branch name: `feature_<Issue Number>_<Description>` - [ ] Complete the development and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update unit tests. - [ ] Add/update documentation. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **develop**. Pull request: `feature <Issue Number> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)** and **Linked issues** Select: **Repository** level development cycle **Project** for the next official release Select: **Milestone** as the next official version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Close this issue.
1.0
Create new Gen-Ens-Prod tool for ensemble product generation. - ## Describe the New Feature ## Create a new tool for ensemble product generation named Gen-Ens-Prod. This tool requires a configuration file and should contain the ensemble product generation currently performed for the fields in the "ens" dictionary of Ensemble-Stat. Consider renaming the "ens" dictionary to "data" to be consistent with the conventions of the Grid-Diag tool. This tool does not process observations, but must support climatology mean and standard deviation data to support the use of climatological distribution percentile thresholds (e.g. >CDP75). ### Acceptance Testing ### Ensure that other MET tools can read the NetCDF output created by this tool. ### Time Estimate ### 2 weeks. ### Sub-Issues ### Consider breaking the new feature down into sub-issues. Sub-issues will likely be required. Not sure what they are yet. - [ ] *Add a checkbox for each sub-issue here.* ### Relevant Deadlines ### *List relevant project deadlines here or state NONE.* ### Funding Source ### Split 2793541, 2700041, 2799991 ## Define the Metadata ## ### Assignee ### - [x] Select **engineer(s)** or **no engineer** required: @JohnHalleyGotway - [x] Select **scientist(s)** or **no scientist** required: @j-opatz Can also solicit feedback from @mpm-meto, @jwolff-ncar, and @michelleharrold. ### Labels ### - [x] Select **component(s)** - [x] Select **priority** - [x] Select **requestor(s)** ### Projects and Milestone ### - [x] Select **Repository** and/or **Organization** level **Project(s)** or add **alert: NEED PROJECT ASSIGNMENT** label - [x] Select **Milestone** as the next official version or **Future Versions** ## Define Related Issue(s) ## Consider the impact to the other METplus components. - [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose) - [x] dtcenter/METplus#1180 - [x] No downstream impacts on METdatadb, METplotpy, METcalcpy, METviewer, or METexpress. ## New Feature Checklist ## See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details. - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding source**. - [ ] Fork this repository or create a branch of **develop**. Branch name: `feature_<Issue Number>_<Description>` - [ ] Complete the development and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update unit tests. - [ ] Add/update documentation. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **develop**. Pull request: `feature <Issue Number> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)** and **Linked issues** Select: **Repository** level development cycle **Project** for the next official release Select: **Milestone** as the next official version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Close this issue.
process
create new gen ens prod tool for ensemble product generation describe the new feature create a new tool for ensemble product generation named gen ens prod this tool requires a configuration file and should contain the ensemble product generation currently performed for the fields in the ens dictionary of ensemble stat consider renaming the ens dictionary to data to be consistent with the conventions of the grid diag tool this tool does not process observations but must support climatology mean and standard deviation data to support the use of climatological distribution percentile thresholds e g acceptance testing ensure that other met tools can read the netcdf output created by this tool time estimate weeks sub issues consider breaking the new feature down into sub issues sub issues will likely be required not sure what they are yet add a checkbox for each sub issue here relevant deadlines list relevant project deadlines here or state none funding source split define the metadata assignee select engineer s or no engineer required johnhalleygotway select scientist s or no scientist required j opatz can also solicit feedback from mpm meto jwolff ncar and michelleharrold labels select component s select priority select requestor s projects and milestone select repository and or organization level project s or add alert need project assignment label select milestone as the next official version or future versions define related issue s consider the impact to the other metplus components dtcenter metplus no downstream impacts on metdatadb metplotpy metcalcpy metviewer or metexpress new feature checklist see the for details complete the issue definition above including the time estimate and funding source fork this repository or create a branch of develop branch name feature complete the development and test your changes add update log messages for easier debugging add update unit tests add update documentation push local changes to github submit a pull request to merge into develop pull request feature define the pull request metadata as permissions allow select reviewer s and linked issues select repository level development cycle project for the next official release select milestone as the next official version iterate until the reviewer s accept and merge your changes delete your fork or branch close this issue
1
28,721
5,533,697,013
IssuesEvent
2017-03-21 13:57:08
joshcummingsdesign/grizzly-wp
https://api.github.com/repos/joshcummingsdesign/grizzly-wp
closed
Ubermenu Nav Class Issue
documentation
Dropdowns show on page load. To fix this go to Ubermenu > General > Misc. and check "Disable Menu Item Class Filtering". This should be added to seed data. Add to the documentation that if filtering is needed, you should uncheck this option, but it may interfere with Ubermenu functionality. Also we may want to remove the "nav" class from header.php as it is not needed.
1.0
Ubermenu Nav Class Issue - Dropdowns show on page load. To fix this go to Ubermenu > General > Misc. and check "Disable Menu Item Class Filtering". This should be added to seed data. Add to the documentation that if filtering is needed, you should uncheck this option, but it may interfere with Ubermenu functionality. Also we may want to remove the "nav" class from header.php as it is not needed.
non_process
ubermenu nav class issue dropdowns show on page load to fix this go to ubermenu general misc and check disable menu item class filtering this should be added to seed data add to the documentation that if filtering is needed you should uncheck this option but it may interfere with ubermenu functionality also we may want to remove the nav class from header php as it is not needed
0
17,231
22,919,495,955
IssuesEvent
2022-07-17 12:58:38
nodejs/security-wg
https://api.github.com/repos/nodejs/security-wg
closed
Node+Interactive Security WG Agenda
process
Let's use this issue to track all the suggestions and recommended agenda for Security WG in upcoming Collab Summit in Node+Interactive Montreal. -- Update 13/12/2019 Participants: @lirantal @ChALkeR @vdeturckheim Working Items and Discussions: 1. Bug Bounty Criteria - agreed and confirmed on current guidelines that we've set via https://github.com/nodejs/security-wg/pull/603 , https://github.com/nodejs/security-wg/issues/593 and https://github.com/nodejs/security-wg/issues/525 2. Buckets and Prioritization - we are thinking that the current bucket order helps and a step in a good direction but we want to further improve it. Right now we have 2 buckets: `Low priority <100 d/l weekly` and `High priority >100 d/l weekly` and current stats are 30 vs 36 respectively. So it helps, but we'd like to also make sure we don't starve the reports in >100 that are affecting a good part of the ecosystem (i.e: >100000 downloads). To be discussed in https://github.com/nodejs/security-wg/issues/604 3. Membership activity / On-boarding Program - discussed that many triage group members are existing but not all are actively participating which doesn't improve our state of items in the queue for triage. Related to this topic, we discussed perhaps creating an on-boarding program where we would have members rotating for a 3-6 months on and off the triage team, since we the work can be burning out for most people, and also have a program where members start off with low priority buckets to help them grow with experience and into the "way of things" and then transition to one of the other higher priority buckets.
1.0
Node+Interactive Security WG Agenda - Let's use this issue to track all the suggestions and recommended agenda for Security WG in upcoming Collab Summit in Node+Interactive Montreal. -- Update 13/12/2019 Participants: @lirantal @ChALkeR @vdeturckheim Working Items and Discussions: 1. Bug Bounty Criteria - agreed and confirmed on current guidelines that we've set via https://github.com/nodejs/security-wg/pull/603 , https://github.com/nodejs/security-wg/issues/593 and https://github.com/nodejs/security-wg/issues/525 2. Buckets and Prioritization - we are thinking that the current bucket order helps and a step in a good direction but we want to further improve it. Right now we have 2 buckets: `Low priority <100 d/l weekly` and `High priority >100 d/l weekly` and current stats are 30 vs 36 respectively. So it helps, but we'd like to also make sure we don't starve the reports in >100 that are affecting a good part of the ecosystem (i.e: >100000 downloads). To be discussed in https://github.com/nodejs/security-wg/issues/604 3. Membership activity / On-boarding Program - discussed that many triage group members are existing but not all are actively participating which doesn't improve our state of items in the queue for triage. Related to this topic, we discussed perhaps creating an on-boarding program where we would have members rotating for a 3-6 months on and off the triage team, since we the work can be burning out for most people, and also have a program where members start off with low priority buckets to help them grow with experience and into the "way of things" and then transition to one of the other higher priority buckets.
process
node interactive security wg agenda let s use this issue to track all the suggestions and recommended agenda for security wg in upcoming collab summit in node interactive montreal update participants lirantal chalker vdeturckheim working items and discussions bug bounty criteria agreed and confirmed on current guidelines that we ve set via and buckets and prioritization we are thinking that the current bucket order helps and a step in a good direction but we want to further improve it right now we have buckets low priority d l weekly and current stats are vs respectively so it helps but we d like to also make sure we don t starve the reports in that are affecting a good part of the ecosystem i e downloads to be discussed in membership activity on boarding program discussed that many triage group members are existing but not all are actively participating which doesn t improve our state of items in the queue for triage related to this topic we discussed perhaps creating an on boarding program where we would have members rotating for a months on and off the triage team since we the work can be burning out for most people and also have a program where members start off with low priority buckets to help them grow with experience and into the way of things and then transition to one of the other higher priority buckets
1
9,808
12,820,886,333
IssuesEvent
2020-07-06 06:57:01
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
No or() in yml conditions
Pri2 devops-cicd-process/tech devops/prod
Is there no or(arg1, arg2) for conditions in yml? I did this: l195 `condition: and(succeeded(), or(eq(variables['Queue.Ssp'], 'ON'), ne(variables['Build.Reason'], 'PullRequest'))` and it gives me: /azure-pipelines.yml: (Line: 195, Col: 3, Idx: 7008) - (Line: 195, Col: 3, Idx: 7008): While scanning a simple key, could not find expected ':'. Queue.Ssp is currently undefined. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3f151218-9a11-0078-e038-f96198a76143 * Version Independent ID: 09c4d032-62f3-d97c-79d7-6fbfd89910e9 * Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/conditions.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
No or() in yml conditions - Is there no or(arg1, arg2) for conditions in yml? I did this: l195 `condition: and(succeeded(), or(eq(variables['Queue.Ssp'], 'ON'), ne(variables['Build.Reason'], 'PullRequest'))` and it gives me: /azure-pipelines.yml: (Line: 195, Col: 3, Idx: 7008) - (Line: 195, Col: 3, Idx: 7008): While scanning a simple key, could not find expected ':'. Queue.Ssp is currently undefined. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3f151218-9a11-0078-e038-f96198a76143 * Version Independent ID: 09c4d032-62f3-d97c-79d7-6fbfd89910e9 * Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/conditions.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
no or in yml conditions is there no or for conditions in yml i did this condition and succeeded or eq variables on ne variables pullrequest and it gives me azure pipelines yml line col idx line col idx while scanning a simple key could not find expected queue ssp is currently undefined document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
28,550
13,746,088,239
IssuesEvent
2020-10-06 04:47:57
eclipse/deeplearning4j
https://api.github.com/repos/eclipse/deeplearning4j
closed
Review Deep Compression for implementation potential regarding deep nets
Performance
Review recent paper http://arxiv.org/pdf/1510.00149v5.pdf to determine how to leverage the compression techniques in order to improve memory and computation efficiencies for deep networks Consider alternative approach in this paper: http://arxiv.org/pdf/1511.06606v5.pdf
True
Review Deep Compression for implementation potential regarding deep nets - Review recent paper http://arxiv.org/pdf/1510.00149v5.pdf to determine how to leverage the compression techniques in order to improve memory and computation efficiencies for deep networks Consider alternative approach in this paper: http://arxiv.org/pdf/1511.06606v5.pdf
non_process
review deep compression for implementation potential regarding deep nets review recent paper to determine how to leverage the compression techniques in order to improve memory and computation efficiencies for deep networks consider alternative approach in this paper
0
17,965
23,974,715,571
IssuesEvent
2022-09-13 10:35:52
OpenDataScotland/the_od_bods
https://api.github.com/repos/OpenDataScotland/the_od_bods
closed
Fix NLS Licensing Treatment
bug good first issue data processing back end
Having removed licencing treatment in the NLS Scraper https://github.com/OpenDataScotland/the_od_bods/commit/56c70c9b63b68fc00324be633f7a086af862cda0 has knock on effects in end results. Correct this such that processing is done in merge_data.py but NLS scraper needs to return 1 licence as string. It's currently a list and some assets have multiple licences. Might want to consider why an asset SHOULD have more than 1 licence. Consider also existing PR https://github.com/OpenDataScotland/the_od_bods/pull/128 https://opendatascotland.slack.com/archives/C02HEHDL8AY/p1662122527467149?thread_ts=1662122284.974809&cid=C02HEHDL8AY ![image](https://user-images.githubusercontent.com/47697803/188674083-3aaae543-bf7e-469d-aeda-5f2528325199.png)
1.0
Fix NLS Licensing Treatment - Having removed licencing treatment in the NLS Scraper https://github.com/OpenDataScotland/the_od_bods/commit/56c70c9b63b68fc00324be633f7a086af862cda0 has knock on effects in end results. Correct this such that processing is done in merge_data.py but NLS scraper needs to return 1 licence as string. It's currently a list and some assets have multiple licences. Might want to consider why an asset SHOULD have more than 1 licence. Consider also existing PR https://github.com/OpenDataScotland/the_od_bods/pull/128 https://opendatascotland.slack.com/archives/C02HEHDL8AY/p1662122527467149?thread_ts=1662122284.974809&cid=C02HEHDL8AY ![image](https://user-images.githubusercontent.com/47697803/188674083-3aaae543-bf7e-469d-aeda-5f2528325199.png)
process
fix nls licensing treatment having removed licencing treatment in the nls scraper has knock on effects in end results correct this such that processing is done in merge data py but nls scraper needs to return licence as string it s currently a list and some assets have multiple licences might want to consider why an asset should have more than licence consider also existing pr
1
993
3,460,678,680
IssuesEvent
2015-12-19 11:08:15
osresearch/vst
https://api.github.com/repos/osresearch/vst
closed
Clipping is created after serial port
processing
`clip` is created after serial port early return, so it will cause null pointer exception if a valid serial port is found.
1.0
Clipping is created after serial port - `clip` is created after serial port early return, so it will cause null pointer exception if a valid serial port is found.
process
clipping is created after serial port clip is created after serial port early return so it will cause null pointer exception if a valid serial port is found
1
9,611
12,550,941,548
IssuesEvent
2020-06-06 13:01:29
Jeffail/benthos
https://api.github.com/repos/Jeffail/benthos
closed
sort with uniq values in array
enhancement processors
I have an array with duplicate values ​​[1,2,3,1,10,1]. How can I sort an array with unique values? And get [1,2,3,10] The only way to do this is with awk, are there any other options?
1.0
sort with uniq values in array - I have an array with duplicate values ​​[1,2,3,1,10,1]. How can I sort an array with unique values? And get [1,2,3,10] The only way to do this is with awk, are there any other options?
process
sort with uniq values in array i have an array with duplicate values ​​ how can i sort an array with unique values and get the only way to do this is with awk are there any other options
1
494,910
14,268,349,239
IssuesEvent
2020-11-20 22:16:03
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
closed
convert dockerNetworkInspect to a struct
area/networking kind/cleanup priority/important-longterm
currently we use comma to get the info we need https://github.com/medyagh/minikube/blob/a84cbed217c45c34e13bce14e1c144bcde7ee168/pkg/drivers/kic/oci/network_create.go#L139 ``` cmd := exec.Command(Docker, "network", "inspect", name, "--format", `{{(index .IPAM.Config 0).Subnet}},{{(index .IPAM.Config 0).Gateway}},{{(index .Options "com.docker.network.driver.mtu")}}`) ``` one better approach is using something like this ``` docker network inspect minikube --format='{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{(index .IPAM.Config 0).Subnet}}","Gateway": "{{(index .IPAM.Config 0).Gateway}}",MTU: "{{(index .Options "com.docker.network.driver.mtu")}}",{{$first := true}} "ContainerIPs": [{{range $k,$v := .Containers }} {{if $first}} {{$first = false}}{{else}},{{end}}"{{$v.IPv4Address}}",{{end}}]}' ``` that will output a json ``` {"Name": "minikube","Driver": "bridge","Subnet": "192.168.49.0/24","Gateway": "192.168.49.1",MTU: "1500", "ContainerIPs": [ "192.168.49.3/24", ,"192.168.49.4/24", ,"192.168.49.2/24",]} ``` but the problem is making sure the dateway and subnet get parsed into netinfo struct so we get more information as a struct, the challenges are making sure it doesn't throw an exception in wide variety of platforms and exceptional cases. when we do json unmarshal
1.0
convert dockerNetworkInspect to a struct - currently we use comma to get the info we need https://github.com/medyagh/minikube/blob/a84cbed217c45c34e13bce14e1c144bcde7ee168/pkg/drivers/kic/oci/network_create.go#L139 ``` cmd := exec.Command(Docker, "network", "inspect", name, "--format", `{{(index .IPAM.Config 0).Subnet}},{{(index .IPAM.Config 0).Gateway}},{{(index .Options "com.docker.network.driver.mtu")}}`) ``` one better approach is using something like this ``` docker network inspect minikube --format='{"Name": "{{.Name}}","Driver": "{{.Driver}}","Subnet": "{{(index .IPAM.Config 0).Subnet}}","Gateway": "{{(index .IPAM.Config 0).Gateway}}",MTU: "{{(index .Options "com.docker.network.driver.mtu")}}",{{$first := true}} "ContainerIPs": [{{range $k,$v := .Containers }} {{if $first}} {{$first = false}}{{else}},{{end}}"{{$v.IPv4Address}}",{{end}}]}' ``` that will output a json ``` {"Name": "minikube","Driver": "bridge","Subnet": "192.168.49.0/24","Gateway": "192.168.49.1",MTU: "1500", "ContainerIPs": [ "192.168.49.3/24", ,"192.168.49.4/24", ,"192.168.49.2/24",]} ``` but the problem is making sure the dateway and subnet get parsed into netinfo struct so we get more information as a struct, the challenges are making sure it doesn't throw an exception in wide variety of platforms and exceptional cases. when we do json unmarshal
non_process
convert dockernetworkinspect to a struct currently we use comma to get the info we need cmd exec command docker network inspect name format index ipam config subnet index ipam config gateway index options com docker network driver mtu one better approach is using something like this docker network inspect minikube format name name driver driver subnet index ipam config subnet gateway index ipam config gateway mtu index options com docker network driver mtu first true containerips that will output a json name minikube driver bridge subnet gateway mtu containerips but the problem is making sure the dateway and subnet get parsed into netinfo struct so we get more information as a struct the challenges are making sure it doesn t throw an exception in wide variety of platforms and exceptional cases when we do json unmarshal
0
729,226
25,114,135,492
IssuesEvent
2022-11-08 23:40:00
E3SM-Project/zstash
https://api.github.com/repos/E3SM-Project/zstash
closed
Improve Globus transfer concurrency
Globus High priority
With --non-blocking option, zstash submits a Globus transfer and immediately creates a subsequent tarball. Zstash does not wait until the transfer completes to start creating a subsequent tarball. On machines where zstash tarballs are created faster than they are transferred to a remote endpoint, zstash will create multiple Globus transfers that are put in a queue by the Globus service. Zstash should submit transfers to transfer as many zstash tarballs as have been created at the moment. On machines where it takes more time to create a tarball than transfer it, each Globus transfer will have one file. On machines where it takes less time to create a tarball than transfer it, the first transfer will have one file, but the number of tarballs in subsequent transfers will grow finding dynamically the most optimal number of tarballs per transfer.
1.0
Improve Globus transfer concurrency - With --non-blocking option, zstash submits a Globus transfer and immediately creates a subsequent tarball. Zstash does not wait until the transfer completes to start creating a subsequent tarball. On machines where zstash tarballs are created faster than they are transferred to a remote endpoint, zstash will create multiple Globus transfers that are put in a queue by the Globus service. Zstash should submit transfers to transfer as many zstash tarballs as have been created at the moment. On machines where it takes more time to create a tarball than transfer it, each Globus transfer will have one file. On machines where it takes less time to create a tarball than transfer it, the first transfer will have one file, but the number of tarballs in subsequent transfers will grow finding dynamically the most optimal number of tarballs per transfer.
non_process
improve globus transfer concurrency with non blocking option zstash submits a globus transfer and immediately creates a subsequent tarball zstash does not wait until the transfer completes to start creating a subsequent tarball on machines where zstash tarballs are created faster than they are transferred to a remote endpoint zstash will create multiple globus transfers that are put in a queue by the globus service zstash should submit transfers to transfer as many zstash tarballs as have been created at the moment on machines where it takes more time to create a tarball than transfer it each globus transfer will have one file on machines where it takes less time to create a tarball than transfer it the first transfer will have one file but the number of tarballs in subsequent transfers will grow finding dynamically the most optimal number of tarballs per transfer
0
263,849
23,085,227,498
IssuesEvent
2022-07-26 10:46:22
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
In-proc event listener listening to native runtime event source events crashes on Mono Linux
disabled-test os-linux os-mac-os-x area-Tracing-mono
In PR https://github.com/dotnet/runtime/pull/66765 I tried to use an in-proc `EventListener` to listen to a thread pool event. On Mono, this appears to work fine on Windows but is crashing at the following stack on Linux. Also crashed on OSX in the PR's CI. The test is in `System.Threading.ThreadPool` tests named 'CooperativeBlockingCanCreateThreadsFaster'. For now, I'll disable the test on Mono in that PR. ``` ================================================================= Managed Stacktrace: ================================================================= at <unknown> <0xffffffff> at System.Runtime.InteropServices.GCHandle:InternalGet <0x00079> at System.Runtime.InteropServices.GCHandle:get_Target <0x00047> at HandleManager:FromHandle <0x000ab> at System.Threading.WaitSubsystem:SetEvent <0x0002b> at System.Threading.EventWaitHandle:Set <0x0004b> at System.Diagnostics.Tracing.EventPipeEventDispatcher:StopDispatchTask <0x0009b> at System.Diagnostics.Tracing.EventPipeEventDispatcher:CommitDispatchConfiguration <0x0007b> at System.Diagnostics.Tracing.EventPipeEventDispatcher:RemoveEventListener <0x000cb> at System.Diagnostics.Tracing.EventListener:RemoveReferencesToListenerInEventSources <0x002eb> at System.Diagnostics.Tracing.EventListener:Dispose <0x0010f> at <>c:<CooperativeBlockingCanCreateThreadsFaster>b__38_0 <0x0052f> at System.Object:runtime_invoke_void__this__ <0x00091> at <unknown> <0xffffffff> at System.Reflection.RuntimeMethodInfo:InternalInvoke <0x000b9> at System.Reflection.RuntimeMethodInfo:InvokeWorker <0x0009f> at System.Reflection.RuntimeMethodInfo:Invoke <0x001d3> at System.Reflection.MethodBase:Invoke <0x00049> at Microsoft.DotNet.RemoteExecutor.Program:Main <0x00313> at <Module>:runtime_invoke_int_object <0x00091> ================================================================= System.Threading.ThreadPools.Tests.ThreadPoolTests.CooperativeBlockingCanCreateThreadsFaster [FAIL] Exit code was 134 but it should have been 42 Expected: True Actual: False Stack Trace: /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs(239,0): at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose(Boolean disposing) /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs(57,0): at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose() /home/user/runtime/src/libraries/System.Threading.ThreadPool/tests/ThreadPoolTests.cs(914,0): at System.Threading.ThreadPools.Tests.ThreadPoolTests.CooperativeBlockingCanCreateThreadsFaster() /home/user/runtime/src/mono/System.Private.CoreLib/src/System/Reflection/RuntimeMethodInfo.Mono.cs(386,0): at System.Reflection.RuntimeMethodInfo.InvokeWorker(Object obj, BindingFlags invokeAttr, Span`1 parameters) Finished: System.Threading.ThreadPool.Tests ```
1.0
In-proc event listener listening to native runtime event source events crashes on Mono Linux - In PR https://github.com/dotnet/runtime/pull/66765 I tried to use an in-proc `EventListener` to listen to a thread pool event. On Mono, this appears to work fine on Windows but is crashing at the following stack on Linux. Also crashed on OSX in the PR's CI. The test is in `System.Threading.ThreadPool` tests named 'CooperativeBlockingCanCreateThreadsFaster'. For now, I'll disable the test on Mono in that PR. ``` ================================================================= Managed Stacktrace: ================================================================= at <unknown> <0xffffffff> at System.Runtime.InteropServices.GCHandle:InternalGet <0x00079> at System.Runtime.InteropServices.GCHandle:get_Target <0x00047> at HandleManager:FromHandle <0x000ab> at System.Threading.WaitSubsystem:SetEvent <0x0002b> at System.Threading.EventWaitHandle:Set <0x0004b> at System.Diagnostics.Tracing.EventPipeEventDispatcher:StopDispatchTask <0x0009b> at System.Diagnostics.Tracing.EventPipeEventDispatcher:CommitDispatchConfiguration <0x0007b> at System.Diagnostics.Tracing.EventPipeEventDispatcher:RemoveEventListener <0x000cb> at System.Diagnostics.Tracing.EventListener:RemoveReferencesToListenerInEventSources <0x002eb> at System.Diagnostics.Tracing.EventListener:Dispose <0x0010f> at <>c:<CooperativeBlockingCanCreateThreadsFaster>b__38_0 <0x0052f> at System.Object:runtime_invoke_void__this__ <0x00091> at <unknown> <0xffffffff> at System.Reflection.RuntimeMethodInfo:InternalInvoke <0x000b9> at System.Reflection.RuntimeMethodInfo:InvokeWorker <0x0009f> at System.Reflection.RuntimeMethodInfo:Invoke <0x001d3> at System.Reflection.MethodBase:Invoke <0x00049> at Microsoft.DotNet.RemoteExecutor.Program:Main <0x00313> at <Module>:runtime_invoke_int_object <0x00091> ================================================================= System.Threading.ThreadPools.Tests.ThreadPoolTests.CooperativeBlockingCanCreateThreadsFaster [FAIL] Exit code was 134 but it should have been 42 Expected: True Actual: False Stack Trace: /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs(239,0): at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose(Boolean disposing) /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs(57,0): at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose() /home/user/runtime/src/libraries/System.Threading.ThreadPool/tests/ThreadPoolTests.cs(914,0): at System.Threading.ThreadPools.Tests.ThreadPoolTests.CooperativeBlockingCanCreateThreadsFaster() /home/user/runtime/src/mono/System.Private.CoreLib/src/System/Reflection/RuntimeMethodInfo.Mono.cs(386,0): at System.Reflection.RuntimeMethodInfo.InvokeWorker(Object obj, BindingFlags invokeAttr, Span`1 parameters) Finished: System.Threading.ThreadPool.Tests ```
non_process
in proc event listener listening to native runtime event source events crashes on mono linux in pr i tried to use an in proc eventlistener to listen to a thread pool event on mono this appears to work fine on windows but is crashing at the following stack on linux also crashed on osx in the pr s ci the test is in system threading threadpool tests named cooperativeblockingcancreatethreadsfaster for now i ll disable the test on mono in that pr managed stacktrace at at system runtime interopservices gchandle internalget at system runtime interopservices gchandle get target at handlemanager fromhandle at system threading waitsubsystem setevent at system threading eventwaithandle set at system diagnostics tracing eventpipeeventdispatcher stopdispatchtask at system diagnostics tracing eventpipeeventdispatcher commitdispatchconfiguration at system diagnostics tracing eventpipeeventdispatcher removeeventlistener at system diagnostics tracing eventlistener removereferencestolistenerineventsources at system diagnostics tracing eventlistener dispose at c b at system object runtime invoke void this at at system reflection runtimemethodinfo internalinvoke at system reflection runtimemethodinfo invokeworker at system reflection runtimemethodinfo invoke at system reflection methodbase invoke at microsoft dotnet remoteexecutor program main at runtime invoke int object system threading threadpools tests threadpooltests cooperativeblockingcancreatethreadsfaster exit code was but it should have been expected true actual false stack trace src microsoft dotnet remoteexecutor src remoteinvokehandle cs at microsoft dotnet remoteexecutor remoteinvokehandle dispose boolean disposing src microsoft dotnet remoteexecutor src remoteinvokehandle cs at microsoft dotnet remoteexecutor remoteinvokehandle dispose home user runtime src libraries system threading threadpool tests threadpooltests cs at system threading threadpools tests threadpooltests cooperativeblockingcancreatethreadsfaster home user runtime src mono system private corelib src system reflection runtimemethodinfo mono cs at system reflection runtimemethodinfo invokeworker object obj bindingflags invokeattr span parameters finished system threading threadpool tests
0
70,140
7,177,938,049
IssuesEvent
2018-01-31 15:09:13
percival-detector/percivalui
https://api.github.com/repos/percival-detector/percivalui
closed
Remove "Apply Setpoint" command from the Web GUI
ELETTRA tested: PASS GUI only...
APPLY SETPOINT: remove from GUI and leave as script, for the moment. The operation "Apply setpoint", similarly to "Scan setpoints" applies to all channels. However, "Apply setpoint" does not use steps and delays… therefore it is extremely risky to use it and we probably won't ever use it… Unless we find a case in which it turns out to be useful, the function might be fully discarded.
1.0
Remove "Apply Setpoint" command from the Web GUI - APPLY SETPOINT: remove from GUI and leave as script, for the moment. The operation "Apply setpoint", similarly to "Scan setpoints" applies to all channels. However, "Apply setpoint" does not use steps and delays… therefore it is extremely risky to use it and we probably won't ever use it… Unless we find a case in which it turns out to be useful, the function might be fully discarded.
non_process
remove apply setpoint command from the web gui apply setpoint remove from gui and leave as script for the moment the operation apply setpoint similarly to scan setpoints applies to all channels however apply setpoint does not use steps and delays… therefore it is extremely risky to use it and we probably won t ever use it… unless we find a case in which it turns out to be useful the function might be fully discarded
0
9,930
12,967,080,065
IssuesEvent
2020-07-21 02:13:50
pingcap/tidb
https://api.github.com/repos/pingcap/tidb
closed
Inconsistent implementation of datetime casting
component/coprocessor type/bug
## Bug Report Please answer these questions before submitting your issue. Thanks! 1. What did you do? ```sql drop table if exists t; create table t (a varchar(100)); insert into t values ('2010-02-12t12:23:34'); select count(*) from t where cast(a as datetime) = cast('2010-02-12t12:23:34' as datetime) or (cast(a as datetime) is null and cast('2010-02-12t12:23:34' as datetime) is null); ``` Related PR: https://github.com/tikv/tikv/blob/6ef8f2454a6039c014ad0945ee7aaea673ab1d53/components/tidb_query/src/codec/mysql/duration.rs#L749 2. What did you expect to see? ``` +----------+ | count(*) | +----------+ | 1 | +----------+ ``` In addition, tidb is also incompatible with mysql. ```txt # on mysql 8.0.17 > select cast('2010-02-12t12:23:34' as datetime) +-----------------------------------------+ | cast('2010-02-12t12:23:34' as datetime) | +-----------------------------------------+ | 2010-02-12 00:00:00 | +-----------------------------------------+ # on tidb master > select cast('2010-02-12t12:23:34' as datetime) +-----------------------------------------+ | cast('2010-02-12t12:23:34' as datetime) | +-----------------------------------------+ | 2010-02-12 12:23:34 | +-----------------------------------------+ ``` 3. What did you see instead? ``` +----------+ | count(*) | +----------+ | 0 | +----------+ ``` 4. What version of TiDB are you using (`tidb-server -V` or run `select tidb_version();` on TiDB)? ``` > select tidb_version(); +-------------------------------------------------------------------+ | tidb_version() | +-------------------------------------------------------------------+ | Release Version: v4.0.0-beta-253-g0f1974e | | Git Commit Hash: 0f1974ebee02b4fec1499f0df8951fea684ba755 | | Git Branch: HEAD | | UTC Build Time: 2020-02-28 06:49:34 | | GoVersion: go1.13.7 | | Race Enabled: false | | TiKV Min Version: v3.0.0-60965b006877ca7234adaced7890d7b029ed1306 | | Check Table Before Drop: false | +-------------------------------------------------------------------+ ```
1.0
Inconsistent implementation of datetime casting - ## Bug Report Please answer these questions before submitting your issue. Thanks! 1. What did you do? ```sql drop table if exists t; create table t (a varchar(100)); insert into t values ('2010-02-12t12:23:34'); select count(*) from t where cast(a as datetime) = cast('2010-02-12t12:23:34' as datetime) or (cast(a as datetime) is null and cast('2010-02-12t12:23:34' as datetime) is null); ``` Related PR: https://github.com/tikv/tikv/blob/6ef8f2454a6039c014ad0945ee7aaea673ab1d53/components/tidb_query/src/codec/mysql/duration.rs#L749 2. What did you expect to see? ``` +----------+ | count(*) | +----------+ | 1 | +----------+ ``` In addition, tidb is also incompatible with mysql. ```txt # on mysql 8.0.17 > select cast('2010-02-12t12:23:34' as datetime) +-----------------------------------------+ | cast('2010-02-12t12:23:34' as datetime) | +-----------------------------------------+ | 2010-02-12 00:00:00 | +-----------------------------------------+ # on tidb master > select cast('2010-02-12t12:23:34' as datetime) +-----------------------------------------+ | cast('2010-02-12t12:23:34' as datetime) | +-----------------------------------------+ | 2010-02-12 12:23:34 | +-----------------------------------------+ ``` 3. What did you see instead? ``` +----------+ | count(*) | +----------+ | 0 | +----------+ ``` 4. What version of TiDB are you using (`tidb-server -V` or run `select tidb_version();` on TiDB)? ``` > select tidb_version(); +-------------------------------------------------------------------+ | tidb_version() | +-------------------------------------------------------------------+ | Release Version: v4.0.0-beta-253-g0f1974e | | Git Commit Hash: 0f1974ebee02b4fec1499f0df8951fea684ba755 | | Git Branch: HEAD | | UTC Build Time: 2020-02-28 06:49:34 | | GoVersion: go1.13.7 | | Race Enabled: false | | TiKV Min Version: v3.0.0-60965b006877ca7234adaced7890d7b029ed1306 | | Check Table Before Drop: false | +-------------------------------------------------------------------+ ```
process
inconsistent implementation of datetime casting bug report please answer these questions before submitting your issue thanks what did you do sql drop table if exists t create table t a varchar insert into t values select count from t where cast a as datetime cast as datetime or cast a as datetime is null and cast as datetime is null related pr what did you expect to see count in addition tidb is also incompatible with mysql txt on mysql select cast as datetime cast as datetime on tidb master select cast as datetime cast as datetime what did you see instead count what version of tidb are you using tidb server v or run select tidb version on tidb select tidb version tidb version release version beta git commit hash git branch head utc build time goversion race enabled false tikv min version check table before drop false
1
3,999
6,216,888,933
IssuesEvent
2017-07-08 09:06:05
prometheus/prometheus
https://api.github.com/repos/prometheus/prometheus
opened
Make sure SD mechanisms are resilient to failure
component/service discovery
There have been a few reports of all/many targets being removed when a SD failed in some way. We should keep using the old target information when a SD fails. Confirm this is what we're doing inside Prometheus, and within each SD mechanism (e.g. SD mechanisms should hard fail rather than returning partial results).
1.0
Make sure SD mechanisms are resilient to failure - There have been a few reports of all/many targets being removed when a SD failed in some way. We should keep using the old target information when a SD fails. Confirm this is what we're doing inside Prometheus, and within each SD mechanism (e.g. SD mechanisms should hard fail rather than returning partial results).
non_process
make sure sd mechanisms are resilient to failure there have been a few reports of all many targets being removed when a sd failed in some way we should keep using the old target information when a sd fails confirm this is what we re doing inside prometheus and within each sd mechanism e g sd mechanisms should hard fail rather than returning partial results
0
21,071
28,017,404,363
IssuesEvent
2023-03-28 00:36:35
nephio-project/sig-release
https://api.github.com/repos/nephio-project/sig-release
opened
Document Contributor development processes and environments set up
area/process-mgmt sig/release
Document the procedure to : 1. Create development and build environment 2. Build and test code 3. PR process and guidelines
1.0
Document Contributor development processes and environments set up - Document the procedure to : 1. Create development and build environment 2. Build and test code 3. PR process and guidelines
process
document contributor development processes and environments set up document the procedure to create development and build environment build and test code pr process and guidelines
1
62,595
6,800,967,859
IssuesEvent
2017-11-02 15:28:54
ckeditor/ckeditor-dev
https://api.github.com/repos/ckeditor/ckeditor-dev
closed
Test /tests/core/ckeditor/basepathglobal should work for custom aliases
status:confirmed type:failingtest type:task
## Are you reporting a feature request or a bug? Task ## Provide detailed reproduction steps (if any) If there is no `sub.ckeditor.dev` specified in the local environment `/etc/hosts` test `/tests/core/ckeditor/basepathglobal` fails on Safari. It could be more flexible and work for custom aliases instead of using fixed `sub.ckeditor.dev`. See [comment](https://github.com/ckeditor/ckeditor-dev/pull/1082#pullrequestreview-71501634).
1.0
Test /tests/core/ckeditor/basepathglobal should work for custom aliases - ## Are you reporting a feature request or a bug? Task ## Provide detailed reproduction steps (if any) If there is no `sub.ckeditor.dev` specified in the local environment `/etc/hosts` test `/tests/core/ckeditor/basepathglobal` fails on Safari. It could be more flexible and work for custom aliases instead of using fixed `sub.ckeditor.dev`. See [comment](https://github.com/ckeditor/ckeditor-dev/pull/1082#pullrequestreview-71501634).
non_process
test tests core ckeditor basepathglobal should work for custom aliases are you reporting a feature request or a bug task provide detailed reproduction steps if any if there is no sub ckeditor dev specified in the local environment etc hosts test tests core ckeditor basepathglobal fails on safari it could be more flexible and work for custom aliases instead of using fixed sub ckeditor dev see
0
204,368
23,239,536,284
IssuesEvent
2022-08-03 14:30:46
turkdevops/angular
https://api.github.com/repos/turkdevops/angular
closed
CVE-2020-7656 (Medium) detected in jquery-1.4.4.min.js, jquery-1.8.1.min.js - autoclosed
security vulnerability
## CVE-2020-7656 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-1.4.4.min.js</b>, <b>jquery-1.8.1.min.js</b></p></summary> <p> <details><summary><b>jquery-1.4.4.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js</a></p> <p>Path to dependency file: /node_modules/selenium-webdriver/lib/test/data/droppableItems.html</p> <p>Path to vulnerable library: /node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js,/node_modules/protractor/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js,/node_modules/webdriver-js-extender/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.4.4.min.js** (Vulnerable Library) </details> <details><summary><b>jquery-1.8.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js</a></p> <p>Path to dependency file: /node_modules/redeyed/examples/browser/index.html</p> <p>Path to vulnerable library: /node_modules/redeyed/examples/browser/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.8.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/turkdevops/angular/commit/c6aca37f442da8c55a02d7c53ccc58100ab004f3">c6aca37f442da8c55a02d7c53ccc58100ab004f3</a></p> <p>Found in base branch: <b>labs/router</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed. <p>Publish Date: 2020-05-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-q4m3-2j7h-f7xw">https://github.com/advisories/GHSA-q4m3-2j7h-f7xw</a></p> <p>Release Date: 2020-05-28</p> <p>Fix Resolution: jquery - 1.9.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7656 (Medium) detected in jquery-1.4.4.min.js, jquery-1.8.1.min.js - autoclosed - ## CVE-2020-7656 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-1.4.4.min.js</b>, <b>jquery-1.8.1.min.js</b></p></summary> <p> <details><summary><b>jquery-1.4.4.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js</a></p> <p>Path to dependency file: /node_modules/selenium-webdriver/lib/test/data/droppableItems.html</p> <p>Path to vulnerable library: /node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js,/node_modules/protractor/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js,/node_modules/webdriver-js-extender/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.4.4.min.js** (Vulnerable Library) </details> <details><summary><b>jquery-1.8.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js</a></p> <p>Path to dependency file: /node_modules/redeyed/examples/browser/index.html</p> <p>Path to vulnerable library: /node_modules/redeyed/examples/browser/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.8.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/turkdevops/angular/commit/c6aca37f442da8c55a02d7c53ccc58100ab004f3">c6aca37f442da8c55a02d7c53ccc58100ab004f3</a></p> <p>Found in base branch: <b>labs/router</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed. <p>Publish Date: 2020-05-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-q4m3-2j7h-f7xw">https://github.com/advisories/GHSA-q4m3-2j7h-f7xw</a></p> <p>Release Date: 2020-05-28</p> <p>Fix Resolution: jquery - 1.9.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in jquery min js jquery min js autoclosed cve medium severity vulnerability vulnerable libraries jquery min js jquery min js jquery min js javascript library for dom operations library home page a href path to dependency file node modules selenium webdriver lib test data droppableitems html path to vulnerable library node modules selenium webdriver lib test data js jquery min js node modules protractor node modules selenium webdriver lib test data js jquery min js node modules webdriver js extender node modules selenium webdriver lib test data js jquery min js dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file node modules redeyed examples browser index html path to vulnerable library node modules redeyed examples browser index html dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch labs router vulnerability details jquery prior to allows cross site scripting attacks via the load method the load method fails to recognize and remove html tags that contain a whitespace character i e which results in the enclosed script logic to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with mend
0
112,362
4,531,066,769
IssuesEvent
2016-09-08 00:16:14
theCrag/website
https://api.github.com/repos/theCrag/website
closed
Messed up maps/location position
0: Priority Hotfix 2: Bug fix
Hey, since today the map/location at a crag wrong positioned. It is behind the area/route list. As an example: ![unbenannt-1](https://cloud.githubusercontent.com/assets/3040689/18299825/4f81a0ae-74c4-11e6-8f68-cf13f47cd4b2.jpg)
1.0
Messed up maps/location position - Hey, since today the map/location at a crag wrong positioned. It is behind the area/route list. As an example: ![unbenannt-1](https://cloud.githubusercontent.com/assets/3040689/18299825/4f81a0ae-74c4-11e6-8f68-cf13f47cd4b2.jpg)
non_process
messed up maps location position hey since today the map location at a crag wrong positioned it is behind the area route list as an example
0
4,103
7,050,691,185
IssuesEvent
2018-01-03 08:05:32
openvstorage/framework
https://api.github.com/repos/openvstorage/framework
closed
create vpools error
process_wontfix
hi i met a question when i create vpools how can i solve this problem thanks the message log Dec 23 15:46:15 localhost gunicorn: 2017-12-23 08:46:15 52700 +0100 - localhost - 2832/140098847882608 - hybrids/storagedriver.py - fetch_statistics - 270 - ERROR - Error loading statistics_node from test-pool2k8aRVaFCTAB2GrBL: Python argument types in Dec 23 15:46:15 localhost gunicorn: StorageRouterClient.statistics_node(StorageRouterClient, str) Dec 23 15:46:15 localhost gunicorn: did not match C++ signature: Dec 23 15:46:15 localhost gunicorn: statistics_node(volumedriverfs::PythonClient {lvalue}, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > node_id, bool reset=False) Dec 23 15:46:15 localhost gunicorn: 2017-12-23 08:46:15 54100 +0100 - localhost - 2832/140098847882608 - hybrids/storagedriver.py - fetch_statistics - 271 - ERROR - Error loading statistics_node from test-vpoolk8aRVaFCTAB2GrBL: Python argument types in Dec 23 15:46:15 localhost gunicorn: StorageRouterClient.statistics_node(StorageRouterClient, str) Dec 23 15:46:15 localhost gunicorn: did not match C++ signature: Dec 23 15:46:15 localhost gunicorn: statistics_node(volumedriverfs::PythonClient {lvalue}, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > node_id, bool reset=False) Dec 23 15:46:16 localhost arakoon: 2017-12-23 15:46:16 226047 +0800 - localhost.localdomain - 1202/0 - arakoon - 5411 - info - 192.168.3.217:client_service:session=0 connection=192.168.3.217:client_service_1780 socket_address=ADDR_INET 192.168.3.217,57491
1.0
create vpools error - hi i met a question when i create vpools how can i solve this problem thanks the message log Dec 23 15:46:15 localhost gunicorn: 2017-12-23 08:46:15 52700 +0100 - localhost - 2832/140098847882608 - hybrids/storagedriver.py - fetch_statistics - 270 - ERROR - Error loading statistics_node from test-pool2k8aRVaFCTAB2GrBL: Python argument types in Dec 23 15:46:15 localhost gunicorn: StorageRouterClient.statistics_node(StorageRouterClient, str) Dec 23 15:46:15 localhost gunicorn: did not match C++ signature: Dec 23 15:46:15 localhost gunicorn: statistics_node(volumedriverfs::PythonClient {lvalue}, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > node_id, bool reset=False) Dec 23 15:46:15 localhost gunicorn: 2017-12-23 08:46:15 54100 +0100 - localhost - 2832/140098847882608 - hybrids/storagedriver.py - fetch_statistics - 271 - ERROR - Error loading statistics_node from test-vpoolk8aRVaFCTAB2GrBL: Python argument types in Dec 23 15:46:15 localhost gunicorn: StorageRouterClient.statistics_node(StorageRouterClient, str) Dec 23 15:46:15 localhost gunicorn: did not match C++ signature: Dec 23 15:46:15 localhost gunicorn: statistics_node(volumedriverfs::PythonClient {lvalue}, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > node_id, bool reset=False) Dec 23 15:46:16 localhost arakoon: 2017-12-23 15:46:16 226047 +0800 - localhost.localdomain - 1202/0 - arakoon - 5411 - info - 192.168.3.217:client_service:session=0 connection=192.168.3.217:client_service_1780 socket_address=ADDR_INET 192.168.3.217,57491
process
create vpools error hi i met a question when i create vpools how can i solve this problem thanks the message log dec localhost gunicorn localhost hybrids storagedriver py fetch statistics error error loading statistics node from test python argument types in dec localhost gunicorn storagerouterclient statistics node storagerouterclient str dec localhost gunicorn did not match c signature dec localhost gunicorn statistics node volumedriverfs pythonclient lvalue std basic string std allocator node id bool reset false dec localhost gunicorn localhost hybrids storagedriver py fetch statistics error error loading statistics node from test python argument types in dec localhost gunicorn storagerouterclient statistics node storagerouterclient str dec localhost gunicorn did not match c signature dec localhost gunicorn statistics node volumedriverfs pythonclient lvalue std basic string std allocator node id bool reset false dec localhost arakoon localhost localdomain arakoon info client service session connection client service socket address addr inet
1
9,979
13,022,794,944
IssuesEvent
2020-07-27 08:55:23
deepset-ai/haystack
https://api.github.com/repos/deepset-ai/haystack
closed
Writing HTML files to the document store
preprocessing question
Hello. If I have one HTML file containing the text from which I want to perform Question Answering, how should I proceed to write them in the document store? Can this solution be extended to the case where I have the information in multiple HTML files? Thank you so much.
1.0
Writing HTML files to the document store - Hello. If I have one HTML file containing the text from which I want to perform Question Answering, how should I proceed to write them in the document store? Can this solution be extended to the case where I have the information in multiple HTML files? Thank you so much.
process
writing html files to the document store hello if i have one html file containing the text from which i want to perform question answering how should i proceed to write them in the document store can this solution be extended to the case where i have the information in multiple html files thank you so much
1
22,316
30,872,628,969
IssuesEvent
2023-08-03 12:25:30
benthosdev/benthos
https://api.github.com/repos/benthosdev/benthos
closed
subprocess executed without any input
question processors
Looks like subprocess processor executed immediatelly without any input. Doesn't matter if it's inside of `input` or `pileline` section. I'm trying to use this version, just installed with `curl -Lsf https://sh.benthos.dev | bash`: ``` Version: (devel) Date: 2022-12-21T18:56:52Z ``` ``` Operating System: Arch Linux Kernel: Linux 6.1.9-arch1-1 ``` config is pretty simple: ``` input: stdin: codec: lines processors: - mapping: root = content().uppercase() pipeline: processors: - subprocess: name: echo logger: level: ALL format: logfmt add_timestamp: true timestamp_name: time ``` I run `benthos -c config.yaml`, don't provide any input, just start the process. Immediatelly logs appear: ``` ➜ scribe git:(develop) ✗ benthos -c config.yaml INFO[2023-02-09T17:01:46+02:00] Running main config from specified file @service=benthos path=config.yaml INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Launching a benthos instance, use CTRL+C to close @service=benthos INFO[2023-02-09T17:01:46+02:00] Listening for HTTP requests at: http://0.0.0.0:4195 @service=benthos WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 ... ``` What do I do wrong?
1.0
subprocess executed without any input - Looks like subprocess processor executed immediatelly without any input. Doesn't matter if it's inside of `input` or `pileline` section. I'm trying to use this version, just installed with `curl -Lsf https://sh.benthos.dev | bash`: ``` Version: (devel) Date: 2022-12-21T18:56:52Z ``` ``` Operating System: Arch Linux Kernel: Linux 6.1.9-arch1-1 ``` config is pretty simple: ``` input: stdin: codec: lines processors: - mapping: root = content().uppercase() pipeline: processors: - subprocess: name: echo logger: level: ALL format: logfmt add_timestamp: true timestamp_name: time ``` I run `benthos -c config.yaml`, don't provide any input, just start the process. Immediatelly logs appear: ``` ➜ scribe git:(develop) ✗ benthos -c config.yaml INFO[2023-02-09T17:01:46+02:00] Running main config from specified file @service=benthos path=config.yaml INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Launching a benthos instance, use CTRL+C to close @service=benthos INFO[2023-02-09T17:01:46+02:00] Listening for HTTP requests at: http://0.0.0.0:4195 @service=benthos WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 INFO[2023-02-09T17:01:46+02:00] Subprocess started @service=benthos label="" path=root.pipeline.processors.0 WARN[2023-02-09T17:01:46+02:00] Subprocess exited @service=benthos label="" path=root.pipeline.processors.0 ERRO[2023-02-09T17:01:46+02:00] Failed to read subprocess output: read |0: file already closed @service=benthos label="" path=root.pipeline.processors.0 ... ``` What do I do wrong?
process
subprocess executed without any input looks like subprocess processor executed immediatelly without any input doesn t matter if it s inside of input or pileline section i m trying to use this version just installed with curl lsf bash version devel date operating system arch linux kernel linux config is pretty simple input stdin codec lines processors mapping root content uppercase pipeline processors subprocess name echo logger level all format logfmt add timestamp true timestamp name time i run benthos c config yaml don t provide any input just start the process immediatelly logs appear ➜ scribe git develop ✗ benthos c config yaml info running main config from specified file service benthos path config yaml info subprocess started service benthos label path root pipeline processors info launching a benthos instance use ctrl c to close service benthos info listening for http requests at service benthos warn subprocess exited service benthos label path root pipeline processors erro failed to read subprocess output read file already closed service benthos label path root pipeline processors info subprocess started service benthos label path root pipeline processors warn subprocess exited service benthos label path root pipeline processors erro failed to read subprocess output read file already closed service benthos label path root pipeline processors info subprocess started service benthos label path root pipeline processors warn subprocess exited service benthos label path root pipeline processors erro failed to read subprocess output read file already closed service benthos label path root pipeline processors info subprocess started service benthos label path root pipeline processors warn subprocess exited service benthos label path root pipeline processors erro failed to read subprocess output read file already closed service benthos label path root pipeline processors info subprocess started service benthos label path root pipeline processors warn subprocess exited service benthos label path root pipeline processors erro failed to read subprocess output read file already closed service benthos label path root pipeline processors what do i do wrong
1
15,212
19,060,085,284
IssuesEvent
2021-11-26 06:03:45
rdoddanavar/hpr-sim
https://api.github.com/repos/rdoddanavar/hpr-sim
closed
RNG Control via Input File
pre-processing
- Specify distribution type - Validate, convert distribution fields and parameters
1.0
RNG Control via Input File - - Specify distribution type - Validate, convert distribution fields and parameters
process
rng control via input file specify distribution type validate convert distribution fields and parameters
1
16,743
21,911,845,706
IssuesEvent
2022-05-21 06:57:31
aiidateam/aiida-core
https://api.github.com/repos/aiidateam/aiida-core
closed
Process functions allow non-Data arguments to be passed as input
type/bug priority/important topic/engine topic/processes
MWE: ```python In [1]: from aiida import engine, orm In [2]: @engine.calcfunction ...: def test_kwargs(**kwargs): ...: for value in kwargs.values(): ...: assert isinstance(value, orm.Data) ...: ...: In [3]: test_kwargs(**{'a': orm.Int(1)}) Out[3]: {} In [4]: test_kwargs(**{'a': orm.Int(1), 'b': 1}) Report: [487|test_kwargs|on_except]: Traceback (most recent call last): File "/home/sph/.virtualenvs/aiida_dev/lib/python3.9/site-packages/plumpy/process_states.py", line 231, in execute result = self.run_fn(*self.args, **self.kwargs) File "/home/sph/code/aiida/env/dev/aiida-core/aiida/engine/processes/functions.py", line 395, in run result = self._func(*args, **kwargs) File "<ipython-input-2-5fb70fcb7652>", line 4, in test_kwargs assert isinstance(value, orm.Data) AssertionError ``` As you can see, the second invocation of the `test_kwargs` calcfunction passes a normal `int` as an argument, but the input validation does not catch it and simply executes the function where the kwargs contains the normal `int` value. In contrast, if we don't use `**kwargs` but explicitly define the arguments, the validation _does_ work: ```python In [5]: @engine.calcfunction ...: def test_explicit(a, b): ...: assert isinstance(a, orm.Data) ...: assert isinstance(b, orm.Data) In [6]: test_explicit(a=orm.Int(1), b=5) ValueError: Error occurred validating port 'inputs.b': value 'b' is not of the right type. Got '<class 'int'>', expected '(<class 'aiida.orm.nodes.data.data.Data'>,)' ``` Originally reported on the mailing list: https://groups.google.com/g/aiidausers/c/BmDOzte4vWQ/m/v7PZUTywBAAJ
1.0
Process functions allow non-Data arguments to be passed as input - MWE: ```python In [1]: from aiida import engine, orm In [2]: @engine.calcfunction ...: def test_kwargs(**kwargs): ...: for value in kwargs.values(): ...: assert isinstance(value, orm.Data) ...: ...: In [3]: test_kwargs(**{'a': orm.Int(1)}) Out[3]: {} In [4]: test_kwargs(**{'a': orm.Int(1), 'b': 1}) Report: [487|test_kwargs|on_except]: Traceback (most recent call last): File "/home/sph/.virtualenvs/aiida_dev/lib/python3.9/site-packages/plumpy/process_states.py", line 231, in execute result = self.run_fn(*self.args, **self.kwargs) File "/home/sph/code/aiida/env/dev/aiida-core/aiida/engine/processes/functions.py", line 395, in run result = self._func(*args, **kwargs) File "<ipython-input-2-5fb70fcb7652>", line 4, in test_kwargs assert isinstance(value, orm.Data) AssertionError ``` As you can see, the second invocation of the `test_kwargs` calcfunction passes a normal `int` as an argument, but the input validation does not catch it and simply executes the function where the kwargs contains the normal `int` value. In contrast, if we don't use `**kwargs` but explicitly define the arguments, the validation _does_ work: ```python In [5]: @engine.calcfunction ...: def test_explicit(a, b): ...: assert isinstance(a, orm.Data) ...: assert isinstance(b, orm.Data) In [6]: test_explicit(a=orm.Int(1), b=5) ValueError: Error occurred validating port 'inputs.b': value 'b' is not of the right type. Got '<class 'int'>', expected '(<class 'aiida.orm.nodes.data.data.Data'>,)' ``` Originally reported on the mailing list: https://groups.google.com/g/aiidausers/c/BmDOzte4vWQ/m/v7PZUTywBAAJ
process
process functions allow non data arguments to be passed as input mwe python in from aiida import engine orm in engine calcfunction def test kwargs kwargs for value in kwargs values assert isinstance value orm data in test kwargs a orm int out in test kwargs a orm int b report traceback most recent call last file home sph virtualenvs aiida dev lib site packages plumpy process states py line in execute result self run fn self args self kwargs file home sph code aiida env dev aiida core aiida engine processes functions py line in run result self func args kwargs file line in test kwargs assert isinstance value orm data assertionerror as you can see the second invocation of the test kwargs calcfunction passes a normal int as an argument but the input validation does not catch it and simply executes the function where the kwargs contains the normal int value in contrast if we don t use kwargs but explicitly define the arguments the validation does work python in engine calcfunction def test explicit a b assert isinstance a orm data assert isinstance b orm data in test explicit a orm int b valueerror error occurred validating port inputs b value b is not of the right type got expected originally reported on the mailing list
1
4,955
7,801,730,570
IssuesEvent
2018-06-10 01:52:20
uccser/verto
https://api.github.com/repos/uccser/verto
closed
Add optional parameter 'type' to 'boxed-text' tag
feature processor implementation
Could you please add an optional `type` parameter to the `boxed-text` processor, so allow optional styling in the HTML template. ## Example ``` {boxed-text type="quote"} Text for the quote here. {boxed-text end} ``` would render as ```html <div class="boxed-text boxed-text-quote"> <p>Text for the quote here.</p> </div> ``` ## Details - This should function the same as the `panel` parameter of the same name, though this one is optional. - This allows styled block text without a heading, it makes more sense to do this than make the `panel` title optional. After this change the only difference between `boxed-text` and `panel` is that a `panel` requires a title as it can be collapsed.
1.0
Add optional parameter 'type' to 'boxed-text' tag - Could you please add an optional `type` parameter to the `boxed-text` processor, so allow optional styling in the HTML template. ## Example ``` {boxed-text type="quote"} Text for the quote here. {boxed-text end} ``` would render as ```html <div class="boxed-text boxed-text-quote"> <p>Text for the quote here.</p> </div> ``` ## Details - This should function the same as the `panel` parameter of the same name, though this one is optional. - This allows styled block text without a heading, it makes more sense to do this than make the `panel` title optional. After this change the only difference between `boxed-text` and `panel` is that a `panel` requires a title as it can be collapsed.
process
add optional parameter type to boxed text tag could you please add an optional type parameter to the boxed text processor so allow optional styling in the html template example boxed text type quote text for the quote here boxed text end would render as html text for the quote here details this should function the same as the panel parameter of the same name though this one is optional this allows styled block text without a heading it makes more sense to do this than make the panel title optional after this change the only difference between boxed text and panel is that a panel requires a title as it can be collapsed
1
3,247
2,610,059,079
IssuesEvent
2015-02-26 18:17:26
chrsmith/jsjsj122
https://api.github.com/repos/chrsmith/jsjsj122
opened
黄岩治疗不育去哪家医院专业
auto-migrated Priority-Medium Type-Defect
``` 黄岩治疗不育去哪家医院专业【台州五洲生殖医院】24小时健 康咨询热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地址: 台州市椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐104� ��108、118、198及椒江一金清公交车直达枫南小区,乘坐107、105 、109、112、901、 902公交车到星星广场下车,步行即可到院。 诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,�� �精,无精。包皮包茎,精索静脉曲张,淋病等。 台州五洲生殖医院是台州最大的男科医院,权威专家在线免�� �咨询,拥有专业完善的男科检查治疗设备,严格按照国家标� ��收费。尖端医疗设备,与世界同步。权威专家,成就专业典 范。人性化服务,一切以患者为中心。 看男科就选台州五洲生殖医院,专业男科为男人。 ``` ----- Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 6:52
1.0
黄岩治疗不育去哪家医院专业 - ``` 黄岩治疗不育去哪家医院专业【台州五洲生殖医院】24小时健 康咨询热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地址: 台州市椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐104� ��108、118、198及椒江一金清公交车直达枫南小区,乘坐107、105 、109、112、901、 902公交车到星星广场下车,步行即可到院。 诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,�� �精,无精。包皮包茎,精索静脉曲张,淋病等。 台州五洲生殖医院是台州最大的男科医院,权威专家在线免�� �咨询,拥有专业完善的男科检查治疗设备,严格按照国家标� ��收费。尖端医疗设备,与世界同步。权威专家,成就专业典 范。人性化服务,一切以患者为中心。 看男科就选台州五洲生殖医院,专业男科为男人。 ``` ----- Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 6:52
non_process
黄岩治疗不育去哪家医院专业 黄岩治疗不育去哪家医院专业【台州五洲生殖医院】 康咨询热线 微信号tzwzszyy 医院地址 (枫南大转盘旁)乘车线路 � �� 、 、 , 、 、 、 、 、 ,步行即可到院。 诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,�� �精,无精。包皮包茎,精索静脉曲张,淋病等。 台州五洲生殖医院是台州最大的男科医院,权威专家在线免�� �咨询,拥有专业完善的男科检查治疗设备,严格按照国家标� ��收费。尖端医疗设备,与世界同步。权威专家,成就专业典 范。人性化服务,一切以患者为中心。 看男科就选台州五洲生殖医院,专业男科为男人。 original issue reported on code google com by poweragr gmail com on may at
0
21,136
28,106,568,774
IssuesEvent
2023-03-31 01:33:13
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Tracking issue for refactoring a Bazel BUILD files to more granular structure
P2 type: process team-Bazel stale
This is a tracking bug for work on Bazel's own BUILD files. The goal of this work is to establish 1 BUILD file per Java package in Bazel's own source code. This has a number of advantages for performance, especially when run with a remote executor. In general the focus is on targets in //src/main/java/com/google/devtools/build/lib:*
1.0
Tracking issue for refactoring a Bazel BUILD files to more granular structure - This is a tracking bug for work on Bazel's own BUILD files. The goal of this work is to establish 1 BUILD file per Java package in Bazel's own source code. This has a number of advantages for performance, especially when run with a remote executor. In general the focus is on targets in //src/main/java/com/google/devtools/build/lib:*
process
tracking issue for refactoring a bazel build files to more granular structure this is a tracking bug for work on bazel s own build files the goal of this work is to establish build file per java package in bazel s own source code this has a number of advantages for performance especially when run with a remote executor in general the focus is on targets in src main java com google devtools build lib
1
7,677
10,762,130,517
IssuesEvent
2019-10-31 22:37:16
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
obsolete: interspecies quorum sensing (GO:0052097) & intraspecies quorum sensing (GO:0052100)
multi-species process obsoletion
Of the 47 annotations to '**quorum sensing involved in interaction with host**', 33 are experimental - 31 IMP from Mony et al. 2014 - 1 IMP + 1 IDA from Fujiya et al. 2007 Looking quickly at these 2 papers plus a couple 2006 reviews they led me to, I think that it is specious to try to separate out `interspecies quorum sensing (GO:0052097)` & `intraspecies quorum sensing (GO:0052100)`. Other than in a laboratory monoculture, an organism that is putting out these signals and detecting them, is probably doing it in the context of multiple organisms and it's really the same basic system whether it's detecting its own signal or that of another organism. Thus, I propose to obsolete these two terms, which have no annotations anyway, since I do not think there is a different process going on between these two things, rather just some differences in specific interactions: - interspecies quorum sensing (GO:0052097) - intraspecies quorum sensing (GO:0052100) ---------- **References:** Mony et al. 2014. Genome-wide dissection of the quorum sensing signalling pathway in Trypanosoma brucei. Nature. 2505(7485):681-685. PMID:24336212 (31 IMP annotations) Fujiya et al. 2007. The Bacillus subtilis quorum-sensing molecule CSF contributes to intestinal homeostasis via OCTN2, a host cell membrane transporter. Cell Host Microbe. 1(4):299-308. PMID:18005709 (1 IMP annotation, 1 IDA annotation) Bassler BL, Losick R. 2006. Bacterially speaking. Cell. 125(2):237-46. Review. PMID:16630813. Camilli A, Bassler BL. 2006. Bacterial small-molecule signaling pathways. Science. 311(5764):1113-6. Review. PMID:16497924 _Originally posted by @krchristie in https://github.com/geneontology/go-ontology/issues/17730#issuecomment-524072591_
1.0
obsolete: interspecies quorum sensing (GO:0052097) & intraspecies quorum sensing (GO:0052100) - Of the 47 annotations to '**quorum sensing involved in interaction with host**', 33 are experimental - 31 IMP from Mony et al. 2014 - 1 IMP + 1 IDA from Fujiya et al. 2007 Looking quickly at these 2 papers plus a couple 2006 reviews they led me to, I think that it is specious to try to separate out `interspecies quorum sensing (GO:0052097)` & `intraspecies quorum sensing (GO:0052100)`. Other than in a laboratory monoculture, an organism that is putting out these signals and detecting them, is probably doing it in the context of multiple organisms and it's really the same basic system whether it's detecting its own signal or that of another organism. Thus, I propose to obsolete these two terms, which have no annotations anyway, since I do not think there is a different process going on between these two things, rather just some differences in specific interactions: - interspecies quorum sensing (GO:0052097) - intraspecies quorum sensing (GO:0052100) ---------- **References:** Mony et al. 2014. Genome-wide dissection of the quorum sensing signalling pathway in Trypanosoma brucei. Nature. 2505(7485):681-685. PMID:24336212 (31 IMP annotations) Fujiya et al. 2007. The Bacillus subtilis quorum-sensing molecule CSF contributes to intestinal homeostasis via OCTN2, a host cell membrane transporter. Cell Host Microbe. 1(4):299-308. PMID:18005709 (1 IMP annotation, 1 IDA annotation) Bassler BL, Losick R. 2006. Bacterially speaking. Cell. 125(2):237-46. Review. PMID:16630813. Camilli A, Bassler BL. 2006. Bacterial small-molecule signaling pathways. Science. 311(5764):1113-6. Review. PMID:16497924 _Originally posted by @krchristie in https://github.com/geneontology/go-ontology/issues/17730#issuecomment-524072591_
process
obsolete interspecies quorum sensing go intraspecies quorum sensing go of the annotations to quorum sensing involved in interaction with host are experimental imp from mony et al imp ida from fujiya et al looking quickly at these papers plus a couple reviews they led me to i think that it is specious to try to separate out interspecies quorum sensing go intraspecies quorum sensing go other than in a laboratory monoculture an organism that is putting out these signals and detecting them is probably doing it in the context of multiple organisms and it s really the same basic system whether it s detecting its own signal or that of another organism thus i propose to obsolete these two terms which have no annotations anyway since i do not think there is a different process going on between these two things rather just some differences in specific interactions interspecies quorum sensing go intraspecies quorum sensing go references mony et al genome wide dissection of the quorum sensing signalling pathway in trypanosoma brucei nature pmid imp annotations fujiya et al the bacillus subtilis quorum sensing molecule csf contributes to intestinal homeostasis via a host cell membrane transporter cell host microbe pmid imp annotation ida annotation bassler bl losick r bacterially speaking cell review pmid camilli a bassler bl bacterial small molecule signaling pathways science review pmid originally posted by krchristie in
1
347,337
10,428,690,383
IssuesEvent
2019-09-16 23:34:17
USDAForestService/fs-open-forest-platform
https://api.github.com/repos/USDAForestService/fs-open-forest-platform
closed
As a forest service administrator I do not want Christmas tree permits active for purchase outside of the season so that people aren't confused about when they can get a Christmas tree.
Christmas Trees high priority
## Notes * What are things we should consider when making this story * The purpose of this story is to inactivate the production site to the public facing user while the site is under construction. ## Acceptance Criteria - [x] The public cannot access information about Christmas trees while the site is under construction. - [x] The public cannot access the buy permit button while the site is under construction. - [x] The public is informed that "The site is under construction in preparation for the coming holiday season". - [x] We have an understanding of why the BUG story 813 fixes were not successful. ## Tasks - [x] A final test on 9/3 to verify Circle recycle jobs are good to go. For now they are still disabled until the final test is done. - [x] PO approved ## Definition of Done - [ ] Pull requests meet technical definition of done - [ ] Compare finished design with mockup - [ ] Usability tested
1.0
As a forest service administrator I do not want Christmas tree permits active for purchase outside of the season so that people aren't confused about when they can get a Christmas tree. - ## Notes * What are things we should consider when making this story * The purpose of this story is to inactivate the production site to the public facing user while the site is under construction. ## Acceptance Criteria - [x] The public cannot access information about Christmas trees while the site is under construction. - [x] The public cannot access the buy permit button while the site is under construction. - [x] The public is informed that "The site is under construction in preparation for the coming holiday season". - [x] We have an understanding of why the BUG story 813 fixes were not successful. ## Tasks - [x] A final test on 9/3 to verify Circle recycle jobs are good to go. For now they are still disabled until the final test is done. - [x] PO approved ## Definition of Done - [ ] Pull requests meet technical definition of done - [ ] Compare finished design with mockup - [ ] Usability tested
non_process
as a forest service administrator i do not want christmas tree permits active for purchase outside of the season so that people aren t confused about when they can get a christmas tree notes what are things we should consider when making this story the purpose of this story is to inactivate the production site to the public facing user while the site is under construction acceptance criteria the public cannot access information about christmas trees while the site is under construction the public cannot access the buy permit button while the site is under construction the public is informed that the site is under construction in preparation for the coming holiday season we have an understanding of why the bug story fixes were not successful tasks a final test on to verify circle recycle jobs are good to go for now they are still disabled until the final test is done po approved definition of done pull requests meet technical definition of done compare finished design with mockup usability tested
0
18,096
24,121,952,938
IssuesEvent
2022-09-20 19:34:42
neuropsychology/NeuroKit
https://api.github.com/repos/neuropsychology/NeuroKit
closed
signal_psd() pesky warning
wontfix signal processing :chart_with_upwards_trend: inactive 👻
When using `signal_psd()` with Welch method I pretty much always get the following warning, which is a bit obscure. We should somehow improve that `The duration of recording is too short to support a sufficiently long window for high frequency resolution. Consider using a longer recording or increasing the min_frequency`
1.0
signal_psd() pesky warning - When using `signal_psd()` with Welch method I pretty much always get the following warning, which is a bit obscure. We should somehow improve that `The duration of recording is too short to support a sufficiently long window for high frequency resolution. Consider using a longer recording or increasing the min_frequency`
process
signal psd pesky warning when using signal psd with welch method i pretty much always get the following warning which is a bit obscure we should somehow improve that the duration of recording is too short to support a sufficiently long window for high frequency resolution consider using a longer recording or increasing the min frequency
1
5,749
8,596,649,308
IssuesEvent
2018-11-15 16:29:01
elBjarnacho/BikingThroughStockholm
https://api.github.com/repos/elBjarnacho/BikingThroughStockholm
closed
Convert Fisheye footage to sphere
Image processing
Use FFMPEG remap filter to map the fisheye video to sphere, so that it can be viewed at 360º player
1.0
Convert Fisheye footage to sphere - Use FFMPEG remap filter to map the fisheye video to sphere, so that it can be viewed at 360º player
process
convert fisheye footage to sphere use ffmpeg remap filter to map the fisheye video to sphere so that it can be viewed at player
1
37,659
8,345,632,526
IssuesEvent
2018-10-01 04:13:18
TEAMMATES/teammates
https://api.github.com/repos/TEAMMATES/teammates
closed
Logic.java: Clean up unused methods.
a-CodeQuality c.Task d.Contributors t-Java
There are several unused methods (or methods only used in test) in `Logic.java`. See if we can remove them as well as their correspoding methods `*Db`.
1.0
Logic.java: Clean up unused methods. - There are several unused methods (or methods only used in test) in `Logic.java`. See if we can remove them as well as their correspoding methods `*Db`.
non_process
logic java clean up unused methods there are several unused methods or methods only used in test in logic java see if we can remove them as well as their correspoding methods db
0
126,003
26,767,055,457
IssuesEvent
2023-01-31 11:24:44
S-Man42/GCWizard
https://api.github.com/repos/S-Man42/GCWizard
opened
Refactoring: Get rid of "WrapperForMaskInputTextBlakeks"
code smell
It was originally used to create a "mask input", e.g. for the textfields like playfair crypto input or enigma plugboard. The mask adds the space after every two characters which is very useful. Meanwhile it is used for many non-masked textfields instead of a simple TextInputFormatter a) Check if there's an alternative b) If not, at least remove it everywhere, where a mask is not necessary and a normal TextInputFormatter could be taken.
1.0
Refactoring: Get rid of "WrapperForMaskInputTextBlakeks" - It was originally used to create a "mask input", e.g. for the textfields like playfair crypto input or enigma plugboard. The mask adds the space after every two characters which is very useful. Meanwhile it is used for many non-masked textfields instead of a simple TextInputFormatter a) Check if there's an alternative b) If not, at least remove it everywhere, where a mask is not necessary and a normal TextInputFormatter could be taken.
non_process
refactoring get rid of wrapperformaskinputtextblakeks it was originally used to create a mask input e g for the textfields like playfair crypto input or enigma plugboard the mask adds the space after every two characters which is very useful meanwhile it is used for many non masked textfields instead of a simple textinputformatter a check if there s an alternative b if not at least remove it everywhere where a mask is not necessary and a normal textinputformatter could be taken
0
75,856
9,334,629,451
IssuesEvent
2019-03-28 16:43:44
GSA/pra.gov
https://api.github.com/repos/GSA/pra.gov
opened
Re-word step 4 of process diagram to include public's action
content design process diagram
The word "post" is an inaccurate description of the public's what the public does in step 4.
1.0
Re-word step 4 of process diagram to include public's action - The word "post" is an inaccurate description of the public's what the public does in step 4.
non_process
re word step of process diagram to include public s action the word post is an inaccurate description of the public s what the public does in step
0
14,282
17,260,742,297
IssuesEvent
2021-07-22 07:13:44
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[Mobile] Verification step > An option to be provided to correct the email in verification step if participant mistakenly signs up with incorrect email
Android Feature request P3 Process: Fixed Process: Tested dev iOS
Steps: 1. Navigate to signup 2. Enter incorrect email 3. Enter rest of data and click on submit 4. Navigated to verification step 5. Observe **Is your feature request related to a problem? Please describe.** After following above steps, no option or guide for the participants to correct the email in verification step. User has to navigate back to signup page and re-enter all the fields again. **Describe the solution you'd like** Instead of back arrow, an option with hyperlink 'Entered the wrong email? Signup again!' can be provided which should navigate to signup page. So participant can re-enter with valid data and proceed further **Describe alternatives you've considered** An optional text field in the verification step page to correct the email with previously entered password. So user can just correct the email without re-entering any of the password fields.
2.0
[Mobile] Verification step > An option to be provided to correct the email in verification step if participant mistakenly signs up with incorrect email - Steps: 1. Navigate to signup 2. Enter incorrect email 3. Enter rest of data and click on submit 4. Navigated to verification step 5. Observe **Is your feature request related to a problem? Please describe.** After following above steps, no option or guide for the participants to correct the email in verification step. User has to navigate back to signup page and re-enter all the fields again. **Describe the solution you'd like** Instead of back arrow, an option with hyperlink 'Entered the wrong email? Signup again!' can be provided which should navigate to signup page. So participant can re-enter with valid data and proceed further **Describe alternatives you've considered** An optional text field in the verification step page to correct the email with previously entered password. So user can just correct the email without re-entering any of the password fields.
process
verification step an option to be provided to correct the email in verification step if participant mistakenly signs up with incorrect email steps navigate to signup enter incorrect email enter rest of data and click on submit navigated to verification step observe is your feature request related to a problem please describe after following above steps no option or guide for the participants to correct the email in verification step user has to navigate back to signup page and re enter all the fields again describe the solution you d like instead of back arrow an option with hyperlink entered the wrong email signup again can be provided which should navigate to signup page so participant can re enter with valid data and proceed further describe alternatives you ve considered an optional text field in the verification step page to correct the email with previously entered password so user can just correct the email without re entering any of the password fields
1
15,159
18,910,615,975
IssuesEvent
2021-11-16 13:47:01
RobertCraigie/prisma-client-py
https://api.github.com/repos/RobertCraigie/prisma-client-py
closed
Global aliases cause confusing error message
bug/2-confirmed kind/bug process/candidate
<!-- Thanks for helping us improve Prisma Client Python! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by enabling additional logging output. See https://prisma-client-py.readthedocs.io/en/latest/logging/ for how to enable additional logging output. --> ## Bug description <!-- A clear and concise description of what the bug is. --> When a field overlaps with a global alias used for query building, a confusing error message is shown ## How to reproduce <!-- Steps to reproduce the behavior: 1. Go to '...' 2. Change '....' 3. Run '....' 4. See error --> ```py import asyncio from prisma import Client async def main() -> None: client = Client() await client.connect() user = await client.user.create( data={ 'name': 'Robert', 'order_by': 'age', } ) print(user) if __name__ == '__main__': asyncio.run(main()) ``` Running the above script raises the following error: ``` prisma.errors.MissingRequiredValueError: Failed to validate the query: `Unable to match input value to any allowed input type for the field. Parse errors: [Query parsing/validation error at `Mutation.createOneUser.data.UserCreateInput.order_by`: A value is required but not set., Query parsing/validation error at `Mutation.createOneUser.data.UserUncheckedCreateInput.order_by`: A value is required but not set.]` at `Mutation.createOneUser.data` ``` ## Expected behavior <!-- A clear and concise description of what you expected to happen. --> This should be a valid query, however supporting this would require a massive refactor of our query builder. The part of this issue that is considered a bug is the confusing error message, we should disallow generating a client that will result in an invalid internal query being generated. ## Prisma information <!-- Your Prisma schema, Prisma Client Python queries, ... Do not include your database credentials when sharing your Prisma schema! --> ```prisma datasource db { provider = "sqlite" url = "file:tmp.db" } generator db { provider = "prisma-client-py" interface = "asyncio" recursive_type_depth = -1 } model User { id String @id @default(cuid()) name String order_by String } ```
1.0
Global aliases cause confusing error message - <!-- Thanks for helping us improve Prisma Client Python! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by enabling additional logging output. See https://prisma-client-py.readthedocs.io/en/latest/logging/ for how to enable additional logging output. --> ## Bug description <!-- A clear and concise description of what the bug is. --> When a field overlaps with a global alias used for query building, a confusing error message is shown ## How to reproduce <!-- Steps to reproduce the behavior: 1. Go to '...' 2. Change '....' 3. Run '....' 4. See error --> ```py import asyncio from prisma import Client async def main() -> None: client = Client() await client.connect() user = await client.user.create( data={ 'name': 'Robert', 'order_by': 'age', } ) print(user) if __name__ == '__main__': asyncio.run(main()) ``` Running the above script raises the following error: ``` prisma.errors.MissingRequiredValueError: Failed to validate the query: `Unable to match input value to any allowed input type for the field. Parse errors: [Query parsing/validation error at `Mutation.createOneUser.data.UserCreateInput.order_by`: A value is required but not set., Query parsing/validation error at `Mutation.createOneUser.data.UserUncheckedCreateInput.order_by`: A value is required but not set.]` at `Mutation.createOneUser.data` ``` ## Expected behavior <!-- A clear and concise description of what you expected to happen. --> This should be a valid query, however supporting this would require a massive refactor of our query builder. The part of this issue that is considered a bug is the confusing error message, we should disallow generating a client that will result in an invalid internal query being generated. ## Prisma information <!-- Your Prisma schema, Prisma Client Python queries, ... Do not include your database credentials when sharing your Prisma schema! --> ```prisma datasource db { provider = "sqlite" url = "file:tmp.db" } generator db { provider = "prisma-client-py" interface = "asyncio" recursive_type_depth = -1 } model User { id String @id @default(cuid()) name String order_by String } ```
process
global aliases cause confusing error message thanks for helping us improve prisma client python 🙏 please follow the sections in the template and provide as much information as possible about your problem e g by enabling additional logging output see for how to enable additional logging output bug description when a field overlaps with a global alias used for query building a confusing error message is shown how to reproduce steps to reproduce the behavior go to change run see error py import asyncio from prisma import client async def main none client client await client connect user await client user create data name robert order by age print user if name main asyncio run main running the above script raises the following error prisma errors missingrequiredvalueerror failed to validate the query unable to match input value to any allowed input type for the field parse errors at mutation createoneuser data expected behavior this should be a valid query however supporting this would require a massive refactor of our query builder the part of this issue that is considered a bug is the confusing error message we should disallow generating a client that will result in an invalid internal query being generated prisma information your prisma schema prisma client python queries do not include your database credentials when sharing your prisma schema prisma datasource db provider sqlite url file tmp db generator db provider prisma client py interface asyncio recursive type depth model user id string id default cuid name string order by string
1
1,709
4,350,589,289
IssuesEvent
2016-07-31 10:39:59
pwittchen/ReactiveWiFi
https://api.github.com/repos/pwittchen/ReactiveWiFi
closed
Release 0.1.1
release process
**Initial release notes**: - bumped RxJava to v. 1.1.8 - bumped RxAndroid to v. 1.2.1 - bumped Gradle Build Tools to 2.1.2 **Things to do**: - [x] bump library version to 0.1.1 - [x] upload Archives to Maven Central Repository - [x] close and release artifact on Nexus - [x] update `CHANGELOG.md` after Maven Sync - [x] update download section in `README.md` after Maven Sync - [x] create new GitHub release
1.0
Release 0.1.1 - **Initial release notes**: - bumped RxJava to v. 1.1.8 - bumped RxAndroid to v. 1.2.1 - bumped Gradle Build Tools to 2.1.2 **Things to do**: - [x] bump library version to 0.1.1 - [x] upload Archives to Maven Central Repository - [x] close and release artifact on Nexus - [x] update `CHANGELOG.md` after Maven Sync - [x] update download section in `README.md` after Maven Sync - [x] create new GitHub release
process
release initial release notes bumped rxjava to v bumped rxandroid to v bumped gradle build tools to things to do bump library version to upload archives to maven central repository close and release artifact on nexus update changelog md after maven sync update download section in readme md after maven sync create new github release
1
353,068
25,099,531,413
IssuesEvent
2022-11-08 12:41:14
equinor/ert
https://api.github.com/repos/equinor/ert
closed
Possible misplacement of docs around TEMPLATE config keyword
documentation
In [working around ensemble config](#4075), we figured out that we could remove the `OUTPUT_FORMAT` option - but in the docs, that one was tied to the `TEMPLATE` keyword. now i'm left with the impression that the info text about the template keyword should perhaps be somewhere else than it is right now, but I don't understand enough of the code / usage of the config to confidently place it myself, so I'm creating this issue instead.
1.0
Possible misplacement of docs around TEMPLATE config keyword - In [working around ensemble config](#4075), we figured out that we could remove the `OUTPUT_FORMAT` option - but in the docs, that one was tied to the `TEMPLATE` keyword. now i'm left with the impression that the info text about the template keyword should perhaps be somewhere else than it is right now, but I don't understand enough of the code / usage of the config to confidently place it myself, so I'm creating this issue instead.
non_process
possible misplacement of docs around template config keyword in we figured out that we could remove the output format option but in the docs that one was tied to the template keyword now i m left with the impression that the info text about the template keyword should perhaps be somewhere else than it is right now but i don t understand enough of the code usage of the config to confidently place it myself so i m creating this issue instead
0
4,620
7,464,929,550
IssuesEvent
2018-04-02 00:00:59
jtablesaw/tablesaw
https://api.github.com/repos/jtablesaw/tablesaw
closed
Eliminate deprecation warnings
process
A number of the libraries we rely on have changed their APIs recently, including OpenCSV, RoaringBitmaps, FastUtil, and commons.lang3. The current list of warnings is below: /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/Table.java Warning:Warning:line (1,119)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/BooleanColumn.java Warning:Warning:line (151)java: entrySet() in it.unimi.dsi.fastutil.bytes.Byte2IntMap has been deprecated Warning:Warning:line (424)java: add(java.lang.Boolean) in it.unimi.dsi.fastutil.booleans.BooleanSet has been deprecated Warning:Warning:line (430)java: contains(java.lang.Object) in it.unimi.dsi.fastutil.bytes.AbstractByteCollection has been deprecated Warning:Warning:line (501)java: next() in it.unimi.dsi.fastutil.bytes.ByteIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/table/TemporaryView.java Warning:Warning:line (119)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated Warning:Warning:line (149)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated Warning:Warning:line (316)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/CategoryColumn.java Warning:Warning:line (205)java: entrySet() in it.unimi.dsi.fastutil.ints.Int2IntMap has been deprecated Warning:Warning:line (635)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/mapping/StringMapUtils.java Warning:Warning:line (168)java: getLevenshteinDistance(java.lang.CharSequence,java.lang.CharSequence) in org.apache.commons.lang3.StringUtils has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/ShortColumn.java Warning:Warning:line (139)java: next() in it.unimi.dsi.fastutil.shorts.ShortIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/LongColumn.java Warning:Warning:line (142)java: next() in it.unimi.dsi.fastutil.longs.LongIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/DateColumn.java Warning:Warning:line (764)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/TimeColumn.java Warning:Warning:line (558)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/DateTimeColumn.java Warning:Warning:line (818)java: next() in it.unimi.dsi.fastutil.longs.LongIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/util/DictionaryMap.java Warning:Warning:line (52)java: remove(java.lang.Object) in it.unimi.dsi.fastutil.objects.Object2IntMap has been deprecated Warning:Warning:line (56)java: remove(java.lang.Object) in it.unimi.dsi.fastutil.objects.Object2IntMap has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/util/BitmapBackedSelection.java Warning:Warning:line (93)java: add(int,int) in org.roaringbitmap.RoaringBitmap has been deprecated Warning:Warning:line (139)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/io/csv/CsvCombiner.java Warning:Warning:line (28)java: CSVWriter(java.io.Writer,char) in com.opencsv.CSVWriter has been deprecated Warning:Warning:line (45)java: CSVReader(java.io.Reader,char) in com.opencsv.CSVReader has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/io/csv/CsvReader.java Warning:Warning:line (280)java: CSVReader(java.io.Reader,char,char) in com.opencsv.CSVReader has been deprecated Warning:Warning:line (426)java: CSVReader(java.io.Reader,char,char,int) in com.opencsv.CSVReader has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/test/java/tech/tablesaw/examples/ObservationDataTest.java Warning:Warning:line (18)java: org.apache.commons.lang3.RandomStringUtils in org.apache.commons.lang3 has been deprecated Warning:Warning:line (164)java: org.apache.commons.lang3.RandomStringUtils in org.apache.commons.lang3 has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/test/java/tech/tablesaw/filters/TimeDependentFilteringTest.java Warning:Warning:line (18)java: org.apache.commons.lang3.RandomStringUtils in org.apache.commons.lang3 has been deprecated Warning:Warning:line (169)java: org.apache.commons.lang3.RandomStringUtils in org.apache.commons.lang3 has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/test/java/tech/tablesaw/api/DoubleColumnTest.java Warning:Warning:line (178)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/test/java/tech/tablesaw/api/FloatColumnTest.java Warning:Warning:line (182)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated
1.0
Eliminate deprecation warnings - A number of the libraries we rely on have changed their APIs recently, including OpenCSV, RoaringBitmaps, FastUtil, and commons.lang3. The current list of warnings is below: /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/Table.java Warning:Warning:line (1,119)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/BooleanColumn.java Warning:Warning:line (151)java: entrySet() in it.unimi.dsi.fastutil.bytes.Byte2IntMap has been deprecated Warning:Warning:line (424)java: add(java.lang.Boolean) in it.unimi.dsi.fastutil.booleans.BooleanSet has been deprecated Warning:Warning:line (430)java: contains(java.lang.Object) in it.unimi.dsi.fastutil.bytes.AbstractByteCollection has been deprecated Warning:Warning:line (501)java: next() in it.unimi.dsi.fastutil.bytes.ByteIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/table/TemporaryView.java Warning:Warning:line (119)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated Warning:Warning:line (149)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated Warning:Warning:line (316)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/CategoryColumn.java Warning:Warning:line (205)java: entrySet() in it.unimi.dsi.fastutil.ints.Int2IntMap has been deprecated Warning:Warning:line (635)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/mapping/StringMapUtils.java Warning:Warning:line (168)java: getLevenshteinDistance(java.lang.CharSequence,java.lang.CharSequence) in org.apache.commons.lang3.StringUtils has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/ShortColumn.java Warning:Warning:line (139)java: next() in it.unimi.dsi.fastutil.shorts.ShortIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/LongColumn.java Warning:Warning:line (142)java: next() in it.unimi.dsi.fastutil.longs.LongIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/DateColumn.java Warning:Warning:line (764)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/TimeColumn.java Warning:Warning:line (558)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/api/DateTimeColumn.java Warning:Warning:line (818)java: next() in it.unimi.dsi.fastutil.longs.LongIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/util/DictionaryMap.java Warning:Warning:line (52)java: remove(java.lang.Object) in it.unimi.dsi.fastutil.objects.Object2IntMap has been deprecated Warning:Warning:line (56)java: remove(java.lang.Object) in it.unimi.dsi.fastutil.objects.Object2IntMap has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/util/BitmapBackedSelection.java Warning:Warning:line (93)java: add(int,int) in org.roaringbitmap.RoaringBitmap has been deprecated Warning:Warning:line (139)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/io/csv/CsvCombiner.java Warning:Warning:line (28)java: CSVWriter(java.io.Writer,char) in com.opencsv.CSVWriter has been deprecated Warning:Warning:line (45)java: CSVReader(java.io.Reader,char) in com.opencsv.CSVReader has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/main/java/tech/tablesaw/io/csv/CsvReader.java Warning:Warning:line (280)java: CSVReader(java.io.Reader,char,char) in com.opencsv.CSVReader has been deprecated Warning:Warning:line (426)java: CSVReader(java.io.Reader,char,char,int) in com.opencsv.CSVReader has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/test/java/tech/tablesaw/examples/ObservationDataTest.java Warning:Warning:line (18)java: org.apache.commons.lang3.RandomStringUtils in org.apache.commons.lang3 has been deprecated Warning:Warning:line (164)java: org.apache.commons.lang3.RandomStringUtils in org.apache.commons.lang3 has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/test/java/tech/tablesaw/filters/TimeDependentFilteringTest.java Warning:Warning:line (18)java: org.apache.commons.lang3.RandomStringUtils in org.apache.commons.lang3 has been deprecated Warning:Warning:line (169)java: org.apache.commons.lang3.RandomStringUtils in org.apache.commons.lang3 has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/test/java/tech/tablesaw/api/DoubleColumnTest.java Warning:Warning:line (178)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated /Users/larrywhite/IdeaProjects/jtablesaw/tablesaw/core/src/test/java/tech/tablesaw/api/FloatColumnTest.java Warning:Warning:line (182)java: next() in it.unimi.dsi.fastutil.ints.IntIterator has been deprecated
process
eliminate deprecation warnings a number of the libraries we rely on have changed their apis recently including opencsv roaringbitmaps fastutil and commons the current list of warnings is below users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw api table java warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw api booleancolumn java warning warning line java entryset in it unimi dsi fastutil bytes has been deprecated warning warning line java add java lang boolean in it unimi dsi fastutil booleans booleanset has been deprecated warning warning line java contains java lang object in it unimi dsi fastutil bytes abstractbytecollection has been deprecated warning warning line java next in it unimi dsi fastutil bytes byteiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw table temporaryview java warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw api categorycolumn java warning warning line java entryset in it unimi dsi fastutil ints has been deprecated warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw mapping stringmaputils java warning warning line java getlevenshteindistance java lang charsequence java lang charsequence in org apache commons stringutils has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw api shortcolumn java warning warning line java next in it unimi dsi fastutil shorts shortiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw api longcolumn java warning warning line java next in it unimi dsi fastutil longs longiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw api datecolumn java warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw api timecolumn java warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw api datetimecolumn java warning warning line java next in it unimi dsi fastutil longs longiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw util dictionarymap java warning warning line java remove java lang object in it unimi dsi fastutil objects has been deprecated warning warning line java remove java lang object in it unimi dsi fastutil objects has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw util bitmapbackedselection java warning warning line java add int int in org roaringbitmap roaringbitmap has been deprecated warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw io csv csvcombiner java warning warning line java csvwriter java io writer char in com opencsv csvwriter has been deprecated warning warning line java csvreader java io reader char in com opencsv csvreader has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src main java tech tablesaw io csv csvreader java warning warning line java csvreader java io reader char char in com opencsv csvreader has been deprecated warning warning line java csvreader java io reader char char int in com opencsv csvreader has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src test java tech tablesaw examples observationdatatest java warning warning line java org apache commons randomstringutils in org apache commons has been deprecated warning warning line java org apache commons randomstringutils in org apache commons has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src test java tech tablesaw filters timedependentfilteringtest java warning warning line java org apache commons randomstringutils in org apache commons has been deprecated warning warning line java org apache commons randomstringutils in org apache commons has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src test java tech tablesaw api doublecolumntest java warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated users larrywhite ideaprojects jtablesaw tablesaw core src test java tech tablesaw api floatcolumntest java warning warning line java next in it unimi dsi fastutil ints intiterator has been deprecated
1
396,205
27,107,148,032
IssuesEvent
2023-02-15 12:57:47
telerik/kendo-angular
https://api.github.com/repos/telerik/kendo-angular
closed
Notifications are stacked in inactive tabs
Documentation pkg:notification Team2
**Describe the bug** Showing and hiding animated Notifications in **setInterval** works as expected. But when the tab became inactive this behavior is changed causing the Notification to stack: **To Reproduce** 1. Open the demo, Notifications should be toggled every 3sec. 2. Open another tab and wait a few seconds (15, 20 sec). 3. Return to the tab where the Notification demo is. Notice how the Notifications are stacked and try to catch up. https://stackblitz.com/edit/angular-gymzzb-dgorck It looks like the animation doesn't work as expected in combination with inactive tabs. To improve the performance, browsers by default set[ low priority execution](https://stackoverflow.com/questions/5927284/how-can-i-make-setinterval-also-work-when-a-tab-is-inactive-in-chrome) to inactive tabs which can affect timers. Also [the `requestAnimationFrame` is paused](https://stackoverflow.com/questions/15871942/how-do-browsers-pause-change-javascript-when-tab-or-window-is-not-active/16033979#16033979) when the tab is inactive. **Workarounds** Remove the Notification animation: https://stackblitz.com/edit/angular-gymzzb-urij43 Another approach is to clear the interval depending on the active state of the tab (using [visibilitychange](https://developer.mozilla.org/en-US/docs/Web/API/Document/visibilitychange_event) event and [document.visibilityState](https://developer.mozilla.org/en-US/docs/Web/API/Document/visibilityState) property): https://stackblitz.com/edit/angular-gymzzb-hvywzq
1.0
Notifications are stacked in inactive tabs - **Describe the bug** Showing and hiding animated Notifications in **setInterval** works as expected. But when the tab became inactive this behavior is changed causing the Notification to stack: **To Reproduce** 1. Open the demo, Notifications should be toggled every 3sec. 2. Open another tab and wait a few seconds (15, 20 sec). 3. Return to the tab where the Notification demo is. Notice how the Notifications are stacked and try to catch up. https://stackblitz.com/edit/angular-gymzzb-dgorck It looks like the animation doesn't work as expected in combination with inactive tabs. To improve the performance, browsers by default set[ low priority execution](https://stackoverflow.com/questions/5927284/how-can-i-make-setinterval-also-work-when-a-tab-is-inactive-in-chrome) to inactive tabs which can affect timers. Also [the `requestAnimationFrame` is paused](https://stackoverflow.com/questions/15871942/how-do-browsers-pause-change-javascript-when-tab-or-window-is-not-active/16033979#16033979) when the tab is inactive. **Workarounds** Remove the Notification animation: https://stackblitz.com/edit/angular-gymzzb-urij43 Another approach is to clear the interval depending on the active state of the tab (using [visibilitychange](https://developer.mozilla.org/en-US/docs/Web/API/Document/visibilitychange_event) event and [document.visibilityState](https://developer.mozilla.org/en-US/docs/Web/API/Document/visibilityState) property): https://stackblitz.com/edit/angular-gymzzb-hvywzq
non_process
notifications are stacked in inactive tabs describe the bug showing and hiding animated notifications in setinterval works as expected but when the tab became inactive this behavior is changed causing the notification to stack to reproduce open the demo notifications should be toggled every open another tab and wait a few seconds sec return to the tab where the notification demo is notice how the notifications are stacked and try to catch up it looks like the animation doesn t work as expected in combination with inactive tabs to improve the performance browsers by default set to inactive tabs which can affect timers also when the tab is inactive workarounds remove the notification animation another approach is to clear the interval depending on the active state of the tab using event and property
0
10,841
13,623,272,932
IssuesEvent
2020-09-24 05:57:49
google/ground-android
https://api.github.com/repos/google/ground-android
closed
[Testing] Add instrumented test for adding a Feature
priority: p1 type: process
Espresso tests or similar that run the app and perform key flows. We could either mock out the Firebase API, or perform actual E2E tests against a live instance; for now would prefer whichever gets us some protection against regressions sooner. Goal is to get one UI test working ("happy path") for adding a feature.
1.0
[Testing] Add instrumented test for adding a Feature - Espresso tests or similar that run the app and perform key flows. We could either mock out the Firebase API, or perform actual E2E tests against a live instance; for now would prefer whichever gets us some protection against regressions sooner. Goal is to get one UI test working ("happy path") for adding a feature.
process
add instrumented test for adding a feature espresso tests or similar that run the app and perform key flows we could either mock out the firebase api or perform actual tests against a live instance for now would prefer whichever gets us some protection against regressions sooner goal is to get one ui test working happy path for adding a feature
1
64,554
12,476,304,070
IssuesEvent
2020-05-29 13:15:23
Genuitec/CodeTogether
https://api.github.com/repos/Genuitec/CodeTogether
closed
Introduction video available for Hosts
eclipse enhancement intellij vscode
**Describe the solution you'd like** To help new users understand how to use CodeTogether, we should create a video and integrate it into the welcome text that shows the core of how CodeTogether works, including briefly covering security concerns, how to invite participants, and a brief show of the capabilities available. This should be made available for optional viewing -- by no means should the video automatically start or be bundled into the product. It can be shown on an embedded or external browser if thumbnail clicked.
1.0
Introduction video available for Hosts - **Describe the solution you'd like** To help new users understand how to use CodeTogether, we should create a video and integrate it into the welcome text that shows the core of how CodeTogether works, including briefly covering security concerns, how to invite participants, and a brief show of the capabilities available. This should be made available for optional viewing -- by no means should the video automatically start or be bundled into the product. It can be shown on an embedded or external browser if thumbnail clicked.
non_process
introduction video available for hosts describe the solution you d like to help new users understand how to use codetogether we should create a video and integrate it into the welcome text that shows the core of how codetogether works including briefly covering security concerns how to invite participants and a brief show of the capabilities available this should be made available for optional viewing by no means should the video automatically start or be bundled into the product it can be shown on an embedded or external browser if thumbnail clicked
0
6,867
10,000,119,954
IssuesEvent
2019-07-12 12:36:55
TorXakis/TorXakis
https://api.github.com/repos/TorXakis/TorXakis
closed
stack lock files
development-process
Should we put the stack lock files under version control? Or should we make TorXakis such that lock files are not needed: higher quality of dependencies... For more info see: https://docs.haskellstack.org/en/stable/lock_files/ Our lock file currently contains ``` # This file was autogenerated by Stack. # You should not edit this file by hand. # For more information, please see the documentation at: # https://docs.haskellstack.org/en/stable/lock_files packages: - completed: cabal-file: size: 2957 sha256: 86721de77129c198aa3f033c79c0f7aaebce16615e6e4ea73cefb3d9771e5a94 name: text-via-sockets version: 0.1.0.0 git: https://github.com/TorXakis/text-via-sockets.git pantry-tree: size: 682 sha256: 5e30ecba223e12674b7b37e668af407bbdebeea117fd3a60306a42dd636056c0 commit: e3228cd0407ec0d7991a544e154ea2def184fcae original: git: https://github.com/TorXakis/text-via-sockets.git commit: e3228cd0407ec0d7991a544e154ea2def184fcae snapshots: - completed: size: 527836 url: https://raw.githubusercontent.com/commercialhaskell/stackage-snapshots/master/lts/11/22.yaml sha256: 341870ac98d8a9f8f77c4adf2e9e0b22063e264a7fbeb4c85b7af5f380dac60e original: lts-11.22 ```
1.0
stack lock files - Should we put the stack lock files under version control? Or should we make TorXakis such that lock files are not needed: higher quality of dependencies... For more info see: https://docs.haskellstack.org/en/stable/lock_files/ Our lock file currently contains ``` # This file was autogenerated by Stack. # You should not edit this file by hand. # For more information, please see the documentation at: # https://docs.haskellstack.org/en/stable/lock_files packages: - completed: cabal-file: size: 2957 sha256: 86721de77129c198aa3f033c79c0f7aaebce16615e6e4ea73cefb3d9771e5a94 name: text-via-sockets version: 0.1.0.0 git: https://github.com/TorXakis/text-via-sockets.git pantry-tree: size: 682 sha256: 5e30ecba223e12674b7b37e668af407bbdebeea117fd3a60306a42dd636056c0 commit: e3228cd0407ec0d7991a544e154ea2def184fcae original: git: https://github.com/TorXakis/text-via-sockets.git commit: e3228cd0407ec0d7991a544e154ea2def184fcae snapshots: - completed: size: 527836 url: https://raw.githubusercontent.com/commercialhaskell/stackage-snapshots/master/lts/11/22.yaml sha256: 341870ac98d8a9f8f77c4adf2e9e0b22063e264a7fbeb4c85b7af5f380dac60e original: lts-11.22 ```
process
stack lock files should we put the stack lock files under version control or should we make torxakis such that lock files are not needed higher quality of dependencies for more info see our lock file currently contains this file was autogenerated by stack you should not edit this file by hand for more information please see the documentation at packages completed cabal file size name text via sockets version git pantry tree size commit original git commit snapshots completed size url original lts
1
141,647
12,974,774,595
IssuesEvent
2020-07-21 15:55:45
chef/automate
https://api.github.com/repos/chef/automate
reopened
confirm saml with azure ad works and add to docs
auth-team documentation
## Overview Recently, @kenmacleod documented [how to setup saml with azure ad](https://github.com/chef-cft/chef-examples/pull/23). Let's try it out and confirm that it works and then add it to the [supported list](https://automate.chef.io/docs/saml/#supported-identity-management-systems) along with any information needed for others to set it up.
1.0
confirm saml with azure ad works and add to docs - ## Overview Recently, @kenmacleod documented [how to setup saml with azure ad](https://github.com/chef-cft/chef-examples/pull/23). Let's try it out and confirm that it works and then add it to the [supported list](https://automate.chef.io/docs/saml/#supported-identity-management-systems) along with any information needed for others to set it up.
non_process
confirm saml with azure ad works and add to docs overview recently kenmacleod documented let s try it out and confirm that it works and then add it to the along with any information needed for others to set it up
0
177,993
21,509,229,386
IssuesEvent
2022-04-28 01:18:30
szb512/cypress
https://api.github.com/repos/szb512/cypress
closed
CVE-2018-19826 (Medium) detected in node-sass-v4.11.0 - autoclosed
bug security vulnerability
## CVE-2018-19826 - Medium Severity Vulnerability <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sassv4.11.0</b></p></summary> <p> <p>:rainbow: Node.js bindings to libsass</p> <p>Library home page: <a href=https://github.com/sass/node-sass.git>https://github.com/sass/node-sass.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/szb512/cypress/commit/29dcad339d37f2169e5a640bf8d0d1438f7c18c2">29dcad339d37f2169e5a640bf8d0d1438f7c18c2</a></p> </p> </details> </p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Library Source Files (125)</summary> <p></p> <p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p> <p> - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/expand.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/color_maps.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/sass_util.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/utf8/unchecked.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/output.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/sass_values.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/util.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/emitter.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/lexer.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/test/test_node.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/plugins.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/include/sass/base.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/position.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/subset_map.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/operation.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/error_handling.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/custom_importer_bridge.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/contrib/plugin.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/functions.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/test/test_superselector.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/eval.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/utf8_string.hpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_context_wrapper.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/error_handling.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/node.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/parser.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/subset_map.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/emitter.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/listize.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/ast.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass_functions.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/output.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/check_nesting.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/functions.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/cssize.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/prelexer.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/paths.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/inspect.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/color.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/test/test_unification.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/values.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/sass_util.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/source_map.hpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/list.h - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/check_nesting.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/json.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/units.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/units.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/context.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/utf8/checked.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/listize.hpp - /cypress/packages/runner/node_modules/node-sass/src/sass_types/string.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/prelexer.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/context.hpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/boolean.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/include/sass2scss.h - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/eval.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/expand.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/factory.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/operators.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/boolean.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/source_map.cpp - /cypress/packages/runner/node_modules/node-sass/src/sass_types/value.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/utf8_string.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/callback_bridge.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/file.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/sass.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/node.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/environment.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/extend.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/sass_context.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/operators.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/constants.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/sass.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/parser.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/constants.cpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/list.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/cssize.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/include/sass/functions.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/util.cpp - /cypress/packages/reporter/node_modules/node-sass/src/custom_function_bridge.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/custom_importer_bridge.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/bind.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/inspect.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass_functions.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/backtrace.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/extend.cpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/sass_value_wrapper.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/debugger.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/cencode.c - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/base64vlq.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/number.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/color.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/c99func.c - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/position.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass_values.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/include/sass/values.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/test/test_subset_map.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass2scss.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/null.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/ast.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/include/sass/context.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/to_c.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/to_value.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/color_maps.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_context_wrapper.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/script/test-leaks.pl - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/lexer.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/to_c.hpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/map.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/to_value.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/b64/encode.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/file.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/environment.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/plugins.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/binding.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass_context.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/debug.hpp </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In inspect.cpp in LibSass 3.5.5, a high memory footprint caused by an endless loop (containing a Sass::Inspect::operator()(Sass::String_Quoted*) stack frame) may cause a Denial of Service via crafted sass input files with stray '&' or '/' characters. <p>Publish Date: 2018-12-03 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19826>CVE-2018-19826</a></p> </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-19826 (Medium) detected in node-sass-v4.11.0 - autoclosed - ## CVE-2018-19826 - Medium Severity Vulnerability <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sassv4.11.0</b></p></summary> <p> <p>:rainbow: Node.js bindings to libsass</p> <p>Library home page: <a href=https://github.com/sass/node-sass.git>https://github.com/sass/node-sass.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/szb512/cypress/commit/29dcad339d37f2169e5a640bf8d0d1438f7c18c2">29dcad339d37f2169e5a640bf8d0d1438f7c18c2</a></p> </p> </details> </p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Library Source Files (125)</summary> <p></p> <p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p> <p> - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/expand.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/color_maps.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/sass_util.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/utf8/unchecked.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/output.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/sass_values.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/util.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/emitter.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/lexer.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/test/test_node.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/plugins.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/include/sass/base.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/position.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/subset_map.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/operation.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/error_handling.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/custom_importer_bridge.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/contrib/plugin.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/functions.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/test/test_superselector.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/eval.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/utf8_string.hpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_context_wrapper.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/error_handling.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/node.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/parser.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/subset_map.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/emitter.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/listize.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/ast.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass_functions.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/output.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/check_nesting.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/functions.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/cssize.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/prelexer.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/paths.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/inspect.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/color.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/test/test_unification.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/values.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/sass_util.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/source_map.hpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/list.h - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/check_nesting.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/json.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/units.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/units.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/context.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/utf8/checked.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/listize.hpp - /cypress/packages/runner/node_modules/node-sass/src/sass_types/string.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/prelexer.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/context.hpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/boolean.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/include/sass2scss.h - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/eval.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/expand.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/factory.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/operators.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/boolean.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/source_map.cpp - /cypress/packages/runner/node_modules/node-sass/src/sass_types/value.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/utf8_string.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/callback_bridge.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/file.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/sass.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/node.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/environment.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/extend.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/sass_context.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/operators.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/constants.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/sass.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/parser.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/constants.cpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/list.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/cssize.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/include/sass/functions.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/util.cpp - /cypress/packages/reporter/node_modules/node-sass/src/custom_function_bridge.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/custom_importer_bridge.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/bind.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/inspect.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass_functions.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/backtrace.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/extend.cpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/sass_value_wrapper.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/debugger.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/cencode.c - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/base64vlq.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/number.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/color.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/c99func.c - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/position.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass_values.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/include/sass/values.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/test/test_subset_map.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass2scss.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_types/null.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/ast.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/include/sass/context.h - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/to_c.cpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/to_value.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/color_maps.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/sass_context_wrapper.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/script/test-leaks.pl - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/lexer.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/to_c.hpp - /cypress/packages/reporter/node_modules/node-sass/src/sass_types/map.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/to_value.cpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/b64/encode.h - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/file.hpp - /cypress/packages/reporter/node_modules/node-sass/src/libsass/src/environment.hpp - /cypress/packages/runner/node_modules/node-sass/src/libsass/src/plugins.hpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/binding.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/sass_context.cpp - /cypress/packages/desktop-gui/node_modules/node-sass/src/libsass/src/debug.hpp </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In inspect.cpp in LibSass 3.5.5, a high memory footprint caused by an endless loop (containing a Sass::Inspect::operator()(Sass::String_Quoted*) stack frame) may cause a Denial of Service via crafted sass input files with stray '&' or '/' characters. <p>Publish Date: 2018-12-03 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19826>CVE-2018-19826</a></p> </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in node sass autoclosed cve medium severity vulnerability vulnerable library node rainbow node js bindings to libsass library home page a href found in head commit a href library source files the source files were matched to this source library based on a best effort match source libraries are selected from a list of probable public libraries cypress packages reporter node modules node sass src libsass src expand hpp cypress packages runner node modules node sass src libsass src color maps cpp cypress packages runner node modules node sass src libsass src sass util hpp cypress packages reporter node modules node sass src libsass src unchecked h cypress packages reporter node modules node sass src libsass src output hpp cypress packages reporter node modules node sass src libsass src sass values hpp cypress packages runner node modules node sass src libsass src util hpp cypress packages runner node modules node sass src libsass src emitter hpp cypress packages desktop gui node modules node sass src libsass src lexer cpp cypress packages runner node modules node sass src libsass test test node cpp cypress packages reporter node modules node sass src libsass src plugins cpp cypress packages runner node modules node sass src libsass include sass base h cypress packages desktop gui node modules node sass src libsass src position hpp cypress packages desktop gui node modules node sass src libsass src subset map hpp cypress packages runner node modules node sass src libsass src operation hpp cypress packages desktop gui node modules node sass src libsass src remove placeholders cpp cypress packages reporter node modules node sass src libsass src error handling hpp cypress packages desktop gui node modules node sass src custom importer bridge cpp cypress packages runner node modules node sass src libsass contrib plugin cpp cypress packages runner node modules node sass src libsass src functions hpp cypress packages runner node modules node sass src libsass test test superselector cpp cypress packages reporter node modules node sass src libsass src eval hpp cypress packages reporter node modules node sass src libsass src string hpp cypress packages reporter node modules node sass src sass context wrapper h cypress packages reporter node modules node sass src libsass src error handling cpp cypress packages runner node modules node sass src libsass src node cpp cypress packages desktop gui node modules node sass src libsass src parser cpp cypress packages runner node modules node sass src libsass src subset map cpp cypress packages desktop gui node modules node sass src libsass src emitter cpp cypress packages reporter node modules node sass src libsass src listize cpp cypress packages runner node modules node sass src libsass src ast hpp cypress packages desktop gui node modules node sass src libsass src sass functions hpp cypress packages runner node modules node sass src libsass src memory sharedptr cpp cypress packages reporter node modules node sass src libsass src output cpp cypress packages reporter node modules node sass src libsass src check nesting cpp cypress packages runner node modules node sass src libsass src ast def macros hpp cypress packages runner node modules node sass src libsass src functions cpp cypress packages runner node modules node sass src libsass src cssize hpp cypress packages reporter node modules node sass src libsass src prelexer cpp cypress packages runner node modules node sass src libsass src paths hpp cypress packages desktop gui node modules node sass src libsass src ast fwd decl hpp cypress packages runner node modules node sass src libsass src inspect hpp cypress packages desktop gui node modules node sass src sass types color cpp cypress packages reporter node modules node sass src libsass test test unification cpp cypress packages runner node modules node sass src libsass src values cpp cypress packages reporter node modules node sass src libsass src sass util cpp cypress packages runner node modules node sass src libsass src source map hpp cypress packages reporter node modules node sass src sass types list h cypress packages runner node modules node sass src libsass src check nesting hpp cypress packages runner node modules node sass src libsass src json cpp cypress packages reporter node modules node sass src libsass src units cpp cypress packages runner node modules node sass src libsass src units hpp cypress packages reporter node modules node sass src libsass src context cpp cypress packages runner node modules node sass src libsass src checked h cypress packages reporter node modules node sass src libsass src listize hpp cypress packages runner node modules node sass src sass types string cpp cypress packages runner node modules node sass src libsass src prelexer hpp cypress packages desktop gui node modules node sass src libsass src context hpp cypress packages reporter node modules node sass src sass types boolean h cypress packages desktop gui node modules node sass src libsass include h cypress packages runner node modules node sass src libsass src eval cpp cypress packages runner node modules node sass src libsass src expand cpp cypress packages desktop gui node modules node sass src sass types factory cpp cypress packages runner node modules node sass src libsass src operators cpp cypress packages desktop gui node modules node sass src sass types boolean cpp cypress packages reporter node modules node sass src libsass src source map cpp cypress packages runner node modules node sass src sass types value h cypress packages desktop gui node modules node sass src libsass src string cpp cypress packages desktop gui node modules node sass src callback bridge h cypress packages reporter node modules node sass src libsass src file cpp cypress packages reporter node modules node sass src libsass src sass cpp cypress packages runner node modules node sass src libsass src node hpp cypress packages reporter node modules node sass src libsass src environment cpp cypress packages runner node modules node sass src libsass src extend hpp cypress packages runner node modules node sass src libsass src sass context hpp cypress packages runner node modules node sass src libsass src operators hpp cypress packages desktop gui node modules node sass src libsass src constants hpp cypress packages runner node modules node sass src libsass src sass hpp cypress packages runner node modules node sass src libsass src ast fwd decl cpp cypress packages desktop gui node modules node sass src libsass src parser hpp cypress packages runner node modules node sass src libsass src constants cpp cypress packages reporter node modules node sass src sass types list cpp cypress packages desktop gui node modules node sass src libsass src cssize cpp cypress packages runner node modules node sass src libsass include sass functions h cypress packages reporter node modules node sass src libsass src util cpp cypress packages reporter node modules node sass src custom function bridge cpp cypress packages desktop gui node modules node sass src custom importer bridge h cypress packages reporter node modules node sass src libsass src bind cpp cypress packages reporter node modules node sass src libsass src inspect cpp cypress packages desktop gui node modules node sass src libsass src sass functions cpp cypress packages desktop gui node modules node sass src libsass src backtrace cpp cypress packages reporter node modules node sass src libsass src extend cpp cypress packages reporter node modules node sass src sass types sass value wrapper h cypress packages desktop gui node modules node sass src libsass src debugger hpp cypress packages reporter node modules node sass src libsass src cencode c cypress packages reporter node modules node sass src libsass src cpp cypress packages desktop gui node modules node sass src sass types number cpp cypress packages desktop gui node modules node sass src sass types color h cypress packages desktop gui node modules node sass src libsass src c cypress packages desktop gui node modules node sass src libsass src position cpp cypress packages runner node modules node sass src libsass src remove placeholders hpp cypress packages desktop gui node modules node sass src libsass src sass values cpp cypress packages runner node modules node sass src libsass include sass values h cypress packages reporter node modules node sass src libsass test test subset map cpp cypress packages desktop gui node modules node sass src libsass src cpp cypress packages desktop gui node modules node sass src sass types null cpp cypress packages runner node modules node sass src libsass src ast cpp cypress packages runner node modules node sass src libsass include sass context h cypress packages desktop gui node modules node sass src libsass src to c cpp cypress packages runner node modules node sass src libsass src to value hpp cypress packages runner node modules node sass src libsass src color maps hpp cypress packages desktop gui node modules node sass src sass context wrapper cpp cypress packages reporter node modules node sass src libsass script test leaks pl cypress packages reporter node modules node sass src libsass src lexer hpp cypress packages reporter node modules node sass src libsass src memory sharedptr hpp cypress packages desktop gui node modules node sass src libsass src to c hpp cypress packages reporter node modules node sass src sass types map cpp cypress packages desktop gui node modules node sass src libsass src to value cpp cypress packages reporter node modules node sass src libsass src encode h cypress packages reporter node modules node sass src libsass src file hpp cypress packages reporter node modules node sass src libsass src environment hpp cypress packages runner node modules node sass src libsass src plugins hpp cypress packages desktop gui node modules node sass src binding cpp cypress packages desktop gui node modules node sass src libsass src sass context cpp cypress packages desktop gui node modules node sass src libsass src debug hpp vulnerability details in inspect cpp in libsass a high memory footprint caused by an endless loop containing a sass inspect operator sass string quoted stack frame may cause a denial of service via crafted sass input files with stray or characters publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href step up your open source security game with whitesource
0
3,350
6,486,694,260
IssuesEvent
2017-08-19 22:19:00
Great-Hill-Corporation/quickBlocks
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
closed
ethName binary cache
status-inprocess tools-ethName type-enhancement
ethName should store its data in a binary cache so it can be used by applications (getBlocks, etc, and monitors) to name accounts when exporting using the 'dispalyName' function
1.0
ethName binary cache - ethName should store its data in a binary cache so it can be used by applications (getBlocks, etc, and monitors) to name accounts when exporting using the 'dispalyName' function
process
ethname binary cache ethname should store its data in a binary cache so it can be used by applications getblocks etc and monitors to name accounts when exporting using the dispalyname function
1
8,316
11,485,801,095
IssuesEvent
2020-02-11 08:36:04
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
Move 'response to other organism' under interspecies interaction
multi-species process
This comes from the request here: https://github.com/geneontology/go-site/issues/1249 gorule-0000015 checks that if there is a value in the 'interacting taxon' column of the GAF file, the GO term annotated must be 'GO:0044419 : interspecies interaction between organisms' or a child. This causes problems 'response to other organism', which is not under 'interspecies interaction between organisms'. We will modify the ontology to fix that, since there are no terms that describe 'response to the same organism'. Thanks, Pascale
1.0
Move 'response to other organism' under interspecies interaction - This comes from the request here: https://github.com/geneontology/go-site/issues/1249 gorule-0000015 checks that if there is a value in the 'interacting taxon' column of the GAF file, the GO term annotated must be 'GO:0044419 : interspecies interaction between organisms' or a child. This causes problems 'response to other organism', which is not under 'interspecies interaction between organisms'. We will modify the ontology to fix that, since there are no terms that describe 'response to the same organism'. Thanks, Pascale
process
move response to other organism under interspecies interaction this comes from the request here gorule checks that if there is a value in the interacting taxon column of the gaf file the go term annotated must be go interspecies interaction between organisms or a child this causes problems response to other organism which is not under interspecies interaction between organisms we will modify the ontology to fix that since there are no terms that describe response to the same organism thanks pascale
1
128,598
5,071,871,918
IssuesEvent
2016-12-26 16:59:23
AlbatrossAvionics/Alba-2017
https://api.github.com/repos/AlbatrossAvionics/Alba-2017
opened
Androidアプリの作成
high priority
機体搭載用Androidアプリの作成を行う。mbedとBlueToothの相互通信で接続する。主な機能は以下の通り。 - [ ] Android内の各種センサ値の取得、保存 - [ ] 電装のセンサで取れた値もBlueToothで送ってしまって保存して、SDロガーの代わりにしてSDロガーを廃止したい。 - [ ] パイロットにセンサの値を伝える表示画面として使用する。 - [ ] カメラ機能を使用して、フライト動画を取りたい。 - [ ] ロールアラームの機能もここに統合してしまいたい。音量的にロールアラームを廃止できなかったとしても両方鳴らす。 - [ ] mbedからはString型の文字列を送ってAndroidを操作。 データ保存にはSQLiteを使用する。
1.0
Androidアプリの作成 - 機体搭載用Androidアプリの作成を行う。mbedとBlueToothの相互通信で接続する。主な機能は以下の通り。 - [ ] Android内の各種センサ値の取得、保存 - [ ] 電装のセンサで取れた値もBlueToothで送ってしまって保存して、SDロガーの代わりにしてSDロガーを廃止したい。 - [ ] パイロットにセンサの値を伝える表示画面として使用する。 - [ ] カメラ機能を使用して、フライト動画を取りたい。 - [ ] ロールアラームの機能もここに統合してしまいたい。音量的にロールアラームを廃止できなかったとしても両方鳴らす。 - [ ] mbedからはString型の文字列を送ってAndroidを操作。 データ保存にはSQLiteを使用する。
non_process
androidアプリの作成 機体搭載用androidアプリの作成を行う。mbedとbluetoothの相互通信で接続する。主な機能は以下の通り。 android内の各種センサ値の取得、保存 電装のセンサで取れた値もbluetoothで送ってしまって保存して、sdロガーの代わりにしてsdロガーを廃止したい。 パイロットにセンサの値を伝える表示画面として使用する。 カメラ機能を使用して、フライト動画を取りたい。 ロールアラームの機能もここに統合してしまいたい。音量的にロールアラームを廃止できなかったとしても両方鳴らす。 mbedからはstring型の文字列を送ってandroidを操作。 データ保存にはsqliteを使用する。
0
12,390
14,908,753,025
IssuesEvent
2021-01-22 06:37:09
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
PM> Search bar text correction
Bug P2 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
**Describe the bug** search tab text to should be 'Search by study ID or name' instead of - Search BY Study ID or Name
3.0
PM> Search bar text correction - **Describe the bug** search tab text to should be 'Search by study ID or name' instead of - Search BY Study ID or Name
process
pm search bar text correction describe the bug search tab text to should be search by study id or name instead of search by study id or name
1
15,384
19,567,860,736
IssuesEvent
2022-01-04 04:59:22
alexrp/system-terminal
https://api.github.com/repos/alexrp/system-terminal
opened
Enable support for killing the entire process tree of a `ChildProcess`
type: feature state: blocked area: processes
We currently ignore the value of `entireProcessTree` since the `System.Diagnostics.Process` implementation appears to be broken on Windows. https://github.com/alexrp/system-terminal/blob/91e3a7ad8c80bb9db6fc25cb6dc3810e734d05a1/src/core/Processes/ChildProcess.cs#L169-L185 See: https://github.com/dotnet/runtime/issues/63328
1.0
Enable support for killing the entire process tree of a `ChildProcess` - We currently ignore the value of `entireProcessTree` since the `System.Diagnostics.Process` implementation appears to be broken on Windows. https://github.com/alexrp/system-terminal/blob/91e3a7ad8c80bb9db6fc25cb6dc3810e734d05a1/src/core/Processes/ChildProcess.cs#L169-L185 See: https://github.com/dotnet/runtime/issues/63328
process
enable support for killing the entire process tree of a childprocess we currently ignore the value of entireprocesstree since the system diagnostics process implementation appears to be broken on windows see
1
407,091
11,906,377,045
IssuesEvent
2020-03-30 20:14:48
internetarchive/openlibrary
https://api.github.com/repos/internetarchive/openlibrary
closed
Full re-index of solr data on prod
Lead: @cdrini Module: Docker Module: Solr Priority: 2 State: Work In Progress Type: Epic Type: Feature
This will be an important step into having a more reliable solr environment. Being able to locally create an *identical* solr environment will get rid of a lot of confusion. It would also allow us a path to move forward on #178 and #599 , since we can spin up a new solr, re-index it with the new settings, and then swap it with the old solr without any downtime. ## Subtasks - [x] #1055 Create docker image for solr - [x] Determine data on production solr - [x] Why are there `type: subject`? This looks like it's used for `/search/subjects`, so these needed to be included. - [x] Why are there `type: edition`? This looks like residuals of dead code for `/search/editions` (which [does appear](https://openlibrary.org/search/editions?q=Nosotros) to work for the measly ~3.5K editions stored in solr) - [x] Why isn't there any stats related data? `/solr/process_stats.py` looks like dead code. - [x] Ensure dev's config file is the same as prod's. -> Copied from prod into solrbuilder, so they _will_ be identical. - [ ] Create test solr on server.openjournal.foundation #2222 - [ ] Create Docker-based solr for production use - [x] Create solr environment on prod somewhere - [ ] Pause both solrupdaters - [ ] Copy OJF solr data to new prod environment - [ ] Link production to new solr endpoint - [ ] Destroy old solr endpoint ## Notes/Comments - I believe solr is storing viewage statistics as well as just works/authors themselves - [x] @mekarpeles Can you run this query on production solr: `NOT(type:work) AND NOT(type:author)`?
1.0
Full re-index of solr data on prod - This will be an important step into having a more reliable solr environment. Being able to locally create an *identical* solr environment will get rid of a lot of confusion. It would also allow us a path to move forward on #178 and #599 , since we can spin up a new solr, re-index it with the new settings, and then swap it with the old solr without any downtime. ## Subtasks - [x] #1055 Create docker image for solr - [x] Determine data on production solr - [x] Why are there `type: subject`? This looks like it's used for `/search/subjects`, so these needed to be included. - [x] Why are there `type: edition`? This looks like residuals of dead code for `/search/editions` (which [does appear](https://openlibrary.org/search/editions?q=Nosotros) to work for the measly ~3.5K editions stored in solr) - [x] Why isn't there any stats related data? `/solr/process_stats.py` looks like dead code. - [x] Ensure dev's config file is the same as prod's. -> Copied from prod into solrbuilder, so they _will_ be identical. - [ ] Create test solr on server.openjournal.foundation #2222 - [ ] Create Docker-based solr for production use - [x] Create solr environment on prod somewhere - [ ] Pause both solrupdaters - [ ] Copy OJF solr data to new prod environment - [ ] Link production to new solr endpoint - [ ] Destroy old solr endpoint ## Notes/Comments - I believe solr is storing viewage statistics as well as just works/authors themselves - [x] @mekarpeles Can you run this query on production solr: `NOT(type:work) AND NOT(type:author)`?
non_process
full re index of solr data on prod this will be an important step into having a more reliable solr environment being able to locally create an identical solr environment will get rid of a lot of confusion it would also allow us a path to move forward on and since we can spin up a new solr re index it with the new settings and then swap it with the old solr without any downtime subtasks create docker image for solr determine data on production solr why are there type subject this looks like it s used for search subjects so these needed to be included why are there type edition this looks like residuals of dead code for search editions which to work for the measly editions stored in solr why isn t there any stats related data solr process stats py looks like dead code ensure dev s config file is the same as prod s copied from prod into solrbuilder so they will be identical create test solr on server openjournal foundation create docker based solr for production use create solr environment on prod somewhere pause both solrupdaters copy ojf solr data to new prod environment link production to new solr endpoint destroy old solr endpoint notes comments i believe solr is storing viewage statistics as well as just works authors themselves mekarpeles can you run this query on production solr not type work and not type author
0
257,542
27,563,798,650
IssuesEvent
2023-03-08 01:07:17
billmcchesney1/page.js
https://api.github.com/repos/billmcchesney1/page.js
opened
CVE-2022-0144 (High) detected in shelljs-0.3.0.tgz
security vulnerability
## CVE-2022-0144 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>shelljs-0.3.0.tgz</b></p></summary> <p>Portable Unix shell commands for Node.js</p> <p>Library home page: <a href="https://registry.npmjs.org/shelljs/-/shelljs-0.3.0.tgz">https://registry.npmjs.org/shelljs/-/shelljs-0.3.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/shelljs/package.json</p> <p> Dependency Hierarchy: - jshint-2.12.0.tgz (Root Library) - :x: **shelljs-0.3.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> shelljs is vulnerable to Improper Privilege Management <p>Publish Date: 2022-01-11 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-0144>CVE-2022-0144</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-01-11</p> <p>Fix Resolution (shelljs): 0.8.5</p> <p>Direct dependency fix Resolution (jshint): 2.13.4</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue
True
CVE-2022-0144 (High) detected in shelljs-0.3.0.tgz - ## CVE-2022-0144 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>shelljs-0.3.0.tgz</b></p></summary> <p>Portable Unix shell commands for Node.js</p> <p>Library home page: <a href="https://registry.npmjs.org/shelljs/-/shelljs-0.3.0.tgz">https://registry.npmjs.org/shelljs/-/shelljs-0.3.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/shelljs/package.json</p> <p> Dependency Hierarchy: - jshint-2.12.0.tgz (Root Library) - :x: **shelljs-0.3.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> shelljs is vulnerable to Improper Privilege Management <p>Publish Date: 2022-01-11 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-0144>CVE-2022-0144</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-01-11</p> <p>Fix Resolution (shelljs): 0.8.5</p> <p>Direct dependency fix Resolution (jshint): 2.13.4</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue
non_process
cve high detected in shelljs tgz cve high severity vulnerability vulnerable library shelljs tgz portable unix shell commands for node js library home page a href path to dependency file package json path to vulnerable library node modules shelljs package json dependency hierarchy jshint tgz root library x shelljs tgz vulnerable library found in base branch master vulnerability details shelljs is vulnerable to improper privilege management publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution shelljs direct dependency fix resolution jshint rescue worker helmet automatic remediation is available for this issue
0
43,508
17,616,502,332
IssuesEvent
2021-08-18 10:21:40
hashicorp/terraform-provider-azurerm
https://api.github.com/repos/hashicorp/terraform-provider-azurerm
closed
Feature request: Add resource for managing Event Hubs Geo-Recovery Pairing
enhancement service/event-hubs
<!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Description <!--- Please leave a helpful description of the feature request here. ---> As a developer, I want to be able to manage Event Hubs Geo-Recovery Pairings using terraform so that I don't have to automate it using PowerShell/cli. APIs available here - https://docs.microsoft.com/en-us/rest/api/eventhub/disasterrecoveryconfigs ### New or Affected Resource(s) <!--- Please list the new or affected resources and data sources. ---> * azurerm_eventhub_geopairing ### Potential Terraform Configuration <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "azurerm_eventhub_namespace" "primary_namespace" { name = "my-hub-namespace-primary" resource_group_name = "my-eventhubs-group-east" location = "australia east" sku = "Standard" } resource "azurerm_eventhub_namespace" "secondary_namespace" { name = "my-hub-namespace-secondary" resource_group_name = "my-eventhubs-group-southeast" location = "australia southeast" sku = "Standard" } resource "azurerm_eventhub_geopairing" "geopairing" { alias = "my-hub-alias" resource_group_name = "my-eventhubs-group-east" primary_namespace = "${azurerm_eventhub_namespace.primary_namespace.name}" secondary_namespace = "${azurerm_eventhub_namespace.secondary_namespace.id}" # this needs to be an ID if in a different resource group } ``` ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example: * https://azure.microsoft.com/en-us/roadmap/virtual-network-service-endpoint-for-azure-cosmos-db/ ---> * N/A
1.0
Feature request: Add resource for managing Event Hubs Geo-Recovery Pairing - <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Description <!--- Please leave a helpful description of the feature request here. ---> As a developer, I want to be able to manage Event Hubs Geo-Recovery Pairings using terraform so that I don't have to automate it using PowerShell/cli. APIs available here - https://docs.microsoft.com/en-us/rest/api/eventhub/disasterrecoveryconfigs ### New or Affected Resource(s) <!--- Please list the new or affected resources and data sources. ---> * azurerm_eventhub_geopairing ### Potential Terraform Configuration <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl resource "azurerm_eventhub_namespace" "primary_namespace" { name = "my-hub-namespace-primary" resource_group_name = "my-eventhubs-group-east" location = "australia east" sku = "Standard" } resource "azurerm_eventhub_namespace" "secondary_namespace" { name = "my-hub-namespace-secondary" resource_group_name = "my-eventhubs-group-southeast" location = "australia southeast" sku = "Standard" } resource "azurerm_eventhub_geopairing" "geopairing" { alias = "my-hub-alias" resource_group_name = "my-eventhubs-group-east" primary_namespace = "${azurerm_eventhub_namespace.primary_namespace.name}" secondary_namespace = "${azurerm_eventhub_namespace.secondary_namespace.id}" # this needs to be an ID if in a different resource group } ``` ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example: * https://azure.microsoft.com/en-us/roadmap/virtual-network-service-endpoint-for-azure-cosmos-db/ ---> * N/A
non_process
feature request add resource for managing event hubs geo recovery pairing community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description as a developer i want to be able to manage event hubs geo recovery pairings using terraform so that i don t have to automate it using powershell cli apis available here new or affected resource s azurerm eventhub geopairing potential terraform configuration hcl resource azurerm eventhub namespace primary namespace name my hub namespace primary resource group name my eventhubs group east location australia east sku standard resource azurerm eventhub namespace secondary namespace name my hub namespace secondary resource group name my eventhubs group southeast location australia southeast sku standard resource azurerm eventhub geopairing geopairing alias my hub alias resource group name my eventhubs group east primary namespace azurerm eventhub namespace primary namespace name secondary namespace azurerm eventhub namespace secondary namespace id this needs to be an id if in a different resource group references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor blog posts or documentation for example n a
0
57,392
14,144,528,343
IssuesEvent
2020-11-10 16:33:30
idonthaveafifaaddiction/ember-css-modules
https://api.github.com/repos/idonthaveafifaaddiction/ember-css-modules
opened
CVE-2018-3721 (Medium) detected in lodash-3.10.1.tgz
security vulnerability
## CVE-2018-3721 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-3.10.1.tgz</b></p></summary> <p>The modern build of lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p> <p>Path to dependency file: ember-css-modules/test-packages/old-app/node_modules/lodash/package.json</p> <p>Path to vulnerable library: ember-css-modules/test-packages/old-app/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - ember-cli-2.16.2.tgz (Root Library) - ember-try-0.2.23.tgz - cli-table2-0.2.0.tgz - :x: **lodash-3.10.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/idonthaveafifaaddiction/ember-css-modules/commit/b6388238f8785f04287241e9c02a302d3198c674">b6388238f8785f04287241e9c02a302d3198c674</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> lodash node module before 4.17.5 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via defaultsDeep, merge, and mergeWith functions, which allows a malicious user to modify the prototype of "Object" via __proto__, causing the addition or modification of an existing property that will exist on all objects. <p>Publish Date: 2018-06-07 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-3721>CVE-2018-3721</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-3721">https://nvd.nist.gov/vuln/detail/CVE-2018-3721</a></p> <p>Release Date: 2018-06-07</p> <p>Fix Resolution: 4.17.5</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"3.10.1","isTransitiveDependency":true,"dependencyTree":"ember-cli:2.16.2;ember-try:0.2.23;cli-table2:0.2.0;lodash:3.10.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.17.5"}],"vulnerabilityIdentifier":"CVE-2018-3721","vulnerabilityDetails":"lodash node module before 4.17.5 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via defaultsDeep, merge, and mergeWith functions, which allows a malicious user to modify the prototype of \"Object\" via __proto__, causing the addition or modification of an existing property that will exist on all objects.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-3721","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-3721 (Medium) detected in lodash-3.10.1.tgz - ## CVE-2018-3721 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-3.10.1.tgz</b></p></summary> <p>The modern build of lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p> <p>Path to dependency file: ember-css-modules/test-packages/old-app/node_modules/lodash/package.json</p> <p>Path to vulnerable library: ember-css-modules/test-packages/old-app/node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - ember-cli-2.16.2.tgz (Root Library) - ember-try-0.2.23.tgz - cli-table2-0.2.0.tgz - :x: **lodash-3.10.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/idonthaveafifaaddiction/ember-css-modules/commit/b6388238f8785f04287241e9c02a302d3198c674">b6388238f8785f04287241e9c02a302d3198c674</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> lodash node module before 4.17.5 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via defaultsDeep, merge, and mergeWith functions, which allows a malicious user to modify the prototype of "Object" via __proto__, causing the addition or modification of an existing property that will exist on all objects. <p>Publish Date: 2018-06-07 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-3721>CVE-2018-3721</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-3721">https://nvd.nist.gov/vuln/detail/CVE-2018-3721</a></p> <p>Release Date: 2018-06-07</p> <p>Fix Resolution: 4.17.5</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"3.10.1","isTransitiveDependency":true,"dependencyTree":"ember-cli:2.16.2;ember-try:0.2.23;cli-table2:0.2.0;lodash:3.10.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.17.5"}],"vulnerabilityIdentifier":"CVE-2018-3721","vulnerabilityDetails":"lodash node module before 4.17.5 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via defaultsDeep, merge, and mergeWith functions, which allows a malicious user to modify the prototype of \"Object\" via __proto__, causing the addition or modification of an existing property that will exist on all objects.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-3721","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in lodash tgz cve medium severity vulnerability vulnerable library lodash tgz the modern build of lodash modular utilities library home page a href path to dependency file ember css modules test packages old app node modules lodash package json path to vulnerable library ember css modules test packages old app node modules lodash package json dependency hierarchy ember cli tgz root library ember try tgz cli tgz x lodash tgz vulnerable library found in head commit a href found in base branch master vulnerability details lodash node module before suffers from a modification of assumed immutable data maid vulnerability via defaultsdeep merge and mergewith functions which allows a malicious user to modify the prototype of object via proto causing the addition or modification of an existing property that will exist on all objects publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails lodash node module before suffers from a modification of assumed immutable data maid vulnerability via defaultsdeep merge and mergewith functions which allows a malicious user to modify the prototype of object via proto causing the addition or modification of an existing property that will exist on all objects vulnerabilityurl
0
21,344
29,144,877,851
IssuesEvent
2023-05-18 01:20:01
googleapis/google-api-go-client
https://api.github.com/repos/googleapis/google-api-go-client
closed
chore(ci): Fix AutoApprove job to work on discogen PRs
type: process
The AutoApprove workflow hasn't work since the client generation job switched branches/stacks. We should fix this so that we don't need to manually handle green regen PRs.
1.0
chore(ci): Fix AutoApprove job to work on discogen PRs - The AutoApprove workflow hasn't work since the client generation job switched branches/stacks. We should fix this so that we don't need to manually handle green regen PRs.
process
chore ci fix autoapprove job to work on discogen prs the autoapprove workflow hasn t work since the client generation job switched branches stacks we should fix this so that we don t need to manually handle green regen prs
1
455,227
13,113,725,160
IssuesEvent
2020-08-05 06:13:36
wso2/product-apim
https://api.github.com/repos/wso2/product-apim
opened
Changing the API Level Throttling Policy for API Products is not effective
Priority/Normal Type/Bug
### Description: After changing the API Level Throttling policy for API Products for the first time after creating it, the latter policies that it gets changed to does not become effective. So if we need to change the API Level Throttling policy in API Products after first creation, we have to delete the product and recreate it with the new API Level throttling policy that we need to change to. ### Steps to reproduce: 1. Create 2 advanced level throttling policies from the Admin Portal, say policy A and policy B. 2. Create and API with operational level throttling selected and policy "Unlimited". 3. Create an API Product that includes one or more resources of this API. 4. Go into the "Resources" page of the API Product and select the throttling policy level as "API Level" and select policy A as the throttling policy. Then save. 5. Invoke the API Product and observe that throttling happens as expected. 6. Then change the API Level throttling policy to policy B in the API Product. 7. Try invoking again. The new throttling policy is not effective. It does not get throttled out as expected. <img width="1470" alt="Screenshot 2020-08-05 at 11 42 12" src="https://user-images.githubusercontent.com/8557410/89378135-d5860480-d710-11ea-849d-9e9f53f6ad9c.png">
1.0
Changing the API Level Throttling Policy for API Products is not effective - ### Description: After changing the API Level Throttling policy for API Products for the first time after creating it, the latter policies that it gets changed to does not become effective. So if we need to change the API Level Throttling policy in API Products after first creation, we have to delete the product and recreate it with the new API Level throttling policy that we need to change to. ### Steps to reproduce: 1. Create 2 advanced level throttling policies from the Admin Portal, say policy A and policy B. 2. Create and API with operational level throttling selected and policy "Unlimited". 3. Create an API Product that includes one or more resources of this API. 4. Go into the "Resources" page of the API Product and select the throttling policy level as "API Level" and select policy A as the throttling policy. Then save. 5. Invoke the API Product and observe that throttling happens as expected. 6. Then change the API Level throttling policy to policy B in the API Product. 7. Try invoking again. The new throttling policy is not effective. It does not get throttled out as expected. <img width="1470" alt="Screenshot 2020-08-05 at 11 42 12" src="https://user-images.githubusercontent.com/8557410/89378135-d5860480-d710-11ea-849d-9e9f53f6ad9c.png">
non_process
changing the api level throttling policy for api products is not effective description after changing the api level throttling policy for api products for the first time after creating it the latter policies that it gets changed to does not become effective so if we need to change the api level throttling policy in api products after first creation we have to delete the product and recreate it with the new api level throttling policy that we need to change to steps to reproduce create advanced level throttling policies from the admin portal say policy a and policy b create and api with operational level throttling selected and policy unlimited create an api product that includes one or more resources of this api go into the resources page of the api product and select the throttling policy level as api level and select policy a as the throttling policy then save invoke the api product and observe that throttling happens as expected then change the api level throttling policy to policy b in the api product try invoking again the new throttling policy is not effective it does not get throttled out as expected img width alt screenshot at src
0
322,464
23,908,459,728
IssuesEvent
2022-09-09 05:11:15
casdoor/casdoor-wechat-miniprogram-example
https://api.github.com/repos/casdoor/casdoor-wechat-miniprogram-example
closed
Improve docs and README
documentation
1. Add this repo link to docs: https://casdoor.org/docs/integration/wechat_miniprogram 2. Copy the full docs from: https://casdoor.org/docs/integration/wechat_miniprogram to this repo's README
1.0
Improve docs and README - 1. Add this repo link to docs: https://casdoor.org/docs/integration/wechat_miniprogram 2. Copy the full docs from: https://casdoor.org/docs/integration/wechat_miniprogram to this repo's README
non_process
improve docs and readme add this repo link to docs copy the full docs from to this repo s readme
0
460,613
13,213,636,189
IssuesEvent
2020-08-16 13:50:22
rism-ch/verovio
https://api.github.com/repos/rism-ch/verovio
closed
space + clef + layer interaction
enhancement low priority
In this example: <img width="277" alt="Screen Shot 2020-08-16 at 12 00 31 AM" src="https://user-images.githubusercontent.com/3487289/90328741-a2831280-df53-11ea-88d0-94870224a813.png"> The dotted-quarter note is displayed incorrectly on the bass staff. It should instead be a G3 on the treble staff. MEI data: ```xml <?xml version="1.0" encoding="UTF-8"?> <?xml-model href="https://music-encoding.org/schema/4.0.0/mei-all.rng" type="application/xml" schematypens="http://relaxng.org/ns/structure/1.0"?> <?xml-model href="https://music-encoding.org/schema/4.0.0/mei-all.rng" type="application/xml" schematypens="http://purl.oclc.org/dsdl/schematron"?> <mei xmlns="http://www.music-encoding.org/ns/mei" meiversion="4.0.0"> <meiHead> <fileDesc> <titleStmt> <title /> </titleStmt> <pubStmt /> </fileDesc> <encodingDesc> <appInfo> <application isodate="2020-08-16T00:00:09" version="3.0.0-dev-1769075"> <name>Verovio</name> <p>Transcoded from Humdrum</p> </application> </appInfo> </encodingDesc> <workList> <work> <title /> </work> </workList> </meiHead> <music> <body> <mdiv xml:id="mdiv-0000000823140314"> <score xml:id="score-0000000310451684"> <scoreDef xml:id="scoredef-0000000298237118"> <staffGrp xml:id="staffgrp-0000001780873375"> <staffDef xml:id="staffdef-0000000208127384" n="1" lines="5"> <clef xml:id="clef-L2F1" shape="F" line="4" /> <meterSig xml:id="metersig-L3F1" count="2" unit="4" /> </staffDef> </staffGrp> </scoreDef> <section xml:id="section-L1F1"> <measure xml:id="measure-L1" n="1"> <staff xml:id="staff-0000001061287784" n="1"> <layer xml:id="layer-L1F1N1" n="1"> <note xml:id="note-L5F1" dur="8" oct="2" pname="g" accid.ges="n" /> <clef xml:id="clef-L6F1" shape="G" line="2" /> <beam xml:id="beam-L8F1-L10F1"> <note xml:id="note-L8F1" dur="8" oct="3" pname="b" accid.ges="n" /> <note xml:id="note-L9F1" dur="8" oct="4" pname="c" accid.ges="n" /> <note xml:id="note-L10F1" dur="8" oct="4" pname="d" accid.ges="n" /> </beam> </layer> <layer xml:id="layer-L8F2N2" n="2"> <space xml:id="space-0000001126558200" dur="8" /> <note xml:id="note-L8F2" dots="1" dur="4" oct="3" pname="g" accid.ges="n" /> </layer> </staff> </measure> </section> </score> </mdiv> </body> </music> </mei> ```
1.0
space + clef + layer interaction - In this example: <img width="277" alt="Screen Shot 2020-08-16 at 12 00 31 AM" src="https://user-images.githubusercontent.com/3487289/90328741-a2831280-df53-11ea-88d0-94870224a813.png"> The dotted-quarter note is displayed incorrectly on the bass staff. It should instead be a G3 on the treble staff. MEI data: ```xml <?xml version="1.0" encoding="UTF-8"?> <?xml-model href="https://music-encoding.org/schema/4.0.0/mei-all.rng" type="application/xml" schematypens="http://relaxng.org/ns/structure/1.0"?> <?xml-model href="https://music-encoding.org/schema/4.0.0/mei-all.rng" type="application/xml" schematypens="http://purl.oclc.org/dsdl/schematron"?> <mei xmlns="http://www.music-encoding.org/ns/mei" meiversion="4.0.0"> <meiHead> <fileDesc> <titleStmt> <title /> </titleStmt> <pubStmt /> </fileDesc> <encodingDesc> <appInfo> <application isodate="2020-08-16T00:00:09" version="3.0.0-dev-1769075"> <name>Verovio</name> <p>Transcoded from Humdrum</p> </application> </appInfo> </encodingDesc> <workList> <work> <title /> </work> </workList> </meiHead> <music> <body> <mdiv xml:id="mdiv-0000000823140314"> <score xml:id="score-0000000310451684"> <scoreDef xml:id="scoredef-0000000298237118"> <staffGrp xml:id="staffgrp-0000001780873375"> <staffDef xml:id="staffdef-0000000208127384" n="1" lines="5"> <clef xml:id="clef-L2F1" shape="F" line="4" /> <meterSig xml:id="metersig-L3F1" count="2" unit="4" /> </staffDef> </staffGrp> </scoreDef> <section xml:id="section-L1F1"> <measure xml:id="measure-L1" n="1"> <staff xml:id="staff-0000001061287784" n="1"> <layer xml:id="layer-L1F1N1" n="1"> <note xml:id="note-L5F1" dur="8" oct="2" pname="g" accid.ges="n" /> <clef xml:id="clef-L6F1" shape="G" line="2" /> <beam xml:id="beam-L8F1-L10F1"> <note xml:id="note-L8F1" dur="8" oct="3" pname="b" accid.ges="n" /> <note xml:id="note-L9F1" dur="8" oct="4" pname="c" accid.ges="n" /> <note xml:id="note-L10F1" dur="8" oct="4" pname="d" accid.ges="n" /> </beam> </layer> <layer xml:id="layer-L8F2N2" n="2"> <space xml:id="space-0000001126558200" dur="8" /> <note xml:id="note-L8F2" dots="1" dur="4" oct="3" pname="g" accid.ges="n" /> </layer> </staff> </measure> </section> </score> </mdiv> </body> </music> </mei> ```
non_process
space clef layer interaction in this example img width alt screen shot at am src the dotted quarter note is displayed incorrectly on the bass staff it should instead be a on the treble staff mei data xml xml model href type application xml schematypens xml model href type application xml schematypens verovio transcoded from humdrum
0
35,077
7,548,582,406
IssuesEvent
2018-04-18 11:44:16
primefaces/primeng
https://api.github.com/repos/primefaces/primeng
closed
Listbox readonly property doesn't work
defect
### There is no guarantee in receiving an immediate response in GitHub Issue Tracker, If you'd like to secure our response, you may consider *PrimeNG PRO Support* where support is provided within 4 business hours **I'm submitting a ...** (check one with "x") ``` [x] bug report => Search github for a similar issue or PR before submitting [ ] feature request => Please check if request is not on the roadmap already https://github.com/primefaces/primeng/wiki/Roadmap [ ] support request => Please do not submit support request here, instead see http://forum.primefaces.org/viewforum.php?f=35 ``` The **readonly** property of the listbox control does not work when trying to set to false. Tried the following: ``` readonly="!editableSw" [readonly]="!editableSw" [readonly]="false" ``` * **Angular version:** 5.0.0 <!-- Check whether this is still an issue in the most recent Angular version --> * **PrimeNG version:** 5.2.0 <!-- Check whether this is still an issue in the most recent Angular version -->
1.0
Listbox readonly property doesn't work - ### There is no guarantee in receiving an immediate response in GitHub Issue Tracker, If you'd like to secure our response, you may consider *PrimeNG PRO Support* where support is provided within 4 business hours **I'm submitting a ...** (check one with "x") ``` [x] bug report => Search github for a similar issue or PR before submitting [ ] feature request => Please check if request is not on the roadmap already https://github.com/primefaces/primeng/wiki/Roadmap [ ] support request => Please do not submit support request here, instead see http://forum.primefaces.org/viewforum.php?f=35 ``` The **readonly** property of the listbox control does not work when trying to set to false. Tried the following: ``` readonly="!editableSw" [readonly]="!editableSw" [readonly]="false" ``` * **Angular version:** 5.0.0 <!-- Check whether this is still an issue in the most recent Angular version --> * **PrimeNG version:** 5.2.0 <!-- Check whether this is still an issue in the most recent Angular version -->
non_process
listbox readonly property doesn t work there is no guarantee in receiving an immediate response in github issue tracker if you d like to secure our response you may consider primeng pro support where support is provided within business hours i m submitting a check one with x bug report search github for a similar issue or pr before submitting feature request please check if request is not on the roadmap already support request please do not submit support request here instead see the readonly property of the listbox control does not work when trying to set to false tried the following readonly editablesw editablesw false angular version primeng version
0
9,282
3,267,295,782
IssuesEvent
2015-10-23 02:09:09
boostorg/hana
https://api.github.com/repos/boostorg/hana
closed
Provide actual links to FP concepts
documentation enhancement
The documentation of FP concepts should provide actual links to external resources explaining them. - [x] Functor - [x] Applicative - [x] Monad - [x] Comonad - [x] MonadPlus
1.0
Provide actual links to FP concepts - The documentation of FP concepts should provide actual links to external resources explaining them. - [x] Functor - [x] Applicative - [x] Monad - [x] Comonad - [x] MonadPlus
non_process
provide actual links to fp concepts the documentation of fp concepts should provide actual links to external resources explaining them functor applicative monad comonad monadplus
0
22,158
30,700,097,337
IssuesEvent
2023-07-26 22:14:53
redpanda-data/documentation
https://api.github.com/repos/redpanda-data/documentation
opened
Use the headings "Suggested reading" and "Suggested videos" for blog, videos, other suggested content
content gap usability improvement P3 Internal Doc Process backport
### Issue description When appropriate, all topics should end with a "Suggested reading" (links to blogs, etc.) and/or "Suggested video" headings. When making this update, please add the links in the doc https://docs.google.com/document/d/1Xh45SFEq3dIA1_gGMgblZtSIjm35eNt0uodPbARLtFo/edit to the appropriate contexts. ### Updates to existing documentation These updates will hit dozens of pages in our library. The only page that appears to have different wording is in the [Introduction to Redpanda, which has "Suggested links"](https://docs.redpanda.com/docs/get-started/intro-to-events/#suggested-links)--this should be divided up as requested above. ### Link to Redpanda Slack conversation n/a
1.0
Use the headings "Suggested reading" and "Suggested videos" for blog, videos, other suggested content - ### Issue description When appropriate, all topics should end with a "Suggested reading" (links to blogs, etc.) and/or "Suggested video" headings. When making this update, please add the links in the doc https://docs.google.com/document/d/1Xh45SFEq3dIA1_gGMgblZtSIjm35eNt0uodPbARLtFo/edit to the appropriate contexts. ### Updates to existing documentation These updates will hit dozens of pages in our library. The only page that appears to have different wording is in the [Introduction to Redpanda, which has "Suggested links"](https://docs.redpanda.com/docs/get-started/intro-to-events/#suggested-links)--this should be divided up as requested above. ### Link to Redpanda Slack conversation n/a
process
use the headings suggested reading and suggested videos for blog videos other suggested content issue description when appropriate all topics should end with a suggested reading links to blogs etc and or suggested video headings when making this update please add the links in the doc to the appropriate contexts updates to existing documentation these updates will hit dozens of pages in our library the only page that appears to have different wording is in the should be divided up as requested above link to redpanda slack conversation n a
1
631,104
20,144,211,738
IssuesEvent
2022-02-09 04:44:48
wso2/product-apim
https://api.github.com/repos/wso2/product-apim
opened
Pass Authentication call(HTTPS) of the Choreo analytics via a proxy service.
Type/Improvement Priority/Normal
### Describe your problem(s) Currently, the authentication endpoint for the Choreo analytics cannot pass via a proxy and this will be a limitation when using proxies in the environment. ### Describe your solution N/A ### How will you implement it N/A
1.0
Pass Authentication call(HTTPS) of the Choreo analytics via a proxy service. - ### Describe your problem(s) Currently, the authentication endpoint for the Choreo analytics cannot pass via a proxy and this will be a limitation when using proxies in the environment. ### Describe your solution N/A ### How will you implement it N/A
non_process
pass authentication call https of the choreo analytics via a proxy service describe your problem s currently the authentication endpoint for the choreo analytics cannot pass via a proxy and this will be a limitation when using proxies in the environment describe your solution n a how will you implement it n a
0
12,032
14,738,612,097
IssuesEvent
2021-01-07 05:15:34
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Updated Account List with Aging Issues
anc-process anp-1 ant-parent/primary ant-support
In GitLab by @kdjstudios on Jun 28, 2018, 09:57 Hello Team, We are requesting an updated list with all Internal accounts that having aging issues. Just like with our external client lists we provided please show what the aging issue is and a brief explanation of what will be needed to correct it for each account. NOTE: I believe in the past these types of scenarios have lead to false accounts on previous lists; there are probably others that I have forgotten too. - Accounts with Draft invoices. - Accounts with Pending charges and payments. - Site's in the middle of processing a billing cycle.
1.0
Updated Account List with Aging Issues - In GitLab by @kdjstudios on Jun 28, 2018, 09:57 Hello Team, We are requesting an updated list with all Internal accounts that having aging issues. Just like with our external client lists we provided please show what the aging issue is and a brief explanation of what will be needed to correct it for each account. NOTE: I believe in the past these types of scenarios have lead to false accounts on previous lists; there are probably others that I have forgotten too. - Accounts with Draft invoices. - Accounts with Pending charges and payments. - Site's in the middle of processing a billing cycle.
process
updated account list with aging issues in gitlab by kdjstudios on jun hello team we are requesting an updated list with all internal accounts that having aging issues just like with our external client lists we provided please show what the aging issue is and a brief explanation of what will be needed to correct it for each account note i believe in the past these types of scenarios have lead to false accounts on previous lists there are probably others that i have forgotten too accounts with draft invoices accounts with pending charges and payments site s in the middle of processing a billing cycle
1
270,509
28,962,278,117
IssuesEvent
2023-05-10 04:19:27
nidhi7598/external_curl_AOSP10_r33
https://api.github.com/repos/nidhi7598/external_curl_AOSP10_r33
opened
CVE-2023-27534 (High) detected in curlcurl-7_64_1
Mend: dependency security vulnerability
## CVE-2023-27534 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>curlcurl-7_64_1</b></p></summary> <p> <p>A command line tool and library for transferring data with URL syntax, supporting HTTP, HTTPS, FTP, FTPS, GOPHER, TFTP, SCP, SFTP, SMB, TELNET, DICT, LDAP, LDAPS, FILE, IMAP, SMTP, POP3, RTSP and RTMP. libcurl offers a myriad of powerful features</p> <p>Library home page: <a href=https://github.com/curl/curl.git>https://github.com/curl/curl.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/nidhi7598/external_curl_AOSP10_r33/commit/481a49fc7dbc30e43cd670ab40fa6cca41715464">481a49fc7dbc30e43cd670ab40fa6cca41715464</a></p> <p>Found in base branch: <b>main</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/lib/curl_path.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> A path traversal vulnerability exists in curl <8.0.0 SFTP implementation causes the tilde (~) character to be wrongly replaced when used as a prefix in the first path element, in addition to its intended use as the first element to indicate a path relative to the user's home directory. Attackers can exploit this flaw to bypass filtering or execute arbitrary code by crafting a path like /~2/foo while accessing a server with a specific user. <p>Publish Date: 2023-03-30 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-27534>CVE-2023-27534</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://seclists.org/oss-sec/2023/q1/175">https://seclists.org/oss-sec/2023/q1/175</a></p> <p>Release Date: 2023-03-03</p> <p>Fix Resolution: curl-8_0_0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2023-27534 (High) detected in curlcurl-7_64_1 - ## CVE-2023-27534 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>curlcurl-7_64_1</b></p></summary> <p> <p>A command line tool and library for transferring data with URL syntax, supporting HTTP, HTTPS, FTP, FTPS, GOPHER, TFTP, SCP, SFTP, SMB, TELNET, DICT, LDAP, LDAPS, FILE, IMAP, SMTP, POP3, RTSP and RTMP. libcurl offers a myriad of powerful features</p> <p>Library home page: <a href=https://github.com/curl/curl.git>https://github.com/curl/curl.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/nidhi7598/external_curl_AOSP10_r33/commit/481a49fc7dbc30e43cd670ab40fa6cca41715464">481a49fc7dbc30e43cd670ab40fa6cca41715464</a></p> <p>Found in base branch: <b>main</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/lib/curl_path.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> A path traversal vulnerability exists in curl <8.0.0 SFTP implementation causes the tilde (~) character to be wrongly replaced when used as a prefix in the first path element, in addition to its intended use as the first element to indicate a path relative to the user's home directory. Attackers can exploit this flaw to bypass filtering or execute arbitrary code by crafting a path like /~2/foo while accessing a server with a specific user. <p>Publish Date: 2023-03-30 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-27534>CVE-2023-27534</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://seclists.org/oss-sec/2023/q1/175">https://seclists.org/oss-sec/2023/q1/175</a></p> <p>Release Date: 2023-03-03</p> <p>Fix Resolution: curl-8_0_0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in curlcurl cve high severity vulnerability vulnerable library curlcurl a command line tool and library for transferring data with url syntax supporting http https ftp ftps gopher tftp scp sftp smb telnet dict ldap ldaps file imap smtp rtsp and rtmp libcurl offers a myriad of powerful features library home page a href found in head commit a href found in base branch main vulnerable source files lib curl path c vulnerability details a path traversal vulnerability exists in curl sftp implementation causes the tilde character to be wrongly replaced when used as a prefix in the first path element in addition to its intended use as the first element to indicate a path relative to the user s home directory attackers can exploit this flaw to bypass filtering or execute arbitrary code by crafting a path like foo while accessing a server with a specific user publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution curl step up your open source security game with mend
0
19,543
5,903,213,843
IssuesEvent
2017-05-19 05:40:38
dickschoeller/gedbrowser
https://api.github.com/repos/dickschoeller/gedbrowser
closed
Menu should be a fragment
code smell in progress
:hankey: Currently, each page implements its own identical toolbar. Because of this, we have the following problems: - The renderers expose control methods that should be a in separate object. This is both a poor separation of concerns and requires an unnecessary facade. - Each template has a block of code that id very similar but not identical. These have to be kept in sync. - It is a non-standard way to do things when there is a perfectly good standard.
1.0
Menu should be a fragment - :hankey: Currently, each page implements its own identical toolbar. Because of this, we have the following problems: - The renderers expose control methods that should be a in separate object. This is both a poor separation of concerns and requires an unnecessary facade. - Each template has a block of code that id very similar but not identical. These have to be kept in sync. - It is a non-standard way to do things when there is a perfectly good standard.
non_process
menu should be a fragment hankey currently each page implements its own identical toolbar because of this we have the following problems the renderers expose control methods that should be a in separate object this is both a poor separation of concerns and requires an unnecessary facade each template has a block of code that id very similar but not identical these have to be kept in sync it is a non standard way to do things when there is a perfectly good standard
0
18,020
24,032,777,902
IssuesEvent
2022-09-15 16:18:40
googleapis/java-apigee-registry
https://api.github.com/repos/googleapis/java-apigee-registry
opened
Your .repo-metadata.json file has a problem 🤒
type: process repo-metadata: lint
You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'apigee-registry' invalid in .repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'apigee-registry' invalid in .repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 api shortname apigee registry invalid in repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
1
21,714
30,214,936,728
IssuesEvent
2023-07-05 14:59:30
GoogleCloudPlatform/python-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
opened
Retire Media Translation API samples
priority: p2 type: process
Due to the [announced] discontinue of the Media Translation API, the code samples for the API, located under [ /media-translation/snippets][folder], have to be removed. [announced]: https://cloud.google.com/translate/media/docs/deprecations [folder]: https://github.com/GoogleCloudPlatform/python-docs-samples/tree/bd7cf36bfcd738c06e861c7d936a0a8e3276d46e/media-translation/snippets
1.0
Retire Media Translation API samples - Due to the [announced] discontinue of the Media Translation API, the code samples for the API, located under [ /media-translation/snippets][folder], have to be removed. [announced]: https://cloud.google.com/translate/media/docs/deprecations [folder]: https://github.com/GoogleCloudPlatform/python-docs-samples/tree/bd7cf36bfcd738c06e861c7d936a0a8e3276d46e/media-translation/snippets
process
retire media translation api samples due to the discontinue of the media translation api the code samples for the api located under have to be removed
1
76,276
21,320,999,488
IssuesEvent
2022-04-17 04:14:57
goharbor/harbor
https://api.github.com/repos/goharbor/harbor
closed
Investigate distroless for building Harbor's images.
area/build kind/spike staled
Investigate the eligibility to build Harbor's images on top of distroless: https://github.com/GoogleContainerTools/distroless To see if we can achieve the goals: 1) Mitigate CVEs 2) Reduce the size We should also measure the impact to debugability as it doesn't contain shell, we may figure out best practice for debugging if we adopt it.
1.0
Investigate distroless for building Harbor's images. - Investigate the eligibility to build Harbor's images on top of distroless: https://github.com/GoogleContainerTools/distroless To see if we can achieve the goals: 1) Mitigate CVEs 2) Reduce the size We should also measure the impact to debugability as it doesn't contain shell, we may figure out best practice for debugging if we adopt it.
non_process
investigate distroless for building harbor s images investigate the eligibility to build harbor s images on top of distroless to see if we can achieve the goals mitigate cves reduce the size we should also measure the impact to debugability as it doesn t contain shell we may figure out best practice for debugging if we adopt it
0
57,728
24,197,076,613
IssuesEvent
2022-09-24 03:02:49
hashicorp/terraform-provider-azurerm
https://api.github.com/repos/hashicorp/terraform-provider-azurerm
closed
Create Azure Cognitive Service with Customer Managed Key
enhancement service/cognitive-services
### Is there an existing issue for this? - [X] I have searched the existing issues ### Community Note <!--- Please keep this note for the community ---> * Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Description Similar to #14426 Currently, users are unable to specify a customer managed key as part of the creation of 'azurerm_cognitive_account'. The only option is to create the key after the deployment using 'azurerm_cognitive_account_customer_managed_key'. Some organizations employ Azure Policies that block any deployment without a customer managed key; and since the above approach is not atomic, their cognitive account deployments fail. We want the option to create the cognitive account with Customer Managed Key at the same time. Azure started supporting User Managed Identity for cognitive accounts, which gives the ability to have the storage account created with User Managed Identity and Customer Managed Keys. Some users have already successfully tested the use of the Azure API to create a storage account with User managed identity and CMK; and it works as expected. ### New or Affected Resource(s)/Data Source(s) azurerm_cognitive_account ### Potential Terraform Configuration ```hcl resource "azurerm_cognitive_account" "example" { name = "examplestor" resource_group_name = azurerm_resource_group.example.name location = azurerm_resource_group.example.location account_tier = "Standard" identity { type = "UserAssigned" } } resource "azurerm_cognitive_account_customer_managed_key" "example" { cognitive_account_id = azurerm_cognitive_account.example.id key_vault_id = azurerm_key_vault.example.id key_name = azurerm_key_vault_key.example.name uami_id = azurerm_cognitive_account.example.identity.id } ``` ### References None.
2.0
Create Azure Cognitive Service with Customer Managed Key - ### Is there an existing issue for this? - [X] I have searched the existing issues ### Community Note <!--- Please keep this note for the community ---> * Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Description Similar to #14426 Currently, users are unable to specify a customer managed key as part of the creation of 'azurerm_cognitive_account'. The only option is to create the key after the deployment using 'azurerm_cognitive_account_customer_managed_key'. Some organizations employ Azure Policies that block any deployment without a customer managed key; and since the above approach is not atomic, their cognitive account deployments fail. We want the option to create the cognitive account with Customer Managed Key at the same time. Azure started supporting User Managed Identity for cognitive accounts, which gives the ability to have the storage account created with User Managed Identity and Customer Managed Keys. Some users have already successfully tested the use of the Azure API to create a storage account with User managed identity and CMK; and it works as expected. ### New or Affected Resource(s)/Data Source(s) azurerm_cognitive_account ### Potential Terraform Configuration ```hcl resource "azurerm_cognitive_account" "example" { name = "examplestor" resource_group_name = azurerm_resource_group.example.name location = azurerm_resource_group.example.location account_tier = "Standard" identity { type = "UserAssigned" } } resource "azurerm_cognitive_account_customer_managed_key" "example" { cognitive_account_id = azurerm_cognitive_account.example.id key_vault_id = azurerm_key_vault.example.id key_name = azurerm_key_vault_key.example.name uami_id = azurerm_cognitive_account.example.identity.id } ``` ### References None.
non_process
create azure cognitive service with customer managed key is there an existing issue for this i have searched the existing issues community note please vote on this issue by adding a thumbsup to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description similar to currently users are unable to specify a customer managed key as part of the creation of azurerm cognitive account the only option is to create the key after the deployment using azurerm cognitive account customer managed key some organizations employ azure policies that block any deployment without a customer managed key and since the above approach is not atomic their cognitive account deployments fail we want the option to create the cognitive account with customer managed key at the same time azure started supporting user managed identity for cognitive accounts which gives the ability to have the storage account created with user managed identity and customer managed keys some users have already successfully tested the use of the azure api to create a storage account with user managed identity and cmk and it works as expected new or affected resource s data source s azurerm cognitive account potential terraform configuration hcl resource azurerm cognitive account example name examplestor resource group name azurerm resource group example name location azurerm resource group example location account tier standard identity type userassigned resource azurerm cognitive account customer managed key example cognitive account id azurerm cognitive account example id key vault id azurerm key vault example id key name azurerm key vault key example name uami id azurerm cognitive account example identity id references none
0
28,615
8,194,626,348
IssuesEvent
2018-08-31 00:40:54
ApolloAuto/apollo
https://api.github.com/repos/ApolloAuto/apollo
closed
Issue running offline demo
Module: Build Type: Help wanted
Hi there! I am new to Apollo. I was following the steps on running the offline demo at https://github.com/ApolloAuto/apollo/tree/master/docs/demo_guide. Everything looks fine until the final step. I was able to play the demo_2.0.bag loop in my terminal and was able to start Dreamview in my Mozilla. But no vehicle was shown and I got an error msg in my terminal: [ERROR] [1535613845.882991023]: Client [/localization] wants topic /apollo/sensor/gnss/corrected_imu to have datatype/md5sum [pb_msgs/CorrectedImu/81aef4a818ce273a8af85a440ccdb0f7], but our version has [pb_msgs/Imu/bdef0ba51869607ed95736d41e80c1f5]. Dropping connection. Has anyone got any idea what is the issue? Thanks a lot. Best, Kai
1.0
Issue running offline demo - Hi there! I am new to Apollo. I was following the steps on running the offline demo at https://github.com/ApolloAuto/apollo/tree/master/docs/demo_guide. Everything looks fine until the final step. I was able to play the demo_2.0.bag loop in my terminal and was able to start Dreamview in my Mozilla. But no vehicle was shown and I got an error msg in my terminal: [ERROR] [1535613845.882991023]: Client [/localization] wants topic /apollo/sensor/gnss/corrected_imu to have datatype/md5sum [pb_msgs/CorrectedImu/81aef4a818ce273a8af85a440ccdb0f7], but our version has [pb_msgs/Imu/bdef0ba51869607ed95736d41e80c1f5]. Dropping connection. Has anyone got any idea what is the issue? Thanks a lot. Best, Kai
non_process
issue running offline demo hi there i am new to apollo i was following the steps on running the offline demo at everything looks fine until the final step i was able to play the demo bag loop in my terminal and was able to start dreamview in my mozilla but no vehicle was shown and i got an error msg in my terminal client wants topic apollo sensor gnss corrected imu to have datatype but our version has dropping connection has anyone got any idea what is the issue thanks a lot best kai
0
44,329
18,018,303,158
IssuesEvent
2021-09-16 16:08:03
hashicorp/terraform-provider-aws
https://api.github.com/repos/hashicorp/terraform-provider-aws
closed
aws_db_subnet_group.default: Error creating DB Subnet Group: DBSubnetGroupDoesNotCoverEnoughAZs: DB Subnet Group doesn't meet availability zone coverage requirement
service/rds
I have a problem terraform * aws_db_subnet_group.default: 1 error occurred: * aws_db_subnet_group.default: Error creating DB Subnet Group: DBSubnetGroupDoesNotCoverEnoughAZs: DB Subnet Group doesn't meet availability zone coverage requirement. Please add subnets to cover at least 2 availability zones. Current coverage: 1 status code: 400, request id: 23eec270-9ffb-4695-9486-b7482c64396e Version terraform v0.11.14
1.0
aws_db_subnet_group.default: Error creating DB Subnet Group: DBSubnetGroupDoesNotCoverEnoughAZs: DB Subnet Group doesn't meet availability zone coverage requirement - I have a problem terraform * aws_db_subnet_group.default: 1 error occurred: * aws_db_subnet_group.default: Error creating DB Subnet Group: DBSubnetGroupDoesNotCoverEnoughAZs: DB Subnet Group doesn't meet availability zone coverage requirement. Please add subnets to cover at least 2 availability zones. Current coverage: 1 status code: 400, request id: 23eec270-9ffb-4695-9486-b7482c64396e Version terraform v0.11.14
non_process
aws db subnet group default error creating db subnet group dbsubnetgroupdoesnotcoverenoughazs db subnet group doesn t meet availability zone coverage requirement i have a problem terraform aws db subnet group default error occurred aws db subnet group default error creating db subnet group dbsubnetgroupdoesnotcoverenoughazs db subnet group doesn t meet availability zone coverage requirement please add subnets to cover at least availability zones current coverage status code request id version terraform
0
79,829
23,049,027,043
IssuesEvent
2022-07-24 10:51:12
Traben-0/Entity_Texture_Features
https://api.github.com/repos/Traben-0/Entity_Texture_Features
closed
Ender chest texture broken (Fabric 1.19 latest version)
bug workaround-known fixed in dev build
Simple, ender chest is broken, if need more info because cannot replicate let me know
1.0
Ender chest texture broken (Fabric 1.19 latest version) - Simple, ender chest is broken, if need more info because cannot replicate let me know
non_process
ender chest texture broken fabric latest version simple ender chest is broken if need more info because cannot replicate let me know
0
664,550
22,274,396,246
IssuesEvent
2022-06-10 15:11:28
Ijwu/Archipelago.HollowKnight
https://api.github.com/repos/Ijwu/Archipelago.HollowKnight
closed
Change font, if possible, on sign in screen to increase accessibility
enhancement good first issue low priority cosmetic
The current font is awful in terms of distinguishing capital letters from lowercase.
1.0
Change font, if possible, on sign in screen to increase accessibility - The current font is awful in terms of distinguishing capital letters from lowercase.
non_process
change font if possible on sign in screen to increase accessibility the current font is awful in terms of distinguishing capital letters from lowercase
0
177,176
6,575,169,939
IssuesEvent
2017-09-11 15:14:46
foundersandcoders/open-tourism-platform
https://api.github.com/repos/foundersandcoders/open-tourism-platform
closed
Authenticating the access token
2 priority discussion
How are we securing the routes? Should they be accessible only by client's logged in with oauth, or also by people directly logged into the platform. This question affects what we do next. The module has an authenticate function that returns the token. However this will presumably fail if a request is made that has a JWT in a cookie and no access token?
1.0
Authenticating the access token - How are we securing the routes? Should they be accessible only by client's logged in with oauth, or also by people directly logged into the platform. This question affects what we do next. The module has an authenticate function that returns the token. However this will presumably fail if a request is made that has a JWT in a cookie and no access token?
non_process
authenticating the access token how are we securing the routes should they be accessible only by client s logged in with oauth or also by people directly logged into the platform this question affects what we do next the module has an authenticate function that returns the token however this will presumably fail if a request is made that has a jwt in a cookie and no access token
0
182,424
14,121,824,498
IssuesEvent
2020-11-09 03:12:46
biopython/biopython
https://api.github.com/repos/biopython/biopython
opened
Changes to TravisCI / Move CI to Github Actions
Style Testing
TravisCI sent out an email this week announcing changes to their plans ([link](https://blog.travis-ci.com/2020-11-02-travis-ci-new-billing)). The post is quite dense and it's hard to understand if this affects us but there seem to be new limitations to open-source projects. Also, we use the travis-ci.org portal, which will be closed down at the end of this year ([link](https://docs.travis-ci.com/user/migrate/open-source-repository-migration#frequently-asked-questions)) so we might take this opportunity to think about updating our CI infrastructure a little. Currently, we use TravisCI and Appveyor to test on Windows and Linux, and it seems we don't test on MacOS. Moving to Github Actions would let us test on all three major OSes and a bunch of Python versions including PyPy. It would also allow us to run up to 256 jobs in parallel, and in my experience, it seems those jobs run faster than on Travis/Appveyor. To kickstart this discussion and test drive a possible transition to Github Actions, I started a branch (purposedly out of date with the current master by a few commits) implementing the style checking, packaging, and simple testing steps. You can see the branch [here](https://github.com/JoaoRodrigues/biopython/tree/gh_actions_tests) and the file I have been working on [here](https://github.com/JoaoRodrigues/biopython/blob/gh_actions_tests/.github/workflows/basic.yml). Thoughts? Suggestions?
1.0
Changes to TravisCI / Move CI to Github Actions - TravisCI sent out an email this week announcing changes to their plans ([link](https://blog.travis-ci.com/2020-11-02-travis-ci-new-billing)). The post is quite dense and it's hard to understand if this affects us but there seem to be new limitations to open-source projects. Also, we use the travis-ci.org portal, which will be closed down at the end of this year ([link](https://docs.travis-ci.com/user/migrate/open-source-repository-migration#frequently-asked-questions)) so we might take this opportunity to think about updating our CI infrastructure a little. Currently, we use TravisCI and Appveyor to test on Windows and Linux, and it seems we don't test on MacOS. Moving to Github Actions would let us test on all three major OSes and a bunch of Python versions including PyPy. It would also allow us to run up to 256 jobs in parallel, and in my experience, it seems those jobs run faster than on Travis/Appveyor. To kickstart this discussion and test drive a possible transition to Github Actions, I started a branch (purposedly out of date with the current master by a few commits) implementing the style checking, packaging, and simple testing steps. You can see the branch [here](https://github.com/JoaoRodrigues/biopython/tree/gh_actions_tests) and the file I have been working on [here](https://github.com/JoaoRodrigues/biopython/blob/gh_actions_tests/.github/workflows/basic.yml). Thoughts? Suggestions?
non_process
changes to travisci move ci to github actions travisci sent out an email this week announcing changes to their plans the post is quite dense and it s hard to understand if this affects us but there seem to be new limitations to open source projects also we use the travis ci org portal which will be closed down at the end of this year so we might take this opportunity to think about updating our ci infrastructure a little currently we use travisci and appveyor to test on windows and linux and it seems we don t test on macos moving to github actions would let us test on all three major oses and a bunch of python versions including pypy it would also allow us to run up to jobs in parallel and in my experience it seems those jobs run faster than on travis appveyor to kickstart this discussion and test drive a possible transition to github actions i started a branch purposedly out of date with the current master by a few commits implementing the style checking packaging and simple testing steps you can see the branch and the file i have been working on thoughts suggestions
0
11,276
14,076,846,769
IssuesEvent
2020-11-04 11:05:12
prisma/prisma-engines
https://api.github.com/repos/prisma/prisma-engines
opened
Add support for AWS Graviton
engines/other kind/feature process/candidate team/engines
More and more people are starting to explore the ARM offering of AWS. Hence we would like to support it.
1.0
Add support for AWS Graviton - More and more people are starting to explore the ARM offering of AWS. Hence we would like to support it.
process
add support for aws graviton more and more people are starting to explore the arm offering of aws hence we would like to support it
1
1,412
3,972,295,901
IssuesEvent
2016-05-04 14:55:24
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Topicset @type attribute warning in move-meta-entries
dita standard preprocess
Version: DITA-OT 2.2.4 Transformation: HTML5 customization Platform: OS X Since upgrading my plugin to 2.2.4, I see warnings in my build log for topicsets: > file:/Users/staylor/Dev/dita_source_files/m_a_common_topicsets.ditamap:12:69: [DOTX019W][WARN]: The type attribute on a topicref was set to 'topicset', but the topicref references a 'concept' topic. This may cause your links to sort incorrectly in the output. Note that the type attribute cascades in maps, so the value 'topicset' may come from an ancestor topicref. My map doesn't specify `@type="topicset"` anywhere. I'm reasonably sure this is an error in xsl/preprocess/mappullImpl.xsl: ``` <xsl:template match="*" mode="mappull:verify-type-value"> <xsl:param name="type" as="xs:string"/> <!-- Specified type on the topicref --> <xsl:param name="actual-class" as="xs:string"/> <!-- Class value on the target element --> <xsl:param name="actual-name" as="xs:string"/> <!-- Name of the target element --> <xsl:param name="WORKDIR" as="xs:string"> <xsl:apply-templates select="/processing-instruction('workdir-uri')[1]" mode="get-work-dir"/> </xsl:param> <xsl:choose> <!-- The type is correct; concept typed as concept, newtype defined as newtype --> <xsl:when test="$type=$actual-name"/> <!-- If the actual class contains the specified type; reference can be called topic, specializedReference can be called reference --> <xsl:when test="contains($actual-class,concat(' ',$type,'/',$type,' '))"> <!-- commented out for bug:1771123 start --> <!--xsl:apply-templates select="." mode="ditamsg:type-mismatch-info"> <xsl:with-param name="type" select="$type"/> <xsl:with-param name="actual-name" select="$actual-name"/> </xsl:apply-templates--> <!-- commented out for bug:1771123 end --> </xsl:when> <!-- Otherwise: incorrect type is specified --> <xsl:otherwise> <xsl:apply-templates select="." mode="ditamsg:type-mismatch-warning"> <xsl:with-param name="type" select="$type"/> <xsl:with-param name="actual-name" select="$actual-name"/> </xsl:apply-templates> </xsl:otherwise> </xsl:choose> </xsl:template> ``` It looks to me like when you use a `<topicsetref>`, the processing sees the target element name as topicset and that results in a mismatch. Wondering if the fix is simply to ignore this check when the target is a topicset.
1.0
Topicset @type attribute warning in move-meta-entries - Version: DITA-OT 2.2.4 Transformation: HTML5 customization Platform: OS X Since upgrading my plugin to 2.2.4, I see warnings in my build log for topicsets: > file:/Users/staylor/Dev/dita_source_files/m_a_common_topicsets.ditamap:12:69: [DOTX019W][WARN]: The type attribute on a topicref was set to 'topicset', but the topicref references a 'concept' topic. This may cause your links to sort incorrectly in the output. Note that the type attribute cascades in maps, so the value 'topicset' may come from an ancestor topicref. My map doesn't specify `@type="topicset"` anywhere. I'm reasonably sure this is an error in xsl/preprocess/mappullImpl.xsl: ``` <xsl:template match="*" mode="mappull:verify-type-value"> <xsl:param name="type" as="xs:string"/> <!-- Specified type on the topicref --> <xsl:param name="actual-class" as="xs:string"/> <!-- Class value on the target element --> <xsl:param name="actual-name" as="xs:string"/> <!-- Name of the target element --> <xsl:param name="WORKDIR" as="xs:string"> <xsl:apply-templates select="/processing-instruction('workdir-uri')[1]" mode="get-work-dir"/> </xsl:param> <xsl:choose> <!-- The type is correct; concept typed as concept, newtype defined as newtype --> <xsl:when test="$type=$actual-name"/> <!-- If the actual class contains the specified type; reference can be called topic, specializedReference can be called reference --> <xsl:when test="contains($actual-class,concat(' ',$type,'/',$type,' '))"> <!-- commented out for bug:1771123 start --> <!--xsl:apply-templates select="." mode="ditamsg:type-mismatch-info"> <xsl:with-param name="type" select="$type"/> <xsl:with-param name="actual-name" select="$actual-name"/> </xsl:apply-templates--> <!-- commented out for bug:1771123 end --> </xsl:when> <!-- Otherwise: incorrect type is specified --> <xsl:otherwise> <xsl:apply-templates select="." mode="ditamsg:type-mismatch-warning"> <xsl:with-param name="type" select="$type"/> <xsl:with-param name="actual-name" select="$actual-name"/> </xsl:apply-templates> </xsl:otherwise> </xsl:choose> </xsl:template> ``` It looks to me like when you use a `<topicsetref>`, the processing sees the target element name as topicset and that results in a mismatch. Wondering if the fix is simply to ignore this check when the target is a topicset.
process
topicset type attribute warning in move meta entries version dita ot transformation customization platform os x since upgrading my plugin to i see warnings in my build log for topicsets file users staylor dev dita source files m a common topicsets ditamap the type attribute on a topicref was set to topicset but the topicref references a concept topic this may cause your links to sort incorrectly in the output note that the type attribute cascades in maps so the value topicset may come from an ancestor topicref my map doesn t specify type topicset anywhere i m reasonably sure this is an error in xsl preprocess mappullimpl xsl if the actual class contains the specified type reference can be called topic specializedreference can be called reference it looks to me like when you use a the processing sees the target element name as topicset and that results in a mismatch wondering if the fix is simply to ignore this check when the target is a topicset
1
438,733
12,643,931,730
IssuesEvent
2020-06-16 10:40:11
eclipse/dirigible
https://api.github.com/repos/eclipse/dirigible
closed
[API] SOAP
API efforts-low enhancement priority-medium
Transfer the SOAP API from 2.x to 4.x (https://github.com/eclipse/dirigible/wiki/api-v4-guidelines). **Module:**``net/v3/soap`` Module Identifier: **net/v4/soap** Module Source: https://github.com/dirigiblelabs/api-net/blob/master/net/v4/soap.js Module Mirror: https://github.com/eclipse/dirigible/blob/master/api/api-javascript/api-net/src/main/resources/net/v4/soap.js Facade Source: none Documentation: http://www.dirigible.io/api/net_soap.html Samples: https://www.dirigible.io/samples/simple_soap_client.html, https://www.dirigible.io/samples/simple_soap_server.html Tests: none
1.0
[API] SOAP - Transfer the SOAP API from 2.x to 4.x (https://github.com/eclipse/dirigible/wiki/api-v4-guidelines). **Module:**``net/v3/soap`` Module Identifier: **net/v4/soap** Module Source: https://github.com/dirigiblelabs/api-net/blob/master/net/v4/soap.js Module Mirror: https://github.com/eclipse/dirigible/blob/master/api/api-javascript/api-net/src/main/resources/net/v4/soap.js Facade Source: none Documentation: http://www.dirigible.io/api/net_soap.html Samples: https://www.dirigible.io/samples/simple_soap_client.html, https://www.dirigible.io/samples/simple_soap_server.html Tests: none
non_process
soap transfer the soap api from x to x module net soap module identifier net soap module source module mirror facade source none documentation samples tests none
0
68,798
17,405,349,192
IssuesEvent
2021-08-03 04:36:27
4awpawz/trio
https://api.github.com/repos/4awpawz/trio
opened
lib/generator/cleanPublic.js is no longer called because lib/tasks/file-watcher.js calls lib/utils/triggerOneOffBuild whenever a file is deleted.
enhancement generator related incremental build related
triggerOneOffBuild triggers Trio's build cycle as if trio b was run which rebuilds the public folder so calling cleanPublic is no longer required. Remove are archive the code.
1.0
lib/generator/cleanPublic.js is no longer called because lib/tasks/file-watcher.js calls lib/utils/triggerOneOffBuild whenever a file is deleted. - triggerOneOffBuild triggers Trio's build cycle as if trio b was run which rebuilds the public folder so calling cleanPublic is no longer required. Remove are archive the code.
non_process
lib generator cleanpublic js is no longer called because lib tasks file watcher js calls lib utils triggeroneoffbuild whenever a file is deleted triggeroneoffbuild triggers trio s build cycle as if trio b was run which rebuilds the public folder so calling cleanpublic is no longer required remove are archive the code
0
6,914
10,062,817,412
IssuesEvent
2019-07-23 02:53:11
dotnet/docfx
https://api.github.com/repos/dotnet/docfx
closed
Resolve xref for SDP
Area-SchemaProcessor v3
Only set Href while resolving xref for SDP for now, it needs to set additional xref properties as well
1.0
Resolve xref for SDP - Only set Href while resolving xref for SDP for now, it needs to set additional xref properties as well
process
resolve xref for sdp only set href while resolving xref for sdp for now it needs to set additional xref properties as well
1
59,660
14,439,365,377
IssuesEvent
2020-12-07 14:18:03
NixOS/nixpkgs
https://api.github.com/repos/NixOS/nixpkgs
opened
Vulnerability roundup 97: monero-0.17.1.5: 1 advisory [5.5]
1.severity: security
[search](https://search.nix.gsc.io/?q=monero&i=fosho&repos=NixOS-nixpkgs), [files](https://github.com/NixOS/nixpkgs/search?utf8=%E2%9C%93&q=monero+in%3Apath&type=Code) * [ ] [CVE-2020-6861](https://nvd.nist.gov/vuln/detail/CVE-2020-6861) CVSSv3=5.5 (nixos-unstable) Scanned versions: nixos-unstable: 83cbad92d73. Cc @ehmry Cc @rnhmjoj
True
Vulnerability roundup 97: monero-0.17.1.5: 1 advisory [5.5] - [search](https://search.nix.gsc.io/?q=monero&i=fosho&repos=NixOS-nixpkgs), [files](https://github.com/NixOS/nixpkgs/search?utf8=%E2%9C%93&q=monero+in%3Apath&type=Code) * [ ] [CVE-2020-6861](https://nvd.nist.gov/vuln/detail/CVE-2020-6861) CVSSv3=5.5 (nixos-unstable) Scanned versions: nixos-unstable: 83cbad92d73. Cc @ehmry Cc @rnhmjoj
non_process
vulnerability roundup monero advisory nixos unstable scanned versions nixos unstable cc ehmry cc rnhmjoj
0