Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
235,624
| 7,741,164,499
|
IssuesEvent
|
2018-05-29 03:45:48
|
SmartlyDressedGames/Unturned-4.x-Community
|
https://api.github.com/repos/SmartlyDressedGames/Unturned-4.x-Community
|
closed
|
Damage
|
Priority: High Status: To-Do Type: Feature
|
- [x] Hitmarker handlers
- [x] Hit direction indicator
- [x] Hit additive animations based on direction
- [x] Fix item collision
- [ ] First person flinch
- [ ] Pain flash
|
1.0
|
Damage - - [x] Hitmarker handlers
- [x] Hit direction indicator
- [x] Hit additive animations based on direction
- [x] Fix item collision
- [ ] First person flinch
- [ ] Pain flash
|
non_process
|
damage hitmarker handlers hit direction indicator hit additive animations based on direction fix item collision first person flinch pain flash
| 0
|
10,245
| 13,101,216,594
|
IssuesEvent
|
2020-08-04 03:02:05
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
Let Coprocessor completely work in the batch mode
|
component/coprocessor status/proposal
|
## Feature Request
### Is your feature request related to a problem?
Currently, the batch executors of coprocessor don't support in a server-side streaming request. If such a request was received, we would fall back to using the old execution framework. The old execution share exactly the same logic with the batch one which complicates our codebase and inhibits our coprocessor maximize its performance.
https://github.com/tikv/tikv/blob/66d2566c166e74bedca8b3172f6a2a95bf01efd4/src/coprocessor/dag/mod.rs#L65
### Describe the feature you'd like:
The remove of the old execution framework consists of 3 steps:
1. Remove the fallback check in the `DagHandlerBuilder`.
2. Overwrite the default implementation in `RequestHandler::handle_streaming_request` for `BatchDAGHandler`.
3. Pass the integration tests.
### Teachability, Documentation, Adoption, Migration Strategy:
Here is a WIP pull request to refer: https://github.com/tikv/tikv/pull/5945
|
1.0
|
Let Coprocessor completely work in the batch mode - ## Feature Request
### Is your feature request related to a problem?
Currently, the batch executors of coprocessor don't support in a server-side streaming request. If such a request was received, we would fall back to using the old execution framework. The old execution share exactly the same logic with the batch one which complicates our codebase and inhibits our coprocessor maximize its performance.
https://github.com/tikv/tikv/blob/66d2566c166e74bedca8b3172f6a2a95bf01efd4/src/coprocessor/dag/mod.rs#L65
### Describe the feature you'd like:
The remove of the old execution framework consists of 3 steps:
1. Remove the fallback check in the `DagHandlerBuilder`.
2. Overwrite the default implementation in `RequestHandler::handle_streaming_request` for `BatchDAGHandler`.
3. Pass the integration tests.
### Teachability, Documentation, Adoption, Migration Strategy:
Here is a WIP pull request to refer: https://github.com/tikv/tikv/pull/5945
|
process
|
let coprocessor completely work in the batch mode feature request is your feature request related to a problem currently the batch executors of coprocessor don t support in a server side streaming request if such a request was received we would fall back to using the old execution framework the old execution share exactly the same logic with the batch one which complicates our codebase and inhibits our coprocessor maximize its performance describe the feature you d like the remove of the old execution framework consists of steps remove the fallback check in the daghandlerbuilder overwrite the default implementation in requesthandler handle streaming request for batchdaghandler pass the integration tests teachability documentation adoption migration strategy here is a wip pull request to refer
| 1
|
19,017
| 25,015,977,877
|
IssuesEvent
|
2022-11-03 18:50:59
|
python/cpython
|
https://api.github.com/repos/python/cpython
|
closed
|
test_concurrent_futures leaks many dangling threads on FreeBSD
|
tests 3.10 OS-freebsd expert-multiprocessing
|
BPO | [43845](https://bugs.python.org/issue43845)
--- | :---
Nosy | @vstinner, @shihai1991
<sup>*Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.*</sup>
<details><summary>Show more details</summary><p>
GitHub fields:
```python
assignee = None
closed_at = None
created_at = <Date 2021-04-14.11:57:40.679>
labels = ['tests', '3.10']
title = 'test_concurrent_futures leaks many dangling threads on FreeBSD'
updated_at = <Date 2021-04-16.13:44:56.214>
user = 'https://github.com/vstinner'
```
bugs.python.org fields:
```python
activity = <Date 2021-04-16.13:44:56.214>
actor = 'shihai1991'
assignee = 'none'
closed = False
closed_date = None
closer = None
components = ['Tests']
creation = <Date 2021-04-14.11:57:40.679>
creator = 'vstinner'
dependencies = []
files = []
hgrepos = []
issue_num = 43845
keywords = []
message_count = 2.0
messages = ['391070', '391071']
nosy_count = 2.0
nosy_names = ['vstinner', 'shihai1991']
pr_nums = []
priority = 'normal'
resolution = None
stage = None
status = 'open'
superseder = None
type = None
url = 'https://bugs.python.org/issue43845'
versions = ['Python 3.10']
```
</p></details>
|
1.0
|
test_concurrent_futures leaks many dangling threads on FreeBSD - BPO | [43845](https://bugs.python.org/issue43845)
--- | :---
Nosy | @vstinner, @shihai1991
<sup>*Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.*</sup>
<details><summary>Show more details</summary><p>
GitHub fields:
```python
assignee = None
closed_at = None
created_at = <Date 2021-04-14.11:57:40.679>
labels = ['tests', '3.10']
title = 'test_concurrent_futures leaks many dangling threads on FreeBSD'
updated_at = <Date 2021-04-16.13:44:56.214>
user = 'https://github.com/vstinner'
```
bugs.python.org fields:
```python
activity = <Date 2021-04-16.13:44:56.214>
actor = 'shihai1991'
assignee = 'none'
closed = False
closed_date = None
closer = None
components = ['Tests']
creation = <Date 2021-04-14.11:57:40.679>
creator = 'vstinner'
dependencies = []
files = []
hgrepos = []
issue_num = 43845
keywords = []
message_count = 2.0
messages = ['391070', '391071']
nosy_count = 2.0
nosy_names = ['vstinner', 'shihai1991']
pr_nums = []
priority = 'normal'
resolution = None
stage = None
status = 'open'
superseder = None
type = None
url = 'https://bugs.python.org/issue43845'
versions = ['Python 3.10']
```
</p></details>
|
process
|
test concurrent futures leaks many dangling threads on freebsd bpo nosy vstinner note these values reflect the state of the issue at the time it was migrated and might not reflect the current state show more details github fields python assignee none closed at none created at labels title test concurrent futures leaks many dangling threads on freebsd updated at user bugs python org fields python activity actor assignee none closed false closed date none closer none components creation creator vstinner dependencies files hgrepos issue num keywords message count messages nosy count nosy names pr nums priority normal resolution none stage none status open superseder none type none url versions
| 1
|
21,155
| 28,131,486,007
|
IssuesEvent
|
2023-04-01 00:03:07
|
metallb/metallb
|
https://api.github.com/repos/metallb/metallb
|
closed
|
Move the developer forum to CNCF infrastructure
|
process lifecycle-stale
|
## General
As a part of moving MetalLB under the CNCF, we can get access to infrastructure for facilitating community meetings. This includes access to Zoom.
Moving to Zoom makes sense for the following reasons:
- Using CNCF infrastructure for community meetings makes more sense than using infrastructure belonging to a specific company.
- Meet doesn't seem to allow starting a meeting before an organizer is present. This is counterproductive for us.
- Zoom usually offers better call quality than Meet.
Besides moving to Zoom, we should reconsider the way we publish meeting times and meeting notes. Lastly, we probably want to start recording community meetings.
## TODO
- [ ] Check with CNCF what is required for getting Zoom access.
- [ ] Check if there is a well-defined way for CNCF projects to publish meeting calendars. If so - adopt it. If not, find an alternative to the current Kinvolk-owned Google Calendar.
- [ ] Schedule a recurring Zoom meeting for the fortnightly MetalLB developer meetings and publish it in the appropriate place (see above).
- [ ] Create a (hopefully automated) process for recording meetings and sharing the recordings publicly.
|
1.0
|
Move the developer forum to CNCF infrastructure - ## General
As a part of moving MetalLB under the CNCF, we can get access to infrastructure for facilitating community meetings. This includes access to Zoom.
Moving to Zoom makes sense for the following reasons:
- Using CNCF infrastructure for community meetings makes more sense than using infrastructure belonging to a specific company.
- Meet doesn't seem to allow starting a meeting before an organizer is present. This is counterproductive for us.
- Zoom usually offers better call quality than Meet.
Besides moving to Zoom, we should reconsider the way we publish meeting times and meeting notes. Lastly, we probably want to start recording community meetings.
## TODO
- [ ] Check with CNCF what is required for getting Zoom access.
- [ ] Check if there is a well-defined way for CNCF projects to publish meeting calendars. If so - adopt it. If not, find an alternative to the current Kinvolk-owned Google Calendar.
- [ ] Schedule a recurring Zoom meeting for the fortnightly MetalLB developer meetings and publish it in the appropriate place (see above).
- [ ] Create a (hopefully automated) process for recording meetings and sharing the recordings publicly.
|
process
|
move the developer forum to cncf infrastructure general as a part of moving metallb under the cncf we can get access to infrastructure for facilitating community meetings this includes access to zoom moving to zoom makes sense for the following reasons using cncf infrastructure for community meetings makes more sense than using infrastructure belonging to a specific company meet doesn t seem to allow starting a meeting before an organizer is present this is counterproductive for us zoom usually offers better call quality than meet besides moving to zoom we should reconsider the way we publish meeting times and meeting notes lastly we probably want to start recording community meetings todo check with cncf what is required for getting zoom access check if there is a well defined way for cncf projects to publish meeting calendars if so adopt it if not find an alternative to the current kinvolk owned google calendar schedule a recurring zoom meeting for the fortnightly metallb developer meetings and publish it in the appropriate place see above create a hopefully automated process for recording meetings and sharing the recordings publicly
| 1
|
71,732
| 23,778,392,459
|
IssuesEvent
|
2022-09-02 00:04:28
|
CorfuDB/CorfuDB
|
https://api.github.com/repos/CorfuDB/CorfuDB
|
closed
|
Unhandled Trimmed Exception
|
defect
|
## Overview
From corfu users' perspective, they may not be aware of that a simple API call (e.g. get, put, remove, etc) would throw Trimmed Exceptions. The wrapper class (e.g. SMRMap$CORFUSMR.java) inherits existing interfaces (e.g. Map) and therefore it is hard to change the signatures of APIs to throw checked exceptions. Nevertheless, it is possible to add some sample codes to illustrate the issue.
|
1.0
|
Unhandled Trimmed Exception - ## Overview
From corfu users' perspective, they may not be aware of that a simple API call (e.g. get, put, remove, etc) would throw Trimmed Exceptions. The wrapper class (e.g. SMRMap$CORFUSMR.java) inherits existing interfaces (e.g. Map) and therefore it is hard to change the signatures of APIs to throw checked exceptions. Nevertheless, it is possible to add some sample codes to illustrate the issue.
|
non_process
|
unhandled trimmed exception overview from corfu users perspective they may not be aware of that a simple api call e g get put remove etc would throw trimmed exceptions the wrapper class e g smrmap corfusmr java inherits existing interfaces e g map and therefore it is hard to change the signatures of apis to throw checked exceptions nevertheless it is possible to add some sample codes to illustrate the issue
| 0
|
21,130
| 28,102,294,582
|
IssuesEvent
|
2023-03-30 20:32:12
|
StormSurgeLive/asgs
|
https://api.github.com/repos/StormSurgeLive/asgs
|
opened
|
`gnuplot` fails to build on `desktop` architecture
|
bug verify suggested workaround postprocessing installation
|
For some odd reason, `gnuplot` is failing to build on my `desktop` architecture. In the `config.log`, the error message is
```
configure:8042: error: texdir is not given and there is no kpsexpand, please tell where to install
```
Not sure why this is happening, but I don't think that any of our built-in postprocessing scripts use `gnuplot` at the moment, so the suggested workaround is to disable building `gnuplot`.
|
1.0
|
`gnuplot` fails to build on `desktop` architecture - For some odd reason, `gnuplot` is failing to build on my `desktop` architecture. In the `config.log`, the error message is
```
configure:8042: error: texdir is not given and there is no kpsexpand, please tell where to install
```
Not sure why this is happening, but I don't think that any of our built-in postprocessing scripts use `gnuplot` at the moment, so the suggested workaround is to disable building `gnuplot`.
|
process
|
gnuplot fails to build on desktop architecture for some odd reason gnuplot is failing to build on my desktop architecture in the config log the error message is configure error texdir is not given and there is no kpsexpand please tell where to install not sure why this is happening but i don t think that any of our built in postprocessing scripts use gnuplot at the moment so the suggested workaround is to disable building gnuplot
| 1
|
7,499
| 10,584,518,328
|
IssuesEvent
|
2019-10-08 15:36:15
|
code4romania/expert-consultation-api
|
https://api.github.com/repos/code4romania/expert-consultation-api
|
closed
|
[Document processing] Implement loading of pdf files
|
document processing documents enhancement help wanted java spring
|
As part of the loading of files in the platform, a service to load and read pdf file contents needs to be developer.
The service should be using a Java PDF library like https://pdfbox.apache.org/
The service should be able to receive the path to a PDF file and return the contents of the file.
|
1.0
|
[Document processing] Implement loading of pdf files - As part of the loading of files in the platform, a service to load and read pdf file contents needs to be developer.
The service should be using a Java PDF library like https://pdfbox.apache.org/
The service should be able to receive the path to a PDF file and return the contents of the file.
|
process
|
implement loading of pdf files as part of the loading of files in the platform a service to load and read pdf file contents needs to be developer the service should be using a java pdf library like the service should be able to receive the path to a pdf file and return the contents of the file
| 1
|
156,445
| 5,969,460,331
|
IssuesEvent
|
2017-05-30 20:21:40
|
craftercms/craftercms
|
https://api.github.com/repos/craftercms/craftercms
|
reopened
|
[studio] Copy of folder containing components fails due to exceptions:
|
bug Priority: High
|
# steps to reproduce
1. in components create a folder structure
```
test
+- a
+-component.xml
+-b
+-c
```
2. using right click on sidebar, copy "a" folder. in dialog, observe both folder and component are selected.
3. paste to c folder, observe issue
# logs
```
[ERROR] 2017-05-23 17:22:45,779 [http-nio-18080-exec-6] [util.ContentUtils] | Error while coverting stream to XML
org.dom4j.DocumentException: Error on line 1 of document : Content is not allowed in prolog. Nested exception: Content is not allowed in prolog.
at org.dom4j.io.SAXReader.read(SAXReader.java:482)
at org.dom4j.io.SAXReader.read(SAXReader.java:365)
at org.craftercms.studio.impl.v1.util.ContentUtils.convertStreamToXml(ContentUtils.java:181)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:513)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:483)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.pasteItems(ClipboardServiceImpl.java:119)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.paste(ClipboardServiceImpl.java:89)
at org.craftercms.studio.api.v1.service.clipboard.ClipboardService$paste$1.call(Unknown Source)
at scripts.api.impl.clipboard.SpringClipboardServices.paste(SpringClipboardServices.groovy:43)
at scripts.api.impl.clipboard.SpringClipboardServices$paste$1.call(Unknown Source)
at scripts.api.ClipboardServices.paste(ClipboardServices.groovy:31)
at scripts.api.ClipboardServices$paste$3.call(Unknown Source)
at paste-item_get.run(paste-item.get.groovy:11)
at groovy.util.GroovyScriptEngine.run(GroovyScriptEngine.java:605)
at org.craftercms.engine.scripting.impl.GroovyScript.execute(GroovyScript.java:55)
at org.craftercms.engine.controller.rest.RestScriptsController.executeScript(RestScriptsController.java:163)
at org.craftercms.engine.controller.rest.RestScriptsController.handleRequestInternal(RestScriptsController.java:97)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:174)
at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:50)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.craftercms.studio.impl.v1.web.security.access.StudioAuthenticationTokenProcessingFilter.doFilter(StudioAuthenticationTokenProcessingFilter.java:75)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.SiteContextResolvingFilter.doFilter(SiteContextResolvingFilter.java:46)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.ExceptionHandlingFilter.doFilter(ExceptionHandlingFilter.java:56)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.commons.http.RequestContextBindingFilter.doFilter(RequestContextBindingFilter.java:79)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.studio.impl.v1.web.filter.MultiReadHttpServletRequestWrapperFilter.doFilter(MultiReadHttpServletRequestWrapperFilter.java:32)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:784)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:802)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1410)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Nested exception:
org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLScanner.reportFatalError(Unknown Source)
at org.apache.xerces.impl.XMLDocumentScannerImpl$PrologDispatcher.dispatch(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
at org.dom4j.io.SAXReader.read(SAXReader.java:465)
at org.dom4j.io.SAXReader.read(SAXReader.java:365)
at org.craftercms.studio.impl.v1.util.ContentUtils.convertStreamToXml(ContentUtils.java:181)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:513)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:483)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.pasteItems(ClipboardServiceImpl.java:119)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.paste(ClipboardServiceImpl.java:89)
at org.craftercms.studio.api.v1.service.clipboard.ClipboardService$paste$1.call(Unknown Source)
at scripts.api.impl.clipboard.SpringClipboardServices.paste(SpringClipboardServices.groovy:43)
at scripts.api.impl.clipboard.SpringClipboardServices$paste$1.call(Unknown Source)
at scripts.api.ClipboardServices.paste(ClipboardServices.groovy:31)
at scripts.api.ClipboardServices$paste$3.call(Unknown Source)
at paste-item_get.run(paste-item.get.groovy:11)
at groovy.util.GroovyScriptEngine.run(GroovyScriptEngine.java:605)
at org.craftercms.engine.scripting.impl.GroovyScript.execute(GroovyScript.java:55)
at org.craftercms.engine.controller.rest.RestScriptsController.executeScript(RestScriptsController.java:163)
at org.craftercms.engine.controller.rest.RestScriptsController.handleRequestInternal(RestScriptsController.java:97)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:174)
at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:50)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.craftercms.studio.impl.v1.web.security.access.StudioAuthenticationTokenProcessingFilter.doFilter(StudioAuthenticationTokenProcessingFilter.java:75)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.SiteContextResolvingFilter.doFilter(SiteContextResolvingFilter.java:46)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.ExceptionHandlingFilter.doFilter(ExceptionHandlingFilter.java:56)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.commons.http.RequestContextBindingFilter.doFilter(RequestContextBindingFilter.java:79)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.studio.impl.v1.web.filter.MultiReadHttpServletRequestWrapperFilter.doFilter(MultiReadHttpServletRequestWrapperFilter.java:32)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:784)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:802)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1410)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
[ERROR] 2017-05-23 17:22:45,780 [http-nio-18080-exec-6] [clipboard.ClipboardServiceImpl] | Paste operation failed for item {0} to dest path `{1}, isCut: {2}
java.lang.NullPointerException
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.getContentIds(ContentServiceImpl.java:954)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:514)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:483)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.pasteItems(ClipboardServiceImpl.java:119)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.paste(ClipboardServiceImpl.java:89)
at org.craftercms.studio.api.v1.service.clipboard.ClipboardService$paste$1.call(Unknown Source)
at scripts.api.impl.clipboard.SpringClipboardServices.paste(SpringClipboardServices.groovy:43)
at scripts.api.impl.clipboard.SpringClipboardServices$paste$1.call(Unknown Source)
at scripts.api.ClipboardServices.paste(ClipboardServices.groovy:31)
at scripts.api.ClipboardServices$paste$3.call(Unknown Source)
at paste-item_get.run(paste-item.get.groovy:11)
at groovy.util.GroovyScriptEngine.run(GroovyScriptEngine.java:605)
at org.craftercms.engine.scripting.impl.GroovyScript.execute(GroovyScript.java:55)
at org.craftercms.engine.controller.rest.RestScriptsController.executeScript(RestScriptsController.java:163)
at org.craftercms.engine.controller.rest.RestScriptsController.handleRequestInternal(RestScriptsController.java:97)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:174)
at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:50)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.craftercms.studio.impl.v1.web.security.access.StudioAuthenticationTokenProcessingFilter.doFilter(StudioAuthenticationTokenProcessingFilter.java:75)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.SiteContextResolvingFilter.doFilter(SiteContextResolvingFilter.java:46)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.ExceptionHandlingFilter.doFilter(ExceptionHandlingFilter.java:56)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.commons.http.RequestContextBindingFilter.doFilter(RequestContextBindingFilter.java:79)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.studio.impl.v1.web.filter.MultiReadHttpServletRequestWrapperFilter.doFilter(MultiReadHttpServletRequestWrapperFilter.java:32)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:784)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:802)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1410)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)```
|
1.0
|
[studio] Copy of folder containing components fails due to exceptions: - # steps to reproduce
1. in components create a folder structure
```
test
+- a
+-component.xml
+-b
+-c
```
2. using right click on sidebar, copy "a" folder. in dialog, observe both folder and component are selected.
3. paste to c folder, observe issue
# logs
```
[ERROR] 2017-05-23 17:22:45,779 [http-nio-18080-exec-6] [util.ContentUtils] | Error while coverting stream to XML
org.dom4j.DocumentException: Error on line 1 of document : Content is not allowed in prolog. Nested exception: Content is not allowed in prolog.
at org.dom4j.io.SAXReader.read(SAXReader.java:482)
at org.dom4j.io.SAXReader.read(SAXReader.java:365)
at org.craftercms.studio.impl.v1.util.ContentUtils.convertStreamToXml(ContentUtils.java:181)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:513)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:483)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.pasteItems(ClipboardServiceImpl.java:119)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.paste(ClipboardServiceImpl.java:89)
at org.craftercms.studio.api.v1.service.clipboard.ClipboardService$paste$1.call(Unknown Source)
at scripts.api.impl.clipboard.SpringClipboardServices.paste(SpringClipboardServices.groovy:43)
at scripts.api.impl.clipboard.SpringClipboardServices$paste$1.call(Unknown Source)
at scripts.api.ClipboardServices.paste(ClipboardServices.groovy:31)
at scripts.api.ClipboardServices$paste$3.call(Unknown Source)
at paste-item_get.run(paste-item.get.groovy:11)
at groovy.util.GroovyScriptEngine.run(GroovyScriptEngine.java:605)
at org.craftercms.engine.scripting.impl.GroovyScript.execute(GroovyScript.java:55)
at org.craftercms.engine.controller.rest.RestScriptsController.executeScript(RestScriptsController.java:163)
at org.craftercms.engine.controller.rest.RestScriptsController.handleRequestInternal(RestScriptsController.java:97)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:174)
at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:50)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.craftercms.studio.impl.v1.web.security.access.StudioAuthenticationTokenProcessingFilter.doFilter(StudioAuthenticationTokenProcessingFilter.java:75)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.SiteContextResolvingFilter.doFilter(SiteContextResolvingFilter.java:46)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.ExceptionHandlingFilter.doFilter(ExceptionHandlingFilter.java:56)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.commons.http.RequestContextBindingFilter.doFilter(RequestContextBindingFilter.java:79)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.studio.impl.v1.web.filter.MultiReadHttpServletRequestWrapperFilter.doFilter(MultiReadHttpServletRequestWrapperFilter.java:32)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:784)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:802)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1410)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Nested exception:
org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLScanner.reportFatalError(Unknown Source)
at org.apache.xerces.impl.XMLDocumentScannerImpl$PrologDispatcher.dispatch(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
at org.dom4j.io.SAXReader.read(SAXReader.java:465)
at org.dom4j.io.SAXReader.read(SAXReader.java:365)
at org.craftercms.studio.impl.v1.util.ContentUtils.convertStreamToXml(ContentUtils.java:181)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:513)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:483)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.pasteItems(ClipboardServiceImpl.java:119)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.paste(ClipboardServiceImpl.java:89)
at org.craftercms.studio.api.v1.service.clipboard.ClipboardService$paste$1.call(Unknown Source)
at scripts.api.impl.clipboard.SpringClipboardServices.paste(SpringClipboardServices.groovy:43)
at scripts.api.impl.clipboard.SpringClipboardServices$paste$1.call(Unknown Source)
at scripts.api.ClipboardServices.paste(ClipboardServices.groovy:31)
at scripts.api.ClipboardServices$paste$3.call(Unknown Source)
at paste-item_get.run(paste-item.get.groovy:11)
at groovy.util.GroovyScriptEngine.run(GroovyScriptEngine.java:605)
at org.craftercms.engine.scripting.impl.GroovyScript.execute(GroovyScript.java:55)
at org.craftercms.engine.controller.rest.RestScriptsController.executeScript(RestScriptsController.java:163)
at org.craftercms.engine.controller.rest.RestScriptsController.handleRequestInternal(RestScriptsController.java:97)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:174)
at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:50)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.craftercms.studio.impl.v1.web.security.access.StudioAuthenticationTokenProcessingFilter.doFilter(StudioAuthenticationTokenProcessingFilter.java:75)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.SiteContextResolvingFilter.doFilter(SiteContextResolvingFilter.java:46)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.ExceptionHandlingFilter.doFilter(ExceptionHandlingFilter.java:56)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.commons.http.RequestContextBindingFilter.doFilter(RequestContextBindingFilter.java:79)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.studio.impl.v1.web.filter.MultiReadHttpServletRequestWrapperFilter.doFilter(MultiReadHttpServletRequestWrapperFilter.java:32)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:784)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:802)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1410)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
[ERROR] 2017-05-23 17:22:45,780 [http-nio-18080-exec-6] [clipboard.ClipboardServiceImpl] | Paste operation failed for item {0} to dest path `{1}, isCut: {2}
java.lang.NullPointerException
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.getContentIds(ContentServiceImpl.java:954)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:514)
at org.craftercms.studio.impl.v1.service.content.ContentServiceImpl.copyContent(ContentServiceImpl.java:483)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.pasteItems(ClipboardServiceImpl.java:119)
at org.craftercms.studio.impl.v1.service.clipboard.ClipboardServiceImpl.paste(ClipboardServiceImpl.java:89)
at org.craftercms.studio.api.v1.service.clipboard.ClipboardService$paste$1.call(Unknown Source)
at scripts.api.impl.clipboard.SpringClipboardServices.paste(SpringClipboardServices.groovy:43)
at scripts.api.impl.clipboard.SpringClipboardServices$paste$1.call(Unknown Source)
at scripts.api.ClipboardServices.paste(ClipboardServices.groovy:31)
at scripts.api.ClipboardServices$paste$3.call(Unknown Source)
at paste-item_get.run(paste-item.get.groovy:11)
at groovy.util.GroovyScriptEngine.run(GroovyScriptEngine.java:605)
at org.craftercms.engine.scripting.impl.GroovyScript.execute(GroovyScript.java:55)
at org.craftercms.engine.controller.rest.RestScriptsController.executeScript(RestScriptsController.java:163)
at org.craftercms.engine.controller.rest.RestScriptsController.handleRequestInternal(RestScriptsController.java:97)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:174)
at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:50)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.craftercms.studio.impl.v1.web.security.access.StudioAuthenticationTokenProcessingFilter.doFilter(StudioAuthenticationTokenProcessingFilter.java:75)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.SiteContextResolvingFilter.doFilter(SiteContextResolvingFilter.java:46)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.engine.servlet.filter.ExceptionHandlingFilter.doFilter(ExceptionHandlingFilter.java:56)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:262)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.commons.http.RequestContextBindingFilter.doFilter(RequestContextBindingFilter.java:79)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.craftercms.studio.impl.v1.web.filter.MultiReadHttpServletRequestWrapperFilter.doFilter(MultiReadHttpServletRequestWrapperFilter.java:32)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:784)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:802)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1410)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)```
|
non_process
|
copy of folder containing components fails due to exceptions steps to reproduce in components create a folder structure test a component xml b c using right click on sidebar copy a folder in dialog observe both folder and component are selected paste to c folder observe issue logs error while coverting stream to xml org documentexception error on line of document content is not allowed in prolog nested exception content is not allowed in prolog at org io saxreader read saxreader java at org io saxreader read saxreader java at org craftercms studio impl util contentutils convertstreamtoxml contentutils java at org craftercms studio impl service content contentserviceimpl copycontent contentserviceimpl java at org craftercms studio impl service content contentserviceimpl copycontent contentserviceimpl java at org craftercms studio impl service clipboard clipboardserviceimpl pasteitems clipboardserviceimpl java at org craftercms studio impl service clipboard clipboardserviceimpl paste clipboardserviceimpl java at org craftercms studio api service clipboard clipboardservice paste call unknown source at scripts api impl clipboard springclipboardservices paste springclipboardservices groovy at scripts api impl clipboard springclipboardservices paste call unknown source at scripts api clipboardservices paste clipboardservices groovy at scripts api clipboardservices paste call unknown source at paste item get run paste item get groovy at groovy util groovyscriptengine run groovyscriptengine java at org craftercms engine scripting impl groovyscript execute groovyscript java at org craftercms engine controller rest restscriptscontroller executescript restscriptscontroller java at org craftercms engine controller rest restscriptscontroller handlerequestinternal restscriptscontroller java at org springframework web servlet mvc abstractcontroller handlerequest abstractcontroller java at org springframework web servlet mvc simplecontrollerhandleradapter handle simplecontrollerhandleradapter java at org springframework web servlet dispatcherservlet dodispatch dispatcherservlet java at org springframework web servlet dispatcherservlet doservice dispatcherservlet java at org springframework web servlet frameworkservlet processrequest frameworkservlet java at org springframework web servlet frameworkservlet doget frameworkservlet java at javax servlet http httpservlet service httpservlet java at org springframework web servlet frameworkservlet service frameworkservlet java at javax servlet http httpservlet service httpservlet java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache tomcat websocket server wsfilter dofilter wsfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web access intercept filtersecurityinterceptor invoke filtersecurityinterceptor java at org springframework security web access intercept filtersecurityinterceptor dofilter filtersecurityinterceptor java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web access exceptiontranslationfilter dofilter exceptiontranslationfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web authentication anonymousauthenticationfilter dofilter anonymousauthenticationfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web servletapi securitycontextholderawarerequestfilter dofilter securitycontextholderawarerequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org craftercms studio impl web security access studioauthenticationtokenprocessingfilter dofilter studioauthenticationtokenprocessingfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web header headerwriterfilter dofilterinternal headerwriterfilter java at org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web context request async webasyncmanagerintegrationfilter dofilterinternal webasyncmanagerintegrationfilter java at org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web context securitycontextpersistencefilter dofilter securitycontextpersistencefilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web filterchainproxy dofilterinternal filterchainproxy java at org springframework security web filterchainproxy dofilter filterchainproxy java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms engine servlet filter sitecontextresolvingfilter dofilter sitecontextresolvingfilter java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms engine servlet filter exceptionhandlingfilter dofilter exceptionhandlingfilter java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms commons http requestcontextbindingfilter dofilter requestcontextbindingfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms studio impl web filter multireadhttpservletrequestwrapperfilter dofilter multireadhttpservletrequestwrapperfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache catalina core standardwrappervalve invoke standardwrappervalve java at org apache catalina core standardcontextvalve invoke standardcontextvalve java at org apache catalina authenticator authenticatorbase invoke authenticatorbase java at org apache catalina core standardhostvalve invoke standardhostvalve java at org apache catalina valves errorreportvalve invoke errorreportvalve java at org apache catalina valves abstractaccesslogvalve invoke abstractaccesslogvalve java at org apache catalina core standardenginevalve invoke standardenginevalve java at org apache catalina connector coyoteadapter service coyoteadapter java at org apache coyote service java at org apache coyote abstractprocessorlight process abstractprocessorlight java at org apache coyote abstractprotocol connectionhandler process abstractprotocol java at org apache tomcat util net nioendpoint socketprocessor dorun nioendpoint java at org apache tomcat util net socketprocessorbase run socketprocessorbase java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at org apache tomcat util threads taskthread wrappingrunnable run taskthread java at java lang thread run thread java nested exception org xml sax saxparseexception linenumber columnnumber content is not allowed in prolog at org apache xerces util errorhandlerwrapper createsaxparseexception unknown source at org apache xerces util errorhandlerwrapper fatalerror unknown source at org apache xerces impl xmlerrorreporter reporterror unknown source at org apache xerces impl xmlerrorreporter reporterror unknown source at org apache xerces impl xmlscanner reportfatalerror unknown source at org apache xerces impl xmldocumentscannerimpl prologdispatcher dispatch unknown source at org apache xerces impl xmldocumentfragmentscannerimpl scandocument unknown source at org apache xerces parsers parse unknown source at org apache xerces parsers parse unknown source at org apache xerces parsers xmlparser parse unknown source at org apache xerces parsers abstractsaxparser parse unknown source at org apache xerces jaxp saxparserimpl jaxpsaxparser parse unknown source at org io saxreader read saxreader java at org io saxreader read saxreader java at org craftercms studio impl util contentutils convertstreamtoxml contentutils java at org craftercms studio impl service content contentserviceimpl copycontent contentserviceimpl java at org craftercms studio impl service content contentserviceimpl copycontent contentserviceimpl java at org craftercms studio impl service clipboard clipboardserviceimpl pasteitems clipboardserviceimpl java at org craftercms studio impl service clipboard clipboardserviceimpl paste clipboardserviceimpl java at org craftercms studio api service clipboard clipboardservice paste call unknown source at scripts api impl clipboard springclipboardservices paste springclipboardservices groovy at scripts api impl clipboard springclipboardservices paste call unknown source at scripts api clipboardservices paste clipboardservices groovy at scripts api clipboardservices paste call unknown source at paste item get run paste item get groovy at groovy util groovyscriptengine run groovyscriptengine java at org craftercms engine scripting impl groovyscript execute groovyscript java at org craftercms engine controller rest restscriptscontroller executescript restscriptscontroller java at org craftercms engine controller rest restscriptscontroller handlerequestinternal restscriptscontroller java at org springframework web servlet mvc abstractcontroller handlerequest abstractcontroller java at org springframework web servlet mvc simplecontrollerhandleradapter handle simplecontrollerhandleradapter java at org springframework web servlet dispatcherservlet dodispatch dispatcherservlet java at org springframework web servlet dispatcherservlet doservice dispatcherservlet java at org springframework web servlet frameworkservlet processrequest frameworkservlet java at org springframework web servlet frameworkservlet doget frameworkservlet java at javax servlet http httpservlet service httpservlet java at org springframework web servlet frameworkservlet service frameworkservlet java at javax servlet http httpservlet service httpservlet java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache tomcat websocket server wsfilter dofilter wsfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web access intercept filtersecurityinterceptor invoke filtersecurityinterceptor java at org springframework security web access intercept filtersecurityinterceptor dofilter filtersecurityinterceptor java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web access exceptiontranslationfilter dofilter exceptiontranslationfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web authentication anonymousauthenticationfilter dofilter anonymousauthenticationfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web servletapi securitycontextholderawarerequestfilter dofilter securitycontextholderawarerequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org craftercms studio impl web security access studioauthenticationtokenprocessingfilter dofilter studioauthenticationtokenprocessingfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web header headerwriterfilter dofilterinternal headerwriterfilter java at org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web context request async webasyncmanagerintegrationfilter dofilterinternal webasyncmanagerintegrationfilter java at org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web context securitycontextpersistencefilter dofilter securitycontextpersistencefilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web filterchainproxy dofilterinternal filterchainproxy java at org springframework security web filterchainproxy dofilter filterchainproxy java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms engine servlet filter sitecontextresolvingfilter dofilter sitecontextresolvingfilter java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms engine servlet filter exceptionhandlingfilter dofilter exceptionhandlingfilter java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms commons http requestcontextbindingfilter dofilter requestcontextbindingfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms studio impl web filter multireadhttpservletrequestwrapperfilter dofilter multireadhttpservletrequestwrapperfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache catalina core standardwrappervalve invoke standardwrappervalve java at org apache catalina core standardcontextvalve invoke standardcontextvalve java at org apache catalina authenticator authenticatorbase invoke authenticatorbase java at org apache catalina core standardhostvalve invoke standardhostvalve java at org apache catalina valves errorreportvalve invoke errorreportvalve java at org apache catalina valves abstractaccesslogvalve invoke abstractaccesslogvalve java at org apache catalina core standardenginevalve invoke standardenginevalve java at org apache catalina connector coyoteadapter service coyoteadapter java at org apache coyote service java at org apache coyote abstractprocessorlight process abstractprocessorlight java at org apache coyote abstractprotocol connectionhandler process abstractprotocol java at org apache tomcat util net nioendpoint socketprocessor dorun nioendpoint java at org apache tomcat util net socketprocessorbase run socketprocessorbase java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at org apache tomcat util threads taskthread wrappingrunnable run taskthread java at java lang thread run thread java paste operation failed for item to dest path iscut java lang nullpointerexception at org craftercms studio impl service content contentserviceimpl getcontentids contentserviceimpl java at org craftercms studio impl service content contentserviceimpl copycontent contentserviceimpl java at org craftercms studio impl service content contentserviceimpl copycontent contentserviceimpl java at org craftercms studio impl service clipboard clipboardserviceimpl pasteitems clipboardserviceimpl java at org craftercms studio impl service clipboard clipboardserviceimpl paste clipboardserviceimpl java at org craftercms studio api service clipboard clipboardservice paste call unknown source at scripts api impl clipboard springclipboardservices paste springclipboardservices groovy at scripts api impl clipboard springclipboardservices paste call unknown source at scripts api clipboardservices paste clipboardservices groovy at scripts api clipboardservices paste call unknown source at paste item get run paste item get groovy at groovy util groovyscriptengine run groovyscriptengine java at org craftercms engine scripting impl groovyscript execute groovyscript java at org craftercms engine controller rest restscriptscontroller executescript restscriptscontroller java at org craftercms engine controller rest restscriptscontroller handlerequestinternal restscriptscontroller java at org springframework web servlet mvc abstractcontroller handlerequest abstractcontroller java at org springframework web servlet mvc simplecontrollerhandleradapter handle simplecontrollerhandleradapter java at org springframework web servlet dispatcherservlet dodispatch dispatcherservlet java at org springframework web servlet dispatcherservlet doservice dispatcherservlet java at org springframework web servlet frameworkservlet processrequest frameworkservlet java at org springframework web servlet frameworkservlet doget frameworkservlet java at javax servlet http httpservlet service httpservlet java at org springframework web servlet frameworkservlet service frameworkservlet java at javax servlet http httpservlet service httpservlet java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache tomcat websocket server wsfilter dofilter wsfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web access intercept filtersecurityinterceptor invoke filtersecurityinterceptor java at org springframework security web access intercept filtersecurityinterceptor dofilter filtersecurityinterceptor java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web access exceptiontranslationfilter dofilter exceptiontranslationfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web authentication anonymousauthenticationfilter dofilter anonymousauthenticationfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web servletapi securitycontextholderawarerequestfilter dofilter securitycontextholderawarerequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org craftercms studio impl web security access studioauthenticationtokenprocessingfilter dofilter studioauthenticationtokenprocessingfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web header headerwriterfilter dofilterinternal headerwriterfilter java at org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web context request async webasyncmanagerintegrationfilter dofilterinternal webasyncmanagerintegrationfilter java at org springframework web filter onceperrequestfilter dofilter onceperrequestfilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web context securitycontextpersistencefilter dofilter securitycontextpersistencefilter java at org springframework security web filterchainproxy virtualfilterchain dofilter filterchainproxy java at org springframework security web filterchainproxy dofilterinternal filterchainproxy java at org springframework security web filterchainproxy dofilter filterchainproxy java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms engine servlet filter sitecontextresolvingfilter dofilter sitecontextresolvingfilter java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms engine servlet filter exceptionhandlingfilter dofilter exceptionhandlingfilter java at org springframework web filter delegatingfilterproxy invokedelegate delegatingfilterproxy java at org springframework web filter delegatingfilterproxy dofilter delegatingfilterproxy java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms commons http requestcontextbindingfilter dofilter requestcontextbindingfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org craftercms studio impl web filter multireadhttpservletrequestwrapperfilter dofilter multireadhttpservletrequestwrapperfilter java at org apache catalina core applicationfilterchain internaldofilter applicationfilterchain java at org apache catalina core applicationfilterchain dofilter applicationfilterchain java at org apache catalina core standardwrappervalve invoke standardwrappervalve java at org apache catalina core standardcontextvalve invoke standardcontextvalve java at org apache catalina authenticator authenticatorbase invoke authenticatorbase java at org apache catalina core standardhostvalve invoke standardhostvalve java at org apache catalina valves errorreportvalve invoke errorreportvalve java at org apache catalina valves abstractaccesslogvalve invoke abstractaccesslogvalve java at org apache catalina core standardenginevalve invoke standardenginevalve java at org apache catalina connector coyoteadapter service coyoteadapter java at org apache coyote service java at org apache coyote abstractprocessorlight process abstractprocessorlight java at org apache coyote abstractprotocol connectionhandler process abstractprotocol java at org apache tomcat util net nioendpoint socketprocessor dorun nioendpoint java at org apache tomcat util net socketprocessorbase run socketprocessorbase java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at org apache tomcat util threads taskthread wrappingrunnable run taskthread java at java lang thread run thread java
| 0
|
81,156
| 30,731,423,759
|
IssuesEvent
|
2023-07-28 02:10:42
|
zed-industries/community
|
https://api.github.com/repos/zed-industries/community
|
opened
|
Can't open a context menu on selected text
|
defect triage admin read
|
### Check for existing issues
- [X] Completed
### Describe the bug / provide steps to reproduce it
Trying to open a context menu while having text selected doesn't provide a menu of text-related actions (such as the one from TextEdit shown below): it only deselects the text and opens the same menu that shows up upon right- or control-clicking on any location within the buffer.
**To reproduce:**
1. Select some text in a buffer;
2. Open up the context menu (by either right-clicking or control-clicking on the selected text).
### Environment
Zed: v0.97.4 (preview)
OS: macOS 13.5.0
Memory: 8 GiB
Architecture: x86_64
**AND**
Zed: v0.96.3 (stable)
OS: macOS 13.5.0
Memory: 8 GiB
Architecture: x86_64
### If applicable, add mockups / screenshots to help explain present your vision of the feature
<img width="262" alt="image" src="https://github.com/zed-industries/community/assets/83471846/bfc80363-3b08-4325-80de-6268fdea784a">
### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue.
If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000.
_No response_
|
1.0
|
Can't open a context menu on selected text - ### Check for existing issues
- [X] Completed
### Describe the bug / provide steps to reproduce it
Trying to open a context menu while having text selected doesn't provide a menu of text-related actions (such as the one from TextEdit shown below): it only deselects the text and opens the same menu that shows up upon right- or control-clicking on any location within the buffer.
**To reproduce:**
1. Select some text in a buffer;
2. Open up the context menu (by either right-clicking or control-clicking on the selected text).
### Environment
Zed: v0.97.4 (preview)
OS: macOS 13.5.0
Memory: 8 GiB
Architecture: x86_64
**AND**
Zed: v0.96.3 (stable)
OS: macOS 13.5.0
Memory: 8 GiB
Architecture: x86_64
### If applicable, add mockups / screenshots to help explain present your vision of the feature
<img width="262" alt="image" src="https://github.com/zed-industries/community/assets/83471846/bfc80363-3b08-4325-80de-6268fdea784a">
### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue.
If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000.
_No response_
|
non_process
|
can t open a context menu on selected text check for existing issues completed describe the bug provide steps to reproduce it trying to open a context menu while having text selected doesn t provide a menu of text related actions such as the one from textedit shown below it only deselects the text and opens the same menu that shows up upon right or control clicking on any location within the buffer to reproduce select some text in a buffer open up the context menu by either right clicking or control clicking on the selected text environment zed preview os macos memory gib architecture and zed stable os macos memory gib architecture if applicable add mockups screenshots to help explain present your vision of the feature img width alt image src if applicable attach your library logs zed zed log file to this issue if you only need the most recent lines you can run the zed open log command palette action to see the last no response
| 0
|
32,556
| 13,878,860,063
|
IssuesEvent
|
2020-10-17 11:45:33
|
microsoft/vscode-cpptools
|
https://api.github.com/repos/microsoft/vscode-cpptools
|
closed
|
Linter support for CppCoreGuidelines
|
Feature Request Language Service help wanted more votes needed
|
New VS version (not code) has some support for these [guidelines](https://github.com/isocpp/CppCoreGuidelines) and clang-tidy does have them too. So it would be great to see a linter based on cpp core guidelines that'll mark/report relevant warnings/errors inside code. I'll be fine with something that runs explicitly, rather than on typing (to reduce the workload on plugin).
Thanks 😸
|
1.0
|
Linter support for CppCoreGuidelines - New VS version (not code) has some support for these [guidelines](https://github.com/isocpp/CppCoreGuidelines) and clang-tidy does have them too. So it would be great to see a linter based on cpp core guidelines that'll mark/report relevant warnings/errors inside code. I'll be fine with something that runs explicitly, rather than on typing (to reduce the workload on plugin).
Thanks 😸
|
non_process
|
linter support for cppcoreguidelines new vs version not code has some support for these and clang tidy does have them too so it would be great to see a linter based on cpp core guidelines that ll mark report relevant warnings errors inside code i ll be fine with something that runs explicitly rather than on typing to reduce the workload on plugin thanks 😸
| 0
|
286,415
| 21,575,428,766
|
IssuesEvent
|
2022-05-02 13:17:49
|
csaf-poc/csaf_distribution
|
https://api.github.com/repos/csaf-poc/csaf_distribution
|
closed
|
Provider Setup
|
documentation csaf_provider
|
I found a few issues with the [documentation of the provider setup](https://github.com/csaf-poc/csaf_distribution/blob/main/docs/provider-setup.md):
- https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L10 is missing an `e`
- It is unclear what is checked with https://github.com/csaf-poc/csaf_distribution/blob/main/docs/provider-setup.md?plain=1#L10-L13
- From the description before https://github.com/csaf-poc/csaf_distribution/blob/main/docs/provider-setup.md?plain=1#L22-L50 it is not clear that fcgiwrap.conf should be modified.
- https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L52: **Add** should be bold.
- https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L56 should at least contain an additional
```
# Other config
# ...
```
to show that there is additional stuff in the file.
- As the directory `/usr/lib/cgi-bin/` may not exist, there should be a step to create it and set the correct permissions / ownership.
- The binary, when compiled does not end with `.go`. Either it should be renamed or https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L74 should be changed.
- `web` should be in the config show [here](https://github.com/csaf-poc/csaf_distribution/blob/main/docs/provider-setup.md?plain=1#L80-L85).
- https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L88 should also link to #32
|
1.0
|
Provider Setup - I found a few issues with the [documentation of the provider setup](https://github.com/csaf-poc/csaf_distribution/blob/main/docs/provider-setup.md):
- https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L10 is missing an `e`
- It is unclear what is checked with https://github.com/csaf-poc/csaf_distribution/blob/main/docs/provider-setup.md?plain=1#L10-L13
- From the description before https://github.com/csaf-poc/csaf_distribution/blob/main/docs/provider-setup.md?plain=1#L22-L50 it is not clear that fcgiwrap.conf should be modified.
- https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L52: **Add** should be bold.
- https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L56 should at least contain an additional
```
# Other config
# ...
```
to show that there is additional stuff in the file.
- As the directory `/usr/lib/cgi-bin/` may not exist, there should be a step to create it and set the correct permissions / ownership.
- The binary, when compiled does not end with `.go`. Either it should be renamed or https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L74 should be changed.
- `web` should be in the config show [here](https://github.com/csaf-poc/csaf_distribution/blob/main/docs/provider-setup.md?plain=1#L80-L85).
- https://github.com/csaf-poc/csaf_distribution/blob/c57de75dac699afd94297a67c954a8e26ca7b7d2/docs/provider-setup.md?plain=1#L88 should also link to #32
|
non_process
|
provider setup i found a few issues with the is missing an e it is unclear what is checked with from the description before it is not clear that fcgiwrap conf should be modified add should be bold should at least contain an additional other config to show that there is additional stuff in the file as the directory usr lib cgi bin may not exist there should be a step to create it and set the correct permissions ownership the binary when compiled does not end with go either it should be renamed or should be changed web should be in the config show should also link to
| 0
|
4,348
| 7,252,654,001
|
IssuesEvent
|
2018-02-16 00:01:15
|
amigosdapoli/donation-system
|
https://api.github.com/repos/amigosdapoli/donation-system
|
closed
|
Evaluate API endpoint to receive webhook integration from gateway for recurring donation
|
admin-processes
|
Maxipago mentioned they have a webhook service to let us know when recurring donations are being charged. It would be a way to implement support for recurring data without the need of accessing the report API which charges by api call
|
1.0
|
Evaluate API endpoint to receive webhook integration from gateway for recurring donation - Maxipago mentioned they have a webhook service to let us know when recurring donations are being charged. It would be a way to implement support for recurring data without the need of accessing the report API which charges by api call
|
process
|
evaluate api endpoint to receive webhook integration from gateway for recurring donation maxipago mentioned they have a webhook service to let us know when recurring donations are being charged it would be a way to implement support for recurring data without the need of accessing the report api which charges by api call
| 1
|
42,062
| 5,416,524,715
|
IssuesEvent
|
2017-03-02 00:49:27
|
SchizoDuckie/DuckieTV
|
https://api.github.com/repos/SchizoDuckie/DuckieTV
|
closed
|
Syntax errors
|
3 - Done bug Testing Completed
|
**What build of DuckieTV are you using (Standlone / Chrome Extension (New Tab / Browser Action))**
...
Standalone
**What version of DuckieTV are you using (Stable 1.1.x / Nightly yyyymmddHHMM)**
...
used
201702270010
also checked
201702280030
**What is your Operating System (Windows, Mac, Linux, Android)**
...
linux
**Describe the problem you are having and steps to reproduce if available**
...
syntax in file setup used incorrectly, causes errors
**Attach any DuckieTV statistics or Developer Console logs if available**
...
lines 37
if [[ $DEPS1 = 'OK'] && [$DEPS2 = 'OK' ]]; then
corrected
if [[ ($DEPS1 = 'OK' ) && ( $DEPS2 = 'OK' ) ]]; then
line 41
if [[ $DEPS1 = 'NOK']]; then
corrected
if [[ $DEPS1 = 'NOK' ]]; then
line 44
if [[ $DEPS2 = 'NOK']]; then
corrected
if [[ $DEPS2 = 'NOK' ]]; then
suggest installing running https://github.com/koalaman/shellcheck
|
1.0
|
Syntax errors - **What build of DuckieTV are you using (Standlone / Chrome Extension (New Tab / Browser Action))**
...
Standalone
**What version of DuckieTV are you using (Stable 1.1.x / Nightly yyyymmddHHMM)**
...
used
201702270010
also checked
201702280030
**What is your Operating System (Windows, Mac, Linux, Android)**
...
linux
**Describe the problem you are having and steps to reproduce if available**
...
syntax in file setup used incorrectly, causes errors
**Attach any DuckieTV statistics or Developer Console logs if available**
...
lines 37
if [[ $DEPS1 = 'OK'] && [$DEPS2 = 'OK' ]]; then
corrected
if [[ ($DEPS1 = 'OK' ) && ( $DEPS2 = 'OK' ) ]]; then
line 41
if [[ $DEPS1 = 'NOK']]; then
corrected
if [[ $DEPS1 = 'NOK' ]]; then
line 44
if [[ $DEPS2 = 'NOK']]; then
corrected
if [[ $DEPS2 = 'NOK' ]]; then
suggest installing running https://github.com/koalaman/shellcheck
|
non_process
|
syntax errors what build of duckietv are you using standlone chrome extension new tab browser action standalone what version of duckietv are you using stable x nightly yyyymmddhhmm used also checked what is your operating system windows mac linux android linux describe the problem you are having and steps to reproduce if available syntax in file setup used incorrectly causes errors attach any duckietv statistics or developer console logs if available lines if then corrected if then line if then corrected if then line if then corrected if then suggest installing running
| 0
|
60,975
| 14,939,097,196
|
IssuesEvent
|
2021-01-25 16:32:22
|
EIDSS/EIDSS7
|
https://api.github.com/repos/EIDSS/EIDSS7
|
closed
|
HPS08: Unable to edit a reopened human disease report
|
Build 98.0 GAT environment/deployment bug
|
**Summary**
When one user reopened a case, the other user was unable to edit the case because it saved as read-only except for the "report status" field.
**To Reproduce**
Steps to reproduce the behavior:
1. Log in as marinasmith
2. select human-->disease report
3. Enter information for "disease", "report status", "first name", and "last name"
4. Open record for Edward Talbot in edit mode
5. Change report status to "in process"
6. Review case and submit
7. Return to dashboard and logout
8. Login as russelpeters and verify new report
**Expected behavior**
When the record is opened in edit mode, the information is supposed to be editable/modifiable
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Additional details:**
- Build:
- Script title (enter ad hoc if not script-based): HPS08
**Issue severity (Optional)**
Severity (critical, major, minor, low): Major
**Additional context**
Add any other context about the problem here.
|
1.0
|
HPS08: Unable to edit a reopened human disease report - **Summary**
When one user reopened a case, the other user was unable to edit the case because it saved as read-only except for the "report status" field.
**To Reproduce**
Steps to reproduce the behavior:
1. Log in as marinasmith
2. select human-->disease report
3. Enter information for "disease", "report status", "first name", and "last name"
4. Open record for Edward Talbot in edit mode
5. Change report status to "in process"
6. Review case and submit
7. Return to dashboard and logout
8. Login as russelpeters and verify new report
**Expected behavior**
When the record is opened in edit mode, the information is supposed to be editable/modifiable
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Additional details:**
- Build:
- Script title (enter ad hoc if not script-based): HPS08
**Issue severity (Optional)**
Severity (critical, major, minor, low): Major
**Additional context**
Add any other context about the problem here.
|
non_process
|
unable to edit a reopened human disease report summary when one user reopened a case the other user was unable to edit the case because it saved as read only except for the report status field to reproduce steps to reproduce the behavior log in as marinasmith select human disease report enter information for disease report status first name and last name open record for edward talbot in edit mode change report status to in process review case and submit return to dashboard and logout login as russelpeters and verify new report expected behavior when the record is opened in edit mode the information is supposed to be editable modifiable screenshots if applicable add screenshots to help explain your problem additional details build script title enter ad hoc if not script based issue severity optional severity critical major minor low major additional context add any other context about the problem here
| 0
|
5,866
| 8,686,621,283
|
IssuesEvent
|
2018-12-03 11:20:45
|
linnovate/root
|
https://api.github.com/repos/linnovate/root
|
closed
|
push notifications dont work
|
2.0.6 Not Reproducible Process bug bug
|
in a task that i was assigned to, a user changed something in it while i was in another tab, and i didnt see any notification pop up about the change
|
1.0
|
push notifications dont work - in a task that i was assigned to, a user changed something in it while i was in another tab, and i didnt see any notification pop up about the change
|
process
|
push notifications dont work in a task that i was assigned to a user changed something in it while i was in another tab and i didnt see any notification pop up about the change
| 1
|
187,007
| 6,744,277,855
|
IssuesEvent
|
2017-10-20 15:08:21
|
vincentrk/quadrodoodle
|
https://api.github.com/repos/vincentrk/quadrodoodle
|
closed
|
Begin implementing Yaw control
|
high priority
|
- [ ] Look into p controller, cascading controller
- [ ] Figure out and implement new message type for updating p values
- [ ] Quad
- [ ] PC terminal
- [ ] Look at sensor data
- [ ] Figure out drift
- [ ] Figure out calculations/formulas needed
- [ ] Implement
- [ ] Quad
- [ ] PC terminal
|
1.0
|
Begin implementing Yaw control - - [ ] Look into p controller, cascading controller
- [ ] Figure out and implement new message type for updating p values
- [ ] Quad
- [ ] PC terminal
- [ ] Look at sensor data
- [ ] Figure out drift
- [ ] Figure out calculations/formulas needed
- [ ] Implement
- [ ] Quad
- [ ] PC terminal
|
non_process
|
begin implementing yaw control look into p controller cascading controller figure out and implement new message type for updating p values quad pc terminal look at sensor data figure out drift figure out calculations formulas needed implement quad pc terminal
| 0
|
18,852
| 24,766,296,119
|
IssuesEvent
|
2022-10-22 15:13:29
|
NEARWEEK/CORE
|
https://api.github.com/repos/NEARWEEK/CORE
|
opened
|
Move Trello Content Board to Github
|
enhancement Process
|
## 🎉 Subtasks
- [ ] Clean up content board & discontinue it
- [ ] Move relevant workflow onto Github
- [ ] Figure out NW content creation & marketing flow on Github with @b4ltasar & @cudam321
## 🤼♂️ Reviewer
@Kisgus
## 🔗 Work doc(s) / inspirational links
[Trello Board](https://trello.com/b/ckGxMaBX/nearweek-marketing-production)
|
1.0
|
Move Trello Content Board to Github - ## 🎉 Subtasks
- [ ] Clean up content board & discontinue it
- [ ] Move relevant workflow onto Github
- [ ] Figure out NW content creation & marketing flow on Github with @b4ltasar & @cudam321
## 🤼♂️ Reviewer
@Kisgus
## 🔗 Work doc(s) / inspirational links
[Trello Board](https://trello.com/b/ckGxMaBX/nearweek-marketing-production)
|
process
|
move trello content board to github 🎉 subtasks clean up content board discontinue it move relevant workflow onto github figure out nw content creation marketing flow on github with 🤼♂️ reviewer kisgus 🔗 work doc s inspirational links
| 1
|
11,164
| 13,957,694,143
|
IssuesEvent
|
2020-10-24 08:11:10
|
alexanderkotsev/geoportal
|
https://api.github.com/repos/alexanderkotsev/geoportal
|
opened
|
DK: Missing resources in Geoportal
|
DK - Denmark Geoportal Harvesting process
|
Collected from the Geoportal Workshop online survey answers:
A harvesting was done from our national catalogue Friday 10th of Dec. First, we were surprised to see that
the number of downloadable data set had decreased. However, after a closer look at our data set atom feed
we noticed that there was no value in "length" in each entry in the data set feeds. The Geoportal Browser
reported that data has no "DATA_DOWNLOAD_LINK_IS_AVAILABLE". We have so far concluded that this
single information "length" in the data set atom feed has a vital meaning in the Geoportal. Can you confirm
that our conclusion is correct? In addition, if that is correct we think the chances for a successful harvesting
is depending too much on a single tag/information deep down in data set atom feed.
|
1.0
|
DK: Missing resources in Geoportal - Collected from the Geoportal Workshop online survey answers:
A harvesting was done from our national catalogue Friday 10th of Dec. First, we were surprised to see that
the number of downloadable data set had decreased. However, after a closer look at our data set atom feed
we noticed that there was no value in "length" in each entry in the data set feeds. The Geoportal Browser
reported that data has no "DATA_DOWNLOAD_LINK_IS_AVAILABLE". We have so far concluded that this
single information "length" in the data set atom feed has a vital meaning in the Geoportal. Can you confirm
that our conclusion is correct? In addition, if that is correct we think the chances for a successful harvesting
is depending too much on a single tag/information deep down in data set atom feed.
|
process
|
dk missing resources in geoportal collected from the geoportal workshop online survey answers a harvesting was done from our national catalogue friday of dec first we were surprised to see that the number of downloadable data set had decreased however after a closer look at our data set atom feed we noticed that there was no value in quot length quot in each entry in the data set feeds the geoportal browser reported that data has no quot data download link is available quot we have so far concluded that this single information quot length quot in the data set atom feed has a vital meaning in the geoportal can you confirm that our conclusion is correct in addition if that is correct we think the chances for a successful harvesting is depending too much on a single tag information deep down in data set atom feed
| 1
|
24,919
| 17,909,100,938
|
IssuesEvent
|
2021-09-09 01:00:14
|
eslint/eslint
|
https://api.github.com/repos/eslint/eslint
|
closed
|
remove the old cla checker in all eslint repos
|
infrastructure
|
<!--
ESLint adheres to the Open JS Foundation Code of Conduct:
https://eslint.org/conduct
-->
recently we are starting to [use the new cla](https://github.com/eslint/eslint/discussions/14943), but old checker has not been removed in some repos. e.g. https://github.com/eslint/espree/pull/514

|
1.0
|
remove the old cla checker in all eslint repos - <!--
ESLint adheres to the Open JS Foundation Code of Conduct:
https://eslint.org/conduct
-->
recently we are starting to [use the new cla](https://github.com/eslint/eslint/discussions/14943), but old checker has not been removed in some repos. e.g. https://github.com/eslint/espree/pull/514

|
non_process
|
remove the old cla checker in all eslint repos eslint adheres to the open js foundation code of conduct recently we are starting to but old checker has not been removed in some repos e g
| 0
|
785,473
| 27,615,226,743
|
IssuesEvent
|
2023-03-09 18:48:30
|
woocommerce/woocommerce-blocks
|
https://api.github.com/repos/woocommerce/woocommerce-blocks
|
closed
|
Third party tax calculation not displaying after changing shipping method
|
type: bug priority: high
|
## Describe the bug
When using a third-party tax calculation plugin (AvaTax), they are not displayed on Cart and Checkout blocks after changing shipping methods. A page refresh will properly display them, and they get successfully recalculated on shortcode versions.
## To reproduce
Steps to reproduce the behavior:
1. Install Avatax and enable taxes
2. Create several shipping methods
3. Add items to the cart
4. Verify taxes display on the cart and checkout blocks but disappear after changing shipping methods.
## Expected behavior
Taxes should be recalculated and displayed after changing shipping methods. Checking out should include taxes.
## Screen recording
https://user-images.githubusercontent.com/17236129/224076426-cd37942f-2d3f-47b9-81df-aaade2a3a817.mov
## Additional context
Since on-page refresh and on the legacy cart and checkout taxes are correct, we need to investigate if this problem needs fixing on our side.
**EDIT: Applying a coupon also produces the same results. It seems to be derived from `$cart->calculate_totals()`**
|
1.0
|
Third party tax calculation not displaying after changing shipping method - ## Describe the bug
When using a third-party tax calculation plugin (AvaTax), they are not displayed on Cart and Checkout blocks after changing shipping methods. A page refresh will properly display them, and they get successfully recalculated on shortcode versions.
## To reproduce
Steps to reproduce the behavior:
1. Install Avatax and enable taxes
2. Create several shipping methods
3. Add items to the cart
4. Verify taxes display on the cart and checkout blocks but disappear after changing shipping methods.
## Expected behavior
Taxes should be recalculated and displayed after changing shipping methods. Checking out should include taxes.
## Screen recording
https://user-images.githubusercontent.com/17236129/224076426-cd37942f-2d3f-47b9-81df-aaade2a3a817.mov
## Additional context
Since on-page refresh and on the legacy cart and checkout taxes are correct, we need to investigate if this problem needs fixing on our side.
**EDIT: Applying a coupon also produces the same results. It seems to be derived from `$cart->calculate_totals()`**
|
non_process
|
third party tax calculation not displaying after changing shipping method describe the bug when using a third party tax calculation plugin avatax they are not displayed on cart and checkout blocks after changing shipping methods a page refresh will properly display them and they get successfully recalculated on shortcode versions to reproduce steps to reproduce the behavior install avatax and enable taxes create several shipping methods add items to the cart verify taxes display on the cart and checkout blocks but disappear after changing shipping methods expected behavior taxes should be recalculated and displayed after changing shipping methods checking out should include taxes screen recording additional context since on page refresh and on the legacy cart and checkout taxes are correct we need to investigate if this problem needs fixing on our side edit applying a coupon also produces the same results it seems to be derived from cart calculate totals
| 0
|
19,899
| 3,786,700,575
|
IssuesEvent
|
2016-03-21 05:41:43
|
rancher/rancher
|
https://api.github.com/repos/rancher/rancher
|
closed
|
Docker Engine section in Add Host for Packet is on same line as Add Label
|
area/host area/ui kind/bug status/to-test
|
Version 0.50.1
Steps;
1. Go to Host Page
2. Click on Add Host
3. Select Packet
Results: The Docker Engine section is messed up and showing up on same line as add label

Expected:
Should show up on next line like all other add pages.
|
1.0
|
Docker Engine section in Add Host for Packet is on same line as Add Label - Version 0.50.1
Steps;
1. Go to Host Page
2. Click on Add Host
3. Select Packet
Results: The Docker Engine section is messed up and showing up on same line as add label

Expected:
Should show up on next line like all other add pages.
|
non_process
|
docker engine section in add host for packet is on same line as add label version steps go to host page click on add host select packet results the docker engine section is messed up and showing up on same line as add label expected should show up on next line like all other add pages
| 0
|
3,284
| 6,377,421,417
|
IssuesEvent
|
2017-08-02 09:59:11
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
Internal domain function can be used to cause segfaults
|
domain process
|
<!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: master
* **Platform**: all
* **Subsystem**: process / domain
<!-- Enter your issue details below this comment. -->
The following snippet will cause a segmentation fault on master:
```js
// This is an evil array
const array = [0];
Object.defineProperty(array, '0', {
get() {
throw new Error();
}
});
// Trick the environment into thinking it is inside a domain
process._setupDomainUse(array, [])[0] = 1;
// This call will try to use the pretended domain and segfault
require('crypto').randomBytes(1024, () => { });
// The process will segfault above so this never gets printed
console.log('Still working');
```
This is caused by using `env->domain_array->Get(0)` instead of the safe variant of the `Get` function. This is not limited to `randomBytes`, there is a number of files with similar code.
The priority of this issue is very low as it uses undocumented internal functions to intentionally cause a segmentation fault. I am documenting this for the sake of completeness.
|
1.0
|
Internal domain function can be used to cause segfaults - <!--
Thank you for reporting an issue.
This issue tracker is for bugs and issues found within Node.js core.
If you require more general support please file an issue on our help
repo. https://github.com/nodejs/help
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
* **Version**: master
* **Platform**: all
* **Subsystem**: process / domain
<!-- Enter your issue details below this comment. -->
The following snippet will cause a segmentation fault on master:
```js
// This is an evil array
const array = [0];
Object.defineProperty(array, '0', {
get() {
throw new Error();
}
});
// Trick the environment into thinking it is inside a domain
process._setupDomainUse(array, [])[0] = 1;
// This call will try to use the pretended domain and segfault
require('crypto').randomBytes(1024, () => { });
// The process will segfault above so this never gets printed
console.log('Still working');
```
This is caused by using `env->domain_array->Get(0)` instead of the safe variant of the `Get` function. This is not limited to `randomBytes`, there is a number of files with similar code.
The priority of this issue is very low as it uses undocumented internal functions to intentionally cause a segmentation fault. I am documenting this for the sake of completeness.
|
process
|
internal domain function can be used to cause segfaults thank you for reporting an issue this issue tracker is for bugs and issues found within node js core if you require more general support please file an issue on our help repo please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version master platform all subsystem process domain the following snippet will cause a segmentation fault on master js this is an evil array const array object defineproperty array get throw new error trick the environment into thinking it is inside a domain process setupdomainuse array this call will try to use the pretended domain and segfault require crypto randombytes the process will segfault above so this never gets printed console log still working this is caused by using env domain array get instead of the safe variant of the get function this is not limited to randombytes there is a number of files with similar code the priority of this issue is very low as it uses undocumented internal functions to intentionally cause a segmentation fault i am documenting this for the sake of completeness
| 1
|
195,119
| 22,288,267,868
|
IssuesEvent
|
2022-06-12 01:04:28
|
SmartBear/readyapi4j
|
https://api.github.com/repos/SmartBear/readyapi4j
|
closed
|
CVE-2013-7285 (High) detected in xstream-1.3.1.jar - autoclosed
|
security vulnerability
|
## CVE-2013-7285 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.3.1.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /modules/cucumber/modules/cucumber4oas/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar</p>
<p>
Dependency Hierarchy:
- readyapi4j-maven-plugin-1.0.0-SNAPSHOT.jar (Root Library)
- readyapi4j-facade-1.0.0-SNAPSHOT.jar
- readyapi4j-local-1.0.0-SNAPSHOT.jar
- soapui-testserver-api-5.5.0.jar
- soapui-5.5.0.jar
- :x: **xstream-1.3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/SmartBear/readyapi4j/commit/2616e3393c26f490cd18ae49306a09616a7b066f">2616e3393c26f490cd18ae49306a09616a7b066f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Xstream API versions up to 1.4.6 and version 1.4.10, if the security framework has not been initialized, may allow a remote attacker to run arbitrary shell commands by manipulating the processed input stream when unmarshaling XML or any supported format. e.g. JSON.
<p>Publish Date: 2019-05-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-7285>CVE-2013-7285</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-7285">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-7285</a></p>
<p>Release Date: 2019-05-15</p>
<p>Fix Resolution: 1.4.7,1.4.11</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.thoughtworks.xstream","packageName":"xstream","packageVersion":"1.3.1","packageFilePaths":["/modules/cucumber/modules/cucumber4oas/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.smartbear.readyapi:readyapi4j-maven-plugin:1.0.0-SNAPSHOT;com.smartbear.readyapi:readyapi4j-facade:1.0.0-SNAPSHOT;com.smartbear.readyapi:readyapi4j-local:1.0.0-SNAPSHOT;com.smartbear.soapui:soapui-testserver-api:5.5.0;com.smartbear.soapui:soapui:5.5.0;com.thoughtworks.xstream:xstream:1.3.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.4.7,1.4.11","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2013-7285","vulnerabilityDetails":"Xstream API versions up to 1.4.6 and version 1.4.10, if the security framework has not been initialized, may allow a remote attacker to run arbitrary shell commands by manipulating the processed input stream when unmarshaling XML or any supported format. e.g. JSON.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-7285","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2013-7285 (High) detected in xstream-1.3.1.jar - autoclosed - ## CVE-2013-7285 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.3.1.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /modules/cucumber/modules/cucumber4oas/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar,/home/wss-scanner/.m2/repository/thoughtworks/xstream/1.3.1/xstream-1.3.1.jar</p>
<p>
Dependency Hierarchy:
- readyapi4j-maven-plugin-1.0.0-SNAPSHOT.jar (Root Library)
- readyapi4j-facade-1.0.0-SNAPSHOT.jar
- readyapi4j-local-1.0.0-SNAPSHOT.jar
- soapui-testserver-api-5.5.0.jar
- soapui-5.5.0.jar
- :x: **xstream-1.3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/SmartBear/readyapi4j/commit/2616e3393c26f490cd18ae49306a09616a7b066f">2616e3393c26f490cd18ae49306a09616a7b066f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Xstream API versions up to 1.4.6 and version 1.4.10, if the security framework has not been initialized, may allow a remote attacker to run arbitrary shell commands by manipulating the processed input stream when unmarshaling XML or any supported format. e.g. JSON.
<p>Publish Date: 2019-05-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-7285>CVE-2013-7285</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-7285">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-7285</a></p>
<p>Release Date: 2019-05-15</p>
<p>Fix Resolution: 1.4.7,1.4.11</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.thoughtworks.xstream","packageName":"xstream","packageVersion":"1.3.1","packageFilePaths":["/modules/cucumber/modules/cucumber4oas/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.smartbear.readyapi:readyapi4j-maven-plugin:1.0.0-SNAPSHOT;com.smartbear.readyapi:readyapi4j-facade:1.0.0-SNAPSHOT;com.smartbear.readyapi:readyapi4j-local:1.0.0-SNAPSHOT;com.smartbear.soapui:soapui-testserver-api:5.5.0;com.smartbear.soapui:soapui:5.5.0;com.thoughtworks.xstream:xstream:1.3.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.4.7,1.4.11","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2013-7285","vulnerabilityDetails":"Xstream API versions up to 1.4.6 and version 1.4.10, if the security framework has not been initialized, may allow a remote attacker to run arbitrary shell commands by manipulating the processed input stream when unmarshaling XML or any supported format. e.g. JSON.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-7285","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in xstream jar autoclosed cve high severity vulnerability vulnerable library xstream jar path to dependency file modules cucumber modules pom xml path to vulnerable library home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar home wss scanner repository thoughtworks xstream xstream jar dependency hierarchy maven plugin snapshot jar root library facade snapshot jar local snapshot jar soapui testserver api jar soapui jar x xstream jar vulnerable library found in head commit a href found in base branch master vulnerability details xstream api versions up to and version if the security framework has not been initialized may allow a remote attacker to run arbitrary shell commands by manipulating the processed input stream when unmarshaling xml or any supported format e g json publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree com smartbear readyapi maven plugin snapshot com smartbear readyapi facade snapshot com smartbear readyapi local snapshot com smartbear soapui soapui testserver api com smartbear soapui soapui com thoughtworks xstream xstream isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails xstream api versions up to and version if the security framework has not been initialized may allow a remote attacker to run arbitrary shell commands by manipulating the processed input stream when unmarshaling xml or any supported format e g json vulnerabilityurl
| 0
|
5,011
| 7,843,330,554
|
IssuesEvent
|
2018-06-19 05:20:16
|
onemoongit/CAU
|
https://api.github.com/repos/onemoongit/CAU
|
closed
|
NGP_process
|
NGP process
|
1. 벽 만들기 ( 보이지 않는 벽 ) , master는 보이고 slave는 안 보이도록
2. 벽 통과할 수 없는 기능 추가
3. 발사체 추가(소리)
4. 소리 튕기게 하기 ( 입사각 반사각 ) - > RECT의 중심을 기준으로 어디에 부딪혔는지 확인
5. 탈출구 만들기
6. 통신
|
1.0
|
NGP_process - 1. 벽 만들기 ( 보이지 않는 벽 ) , master는 보이고 slave는 안 보이도록
2. 벽 통과할 수 없는 기능 추가
3. 발사체 추가(소리)
4. 소리 튕기게 하기 ( 입사각 반사각 ) - > RECT의 중심을 기준으로 어디에 부딪혔는지 확인
5. 탈출구 만들기
6. 통신
|
process
|
ngp process 벽 만들기 보이지 않는 벽 master는 보이고 slave는 안 보이도록 벽 통과할 수 없는 기능 추가 발사체 추가 소리 소리 튕기게 하기 입사각 반사각 rect의 중심을 기준으로 어디에 부딪혔는지 확인 탈출구 만들기 통신
| 1
|
2,875
| 5,832,289,715
|
IssuesEvent
|
2017-05-08 21:26:18
|
ContaoMonitoring/monitoring
|
https://api.github.com/repos/ContaoMonitoring/monitoring
|
closed
|
Measurement of response time
|
Feature ⚙ - Processed
|
Measure the duration until the response is back at the server
Should be done here: https://github.com/ContaoMonitoring/monitoring/blob/master/system/modules/Monitoring/classes/Monitoring.php#L161
An additional diagramm would be fine.
|
1.0
|
Measurement of response time - Measure the duration until the response is back at the server
Should be done here: https://github.com/ContaoMonitoring/monitoring/blob/master/system/modules/Monitoring/classes/Monitoring.php#L161
An additional diagramm would be fine.
|
process
|
measurement of response time measure the duration until the response is back at the server should be done here an additional diagramm would be fine
| 1
|
8,813
| 11,924,384,142
|
IssuesEvent
|
2020-04-01 09:26:21
|
prisma/prisma-client-js
|
https://api.github.com/repos/prisma/prisma-client-js
|
closed
|
Invalid include query results in unexpected output
|
bug/1-repro-available kind/bug process/candidate
|
This issue is related to https://github.com/prisma/prisma-client-js/issues/607
Schema:
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model api_keys {
allowed_ips String[]
created_at DateTime
created_by_id Int?
hidden Boolean @default(false)
id Int @default(autoincrement()) @id
key String
updated_at DateTime
user_id Int? @unique
@@index([key], name: "index_api_keys_on_key")
}
```
When running the following **invalid** query I'm getting some really weird terminal output before the helpful intended runtime error message is shown:
```ts
import { PrismaClient } from '@prisma/client'
// or const { PrismaClient } = require('@prisma/client')
const prisma = new PrismaClient()
prisma.api_keys.findMany({ first: 1, include: { user: true } }).then(x => {
console.log(x)
prisma.disconnect()
})
```

_scrolling_

_scrolling_

_scrolling_

|
1.0
|
Invalid include query results in unexpected output - This issue is related to https://github.com/prisma/prisma-client-js/issues/607
Schema:
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model api_keys {
allowed_ips String[]
created_at DateTime
created_by_id Int?
hidden Boolean @default(false)
id Int @default(autoincrement()) @id
key String
updated_at DateTime
user_id Int? @unique
@@index([key], name: "index_api_keys_on_key")
}
```
When running the following **invalid** query I'm getting some really weird terminal output before the helpful intended runtime error message is shown:
```ts
import { PrismaClient } from '@prisma/client'
// or const { PrismaClient } = require('@prisma/client')
const prisma = new PrismaClient()
prisma.api_keys.findMany({ first: 1, include: { user: true } }).then(x => {
console.log(x)
prisma.disconnect()
})
```

_scrolling_

_scrolling_

_scrolling_

|
process
|
invalid include query results in unexpected output this issue is related to schema prisma generator client provider prisma client js datasource db provider postgresql url env database url model api keys allowed ips string created at datetime created by id int hidden boolean default false id int default autoincrement id key string updated at datetime user id int unique index name index api keys on key when running the following invalid query i m getting some really weird terminal output before the helpful intended runtime error message is shown ts import prismaclient from prisma client or const prismaclient require prisma client const prisma new prismaclient prisma api keys findmany first include user true then x console log x prisma disconnect scrolling scrolling scrolling
| 1
|
336,939
| 24,519,803,024
|
IssuesEvent
|
2022-10-11 08:39:37
|
GDATASoftwareAG/vaas
|
https://api.github.com/repos/GDATASoftwareAG/vaas
|
opened
|
UseShed & UseCache Flags
|
documentation
|
In the VerdictRequest protocol you can specify two optional flags:
use_shed & use_cache
These are debug flags and should be documented and provided as example in all SDKs.
|
1.0
|
UseShed & UseCache Flags - In the VerdictRequest protocol you can specify two optional flags:
use_shed & use_cache
These are debug flags and should be documented and provided as example in all SDKs.
|
non_process
|
useshed usecache flags in the verdictrequest protocol you can specify two optional flags use shed use cache these are debug flags and should be documented and provided as example in all sdks
| 0
|
10,547
| 13,327,678,160
|
IssuesEvent
|
2020-08-27 13:30:49
|
aiidateam/aiida-core
|
https://api.github.com/repos/aiidateam/aiida-core
|
closed
|
get job status from scheduler
|
priority/nice-to-have topic/calc-jobs topic/processes topic/schedulers type/accepted feature
|
Once a job has been dropped from the queue, it would be great if AiiDA would query the scheduler for the job status rather than assuming that the job completed correctly (e.g. to detect whether the walltime was exceeded, the job was cancelled via a scheduler command, ...).
When AiiDA thinks a job is finished, it could simply run something like (add --format=State to just show the state)
```
$ sacct --parsable --jobs=11566621
JobID|JobName|Partition|Account|AllocCPUS|State|ExitCode|
11566621|aiida-140|normal|s888|192|TIMEOUT|0:0|
11566621.batch|batch||s888|24|CANCELLED|0:15|
11566621.extern|extern||s888|192|COMPLETED|0:0|
11566621.0|pw.x||s888|96|FAILED|1:0|
```
The slurm plugin already has commands that use sacct, e.g. here:
https://github.com/aiidateam/aiida_core/blob/6e9711046753332933f982971db1d7ac7e7ade58/aiida/scheduler/plugins/slurm.py#L225
@sphuber suggest to implement this in the form of a builtin generic scheduler output parser that is called before the actual calculation parser.
> Ideally we would reserved some exit codes for scheduler specific errors, that will be available to the calculation parser, which can then decide what to do, if they want to simply return the scheduler error code or do some parsing before
|
1.0
|
get job status from scheduler - Once a job has been dropped from the queue, it would be great if AiiDA would query the scheduler for the job status rather than assuming that the job completed correctly (e.g. to detect whether the walltime was exceeded, the job was cancelled via a scheduler command, ...).
When AiiDA thinks a job is finished, it could simply run something like (add --format=State to just show the state)
```
$ sacct --parsable --jobs=11566621
JobID|JobName|Partition|Account|AllocCPUS|State|ExitCode|
11566621|aiida-140|normal|s888|192|TIMEOUT|0:0|
11566621.batch|batch||s888|24|CANCELLED|0:15|
11566621.extern|extern||s888|192|COMPLETED|0:0|
11566621.0|pw.x||s888|96|FAILED|1:0|
```
The slurm plugin already has commands that use sacct, e.g. here:
https://github.com/aiidateam/aiida_core/blob/6e9711046753332933f982971db1d7ac7e7ade58/aiida/scheduler/plugins/slurm.py#L225
@sphuber suggest to implement this in the form of a builtin generic scheduler output parser that is called before the actual calculation parser.
> Ideally we would reserved some exit codes for scheduler specific errors, that will be available to the calculation parser, which can then decide what to do, if they want to simply return the scheduler error code or do some parsing before
|
process
|
get job status from scheduler once a job has been dropped from the queue it would be great if aiida would query the scheduler for the job status rather than assuming that the job completed correctly e g to detect whether the walltime was exceeded the job was cancelled via a scheduler command when aiida thinks a job is finished it could simply run something like add format state to just show the state sacct parsable jobs jobid jobname partition account alloccpus state exitcode aiida normal timeout batch batch cancelled extern extern completed pw x failed the slurm plugin already has commands that use sacct e g here sphuber suggest to implement this in the form of a builtin generic scheduler output parser that is called before the actual calculation parser ideally we would reserved some exit codes for scheduler specific errors that will be available to the calculation parser which can then decide what to do if they want to simply return the scheduler error code or do some parsing before
| 1
|
134,407
| 19,185,508,804
|
IssuesEvent
|
2021-12-05 05:26:21
|
PostHog/posthog
|
https://api.github.com/repos/PostHog/posthog
|
closed
|
Dashboards color schemes
|
enhancement design team-core-experience
|
When working on the designs for #1659, @lottiecoxon helped us create different color scheme proposals for the main dashboards 🙌 ([full design here](https://www.figma.com/file/gtW3o3VKYvf04ws8cZM3Nr/PH-Retention-optimisations?node-id=492%3A0)). Opening this issue to discuss the different proposals.
### Current color scheme

### Proposal 1

### Proposal 2

### Proposal 3

### Proposal 4

|
1.0
|
Dashboards color schemes - When working on the designs for #1659, @lottiecoxon helped us create different color scheme proposals for the main dashboards 🙌 ([full design here](https://www.figma.com/file/gtW3o3VKYvf04ws8cZM3Nr/PH-Retention-optimisations?node-id=492%3A0)). Opening this issue to discuss the different proposals.
### Current color scheme

### Proposal 1

### Proposal 2

### Proposal 3

### Proposal 4

|
non_process
|
dashboards color schemes when working on the designs for lottiecoxon helped us create different color scheme proposals for the main dashboards 🙌 opening this issue to discuss the different proposals current color scheme proposal proposal proposal proposal
| 0
|
14,850
| 18,244,728,103
|
IssuesEvent
|
2021-10-01 16:49:49
|
NixOS/nixpkgs
|
https://api.github.com/repos/NixOS/nixpkgs
|
opened
|
21.11 Feature Freeze
|
6.topic: release process
|
It's that time again!
Let's clarify any blocking concerns for the 21.11 Release, which will be cut on the 26th of November.
Nix/nix-cli ecosystem: @edolstra @grahamc @nbp @Profpatsch
Mobile: @samueldr
Nixos Modules / internals : @Infinisil @Ericson2314 @alyssais
Nixos tests: @tfc
Marketing: @garbas
Docs: @ryantm
Release: @tomberek
C: @matthewbauer
Emacs: @adisbladis
Vim/Neovim: @jonringer @softinio @teto
Erlang: @gleber @NixOS/beam
Go: @kalbasit @Mic92 @zowoq
Haskell: @NixOS/haskell @cdepillabout @sternenseemann @maralorn @expipiplus1
Python: @FRidh @DavHau
Perl: @stigtsp
Php: @NixOS/php @aanderse @etu @globin @ma27 @talyz
Ruby: @marsam
Rust: @zowoq @Mic92 @andir @LnL7
Dhall: @Gabriel439 @ehmry
R: @jbedo @bcdarwin
Darwin: @NixOS/darwin-maintainers @toonn
Bazel: @mboes
Blockchains @mmahut @RaghavSood
Podman: @NixOS/podman
Docker: @roberth @utdemir
Gnome: @jtojnar @NixOS/gnome
Qt / KDE: @ttuegel @NixOS/qt-kde
Cinnamon: @mkg20001
Postgres: @thoughtpolice
Everyone else: @NixOS/nixpkgs-committers @NixOS/release-engineers
No issue is too big or small, but let's remember that we are all working on donated time here, so let's triage those issues that can be realistically addressed by release time. Thanks everyone!
|
1.0
|
21.11 Feature Freeze - It's that time again!
Let's clarify any blocking concerns for the 21.11 Release, which will be cut on the 26th of November.
Nix/nix-cli ecosystem: @edolstra @grahamc @nbp @Profpatsch
Mobile: @samueldr
Nixos Modules / internals : @Infinisil @Ericson2314 @alyssais
Nixos tests: @tfc
Marketing: @garbas
Docs: @ryantm
Release: @tomberek
C: @matthewbauer
Emacs: @adisbladis
Vim/Neovim: @jonringer @softinio @teto
Erlang: @gleber @NixOS/beam
Go: @kalbasit @Mic92 @zowoq
Haskell: @NixOS/haskell @cdepillabout @sternenseemann @maralorn @expipiplus1
Python: @FRidh @DavHau
Perl: @stigtsp
Php: @NixOS/php @aanderse @etu @globin @ma27 @talyz
Ruby: @marsam
Rust: @zowoq @Mic92 @andir @LnL7
Dhall: @Gabriel439 @ehmry
R: @jbedo @bcdarwin
Darwin: @NixOS/darwin-maintainers @toonn
Bazel: @mboes
Blockchains @mmahut @RaghavSood
Podman: @NixOS/podman
Docker: @roberth @utdemir
Gnome: @jtojnar @NixOS/gnome
Qt / KDE: @ttuegel @NixOS/qt-kde
Cinnamon: @mkg20001
Postgres: @thoughtpolice
Everyone else: @NixOS/nixpkgs-committers @NixOS/release-engineers
No issue is too big or small, but let's remember that we are all working on donated time here, so let's triage those issues that can be realistically addressed by release time. Thanks everyone!
|
process
|
feature freeze it s that time again let s clarify any blocking concerns for the release which will be cut on the of november nix nix cli ecosystem edolstra grahamc nbp profpatsch mobile samueldr nixos modules internals infinisil alyssais nixos tests tfc marketing garbas docs ryantm release tomberek c matthewbauer emacs adisbladis vim neovim jonringer softinio teto erlang gleber nixos beam go kalbasit zowoq haskell nixos haskell cdepillabout sternenseemann maralorn python fridh davhau perl stigtsp php nixos php aanderse etu globin talyz ruby marsam rust zowoq andir dhall ehmry r jbedo bcdarwin darwin nixos darwin maintainers toonn bazel mboes blockchains mmahut raghavsood podman nixos podman docker roberth utdemir gnome jtojnar nixos gnome qt kde ttuegel nixos qt kde cinnamon postgres thoughtpolice everyone else nixos nixpkgs committers nixos release engineers no issue is too big or small but let s remember that we are all working on donated time here so let s triage those issues that can be realistically addressed by release time thanks everyone
| 1
|
287,069
| 8,798,685,639
|
IssuesEvent
|
2018-12-24 09:20:15
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.kia.com - site is not usable
|
browser-firefox-mobile priority-normal
|
<!-- @browser: Firefox Mobile 65.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:65.0) Gecko/65.0 Firefox/65.0 -->
<!-- @reported_with: mobile-reporter -->
**URL**: https://www.kia.com/us/en/home
**Browser / Version**: Firefox Mobile 65.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: Website doesn't load
**Steps to Reproduce**:
Website doesn't load, remains blank
[](https://webcompat.com/uploads/2018/12/ec700430-3ca8-43ff-bcd7-34693683cd48.jpeg)
<details>
<summary>Browser Configuration</summary>
<ul>
<li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20181217180946</li><li>tracking content blocked: true (strict)</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: true</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: beta</li>
</ul>
<p>Console Messages:</p>
<pre>
[u'[JavaScript Warning: "The resource at https://maps.google.com/maps/api/js?client=gme-hyundaimotorcompany1&v=3®ion=kr&language=en was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://www.youtube.com/iframe_api was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://www.googleadservices.com/pagead/conversion.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://maps.google.com/maps/api/js?client=gme-hyundaimotorcompany1&v=3®ion=kr&language=en was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://tag.contactatonce.com/tag/tag.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://maps.google.com/maps/api/js?client=gme-hyundaimotorcompany1&v=3®ion=kr&language=en." {file: "https://www.kia.com/us/en/home" line: 1841}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://tag.contactatonce.com/tag/tag.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Error: "ReferenceError: google is not defined" {file: "https://www.kia.com/us/k4/scripts/vendor.js" line: 19}]\n@https://www.kia.com/us/k4/scripts/vendor.js:19:11956\n', u'[JavaScript Error: "TypeError: THREE is undefined" {file: "https://www.kia.com/us/k4/scripts/k4-libs.js" line: 9}]\n@https://www.kia.com/us/k4/scripts/k4-libs.js:9:12160\n', u'[JavaScript Warning: "The resource at https://www.youtube.com/iframe_api was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.youtube.com/iframe_api." {file: "https://www.kia.com/us/en/home" line: 1868}]', u'[JavaScript Warning: "The resource at https://www.googleadservices.com/pagead/conversion.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.googleadservices.com/pagead/conversion.js." {file: "https://www.kia.com/us/en/home" line: 1880}]', u'[JavaScript Error: "Error: [$injector:modulerr] Failed to instantiate module kia4 due to:\n[$injector:modulerr] Failed to instantiate module ngSanitize due to:\n[$injector:nomod] Module \'ngSanitize\' is not available! You either misspelled the module name or forgot to load it. If registering a module ensure that you specify the dependencies as the second argument.\nhttp://errors.angularjs.org/1.3.15/$injector/nomod?p0=ngSanitize\nd/<@https://www.kia.com/us/k4/scripts/vendor.js:12:12551\nka/</</<@https://www.kia.com/us/k4/scripts/vendor.js:12:20897\nb@https://www.kia.com/us/k4/scripts/vendor.js:12:20469\nka/</<@https://www.kia.com/us/k4/scripts/vendor.js:12:20781\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29215\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29232\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nSa@https://www.kia.com/us/k4/scripts/vendor.js:12:30753\nh@https://www.kia.com/us/k4/scripts/vendor.js:12:18582\n_@https://www.kia.com/us/k4/scripts/vendor.js:12:18892\n$@https://www.kia.com/us/k4/scripts/vendor.js:12:18141\n@https://www.kia.com/us/k4/scripts/vendor.js:16:16490\nk@https://www.kia.com/us/k4/scripts/vendor.js:2:10435\nfireWith@https://www.kia.com/us/k4/scripts/vendor.js:2:11252\nready@https://www.kia.com/us/k4/scripts/vendor.js:2:13045\ng@https://www.kia.com/us/k4/scripts/vendor.js:1:7876\n\nhttp://errors.angularjs.org/1.3.15/$injector/modulerr?p0=ngSanitize&p1=%5B%24injector%3Anomod%5D%20Module%20\'ngSanitize\'%20is%20not%20available!%20You%20either%20misspelled%20the%20module%20name%20or%20forgot%20to%20load%20it.%20If%20registering%20a%20module%20ensure%20that%20you%20specify%20the%20dependencies%20as%20the%20second%20argument.%0Ahttp%3A%2F%2Ferrors.angularjs.org%2F1.3.15%2F%24injector%2Fnomod%3Fp0%3DngSanitize%0Ad%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12551%0Aka%2F%3C%2F%3C%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20897%0Ab%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20469%0Aka%2F%3C%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20781%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29215%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29232%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0ASa%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A30753%0Ah%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18582%0A_%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18892%0A%24%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18141%0A%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A16%3A16490%0Ak%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A10435%0AfireWith%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A11252%0Aready%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A13045%0Ag%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A1%3A7876%0A\nd/<@https://www.kia.com/us/k4/scripts/vendor.js:12:12551\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29491\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29232\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nSa@https://www.kia.com/us/k4/scripts/vendor.js:12:30753\nh@https://www.kia.com/us/k4/scripts/vendor.js:12:18582\n_@https://www.kia.com/us/k4/scripts/vendor.js:12:18892\n$@https://www.kia.com/us/k4/scripts/vendor.js:12:18141\n@https://www.kia.com/us/k4/scripts/vendor.js:16:16490\nk@https://www.kia.com/us/k4/scripts/vendor.js:2:10435\nfireWith@https://www.kia.com/us/k4/scripts/vendor.js:2:11252\nready@https://www.kia.com/us/k4/scripts/vendor.js:2:13045\ng@https://www.kia.com/us/k4/scripts/vendor.js:1:7876\n\nhttp://errors.angularjs.org/1.3.15/$injector/modulerr?p0=kia4&p1=%5B%24injector%3Amodulerr%5D%20Failed%20to%20instantiate%20module%20ngSanitize%20due%20to%3A%0A%5B%24injector%3Anomod%5D%20Module%20\'ngSanitize\'%20is%20not%20available!%20You%20either%20misspelled%20the%20module%20name%20or%20forgot%20to%20load%20it.%20If%20registering%20a%20module%20ensure%20that%20you%20specify%20the%20dependencies%20as%20the%20second%20argument.%0Ahttp%3A%2F%2Ferrors.angularjs.org%2F1.3.15%2F%24injector%2Fnomod%3Fp0%3DngSanitize%0Ad%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12551%0Aka%2F%3C%2F%3C%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20897%0Ab%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20469%0Aka%2F%3C%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20781%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29215%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29232%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0ASa%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A30753%0Ah%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18582%0A_%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18892%0A%24%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18141%0A%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A16%3A16490%0Ak%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A10435%0AfireWith%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A11252%0Aready%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A13045%0Ag%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A1%3A7876%0A%0Ahttp%3A%2F%2Ferrors.angularjs.org%2F1.3.15%2F%24injector%2Fmodulerr%3Fp0%3DngSanitize%26p1%3D%255B%2524injector%253Anomod%255D%2520Module%2520\'ngSanitize\'%2520is%2520not%2520available!%2520You%2520either%2520misspelled%2520the%2520module%2520name%2520or%2520forgot%2520to%2520load%2520it.%2520If%2520registering%2520a%2520module%2520ensure%2520that%2520you%2520specify%2520the%2520dependencies%2520as%2520the%2520second%2520argument.%250Ahttp%253A%252F%252Ferrors.angularjs.org%252F1.3.15%252F%2524injector%252Fnomod%253Fp0%253DngSanitize%250Ad%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A12551%250Aka%252F%253C%252F%253C%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A20897%250Ab%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A20469%250Aka%252F%253C%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A20781%250An%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A29215%250Af%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A12934%250An%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A29063%250An%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A29232%250Af%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A12934%250An%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A29063%250ASa%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A30753%250Ah%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A18582%250A_%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A18892%250A%2524%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A18141%250A%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A16%253A16490%250Ak%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A2%253A10435%250AfireWith%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A2%253A11252%250Aready%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A2%253A13045%250Ag%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A1%253A7876%250A%0Ad%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12551%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29491%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29232%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0ASa%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A30753%0Ah%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18582%0A_%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18892%0A%24%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18141%0A%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A16%3A16490%0Ak%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A10435%0AfireWith%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A11252%0Aready%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A13045%0Ag%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A1%3A7876%0A" {file: "https://www.kia.com/us/k4/scripts/vendor.js" line: 12}]\nd/<@https://www.kia.com/us/k4/scripts/vendor.js:12:12551\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29491\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nSa@https://www.kia.com/us/k4/scripts/vendor.js:12:30753\nh@https://www.kia.com/us/k4/scripts/vendor.js:12:18582\n_@https://www.kia.com/us/k4/scripts/vendor.js:12:18892\n$@https://www.kia.com/us/k4/scripts/vendor.js:12:18141\n@https://www.kia.com/us/k4/scripts/vendor.js:16:16490\nk@https://www.kia.com/us/k4/scripts/vendor.js:2:10435\nfireWith@https://www.kia.com/us/k4/scripts/vendor.js:2:11252\nready@https://www.kia.com/us/k4/scripts/vendor.js:2:13045\ng@https://www.kia.com/us/k4/scripts/vendor.js:1:7876\n', u'[JavaScript Warning: "The resource at https://static.ads-twitter.com/uwt.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://connect.facebook.net/en_US/fbevents.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://d.turn.com/r/dd/id/L21rdC8yNTcvY2lkLzI0OTE3MTg2L3QvMA/dpuid/L21rdC82ODUvcGlkLzc2NzM0OTg1L3QvMA/kv/ was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://bat.bing.com/bat.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://www.googleadservices.com/pagead/conversion_async.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://www.googleadservices.com/pagead/conversion_async.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://d.turn.com/r/dft/id/L21rdC82ODUvcGlkLzc2NzM0OTg1L3QvMA was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://live.kia.carlabs.com/logcookies. (Reason: CORS header Access-Control-Allow-Origin missing)."]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://live.kia.carlabs.com/logcookies. (Reason: CORS request did not succeed)."]', u'[JavaScript Error: "TypeError: NetworkError when attempting to fetch resource."]', u'[JavaScript Warning: "Loading failed for the <script> with source https://static.ads-twitter.com/uwt.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://connect.facebook.net/en_US/fbevents.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://d.turn.com/r/dd/id/L21rdC8yNTcvY2lkLzI0OTE3MTg2L3QvMA/dpuid/L21rdC82ODUvcGlkLzc2NzM0OTg1L3QvMA/kv/." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://bat.bing.com/bat.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.googleadservices.com/pagead/conversion_async.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.googleadservices.com/pagead/conversion_async.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://d.turn.com/r/dft/id/L21rdC82ODUvcGlkLzc2NzM0OTg1L3QvMA." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://vt.myvisualiq.net/2/OGnQJy4nQG7dThqXAxTNoA%3D%3D/vt-74.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://d2oh4tlt9mrke9.cloudfront.net/Record/js/sessioncam.recorder.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "The resource at https://www.googletagmanager.com/gtag/js?id=DC-4235921 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://fls.doubleclick.net/json?spot=4235921&src=&var=s_3_Integrate_DFA_get_0&host=integrate.112.2o7.net%2Fdfa_echo%3Fvar%3Ds_3_Integrate_DFA_get_0%26AQE%3D1%26A2S%3D1&ord=1432172696337 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://dpm.demdex.net/id?d_visid_ver=2.4.0&d_fieldgroup=MC&d_rtbd=json&d_ver=2&d_orgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&d_nsid=0&d_mid=76383219896027699053678177471538551322&ts=1545544897940 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://dpm.demdex.net/id?d_visid_ver=2.4.0&d_fieldgroup=AAM&d_rtbd=json&d_ver=2&d_orgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&d_nsid=0&d_mid=76383219896027699053678177471538551322&ts=1545544897941 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.googletagmanager.com/gtag/js?id=DC-4235921." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://fls.doubleclick.net/json?spot=4235921&src=&var=s_3_Integrate_DFA_get_0&host=integrate.112.2o7.net%2Fdfa_echo%3Fvar%3Ds_3_Integrate_DFA_get_0%26AQE%3D1%26A2S%3D1&ord=1432172696337." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://dpm.demdex.net/id?d_visid_ver=2.4.0&d_fieldgroup=MC&d_rtbd=json&d_ver=2&d_orgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&d_nsid=0&d_mid=76383219896027699053678177471538551322&ts=1545544897940. (Reason: CORS request did not succeed)."]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://dpm.demdex.net/id?d_visid_ver=2.4.0&d_fieldgroup=AAM&d_rtbd=json&d_ver=2&d_orgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&d_nsid=0&d_mid=76383219896027699053678177471538551322&ts=1545544897941. (Reason: CORS request did not succeed)."]', u'[JavaScript Warning: "The resource at https://www.google-analytics.com/analytics.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://hisnakiamotors.d2.sc.omtrdc.net/b/ss/hkmkiatier1prod/1/JS-2.4.0/s69342046975041?AQB=1&ndh=1&pf=1&t=22%2F11%2F2018%2023%3A1%3A39%206%20420&sdid=4416767A79572AA3-2650F29B6C4170C3&mid=76383219896027699053678177471538551322&ce=UTF-8&g=https%3A%2F%2Fwww.kia.com%2Fus%2Fen%2Fhome&c.&language=en&page_name=home&layout_type=mobile&tealium_library_version=4.44.0&site_section=home&fire_floodlight=false&adobe_analytics_rsid=hkmkiatier1prod&page_url=https%3A%2F%2Fwww.kia.com%2Fus%2Fen%2Fhome&document_title=Cars%2C%20SUVs%2C%20Hybrids%2C%20Minivans%20%26%20Crossovers%20%7C%20Kia&page_load_time=34&utag_profile=kiatier1&utag_version=ut4.44.201811061502&utag_environment=prod&utag_event=view&utag_root_domain=kia.com&tier=t1&time_part=10%3A01%20pm%7Csaturday&visit_number=1&v_api=visitorapi%20present&mcid=76383219896027699053678177471538551322&.c&cc=USD&ch=home&events=event59&h1=home&s=414x795&c=24&j=1.6&v=N&k=Y&bw=414&bh=739&mcorgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&AQE=1 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]']
</pre>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.kia.com - site is not usable - <!-- @browser: Firefox Mobile 65.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:65.0) Gecko/65.0 Firefox/65.0 -->
<!-- @reported_with: mobile-reporter -->
**URL**: https://www.kia.com/us/en/home
**Browser / Version**: Firefox Mobile 65.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: Website doesn't load
**Steps to Reproduce**:
Website doesn't load, remains blank
[](https://webcompat.com/uploads/2018/12/ec700430-3ca8-43ff-bcd7-34693683cd48.jpeg)
<details>
<summary>Browser Configuration</summary>
<ul>
<li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20181217180946</li><li>tracking content blocked: true (strict)</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: true</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: beta</li>
</ul>
<p>Console Messages:</p>
<pre>
[u'[JavaScript Warning: "The resource at https://maps.google.com/maps/api/js?client=gme-hyundaimotorcompany1&v=3®ion=kr&language=en was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://www.youtube.com/iframe_api was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://www.googleadservices.com/pagead/conversion.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://maps.google.com/maps/api/js?client=gme-hyundaimotorcompany1&v=3®ion=kr&language=en was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://tag.contactatonce.com/tag/tag.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://maps.google.com/maps/api/js?client=gme-hyundaimotorcompany1&v=3®ion=kr&language=en." {file: "https://www.kia.com/us/en/home" line: 1841}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://tag.contactatonce.com/tag/tag.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Error: "ReferenceError: google is not defined" {file: "https://www.kia.com/us/k4/scripts/vendor.js" line: 19}]\n@https://www.kia.com/us/k4/scripts/vendor.js:19:11956\n', u'[JavaScript Error: "TypeError: THREE is undefined" {file: "https://www.kia.com/us/k4/scripts/k4-libs.js" line: 9}]\n@https://www.kia.com/us/k4/scripts/k4-libs.js:9:12160\n', u'[JavaScript Warning: "The resource at https://www.youtube.com/iframe_api was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.youtube.com/iframe_api." {file: "https://www.kia.com/us/en/home" line: 1868}]', u'[JavaScript Warning: "The resource at https://www.googleadservices.com/pagead/conversion.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.googleadservices.com/pagead/conversion.js." {file: "https://www.kia.com/us/en/home" line: 1880}]', u'[JavaScript Error: "Error: [$injector:modulerr] Failed to instantiate module kia4 due to:\n[$injector:modulerr] Failed to instantiate module ngSanitize due to:\n[$injector:nomod] Module \'ngSanitize\' is not available! You either misspelled the module name or forgot to load it. If registering a module ensure that you specify the dependencies as the second argument.\nhttp://errors.angularjs.org/1.3.15/$injector/nomod?p0=ngSanitize\nd/<@https://www.kia.com/us/k4/scripts/vendor.js:12:12551\nka/</</<@https://www.kia.com/us/k4/scripts/vendor.js:12:20897\nb@https://www.kia.com/us/k4/scripts/vendor.js:12:20469\nka/</<@https://www.kia.com/us/k4/scripts/vendor.js:12:20781\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29215\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29232\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nSa@https://www.kia.com/us/k4/scripts/vendor.js:12:30753\nh@https://www.kia.com/us/k4/scripts/vendor.js:12:18582\n_@https://www.kia.com/us/k4/scripts/vendor.js:12:18892\n$@https://www.kia.com/us/k4/scripts/vendor.js:12:18141\n@https://www.kia.com/us/k4/scripts/vendor.js:16:16490\nk@https://www.kia.com/us/k4/scripts/vendor.js:2:10435\nfireWith@https://www.kia.com/us/k4/scripts/vendor.js:2:11252\nready@https://www.kia.com/us/k4/scripts/vendor.js:2:13045\ng@https://www.kia.com/us/k4/scripts/vendor.js:1:7876\n\nhttp://errors.angularjs.org/1.3.15/$injector/modulerr?p0=ngSanitize&p1=%5B%24injector%3Anomod%5D%20Module%20\'ngSanitize\'%20is%20not%20available!%20You%20either%20misspelled%20the%20module%20name%20or%20forgot%20to%20load%20it.%20If%20registering%20a%20module%20ensure%20that%20you%20specify%20the%20dependencies%20as%20the%20second%20argument.%0Ahttp%3A%2F%2Ferrors.angularjs.org%2F1.3.15%2F%24injector%2Fnomod%3Fp0%3DngSanitize%0Ad%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12551%0Aka%2F%3C%2F%3C%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20897%0Ab%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20469%0Aka%2F%3C%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20781%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29215%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29232%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0ASa%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A30753%0Ah%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18582%0A_%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18892%0A%24%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18141%0A%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A16%3A16490%0Ak%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A10435%0AfireWith%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A11252%0Aready%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A13045%0Ag%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A1%3A7876%0A\nd/<@https://www.kia.com/us/k4/scripts/vendor.js:12:12551\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29491\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29232\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nSa@https://www.kia.com/us/k4/scripts/vendor.js:12:30753\nh@https://www.kia.com/us/k4/scripts/vendor.js:12:18582\n_@https://www.kia.com/us/k4/scripts/vendor.js:12:18892\n$@https://www.kia.com/us/k4/scripts/vendor.js:12:18141\n@https://www.kia.com/us/k4/scripts/vendor.js:16:16490\nk@https://www.kia.com/us/k4/scripts/vendor.js:2:10435\nfireWith@https://www.kia.com/us/k4/scripts/vendor.js:2:11252\nready@https://www.kia.com/us/k4/scripts/vendor.js:2:13045\ng@https://www.kia.com/us/k4/scripts/vendor.js:1:7876\n\nhttp://errors.angularjs.org/1.3.15/$injector/modulerr?p0=kia4&p1=%5B%24injector%3Amodulerr%5D%20Failed%20to%20instantiate%20module%20ngSanitize%20due%20to%3A%0A%5B%24injector%3Anomod%5D%20Module%20\'ngSanitize\'%20is%20not%20available!%20You%20either%20misspelled%20the%20module%20name%20or%20forgot%20to%20load%20it.%20If%20registering%20a%20module%20ensure%20that%20you%20specify%20the%20dependencies%20as%20the%20second%20argument.%0Ahttp%3A%2F%2Ferrors.angularjs.org%2F1.3.15%2F%24injector%2Fnomod%3Fp0%3DngSanitize%0Ad%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12551%0Aka%2F%3C%2F%3C%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20897%0Ab%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20469%0Aka%2F%3C%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A20781%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29215%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29232%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0ASa%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A30753%0Ah%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18582%0A_%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18892%0A%24%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18141%0A%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A16%3A16490%0Ak%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A10435%0AfireWith%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A11252%0Aready%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A13045%0Ag%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A1%3A7876%0A%0Ahttp%3A%2F%2Ferrors.angularjs.org%2F1.3.15%2F%24injector%2Fmodulerr%3Fp0%3DngSanitize%26p1%3D%255B%2524injector%253Anomod%255D%2520Module%2520\'ngSanitize\'%2520is%2520not%2520available!%2520You%2520either%2520misspelled%2520the%2520module%2520name%2520or%2520forgot%2520to%2520load%2520it.%2520If%2520registering%2520a%2520module%2520ensure%2520that%2520you%2520specify%2520the%2520dependencies%2520as%2520the%2520second%2520argument.%250Ahttp%253A%252F%252Ferrors.angularjs.org%252F1.3.15%252F%2524injector%252Fnomod%253Fp0%253DngSanitize%250Ad%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A12551%250Aka%252F%253C%252F%253C%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A20897%250Ab%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A20469%250Aka%252F%253C%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A20781%250An%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A29215%250Af%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A12934%250An%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A29063%250An%252F%253C%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A29232%250Af%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A12934%250An%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A29063%250ASa%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A30753%250Ah%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A18582%250A_%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A18892%250A%2524%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A12%253A18141%250A%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A16%253A16490%250Ak%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A2%253A10435%250AfireWith%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A2%253A11252%250Aready%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A2%253A13045%250Ag%2540https%253A%252F%252Fwww.kia.com%252Fus%252Fk4%252Fscripts%252Fvendor.js%253A1%253A7876%250A%0Ad%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12551%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29491%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0An%2F%3C%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29232%0Af%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A12934%0An%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A29063%0ASa%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A30753%0Ah%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18582%0A_%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18892%0A%24%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A12%3A18141%0A%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A16%3A16490%0Ak%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A10435%0AfireWith%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A11252%0Aready%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A2%3A13045%0Ag%40https%3A%2F%2Fwww.kia.com%2Fus%2Fk4%2Fscripts%2Fvendor.js%3A1%3A7876%0A" {file: "https://www.kia.com/us/k4/scripts/vendor.js" line: 12}]\nd/<@https://www.kia.com/us/k4/scripts/vendor.js:12:12551\nn/<@https://www.kia.com/us/k4/scripts/vendor.js:12:29491\nf@https://www.kia.com/us/k4/scripts/vendor.js:12:12934\nn@https://www.kia.com/us/k4/scripts/vendor.js:12:29063\nSa@https://www.kia.com/us/k4/scripts/vendor.js:12:30753\nh@https://www.kia.com/us/k4/scripts/vendor.js:12:18582\n_@https://www.kia.com/us/k4/scripts/vendor.js:12:18892\n$@https://www.kia.com/us/k4/scripts/vendor.js:12:18141\n@https://www.kia.com/us/k4/scripts/vendor.js:16:16490\nk@https://www.kia.com/us/k4/scripts/vendor.js:2:10435\nfireWith@https://www.kia.com/us/k4/scripts/vendor.js:2:11252\nready@https://www.kia.com/us/k4/scripts/vendor.js:2:13045\ng@https://www.kia.com/us/k4/scripts/vendor.js:1:7876\n', u'[JavaScript Warning: "The resource at https://static.ads-twitter.com/uwt.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://connect.facebook.net/en_US/fbevents.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://d.turn.com/r/dd/id/L21rdC8yNTcvY2lkLzI0OTE3MTg2L3QvMA/dpuid/L21rdC82ODUvcGlkLzc2NzM0OTg1L3QvMA/kv/ was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://bat.bing.com/bat.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://www.googleadservices.com/pagead/conversion_async.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://www.googleadservices.com/pagead/conversion_async.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://d.turn.com/r/dft/id/L21rdC82ODUvcGlkLzc2NzM0OTg1L3QvMA was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://live.kia.carlabs.com/logcookies. (Reason: CORS header Access-Control-Allow-Origin missing)."]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://live.kia.carlabs.com/logcookies. (Reason: CORS request did not succeed)."]', u'[JavaScript Error: "TypeError: NetworkError when attempting to fetch resource."]', u'[JavaScript Warning: "Loading failed for the <script> with source https://static.ads-twitter.com/uwt.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://connect.facebook.net/en_US/fbevents.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://d.turn.com/r/dd/id/L21rdC8yNTcvY2lkLzI0OTE3MTg2L3QvMA/dpuid/L21rdC82ODUvcGlkLzc2NzM0OTg1L3QvMA/kv/." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://bat.bing.com/bat.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.googleadservices.com/pagead/conversion_async.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.googleadservices.com/pagead/conversion_async.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://d.turn.com/r/dft/id/L21rdC82ODUvcGlkLzc2NzM0OTg1L3QvMA." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://vt.myvisualiq.net/2/OGnQJy4nQG7dThqXAxTNoA%3D%3D/vt-74.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://d2oh4tlt9mrke9.cloudfront.net/Record/js/sessioncam.recorder.js." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "The resource at https://www.googletagmanager.com/gtag/js?id=DC-4235921 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://fls.doubleclick.net/json?spot=4235921&src=&var=s_3_Integrate_DFA_get_0&host=integrate.112.2o7.net%2Fdfa_echo%3Fvar%3Ds_3_Integrate_DFA_get_0%26AQE%3D1%26A2S%3D1&ord=1432172696337 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://dpm.demdex.net/id?d_visid_ver=2.4.0&d_fieldgroup=MC&d_rtbd=json&d_ver=2&d_orgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&d_nsid=0&d_mid=76383219896027699053678177471538551322&ts=1545544897940 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://dpm.demdex.net/id?d_visid_ver=2.4.0&d_fieldgroup=AAM&d_rtbd=json&d_ver=2&d_orgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&d_nsid=0&d_mid=76383219896027699053678177471538551322&ts=1545544897941 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://www.googletagmanager.com/gtag/js?id=DC-4235921." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Loading failed for the <script> with source https://fls.doubleclick.net/json?spot=4235921&src=&var=s_3_Integrate_DFA_get_0&host=integrate.112.2o7.net%2Fdfa_echo%3Fvar%3Ds_3_Integrate_DFA_get_0%26AQE%3D1%26A2S%3D1&ord=1432172696337." {file: "https://www.kia.com/us/en/home" line: 1}]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://dpm.demdex.net/id?d_visid_ver=2.4.0&d_fieldgroup=MC&d_rtbd=json&d_ver=2&d_orgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&d_nsid=0&d_mid=76383219896027699053678177471538551322&ts=1545544897940. (Reason: CORS request did not succeed)."]', u'[JavaScript Warning: "Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://dpm.demdex.net/id?d_visid_ver=2.4.0&d_fieldgroup=AAM&d_rtbd=json&d_ver=2&d_orgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&d_nsid=0&d_mid=76383219896027699053678177471538551322&ts=1545544897941. (Reason: CORS request did not succeed)."]', u'[JavaScript Warning: "The resource at https://www.google-analytics.com/analytics.js was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]', u'[JavaScript Warning: "The resource at https://hisnakiamotors.d2.sc.omtrdc.net/b/ss/hkmkiatier1prod/1/JS-2.4.0/s69342046975041?AQB=1&ndh=1&pf=1&t=22%2F11%2F2018%2023%3A1%3A39%206%20420&sdid=4416767A79572AA3-2650F29B6C4170C3&mid=76383219896027699053678177471538551322&ce=UTF-8&g=https%3A%2F%2Fwww.kia.com%2Fus%2Fen%2Fhome&c.&language=en&page_name=home&layout_type=mobile&tealium_library_version=4.44.0&site_section=home&fire_floodlight=false&adobe_analytics_rsid=hkmkiatier1prod&page_url=https%3A%2F%2Fwww.kia.com%2Fus%2Fen%2Fhome&document_title=Cars%2C%20SUVs%2C%20Hybrids%2C%20Minivans%20%26%20Crossovers%20%7C%20Kia&page_load_time=34&utag_profile=kiatier1&utag_version=ut4.44.201811061502&utag_environment=prod&utag_event=view&utag_root_domain=kia.com&tier=t1&time_part=10%3A01%20pm%7Csaturday&visit_number=1&v_api=visitorapi%20present&mcid=76383219896027699053678177471538551322&.c&cc=USD&ch=home&events=event59&h1=home&s=414x795&c=24&j=1.6&v=N&k=Y&bw=414&bh=739&mcorgid=5288FC7C5A0DB1AD0A495DAA%40AdobeOrg&AQE=1 was blocked because content blocking is enabled." {file: "https://www.kia.com/us/en/home" line: 0}]']
</pre>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
site is not usable url browser version firefox mobile operating system android tested another browser yes problem type site is not usable description website doesn t load steps to reproduce website doesn t load remains blank browser configuration mixed active content blocked false image mem shared true buildid tracking content blocked true strict gfx webrender blob images true hastouchscreen true mixed passive content blocked false gfx webrender enabled false gfx webrender all false channel beta console messages u u u u u u u n u n u u u u u failed to instantiate module due to n failed to instantiate module ngsanitize due to n module ngsanitize is not available you either misspelled the module name or forgot to load it if registering a module ensure that you specify the dependencies as the second argument n file line nd u u u u u u u u u u u u u u u u u u u u u u u u u u u u u from with ❤️
| 0
|
816,810
| 30,613,351,612
|
IssuesEvent
|
2023-07-23 22:07:36
|
openmsupply/open-msupply
|
https://api.github.com/repos/openmsupply/open-msupply
|
closed
|
Report printing busy screen
|
programs Priority: Should have
|
Printing a programs report with lots of data can take a while. Currently, there is no indication that the printing is ongoing and you even can start the printing process multiple times.

- [x] Add loading modal when printing
- [x] Should not be possible to print other reports while a report is printing
- [ ] ~Being able to abort the request would be nice (if its not too difficult to implement)~
|
1.0
|
Report printing busy screen - Printing a programs report with lots of data can take a while. Currently, there is no indication that the printing is ongoing and you even can start the printing process multiple times.

- [x] Add loading modal when printing
- [x] Should not be possible to print other reports while a report is printing
- [ ] ~Being able to abort the request would be nice (if its not too difficult to implement)~
|
non_process
|
report printing busy screen printing a programs report with lots of data can take a while currently there is no indication that the printing is ongoing and you even can start the printing process multiple times add loading modal when printing should not be possible to print other reports while a report is printing being able to abort the request would be nice if its not too difficult to implement
| 0
|
51,840
| 13,651,454,125
|
IssuesEvent
|
2020-09-27 01:18:13
|
ignatandrei/testData
|
https://api.github.com/repos/ignatandrei/testData
|
closed
|
CVE-2011-4969 (Medium) detected in jquery-1.4.4.min.js
|
security vulnerability
|
## CVE-2011-4969 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.4.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js</a></p>
<p>Path to dependency file: testData/src/saveBox/node_modules/selenium-webdriver/lib/test/data/selectableItems.html</p>
<p>Path to vulnerable library: testData/src/saveBox/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.4.4.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ignatandrei/testData/commit/1d6e271d4be63f8313319214d19ed924c533c7d9">1d6e271d4be63f8313319214d19ed924c533c7d9</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Cross-site scripting (XSS) vulnerability in jQuery before 1.6.3, when using location.hash to select elements, allows remote attackers to inject arbitrary web script or HTML via a crafted tag.
<p>Publish Date: 2013-03-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2011-4969>CVE-2011-4969</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2011-4969">https://nvd.nist.gov/vuln/detail/CVE-2011-4969</a></p>
<p>Release Date: 2013-03-08</p>
<p>Fix Resolution: 1.6.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2011-4969 (Medium) detected in jquery-1.4.4.min.js - ## CVE-2011-4969 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.4.4.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.4.4/jquery.min.js</a></p>
<p>Path to dependency file: testData/src/saveBox/node_modules/selenium-webdriver/lib/test/data/selectableItems.html</p>
<p>Path to vulnerable library: testData/src/saveBox/node_modules/selenium-webdriver/lib/test/data/js/jquery-1.4.4.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.4.4.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ignatandrei/testData/commit/1d6e271d4be63f8313319214d19ed924c533c7d9">1d6e271d4be63f8313319214d19ed924c533c7d9</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Cross-site scripting (XSS) vulnerability in jQuery before 1.6.3, when using location.hash to select elements, allows remote attackers to inject arbitrary web script or HTML via a crafted tag.
<p>Publish Date: 2013-03-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2011-4969>CVE-2011-4969</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2011-4969">https://nvd.nist.gov/vuln/detail/CVE-2011-4969</a></p>
<p>Release Date: 2013-03-08</p>
<p>Fix Resolution: 1.6.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file testdata src savebox node modules selenium webdriver lib test data selectableitems html path to vulnerable library testdata src savebox node modules selenium webdriver lib test data js jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details cross site scripting xss vulnerability in jquery before when using location hash to select elements allows remote attackers to inject arbitrary web script or html via a crafted tag publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
12,686
| 15,051,588,015
|
IssuesEvent
|
2021-02-03 14:17:05
|
pystatgen/sgkit
|
https://api.github.com/repos/pystatgen/sgkit
|
closed
|
Fix errors when using NumPy 1.20.0
|
process + tools
|
MyPy checks are failing with the new version of NumPy (1.20.0): https://github.com/pystatgen/sgkit/runs/1818875764?check_suite_focus=true
|
1.0
|
Fix errors when using NumPy 1.20.0 - MyPy checks are failing with the new version of NumPy (1.20.0): https://github.com/pystatgen/sgkit/runs/1818875764?check_suite_focus=true
|
process
|
fix errors when using numpy mypy checks are failing with the new version of numpy
| 1
|
108,667
| 9,320,932,088
|
IssuesEvent
|
2019-03-27 01:32:00
|
twatter-soen341/Twatter
|
https://api.github.com/repos/twatter-soen341/Twatter
|
closed
|
Acceptance Test for Search User Story #145
|
Acceptance Test Critical
|
## Acceptance Test
| User Story | `#145 ` |
| --: | :--|
| **Test Priority** | CRITICAL |
| **Test Title** | User can search for a Twat |
| **Description** | The search can provide suggested twats related to the search as well as showing results that match the search. |
| **Tester Name** | Alexandria Paggabao |
## Test Result
| Step | Test Step | Test Data | Expected Results | Acutal Results | Status | Notes |
| :--: | :-- | :-- | :-- | :-- | :--: | :-- |
| 1 | Click on the search icon found in the header | | The search icon expands to display the textarea | | | |
| 2 | Type something to search | | A dropdown menu appears with suggested results | | | |
| 3 | Press enter | | User is redirected to a search result page that displays all matching results| | | |
| 4 | Click on the search icon found in the header | | The search icon expands to display the textarea | | | |
| 5 | Type something to search | | A dropdown menu appears with suggested results | | | |
| 6 | Click on a suggested Twat result from the dropdown menu | | User should be redirected to the user profile of the user that shared the twat.| | | |
|
1.0
|
Acceptance Test for Search User Story #145 - ## Acceptance Test
| User Story | `#145 ` |
| --: | :--|
| **Test Priority** | CRITICAL |
| **Test Title** | User can search for a Twat |
| **Description** | The search can provide suggested twats related to the search as well as showing results that match the search. |
| **Tester Name** | Alexandria Paggabao |
## Test Result
| Step | Test Step | Test Data | Expected Results | Acutal Results | Status | Notes |
| :--: | :-- | :-- | :-- | :-- | :--: | :-- |
| 1 | Click on the search icon found in the header | | The search icon expands to display the textarea | | | |
| 2 | Type something to search | | A dropdown menu appears with suggested results | | | |
| 3 | Press enter | | User is redirected to a search result page that displays all matching results| | | |
| 4 | Click on the search icon found in the header | | The search icon expands to display the textarea | | | |
| 5 | Type something to search | | A dropdown menu appears with suggested results | | | |
| 6 | Click on a suggested Twat result from the dropdown menu | | User should be redirected to the user profile of the user that shared the twat.| | | |
|
non_process
|
acceptance test for search user story acceptance test user story test priority critical test title user can search for a twat description the search can provide suggested twats related to the search as well as showing results that match the search tester name alexandria paggabao test result step test step test data expected results acutal results status notes click on the search icon found in the header the search icon expands to display the textarea type something to search a dropdown menu appears with suggested results press enter user is redirected to a search result page that displays all matching results click on the search icon found in the header the search icon expands to display the textarea type something to search a dropdown menu appears with suggested results click on a suggested twat result from the dropdown menu user should be redirected to the user profile of the user that shared the twat
| 0
|
658,332
| 21,885,011,892
|
IssuesEvent
|
2022-05-19 17:42:44
|
monarch-initiative/mondo
|
https://api.github.com/repos/monarch-initiative/mondo
|
closed
|
Add RGD disease xrefs/equivalencies
|
low priority
|
I think these are essentially re-ID'd from the CTD-OMIM file, but not 100% certain of what has been added/changed since that file was generated.
ftp://ftp.rgd.mcw.edu/pub/ontology/disease/RDO.obo
ftp://ftp.rgd.mcw.edu/pub/ontology/disease/
|
1.0
|
Add RGD disease xrefs/equivalencies - I think these are essentially re-ID'd from the CTD-OMIM file, but not 100% certain of what has been added/changed since that file was generated.
ftp://ftp.rgd.mcw.edu/pub/ontology/disease/RDO.obo
ftp://ftp.rgd.mcw.edu/pub/ontology/disease/
|
non_process
|
add rgd disease xrefs equivalencies i think these are essentially re id d from the ctd omim file but not certain of what has been added changed since that file was generated ftp ftp rgd mcw edu pub ontology disease rdo obo ftp ftp rgd mcw edu pub ontology disease
| 0
|
15,594
| 19,720,409,104
|
IssuesEvent
|
2022-01-13 14:53:15
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
closed
|
global saturation - color balance rgb
|
scope: image processing
|
When increasing the global saturation using **color balance rgb** some colors are darkened / getting black.
Is this by design?
* darktable version : 3.9.0
* OS : Win10

|
1.0
|
global saturation - color balance rgb - When increasing the global saturation using **color balance rgb** some colors are darkened / getting black.
Is this by design?
* darktable version : 3.9.0
* OS : Win10

|
process
|
global saturation color balance rgb when increasing the global saturation using color balance rgb some colors are darkened getting black is this by design darktable version os
| 1
|
15,949
| 20,168,932,674
|
IssuesEvent
|
2022-02-10 08:35:55
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
opened
|
BigQuery incorrectly aliasing, which can make the query fail
|
Type:Bug Priority:P1 Querying/Processor Database/BigQuery Querying/Nested Queries .Regression
|
**Describe the bug**
BigQuery incorrectly aliasing, which can make the query fail.
**To Reproduce**
1. Admin > Data Model > **BigQuery** Sample > Products - rename to "Products Renamed"
2. Question > BigQuery Sample > Orders
3. Join "Products Renamed"
4. Custom Column `1 + 1` as "CC" (just to trigger nested query)
5. Filter "Products Renamed".Category=`Doohickey`

<details><summary>Full stacktrace</summary>
```
2022-02-10 09:33:17,223 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 43,
:started_at #t "2022-02-10T09:33:15.661477+01:00[Europe/Copenhagen]",
:via
[{:status :failed,
:class com.google.cloud.bigquery.BigQueryException,
:error "Unrecognized name: `Products Renamed__category` at [2:1960]",
:stacktrace
["com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:115)"
"com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.queryRpc(HttpBigQueryRpc.java:637)"
"com.google.cloud.bigquery.BigQueryImpl$34.call(BigQueryImpl.java:1255)"
"com.google.cloud.bigquery.BigQueryImpl$34.call(BigQueryImpl.java:1252)"
"com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)"
"com.google.cloud.RetryHelper.run(RetryHelper.java:76)"
"com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)"
"com.google.cloud.bigquery.BigQueryImpl.queryRpc(BigQueryImpl.java:1251)"
"com.google.cloud.bigquery.BigQueryImpl.query(BigQueryImpl.java:1239)"
"--> driver.bigquery_cloud_sdk$execute_bigquery$fn__89607.invoke(bigquery_cloud_sdk.clj:184)"]}
{:status :failed,
:class java.util.concurrent.ExecutionException,
:error "com.google.cloud.bigquery.BigQueryException: Unrecognized name: `Products Renamed__category` at [2:1960]",
:stacktrace
["java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)"
"java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191)"
"clojure.core$deref_future.invokeStatic(core.clj:2304)"
"clojure.core$future_call$reify__8477.deref(core.clj:6976)"
"clojure.core$deref.invokeStatic(core.clj:2324)"
"clojure.core$deref.invoke(core.clj:2310)"
"--> driver.bigquery_cloud_sdk$execute_bigquery.invokeStatic(bigquery_cloud_sdk.clj:172)"
"driver.bigquery_cloud_sdk$execute_bigquery.invoke(bigquery_cloud_sdk.clj:168)"
"driver.bigquery_cloud_sdk$execute_bigquery_on_db.invokeStatic(bigquery_cloud_sdk.clj:210)"
"driver.bigquery_cloud_sdk$execute_bigquery_on_db.invoke(bigquery_cloud_sdk.clj:208)"
"driver.bigquery_cloud_sdk$process_native_STAR_$thunk__89708.invoke(bigquery_cloud_sdk.clj:270)"
"driver.bigquery_cloud_sdk$process_native_STAR_.invokeStatic(bigquery_cloud_sdk.clj:278)"
"driver.bigquery_cloud_sdk$process_native_STAR_.invoke(bigquery_cloud_sdk.clj:262)"
"driver.bigquery_cloud_sdk$fn__89714.invokeStatic(bigquery_cloud_sdk.clj:298)"
"driver.bigquery_cloud_sdk$fn__89714.invoke(bigquery_cloud_sdk.clj:290)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__49659.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__50405.invoke(check_features.clj:42)"
"query_processor.middleware.limit$limit$fn__48032.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__50788.invoke(cache.clj:204)"
"query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__51821.invoke(optimize_temporal_filters.clj:204)"
"query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__51865.invoke(validate_temporal_bucketing.clj:50)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__49716.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__39821.invoke(wrap_value_literals.clj:161)"
"query_processor.middleware.annotate$add_column_info$fn__44587.invoke(annotate.clj:659)"
"query_processor.middleware.permissions$check_query_permissions$fn__46329.invoke(permissions.clj:108)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__51010.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46725.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.visualization_settings$update_viz_settings$fn__46663.invoke(visualization_settings.clj:63)"
"query_processor.middleware.fix_bad_references$fix_bad_references_middleware$fn__50975.invoke(fix_bad_references.clj:91)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__47610.invoke(resolve_joined_fields.clj:111)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__51591.invoke(resolve_joins.clj:176)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__51133.invoke(add_implicit_joins.clj:202)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47629.invoke(large_int_id.clj:59)"
"query_processor.middleware.format_rows$format_rows$fn__51185.invoke(format_rows.clj:74)"
"query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__46997.invoke(add_default_temporal_unit.clj:23)"
"query_processor.middleware.desugar$desugar$fn__46636.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__39550.invoke(binning.clj:229)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__45971.invoke(resolve_fields.clj:34)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__50342.invoke(add_dimension_projections.clj:487)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__50636.invoke(add_implicit_clauses.clj:164)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__48017.invoke(upgrade_field_literals.clj:117)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__47374.invoke(add_source_metadata.clj:125)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__50887.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__48974.invoke(auto_bucket_datetimes.clj:147)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__45952.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__48628.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46025.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__52249.invoke(expand_macros.clj:184)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__48407.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__51200.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__50647$fn__50652.invoke(resolve_database_and_driver.clj:35)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__50647.invoke(resolve_database_and_driver.clj:34)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46571.invoke(fetch_source_query.clj:286)"
"query_processor.middleware.store$initialize_store$fn__46762$fn__46763.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__46762.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__50982.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__50989.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__48353.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49645.invoke(results_metadata.clj:82)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__48371.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__50923.invoke(process_userland_query.clj:146)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__51280.invoke(catch_exceptions.clj:169)"
"query_processor.reducible$async_qp$qp_STAR___43323$thunk__43324.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___43323.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___43332$fn__43335.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___43332.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:245)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:241)"
"query_processor$fn__52297$process_query_and_save_execution_BANG___52306$fn__52309.invoke(query_processor.clj:256)"
"query_processor$fn__52297$process_query_and_save_execution_BANG___52306.invoke(query_processor.clj:249)"
"query_processor$fn__52341$process_query_and_save_with_max_results_constraints_BANG___52350$fn__52353.invoke(query_processor.clj:268)"
"query_processor$fn__52341$process_query_and_save_with_max_results_constraints_BANG___52350.invoke(query_processor.clj:261)"
"api.dataset$run_query_async$fn__65276.invoke(dataset.clj:69)"
"query_processor.streaming$streaming_response_STAR_$fn__38459$fn__38460.invoke(streaming.clj:162)"
"query_processor.streaming$streaming_response_STAR_$fn__38459.invoke(streaming.clj:161)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$task__26889.invoke(streaming_response.clj:84)"]}
{:status :failed,
:class clojure.lang.ExceptionInfo,
:error "Error executing query",
:stacktrace
["--> driver.bigquery_cloud_sdk$throw_invalid_query.invokeStatic(bigquery_cloud_sdk.clj:164)"
"driver.bigquery_cloud_sdk$throw_invalid_query.invoke(bigquery_cloud_sdk.clj:163)"
"driver.bigquery_cloud_sdk$execute_bigquery.invokeStatic(bigquery_cloud_sdk.clj:206)"
"driver.bigquery_cloud_sdk$execute_bigquery.invoke(bigquery_cloud_sdk.clj:168)"
"driver.bigquery_cloud_sdk$execute_bigquery_on_db.invokeStatic(bigquery_cloud_sdk.clj:210)"
"driver.bigquery_cloud_sdk$execute_bigquery_on_db.invoke(bigquery_cloud_sdk.clj:208)"
"driver.bigquery_cloud_sdk$process_native_STAR_$thunk__89708.invoke(bigquery_cloud_sdk.clj:270)"
"driver.bigquery_cloud_sdk$process_native_STAR_.invokeStatic(bigquery_cloud_sdk.clj:278)"
"driver.bigquery_cloud_sdk$process_native_STAR_.invoke(bigquery_cloud_sdk.clj:262)"
"driver.bigquery_cloud_sdk$fn__89714.invokeStatic(bigquery_cloud_sdk.clj:298)"
"driver.bigquery_cloud_sdk$fn__89714.invoke(bigquery_cloud_sdk.clj:290)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__49659.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__50405.invoke(check_features.clj:42)"
"query_processor.middleware.limit$limit$fn__48032.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__50788.invoke(cache.clj:204)"
"query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__51821.invoke(optimize_temporal_filters.clj:204)"
"query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__51865.invoke(validate_temporal_bucketing.clj:50)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__49716.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__39821.invoke(wrap_value_literals.clj:161)"
"query_processor.middleware.annotate$add_column_info$fn__44587.invoke(annotate.clj:659)"
"query_processor.middleware.permissions$check_query_permissions$fn__46329.invoke(permissions.clj:108)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__51010.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46725.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.visualization_settings$update_viz_settings$fn__46663.invoke(visualization_settings.clj:63)"
"query_processor.middleware.fix_bad_references$fix_bad_references_middleware$fn__50975.invoke(fix_bad_references.clj:91)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__47610.invoke(resolve_joined_fields.clj:111)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__51591.invoke(resolve_joins.clj:176)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__51133.invoke(add_implicit_joins.clj:202)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47629.invoke(large_int_id.clj:59)"
"query_processor.middleware.format_rows$format_rows$fn__51185.invoke(format_rows.clj:74)"
"query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__46997.invoke(add_default_temporal_unit.clj:23)"
"query_processor.middleware.desugar$desugar$fn__46636.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__39550.invoke(binning.clj:229)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__45971.invoke(resolve_fields.clj:34)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__50342.invoke(add_dimension_projections.clj:487)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__50636.invoke(add_implicit_clauses.clj:164)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__48017.invoke(upgrade_field_literals.clj:117)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__47374.invoke(add_source_metadata.clj:125)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__50887.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__48974.invoke(auto_bucket_datetimes.clj:147)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__45952.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__48628.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46025.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__52249.invoke(expand_macros.clj:184)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__48407.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__51200.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__50647$fn__50652.invoke(resolve_database_and_driver.clj:35)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__50647.invoke(resolve_database_and_driver.clj:34)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46571.invoke(fetch_source_query.clj:286)"
"query_processor.middleware.store$initialize_store$fn__46762$fn__46763.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__46762.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__50982.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__50989.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__48353.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49645.invoke(results_metadata.clj:82)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__48371.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__50923.invoke(process_userland_query.clj:146)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__51280.invoke(catch_exceptions.clj:169)"
"query_processor.reducible$async_qp$qp_STAR___43323$thunk__43324.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___43323.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___43332$fn__43335.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___43332.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:245)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:241)"
"query_processor$fn__52297$process_query_and_save_execution_BANG___52306$fn__52309.invoke(query_processor.clj:256)"
"query_processor$fn__52297$process_query_and_save_execution_BANG___52306.invoke(query_processor.clj:249)"
"query_processor$fn__52341$process_query_and_save_with_max_results_constraints_BANG___52350$fn__52353.invoke(query_processor.clj:268)"
"query_processor$fn__52341$process_query_and_save_with_max_results_constraints_BANG___52350.invoke(query_processor.clj:261)"
"api.dataset$run_query_async$fn__65276.invoke(dataset.clj:69)"
"query_processor.streaming$streaming_response_STAR_$fn__38459$fn__38460.invoke(streaming.clj:162)"
"query_processor.streaming$streaming_response_STAR_$fn__38459.invoke(streaming.clj:161)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$task__26889.invoke(streaming_response.clj:84)"],
:error_type :invalid-query,
:ex-data
{:type :invalid-query,
:sql
"-- Metabase:: userID: 1 queryType: MBQL queryHash: d33e9441d379c580b218903cbfa36e917fa23a05c40b74c96503004b8205b9c2\nSELECT `id` AS `id`, `user_id` AS `user_id`, `product_id` AS `product_id`, `subtotal` AS `subtotal`, `tax` AS `tax`, `total` AS `total`, `discount` AS `discount`, `created_at` AS `created_at`, `quantity` AS `quantity`, `CC` AS `CC`, `Products Renamed__id` AS `Products_Renamed__id_4408b1be`, `Products Renamed__ean` AS `Products_Renamed__ean_b48361a7`, `Products Renamed__title` AS `Products_Renamed__title_325dad00`, `Products Renamed__category` AS `Products_Renamed__category_1f39b42c`, `Products Renamed__vendor` AS `Products_Renamed__vendor_cb8bd978`, `Products Renamed__price` AS `Products_Renamed__price_1ab6323c`, `Products Renamed__rating` AS `Products_Renamed__rating_3eb4d2e2`, `Products Renamed__created_at` AS `Products_Renamed__created_at_8de4c02f` FROM (SELECT `v3_sample_dataset.orders`.`id` AS `id`, `v3_sample_dataset.orders`.`user_id` AS `user_id`, `v3_sample_dataset.orders`.`product_id` AS `product_id`, `v3_sample_dataset.orders`.`subtotal` AS `subtotal`, `v3_sample_dataset.orders`.`tax` AS `tax`, `v3_sample_dataset.orders`.`total` AS `total`, `v3_sample_dataset.orders`.`discount` AS `discount`, `v3_sample_dataset.orders`.`created_at` AS `created_at`, `v3_sample_dataset.orders`.`quantity` AS `quantity`, (1 + 1) AS `CC`, `Products Renamed`.`id` AS `Products_Renamed__id_4408b1be`, `Products Renamed`.`ean` AS `Products_Renamed__ean_b48361a7`, `Products Renamed`.`title` AS `Products_Renamed__title_325dad00`, `Products Renamed`.`category` AS `Products_Renamed__category_1f39b42c`, `Products Renamed`.`vendor` AS `Products_Renamed__vendor_cb8bd978`, `Products Renamed`.`price` AS `Products_Renamed__price_1ab6323c`, `Products Renamed`.`rating` AS `Products_Renamed__rating_3eb4d2e2`, `Products Renamed`.`created_at` AS `Products_Renamed__created_at_8de4c02f` FROM `v3_sample_dataset.orders` LEFT JOIN `v3_sample_dataset.products` `Products Renamed` ON `v3_sample_dataset.orders`.`product_id` = `Products Renamed`.`id`) `source` WHERE `Products Renamed__category` = ? LIMIT 2000",
:parameters ("Doohickey")}}],
:error_type :invalid-query,
:json_query
{:type "query",
:query
{:source-table 6554,
:joins
[{:fields "all",
:source-table 6552,
:condition ["=" ["field" 38264 nil] ["field" 38247 {:join-alias "Products Renamed"}]],
:alias "Products Renamed"}],
:filter ["=" ["field" 38241 {:join-alias "Products Renamed"}] "Doohickey"],
:expressions {:CC ["+" 1 1]}},
:database 43,
:parameters [],
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:native
{:query
"SELECT `id` AS `id`, `user_id` AS `user_id`, `product_id` AS `product_id`, `subtotal` AS `subtotal`, `tax` AS `tax`, `total` AS `total`, `discount` AS `discount`, `created_at` AS `created_at`, `quantity` AS `quantity`, `CC` AS `CC`, `Products Renamed__id` AS `Products_Renamed__id_4408b1be`, `Products Renamed__ean` AS `Products_Renamed__ean_b48361a7`, `Products Renamed__title` AS `Products_Renamed__title_325dad00`, `Products Renamed__category` AS `Products_Renamed__category_1f39b42c`, `Products Renamed__vendor` AS `Products_Renamed__vendor_cb8bd978`, `Products Renamed__price` AS `Products_Renamed__price_1ab6323c`, `Products Renamed__rating` AS `Products_Renamed__rating_3eb4d2e2`, `Products Renamed__created_at` AS `Products_Renamed__created_at_8de4c02f` FROM (SELECT `v3_sample_dataset.orders`.`id` AS `id`, `v3_sample_dataset.orders`.`user_id` AS `user_id`, `v3_sample_dataset.orders`.`product_id` AS `product_id`, `v3_sample_dataset.orders`.`subtotal` AS `subtotal`, `v3_sample_dataset.orders`.`tax` AS `tax`, `v3_sample_dataset.orders`.`total` AS `total`, `v3_sample_dataset.orders`.`discount` AS `discount`, `v3_sample_dataset.orders`.`created_at` AS `created_at`, `v3_sample_dataset.orders`.`quantity` AS `quantity`, (1 + 1) AS `CC`, `Products Renamed`.`id` AS `Products_Renamed__id_4408b1be`, `Products Renamed`.`ean` AS `Products_Renamed__ean_b48361a7`, `Products Renamed`.`title` AS `Products_Renamed__title_325dad00`, `Products Renamed`.`category` AS `Products_Renamed__category_1f39b42c`, `Products Renamed`.`vendor` AS `Products_Renamed__vendor_cb8bd978`, `Products Renamed`.`price` AS `Products_Renamed__price_1ab6323c`, `Products Renamed`.`rating` AS `Products_Renamed__rating_3eb4d2e2`, `Products Renamed`.`created_at` AS `Products_Renamed__created_at_8de4c02f` FROM `v3_sample_dataset.orders` LEFT JOIN `v3_sample_dataset.products` `Products Renamed` ON `v3_sample_dataset.orders`.`product_id` = `Products Renamed`.`id`) `source` WHERE `Products Renamed__category` = ? LIMIT 2000",
:params ("Doohickey"),
:table-name "orders",
:mbql? true},
:status :failed,
:class com.google.api.client.googleapis.json.GoogleJsonResponseException,
:stacktrace
["com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)"
"com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)"
"com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)"
"com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)"
"com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)"
"com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)"
"com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)"
"com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)"
"com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.queryRpc(HttpBigQueryRpc.java:635)"
"com.google.cloud.bigquery.BigQueryImpl$34.call(BigQueryImpl.java:1255)"
"com.google.cloud.bigquery.BigQueryImpl$34.call(BigQueryImpl.java:1252)"
"com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)"
"com.google.cloud.RetryHelper.run(RetryHelper.java:76)"
"com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)"
"com.google.cloud.bigquery.BigQueryImpl.queryRpc(BigQueryImpl.java:1251)"
"com.google.cloud.bigquery.BigQueryImpl.query(BigQueryImpl.java:1239)"
"--> driver.bigquery_cloud_sdk$execute_bigquery$fn__89607.invoke(bigquery_cloud_sdk.clj:184)"],
:card_id nil,
:context :ad-hoc,
:error
"400 Bad Request\nPOST https://www.googleapis.com/bigquery/v2/projects/metabase-bigquery-driver/queries\n{\n \"code\" : 400,\n \"errors\" : [ {\n \"domain\" : \"global\",\n \"location\" : \"q\",\n \"locationType\" : \"parameter\",\n \"message\" : \"Unrecognized name: `Products Renamed__category` at [2:1960]\",\n \"reason\" : \"invalidQuery\"\n } ],\n \"message\" : \"Unrecognized name: `Products Renamed__category` at [2:1960]\",\n \"status\" : \"INVALID_ARGUMENT\"\n}",
:row_count 0,
:running_time 0,
:preprocessed
{:type :query,
:query
{:source-table 6554,
:filter
[:=
[:field 38241 {:join-alias "Products Renamed"}]
[:value
"Doohickey"
{:base_type :type/Text,
:effective_type :type/Text,
:coercion_strategy nil,
:semantic_type :type/Category,
:database_type "STRING",
:name "category"}]],
:expressions {:CC [:+ 1 1]},
:fields
[[:field 38269 nil]
[:field 38267 nil]
[:field 38264 nil]
[:field 38265 nil]
[:field 38261 nil]
[:field 38263 nil]
[:field 38268 nil]
[:field 38266 {:temporal-unit :default}]
[:field 38262 nil]
[:expression "CC"]
[:field 38247 {:join-alias "Products Renamed"}]
[:field 38243 {:join-alias "Products Renamed"}]
[:field 38240 {:join-alias "Products Renamed"}]
[:field 38241 {:join-alias "Products Renamed"}]
[:field 38244 {:join-alias "Products Renamed"}]
[:field 38246 {:join-alias "Products Renamed"}]
[:field 38242 {:join-alias "Products Renamed"}]
[:field 38245 {:temporal-unit :default, :join-alias "Products Renamed"}]],
:joins
[{:strategy :left-join,
:fields
[[:field 38247 {:join-alias "Products Renamed"}]
[:field 38243 {:join-alias "Products Renamed"}]
[:field 38240 {:join-alias "Products Renamed"}]
[:field 38241 {:join-alias "Products Renamed"}]
[:field 38244 {:join-alias "Products Renamed"}]
[:field 38246 {:join-alias "Products Renamed"}]
[:field 38242 {:join-alias "Products Renamed"}]
[:field 38245 {:temporal-unit :default, :join-alias "Products Renamed"}]],
:source-table 6552,
:condition [:= [:field 38264 nil] [:field 38247 {:join-alias "Products Renamed"}]],
:alias "Products Renamed"}],
:limit 2000},
:database 43,
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true},
:info
{:executed-by 1,
:context :ad-hoc,
:nested? false,
:query-hash
[-45, 62, -108, 65, -45, 121, -59, -128, -78, 24, -112, 60, -65, -93, 110, -111, 127, -94, 58, 5, -60, 11, 116, -55,
101, 3, 0, 75, -126, 5, -71, -62]},
:constraints {:max-results 10000, :max-results-bare-rows 2000}},
:data {:rows [], :cols []}}
2022-02-10 09:33:17,237 DEBUG middleware.log :: POST /api/dataset 202 [ASYNC: completed] 1.6 s (22 DB calls) App DB connections: 0/15 Jetty threads: 3/50 (3 idle, 0 queued) (111 total active threads) Queries in flight: 0 (0 queued)
```
</details>
**Information about your Metabase Installation:**
0.42.0 - works on 0.41.6
|
1.0
|
BigQuery incorrectly aliasing, which can make the query fail - **Describe the bug**
BigQuery incorrectly aliasing, which can make the query fail.
**To Reproduce**
1. Admin > Data Model > **BigQuery** Sample > Products - rename to "Products Renamed"
2. Question > BigQuery Sample > Orders
3. Join "Products Renamed"
4. Custom Column `1 + 1` as "CC" (just to trigger nested query)
5. Filter "Products Renamed".Category=`Doohickey`

<details><summary>Full stacktrace</summary>
```
2022-02-10 09:33:17,223 ERROR middleware.catch-exceptions :: Error processing query: null
{:database_id 43,
:started_at #t "2022-02-10T09:33:15.661477+01:00[Europe/Copenhagen]",
:via
[{:status :failed,
:class com.google.cloud.bigquery.BigQueryException,
:error "Unrecognized name: `Products Renamed__category` at [2:1960]",
:stacktrace
["com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:115)"
"com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.queryRpc(HttpBigQueryRpc.java:637)"
"com.google.cloud.bigquery.BigQueryImpl$34.call(BigQueryImpl.java:1255)"
"com.google.cloud.bigquery.BigQueryImpl$34.call(BigQueryImpl.java:1252)"
"com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)"
"com.google.cloud.RetryHelper.run(RetryHelper.java:76)"
"com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)"
"com.google.cloud.bigquery.BigQueryImpl.queryRpc(BigQueryImpl.java:1251)"
"com.google.cloud.bigquery.BigQueryImpl.query(BigQueryImpl.java:1239)"
"--> driver.bigquery_cloud_sdk$execute_bigquery$fn__89607.invoke(bigquery_cloud_sdk.clj:184)"]}
{:status :failed,
:class java.util.concurrent.ExecutionException,
:error "com.google.cloud.bigquery.BigQueryException: Unrecognized name: `Products Renamed__category` at [2:1960]",
:stacktrace
["java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)"
"java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191)"
"clojure.core$deref_future.invokeStatic(core.clj:2304)"
"clojure.core$future_call$reify__8477.deref(core.clj:6976)"
"clojure.core$deref.invokeStatic(core.clj:2324)"
"clojure.core$deref.invoke(core.clj:2310)"
"--> driver.bigquery_cloud_sdk$execute_bigquery.invokeStatic(bigquery_cloud_sdk.clj:172)"
"driver.bigquery_cloud_sdk$execute_bigquery.invoke(bigquery_cloud_sdk.clj:168)"
"driver.bigquery_cloud_sdk$execute_bigquery_on_db.invokeStatic(bigquery_cloud_sdk.clj:210)"
"driver.bigquery_cloud_sdk$execute_bigquery_on_db.invoke(bigquery_cloud_sdk.clj:208)"
"driver.bigquery_cloud_sdk$process_native_STAR_$thunk__89708.invoke(bigquery_cloud_sdk.clj:270)"
"driver.bigquery_cloud_sdk$process_native_STAR_.invokeStatic(bigquery_cloud_sdk.clj:278)"
"driver.bigquery_cloud_sdk$process_native_STAR_.invoke(bigquery_cloud_sdk.clj:262)"
"driver.bigquery_cloud_sdk$fn__89714.invokeStatic(bigquery_cloud_sdk.clj:298)"
"driver.bigquery_cloud_sdk$fn__89714.invoke(bigquery_cloud_sdk.clj:290)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__49659.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__50405.invoke(check_features.clj:42)"
"query_processor.middleware.limit$limit$fn__48032.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__50788.invoke(cache.clj:204)"
"query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__51821.invoke(optimize_temporal_filters.clj:204)"
"query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__51865.invoke(validate_temporal_bucketing.clj:50)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__49716.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__39821.invoke(wrap_value_literals.clj:161)"
"query_processor.middleware.annotate$add_column_info$fn__44587.invoke(annotate.clj:659)"
"query_processor.middleware.permissions$check_query_permissions$fn__46329.invoke(permissions.clj:108)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__51010.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46725.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.visualization_settings$update_viz_settings$fn__46663.invoke(visualization_settings.clj:63)"
"query_processor.middleware.fix_bad_references$fix_bad_references_middleware$fn__50975.invoke(fix_bad_references.clj:91)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__47610.invoke(resolve_joined_fields.clj:111)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__51591.invoke(resolve_joins.clj:176)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__51133.invoke(add_implicit_joins.clj:202)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47629.invoke(large_int_id.clj:59)"
"query_processor.middleware.format_rows$format_rows$fn__51185.invoke(format_rows.clj:74)"
"query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__46997.invoke(add_default_temporal_unit.clj:23)"
"query_processor.middleware.desugar$desugar$fn__46636.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__39550.invoke(binning.clj:229)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__45971.invoke(resolve_fields.clj:34)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__50342.invoke(add_dimension_projections.clj:487)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__50636.invoke(add_implicit_clauses.clj:164)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__48017.invoke(upgrade_field_literals.clj:117)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__47374.invoke(add_source_metadata.clj:125)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__50887.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__48974.invoke(auto_bucket_datetimes.clj:147)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__45952.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__48628.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46025.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__52249.invoke(expand_macros.clj:184)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__48407.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__51200.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__50647$fn__50652.invoke(resolve_database_and_driver.clj:35)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__50647.invoke(resolve_database_and_driver.clj:34)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46571.invoke(fetch_source_query.clj:286)"
"query_processor.middleware.store$initialize_store$fn__46762$fn__46763.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__46762.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__50982.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__50989.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__48353.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49645.invoke(results_metadata.clj:82)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__48371.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__50923.invoke(process_userland_query.clj:146)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__51280.invoke(catch_exceptions.clj:169)"
"query_processor.reducible$async_qp$qp_STAR___43323$thunk__43324.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___43323.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___43332$fn__43335.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___43332.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:245)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:241)"
"query_processor$fn__52297$process_query_and_save_execution_BANG___52306$fn__52309.invoke(query_processor.clj:256)"
"query_processor$fn__52297$process_query_and_save_execution_BANG___52306.invoke(query_processor.clj:249)"
"query_processor$fn__52341$process_query_and_save_with_max_results_constraints_BANG___52350$fn__52353.invoke(query_processor.clj:268)"
"query_processor$fn__52341$process_query_and_save_with_max_results_constraints_BANG___52350.invoke(query_processor.clj:261)"
"api.dataset$run_query_async$fn__65276.invoke(dataset.clj:69)"
"query_processor.streaming$streaming_response_STAR_$fn__38459$fn__38460.invoke(streaming.clj:162)"
"query_processor.streaming$streaming_response_STAR_$fn__38459.invoke(streaming.clj:161)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$task__26889.invoke(streaming_response.clj:84)"]}
{:status :failed,
:class clojure.lang.ExceptionInfo,
:error "Error executing query",
:stacktrace
["--> driver.bigquery_cloud_sdk$throw_invalid_query.invokeStatic(bigquery_cloud_sdk.clj:164)"
"driver.bigquery_cloud_sdk$throw_invalid_query.invoke(bigquery_cloud_sdk.clj:163)"
"driver.bigquery_cloud_sdk$execute_bigquery.invokeStatic(bigquery_cloud_sdk.clj:206)"
"driver.bigquery_cloud_sdk$execute_bigquery.invoke(bigquery_cloud_sdk.clj:168)"
"driver.bigquery_cloud_sdk$execute_bigquery_on_db.invokeStatic(bigquery_cloud_sdk.clj:210)"
"driver.bigquery_cloud_sdk$execute_bigquery_on_db.invoke(bigquery_cloud_sdk.clj:208)"
"driver.bigquery_cloud_sdk$process_native_STAR_$thunk__89708.invoke(bigquery_cloud_sdk.clj:270)"
"driver.bigquery_cloud_sdk$process_native_STAR_.invokeStatic(bigquery_cloud_sdk.clj:278)"
"driver.bigquery_cloud_sdk$process_native_STAR_.invoke(bigquery_cloud_sdk.clj:262)"
"driver.bigquery_cloud_sdk$fn__89714.invokeStatic(bigquery_cloud_sdk.clj:298)"
"driver.bigquery_cloud_sdk$fn__89714.invoke(bigquery_cloud_sdk.clj:290)"
"query_processor.context$executef.invokeStatic(context.clj:59)"
"query_processor.context$executef.invoke(context.clj:48)"
"query_processor.context.default$default_runf.invokeStatic(default.clj:68)"
"query_processor.context.default$default_runf.invoke(default.clj:66)"
"query_processor.context$runf.invokeStatic(context.clj:45)"
"query_processor.context$runf.invoke(context.clj:39)"
"query_processor.reducible$pivot.invokeStatic(reducible.clj:34)"
"query_processor.reducible$pivot.invoke(reducible.clj:31)"
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__49659.invoke(mbql_to_native.clj:25)"
"query_processor.middleware.check_features$check_features$fn__50405.invoke(check_features.clj:42)"
"query_processor.middleware.limit$limit$fn__48032.invoke(limit.clj:37)"
"query_processor.middleware.cache$maybe_return_cached_results$fn__50788.invoke(cache.clj:204)"
"query_processor.middleware.optimize_temporal_filters$optimize_temporal_filters$fn__51821.invoke(optimize_temporal_filters.clj:204)"
"query_processor.middleware.validate_temporal_bucketing$validate_temporal_bucketing$fn__51865.invoke(validate_temporal_bucketing.clj:50)"
"query_processor.middleware.auto_parse_filter_values$auto_parse_filter_values$fn__49716.invoke(auto_parse_filter_values.clj:43)"
"query_processor.middleware.wrap_value_literals$wrap_value_literals$fn__39821.invoke(wrap_value_literals.clj:161)"
"query_processor.middleware.annotate$add_column_info$fn__44587.invoke(annotate.clj:659)"
"query_processor.middleware.permissions$check_query_permissions$fn__46329.invoke(permissions.clj:108)"
"query_processor.middleware.pre_alias_aggregations$pre_alias_aggregations$fn__51010.invoke(pre_alias_aggregations.clj:40)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__46725.invoke(cumulative_aggregations.clj:60)"
"query_processor.middleware.visualization_settings$update_viz_settings$fn__46663.invoke(visualization_settings.clj:63)"
"query_processor.middleware.fix_bad_references$fix_bad_references_middleware$fn__50975.invoke(fix_bad_references.clj:91)"
"query_processor.middleware.resolve_joined_fields$resolve_joined_fields$fn__47610.invoke(resolve_joined_fields.clj:111)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__51591.invoke(resolve_joins.clj:176)"
"query_processor.middleware.add_implicit_joins$add_implicit_joins$fn__51133.invoke(add_implicit_joins.clj:202)"
"query_processor.middleware.large_int_id$convert_id_to_string$fn__47629.invoke(large_int_id.clj:59)"
"query_processor.middleware.format_rows$format_rows$fn__51185.invoke(format_rows.clj:74)"
"query_processor.middleware.add_default_temporal_unit$add_default_temporal_unit$fn__46997.invoke(add_default_temporal_unit.clj:23)"
"query_processor.middleware.desugar$desugar$fn__46636.invoke(desugar.clj:21)"
"query_processor.middleware.binning$update_binning_strategy$fn__39550.invoke(binning.clj:229)"
"query_processor.middleware.resolve_fields$resolve_fields$fn__45971.invoke(resolve_fields.clj:34)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__50342.invoke(add_dimension_projections.clj:487)"
"query_processor.middleware.add_implicit_clauses$add_implicit_clauses$fn__50636.invoke(add_implicit_clauses.clj:164)"
"query_processor.middleware.upgrade_field_literals$upgrade_field_literals$fn__48017.invoke(upgrade_field_literals.clj:117)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__47374.invoke(add_source_metadata.clj:125)"
"query_processor.middleware.reconcile_breakout_and_order_by_bucketing$reconcile_breakout_and_order_by_bucketing$fn__50887.invoke(reconcile_breakout_and_order_by_bucketing.clj:100)"
"query_processor.middleware.auto_bucket_datetimes$auto_bucket_datetimes$fn__48974.invoke(auto_bucket_datetimes.clj:147)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__45952.invoke(resolve_source_table.clj:45)"
"query_processor.middleware.parameters$substitute_parameters$fn__48628.invoke(parameters.clj:111)"
"query_processor.middleware.resolve_referenced$resolve_referenced_card_resources$fn__46025.invoke(resolve_referenced.clj:79)"
"query_processor.middleware.expand_macros$expand_macros$fn__52249.invoke(expand_macros.clj:184)"
"query_processor.middleware.add_timezone_info$add_timezone_info$fn__48407.invoke(add_timezone_info.clj:15)"
"query_processor.middleware.splice_params_in_response$splice_params_in_response$fn__51200.invoke(splice_params_in_response.clj:32)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__50647$fn__50652.invoke(resolve_database_and_driver.clj:35)"
"driver$do_with_driver.invokeStatic(driver.clj:60)"
"driver$do_with_driver.invoke(driver.clj:56)"
"query_processor.middleware.resolve_database_and_driver$resolve_database_and_driver$fn__50647.invoke(resolve_database_and_driver.clj:34)"
"query_processor.middleware.fetch_source_query$resolve_card_id_source_tables$fn__46571.invoke(fetch_source_query.clj:286)"
"query_processor.middleware.store$initialize_store$fn__46762$fn__46763.invoke(store.clj:11)"
"query_processor.store$do_with_store.invokeStatic(store.clj:44)"
"query_processor.store$do_with_store.invoke(store.clj:38)"
"query_processor.middleware.store$initialize_store$fn__46762.invoke(store.clj:10)"
"query_processor.middleware.validate$validate_query$fn__50982.invoke(validate.clj:10)"
"query_processor.middleware.normalize_query$normalize$fn__50989.invoke(normalize_query.clj:22)"
"query_processor.middleware.add_rows_truncated$add_rows_truncated$fn__48353.invoke(add_rows_truncated.clj:35)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__49645.invoke(results_metadata.clj:82)"
"query_processor.middleware.constraints$add_default_userland_constraints$fn__48371.invoke(constraints.clj:42)"
"query_processor.middleware.process_userland_query$process_userland_query$fn__50923.invoke(process_userland_query.clj:146)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__51280.invoke(catch_exceptions.clj:169)"
"query_processor.reducible$async_qp$qp_STAR___43323$thunk__43324.invoke(reducible.clj:103)"
"query_processor.reducible$async_qp$qp_STAR___43323.invoke(reducible.clj:109)"
"query_processor.reducible$sync_qp$qp_STAR___43332$fn__43335.invoke(reducible.clj:135)"
"query_processor.reducible$sync_qp$qp_STAR___43332.invoke(reducible.clj:134)"
"query_processor$process_userland_query.invokeStatic(query_processor.clj:245)"
"query_processor$process_userland_query.doInvoke(query_processor.clj:241)"
"query_processor$fn__52297$process_query_and_save_execution_BANG___52306$fn__52309.invoke(query_processor.clj:256)"
"query_processor$fn__52297$process_query_and_save_execution_BANG___52306.invoke(query_processor.clj:249)"
"query_processor$fn__52341$process_query_and_save_with_max_results_constraints_BANG___52350$fn__52353.invoke(query_processor.clj:268)"
"query_processor$fn__52341$process_query_and_save_with_max_results_constraints_BANG___52350.invoke(query_processor.clj:261)"
"api.dataset$run_query_async$fn__65276.invoke(dataset.clj:69)"
"query_processor.streaming$streaming_response_STAR_$fn__38459$fn__38460.invoke(streaming.clj:162)"
"query_processor.streaming$streaming_response_STAR_$fn__38459.invoke(streaming.clj:161)"
"async.streaming_response$do_f_STAR_.invokeStatic(streaming_response.clj:65)"
"async.streaming_response$do_f_STAR_.invoke(streaming_response.clj:63)"
"async.streaming_response$do_f_async$task__26889.invoke(streaming_response.clj:84)"],
:error_type :invalid-query,
:ex-data
{:type :invalid-query,
:sql
"-- Metabase:: userID: 1 queryType: MBQL queryHash: d33e9441d379c580b218903cbfa36e917fa23a05c40b74c96503004b8205b9c2\nSELECT `id` AS `id`, `user_id` AS `user_id`, `product_id` AS `product_id`, `subtotal` AS `subtotal`, `tax` AS `tax`, `total` AS `total`, `discount` AS `discount`, `created_at` AS `created_at`, `quantity` AS `quantity`, `CC` AS `CC`, `Products Renamed__id` AS `Products_Renamed__id_4408b1be`, `Products Renamed__ean` AS `Products_Renamed__ean_b48361a7`, `Products Renamed__title` AS `Products_Renamed__title_325dad00`, `Products Renamed__category` AS `Products_Renamed__category_1f39b42c`, `Products Renamed__vendor` AS `Products_Renamed__vendor_cb8bd978`, `Products Renamed__price` AS `Products_Renamed__price_1ab6323c`, `Products Renamed__rating` AS `Products_Renamed__rating_3eb4d2e2`, `Products Renamed__created_at` AS `Products_Renamed__created_at_8de4c02f` FROM (SELECT `v3_sample_dataset.orders`.`id` AS `id`, `v3_sample_dataset.orders`.`user_id` AS `user_id`, `v3_sample_dataset.orders`.`product_id` AS `product_id`, `v3_sample_dataset.orders`.`subtotal` AS `subtotal`, `v3_sample_dataset.orders`.`tax` AS `tax`, `v3_sample_dataset.orders`.`total` AS `total`, `v3_sample_dataset.orders`.`discount` AS `discount`, `v3_sample_dataset.orders`.`created_at` AS `created_at`, `v3_sample_dataset.orders`.`quantity` AS `quantity`, (1 + 1) AS `CC`, `Products Renamed`.`id` AS `Products_Renamed__id_4408b1be`, `Products Renamed`.`ean` AS `Products_Renamed__ean_b48361a7`, `Products Renamed`.`title` AS `Products_Renamed__title_325dad00`, `Products Renamed`.`category` AS `Products_Renamed__category_1f39b42c`, `Products Renamed`.`vendor` AS `Products_Renamed__vendor_cb8bd978`, `Products Renamed`.`price` AS `Products_Renamed__price_1ab6323c`, `Products Renamed`.`rating` AS `Products_Renamed__rating_3eb4d2e2`, `Products Renamed`.`created_at` AS `Products_Renamed__created_at_8de4c02f` FROM `v3_sample_dataset.orders` LEFT JOIN `v3_sample_dataset.products` `Products Renamed` ON `v3_sample_dataset.orders`.`product_id` = `Products Renamed`.`id`) `source` WHERE `Products Renamed__category` = ? LIMIT 2000",
:parameters ("Doohickey")}}],
:error_type :invalid-query,
:json_query
{:type "query",
:query
{:source-table 6554,
:joins
[{:fields "all",
:source-table 6552,
:condition ["=" ["field" 38264 nil] ["field" 38247 {:join-alias "Products Renamed"}]],
:alias "Products Renamed"}],
:filter ["=" ["field" 38241 {:join-alias "Products Renamed"}] "Doohickey"],
:expressions {:CC ["+" 1 1]}},
:database 43,
:parameters [],
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true}},
:native
{:query
"SELECT `id` AS `id`, `user_id` AS `user_id`, `product_id` AS `product_id`, `subtotal` AS `subtotal`, `tax` AS `tax`, `total` AS `total`, `discount` AS `discount`, `created_at` AS `created_at`, `quantity` AS `quantity`, `CC` AS `CC`, `Products Renamed__id` AS `Products_Renamed__id_4408b1be`, `Products Renamed__ean` AS `Products_Renamed__ean_b48361a7`, `Products Renamed__title` AS `Products_Renamed__title_325dad00`, `Products Renamed__category` AS `Products_Renamed__category_1f39b42c`, `Products Renamed__vendor` AS `Products_Renamed__vendor_cb8bd978`, `Products Renamed__price` AS `Products_Renamed__price_1ab6323c`, `Products Renamed__rating` AS `Products_Renamed__rating_3eb4d2e2`, `Products Renamed__created_at` AS `Products_Renamed__created_at_8de4c02f` FROM (SELECT `v3_sample_dataset.orders`.`id` AS `id`, `v3_sample_dataset.orders`.`user_id` AS `user_id`, `v3_sample_dataset.orders`.`product_id` AS `product_id`, `v3_sample_dataset.orders`.`subtotal` AS `subtotal`, `v3_sample_dataset.orders`.`tax` AS `tax`, `v3_sample_dataset.orders`.`total` AS `total`, `v3_sample_dataset.orders`.`discount` AS `discount`, `v3_sample_dataset.orders`.`created_at` AS `created_at`, `v3_sample_dataset.orders`.`quantity` AS `quantity`, (1 + 1) AS `CC`, `Products Renamed`.`id` AS `Products_Renamed__id_4408b1be`, `Products Renamed`.`ean` AS `Products_Renamed__ean_b48361a7`, `Products Renamed`.`title` AS `Products_Renamed__title_325dad00`, `Products Renamed`.`category` AS `Products_Renamed__category_1f39b42c`, `Products Renamed`.`vendor` AS `Products_Renamed__vendor_cb8bd978`, `Products Renamed`.`price` AS `Products_Renamed__price_1ab6323c`, `Products Renamed`.`rating` AS `Products_Renamed__rating_3eb4d2e2`, `Products Renamed`.`created_at` AS `Products_Renamed__created_at_8de4c02f` FROM `v3_sample_dataset.orders` LEFT JOIN `v3_sample_dataset.products` `Products Renamed` ON `v3_sample_dataset.orders`.`product_id` = `Products Renamed`.`id`) `source` WHERE `Products Renamed__category` = ? LIMIT 2000",
:params ("Doohickey"),
:table-name "orders",
:mbql? true},
:status :failed,
:class com.google.api.client.googleapis.json.GoogleJsonResponseException,
:stacktrace
["com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)"
"com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)"
"com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)"
"com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)"
"com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)"
"com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)"
"com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)"
"com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)"
"com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.queryRpc(HttpBigQueryRpc.java:635)"
"com.google.cloud.bigquery.BigQueryImpl$34.call(BigQueryImpl.java:1255)"
"com.google.cloud.bigquery.BigQueryImpl$34.call(BigQueryImpl.java:1252)"
"com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)"
"com.google.cloud.RetryHelper.run(RetryHelper.java:76)"
"com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)"
"com.google.cloud.bigquery.BigQueryImpl.queryRpc(BigQueryImpl.java:1251)"
"com.google.cloud.bigquery.BigQueryImpl.query(BigQueryImpl.java:1239)"
"--> driver.bigquery_cloud_sdk$execute_bigquery$fn__89607.invoke(bigquery_cloud_sdk.clj:184)"],
:card_id nil,
:context :ad-hoc,
:error
"400 Bad Request\nPOST https://www.googleapis.com/bigquery/v2/projects/metabase-bigquery-driver/queries\n{\n \"code\" : 400,\n \"errors\" : [ {\n \"domain\" : \"global\",\n \"location\" : \"q\",\n \"locationType\" : \"parameter\",\n \"message\" : \"Unrecognized name: `Products Renamed__category` at [2:1960]\",\n \"reason\" : \"invalidQuery\"\n } ],\n \"message\" : \"Unrecognized name: `Products Renamed__category` at [2:1960]\",\n \"status\" : \"INVALID_ARGUMENT\"\n}",
:row_count 0,
:running_time 0,
:preprocessed
{:type :query,
:query
{:source-table 6554,
:filter
[:=
[:field 38241 {:join-alias "Products Renamed"}]
[:value
"Doohickey"
{:base_type :type/Text,
:effective_type :type/Text,
:coercion_strategy nil,
:semantic_type :type/Category,
:database_type "STRING",
:name "category"}]],
:expressions {:CC [:+ 1 1]},
:fields
[[:field 38269 nil]
[:field 38267 nil]
[:field 38264 nil]
[:field 38265 nil]
[:field 38261 nil]
[:field 38263 nil]
[:field 38268 nil]
[:field 38266 {:temporal-unit :default}]
[:field 38262 nil]
[:expression "CC"]
[:field 38247 {:join-alias "Products Renamed"}]
[:field 38243 {:join-alias "Products Renamed"}]
[:field 38240 {:join-alias "Products Renamed"}]
[:field 38241 {:join-alias "Products Renamed"}]
[:field 38244 {:join-alias "Products Renamed"}]
[:field 38246 {:join-alias "Products Renamed"}]
[:field 38242 {:join-alias "Products Renamed"}]
[:field 38245 {:temporal-unit :default, :join-alias "Products Renamed"}]],
:joins
[{:strategy :left-join,
:fields
[[:field 38247 {:join-alias "Products Renamed"}]
[:field 38243 {:join-alias "Products Renamed"}]
[:field 38240 {:join-alias "Products Renamed"}]
[:field 38241 {:join-alias "Products Renamed"}]
[:field 38244 {:join-alias "Products Renamed"}]
[:field 38246 {:join-alias "Products Renamed"}]
[:field 38242 {:join-alias "Products Renamed"}]
[:field 38245 {:temporal-unit :default, :join-alias "Products Renamed"}]],
:source-table 6552,
:condition [:= [:field 38264 nil] [:field 38247 {:join-alias "Products Renamed"}]],
:alias "Products Renamed"}],
:limit 2000},
:database 43,
:middleware {:js-int-to-string? true, :add-default-userland-constraints? true},
:info
{:executed-by 1,
:context :ad-hoc,
:nested? false,
:query-hash
[-45, 62, -108, 65, -45, 121, -59, -128, -78, 24, -112, 60, -65, -93, 110, -111, 127, -94, 58, 5, -60, 11, 116, -55,
101, 3, 0, 75, -126, 5, -71, -62]},
:constraints {:max-results 10000, :max-results-bare-rows 2000}},
:data {:rows [], :cols []}}
2022-02-10 09:33:17,237 DEBUG middleware.log :: POST /api/dataset 202 [ASYNC: completed] 1.6 s (22 DB calls) App DB connections: 0/15 Jetty threads: 3/50 (3 idle, 0 queued) (111 total active threads) Queries in flight: 0 (0 queued)
```
</details>
**Information about your Metabase Installation:**
0.42.0 - works on 0.41.6
|
process
|
bigquery incorrectly aliasing which can make the query fail describe the bug bigquery incorrectly aliasing which can make the query fail to reproduce admin data model bigquery sample products rename to products renamed question bigquery sample orders join products renamed custom column as cc just to trigger nested query filter products renamed category doohickey full stacktrace error middleware catch exceptions error processing query null database id started at t via status failed class com google cloud bigquery bigqueryexception error unrecognized name products renamed category at stacktrace com google cloud bigquery spi httpbigqueryrpc translate httpbigqueryrpc java com google cloud bigquery spi httpbigqueryrpc queryrpc httpbigqueryrpc java com google cloud bigquery bigqueryimpl call bigqueryimpl java com google cloud bigquery bigqueryimpl call bigqueryimpl java com google api gax retrying directretryingexecutor submit directretryingexecutor java com google cloud retryhelper run retryhelper java com google cloud retryhelper runwithretries retryhelper java com google cloud bigquery bigqueryimpl queryrpc bigqueryimpl java com google cloud bigquery bigqueryimpl query bigqueryimpl java driver bigquery cloud sdk execute bigquery fn invoke bigquery cloud sdk clj status failed class java util concurrent executionexception error com google cloud bigquery bigqueryexception unrecognized name products renamed category at stacktrace java base java util concurrent futuretask report futuretask java java base java util concurrent futuretask get futuretask java clojure core deref future invokestatic core clj clojure core future call reify deref core clj clojure core deref invokestatic core clj clojure core deref invoke core clj driver bigquery cloud sdk execute bigquery invokestatic bigquery cloud sdk clj driver bigquery cloud sdk execute bigquery invoke bigquery cloud sdk clj driver bigquery cloud sdk execute bigquery on db invokestatic bigquery cloud sdk clj driver bigquery cloud sdk execute bigquery on db invoke bigquery cloud sdk clj driver bigquery cloud sdk process native star thunk invoke bigquery cloud sdk clj driver bigquery cloud sdk process native star invokestatic bigquery cloud sdk clj driver bigquery cloud sdk process native star invoke bigquery cloud sdk clj driver bigquery cloud sdk fn invokestatic bigquery cloud sdk clj driver bigquery cloud sdk fn invoke bigquery cloud sdk clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware visualization settings update viz settings fn invoke visualization settings clj query processor middleware fix bad references fix bad references middleware fn invoke fix bad references clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset run query async fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async task invoke streaming response clj status failed class clojure lang exceptioninfo error error executing query stacktrace driver bigquery cloud sdk throw invalid query invokestatic bigquery cloud sdk clj driver bigquery cloud sdk throw invalid query invoke bigquery cloud sdk clj driver bigquery cloud sdk execute bigquery invokestatic bigquery cloud sdk clj driver bigquery cloud sdk execute bigquery invoke bigquery cloud sdk clj driver bigquery cloud sdk execute bigquery on db invokestatic bigquery cloud sdk clj driver bigquery cloud sdk execute bigquery on db invoke bigquery cloud sdk clj driver bigquery cloud sdk process native star thunk invoke bigquery cloud sdk clj driver bigquery cloud sdk process native star invokestatic bigquery cloud sdk clj driver bigquery cloud sdk process native star invoke bigquery cloud sdk clj driver bigquery cloud sdk fn invokestatic bigquery cloud sdk clj driver bigquery cloud sdk fn invoke bigquery cloud sdk clj query processor context executef invokestatic context clj query processor context executef invoke context clj query processor context default default runf invokestatic default clj query processor context default default runf invoke default clj query processor context runf invokestatic context clj query processor context runf invoke context clj query processor reducible pivot invokestatic reducible clj query processor reducible pivot invoke reducible clj query processor middleware mbql to native mbql gt native fn invoke mbql to native clj query processor middleware check features check features fn invoke check features clj query processor middleware limit limit fn invoke limit clj query processor middleware cache maybe return cached results fn invoke cache clj query processor middleware optimize temporal filters optimize temporal filters fn invoke optimize temporal filters clj query processor middleware validate temporal bucketing validate temporal bucketing fn invoke validate temporal bucketing clj query processor middleware auto parse filter values auto parse filter values fn invoke auto parse filter values clj query processor middleware wrap value literals wrap value literals fn invoke wrap value literals clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware permissions check query permissions fn invoke permissions clj query processor middleware pre alias aggregations pre alias aggregations fn invoke pre alias aggregations clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware visualization settings update viz settings fn invoke visualization settings clj query processor middleware fix bad references fix bad references middleware fn invoke fix bad references clj query processor middleware resolve joined fields resolve joined fields fn invoke resolve joined fields clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware add implicit joins add implicit joins fn invoke add implicit joins clj query processor middleware large int id convert id to string fn invoke large int id clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add default temporal unit add default temporal unit fn invoke add default temporal unit clj query processor middleware desugar desugar fn invoke desugar clj query processor middleware binning update binning strategy fn invoke binning clj query processor middleware resolve fields resolve fields fn invoke resolve fields clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add implicit clauses add implicit clauses fn invoke add implicit clauses clj query processor middleware upgrade field literals upgrade field literals fn invoke upgrade field literals clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware reconcile breakout and order by bucketing reconcile breakout and order by bucketing fn invoke reconcile breakout and order by bucketing clj query processor middleware auto bucket datetimes auto bucket datetimes fn invoke auto bucket datetimes clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware parameters substitute parameters fn invoke parameters clj query processor middleware resolve referenced resolve referenced card resources fn invoke resolve referenced clj query processor middleware expand macros expand macros fn invoke expand macros clj query processor middleware add timezone info add timezone info fn invoke add timezone info clj query processor middleware splice params in response splice params in response fn invoke splice params in response clj query processor middleware resolve database and driver resolve database and driver fn fn invoke resolve database and driver clj driver do with driver invokestatic driver clj driver do with driver invoke driver clj query processor middleware resolve database and driver resolve database and driver fn invoke resolve database and driver clj query processor middleware fetch source query resolve card id source tables fn invoke fetch source query clj query processor middleware store initialize store fn fn invoke store clj query processor store do with store invokestatic store clj query processor store do with store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware validate validate query fn invoke validate clj query processor middleware normalize query normalize fn invoke normalize query clj query processor middleware add rows truncated add rows truncated fn invoke add rows truncated clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware constraints add default userland constraints fn invoke constraints clj query processor middleware process userland query process userland query fn invoke process userland query clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor reducible async qp qp star thunk invoke reducible clj query processor reducible async qp qp star invoke reducible clj query processor reducible sync qp qp star fn invoke reducible clj query processor reducible sync qp qp star invoke reducible clj query processor process userland query invokestatic query processor clj query processor process userland query doinvoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj query processor fn process query and save with max results constraints bang fn invoke query processor clj query processor fn process query and save with max results constraints bang invoke query processor clj api dataset run query async fn invoke dataset clj query processor streaming streaming response star fn fn invoke streaming clj query processor streaming streaming response star fn invoke streaming clj async streaming response do f star invokestatic streaming response clj async streaming response do f star invoke streaming response clj async streaming response do f async task invoke streaming response clj error type invalid query ex data type invalid query sql metabase userid querytype mbql queryhash nselect id as id user id as user id product id as product id subtotal as subtotal tax as tax total as total discount as discount created at as created at quantity as quantity cc as cc products renamed id as products renamed id products renamed ean as products renamed ean products renamed title as products renamed title products renamed category as products renamed category products renamed vendor as products renamed vendor products renamed price as products renamed price products renamed rating as products renamed rating products renamed created at as products renamed created at from select sample dataset orders id as id sample dataset orders user id as user id sample dataset orders product id as product id sample dataset orders subtotal as subtotal sample dataset orders tax as tax sample dataset orders total as total sample dataset orders discount as discount sample dataset orders created at as created at sample dataset orders quantity as quantity as cc products renamed id as products renamed id products renamed ean as products renamed ean products renamed title as products renamed title products renamed category as products renamed category products renamed vendor as products renamed vendor products renamed price as products renamed price products renamed rating as products renamed rating products renamed created at as products renamed created at from sample dataset orders left join sample dataset products products renamed on sample dataset orders product id products renamed id source where products renamed category limit parameters doohickey error type invalid query json query type query query source table joins fields all source table condition alias products renamed filter doohickey expressions cc database parameters middleware js int to string true add default userland constraints true native query select id as id user id as user id product id as product id subtotal as subtotal tax as tax total as total discount as discount created at as created at quantity as quantity cc as cc products renamed id as products renamed id products renamed ean as products renamed ean products renamed title as products renamed title products renamed category as products renamed category products renamed vendor as products renamed vendor products renamed price as products renamed price products renamed rating as products renamed rating products renamed created at as products renamed created at from select sample dataset orders id as id sample dataset orders user id as user id sample dataset orders product id as product id sample dataset orders subtotal as subtotal sample dataset orders tax as tax sample dataset orders total as total sample dataset orders discount as discount sample dataset orders created at as created at sample dataset orders quantity as quantity as cc products renamed id as products renamed id products renamed ean as products renamed ean products renamed title as products renamed title products renamed category as products renamed category products renamed vendor as products renamed vendor products renamed price as products renamed price products renamed rating as products renamed rating products renamed created at as products renamed created at from sample dataset orders left join sample dataset products products renamed on sample dataset orders product id products renamed id source where products renamed category limit params doohickey table name orders mbql true status failed class com google api client googleapis json googlejsonresponseexception stacktrace com google api client googleapis json googlejsonresponseexception from googlejsonresponseexception java com google api client googleapis services json abstractgooglejsonclientrequest newexceptiononerror abstractgooglejsonclientrequest java com google api client googleapis services json abstractgooglejsonclientrequest newexceptiononerror abstractgooglejsonclientrequest java com google api client googleapis services abstractgoogleclientrequest interceptresponse abstractgoogleclientrequest java com google api client http httprequest execute httprequest java com google api client googleapis services abstractgoogleclientrequest executeunparsed abstractgoogleclientrequest java com google api client googleapis services abstractgoogleclientrequest executeunparsed abstractgoogleclientrequest java com google api client googleapis services abstractgoogleclientrequest execute abstractgoogleclientrequest java com google cloud bigquery spi httpbigqueryrpc queryrpc httpbigqueryrpc java com google cloud bigquery bigqueryimpl call bigqueryimpl java com google cloud bigquery bigqueryimpl call bigqueryimpl java com google api gax retrying directretryingexecutor submit directretryingexecutor java com google cloud retryhelper run retryhelper java com google cloud retryhelper runwithretries retryhelper java com google cloud bigquery bigqueryimpl queryrpc bigqueryimpl java com google cloud bigquery bigqueryimpl query bigqueryimpl java driver bigquery cloud sdk execute bigquery fn invoke bigquery cloud sdk clj card id nil context ad hoc error bad request npost code n errors n reason invalidquery n n message unrecognized name products renamed category at n status invalid argument n row count running time preprocessed type query query source table filter value doohickey base type type text effective type type text coercion strategy nil semantic type type category database type string name category expressions cc fields joins strategy left join fields source table condition alias products renamed limit database middleware js int to string true add default userland constraints true info executed by context ad hoc nested false query hash constraints max results max results bare rows data rows cols debug middleware log post api dataset s db calls app db connections jetty threads idle queued total active threads queries in flight queued information about your metabase installation works on
| 1
|
141,074
| 5,428,890,519
|
IssuesEvent
|
2017-03-03 16:58:42
|
SciSpike/yaktor-issues
|
https://api.github.com/repos/SciSpike/yaktor-issues
|
closed
|
Enhance mongo configuration & mongoose initializer to support replica sets & mongos
|
platform:nodejs priority:high status:reviewNeeded team:core type:enhancement
|
The current config options for mongo only allow the specification of a single host & port. In order to use mongo replica sets and `mongos` config servers, you need to be able to specify multiple `host:port` entries.
|
1.0
|
Enhance mongo configuration & mongoose initializer to support replica sets & mongos - The current config options for mongo only allow the specification of a single host & port. In order to use mongo replica sets and `mongos` config servers, you need to be able to specify multiple `host:port` entries.
|
non_process
|
enhance mongo configuration mongoose initializer to support replica sets mongos the current config options for mongo only allow the specification of a single host port in order to use mongo replica sets and mongos config servers you need to be able to specify multiple host port entries
| 0
|
19,823
| 26,211,121,741
|
IssuesEvent
|
2023-01-04 06:35:07
|
vesoft-inc/nebula
|
https://api.github.com/repos/vesoft-inc/nebula
|
reopened
|
Mixed usage of ngql and cypher statements
|
type/bug severity/none process/fixed affects/none
|
**Please check the FAQ documentation before raising an issue**
<!-- Please check the [FAQ](https://docs.nebula-graph.com.cn/master/20.appendix/0.FAQ/) documentation and old issues before raising an issue in case someone has asked the same question that you are asking. -->
**Describe the bug (__required__)**
https://github.com/vesoft-inc/nebula/pull/3506 forbids the mixed usage of ngql and cypher statements, but it forgets to handle the standalone return statement.
See the following example:
```
go from "Tony Parker" over like yield id($$) as vid | return $-.vid
```
This query should report a syntax error according to our previous design, but in fact, this query can be executed successfully before because a standalone return statement is treated as a `YieldSentence` in our implementation. So a standalone statement is actually a ngql statement not a cypher statement in our implementation. This strange implementation causes a bug https://github.com/vesoft-inc/nebula/issues/5113.
Pr https://github.com/vesoft-inc/nebula/pull/5116 forbids the mixed usage of ngql and return statement, but it causes incompatibility issues. After a discussion with @MuYiYong @HarrisChu , we decided to be consistent with previous behaviors.
<!-- A clear and concise description of what the bug is. -->
**Your Environments (__required__)**
* OS: `uname -a`
* Compiler: `g++ --version` or `clang++ --version`
* CPU: `lscpu`
* Commit id (e.g. `a3ffc7d8`)
**How To Reproduce(__required__)**
Steps to reproduce the behavior:
1. Step 1
2. Step 2
3. Step 3
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**Additional context**
<!-- Provide logs and configs, or any other context to trace the problem. -->
|
1.0
|
Mixed usage of ngql and cypher statements - **Please check the FAQ documentation before raising an issue**
<!-- Please check the [FAQ](https://docs.nebula-graph.com.cn/master/20.appendix/0.FAQ/) documentation and old issues before raising an issue in case someone has asked the same question that you are asking. -->
**Describe the bug (__required__)**
https://github.com/vesoft-inc/nebula/pull/3506 forbids the mixed usage of ngql and cypher statements, but it forgets to handle the standalone return statement.
See the following example:
```
go from "Tony Parker" over like yield id($$) as vid | return $-.vid
```
This query should report a syntax error according to our previous design, but in fact, this query can be executed successfully before because a standalone return statement is treated as a `YieldSentence` in our implementation. So a standalone statement is actually a ngql statement not a cypher statement in our implementation. This strange implementation causes a bug https://github.com/vesoft-inc/nebula/issues/5113.
Pr https://github.com/vesoft-inc/nebula/pull/5116 forbids the mixed usage of ngql and return statement, but it causes incompatibility issues. After a discussion with @MuYiYong @HarrisChu , we decided to be consistent with previous behaviors.
<!-- A clear and concise description of what the bug is. -->
**Your Environments (__required__)**
* OS: `uname -a`
* Compiler: `g++ --version` or `clang++ --version`
* CPU: `lscpu`
* Commit id (e.g. `a3ffc7d8`)
**How To Reproduce(__required__)**
Steps to reproduce the behavior:
1. Step 1
2. Step 2
3. Step 3
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**Additional context**
<!-- Provide logs and configs, or any other context to trace the problem. -->
|
process
|
mixed usage of ngql and cypher statements please check the faq documentation before raising an issue describe the bug required forbids the mixed usage of ngql and cypher statements but it forgets to handle the standalone return statement see the following example go from tony parker over like yield id as vid return vid this query should report a syntax error according to our previous design but in fact this query can be executed successfully before because a standalone return statement is treated as a yieldsentence in our implementation so a standalone statement is actually a ngql statement not a cypher statement in our implementation this strange implementation causes a bug pr forbids the mixed usage of ngql and return statement but it causes incompatibility issues after a discussion with muyiyong harrischu we decided to be consistent with previous behaviors your environments required os uname a compiler g version or clang version cpu lscpu commit id e g how to reproduce required steps to reproduce the behavior step step step expected behavior additional context
| 1
|
5,383
| 8,211,401,040
|
IssuesEvent
|
2018-09-04 13:44:17
|
openvstorage/framework
|
https://api.github.com/repos/openvstorage/framework
|
closed
|
mds_safety timeout received after errors
|
process_wontfix type_bug
|
# Problem
I ran a manually mds_checkup on Nuvolat and ran into a timeout problem.
Restarting the ovs-workers did the trick to start the mds_checkup again.
After x amount mds checks i see following error messages and mds_checkup stopped with processing the other vdisks. (back to the timeout)
Before restarting the ovs-workers:
```
2018-06-29 05:19:12 54700 -0400 - NY1SRV0001 - 24159/140671605868288 - lib/ensure single - 273 - INFO - Ensure single CHAINED mode - ID 1530263952_0oU9afCDVn - New task ovs.mds.ensure_safety with params {'vdisk_guid': '001326a2-8c1b-4c7f-9023-214451a231b7'} scheduled for execution
2018-06-29 05:24:13 07400 -0400 - NY1SRV0001 - 24159/140671605868288 - lib/ensure single - 274 - ERROR - Ensure single CHAINED mode - ID 1530263952_0oU9afCDVn - Could not start task ovs.mds.ensure_safety with params {'vdisk_guid': '001326a2-8c1b-4c7f-9023-214451a231b7'}, within expected time (300s). Removed it from queue
2018-06-29 05:24:13 07400 -0400 - NY1SRV0001 - 24159/140671605868288 - lib/mds - 275 - ERROR - Ensure safety for vDisk tgt-hprm-test10 with guid 001326a2-8c1b-4c7f-9023-214451a231b7 failed
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/mdsservice.py", line 410, in mds_checkup
MDSServiceController.ensure_safety(vdisk_guid=vdisk.guid)
File "/usr/lib/python2.7/dist-packages/celery/local.py", line 188, in __call__
return self._get_current_object()(*a, **kw)
File "/usr/lib/python2.7/dist-packages/celery/app/task.py", line 420, in __call__
return self.run(*args, **kwargs)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 491, in new_function
timeout))
EnsureSingleTimeoutReached: Ensure single CHAINED mode - ID 1530263952_0oU9afCDVn - Task ovs.mds.ensure_safety could not be started within timeout of 300s
```
Error after restarting the ovs-workers:
```
2018-06-29 05:49:51 08800 -0400 - NY1SRV0001 - 28998/139845377779456 - lib/mds - 21631 - ERROR - Ensure safety for vDisk pcd_ad07b096-d25f-4c81-9ce0-b054bfdda036 with guid 50bb37a2-52e5-4a1f-810f-11afe7b1b497 failed
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/mdsservice.py", line 410, in mds_checkup
MDSServiceController.ensure_safety(vdisk_guid=vdisk.guid)
File "/usr/lib/python2.7/dist-packages/celery/local.py", line 188, in __call__
return self._get_current_object()(*a, **kw)
File "/usr/lib/python2.7/dist-packages/celery/app/task.py", line 420, in __call__
return self.run(*args, **kwargs)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 469, in new_function
append=False)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 189, in update_value
if vals[0] is not None:
IndexError: list index out of range
2018-06-29 05:49:51 53000 -0400 - NY1SRV0001 - 28998/139845377779456 - lib/mds - 21632 - ERROR - Ensure safety for vDisk ARRDSM05-flat_20ff989d-373a-499b-b6f1-d8fa6866ceb2 with guid 50e096c2-abbd-4111-92fe-942d3de50f69 failed
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/mdsservice.py", line 410, in mds_checkup
MDSServiceController.ensure_safety(vdisk_guid=vdisk.guid)
File "/usr/lib/python2.7/dist-packages/celery/local.py", line 188, in __call__
return self._get_current_object()(*a, **kw)
File "/usr/lib/python2.7/dist-packages/celery/app/task.py", line 420, in __call__
return self.run(*args, **kwargs)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 391, in new_function
append=True)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 188, in update_value
vals = list(persistent_client.get_multi([key], must_exist=False))
File "/usr/lib/python2.7/dist-packages/ovs_extensions/storage/persistent/pyrakoonstore.py", line 66, in get_multi
for item in self._client.get_multi(keys, must_exist=must_exist):
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/client.py", line 86, in get_multi
keys):
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/client.py", line 214, in _try
return_value = method(*args, **kwargs)
File "<update_argspec>", line 5, in multiGetOption
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 160, in wrapped
return fun(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 143, in wrapped
return fun(*new_args)
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 511, in multiGetOption
return self._client.multi_get_option(keys, consistency = consistency_)
File "<update_argspec>", line 5, in multi_get_option
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/client/utils.py", line 99, in wrapped
return self._process(message) #pylint: disable=W0212
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 1125, in _process
connection.read)
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/utils.py", line 342, in read_blocking
value = read_fun(request.count)
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 1360, in read
reads, _, _ = select.select([self._socket], [], [], self._timeout)
error: (4, 'Interrupted system call')
2018-06-29 05:49:51 58300 -0400 - NY1SRV0001 - 28998/139845377779456 - lib/ensure single - 21633 - INFO - Ensure single CHAINED mode - ID 1530265791_FETekoYLzv - New task ovs.mds.ensure_safety with params {'vdisk_guid': '50e2ca0a-4185-4a11-96cf-955c8b3c764d'} scheduled for execution
2018-06-29 05:54:52 10000 -0400 - NY1SRV0001 - 28998/139845377779456 - lib/mds - 21635 - ERROR - Ensure safety for vDisk pcd_774ef4e7-10ae-4509-97ef-1776fc25249f with guid 50e2ca0a-4185-4a11-96cf-955c8b3c764d failed
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/mdsservice.py", line 410, in mds_checkup
MDSServiceController.ensure_safety(vdisk_guid=vdisk.guid)
File "/usr/lib/python2.7/dist-packages/celery/local.py", line 188, in __call__
return self._get_current_object()(*a, **kw)
File "/usr/lib/python2.7/dist-packages/celery/app/task.py", line 420, in __call__
return self.run(*args, **kwargs)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 491, in new_function
timeout))
EnsureSingleTimeoutReached: Ensure single CHAINED mode - ID 1530265791_FETekoYLzv - Task ovs.mds.ensure_safety could not be started within timeout of 300s
```
# Versions
```
ii openvstorage 2.9.14-1 amd64 OpenvStorage
ii openvstorage-backend 1.9.2-1 amd64 OpenvStorage Backend plugin
ii openvstorage-backend-core 1.9.2-1 amd64 OpenvStorage Backend plugin core
ii openvstorage-backend-webapps 1.9.2-1 amd64 OpenvStorage Backend plugin Web Applications
ii openvstorage-core 2.9.14-1 amd64 OpenvStorage core
ii openvstorage-extensions 0.1.2-1 amd64 Extensions for Open vStorage
ii openvstorage-health-check 3.4.10-1 amd64 Open vStorage HealthCheck
ii openvstorage-sdm 1.9.1-1 amd64 Open vStorage Backend ASD Manager
ii openvstorage-webapps 2.9.14-1 amd64 OpenvStorage Web Applications
```
|
1.0
|
mds_safety timeout received after errors - # Problem
I ran a manually mds_checkup on Nuvolat and ran into a timeout problem.
Restarting the ovs-workers did the trick to start the mds_checkup again.
After x amount mds checks i see following error messages and mds_checkup stopped with processing the other vdisks. (back to the timeout)
Before restarting the ovs-workers:
```
2018-06-29 05:19:12 54700 -0400 - NY1SRV0001 - 24159/140671605868288 - lib/ensure single - 273 - INFO - Ensure single CHAINED mode - ID 1530263952_0oU9afCDVn - New task ovs.mds.ensure_safety with params {'vdisk_guid': '001326a2-8c1b-4c7f-9023-214451a231b7'} scheduled for execution
2018-06-29 05:24:13 07400 -0400 - NY1SRV0001 - 24159/140671605868288 - lib/ensure single - 274 - ERROR - Ensure single CHAINED mode - ID 1530263952_0oU9afCDVn - Could not start task ovs.mds.ensure_safety with params {'vdisk_guid': '001326a2-8c1b-4c7f-9023-214451a231b7'}, within expected time (300s). Removed it from queue
2018-06-29 05:24:13 07400 -0400 - NY1SRV0001 - 24159/140671605868288 - lib/mds - 275 - ERROR - Ensure safety for vDisk tgt-hprm-test10 with guid 001326a2-8c1b-4c7f-9023-214451a231b7 failed
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/mdsservice.py", line 410, in mds_checkup
MDSServiceController.ensure_safety(vdisk_guid=vdisk.guid)
File "/usr/lib/python2.7/dist-packages/celery/local.py", line 188, in __call__
return self._get_current_object()(*a, **kw)
File "/usr/lib/python2.7/dist-packages/celery/app/task.py", line 420, in __call__
return self.run(*args, **kwargs)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 491, in new_function
timeout))
EnsureSingleTimeoutReached: Ensure single CHAINED mode - ID 1530263952_0oU9afCDVn - Task ovs.mds.ensure_safety could not be started within timeout of 300s
```
Error after restarting the ovs-workers:
```
2018-06-29 05:49:51 08800 -0400 - NY1SRV0001 - 28998/139845377779456 - lib/mds - 21631 - ERROR - Ensure safety for vDisk pcd_ad07b096-d25f-4c81-9ce0-b054bfdda036 with guid 50bb37a2-52e5-4a1f-810f-11afe7b1b497 failed
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/mdsservice.py", line 410, in mds_checkup
MDSServiceController.ensure_safety(vdisk_guid=vdisk.guid)
File "/usr/lib/python2.7/dist-packages/celery/local.py", line 188, in __call__
return self._get_current_object()(*a, **kw)
File "/usr/lib/python2.7/dist-packages/celery/app/task.py", line 420, in __call__
return self.run(*args, **kwargs)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 469, in new_function
append=False)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 189, in update_value
if vals[0] is not None:
IndexError: list index out of range
2018-06-29 05:49:51 53000 -0400 - NY1SRV0001 - 28998/139845377779456 - lib/mds - 21632 - ERROR - Ensure safety for vDisk ARRDSM05-flat_20ff989d-373a-499b-b6f1-d8fa6866ceb2 with guid 50e096c2-abbd-4111-92fe-942d3de50f69 failed
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/mdsservice.py", line 410, in mds_checkup
MDSServiceController.ensure_safety(vdisk_guid=vdisk.guid)
File "/usr/lib/python2.7/dist-packages/celery/local.py", line 188, in __call__
return self._get_current_object()(*a, **kw)
File "/usr/lib/python2.7/dist-packages/celery/app/task.py", line 420, in __call__
return self.run(*args, **kwargs)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 391, in new_function
append=True)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 188, in update_value
vals = list(persistent_client.get_multi([key], must_exist=False))
File "/usr/lib/python2.7/dist-packages/ovs_extensions/storage/persistent/pyrakoonstore.py", line 66, in get_multi
for item in self._client.get_multi(keys, must_exist=must_exist):
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/client.py", line 86, in get_multi
keys):
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/client.py", line 214, in _try
return_value = method(*args, **kwargs)
File "<update_argspec>", line 5, in multiGetOption
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 160, in wrapped
return fun(*args, **kwargs)
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 143, in wrapped
return fun(*new_args)
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 511, in multiGetOption
return self._client.multi_get_option(keys, consistency = consistency_)
File "<update_argspec>", line 5, in multi_get_option
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/client/utils.py", line 99, in wrapped
return self._process(message) #pylint: disable=W0212
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 1125, in _process
connection.read)
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/utils.py", line 342, in read_blocking
value = read_fun(request.count)
File "/usr/lib/python2.7/dist-packages/ovs_extensions/db/arakoon/pyrakoon/pyrakoon/compat.py", line 1360, in read
reads, _, _ = select.select([self._socket], [], [], self._timeout)
error: (4, 'Interrupted system call')
2018-06-29 05:49:51 58300 -0400 - NY1SRV0001 - 28998/139845377779456 - lib/ensure single - 21633 - INFO - Ensure single CHAINED mode - ID 1530265791_FETekoYLzv - New task ovs.mds.ensure_safety with params {'vdisk_guid': '50e2ca0a-4185-4a11-96cf-955c8b3c764d'} scheduled for execution
2018-06-29 05:54:52 10000 -0400 - NY1SRV0001 - 28998/139845377779456 - lib/mds - 21635 - ERROR - Ensure safety for vDisk pcd_774ef4e7-10ae-4509-97ef-1776fc25249f with guid 50e2ca0a-4185-4a11-96cf-955c8b3c764d failed
Traceback (most recent call last):
File "/opt/OpenvStorage/ovs/lib/mdsservice.py", line 410, in mds_checkup
MDSServiceController.ensure_safety(vdisk_guid=vdisk.guid)
File "/usr/lib/python2.7/dist-packages/celery/local.py", line 188, in __call__
return self._get_current_object()(*a, **kw)
File "/usr/lib/python2.7/dist-packages/celery/app/task.py", line 420, in __call__
return self.run(*args, **kwargs)
File "/opt/OpenvStorage/ovs/lib/helpers/decorators.py", line 491, in new_function
timeout))
EnsureSingleTimeoutReached: Ensure single CHAINED mode - ID 1530265791_FETekoYLzv - Task ovs.mds.ensure_safety could not be started within timeout of 300s
```
# Versions
```
ii openvstorage 2.9.14-1 amd64 OpenvStorage
ii openvstorage-backend 1.9.2-1 amd64 OpenvStorage Backend plugin
ii openvstorage-backend-core 1.9.2-1 amd64 OpenvStorage Backend plugin core
ii openvstorage-backend-webapps 1.9.2-1 amd64 OpenvStorage Backend plugin Web Applications
ii openvstorage-core 2.9.14-1 amd64 OpenvStorage core
ii openvstorage-extensions 0.1.2-1 amd64 Extensions for Open vStorage
ii openvstorage-health-check 3.4.10-1 amd64 Open vStorage HealthCheck
ii openvstorage-sdm 1.9.1-1 amd64 Open vStorage Backend ASD Manager
ii openvstorage-webapps 2.9.14-1 amd64 OpenvStorage Web Applications
```
|
process
|
mds safety timeout received after errors problem i ran a manually mds checkup on nuvolat and ran into a timeout problem restarting the ovs workers did the trick to start the mds checkup again after x amount mds checks i see following error messages and mds checkup stopped with processing the other vdisks back to the timeout before restarting the ovs workers lib ensure single info ensure single chained mode id new task ovs mds ensure safety with params vdisk guid scheduled for execution lib ensure single error ensure single chained mode id could not start task ovs mds ensure safety with params vdisk guid within expected time removed it from queue lib mds error ensure safety for vdisk tgt hprm with guid failed traceback most recent call last file opt openvstorage ovs lib mdsservice py line in mds checkup mdsservicecontroller ensure safety vdisk guid vdisk guid file usr lib dist packages celery local py line in call return self get current object a kw file usr lib dist packages celery app task py line in call return self run args kwargs file opt openvstorage ovs lib helpers decorators py line in new function timeout ensuresingletimeoutreached ensure single chained mode id task ovs mds ensure safety could not be started within timeout of error after restarting the ovs workers lib mds error ensure safety for vdisk pcd with guid failed traceback most recent call last file opt openvstorage ovs lib mdsservice py line in mds checkup mdsservicecontroller ensure safety vdisk guid vdisk guid file usr lib dist packages celery local py line in call return self get current object a kw file usr lib dist packages celery app task py line in call return self run args kwargs file opt openvstorage ovs lib helpers decorators py line in new function append false file opt openvstorage ovs lib helpers decorators py line in update value if vals is not none indexerror list index out of range lib mds error ensure safety for vdisk flat with guid abbd failed traceback most recent call last file opt openvstorage ovs lib mdsservice py line in mds checkup mdsservicecontroller ensure safety vdisk guid vdisk guid file usr lib dist packages celery local py line in call return self get current object a kw file usr lib dist packages celery app task py line in call return self run args kwargs file opt openvstorage ovs lib helpers decorators py line in new function append true file opt openvstorage ovs lib helpers decorators py line in update value vals list persistent client get multi must exist false file usr lib dist packages ovs extensions storage persistent pyrakoonstore py line in get multi for item in self client get multi keys must exist must exist file usr lib dist packages ovs extensions db arakoon pyrakoon client py line in get multi keys file usr lib dist packages ovs extensions db arakoon pyrakoon client py line in try return value method args kwargs file line in multigetoption file usr lib dist packages ovs extensions db arakoon pyrakoon pyrakoon compat py line in wrapped return fun args kwargs file usr lib dist packages ovs extensions db arakoon pyrakoon pyrakoon compat py line in wrapped return fun new args file usr lib dist packages ovs extensions db arakoon pyrakoon pyrakoon compat py line in multigetoption return self client multi get option keys consistency consistency file line in multi get option file usr lib dist packages ovs extensions db arakoon pyrakoon pyrakoon client utils py line in wrapped return self process message pylint disable file usr lib dist packages ovs extensions db arakoon pyrakoon pyrakoon compat py line in process connection read file usr lib dist packages ovs extensions db arakoon pyrakoon pyrakoon utils py line in read blocking value read fun request count file usr lib dist packages ovs extensions db arakoon pyrakoon pyrakoon compat py line in read reads select select self timeout error interrupted system call lib ensure single info ensure single chained mode id fetekoylzv new task ovs mds ensure safety with params vdisk guid scheduled for execution lib mds error ensure safety for vdisk pcd with guid failed traceback most recent call last file opt openvstorage ovs lib mdsservice py line in mds checkup mdsservicecontroller ensure safety vdisk guid vdisk guid file usr lib dist packages celery local py line in call return self get current object a kw file usr lib dist packages celery app task py line in call return self run args kwargs file opt openvstorage ovs lib helpers decorators py line in new function timeout ensuresingletimeoutreached ensure single chained mode id fetekoylzv task ovs mds ensure safety could not be started within timeout of versions ii openvstorage openvstorage ii openvstorage backend openvstorage backend plugin ii openvstorage backend core openvstorage backend plugin core ii openvstorage backend webapps openvstorage backend plugin web applications ii openvstorage core openvstorage core ii openvstorage extensions extensions for open vstorage ii openvstorage health check open vstorage healthcheck ii openvstorage sdm open vstorage backend asd manager ii openvstorage webapps openvstorage web applications
| 1
|
17,544
| 24,199,445,412
|
IssuesEvent
|
2022-09-24 10:48:59
|
smilligan93/SR5-FoundryVTT
|
https://api.github.com/repos/smilligan93/SR5-FoundryVTT
|
closed
|
compatibility foundryvtt v10
|
incompatibility
|
I guess now it's a must.
Could you know when we could use your very marvellous system on Foundry vtt v10?
|
True
|
compatibility foundryvtt v10 - I guess now it's a must.
Could you know when we could use your very marvellous system on Foundry vtt v10?
|
non_process
|
compatibility foundryvtt i guess now it s a must could you know when we could use your very marvellous system on foundry vtt
| 0
|
188,757
| 15,170,011,519
|
IssuesEvent
|
2021-02-12 22:16:49
|
openssl/openssl
|
https://api.github.com/repos/openssl/openssl
|
opened
|
There is no documentation for CMS_compress.
|
issue: documentation
|
Even though the man page for CMS_uncompress refers to it.
<!--
Thank you for taking the time to report a documentation issue.
Please remember to tell us which OpenSSL version you are using and then
briefly describe the documentation error and where you encountered it
(e.g., in which manual page). If you are missing the documentation for a
certain command or API function, please tell us its name.
-->
|
1.0
|
There is no documentation for CMS_compress. - Even though the man page for CMS_uncompress refers to it.
<!--
Thank you for taking the time to report a documentation issue.
Please remember to tell us which OpenSSL version you are using and then
briefly describe the documentation error and where you encountered it
(e.g., in which manual page). If you are missing the documentation for a
certain command or API function, please tell us its name.
-->
|
non_process
|
there is no documentation for cms compress even though the man page for cms uncompress refers to it thank you for taking the time to report a documentation issue please remember to tell us which openssl version you are using and then briefly describe the documentation error and where you encountered it e g in which manual page if you are missing the documentation for a certain command or api function please tell us its name
| 0
|
243,876
| 7,868,181,262
|
IssuesEvent
|
2018-06-23 18:10:49
|
swarm-robotics/fordyca
|
https://api.github.com/repos/swarm-robotics/fordyca
|
closed
|
refactor/332-add-class-constants-to-collectors
|
Priority: Low Status: Completed Type: Refactor
|
Instead of typing fsm::distance or whatever for both registering and accessing
collectors, a class constant should be used instead, in addition to the
std::type_index map used for parameter repositories, so that there is as little
chance for typos as possible.
|
1.0
|
refactor/332-add-class-constants-to-collectors - Instead of typing fsm::distance or whatever for both registering and accessing
collectors, a class constant should be used instead, in addition to the
std::type_index map used for parameter repositories, so that there is as little
chance for typos as possible.
|
non_process
|
refactor add class constants to collectors instead of typing fsm distance or whatever for both registering and accessing collectors a class constant should be used instead in addition to the std type index map used for parameter repositories so that there is as little chance for typos as possible
| 0
|
22,139
| 30,683,635,452
|
IssuesEvent
|
2023-07-26 10:48:43
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Calling python3 runbook from another runbook in azure automation
|
automation/svc triaged assigned-to-author doc-enhancement process-automation/subsvc Pri2
|
**Feedback: Please add instructions with illustrations on python parent and child runbooks.**
One of the use case from a customer:
Customer have a function (named download_file()) which is in a python3 runbook named get_secret.py. They need to use it in an another python3 runbook called register.py.
All these runbooks are created under one of the automation accounts in Azure.
They would like to know how to call/invoke/import the function/script (get_secret.py) from register.py
Tried as per the usual method given below, but it didn't work.
In register.py
! /usr/bin/env python3
import get_secret
get_secret.download_file()
Also, imported the get_secret.py script into "Python packages" under Shared Resources in Automation Account, that too also not worked out.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 23c183d0-5012-e2e1-5562-69135b3f6509
* Version Independent ID: 7f36ff87-e24a-7442-8d42-f621f5391814
* Content: [Create modular runbooks in Azure Automation](https://docs.microsoft.com/en-us/azure/automation/automation-child-runbooks)
* Content Source: [articles/automation/automation-child-runbooks.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/automation/automation-child-runbooks.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @MGoedtel
* Microsoft Alias: **magoedte**
|
1.0
|
Calling python3 runbook from another runbook in azure automation -
**Feedback: Please add instructions with illustrations on python parent and child runbooks.**
One of the use case from a customer:
Customer have a function (named download_file()) which is in a python3 runbook named get_secret.py. They need to use it in an another python3 runbook called register.py.
All these runbooks are created under one of the automation accounts in Azure.
They would like to know how to call/invoke/import the function/script (get_secret.py) from register.py
Tried as per the usual method given below, but it didn't work.
In register.py
! /usr/bin/env python3
import get_secret
get_secret.download_file()
Also, imported the get_secret.py script into "Python packages" under Shared Resources in Automation Account, that too also not worked out.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 23c183d0-5012-e2e1-5562-69135b3f6509
* Version Independent ID: 7f36ff87-e24a-7442-8d42-f621f5391814
* Content: [Create modular runbooks in Azure Automation](https://docs.microsoft.com/en-us/azure/automation/automation-child-runbooks)
* Content Source: [articles/automation/automation-child-runbooks.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/automation/automation-child-runbooks.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @MGoedtel
* Microsoft Alias: **magoedte**
|
process
|
calling runbook from another runbook in azure automation feedback please add instructions with illustrations on python parent and child runbooks one of the use case from a customer customer have a function named download file which is in a runbook named get secret py they need to use it in an another runbook called register py all these runbooks are created under one of the automation accounts in azure they would like to know how to call invoke import the function script get secret py from register py tried as per the usual method given below but it didn t work in register py usr bin env import get secret get secret download file also imported the get secret py script into python packages under shared resources in automation account that too also not worked out document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login mgoedtel microsoft alias magoedte
| 1
|
311,281
| 26,779,819,973
|
IssuesEvent
|
2023-01-31 20:09:31
|
saltstack/salt
|
https://api.github.com/repos/saltstack/salt
|
opened
|
[TEST FAILURE] file.comment `test_issue_62121` fails with Python 3.11
|
Test-Failure
|
There are some regex specific mentions in the py3.11 changelog https://docs.python.org/3/whatsnew/3.11.html
and lots of notes of `.. changed in 3.11` on https://docs.python.org/3/library/re.html
```python
tests/pytests/functional/states/file/test_comment.py:35 (test_issue_62121)
file = <LoadedMod module='tests.pytests.functional.conftest.loaded.file'>
source = PosixPath('/tmp/file.txt')
def test_issue_62121(file, source):
"""
Test file.comment when the comment character is
later in the line, after the text
"""
regex = r"^port\s*=.+"
reg_cmp = re.compile(regex, re.MULTILINE)
cmt_regex = r"^#port\s*=.+"
cmt_cmp = re.compile(cmt_regex, re.MULTILINE)
with salt.utils.files.fopen(str(source)) as _fp:
assert reg_cmp.findall(_fp.read())
file.comment(name=str(source), regex=regex)
with salt.utils.files.fopen(str(source)) as _fp:
> assert not reg_cmp.findall(_fp.read())
E AssertionError: assert not ['port = 5432 # (change requires restart)']
E + where ['port = 5432 # (change requires restart)'] = <built-in method findall of re.Pattern object at 0x7f8490691e40>('things = stuff\nport = 5432 # (change requires restart)\n# commented = something\nmoar = things\n')
E + where <built-in method findall of re.Pattern object at 0x7f8490691e40> = re.compile('^port\\s*=.+', re.MULTILINE).findall
E + and 'things = stuff\nport = 5432 # (change requires restart)\n# commented = something\nmoar = things\n' = <built-in method read of _io.TextIOWrapper object at 0x7f84964e0d40>()
E + where <built-in method read of _io.TextIOWrapper object at 0x7f84964e0d40> = <_io.TextIOWrapper name='/tmp/file.txt' mode='r' encoding='utf-8'>.read
tests/pytests/functional/states/file/test_comment.py:52: AssertionError
```
|
1.0
|
[TEST FAILURE] file.comment `test_issue_62121` fails with Python 3.11 - There are some regex specific mentions in the py3.11 changelog https://docs.python.org/3/whatsnew/3.11.html
and lots of notes of `.. changed in 3.11` on https://docs.python.org/3/library/re.html
```python
tests/pytests/functional/states/file/test_comment.py:35 (test_issue_62121)
file = <LoadedMod module='tests.pytests.functional.conftest.loaded.file'>
source = PosixPath('/tmp/file.txt')
def test_issue_62121(file, source):
"""
Test file.comment when the comment character is
later in the line, after the text
"""
regex = r"^port\s*=.+"
reg_cmp = re.compile(regex, re.MULTILINE)
cmt_regex = r"^#port\s*=.+"
cmt_cmp = re.compile(cmt_regex, re.MULTILINE)
with salt.utils.files.fopen(str(source)) as _fp:
assert reg_cmp.findall(_fp.read())
file.comment(name=str(source), regex=regex)
with salt.utils.files.fopen(str(source)) as _fp:
> assert not reg_cmp.findall(_fp.read())
E AssertionError: assert not ['port = 5432 # (change requires restart)']
E + where ['port = 5432 # (change requires restart)'] = <built-in method findall of re.Pattern object at 0x7f8490691e40>('things = stuff\nport = 5432 # (change requires restart)\n# commented = something\nmoar = things\n')
E + where <built-in method findall of re.Pattern object at 0x7f8490691e40> = re.compile('^port\\s*=.+', re.MULTILINE).findall
E + and 'things = stuff\nport = 5432 # (change requires restart)\n# commented = something\nmoar = things\n' = <built-in method read of _io.TextIOWrapper object at 0x7f84964e0d40>()
E + where <built-in method read of _io.TextIOWrapper object at 0x7f84964e0d40> = <_io.TextIOWrapper name='/tmp/file.txt' mode='r' encoding='utf-8'>.read
tests/pytests/functional/states/file/test_comment.py:52: AssertionError
```
|
non_process
|
file comment test issue fails with python there are some regex specific mentions in the changelog and lots of notes of changed in on python tests pytests functional states file test comment py test issue file source posixpath tmp file txt def test issue file source test file comment when the comment character is later in the line after the text regex r port s reg cmp re compile regex re multiline cmt regex r port s cmt cmp re compile cmt regex re multiline with salt utils files fopen str source as fp assert reg cmp findall fp read file comment name str source regex regex with salt utils files fopen str source as fp assert not reg cmp findall fp read e assertionerror assert not e where things stuff nport change requires restart n commented something nmoar things n e where re compile port s re multiline findall e and things stuff nport change requires restart n commented something nmoar things n e where read tests pytests functional states file test comment py assertionerror
| 0
|
11,915
| 18,507,021,323
|
IssuesEvent
|
2021-10-19 19:56:57
|
NASA-PDS/pds-api
|
https://api.github.com/repos/NASA-PDS/pds-api
|
closed
|
As an API user, I want to specify whether I get the latest or all versions of a product
|
requirement Epic B12.1 p.must-have d.running-late proj.registry+api
|
## 🌬 Motivation
PDS labels form references with collections and other labels in a bundle through two mechanisms: one is by specifying a full logical identifier + version identifier, or "lidvid", such as:
```xml
<Bundle_Member_Entry>
<lidvid_reference>urn:nasa:pds:insight_documents:document_mission::2.0</lidvid_reference>
<member_status>Primary</member_status>
</Bundle_Member_Entry>
```
Another is with a reference just to the logical identifier, or "lid"; for example:
```xml
<Bundle_Member_Entry>
<lid_reference>urn:nasa:pds:ladee_mission:xml_schema_collection</lid_reference>
<member_status>Primary</member_status>
</Bundle_Member_Entry>
```
In the former, a single referenced collection is indicated; in the latter, there's a choice: do we want _all versions_ of the collection or _just the latest_? When searching for collections within a bundle, it would be great to be able to add a search parameter that gives that choice to the client.
## 🕵️ Additional Details
The use case for this is in [registry version of the PDS Deep Archive](https://github.com/NASA-PDS/pds-deep-archive/issues/7). When examining "lid-only" references, the PDS Deep Archive can generate two kinds of Archive Information Packages and Submission Information Packages:
- One with _all versions_ of the referenced collections
- Or one with just the _latest version_
A command-line parameter, `--include-latest-collection-only`, turns on the second behavior.
Because the API service (and the ElasticSearch behind it) is in an ideal position to do this distinction, it should provide it as a feature. This would also reduce the over-the-wire data transferred from the API—which can be problematic for huge bundles (think insight_cameras).
Of course, if it already does support this, please close this issue and tell me how! 😊
## Acceptance Criteria
See sub-tickets to this Epic for specific acceptance criteria addressing these two use cases
|
1.0
|
As an API user, I want to specify whether I get the latest or all versions of a product - ## 🌬 Motivation
PDS labels form references with collections and other labels in a bundle through two mechanisms: one is by specifying a full logical identifier + version identifier, or "lidvid", such as:
```xml
<Bundle_Member_Entry>
<lidvid_reference>urn:nasa:pds:insight_documents:document_mission::2.0</lidvid_reference>
<member_status>Primary</member_status>
</Bundle_Member_Entry>
```
Another is with a reference just to the logical identifier, or "lid"; for example:
```xml
<Bundle_Member_Entry>
<lid_reference>urn:nasa:pds:ladee_mission:xml_schema_collection</lid_reference>
<member_status>Primary</member_status>
</Bundle_Member_Entry>
```
In the former, a single referenced collection is indicated; in the latter, there's a choice: do we want _all versions_ of the collection or _just the latest_? When searching for collections within a bundle, it would be great to be able to add a search parameter that gives that choice to the client.
## 🕵️ Additional Details
The use case for this is in [registry version of the PDS Deep Archive](https://github.com/NASA-PDS/pds-deep-archive/issues/7). When examining "lid-only" references, the PDS Deep Archive can generate two kinds of Archive Information Packages and Submission Information Packages:
- One with _all versions_ of the referenced collections
- Or one with just the _latest version_
A command-line parameter, `--include-latest-collection-only`, turns on the second behavior.
Because the API service (and the ElasticSearch behind it) is in an ideal position to do this distinction, it should provide it as a feature. This would also reduce the over-the-wire data transferred from the API—which can be problematic for huge bundles (think insight_cameras).
Of course, if it already does support this, please close this issue and tell me how! 😊
## Acceptance Criteria
See sub-tickets to this Epic for specific acceptance criteria addressing these two use cases
|
non_process
|
as an api user i want to specify whether i get the latest or all versions of a product 🌬 motivation pds labels form references with collections and other labels in a bundle through two mechanisms one is by specifying a full logical identifier version identifier or lidvid such as xml urn nasa pds insight documents document mission primary another is with a reference just to the logical identifier or lid for example xml urn nasa pds ladee mission xml schema collection primary in the former a single referenced collection is indicated in the latter there s a choice do we want all versions of the collection or just the latest when searching for collections within a bundle it would be great to be able to add a search parameter that gives that choice to the client 🕵️ additional details the use case for this is in when examining lid only references the pds deep archive can generate two kinds of archive information packages and submission information packages one with all versions of the referenced collections or one with just the latest version a command line parameter include latest collection only turns on the second behavior because the api service and the elasticsearch behind it is in an ideal position to do this distinction it should provide it as a feature this would also reduce the over the wire data transferred from the api—which can be problematic for huge bundles think insight cameras of course if it already does support this please close this issue and tell me how 😊 acceptance criteria see sub tickets to this epic for specific acceptance criteria addressing these two use cases
| 0
|
27,847
| 22,443,556,141
|
IssuesEvent
|
2022-06-21 04:14:05
|
IBM-Cloud/terraform-provider-ibm
|
https://api.github.com/repos/IBM-Cloud/terraform-provider-ibm
|
closed
|
ibm_is_instance should not not suppress change and force new on boot_volume.0.snapshot change
|
service/VPC Infrastructure
|
<!---
Please note the following potential times when an issue might be in Terraform core:
* [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues
* [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues
* [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues
* [Registry](https://registry.terraform.io/) issues
* Spans resources across multiple providers
If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead.
--->
<!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform CLI and Terraform IBM Provider Version
<!--- Please run `terraform -v` to show the Terraform core version and provider version(s). If you are not running the latest version of Terraform or the provider, please upgrade because your issue may have already been fixed. [Terraform documentation on provider versioning](https://www.terraform.io/docs/configuration/providers.html#provider-versions). --->
### Affected Resource(s)
<!--- Please list the affected resources and data sources. --->
* ibm_is_instance
### Terraform Configuration Files
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
Please include all Terraform configurations required to reproduce the bug. Bug reports without a functional reproduction may be closed without investigation.
```hcl
resource "ibm_is_instance" "ins" {
name = "${var.prefix}-vsi"
zone = "${var.zone}"
profile = "bx2-4x16"
keys = ["${var.keyid}]
primary_network_interface {
subnet = var.subnetid
}
boot_volume {
snapshot = data.ibm_is_snapshot.this.id
}
vpc = var.vpcid
}
```
### Debug Output
<!---
Please provide a link to a GitHub Gist containing the complete debug output. Please do NOT paste the debug output in the issue; just paste a link to the Gist.
To obtain the debug output, see the [Terraform documentation on debugging](https://www.terraform.io/docs/internals/debugging.html).
--->
### Panic Output
<!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. --->
### Expected Behavior
Should force new
<!--- What should have happened? --->
### Actual Behavior
Suppressing the changes
<!--- What actually happened? --->
### Steps to Reproduce
<!--- Please list the steps required to reproduce the issue. --->
1. `terraform apply`
### Important Factoids
<!--- Are there anything atypical about your accounts that we should know? For example: Running in EC2 Classic? --->
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor documentation? For example:
--->
* #0000
|
1.0
|
ibm_is_instance should not not suppress change and force new on boot_volume.0.snapshot change - <!---
Please note the following potential times when an issue might be in Terraform core:
* [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues
* [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues
* [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues
* [Registry](https://registry.terraform.io/) issues
* Spans resources across multiple providers
If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead.
--->
<!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Terraform CLI and Terraform IBM Provider Version
<!--- Please run `terraform -v` to show the Terraform core version and provider version(s). If you are not running the latest version of Terraform or the provider, please upgrade because your issue may have already been fixed. [Terraform documentation on provider versioning](https://www.terraform.io/docs/configuration/providers.html#provider-versions). --->
### Affected Resource(s)
<!--- Please list the affected resources and data sources. --->
* ibm_is_instance
### Terraform Configuration Files
<!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code --->
Please include all Terraform configurations required to reproduce the bug. Bug reports without a functional reproduction may be closed without investigation.
```hcl
resource "ibm_is_instance" "ins" {
name = "${var.prefix}-vsi"
zone = "${var.zone}"
profile = "bx2-4x16"
keys = ["${var.keyid}]
primary_network_interface {
subnet = var.subnetid
}
boot_volume {
snapshot = data.ibm_is_snapshot.this.id
}
vpc = var.vpcid
}
```
### Debug Output
<!---
Please provide a link to a GitHub Gist containing the complete debug output. Please do NOT paste the debug output in the issue; just paste a link to the Gist.
To obtain the debug output, see the [Terraform documentation on debugging](https://www.terraform.io/docs/internals/debugging.html).
--->
### Panic Output
<!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. --->
### Expected Behavior
Should force new
<!--- What should have happened? --->
### Actual Behavior
Suppressing the changes
<!--- What actually happened? --->
### Steps to Reproduce
<!--- Please list the steps required to reproduce the issue. --->
1. `terraform apply`
### Important Factoids
<!--- Are there anything atypical about your accounts that we should know? For example: Running in EC2 Classic? --->
### References
<!---
Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests
Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor documentation? For example:
--->
* #0000
|
non_process
|
ibm is instance should not not suppress change and force new on boot volume snapshot change please note the following potential times when an issue might be in terraform core or resource ordering issues and issues issues issues spans resources across multiple providers if you are running into one of these scenarios we recommend opening an issue in the instead community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform cli and terraform ibm provider version affected resource s ibm is instance terraform configuration files please include all terraform configurations required to reproduce the bug bug reports without a functional reproduction may be closed without investigation hcl resource ibm is instance ins name var prefix vsi zone var zone profile keys primary network interface subnet var subnetid boot volume snapshot data ibm is snapshot this id vpc var vpcid debug output please provide a link to a github gist containing the complete debug output please do not paste the debug output in the issue just paste a link to the gist to obtain the debug output see the panic output expected behavior should force new actual behavior suppressing the changes steps to reproduce terraform apply important factoids references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor documentation for example
| 0
|
799,798
| 28,314,240,156
|
IssuesEvent
|
2023-04-10 18:07:17
|
googleapis/repo-automation-bots
|
https://api.github.com/repos/googleapis/repo-automation-bots
|
closed
|
Policy bot not working as expected
|
type: bug priority: p2 policybot
|
We recently enabled Policy Bot on some of our team's repos and it doesn't seem to be working very well.
1) The bot keeps raising an issue on the Python Connector [#638](https://github.com/GoogleCloudPlatform/cloud-sql-python-connector/issues/638) for not having branch protection enabled ... but the repo does have branch protection enabled from what i can tell. If I close the issue, the bot opens a new one the next day 😢
2) The bot was configured on multiple of our repos a week ago but has not run yet on them. Is there a way to tell why this is happening? Two of the repos that have not seen any issues or PRs from policy bot yet are [cloud-sql-go-connector](https://github.com/GoogleCloudPlatform/cloud-sql-go-connector) and the [cloud-sql-jdbc-socket-factory](https://github.com/GoogleCloudPlatform/cloud-sql-jdbc-socket-factory)
Please let me know if there are ways to fix the two issues mentioned. Is the policy bot is actively maintained and recommended to be used?
Any info would be greatly appreciated, thanks so much :)
|
1.0
|
Policy bot not working as expected - We recently enabled Policy Bot on some of our team's repos and it doesn't seem to be working very well.
1) The bot keeps raising an issue on the Python Connector [#638](https://github.com/GoogleCloudPlatform/cloud-sql-python-connector/issues/638) for not having branch protection enabled ... but the repo does have branch protection enabled from what i can tell. If I close the issue, the bot opens a new one the next day 😢
2) The bot was configured on multiple of our repos a week ago but has not run yet on them. Is there a way to tell why this is happening? Two of the repos that have not seen any issues or PRs from policy bot yet are [cloud-sql-go-connector](https://github.com/GoogleCloudPlatform/cloud-sql-go-connector) and the [cloud-sql-jdbc-socket-factory](https://github.com/GoogleCloudPlatform/cloud-sql-jdbc-socket-factory)
Please let me know if there are ways to fix the two issues mentioned. Is the policy bot is actively maintained and recommended to be used?
Any info would be greatly appreciated, thanks so much :)
|
non_process
|
policy bot not working as expected we recently enabled policy bot on some of our team s repos and it doesn t seem to be working very well the bot keeps raising an issue on the python connector for not having branch protection enabled but the repo does have branch protection enabled from what i can tell if i close the issue the bot opens a new one the next day 😢 the bot was configured on multiple of our repos a week ago but has not run yet on them is there a way to tell why this is happening two of the repos that have not seen any issues or prs from policy bot yet are and the please let me know if there are ways to fix the two issues mentioned is the policy bot is actively maintained and recommended to be used any info would be greatly appreciated thanks so much
| 0
|
11,094
| 9,216,814,042
|
IssuesEvent
|
2019-03-11 09:12:41
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Invalid local file path
|
cognitive-services/svc cxp product-question triaged
|
Where should I put the image? I always get an invalid file path
---
#### Document details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5fdd9884-2d15-e480-e057-dfb72d89f2fb
* Version Independent ID: 37f24981-566c-5d2d-87cc-4878e8699342
* Content: [Quickstart: Analyze a local image - REST, C# - Azure Cognitive Services](https://docs.microsoft.com/en-sg/azure/cognitive-services/Computer-vision/QuickStarts/CSharp-analyze#feedback)
* Content Source: [articles/cognitive-services/Computer-vision/QuickStarts/CSharp-analyze.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cognitive-services/Computer-vision/QuickStarts/CSharp-analyze.md)
* Service: **cognitive-services**
* GitHub Login: @PatrickFarley
* Microsoft Alias: **pafarley**
|
1.0
|
Invalid local file path - Where should I put the image? I always get an invalid file path
---
#### Document details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5fdd9884-2d15-e480-e057-dfb72d89f2fb
* Version Independent ID: 37f24981-566c-5d2d-87cc-4878e8699342
* Content: [Quickstart: Analyze a local image - REST, C# - Azure Cognitive Services](https://docs.microsoft.com/en-sg/azure/cognitive-services/Computer-vision/QuickStarts/CSharp-analyze#feedback)
* Content Source: [articles/cognitive-services/Computer-vision/QuickStarts/CSharp-analyze.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cognitive-services/Computer-vision/QuickStarts/CSharp-analyze.md)
* Service: **cognitive-services**
* GitHub Login: @PatrickFarley
* Microsoft Alias: **pafarley**
|
non_process
|
invalid local file path where should i put the image i always get an invalid file path document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service cognitive services github login patrickfarley microsoft alias pafarley
| 0
|
21,348
| 29,173,465,214
|
IssuesEvent
|
2023-05-19 05:28:39
|
james77777778/keras-aug
|
https://api.github.com/repos/james77777778/keras-aug
|
closed
|
Support ragged `segmentation_masks`
|
enhancement preprocessing augmentation
|
- [x] RandomAffine
- [x] RandomCropAndResize
- [x] RandomCrop
- [x] RandomFlip
- [x] RandomResize (polish api)
- [x] RandomRotate
- [x] RandomZoomAndCrop (add support)
- [x] CenterCrop (polish api)
- [x] PadIfNeeded (polish api)
- [x] Resize (polish api)
|
1.0
|
Support ragged `segmentation_masks` - - [x] RandomAffine
- [x] RandomCropAndResize
- [x] RandomCrop
- [x] RandomFlip
- [x] RandomResize (polish api)
- [x] RandomRotate
- [x] RandomZoomAndCrop (add support)
- [x] CenterCrop (polish api)
- [x] PadIfNeeded (polish api)
- [x] Resize (polish api)
|
process
|
support ragged segmentation masks randomaffine randomcropandresize randomcrop randomflip randomresize polish api randomrotate randomzoomandcrop add support centercrop polish api padifneeded polish api resize polish api
| 1
|
3,259
| 6,336,926,157
|
IssuesEvent
|
2017-07-26 22:18:19
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
Please do not return an error code when reading an empty log file
|
change log-processing
|
I use GoAccess to parse the logs generated by Apache on Debian. Logrotate is part of the picture with the default configuration, which means that every day a new, empty log file is created, and the old one is renamed.
I run GoAccess in a cron task every hour, so every day it will parse an empty file. The Cron line looks like that.
@hourly goaccess --load-from-disk --keep-db-files -f /var/log/apache2/jekyll_vhosts_access.log -o /srv/www/default/goaccess/hacking.html
So every day, GoAccess reads an empty log file at some point. Since it returns an error code (`1`), Cron sends me a notification email. That's the usual and expected behavior from Cron when a job fails.
Does it sound OK to return `0` when reading an empty log file ? Currently I have a site with very low traffic, the log file can remain empty the whole day, so Cron spams me every hour, that's a pain...
I know that this topic is fresh on your mind, I've seen some issues related, so you must know what's best to do.
Cheers !
|
1.0
|
Please do not return an error code when reading an empty log file - I use GoAccess to parse the logs generated by Apache on Debian. Logrotate is part of the picture with the default configuration, which means that every day a new, empty log file is created, and the old one is renamed.
I run GoAccess in a cron task every hour, so every day it will parse an empty file. The Cron line looks like that.
@hourly goaccess --load-from-disk --keep-db-files -f /var/log/apache2/jekyll_vhosts_access.log -o /srv/www/default/goaccess/hacking.html
So every day, GoAccess reads an empty log file at some point. Since it returns an error code (`1`), Cron sends me a notification email. That's the usual and expected behavior from Cron when a job fails.
Does it sound OK to return `0` when reading an empty log file ? Currently I have a site with very low traffic, the log file can remain empty the whole day, so Cron spams me every hour, that's a pain...
I know that this topic is fresh on your mind, I've seen some issues related, so you must know what's best to do.
Cheers !
|
process
|
please do not return an error code when reading an empty log file i use goaccess to parse the logs generated by apache on debian logrotate is part of the picture with the default configuration which means that every day a new empty log file is created and the old one is renamed i run goaccess in a cron task every hour so every day it will parse an empty file the cron line looks like that hourly goaccess load from disk keep db files f var log jekyll vhosts access log o srv www default goaccess hacking html so every day goaccess reads an empty log file at some point since it returns an error code cron sends me a notification email that s the usual and expected behavior from cron when a job fails does it sound ok to return when reading an empty log file currently i have a site with very low traffic the log file can remain empty the whole day so cron spams me every hour that s a pain i know that this topic is fresh on your mind i ve seen some issues related so you must know what s best to do cheers
| 1
|
206,060
| 15,707,147,676
|
IssuesEvent
|
2021-03-26 18:26:36
|
ValveSoftware/halflife
|
https://api.github.com/repos/ValveSoftware/halflife
|
closed
|
Can't See My Steam-Only CS 1.6 Server On Internet List
|
Need Retest
|
I have hosted a Counter Strike 1.6 server in Singapore yesterday and it's not showing up in the official internet server list. I'm from Sri Lanka and I have a ping lower than 90 milliseconds every time in my server so it's geographically close to my location. Since many other servers with ping 100 - 150 are showing up in my server list including no-Steam and my server just isn't there no matter how much I reload it.
It's been 24 hours since my server has gone online. Why is this? Just bought this game yesterday and the amount of bugs I'm experiencing is just insane, feels I just paid my money for nothing.
|
1.0
|
Can't See My Steam-Only CS 1.6 Server On Internet List - I have hosted a Counter Strike 1.6 server in Singapore yesterday and it's not showing up in the official internet server list. I'm from Sri Lanka and I have a ping lower than 90 milliseconds every time in my server so it's geographically close to my location. Since many other servers with ping 100 - 150 are showing up in my server list including no-Steam and my server just isn't there no matter how much I reload it.
It's been 24 hours since my server has gone online. Why is this? Just bought this game yesterday and the amount of bugs I'm experiencing is just insane, feels I just paid my money for nothing.
|
non_process
|
can t see my steam only cs server on internet list i have hosted a counter strike server in singapore yesterday and it s not showing up in the official internet server list i m from sri lanka and i have a ping lower than milliseconds every time in my server so it s geographically close to my location since many other servers with ping are showing up in my server list including no steam and my server just isn t there no matter how much i reload it it s been hours since my server has gone online why is this just bought this game yesterday and the amount of bugs i m experiencing is just insane feels i just paid my money for nothing
| 0
|
6,680
| 9,797,687,249
|
IssuesEvent
|
2019-06-11 10:31:55
|
EthVM/EthVM
|
https://api.github.com/repos/EthVM/EthVM
|
closed
|
Encoding issue with Postgres / Timescale on ERC20 detector
|
bug priority:high project:processing
|
* **I'm submitting a ...**
- [ ] feature request
- [X] bug report
* **Bug Report**
The detected exception related to encoding issues:
```
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:560)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:321)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: java.sql.BatchUpdateException: Batch entry 12 INSERT INTO "erc20_metadata" ("address","name","symbol","decimals","total_supply") VALUES ('0x8be2847c17c8bf000d6e1326c63c8729d3ebe940','MySuperDuperTokenMySuperDuperToken','MMMMMM',3,'1000000000') ON CONFLICT ("address") DO UPDATE SET "name"=EXCLUDED."name","symbol"=EXCLUDED."symbol","decimals"=EXCLUDED."decimals","total_supply"=EXCLUDED."total_supply" was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:87)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:538)
... 10 more
Caused by: java.sql.SQLException: java.sql.BatchUpdateException: Batch entry 12 INSERT INTO "erc20_metadata" ("address","name","symbol","decimals","total_supply") VALUES ('0x8be2847c17c8bf000d6e1326c63c8729d3ebe940','MySuperDuperTokenMySuperDuperToken','MMMMMM',3,'1000000000') ON CONFLICT ("address") DO UPDATE SET "name"=EXCLUDED."name","symbol"=EXCLUDED."symbol","decimals"=EXCLUDED."decimals","total_supply"=EXCLUDED."total_supply" was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
... 12 more
```
Below you can see a picture:

|
1.0
|
Encoding issue with Postgres / Timescale on ERC20 detector - * **I'm submitting a ...**
- [ ] feature request
- [X] bug report
* **Bug Report**
The detected exception related to encoding issues:
```
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:560)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:321)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: java.sql.BatchUpdateException: Batch entry 12 INSERT INTO "erc20_metadata" ("address","name","symbol","decimals","total_supply") VALUES ('0x8be2847c17c8bf000d6e1326c63c8729d3ebe940','MySuperDuperTokenMySuperDuperToken','MMMMMM',3,'1000000000') ON CONFLICT ("address") DO UPDATE SET "name"=EXCLUDED."name","symbol"=EXCLUDED."symbol","decimals"=EXCLUDED."decimals","total_supply"=EXCLUDED."total_supply" was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:87)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:538)
... 10 more
Caused by: java.sql.SQLException: java.sql.BatchUpdateException: Batch entry 12 INSERT INTO "erc20_metadata" ("address","name","symbol","decimals","total_supply") VALUES ('0x8be2847c17c8bf000d6e1326c63c8729d3ebe940','MySuperDuperTokenMySuperDuperToken','MMMMMM',3,'1000000000') ON CONFLICT ("address") DO UPDATE SET "name"=EXCLUDED."name","symbol"=EXCLUDED."symbol","decimals"=EXCLUDED."decimals","total_supply"=EXCLUDED."total_supply" was aborted: ERROR: invalid byte sequence for encoding "UTF8": 0x00 Call getNextException to see other errors in the batch.
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
... 12 more
```
Below you can see a picture:

|
process
|
encoding issue with postgres timescale on detector i m submitting a feature request bug report bug report the detected exception related to encoding issues org apache kafka connect errors connectexception exiting workersinktask due to unrecoverable exception at org apache kafka connect runtime workersinktask delivermessages workersinktask java at org apache kafka connect runtime workersinktask poll workersinktask java at org apache kafka connect runtime workersinktask iteration workersinktask java at org apache kafka connect runtime workersinktask execute workersinktask java at org apache kafka connect runtime workertask dorun workertask java at org apache kafka connect runtime workertask run workertask java at java util concurrent executors runnableadapter call executors java at java util concurrent futuretask run futuretask java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java caused by org apache kafka connect errors connectexception java sql sqlexception java sql batchupdateexception batch entry insert into metadata address name symbol decimals total supply values mysuperdupertokenmysuperdupertoken mmmmmm on conflict address do update set name excluded name symbol excluded symbol decimals excluded decimals total supply excluded total supply was aborted error invalid byte sequence for encoding call getnextexception to see other errors in the batch org postgresql util psqlexception error invalid byte sequence for encoding org postgresql util psqlexception error invalid byte sequence for encoding at io confluent connect jdbc sink jdbcsinktask put jdbcsinktask java at org apache kafka connect runtime workersinktask delivermessages workersinktask java more caused by java sql sqlexception java sql batchupdateexception batch entry insert into metadata address name symbol decimals total supply values mysuperdupertokenmysuperdupertoken mmmmmm on conflict address do update set name excluded name symbol excluded symbol decimals excluded decimals total supply excluded total supply was aborted error invalid byte sequence for encoding call getnextexception to see other errors in the batch org postgresql util psqlexception error invalid byte sequence for encoding org postgresql util psqlexception error invalid byte sequence for encoding more below you can see a picture
| 1
|
10,195
| 13,056,042,798
|
IssuesEvent
|
2020-07-30 03:28:52
|
bridgetownrb/bridgetown
|
https://api.github.com/repos/bridgetownrb/bridgetown
|
opened
|
feat: Typing for core classes in Ruby 3
|
process
|
Ruby 3 will introduce a form of type checking, using the RBS type signature language.
More info here: https://developer.squareup.com/blog/the-state-of-ruby-3-typing/
I think we have an wonderful opportunity before us to get the core classes of Bridgetown ready for Ruby 3 typing so that folks developing plugins and other extensions to Bridgetown will get code completion and other useful info in their IDEs, and we'll get a small measure of greater confidence in our code quality.
Don't misunderstand me — I am *hardly* an advocate for strict typing. I absolutely love Ruby's dynamic typing and don't want to do anything to move away from that center. But adding type signatures for things like `Site`, `Document`, `Renderer`, etc. should require minimal (if any) alterations to existing code, while at the same time allowing us to typecheck those classes and provide downstream DX enhancements.
I welcome your thoughts and feedback.
|
1.0
|
feat: Typing for core classes in Ruby 3 - Ruby 3 will introduce a form of type checking, using the RBS type signature language.
More info here: https://developer.squareup.com/blog/the-state-of-ruby-3-typing/
I think we have an wonderful opportunity before us to get the core classes of Bridgetown ready for Ruby 3 typing so that folks developing plugins and other extensions to Bridgetown will get code completion and other useful info in their IDEs, and we'll get a small measure of greater confidence in our code quality.
Don't misunderstand me — I am *hardly* an advocate for strict typing. I absolutely love Ruby's dynamic typing and don't want to do anything to move away from that center. But adding type signatures for things like `Site`, `Document`, `Renderer`, etc. should require minimal (if any) alterations to existing code, while at the same time allowing us to typecheck those classes and provide downstream DX enhancements.
I welcome your thoughts and feedback.
|
process
|
feat typing for core classes in ruby ruby will introduce a form of type checking using the rbs type signature language more info here i think we have an wonderful opportunity before us to get the core classes of bridgetown ready for ruby typing so that folks developing plugins and other extensions to bridgetown will get code completion and other useful info in their ides and we ll get a small measure of greater confidence in our code quality don t misunderstand me — i am hardly an advocate for strict typing i absolutely love ruby s dynamic typing and don t want to do anything to move away from that center but adding type signatures for things like site document renderer etc should require minimal if any alterations to existing code while at the same time allowing us to typecheck those classes and provide downstream dx enhancements i welcome your thoughts and feedback
| 1
|
7,847
| 11,015,565,576
|
IssuesEvent
|
2019-12-05 02:05:15
|
GoogleCloudPlatform/python-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
|
closed
|
Run tests only for APIs with changes
|
testing type: process
|
Opening a PR should only result in test runs for APIs with changes.
google-cloud-python has a script to do something similar.
|
1.0
|
Run tests only for APIs with changes - Opening a PR should only result in test runs for APIs with changes.
google-cloud-python has a script to do something similar.
|
process
|
run tests only for apis with changes opening a pr should only result in test runs for apis with changes google cloud python has a script to do something similar
| 1
|
142,142
| 5,459,808,841
|
IssuesEvent
|
2017-03-09 02:06:32
|
ilios/frontend
|
https://api.github.com/repos/ilios/frontend
|
closed
|
broccoli typescript warnings on npm install
|
low priority
|
```bash
npm install
npm WARN broccoli-tslinter@2.0.1 requires a peer of typescript@^2.0.0 but none was installed.
npm WARN broccoli-tslinter@2.0.1 requires a peer of tslint@^4.0.2 but none was installed.
npm WARN broccoli-typescript-compiler@0.6.2 requires a peer of typescript@^1.6.2 || ^1.7.0 || ^1.8.0 || ^1.9.0-dev || ^2.0.0 || ^2.0.0-dev || ^2.1.0-dev || next but none was installed.
```
get rid of those.
|
1.0
|
broccoli typescript warnings on npm install - ```bash
npm install
npm WARN broccoli-tslinter@2.0.1 requires a peer of typescript@^2.0.0 but none was installed.
npm WARN broccoli-tslinter@2.0.1 requires a peer of tslint@^4.0.2 but none was installed.
npm WARN broccoli-typescript-compiler@0.6.2 requires a peer of typescript@^1.6.2 || ^1.7.0 || ^1.8.0 || ^1.9.0-dev || ^2.0.0 || ^2.0.0-dev || ^2.1.0-dev || next but none was installed.
```
get rid of those.
|
non_process
|
broccoli typescript warnings on npm install bash npm install npm warn broccoli tslinter requires a peer of typescript but none was installed npm warn broccoli tslinter requires a peer of tslint but none was installed npm warn broccoli typescript compiler requires a peer of typescript dev dev dev next but none was installed get rid of those
| 0
|
17,524
| 6,469,324,577
|
IssuesEvent
|
2017-08-17 05:23:02
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
--enable-static fails due to multiple definition
|
build tracing v7.x v8.x
|
* **Version**: v7.10.0, v8.0.0
* **Platform**: Linux x64 (Ubuntu 17.04)
* **Subsystem**: Build?
```
./configure --enable-static
make
```
ends up
```
/tmp/out/Release/obj.target/node/src/tracing/node_trace_writer.o: In function `node::tracing::NodeTraceWriter::FlushPrivate()':
node_trace_writer.cc:(.text+0x16e0): multiple definition of `node::tracing::NodeTraceWriter::FlushPrivate()'
/tmp/out/Release/obj.target/node/src/tracing/node_trace_writer.o:node_trace_writer.cc:(.text+0x16e0): first defined here
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o: In function `node::tracing::TraceEventHelper::SetCurrentPlatform(v8::Platform*)':
trace_event.cc:(.text+0x0): multiple definition of `node::tracing::TraceEventHelper::SetCurrentPlatform(v8::Platform*)'
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o:trace_event.cc:(.text+0x0): first defined here
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o:(.bss+0x0): multiple definition of `node::tracing::platform_'
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o:(.bss+0x0): first defined here
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o: In function `node::tracing::TraceEventHelper::GetCurrentPlatform()':
trace_event.cc:(.text+0x10): multiple definition of `node::tracing::TraceEventHelper::GetCurrentPlatform()'
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o:trace_event.cc:(.text+0x10): first defined here
collect2: error: ld returned 1 exit status
cctest.target.mk:218: recipe for target '/tmp/out/Release/cctest' failed
make[1]: *** [/tmp/out/Release/cctest] Error 1
rm 8cc54113df3f034cebe6283bb3a0080714e3f522.intermediate
Makefile:76: recipe for target 'node' failed
make: *** [node] Error 2
```
for me.
With v7.4.0, things are working.
|
1.0
|
--enable-static fails due to multiple definition - * **Version**: v7.10.0, v8.0.0
* **Platform**: Linux x64 (Ubuntu 17.04)
* **Subsystem**: Build?
```
./configure --enable-static
make
```
ends up
```
/tmp/out/Release/obj.target/node/src/tracing/node_trace_writer.o: In function `node::tracing::NodeTraceWriter::FlushPrivate()':
node_trace_writer.cc:(.text+0x16e0): multiple definition of `node::tracing::NodeTraceWriter::FlushPrivate()'
/tmp/out/Release/obj.target/node/src/tracing/node_trace_writer.o:node_trace_writer.cc:(.text+0x16e0): first defined here
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o: In function `node::tracing::TraceEventHelper::SetCurrentPlatform(v8::Platform*)':
trace_event.cc:(.text+0x0): multiple definition of `node::tracing::TraceEventHelper::SetCurrentPlatform(v8::Platform*)'
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o:trace_event.cc:(.text+0x0): first defined here
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o:(.bss+0x0): multiple definition of `node::tracing::platform_'
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o:(.bss+0x0): first defined here
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o: In function `node::tracing::TraceEventHelper::GetCurrentPlatform()':
trace_event.cc:(.text+0x10): multiple definition of `node::tracing::TraceEventHelper::GetCurrentPlatform()'
/tmp/out/Release/obj.target/node/src/tracing/trace_event.o:trace_event.cc:(.text+0x10): first defined here
collect2: error: ld returned 1 exit status
cctest.target.mk:218: recipe for target '/tmp/out/Release/cctest' failed
make[1]: *** [/tmp/out/Release/cctest] Error 1
rm 8cc54113df3f034cebe6283bb3a0080714e3f522.intermediate
Makefile:76: recipe for target 'node' failed
make: *** [node] Error 2
```
for me.
With v7.4.0, things are working.
|
non_process
|
enable static fails due to multiple definition version platform linux ubuntu subsystem build configure enable static make ends up tmp out release obj target node src tracing node trace writer o in function node tracing nodetracewriter flushprivate node trace writer cc text multiple definition of node tracing nodetracewriter flushprivate tmp out release obj target node src tracing node trace writer o node trace writer cc text first defined here tmp out release obj target node src tracing trace event o in function node tracing traceeventhelper setcurrentplatform platform trace event cc text multiple definition of node tracing traceeventhelper setcurrentplatform platform tmp out release obj target node src tracing trace event o trace event cc text first defined here tmp out release obj target node src tracing trace event o bss multiple definition of node tracing platform tmp out release obj target node src tracing trace event o bss first defined here tmp out release obj target node src tracing trace event o in function node tracing traceeventhelper getcurrentplatform trace event cc text multiple definition of node tracing traceeventhelper getcurrentplatform tmp out release obj target node src tracing trace event o trace event cc text first defined here error ld returned exit status cctest target mk recipe for target tmp out release cctest failed make error rm intermediate makefile recipe for target node failed make error for me with things are working
| 0
|
17,056
| 2,972,401,364
|
IssuesEvent
|
2015-07-14 13:37:22
|
mabe02/lanterna
|
https://api.github.com/repos/mabe02/lanterna
|
closed
|
Remove traces of Container interface from Window
|
auto-migrated Priority-Medium Type-Defect
|
```
Window used to implement Container interface, but is no longer. We should
remove and clean-up the code in Window class to remove the traces of
implementing Container, by exposing the content pane.
```
Original issue reported on code.google.com by `mab...@gmail.com` on 16 Sep 2012 at 9:07
|
1.0
|
Remove traces of Container interface from Window - ```
Window used to implement Container interface, but is no longer. We should
remove and clean-up the code in Window class to remove the traces of
implementing Container, by exposing the content pane.
```
Original issue reported on code.google.com by `mab...@gmail.com` on 16 Sep 2012 at 9:07
|
non_process
|
remove traces of container interface from window window used to implement container interface but is no longer we should remove and clean up the code in window class to remove the traces of implementing container by exposing the content pane original issue reported on code google com by mab gmail com on sep at
| 0
|
4,598
| 5,206,395,414
|
IssuesEvent
|
2017-01-24 20:30:24
|
jamesmundia/gaffablaze
|
https://api.github.com/repos/jamesmundia/gaffablaze
|
opened
|
Alanning Roles for Admin access data
|
db EPIC SaaS / Production Security UX
|
For example the DOCs or a club admin might need elevated access to team data when in production
|
True
|
Alanning Roles for Admin access data - For example the DOCs or a club admin might need elevated access to team data when in production
|
non_process
|
alanning roles for admin access data for example the docs or a club admin might need elevated access to team data when in production
| 0
|
8,088
| 11,257,750,843
|
IssuesEvent
|
2020-01-13 00:55:28
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
java.lang.NoSuchMethodError error with metabase 0.33.2
|
Database/BigQuery Priority:P1 Querying/Processor Type:Bug
|
**Describe the bug**
We have recently upgraded to metabse 0.33.2 from 0.31.2.
After the upgrade we have observed that our dashboards and questions are showing following error frequently-
":error "'java.util.List com.google.common.base.Splitter.splitToList(java.lang.CharSequence)'"," error.
To fix the error we have to restart the metabse services. But the error again shows up after couple of hours.
**Logs**
Sep 16 05:36:19 metabase-vm-dup java[13330]: :class java.lang.NoSuchMethodError,
Sep 16 05:36:19 metabase-vm-dup java[13330]: :error "'java.util.List com.google.common.base.Splitter.splitToList(java.lang.CharSequence)'",
Sep 16 05:36:19 metabase-vm-dup java[13330]: :stacktrace
Sep 16 05:36:19 metabase-vm-dup java[13330]: ("com.google.api.client.http.UriTemplate.expand(UriTemplate.java:305)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.http.UriTemplate.expand(UriTemplate.java:262)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequestUrl(AbstractGoogleClientRequest.java:346)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequest(AbstractGoogleClientRequest.java:381)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:499)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:549)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "--> driver.google$execute_no_auto_retry.invokeStatic(google.clj:36)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.google$execute_no_auto_retry.invoke(google.clj:32)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.google$execute$fn__1128.invoke(google.clj:50)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:429)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:433)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:433)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.google$execute.invokeStatic(google.clj:49)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.google$execute.invoke(google.clj:43)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$execute_bigquery.invokeStatic(bigquery.clj:174)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$execute_bigquery.invoke(bigquery.clj:163)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$execute_bigquery.invokeStatic(bigquery.clj:165)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$execute_bigquery.invoke(bigquery.clj:163)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$process_native_STAR_$fn__1432.invoke(bigquery.clj:259)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:429)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:433)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$process_native_STAR_.invokeStatic(bigquery.clj:258)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$process_native_STAR_.invoke(bigquery.clj:254)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$eval1626$fn__1628.invoke(bigquery.clj:443)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor$fn__43955$execute_query__43960$fn__43961.invoke(query_processor.clj:70)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor$fn__43955$execute_query__43960.invoke(query_processor.clj:64)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__34096.invoke(mbql_to_native.clj:38)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.annotate$result_rows_maps__GT_vectors$fn__36340.invoke(annotate.clj:540)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.annotate$add_column_info$fn__36246.invoke(annotate.clj:484)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__37267.invoke(cumulative_aggregations.clj:57)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.resolve_joins$resolve_joins$fn__41065.invoke(resolve_joins.clj:184)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.limit$limit$fn__38003.invoke(limit.clj:19)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__43824.invoke(results_metadata.clj:87)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.format_rows$format_rows$fn__37991.invoke(format_rows.clj:26)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.add_dimension_projections$add_remapping$fn__34855.invoke(add_dimension_projections.clj:232)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__35474.invoke(add_source_metadata.clj:107)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__41115.invoke(resolve_source_table.clj:46)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.add_row_count_and_status$add_row_count_and_status$fn__35334.invoke(add_row_count_and_status.clj:16)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.driver_specific$process_query_in_context$fn__37478.invoke(driver_specific.clj:12)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.resolve_driver$resolve_driver$fn__40729.invoke(resolve_driver.clj:22)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.bind_effective_timezone$bind_effective_timezone$fn__36669$fn__36670.invoke(bind_effective_timezone.clj:9)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util.date$call_with_effective_timezone.invokeStatic(date.clj:88)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util.date$call_with_effective_timezone.invoke(date.clj:77)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.bind_effective_timezone$bind_effective_timezone$fn__36669.invoke(bind_effective_timezone.clj:8)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.store$initialize_store$fn__43849$fn__43850.invoke(store.clj:11)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.store$do_with_store.invokeStatic(store.clj:46)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.store$do_with_store.invoke(store.clj:40)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.store$initialize_store$fn__43849.invoke(store.clj:10)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.async$async__GT_sync$fn__34007.invoke(async.clj:23)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.async_wait$runnable$fn__36397.invoke(async_wait.clj:89)"),
Sep 16 05:36:19 metabase-vm-dup java[13330]: :query
Sep 16 05:36:19 metabase-vm-dup java[13330]: {:constraints {:max-results 10000, :max-results-bare-rows 2000},
Sep 16 05:36:19 metabase-vm-dup java[13330]: :type :native,
Sep 16 05:36:19 metabase-vm-dup java[13330]: :middleware {:userland-query? true},
Sep 16 05:36:19 metabase-vm-dup java[13330]: :native
Sep 16 05:36:19 metabase-vm-dup java[13330]: {:template-tags
Sep 16 05:36:19 metabase-vm-dup java[13330]: {"Customer_base"
Sep 16 05:36:19 metabase-vm-dup java[13330]: {:id "03cb1f70-ae23-74ce-5896-7522fff6c6b0",
**To Reproduce**
Steps to reproduce the behavior:
1. Start metabase application.
2. Let it run for couple of hours and then the error shows up for all dashboards and questions.
**Information about your Metabase Installation:**
- Your operating system: Ubuntu 18.04
- Your databases: BigQuery
- Metabase version: 0.33.2
- Metabase hosting environment: Jar-file on Ubuntu (GCP Compute Engine)
- Metabase internal database: MySQL
**Severity**
It is impacting all our metabase users. Including Production users.
**Additional context**
This error was not observed in metabase 0.31.2 version.
|
1.0
|
java.lang.NoSuchMethodError error with metabase 0.33.2 - **Describe the bug**
We have recently upgraded to metabse 0.33.2 from 0.31.2.
After the upgrade we have observed that our dashboards and questions are showing following error frequently-
":error "'java.util.List com.google.common.base.Splitter.splitToList(java.lang.CharSequence)'"," error.
To fix the error we have to restart the metabse services. But the error again shows up after couple of hours.
**Logs**
Sep 16 05:36:19 metabase-vm-dup java[13330]: :class java.lang.NoSuchMethodError,
Sep 16 05:36:19 metabase-vm-dup java[13330]: :error "'java.util.List com.google.common.base.Splitter.splitToList(java.lang.CharSequence)'",
Sep 16 05:36:19 metabase-vm-dup java[13330]: :stacktrace
Sep 16 05:36:19 metabase-vm-dup java[13330]: ("com.google.api.client.http.UriTemplate.expand(UriTemplate.java:305)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.http.UriTemplate.expand(UriTemplate.java:262)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequestUrl(AbstractGoogleClientRequest.java:346)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequest(AbstractGoogleClientRequest.java:381)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:499)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:549)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "--> driver.google$execute_no_auto_retry.invokeStatic(google.clj:36)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.google$execute_no_auto_retry.invoke(google.clj:32)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.google$execute$fn__1128.invoke(google.clj:50)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:429)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:433)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:433)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.google$execute.invokeStatic(google.clj:49)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.google$execute.invoke(google.clj:43)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$execute_bigquery.invokeStatic(bigquery.clj:174)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$execute_bigquery.invoke(bigquery.clj:163)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$execute_bigquery.invokeStatic(bigquery.clj:165)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$execute_bigquery.invoke(bigquery.clj:163)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$process_native_STAR_$fn__1432.invoke(bigquery.clj:259)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:429)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invokeStatic(util.clj:433)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util$do_with_auto_retries.invoke(util.clj:421)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$process_native_STAR_.invokeStatic(bigquery.clj:258)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$process_native_STAR_.invoke(bigquery.clj:254)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "driver.bigquery$eval1626$fn__1628.invoke(bigquery.clj:443)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor$fn__43955$execute_query__43960$fn__43961.invoke(query_processor.clj:70)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor$fn__43955$execute_query__43960.invoke(query_processor.clj:64)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.mbql_to_native$mbql__GT_native$fn__34096.invoke(mbql_to_native.clj:38)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.annotate$result_rows_maps__GT_vectors$fn__36340.invoke(annotate.clj:540)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.annotate$add_column_info$fn__36246.invoke(annotate.clj:484)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__37267.invoke(cumulative_aggregations.clj:57)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.resolve_joins$resolve_joins$fn__41065.invoke(resolve_joins.clj:184)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.limit$limit$fn__38003.invoke(limit.clj:19)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__43824.invoke(results_metadata.clj:87)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.format_rows$format_rows$fn__37991.invoke(format_rows.clj:26)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.add_dimension_projections$add_remapping$fn__34855.invoke(add_dimension_projections.clj:232)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__35474.invoke(add_source_metadata.clj:107)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.resolve_source_table$resolve_source_tables$fn__41115.invoke(resolve_source_table.clj:46)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.add_row_count_and_status$add_row_count_and_status$fn__35334.invoke(add_row_count_and_status.clj:16)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.driver_specific$process_query_in_context$fn__37478.invoke(driver_specific.clj:12)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.resolve_driver$resolve_driver$fn__40729.invoke(resolve_driver.clj:22)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.bind_effective_timezone$bind_effective_timezone$fn__36669$fn__36670.invoke(bind_effective_timezone.clj:9)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util.date$call_with_effective_timezone.invokeStatic(date.clj:88)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "util.date$call_with_effective_timezone.invoke(date.clj:77)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.bind_effective_timezone$bind_effective_timezone$fn__36669.invoke(bind_effective_timezone.clj:8)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.store$initialize_store$fn__43849$fn__43850.invoke(store.clj:11)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.store$do_with_store.invokeStatic(store.clj:46)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.store$do_with_store.invoke(store.clj:40)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.store$initialize_store$fn__43849.invoke(store.clj:10)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.async$async__GT_sync$fn__34007.invoke(async.clj:23)"
Sep 16 05:36:19 metabase-vm-dup java[13330]: "query_processor.middleware.async_wait$runnable$fn__36397.invoke(async_wait.clj:89)"),
Sep 16 05:36:19 metabase-vm-dup java[13330]: :query
Sep 16 05:36:19 metabase-vm-dup java[13330]: {:constraints {:max-results 10000, :max-results-bare-rows 2000},
Sep 16 05:36:19 metabase-vm-dup java[13330]: :type :native,
Sep 16 05:36:19 metabase-vm-dup java[13330]: :middleware {:userland-query? true},
Sep 16 05:36:19 metabase-vm-dup java[13330]: :native
Sep 16 05:36:19 metabase-vm-dup java[13330]: {:template-tags
Sep 16 05:36:19 metabase-vm-dup java[13330]: {"Customer_base"
Sep 16 05:36:19 metabase-vm-dup java[13330]: {:id "03cb1f70-ae23-74ce-5896-7522fff6c6b0",
**To Reproduce**
Steps to reproduce the behavior:
1. Start metabase application.
2. Let it run for couple of hours and then the error shows up for all dashboards and questions.
**Information about your Metabase Installation:**
- Your operating system: Ubuntu 18.04
- Your databases: BigQuery
- Metabase version: 0.33.2
- Metabase hosting environment: Jar-file on Ubuntu (GCP Compute Engine)
- Metabase internal database: MySQL
**Severity**
It is impacting all our metabase users. Including Production users.
**Additional context**
This error was not observed in metabase 0.31.2 version.
|
process
|
java lang nosuchmethoderror error with metabase describe the bug we have recently upgraded to metabse from after the upgrade we have observed that our dashboards and questions are showing following error frequently error java util list com google common base splitter splittolist java lang charsequence error to fix the error we have to restart the metabse services but the error again shows up after couple of hours logs sep metabase vm dup java class java lang nosuchmethoderror sep metabase vm dup java error java util list com google common base splitter splittolist java lang charsequence sep metabase vm dup java stacktrace sep metabase vm dup java com google api client http uritemplate expand uritemplate java sep metabase vm dup java com google api client http uritemplate expand uritemplate java sep metabase vm dup java com google api client googleapis services abstractgoogleclientrequest buildhttprequesturl abstractgoogleclientrequest java sep metabase vm dup java com google api client googleapis services abstractgoogleclientrequest buildhttprequest abstractgoogleclientrequest java sep metabase vm dup java com google api client googleapis services abstractgoogleclientrequest executeunparsed abstractgoogleclientrequest java sep metabase vm dup java com google api client googleapis services abstractgoogleclientrequest executeunparsed abstractgoogleclientrequest java sep metabase vm dup java com google api client googleapis services abstractgoogleclientrequest execute abstractgoogleclientrequest java sep metabase vm dup java driver google execute no auto retry invokestatic google clj sep metabase vm dup java driver google execute no auto retry invoke google clj sep metabase vm dup java driver google execute fn invoke google clj sep metabase vm dup java util do with auto retries invokestatic util clj sep metabase vm dup java util do with auto retries invoke util clj sep metabase vm dup java util do with auto retries invokestatic util clj sep metabase vm dup java util do with auto retries invoke util clj sep metabase vm dup java util do with auto retries invokestatic util clj sep metabase vm dup java util do with auto retries invoke util clj sep metabase vm dup java driver google execute invokestatic google clj sep metabase vm dup java driver google execute invoke google clj sep metabase vm dup java driver bigquery execute bigquery invokestatic bigquery clj sep metabase vm dup java driver bigquery execute bigquery invoke bigquery clj sep metabase vm dup java driver bigquery execute bigquery invokestatic bigquery clj sep metabase vm dup java driver bigquery execute bigquery invoke bigquery clj sep metabase vm dup java driver bigquery process native star fn invoke bigquery clj sep metabase vm dup java util do with auto retries invokestatic util clj sep metabase vm dup java util do with auto retries invoke util clj sep metabase vm dup java util do with auto retries invokestatic util clj sep metabase vm dup java util do with auto retries invoke util clj sep metabase vm dup java driver bigquery process native star invokestatic bigquery clj sep metabase vm dup java driver bigquery process native star invoke bigquery clj sep metabase vm dup java driver bigquery fn invoke bigquery clj sep metabase vm dup java query processor fn execute query fn invoke query processor clj sep metabase vm dup java query processor fn execute query invoke query processor clj sep metabase vm dup java query processor middleware mbql to native mbql gt native fn invoke mbql to native clj sep metabase vm dup java query processor middleware annotate result rows maps gt vectors fn invoke annotate clj sep metabase vm dup java query processor middleware annotate add column info fn invoke annotate clj sep metabase vm dup java query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj sep metabase vm dup java query processor middleware resolve joins resolve joins fn invoke resolve joins clj sep metabase vm dup java query processor middleware limit limit fn invoke limit clj sep metabase vm dup java query processor middleware results metadata record and return metadata bang fn invoke results metadata clj sep metabase vm dup java query processor middleware format rows format rows fn invoke format rows clj sep metabase vm dup java query processor middleware add dimension projections add remapping fn invoke add dimension projections clj sep metabase vm dup java query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj sep metabase vm dup java query processor middleware resolve source table resolve source tables fn invoke resolve source table clj sep metabase vm dup java query processor middleware add row count and status add row count and status fn invoke add row count and status clj sep metabase vm dup java query processor middleware driver specific process query in context fn invoke driver specific clj sep metabase vm dup java query processor middleware resolve driver resolve driver fn invoke resolve driver clj sep metabase vm dup java query processor middleware bind effective timezone bind effective timezone fn fn invoke bind effective timezone clj sep metabase vm dup java util date call with effective timezone invokestatic date clj sep metabase vm dup java util date call with effective timezone invoke date clj sep metabase vm dup java query processor middleware bind effective timezone bind effective timezone fn invoke bind effective timezone clj sep metabase vm dup java query processor middleware store initialize store fn fn invoke store clj sep metabase vm dup java query processor store do with store invokestatic store clj sep metabase vm dup java query processor store do with store invoke store clj sep metabase vm dup java query processor middleware store initialize store fn invoke store clj sep metabase vm dup java query processor middleware async async gt sync fn invoke async clj sep metabase vm dup java query processor middleware async wait runnable fn invoke async wait clj sep metabase vm dup java query sep metabase vm dup java constraints max results max results bare rows sep metabase vm dup java type native sep metabase vm dup java middleware userland query true sep metabase vm dup java native sep metabase vm dup java template tags sep metabase vm dup java customer base sep metabase vm dup java id to reproduce steps to reproduce the behavior start metabase application let it run for couple of hours and then the error shows up for all dashboards and questions information about your metabase installation your operating system ubuntu your databases bigquery metabase version metabase hosting environment jar file on ubuntu gcp compute engine metabase internal database mysql severity it is impacting all our metabase users including production users additional context this error was not observed in metabase version
| 1
|
484,222
| 13,936,485,467
|
IssuesEvent
|
2020-10-22 13:00:18
|
ihhub/fheroes2
|
https://api.github.com/repos/ihhub/fheroes2
|
closed
|
Arena choice window, the letter "K" is missing in the word "Knowledge"
|
bug good first issue low priority ui
|
In fheroes2, with the option "heroes: in Arena can choose any of primary skills" activated, in the Arena choice window, the letter "K" is missing in the word "Knowledge" under the skill icon.
In fheroes2:

|
1.0
|
Arena choice window, the letter "K" is missing in the word "Knowledge" - In fheroes2, with the option "heroes: in Arena can choose any of primary skills" activated, in the Arena choice window, the letter "K" is missing in the word "Knowledge" under the skill icon.
In fheroes2:

|
non_process
|
arena choice window the letter k is missing in the word knowledge in with the option heroes in arena can choose any of primary skills activated in the arena choice window the letter k is missing in the word knowledge under the skill icon in
| 0
|
13,226
| 15,691,845,271
|
IssuesEvent
|
2021-03-25 18:22:12
|
GoogleCloudPlatform/php-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/php-docs-samples
|
closed
|
longRunningRecognize and pollUntilComplete doesn't work with really long files
|
:rotating_light: api: speech priority: p2 samples type: bug type: process
|
I'm using this code https://github.com/GoogleCloudPlatform/php-docs-samples/blob/master/speech/src/transcribe_async_gcs.php to upload and transcribe a 1 hour audio file, and the `$operation->pollUntilComplete();` function ends but both `$operation->operationSucceeded()` and `$operation->operationFailed()` are `false`, and `$operation->getError()` is empty.
Tested it in multiple platforms, always using the options to transcribe the file from a Google Cloud Storage bucket. Same code for other audio lengths works correctly.
|
1.0
|
longRunningRecognize and pollUntilComplete doesn't work with really long files - I'm using this code https://github.com/GoogleCloudPlatform/php-docs-samples/blob/master/speech/src/transcribe_async_gcs.php to upload and transcribe a 1 hour audio file, and the `$operation->pollUntilComplete();` function ends but both `$operation->operationSucceeded()` and `$operation->operationFailed()` are `false`, and `$operation->getError()` is empty.
Tested it in multiple platforms, always using the options to transcribe the file from a Google Cloud Storage bucket. Same code for other audio lengths works correctly.
|
process
|
longrunningrecognize and polluntilcomplete doesn t work with really long files i m using this code to upload and transcribe a hour audio file and the operation polluntilcomplete function ends but both operation operationsucceeded and operation operationfailed are false and operation geterror is empty tested it in multiple platforms always using the options to transcribe the file from a google cloud storage bucket same code for other audio lengths works correctly
| 1
|
508,431
| 14,700,115,316
|
IssuesEvent
|
2021-01-04 09:42:06
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.pinterest.co.uk - see bug description
|
browser-firefox engine-gecko ml-needsdiagnosis-false priority-normal
|
<!-- @browser: Firefox 84.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:84.0) Gecko/20100101 Firefox/84.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/64797 -->
**URL**: https://www.pinterest.co.uk/
**Browser / Version**: Firefox 84.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Internet Explorer
**Problem type**: Something else
**Description**: Page does not load, plain white screen. I have no problem in Microsoft Edge, but cannot use Imagus, and this enhances the site enormously, I also sometimes cannot see emails either which is very annoying
**Steps to Reproduce**:
Works ok.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.pinterest.co.uk - see bug description - <!-- @browser: Firefox 84.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:84.0) Gecko/20100101 Firefox/84.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/64797 -->
**URL**: https://www.pinterest.co.uk/
**Browser / Version**: Firefox 84.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Internet Explorer
**Problem type**: Something else
**Description**: Page does not load, plain white screen. I have no problem in Microsoft Edge, but cannot use Imagus, and this enhances the site enormously, I also sometimes cannot see emails either which is very annoying
**Steps to Reproduce**:
Works ok.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
see bug description url browser version firefox operating system windows tested another browser yes internet explorer problem type something else description page does not load plain white screen i have no problem in microsoft edge but cannot use imagus and this enhances the site enormously i also sometimes cannot see emails either which is very annoying steps to reproduce works ok browser configuration none from with ❤️
| 0
|
321,512
| 27,535,621,416
|
IssuesEvent
|
2023-03-07 03:04:55
|
mumble-voip/mumble
|
https://api.github.com/repos/mumble-voip/mumble
|
closed
|
Mumble doesn't reconnect after sleep mode
|
client windows needs-more-input priority/P3 - Somewhat important needs-ckeck-with-latest-version stale-no-response
|
So, the title says itself. If you put your OS into a sleep mode and then go back, mumble never reconnects. Of course I have all these "reconnect automatically" option enabled. I guess that bug started with 1.2.3 beta, because I remember times when mumble always reconnected after sleep/suspend mode.
_This ticket has been migrated from sourceforge. It is thus missing some details like original creator etc.
The original is at https://sourceforge.net/p/mumble/bugs/996/ ._
|
1.0
|
Mumble doesn't reconnect after sleep mode - So, the title says itself. If you put your OS into a sleep mode and then go back, mumble never reconnects. Of course I have all these "reconnect automatically" option enabled. I guess that bug started with 1.2.3 beta, because I remember times when mumble always reconnected after sleep/suspend mode.
_This ticket has been migrated from sourceforge. It is thus missing some details like original creator etc.
The original is at https://sourceforge.net/p/mumble/bugs/996/ ._
|
non_process
|
mumble doesn t reconnect after sleep mode so the title says itself if you put your os into a sleep mode and then go back mumble never reconnects of course i have all these reconnect automatically option enabled i guess that bug started with beta because i remember times when mumble always reconnected after sleep suspend mode this ticket has been migrated from sourceforge it is thus missing some details like original creator etc the original is at
| 0
|
13,284
| 15,762,555,293
|
IssuesEvent
|
2021-03-31 11:13:13
|
CATcher-org/CATcher
|
https://api.github.com/repos/CATcher-org/CATcher
|
closed
|
Enforcing Prettier on commit and push
|
aspect-Process
|
Currently the `pretty-quick` setting does not automatically restage formatted files.
It is possible for devs to push a commit without applying prettier suggestions.
Let's
* add the `--staged` flag to automatically restage and commit formatting changes
* add `prettier --check` in the pre-push hook
|
1.0
|
Enforcing Prettier on commit and push - Currently the `pretty-quick` setting does not automatically restage formatted files.
It is possible for devs to push a commit without applying prettier suggestions.
Let's
* add the `--staged` flag to automatically restage and commit formatting changes
* add `prettier --check` in the pre-push hook
|
process
|
enforcing prettier on commit and push currently the pretty quick setting does not automatically restage formatted files it is possible for devs to push a commit without applying prettier suggestions let s add the staged flag to automatically restage and commit formatting changes add prettier check in the pre push hook
| 1
|
622,765
| 19,656,347,485
|
IssuesEvent
|
2022-01-10 12:56:25
|
ls1intum/Artemis
|
https://api.github.com/repos/ls1intum/Artemis
|
closed
|
Possible Race Condition in "Next Assessment"
|
bug good first issue priority:high
|
### Describe the bug
We've recently run into the bug that two teaching assistants had to assess the same solution which lead to issues regarding the grading. After talking to @krusche the idea is that this behavior might be caused by a not handled race condition in the "Assess next solution" functionality. As you can see in the picture, you can see less dates than assessments.
As this state could only be resolved by manually editing the Database, I've marked this bug as "priority:high".
**EDIT: @krusche gave me the information that the lock mechanism for complaints has been reworked in the recent past; maybe this mechanism can be used for the assess next solution feature as well**
#### To Reproduce
<!-- Steps to reproduce the behavior: -->
As this might be caused by a race condition, it's quite hard to reproduce.
#### Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
A Submission shall be assessed by only one teaching assistant
#### Screenshots
<!-- If applicable, add screenshots to help explain your problem. -->

### Environment
Artemis 4.12.1 (Gitlab/Jenkins)
|
1.0
|
Possible Race Condition in "Next Assessment" - ### Describe the bug
We've recently run into the bug that two teaching assistants had to assess the same solution which lead to issues regarding the grading. After talking to @krusche the idea is that this behavior might be caused by a not handled race condition in the "Assess next solution" functionality. As you can see in the picture, you can see less dates than assessments.
As this state could only be resolved by manually editing the Database, I've marked this bug as "priority:high".
**EDIT: @krusche gave me the information that the lock mechanism for complaints has been reworked in the recent past; maybe this mechanism can be used for the assess next solution feature as well**
#### To Reproduce
<!-- Steps to reproduce the behavior: -->
As this might be caused by a race condition, it's quite hard to reproduce.
#### Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
A Submission shall be assessed by only one teaching assistant
#### Screenshots
<!-- If applicable, add screenshots to help explain your problem. -->

### Environment
Artemis 4.12.1 (Gitlab/Jenkins)
|
non_process
|
possible race condition in next assessment describe the bug we ve recently run into the bug that two teaching assistants had to assess the same solution which lead to issues regarding the grading after talking to krusche the idea is that this behavior might be caused by a not handled race condition in the assess next solution functionality as you can see in the picture you can see less dates than assessments as this state could only be resolved by manually editing the database i ve marked this bug as priority high edit krusche gave me the information that the lock mechanism for complaints has been reworked in the recent past maybe this mechanism can be used for the assess next solution feature as well to reproduce as this might be caused by a race condition it s quite hard to reproduce expected behavior a submission shall be assessed by only one teaching assistant screenshots environment artemis gitlab jenkins
| 0
|
21,496
| 29,661,776,438
|
IssuesEvent
|
2023-06-10 08:41:28
|
goravel/goravel
|
https://api.github.com/repos/goravel/goravel
|
closed
|
✨ [Feature] Use hibiken/asynq replace RichardKnop/machinery
|
enhancement processing
|
### Before feedback (在反馈之前)
- [X] There are no features that I want to submit in Issues (当前 Issues 中没有我将要提交的新功能)
### Describe Feature (描述功能)
https://github.com/RichardKnop/machinery library has not been maintained for a long time, and recently there are feedbacks that the queue cannot be stopped after being started. I don’t know if it is the reason for this library. It is recommended to replace it with https://github.com/hibiken/asynq, which is being actively maintained.
|
1.0
|
✨ [Feature] Use hibiken/asynq replace RichardKnop/machinery - ### Before feedback (在反馈之前)
- [X] There are no features that I want to submit in Issues (当前 Issues 中没有我将要提交的新功能)
### Describe Feature (描述功能)
https://github.com/RichardKnop/machinery library has not been maintained for a long time, and recently there are feedbacks that the queue cannot be stopped after being started. I don’t know if it is the reason for this library. It is recommended to replace it with https://github.com/hibiken/asynq, which is being actively maintained.
|
process
|
✨ use hibiken asynq replace richardknop machinery before feedback 在反馈之前 there are no features that i want to submit in issues 当前 issues 中没有我将要提交的新功能 describe feature 描述功能 library has not been maintained for a long time and recently there are feedbacks that the queue cannot be stopped after being started i don’t know if it is the reason for this library it is recommended to replace it with which is being actively maintained
| 1
|
12,798
| 15,180,809,528
|
IssuesEvent
|
2021-02-15 01:20:27
|
Geonovum/disgeo-arch
|
https://api.github.com/repos/Geonovum/disgeo-arch
|
closed
|
4.2 Functionele lagen in de inrichting
|
In Behandeling In behandeling - voorstel processen e.d. Processen Functies Componenten
|
Waarin wordt er niet aangesloten bij de 5 lagen architectuur van Common ground en de bijbehorende principes?
https://commonground.nl/file/download/54476935/Common%20Ground%20Infographic.pdf Genoemde functionele lagen zijn geen lagen maar kolommen. Leidt tot verwarring met andere lagenmodellen. Voegt ook niets toe om dat lagen te noemen.
Registreren lijkt ontkoppeld van ontsluiten/verstrekken. Als de keten kantelt, en data-gedreven werkt, dan is er "maar 1 database" (logisch gezien), waar alle processen in samenhang op werken. (waar nodig in samenhang bijwerken en waar gewenst in samenhang verstrekken). De inwinnende partij is verantwoordelijk voor de correctheid van de data, die verstrekt wordt - en wordt geacht zelf die data op orde te maken. Dan ligt de correctheid van de data ook bij de juiste partij, de partij met kennis, en die partij is zich bewust/sensitief van hoe de data gebruikt wordt. Kortom: inwinnen voor gebruik.
Belangrijk is dat er 1 partij verantwoordelijk is voor de registratie, ongeacht of het gaat om registreren of gebruik. Dus een principe dat je inricht/optimaliseert voor gebruik. De gebruiksprocessen zijn vaak veel heftiger dan de registratieprocessen.
Er kan een aparte database van gemaakt worden, bv. een real-time replica, desnoods een aparte database met deels een apart technisch datamodel, maar niet een apart informatiemodel. De informatie moet in de keten gelijk zijn, met dezelfde betekenis en ontsloten zodra dit kan. Ook m.b.t. wanneer dit kan, beschikt alleen de inwinnende partij (over het algemeen) over de juiste kennis om dit in te schatten/te bepalen. Als je dus aparte technische datamodellen maakt, hou het informatiemodel deel dan 1 op 1 hetzelfde (!, dan kan dit wel, maar met behoud van consistentie).
Het gaat niet om informatieproducten, of informatie op maat. Die onderwerpen zitten in een verlengstuk op de bron. Het gaat over de data in de bron. Dus de informatie waarmee de informatieproducten het moeten doen. Immers, alleen deze informatie uit de bron is er, er is verder niks (ook al wil je meer, er is niet meer, ook al wil je informatie met een andere betekenis, die is er niet, want het is niet zo ingewonnen).
|
2.0
|
4.2 Functionele lagen in de inrichting - Waarin wordt er niet aangesloten bij de 5 lagen architectuur van Common ground en de bijbehorende principes?
https://commonground.nl/file/download/54476935/Common%20Ground%20Infographic.pdf Genoemde functionele lagen zijn geen lagen maar kolommen. Leidt tot verwarring met andere lagenmodellen. Voegt ook niets toe om dat lagen te noemen.
Registreren lijkt ontkoppeld van ontsluiten/verstrekken. Als de keten kantelt, en data-gedreven werkt, dan is er "maar 1 database" (logisch gezien), waar alle processen in samenhang op werken. (waar nodig in samenhang bijwerken en waar gewenst in samenhang verstrekken). De inwinnende partij is verantwoordelijk voor de correctheid van de data, die verstrekt wordt - en wordt geacht zelf die data op orde te maken. Dan ligt de correctheid van de data ook bij de juiste partij, de partij met kennis, en die partij is zich bewust/sensitief van hoe de data gebruikt wordt. Kortom: inwinnen voor gebruik.
Belangrijk is dat er 1 partij verantwoordelijk is voor de registratie, ongeacht of het gaat om registreren of gebruik. Dus een principe dat je inricht/optimaliseert voor gebruik. De gebruiksprocessen zijn vaak veel heftiger dan de registratieprocessen.
Er kan een aparte database van gemaakt worden, bv. een real-time replica, desnoods een aparte database met deels een apart technisch datamodel, maar niet een apart informatiemodel. De informatie moet in de keten gelijk zijn, met dezelfde betekenis en ontsloten zodra dit kan. Ook m.b.t. wanneer dit kan, beschikt alleen de inwinnende partij (over het algemeen) over de juiste kennis om dit in te schatten/te bepalen. Als je dus aparte technische datamodellen maakt, hou het informatiemodel deel dan 1 op 1 hetzelfde (!, dan kan dit wel, maar met behoud van consistentie).
Het gaat niet om informatieproducten, of informatie op maat. Die onderwerpen zitten in een verlengstuk op de bron. Het gaat over de data in de bron. Dus de informatie waarmee de informatieproducten het moeten doen. Immers, alleen deze informatie uit de bron is er, er is verder niks (ook al wil je meer, er is niet meer, ook al wil je informatie met een andere betekenis, die is er niet, want het is niet zo ingewonnen).
|
process
|
functionele lagen in de inrichting waarin wordt er niet aangesloten bij de lagen architectuur van common ground en de bijbehorende principes genoemde functionele lagen zijn geen lagen maar kolommen leidt tot verwarring met andere lagenmodellen voegt ook niets toe om dat lagen te noemen registreren lijkt ontkoppeld van ontsluiten verstrekken als de keten kantelt en data gedreven werkt dan is er maar database logisch gezien waar alle processen in samenhang op werken waar nodig in samenhang bijwerken en waar gewenst in samenhang verstrekken de inwinnende partij is verantwoordelijk voor de correctheid van de data die verstrekt wordt en wordt geacht zelf die data op orde te maken dan ligt de correctheid van de data ook bij de juiste partij de partij met kennis en die partij is zich bewust sensitief van hoe de data gebruikt wordt kortom inwinnen voor gebruik belangrijk is dat er partij verantwoordelijk is voor de registratie ongeacht of het gaat om registreren of gebruik dus een principe dat je inricht optimaliseert voor gebruik de gebruiksprocessen zijn vaak veel heftiger dan de registratieprocessen er kan een aparte database van gemaakt worden bv een real time replica desnoods een aparte database met deels een apart technisch datamodel maar niet een apart informatiemodel de informatie moet in de keten gelijk zijn met dezelfde betekenis en ontsloten zodra dit kan ook m b t wanneer dit kan beschikt alleen de inwinnende partij over het algemeen over de juiste kennis om dit in te schatten te bepalen als je dus aparte technische datamodellen maakt hou het informatiemodel deel dan op hetzelfde dan kan dit wel maar met behoud van consistentie het gaat niet om informatieproducten of informatie op maat die onderwerpen zitten in een verlengstuk op de bron het gaat over de data in de bron dus de informatie waarmee de informatieproducten het moeten doen immers alleen deze informatie uit de bron is er er is verder niks ook al wil je meer er is niet meer ook al wil je informatie met een andere betekenis die is er niet want het is niet zo ingewonnen
| 1
|
147,653
| 11,800,349,558
|
IssuesEvent
|
2020-03-18 17:23:23
|
rancher/rancher
|
https://api.github.com/repos/rancher/rancher
|
closed
|
When display k3s node args and environment variables, use ******** for omitted values
|
[zube]: To Test area/import-k3s
|
See: https://github.com/rancher/k3s/issues/1524
This issue is just to make sure we check this out in the Rancher UI and it looks good
This is waiting for a *K3s RC* of 1.17.4
|
1.0
|
When display k3s node args and environment variables, use ******** for omitted values - See: https://github.com/rancher/k3s/issues/1524
This issue is just to make sure we check this out in the Rancher UI and it looks good
This is waiting for a *K3s RC* of 1.17.4
|
non_process
|
when display node args and environment variables use for omitted values see this issue is just to make sure we check this out in the rancher ui and it looks good this is waiting for a rc of
| 0
|
14,340
| 17,368,464,808
|
IssuesEvent
|
2021-07-30 10:36:17
|
osstotalsoft/nbb
|
https://api.github.com/repos/osstotalsoft/nbb
|
closed
|
process manager - duplicate definitions
|
good first issue process manager
|
One can call services.AddProcessManagerDefinition multiple times or can have multiple definitions for the same type. Now, the runtime dies with
`"Message of type XXX could not be processed due to the following exception System.InvalidOperationException: Sequence contains more than one element at System.Linq.ThrowHelper.ThrowMoreThanOneElementException()
at System.Linq.Enumerable.SingleOrDefault[TSource](IEnumerable1 source)
at NBB.ProcessManager.Runtime.ProcessExecutionCoordinator.Invoke"`
Desired behaviour: similar with a DI container's Build process, the program should check for duplicate definitions after startup is complete (maybe also introduce a build function) and throw a more programmer friendly exception like "multiple definitions have been loaded for the same.. ".
However, it is to be discussed if multiple definitions for the same event should be supported, eg maybe I want to have a fork coming out of the same event. Example: an user is created, I want to start an email sending process and also start a flow where I check the person against an fraud internal database.
|
1.0
|
process manager - duplicate definitions - One can call services.AddProcessManagerDefinition multiple times or can have multiple definitions for the same type. Now, the runtime dies with
`"Message of type XXX could not be processed due to the following exception System.InvalidOperationException: Sequence contains more than one element at System.Linq.ThrowHelper.ThrowMoreThanOneElementException()
at System.Linq.Enumerable.SingleOrDefault[TSource](IEnumerable1 source)
at NBB.ProcessManager.Runtime.ProcessExecutionCoordinator.Invoke"`
Desired behaviour: similar with a DI container's Build process, the program should check for duplicate definitions after startup is complete (maybe also introduce a build function) and throw a more programmer friendly exception like "multiple definitions have been loaded for the same.. ".
However, it is to be discussed if multiple definitions for the same event should be supported, eg maybe I want to have a fork coming out of the same event. Example: an user is created, I want to start an email sending process and also start a flow where I check the person against an fraud internal database.
|
process
|
process manager duplicate definitions one can call services addprocessmanagerdefinition multiple times or can have multiple definitions for the same type now the runtime dies with message of type xxx could not be processed due to the following exception system invalidoperationexception sequence contains more than one element at system linq throwhelper throwmorethanoneelementexception at system linq enumerable singleordefault source at nbb processmanager runtime processexecutioncoordinator invoke desired behaviour similar with a di container s build process the program should check for duplicate definitions after startup is complete maybe also introduce a build function and throw a more programmer friendly exception like multiple definitions have been loaded for the same however it is to be discussed if multiple definitions for the same event should be supported eg maybe i want to have a fork coming out of the same event example an user is created i want to start an email sending process and also start a flow where i check the person against an fraud internal database
| 1
|
9,115
| 12,195,298,266
|
IssuesEvent
|
2020-04-29 17:08:26
|
kubeflow/testing
|
https://api.github.com/repos/kubeflow/testing
|
closed
|
Create a management cluster for GCP blueprints
|
kind/feature kind/process platform/gcp priority/p2
|
We should create a management cluster to run and deploy GCP blueprints as well as other test infrastructure.
|
1.0
|
Create a management cluster for GCP blueprints - We should create a management cluster to run and deploy GCP blueprints as well as other test infrastructure.
|
process
|
create a management cluster for gcp blueprints we should create a management cluster to run and deploy gcp blueprints as well as other test infrastructure
| 1
|
107,483
| 16,761,595,520
|
IssuesEvent
|
2021-06-13 22:26:09
|
gms-ws-demo/nibrs
|
https://api.github.com/repos/gms-ws-demo/nibrs
|
closed
|
CVE-2021-29425 (Medium) detected in commons-io-2.5.jar, commons-io-2.6.jar - autoclosed
|
security vulnerability
|
## CVE-2021-29425 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>commons-io-2.5.jar</b>, <b>commons-io-2.6.jar</b></p></summary>
<p>
<details><summary><b>commons-io-2.5.jar</b></p></summary>
<p>The Apache Commons IO library contains utility classes, stream implementations, file filters,
file comparators, endian transformation classes, and much more.</p>
<p>Library home page: <a href="http://commons.apache.org/proper/commons-io/">http://commons.apache.org/proper/commons-io/</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-fbi-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-io/commons-io/2.5/commons-io-2.5.jar,nibrs/tools/nibrs-fbi-service/target/nibrs-fbi-service-1.0.0/WEB-INF/lib/commons-io-2.5.jar</p>
<p>
Dependency Hierarchy:
- :x: **commons-io-2.5.jar** (Vulnerable Library)
</details>
<details><summary><b>commons-io-2.6.jar</b></p></summary>
<p>The Apache Commons IO library contains utility classes, stream implementations, file filters,
file comparators, endian transformation classes, and much more.</p>
<p>Library home page: <a href="http://commons.apache.org/proper/commons-io/">http://commons.apache.org/proper/commons-io/</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-validate-common/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,nibrs/web/nibrs-web/target/nibrs-web/WEB-INF/lib/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,canner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar</p>
<p>
Dependency Hierarchy:
- tika-parsers-1.18.jar (Root Library)
- :x: **commons-io-2.6.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/gms-ws-demo/nibrs/commit/9fb1c19bd26c2113d1961640de126a33eacdc946">9fb1c19bd26c2113d1961640de126a33eacdc946</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Commons IO before 2.7, When invoking the method FileNameUtils.normalize with an improper input string, like "//../foo", or "\\..\foo", the result would be the same value, thus possibly providing access to files in the parent directory, but not further above (thus "limited" path traversal), if the calling code would use the result to construct a path value.
<p>Publish Date: 2021-04-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-29425>CVE-2021-29425</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-29425">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-29425</a></p>
<p>Release Date: 2021-04-13</p>
<p>Fix Resolution: commons-io:commons-io:2.7</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"commons-io","packageName":"commons-io","packageVersion":"2.5","packageFilePaths":["/tools/nibrs-fbi-service/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"commons-io:commons-io:2.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"commons-io:commons-io:2.7"},{"packageType":"Java","groupId":"commons-io","packageName":"commons-io","packageVersion":"2.6","packageFilePaths":["/tools/nibrs-validate-common/pom.xml","/web/nibrs-web/pom.xml","/tools/nibrs-xmlfile/pom.xml","/tools/nibrs-flatfile/pom.xml","/tools/nibrs-summary-report-common/pom.xml","/tools/nibrs-route/pom.xml","/tools/nibrs-validation/pom.xml","/tools/nibrs-summary-report/pom.xml","/tools/nibrs-staging-data-common/pom.xml","/tools/nibrs-staging-data/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.tika:tika-parsers:1.18;commons-io:commons-io:2.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"commons-io:commons-io:2.7"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-29425","vulnerabilityDetails":"In Apache Commons IO before 2.7, When invoking the method FileNameUtils.normalize with an improper input string, like \"//../foo\", or \"\\\\..\\foo\", the result would be the same value, thus possibly providing access to files in the parent directory, but not further above (thus \"limited\" path traversal), if the calling code would use the result to construct a path value.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-29425","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2021-29425 (Medium) detected in commons-io-2.5.jar, commons-io-2.6.jar - autoclosed - ## CVE-2021-29425 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>commons-io-2.5.jar</b>, <b>commons-io-2.6.jar</b></p></summary>
<p>
<details><summary><b>commons-io-2.5.jar</b></p></summary>
<p>The Apache Commons IO library contains utility classes, stream implementations, file filters,
file comparators, endian transformation classes, and much more.</p>
<p>Library home page: <a href="http://commons.apache.org/proper/commons-io/">http://commons.apache.org/proper/commons-io/</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-fbi-service/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-io/commons-io/2.5/commons-io-2.5.jar,nibrs/tools/nibrs-fbi-service/target/nibrs-fbi-service-1.0.0/WEB-INF/lib/commons-io-2.5.jar</p>
<p>
Dependency Hierarchy:
- :x: **commons-io-2.5.jar** (Vulnerable Library)
</details>
<details><summary><b>commons-io-2.6.jar</b></p></summary>
<p>The Apache Commons IO library contains utility classes, stream implementations, file filters,
file comparators, endian transformation classes, and much more.</p>
<p>Library home page: <a href="http://commons.apache.org/proper/commons-io/">http://commons.apache.org/proper/commons-io/</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-validate-common/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,nibrs/web/nibrs-web/target/nibrs-web/WEB-INF/lib/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,/home/wss-scanner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar,canner/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar</p>
<p>
Dependency Hierarchy:
- tika-parsers-1.18.jar (Root Library)
- :x: **commons-io-2.6.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/gms-ws-demo/nibrs/commit/9fb1c19bd26c2113d1961640de126a33eacdc946">9fb1c19bd26c2113d1961640de126a33eacdc946</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Commons IO before 2.7, When invoking the method FileNameUtils.normalize with an improper input string, like "//../foo", or "\\..\foo", the result would be the same value, thus possibly providing access to files in the parent directory, but not further above (thus "limited" path traversal), if the calling code would use the result to construct a path value.
<p>Publish Date: 2021-04-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-29425>CVE-2021-29425</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-29425">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-29425</a></p>
<p>Release Date: 2021-04-13</p>
<p>Fix Resolution: commons-io:commons-io:2.7</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"commons-io","packageName":"commons-io","packageVersion":"2.5","packageFilePaths":["/tools/nibrs-fbi-service/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"commons-io:commons-io:2.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"commons-io:commons-io:2.7"},{"packageType":"Java","groupId":"commons-io","packageName":"commons-io","packageVersion":"2.6","packageFilePaths":["/tools/nibrs-validate-common/pom.xml","/web/nibrs-web/pom.xml","/tools/nibrs-xmlfile/pom.xml","/tools/nibrs-flatfile/pom.xml","/tools/nibrs-summary-report-common/pom.xml","/tools/nibrs-route/pom.xml","/tools/nibrs-validation/pom.xml","/tools/nibrs-summary-report/pom.xml","/tools/nibrs-staging-data-common/pom.xml","/tools/nibrs-staging-data/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.tika:tika-parsers:1.18;commons-io:commons-io:2.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"commons-io:commons-io:2.7"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-29425","vulnerabilityDetails":"In Apache Commons IO before 2.7, When invoking the method FileNameUtils.normalize with an improper input string, like \"//../foo\", or \"\\\\..\\foo\", the result would be the same value, thus possibly providing access to files in the parent directory, but not further above (thus \"limited\" path traversal), if the calling code would use the result to construct a path value.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-29425","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in commons io jar commons io jar autoclosed cve medium severity vulnerability vulnerable libraries commons io jar commons io jar commons io jar the apache commons io library contains utility classes stream implementations file filters file comparators endian transformation classes and much more library home page a href path to dependency file nibrs tools nibrs fbi service pom xml path to vulnerable library home wss scanner repository commons io commons io commons io jar nibrs tools nibrs fbi service target nibrs fbi service web inf lib commons io jar dependency hierarchy x commons io jar vulnerable library commons io jar the apache commons io library contains utility classes stream implementations file filters file comparators endian transformation classes and much more library home page a href path to dependency file nibrs tools nibrs validate common pom xml path to vulnerable library home wss scanner repository commons io commons io commons io jar home wss scanner repository commons io commons io commons io jar home wss scanner repository commons io commons io commons io jar home wss scanner repository commons io commons io commons io jar nibrs web nibrs web target nibrs web web inf lib commons io jar home wss scanner repository commons io commons io commons io jar home wss scanner repository commons io commons io commons io jar home wss scanner repository commons io commons io commons io jar home wss scanner repository commons io commons io commons io jar home wss scanner repository commons io commons io commons io jar home wss scanner repository commons io commons io commons io jar canner repository commons io commons io commons io jar dependency hierarchy tika parsers jar root library x commons io jar vulnerable library found in head commit a href found in base branch master vulnerability details in apache commons io before when invoking the method filenameutils normalize with an improper input string like foo or foo the result would be the same value thus possibly providing access to files in the parent directory but not further above thus limited path traversal if the calling code would use the result to construct a path value publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution commons io commons io isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree commons io commons io isminimumfixversionavailable true minimumfixversion commons io commons io packagetype java groupid commons io packagename commons io packageversion packagefilepaths istransitivedependency true dependencytree org apache tika tika parsers commons io commons io isminimumfixversionavailable true minimumfixversion commons io commons io basebranches vulnerabilityidentifier cve vulnerabilitydetails in apache commons io before when invoking the method filenameutils normalize with an improper input string like foo or foo the result would be the same value thus possibly providing access to files in the parent directory but not further above thus limited path traversal if the calling code would use the result to construct a path value vulnerabilityurl
| 0
|
17,673
| 23,502,689,089
|
IssuesEvent
|
2022-08-18 09:47:48
|
apache/arrow-rs
|
https://api.github.com/repos/apache/arrow-rs
|
closed
|
PrimitiveArrayReader Creates Incorrect PrimitiveArray for Unsigned Types
|
bug development-process
|
**Describe the bug**
<!--
A clear and concise description of what the bug is.
-->
PrimitiveArrayReader creates the arrow array based on the parquet physical type, this is despite creating the correct DataType
**To Reproduce**
<!--
Steps to reproduce the behavior:
-->
Read an unsigned primitive array, try to downcast it to a unsigned array, get an error
**Expected behavior**
<!--
A clear and concise description of what you expected to happen.
-->
**Additional context**
<!--
Add any other context about the problem here.
-->
This is currently masked by https://github.com/apache/arrow-rs/issues/2487
|
1.0
|
PrimitiveArrayReader Creates Incorrect PrimitiveArray for Unsigned Types - **Describe the bug**
<!--
A clear and concise description of what the bug is.
-->
PrimitiveArrayReader creates the arrow array based on the parquet physical type, this is despite creating the correct DataType
**To Reproduce**
<!--
Steps to reproduce the behavior:
-->
Read an unsigned primitive array, try to downcast it to a unsigned array, get an error
**Expected behavior**
<!--
A clear and concise description of what you expected to happen.
-->
**Additional context**
<!--
Add any other context about the problem here.
-->
This is currently masked by https://github.com/apache/arrow-rs/issues/2487
|
process
|
primitivearrayreader creates incorrect primitivearray for unsigned types describe the bug a clear and concise description of what the bug is primitivearrayreader creates the arrow array based on the parquet physical type this is despite creating the correct datatype to reproduce steps to reproduce the behavior read an unsigned primitive array try to downcast it to a unsigned array get an error expected behavior a clear and concise description of what you expected to happen additional context add any other context about the problem here this is currently masked by
| 1
|
32,933
| 8,971,527,532
|
IssuesEvent
|
2019-01-29 16:06:25
|
avast-tl/retdec
|
https://api.github.com/repos/avast-tl/retdec
|
closed
|
CMake rules for pelib contain an unitialized variable
|
C-build-system C-pelib bug
|
File `deps/pelib/CMakeLists.txt` contains the following piece of code:
```
44 # Force rebuild if switch happened.
45 # Seems like this is not needed on Linux, and not working on Windows :-(
46 BUILD_ALWAYS ${CHANGED}
```
However, the `CHANGED` variable is defined later on line
```
57 check_if_variable_changed(PELIB_LOCAL_DIR CHANGED)
```
Questions:
* Can you please verify that we actually want to use an uninitialized variable there?
* Is that `BUILD_ALWAYS` part necessary? According to the comment above, it is not needed on Linux and does not work on Windows. Is it for macOS then?
|
1.0
|
CMake rules for pelib contain an unitialized variable - File `deps/pelib/CMakeLists.txt` contains the following piece of code:
```
44 # Force rebuild if switch happened.
45 # Seems like this is not needed on Linux, and not working on Windows :-(
46 BUILD_ALWAYS ${CHANGED}
```
However, the `CHANGED` variable is defined later on line
```
57 check_if_variable_changed(PELIB_LOCAL_DIR CHANGED)
```
Questions:
* Can you please verify that we actually want to use an uninitialized variable there?
* Is that `BUILD_ALWAYS` part necessary? According to the comment above, it is not needed on Linux and does not work on Windows. Is it for macOS then?
|
non_process
|
cmake rules for pelib contain an unitialized variable file deps pelib cmakelists txt contains the following piece of code force rebuild if switch happened seems like this is not needed on linux and not working on windows build always changed however the changed variable is defined later on line check if variable changed pelib local dir changed questions can you please verify that we actually want to use an uninitialized variable there is that build always part necessary according to the comment above it is not needed on linux and does not work on windows is it for macos then
| 0
|
2,933
| 5,918,855,359
|
IssuesEvent
|
2017-05-22 16:15:51
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
Topics with processing-role="resource-only" generate XHTML
|
bug P1 preprocess
|
Reported to me by a user (not sure which OT version), but I replicated with 2.1.1. If a map has nothing but key definitions (that is, nothing but references with `processing-role="resource-only"`), XHTML output still generates result files for each of the topics. Resource-only topics should be available to the transform for whatever reason, but should not be generating output.
To reproduce: in `hierarchy.ditamap`, add `processing-role="resource-only"` to every `<topicref>`. Build to XHTML; all topics are still generated. PDF output fails, I assume because no real content goes in the PDF. I tested this test both with the original `<topicref>` (add `@processing-role` manually) and with `<keydef>` (pick up default `@processing-role` and add `@keys`).
If I set the first topic in the map to `@processing-role="normal"`, but leave the others as `resource-only`, XHTML still generates all topics. PDF correctly includes only the one normal topic.
Sample set:
[2077.zip](https://github.com/dita-ot/dita-ot/files/1005790/2077.zip)
|
1.0
|
Topics with processing-role="resource-only" generate XHTML - Reported to me by a user (not sure which OT version), but I replicated with 2.1.1. If a map has nothing but key definitions (that is, nothing but references with `processing-role="resource-only"`), XHTML output still generates result files for each of the topics. Resource-only topics should be available to the transform for whatever reason, but should not be generating output.
To reproduce: in `hierarchy.ditamap`, add `processing-role="resource-only"` to every `<topicref>`. Build to XHTML; all topics are still generated. PDF output fails, I assume because no real content goes in the PDF. I tested this test both with the original `<topicref>` (add `@processing-role` manually) and with `<keydef>` (pick up default `@processing-role` and add `@keys`).
If I set the first topic in the map to `@processing-role="normal"`, but leave the others as `resource-only`, XHTML still generates all topics. PDF correctly includes only the one normal topic.
Sample set:
[2077.zip](https://github.com/dita-ot/dita-ot/files/1005790/2077.zip)
|
process
|
topics with processing role resource only generate xhtml reported to me by a user not sure which ot version but i replicated with if a map has nothing but key definitions that is nothing but references with processing role resource only xhtml output still generates result files for each of the topics resource only topics should be available to the transform for whatever reason but should not be generating output to reproduce in hierarchy ditamap add processing role resource only to every build to xhtml all topics are still generated pdf output fails i assume because no real content goes in the pdf i tested this test both with the original add processing role manually and with pick up default processing role and add keys if i set the first topic in the map to processing role normal but leave the others as resource only xhtml still generates all topics pdf correctly includes only the one normal topic sample set
| 1
|
4,527
| 7,371,463,549
|
IssuesEvent
|
2018-03-13 11:51:18
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Broken Link
|
active-directory cxp doc-bug in-process triaged
|
It seems as though the link in Safari (Mac OS) is broken. Is an updated link available?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 7d9cfb97-d0f3-3cc1-a460-ff4e05530b92
* Version Independent ID: 3db82b30-1d14-1613-e473-13310e855ab2
* Content: [Azure AD Connect: Seamless Single Sign-On - quick start | Microsoft Docs](https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-sso-quick-start)
* Content Source: [articles/active-directory/connect/active-directory-aadconnect-sso-quick-start.md](https://github.com/Microsoft/azure-docs/blob/master/articles/active-directory/connect/active-directory-aadconnect-sso-quick-start.md)
* Service: **active-directory**
* GitHub Login: @swkrish
* Microsoft Alias: **billmath**
|
1.0
|
Broken Link - It seems as though the link in Safari (Mac OS) is broken. Is an updated link available?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 7d9cfb97-d0f3-3cc1-a460-ff4e05530b92
* Version Independent ID: 3db82b30-1d14-1613-e473-13310e855ab2
* Content: [Azure AD Connect: Seamless Single Sign-On - quick start | Microsoft Docs](https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-sso-quick-start)
* Content Source: [articles/active-directory/connect/active-directory-aadconnect-sso-quick-start.md](https://github.com/Microsoft/azure-docs/blob/master/articles/active-directory/connect/active-directory-aadconnect-sso-quick-start.md)
* Service: **active-directory**
* GitHub Login: @swkrish
* Microsoft Alias: **billmath**
|
process
|
broken link it seems as though the link in safari mac os is broken is an updated link available document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service active directory github login swkrish microsoft alias billmath
| 1
|
346
| 2,793,289,936
|
IssuesEvent
|
2015-05-11 09:56:17
|
ecodistrict/IDSSDashboard
|
https://api.github.com/repos/ecodistrict/IDSSDashboard
|
closed
|
It would be useful to be able to include new alternatives/measures developed by the user, that are not yet included in the database of alternatives.
|
enhancement form feedback 09102014 process step: develop alternatives
|
It would be useful to be able to include new alternatives/measures developed by the user, that are not yet included in the database of alternatives.
|
1.0
|
It would be useful to be able to include new alternatives/measures developed by the user, that are not yet included in the database of alternatives. - It would be useful to be able to include new alternatives/measures developed by the user, that are not yet included in the database of alternatives.
|
process
|
it would be useful to be able to include new alternatives measures developed by the user that are not yet included in the database of alternatives it would be useful to be able to include new alternatives measures developed by the user that are not yet included in the database of alternatives
| 1
|
2,615
| 2,607,932,059
|
IssuesEvent
|
2015-02-26 00:27:09
|
chrsmithdemos/minify
|
https://api.github.com/repos/chrsmithdemos/minify
|
closed
|
must-revalidate currently breaks caching in webkit
|
auto-migrated Priority-Medium Release-2.1.2 Type-Defect
|
```
See: http://mrclay.org/index.php/2009/02/24/safari-4-beta-cache-
controlmust-revalidate-bug/
For now, best to remove must-revalidate from ConditionalGet. It was
originally added to force Opera to revalidate its cache (which it wasn't
even with max-age=0).
```
-----
Original issue reported on code.google.com by `mrclay....@gmail.com` on 30 Jun 2009 at 5:36
|
1.0
|
must-revalidate currently breaks caching in webkit - ```
See: http://mrclay.org/index.php/2009/02/24/safari-4-beta-cache-
controlmust-revalidate-bug/
For now, best to remove must-revalidate from ConditionalGet. It was
originally added to force Opera to revalidate its cache (which it wasn't
even with max-age=0).
```
-----
Original issue reported on code.google.com by `mrclay....@gmail.com` on 30 Jun 2009 at 5:36
|
non_process
|
must revalidate currently breaks caching in webkit see controlmust revalidate bug for now best to remove must revalidate from conditionalget it was originally added to force opera to revalidate its cache which it wasn t even with max age original issue reported on code google com by mrclay gmail com on jun at
| 0
|
22,116
| 6,229,408,619
|
IssuesEvent
|
2017-07-11 03:45:44
|
XceedBoucherS/TestImport5
|
https://api.github.com/repos/XceedBoucherS/TestImport5
|
closed
|
DateTimePicker: Date cannot be entered by keyboard
|
CodePlex
|
<b>Philvx[CodePlex]</b> <br />After entering the first number the Cursor is automatically moved to the beginning and it is impossible to enter any more numbers.The attached videos shows this behavior. After clicking the part of the date I wanted to change I entered a single number
which you cannot see unfortunately. But at least you can see the cursor jumping to the front immediately upon my key press making further input impossible.I am using version 1.8 (NuGet package) of the Extended WPF Toolkit. This problem did not exist in version
1.7.
|
1.0
|
DateTimePicker: Date cannot be entered by keyboard - <b>Philvx[CodePlex]</b> <br />After entering the first number the Cursor is automatically moved to the beginning and it is impossible to enter any more numbers.The attached videos shows this behavior. After clicking the part of the date I wanted to change I entered a single number
which you cannot see unfortunately. But at least you can see the cursor jumping to the front immediately upon my key press making further input impossible.I am using version 1.8 (NuGet package) of the Extended WPF Toolkit. This problem did not exist in version
1.7.
|
non_process
|
datetimepicker date cannot be entered by keyboard philvx after entering the first number the cursor is automatically moved to the beginning and it is impossible to enter any more numbers the attached videos shows this behavior after clicking the part of the date i wanted to change i entered a single number which you cannot see unfortunately but at least you can see the cursor jumping to the front immediately upon my key press making further input impossible i am using version nuget package of the extended wpf toolkit this problem did not exist in version
| 0
|
296,200
| 25,535,925,899
|
IssuesEvent
|
2022-11-29 11:59:24
|
Dart-Code/Dart-Code
|
https://api.github.com/repos/Dart-Code/Dart-Code
|
closed
|
⌘-click on link in test output doesn't go to location in file.
|
is enhancement in flutter in testing
|
## Description
When I run a test and it fails I get output that includes the absolute path to the file and the line number and character of the failure, and while VSCode detects the `file:///` URI as a link, and I can ⌘-click on it, it either fails and gives me an error dialog that says something like "Unable to open 'menu_bar_test.dart:205:7'" because it doesn't know about a file with a filename that ends in ":205:7" (unsurprisingly), or (if I click on the other link in the output that is only the filename followed by "line 205", it doesn't go anywhere (because it's already got that file open).
This issue also appears to happen on Linux, so I don't think it's macOS specific.
## To Reproduce
Steps to reproduce the behavior:
1. Run a failing test.
2. In the "inline" failed test result at the test location in the file, ⌘-click on either link for the file location where the test failure occurred.
3. See the failure dialog appear, or that the cursor doesn't move.
## Expected behavior
I expected the cursor to jump to the referenced test file location where the error occurred.
## Screenshots
Here's what the error output from the test looks like:
<img width="716" alt="Screen Shot 2022-08-04 at 11 18 49 AM" src="https://user-images.githubusercontent.com/8867023/182924456-7f0a5b61-2ab5-44d2-8715-04ddf11e1e5a.png">
<details>
<summary>flutter doctor -v</summary>
<pre>
[!] Flutter (Channel menu_bar_iv, 3.1.0-0.0.pre.2094, on macOS 12.5 21G72 darwin-arm, locale en)
• Flutter version 3.1.0-0.0.pre.2094 on channel menu_bar_iv at /Users/gspencer/code/flutter
! Upstream repository git@github.com:gspencergoog/flutter.git is not a standard remote.
Set environment variable "FLUTTER_GIT_URL" to git@github.com:gspencergoog/flutter.git to dismiss this error.
• Framework revision 1f76314471 (21 minutes ago), 2022-08-04 11:06:24 -0700
• Engine revision 51296a62d9
• Dart version 2.19.0 (build 2.19.0-58.0.dev)
• DevTools version 2.16.0
• If those were intentional, you can disregard the above warnings; however it is recommended to use "git" directly to
perform update checks and upgrades.
[✓] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
• Android SDK at /Users/gspencer/Library/Android/sdk
• Platform android-33, build-tools 33.0.0
• ANDROID_HOME = /Users/gspencer/Library/Android/sdk
• Java binary at: /Applications/Android Studio.app/Contents/jre/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 11.0.12+0-b1504.28-7817840)
• All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 13.4.1)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Build 13F100
• CocoaPods version 1.11.3
[✓] Chrome - develop for the web
• Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome
[✓] VS Code (version 1.69.2)
• VS Code at /Applications/Visual Studio Code.app/Contents
• Flutter extension version 3.46.0
</pre>
</details>
- Target device (if the issue relates to Flutter debugging): macOS desktop
|
1.0
|
⌘-click on link in test output doesn't go to location in file. - ## Description
When I run a test and it fails I get output that includes the absolute path to the file and the line number and character of the failure, and while VSCode detects the `file:///` URI as a link, and I can ⌘-click on it, it either fails and gives me an error dialog that says something like "Unable to open 'menu_bar_test.dart:205:7'" because it doesn't know about a file with a filename that ends in ":205:7" (unsurprisingly), or (if I click on the other link in the output that is only the filename followed by "line 205", it doesn't go anywhere (because it's already got that file open).
This issue also appears to happen on Linux, so I don't think it's macOS specific.
## To Reproduce
Steps to reproduce the behavior:
1. Run a failing test.
2. In the "inline" failed test result at the test location in the file, ⌘-click on either link for the file location where the test failure occurred.
3. See the failure dialog appear, or that the cursor doesn't move.
## Expected behavior
I expected the cursor to jump to the referenced test file location where the error occurred.
## Screenshots
Here's what the error output from the test looks like:
<img width="716" alt="Screen Shot 2022-08-04 at 11 18 49 AM" src="https://user-images.githubusercontent.com/8867023/182924456-7f0a5b61-2ab5-44d2-8715-04ddf11e1e5a.png">
<details>
<summary>flutter doctor -v</summary>
<pre>
[!] Flutter (Channel menu_bar_iv, 3.1.0-0.0.pre.2094, on macOS 12.5 21G72 darwin-arm, locale en)
• Flutter version 3.1.0-0.0.pre.2094 on channel menu_bar_iv at /Users/gspencer/code/flutter
! Upstream repository git@github.com:gspencergoog/flutter.git is not a standard remote.
Set environment variable "FLUTTER_GIT_URL" to git@github.com:gspencergoog/flutter.git to dismiss this error.
• Framework revision 1f76314471 (21 minutes ago), 2022-08-04 11:06:24 -0700
• Engine revision 51296a62d9
• Dart version 2.19.0 (build 2.19.0-58.0.dev)
• DevTools version 2.16.0
• If those were intentional, you can disregard the above warnings; however it is recommended to use "git" directly to
perform update checks and upgrades.
[✓] Android toolchain - develop for Android devices (Android SDK version 33.0.0)
• Android SDK at /Users/gspencer/Library/Android/sdk
• Platform android-33, build-tools 33.0.0
• ANDROID_HOME = /Users/gspencer/Library/Android/sdk
• Java binary at: /Applications/Android Studio.app/Contents/jre/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 11.0.12+0-b1504.28-7817840)
• All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 13.4.1)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Build 13F100
• CocoaPods version 1.11.3
[✓] Chrome - develop for the web
• Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome
[✓] VS Code (version 1.69.2)
• VS Code at /Applications/Visual Studio Code.app/Contents
• Flutter extension version 3.46.0
</pre>
</details>
- Target device (if the issue relates to Flutter debugging): macOS desktop
|
non_process
|
⌘ click on link in test output doesn t go to location in file description when i run a test and it fails i get output that includes the absolute path to the file and the line number and character of the failure and while vscode detects the file uri as a link and i can ⌘ click on it it either fails and gives me an error dialog that says something like unable to open menu bar test dart because it doesn t know about a file with a filename that ends in unsurprisingly or if i click on the other link in the output that is only the filename followed by line it doesn t go anywhere because it s already got that file open this issue also appears to happen on linux so i don t think it s macos specific to reproduce steps to reproduce the behavior run a failing test in the inline failed test result at the test location in the file ⌘ click on either link for the file location where the test failure occurred see the failure dialog appear or that the cursor doesn t move expected behavior i expected the cursor to jump to the referenced test file location where the error occurred screenshots here s what the error output from the test looks like img width alt screen shot at am src flutter doctor v flutter channel menu bar iv pre on macos darwin arm locale en • flutter version pre on channel menu bar iv at users gspencer code flutter upstream repository git github com gspencergoog flutter git is not a standard remote set environment variable flutter git url to git github com gspencergoog flutter git to dismiss this error • framework revision minutes ago • engine revision • dart version build dev • devtools version • if those were intentional you can disregard the above warnings however it is recommended to use git directly to perform update checks and upgrades android toolchain develop for android devices android sdk version • android sdk at users gspencer library android sdk • platform android build tools • android home users gspencer library android sdk • java binary at applications android studio app contents jre contents home bin java • java version openjdk runtime environment build • all android licenses accepted xcode develop for ios and macos xcode • xcode at applications xcode app contents developer • build • cocoapods version chrome develop for the web • chrome at applications google chrome app contents macos google chrome vs code version • vs code at applications visual studio code app contents • flutter extension version target device if the issue relates to flutter debugging macos desktop
| 0
|
330,198
| 10,036,673,844
|
IssuesEvent
|
2019-07-18 11:17:02
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.newsnow.co.uk - see bug description
|
browser-fenix engine-gecko priority-normal
|
<!-- @browser: Firefox Mobile 68.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.newsnow.co.uk/h/Sport/Football/Premier+League/Manchester+United/Top+Sources
**Browser / Version**: Firefox Mobile 68.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Something else
**Description**: links opening in the same tab
**Steps to Reproduce**:
Links to different sites open in new tabs on both desktop and mobile Firefox browsers but Firefox preview is opeing them in the same window/tab
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.newsnow.co.uk - see bug description - <!-- @browser: Firefox Mobile 68.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.newsnow.co.uk/h/Sport/Football/Premier+League/Manchester+United/Top+Sources
**Browser / Version**: Firefox Mobile 68.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Something else
**Description**: links opening in the same tab
**Steps to Reproduce**:
Links to different sites open in new tabs on both desktop and mobile Firefox browsers but Firefox preview is opeing them in the same window/tab
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
see bug description url browser version firefox mobile operating system android tested another browser yes problem type something else description links opening in the same tab steps to reproduce links to different sites open in new tabs on both desktop and mobile firefox browsers but firefox preview is opeing them in the same window tab browser configuration none from with ❤️
| 0
|
456,875
| 13,151,021,526
|
IssuesEvent
|
2020-08-09 14:42:16
|
chrisjsewell/docutils
|
https://api.github.com/repos/chrisjsewell/docutils
|
closed
|
rst-tools.el [SF:patches:10]
|
closed patches priority-5
|
author: sakito
created: 2004-04-11 22:30:30
assigned: felixwiemann
SF_url: https://sourceforge.net/p/docutils/patches/10
Editor support rst-latex.el
---
commenter: goodger
posted: 2004-04-13 22:00:39
title: #10 rst-tools.el
- **assigned_to**: nobody --> goodger
---
commenter: goodger
posted: 2004-04-13 22:00:39
title: #10 rst-tools.el
Logged In: YES
user\_id=7733
There is no file attached. Did you intend to attach a file?
---
commenter: sakito
posted: 2004-04-15 00:19:31
title: #10 rst-tools.el
Logged In: YES
user\_id=488282
Sorry.attache check miss.
This file merge rst-html.el
This file is rst2html.py and rst2latex.py support.
other tools not yet.but I'm under creation.
---
commenter: sakito
posted: 2004-04-15 00:19:31
title: #10 rst-tools.el
attachments:
- https://sourceforge.net/p/docutils/patches/_discuss/thread/a1c1be8e/1b86/attachment/rst-tools.el
---
commenter: sakito
posted: 2004-04-15 00:19:31
title: #10 rst-tools.el
- **summary**: rst-latex.el --> rst-tools.el
---
commenter: felixwiemann
posted: 2005-05-29 08:29:33
title: #10 rst-tools.el
Logged In: YES
user\_id=1014490
Thank you for your contribution and sorry for the delayed
response.
I created a patch based on the rst-tools.el you provided and
filed it at
<https://sourceforge.net/tracker/index.php?func=detail&aid=1210674&group\_id=38414&atid=422032>.
Closing this tracker item.
---
commenter: felixwiemann
posted: 2005-05-29 08:29:33
title: #10 rst-tools.el
- **assigned_to**: goodger --> felixwiemann
- **status**: open --> closed
|
1.0
|
rst-tools.el [SF:patches:10] -
author: sakito
created: 2004-04-11 22:30:30
assigned: felixwiemann
SF_url: https://sourceforge.net/p/docutils/patches/10
Editor support rst-latex.el
---
commenter: goodger
posted: 2004-04-13 22:00:39
title: #10 rst-tools.el
- **assigned_to**: nobody --> goodger
---
commenter: goodger
posted: 2004-04-13 22:00:39
title: #10 rst-tools.el
Logged In: YES
user\_id=7733
There is no file attached. Did you intend to attach a file?
---
commenter: sakito
posted: 2004-04-15 00:19:31
title: #10 rst-tools.el
Logged In: YES
user\_id=488282
Sorry.attache check miss.
This file merge rst-html.el
This file is rst2html.py and rst2latex.py support.
other tools not yet.but I'm under creation.
---
commenter: sakito
posted: 2004-04-15 00:19:31
title: #10 rst-tools.el
attachments:
- https://sourceforge.net/p/docutils/patches/_discuss/thread/a1c1be8e/1b86/attachment/rst-tools.el
---
commenter: sakito
posted: 2004-04-15 00:19:31
title: #10 rst-tools.el
- **summary**: rst-latex.el --> rst-tools.el
---
commenter: felixwiemann
posted: 2005-05-29 08:29:33
title: #10 rst-tools.el
Logged In: YES
user\_id=1014490
Thank you for your contribution and sorry for the delayed
response.
I created a patch based on the rst-tools.el you provided and
filed it at
<https://sourceforge.net/tracker/index.php?func=detail&aid=1210674&group\_id=38414&atid=422032>.
Closing this tracker item.
---
commenter: felixwiemann
posted: 2005-05-29 08:29:33
title: #10 rst-tools.el
- **assigned_to**: goodger --> felixwiemann
- **status**: open --> closed
|
non_process
|
rst tools el author sakito created assigned felixwiemann sf url editor support rst latex el commenter goodger posted title rst tools el assigned to nobody goodger commenter goodger posted title rst tools el logged in yes user id there is no file attached did you intend to attach a file commenter sakito posted title rst tools el logged in yes user id sorry attache check miss this file merge rst html el this file is py and py support other tools not yet but i m under creation commenter sakito posted title rst tools el attachments commenter sakito posted title rst tools el summary rst latex el rst tools el commenter felixwiemann posted title rst tools el logged in yes user id thank you for your contribution and sorry for the delayed response i created a patch based on the rst tools el you provided and filed it at closing this tracker item commenter felixwiemann posted title rst tools el assigned to goodger felixwiemann status open closed
| 0
|
636,958
| 20,614,740,745
|
IssuesEvent
|
2022-03-07 12:07:45
|
thoth-station/core
|
https://api.github.com/repos/thoth-station/core
|
reopened
|
Automate update of dependencies + release of image for ps-* repos.
|
kind/feature priority/important-soon lifecycle/rotten
|
**Is your feature request related to a problem? Please describe.**
I want bots to take care of all stacks and repos with ps-* stacks, so that I can focus on my project and not worry about dependencies and images.
**High-level Goals**
- [ ] ps-* repos are completely maintained by bots.
**Describe the solution you'd like**
Automatic release once a new update from the bot is added.
**Describe alternatives you've considered**
Manual trigger of the release.
**Additional context**
**Acceptance Criteria**
- [ ] new image automatically released once the PR with updates from the bots is merged.
|
1.0
|
Automate update of dependencies + release of image for ps-* repos. - **Is your feature request related to a problem? Please describe.**
I want bots to take care of all stacks and repos with ps-* stacks, so that I can focus on my project and not worry about dependencies and images.
**High-level Goals**
- [ ] ps-* repos are completely maintained by bots.
**Describe the solution you'd like**
Automatic release once a new update from the bot is added.
**Describe alternatives you've considered**
Manual trigger of the release.
**Additional context**
**Acceptance Criteria**
- [ ] new image automatically released once the PR with updates from the bots is merged.
|
non_process
|
automate update of dependencies release of image for ps repos is your feature request related to a problem please describe i want bots to take care of all stacks and repos with ps stacks so that i can focus on my project and not worry about dependencies and images high level goals ps repos are completely maintained by bots describe the solution you d like automatic release once a new update from the bot is added describe alternatives you ve considered manual trigger of the release additional context acceptance criteria new image automatically released once the pr with updates from the bots is merged
| 0
|
19,312
| 25,466,761,973
|
IssuesEvent
|
2022-11-25 05:44:20
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[IDP] [PM] Entered value in sign in screen is not getting displayed in the following scenario
|
Bug P2 Participant manager Process: Fixed Process: Tested QA Process: Tested dev Auth server
|
**Steps:**
1. Enter email and password of non -organizational admin
2. Click on eye icon present in the password field (Password will be decrypted)
3. Click on 'Sign in' button and Verify
**AR:** Entered values in sign in screen are not getting displayed in the following scenario
**ER:** Entered values in sign in screen should get displayed in the following scenario
[screen-capture (11).webm](https://user-images.githubusercontent.com/86007179/199964888-c66bc07e-3145-4d7a-b622-7e380267d5cb.webm)
|
3.0
|
[IDP] [PM] Entered value in sign in screen is not getting displayed in the following scenario -
**Steps:**
1. Enter email and password of non -organizational admin
2. Click on eye icon present in the password field (Password will be decrypted)
3. Click on 'Sign in' button and Verify
**AR:** Entered values in sign in screen are not getting displayed in the following scenario
**ER:** Entered values in sign in screen should get displayed in the following scenario
[screen-capture (11).webm](https://user-images.githubusercontent.com/86007179/199964888-c66bc07e-3145-4d7a-b622-7e380267d5cb.webm)
|
process
|
entered value in sign in screen is not getting displayed in the following scenario steps enter email and password of non organizational admin click on eye icon present in the password field password will be decrypted click on sign in button and verify ar entered values in sign in screen are not getting displayed in the following scenario er entered values in sign in screen should get displayed in the following scenario
| 1
|
36,354
| 12,404,382,887
|
IssuesEvent
|
2020-05-21 15:26:45
|
jgeraigery/beaker-notebook
|
https://api.github.com/repos/jgeraigery/beaker-notebook
|
opened
|
WS-2019-0425 (Medium) detected in mocha-1.11.0.tgz
|
security vulnerability
|
## WS-2019-0425 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mocha-1.11.0.tgz</b></p></summary>
<p>simple, flexible, fun test framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/mocha/-/mocha-1.11.0.tgz">https://registry.npmjs.org/mocha/-/mocha-1.11.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-ua_20200521151458_RSRDSO/archiveExtraction_BOEWOR/20200521151458/ws-scm_depth_0/beaker-notebook/data/allDeps/src/vendor/bower_components/q/package.json</p>
<p>Path to vulnerable library: /tmp/ws-ua_20200521151458_RSRDSO/archiveExtraction_BOEWOR/20200521151458/ws-scm_depth_0/beaker-notebook/data/allDeps/src/vendor/bower_components/q/node_modules/mocha/package.json</p>
<p>
Dependency Hierarchy:
- promises-aplus-tests-1.3.2.tgz (Root Library)
- :x: **mocha-1.11.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/beaker-notebook/commit/e74341acf643e87bd21b092c7a9e9f6bb96fa7c4">e74341acf643e87bd21b092c7a9e9f6bb96fa7c4</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Mocha is vulnerable to ReDoS attack. If the stack trace in utils.js begins with a large error message, and full-trace is not enabled, utils.stackTraceFilter() will take exponential run time.
<p>Publish Date: 2019-01-24
<p>URL: <a href=https://github.com/mochajs/mocha/commit/1a43d8b11a64e4e85fe2a61aed91c259bbbac559>WS-2019-0425</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="v6.0.0">v6.0.0</a></p>
<p>Release Date: 2020-05-07</p>
<p>Fix Resolution: https://github.com/mochajs/mocha/commit/1a43d8b11a64e4e85fe2a61aed91c259bbbac559</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"mocha","packageVersion":"1.11.0","isTransitiveDependency":true,"dependencyTree":"promises-aplus-tests:1.3.2;mocha:1.11.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"https://github.com/mochajs/mocha/commit/1a43d8b11a64e4e85fe2a61aed91c259bbbac559"}],"vulnerabilityIdentifier":"WS-2019-0425","vulnerabilityDetails":"Mocha is vulnerable to ReDoS attack. If the stack trace in utils.js begins with a large error message, and full-trace is not enabled, utils.stackTraceFilter() will take exponential run time.","vulnerabilityUrl":"https://github.com/mochajs/mocha/commit/1a43d8b11a64e4e85fe2a61aed91c259bbbac559","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
WS-2019-0425 (Medium) detected in mocha-1.11.0.tgz - ## WS-2019-0425 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mocha-1.11.0.tgz</b></p></summary>
<p>simple, flexible, fun test framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/mocha/-/mocha-1.11.0.tgz">https://registry.npmjs.org/mocha/-/mocha-1.11.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-ua_20200521151458_RSRDSO/archiveExtraction_BOEWOR/20200521151458/ws-scm_depth_0/beaker-notebook/data/allDeps/src/vendor/bower_components/q/package.json</p>
<p>Path to vulnerable library: /tmp/ws-ua_20200521151458_RSRDSO/archiveExtraction_BOEWOR/20200521151458/ws-scm_depth_0/beaker-notebook/data/allDeps/src/vendor/bower_components/q/node_modules/mocha/package.json</p>
<p>
Dependency Hierarchy:
- promises-aplus-tests-1.3.2.tgz (Root Library)
- :x: **mocha-1.11.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/beaker-notebook/commit/e74341acf643e87bd21b092c7a9e9f6bb96fa7c4">e74341acf643e87bd21b092c7a9e9f6bb96fa7c4</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Mocha is vulnerable to ReDoS attack. If the stack trace in utils.js begins with a large error message, and full-trace is not enabled, utils.stackTraceFilter() will take exponential run time.
<p>Publish Date: 2019-01-24
<p>URL: <a href=https://github.com/mochajs/mocha/commit/1a43d8b11a64e4e85fe2a61aed91c259bbbac559>WS-2019-0425</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="v6.0.0">v6.0.0</a></p>
<p>Release Date: 2020-05-07</p>
<p>Fix Resolution: https://github.com/mochajs/mocha/commit/1a43d8b11a64e4e85fe2a61aed91c259bbbac559</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"mocha","packageVersion":"1.11.0","isTransitiveDependency":true,"dependencyTree":"promises-aplus-tests:1.3.2;mocha:1.11.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"https://github.com/mochajs/mocha/commit/1a43d8b11a64e4e85fe2a61aed91c259bbbac559"}],"vulnerabilityIdentifier":"WS-2019-0425","vulnerabilityDetails":"Mocha is vulnerable to ReDoS attack. If the stack trace in utils.js begins with a large error message, and full-trace is not enabled, utils.stackTraceFilter() will take exponential run time.","vulnerabilityUrl":"https://github.com/mochajs/mocha/commit/1a43d8b11a64e4e85fe2a61aed91c259bbbac559","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
ws medium detected in mocha tgz ws medium severity vulnerability vulnerable library mocha tgz simple flexible fun test framework library home page a href path to dependency file tmp ws ua rsrdso archiveextraction boewor ws scm depth beaker notebook data alldeps src vendor bower components q package json path to vulnerable library tmp ws ua rsrdso archiveextraction boewor ws scm depth beaker notebook data alldeps src vendor bower components q node modules mocha package json dependency hierarchy promises aplus tests tgz root library x mocha tgz vulnerable library found in head commit a href vulnerability details mocha is vulnerable to redos attack if the stack trace in utils js begins with a large error message and full trace is not enabled utils stacktracefilter will take exponential run time publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier ws vulnerabilitydetails mocha is vulnerable to redos attack if the stack trace in utils js begins with a large error message and full trace is not enabled utils stacktracefilter will take exponential run time vulnerabilityurl
| 0
|
9,464
| 12,441,185,045
|
IssuesEvent
|
2020-05-26 13:17:02
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Column names in queries don't get de-CLOBed
|
.Help Wanted .Limitation Database/H2 Querying/Processor
|
Whats a `stringdecode`? This might be a problem with H2 itself.

|
1.0
|
Column names in queries don't get de-CLOBed - Whats a `stringdecode`? This might be a problem with H2 itself.

|
process
|
column names in queries don t get de clobed whats a stringdecode this might be a problem with itself
| 1
|
87,805
| 8,122,773,452
|
IssuesEvent
|
2018-08-16 12:47:11
|
IBMStreams/streamsx.rabbitmq
|
https://api.github.com/repos/IBMStreams/streamsx.rabbitmq
|
closed
|
Update of RabbitMQ Java client
|
Pipeline: Done (Dev/Test/Doc) enhancement prio: middle
|
Currently, the RabbitMQ toolkit uses the Java client in version v3.6.2.
According to this page here, https://www.rabbitmq.com/releases/rabbitmq-java-client/, the
current version of the Java client is v3.6.14. Maybe it is time to update the client?
|
1.0
|
Update of RabbitMQ Java client - Currently, the RabbitMQ toolkit uses the Java client in version v3.6.2.
According to this page here, https://www.rabbitmq.com/releases/rabbitmq-java-client/, the
current version of the Java client is v3.6.14. Maybe it is time to update the client?
|
non_process
|
update of rabbitmq java client currently the rabbitmq toolkit uses the java client in version according to this page here the current version of the java client is maybe it is time to update the client
| 0
|
10,362
| 13,184,335,247
|
IssuesEvent
|
2020-08-12 19:12:32
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
BigQuery column of ARRAY type in results causes a casting exception
|
.Backend Database/BigQuery Difficulty:Easy Priority:P2 Querying/Processor Type:Bug
|
**Describe the bug**
BigQuery queries that return a column of ARRAY type trigger a backend exception: `java.util.ArrayList cannot be cast to java.lang.String`
**Logs**
`Jun 28 09:43:54 WARN metabase.query-processor.middleware.process-userland-query :: Query failure {:status :failed,
:class java.lang.ClassCastException,
:error "java.util.ArrayList cannot be cast to java.lang.String",
:stacktrace
("--> driver.bigquery$fn__576.invokeStatic(bigquery.clj:215)"
"driver.bigquery$fn__576.invoke(bigquery.clj:215)"
"driver.bigquery$post_process_native$iter__610__614$fn__615$iter__640__644$fn__645.invoke(bigquery.clj:255)"
"query_processor.middleware.annotate$native_cols$iter__33058__33062$fn__33063$fn__33064$iter__33065__33069$fn__33070.invoke(annotate.clj:50)"
"driver.common$values__GT_base_type.invokeStatic(common.clj:237)"
"driver.common$values__GT_base_type.invoke(common.clj:231)"
"query_processor.middleware.annotate$native_cols$iter__33058__33062$fn__33063$fn__33064.invoke(annotate.clj:49)"
"query_processor.middleware.annotate$native_cols$iter__33058__33062$fn__33063.invoke(annotate.clj:44)"
"query_processor.middleware.annotate$native_cols.invokeStatic(annotate.clj:44)"
"query_processor.middleware.annotate$native_cols.invoke(annotate.clj:39)"
"query_processor.middleware.annotate$add_native_column_info.invokeStatic(annotate.clj:58)"
"query_processor.middleware.annotate$add_native_column_info.invoke(annotate.clj:54)"
"query_processor.middleware.annotate$fn__33527$add_column_info_STAR___33532$fn__33539.invoke(annotate.clj:345)"
"query_processor.middleware.annotate$fn__33527$add_column_info_STAR___33532.invoke(annotate.clj:340)"
"query_processor.middleware.annotate$add_column_info$fn__33555.invoke(annotate.clj:363)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__35249.invoke(cumulative_aggregations.clj:57)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__39124.invoke(resolve_joins.clj:184)"
"query_processor.middleware.limit$limit$fn__36107.invoke(limit.clj:19)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__41885.invoke(results_metadata.clj:86)"
"query_processor.middleware.format_rows$format_rows$fn__36095.invoke(format_rows.clj:26)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__32190.invoke(add_dimension_projections.clj:234)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__33789.invoke(add_source_metadata.clj:127)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__39168.invoke(resolve_source_table.clj:48)"
"query_processor.middleware.add_row_count_and_status$add_row_count_and_status$fn__32952.invoke(add_row_count_and_status.clj:16)"
"query_processor.middleware.driver_specific$process_query_in_context$fn__35460.invoke(driver_specific.clj:12)"
"query_processor.middleware.resolve_driver$resolve_driver$fn__38788.invoke(resolve_driver.clj:23)"
"query_processor.middleware.bind_effective_timezone$bind_effective_timezone$fn__34649$fn__34650.invoke(bind_effective_timezone.clj:9)"
"util.date$call_with_effective_timezone.invokeStatic(date.clj:88)"
"util.date$call_with_effective_timezone.invoke(date.clj:77)"
"query_processor.middleware.bind_effective_timezone$bind_effective_timezone$fn__34649.invoke(bind_effective_timezone.clj:8)"
"query_processor.middleware.store$initialize_store$fn__41910$fn__41911.invoke(store.clj:11)"
"query_processor.store$do_with_new_store.invokeStatic(store.clj:41)"
"query_processor.store$do_with_new_store.invoke(store.clj:37)"
"query_processor.middleware.store$initialize_store$fn__41910.invoke(store.clj:10)"
"query_processor.middleware.async$async__GT_sync$fn__34250.invoke(async.clj:23)"
"query_processor.middleware.async_wait$runnable$fn__34375.invoke(async_wait.clj:89)"),
:query
{:type "native",
:native {:query "select [1, 2]", :template-tags {}},
:parameters [],
:async? true,
:middleware {:add-default-userland-constraints? true, :userland-query? true},
:info
{:executed-by 1,
:context :ad-hoc,
:card-id nil,
:nested? false,
:query-hash [20, 66, 5, -36, -46, 10, -105, -101, -68, 63, -102, -44, 10, 49, -98, 86, 115, -94, -122, 82, 52, 45, -17, 41, 48, 2, 15, 51, 124, 62, 103, 98]},
:constraints {:max-results 10000, :max-results-bare-rows 2000}}}`
**To Reproduce**
Steps to reproduce the behavior:
1. Select a BigQuery connection
2. Run a native query that returns an ARRAY: `select [1, 2]`
3. Casting error shows in the results area
**Expected behavior**
Results of the successful query should display.
**Information about your Metabase Installation:**
- Your browser and the version: Safari 13
- Your operating system: macOS 10.14.5
- Your databases: BigQuery
- Metabase version: 0.32.9 (bug experienced with 0.32.8 as well)
- Metabase hosting environment: Docker
- Metabase internal database: Postgres
**Severity**
I'm comfortable working around this or simply using the BigQuery UI, although I imagine it'd be a blocker for less technical users.
|
1.0
|
BigQuery column of ARRAY type in results causes a casting exception - **Describe the bug**
BigQuery queries that return a column of ARRAY type trigger a backend exception: `java.util.ArrayList cannot be cast to java.lang.String`
**Logs**
`Jun 28 09:43:54 WARN metabase.query-processor.middleware.process-userland-query :: Query failure {:status :failed,
:class java.lang.ClassCastException,
:error "java.util.ArrayList cannot be cast to java.lang.String",
:stacktrace
("--> driver.bigquery$fn__576.invokeStatic(bigquery.clj:215)"
"driver.bigquery$fn__576.invoke(bigquery.clj:215)"
"driver.bigquery$post_process_native$iter__610__614$fn__615$iter__640__644$fn__645.invoke(bigquery.clj:255)"
"query_processor.middleware.annotate$native_cols$iter__33058__33062$fn__33063$fn__33064$iter__33065__33069$fn__33070.invoke(annotate.clj:50)"
"driver.common$values__GT_base_type.invokeStatic(common.clj:237)"
"driver.common$values__GT_base_type.invoke(common.clj:231)"
"query_processor.middleware.annotate$native_cols$iter__33058__33062$fn__33063$fn__33064.invoke(annotate.clj:49)"
"query_processor.middleware.annotate$native_cols$iter__33058__33062$fn__33063.invoke(annotate.clj:44)"
"query_processor.middleware.annotate$native_cols.invokeStatic(annotate.clj:44)"
"query_processor.middleware.annotate$native_cols.invoke(annotate.clj:39)"
"query_processor.middleware.annotate$add_native_column_info.invokeStatic(annotate.clj:58)"
"query_processor.middleware.annotate$add_native_column_info.invoke(annotate.clj:54)"
"query_processor.middleware.annotate$fn__33527$add_column_info_STAR___33532$fn__33539.invoke(annotate.clj:345)"
"query_processor.middleware.annotate$fn__33527$add_column_info_STAR___33532.invoke(annotate.clj:340)"
"query_processor.middleware.annotate$add_column_info$fn__33555.invoke(annotate.clj:363)"
"query_processor.middleware.cumulative_aggregations$handle_cumulative_aggregations$fn__35249.invoke(cumulative_aggregations.clj:57)"
"query_processor.middleware.resolve_joins$resolve_joins$fn__39124.invoke(resolve_joins.clj:184)"
"query_processor.middleware.limit$limit$fn__36107.invoke(limit.clj:19)"
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__41885.invoke(results_metadata.clj:86)"
"query_processor.middleware.format_rows$format_rows$fn__36095.invoke(format_rows.clj:26)"
"query_processor.middleware.add_dimension_projections$add_remapping$fn__32190.invoke(add_dimension_projections.clj:234)"
"query_processor.middleware.add_source_metadata$add_source_metadata_for_source_queries$fn__33789.invoke(add_source_metadata.clj:127)"
"query_processor.middleware.resolve_source_table$resolve_source_tables$fn__39168.invoke(resolve_source_table.clj:48)"
"query_processor.middleware.add_row_count_and_status$add_row_count_and_status$fn__32952.invoke(add_row_count_and_status.clj:16)"
"query_processor.middleware.driver_specific$process_query_in_context$fn__35460.invoke(driver_specific.clj:12)"
"query_processor.middleware.resolve_driver$resolve_driver$fn__38788.invoke(resolve_driver.clj:23)"
"query_processor.middleware.bind_effective_timezone$bind_effective_timezone$fn__34649$fn__34650.invoke(bind_effective_timezone.clj:9)"
"util.date$call_with_effective_timezone.invokeStatic(date.clj:88)"
"util.date$call_with_effective_timezone.invoke(date.clj:77)"
"query_processor.middleware.bind_effective_timezone$bind_effective_timezone$fn__34649.invoke(bind_effective_timezone.clj:8)"
"query_processor.middleware.store$initialize_store$fn__41910$fn__41911.invoke(store.clj:11)"
"query_processor.store$do_with_new_store.invokeStatic(store.clj:41)"
"query_processor.store$do_with_new_store.invoke(store.clj:37)"
"query_processor.middleware.store$initialize_store$fn__41910.invoke(store.clj:10)"
"query_processor.middleware.async$async__GT_sync$fn__34250.invoke(async.clj:23)"
"query_processor.middleware.async_wait$runnable$fn__34375.invoke(async_wait.clj:89)"),
:query
{:type "native",
:native {:query "select [1, 2]", :template-tags {}},
:parameters [],
:async? true,
:middleware {:add-default-userland-constraints? true, :userland-query? true},
:info
{:executed-by 1,
:context :ad-hoc,
:card-id nil,
:nested? false,
:query-hash [20, 66, 5, -36, -46, 10, -105, -101, -68, 63, -102, -44, 10, 49, -98, 86, 115, -94, -122, 82, 52, 45, -17, 41, 48, 2, 15, 51, 124, 62, 103, 98]},
:constraints {:max-results 10000, :max-results-bare-rows 2000}}}`
**To Reproduce**
Steps to reproduce the behavior:
1. Select a BigQuery connection
2. Run a native query that returns an ARRAY: `select [1, 2]`
3. Casting error shows in the results area
**Expected behavior**
Results of the successful query should display.
**Information about your Metabase Installation:**
- Your browser and the version: Safari 13
- Your operating system: macOS 10.14.5
- Your databases: BigQuery
- Metabase version: 0.32.9 (bug experienced with 0.32.8 as well)
- Metabase hosting environment: Docker
- Metabase internal database: Postgres
**Severity**
I'm comfortable working around this or simply using the BigQuery UI, although I imagine it'd be a blocker for less technical users.
|
process
|
bigquery column of array type in results causes a casting exception describe the bug bigquery queries that return a column of array type trigger a backend exception java util arraylist cannot be cast to java lang string logs jun warn metabase query processor middleware process userland query query failure status failed class java lang classcastexception error java util arraylist cannot be cast to java lang string stacktrace driver bigquery fn invokestatic bigquery clj driver bigquery fn invoke bigquery clj driver bigquery post process native iter fn iter fn invoke bigquery clj query processor middleware annotate native cols iter fn fn iter fn invoke annotate clj driver common values gt base type invokestatic common clj driver common values gt base type invoke common clj query processor middleware annotate native cols iter fn fn invoke annotate clj query processor middleware annotate native cols iter fn invoke annotate clj query processor middleware annotate native cols invokestatic annotate clj query processor middleware annotate native cols invoke annotate clj query processor middleware annotate add native column info invokestatic annotate clj query processor middleware annotate add native column info invoke annotate clj query processor middleware annotate fn add column info star fn invoke annotate clj query processor middleware annotate fn add column info star invoke annotate clj query processor middleware annotate add column info fn invoke annotate clj query processor middleware cumulative aggregations handle cumulative aggregations fn invoke cumulative aggregations clj query processor middleware resolve joins resolve joins fn invoke resolve joins clj query processor middleware limit limit fn invoke limit clj query processor middleware results metadata record and return metadata bang fn invoke results metadata clj query processor middleware format rows format rows fn invoke format rows clj query processor middleware add dimension projections add remapping fn invoke add dimension projections clj query processor middleware add source metadata add source metadata for source queries fn invoke add source metadata clj query processor middleware resolve source table resolve source tables fn invoke resolve source table clj query processor middleware add row count and status add row count and status fn invoke add row count and status clj query processor middleware driver specific process query in context fn invoke driver specific clj query processor middleware resolve driver resolve driver fn invoke resolve driver clj query processor middleware bind effective timezone bind effective timezone fn fn invoke bind effective timezone clj util date call with effective timezone invokestatic date clj util date call with effective timezone invoke date clj query processor middleware bind effective timezone bind effective timezone fn invoke bind effective timezone clj query processor middleware store initialize store fn fn invoke store clj query processor store do with new store invokestatic store clj query processor store do with new store invoke store clj query processor middleware store initialize store fn invoke store clj query processor middleware async async gt sync fn invoke async clj query processor middleware async wait runnable fn invoke async wait clj query type native native query select template tags parameters async true middleware add default userland constraints true userland query true info executed by context ad hoc card id nil nested false query hash constraints max results max results bare rows to reproduce steps to reproduce the behavior select a bigquery connection run a native query that returns an array select casting error shows in the results area expected behavior results of the successful query should display information about your metabase installation your browser and the version safari your operating system macos your databases bigquery metabase version bug experienced with as well metabase hosting environment docker metabase internal database postgres severity i m comfortable working around this or simply using the bigquery ui although i imagine it d be a blocker for less technical users
| 1
|
122,573
| 10,227,363,153
|
IssuesEvent
|
2019-08-16 20:32:58
|
dexpenses/dexpenses-extract
|
https://api.github.com/repos/dexpenses/dexpenses-extract
|
opened
|
Implement test receipt normal/hmue-vockeroth-bad-print-debit
|
enhancement test-data
|
Receipt to implement:

|
1.0
|
Implement test receipt normal/hmue-vockeroth-bad-print-debit - Receipt to implement:

|
non_process
|
implement test receipt normal hmue vockeroth bad print debit receipt to implement normal hmue vockeroth bad print debit
| 0
|
18,620
| 24,579,489,040
|
IssuesEvent
|
2022-10-13 14:38:47
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Consent API] Data sharing Consent Artifacts are not getting created
|
Bug Blocker P0 Process: Fixed Process: Tested QA Process: Tested dev
|
Data sharing Consent Artifacts are not getting created when participant has provided data sharing permission
|
3.0
|
[Consent API] Data sharing Consent Artifacts are not getting created - Data sharing Consent Artifacts are not getting created when participant has provided data sharing permission
|
process
|
data sharing consent artifacts are not getting created data sharing consent artifacts are not getting created when participant has provided data sharing permission
| 1
|
63,796
| 14,656,784,315
|
IssuesEvent
|
2020-12-28 14:11:29
|
fu1771695yongxie/react-bootstrap
|
https://api.github.com/repos/fu1771695yongxie/react-bootstrap
|
opened
|
CVE-2019-8331 (Medium) detected in bootstrap-3.3.5.min.js
|
security vulnerability
|
## CVE-2019-8331 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.5.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js</a></p>
<p>Path to dependency file: react-bootstrap/www/node_modules/autocomplete.js/test/playground_jquery.html</p>
<p>Path to vulnerable library: react-bootstrap/www/node_modules/autocomplete.js/test/playground_jquery.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.5.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/react-bootstrap/commit/7965fbc9111ff1b9a9d215719a8f3b485f4170c3">7965fbc9111ff1b9a9d215719a8f3b485f4170c3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute.
<p>Publish Date: 2019-02-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/twbs/bootstrap/pull/28236">https://github.com/twbs/bootstrap/pull/28236</a></p>
<p>Release Date: 2019-02-20</p>
<p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-8331 (Medium) detected in bootstrap-3.3.5.min.js - ## CVE-2019-8331 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.5.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.5/js/bootstrap.min.js</a></p>
<p>Path to dependency file: react-bootstrap/www/node_modules/autocomplete.js/test/playground_jquery.html</p>
<p>Path to vulnerable library: react-bootstrap/www/node_modules/autocomplete.js/test/playground_jquery.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.5.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/react-bootstrap/commit/7965fbc9111ff1b9a9d215719a8f3b485f4170c3">7965fbc9111ff1b9a9d215719a8f3b485f4170c3</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute.
<p>Publish Date: 2019-02-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/twbs/bootstrap/pull/28236">https://github.com/twbs/bootstrap/pull/28236</a></p>
<p>Release Date: 2019-02-20</p>
<p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file react bootstrap www node modules autocomplete js test playground jquery html path to vulnerable library react bootstrap www node modules autocomplete js test playground jquery html dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch master vulnerability details in bootstrap before and x before xss is possible in the tooltip or popover data template attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap bootstrap sass step up your open source security game with whitesource
| 0
|
96,631
| 3,971,347,118
|
IssuesEvent
|
2016-05-04 11:28:52
|
OCHA-DAP/hdx-ckan
|
https://api.github.com/repos/OCHA-DAP/hdx-ckan
|
closed
|
Resource Edit: display of long filenames
|
Priority-Low
|

Can we break lines at something other than spaces? Could we allow line breaks at "_" or "-" as well?
And while we're at it, could we add a bit more space between lines:

|
1.0
|
Resource Edit: display of long filenames - 
Can we break lines at something other than spaces? Could we allow line breaks at "_" or "-" as well?
And while we're at it, could we add a bit more space between lines:

|
non_process
|
resource edit display of long filenames can we break lines at something other than spaces could we allow line breaks at or as well and while we re at it could we add a bit more space between lines
| 0
|
8,856
| 11,955,860,757
|
IssuesEvent
|
2020-04-04 07:16:25
|
utopia-rise/godot-kotlin
|
https://api.github.com/repos/utopia-rise/godot-kotlin
|
closed
|
Implement godot-annotation-processor
|
merge tools:annotation-processor
|
Implement a annotation processor like here: https://github.com/utopia-rise/godot-kotlin/tree/master/tools/godot-annotation-processor
The annotation processor should listen to the annotation's defined here: #68 , #70 , #69.
It should gather the descriptor classes in these lists:
- classDescriptors
- propertyDescriptors
- propertyDescriptors (for signal annotations)
- funcDescriptors
It should gather those informations using the mpApt project: https://github.com/Foso/MpApt
But should NOT pass any MpApt specific informations to the godot-entry-generator so it does not need any dependency to MpApt!
|
1.0
|
Implement godot-annotation-processor - Implement a annotation processor like here: https://github.com/utopia-rise/godot-kotlin/tree/master/tools/godot-annotation-processor
The annotation processor should listen to the annotation's defined here: #68 , #70 , #69.
It should gather the descriptor classes in these lists:
- classDescriptors
- propertyDescriptors
- propertyDescriptors (for signal annotations)
- funcDescriptors
It should gather those informations using the mpApt project: https://github.com/Foso/MpApt
But should NOT pass any MpApt specific informations to the godot-entry-generator so it does not need any dependency to MpApt!
|
process
|
implement godot annotation processor implement a annotation processor like here the annotation processor should listen to the annotation s defined here it should gather the descriptor classes in these lists classdescriptors propertydescriptors propertydescriptors for signal annotations funcdescriptors it should gather those informations using the mpapt project but should not pass any mpapt specific informations to the godot entry generator so it does not need any dependency to mpapt
| 1
|
12,166
| 14,741,579,795
|
IssuesEvent
|
2021-01-07 10:50:30
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
SA Billing - Toronto - Invalid Late Fees
|
anc-process anp-1 ant-bug has attachment
|
In GitLab by @kdjstudios on Jan 25, 2019, 08:10
**Submitted by:** "Denise Joseph" <denise.joseph@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-01-15-37793
**Server:** Internal
**Client/Site:** Toronto
**Account:** NA
**Issue:**
Please find attached the late fees list updated with comments for your review.
The stage was applied to the affected accounts.
Please let me know if you need anything further.
[Toronto+1-15-19+_4_.xlsx](/uploads/f34cdd9c18395a556ce669ca9f186089/Toronto+1-15-19+_4_.xlsx)
|
1.0
|
SA Billing - Toronto - Invalid Late Fees - In GitLab by @kdjstudios on Jan 25, 2019, 08:10
**Submitted by:** "Denise Joseph" <denise.joseph@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-01-15-37793
**Server:** Internal
**Client/Site:** Toronto
**Account:** NA
**Issue:**
Please find attached the late fees list updated with comments for your review.
The stage was applied to the affected accounts.
Please let me know if you need anything further.
[Toronto+1-15-19+_4_.xlsx](/uploads/f34cdd9c18395a556ce669ca9f186089/Toronto+1-15-19+_4_.xlsx)
|
process
|
sa billing toronto invalid late fees in gitlab by kdjstudios on jan submitted by denise joseph helpdesk server internal client site toronto account na issue please find attached the late fees list updated with comments for your review the stage was applied to the affected accounts please let me know if you need anything further uploads toronto xlsx
| 1
|
470,900
| 13,548,956,676
|
IssuesEvent
|
2020-09-17 07:27:40
|
ballerina-platform/ballerina-lang
|
https://api.github.com/repos/ballerina-platform/ballerina-lang
|
closed
|
Move `grpc` to another github repo
|
Points/9 Priority/High SwanLakeDump Team/StandardLibs Type/Task
|
**Description:**
This is part of the standard library migration effort.
|
1.0
|
Move `grpc` to another github repo - **Description:**
This is part of the standard library migration effort.
|
non_process
|
move grpc to another github repo description this is part of the standard library migration effort
| 0
|
6,841
| 9,984,677,938
|
IssuesEvent
|
2019-07-10 14:58:30
|
CATcher-org/CATcher
|
https://api.github.com/repos/CATcher-org/CATcher
|
closed
|
Feature: Phase Determination by reading file from public_data repo
|
aspect-Process category.Enhancement
|
As part of the Epic #101 ,
Determine the app's phase by reading a `phase.json` file from the `public_data` repo.
|
1.0
|
Feature: Phase Determination by reading file from public_data repo - As part of the Epic #101 ,
Determine the app's phase by reading a `phase.json` file from the `public_data` repo.
|
process
|
feature phase determination by reading file from public data repo as part of the epic determine the app s phase by reading a phase json file from the public data repo
| 1
|
61,973
| 8,564,410,745
|
IssuesEvent
|
2018-11-09 16:39:46
|
WeblateOrg/weblate
|
https://api.github.com/repos/WeblateOrg/weblate
|
closed
|
What is the best method of installation for production?
|
documentation good first issue hacktoberfest help wanted
|
In documentation it is not really clear - is virtualenv method of installing Weblate recommended in general or only for those who want to play around with it temporarily?
> This is recommended method if you don’t want to dig into details.
So "don't dig into details" - means you want to have it "guaranteed to just work" or "get it running no matter what, quickly" ?
Confusing that documentation suggests installing into /tmp/weblate which looks like not a good idea for something non-temporary.
I've been using Weblate for some years under Debian with pip, however if you stop upgrading Debian to next major release, after some time python libraries drift in version comparing to what's required by Weblate. and you no longer able to upgrade Weblate. Maybe some OS is better suited, is it Debian or CentOS?
What is the best case environment for Weblate?
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/64307150-what-is-the-best-method-of-installation-for-production?utm_campaign=plugin&utm_content=tracker%2F253393&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F253393&utm_medium=issues&utm_source=github).
</bountysource-plugin>
|
1.0
|
What is the best method of installation for production? - In documentation it is not really clear - is virtualenv method of installing Weblate recommended in general or only for those who want to play around with it temporarily?
> This is recommended method if you don’t want to dig into details.
So "don't dig into details" - means you want to have it "guaranteed to just work" or "get it running no matter what, quickly" ?
Confusing that documentation suggests installing into /tmp/weblate which looks like not a good idea for something non-temporary.
I've been using Weblate for some years under Debian with pip, however if you stop upgrading Debian to next major release, after some time python libraries drift in version comparing to what's required by Weblate. and you no longer able to upgrade Weblate. Maybe some OS is better suited, is it Debian or CentOS?
What is the best case environment for Weblate?
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/64307150-what-is-the-best-method-of-installation-for-production?utm_campaign=plugin&utm_content=tracker%2F253393&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F253393&utm_medium=issues&utm_source=github).
</bountysource-plugin>
|
non_process
|
what is the best method of installation for production in documentation it is not really clear is virtualenv method of installing weblate recommended in general or only for those who want to play around with it temporarily this is recommended method if you don’t want to dig into details so don t dig into details means you want to have it guaranteed to just work or get it running no matter what quickly confusing that documentation suggests installing into tmp weblate which looks like not a good idea for something non temporary i ve been using weblate for some years under debian with pip however if you stop upgrading debian to next major release after some time python libraries drift in version comparing to what s required by weblate and you no longer able to upgrade weblate maybe some os is better suited is it debian or centos what is the best case environment for weblate want to back this issue we accept bounties via
| 0
|
17,959
| 4,217,901,444
|
IssuesEvent
|
2016-06-30 14:30:02
|
meew0/discordrb
|
https://api.github.com/repos/meew0/discordrb
|
closed
|
Commands could be assigned to multiple symbols
|
commands documentation
|
For example, aliases; shorthands for a long command symbol; or just a different name. `CommandContainer#command` could optionally take an array in place of the single symbol we have right now and the command is assigned to each of the symbols in the array. What do you think?
|
1.0
|
Commands could be assigned to multiple symbols - For example, aliases; shorthands for a long command symbol; or just a different name. `CommandContainer#command` could optionally take an array in place of the single symbol we have right now and the command is assigned to each of the symbols in the array. What do you think?
|
non_process
|
commands could be assigned to multiple symbols for example aliases shorthands for a long command symbol or just a different name commandcontainer command could optionally take an array in place of the single symbol we have right now and the command is assigned to each of the symbols in the array what do you think
| 0
|
16,076
| 20,249,011,485
|
IssuesEvent
|
2022-02-14 16:10:31
|
Bone008/orbiteye
|
https://api.github.com/repos/Bone008/orbiteye
|
opened
|
Combine SATCAT and UCS dataset into JSON
|
data processing
|
See [this spreadsheet](https://docs.google.com/spreadsheets/d/1bWLvvn11pkFoWONQY0W00fq5BEsncXgFrQVKWN93NWM/edit?usp=sharing) for info on how to combine each row and what processing is required.
Final project should be a single JSON file which is a list of objects.
|
1.0
|
Combine SATCAT and UCS dataset into JSON - See [this spreadsheet](https://docs.google.com/spreadsheets/d/1bWLvvn11pkFoWONQY0W00fq5BEsncXgFrQVKWN93NWM/edit?usp=sharing) for info on how to combine each row and what processing is required.
Final project should be a single JSON file which is a list of objects.
|
process
|
combine satcat and ucs dataset into json see for info on how to combine each row and what processing is required final project should be a single json file which is a list of objects
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.