Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
4
112
repo_url
stringlengths
33
141
action
stringclasses
3 values
title
stringlengths
1
1.02k
labels
stringlengths
4
1.54k
body
stringlengths
1
262k
index
stringclasses
17 values
text_combine
stringlengths
95
262k
label
stringclasses
2 values
text
stringlengths
96
252k
binary_label
int64
0
1
140,130
18,895,235,951
IssuesEvent
2021-11-15 17:08:30
bgoonz/searchAwesome
https://api.github.com/repos/bgoonz/searchAwesome
closed
CVE-2020-7774 (High) detected in y18n-3.2.1.tgz, y18n-4.0.0.tgz
security vulnerability
## CVE-2020-7774 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>y18n-3.2.1.tgz</b>, <b>y18n-4.0.0.tgz</b></p></summary> <p> <details><summary><b>y18n-3.2.1.tgz</b></p></summary> <p>the bare-bones internationalization library used by yargs</p> <p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz">https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz</a></p> <p>Path to dependency file: searchAwesome/clones/awesome-stacks/package.json</p> <p>Path to vulnerable library: searchAwesome/clones/awesome-stacks/node_modules/y18n/package.json</p> <p> Dependency Hierarchy: - gatsby-cli-2.5.7.tgz (Root Library) - yargs-12.0.5.tgz - :x: **y18n-3.2.1.tgz** (Vulnerable Library) </details> <details><summary><b>y18n-4.0.0.tgz</b></p></summary> <p>the bare-bones internationalization library used by yargs</p> <p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz">https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz</a></p> <p>Path to dependency file: searchAwesome/clones/awesome-wpo/website/package.json</p> <p>Path to vulnerable library: searchAwesome/clones/awesome-wpo/website/node_modules/y18n/package.json,searchAwesome/clones/awesome-stacks/node_modules/cacache/node_modules/y18n/package.json</p> <p> Dependency Hierarchy: - gatsby-2.1.19.tgz (Root Library) - terser-webpack-plugin-1.2.3.tgz - cacache-11.3.2.tgz - :x: **y18n-4.0.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/bgoonz/searchAwesome/commit/cb1b8421c464b43b24d4816929e575612a00cd49">cb1b8421c464b43b24d4816929e575612a00cd49</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package y18n before 3.2.2, 4.0.1 and 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true <p>Publish Date: 2020-11-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1654">https://www.npmjs.com/advisories/1654</a></p> <p>Release Date: 2020-11-17</p> <p>Fix Resolution: 3.2.2, 4.0.1, 5.0.5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7774 (High) detected in y18n-3.2.1.tgz, y18n-4.0.0.tgz - ## CVE-2020-7774 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>y18n-3.2.1.tgz</b>, <b>y18n-4.0.0.tgz</b></p></summary> <p> <details><summary><b>y18n-3.2.1.tgz</b></p></summary> <p>the bare-bones internationalization library used by yargs</p> <p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz">https://registry.npmjs.org/y18n/-/y18n-3.2.1.tgz</a></p> <p>Path to dependency file: searchAwesome/clones/awesome-stacks/package.json</p> <p>Path to vulnerable library: searchAwesome/clones/awesome-stacks/node_modules/y18n/package.json</p> <p> Dependency Hierarchy: - gatsby-cli-2.5.7.tgz (Root Library) - yargs-12.0.5.tgz - :x: **y18n-3.2.1.tgz** (Vulnerable Library) </details> <details><summary><b>y18n-4.0.0.tgz</b></p></summary> <p>the bare-bones internationalization library used by yargs</p> <p>Library home page: <a href="https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz">https://registry.npmjs.org/y18n/-/y18n-4.0.0.tgz</a></p> <p>Path to dependency file: searchAwesome/clones/awesome-wpo/website/package.json</p> <p>Path to vulnerable library: searchAwesome/clones/awesome-wpo/website/node_modules/y18n/package.json,searchAwesome/clones/awesome-stacks/node_modules/cacache/node_modules/y18n/package.json</p> <p> Dependency Hierarchy: - gatsby-2.1.19.tgz (Root Library) - terser-webpack-plugin-1.2.3.tgz - cacache-11.3.2.tgz - :x: **y18n-4.0.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/bgoonz/searchAwesome/commit/cb1b8421c464b43b24d4816929e575612a00cd49">cb1b8421c464b43b24d4816929e575612a00cd49</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package y18n before 3.2.2, 4.0.1 and 5.0.5. PoC by po6ix: const y18n = require('y18n')(); y18n.setLocale('__proto__'); y18n.updateLocale({polluted: true}); console.log(polluted); // true <p>Publish Date: 2020-11-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7774>CVE-2020-7774</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1654">https://www.npmjs.com/advisories/1654</a></p> <p>Release Date: 2020-11-17</p> <p>Fix Resolution: 3.2.2, 4.0.1, 5.0.5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve high detected in tgz tgz cve high severity vulnerability vulnerable libraries tgz tgz tgz the bare bones internationalization library used by yargs library home page a href path to dependency file searchawesome clones awesome stacks package json path to vulnerable library searchawesome clones awesome stacks node modules package json dependency hierarchy gatsby cli tgz root library yargs tgz x tgz vulnerable library tgz the bare bones internationalization library used by yargs library home page a href path to dependency file searchawesome clones awesome wpo website package json path to vulnerable library searchawesome clones awesome wpo website node modules package json searchawesome clones awesome stacks node modules cacache node modules package json dependency hierarchy gatsby tgz root library terser webpack plugin tgz cacache tgz x tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package before and poc by const require setlocale proto updatelocale polluted true console log polluted true publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
37,684
6,626,474,070
IssuesEvent
2017-09-22 19:43:12
Optum/ChaoSlingr
https://api.github.com/repos/Optum/ChaoSlingr
closed
Figure out flow for outside contributions
documentation
Figure out how submitting request from outside contributors works
1.0
Figure out flow for outside contributions - Figure out how submitting request from outside contributors works
non_test
figure out flow for outside contributions figure out how submitting request from outside contributors works
0
337,336
30,247,489,684
IssuesEvent
2023-07-06 17:39:29
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
reopened
Fix indexing_slicing_joining_mutating_ops.test_torch_transpose
PyTorch Frontend Sub Task Failing Test
| | | |---|---| |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5478319447"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5467750133/jobs/9954465931"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5478319447"><img src=https://img.shields.io/badge/-failure-red></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5478319447"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5467750133/jobs/9954465931"><img src=https://img.shields.io/badge/-success-success></a>
1.0
Fix indexing_slicing_joining_mutating_ops.test_torch_transpose - | | | |---|---| |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5478319447"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5467750133/jobs/9954465931"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5478319447"><img src=https://img.shields.io/badge/-failure-red></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5478319447"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5467750133/jobs/9954465931"><img src=https://img.shields.io/badge/-success-success></a>
test
fix indexing slicing joining mutating ops test torch transpose jax a href src numpy a href src tensorflow a href src torch a href src paddle a href src
1
205,779
16,008,933,276
IssuesEvent
2021-04-20 08:09:22
ptarmiganlabs/butler-sos
https://api.github.com/repos/ptarmiganlabs/butler-sos
closed
Document dependency on InfluxDB 1.x
documentation
**Describe the bug** While InfluxDB 2.x is a mighty fine database, it is not entirely backwards compatible with v1.x (which Butler still uses). Document this dependency and stress the need to use for example InfluxDB 1.8.4 (which is latest as of this writing).
1.0
Document dependency on InfluxDB 1.x - **Describe the bug** While InfluxDB 2.x is a mighty fine database, it is not entirely backwards compatible with v1.x (which Butler still uses). Document this dependency and stress the need to use for example InfluxDB 1.8.4 (which is latest as of this writing).
non_test
document dependency on influxdb x describe the bug while influxdb x is a mighty fine database it is not entirely backwards compatible with x which butler still uses document this dependency and stress the need to use for example influxdb which is latest as of this writing
0
53,832
13,262,361,571
IssuesEvent
2020-08-20 21:40:15
icecube-trac/tix4
https://api.github.com/repos/icecube-trac/tix4
closed
On wimpsim_reader (Trac #2156)
Migrated from Trac combo simulation defect
Found when running this script: ```text from I3Tray import * from icecube import icetray, dataclasses, dataio from icecube import simclasses, wimpsim_reader, phys_services from icecube.wimpsim_reader import WimpSimReaderEarth tray = I3Tray() tray.AddService("I3SPRNGRandomServiceFactory","Random", NStreams = 2, Seed = 42, StreamNum = 1, InstallServiceAs = "I3RandomService") tray.AddSegment(WimpSimReaderEarth,"EarthWimpsim-reader", Infile = '/data/user/grenzi/data/EarthWimpData/Earth/we-m50-ch11-earth.010010.000100.dat', #GCDFileName = '/data/user/grenzi/data/EarthWimpData/gcd/GeoCalibDetectorStatus_IC86.55697_V2.i3.gz' StartMJD=55555, EndMJD=55666, ) def prettyprint(frame): icetray.logging.log_info("=====================") icetray.logging.log_info(str(frame["I3EventHeader"])) icetray.logging.log_info(str(frame["WIMP_params"])) icetray.logging.log_info(str(frame["I3MCTree"])) tray.AddModule(prettyprint, "print", Streams = [icetray.I3Frame.Physics, icetray.I3Frame.DAQ]) tray.AddModule( 'I3Writer', 'EventWriter', Filename = "/data/user/grenzi/data/EarthWimpData/Earth/we-m50-ch11-earth.010010.000100.i3.bz2", #Streams = [icetray.I3Frame.Physics, icetray.I3Frame.DAQ], DropOrphanStreams = [icetray.I3Frame.DAQ] ) tray.Execute() del tray ``` The segments in this submodule: http://code.icecube.wisc.edu/projects/icecube/browser/IceCube/projects/wimpsim-reader/trunk/python/wimpsimreader.py Are set with not acceptable values of ```InjectionRadius``` Both: ```text 29 InjectionRadius = 0*I3Units.meter , #default 0*I3Units.meter ``` ```text 74 InjectionRadius = 0*I3Units.meter , #default 0*I3Units.meter ``` While in /IceCube/projects/wimpsim-reader/trunk/private/wimpsim-reader/I3WimpSimReader.cxx we have: ```text 180 if (radius_<=0.) 181 log_fatal("Injection radius must be positive and not zero"); ``` Hence, running this segments with whatever the inputs gives the error: ```text RuntimeError: Injection radius must be positive and not zero (in virtual void I3WimpSimReader::Configure()) ``` Also, it is commented that default is 0, but the log message when running is ```text InjectionRadius Description : If >0, events will be injected in cylinder with zmin, zmax height Default : nan Configured : 0.0 ``` <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/2156">https://code.icecube.wisc.edu/projects/icecube/ticket/2156</a>, reported by grenziand owned by mjl5147</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:15:23", "_ts": "1550067323910946", "description": "Found when running this script:\n\n{{{\nfrom I3Tray import *\nfrom icecube import icetray, dataclasses, dataio\nfrom icecube import simclasses, wimpsim_reader, phys_services\nfrom icecube.wimpsim_reader import WimpSimReaderEarth\n\ntray = I3Tray()\n\ntray.AddService(\"I3SPRNGRandomServiceFactory\",\"Random\",\n NStreams = 2,\n Seed = 42,\n StreamNum = 1,\n InstallServiceAs = \"I3RandomService\")\n\ntray.AddSegment(WimpSimReaderEarth,\"EarthWimpsim-reader\",\n Infile = '/data/user/grenzi/data/EarthWimpData/Earth/we-m50-ch11-earth.010010.000100.dat',\n #GCDFileName = '/data/user/grenzi/data/EarthWimpData/gcd/GeoCalibDetectorStatus_IC86.55697_V2.i3.gz'\n StartMJD=55555,\n EndMJD=55666,\n )\n\ndef prettyprint(frame):\n icetray.logging.log_info(\"=====================\")\n icetray.logging.log_info(str(frame[\"I3EventHeader\"]))\n icetray.logging.log_info(str(frame[\"WIMP_params\"]))\n icetray.logging.log_info(str(frame[\"I3MCTree\"]))\ntray.AddModule(prettyprint, \"print\",\n Streams = [icetray.I3Frame.Physics, icetray.I3Frame.DAQ])\n\ntray.AddModule( 'I3Writer', 'EventWriter',\n Filename = \"/data/user/grenzi/data/EarthWimpData/Earth/we-m50-ch11-earth.010010.000100.i3.bz2\",\n #Streams = [icetray.I3Frame.Physics, icetray.I3Frame.DAQ],\n DropOrphanStreams = [icetray.I3Frame.DAQ]\n )\n\ntray.Execute()\n\ndel tray\n\n}}}\n\nThe segments in this submodule: http://code.icecube.wisc.edu/projects/icecube/browser/IceCube/projects/wimpsim-reader/trunk/python/wimpsimreader.py\n\nAre set with not acceptable values of {{{InjectionRadius}}}\n\nBoth:\n\n{{{\n29\t InjectionRadius = 0*I3Units.meter , #default 0*I3Units.meter\n}}}\n\n{{{\n74\t InjectionRadius = 0*I3Units.meter , #default 0*I3Units.meter\n}}}\n\nWhile in /IceCube/projects/wimpsim-reader/trunk/private/wimpsim-reader/I3WimpSimReader.cxx we have:\n\n{{{\n180\t if (radius_<=0.)\n181\t log_fatal(\"Injection radius must be positive and not zero\");\n}}}\n\nHence, running this segments with whatever the inputs gives the error:\n\n{{{\nRuntimeError: Injection radius must be positive and not zero (in virtual void I3WimpSimReader::Configure())\n}}}\n\nAlso, it is commented that default is 0, but the log message when running is \n{{{\nInjectionRadius\n Description : If >0, events will be injected in cylinder with zmin, zmax height\n Default : nan\n Configured : 0.0\n}}} \n\n", "reporter": "grenzi", "cc": "", "resolution": "fixed", "time": "2018-05-17T15:55:29", "component": "combo simulation", "summary": "On wimpsim_reader", "priority": "normal", "keywords": "", "milestone": "", "owner": "mjl5147", "type": "defect" } ``` </p> </details>
1.0
On wimpsim_reader (Trac #2156) - Found when running this script: ```text from I3Tray import * from icecube import icetray, dataclasses, dataio from icecube import simclasses, wimpsim_reader, phys_services from icecube.wimpsim_reader import WimpSimReaderEarth tray = I3Tray() tray.AddService("I3SPRNGRandomServiceFactory","Random", NStreams = 2, Seed = 42, StreamNum = 1, InstallServiceAs = "I3RandomService") tray.AddSegment(WimpSimReaderEarth,"EarthWimpsim-reader", Infile = '/data/user/grenzi/data/EarthWimpData/Earth/we-m50-ch11-earth.010010.000100.dat', #GCDFileName = '/data/user/grenzi/data/EarthWimpData/gcd/GeoCalibDetectorStatus_IC86.55697_V2.i3.gz' StartMJD=55555, EndMJD=55666, ) def prettyprint(frame): icetray.logging.log_info("=====================") icetray.logging.log_info(str(frame["I3EventHeader"])) icetray.logging.log_info(str(frame["WIMP_params"])) icetray.logging.log_info(str(frame["I3MCTree"])) tray.AddModule(prettyprint, "print", Streams = [icetray.I3Frame.Physics, icetray.I3Frame.DAQ]) tray.AddModule( 'I3Writer', 'EventWriter', Filename = "/data/user/grenzi/data/EarthWimpData/Earth/we-m50-ch11-earth.010010.000100.i3.bz2", #Streams = [icetray.I3Frame.Physics, icetray.I3Frame.DAQ], DropOrphanStreams = [icetray.I3Frame.DAQ] ) tray.Execute() del tray ``` The segments in this submodule: http://code.icecube.wisc.edu/projects/icecube/browser/IceCube/projects/wimpsim-reader/trunk/python/wimpsimreader.py Are set with not acceptable values of ```InjectionRadius``` Both: ```text 29 InjectionRadius = 0*I3Units.meter , #default 0*I3Units.meter ``` ```text 74 InjectionRadius = 0*I3Units.meter , #default 0*I3Units.meter ``` While in /IceCube/projects/wimpsim-reader/trunk/private/wimpsim-reader/I3WimpSimReader.cxx we have: ```text 180 if (radius_<=0.) 181 log_fatal("Injection radius must be positive and not zero"); ``` Hence, running this segments with whatever the inputs gives the error: ```text RuntimeError: Injection radius must be positive and not zero (in virtual void I3WimpSimReader::Configure()) ``` Also, it is commented that default is 0, but the log message when running is ```text InjectionRadius Description : If >0, events will be injected in cylinder with zmin, zmax height Default : nan Configured : 0.0 ``` <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/2156">https://code.icecube.wisc.edu/projects/icecube/ticket/2156</a>, reported by grenziand owned by mjl5147</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:15:23", "_ts": "1550067323910946", "description": "Found when running this script:\n\n{{{\nfrom I3Tray import *\nfrom icecube import icetray, dataclasses, dataio\nfrom icecube import simclasses, wimpsim_reader, phys_services\nfrom icecube.wimpsim_reader import WimpSimReaderEarth\n\ntray = I3Tray()\n\ntray.AddService(\"I3SPRNGRandomServiceFactory\",\"Random\",\n NStreams = 2,\n Seed = 42,\n StreamNum = 1,\n InstallServiceAs = \"I3RandomService\")\n\ntray.AddSegment(WimpSimReaderEarth,\"EarthWimpsim-reader\",\n Infile = '/data/user/grenzi/data/EarthWimpData/Earth/we-m50-ch11-earth.010010.000100.dat',\n #GCDFileName = '/data/user/grenzi/data/EarthWimpData/gcd/GeoCalibDetectorStatus_IC86.55697_V2.i3.gz'\n StartMJD=55555,\n EndMJD=55666,\n )\n\ndef prettyprint(frame):\n icetray.logging.log_info(\"=====================\")\n icetray.logging.log_info(str(frame[\"I3EventHeader\"]))\n icetray.logging.log_info(str(frame[\"WIMP_params\"]))\n icetray.logging.log_info(str(frame[\"I3MCTree\"]))\ntray.AddModule(prettyprint, \"print\",\n Streams = [icetray.I3Frame.Physics, icetray.I3Frame.DAQ])\n\ntray.AddModule( 'I3Writer', 'EventWriter',\n Filename = \"/data/user/grenzi/data/EarthWimpData/Earth/we-m50-ch11-earth.010010.000100.i3.bz2\",\n #Streams = [icetray.I3Frame.Physics, icetray.I3Frame.DAQ],\n DropOrphanStreams = [icetray.I3Frame.DAQ]\n )\n\ntray.Execute()\n\ndel tray\n\n}}}\n\nThe segments in this submodule: http://code.icecube.wisc.edu/projects/icecube/browser/IceCube/projects/wimpsim-reader/trunk/python/wimpsimreader.py\n\nAre set with not acceptable values of {{{InjectionRadius}}}\n\nBoth:\n\n{{{\n29\t InjectionRadius = 0*I3Units.meter , #default 0*I3Units.meter\n}}}\n\n{{{\n74\t InjectionRadius = 0*I3Units.meter , #default 0*I3Units.meter\n}}}\n\nWhile in /IceCube/projects/wimpsim-reader/trunk/private/wimpsim-reader/I3WimpSimReader.cxx we have:\n\n{{{\n180\t if (radius_<=0.)\n181\t log_fatal(\"Injection radius must be positive and not zero\");\n}}}\n\nHence, running this segments with whatever the inputs gives the error:\n\n{{{\nRuntimeError: Injection radius must be positive and not zero (in virtual void I3WimpSimReader::Configure())\n}}}\n\nAlso, it is commented that default is 0, but the log message when running is \n{{{\nInjectionRadius\n Description : If >0, events will be injected in cylinder with zmin, zmax height\n Default : nan\n Configured : 0.0\n}}} \n\n", "reporter": "grenzi", "cc": "", "resolution": "fixed", "time": "2018-05-17T15:55:29", "component": "combo simulation", "summary": "On wimpsim_reader", "priority": "normal", "keywords": "", "milestone": "", "owner": "mjl5147", "type": "defect" } ``` </p> </details>
non_test
on wimpsim reader trac found when running this script text from import from icecube import icetray dataclasses dataio from icecube import simclasses wimpsim reader phys services from icecube wimpsim reader import wimpsimreaderearth tray tray addservice random nstreams seed streamnum installserviceas tray addsegment wimpsimreaderearth earthwimpsim reader infile data user grenzi data earthwimpdata earth we earth dat gcdfilename data user grenzi data earthwimpdata gcd geocalibdetectorstatus gz startmjd endmjd def prettyprint frame icetray logging log info icetray logging log info str frame icetray logging log info str frame icetray logging log info str frame tray addmodule prettyprint print streams tray addmodule eventwriter filename data user grenzi data earthwimpdata earth we earth streams droporphanstreams tray execute del tray the segments in this submodule are set with not acceptable values of injectionradius both text injectionradius meter default meter text injectionradius meter default meter while in icecube projects wimpsim reader trunk private wimpsim reader cxx we have text if radius log fatal injection radius must be positive and not zero hence running this segments with whatever the inputs gives the error text runtimeerror injection radius must be positive and not zero in virtual void configure also it is commented that default is but the log message when running is text injectionradius description if events will be injected in cylinder with zmin zmax height default nan configured migrated from json status closed changetime ts description found when running this script n n nfrom import nfrom icecube import icetray dataclasses dataio nfrom icecube import simclasses wimpsim reader phys services nfrom icecube wimpsim reader import wimpsimreaderearth n ntray n ntray addservice random n nstreams n seed n streamnum n installserviceas n ntray addsegment wimpsimreaderearth earthwimpsim reader n infile data user grenzi data earthwimpdata earth we earth dat n gcdfilename data user grenzi data earthwimpdata gcd geocalibdetectorstatus gz n startmjd n endmjd n n ndef prettyprint frame n icetray logging log info n icetray logging log info str frame n icetray logging log info str frame n icetray logging log info str frame ntray addmodule prettyprint print n streams n ntray addmodule eventwriter n filename data user grenzi data earthwimpdata earth we earth n streams n droporphanstreams n n ntray execute n ndel tray n n n nthe segments in this submodule set with not acceptable values of injectionradius n nboth n n t injectionradius meter default meter n n n t injectionradius meter default meter n n nwhile in icecube projects wimpsim reader trunk private wimpsim reader cxx we have n n t if radius events will be injected in cylinder with zmin zmax height n default nan n configured n n n reporter grenzi cc resolution fixed time component combo simulation summary on wimpsim reader priority normal keywords milestone owner type defect
0
323,302
27,714,499,132
IssuesEvent
2023-03-14 16:06:28
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
opened
Fix math.test_tensorflow_add_n
TensorFlow Frontend Sub Task Failing Test
| | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4415955275/jobs/7739604896" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |torch|None |numpy|None |jax|None <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_tensorflow/test_math.py::test_tensorflow_add_n[cpu-ivy.functional.backends.tensorflow-False-False]</summary> 2023-03-14T13:11:15.8387508Z E AttributeError: module 'ivy.functional.backends.tensorflow' has no attribute 'add_n' 2023-03-14T13:11:15.8388123Z E Falsifying example: test_tensorflow_add_n( 2023-03-14T13:11:15.8388708Z E dtype_and_x=(['float64'], [array(-1.)]), 2023-03-14T13:11:15.8389209Z E test_flags=FrontendFunctionTestFlags( 2023-03-14T13:11:15.8389690Z E num_positional_args=0, 2023-03-14T13:11:15.8390102Z E with_out=False, 2023-03-14T13:11:15.8390516Z E inplace=False, 2023-03-14T13:11:15.8390932Z E as_variable=[False], 2023-03-14T13:11:15.8391346Z E native_arrays=[False], 2023-03-14T13:11:15.8391741Z E ), 2023-03-14T13:11:15.8392333Z E fn_tree='ivy.functional.frontends.tensorflow.math.add_n', 2023-03-14T13:11:15.8393723Z E on_device='cpu', 2023-03-14T13:11:15.8394123Z E frontend='tensorflow', 2023-03-14T13:11:15.8394428Z E ) 2023-03-14T13:11:15.8394700Z E 2023-03-14T13:11:15.8395409Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCIDQAAAnAAM=') as a decorator on your test case </details>
1.0
Fix math.test_tensorflow_add_n - | | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4415955275/jobs/7739604896" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |torch|None |numpy|None |jax|None <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_tensorflow/test_math.py::test_tensorflow_add_n[cpu-ivy.functional.backends.tensorflow-False-False]</summary> 2023-03-14T13:11:15.8387508Z E AttributeError: module 'ivy.functional.backends.tensorflow' has no attribute 'add_n' 2023-03-14T13:11:15.8388123Z E Falsifying example: test_tensorflow_add_n( 2023-03-14T13:11:15.8388708Z E dtype_and_x=(['float64'], [array(-1.)]), 2023-03-14T13:11:15.8389209Z E test_flags=FrontendFunctionTestFlags( 2023-03-14T13:11:15.8389690Z E num_positional_args=0, 2023-03-14T13:11:15.8390102Z E with_out=False, 2023-03-14T13:11:15.8390516Z E inplace=False, 2023-03-14T13:11:15.8390932Z E as_variable=[False], 2023-03-14T13:11:15.8391346Z E native_arrays=[False], 2023-03-14T13:11:15.8391741Z E ), 2023-03-14T13:11:15.8392333Z E fn_tree='ivy.functional.frontends.tensorflow.math.add_n', 2023-03-14T13:11:15.8393723Z E on_device='cpu', 2023-03-14T13:11:15.8394123Z E frontend='tensorflow', 2023-03-14T13:11:15.8394428Z E ) 2023-03-14T13:11:15.8394700Z E 2023-03-14T13:11:15.8395409Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCIDQAAAnAAM=') as a decorator on your test case </details>
test
fix math test tensorflow add n tensorflow img src torch none numpy none jax none failed ivy tests test ivy test frontends test tensorflow test math py test tensorflow add n e attributeerror module ivy functional backends tensorflow has no attribute add n e falsifying example test tensorflow add n e dtype and x e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e fn tree ivy functional frontends tensorflow math add n e on device cpu e frontend tensorflow e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case
1
68,096
7,087,693,496
IssuesEvent
2018-01-11 18:44:18
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
opened
`OperationServiceImpl_invokeOnPartitionsTest`
Team: Core Team: QuSP Type: Test-Failure
`com.hazelcast.spi.impl.operationservice.impl.OperationServiceImpl_invokeOnPartitionsTest.testLongRunning` ```java com.hazelcast.nio.serialization.HazelcastSerializationException: Problem while reading DataSerializable, namespace: 0, ID: 0, class: 'com.hazelcast.spi.impl.operationservice.impl.OperationServiceImpl_invokeOnPartitionsTest$SlowOperationFactoryImpl$1', exception: com.hazelcast.spi.impl.operationservice.impl.OperationServiceImpl_invokeOnPartitionsTest$SlowOperationFactoryImpl$1.<init>() ``` https://hazelcast-l337.ci.cloudbees.com/job/new-lab-fast-pr/12761/testReport/junit/com.hazelcast.spi.impl.operationservice.impl/OperationServiceImpl_invokeOnPartitionsTest/testLongRunning/
1.0
`OperationServiceImpl_invokeOnPartitionsTest` - `com.hazelcast.spi.impl.operationservice.impl.OperationServiceImpl_invokeOnPartitionsTest.testLongRunning` ```java com.hazelcast.nio.serialization.HazelcastSerializationException: Problem while reading DataSerializable, namespace: 0, ID: 0, class: 'com.hazelcast.spi.impl.operationservice.impl.OperationServiceImpl_invokeOnPartitionsTest$SlowOperationFactoryImpl$1', exception: com.hazelcast.spi.impl.operationservice.impl.OperationServiceImpl_invokeOnPartitionsTest$SlowOperationFactoryImpl$1.<init>() ``` https://hazelcast-l337.ci.cloudbees.com/job/new-lab-fast-pr/12761/testReport/junit/com.hazelcast.spi.impl.operationservice.impl/OperationServiceImpl_invokeOnPartitionsTest/testLongRunning/
test
operationserviceimpl invokeonpartitionstest com hazelcast spi impl operationservice impl operationserviceimpl invokeonpartitionstest testlongrunning java com hazelcast nio serialization hazelcastserializationexception problem while reading dataserializable namespace id class com hazelcast spi impl operationservice impl operationserviceimpl invokeonpartitionstest slowoperationfactoryimpl exception com hazelcast spi impl operationservice impl operationserviceimpl invokeonpartitionstest slowoperationfactoryimpl
1
365,775
10,797,591,855
IssuesEvent
2019-11-06 08:13:59
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
subtech.g.hatena.ne.jp - Page content is not displayed
browser-firefox engine-gecko priority-important severity-important
<!-- @browser: Firefox 72.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:72.0) Gecko/20100101 Firefox/72.0 --> <!-- @reported_with: desktop-reporter --> **URL**: http://subtech.g.hatena.ne.jp/ **Browser / Version**: Firefox 72.0 **Operating System**: Windows 10 **Tested Another Browser**: Yes **Problem type**: Design is broken **Description**: Elements with style "zoom: 0" are invisible **Steps to Reproduce**: Some body texts, that should be displayed below headings, are not displayed. The site has a CSS rule "zoom: 0". In IE, Edge, and Chrome, "zoom: 0" seems to be ignored. In Firefox 72, "zoom: 0" makes the element invisible. Firefox 72 supports CSS zoom property via transform property https://bugzilla.mozilla.org/show_bug.cgi?id=1589766 . [![Screenshot Description](https://webcompat.com/uploads/2019/10/bcbaf893-3f91-40cc-b52e-21d359dfdbcc-thumb.jpeg)](https://webcompat.com/uploads/2019/10/bcbaf893-3f91-40cc-b52e-21d359dfdbcc.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20191027212548</li><li>channel: nightly</li><li>hasTouchScreen: false</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> <p>Console Messages:</p> <pre> [{'level': 'warn', 'log': ['This page uses the non standard property zoom. Consider using calc() in the relevant property values, or using transform along with transform-origin: 0 0.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['The script from https://www.otsuka.co.jp/soy/entertainment/ichigobp/ichigobp.js was loaded even though its MIME type (text/html) is not a valid JavaScript MIME type.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['Loading failed for the <script> with source http://www.otsuka.co.jp/soy/entertainment/ichigobp/ichigobp.js.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '701:1'}, {'level': 'error', 'log': ["SyntaxError: expected expression, got ';'"], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '886:31'}, {'level': 'warn', 'log': ['Request to access cookie or storage on http://stats.g.doubleclick.net/dc.js was blocked because it came from a tracker and content blocking is enabled.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['Request to access cookie or storage on https://stats.g.doubleclick.net/dc.js was blocked because it came from a tracker and content blocking is enabled.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['Use of Mutation Events is deprecated. Use MutationObserver instead.'], 'uri': 'moz-extension://72fbe179-9ca6-4837-be8e-94e0ddd8bca7/content/widget_embedder.js', 'pos': '106:21'}, {'level': 'warn', 'log': ['onmozfullscreenchange is deprecated.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['onmozfullscreenerror is deprecated.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
subtech.g.hatena.ne.jp - Page content is not displayed - <!-- @browser: Firefox 72.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:72.0) Gecko/20100101 Firefox/72.0 --> <!-- @reported_with: desktop-reporter --> **URL**: http://subtech.g.hatena.ne.jp/ **Browser / Version**: Firefox 72.0 **Operating System**: Windows 10 **Tested Another Browser**: Yes **Problem type**: Design is broken **Description**: Elements with style "zoom: 0" are invisible **Steps to Reproduce**: Some body texts, that should be displayed below headings, are not displayed. The site has a CSS rule "zoom: 0". In IE, Edge, and Chrome, "zoom: 0" seems to be ignored. In Firefox 72, "zoom: 0" makes the element invisible. Firefox 72 supports CSS zoom property via transform property https://bugzilla.mozilla.org/show_bug.cgi?id=1589766 . [![Screenshot Description](https://webcompat.com/uploads/2019/10/bcbaf893-3f91-40cc-b52e-21d359dfdbcc-thumb.jpeg)](https://webcompat.com/uploads/2019/10/bcbaf893-3f91-40cc-b52e-21d359dfdbcc.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20191027212548</li><li>channel: nightly</li><li>hasTouchScreen: false</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> <p>Console Messages:</p> <pre> [{'level': 'warn', 'log': ['This page uses the non standard property zoom. Consider using calc() in the relevant property values, or using transform along with transform-origin: 0 0.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['The script from https://www.otsuka.co.jp/soy/entertainment/ichigobp/ichigobp.js was loaded even though its MIME type (text/html) is not a valid JavaScript MIME type.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['Loading failed for the <script> with source http://www.otsuka.co.jp/soy/entertainment/ichigobp/ichigobp.js.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '701:1'}, {'level': 'error', 'log': ["SyntaxError: expected expression, got ';'"], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '886:31'}, {'level': 'warn', 'log': ['Request to access cookie or storage on http://stats.g.doubleclick.net/dc.js was blocked because it came from a tracker and content blocking is enabled.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['Request to access cookie or storage on https://stats.g.doubleclick.net/dc.js was blocked because it came from a tracker and content blocking is enabled.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['Use of Mutation Events is deprecated. Use MutationObserver instead.'], 'uri': 'moz-extension://72fbe179-9ca6-4837-be8e-94e0ddd8bca7/content/widget_embedder.js', 'pos': '106:21'}, {'level': 'warn', 'log': ['onmozfullscreenchange is deprecated.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}, {'level': 'warn', 'log': ['onmozfullscreenerror is deprecated.'], 'uri': 'http://subtech.g.hatena.ne.jp/', 'pos': '0:0'}] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_test
subtech g hatena ne jp page content is not displayed url browser version firefox operating system windows tested another browser yes problem type design is broken description elements with style zoom are invisible steps to reproduce some body texts that should be displayed below headings are not displayed the site has a css rule zoom in ie edge and chrome zoom seems to be ignored in firefox zoom makes the element invisible firefox supports css zoom property via transform property browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel nightly hastouchscreen false mixed active content blocked false mixed passive content blocked false tracking content blocked false console messages uri pos level warn log uri pos level warn log uri pos level error log uri pos level warn log uri pos level warn log uri pos level warn log uri moz extension content widget embedder js pos level warn log uri pos level warn log uri pos from with ❤️
0
299,274
25,892,569,683
IssuesEvent
2022-12-14 19:15:54
rancher/dashboard
https://api.github.com/repos/rancher/dashboard
closed
"Deploy Workload" should print an error when creating namespace containing an underscore
kind/bug [zube]: To Test internal QA/XS kind/enhancement team/area2 size/2 ember area/form-validation JIRA
**What kind of request is this (question/bug/enhancement/feature request):** Bug **Steps to reproduce (least amount of steps as possible):** 1. Head to the `Cluster` -> `Project` -> `Workloads` view. 2. Click `Deploy` 3. Attempt to deploy a workload. * Name: test * Dock Image: Anything * Namespace: Select `Add to a new namespace`. For namespace, add something that contains an underscore, such as `namespace_test` * Hit `Launch` **Result:** Rancher will return a red error box at the bottom with no error: ![image](https://user-images.githubusercontent.com/142752/88213550-bb76fd00-cc0d-11ea-9bac-0276a2114b8a.png) **Other details that may be helpful:** If I create a namespace another way, Rancher warns me that underscores are not allowed in namespaces. 1. Head to the `Cluster` -> `Project` view (not workload) 2. Click `Add Namespace` 3. Attempt to add a namespace * Name: add something that contains an underscore, such as `namespace_test` 4. Hit `Create` Rancher will print the error `"Name" contains an invalid character: _` ![image](https://user-images.githubusercontent.com/142752/88213831-1c063a00-cc0e-11ea-9c08-9ce3fc672e28.png) **Environment information** - Rancher version: v2.4.5 - Rancher User interface: v2.4.28 - Installation option (single install/HA): HA <!-- If the reported issue is regarding a created cluster, please provide requested info below --> **Cluster information** - Cluster type (Hosted/Infrastructure Provider/Custom/Imported): Custom - Machine type (cloud/VM/metal) and specifications (CPU/memory): VMware VMs. Specs n/a - Kubernetes version (use `kubectl version`): ``` Client Version: version.Info{Major:"1", Minor:"17", GitVersion:"v1.17.9", GitCommit:"4fb7ed12476d57b8437ada90b4f93b17ffaeed99", GitTreeState:"clean", BuildDate:"2020-07-15T16:18:16Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"17", GitVersion:"v1.17.9", GitCommit:"4fb7ed12476d57b8437ada90b4f93b17ffaeed99", GitTreeState:"clean", BuildDate:"2020-07-15T16:10:45Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"linux/amd64"} ``` - Docker version (use `docker version`): ``` Client: Version: 18.09.9 API version: 1.39 Go version: go1.11.13 Git commit: 039a7df9ba Built: Wed Sep 4 16:57:28 2019 OS/Arch: linux/amd64 Experimental: false Server: Docker Engine - Community Engine: Version: 18.09.9 API version: 1.39 (minimum version 1.12) Go version: go1.11.13 Git commit: 039a7df Built: Wed Sep 4 16:19:38 2019 OS/Arch: linux/amd64 Experimental: false ``` gz#14010
1.0
"Deploy Workload" should print an error when creating namespace containing an underscore - **What kind of request is this (question/bug/enhancement/feature request):** Bug **Steps to reproduce (least amount of steps as possible):** 1. Head to the `Cluster` -> `Project` -> `Workloads` view. 2. Click `Deploy` 3. Attempt to deploy a workload. * Name: test * Dock Image: Anything * Namespace: Select `Add to a new namespace`. For namespace, add something that contains an underscore, such as `namespace_test` * Hit `Launch` **Result:** Rancher will return a red error box at the bottom with no error: ![image](https://user-images.githubusercontent.com/142752/88213550-bb76fd00-cc0d-11ea-9bac-0276a2114b8a.png) **Other details that may be helpful:** If I create a namespace another way, Rancher warns me that underscores are not allowed in namespaces. 1. Head to the `Cluster` -> `Project` view (not workload) 2. Click `Add Namespace` 3. Attempt to add a namespace * Name: add something that contains an underscore, such as `namespace_test` 4. Hit `Create` Rancher will print the error `"Name" contains an invalid character: _` ![image](https://user-images.githubusercontent.com/142752/88213831-1c063a00-cc0e-11ea-9c08-9ce3fc672e28.png) **Environment information** - Rancher version: v2.4.5 - Rancher User interface: v2.4.28 - Installation option (single install/HA): HA <!-- If the reported issue is regarding a created cluster, please provide requested info below --> **Cluster information** - Cluster type (Hosted/Infrastructure Provider/Custom/Imported): Custom - Machine type (cloud/VM/metal) and specifications (CPU/memory): VMware VMs. Specs n/a - Kubernetes version (use `kubectl version`): ``` Client Version: version.Info{Major:"1", Minor:"17", GitVersion:"v1.17.9", GitCommit:"4fb7ed12476d57b8437ada90b4f93b17ffaeed99", GitTreeState:"clean", BuildDate:"2020-07-15T16:18:16Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"17", GitVersion:"v1.17.9", GitCommit:"4fb7ed12476d57b8437ada90b4f93b17ffaeed99", GitTreeState:"clean", BuildDate:"2020-07-15T16:10:45Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"linux/amd64"} ``` - Docker version (use `docker version`): ``` Client: Version: 18.09.9 API version: 1.39 Go version: go1.11.13 Git commit: 039a7df9ba Built: Wed Sep 4 16:57:28 2019 OS/Arch: linux/amd64 Experimental: false Server: Docker Engine - Community Engine: Version: 18.09.9 API version: 1.39 (minimum version 1.12) Go version: go1.11.13 Git commit: 039a7df Built: Wed Sep 4 16:19:38 2019 OS/Arch: linux/amd64 Experimental: false ``` gz#14010
test
deploy workload should print an error when creating namespace containing an underscore what kind of request is this question bug enhancement feature request bug steps to reproduce least amount of steps as possible head to the cluster project workloads view click deploy attempt to deploy a workload name test dock image anything namespace select add to a new namespace for namespace add something that contains an underscore such as namespace test hit launch result rancher will return a red error box at the bottom with no error other details that may be helpful if i create a namespace another way rancher warns me that underscores are not allowed in namespaces head to the cluster project view not workload click add namespace attempt to add a namespace name add something that contains an underscore such as namespace test hit create rancher will print the error name contains an invalid character environment information rancher version rancher user interface installation option single install ha ha if the reported issue is regarding a created cluster please provide requested info below cluster information cluster type hosted infrastructure provider custom imported custom machine type cloud vm metal and specifications cpu memory vmware vms specs n a kubernetes version use kubectl version client version version info major minor gitversion gitcommit gittreestate clean builddate goversion compiler gc platform linux server version version info major minor gitversion gitcommit gittreestate clean builddate goversion compiler gc platform linux docker version use docker version client version api version go version git commit built wed sep os arch linux experimental false server docker engine community engine version api version minimum version go version git commit built wed sep os arch linux experimental false gz
1
211,229
16,191,162,415
IssuesEvent
2021-05-04 08:41:28
lutraconsulting/qgis-mergin-plugin-manual-tests
https://api.github.com/repos/lutraconsulting/qgis-mergin-plugin-manual-tests
opened
Mergin plugin test plan
test plan
## Test plan for mergin plugin | Test environment | Value | |---|---| | Mergin Version: | 2021.5.1 | | Mergin URL: <> | public.cloudmergin.com | | QGIS Version: | 3.16 LTE | | Mergin plugin Version: | 2021.2.1 | | Date of Execution: | 4.5.2021 | --- ### Test Cases - [X] ( #1 ) TC 01: Installation - [X] ( #2 ) TC 02: Configuration - [X] ( #2 ) TC 03: New project - [X] ( #4 ) TC 04: Project management - [X] ( #5 ) TC 05: Project permissions - [X] ( #6 ) TC 06: Project tree --- | Test Execution Outcome | | |---|---| | Issues Created During Testing: | https://github.com/lutraconsulting/qgis-mergin-plugin/issues/231 | | https://github.com/lutraconsulting/qgis-mergin-plugin/issues/229 | | | https://github.com/lutraconsulting/qgis-mergin-plugin/issues/228 | **Success** / **Bugs Created** (erase one)
1.0
Mergin plugin test plan - ## Test plan for mergin plugin | Test environment | Value | |---|---| | Mergin Version: | 2021.5.1 | | Mergin URL: <> | public.cloudmergin.com | | QGIS Version: | 3.16 LTE | | Mergin plugin Version: | 2021.2.1 | | Date of Execution: | 4.5.2021 | --- ### Test Cases - [X] ( #1 ) TC 01: Installation - [X] ( #2 ) TC 02: Configuration - [X] ( #2 ) TC 03: New project - [X] ( #4 ) TC 04: Project management - [X] ( #5 ) TC 05: Project permissions - [X] ( #6 ) TC 06: Project tree --- | Test Execution Outcome | | |---|---| | Issues Created During Testing: | https://github.com/lutraconsulting/qgis-mergin-plugin/issues/231 | | https://github.com/lutraconsulting/qgis-mergin-plugin/issues/229 | | | https://github.com/lutraconsulting/qgis-mergin-plugin/issues/228 | **Success** / **Bugs Created** (erase one)
test
mergin plugin test plan test plan for mergin plugin test environment value mergin version mergin url public cloudmergin com qgis version lte mergin plugin version date of execution test cases tc installation tc configuration tc new project tc project management tc project permissions tc project tree test execution outcome issues created during testing success bugs created erase one
1
293,768
22,087,724,733
IssuesEvent
2022-06-01 01:35:59
supabase/gotrue
https://api.github.com/repos/supabase/gotrue
closed
Enable Discussions
documentation
I don't know what "JAM projects" means, was hoping to ask in the discussions... but it doesn't look like it's set up. Would you consider enabling discussions for these kinds of questions?
1.0
Enable Discussions - I don't know what "JAM projects" means, was hoping to ask in the discussions... but it doesn't look like it's set up. Would you consider enabling discussions for these kinds of questions?
non_test
enable discussions i don t know what jam projects means was hoping to ask in the discussions but it doesn t look like it s set up would you consider enabling discussions for these kinds of questions
0
207,246
15,798,782,111
IssuesEvent
2021-04-02 19:25:59
hashicorp/terraform-provider-aws
https://api.github.com/repos/hashicorp/terraform-provider-aws
closed
Panic at Terraform 0.15-beta: can't use ElementIterator on unknown value
bug crash prerelease-tf-testing terraform-0.15 tests upstream-terraform
<!--- Please note the following potential times when an issue might be in Terraform core: * [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues * [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues * [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues * [Registry](https://registry.terraform.io/) issues * Spans resources across multiple providers If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead. ---> <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform CLI and Terraform AWS Provider Version Terraform v0.15.0-beta2 AWS provider v3.32.0 ### Affected Resources/Data Sources <!--- Please list the affected resources and data sources. ---> * ds/aws_elasticsearch_domain * r/aws_appsync_datasource * r/aws_elasticsearch_domain_policy * r/aws_kinesis_firehose_delivery_stream * r/aws_opsworks_application * r/aws_opsworks_custom_layer * r/aws_opsworks_ganglia_layer * r/aws_opsworks_haproxy_layer * r/aws_opsworks_instance * r/aws_opsworks_java_app_layer * r/aws_opsworks_memcached_layer * r/aws_opsworks_mysql_layer * r/aws_opsworks_nodejs_app_layer * r/aws_opsworks_permission * r/aws_opsworks_php_app_layer * r/aws_opsworks_rails_app_layer * r/aws_opsworks_rds_db_instance * r/aws_opsworks_static_web_layer ### Failing tests 1. TestAccAwsAppsyncDatasource_ElasticsearchConfig_Region 1. TestAccAwsAppsyncDatasource_Type_Elasticsearch 1. TestAccAWSDataElasticsearchDomain_basic 1. TestAccAWSElasticSearchDomainPolicy_basic 1. TestAccAWSKinesisFirehoseDeliveryStream_ElasticsearchConfigEndpointUpdates 1. TestAccAWSKinesisFirehoseDeliveryStream_ElasticsearchConfigUpdates 1. TestAccAWSKinesisFirehoseDeliveryStream_ElasticsearchWithVpcConfigUpdates 1. TestAccAWSOpsworksApplication_basic 1. TestAccAWSOpsworksCustomLayer_basic 1. TestAccAWSOpsworksCustomLayer_noVPC 1. TestAccAWSOpsworksCustomLayer_tags 1. TestAccAWSOpsworksGangliaLayer_basic 1. TestAccAWSOpsworksGangliaLayer_tags 1. TestAccAWSOpsworksHAProxyLayer_basic 1. TestAccAWSOpsworksHAProxyLayer_tags 1. TestAccAWSOpsworksInstance_basic 1. TestAccAWSOpsworksInstance_UpdateHostNameForceNew 1. TestAccAWSOpsworksJavaAppLayer_basic 1. TestAccAWSOpsworksJavaAppLayer_tags 1. TestAccAWSOpsworksMemcachedLayer_basic 1. TestAccAWSOpsworksMemcachedLayer_tags 1. TestAccAWSOpsworksMysqlLayer_basic 1. TestAccAWSOpsworksMysqlLayer_tags 1. TestAccAWSOpsworksNodejsAppLayer_basic 1. TestAccAWSOpsworksNodejsAppLayer_tags 1. TestAccAWSOpsworksPermission_basic 1. TestAccAWSOpsworksPermission_Self 1. TestAccAWSOpsworksPhpAppLayer_basic 1. TestAccAWSOpsworksPhpAppLayer_tags 1. TestAccAWSOpsworksRailsAppLayer_basic 1. TestAccAWSOpsworksRailsAppLayer_tags 1. TestAccAWSOpsworksRdsDbInstance_basic 1. TestAccAWSOpsworksStaticWebLayer_basic 1. TestAccAWSOpsworksStaticWebLayer_tags ### Terraform Configuration Files See individual tests. ### Panic Output ``` data_source_aws_elasticsearch_domain_test.go:17: Step 1/1 error: Error running pre-apply refresh: exit status 2 panic: can't use ElementIterator on unknown value goroutine 120 [running]: github.com/zclconf/go-cty/cty.Value.ElementIterator(0x2c020b8, 0xc001896e40, 0x23a6120, 0x3caae20, 0xc00069a2a0, 0x19) /go/pkg/mod/github.com/zclconf/go-cty@v1.8.1/cty/value_ops.go:1113 +0x13b github.com/hashicorp/terraform/terraform.getValMarks(0xc00124c330, 0x2c02128, 0xc001897370, 0x23cfa00, 0xc002a1adb0, 0x0, 0x0, 0x0, 0x4010100, 0x0, ...) /home/circleci/project/project/terraform/evaluate.go:992 +0x6c5 github.com/hashicorp/terraform/terraform.markProviderSensitiveAttributes(0xc00124c330, 0x2c02128, 0xc001897370, 0x23cfa00, 0xc002a1adb0, 0x2c02128, 0xc001897370, 0x23cfa00, 0xc002a1adb0) /home/circleci/project/project/terraform/evaluate.go:959 +0x6e github.com/hashicorp/terraform/terraform.(*evaluationStateData).GetResource(0xc000456fc0, 0x4d, 0xc000058708, 0x18, 0xc0001a69c8, 0x4, 0xc0000581e0, 0x18, 0x3d, 0x11, ...) /home/circleci/project/project/terraform/evaluate.go:762 +0xbd5 github.com/hashicorp/terraform/lang.(*Scope).evalContext(0xc001eb1f90, 0xc0017dc3f0, 0x1, 0x1, 0x0, 0x0, 0x0, 0x0, 0x0, 0x30) /home/circleci/project/project/lang/eval.go:360 +0x206d github.com/hashicorp/terraform/lang.(*Scope).EvalContext(...) /home/circleci/project/project/lang/eval.go:238 github.com/hashicorp/terraform/lang.(*Scope).EvalBlock(0xc001eb1f90, 0x2c00bb8, 0xc002997470, 0xc0017221b0, 0x1, 0x1, 0x0, 0x0, 0x0, 0x0, ...) /home/circleci/project/project/lang/eval.go:51 +0xf3 github.com/hashicorp/terraform/terraform.(*BuiltinEvalContext).EvaluateBlock(0xc001144b60, 0x2c00e58, 0xc002997470, 0xc0017221b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /home/circleci/project/project/terraform/eval_context_builtin.go:273 +0x1ad github.com/hashicorp/terraform/terraform.(*NodeAbstractResourceInstance).planDataSource(0xc000452000, 0x2c37770, 0xc001144b60, 0x0, 0x1, 0x1, 0x0, 0x0, 0x0) /home/circleci/project/project/terraform/node_resource_abstract_instance.go:1350 +0x515 github.com/hashicorp/terraform/terraform.(*NodePlannableResourceInstance).dataResourceExecute(0xc0018964e0, 0x2c37770, 0xc001144b60, 0xc000000001, 0xc0007a3118, 0xc002565c80) /home/circleci/project/project/terraform/node_resource_plan_instance.go:73 +0x478 github.com/hashicorp/terraform/terraform.(*NodePlannableResourceInstance).Execute(0xc0018964e0, 0x2c37770, 0xc001144b60, 0xc000180002, 0xc002565d18, 0x40bb05, 0x2418a80) /home/circleci/project/project/terraform/node_resource_plan_instance.go:43 +0x10d github.com/hashicorp/terraform/terraform.(*ContextGraphWalker).Execute(0xc00039fc80, 0x2c37770, 0xc001144b60, 0x7f1682310508, 0xc0018964e0, 0x0, 0x0, 0x0) /home/circleci/project/project/terraform/graph_walk_context.go:127 +0xbf github.com/hashicorp/terraform/terraform.(*Graph).walk.func1(0x2745d00, 0xc0018964e0, 0x0, 0x0, 0x0) /home/circleci/project/project/terraform/graph.go:59 +0xba2 github.com/hashicorp/terraform/dag.(*Walker).walkVertex(0xc001149ec0, 0x2745d00, 0xc0018964e0, 0xc001267900) /home/circleci/project/project/dag/walk.go:381 +0x288 created by github.com/hashicorp/terraform/dag.(*Walker).Update /home/circleci/project/project/dag/walk.go:304 +0x1246 --- FAIL: TestAccAWSDataElasticsearchDomain_basic (3.37s) ``` ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor documentation? For example: ---> * hashicorp/terraform#28180
2.0
Panic at Terraform 0.15-beta: can't use ElementIterator on unknown value - <!--- Please note the following potential times when an issue might be in Terraform core: * [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues * [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues * [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues * [Registry](https://registry.terraform.io/) issues * Spans resources across multiple providers If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead. ---> <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform CLI and Terraform AWS Provider Version Terraform v0.15.0-beta2 AWS provider v3.32.0 ### Affected Resources/Data Sources <!--- Please list the affected resources and data sources. ---> * ds/aws_elasticsearch_domain * r/aws_appsync_datasource * r/aws_elasticsearch_domain_policy * r/aws_kinesis_firehose_delivery_stream * r/aws_opsworks_application * r/aws_opsworks_custom_layer * r/aws_opsworks_ganglia_layer * r/aws_opsworks_haproxy_layer * r/aws_opsworks_instance * r/aws_opsworks_java_app_layer * r/aws_opsworks_memcached_layer * r/aws_opsworks_mysql_layer * r/aws_opsworks_nodejs_app_layer * r/aws_opsworks_permission * r/aws_opsworks_php_app_layer * r/aws_opsworks_rails_app_layer * r/aws_opsworks_rds_db_instance * r/aws_opsworks_static_web_layer ### Failing tests 1. TestAccAwsAppsyncDatasource_ElasticsearchConfig_Region 1. TestAccAwsAppsyncDatasource_Type_Elasticsearch 1. TestAccAWSDataElasticsearchDomain_basic 1. TestAccAWSElasticSearchDomainPolicy_basic 1. TestAccAWSKinesisFirehoseDeliveryStream_ElasticsearchConfigEndpointUpdates 1. TestAccAWSKinesisFirehoseDeliveryStream_ElasticsearchConfigUpdates 1. TestAccAWSKinesisFirehoseDeliveryStream_ElasticsearchWithVpcConfigUpdates 1. TestAccAWSOpsworksApplication_basic 1. TestAccAWSOpsworksCustomLayer_basic 1. TestAccAWSOpsworksCustomLayer_noVPC 1. TestAccAWSOpsworksCustomLayer_tags 1. TestAccAWSOpsworksGangliaLayer_basic 1. TestAccAWSOpsworksGangliaLayer_tags 1. TestAccAWSOpsworksHAProxyLayer_basic 1. TestAccAWSOpsworksHAProxyLayer_tags 1. TestAccAWSOpsworksInstance_basic 1. TestAccAWSOpsworksInstance_UpdateHostNameForceNew 1. TestAccAWSOpsworksJavaAppLayer_basic 1. TestAccAWSOpsworksJavaAppLayer_tags 1. TestAccAWSOpsworksMemcachedLayer_basic 1. TestAccAWSOpsworksMemcachedLayer_tags 1. TestAccAWSOpsworksMysqlLayer_basic 1. TestAccAWSOpsworksMysqlLayer_tags 1. TestAccAWSOpsworksNodejsAppLayer_basic 1. TestAccAWSOpsworksNodejsAppLayer_tags 1. TestAccAWSOpsworksPermission_basic 1. TestAccAWSOpsworksPermission_Self 1. TestAccAWSOpsworksPhpAppLayer_basic 1. TestAccAWSOpsworksPhpAppLayer_tags 1. TestAccAWSOpsworksRailsAppLayer_basic 1. TestAccAWSOpsworksRailsAppLayer_tags 1. TestAccAWSOpsworksRdsDbInstance_basic 1. TestAccAWSOpsworksStaticWebLayer_basic 1. TestAccAWSOpsworksStaticWebLayer_tags ### Terraform Configuration Files See individual tests. ### Panic Output ``` data_source_aws_elasticsearch_domain_test.go:17: Step 1/1 error: Error running pre-apply refresh: exit status 2 panic: can't use ElementIterator on unknown value goroutine 120 [running]: github.com/zclconf/go-cty/cty.Value.ElementIterator(0x2c020b8, 0xc001896e40, 0x23a6120, 0x3caae20, 0xc00069a2a0, 0x19) /go/pkg/mod/github.com/zclconf/go-cty@v1.8.1/cty/value_ops.go:1113 +0x13b github.com/hashicorp/terraform/terraform.getValMarks(0xc00124c330, 0x2c02128, 0xc001897370, 0x23cfa00, 0xc002a1adb0, 0x0, 0x0, 0x0, 0x4010100, 0x0, ...) /home/circleci/project/project/terraform/evaluate.go:992 +0x6c5 github.com/hashicorp/terraform/terraform.markProviderSensitiveAttributes(0xc00124c330, 0x2c02128, 0xc001897370, 0x23cfa00, 0xc002a1adb0, 0x2c02128, 0xc001897370, 0x23cfa00, 0xc002a1adb0) /home/circleci/project/project/terraform/evaluate.go:959 +0x6e github.com/hashicorp/terraform/terraform.(*evaluationStateData).GetResource(0xc000456fc0, 0x4d, 0xc000058708, 0x18, 0xc0001a69c8, 0x4, 0xc0000581e0, 0x18, 0x3d, 0x11, ...) /home/circleci/project/project/terraform/evaluate.go:762 +0xbd5 github.com/hashicorp/terraform/lang.(*Scope).evalContext(0xc001eb1f90, 0xc0017dc3f0, 0x1, 0x1, 0x0, 0x0, 0x0, 0x0, 0x0, 0x30) /home/circleci/project/project/lang/eval.go:360 +0x206d github.com/hashicorp/terraform/lang.(*Scope).EvalContext(...) /home/circleci/project/project/lang/eval.go:238 github.com/hashicorp/terraform/lang.(*Scope).EvalBlock(0xc001eb1f90, 0x2c00bb8, 0xc002997470, 0xc0017221b0, 0x1, 0x1, 0x0, 0x0, 0x0, 0x0, ...) /home/circleci/project/project/lang/eval.go:51 +0xf3 github.com/hashicorp/terraform/terraform.(*BuiltinEvalContext).EvaluateBlock(0xc001144b60, 0x2c00e58, 0xc002997470, 0xc0017221b0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /home/circleci/project/project/terraform/eval_context_builtin.go:273 +0x1ad github.com/hashicorp/terraform/terraform.(*NodeAbstractResourceInstance).planDataSource(0xc000452000, 0x2c37770, 0xc001144b60, 0x0, 0x1, 0x1, 0x0, 0x0, 0x0) /home/circleci/project/project/terraform/node_resource_abstract_instance.go:1350 +0x515 github.com/hashicorp/terraform/terraform.(*NodePlannableResourceInstance).dataResourceExecute(0xc0018964e0, 0x2c37770, 0xc001144b60, 0xc000000001, 0xc0007a3118, 0xc002565c80) /home/circleci/project/project/terraform/node_resource_plan_instance.go:73 +0x478 github.com/hashicorp/terraform/terraform.(*NodePlannableResourceInstance).Execute(0xc0018964e0, 0x2c37770, 0xc001144b60, 0xc000180002, 0xc002565d18, 0x40bb05, 0x2418a80) /home/circleci/project/project/terraform/node_resource_plan_instance.go:43 +0x10d github.com/hashicorp/terraform/terraform.(*ContextGraphWalker).Execute(0xc00039fc80, 0x2c37770, 0xc001144b60, 0x7f1682310508, 0xc0018964e0, 0x0, 0x0, 0x0) /home/circleci/project/project/terraform/graph_walk_context.go:127 +0xbf github.com/hashicorp/terraform/terraform.(*Graph).walk.func1(0x2745d00, 0xc0018964e0, 0x0, 0x0, 0x0) /home/circleci/project/project/terraform/graph.go:59 +0xba2 github.com/hashicorp/terraform/dag.(*Walker).walkVertex(0xc001149ec0, 0x2745d00, 0xc0018964e0, 0xc001267900) /home/circleci/project/project/dag/walk.go:381 +0x288 created by github.com/hashicorp/terraform/dag.(*Walker).Update /home/circleci/project/project/dag/walk.go:304 +0x1246 --- FAIL: TestAccAWSDataElasticsearchDomain_basic (3.37s) ``` ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor documentation? For example: ---> * hashicorp/terraform#28180
test
panic at terraform beta can t use elementiterator on unknown value please note the following potential times when an issue might be in terraform core or resource ordering issues and issues issues issues spans resources across multiple providers if you are running into one of these scenarios we recommend opening an issue in the instead community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform cli and terraform aws provider version terraform aws provider affected resources data sources ds aws elasticsearch domain r aws appsync datasource r aws elasticsearch domain policy r aws kinesis firehose delivery stream r aws opsworks application r aws opsworks custom layer r aws opsworks ganglia layer r aws opsworks haproxy layer r aws opsworks instance r aws opsworks java app layer r aws opsworks memcached layer r aws opsworks mysql layer r aws opsworks nodejs app layer r aws opsworks permission r aws opsworks php app layer r aws opsworks rails app layer r aws opsworks rds db instance r aws opsworks static web layer failing tests testaccawsappsyncdatasource elasticsearchconfig region testaccawsappsyncdatasource type elasticsearch testaccawsdataelasticsearchdomain basic testaccawselasticsearchdomainpolicy basic testaccawskinesisfirehosedeliverystream elasticsearchconfigendpointupdates testaccawskinesisfirehosedeliverystream elasticsearchconfigupdates testaccawskinesisfirehosedeliverystream elasticsearchwithvpcconfigupdates testaccawsopsworksapplication basic testaccawsopsworkscustomlayer basic testaccawsopsworkscustomlayer novpc testaccawsopsworkscustomlayer tags testaccawsopsworksganglialayer basic testaccawsopsworksganglialayer tags testaccawsopsworkshaproxylayer basic testaccawsopsworkshaproxylayer tags testaccawsopsworksinstance basic testaccawsopsworksinstance updatehostnameforcenew testaccawsopsworksjavaapplayer basic testaccawsopsworksjavaapplayer tags testaccawsopsworksmemcachedlayer basic testaccawsopsworksmemcachedlayer tags testaccawsopsworksmysqllayer basic testaccawsopsworksmysqllayer tags testaccawsopsworksnodejsapplayer basic testaccawsopsworksnodejsapplayer tags testaccawsopsworkspermission basic testaccawsopsworkspermission self testaccawsopsworksphpapplayer basic testaccawsopsworksphpapplayer tags testaccawsopsworksrailsapplayer basic testaccawsopsworksrailsapplayer tags testaccawsopsworksrdsdbinstance basic testaccawsopsworksstaticweblayer basic testaccawsopsworksstaticweblayer tags terraform configuration files see individual tests panic output data source aws elasticsearch domain test go step error error running pre apply refresh exit status panic can t use elementiterator on unknown value goroutine github com zclconf go cty cty value elementiterator go pkg mod github com zclconf go cty cty value ops go github com hashicorp terraform terraform getvalmarks home circleci project project terraform evaluate go github com hashicorp terraform terraform markprovidersensitiveattributes home circleci project project terraform evaluate go github com hashicorp terraform terraform evaluationstatedata getresource home circleci project project terraform evaluate go github com hashicorp terraform lang scope evalcontext home circleci project project lang eval go github com hashicorp terraform lang scope evalcontext home circleci project project lang eval go github com hashicorp terraform lang scope evalblock home circleci project project lang eval go github com hashicorp terraform terraform builtinevalcontext evaluateblock home circleci project project terraform eval context builtin go github com hashicorp terraform terraform nodeabstractresourceinstance plandatasource home circleci project project terraform node resource abstract instance go github com hashicorp terraform terraform nodeplannableresourceinstance dataresourceexecute home circleci project project terraform node resource plan instance go github com hashicorp terraform terraform nodeplannableresourceinstance execute home circleci project project terraform node resource plan instance go github com hashicorp terraform terraform contextgraphwalker execute home circleci project project terraform graph walk context go github com hashicorp terraform terraform graph walk home circleci project project terraform graph go github com hashicorp terraform dag walker walkvertex home circleci project project dag walk go created by github com hashicorp terraform dag walker update home circleci project project dag walk go fail testaccawsdataelasticsearchdomain basic references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor documentation for example hashicorp terraform
1
266,996
23,272,596,095
IssuesEvent
2022-08-05 01:58:14
bytedeck/bytedeck
https://api.github.com/repos/bytedeck/bytedeck
closed
Pages not found and PermissionDenied exception in tests
testing devops
This has been going on as far back as I can check old test runs. Note that I was not getting these missing pages or permission errors locally until I deleted and recreated my venv, so it's possible there is some problem with a version of a module we are using in our requirements.txt ? This is the oldest run that still has data, and it's showing the same thing (Click run test suite to view it) https://github.com/bytedeck/bytedeck/runs/6438655994?check_suite_focus=true ![image](https://user-images.githubusercontent.com/10604391/174507108-aa89df49-2945-4834-9888-9f9197003cba.png)
1.0
Pages not found and PermissionDenied exception in tests - This has been going on as far back as I can check old test runs. Note that I was not getting these missing pages or permission errors locally until I deleted and recreated my venv, so it's possible there is some problem with a version of a module we are using in our requirements.txt ? This is the oldest run that still has data, and it's showing the same thing (Click run test suite to view it) https://github.com/bytedeck/bytedeck/runs/6438655994?check_suite_focus=true ![image](https://user-images.githubusercontent.com/10604391/174507108-aa89df49-2945-4834-9888-9f9197003cba.png)
test
pages not found and permissiondenied exception in tests this has been going on as far back as i can check old test runs note that i was not getting these missing pages or permission errors locally until i deleted and recreated my venv so it s possible there is some problem with a version of a module we are using in our requirements txt this is the oldest run that still has data and it s showing the same thing click run test suite to view it
1
262,804
22,960,919,320
IssuesEvent
2022-07-19 15:18:25
iotaledger/explorer
https://api.github.com/repos/iotaledger/explorer
opened
[Task]: Update Visualizer info labels
network:testnet network:shimmer context:visualizer
### Task description On Shimmer Visualiser the stats still show M as in Message in MPS / CMPS. We should update it to B as in Block for stardust Networks. ![image](https://user-images.githubusercontent.com/6864498/179786564-1f955c9b-ebf1-4384-a149-036464fbd8bb.png) ### Requirements N/A ### Acceptance criteria N/A ### Creation checklist - [X] I have assigned this task to the correct people - [X] I have added the most appropriate labels - [X] I have linked the correct milestone and/or project
1.0
[Task]: Update Visualizer info labels - ### Task description On Shimmer Visualiser the stats still show M as in Message in MPS / CMPS. We should update it to B as in Block for stardust Networks. ![image](https://user-images.githubusercontent.com/6864498/179786564-1f955c9b-ebf1-4384-a149-036464fbd8bb.png) ### Requirements N/A ### Acceptance criteria N/A ### Creation checklist - [X] I have assigned this task to the correct people - [X] I have added the most appropriate labels - [X] I have linked the correct milestone and/or project
test
update visualizer info labels task description on shimmer visualiser the stats still show m as in message in mps cmps we should update it to b as in block for stardust networks requirements n a acceptance criteria n a creation checklist i have assigned this task to the correct people i have added the most appropriate labels i have linked the correct milestone and or project
1
74,762
7,440,498,548
IssuesEvent
2018-03-27 10:16:30
kettanaito/react-advanced-form
https://api.github.com/repos/kettanaito/react-advanced-form
closed
Integration: Set fast type speed and see form serialization
tests
## Environment * **react-advanaced-form:** 1.0.7 ## What See issue's title. ## Why Selenium tests show that sometimes a form serialized with only the first characters entered into the field. That needs to be investigated.
1.0
Integration: Set fast type speed and see form serialization - ## Environment * **react-advanaced-form:** 1.0.7 ## What See issue's title. ## Why Selenium tests show that sometimes a form serialized with only the first characters entered into the field. That needs to be investigated.
test
integration set fast type speed and see form serialization environment react advanaced form what see issue s title why selenium tests show that sometimes a form serialized with only the first characters entered into the field that needs to be investigated
1
155,889
12,281,106,513
IssuesEvent
2020-05-08 15:15:54
d-r-q/qbit
https://api.github.com/repos/d-r-q/qbit
opened
Add child to parent tree test
api choose enhancement refactoring research tests
Requires API change to allow user to persist several entities in single call Add test for factoring of tree of ```kotlin data class ChildToParent(val id: Long?, parent: ChildToParent?) ```
1.0
Add child to parent tree test - Requires API change to allow user to persist several entities in single call Add test for factoring of tree of ```kotlin data class ChildToParent(val id: Long?, parent: ChildToParent?) ```
test
add child to parent tree test requires api change to allow user to persist several entities in single call add test for factoring of tree of kotlin data class childtoparent val id long parent childtoparent
1
174,662
27,705,171,907
IssuesEvent
2023-03-14 10:42:40
owncloud/client
https://api.github.com/repos/owncloud/client
closed
Display the private shares as a list
Enhancement Design & UX feature:sharing
Follow up of https://github.com/owncloud/client/pull/4310#issuecomment-345999910 Maybe we could consider delimiting each entry (share) using a list with alternate colors (like the one in "Sync Protocol/Not Synced") to get rid of the rectangle that encloses the checkboxes now. I remember a mockup from some time ago: <p align="center"> <img src="https://user-images.githubusercontent.com/2644445/33070389-222fa6ec-ceb8-11e7-9bb4-52cb343fc21d.png"/> </p> Additionally we could think of a way (icons, multi-line...) to avoid widening the dialog on such long names e.g. remote shares (currently displays the full federated share id + the string "`(remote)`") ## Screenshot for reference: <p align="center"> <img src="https://user-images.githubusercontent.com/2644445/33069989-af6f5c34-ceb6-11e7-93b5-29359822499e.png"/> </p>
1.0
Display the private shares as a list - Follow up of https://github.com/owncloud/client/pull/4310#issuecomment-345999910 Maybe we could consider delimiting each entry (share) using a list with alternate colors (like the one in "Sync Protocol/Not Synced") to get rid of the rectangle that encloses the checkboxes now. I remember a mockup from some time ago: <p align="center"> <img src="https://user-images.githubusercontent.com/2644445/33070389-222fa6ec-ceb8-11e7-9bb4-52cb343fc21d.png"/> </p> Additionally we could think of a way (icons, multi-line...) to avoid widening the dialog on such long names e.g. remote shares (currently displays the full federated share id + the string "`(remote)`") ## Screenshot for reference: <p align="center"> <img src="https://user-images.githubusercontent.com/2644445/33069989-af6f5c34-ceb6-11e7-93b5-29359822499e.png"/> </p>
non_test
display the private shares as a list follow up of maybe we could consider delimiting each entry share using a list with alternate colors like the one in sync protocol not synced to get rid of the rectangle that encloses the checkboxes now i remember a mockup from some time ago img src additionally we could think of a way icons multi line to avoid widening the dialog on such long names e g remote shares currently displays the full federated share id the string remote screenshot for reference img src
0
356,376
25,176,175,337
IssuesEvent
2022-11-11 09:27:24
ruihan00/pe
https://api.github.com/repos/ruihan00/pe
opened
Invalid example command in UG
severity.Medium type.DocumentationBug
The example command in things to note, first point `add n/NAME` is not a valid command that is quoted as a example. This might cause confusion in the users ![Screenshot 2022-11-11 at 5.26.19 PM.png](https://raw.githubusercontent.com/ruihan00/pe/main/files/4d5b1141-f724-4049-9907-175868a1e602.png) ![Screenshot 2022-11-11 at 5.25.50 PM.png](https://raw.githubusercontent.com/ruihan00/pe/main/files/2f965ed1-6dbc-458d-84f4-7605d1c91315.png) <!--session: 1668153581208-180acfcd-9de4-4da8-be78-2bc776d3dd6e--> <!--Version: Web v3.4.4-->
1.0
Invalid example command in UG - The example command in things to note, first point `add n/NAME` is not a valid command that is quoted as a example. This might cause confusion in the users ![Screenshot 2022-11-11 at 5.26.19 PM.png](https://raw.githubusercontent.com/ruihan00/pe/main/files/4d5b1141-f724-4049-9907-175868a1e602.png) ![Screenshot 2022-11-11 at 5.25.50 PM.png](https://raw.githubusercontent.com/ruihan00/pe/main/files/2f965ed1-6dbc-458d-84f4-7605d1c91315.png) <!--session: 1668153581208-180acfcd-9de4-4da8-be78-2bc776d3dd6e--> <!--Version: Web v3.4.4-->
non_test
invalid example command in ug the example command in things to note first point add n name is not a valid command that is quoted as a example this might cause confusion in the users
0
145,992
11,717,283,987
IssuesEvent
2020-03-09 17:00:03
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Helm 3 should not need to start Tiller
[zube]: To Test kind/bug-qa
<!-- Please search for existing issues first, then read https://rancher.com/docs/rancher/v2.x/en/contributing/#bugs-issues-or-questions to see what we expect in an issue For security issues, please email security@rancher.com instead of posting a public issue in GitHub. You may (but are not required to) use the GPG key located on Keybase. --> **What kind of request is this (question/bug/enhancement/feature request):** bug **Steps to reproduce (least amount of steps as possible):** 1. Deploy rancher server. Tail logs 2. Add a cluster 3. Add a helm_v3 catalog. I used `https://argoproj.github.io/argo-helm` 4. Deploy an app from the helm_v3 catalog. **Result:** In the rancher server logs, it shows: ``` [INFO] Installing chart using helm version: helm_v3 [main] 2020/03/06 18:22:25 Starting Tiller v2.16+unreleased (tls=false) ``` Helm 3 does not use Tiller so shouldn't need to start it. **Environment information** - Rancher version (`rancher/rancher`/`rancher/server` image tag or shown bottom left in the UI): `master-2373-head Rancher version beba5247e` - Installation option (single install/HA): single <!-- If the reported issue is regarding a created cluster, please provide requested info below --> **Cluster information** - Cluster type (Hosted/Infrastructure Provider/Custom/Imported): Linode - Machine type (cloud/VM/metal) and specifications (CPU/memory): 3 nodes, all roles, each 4gb 2cpu - Kubernetes version (use `kubectl version`): `v1.17.3`
1.0
Helm 3 should not need to start Tiller - <!-- Please search for existing issues first, then read https://rancher.com/docs/rancher/v2.x/en/contributing/#bugs-issues-or-questions to see what we expect in an issue For security issues, please email security@rancher.com instead of posting a public issue in GitHub. You may (but are not required to) use the GPG key located on Keybase. --> **What kind of request is this (question/bug/enhancement/feature request):** bug **Steps to reproduce (least amount of steps as possible):** 1. Deploy rancher server. Tail logs 2. Add a cluster 3. Add a helm_v3 catalog. I used `https://argoproj.github.io/argo-helm` 4. Deploy an app from the helm_v3 catalog. **Result:** In the rancher server logs, it shows: ``` [INFO] Installing chart using helm version: helm_v3 [main] 2020/03/06 18:22:25 Starting Tiller v2.16+unreleased (tls=false) ``` Helm 3 does not use Tiller so shouldn't need to start it. **Environment information** - Rancher version (`rancher/rancher`/`rancher/server` image tag or shown bottom left in the UI): `master-2373-head Rancher version beba5247e` - Installation option (single install/HA): single <!-- If the reported issue is regarding a created cluster, please provide requested info below --> **Cluster information** - Cluster type (Hosted/Infrastructure Provider/Custom/Imported): Linode - Machine type (cloud/VM/metal) and specifications (CPU/memory): 3 nodes, all roles, each 4gb 2cpu - Kubernetes version (use `kubectl version`): `v1.17.3`
test
helm should not need to start tiller please search for existing issues first then read to see what we expect in an issue for security issues please email security rancher com instead of posting a public issue in github you may but are not required to use the gpg key located on keybase what kind of request is this question bug enhancement feature request bug steps to reproduce least amount of steps as possible deploy rancher server tail logs add a cluster add a helm catalog i used deploy an app from the helm catalog result in the rancher server logs it shows installing chart using helm version helm starting tiller unreleased tls false helm does not use tiller so shouldn t need to start it environment information rancher version rancher rancher rancher server image tag or shown bottom left in the ui master head rancher version installation option single install ha single if the reported issue is regarding a created cluster please provide requested info below cluster information cluster type hosted infrastructure provider custom imported linode machine type cloud vm metal and specifications cpu memory nodes all roles each kubernetes version use kubectl version
1
247,081
20,956,328,027
IssuesEvent
2022-03-27 06:17:04
Uuvana-Studios/longvinter-windows-client
https://api.github.com/repos/Uuvana-Studios/longvinter-windows-client
closed
Game keeps disconnecting
bug Not Tested
**Describe the bug** The game keeps disconnecting at random moments and then it takes some time to reconnect. **To Reproduce** Steps to reproduce the behavior: 1. Open the game 2. Play for a while 3. Random disconnect 4. Reconnect again after its loading for a few min 5. Steps 2-4 repeat **Expected behavior** Have the game not disconnect **Desktop (please complete the following information):** - OS: Windows - Game Version: 1.0.1 - Steam Version: most recent, idk which version
1.0
Game keeps disconnecting - **Describe the bug** The game keeps disconnecting at random moments and then it takes some time to reconnect. **To Reproduce** Steps to reproduce the behavior: 1. Open the game 2. Play for a while 3. Random disconnect 4. Reconnect again after its loading for a few min 5. Steps 2-4 repeat **Expected behavior** Have the game not disconnect **Desktop (please complete the following information):** - OS: Windows - Game Version: 1.0.1 - Steam Version: most recent, idk which version
test
game keeps disconnecting describe the bug the game keeps disconnecting at random moments and then it takes some time to reconnect to reproduce steps to reproduce the behavior open the game play for a while random disconnect reconnect again after its loading for a few min steps repeat expected behavior have the game not disconnect desktop please complete the following information os windows game version steam version most recent idk which version
1
185,424
21,789,130,835
IssuesEvent
2022-05-14 16:11:11
TreyM-WSS/concord
https://api.github.com/repos/TreyM-WSS/concord
closed
CVE-2018-1109 (High) detected in braces-1.8.5.tgz - autoclosed
security vulnerability
## CVE-2018-1109 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>braces-1.8.5.tgz</b></p></summary> <p>Fastest brace expansion for node.js, with the most complete support for the Bash 4.3 braces specification.</p> <p>Library home page: <a href="https://registry.npmjs.org/braces/-/braces-1.8.5.tgz">https://registry.npmjs.org/braces/-/braces-1.8.5.tgz</a></p> <p>Path to dependency file: concord/console2/package.json</p> <p>Path to vulnerable library: concord/console2/node_modules/babel-cli/node_modules/braces/package.json</p> <p> Dependency Hierarchy: - babel-cli-6.26.0.tgz (Root Library) - chokidar-1.7.0.tgz - anymatch-1.3.2.tgz - micromatch-2.3.11.tgz - :x: **braces-1.8.5.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Braces before 1.4.2 and 2.17.2 is vulnerable to ReDoS. It used a regular expression (^\{(,+(?:(\{,+\})*),*|,*(?:(\{,+\})*),+)\}) in order to detects empty braces. This can cause an impact of about 10 seconds matching time for data 50K characters long. <p>Publish Date: 2020-07-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1109>CVE-2018-1109</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1547272">https://bugzilla.redhat.com/show_bug.cgi?id=1547272</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: 2.3.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"braces","packageVersion":"1.8.5","isTransitiveDependency":true,"dependencyTree":"babel-cli:6.26.0;chokidar:1.7.0;anymatch:1.3.2;micromatch:2.3.11;braces:1.8.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.3.1"}],"vulnerabilityIdentifier":"CVE-2018-1109","vulnerabilityDetails":"Braces before 1.4.2 and 2.17.2 is vulnerable to ReDoS. It used a regular expression (^\\{(,+(?:(\\{,+\\})*),*|,*(?:(\\{,+\\})*),+)\\}) in order to detects empty braces. This can cause an impact of about 10 seconds matching time for data 50K characters long.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1109","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-1109 (High) detected in braces-1.8.5.tgz - autoclosed - ## CVE-2018-1109 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>braces-1.8.5.tgz</b></p></summary> <p>Fastest brace expansion for node.js, with the most complete support for the Bash 4.3 braces specification.</p> <p>Library home page: <a href="https://registry.npmjs.org/braces/-/braces-1.8.5.tgz">https://registry.npmjs.org/braces/-/braces-1.8.5.tgz</a></p> <p>Path to dependency file: concord/console2/package.json</p> <p>Path to vulnerable library: concord/console2/node_modules/babel-cli/node_modules/braces/package.json</p> <p> Dependency Hierarchy: - babel-cli-6.26.0.tgz (Root Library) - chokidar-1.7.0.tgz - anymatch-1.3.2.tgz - micromatch-2.3.11.tgz - :x: **braces-1.8.5.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Braces before 1.4.2 and 2.17.2 is vulnerable to ReDoS. It used a regular expression (^\{(,+(?:(\{,+\})*),*|,*(?:(\{,+\})*),+)\}) in order to detects empty braces. This can cause an impact of about 10 seconds matching time for data 50K characters long. <p>Publish Date: 2020-07-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1109>CVE-2018-1109</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1547272">https://bugzilla.redhat.com/show_bug.cgi?id=1547272</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: 2.3.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"braces","packageVersion":"1.8.5","isTransitiveDependency":true,"dependencyTree":"babel-cli:6.26.0;chokidar:1.7.0;anymatch:1.3.2;micromatch:2.3.11;braces:1.8.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.3.1"}],"vulnerabilityIdentifier":"CVE-2018-1109","vulnerabilityDetails":"Braces before 1.4.2 and 2.17.2 is vulnerable to ReDoS. It used a regular expression (^\\{(,+(?:(\\{,+\\})*),*|,*(?:(\\{,+\\})*),+)\\}) in order to detects empty braces. This can cause an impact of about 10 seconds matching time for data 50K characters long.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-1109","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_test
cve high detected in braces tgz autoclosed cve high severity vulnerability vulnerable library braces tgz fastest brace expansion for node js with the most complete support for the bash braces specification library home page a href path to dependency file concord package json path to vulnerable library concord node modules babel cli node modules braces package json dependency hierarchy babel cli tgz root library chokidar tgz anymatch tgz micromatch tgz x braces tgz vulnerable library vulnerability details braces before and is vulnerable to redos it used a regular expression in order to detects empty braces this can cause an impact of about seconds matching time for data characters long publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails braces before and is vulnerable to redos it used a regular expression in order to detects empty braces this can cause an impact of about seconds matching time for data characters long vulnerabilityurl
0
83,372
3,634,226,945
IssuesEvent
2016-02-11 17:13:33
Sonarr/Sonarr
https://api.github.com/repos/Sonarr/Sonarr
closed
Add rel="noreferrer" to all external links
enhancement priority:low
Ideally this is a helper or something else that makes it seamless to add to links automatically.
1.0
Add rel="noreferrer" to all external links - Ideally this is a helper or something else that makes it seamless to add to links automatically.
non_test
add rel noreferrer to all external links ideally this is a helper or something else that makes it seamless to add to links automatically
0
277,064
24,046,106,817
IssuesEvent
2022-09-16 08:29:01
zephyrproject-rtos/test_results
https://api.github.com/repos/zephyrproject-rtos/test_results
opened
IPv4.ARP.003 : icmp.v4: udp.v4: ARP Request with Hardware type field set to unassigned type. fail
area: Tests area: maxpro
Describe the bug An ARP Request message is sent to the DUT, but its Hardware address space (or type) field is set to an unassigned value (42). A check of that field is the first test performed by the pseudo code on page 5 of RFC 826. Therefore no reply should be sent nor any updates made to the DUT's translation table. To verify that the DUT did not update the translation table, an IPv4 packet with its source address set to the value in the ARP Request's Protocol Sender Address field is sent to the DUT. test is Fail on Zephyr3.2.0 on qemu_x86 References RFC 826: page 5 Results b'FAIL: icmp.v4 DUT did not send an IP datagram response.' b'FAIL: udp.v4 DUT did not send an IP datagram response.' Environment (please complete the following information): OS: (e.g. Linux ) Toolchain (e.g Zephyr SDK) Commit SHA or Version used: Zephyr3.2-rc1
1.0
IPv4.ARP.003 : icmp.v4: udp.v4: ARP Request with Hardware type field set to unassigned type. fail - Describe the bug An ARP Request message is sent to the DUT, but its Hardware address space (or type) field is set to an unassigned value (42). A check of that field is the first test performed by the pseudo code on page 5 of RFC 826. Therefore no reply should be sent nor any updates made to the DUT's translation table. To verify that the DUT did not update the translation table, an IPv4 packet with its source address set to the value in the ARP Request's Protocol Sender Address field is sent to the DUT. test is Fail on Zephyr3.2.0 on qemu_x86 References RFC 826: page 5 Results b'FAIL: icmp.v4 DUT did not send an IP datagram response.' b'FAIL: udp.v4 DUT did not send an IP datagram response.' Environment (please complete the following information): OS: (e.g. Linux ) Toolchain (e.g Zephyr SDK) Commit SHA or Version used: Zephyr3.2-rc1
test
arp icmp udp arp request with hardware type field set to unassigned type fail describe the bug an arp request message is sent to the dut but its hardware address space or type field is set to an unassigned value a check of that field is the first test performed by the pseudo code on page of rfc therefore no reply should be sent nor any updates made to the dut s translation table to verify that the dut did not update the translation table an packet with its source address set to the value in the arp request s protocol sender address field is sent to the dut test is fail on on qemu references rfc page results b fail icmp dut did not send an ip datagram response b fail udp dut did not send an ip datagram response environment please complete the following information os e g linux toolchain e g zephyr sdk commit sha or version used
1
690,282
23,653,211,498
IssuesEvent
2022-08-26 08:45:10
trustwallet/wallet-core
https://api.github.com/repos/trustwallet/wallet-core
closed
Manage boost by `tools/install-dependencies` script
enhancement priority:medium size:medium
Make sure we have same boost version across macOS / Linux / Android / iOS, Wasm could be different for now; This should also solve the issue when you use Android Studio + M1 MacBook
1.0
Manage boost by `tools/install-dependencies` script - Make sure we have same boost version across macOS / Linux / Android / iOS, Wasm could be different for now; This should also solve the issue when you use Android Studio + M1 MacBook
non_test
manage boost by tools install dependencies script make sure we have same boost version across macos linux android ios wasm could be different for now this should also solve the issue when you use android studio macbook
0
52,888
6,662,983,331
IssuesEvent
2017-10-02 14:55:03
status-im/status-react
https://api.github.com/repos/status-im/status-react
opened
Wallet - design review - send transaction
design wallet
![image](https://user-images.githubusercontent.com/11790366/31083526-9931c8be-a79a-11e7-979f-7587cf2c74ae.png) ![image](https://user-images.githubusercontent.com/11790366/31083531-9d9d4900-a79a-11e7-8aa9-afd4a98b4006.png) ![image](https://user-images.githubusercontent.com/11790366/31083537-a1da310e-a79a-11e7-8209-8afb6e844184.png) ![image](https://user-images.githubusercontent.com/11790366/31083542-a62edae8-a79a-11e7-8ce9-c6c1ad068c3e.png) ![image](https://user-images.githubusercontent.com/11790366/31083553-aae2d6c0-a79a-11e7-8a66-fb4e31cf66a5.png) ![image](https://user-images.githubusercontent.com/11790366/31083569-b3cf7540-a79a-11e7-9796-12778d414fdf.png) ![image](https://user-images.githubusercontent.com/11790366/31083598-c2f1b114-a79a-11e7-8ef0-710dec121012.png) ![image](https://user-images.githubusercontent.com/11790366/31083613-c8cea20e-a79a-11e7-9f98-600cfe289380.png) ![image](https://user-images.githubusercontent.com/11790366/31083619-cd51a376-a79a-11e7-9578-3cea9f8c2830.png)
1.0
Wallet - design review - send transaction - ![image](https://user-images.githubusercontent.com/11790366/31083526-9931c8be-a79a-11e7-979f-7587cf2c74ae.png) ![image](https://user-images.githubusercontent.com/11790366/31083531-9d9d4900-a79a-11e7-8aa9-afd4a98b4006.png) ![image](https://user-images.githubusercontent.com/11790366/31083537-a1da310e-a79a-11e7-8209-8afb6e844184.png) ![image](https://user-images.githubusercontent.com/11790366/31083542-a62edae8-a79a-11e7-8ce9-c6c1ad068c3e.png) ![image](https://user-images.githubusercontent.com/11790366/31083553-aae2d6c0-a79a-11e7-8a66-fb4e31cf66a5.png) ![image](https://user-images.githubusercontent.com/11790366/31083569-b3cf7540-a79a-11e7-9796-12778d414fdf.png) ![image](https://user-images.githubusercontent.com/11790366/31083598-c2f1b114-a79a-11e7-8ef0-710dec121012.png) ![image](https://user-images.githubusercontent.com/11790366/31083613-c8cea20e-a79a-11e7-9f98-600cfe289380.png) ![image](https://user-images.githubusercontent.com/11790366/31083619-cd51a376-a79a-11e7-9578-3cea9f8c2830.png)
non_test
wallet design review send transaction
0
4,715
11,609,850,572
IssuesEvent
2020-02-26 01:15:30
hkamran80/game-launcher
https://api.github.com/repos/hkamran80/game-launcher
opened
Download Covers Endpoint
Architecture Enhancement Enhancement
We need an endpoint to download covers, so you don't have to run `backend.py` every time you want to update your covers. Proposal: `/update/covers`
1.0
Download Covers Endpoint - We need an endpoint to download covers, so you don't have to run `backend.py` every time you want to update your covers. Proposal: `/update/covers`
non_test
download covers endpoint we need an endpoint to download covers so you don t have to run backend py every time you want to update your covers proposal update covers
0
5,913
2,986,361,888
IssuesEvent
2015-07-20 00:53:59
nilearn/nilearn
https://api.github.com/repos/nilearn/nilearn
closed
Running examples needs a more recent scikit-learn version than advertised in README.rst
Documentation
- scikit-learn 0.10 is required by nilearn/version.py and the nilearn tests pass with 0.10 on Travis - README.rst says 0.12.1 is required (looking at git history this is probably because at one point at least one example needed 0.12.1 to run) - to be able to run all the examples, scikit-learn 0.14.1 is required My question is: - do we just accept that running the examples may need a more recent version of scikit-learn than just using nilearn and add a note about that in README.rst? - Should we try to modify the examples so that they run with a scikit-learn version older than 0.14.1? For completeness, here is some summary of the errors with scikit-learn 0.11, 0.12.1 and 0.13.1 (0.10 is not available through conda, hence it is absent from the summary): ``` doc_sklearn_0.11.log- File "plot_nifti_simple.py", line 44, in <module> doc_sklearn_0.11.log:AttributeError: 'FastICA' object has no attribute 'fit_transform' doc_sklearn_0.11.log- File "plot_haxby_simple.py", line 60, in <module> doc_sklearn_0.11.log:TypeError: __init__() got an unexpected keyword argument 'n_folds' doc_sklearn_0.11.log- File "plot_ica_resting_state.py", line 36, in <module> doc_sklearn_0.11.log:AttributeError: 'FastICA' object has no attribute 'fit_transform' doc_sklearn_0.11.log- File "plot_haxby_searchlight.py", line 60, in <module> doc_sklearn_0.11.log:TypeError: __init__() got an unexpected keyword argument 'n_folds' doc_sklearn_0.11.log- File "plot_haxby_full_analysis.py", line 42, in <module> doc_sklearn_0.11.log:ImportError: No module named dummy doc_sklearn_0.11.log- File "plot_simulated_data.py", line 114, in <module> doc_sklearn_0.11.log:TypeError: __init__() got an unexpected keyword argument 'l1_ratio' doc_sklearn_0.11.log- File "plot_haxby_different_estimators.py", line 49, in <module> doc_sklearn_0.11.log:TypeError: __init__() got an unexpected keyword argument 'scoring' doc_sklearn_0.11.log- File "plot_miyawaki_reconstruction.py", line 214, in <module> doc_sklearn_0.11.log:ImportError: cannot import name accuracy_score ---------------------------------------- doc_sklearn_0.12.1.log- File "plot_haxby_simple.py", line 60, in <module> doc_sklearn_0.12.1.log:TypeError: __init__() got an unexpected keyword argument 'n_folds' doc_sklearn_0.12.1.log- File "plot_haxby_searchlight.py", line 60, in <module> doc_sklearn_0.12.1.log:TypeError: __init__() got an unexpected keyword argument 'n_folds' doc_sklearn_0.12.1.log- File "plot_haxby_full_analysis.py", line 42, in <module> doc_sklearn_0.12.1.log:ImportError: No module named dummy doc_sklearn_0.12.1.log- File "plot_simulated_data.py", line 114, in <module> doc_sklearn_0.12.1.log:TypeError: __init__() got an unexpected keyword argument 'l1_ratio' doc_sklearn_0.12.1.log- File "plot_haxby_different_estimators.py", line 49, in <module> doc_sklearn_0.12.1.log:TypeError: __init__() got an unexpected keyword argument 'scoring' doc_sklearn_0.12.1.log- File "plot_miyawaki_reconstruction.py", line 214, in <module> doc_sklearn_0.12.1.log:ImportError: cannot import name accuracy_score ---------------------------------------- doc_sklearn_0.13.1.log- File "plot_haxby_full_analysis.py", line 73, in <module> doc_sklearn_0.13.1.log:TypeError: cross_val_score() got an unexpected keyword argument 'scoring' doc_sklearn_0.13.1.log- File "plot_haxby_different_estimators.py", line 49, in <module> doc_sklearn_0.13.1.log:TypeError: __init__() got an unexpected keyword argument 'scoring' ```
1.0
Running examples needs a more recent scikit-learn version than advertised in README.rst - - scikit-learn 0.10 is required by nilearn/version.py and the nilearn tests pass with 0.10 on Travis - README.rst says 0.12.1 is required (looking at git history this is probably because at one point at least one example needed 0.12.1 to run) - to be able to run all the examples, scikit-learn 0.14.1 is required My question is: - do we just accept that running the examples may need a more recent version of scikit-learn than just using nilearn and add a note about that in README.rst? - Should we try to modify the examples so that they run with a scikit-learn version older than 0.14.1? For completeness, here is some summary of the errors with scikit-learn 0.11, 0.12.1 and 0.13.1 (0.10 is not available through conda, hence it is absent from the summary): ``` doc_sklearn_0.11.log- File "plot_nifti_simple.py", line 44, in <module> doc_sklearn_0.11.log:AttributeError: 'FastICA' object has no attribute 'fit_transform' doc_sklearn_0.11.log- File "plot_haxby_simple.py", line 60, in <module> doc_sklearn_0.11.log:TypeError: __init__() got an unexpected keyword argument 'n_folds' doc_sklearn_0.11.log- File "plot_ica_resting_state.py", line 36, in <module> doc_sklearn_0.11.log:AttributeError: 'FastICA' object has no attribute 'fit_transform' doc_sklearn_0.11.log- File "plot_haxby_searchlight.py", line 60, in <module> doc_sklearn_0.11.log:TypeError: __init__() got an unexpected keyword argument 'n_folds' doc_sklearn_0.11.log- File "plot_haxby_full_analysis.py", line 42, in <module> doc_sklearn_0.11.log:ImportError: No module named dummy doc_sklearn_0.11.log- File "plot_simulated_data.py", line 114, in <module> doc_sklearn_0.11.log:TypeError: __init__() got an unexpected keyword argument 'l1_ratio' doc_sklearn_0.11.log- File "plot_haxby_different_estimators.py", line 49, in <module> doc_sklearn_0.11.log:TypeError: __init__() got an unexpected keyword argument 'scoring' doc_sklearn_0.11.log- File "plot_miyawaki_reconstruction.py", line 214, in <module> doc_sklearn_0.11.log:ImportError: cannot import name accuracy_score ---------------------------------------- doc_sklearn_0.12.1.log- File "plot_haxby_simple.py", line 60, in <module> doc_sklearn_0.12.1.log:TypeError: __init__() got an unexpected keyword argument 'n_folds' doc_sklearn_0.12.1.log- File "plot_haxby_searchlight.py", line 60, in <module> doc_sklearn_0.12.1.log:TypeError: __init__() got an unexpected keyword argument 'n_folds' doc_sklearn_0.12.1.log- File "plot_haxby_full_analysis.py", line 42, in <module> doc_sklearn_0.12.1.log:ImportError: No module named dummy doc_sklearn_0.12.1.log- File "plot_simulated_data.py", line 114, in <module> doc_sklearn_0.12.1.log:TypeError: __init__() got an unexpected keyword argument 'l1_ratio' doc_sklearn_0.12.1.log- File "plot_haxby_different_estimators.py", line 49, in <module> doc_sklearn_0.12.1.log:TypeError: __init__() got an unexpected keyword argument 'scoring' doc_sklearn_0.12.1.log- File "plot_miyawaki_reconstruction.py", line 214, in <module> doc_sklearn_0.12.1.log:ImportError: cannot import name accuracy_score ---------------------------------------- doc_sklearn_0.13.1.log- File "plot_haxby_full_analysis.py", line 73, in <module> doc_sklearn_0.13.1.log:TypeError: cross_val_score() got an unexpected keyword argument 'scoring' doc_sklearn_0.13.1.log- File "plot_haxby_different_estimators.py", line 49, in <module> doc_sklearn_0.13.1.log:TypeError: __init__() got an unexpected keyword argument 'scoring' ```
non_test
running examples needs a more recent scikit learn version than advertised in readme rst scikit learn is required by nilearn version py and the nilearn tests pass with on travis readme rst says is required looking at git history this is probably because at one point at least one example needed to run to be able to run all the examples scikit learn is required my question is do we just accept that running the examples may need a more recent version of scikit learn than just using nilearn and add a note about that in readme rst should we try to modify the examples so that they run with a scikit learn version older than for completeness here is some summary of the errors with scikit learn and is not available through conda hence it is absent from the summary doc sklearn log file plot nifti simple py line in doc sklearn log attributeerror fastica object has no attribute fit transform doc sklearn log file plot haxby simple py line in doc sklearn log typeerror init got an unexpected keyword argument n folds doc sklearn log file plot ica resting state py line in doc sklearn log attributeerror fastica object has no attribute fit transform doc sklearn log file plot haxby searchlight py line in doc sklearn log typeerror init got an unexpected keyword argument n folds doc sklearn log file plot haxby full analysis py line in doc sklearn log importerror no module named dummy doc sklearn log file plot simulated data py line in doc sklearn log typeerror init got an unexpected keyword argument ratio doc sklearn log file plot haxby different estimators py line in doc sklearn log typeerror init got an unexpected keyword argument scoring doc sklearn log file plot miyawaki reconstruction py line in doc sklearn log importerror cannot import name accuracy score doc sklearn log file plot haxby simple py line in doc sklearn log typeerror init got an unexpected keyword argument n folds doc sklearn log file plot haxby searchlight py line in doc sklearn log typeerror init got an unexpected keyword argument n folds doc sklearn log file plot haxby full analysis py line in doc sklearn log importerror no module named dummy doc sklearn log file plot simulated data py line in doc sklearn log typeerror init got an unexpected keyword argument ratio doc sklearn log file plot haxby different estimators py line in doc sklearn log typeerror init got an unexpected keyword argument scoring doc sklearn log file plot miyawaki reconstruction py line in doc sklearn log importerror cannot import name accuracy score doc sklearn log file plot haxby full analysis py line in doc sklearn log typeerror cross val score got an unexpected keyword argument scoring doc sklearn log file plot haxby different estimators py line in doc sklearn log typeerror init got an unexpected keyword argument scoring
0
246,340
20,834,484,780
IssuesEvent
2022-03-20 00:50:33
metaplex-foundation/metaplex
https://api.github.com/repos/metaplex-foundation/metaplex
closed
[Bug]: ts-node ./candy-machine-cli.ts upload ./assets --env devnet -k <keypair-path> zsh: parse error near `\n'
needs tests bug
### Which package is this bug report for? candy machine cli ### Issue description solve ts-node ./candy-machine-cli.ts upload ./assets --env devnet -k <keypair-path> zsh: parse error near `\n' ### Command ```shell ts-node ./candy-machine-cli.ts upload ./assets --env devnet -k <keypair-path> zsh: parse error near `\n' ``` ### Relevant log output ```shell ts-node ./candy-machine-cli.ts upload ./assets --env devnet -k <keypair-path> zsh: parse error near `\n' ``` ### Operating system mac ### Priority this issue should have High (immediate attention needed) ### Check the Docs First - [X] I have checked the docs and it didn't solve my issue
1.0
[Bug]: ts-node ./candy-machine-cli.ts upload ./assets --env devnet -k <keypair-path> zsh: parse error near `\n' - ### Which package is this bug report for? candy machine cli ### Issue description solve ts-node ./candy-machine-cli.ts upload ./assets --env devnet -k <keypair-path> zsh: parse error near `\n' ### Command ```shell ts-node ./candy-machine-cli.ts upload ./assets --env devnet -k <keypair-path> zsh: parse error near `\n' ``` ### Relevant log output ```shell ts-node ./candy-machine-cli.ts upload ./assets --env devnet -k <keypair-path> zsh: parse error near `\n' ``` ### Operating system mac ### Priority this issue should have High (immediate attention needed) ### Check the Docs First - [X] I have checked the docs and it didn't solve my issue
test
ts node candy machine cli ts upload assets env devnet k zsh parse error near n which package is this bug report for candy machine cli issue description solve ts node candy machine cli ts upload assets env devnet k zsh parse error near n command shell ts node candy machine cli ts upload assets env devnet k zsh parse error near n relevant log output shell ts node candy machine cli ts upload assets env devnet k zsh parse error near n operating system mac priority this issue should have high immediate attention needed check the docs first i have checked the docs and it didn t solve my issue
1
186,538
21,944,189,706
IssuesEvent
2022-05-23 21:40:40
CMSgov/cms-carts-seds
https://api.github.com/repos/CMSgov/cms-carts-seds
closed
SHF - cms-carts-seds - main - MEDIUM - Instance i-0cd9e25dff7e97a2f is vulnerable to CVE-2021-43056
security-hub main
************************************************************** __This issue was generated from Security Hub data and is managed through automation.__ Please do not edit the title or body of this issue, or remove the security-hub tag. All other edits/comments are welcome. Finding Id: inspector/us-east-1/519095364708/7b0676120713c898345b0c9f716982990d7dbb87 ************************************************************** ## Type of Issue: - [x] Security Hub Finding ## Title: Instance i-0cd9e25dff7e97a2f is vulnerable to CVE-2021-43056 ## Id: inspector/us-east-1/519095364708/7b0676120713c898345b0c9f716982990d7dbb87 (You may use this ID to lookup this finding's details in Security Hub) ## Description An issue was discovered in the Linux kernel for powerpc before 5.14.15. It allows a malicious KVM guest to crash the host, when the host is running on Power8, due to an arch/powerpc/kvm/book3s_hv_rmhandlers.S implementation bug in the handling of the SRR1 register values. ## Remediation undefined ## AC: - The security hub finding is resolved or suppressed, indicated by a Workflow Status of Resolved or Suppressed.
True
SHF - cms-carts-seds - main - MEDIUM - Instance i-0cd9e25dff7e97a2f is vulnerable to CVE-2021-43056 - ************************************************************** __This issue was generated from Security Hub data and is managed through automation.__ Please do not edit the title or body of this issue, or remove the security-hub tag. All other edits/comments are welcome. Finding Id: inspector/us-east-1/519095364708/7b0676120713c898345b0c9f716982990d7dbb87 ************************************************************** ## Type of Issue: - [x] Security Hub Finding ## Title: Instance i-0cd9e25dff7e97a2f is vulnerable to CVE-2021-43056 ## Id: inspector/us-east-1/519095364708/7b0676120713c898345b0c9f716982990d7dbb87 (You may use this ID to lookup this finding's details in Security Hub) ## Description An issue was discovered in the Linux kernel for powerpc before 5.14.15. It allows a malicious KVM guest to crash the host, when the host is running on Power8, due to an arch/powerpc/kvm/book3s_hv_rmhandlers.S implementation bug in the handling of the SRR1 register values. ## Remediation undefined ## AC: - The security hub finding is resolved or suppressed, indicated by a Workflow Status of Resolved or Suppressed.
non_test
shf cms carts seds main medium instance i is vulnerable to cve this issue was generated from security hub data and is managed through automation please do not edit the title or body of this issue or remove the security hub tag all other edits comments are welcome finding id inspector us east type of issue security hub finding title instance i is vulnerable to cve id inspector us east you may use this id to lookup this finding s details in security hub description an issue was discovered in the linux kernel for powerpc before it allows a malicious kvm guest to crash the host when the host is running on due to an arch powerpc kvm hv rmhandlers s implementation bug in the handling of the register values remediation undefined ac the security hub finding is resolved or suppressed indicated by a workflow status of resolved or suppressed
0
318,295
27,297,081,787
IssuesEvent
2023-02-23 21:21:44
nucleus-security/Test-repo
https://api.github.com/repos/nucleus-security/Test-repo
closed
Nucleus - [High] - 440059
Test
Source: QUALYS Finding Description: CentOS has released security update for kernel security update to fix the vulnerabilities. Affected Products: centos 6 Impact: N/A Target(s): Asset name: 192.168.56.103 Solution: To resolve this issue, upgrade to the latest packages which contain a patch. Refer to CentOS advisory centos 6 (https://lists.centos.org/pipermail/centos-announce/2015-August/021327.html) for updates and patch information. Patch: Following are links for downloading patches to fix the vulnerabilities: CESA-2015:1623: centos 6 (https://lists.centos.org/pipermail/centos-announce/2015-August/021327.html) References: QID:440059 CVE:CVE-2015-5364, CVE-2015-5366 Category:Local PCI Flagged:no Vendor References:CESA-2015:1623 centos 6 Bugtraq IDs:75510 Severity: High Date Discovered: 2022-11-12 08:04:44 Nucleus Notification Rules Triggered: Rule GitHub Project Name: 6716 Please see Nucleus for more information on these vulnerabilities:https://192.168.56.101/nucleus/public/app/index.html#vuln/201000007/NDQwMDU5/UVVBTFlT/VnVsbg--/false/MjAxMDAwMDA3/c3VtbWFyeQ--/false
1.0
Nucleus - [High] - 440059 - Source: QUALYS Finding Description: CentOS has released security update for kernel security update to fix the vulnerabilities. Affected Products: centos 6 Impact: N/A Target(s): Asset name: 192.168.56.103 Solution: To resolve this issue, upgrade to the latest packages which contain a patch. Refer to CentOS advisory centos 6 (https://lists.centos.org/pipermail/centos-announce/2015-August/021327.html) for updates and patch information. Patch: Following are links for downloading patches to fix the vulnerabilities: CESA-2015:1623: centos 6 (https://lists.centos.org/pipermail/centos-announce/2015-August/021327.html) References: QID:440059 CVE:CVE-2015-5364, CVE-2015-5366 Category:Local PCI Flagged:no Vendor References:CESA-2015:1623 centos 6 Bugtraq IDs:75510 Severity: High Date Discovered: 2022-11-12 08:04:44 Nucleus Notification Rules Triggered: Rule GitHub Project Name: 6716 Please see Nucleus for more information on these vulnerabilities:https://192.168.56.101/nucleus/public/app/index.html#vuln/201000007/NDQwMDU5/UVVBTFlT/VnVsbg--/false/MjAxMDAwMDA3/c3VtbWFyeQ--/false
test
nucleus source qualys finding description centos has released security update for kernel security update to fix the vulnerabilities affected products centos impact n a target s asset name solution to resolve this issue upgrade to the latest packages which contain a patch refer to centos advisory centos for updates and patch information patch following are links for downloading patches to fix the vulnerabilities cesa centos references qid cve cve cve category local pci flagged no vendor references cesa centos bugtraq ids severity high date discovered nucleus notification rules triggered rule github project name please see nucleus for more information on these vulnerabilities
1
322,727
27,627,831,186
IssuesEvent
2023-03-10 08:31:20
dart-lang/co19
https://api.github.com/repos/dart-lang/co19
closed
co19/Language/Statements/Switch/syntax_t01.dart expectation should be updated
bad-test
This case https://github.com/dart-lang/co19/blob/07279694bd8d95a8e5aea258624fdfb3feb01f3f/Language/Statements/Switch/syntax_t01.dart#L33, when opted into patterns, will only match `value` against both `true` and `false`, which will always be `false`. Prior to patterns it would have matched against the expression `true && false` (ie: `false`). So the expectation here https://github.com/dart-lang/co19/blob/07279694bd8d95a8e5aea258624fdfb3feb01f3f/Language/Statements/Switch/syntax_t01.dart#L85 should be `-1`.
1.0
co19/Language/Statements/Switch/syntax_t01.dart expectation should be updated - This case https://github.com/dart-lang/co19/blob/07279694bd8d95a8e5aea258624fdfb3feb01f3f/Language/Statements/Switch/syntax_t01.dart#L33, when opted into patterns, will only match `value` against both `true` and `false`, which will always be `false`. Prior to patterns it would have matched against the expression `true && false` (ie: `false`). So the expectation here https://github.com/dart-lang/co19/blob/07279694bd8d95a8e5aea258624fdfb3feb01f3f/Language/Statements/Switch/syntax_t01.dart#L85 should be `-1`.
test
language statements switch syntax dart expectation should be updated this case when opted into patterns will only match value against both true and false which will always be false prior to patterns it would have matched against the expression true false ie false so the expectation here should be
1
692,063
23,721,805,739
IssuesEvent
2022-08-30 15:55:23
ballerina-platform/ballerina-dev-website
https://api.github.com/repos/ballerina-platform/ballerina-dev-website
closed
Change the color of the footer buttons
Priority/Highest Type/Task WebsiteRewrite Area/CommonPages
## Description Need to change the color of the 2 subscribe buttons on the footer to match the color shade added in https://github.com/ballerina-platform/ballerina-dev-website/issues/5104. ## Related website/documentation area > Add/Uncomment the relevant area label out of the following. <!--Area/BBEs--> <!--Area/HomePageSamples--> <!--Area/LearnPages--> <!--Area/CommonPages--> <!--Area/Backend--> <!--Area/UIUX--> <!--Area/Workflows--> <!--Area/Blog--> ## Describe your task(s) > A detailed description of the task. ## Related issue(s) (optional) > Any related issues such as sub tasks and issues reported in other repositories (e.g., component repositories), similar problems, etc. ## Suggested label(s) (optional) > Optional comma-separated list of suggested labels. Non committers can’t assign labels to issues, and thereby, this will help issue creators who are not a committer to suggest possible labels. ## Suggested assignee(s) (optional) > Optional comma-separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, and thereby, this will help issue creators who are not a committer to suggest possible assignees.
1.0
Change the color of the footer buttons - ## Description Need to change the color of the 2 subscribe buttons on the footer to match the color shade added in https://github.com/ballerina-platform/ballerina-dev-website/issues/5104. ## Related website/documentation area > Add/Uncomment the relevant area label out of the following. <!--Area/BBEs--> <!--Area/HomePageSamples--> <!--Area/LearnPages--> <!--Area/CommonPages--> <!--Area/Backend--> <!--Area/UIUX--> <!--Area/Workflows--> <!--Area/Blog--> ## Describe your task(s) > A detailed description of the task. ## Related issue(s) (optional) > Any related issues such as sub tasks and issues reported in other repositories (e.g., component repositories), similar problems, etc. ## Suggested label(s) (optional) > Optional comma-separated list of suggested labels. Non committers can’t assign labels to issues, and thereby, this will help issue creators who are not a committer to suggest possible labels. ## Suggested assignee(s) (optional) > Optional comma-separated list of suggested team members who should attend the issue. Non committers can’t assign issues to assignees, and thereby, this will help issue creators who are not a committer to suggest possible assignees.
non_test
change the color of the footer buttons description need to change the color of the subscribe buttons on the footer to match the color shade added in related website documentation area add uncomment the relevant area label out of the following describe your task s a detailed description of the task related issue s optional any related issues such as sub tasks and issues reported in other repositories e g component repositories similar problems etc suggested label s optional optional comma separated list of suggested labels non committers can’t assign labels to issues and thereby this will help issue creators who are not a committer to suggest possible labels suggested assignee s optional optional comma separated list of suggested team members who should attend the issue non committers can’t assign issues to assignees and thereby this will help issue creators who are not a committer to suggest possible assignees
0
320,633
23,816,722,689
IssuesEvent
2022-09-05 07:31:34
takuma-ru/vue-swipe-modal
https://api.github.com/repos/takuma-ru/vue-swipe-modal
closed
:sparkles: vue2.x用ドキュメントの作成
documentation enhancement
### Want to Achieve vue2.x用ドキュメントの作成 ### ToDo - [ ] vue2.x用ドキュメントの作成 ### Branch
1.0
:sparkles: vue2.x用ドキュメントの作成 - ### Want to Achieve vue2.x用ドキュメントの作成 ### ToDo - [ ] vue2.x用ドキュメントの作成 ### Branch
non_test
sparkles x用ドキュメントの作成 want to achieve x用ドキュメントの作成 todo x用ドキュメントの作成 branch
0
333,322
24,370,790,284
IssuesEvent
2022-10-03 19:05:45
KDAB/cxx-qt
https://api.github.com/repos/KDAB/cxx-qt
opened
Update the book for new API changes
documentation good first issue
We need to go through all the pages of the book and ensure they are up to date. Also consider if additional pages need to be added on new topics or internals that are relevant.
1.0
Update the book for new API changes - We need to go through all the pages of the book and ensure they are up to date. Also consider if additional pages need to be added on new topics or internals that are relevant.
non_test
update the book for new api changes we need to go through all the pages of the book and ensure they are up to date also consider if additional pages need to be added on new topics or internals that are relevant
0
195,769
14,762,350,872
IssuesEvent
2021-01-09 03:10:07
PyTorchLightning/pytorch-lightning-bolts
https://api.github.com/repos/PyTorchLightning/pytorch-lightning-bolts
opened
Redundant output from `test_logistic_regression_model`
bug / fix tests / CI
## 🐛 Bug There are sometimes so many redundant lines in the tests output which makes it hard to review PRs: ``` ... tests/models/test_classic_ml.py::test_logistic_regression_model tests/models/test_classic_ml.py::test_logistic_regression_model tests/models/test_classic_ml.py::test_logistic_regression_model ... ``` This behaviour happens in https://github.com/PyTorchLightning/pytorch-lightning-bolts/pull/403/checks?check_run_id=1669189373 while it doesn't in https://github.com/PyTorchLightning/pytorch-lightning-bolts/pull/475/checks?check_run_id=1669579907 ### To Reproduce Will be investigated ### Expected behavior Only one line output: ``` tests/models/test_classic_ml.py::test_logistic_regression_model ``` ### Environment - PyTorch Version (e.g., 1.0): - OS (e.g., Linux): - How you installed PyTorch (`conda`, `pip`, source): - Build command you used (if compiling from source): - Python version: - CUDA/cuDNN version: - GPU models and configuration: - Any other relevant information: ### Additional context <!-- Add any other context about the problem here. -->
1.0
Redundant output from `test_logistic_regression_model` - ## 🐛 Bug There are sometimes so many redundant lines in the tests output which makes it hard to review PRs: ``` ... tests/models/test_classic_ml.py::test_logistic_regression_model tests/models/test_classic_ml.py::test_logistic_regression_model tests/models/test_classic_ml.py::test_logistic_regression_model ... ``` This behaviour happens in https://github.com/PyTorchLightning/pytorch-lightning-bolts/pull/403/checks?check_run_id=1669189373 while it doesn't in https://github.com/PyTorchLightning/pytorch-lightning-bolts/pull/475/checks?check_run_id=1669579907 ### To Reproduce Will be investigated ### Expected behavior Only one line output: ``` tests/models/test_classic_ml.py::test_logistic_regression_model ``` ### Environment - PyTorch Version (e.g., 1.0): - OS (e.g., Linux): - How you installed PyTorch (`conda`, `pip`, source): - Build command you used (if compiling from source): - Python version: - CUDA/cuDNN version: - GPU models and configuration: - Any other relevant information: ### Additional context <!-- Add any other context about the problem here. -->
test
redundant output from test logistic regression model 🐛 bug there are sometimes so many redundant lines in the tests output which makes it hard to review prs tests models test classic ml py test logistic regression model tests models test classic ml py test logistic regression model tests models test classic ml py test logistic regression model this behaviour happens in while it doesn t in to reproduce will be investigated expected behavior only one line output tests models test classic ml py test logistic regression model environment pytorch version e g os e g linux how you installed pytorch conda pip source build command you used if compiling from source python version cuda cudnn version gpu models and configuration any other relevant information additional context
1
212,575
7,238,691,452
IssuesEvent
2018-02-13 15:21:45
FocusCompany/backend
https://api.github.com/repos/FocusCompany/backend
closed
Sockets shouldn't deserialize Protobuf objects
Priority: low enhancement
Currently, Protobuf object unwrapping is done in the socket function. While protobuf is deserializing the envelope, the socket is not listening for anything. We should move the message reading bits into another actor
1.0
Sockets shouldn't deserialize Protobuf objects - Currently, Protobuf object unwrapping is done in the socket function. While protobuf is deserializing the envelope, the socket is not listening for anything. We should move the message reading bits into another actor
non_test
sockets shouldn t deserialize protobuf objects currently protobuf object unwrapping is done in the socket function while protobuf is deserializing the envelope the socket is not listening for anything we should move the message reading bits into another actor
0
23,457
4,019,032,255
IssuesEvent
2016-05-16 13:30:37
socia-platform/htwplus
https://api.github.com/repos/socia-platform/htwplus
closed
Account Deletion not working
bug ready to test
You can not delete an account because of a foreign key constraint. `Caused by: org.postgresql.util.PSQLException: FEHLER: Aktualisieren oder Löschen in Tabelle „account“ verletzt Fremdschlüssel-Constraint „fk_gkqik7byxub0htwmyfilwa1gi“ von Tabelle „folder“ Detail: Auf Schlüssel (id)=(97) wird noch aus Tabelle „folder“ verwiesen. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2270) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1998) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:570) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:420) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.jdbc2.AbstractJdbc2Statement.executeUpdate(AbstractJdbc2Statement.java:366) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at com.zaxxer.hikari.proxy.PreparedStatementProxy.executeUpdate(PreparedStatementProxy.java:61) ~[HikariCP-2.3.7.jar:na] at com.zaxxer.hikari.proxy.PreparedStatementJavassistProxy.executeUpdate(PreparedStatementJavassistProxy.java) ~[HikariCP-2.3.7.jar:na] at org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.executeUpdate(ResultSetReturnImpl.java:208) ~[hibernate-core-4.3.10.Final.jar:4.3.10.Final] at org.hibernate.engine.jdbc.batch.internal.NonBatchingBatch.addToBatch(NonBatchingBatch.java:62) ~[hibernate-core-4.3.10.Final.jar:4.3.10.Final] `
1.0
Account Deletion not working - You can not delete an account because of a foreign key constraint. `Caused by: org.postgresql.util.PSQLException: FEHLER: Aktualisieren oder Löschen in Tabelle „account“ verletzt Fremdschlüssel-Constraint „fk_gkqik7byxub0htwmyfilwa1gi“ von Tabelle „folder“ Detail: Auf Schlüssel (id)=(97) wird noch aus Tabelle „folder“ verwiesen. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2270) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1998) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:570) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:420) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at org.postgresql.jdbc2.AbstractJdbc2Statement.executeUpdate(AbstractJdbc2Statement.java:366) ~[postgresql-9.4-1201-jdbc41.jar:9.4] at com.zaxxer.hikari.proxy.PreparedStatementProxy.executeUpdate(PreparedStatementProxy.java:61) ~[HikariCP-2.3.7.jar:na] at com.zaxxer.hikari.proxy.PreparedStatementJavassistProxy.executeUpdate(PreparedStatementJavassistProxy.java) ~[HikariCP-2.3.7.jar:na] at org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.executeUpdate(ResultSetReturnImpl.java:208) ~[hibernate-core-4.3.10.Final.jar:4.3.10.Final] at org.hibernate.engine.jdbc.batch.internal.NonBatchingBatch.addToBatch(NonBatchingBatch.java:62) ~[hibernate-core-4.3.10.Final.jar:4.3.10.Final] `
test
account deletion not working you can not delete an account because of a foreign key constraint caused by org postgresql util psqlexception fehler aktualisieren oder löschen in tabelle „account“ verletzt fremdschlüssel constraint „fk “ von tabelle „folder“ detail auf schlüssel id wird noch aus tabelle „folder“ verwiesen at org postgresql core queryexecutorimpl receiveerrorresponse queryexecutorimpl java at org postgresql core queryexecutorimpl processresults queryexecutorimpl java at org postgresql core queryexecutorimpl execute queryexecutorimpl java at org postgresql execute java at org postgresql executewithflags java at org postgresql executeupdate java at com zaxxer hikari proxy preparedstatementproxy executeupdate preparedstatementproxy java at com zaxxer hikari proxy preparedstatementjavassistproxy executeupdate preparedstatementjavassistproxy java at org hibernate engine jdbc internal resultsetreturnimpl executeupdate resultsetreturnimpl java at org hibernate engine jdbc batch internal nonbatchingbatch addtobatch nonbatchingbatch java
1
9,167
3,024,984,642
IssuesEvent
2015-08-03 03:23:29
piwik/piwik
https://api.github.com/repos/piwik/piwik
opened
Load Testing System
c: Tests & QA
It is not currently possible to test Piwik under high load. The current approach is to use xhprof while tracking one or many requests and while archiving. Since archiving happens once per hour and there is never more than once 'archiving process' running at once (other core:archive processes should continue the existing run), using xhprof alone is ok. Tracker performance, on the other hand is not so easily gauged. Sure, using xhprof on a single request or multiple requests will provide some useful information and allow you to optimize, but it says nothing about how the tracker will behave under high levels of concurrency. Simulating high load requires many concurrent tracking requests and special monitoring. # Enter Locust Locust ([http://locust.io/](http://locust.io/)) is a load testing framework that can simulate millions of concurrent users. It also looks very easy to use. Though it requires writing python code. (I'm ok w/ that, not sure if everyone else is.) We can create a system that uses EC2 & locust to automate load testing the tracker. Then after the tracker is tested, we can automatically launch archiving w/ xhprof using the data that was just tracked. This can provide is with performance regression testing under high load, but more importantly, we can test the effect plugins have on tracker & archiving performance automatically. By comparing the results of a test when a plugin is enabled to the test for core, we can see just how much overhead a plugin adds, and maybe even find ways to boost performance. # The System * Plugins should be able to create a custom locust profile describing test visits and test setup. For example, Goals would make sure goals are defined for the test websites being tracked, and define visits that triggered conversions. An example locust file w/ Piwik visit data might look like: ``` class VisitWithConversionTasks(TaskSet): @task def shortVisitWithConversion(self): self.client.get("?idsite=1&..."); # action 1 self.client.get("..."); # action 2 # ... @task(5) # occurs 5 times less than above def manualGoalConversion(self): self.client.get("?idsite=1&idgoal=1&..."); # ... class VisitWithConversion(Locust): weight = 1 task_set = VisitWithConversionTasks class VisitWithMultipleConversion(TaskSet): # ... class VisitWithMultipleConversion(Locust): weight = 3 # happens 3 times less frequently than visit with conversion task_set = VisitWithMultipleConversionTasks ``` * A new command `tests:load [--plugin=...]` should be created that will start up an EC2 instance w/ Piwik installed, then a EC2 instance w/ locust installed. Later, if it's needed to run tests against millions of concurrent visits, perhaps clusters can be created. [Note: Perhaps puppet files or something else can be used to specify different Piwik instance configurations. For example, normal w/ no redis cache, normal w/ redis cache, w/ queued tracking installed, etc.] * The command will run the load test and save metric data to the filesystem, then use xhprof on a core:archive run and save the profile. * If possible, the command should get database performance metrics. Something that can be used to pinpoint problematic queries or tell if caches aren't being effective.
1.0
Load Testing System - It is not currently possible to test Piwik under high load. The current approach is to use xhprof while tracking one or many requests and while archiving. Since archiving happens once per hour and there is never more than once 'archiving process' running at once (other core:archive processes should continue the existing run), using xhprof alone is ok. Tracker performance, on the other hand is not so easily gauged. Sure, using xhprof on a single request or multiple requests will provide some useful information and allow you to optimize, but it says nothing about how the tracker will behave under high levels of concurrency. Simulating high load requires many concurrent tracking requests and special monitoring. # Enter Locust Locust ([http://locust.io/](http://locust.io/)) is a load testing framework that can simulate millions of concurrent users. It also looks very easy to use. Though it requires writing python code. (I'm ok w/ that, not sure if everyone else is.) We can create a system that uses EC2 & locust to automate load testing the tracker. Then after the tracker is tested, we can automatically launch archiving w/ xhprof using the data that was just tracked. This can provide is with performance regression testing under high load, but more importantly, we can test the effect plugins have on tracker & archiving performance automatically. By comparing the results of a test when a plugin is enabled to the test for core, we can see just how much overhead a plugin adds, and maybe even find ways to boost performance. # The System * Plugins should be able to create a custom locust profile describing test visits and test setup. For example, Goals would make sure goals are defined for the test websites being tracked, and define visits that triggered conversions. An example locust file w/ Piwik visit data might look like: ``` class VisitWithConversionTasks(TaskSet): @task def shortVisitWithConversion(self): self.client.get("?idsite=1&..."); # action 1 self.client.get("..."); # action 2 # ... @task(5) # occurs 5 times less than above def manualGoalConversion(self): self.client.get("?idsite=1&idgoal=1&..."); # ... class VisitWithConversion(Locust): weight = 1 task_set = VisitWithConversionTasks class VisitWithMultipleConversion(TaskSet): # ... class VisitWithMultipleConversion(Locust): weight = 3 # happens 3 times less frequently than visit with conversion task_set = VisitWithMultipleConversionTasks ``` * A new command `tests:load [--plugin=...]` should be created that will start up an EC2 instance w/ Piwik installed, then a EC2 instance w/ locust installed. Later, if it's needed to run tests against millions of concurrent visits, perhaps clusters can be created. [Note: Perhaps puppet files or something else can be used to specify different Piwik instance configurations. For example, normal w/ no redis cache, normal w/ redis cache, w/ queued tracking installed, etc.] * The command will run the load test and save metric data to the filesystem, then use xhprof on a core:archive run and save the profile. * If possible, the command should get database performance metrics. Something that can be used to pinpoint problematic queries or tell if caches aren't being effective.
test
load testing system it is not currently possible to test piwik under high load the current approach is to use xhprof while tracking one or many requests and while archiving since archiving happens once per hour and there is never more than once archiving process running at once other core archive processes should continue the existing run using xhprof alone is ok tracker performance on the other hand is not so easily gauged sure using xhprof on a single request or multiple requests will provide some useful information and allow you to optimize but it says nothing about how the tracker will behave under high levels of concurrency simulating high load requires many concurrent tracking requests and special monitoring enter locust locust is a load testing framework that can simulate millions of concurrent users it also looks very easy to use though it requires writing python code i m ok w that not sure if everyone else is we can create a system that uses locust to automate load testing the tracker then after the tracker is tested we can automatically launch archiving w xhprof using the data that was just tracked this can provide is with performance regression testing under high load but more importantly we can test the effect plugins have on tracker archiving performance automatically by comparing the results of a test when a plugin is enabled to the test for core we can see just how much overhead a plugin adds and maybe even find ways to boost performance the system plugins should be able to create a custom locust profile describing test visits and test setup for example goals would make sure goals are defined for the test websites being tracked and define visits that triggered conversions an example locust file w piwik visit data might look like class visitwithconversiontasks taskset task def shortvisitwithconversion self self client get idsite action self client get action task occurs times less than above def manualgoalconversion self self client get idsite idgoal class visitwithconversion locust weight task set visitwithconversiontasks class visitwithmultipleconversion taskset class visitwithmultipleconversion locust weight happens times less frequently than visit with conversion task set visitwithmultipleconversiontasks a new command tests load should be created that will start up an instance w piwik installed then a instance w locust installed later if it s needed to run tests against millions of concurrent visits perhaps clusters can be created the command will run the load test and save metric data to the filesystem then use xhprof on a core archive run and save the profile if possible the command should get database performance metrics something that can be used to pinpoint problematic queries or tell if caches aren t being effective
1
27,119
4,281,472,882
IssuesEvent
2016-07-15 03:11:43
knieper/crltmich
https://api.github.com/repos/knieper/crltmich
closed
2 fields not working in grant report reminder email
Grants ready for testing
In the reminder email that goes out to grant recipients to remind the of a report due soon or overdue, two fields (grant title and report due date) are not merging into the email that goes out. See example below. Missing fields are indicated with *****. ________________________________________ Dear Colleague, In 2015, you received funding from the Lecturers' Professional Development Fund for the project titled: [field_grant_application_title]***** Grantee(s): Susan Crabb Our records indicate that your project has concluded. CRLT requires grantees to submit final reports no later than 3 months after a project’s conclusion. Please remember to submit your final grant report by *****, using CRLT's Kerberos Password protected report website: https://crlt.umich.edu/mygrants Once you have logged in and selected the appropriate project, you will be asked to recap the goals of your project; detail the achievements of the project (including the number of students and/or courses affected by the project); discuss any plans for continuation and dissemination of the project; and provide advice to your U-M colleagues considering similar work. All reports will be published on CRLT's website to assist your U-M faculty colleagues in improving teaching and learning across campus: http://crlt.umich.edu/viewgrantreports. If you have any questions, please e-mail us (crltgrants@umich.edu). We hope your project has been successful, and we look forward to receiving your report. Best regards, Center for Research on Learning and Teaching (CRLT)
1.0
2 fields not working in grant report reminder email - In the reminder email that goes out to grant recipients to remind the of a report due soon or overdue, two fields (grant title and report due date) are not merging into the email that goes out. See example below. Missing fields are indicated with *****. ________________________________________ Dear Colleague, In 2015, you received funding from the Lecturers' Professional Development Fund for the project titled: [field_grant_application_title]***** Grantee(s): Susan Crabb Our records indicate that your project has concluded. CRLT requires grantees to submit final reports no later than 3 months after a project’s conclusion. Please remember to submit your final grant report by *****, using CRLT's Kerberos Password protected report website: https://crlt.umich.edu/mygrants Once you have logged in and selected the appropriate project, you will be asked to recap the goals of your project; detail the achievements of the project (including the number of students and/or courses affected by the project); discuss any plans for continuation and dissemination of the project; and provide advice to your U-M colleagues considering similar work. All reports will be published on CRLT's website to assist your U-M faculty colleagues in improving teaching and learning across campus: http://crlt.umich.edu/viewgrantreports. If you have any questions, please e-mail us (crltgrants@umich.edu). We hope your project has been successful, and we look forward to receiving your report. Best regards, Center for Research on Learning and Teaching (CRLT)
test
fields not working in grant report reminder email in the reminder email that goes out to grant recipients to remind the of a report due soon or overdue two fields grant title and report due date are not merging into the email that goes out see example below missing fields are indicated with dear colleague in you received funding from the lecturers professional development fund for the project titled grantee s susan crabb our records indicate that your project has concluded crlt requires grantees to submit final reports no later than months after a project’s conclusion please remember to submit your final grant report by using crlt s kerberos password protected report website once you have logged in and selected the appropriate project you will be asked to recap the goals of your project detail the achievements of the project including the number of students and or courses affected by the project discuss any plans for continuation and dissemination of the project and provide advice to your u m colleagues considering similar work all reports will be published on crlt s website to assist your u m faculty colleagues in improving teaching and learning across campus if you have any questions please e mail us crltgrants umich edu we hope your project has been successful and we look forward to receiving your report best regards center for research on learning and teaching crlt
1
199,391
15,765,842,288
IssuesEvent
2021-03-31 14:30:48
snowplow/snowplow-javascript-tracker
https://api.github.com/repos/snowplow/snowplow-javascript-tracker
closed
Test Snowplow setup in BrightTag
category:tag_management type:documentation
Note that at the moment, Snowplow is not integrated in BrightTag. However, we know that it's possible to integrate the tags manually, because the guys at Burberry have done it. I have contacts with the folks at BrightTag I can share. Output of this phase of work should be a tagging guide to integrating Snowplow with BrightTag
1.0
Test Snowplow setup in BrightTag - Note that at the moment, Snowplow is not integrated in BrightTag. However, we know that it's possible to integrate the tags manually, because the guys at Burberry have done it. I have contacts with the folks at BrightTag I can share. Output of this phase of work should be a tagging guide to integrating Snowplow with BrightTag
non_test
test snowplow setup in brighttag note that at the moment snowplow is not integrated in brighttag however we know that it s possible to integrate the tags manually because the guys at burberry have done it i have contacts with the folks at brighttag i can share output of this phase of work should be a tagging guide to integrating snowplow with brighttag
0
765,339
26,842,094,894
IssuesEvent
2023-02-03 01:57:19
vmware/singleton
https://api.github.com/repos/vmware/singleton
closed
[REQUIREMENT][Go Client] Add Get API for Multiple Components Translations
area/go-client kind/feature priority/high
**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered** A clear and concise description of any alternative solution or feature you've considered. **Additional context** Add any other context or screenshots about the feature request here.
1.0
[REQUIREMENT][Go Client] Add Get API for Multiple Components Translations - **Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered** A clear and concise description of any alternative solution or feature you've considered. **Additional context** Add any other context or screenshots about the feature request here.
non_test
add get api for multiple components translations is your feature request related to a problem please describe a clear and concise description of what the problem is ex i m always frustrated when describe the solution you d like a clear and concise description of what you want to happen describe alternatives you ve considered a clear and concise description of any alternative solution or feature you ve considered additional context add any other context or screenshots about the feature request here
0
123,926
26,356,975,322
IssuesEvent
2023-01-11 10:26:06
haproxy/haproxy
https://api.github.com/repos/haproxy/haproxy
closed
src/ssl_ocsp.c: null pointer dereference suspected by coverity
type: code-report
### Tool Name and Version coverity ### Code Report ```plain *** CID 1502454: Null pointer dereferences (REVERSE_INULL) /src/ssl_ocsp.c: 1435 in cli_release_update_ocsp_response() 1429 1430 static void cli_release_update_ocsp_response(struct appctx *appctx) 1431 { 1432 struct ocsp_cli_ctx *ctx = appctx->svcctx; 1433 struct httpclient *hc = ctx->hc; 1434 >>> CID 1502454: Null pointer dereferences (REVERSE_INULL) >>> Null-checking "ctx" suggests that it may be null, but it has already been dereferenced on all paths leading to the check. 1435 if (ctx) 1436 X509_free(ctx->ocsp_issuer); 1437 1438 /* Everything possible was printed on the CLI, we can destroy the client */ 1439 httpclient_stop_and_destroy(hc); 1440 ``` ### Additional Information _No response_ ### Output of `haproxy -vv` ```plain no ```
1.0
src/ssl_ocsp.c: null pointer dereference suspected by coverity - ### Tool Name and Version coverity ### Code Report ```plain *** CID 1502454: Null pointer dereferences (REVERSE_INULL) /src/ssl_ocsp.c: 1435 in cli_release_update_ocsp_response() 1429 1430 static void cli_release_update_ocsp_response(struct appctx *appctx) 1431 { 1432 struct ocsp_cli_ctx *ctx = appctx->svcctx; 1433 struct httpclient *hc = ctx->hc; 1434 >>> CID 1502454: Null pointer dereferences (REVERSE_INULL) >>> Null-checking "ctx" suggests that it may be null, but it has already been dereferenced on all paths leading to the check. 1435 if (ctx) 1436 X509_free(ctx->ocsp_issuer); 1437 1438 /* Everything possible was printed on the CLI, we can destroy the client */ 1439 httpclient_stop_and_destroy(hc); 1440 ``` ### Additional Information _No response_ ### Output of `haproxy -vv` ```plain no ```
non_test
src ssl ocsp c null pointer dereference suspected by coverity tool name and version coverity code report plain cid null pointer dereferences reverse inull src ssl ocsp c in cli release update ocsp response static void cli release update ocsp response struct appctx appctx struct ocsp cli ctx ctx appctx svcctx struct httpclient hc ctx hc cid null pointer dereferences reverse inull null checking ctx suggests that it may be null but it has already been dereferenced on all paths leading to the check if ctx free ctx ocsp issuer everything possible was printed on the cli we can destroy the client httpclient stop and destroy hc additional information no response output of haproxy vv plain no
0
306,403
26,465,329,168
IssuesEvent
2023-01-16 22:34:30
cicirello/Chips-n-Salsa
https://api.github.com/repos/cicirello/Chips-n-Salsa
closed
Refactor test cases in RestartSchedulesTests
testing refactor
## Summary Refactor test cases in RestartSchedulesTests based on RefactorFirst scan.
1.0
Refactor test cases in RestartSchedulesTests - ## Summary Refactor test cases in RestartSchedulesTests based on RefactorFirst scan.
test
refactor test cases in restartschedulestests summary refactor test cases in restartschedulestests based on refactorfirst scan
1
65,262
27,034,834,528
IssuesEvent
2023-02-12 16:53:29
LLEB-ME/gouv.fa
https://api.github.com/repos/LLEB-ME/gouv.fa
opened
Migrate from Zola to Velo
service maintenance T3: NORMAL
I think literally everyone realises the troubles we've had with Zola. I'm currently working on Velo, which is what this site is planned to move to. We can do a lot of the work prior to v0.1.0 since a lot of the things here do not need to change. Things that do are: - [ ] Convert config.toml to config.json The file structure is pretty simple— I'll attach the current documentation here. Feedback is welcome if more required/optional attributes are needed native to the tool. An additional config file for allowing this site to be rendered at both farer.group and gouv.fa would be nice— the `extends` element would be how we do this. - [ ] Convert templates to [Nunjucks](https://mozilla.github.io/nunjucks/) Velo uses the Nunjucks templating engine, which is VERY similar to Tera, Liquid, et al. It should be a very low effort change. Some things to keep in mind are that I haven't decided what addition filters and functions will come with Velo. Two that are realtively safe to assume are `get_section(pathtosectionfromcontentfolder` and `get_page(pathtopagefromcontentfolder)`. - [ ] Update `ansible.yml` and/or `gouv-deploy.yml` to be ready for Velo I plan to make a Docker container via GitHub Packages which can make this easier. The standard build command is `velo build`. To tie in other config files is done with `velo build -c PATHTOCONFIGFILE` I can answer any questions as we make progress on migration. Velo should have v0.1.0 sometime in March, ideally. Ref: [velo/docs/configuration.md](https://github.com/LLEB-ME/gouv.fa/files/10716632/configuration.md)
1.0
Migrate from Zola to Velo - I think literally everyone realises the troubles we've had with Zola. I'm currently working on Velo, which is what this site is planned to move to. We can do a lot of the work prior to v0.1.0 since a lot of the things here do not need to change. Things that do are: - [ ] Convert config.toml to config.json The file structure is pretty simple— I'll attach the current documentation here. Feedback is welcome if more required/optional attributes are needed native to the tool. An additional config file for allowing this site to be rendered at both farer.group and gouv.fa would be nice— the `extends` element would be how we do this. - [ ] Convert templates to [Nunjucks](https://mozilla.github.io/nunjucks/) Velo uses the Nunjucks templating engine, which is VERY similar to Tera, Liquid, et al. It should be a very low effort change. Some things to keep in mind are that I haven't decided what addition filters and functions will come with Velo. Two that are realtively safe to assume are `get_section(pathtosectionfromcontentfolder` and `get_page(pathtopagefromcontentfolder)`. - [ ] Update `ansible.yml` and/or `gouv-deploy.yml` to be ready for Velo I plan to make a Docker container via GitHub Packages which can make this easier. The standard build command is `velo build`. To tie in other config files is done with `velo build -c PATHTOCONFIGFILE` I can answer any questions as we make progress on migration. Velo should have v0.1.0 sometime in March, ideally. Ref: [velo/docs/configuration.md](https://github.com/LLEB-ME/gouv.fa/files/10716632/configuration.md)
non_test
migrate from zola to velo i think literally everyone realises the troubles we ve had with zola i m currently working on velo which is what this site is planned to move to we can do a lot of the work prior to since a lot of the things here do not need to change things that do are convert config toml to config json the file structure is pretty simple— i ll attach the current documentation here feedback is welcome if more required optional attributes are needed native to the tool an additional config file for allowing this site to be rendered at both farer group and gouv fa would be nice— the extends element would be how we do this convert templates to velo uses the nunjucks templating engine which is very similar to tera liquid et al it should be a very low effort change some things to keep in mind are that i haven t decided what addition filters and functions will come with velo two that are realtively safe to assume are get section pathtosectionfromcontentfolder and get page pathtopagefromcontentfolder update ansible yml and or gouv deploy yml to be ready for velo i plan to make a docker container via github packages which can make this easier the standard build command is velo build to tie in other config files is done with velo build c pathtoconfigfile i can answer any questions as we make progress on migration velo should have sometime in march ideally ref
0
87,649
15,788,939,950
IssuesEvent
2021-04-01 21:40:59
MicrosoftDocs/sql-docs
https://api.github.com/repos/MicrosoftDocs/sql-docs
closed
Shared Secret Generation
Pri2 assigned-to-author doc-bug security/tech sql/prod
You need examples (or a link to another docs article) of how to generate shared secrets correctly in Windows Server (and possibly Linux), both On-Prem and in the cloud. Database Admins do not have the requisite experience to assume that they can create a C# program or call a method to do this unless you were to give a T-SQL function or PowerShell cmdlet example. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 0d903c17-112d-c463-0af8-e1b6799df66f * Version Independent ID: 008ccdde-b62c-92a1-98fd-377ad97a769d * Content: [Create Identical Symmetric Keys on Two Servers - SQL Server](https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/create-identical-symmetric-keys-on-two-servers?view=sql-server-ver15#feedback) * Content Source: [docs/relational-databases/security/encryption/create-identical-symmetric-keys-on-two-servers.md](https://github.com/MicrosoftDocs/sql-docs/blob/live/docs/relational-databases/security/encryption/create-identical-symmetric-keys-on-two-servers.md) * Product: **sql** * Technology: **security** * GitHub Login: @jaszymas * Microsoft Alias: **jaszymas**
True
Shared Secret Generation - You need examples (or a link to another docs article) of how to generate shared secrets correctly in Windows Server (and possibly Linux), both On-Prem and in the cloud. Database Admins do not have the requisite experience to assume that they can create a C# program or call a method to do this unless you were to give a T-SQL function or PowerShell cmdlet example. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 0d903c17-112d-c463-0af8-e1b6799df66f * Version Independent ID: 008ccdde-b62c-92a1-98fd-377ad97a769d * Content: [Create Identical Symmetric Keys on Two Servers - SQL Server](https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/create-identical-symmetric-keys-on-two-servers?view=sql-server-ver15#feedback) * Content Source: [docs/relational-databases/security/encryption/create-identical-symmetric-keys-on-two-servers.md](https://github.com/MicrosoftDocs/sql-docs/blob/live/docs/relational-databases/security/encryption/create-identical-symmetric-keys-on-two-servers.md) * Product: **sql** * Technology: **security** * GitHub Login: @jaszymas * Microsoft Alias: **jaszymas**
non_test
shared secret generation you need examples or a link to another docs article of how to generate shared secrets correctly in windows server and possibly linux both on prem and in the cloud database admins do not have the requisite experience to assume that they can create a c program or call a method to do this unless you were to give a t sql function or powershell cmdlet example document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product sql technology security github login jaszymas microsoft alias jaszymas
0
80,207
15,365,973,619
IssuesEvent
2021-03-02 00:34:12
AUSoftAndreas/tetris
https://api.github.com/repos/AUSoftAndreas/tetris
closed
[CODE] Create lib/models/game.dart
Code
Use lib/ideas/classes.drawio for reference Every method can just look like this: ```Dart void someFunction() { log('Shape: someFunction'); } ```
1.0
[CODE] Create lib/models/game.dart - Use lib/ideas/classes.drawio for reference Every method can just look like this: ```Dart void someFunction() { log('Shape: someFunction'); } ```
non_test
create lib models game dart use lib ideas classes drawio for reference every method can just look like this dart void somefunction log shape somefunction
0
113,332
9,636,358,368
IssuesEvent
2019-05-16 05:37:50
exportarts/ngx-prismic
https://api.github.com/repos/exportarts/ngx-prismic
opened
MappingFunction should be extended to support multiple OperatorFunctions
enhancement tests
https://github.com/exportarts/ngx-prismic/blob/3ce4512cb313c1f12e90b6c04c5af60d381bfa6f/lib/prismic-client/src/services/prismic.service.ts#L66 The current implementation only accepts a function for [rxjs#map](https://www.learnrxjs.io/operators/transformation/map.html). It should be possible to provide multiple `OperatorFunction`s to support more complex piping operations.
1.0
MappingFunction should be extended to support multiple OperatorFunctions - https://github.com/exportarts/ngx-prismic/blob/3ce4512cb313c1f12e90b6c04c5af60d381bfa6f/lib/prismic-client/src/services/prismic.service.ts#L66 The current implementation only accepts a function for [rxjs#map](https://www.learnrxjs.io/operators/transformation/map.html). It should be possible to provide multiple `OperatorFunction`s to support more complex piping operations.
test
mappingfunction should be extended to support multiple operatorfunctions the current implementation only accepts a function for it should be possible to provide multiple operatorfunction s to support more complex piping operations
1
29,407
4,501,210,417
IssuesEvent
2016-09-01 08:36:50
mattbearman/lime
https://api.github.com/repos/mattbearman/lime
closed
BugMuncher Feedback Report
bug BugMuncher problem test
## Details ## **Submitted:** April 18, 2016 10:24 **Category:** problem **Sender Email:** robalbert67@gmail.com **Website:** BugMuncher App **URL:** https://app.bugmuncher.com/profiles **Operating System:** Windows 10 **Browser:** Chrome 49.0.2623.112 **Browser Size:** 1366 x 643 **User Agent:** Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.112 Safari/537.36 **Description:** test bugmuncher ## Screenshot ## ![Screenshot](http://app.bugmuncher.com/api/v1/screenshots/db08c1348a8a1905f1cf14b30f2703370dbfea7f.png) ## Custom Data ## **user_id:** 1320 **account_id:** 887 **plan:** BugMuncher Corporate ## Browser Plugins ## Shockwave Flash ## Events ## **method:** GET **url:** https://app.bugmuncher.com/user/dashboard **timestamp:** Mon Apr 18 2016 20:21:09 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **content:** Feedback Button Clicked **timestamp:** Mon Apr 18 2016 20:21:51 GMT+1000 (AUS Eastern Standard Time) **type:** bugmuncher --- **method:** GET **url:** https://app.bugmuncher.com/user/dashboard?tour=part_1 **timestamp:** Mon Apr 18 2016 20:22:00 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/profiles?tour=part_2 **timestamp:** Mon Apr 18 2016 20:22:18 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/profiles/new?tour=part_3 **timestamp:** Mon Apr 18 2016 20:22:38 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account/subscription/edit **timestamp:** Mon Apr 18 2016 20:23:20 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account **timestamp:** Mon Apr 18 2016 20:23:49 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account/edit **timestamp:** Mon Apr 18 2016 20:23:55 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account **timestamp:** Mon Apr 18 2016 20:24:00 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account/edit **timestamp:** Mon Apr 18 2016 20:24:04 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/user/dashboard **timestamp:** Mon Apr 18 2016 20:24:11 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/profiles **timestamp:** Mon Apr 18 2016 20:24:18 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **content:** Feedback Button Clicked **timestamp:** Mon Apr 18 2016 20:24:25 GMT+1000 (AUS Eastern Standard Time) **type:** bugmuncher --- **type:** bugmuncher **content:** Feedback Report Submitted **timestamp:** Mon Apr 18 2016 20:24:45 GMT+1000 (AUS Eastern Standard Time) ---
1.0
BugMuncher Feedback Report - ## Details ## **Submitted:** April 18, 2016 10:24 **Category:** problem **Sender Email:** robalbert67@gmail.com **Website:** BugMuncher App **URL:** https://app.bugmuncher.com/profiles **Operating System:** Windows 10 **Browser:** Chrome 49.0.2623.112 **Browser Size:** 1366 x 643 **User Agent:** Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.112 Safari/537.36 **Description:** test bugmuncher ## Screenshot ## ![Screenshot](http://app.bugmuncher.com/api/v1/screenshots/db08c1348a8a1905f1cf14b30f2703370dbfea7f.png) ## Custom Data ## **user_id:** 1320 **account_id:** 887 **plan:** BugMuncher Corporate ## Browser Plugins ## Shockwave Flash ## Events ## **method:** GET **url:** https://app.bugmuncher.com/user/dashboard **timestamp:** Mon Apr 18 2016 20:21:09 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **content:** Feedback Button Clicked **timestamp:** Mon Apr 18 2016 20:21:51 GMT+1000 (AUS Eastern Standard Time) **type:** bugmuncher --- **method:** GET **url:** https://app.bugmuncher.com/user/dashboard?tour=part_1 **timestamp:** Mon Apr 18 2016 20:22:00 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/profiles?tour=part_2 **timestamp:** Mon Apr 18 2016 20:22:18 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/profiles/new?tour=part_3 **timestamp:** Mon Apr 18 2016 20:22:38 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account/subscription/edit **timestamp:** Mon Apr 18 2016 20:23:20 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account **timestamp:** Mon Apr 18 2016 20:23:49 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account/edit **timestamp:** Mon Apr 18 2016 20:23:55 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account **timestamp:** Mon Apr 18 2016 20:24:00 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/account/edit **timestamp:** Mon Apr 18 2016 20:24:04 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/user/dashboard **timestamp:** Mon Apr 18 2016 20:24:11 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **method:** GET **url:** https://app.bugmuncher.com/profiles **timestamp:** Mon Apr 18 2016 20:24:18 GMT+1000 (AUS Eastern Standard Time) **type:** page_load --- **content:** Feedback Button Clicked **timestamp:** Mon Apr 18 2016 20:24:25 GMT+1000 (AUS Eastern Standard Time) **type:** bugmuncher --- **type:** bugmuncher **content:** Feedback Report Submitted **timestamp:** Mon Apr 18 2016 20:24:45 GMT+1000 (AUS Eastern Standard Time) ---
test
bugmuncher feedback report details submitted april category problem sender email gmail com website bugmuncher app url operating system windows browser chrome browser size x user agent mozilla windows nt applewebkit khtml like gecko chrome safari description test bugmuncher screenshot custom data user id account id plan bugmuncher corporate browser plugins shockwave flash events method get url timestamp mon apr gmt aus eastern standard time type page load content feedback button clicked timestamp mon apr gmt aus eastern standard time type bugmuncher method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load method get url timestamp mon apr gmt aus eastern standard time type page load content feedback button clicked timestamp mon apr gmt aus eastern standard time type bugmuncher type bugmuncher content feedback report submitted timestamp mon apr gmt aus eastern standard time
1
38,634
5,194,420,987
IssuesEvent
2017-01-23 03:35:11
openshift/origin
https://api.github.com/repos/openshift/origin
closed
deployment failed: caused by a config change
component/deployments kind/test-flake priority/P2
https://ci.openshift.redhat.com/jenkins/job/test_pull_requests_origin_conformance/5805/consoleFull#103359503156cbb9a5e4b02b88ae8c2f77 ``` [Fail] deploymentconfigs when run iteratively [It] should only deploy the last deployment [Conformance] /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:120 [Fail] deploymentconfigs when run iteratively [It] should immediately start a new deployment [Conformance] /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:167 ``` ``` should only deploy the last deployment [Conformance] /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:131 Expected error: <*errors.errorString | 0xc8209a87b0>: { s: "deployment failed: caused by a config change", } deployment failed: caused by a config change not to have occurred /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:120 ``` ``` should immediately start a new deployment [Conformance] [It] /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:190 Expected error: <*errors.errorString | 0xc8200ce130>: { s: "timed out waiting for the condition", } timed out waiting for the condition not to have occurred /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:167 ```
1.0
deployment failed: caused by a config change - https://ci.openshift.redhat.com/jenkins/job/test_pull_requests_origin_conformance/5805/consoleFull#103359503156cbb9a5e4b02b88ae8c2f77 ``` [Fail] deploymentconfigs when run iteratively [It] should only deploy the last deployment [Conformance] /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:120 [Fail] deploymentconfigs when run iteratively [It] should immediately start a new deployment [Conformance] /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:167 ``` ``` should only deploy the last deployment [Conformance] /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:131 Expected error: <*errors.errorString | 0xc8209a87b0>: { s: "deployment failed: caused by a config change", } deployment failed: caused by a config change not to have occurred /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:120 ``` ``` should immediately start a new deployment [Conformance] [It] /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:190 Expected error: <*errors.errorString | 0xc8200ce130>: { s: "timed out waiting for the condition", } timed out waiting for the condition not to have occurred /data/src/github.com/openshift/origin/test/extended/deployments/deployments.go:167 ```
test
deployment failed caused by a config change deploymentconfigs when run iteratively should only deploy the last deployment data src github com openshift origin test extended deployments deployments go deploymentconfigs when run iteratively should immediately start a new deployment data src github com openshift origin test extended deployments deployments go should only deploy the last deployment data src github com openshift origin test extended deployments deployments go expected error s deployment failed caused by a config change deployment failed caused by a config change not to have occurred data src github com openshift origin test extended deployments deployments go should immediately start a new deployment data src github com openshift origin test extended deployments deployments go expected error s timed out waiting for the condition timed out waiting for the condition not to have occurred data src github com openshift origin test extended deployments deployments go
1
50,886
6,130,830,889
IssuesEvent
2017-06-24 09:32:32
rust-lang/rust
https://api.github.com/repos/rust-lang/rust
closed
Inconsistency with “at least one non-builtin trait is required for an object type”
E-needstest
Auto traits are alone allowed as object types in some cases and not others. The following two lines are fine: (Using Send as an object type) ```rust type Y = Send; fn main() { let x = &1 as &Send; } ``` The following line has an error: ```rust type Z = for<'x> Send; //~ error[E0224]: at least one non-builtin trait is required for an object type ``` The second error seems to speak against the first two examples. Tested using `rustc 1.14.0-nightly (3f4408347 2016-10-27)`
1.0
Inconsistency with “at least one non-builtin trait is required for an object type” - Auto traits are alone allowed as object types in some cases and not others. The following two lines are fine: (Using Send as an object type) ```rust type Y = Send; fn main() { let x = &1 as &Send; } ``` The following line has an error: ```rust type Z = for<'x> Send; //~ error[E0224]: at least one non-builtin trait is required for an object type ``` The second error seems to speak against the first two examples. Tested using `rustc 1.14.0-nightly (3f4408347 2016-10-27)`
test
inconsistency with “at least one non builtin trait is required for an object type” auto traits are alone allowed as object types in some cases and not others the following two lines are fine using send as an object type rust type y send fn main let x as send the following line has an error rust type z for send error at least one non builtin trait is required for an object type the second error seems to speak against the first two examples tested using rustc nightly
1
140,086
18,893,691,481
IssuesEvent
2021-11-15 15:41:41
Zolyn/vuepress-plugin-waline
https://api.github.com/repos/Zolyn/vuepress-plugin-waline
closed
CVE-2021-32803 (High) detected in tar-6.1.0.tgz
security vulnerability
## CVE-2021-32803 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-6.1.0.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-6.1.0.tgz">https://registry.npmjs.org/tar/-/tar-6.1.0.tgz</a></p> <p>Path to dependency file: vuepress-plugin-waline/package.json</p> <p>Path to vulnerable library: vuepress-plugin-waline/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - minivaline-5.1.7.tgz (Root Library) - snyk-1.618.0.tgz - :x: **tar-6.1.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2. <p>Publish Date: 2021-08-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p> <p>Release Date: 2021-08-03</p> <p>Fix Resolution: tar - 3.2.3, 4.4.15, 5.0.7, 6.1.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-32803 (High) detected in tar-6.1.0.tgz - ## CVE-2021-32803 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-6.1.0.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-6.1.0.tgz">https://registry.npmjs.org/tar/-/tar-6.1.0.tgz</a></p> <p>Path to dependency file: vuepress-plugin-waline/package.json</p> <p>Path to vulnerable library: vuepress-plugin-waline/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - minivaline-5.1.7.tgz (Root Library) - snyk-1.618.0.tgz - :x: **tar-6.1.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2. <p>Publish Date: 2021-08-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p> <p>Release Date: 2021-08-03</p> <p>Fix Resolution: tar - 3.2.3, 4.4.15, 5.0.7, 6.1.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href path to dependency file vuepress plugin waline package json path to vulnerable library vuepress plugin waline node modules tar package json dependency hierarchy minivaline tgz root library snyk tgz x tar tgz vulnerable library found in base branch main vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite vulnerability via insufficient symlink protection node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory this order of operations resulted in the directory being created and added to the node tar directory cache when a directory is present in the directory cache subsequent calls to mkdir for that directory are skipped however this is also where node tar checks for symlinks occur by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite this issue was addressed in releases and publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar step up your open source security game with whitesource
0
32,339
12,103,490,084
IssuesEvent
2020-04-20 18:29:24
wrbejar/JavaVulnerableLab
https://api.github.com/repos/wrbejar/JavaVulnerableLab
opened
CVE-2015-7501 (High) detected in commons-collections-3.2.1.jar
security vulnerability
## CVE-2015-7501 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-collections-3.2.1.jar</b></p></summary> <p>Types that extend and augment the Java Collections Framework.</p> <p>Path to vulnerable library: _depth_0/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/commons-collections-3.2.1.jar,_depth_0/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/commons-collections-3.2.1.jar</p> <p> Dependency Hierarchy: - :x: **commons-collections-3.2.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/wrbejar/JavaVulnerableLab/commit/2ed76cff8a50599d42c46efd42f5e92c03e05850">2ed76cff8a50599d42c46efd42f5e92c03e05850</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Red Hat JBoss A-MQ 6.x; BPM Suite (BPMS) 6.x; BRMS 6.x and 5.x; Data Grid (JDG) 6.x; Data Virtualization (JDV) 6.x and 5.x; Enterprise Application Platform 6.x, 5.x, and 4.3.x; Fuse 6.x; Fuse Service Works (FSW) 6.x; Operations Network (JBoss ON) 3.x; Portal 6.x; SOA Platform (SOA-P) 5.x; Web Server (JWS) 3.x; Red Hat OpenShift/xPAAS 3.x; and Red Hat Subscription Asset Manager 1.3 allow remote attackers to execute arbitrary commands via a crafted serialized Java object, related to the Apache Commons Collections (ACC) library. <p>Publish Date: 2017-11-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-7501>CVE-2015-7501</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1279330">https://bugzilla.redhat.com/show_bug.cgi?id=1279330</a></p> <p>Release Date: 2017-11-09</p> <p>Fix Resolution: commons-collections:commons-collections:3.2.2;org.apache.commons:commons-collections4:4.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"commons-collections","packageName":"commons-collections","packageVersion":"3.2.1","isTransitiveDependency":false,"dependencyTree":"commons-collections:commons-collections:3.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"commons-collections:commons-collections:3.2.2;org.apache.commons:commons-collections4:4.1"}],"vulnerabilityIdentifier":"CVE-2015-7501","vulnerabilityDetails":"Red Hat JBoss A-MQ 6.x; BPM Suite (BPMS) 6.x; BRMS 6.x and 5.x; Data Grid (JDG) 6.x; Data Virtualization (JDV) 6.x and 5.x; Enterprise Application Platform 6.x, 5.x, and 4.3.x; Fuse 6.x; Fuse Service Works (FSW) 6.x; Operations Network (JBoss ON) 3.x; Portal 6.x; SOA Platform (SOA-P) 5.x; Web Server (JWS) 3.x; Red Hat OpenShift/xPAAS 3.x; and Red Hat Subscription Asset Manager 1.3 allow remote attackers to execute arbitrary commands via a crafted serialized Java object, related to the Apache Commons Collections (ACC) library.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-7501","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2015-7501 (High) detected in commons-collections-3.2.1.jar - ## CVE-2015-7501 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-collections-3.2.1.jar</b></p></summary> <p>Types that extend and augment the Java Collections Framework.</p> <p>Path to vulnerable library: _depth_0/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/commons-collections-3.2.1.jar,_depth_0/JavaVulnerableLab/target/JavaVulnerableLab/WEB-INF/lib/commons-collections-3.2.1.jar</p> <p> Dependency Hierarchy: - :x: **commons-collections-3.2.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/wrbejar/JavaVulnerableLab/commit/2ed76cff8a50599d42c46efd42f5e92c03e05850">2ed76cff8a50599d42c46efd42f5e92c03e05850</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Red Hat JBoss A-MQ 6.x; BPM Suite (BPMS) 6.x; BRMS 6.x and 5.x; Data Grid (JDG) 6.x; Data Virtualization (JDV) 6.x and 5.x; Enterprise Application Platform 6.x, 5.x, and 4.3.x; Fuse 6.x; Fuse Service Works (FSW) 6.x; Operations Network (JBoss ON) 3.x; Portal 6.x; SOA Platform (SOA-P) 5.x; Web Server (JWS) 3.x; Red Hat OpenShift/xPAAS 3.x; and Red Hat Subscription Asset Manager 1.3 allow remote attackers to execute arbitrary commands via a crafted serialized Java object, related to the Apache Commons Collections (ACC) library. <p>Publish Date: 2017-11-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-7501>CVE-2015-7501</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1279330">https://bugzilla.redhat.com/show_bug.cgi?id=1279330</a></p> <p>Release Date: 2017-11-09</p> <p>Fix Resolution: commons-collections:commons-collections:3.2.2;org.apache.commons:commons-collections4:4.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"commons-collections","packageName":"commons-collections","packageVersion":"3.2.1","isTransitiveDependency":false,"dependencyTree":"commons-collections:commons-collections:3.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"commons-collections:commons-collections:3.2.2;org.apache.commons:commons-collections4:4.1"}],"vulnerabilityIdentifier":"CVE-2015-7501","vulnerabilityDetails":"Red Hat JBoss A-MQ 6.x; BPM Suite (BPMS) 6.x; BRMS 6.x and 5.x; Data Grid (JDG) 6.x; Data Virtualization (JDV) 6.x and 5.x; Enterprise Application Platform 6.x, 5.x, and 4.3.x; Fuse 6.x; Fuse Service Works (FSW) 6.x; Operations Network (JBoss ON) 3.x; Portal 6.x; SOA Platform (SOA-P) 5.x; Web Server (JWS) 3.x; Red Hat OpenShift/xPAAS 3.x; and Red Hat Subscription Asset Manager 1.3 allow remote attackers to execute arbitrary commands via a crafted serialized Java object, related to the Apache Commons Collections (ACC) library.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-7501","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_test
cve high detected in commons collections jar cve high severity vulnerability vulnerable library commons collections jar types that extend and augment the java collections framework path to vulnerable library depth javavulnerablelab target javavulnerablelab web inf lib commons collections jar depth javavulnerablelab target javavulnerablelab web inf lib commons collections jar dependency hierarchy x commons collections jar vulnerable library found in head commit a href vulnerability details red hat jboss a mq x bpm suite bpms x brms x and x data grid jdg x data virtualization jdv x and x enterprise application platform x x and x fuse x fuse service works fsw x operations network jboss on x portal x soa platform soa p x web server jws x red hat openshift xpaas x and red hat subscription asset manager allow remote attackers to execute arbitrary commands via a crafted serialized java object related to the apache commons collections acc library publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution commons collections commons collections org apache commons commons isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails red hat jboss a mq x bpm suite bpms x brms x and x data grid jdg x data virtualization jdv x and x enterprise application platform x x and x fuse x fuse service works fsw x operations network jboss on x portal x soa platform soa p x web server jws x red hat openshift xpaas x and red hat subscription asset manager allow remote attackers to execute arbitrary commands via a crafted serialized java object related to the apache commons collections acc library vulnerabilityurl
0
40,824
5,317,211,253
IssuesEvent
2017-02-13 21:55:32
ampproject/amphtml
https://api.github.com/repos/ampproject/amphtml
opened
Error Logging test flake
P1: High Priority Related to: Flaky Tests
``` Chrome 56.0.2924 (Linux 0.0.0) Logging rethrowAsync should rethrow a single error FAILED AssertionError: expected 'intended _reported_' to equal 'intended' at Context.<anonymous> (/tmp/ee8881a36732455386f0642b0f77b860.browserify:150303:32 <- /home/travis/build/ampproject/amphtml/test/functional/test-log.js:551:31) ```
1.0
Error Logging test flake - ``` Chrome 56.0.2924 (Linux 0.0.0) Logging rethrowAsync should rethrow a single error FAILED AssertionError: expected 'intended _reported_' to equal 'intended' at Context.<anonymous> (/tmp/ee8881a36732455386f0642b0f77b860.browserify:150303:32 <- /home/travis/build/ampproject/amphtml/test/functional/test-log.js:551:31) ```
test
error logging test flake chrome linux logging rethrowasync should rethrow a single error failed assertionerror expected intended reported to equal intended at context tmp browserify home travis build ampproject amphtml test functional test log js
1
53,349
6,717,551,365
IssuesEvent
2017-10-14 22:47:43
pjhampton/Gramophone
https://api.github.com/repos/pjhampton/Gramophone
opened
Style Gallery
Design Hacktoberfest
The **gallery post format** needs to be styled with basic bootstrap styles. You can get the dummy data from: https://codex.wordpress.org/Theme_Unit_Test ![screen shot 2017-10-14 at 10 28 53 pm](https://user-images.githubusercontent.com/8960296/31579953-fc72fdaa-b139-11e7-9890-c51438512bb7.png) If you are interested, please comment on the issue below 😄 👇
1.0
Style Gallery - The **gallery post format** needs to be styled with basic bootstrap styles. You can get the dummy data from: https://codex.wordpress.org/Theme_Unit_Test ![screen shot 2017-10-14 at 10 28 53 pm](https://user-images.githubusercontent.com/8960296/31579953-fc72fdaa-b139-11e7-9890-c51438512bb7.png) If you are interested, please comment on the issue below 😄 👇
non_test
style gallery the gallery post format needs to be styled with basic bootstrap styles you can get the dummy data from if you are interested please comment on the issue below 😄 👇
0
38,850
12,603,324,785
IssuesEvent
2020-06-11 13:18:34
jgeraigery/spark
https://api.github.com/repos/jgeraigery/spark
opened
CVE-2016-10735 (Medium) detected in bootstrap-2.1.0.min.js
security vulnerability
## CVE-2016-10735 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-2.1.0.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.1.0/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.1.0/bootstrap.min.js</a></p> <p>Path to vulnerable library: bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-2.1.0.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/jgeraigery/spark/commit/731da3b446d6f17beaa776a756bcc2ffc1397f43">731da3b446d6f17beaa776a756bcc2ffc1397f43</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041. <p>Publish Date: 2019-01-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735>CVE-2016-10735</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/commit/2ba83171807bdec8ce5235042e6abfc6906a1d09">https://github.com/twbs/bootstrap/commit/2ba83171807bdec8ce5235042e6abfc6906a1d09</a></p> <p>Release Date: 2019-01-09</p> <p>Fix Resolution: v3.4.0,v4.0.0-beta.2</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"2.1.0","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:2.1.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v3.4.0,v4.0.0-beta.2"}],"vulnerabilityIdentifier":"CVE-2016-10735","vulnerabilityDetails":"In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2016-10735 (Medium) detected in bootstrap-2.1.0.min.js - ## CVE-2016-10735 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-2.1.0.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.1.0/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/2.1.0/bootstrap.min.js</a></p> <p>Path to vulnerable library: bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-2.1.0.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/jgeraigery/spark/commit/731da3b446d6f17beaa776a756bcc2ffc1397f43">731da3b446d6f17beaa776a756bcc2ffc1397f43</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041. <p>Publish Date: 2019-01-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735>CVE-2016-10735</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/commit/2ba83171807bdec8ce5235042e6abfc6906a1d09">https://github.com/twbs/bootstrap/commit/2ba83171807bdec8ce5235042e6abfc6906a1d09</a></p> <p>Release Date: 2019-01-09</p> <p>Fix Resolution: v3.4.0,v4.0.0-beta.2</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"2.1.0","isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:2.1.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v3.4.0,v4.0.0-beta.2"}],"vulnerabilityIdentifier":"CVE-2016-10735","vulnerabilityDetails":"In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_test
cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href vulnerability details in bootstrap x before and x beta before beta xss is possible in the data target attribute a different vulnerability than cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution beta isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails in bootstrap x before and x beta before beta xss is possible in the data target attribute a different vulnerability than cve vulnerabilityurl
0
105,388
23,043,224,832
IssuesEvent
2022-07-23 13:34:35
nmrih/source-game
https://api.github.com/repos/nmrih/source-game
reopened
[public-1.10.0] Zombies can close doors
Status: Completed Type: Code Priority: Minimal
Zombies should not be able to use doors if the door is open (fully or partially). It leads to scenarios where zombies can pile up behind a door and immediately close it as it starts to open, making it impossible to go thru it.
1.0
[public-1.10.0] Zombies can close doors - Zombies should not be able to use doors if the door is open (fully or partially). It leads to scenarios where zombies can pile up behind a door and immediately close it as it starts to open, making it impossible to go thru it.
non_test
zombies can close doors zombies should not be able to use doors if the door is open fully or partially it leads to scenarios where zombies can pile up behind a door and immediately close it as it starts to open making it impossible to go thru it
0
139,565
11,273,907,428
IssuesEvent
2020-01-14 17:26:22
apache/pulsar
https://api.github.com/repos/apache/pulsar
closed
[go client] add go related tests and checks in jenkins
component/go component/test triage/week-35 type/task
**Is your feature request related to a problem? Please describe.** At present, Jenkins does not check the unit tests and code specifications related to go. When we modify the unit test of go or add new code, we can only check it locally. For some code formats and specifications, we have no way. Do a good job of specification and unification **Describe the solution you'd like** Regarding the check of the go code format, the more mature tool in the community is [gometalinter](https://github.com/alecthomas/gometalinter) **Additional context** check.sh ``` #!/bin/bash # The script does automatic checking on a Go package and its sub-packages, including: # 1. gofmt (http://golang.org/cmd/gofmt/) # 2. golint (https://github.com/golang/lint) # 3. go vet (http://golang.org/cmd/vet) # 4. gosimple (https://github.com/dominikh/go-simple) # 5. unconvert (https://github.com/mdempsky/unconvert) # # gometalinter (github.com/alecthomas/gometalinter) is used to run each static # checker. set -ex # Make sure gometalinter is installed and $GOPATH/bin is in your path. # $ go get -v github.com/alecthomas/gometalinter" # $ gometalinter --install" if [ ! -x "$(type -p gometalinter.v2)" ]; then exit 1 fi linter_targets=$(go list ./...) # Automatic checks test -z "$(gometalinter.v2 -j 4 --disable-all \ --enable=gofmt \ --enable=golint \ --enable=vet \ --enable=gosimple \ --enable=unconvert \ --deadline=10m $linter_targets 2>&1 | grep -v 'ALL_CAPS\|OP_' 2>&1 | tee /dev/stderr)" GO111MODULE=on go test $linter_targets ```
1.0
[go client] add go related tests and checks in jenkins - **Is your feature request related to a problem? Please describe.** At present, Jenkins does not check the unit tests and code specifications related to go. When we modify the unit test of go or add new code, we can only check it locally. For some code formats and specifications, we have no way. Do a good job of specification and unification **Describe the solution you'd like** Regarding the check of the go code format, the more mature tool in the community is [gometalinter](https://github.com/alecthomas/gometalinter) **Additional context** check.sh ``` #!/bin/bash # The script does automatic checking on a Go package and its sub-packages, including: # 1. gofmt (http://golang.org/cmd/gofmt/) # 2. golint (https://github.com/golang/lint) # 3. go vet (http://golang.org/cmd/vet) # 4. gosimple (https://github.com/dominikh/go-simple) # 5. unconvert (https://github.com/mdempsky/unconvert) # # gometalinter (github.com/alecthomas/gometalinter) is used to run each static # checker. set -ex # Make sure gometalinter is installed and $GOPATH/bin is in your path. # $ go get -v github.com/alecthomas/gometalinter" # $ gometalinter --install" if [ ! -x "$(type -p gometalinter.v2)" ]; then exit 1 fi linter_targets=$(go list ./...) # Automatic checks test -z "$(gometalinter.v2 -j 4 --disable-all \ --enable=gofmt \ --enable=golint \ --enable=vet \ --enable=gosimple \ --enable=unconvert \ --deadline=10m $linter_targets 2>&1 | grep -v 'ALL_CAPS\|OP_' 2>&1 | tee /dev/stderr)" GO111MODULE=on go test $linter_targets ```
test
add go related tests and checks in jenkins is your feature request related to a problem please describe at present jenkins does not check the unit tests and code specifications related to go when we modify the unit test of go or add new code we can only check it locally for some code formats and specifications we have no way do a good job of specification and unification describe the solution you d like regarding the check of the go code format the more mature tool in the community is additional context check sh bin bash the script does automatic checking on a go package and its sub packages including gofmt golint go vet gosimple unconvert gometalinter github com alecthomas gometalinter is used to run each static checker set ex make sure gometalinter is installed and gopath bin is in your path go get v github com alecthomas gometalinter gometalinter install if then exit fi linter targets go list automatic checks test z gometalinter j disable all enable gofmt enable golint enable vet enable gosimple enable unconvert deadline linter targets grep v all caps op tee dev stderr on go test linter targets
1
178,167
13,766,764,550
IssuesEvent
2020-10-07 14:56:34
mozilla-mobile/firefox-ios
https://api.github.com/repos/mozilla-mobile/firefox-ios
opened
[XCUITests] Update XCUITest scheme for iOS14
Test-Automation
Similarly to what we did to fix the[ Smoketest for iOS14,](https://github.com/mozilla-mobile/firefox-ios/issues/6969) we need to do the same for the general test suite. There are some test failures due to similar issues on Bitrise, see [logs](https://addons-testing.bitrise.io/builds/c9e9f069dd5ea030/testreport/7e19af96-e170-4e2d-8b8f-e3b398dafb0e/testsuite/0/testcases?status=failed) Test affected: LibraryTestsIphone.testLibraryShortcutHomePage() NavigationTest.testLongPressLinkOptionsPrivateMode() NavigationTest.testLongPressOnAddressBar() NavigationTest.testShareLink() NavigationTest.testShareLinkPrivateMode() NavigationTest.testWriteToChildPopupTab() PhotonActionSheetTest.testSendToDeviceFromPageOptionsMenu() PhotonActionSheetTest.testShareOptionIsShown() PocketTest.testPocketEnabledByDefault() PocketTest.testTapOnMore() ReaderViewTest.testMarkAsReadAndUnreadFromReadingList() ScreenGraphTest.testChainedActionPerf1() SearchTests.testCopyPasteComplete() SearchTests.testPromptPresence() ThirdPartySearchTest.testCustomEngineFromIncorrectTemplate() ThirdPartySearchTest.testCustomSearchEngineAsDefault() ThirdPartySearchTest.testCustomSearchEngineDeletion() ThirdPartySearchTest.testCustomSearchEngines()
1.0
[XCUITests] Update XCUITest scheme for iOS14 - Similarly to what we did to fix the[ Smoketest for iOS14,](https://github.com/mozilla-mobile/firefox-ios/issues/6969) we need to do the same for the general test suite. There are some test failures due to similar issues on Bitrise, see [logs](https://addons-testing.bitrise.io/builds/c9e9f069dd5ea030/testreport/7e19af96-e170-4e2d-8b8f-e3b398dafb0e/testsuite/0/testcases?status=failed) Test affected: LibraryTestsIphone.testLibraryShortcutHomePage() NavigationTest.testLongPressLinkOptionsPrivateMode() NavigationTest.testLongPressOnAddressBar() NavigationTest.testShareLink() NavigationTest.testShareLinkPrivateMode() NavigationTest.testWriteToChildPopupTab() PhotonActionSheetTest.testSendToDeviceFromPageOptionsMenu() PhotonActionSheetTest.testShareOptionIsShown() PocketTest.testPocketEnabledByDefault() PocketTest.testTapOnMore() ReaderViewTest.testMarkAsReadAndUnreadFromReadingList() ScreenGraphTest.testChainedActionPerf1() SearchTests.testCopyPasteComplete() SearchTests.testPromptPresence() ThirdPartySearchTest.testCustomEngineFromIncorrectTemplate() ThirdPartySearchTest.testCustomSearchEngineAsDefault() ThirdPartySearchTest.testCustomSearchEngineDeletion() ThirdPartySearchTest.testCustomSearchEngines()
test
update xcuitest scheme for similarly to what we did to fix the we need to do the same for the general test suite there are some test failures due to similar issues on bitrise see test affected librarytestsiphone testlibraryshortcuthomepage navigationtest testlongpresslinkoptionsprivatemode navigationtest testlongpressonaddressbar navigationtest testsharelink navigationtest testsharelinkprivatemode navigationtest testwritetochildpopuptab photonactionsheettest testsendtodevicefrompageoptionsmenu photonactionsheettest testshareoptionisshown pockettest testpocketenabledbydefault pockettest testtaponmore readerviewtest testmarkasreadandunreadfromreadinglist screengraphtest searchtests testcopypastecomplete searchtests testpromptpresence thirdpartysearchtest testcustomenginefromincorrecttemplate thirdpartysearchtest testcustomsearchengineasdefault thirdpartysearchtest testcustomsearchenginedeletion thirdpartysearchtest testcustomsearchengines
1
34,363
4,918,323,912
IssuesEvent
2016-11-24 08:27:20
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
opened
github.com/cockroachdb/cockroach/vendor/github.com/coreos/etcd/contrib/recipes: (unknown) failed under stress
Robot test-failure
SHA: https://github.com/cockroachdb/cockroach/commits/b54490b2cf70c155ec2b7af5133276ffe24dc02c Parameters: ``` COCKROACH_PROPOSER_EVALUATED_KV=true TAGS=stress GOFLAGS= ``` Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=58189&tab=buildLog ``` go list -tags 'stress' -f 'go test -v -tags '\''stress'\'' -ldflags '\'''\'' -i -c {{.ImportPath}} -o {{.Dir}}/stress.test && (cd {{.Dir}} && if [ -f stress.test ]; then stress -maxtime 15m -maxfails 1 -stderr ./stress.test -test.run '\''.'\'' -test.timeout 30m -test.v; fi)' github.com/cockroachdb/cockroach/vendor/github.com/coreos/etcd/contrib/recipes | /bin/bash vendor/github.com/coreos/etcd/clientv3/client.go:27:2: cannot find package "github.com/grpc-ecosystem/go-grpc-prometheus" in any of: /go/src/github.com/cockroachdb/cockroach/vendor/github.com/grpc-ecosystem/go-grpc-prometheus (vendor tree) /usr/local/go/src/github.com/grpc-ecosystem/go-grpc-prometheus (from $GOROOT) /go/src/github.com/grpc-ecosystem/go-grpc-prometheus (from $GOPATH) Makefile:138: recipe for target 'stress' failed make: *** [stress] Error 1 ```
1.0
github.com/cockroachdb/cockroach/vendor/github.com/coreos/etcd/contrib/recipes: (unknown) failed under stress - SHA: https://github.com/cockroachdb/cockroach/commits/b54490b2cf70c155ec2b7af5133276ffe24dc02c Parameters: ``` COCKROACH_PROPOSER_EVALUATED_KV=true TAGS=stress GOFLAGS= ``` Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=58189&tab=buildLog ``` go list -tags 'stress' -f 'go test -v -tags '\''stress'\'' -ldflags '\'''\'' -i -c {{.ImportPath}} -o {{.Dir}}/stress.test && (cd {{.Dir}} && if [ -f stress.test ]; then stress -maxtime 15m -maxfails 1 -stderr ./stress.test -test.run '\''.'\'' -test.timeout 30m -test.v; fi)' github.com/cockroachdb/cockroach/vendor/github.com/coreos/etcd/contrib/recipes | /bin/bash vendor/github.com/coreos/etcd/clientv3/client.go:27:2: cannot find package "github.com/grpc-ecosystem/go-grpc-prometheus" in any of: /go/src/github.com/cockroachdb/cockroach/vendor/github.com/grpc-ecosystem/go-grpc-prometheus (vendor tree) /usr/local/go/src/github.com/grpc-ecosystem/go-grpc-prometheus (from $GOROOT) /go/src/github.com/grpc-ecosystem/go-grpc-prometheus (from $GOPATH) Makefile:138: recipe for target 'stress' failed make: *** [stress] Error 1 ```
test
github com cockroachdb cockroach vendor github com coreos etcd contrib recipes unknown failed under stress sha parameters cockroach proposer evaluated kv true tags stress goflags stress build found a failed test go list tags stress f go test v tags stress ldflags i c importpath o dir stress test cd dir if then stress maxtime maxfails stderr stress test test run test timeout test v fi github com cockroachdb cockroach vendor github com coreos etcd contrib recipes bin bash vendor github com coreos etcd client go cannot find package github com grpc ecosystem go grpc prometheus in any of go src github com cockroachdb cockroach vendor github com grpc ecosystem go grpc prometheus vendor tree usr local go src github com grpc ecosystem go grpc prometheus from goroot go src github com grpc ecosystem go grpc prometheus from gopath makefile recipe for target stress failed make error
1
187,408
14,427,754,904
IssuesEvent
2020-12-06 06:00:40
kalexmills/github-vet-tests-dec2020
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
closed
lucmichalski/vmx-kubernetes: utils/kubernetes/contrib/mesos/pkg/scheduler/plugin_test.go; 9 LoC
fresh test tiny
Found a possible issue in [lucmichalski/vmx-kubernetes](https://www.github.com/lucmichalski/vmx-kubernetes) at [utils/kubernetes/contrib/mesos/pkg/scheduler/plugin_test.go](https://github.com/lucmichalski/vmx-kubernetes/blob/8b1261f16d5261d36047ba981079a070a508e975/utils/kubernetes/contrib/mesos/pkg/scheduler/plugin_test.go#L180-L188) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to otherPod at line 184 may start a goroutine [Click here to see the code in its original context.](https://github.com/lucmichalski/vmx-kubernetes/blob/8b1261f16d5261d36047ba981079a070a508e975/utils/kubernetes/contrib/mesos/pkg/scheduler/plugin_test.go#L180-L188) <details> <summary>Click here to show the 9 line(s) of Go which triggered the analyzer.</summary> ```go for i, otherPod := range lw.list.Items { if otherPod.Name == pod.Name { lw.list.Items = append(lw.list.Items[:i], lw.list.Items[i+1:]...) if notify { lw.fakeWatcher.Delete(&otherPod) } return } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 8b1261f16d5261d36047ba981079a070a508e975
1.0
lucmichalski/vmx-kubernetes: utils/kubernetes/contrib/mesos/pkg/scheduler/plugin_test.go; 9 LoC - Found a possible issue in [lucmichalski/vmx-kubernetes](https://www.github.com/lucmichalski/vmx-kubernetes) at [utils/kubernetes/contrib/mesos/pkg/scheduler/plugin_test.go](https://github.com/lucmichalski/vmx-kubernetes/blob/8b1261f16d5261d36047ba981079a070a508e975/utils/kubernetes/contrib/mesos/pkg/scheduler/plugin_test.go#L180-L188) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to otherPod at line 184 may start a goroutine [Click here to see the code in its original context.](https://github.com/lucmichalski/vmx-kubernetes/blob/8b1261f16d5261d36047ba981079a070a508e975/utils/kubernetes/contrib/mesos/pkg/scheduler/plugin_test.go#L180-L188) <details> <summary>Click here to show the 9 line(s) of Go which triggered the analyzer.</summary> ```go for i, otherPod := range lw.list.Items { if otherPod.Name == pod.Name { lw.list.Items = append(lw.list.Items[:i], lw.list.Items[i+1:]...) if notify { lw.fakeWatcher.Delete(&otherPod) } return } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 8b1261f16d5261d36047ba981079a070a508e975
test
lucmichalski vmx kubernetes utils kubernetes contrib mesos pkg scheduler plugin test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to otherpod at line may start a goroutine click here to show the line s of go which triggered the analyzer go for i otherpod range lw list items if otherpod name pod name lw list items append lw list items lw list items if notify lw fakewatcher delete otherpod return leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
123,492
10,270,505,494
IssuesEvent
2019-08-23 11:49:55
legion-platform/legion
https://api.github.com/repos/legion-platform/legion
opened
Terraform module for K8S setup at EKS
860: K8S to PaaS CI/CD/tests epic:972- EKS
As a DevOps engineer I'd like to have terraform modules for Legion K8S dependencies setup at EKS cluster so I can deploy Legion components
1.0
Terraform module for K8S setup at EKS - As a DevOps engineer I'd like to have terraform modules for Legion K8S dependencies setup at EKS cluster so I can deploy Legion components
test
terraform module for setup at eks as a devops engineer i d like to have terraform modules for legion dependencies setup at eks cluster so i can deploy legion components
1
219,105
17,064,351,158
IssuesEvent
2021-07-07 04:32:12
topcoder-platform/tc-auth-lib
https://api.github.com/repos/topcoder-platform/tc-auth-lib
closed
Getting "Unknown error from provider" while trying to generate the V2 Token using Postman
Functional Internal_Testing P1 Sign up
![bandicam 2021-06-30 09-59-21-797](https://user-images.githubusercontent.com/42398485/123902111-de677080-d989-11eb-8909-a87d7775c3a6.jpg) **Response**: 401 Unauthorized ``` { "error": "Unknown error from provider", "error_description": "Unknown error from provider" } ``` cc @ThomasKranitsas @lakshmiathreya
1.0
Getting "Unknown error from provider" while trying to generate the V2 Token using Postman - ![bandicam 2021-06-30 09-59-21-797](https://user-images.githubusercontent.com/42398485/123902111-de677080-d989-11eb-8909-a87d7775c3a6.jpg) **Response**: 401 Unauthorized ``` { "error": "Unknown error from provider", "error_description": "Unknown error from provider" } ``` cc @ThomasKranitsas @lakshmiathreya
test
getting unknown error from provider while trying to generate the token using postman response unauthorized error unknown error from provider error description unknown error from provider cc thomaskranitsas lakshmiathreya
1
278,983
8,655,567,059
IssuesEvent
2018-11-27 16:14:33
ansible/awx
https://api.github.com/repos/ansible/awx
closed
In workflow Visualizer, Auditors cannot see Jobs on the Node selection view
component:api component:ui flag:qe priority:medium state:needs_devel type:bug
##### ISSUE TYPE - Bug Report ##### COMPONENT NAME - API - UI ##### SUMMARY In workflow Visualizer, Auditors cannot see Jobs in the Node selection view ##### ENVIRONMENT * AWX version: 3.4.0-devel ##### STEPS TO REPRODUCE 1. Create a system auditor attached to an org 2. Ensure that the org has a workflow with at least 1 JT or WFJT. 3. As the system auditor, open the workflow visualizer and attempt to view the Jobs List in the Node selection screen Note: Inventory and Project Nodes are available. Jobs are not listed ##### ADDITIONAL INFORMATION ![screen shot 2018-11-26 at 7 50 23 pm](https://user-images.githubusercontent.com/12446869/49051220-ec8ef800-f1b4-11e8-8b18-161c34a5cc20.png)
1.0
In workflow Visualizer, Auditors cannot see Jobs on the Node selection view - ##### ISSUE TYPE - Bug Report ##### COMPONENT NAME - API - UI ##### SUMMARY In workflow Visualizer, Auditors cannot see Jobs in the Node selection view ##### ENVIRONMENT * AWX version: 3.4.0-devel ##### STEPS TO REPRODUCE 1. Create a system auditor attached to an org 2. Ensure that the org has a workflow with at least 1 JT or WFJT. 3. As the system auditor, open the workflow visualizer and attempt to view the Jobs List in the Node selection screen Note: Inventory and Project Nodes are available. Jobs are not listed ##### ADDITIONAL INFORMATION ![screen shot 2018-11-26 at 7 50 23 pm](https://user-images.githubusercontent.com/12446869/49051220-ec8ef800-f1b4-11e8-8b18-161c34a5cc20.png)
non_test
in workflow visualizer auditors cannot see jobs on the node selection view issue type bug report component name api ui summary in workflow visualizer auditors cannot see jobs in the node selection view environment awx version devel steps to reproduce create a system auditor attached to an org ensure that the org has a workflow with at least jt or wfjt as the system auditor open the workflow visualizer and attempt to view the jobs list in the node selection screen note inventory and project nodes are available jobs are not listed additional information
0
293,397
25,289,518,462
IssuesEvent
2022-11-16 22:29:41
adafruit/circuitpython
https://api.github.com/repos/adafruit/circuitpython
closed
Creating an audio sample array corrupts flash memory
bug needs retest audio rp2
### CircuitPython version ```python adafruit-circuitpython-raspberry_pi_pico-en_US-7.2.3 ``` ### Code/REPL ```python def PlayTone(inst, length, pitch, mixer, volume, effect): tone_frequency = int(440 * (pitch ** 1.059463094359)) tone_length = int(length // tone_frequency) sine_wave = array.array("h", [0] * tone_length) for i in range(tone_length): sine_wave[i] = int((1 + math.sin(math.pi * 2 * i / tone_length)) * (volume / 7) * (2 ** 15 - 1)) sine_wave_sample = RawSample(sine_wave, sample_rate=22050) while True: mixer.voice[0].play(sine_wave_sample, loop=True) time.sleep(length) mixer.stop_voice() ``` ### Behavior REPL disconnects and flash memory corrupts. ### Description Every time I save my code trying to play an audio sample from an array I get corrupted sounds coming out of the DAC and am unable to save or copy anything from the board. Eventually it just disconnects completely. ### Additional information Using flask_nuke.uf2 and reinstalling CircuitPython fixes the issue until caused again.
1.0
Creating an audio sample array corrupts flash memory - ### CircuitPython version ```python adafruit-circuitpython-raspberry_pi_pico-en_US-7.2.3 ``` ### Code/REPL ```python def PlayTone(inst, length, pitch, mixer, volume, effect): tone_frequency = int(440 * (pitch ** 1.059463094359)) tone_length = int(length // tone_frequency) sine_wave = array.array("h", [0] * tone_length) for i in range(tone_length): sine_wave[i] = int((1 + math.sin(math.pi * 2 * i / tone_length)) * (volume / 7) * (2 ** 15 - 1)) sine_wave_sample = RawSample(sine_wave, sample_rate=22050) while True: mixer.voice[0].play(sine_wave_sample, loop=True) time.sleep(length) mixer.stop_voice() ``` ### Behavior REPL disconnects and flash memory corrupts. ### Description Every time I save my code trying to play an audio sample from an array I get corrupted sounds coming out of the DAC and am unable to save or copy anything from the board. Eventually it just disconnects completely. ### Additional information Using flask_nuke.uf2 and reinstalling CircuitPython fixes the issue until caused again.
test
creating an audio sample array corrupts flash memory circuitpython version python adafruit circuitpython raspberry pi pico en us code repl python def playtone inst length pitch mixer volume effect tone frequency int pitch tone length int length tone frequency sine wave array array h tone length for i in range tone length sine wave int math sin math pi i tone length volume sine wave sample rawsample sine wave sample rate while true mixer voice play sine wave sample loop true time sleep length mixer stop voice behavior repl disconnects and flash memory corrupts description every time i save my code trying to play an audio sample from an array i get corrupted sounds coming out of the dac and am unable to save or copy anything from the board eventually it just disconnects completely additional information using flask nuke and reinstalling circuitpython fixes the issue until caused again
1
120,554
15,779,582,395
IssuesEvent
2021-04-01 08:56:02
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
reopened
Open Recent doesn't open the same custom editor
*as-designed
Testing #117279 1. Close all editors 2. Open a file with the Hex editors 3. Close all editors 4. <kbd>Ctrl P</kbd> to open the file again 🐛 File opens with text editor.
1.0
Open Recent doesn't open the same custom editor - Testing #117279 1. Close all editors 2. Open a file with the Hex editors 3. Close all editors 4. <kbd>Ctrl P</kbd> to open the file again 🐛 File opens with text editor.
non_test
open recent doesn t open the same custom editor testing close all editors open a file with the hex editors close all editors ctrl p to open the file again 🐛 file opens with text editor
0
166,461
12,957,219,697
IssuesEvent
2020-07-20 09:24:32
MLH-Fellowship/react-native
https://api.github.com/repos/MLH-Fellowship/react-native
closed
Implement the new Alert Screen
RNTester level1
Contributes to #14 ## Use cases ### Alert with one button - [x] Show an alert with the message - "External USB drive detected!" with a neutral button `Okay`. ### Alert with two buttons [ ] Show an alert with the message - "Your subscription has expired!" with a positive button `Renew` and a negative button `Ignore` ### Alert with three buttons - [x] Show an alert with the message - "Do you want to save your changes?" with a positive button `Yes`, a negative button `No` and a neutral button `Cancel`. ### Alert with more than three buttons [Only available on iOS] - [x] Can't think of any real-world use case. Could just show `Button One`, `Button Two`, `Button Three`, and so on. ## e2e tests Write detox tests for each of the above alerts and test the following - - [x] Alert is visible - [x] Alert can be dismissed by pressing any one of the buttons - [x] Callback handler is called appropriately ## Suggestions - Style prop (Cancellable or not) - Android alerts dismissible by tap outside alert window
1.0
Implement the new Alert Screen - Contributes to #14 ## Use cases ### Alert with one button - [x] Show an alert with the message - "External USB drive detected!" with a neutral button `Okay`. ### Alert with two buttons [ ] Show an alert with the message - "Your subscription has expired!" with a positive button `Renew` and a negative button `Ignore` ### Alert with three buttons - [x] Show an alert with the message - "Do you want to save your changes?" with a positive button `Yes`, a negative button `No` and a neutral button `Cancel`. ### Alert with more than three buttons [Only available on iOS] - [x] Can't think of any real-world use case. Could just show `Button One`, `Button Two`, `Button Three`, and so on. ## e2e tests Write detox tests for each of the above alerts and test the following - - [x] Alert is visible - [x] Alert can be dismissed by pressing any one of the buttons - [x] Callback handler is called appropriately ## Suggestions - Style prop (Cancellable or not) - Android alerts dismissible by tap outside alert window
test
implement the new alert screen contributes to use cases alert with one button show an alert with the message external usb drive detected with a neutral button okay alert with two buttons show an alert with the message your subscription has expired with a positive button renew and a negative button ignore alert with three buttons show an alert with the message do you want to save your changes with a positive button yes a negative button no and a neutral button cancel alert with more than three buttons can t think of any real world use case could just show button one button two button three and so on tests write detox tests for each of the above alerts and test the following alert is visible alert can be dismissed by pressing any one of the buttons callback handler is called appropriately suggestions style prop cancellable or not android alerts dismissible by tap outside alert window
1
273,380
29,820,292,256
IssuesEvent
2023-06-17 01:22:30
pazhanivel07/frameworks_base_2021-0970
https://api.github.com/repos/pazhanivel07/frameworks_base_2021-0970
closed
CVE-2023-21109 (Medium) detected in multiple libraries - autoclosed
Mend: dependency security vulnerability
## CVE-2023-21109 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>baseandroid-10.0.0_r44</b>, <b>baseandroid-10.0.0_r44</b>, <b>baseandroid-10.0.0_r44</b></p></summary> <p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> In multiple places of AccessibilityService, there is a possible way to hide the app from the user due to a logic error in the code. This could lead to local escalation of privilege with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-11 Android-12 Android-12L Android-13Android ID: A-261589597 <p>Publish Date: 2023-05-15 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-21109>CVE-2023-21109</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://android.googlesource.com/platform/frameworks/base/+/2c1f16db893680b0db29ffa222652fea3e5b87e0">https://android.googlesource.com/platform/frameworks/base/+/2c1f16db893680b0db29ffa222652fea3e5b87e0</a></p> <p>Release Date: 2023-05-15</p> <p>Fix Resolution: android-13.0.0_r49</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2023-21109 (Medium) detected in multiple libraries - autoclosed - ## CVE-2023-21109 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>baseandroid-10.0.0_r44</b>, <b>baseandroid-10.0.0_r44</b>, <b>baseandroid-10.0.0_r44</b></p></summary> <p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> In multiple places of AccessibilityService, there is a possible way to hide the app from the user due to a logic error in the code. This could lead to local escalation of privilege with no additional execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-11 Android-12 Android-12L Android-13Android ID: A-261589597 <p>Publish Date: 2023-05-15 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-21109>CVE-2023-21109</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://android.googlesource.com/platform/frameworks/base/+/2c1f16db893680b0db29ffa222652fea3e5b87e0">https://android.googlesource.com/platform/frameworks/base/+/2c1f16db893680b0db29ffa222652fea3e5b87e0</a></p> <p>Release Date: 2023-05-15</p> <p>Fix Resolution: android-13.0.0_r49</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in multiple libraries autoclosed cve medium severity vulnerability vulnerable libraries baseandroid baseandroid baseandroid vulnerability details in multiple places of accessibilityservice there is a possible way to hide the app from the user due to a logic error in the code this could lead to local escalation of privilege with no additional execution privileges needed user interaction is not needed for exploitation product androidversions android android android android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android step up your open source security game with mend
0
97,999
20,598,384,736
IssuesEvent
2022-03-05 21:51:59
rust-lang/rust
https://api.github.com/repos/rust-lang/rust
closed
Possible performance loss with f32 arithmetic
A-LLVM I-slow A-codegen T-compiler C-bug A-floating-point
I've tried, out of curiosity, a floating point arithmetic test and found quite a big difference between C++ and Rust. The code used in rust ```rust pub struct Stats { x: f32, y: f32, z: f32 } pub fn sum(a: &Stats, b: &Stats) -> Stats { Stats { x: a.x + b.x, y: a.y + b.y, z: a.z + b.z } } ``` The code used in C++ ```cpp struct Stats { float x; float y; float z; }; Stats sum(const Stats &a, const Stats &b) { return Stats { a.x + b.x, a.y + b.y, a.z + b.z }; } ``` Here is a link to a godbolt for side-by-side comparision of assembly output: https://godbolt.org/z/dqc4b74rv Rust seem to absolutely want the floats back into e* registers instead of keeping them in xmm registers, C++ leaves them into the xmm registers. In some cases it might more advantageous to leave the floats in xmm registers for future operations on them rather then passing them back into the e* registers.
1.0
Possible performance loss with f32 arithmetic - I've tried, out of curiosity, a floating point arithmetic test and found quite a big difference between C++ and Rust. The code used in rust ```rust pub struct Stats { x: f32, y: f32, z: f32 } pub fn sum(a: &Stats, b: &Stats) -> Stats { Stats { x: a.x + b.x, y: a.y + b.y, z: a.z + b.z } } ``` The code used in C++ ```cpp struct Stats { float x; float y; float z; }; Stats sum(const Stats &a, const Stats &b) { return Stats { a.x + b.x, a.y + b.y, a.z + b.z }; } ``` Here is a link to a godbolt for side-by-side comparision of assembly output: https://godbolt.org/z/dqc4b74rv Rust seem to absolutely want the floats back into e* registers instead of keeping them in xmm registers, C++ leaves them into the xmm registers. In some cases it might more advantageous to leave the floats in xmm registers for future operations on them rather then passing them back into the e* registers.
non_test
possible performance loss with arithmetic i ve tried out of curiosity a floating point arithmetic test and found quite a big difference between c and rust the code used in rust rust pub struct stats x y z pub fn sum a stats b stats stats stats x a x b x y a y b y z a z b z the code used in c cpp struct stats float x float y float z stats sum const stats a const stats b return stats a x b x a y b y a z b z here is a link to a godbolt for side by side comparision of assembly output rust seem to absolutely want the floats back into e registers instead of keeping them in xmm registers c leaves them into the xmm registers in some cases it might more advantageous to leave the floats in xmm registers for future operations on them rather then passing them back into the e registers
0
1,425
3,688,046,749
IssuesEvent
2016-02-25 11:04:36
mesosphere/marathon
https://api.github.com/repos/mesosphere/marathon
opened
Path in health checks validation failure results is broken
bug service
Steps to reproduce: 1) Submit an invalid app definition: ```json { "id": "/foo", "cmd": "sleep 500", "healthChecks": [ { "portIndex": -1 } ] } ``` 2) The result will be: ``` { "details": [ { "errors": [ "Health check port indices must address an element of the ports array or container port mappings." ], "path": "self" } ], "message": "Object is not valid" } ``` The `path` should be `/healthChecks(0)`, not `self`. cc/ @alexanderweber
1.0
Path in health checks validation failure results is broken - Steps to reproduce: 1) Submit an invalid app definition: ```json { "id": "/foo", "cmd": "sleep 500", "healthChecks": [ { "portIndex": -1 } ] } ``` 2) The result will be: ``` { "details": [ { "errors": [ "Health check port indices must address an element of the ports array or container port mappings." ], "path": "self" } ], "message": "Object is not valid" } ``` The `path` should be `/healthChecks(0)`, not `self`. cc/ @alexanderweber
non_test
path in health checks validation failure results is broken steps to reproduce submit an invalid app definition json id foo cmd sleep healthchecks portindex the result will be details errors health check port indices must address an element of the ports array or container port mappings path self message object is not valid the path should be healthchecks not self cc alexanderweber
0
224,080
17,659,332,225
IssuesEvent
2021-08-21 06:49:49
eblocker/eblocker
https://api.github.com/repos/eblocker/eblocker
closed
eBlocker Mobile: 2.6.2 is not compatible with Passepartout app
needs testing
In eOS 2.6.2 the compression was disabled in eBlocker Mobile. A [soft migration](https://community.openvpn.net/openvpn/wiki/VORACLE) was chosen to support client configurations created with eOS 2.5 or earlier, so the server pushes the following option to the client to disable the compression: push "compress stub-v2" The [Passepartout app](https://github.com/passepartoutvpn) does not seem to use `openvpn` but the `tunnelkit` library, which is confused by the option `compress stub-v2`. The error message is: Server has non-LZO compression enabled and this is currently unsupported (framing=compress)
1.0
eBlocker Mobile: 2.6.2 is not compatible with Passepartout app - In eOS 2.6.2 the compression was disabled in eBlocker Mobile. A [soft migration](https://community.openvpn.net/openvpn/wiki/VORACLE) was chosen to support client configurations created with eOS 2.5 or earlier, so the server pushes the following option to the client to disable the compression: push "compress stub-v2" The [Passepartout app](https://github.com/passepartoutvpn) does not seem to use `openvpn` but the `tunnelkit` library, which is confused by the option `compress stub-v2`. The error message is: Server has non-LZO compression enabled and this is currently unsupported (framing=compress)
test
eblocker mobile is not compatible with passepartout app in eos the compression was disabled in eblocker mobile a was chosen to support client configurations created with eos or earlier so the server pushes the following option to the client to disable the compression push compress stub the does not seem to use openvpn but the tunnelkit library which is confused by the option compress stub the error message is server has non lzo compression enabled and this is currently unsupported framing compress
1
172,226
13,282,706,402
IssuesEvent
2020-08-24 00:23:56
backend-br/vagas
https://api.github.com/repos/backend-br/vagas
closed
[Remoto] Back-end Developer na Objective Solutions
CLT Java Scrum Stale Testes automatizados
<!-- ================================================== POR FAVOR, SÓ POSTE SE A VAGA FOR PARA BACK-END! Não faça distinção de gênero no título da vaga. Use: "Back-End Developer" ao invés de "Desenvolvedor Back-End" \o/ Exemplo: `[São Paulo] Back-End Developer @ NOME DA EMPRESA` ================================================== --> ## Nossa empresa A Objective trabalha junto com os clientes para criar soluções digitais com práticas ágeis, garantindo que o mais importante esteja em produção. E através de um mindset de testes contínuos, proporcionamos a qualidade e confiança que nossos clientes precisam para conduzir projetos de inovação e transformação digital. **Algumas informações que você precisa saber:** - +300 colaboradores - Uma das melhores empresas para se trabalhar no Paraná em 2019, segundo o GPTW - Uma das empresas de TI que melhor remuneram no Brasil, segundo o Glassdoor - Escritórios em São Paulo, Maringá, Curitiba e Chicago - Referência em desenvolvimento ágil, com um case relatado no livro ‘The Scrumban [R]Evolution: Getting the Most Out of Agile, Scrum, and Lean Kanban’ do autor Ajay Reddy ## Descrição da vaga **Responsabilidades:** - Participação em projetos para o desenvolvimento e implantação de portais Liferay. - Necessário ter experiência em desenvolvimento back-end utilizando a linguagem JAVA e Orientação a Objetos. ## Local Possibilidade de trabalho em um dos nossos escritórios em Maringá, Curitiba ou São Paulo, ou ainda no conforto da sua casa. ## Requisitos **Desejáveis:** - Padrões de projeto - Experiência com Testes Automatizados - Experiência em análise de requisitos - Sólidos conhecimentos de OO - Conhecimento em Portal Liferay **Diferenciais:** - Conhecimento em desenvolvimento Ágil - Clean Code ## Benefícios - Vale Refeição ou Vale Alimentação; - Plano de Saúde; - Plano Odontológico; - Estacionamento; - Vale Transporte; - Gympass; - PLR; - Seguro de vida; - Programa de indicação de funcionários; - Convênios de descontos com universidades, escolas de idiomas e restaurantes; - Frutas fresquinhas todos os dias; - Café da manhã e Lanche da tarde à vontade; - Massagem 1x por semana. ## Contratação CLT ## Como se candidatar Você pode se candidatar a vaga clicando [aqui](https://objective.gupy.io/jobs/231724), ou se preferir pode enviar um e-mail para gabriela.hoffmann@objective.com.br com o seu CV ou link do perfil no Linkedin.
1.0
[Remoto] Back-end Developer na Objective Solutions - <!-- ================================================== POR FAVOR, SÓ POSTE SE A VAGA FOR PARA BACK-END! Não faça distinção de gênero no título da vaga. Use: "Back-End Developer" ao invés de "Desenvolvedor Back-End" \o/ Exemplo: `[São Paulo] Back-End Developer @ NOME DA EMPRESA` ================================================== --> ## Nossa empresa A Objective trabalha junto com os clientes para criar soluções digitais com práticas ágeis, garantindo que o mais importante esteja em produção. E através de um mindset de testes contínuos, proporcionamos a qualidade e confiança que nossos clientes precisam para conduzir projetos de inovação e transformação digital. **Algumas informações que você precisa saber:** - +300 colaboradores - Uma das melhores empresas para se trabalhar no Paraná em 2019, segundo o GPTW - Uma das empresas de TI que melhor remuneram no Brasil, segundo o Glassdoor - Escritórios em São Paulo, Maringá, Curitiba e Chicago - Referência em desenvolvimento ágil, com um case relatado no livro ‘The Scrumban [R]Evolution: Getting the Most Out of Agile, Scrum, and Lean Kanban’ do autor Ajay Reddy ## Descrição da vaga **Responsabilidades:** - Participação em projetos para o desenvolvimento e implantação de portais Liferay. - Necessário ter experiência em desenvolvimento back-end utilizando a linguagem JAVA e Orientação a Objetos. ## Local Possibilidade de trabalho em um dos nossos escritórios em Maringá, Curitiba ou São Paulo, ou ainda no conforto da sua casa. ## Requisitos **Desejáveis:** - Padrões de projeto - Experiência com Testes Automatizados - Experiência em análise de requisitos - Sólidos conhecimentos de OO - Conhecimento em Portal Liferay **Diferenciais:** - Conhecimento em desenvolvimento Ágil - Clean Code ## Benefícios - Vale Refeição ou Vale Alimentação; - Plano de Saúde; - Plano Odontológico; - Estacionamento; - Vale Transporte; - Gympass; - PLR; - Seguro de vida; - Programa de indicação de funcionários; - Convênios de descontos com universidades, escolas de idiomas e restaurantes; - Frutas fresquinhas todos os dias; - Café da manhã e Lanche da tarde à vontade; - Massagem 1x por semana. ## Contratação CLT ## Como se candidatar Você pode se candidatar a vaga clicando [aqui](https://objective.gupy.io/jobs/231724), ou se preferir pode enviar um e-mail para gabriela.hoffmann@objective.com.br com o seu CV ou link do perfil no Linkedin.
test
back end developer na objective solutions por favor só poste se a vaga for para back end não faça distinção de gênero no título da vaga use back end developer ao invés de desenvolvedor back end o exemplo back end developer nome da empresa nossa empresa a objective trabalha junto com os clientes para criar soluções digitais com práticas ágeis garantindo que o mais importante esteja em produção e através de um mindset de testes contínuos proporcionamos a qualidade e confiança que nossos clientes precisam para conduzir projetos de inovação e transformação digital algumas informações que você precisa saber colaboradores uma das melhores empresas para se trabalhar no paraná em segundo o gptw uma das empresas de ti que melhor remuneram no brasil segundo o glassdoor escritórios em são paulo maringá curitiba e chicago referência em desenvolvimento ágil com um case relatado no livro ‘the scrumban evolution getting the most out of agile scrum and lean kanban’ do autor ajay reddy descrição da vaga responsabilidades participação em projetos para o desenvolvimento e implantação de portais liferay necessário ter experiência em desenvolvimento back end utilizando a linguagem java e orientação a objetos local possibilidade de trabalho em um dos nossos escritórios em maringá curitiba ou são paulo ou ainda no conforto da sua casa requisitos desejáveis padrões de projeto experiência com testes automatizados experiência em análise de requisitos sólidos conhecimentos de oo conhecimento em portal liferay diferenciais conhecimento em desenvolvimento ágil clean code benefícios vale refeição ou vale alimentação plano de saúde plano odontológico estacionamento vale transporte gympass plr seguro de vida programa de indicação de funcionários convênios de descontos com universidades escolas de idiomas e restaurantes frutas fresquinhas todos os dias café da manhã e lanche da tarde à vontade massagem por semana contratação clt como se candidatar você pode se candidatar a vaga clicando ou se preferir pode enviar um e mail para gabriela hoffmann objective com br com o seu cv ou link do perfil no linkedin
1
639,606
20,759,636,347
IssuesEvent
2022-03-15 15:07:54
JeffreyCHChan/SOEN390
https://api.github.com/repos/JeffreyCHChan/SOEN390
closed
As a medical doctor, I want to be able to distinguish between reviewed and non reviewed patients
Priority 2 Front End USP 1
Font color change
1.0
As a medical doctor, I want to be able to distinguish between reviewed and non reviewed patients - Font color change
non_test
as a medical doctor i want to be able to distinguish between reviewed and non reviewed patients font color change
0
340,128
10,267,120,611
IssuesEvent
2019-08-23 00:02:58
morpheus65535/bazarr
https://api.github.com/repos/morpheus65535/bazarr
closed
[Feature Request] Manual import of subtitles
feature request priority:low
Hi, would it be possible to add the capability to upload subtitles manually for an episode, or a complete season ? Bazarr would then handle the renaming of the subtitle to match the episode name Maybe a bit like Lidarr does by matching tracks for season upload if file match does not succeed. Thanks
1.0
[Feature Request] Manual import of subtitles - Hi, would it be possible to add the capability to upload subtitles manually for an episode, or a complete season ? Bazarr would then handle the renaming of the subtitle to match the episode name Maybe a bit like Lidarr does by matching tracks for season upload if file match does not succeed. Thanks
non_test
manual import of subtitles hi would it be possible to add the capability to upload subtitles manually for an episode or a complete season bazarr would then handle the renaming of the subtitle to match the episode name maybe a bit like lidarr does by matching tracks for season upload if file match does not succeed thanks
0
35,231
4,969,216,019
IssuesEvent
2016-12-05 12:36:55
IDgis/geoportaal-test
https://api.github.com/repos/IDgis/geoportaal-test
closed
Generieke teksten in Geoportaal
gebruikerstest wens
Als meer organisaties van het Geoportaal gebruik gaan maken, is het niet logisch dat op allerlei plekken naar de provincie Overijssel wordt verwezen. Helemaal anoniem maken kan niet, maar er kan wel op een aantal plekken iets worden veranderd. Hieronder eerst wat moet blijven, om de herkenbaarheid van de provincie intact te houden. Daaronder de plaatsen waar aanpassingen mogelijk zijn. De tekst "provincie Overijsel" of verwijzing naar email blijft op 4 plaatsen voorkomen: =================================== ### Niet veranderen. Dus laten zoals het nu is =================================== pagina 1) https://www.geoportaaloverijssel.nl/about Het Geoportaal van de provincie Overijssel bevat datasets en beschrijvingen van datasets. Deze worden gebruikt door de provincie om het beleid van de provincie te ondersteunen. Zo worden ze gebruikt om kaarten van te maken of om analyses mee uit te voeren. De provincie Overijssel streeft ernaar om de data en ook de omschrijving bij de data juist en actueel te houden. Mocht je ondanks onze inspanningen toch onvolkomenheden tegenkomen, aarzel dan niet contact met ons op te nemen. pagina 2) https://www.geoportaaloverijssel.nl/contact De informatie in het Geoportaal wordt beschikbaar gesteld door het team Beleidsinformatie van de provincie Overijssel. Heb je vragen, suggesties of tips, aarzel dan niet contact op te nemen. Stuur je email naar: beleidsinformatie@overijssel.nl pagina 3) https://www.geoportaaloverijssel.nl/ Zoek en download actuele kaarten en de beschrijving van de kaarten van de provincie Overijssel. pagina 4) download scherm, b.v.: https://download.geoportaaloverijssel.nl/download/b96d4c68-74ca-47bc-a9af-6b9ac5720f7a Deze gegevens worden beschikbaar gesteld door het Geoportaal van Overijssel: http://www.geoportaaloverijssel.nl In het Geoportaal staan actuele kaarten en beschrijvingen van die kaarten. Ter referentie zijn vaak ook nog oudere kaarten beschikbaar gesteld. De provincie Overijssel stelt zoveel mogelijk kaarten als "open data" voor iedereen beschikbaar. Heeft u suggesties of vragen? Stuur dan een email naar beleidsinformatie@overijssel.nl Zie proclaimer: http://www.overijssel.nl/algemene-onderdelen/proclaimer =================================== ### Aanpassingen: =================================== https://admin.geoportaaloverijssel.nl/loginhelp Bestaande tekst: Gebruik bij het inloggen je emailadres, zoals gebruikt wordt in Overijssel. Je mag zowel hoofdletters als kleine letters gebruiken. p.pietersen@overijssel.nl is goed. En ook P.Pietersen@Overijssel.nl is goed. Nieuwe tekst: Gebruik bij het inloggen je werk emailadres. Je mag zowel hoofdletters als kleine letters gebruiken. p.pietersen@werk.nl is goed. En ook P.Pietersen@Werk.nl is goed. --- https://www.geoportaaloverijssel.nl/about Bestaande tekst: Van een aantal datasets is de provincie Overijssel geen eigenaar. Meestal zijn hierbij dan beperkingen voordat ze gedeeld mogen worden met anderen. Deze beperkingen staan ook in de beschrijving bij de datasets. Nieuwe tekst: Van een aantal datasets zijn wij geen eigenaar. Meestal zijn hierbij dan beperkingen voordat ze gedeeld mogen worden met anderen. Deze beperkingen staan ook in de beschrijving bij de datasets. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: Handleiding voor het gebruik van het Geoportaal van Overijssel. Nieuwe tekst: Handleiding voor het gebruik van het Geoportaal --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: Het Geoportaal is bedoeld voor het beheer en het beschikbaar stellen van kaarten die de provincie Overijssel gebruikt. Nieuwe tekst: Het Geoportaal is bedoeld voor het beheer en het beschikbaar stellen van kaarten. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: Heb je niet de beschikking over dergelijke GIS-software, maar wil je toch kaarten zien die gemaakt zijn door de provincie Overijssel? Dat kan, omdat er ook veel pdf-bestanden beschikbaar gesteld worden. Zie hiervoor verder bij de toelichting over de “statische kaarten”. Nieuwe tekst: Heb je niet de beschikking over dergelijke GIS-software, maar wil je toch kaarten zien? Dat kan, omdat er ook veel pdf-bestanden beschikbaar gesteld worden. Zie hiervoor verder bij de toelichting over de “statische kaarten”. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: In dit Geoportaal worden niet alleen de datasets maar ook de bijbehorende services getoond. Dat zijn links naar websites, die op een moderne manier dit soort kaarten beschikbaar stellen. In het Geoportaal van Overijssel worden de datasets en ook de bijbehorende services ontsloten. Ook voor het gebruik van services het je een GIS nodig. Nieuwe tekst: In dit Geoportaal worden niet alleen de datasets maar ook de bijbehorende services getoond. Dat zijn links naar websites, die op een moderne manier dit soort kaarten beschikbaar stellen. In het Geoportaal worden de datasets en ook de bijbehorende services ontsloten. Ook voor het gebruik van services heb je een GIS nodig. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: De provincie Overijssel streeft ernaar om de informatie zo goed mogelijk beschikbaar te stellen. We streven naar actualiteit en juistheid bij al onze beschikbare bestanden. En we horen ook graag hoe het nog beter zou kunnen. Nieuwe tekst: Wij streven ernaar om de informatie zo goed mogelijk beschikbaar te stellen. We streven naar actualiteit en juistheid bij al onze beschikbare bestanden. En we horen ook graag hoe het nog beter zou kunnen. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: Heb je vragen over het Geoportaal van Overijssel? Of heb je tips of suggesties? Of wil je op de hoogte blijven van de laatste ontwikkelingen, bijvoorbeeld via de nieuwsbrief? Nieuwe tekst: Heb je vragen over het Geoportaal? Of heb je tips of suggesties? Of wil je op de hoogte blijven van de laatste ontwikkelingen? ---
1.0
Generieke teksten in Geoportaal - Als meer organisaties van het Geoportaal gebruik gaan maken, is het niet logisch dat op allerlei plekken naar de provincie Overijssel wordt verwezen. Helemaal anoniem maken kan niet, maar er kan wel op een aantal plekken iets worden veranderd. Hieronder eerst wat moet blijven, om de herkenbaarheid van de provincie intact te houden. Daaronder de plaatsen waar aanpassingen mogelijk zijn. De tekst "provincie Overijsel" of verwijzing naar email blijft op 4 plaatsen voorkomen: =================================== ### Niet veranderen. Dus laten zoals het nu is =================================== pagina 1) https://www.geoportaaloverijssel.nl/about Het Geoportaal van de provincie Overijssel bevat datasets en beschrijvingen van datasets. Deze worden gebruikt door de provincie om het beleid van de provincie te ondersteunen. Zo worden ze gebruikt om kaarten van te maken of om analyses mee uit te voeren. De provincie Overijssel streeft ernaar om de data en ook de omschrijving bij de data juist en actueel te houden. Mocht je ondanks onze inspanningen toch onvolkomenheden tegenkomen, aarzel dan niet contact met ons op te nemen. pagina 2) https://www.geoportaaloverijssel.nl/contact De informatie in het Geoportaal wordt beschikbaar gesteld door het team Beleidsinformatie van de provincie Overijssel. Heb je vragen, suggesties of tips, aarzel dan niet contact op te nemen. Stuur je email naar: beleidsinformatie@overijssel.nl pagina 3) https://www.geoportaaloverijssel.nl/ Zoek en download actuele kaarten en de beschrijving van de kaarten van de provincie Overijssel. pagina 4) download scherm, b.v.: https://download.geoportaaloverijssel.nl/download/b96d4c68-74ca-47bc-a9af-6b9ac5720f7a Deze gegevens worden beschikbaar gesteld door het Geoportaal van Overijssel: http://www.geoportaaloverijssel.nl In het Geoportaal staan actuele kaarten en beschrijvingen van die kaarten. Ter referentie zijn vaak ook nog oudere kaarten beschikbaar gesteld. De provincie Overijssel stelt zoveel mogelijk kaarten als "open data" voor iedereen beschikbaar. Heeft u suggesties of vragen? Stuur dan een email naar beleidsinformatie@overijssel.nl Zie proclaimer: http://www.overijssel.nl/algemene-onderdelen/proclaimer =================================== ### Aanpassingen: =================================== https://admin.geoportaaloverijssel.nl/loginhelp Bestaande tekst: Gebruik bij het inloggen je emailadres, zoals gebruikt wordt in Overijssel. Je mag zowel hoofdletters als kleine letters gebruiken. p.pietersen@overijssel.nl is goed. En ook P.Pietersen@Overijssel.nl is goed. Nieuwe tekst: Gebruik bij het inloggen je werk emailadres. Je mag zowel hoofdletters als kleine letters gebruiken. p.pietersen@werk.nl is goed. En ook P.Pietersen@Werk.nl is goed. --- https://www.geoportaaloverijssel.nl/about Bestaande tekst: Van een aantal datasets is de provincie Overijssel geen eigenaar. Meestal zijn hierbij dan beperkingen voordat ze gedeeld mogen worden met anderen. Deze beperkingen staan ook in de beschrijving bij de datasets. Nieuwe tekst: Van een aantal datasets zijn wij geen eigenaar. Meestal zijn hierbij dan beperkingen voordat ze gedeeld mogen worden met anderen. Deze beperkingen staan ook in de beschrijving bij de datasets. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: Handleiding voor het gebruik van het Geoportaal van Overijssel. Nieuwe tekst: Handleiding voor het gebruik van het Geoportaal --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: Het Geoportaal is bedoeld voor het beheer en het beschikbaar stellen van kaarten die de provincie Overijssel gebruikt. Nieuwe tekst: Het Geoportaal is bedoeld voor het beheer en het beschikbaar stellen van kaarten. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: Heb je niet de beschikking over dergelijke GIS-software, maar wil je toch kaarten zien die gemaakt zijn door de provincie Overijssel? Dat kan, omdat er ook veel pdf-bestanden beschikbaar gesteld worden. Zie hiervoor verder bij de toelichting over de “statische kaarten”. Nieuwe tekst: Heb je niet de beschikking over dergelijke GIS-software, maar wil je toch kaarten zien? Dat kan, omdat er ook veel pdf-bestanden beschikbaar gesteld worden. Zie hiervoor verder bij de toelichting over de “statische kaarten”. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: In dit Geoportaal worden niet alleen de datasets maar ook de bijbehorende services getoond. Dat zijn links naar websites, die op een moderne manier dit soort kaarten beschikbaar stellen. In het Geoportaal van Overijssel worden de datasets en ook de bijbehorende services ontsloten. Ook voor het gebruik van services het je een GIS nodig. Nieuwe tekst: In dit Geoportaal worden niet alleen de datasets maar ook de bijbehorende services getoond. Dat zijn links naar websites, die op een moderne manier dit soort kaarten beschikbaar stellen. In het Geoportaal worden de datasets en ook de bijbehorende services ontsloten. Ook voor het gebruik van services heb je een GIS nodig. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: De provincie Overijssel streeft ernaar om de informatie zo goed mogelijk beschikbaar te stellen. We streven naar actualiteit en juistheid bij al onze beschikbare bestanden. En we horen ook graag hoe het nog beter zou kunnen. Nieuwe tekst: Wij streven ernaar om de informatie zo goed mogelijk beschikbaar te stellen. We streven naar actualiteit en juistheid bij al onze beschikbare bestanden. En we horen ook graag hoe het nog beter zou kunnen. --- https://www.geoportaaloverijssel.nl/help Bestaande tekst: Heb je vragen over het Geoportaal van Overijssel? Of heb je tips of suggesties? Of wil je op de hoogte blijven van de laatste ontwikkelingen, bijvoorbeeld via de nieuwsbrief? Nieuwe tekst: Heb je vragen over het Geoportaal? Of heb je tips of suggesties? Of wil je op de hoogte blijven van de laatste ontwikkelingen? ---
test
generieke teksten in geoportaal als meer organisaties van het geoportaal gebruik gaan maken is het niet logisch dat op allerlei plekken naar de provincie overijssel wordt verwezen helemaal anoniem maken kan niet maar er kan wel op een aantal plekken iets worden veranderd hieronder eerst wat moet blijven om de herkenbaarheid van de provincie intact te houden daaronder de plaatsen waar aanpassingen mogelijk zijn de tekst provincie overijsel of verwijzing naar email blijft op plaatsen voorkomen niet veranderen dus laten zoals het nu is pagina het geoportaal van de provincie overijssel bevat datasets en beschrijvingen van datasets deze worden gebruikt door de provincie om het beleid van de provincie te ondersteunen zo worden ze gebruikt om kaarten van te maken of om analyses mee uit te voeren de provincie overijssel streeft ernaar om de data en ook de omschrijving bij de data juist en actueel te houden mocht je ondanks onze inspanningen toch onvolkomenheden tegenkomen aarzel dan niet contact met ons op te nemen pagina de informatie in het geoportaal wordt beschikbaar gesteld door het team beleidsinformatie van de provincie overijssel heb je vragen suggesties of tips aarzel dan niet contact op te nemen stuur je email naar beleidsinformatie overijssel nl pagina zoek en download actuele kaarten en de beschrijving van de kaarten van de provincie overijssel pagina download scherm b v deze gegevens worden beschikbaar gesteld door het geoportaal van overijssel in het geoportaal staan actuele kaarten en beschrijvingen van die kaarten ter referentie zijn vaak ook nog oudere kaarten beschikbaar gesteld de provincie overijssel stelt zoveel mogelijk kaarten als open data voor iedereen beschikbaar heeft u suggesties of vragen stuur dan een email naar beleidsinformatie overijssel nl zie proclaimer aanpassingen bestaande tekst gebruik bij het inloggen je emailadres zoals gebruikt wordt in overijssel je mag zowel hoofdletters als kleine letters gebruiken p pietersen overijssel nl is goed en ook p pietersen overijssel nl is goed nieuwe tekst gebruik bij het inloggen je werk emailadres je mag zowel hoofdletters als kleine letters gebruiken p pietersen werk nl is goed en ook p pietersen werk nl is goed bestaande tekst van een aantal datasets is de provincie overijssel geen eigenaar meestal zijn hierbij dan beperkingen voordat ze gedeeld mogen worden met anderen deze beperkingen staan ook in de beschrijving bij de datasets nieuwe tekst van een aantal datasets zijn wij geen eigenaar meestal zijn hierbij dan beperkingen voordat ze gedeeld mogen worden met anderen deze beperkingen staan ook in de beschrijving bij de datasets bestaande tekst handleiding voor het gebruik van het geoportaal van overijssel nieuwe tekst handleiding voor het gebruik van het geoportaal bestaande tekst het geoportaal is bedoeld voor het beheer en het beschikbaar stellen van kaarten die de provincie overijssel gebruikt nieuwe tekst het geoportaal is bedoeld voor het beheer en het beschikbaar stellen van kaarten bestaande tekst heb je niet de beschikking over dergelijke gis software maar wil je toch kaarten zien die gemaakt zijn door de provincie overijssel dat kan omdat er ook veel pdf bestanden beschikbaar gesteld worden zie hiervoor verder bij de toelichting over de “statische kaarten” nieuwe tekst heb je niet de beschikking over dergelijke gis software maar wil je toch kaarten zien dat kan omdat er ook veel pdf bestanden beschikbaar gesteld worden zie hiervoor verder bij de toelichting over de “statische kaarten” bestaande tekst in dit geoportaal worden niet alleen de datasets maar ook de bijbehorende services getoond dat zijn links naar websites die op een moderne manier dit soort kaarten beschikbaar stellen in het geoportaal van overijssel worden de datasets en ook de bijbehorende services ontsloten ook voor het gebruik van services het je een gis nodig nieuwe tekst in dit geoportaal worden niet alleen de datasets maar ook de bijbehorende services getoond dat zijn links naar websites die op een moderne manier dit soort kaarten beschikbaar stellen in het geoportaal worden de datasets en ook de bijbehorende services ontsloten ook voor het gebruik van services heb je een gis nodig bestaande tekst de provincie overijssel streeft ernaar om de informatie zo goed mogelijk beschikbaar te stellen we streven naar actualiteit en juistheid bij al onze beschikbare bestanden en we horen ook graag hoe het nog beter zou kunnen nieuwe tekst wij streven ernaar om de informatie zo goed mogelijk beschikbaar te stellen we streven naar actualiteit en juistheid bij al onze beschikbare bestanden en we horen ook graag hoe het nog beter zou kunnen bestaande tekst heb je vragen over het geoportaal van overijssel of heb je tips of suggesties of wil je op de hoogte blijven van de laatste ontwikkelingen bijvoorbeeld via de nieuwsbrief nieuwe tekst heb je vragen over het geoportaal of heb je tips of suggesties of wil je op de hoogte blijven van de laatste ontwikkelingen
1
82,123
15,646,506,863
IssuesEvent
2021-03-23 01:05:02
jgeraigery/linux
https://api.github.com/repos/jgeraigery/linux
opened
CVE-2019-15221 (Medium) detected in linuxv5.2
security vulnerability
## CVE-2019-15221 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary> <p> <p>Linux kernel source tree</p> <p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p> </p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (0)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in the Linux kernel before 5.1.17. There is a NULL pointer dereference caused by a malicious USB device in the sound/usb/line6/pcm.c driver. <p>Publish Date: 2019-08-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-15221>CVE-2019-15221</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Physical - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-15221">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-15221</a></p> <p>Release Date: 2019-08-19</p> <p>Fix Resolution: v5.2</p> </p> </details> <p></p>
True
CVE-2019-15221 (Medium) detected in linuxv5.2 - ## CVE-2019-15221 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary> <p> <p>Linux kernel source tree</p> <p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p> </p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (0)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in the Linux kernel before 5.1.17. There is a NULL pointer dereference caused by a malicious USB device in the sound/usb/line6/pcm.c driver. <p>Publish Date: 2019-08-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-15221>CVE-2019-15221</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Physical - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-15221">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-15221</a></p> <p>Release Date: 2019-08-19</p> <p>Fix Resolution: v5.2</p> </p> </details> <p></p>
non_test
cve medium detected in cve medium severity vulnerability vulnerable library linux kernel source tree library home page a href vulnerable source files vulnerability details an issue was discovered in the linux kernel before there is a null pointer dereference caused by a malicious usb device in the sound usb pcm c driver publish date url a href cvss score details base score metrics exploitability metrics attack vector physical attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
0
332,766
29,493,025,056
IssuesEvent
2023-06-02 14:46:08
rust-lang/rust
https://api.github.com/repos/rust-lang/rust
opened
rustdoc `--runtool` uses ambiguous relative paths with `--test-run-directory`
T-rustdoc C-bug A-doctests
The current code for `--runtool` can interact in platform-specific ways with `--test-run-directory` because it uses [`current_dir`](https://github.com/rust-lang/rust/blob/0939ec13d88dfafcbb7f25314bd0d2f1519bf0d5/src/librustdoc/doctest.rs#L466-L475) in combination with potentially relative paths which has [platform-specific behavior](https://doc.rust-lang.org/std/process/struct.Command.html#method.current_dir). This also causes issues with Cargo, since in some cases it makes the path relative to the package in a workspace instead of the workspace itself. This means that in a workspace, it can be difficult or impossible to use a relative path for the `--runtool` since it would change for every package. Reproduction: 1. `mkdir foo` 2. `cd foo` 3. `mkdir tool` 4. Create a runtool: ``` cat <<'EOF' > tool/echo.rs fn main() { eprintln!("{:?}", std::env::args().collect::<Vec<_>>()); } EOF ``` 5. `rustc --out-dir tool tool/echo.rs` 6. `mkdir subdir` 7. Create a test file: `````` cat <<'EOF' > subdir/example.rs /// ``` /// example::foo(); /// ``` pub fn foo() {} EOF `````` 8. `rustc --crate-type=rlib subdir/example.rs` 9. Verify test works: `rustdoc --edition=2021 -L . --test subdir/example.rs` 10. Verify with just runtool: `rustdoc --edition=2021 -L . --test subdir/example.rs --runtool=tool/echo -Z unstable-options --nocapture` 11. Try with both runtool and test-run-directory: `rustdoc --edition=2021 -L . --test subdir/example.rs --runtool=tool/echo --test-run-directory=subdir -Z unstable-options --nocapture` I expected this to happen: The last step should always work. Instead, this happened: The last step works on Windows, but fails on Unix-like platforms. I general, one needs to be careful when using `Command::current_dir` with relative paths. The recommended approach (and the one Cargo uses) is to check if the executable has a `/` or `\`, and if it does, do `current_working_directory.join(exe_path)` as the argument to `Command::new`. ### Meta `rustc --version --verbose`: ``` rustc 1.71.0-nightly (9d871b061 2023-05-21) binary: rustc commit-hash: 9d871b0617a4b3d6610b7cee0ab5310dcb542c62 commit-date: 2023-05-21 host: aarch64-apple-darwin release: 1.71.0-nightly LLVM version: 16.0.4 ```
1.0
rustdoc `--runtool` uses ambiguous relative paths with `--test-run-directory` - The current code for `--runtool` can interact in platform-specific ways with `--test-run-directory` because it uses [`current_dir`](https://github.com/rust-lang/rust/blob/0939ec13d88dfafcbb7f25314bd0d2f1519bf0d5/src/librustdoc/doctest.rs#L466-L475) in combination with potentially relative paths which has [platform-specific behavior](https://doc.rust-lang.org/std/process/struct.Command.html#method.current_dir). This also causes issues with Cargo, since in some cases it makes the path relative to the package in a workspace instead of the workspace itself. This means that in a workspace, it can be difficult or impossible to use a relative path for the `--runtool` since it would change for every package. Reproduction: 1. `mkdir foo` 2. `cd foo` 3. `mkdir tool` 4. Create a runtool: ``` cat <<'EOF' > tool/echo.rs fn main() { eprintln!("{:?}", std::env::args().collect::<Vec<_>>()); } EOF ``` 5. `rustc --out-dir tool tool/echo.rs` 6. `mkdir subdir` 7. Create a test file: `````` cat <<'EOF' > subdir/example.rs /// ``` /// example::foo(); /// ``` pub fn foo() {} EOF `````` 8. `rustc --crate-type=rlib subdir/example.rs` 9. Verify test works: `rustdoc --edition=2021 -L . --test subdir/example.rs` 10. Verify with just runtool: `rustdoc --edition=2021 -L . --test subdir/example.rs --runtool=tool/echo -Z unstable-options --nocapture` 11. Try with both runtool and test-run-directory: `rustdoc --edition=2021 -L . --test subdir/example.rs --runtool=tool/echo --test-run-directory=subdir -Z unstable-options --nocapture` I expected this to happen: The last step should always work. Instead, this happened: The last step works on Windows, but fails on Unix-like platforms. I general, one needs to be careful when using `Command::current_dir` with relative paths. The recommended approach (and the one Cargo uses) is to check if the executable has a `/` or `\`, and if it does, do `current_working_directory.join(exe_path)` as the argument to `Command::new`. ### Meta `rustc --version --verbose`: ``` rustc 1.71.0-nightly (9d871b061 2023-05-21) binary: rustc commit-hash: 9d871b0617a4b3d6610b7cee0ab5310dcb542c62 commit-date: 2023-05-21 host: aarch64-apple-darwin release: 1.71.0-nightly LLVM version: 16.0.4 ```
test
rustdoc runtool uses ambiguous relative paths with test run directory the current code for runtool can interact in platform specific ways with test run directory because it uses in combination with potentially relative paths which has this also causes issues with cargo since in some cases it makes the path relative to the package in a workspace instead of the workspace itself this means that in a workspace it can be difficult or impossible to use a relative path for the runtool since it would change for every package reproduction mkdir foo cd foo mkdir tool create a runtool cat tool echo rs fn main eprintln std env args collect eof rustc out dir tool tool echo rs mkdir subdir create a test file cat subdir example rs example foo pub fn foo eof rustc crate type rlib subdir example rs verify test works rustdoc edition l test subdir example rs verify with just runtool rustdoc edition l test subdir example rs runtool tool echo z unstable options nocapture try with both runtool and test run directory rustdoc edition l test subdir example rs runtool tool echo test run directory subdir z unstable options nocapture i expected this to happen the last step should always work instead this happened the last step works on windows but fails on unix like platforms i general one needs to be careful when using command current dir with relative paths the recommended approach and the one cargo uses is to check if the executable has a or and if it does do current working directory join exe path as the argument to command new meta rustc version verbose rustc nightly binary rustc commit hash commit date host apple darwin release nightly llvm version
1
119,255
10,036,252,141
IssuesEvent
2019-07-18 10:11:45
dbrownukk/EFD_v2
https://api.github.com/repos/dbrownukk/EFD_v2
closed
Report summary should contain the definition of the report spec used
For Testing bug
The summary page includes the reports run and the HHs included. But it should also include: Household Inclusion rule Not Applicable or Household Inclusion rule <HH question prompt> <HH question answer> Number of Households included <count of HHs incuded> Specific Categories included <comma delineated concatenation of Category names> Specific Resource Types included <comma delineated concatenation of RT names> Specific Resources included <comma delineated concatenation of RST names> Quantiles <1st quantile name> <1st quantile %age> <2nd quantile name> <2nd quantile %age> <3rd quantile name> <3rd quantile %age> Reporting Currency <ISO code for reporting currency> This additional data would allow better title and annotation of the various report visualisations.
1.0
Report summary should contain the definition of the report spec used - The summary page includes the reports run and the HHs included. But it should also include: Household Inclusion rule Not Applicable or Household Inclusion rule <HH question prompt> <HH question answer> Number of Households included <count of HHs incuded> Specific Categories included <comma delineated concatenation of Category names> Specific Resource Types included <comma delineated concatenation of RT names> Specific Resources included <comma delineated concatenation of RST names> Quantiles <1st quantile name> <1st quantile %age> <2nd quantile name> <2nd quantile %age> <3rd quantile name> <3rd quantile %age> Reporting Currency <ISO code for reporting currency> This additional data would allow better title and annotation of the various report visualisations.
test
report summary should contain the definition of the report spec used the summary page includes the reports run and the hhs included but it should also include household inclusion rule not applicable or household inclusion rule number of households included specific categories included specific resource types included specific resources included quantiles reporting currency this additional data would allow better title and annotation of the various report visualisations
1
179,043
30,106,479,073
IssuesEvent
2023-06-30 02:04:05
broken-helix/keep-it-tidy-london
https://api.github.com/repos/broken-helix/keep-it-tidy-london
reopened
EPIC #5 : SITE DESIGN
must have EPIC #5: SITE DESIGN
This epic covers the design of the site. ### Tasks: - [x] #20 - [x] #27 - [x] #5
1.0
EPIC #5 : SITE DESIGN - This epic covers the design of the site. ### Tasks: - [x] #20 - [x] #27 - [x] #5
non_test
epic site design this epic covers the design of the site tasks
0
4,077
2,702,670,863
IssuesEvent
2015-04-06 10:58:39
pretix/pretix
https://api.github.com/repos/pretix/pretix
closed
Required boolean question
bug easy pretix.presale test case
A required question of type boolean should require the user to select either Yes or No, but currently requires the user to select Yes (because the checkbox is otherwise not checked). A SelectInput could be the solution to this.
1.0
Required boolean question - A required question of type boolean should require the user to select either Yes or No, but currently requires the user to select Yes (because the checkbox is otherwise not checked). A SelectInput could be the solution to this.
test
required boolean question a required question of type boolean should require the user to select either yes or no but currently requires the user to select yes because the checkbox is otherwise not checked a selectinput could be the solution to this
1
154,117
12,193,961,458
IssuesEvent
2020-04-29 15:08:04
Arquisoft/viade_en3b1
https://api.github.com/repos/Arquisoft/viade_en3b1
closed
React Testing
PRIO: High test
Apart from tests of #4, I will also have to test all the React rendering and basic application framework.
1.0
React Testing - Apart from tests of #4, I will also have to test all the React rendering and basic application framework.
test
react testing apart from tests of i will also have to test all the react rendering and basic application framework
1
17,782
3,640,788,133
IssuesEvent
2016-02-13 04:34:12
s42ky/ayso-schedules
https://api.github.com/repos/s42ky/ayso-schedules
closed
Unit test init of all view components
automation testing
Make sure all templates compile by unit testing views. Need to see if can find manner for this to work easily, as currently cumbersome with lots of injection
1.0
Unit test init of all view components - Make sure all templates compile by unit testing views. Need to see if can find manner for this to work easily, as currently cumbersome with lots of injection
test
unit test init of all view components make sure all templates compile by unit testing views need to see if can find manner for this to work easily as currently cumbersome with lots of injection
1
808,494
30,085,092,118
IssuesEvent
2023-06-29 08:02:20
JulesBelveze/bert-squeeze
https://api.github.com/repos/JulesBelveze/bert-squeeze
closed
Enable batching in `DeeBert` at inference time
priority:medium
Since we need to asses the amount of information carried out by one prediction it is not possible to have a batch size greater than 1 at inference time. I have implemented a workaround for FastBert which can be found [here](https://github.com/BitVoyage/FastBERT/commit/bb8cd0ec344786068a44b59469475c04f1f06185) and should be adjustable to DeeBert.
1.0
Enable batching in `DeeBert` at inference time - Since we need to asses the amount of information carried out by one prediction it is not possible to have a batch size greater than 1 at inference time. I have implemented a workaround for FastBert which can be found [here](https://github.com/BitVoyage/FastBERT/commit/bb8cd0ec344786068a44b59469475c04f1f06185) and should be adjustable to DeeBert.
non_test
enable batching in deebert at inference time since we need to asses the amount of information carried out by one prediction it is not possible to have a batch size greater than at inference time i have implemented a workaround for fastbert which can be found and should be adjustable to deebert
0
139,480
20,868,550,355
IssuesEvent
2022-03-22 09:45:38
iotaledger/explorer
https://api.github.com/repos/iotaledger/explorer
opened
[Task]: Design for NFT address metadata
priority:3 status:blocked type:feature type:ux:design network:shimmer
### Task description As part of the design changes we have to do for explorer, we need to display the information associated with an NFT address when a user searches for it. [Link](https://hackmd.io/diWWFAVjSOaqQYJXDQaAAg) to the user stories document. ### Requirements When a user searches for an NFT address, it should not only display the amount of IOTA coins held by the NFT (as described in https://github.com/iotaledger/explorer/issues/232), but should also have other NFT specific information. ### Acceptance criteria When an NFT address is searched for, it should display the `metadata` associated with NFT. TODO (@laumair): Explain what `metadata` is ### Creation checklist - [ ] I have assigned this task to the correct people - [X] I have added the most appropriate labels - [X] I have linked the correct milestone and/or project
1.0
[Task]: Design for NFT address metadata - ### Task description As part of the design changes we have to do for explorer, we need to display the information associated with an NFT address when a user searches for it. [Link](https://hackmd.io/diWWFAVjSOaqQYJXDQaAAg) to the user stories document. ### Requirements When a user searches for an NFT address, it should not only display the amount of IOTA coins held by the NFT (as described in https://github.com/iotaledger/explorer/issues/232), but should also have other NFT specific information. ### Acceptance criteria When an NFT address is searched for, it should display the `metadata` associated with NFT. TODO (@laumair): Explain what `metadata` is ### Creation checklist - [ ] I have assigned this task to the correct people - [X] I have added the most appropriate labels - [X] I have linked the correct milestone and/or project
non_test
design for nft address metadata task description as part of the design changes we have to do for explorer we need to display the information associated with an nft address when a user searches for it to the user stories document requirements when a user searches for an nft address it should not only display the amount of iota coins held by the nft as described in but should also have other nft specific information acceptance criteria when an nft address is searched for it should display the metadata associated with nft todo laumair explain what metadata is creation checklist i have assigned this task to the correct people i have added the most appropriate labels i have linked the correct milestone and or project
0
49,608
12,391,360,257
IssuesEvent
2020-05-20 12:21:58
Scirra/Construct-3-bugs
https://api.github.com/repos/Scirra/Construct-3-bugs
closed
error when compiling for android
Build Service can't reproduce
<!-- You must use this template or your issue will be closed without investigation. Please see the guidelines. --> ## Problem description <!-- Enter a concise description of your problem here --> Error generating apk ## Attach a .c3p <!-- A minimal Construct 3 project (.c3p) is required to be attached. Your issue will likely be closed without investigation if you don't provide one. Please see the guidelines --> [test.zip](https://github.com/Scirra/Construct-3-bugs/files/4644942/test.zip) ## Steps to reproduce <!-- These steps are essential for us to be able to help you. Usually it is impossible to investigate reports unless they include steps we can follow ourselves, so please do your best to provide specific steps. There is no need to explain how you made the attached project - just explain what to do to with the project to observe the issue. --> 1. export >> android ## Observed result <!-- What do you see happen? --> Error: Checking Java JDK and Android SDK versions ANDROID_SDK_ROOT=~~/androidSDK (recommended setting) ANDROID_HOME=~~/androidSDK (DEPRECATED) :wrapper BUILD SUCCESSFUL in 1s 1 actionable task: 1 executed Subproject Path: CordovaLib Subproject Path: app > Configure project :app Project evaluation failed including an error in afterEvaluate {}. Run with --stacktrace for details of the afterEvaluate {} error. FAILURE: Build failed with an exception. * Where: Script '~~/cordova.gradle' line: 132 * What went wrong: A problem occurred evaluating project ':app'. > No match found * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https:/~~/help.gradle.org BUILD FAILED in 1s ~~/gradlew: Command failed with exit code 1 Error output: Project evaluation failed including an error in afterEvaluate {}. Run with --stacktrace for details of the afterEvaluate {} error. FAILURE: Build failed with an exception. * Where: Script '~~/cordova.gradle' line: 132 * What went wrong: A problem occurred evaluating project ':app'. > No match found * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https:/~~/help.gradle.org BUILD FAILED in 1s ## Expected result <!-- What did you expect to happen instead? --> An android application ## More details <!-- Providing this information will make it more likely the issue you are reporting can be fixed quickly. --> <!-- It's helpful to test as many browsers, platforms or export options as possible. For example an issue occurs in an Android app, does it also occur in Chrome on Windows? How about Firefox? etc. --> yes, it occurs in chrome, fire fox **Affected browsers/platforms:** <!-- Chrome/Firefox/Safari, Windows/macOS/Android, etc --> <!-- Identifying the first version the issue started happening can help resolve the issue more quickly. -->version : r197.2 **First affected release:** <!-- e.g. worked in r122 but broke in r123 --> ## System details <!-- If you see a crash report dialog, please copy and paste it to where it says "PASTE HERE" below. --> ![erro2](https://user-images.githubusercontent.com/65552816/82230220-d1acc800-9901-11ea-8067-79b982791822.png) ![erro3](https://user-images.githubusercontent.com/65552816/82230222-d2455e80-9901-11ea-8820-2847417ff7c1.png) ![errro1](https://user-images.githubusercontent.com/65552816/82230224-d2455e80-9901-11ea-952f-2fc996a5f1b0.png) <!-- Otherwise please go to Menu > About > Platform information and paste that information there instead. --> <details><summary>View details</summary> PASTE HERE </details>
1.0
error when compiling for android - <!-- You must use this template or your issue will be closed without investigation. Please see the guidelines. --> ## Problem description <!-- Enter a concise description of your problem here --> Error generating apk ## Attach a .c3p <!-- A minimal Construct 3 project (.c3p) is required to be attached. Your issue will likely be closed without investigation if you don't provide one. Please see the guidelines --> [test.zip](https://github.com/Scirra/Construct-3-bugs/files/4644942/test.zip) ## Steps to reproduce <!-- These steps are essential for us to be able to help you. Usually it is impossible to investigate reports unless they include steps we can follow ourselves, so please do your best to provide specific steps. There is no need to explain how you made the attached project - just explain what to do to with the project to observe the issue. --> 1. export >> android ## Observed result <!-- What do you see happen? --> Error: Checking Java JDK and Android SDK versions ANDROID_SDK_ROOT=~~/androidSDK (recommended setting) ANDROID_HOME=~~/androidSDK (DEPRECATED) :wrapper BUILD SUCCESSFUL in 1s 1 actionable task: 1 executed Subproject Path: CordovaLib Subproject Path: app > Configure project :app Project evaluation failed including an error in afterEvaluate {}. Run with --stacktrace for details of the afterEvaluate {} error. FAILURE: Build failed with an exception. * Where: Script '~~/cordova.gradle' line: 132 * What went wrong: A problem occurred evaluating project ':app'. > No match found * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https:/~~/help.gradle.org BUILD FAILED in 1s ~~/gradlew: Command failed with exit code 1 Error output: Project evaluation failed including an error in afterEvaluate {}. Run with --stacktrace for details of the afterEvaluate {} error. FAILURE: Build failed with an exception. * Where: Script '~~/cordova.gradle' line: 132 * What went wrong: A problem occurred evaluating project ':app'. > No match found * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https:/~~/help.gradle.org BUILD FAILED in 1s ## Expected result <!-- What did you expect to happen instead? --> An android application ## More details <!-- Providing this information will make it more likely the issue you are reporting can be fixed quickly. --> <!-- It's helpful to test as many browsers, platforms or export options as possible. For example an issue occurs in an Android app, does it also occur in Chrome on Windows? How about Firefox? etc. --> yes, it occurs in chrome, fire fox **Affected browsers/platforms:** <!-- Chrome/Firefox/Safari, Windows/macOS/Android, etc --> <!-- Identifying the first version the issue started happening can help resolve the issue more quickly. -->version : r197.2 **First affected release:** <!-- e.g. worked in r122 but broke in r123 --> ## System details <!-- If you see a crash report dialog, please copy and paste it to where it says "PASTE HERE" below. --> ![erro2](https://user-images.githubusercontent.com/65552816/82230220-d1acc800-9901-11ea-8067-79b982791822.png) ![erro3](https://user-images.githubusercontent.com/65552816/82230222-d2455e80-9901-11ea-8820-2847417ff7c1.png) ![errro1](https://user-images.githubusercontent.com/65552816/82230224-d2455e80-9901-11ea-952f-2fc996a5f1b0.png) <!-- Otherwise please go to Menu > About > Platform information and paste that information there instead. --> <details><summary>View details</summary> PASTE HERE </details>
non_test
error when compiling for android problem description error generating apk attach a steps to reproduce export android observed result error checking java jdk and android sdk versions android sdk root androidsdk recommended setting android home androidsdk deprecated wrapper build successful in actionable task executed subproject path cordovalib subproject path app configure project app project evaluation failed including an error in afterevaluate run with stacktrace for details of the afterevaluate error failure build failed with an exception where script cordova gradle line what went wrong a problem occurred evaluating project app no match found try run with stacktrace option to get the stack trace run with info or debug option to get more log output run with scan to get full insights get more help at https help gradle org build failed in gradlew command failed with exit code error output project evaluation failed including an error in afterevaluate run with stacktrace for details of the afterevaluate error failure build failed with an exception where script cordova gradle line what went wrong a problem occurred evaluating project app no match found try run with stacktrace option to get the stack trace run with info or debug option to get more log output run with scan to get full insights get more help at https help gradle org build failed in expected result an android application more details yes it occurs in chrome fire fox affected browsers platforms version first affected release system details about platform information and paste that information there instead view details paste here
0
520,391
15,085,689,535
IssuesEvent
2021-02-05 19:04:40
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
closed
cluster launched from WSL2 conflicts with PowerShell launched cluster
kind/bug lifecycle/rotten os/windows os/wsl-windows priority/important-longterm
Some quirky behavior I noticed: * In the default configuration for Docker, WSL2 and PowerShell share a container space * minikube sees the correctly named container, but does not have access to SSH certificates between environments.
1.0
cluster launched from WSL2 conflicts with PowerShell launched cluster - Some quirky behavior I noticed: * In the default configuration for Docker, WSL2 and PowerShell share a container space * minikube sees the correctly named container, but does not have access to SSH certificates between environments.
non_test
cluster launched from conflicts with powershell launched cluster some quirky behavior i noticed in the default configuration for docker and powershell share a container space minikube sees the correctly named container but does not have access to ssh certificates between environments
0
148,800
11,865,463,129
IssuesEvent
2020-03-26 00:28:38
dotnet/aspnetcore
https://api.github.com/repos/dotnet/aspnetcore
opened
Test failure: BlazorServerTemplateWorks_NoAuth
test-failure
https://dev.azure.com/dnceng/public/_build/results?buildId=573684&view=ms.vss-test-web.build-test-results-tab&runId=17989716&resultId=100011&paneView=debug <details> <summary>Error</summary> ``` OpenQA.Selenium.BrowserAssertFailedException : Xunit.Sdk.EqualException: Assert.Equal() Failure\r\nExpected: Current count: 1\r\nActual: Current count: 0\r\n at Xunit.Assert.Equal[T](T expected, T actual, IEqualityComparer1 comparer) in C:\\Dev\\xunit\\xunit\\src\\xunit.assert\\Asserts\\EqualityAsserts.cs:line 40\r\n at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass15_0.<WaitAssertCore>b__0() in /_/src/Shared/E2ETesting/WaitAssert.cs:line 74\r\n at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass16_01.<WaitAssertCore>b__0(IWebDriver ) in //src/Shared/E2ETesting/WaitAssert.cs:line 100\r\nScreen shot captured at 'F:\workspace\_work\1\s\artifacts\TestResults\Release\ProjectTemplates.Tests\8d6ea42696df493da5a51bef15956f3a.png'\r\nEncountered browser errors\r\n[2020-03-25T23:29:57Z] [Info] https://localhost:50863/_framework/blazor.server.js 0:5359 "[2020-03-25T23:29:57.084Z] Information: Normalizing '_blazor' to 'https://localhost:50863/_blazor'."\r\n[2020-03-25T23:30:02Z] [Info] https://localhost:50863/_framework/blazor.server.js 0:5359 "[2020-03-25T23:29:57.505Z] Information: WebSocket connected to wss://localhost:50863/_blazor?id=wXSTzHmh9YZV4R9olS75zQ."\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:03.278Z] Error: Connection disconnected with error 'Error: WebSocket closed with status code: 1006 ().'."\r\n[2020-03-25T23:30:06Z] [Info] https://localhost:50863/_framework/blazor.server.js 0:5359 "[2020-03-25T23:30:06.282Z] Information: Normalizing '_blazor' to 'https://localhost:50863/_blazor'."\r\n[2020-03-25T23:30:06Z] [Warning] https://localhost:50863/_framework/blazor.server.js 0:5259 "[2020-03-25T23:30:06.963Z] Warning: Error from HTTP request. TypeError: Failed to fetch."\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:06.964Z] Error: Failed to complete negotiation with the server: TypeError: Failed to fetch"\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:06.965Z] Error: Failed to start the connection: TypeError: Failed to fetch"\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 14:27306 "[2020-03-25T23:30:06.966Z] Error: TypeError: Failed to fetch"\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 14:27306 "[2020-03-25T23:30:06.968Z] Error: Error: Cannot send data if the connection is not in the 'Connected' State."\r\n[2020-03-25T23:30:07Z] [Info] https://localhost:50909/_framework/blazor.server.js 0:5359 "[2020-03-25T23:30:07.063Z] Information: Normalizing '_blazor' to 'https://localhost:50909/_blazor'."\r\n[2020-03-25T23:30:12Z] [Info] https://localhost:50909/_framework/blazor.server.js 0:5359 "[2020-03-25T23:30:07.420Z] Information: WebSocket connected to wss://localhost:50909/_blazor?id=QgbEYgBcbuSBSI-4cRrJEQ."\r\n[2020-03-25T23:34:46Z] [Severe] https://localhost:50909/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:13.205Z] Error: Connection disconnected with error 'Error: WebSocket closed with status code: 1006 ().'."\r\n[2020-03-25T23:34:46Z] [Info] https://localhost:50909/_framework/blazor.server.js 0:5359 "[2020-03-25T23:30:16.212Z] Information: Normalizing '_blazor' to 'https://localhost:50909/_blazor'."\r\n[2020-03-25T23:34:46Z] [Severe] https://localhost:50909/_blazor/negotiate?negotiateVersion=1 - Failed to load resource: net::ERR_CONNECTION_REFUSED\r\n[2020-03-25T23:34:46Z] [Warning] https://localhost:50909/_framework/blazor.server.js 0:5259 "[2020-03-25T23:30:18.244Z] Warning: Error from HTTP request. TypeError: Failed to fetch."\r\n[2020-03-25T23:34:46Z] [Severe] https://localhost:50909/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:18.245Z] Error: Failed to complete negotiation with the server: TypeError: Failed to fetch"\r\n[2020-03-25T23:34:46Z] [Sev ``` </details> <details> <summary>Stacktrace</summary> ``` at Microsoft.AspNetCore.E2ETesting.WaitAssert.WaitAssertCore[TResult](IWebDriver driver, Func`1 assertion, TimeSpan timeout) in /_/src/Shared/E2ETesting/WaitAssert.cs:line 120 at Microsoft.AspNetCore.E2ETesting.WaitAssert.WaitAssertCore(IWebDriver driver, Action assertion, TimeSpan timeout) in /_/src/Shared/E2ETesting/WaitAssert.cs:line 75 at Microsoft.AspNetCore.E2ETesting.WaitAssert.Equal[T](IWebDriver driver, T expected, Func`1 actual) in /_/src/Shared/E2ETesting/WaitAssert.cs:line 24 at Templates.Test.BlazorServerTemplateTest.TestBasicNavigation() in /_/src/ProjectTemplates/test/BlazorServerTemplateTest.cs:line 158 at Templates.Test.BlazorServerTemplateTest.BlazorServerTemplateWorks_NoAuth() in /_/src/ProjectTemplates/test/BlazorServerTemplateTest.cs:line 74 --- End of stack trace from previous location --- ----- Inner Stack Trace ----- at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass15_0.<WaitAssertCore>b__0() in /_/src/Shared/E2ETesting/WaitAssert.cs:line 74 at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass16_0`1.<WaitAssertCore>b__0(IWebDriver _) in /_/src/Shared/E2ETesting/WaitAssert.cs:line 100 ``` </details>
1.0
Test failure: BlazorServerTemplateWorks_NoAuth - https://dev.azure.com/dnceng/public/_build/results?buildId=573684&view=ms.vss-test-web.build-test-results-tab&runId=17989716&resultId=100011&paneView=debug <details> <summary>Error</summary> ``` OpenQA.Selenium.BrowserAssertFailedException : Xunit.Sdk.EqualException: Assert.Equal() Failure\r\nExpected: Current count: 1\r\nActual: Current count: 0\r\n at Xunit.Assert.Equal[T](T expected, T actual, IEqualityComparer1 comparer) in C:\\Dev\\xunit\\xunit\\src\\xunit.assert\\Asserts\\EqualityAsserts.cs:line 40\r\n at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass15_0.<WaitAssertCore>b__0() in /_/src/Shared/E2ETesting/WaitAssert.cs:line 74\r\n at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass16_01.<WaitAssertCore>b__0(IWebDriver ) in //src/Shared/E2ETesting/WaitAssert.cs:line 100\r\nScreen shot captured at 'F:\workspace\_work\1\s\artifacts\TestResults\Release\ProjectTemplates.Tests\8d6ea42696df493da5a51bef15956f3a.png'\r\nEncountered browser errors\r\n[2020-03-25T23:29:57Z] [Info] https://localhost:50863/_framework/blazor.server.js 0:5359 "[2020-03-25T23:29:57.084Z] Information: Normalizing '_blazor' to 'https://localhost:50863/_blazor'."\r\n[2020-03-25T23:30:02Z] [Info] https://localhost:50863/_framework/blazor.server.js 0:5359 "[2020-03-25T23:29:57.505Z] Information: WebSocket connected to wss://localhost:50863/_blazor?id=wXSTzHmh9YZV4R9olS75zQ."\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:03.278Z] Error: Connection disconnected with error 'Error: WebSocket closed with status code: 1006 ().'."\r\n[2020-03-25T23:30:06Z] [Info] https://localhost:50863/_framework/blazor.server.js 0:5359 "[2020-03-25T23:30:06.282Z] Information: Normalizing '_blazor' to 'https://localhost:50863/_blazor'."\r\n[2020-03-25T23:30:06Z] [Warning] https://localhost:50863/_framework/blazor.server.js 0:5259 "[2020-03-25T23:30:06.963Z] Warning: Error from HTTP request. TypeError: Failed to fetch."\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:06.964Z] Error: Failed to complete negotiation with the server: TypeError: Failed to fetch"\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:06.965Z] Error: Failed to start the connection: TypeError: Failed to fetch"\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 14:27306 "[2020-03-25T23:30:06.966Z] Error: TypeError: Failed to fetch"\r\n[2020-03-25T23:30:06Z] [Severe] https://localhost:50863/_framework/blazor.server.js 14:27306 "[2020-03-25T23:30:06.968Z] Error: Error: Cannot send data if the connection is not in the 'Connected' State."\r\n[2020-03-25T23:30:07Z] [Info] https://localhost:50909/_framework/blazor.server.js 0:5359 "[2020-03-25T23:30:07.063Z] Information: Normalizing '_blazor' to 'https://localhost:50909/_blazor'."\r\n[2020-03-25T23:30:12Z] [Info] https://localhost:50909/_framework/blazor.server.js 0:5359 "[2020-03-25T23:30:07.420Z] Information: WebSocket connected to wss://localhost:50909/_blazor?id=QgbEYgBcbuSBSI-4cRrJEQ."\r\n[2020-03-25T23:34:46Z] [Severe] https://localhost:50909/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:13.205Z] Error: Connection disconnected with error 'Error: WebSocket closed with status code: 1006 ().'."\r\n[2020-03-25T23:34:46Z] [Info] https://localhost:50909/_framework/blazor.server.js 0:5359 "[2020-03-25T23:30:16.212Z] Information: Normalizing '_blazor' to 'https://localhost:50909/_blazor'."\r\n[2020-03-25T23:34:46Z] [Severe] https://localhost:50909/_blazor/negotiate?negotiateVersion=1 - Failed to load resource: net::ERR_CONNECTION_REFUSED\r\n[2020-03-25T23:34:46Z] [Warning] https://localhost:50909/_framework/blazor.server.js 0:5259 "[2020-03-25T23:30:18.244Z] Warning: Error from HTTP request. TypeError: Failed to fetch."\r\n[2020-03-25T23:34:46Z] [Severe] https://localhost:50909/_framework/blazor.server.js 0:5162 "[2020-03-25T23:30:18.245Z] Error: Failed to complete negotiation with the server: TypeError: Failed to fetch"\r\n[2020-03-25T23:34:46Z] [Sev ``` </details> <details> <summary>Stacktrace</summary> ``` at Microsoft.AspNetCore.E2ETesting.WaitAssert.WaitAssertCore[TResult](IWebDriver driver, Func`1 assertion, TimeSpan timeout) in /_/src/Shared/E2ETesting/WaitAssert.cs:line 120 at Microsoft.AspNetCore.E2ETesting.WaitAssert.WaitAssertCore(IWebDriver driver, Action assertion, TimeSpan timeout) in /_/src/Shared/E2ETesting/WaitAssert.cs:line 75 at Microsoft.AspNetCore.E2ETesting.WaitAssert.Equal[T](IWebDriver driver, T expected, Func`1 actual) in /_/src/Shared/E2ETesting/WaitAssert.cs:line 24 at Templates.Test.BlazorServerTemplateTest.TestBasicNavigation() in /_/src/ProjectTemplates/test/BlazorServerTemplateTest.cs:line 158 at Templates.Test.BlazorServerTemplateTest.BlazorServerTemplateWorks_NoAuth() in /_/src/ProjectTemplates/test/BlazorServerTemplateTest.cs:line 74 --- End of stack trace from previous location --- ----- Inner Stack Trace ----- at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass15_0.<WaitAssertCore>b__0() in /_/src/Shared/E2ETesting/WaitAssert.cs:line 74 at Microsoft.AspNetCore.E2ETesting.WaitAssert.<>c__DisplayClass16_0`1.<WaitAssertCore>b__0(IWebDriver _) in /_/src/Shared/E2ETesting/WaitAssert.cs:line 100 ``` </details>
test
test failure blazorservertemplateworks noauth error openqa selenium browserassertfailedexception xunit sdk equalexception assert equal failure r nexpected current count r nactual current count r n at xunit assert equal t expected t actual comparer in c dev xunit xunit src xunit assert asserts equalityasserts cs line r n at microsoft aspnetcore waitassert c b in src shared waitassert cs line r n at microsoft aspnetcore waitassert c b iwebdriver in src shared waitassert cs line r nscreen shot captured at f workspace work s artifacts testresults release projecttemplates tests png r nencountered browser errors r n information normalizing blazor to information websocket connected to wss localhost blazor id r n error connection disconnected with error error websocket closed with status code r n information normalizing blazor to warning error from http request typeerror failed to fetch r n error failed to complete negotiation with the server typeerror failed to fetch r n error failed to start the connection typeerror failed to fetch r n error typeerror failed to fetch r n error error cannot send data if the connection is not in the connected state r n information normalizing blazor to information websocket connected to wss localhost blazor id qgbeygbcbusbsi r n error connection disconnected with error error websocket closed with status code r n information normalizing blazor to failed to load resource net err connection refused r n warning error from http request typeerror failed to fetch r n error failed to complete negotiation with the server typeerror failed to fetch r n sev stacktrace at microsoft aspnetcore waitassert waitassertcore iwebdriver driver func assertion timespan timeout in src shared waitassert cs line at microsoft aspnetcore waitassert waitassertcore iwebdriver driver action assertion timespan timeout in src shared waitassert cs line at microsoft aspnetcore waitassert equal iwebdriver driver t expected func actual in src shared waitassert cs line at templates test blazorservertemplatetest testbasicnavigation in src projecttemplates test blazorservertemplatetest cs line at templates test blazorservertemplatetest blazorservertemplateworks noauth in src projecttemplates test blazorservertemplatetest cs line end of stack trace from previous location inner stack trace at microsoft aspnetcore waitassert c b in src shared waitassert cs line at microsoft aspnetcore waitassert c b iwebdriver in src shared waitassert cs line
1
280,311
24,293,579,487
IssuesEvent
2022-09-29 08:18:19
valory-xyz/open-autonomy
https://api.github.com/repos/valory-xyz/open-autonomy
closed
Containers need more time to start on CI from `contracts-amm` image
test
### Subject of the issue Some tests are flaky because containers fail to start on time on CI. Specifically, this happened on [Merge pull request #1395 from valory-xyz/feat/autoupdate-packagelist main_workflow #5418](https://github.com/valory-xyz/open-autonomy/actions/runs/3133747181/jobs/5087596544) for some tests that use the `contracts-amm` image. We could try increasing the max attempts, e.g., from 20 to 25, here: https://github.com/valory-xyz/open-autonomy/blob/61c7f69b3e6290e94fdc6837cf3bbaaddccdcf22/plugins/aea-test-autonomy/aea_test_autonomy/fixture_helpers.py#L277 Example of the failure: ``` 2022-09-27T08:08:33.0046808Z INFO aea_test_autonomy.docker.base:base.py:140 Setting up image valory/contracts-amm:latest... 2022-09-27T08:08:33.0047362Z DEBUG urllib3.connectionpool:connectionpool.py:228 Starting new HTTP connection (1): 127.0.0.1:8545 2022-09-27T08:08:33.0048124Z ERROR root:amm_net.py:98 Exception: ConnectionError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer')) 2022-09-27T08:08:33.0048653Z INFO root:amm_net.py:99 Attempt 0 failed. Retrying in 2.0 seconds... . . . 2022-09-27T08:08:33.0083634Z DEBUG urllib3.connectionpool:connectionpool.py:228 Starting new HTTP connection (1): 127.0.0.1:8545 2022-09-27T08:08:33.0084423Z ERROR root:amm_net.py:98 Exception: ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) 2022-09-27T08:08:33.0084953Z INFO root:amm_net.py:99 Attempt 19 failed. Retrying in 2.0 seconds... ```
1.0
Containers need more time to start on CI from `contracts-amm` image - ### Subject of the issue Some tests are flaky because containers fail to start on time on CI. Specifically, this happened on [Merge pull request #1395 from valory-xyz/feat/autoupdate-packagelist main_workflow #5418](https://github.com/valory-xyz/open-autonomy/actions/runs/3133747181/jobs/5087596544) for some tests that use the `contracts-amm` image. We could try increasing the max attempts, e.g., from 20 to 25, here: https://github.com/valory-xyz/open-autonomy/blob/61c7f69b3e6290e94fdc6837cf3bbaaddccdcf22/plugins/aea-test-autonomy/aea_test_autonomy/fixture_helpers.py#L277 Example of the failure: ``` 2022-09-27T08:08:33.0046808Z INFO aea_test_autonomy.docker.base:base.py:140 Setting up image valory/contracts-amm:latest... 2022-09-27T08:08:33.0047362Z DEBUG urllib3.connectionpool:connectionpool.py:228 Starting new HTTP connection (1): 127.0.0.1:8545 2022-09-27T08:08:33.0048124Z ERROR root:amm_net.py:98 Exception: ConnectionError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer')) 2022-09-27T08:08:33.0048653Z INFO root:amm_net.py:99 Attempt 0 failed. Retrying in 2.0 seconds... . . . 2022-09-27T08:08:33.0083634Z DEBUG urllib3.connectionpool:connectionpool.py:228 Starting new HTTP connection (1): 127.0.0.1:8545 2022-09-27T08:08:33.0084423Z ERROR root:amm_net.py:98 Exception: ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) 2022-09-27T08:08:33.0084953Z INFO root:amm_net.py:99 Attempt 19 failed. Retrying in 2.0 seconds... ```
test
containers need more time to start on ci from contracts amm image subject of the issue some tests are flaky because containers fail to start on time on ci specifically this happened on for some tests that use the contracts amm image we could try increasing the max attempts e g from to here example of the failure info aea test autonomy docker base base py setting up image valory contracts amm latest debug connectionpool connectionpool py starting new http connection error root amm net py exception connectionerror connection aborted connectionreseterror connection reset by peer info root amm net py attempt failed retrying in seconds debug connectionpool connectionpool py starting new http connection error root amm net py exception connectionerror connection aborted remotedisconnected remote end closed connection without response info root amm net py attempt failed retrying in seconds
1
340,158
30,496,513,858
IssuesEvent
2023-07-18 11:13:30
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
closed
com.hazelcast.jet.impl.connector.PostgreReadJdbcPPropertiesTest and com.hazelcast.jet.impl.connector.MySQLReadJdbcPPropertiesTest
Type: Test-Failure Source: Internal Module: Jet Team: Integration
_master_ (commit e7514d2ef9f960f59694fb0534357f8b996a403b) Failed on Windows - OracleJDK11: https://jenkins.hazelcast.com/job/Hazelcast-master-Windows-OracleJDK11/454/testReport/com.hazelcast.jet.impl.connector/PostgreReadJdbcPPropertiesTest/___/ <details><summary>Stacktrace:</summary> ``` java.lang.IllegalStateException: Previous attempts to find a Docker environment failed. Will not retry. Please see logs and check configuration at org.testcontainers.dockerclient.DockerClientProviderStrategy.getFirstValidStrategy(DockerClientProviderStrategy.java:231) at org.testcontainers.DockerClientFactory.getOrInitializeStrategy(DockerClientFactory.java:150) at org.testcontainers.DockerClientFactory.client(DockerClientFactory.java:191) at org.testcontainers.DockerClientFactory$1.getDockerClient(DockerClientFactory.java:104) at com.github.dockerjava.api.DockerClientDelegate.authConfig(DockerClientDelegate.java:109) at org.testcontainers.containers.GenericContainer.start(GenericContainer.java:321) at com.hazelcast.test.jdbc.PostgresDatabaseProvider.createDatabase(PostgresDatabaseProvider.java:35) at com.hazelcast.jet.impl.connector.ReadJdbcPPropertiesTest.initializeBeforeClass(ReadJdbcPPropertiesTest.java:57) at com.hazelcast.jet.impl.connector.PostgreReadJdbcPPropertiesTest.beforeClass(PostgreReadJdbcPPropertiesTest.java:33) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) at com.hazelcast.test.AfterClassesStatement.evaluate(AfterClassesStatement.java:41) at com.hazelcast.test.OverridePropertyRule$1.evaluate(OverridePropertyRule.java:66) at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299) at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.lang.Thread.run(Thread.java:834) ``` </details> <details><summary>Standard output:</summary> ``` 19:55:00,594 INFO || - [MetricsConfigHelper] Time-limited test - [LOCAL] [dev] [5.4.0-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 19:55:00,595 INFO || - [logo] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Hazelcast Platform 5.4.0-SNAPSHOT (20230703 - e7514d2) starting at [127.0.0.1]:5701 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Cluster name: dev 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Jet is enabled 19:55:00,599 INFO || - [MetricsConfigHelper] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 19:55:00,599 INFO || - [TpcServerBootstrap] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] TPC: disabled 19:55:00,604 WARN || - [CPSubsystem] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 19:55:00,607 INFO || - [JetServiceBackend] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Setting number of cooperative threads and default parallelism to 2 19:55:00,608 INFO || - [Diagnostics] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 19:55:00,608 INFO || - [LifecycleService] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] [127.0.0.1]:5701 is STARTING 19:55:00,608 INFO || - [ClusterService] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Members {size:1, ver:1} [ Member [127.0.0.1]:5701 - 0e3a349c-0182-4678-8add-414e55b0e88b this ] 19:55:00,608 INFO || - [JobCoordinationService] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Jet started scanning for jobs 19:55:00,608 INFO || - [LifecycleService] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] [127.0.0.1]:5701 is STARTED Started Running Test: test_smallFiles 19:55:00,609 DEBUG || - [JobCoordinationService] hz.ReadFilesPTest_zen_vaughan.cached.thread-6 - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Not starting jobs because partitions are not yet initialized. ``` </details> Standard output can be found here - https://s3.console.aws.amazon.com/s3/buckets/j-artifacts/Hazelcast-master-Windows-OracleJDK11/454/ The same problem is for MySQLReadJdbcPPropertiesTest test: https://jenkins.hazelcast.com/job/Hazelcast-master-Windows-OracleJDK11/454/testReport/com.hazelcast.jet.impl.connector/MySQLReadJdbcPPropertiesTest/___/ Both tests must have: `assumeDockerEnabled();`
1.0
com.hazelcast.jet.impl.connector.PostgreReadJdbcPPropertiesTest and com.hazelcast.jet.impl.connector.MySQLReadJdbcPPropertiesTest - _master_ (commit e7514d2ef9f960f59694fb0534357f8b996a403b) Failed on Windows - OracleJDK11: https://jenkins.hazelcast.com/job/Hazelcast-master-Windows-OracleJDK11/454/testReport/com.hazelcast.jet.impl.connector/PostgreReadJdbcPPropertiesTest/___/ <details><summary>Stacktrace:</summary> ``` java.lang.IllegalStateException: Previous attempts to find a Docker environment failed. Will not retry. Please see logs and check configuration at org.testcontainers.dockerclient.DockerClientProviderStrategy.getFirstValidStrategy(DockerClientProviderStrategy.java:231) at org.testcontainers.DockerClientFactory.getOrInitializeStrategy(DockerClientFactory.java:150) at org.testcontainers.DockerClientFactory.client(DockerClientFactory.java:191) at org.testcontainers.DockerClientFactory$1.getDockerClient(DockerClientFactory.java:104) at com.github.dockerjava.api.DockerClientDelegate.authConfig(DockerClientDelegate.java:109) at org.testcontainers.containers.GenericContainer.start(GenericContainer.java:321) at com.hazelcast.test.jdbc.PostgresDatabaseProvider.createDatabase(PostgresDatabaseProvider.java:35) at com.hazelcast.jet.impl.connector.ReadJdbcPPropertiesTest.initializeBeforeClass(ReadJdbcPPropertiesTest.java:57) at com.hazelcast.jet.impl.connector.PostgreReadJdbcPPropertiesTest.beforeClass(PostgreReadJdbcPPropertiesTest.java:33) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) at com.hazelcast.test.AfterClassesStatement.evaluate(AfterClassesStatement.java:41) at com.hazelcast.test.OverridePropertyRule$1.evaluate(OverridePropertyRule.java:66) at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299) at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.lang.Thread.run(Thread.java:834) ``` </details> <details><summary>Standard output:</summary> ``` 19:55:00,594 INFO || - [MetricsConfigHelper] Time-limited test - [LOCAL] [dev] [5.4.0-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 19:55:00,595 INFO || - [logo] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Hazelcast Platform 5.4.0-SNAPSHOT (20230703 - e7514d2) starting at [127.0.0.1]:5701 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Cluster name: dev 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 19:55:00,595 INFO || - [system] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Jet is enabled 19:55:00,599 INFO || - [MetricsConfigHelper] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 19:55:00,599 INFO || - [TpcServerBootstrap] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] TPC: disabled 19:55:00,604 WARN || - [CPSubsystem] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 19:55:00,607 INFO || - [JetServiceBackend] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Setting number of cooperative threads and default parallelism to 2 19:55:00,608 INFO || - [Diagnostics] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 19:55:00,608 INFO || - [LifecycleService] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] [127.0.0.1]:5701 is STARTING 19:55:00,608 INFO || - [ClusterService] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Members {size:1, ver:1} [ Member [127.0.0.1]:5701 - 0e3a349c-0182-4678-8add-414e55b0e88b this ] 19:55:00,608 INFO || - [JobCoordinationService] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Jet started scanning for jobs 19:55:00,608 INFO || - [LifecycleService] Time-limited test - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] [127.0.0.1]:5701 is STARTED Started Running Test: test_smallFiles 19:55:00,609 DEBUG || - [JobCoordinationService] hz.ReadFilesPTest_zen_vaughan.cached.thread-6 - [127.0.0.1]:5701 [dev] [5.4.0-SNAPSHOT] Not starting jobs because partitions are not yet initialized. ``` </details> Standard output can be found here - https://s3.console.aws.amazon.com/s3/buckets/j-artifacts/Hazelcast-master-Windows-OracleJDK11/454/ The same problem is for MySQLReadJdbcPPropertiesTest test: https://jenkins.hazelcast.com/job/Hazelcast-master-Windows-OracleJDK11/454/testReport/com.hazelcast.jet.impl.connector/MySQLReadJdbcPPropertiesTest/___/ Both tests must have: `assumeDockerEnabled();`
test
com hazelcast jet impl connector postgrereadjdbcppropertiestest and com hazelcast jet impl connector mysqlreadjdbcppropertiestest master commit failed on windows stacktrace java lang illegalstateexception previous attempts to find a docker environment failed will not retry please see logs and check configuration at org testcontainers dockerclient dockerclientproviderstrategy getfirstvalidstrategy dockerclientproviderstrategy java at org testcontainers dockerclientfactory getorinitializestrategy dockerclientfactory java at org testcontainers dockerclientfactory client dockerclientfactory java at org testcontainers dockerclientfactory getdockerclient dockerclientfactory java at com github dockerjava api dockerclientdelegate authconfig dockerclientdelegate java at org testcontainers containers genericcontainer start genericcontainer java at com hazelcast test jdbc postgresdatabaseprovider createdatabase postgresdatabaseprovider java at com hazelcast jet impl connector readjdbcppropertiestest initializebeforeclass readjdbcppropertiestest java at com hazelcast jet impl connector postgrereadjdbcppropertiestest beforeclass postgrereadjdbcppropertiestest java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org junit runners model frameworkmethod runreflectivecall frameworkmethod java at org junit internal runners model reflectivecallable run reflectivecallable java at org junit runners model frameworkmethod invokeexplosively frameworkmethod java at org junit internal runners statements runbefores invokemethod runbefores java at org junit internal runners statements runbefores evaluate runbefores java at org junit internal runners statements runafters evaluate runafters java at com hazelcast test afterclassesstatement evaluate afterclassesstatement java at com hazelcast test overridepropertyrule evaluate overridepropertyrule java at org junit internal runners statements failontimeout callablestatement call failontimeout java at org junit internal runners statements failontimeout callablestatement call failontimeout java at java base java util concurrent futuretask run futuretask java at java base java lang thread run thread java standard output info time limited test overridden metrics configuration with system property hazelcast metrics collection frequency metricsconfig collectionfrequencyseconds info time limited test o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o info time limited test copyright c hazelcast inc all rights reserved info time limited test hazelcast platform snapshot starting at info time limited test cluster name dev info time limited test integrity checker is disabled fail fast on corrupted executables will not be performed for more information see the documentation for integrity checker info time limited test jet is enabled info time limited test collecting debug metrics and sending to diagnostics is enabled info time limited test tpc disabled warn time limited test cp subsystem is not enabled cp data structures will operate in unsafe mode please note that unsafe mode will not provide strong consistency guarantees info time limited test setting number of cooperative threads and default parallelism to info time limited test diagnostics disabled to enable add dhazelcast diagnostics enabled true to the jvm arguments info time limited test is starting info time limited test members size ver member this info time limited test jet started scanning for jobs info time limited test is started started running test test smallfiles debug hz readfilesptest zen vaughan cached thread not starting jobs because partitions are not yet initialized standard output can be found here the same problem is for mysqlreadjdbcppropertiestest test both tests must have assumedockerenabled
1
80,570
7,751,347,656
IssuesEvent
2018-05-30 16:46:22
NetsBlox/NetsBlox
https://api.github.com/repos/NetsBlox/NetsBlox
closed
Update tests for RPCs detecting cancelation
enhancement minor testing
The tests (and manual testing endpoints) should be updated wrt the cancelation detection capabilities of #2070
1.0
Update tests for RPCs detecting cancelation - The tests (and manual testing endpoints) should be updated wrt the cancelation detection capabilities of #2070
test
update tests for rpcs detecting cancelation the tests and manual testing endpoints should be updated wrt the cancelation detection capabilities of
1
54,992
6,421,468,226
IssuesEvent
2017-08-09 04:59:56
telecmi/TeleCMI-Support
https://api.github.com/repos/telecmi/TeleCMI-Support
closed
App Details
bug fixed testing finished
I create app in chub with register my mobile number,After i would create another chub app and register same mobile number,But its created.
1.0
App Details - I create app in chub with register my mobile number,After i would create another chub app and register same mobile number,But its created.
test
app details i create app in chub with register my mobile number after i would create another chub app and register same mobile number but its created
1
258,953
22,360,821,074
IssuesEvent
2022-06-15 20:17:58
microsoft/FluidFramework
https://api.github.com/repos/microsoft/FluidFramework
closed
Write a few basic Differential Summary tests
area: runtime area: tests
It is also fairly easy to repro the bug on old code, so you could create a bad doc using clicker and then open it with your changes and verify things worked as expected on load. (That was how I initially tested the fix) Otherwise you may want to do a more explicit remove duplicates fix. Or like Vlad said find slice point, but In reality, given how the bug works, the slice point is just remove all the ops from `newOutstandingOps`. _Originally posted by @arinwt in https://github.com/microsoft/FluidFramework/pull/4016#discussion_r509657824_
1.0
Write a few basic Differential Summary tests - It is also fairly easy to repro the bug on old code, so you could create a bad doc using clicker and then open it with your changes and verify things worked as expected on load. (That was how I initially tested the fix) Otherwise you may want to do a more explicit remove duplicates fix. Or like Vlad said find slice point, but In reality, given how the bug works, the slice point is just remove all the ops from `newOutstandingOps`. _Originally posted by @arinwt in https://github.com/microsoft/FluidFramework/pull/4016#discussion_r509657824_
test
write a few basic differential summary tests it is also fairly easy to repro the bug on old code so you could create a bad doc using clicker and then open it with your changes and verify things worked as expected on load that was how i initially tested the fix otherwise you may want to do a more explicit remove duplicates fix or like vlad said find slice point but in reality given how the bug works the slice point is just remove all the ops from newoutstandingops originally posted by arinwt in
1
26,341
4,216,875,656
IssuesEvent
2016-06-30 10:52:50
hioa-cs/IncludeOS
https://api.github.com/repos/hioa-cs/IncludeOS
closed
IncludeOS/test/IDE is missing files
Test
This test is missing the README and test.sh script needed to integrate it with the test system.
1.0
IncludeOS/test/IDE is missing files - This test is missing the README and test.sh script needed to integrate it with the test system.
test
includeos test ide is missing files this test is missing the readme and test sh script needed to integrate it with the test system
1
49,329
20,738,007,652
IssuesEvent
2022-03-14 15:15:10
hashicorp/terraform-provider-aws
https://api.github.com/repos/hashicorp/terraform-provider-aws
closed
ECS service - update load_balancer without destroy & recreate
enhancement service/ecs
<!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Description Currently, changing the value of aws_ecs_service.load_balancer triggers a destroy & recreate of the entire service. AWS added support for updating the service in-place in a recent API update. ### New or Affected Resource(s) <!--- Please list the new or affected resources and data sources. ---> * aws_ecs_service ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example: * https://aws.amazon.com/about-aws/whats-new/2018/04/introducing-amazon-ec2-fleet/ ---> * AWS Go SDK was updated https://github.com/aws/aws-sdk-go/pull/4305 * Blog post announcement https://aws.amazon.com/about-aws/whats-new/2022/03/amazon-ecs-service-api-updating-elastic-load-balancers-service-registries-tag-propagation-ecs-managed-tags/ * Similar ticket https://github.com/hashicorp/terraform-provider-aws/issues/23581
1.0
ECS service - update load_balancer without destroy & recreate - <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Description Currently, changing the value of aws_ecs_service.load_balancer triggers a destroy & recreate of the entire service. AWS added support for updating the service in-place in a recent API update. ### New or Affected Resource(s) <!--- Please list the new or affected resources and data sources. ---> * aws_ecs_service ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Vendor blog posts or documentation? For example: * https://aws.amazon.com/about-aws/whats-new/2018/04/introducing-amazon-ec2-fleet/ ---> * AWS Go SDK was updated https://github.com/aws/aws-sdk-go/pull/4305 * Blog post announcement https://aws.amazon.com/about-aws/whats-new/2022/03/amazon-ecs-service-api-updating-elastic-load-balancers-service-registries-tag-propagation-ecs-managed-tags/ * Similar ticket https://github.com/hashicorp/terraform-provider-aws/issues/23581
non_test
ecs service update load balancer without destroy recreate community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description currently changing the value of aws ecs service load balancer triggers a destroy recreate of the entire service aws added support for updating the service in place in a recent api update new or affected resource s aws ecs service references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here vendor blog posts or documentation for example aws go sdk was updated blog post announcement similar ticket
0
188,115
14,439,173,901
IssuesEvent
2020-12-07 14:03:27
kalexmills/github-vet-tests-dec2020
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
closed
MeshBoxFoundation/go-mbfs: gx/QmYMiyZRYDmhMr2phMc4FGrYbsyzvR751BgeobnWroiq2z/go-multicodec/msgpack/msgpack_test.go; 4 LoC
fresh test tiny
Found a possible issue in [MeshBoxFoundation/go-mbfs](https://www.github.com/MeshBoxFoundation/go-mbfs) at [gx/QmYMiyZRYDmhMr2phMc4FGrYbsyzvR751BgeobnWroiq2z/go-multicodec/msgpack/msgpack_test.go](https://github.com/MeshBoxFoundation/go-mbfs/blob/3789ae056f3748762de73da8522c49ec4b69147d/gx/QmYMiyZRYDmhMr2phMc4FGrYbsyzvR751BgeobnWroiq2z/go-multicodec/msgpack/msgpack_test.go#L45-L48) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to tca at line 47 may start a goroutine [Click here to see the code in its original context.](https://github.com/MeshBoxFoundation/go-mbfs/blob/3789ae056f3748762de73da8522c49ec4b69147d/gx/QmYMiyZRYDmhMr2phMc4FGrYbsyzvR751BgeobnWroiq2z/go-multicodec/msgpack/msgpack_test.go#L45-L48) <details> <summary>Click here to show the 4 line(s) of Go which triggered the analyzer.</summary> ```go for _, tca := range testCases { var tcb map[string]interface{} mctest.RoundTripTest(t, codec, &tca, &tcb) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 3789ae056f3748762de73da8522c49ec4b69147d
1.0
MeshBoxFoundation/go-mbfs: gx/QmYMiyZRYDmhMr2phMc4FGrYbsyzvR751BgeobnWroiq2z/go-multicodec/msgpack/msgpack_test.go; 4 LoC - Found a possible issue in [MeshBoxFoundation/go-mbfs](https://www.github.com/MeshBoxFoundation/go-mbfs) at [gx/QmYMiyZRYDmhMr2phMc4FGrYbsyzvR751BgeobnWroiq2z/go-multicodec/msgpack/msgpack_test.go](https://github.com/MeshBoxFoundation/go-mbfs/blob/3789ae056f3748762de73da8522c49ec4b69147d/gx/QmYMiyZRYDmhMr2phMc4FGrYbsyzvR751BgeobnWroiq2z/go-multicodec/msgpack/msgpack_test.go#L45-L48) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to tca at line 47 may start a goroutine [Click here to see the code in its original context.](https://github.com/MeshBoxFoundation/go-mbfs/blob/3789ae056f3748762de73da8522c49ec4b69147d/gx/QmYMiyZRYDmhMr2phMc4FGrYbsyzvR751BgeobnWroiq2z/go-multicodec/msgpack/msgpack_test.go#L45-L48) <details> <summary>Click here to show the 4 line(s) of Go which triggered the analyzer.</summary> ```go for _, tca := range testCases { var tcb map[string]interface{} mctest.RoundTripTest(t, codec, &tca, &tcb) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 3789ae056f3748762de73da8522c49ec4b69147d
test
meshboxfoundation go mbfs gx go multicodec msgpack msgpack test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to tca at line may start a goroutine click here to show the line s of go which triggered the analyzer go for tca range testcases var tcb map interface mctest roundtriptest t codec tca tcb leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
32,428
4,771,007,490
IssuesEvent
2016-10-26 16:44:18
rLoopTeam/eng-software-pod
https://api.github.com/repos/rLoopTeam/eng-software-pod
opened
Ensure each devices temperature is correct
clarification needed test
Are we writing a unit test for this, or is this checked some other way?
1.0
Ensure each devices temperature is correct - Are we writing a unit test for this, or is this checked some other way?
test
ensure each devices temperature is correct are we writing a unit test for this or is this checked some other way
1
136,786
30,590,943,034
IssuesEvent
2023-07-21 16:59:36
sourcenetwork/defradb
https://api.github.com/repos/sourcenetwork/defradb
closed
Lens cfg capitalisation requirements super unclear
bug documentation area/schema code quality action/no-benchmark area/cli
- cfg must be provided with lowercase prop names (e.g. `lenses`) - `SourceSchemaVersionID` and `DestinationSchemaVersionID` must be uppercase! - get CLI outputs them all as uppercase - no error if casing is incorrect - no documentation for this requirement
1.0
Lens cfg capitalisation requirements super unclear - - cfg must be provided with lowercase prop names (e.g. `lenses`) - `SourceSchemaVersionID` and `DestinationSchemaVersionID` must be uppercase! - get CLI outputs them all as uppercase - no error if casing is incorrect - no documentation for this requirement
non_test
lens cfg capitalisation requirements super unclear cfg must be provided with lowercase prop names e g lenses sourceschemaversionid and destinationschemaversionid must be uppercase get cli outputs them all as uppercase no error if casing is incorrect no documentation for this requirement
0
125,286
10,339,671,220
IssuesEvent
2019-09-03 19:55:45
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
Failing test: Chrome UI Functional Tests.test/functional/apps/visualize/_tsvb_time_series·ts - visualize app visual builder Time Series should show the correct count in the legend with -2h offset
Feature:TSVB Team:KibanaApp failed-test
A test failed on a tracked branch ``` Error: expected '156' to equal '53' at Assertion.assert (packages/kbn-expect/expect.js:100:11) at Assertion.be.Assertion.equal (packages/kbn-expect/expect.js:221:8) at Assertion.(anonymous function) [as be] (packages/kbn-expect/expect.js:69:22) at Context.be (test/functional/apps/visualize/_tsvb_time_series.ts:63:32) at process._tickCallback (internal/process/next_tick.js:68:7) ``` First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/JOB=kibana-ciGroup12,node=immutable/857/) ![image](https://user-images.githubusercontent.com/1329312/61144771-ac059580-a48a-11e9-8c3b-b1180404344c.png) <!-- kibanaCiData = {"failed-test":{"test.class":"Chrome UI Functional Tests.test/functional/apps/visualize/_tsvb_time_series·ts","test.name":"visualize app visual builder Time Series should show the correct count in the legend with -2h offset","test.failCount":1}} -->
1.0
Failing test: Chrome UI Functional Tests.test/functional/apps/visualize/_tsvb_time_series·ts - visualize app visual builder Time Series should show the correct count in the legend with -2h offset - A test failed on a tracked branch ``` Error: expected '156' to equal '53' at Assertion.assert (packages/kbn-expect/expect.js:100:11) at Assertion.be.Assertion.equal (packages/kbn-expect/expect.js:221:8) at Assertion.(anonymous function) [as be] (packages/kbn-expect/expect.js:69:22) at Context.be (test/functional/apps/visualize/_tsvb_time_series.ts:63:32) at process._tickCallback (internal/process/next_tick.js:68:7) ``` First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/JOB=kibana-ciGroup12,node=immutable/857/) ![image](https://user-images.githubusercontent.com/1329312/61144771-ac059580-a48a-11e9-8c3b-b1180404344c.png) <!-- kibanaCiData = {"failed-test":{"test.class":"Chrome UI Functional Tests.test/functional/apps/visualize/_tsvb_time_series·ts","test.name":"visualize app visual builder Time Series should show the correct count in the legend with -2h offset","test.failCount":1}} -->
test
failing test chrome ui functional tests test functional apps visualize tsvb time series·ts visualize app visual builder time series should show the correct count in the legend with offset a test failed on a tracked branch error expected to equal at assertion assert packages kbn expect expect js at assertion be assertion equal packages kbn expect expect js at assertion anonymous function packages kbn expect expect js at context be test functional apps visualize tsvb time series ts at process tickcallback internal process next tick js first failure
1
108,073
9,258,291,860
IssuesEvent
2019-03-17 14:37:17
NMGRL/pychron
https://api.github.com/repos/NMGRL/pychron
closed
K/Ca log plot minimum
Bug Data Reduction Testing Required
in Spectra, if K/Ca is on a log scale and the errors extend <0, the minimum shown on the plot is 1, even if the scale extends to <1. Shouldn't the error bar extend to the minimum of the plot in this case?\ Not a burning issue at this point - it only affects bad data
1.0
K/Ca log plot minimum - in Spectra, if K/Ca is on a log scale and the errors extend <0, the minimum shown on the plot is 1, even if the scale extends to <1. Shouldn't the error bar extend to the minimum of the plot in this case?\ Not a burning issue at this point - it only affects bad data
test
k ca log plot minimum in spectra if k ca is on a log scale and the errors extend the minimum shown on the plot is even if the scale extends to shouldn t the error bar extend to the minimum of the plot in this case not a burning issue at this point it only affects bad data
1
47,704
19,700,521,569
IssuesEvent
2022-01-12 16:12:40
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
What options are available for cache in mountOptions
container-service/svc Pri2
The example mountOptions shows a cache=strict I've seen another page that showed cache=none is another option (https://docs.microsoft.com/en-us/azure/storage/files/storage-troubleshoot-linux-file-connection-problems). What values are possible for cache and what do they mean in terms of client implementation? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: a57e48b4-dc33-0b9d-b773-42ce7d2bffd0 * Version Independent ID: e912ac2d-cf6a-0b54-4697-1940b1270874 * Content: [Dynamically create Azure Files share - Azure Kubernetes Service](https://docs.microsoft.com/en-us/azure/aks/azure-files-dynamic-pv) * Content Source: [articles/aks/azure-files-dynamic-pv.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/aks/azure-files-dynamic-pv.md) * Service: **container-service** * GitHub Login: @zr-msft * Microsoft Alias: **zarhoads**
1.0
What options are available for cache in mountOptions - The example mountOptions shows a cache=strict I've seen another page that showed cache=none is another option (https://docs.microsoft.com/en-us/azure/storage/files/storage-troubleshoot-linux-file-connection-problems). What values are possible for cache and what do they mean in terms of client implementation? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: a57e48b4-dc33-0b9d-b773-42ce7d2bffd0 * Version Independent ID: e912ac2d-cf6a-0b54-4697-1940b1270874 * Content: [Dynamically create Azure Files share - Azure Kubernetes Service](https://docs.microsoft.com/en-us/azure/aks/azure-files-dynamic-pv) * Content Source: [articles/aks/azure-files-dynamic-pv.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/aks/azure-files-dynamic-pv.md) * Service: **container-service** * GitHub Login: @zr-msft * Microsoft Alias: **zarhoads**
non_test
what options are available for cache in mountoptions the example mountoptions shows a cache strict i ve seen another page that showed cache none is another option what values are possible for cache and what do they mean in terms of client implementation document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service container service github login zr msft microsoft alias zarhoads
0
58,745
6,619,251,837
IssuesEvent
2017-09-21 11:27:11
openSUSE/open-build-service
https://api.github.com/repos/openSUSE/open-build-service
closed
Implement tests if obsapidelayed is started correctly in openqa test cases
Frontend Test Suite
The problem in obsapidelayed startup (https://github.com/openSUSE/open-build-service/pull/3875) was not detected by openqa test. A test case which checks if obsapidelayed started correctly (e.g. systemctl status obsapidelayed) would have been useful.
1.0
Implement tests if obsapidelayed is started correctly in openqa test cases - The problem in obsapidelayed startup (https://github.com/openSUSE/open-build-service/pull/3875) was not detected by openqa test. A test case which checks if obsapidelayed started correctly (e.g. systemctl status obsapidelayed) would have been useful.
test
implement tests if obsapidelayed is started correctly in openqa test cases the problem in obsapidelayed startup was not detected by openqa test a test case which checks if obsapidelayed started correctly e g systemctl status obsapidelayed would have been useful
1