Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
21,190
| 28,209,306,322
|
IssuesEvent
|
2023-04-05 01:48:15
|
bitfocus/companion-module-requests
|
https://api.github.com/repos/bitfocus/companion-module-requests
|
opened
|
RTS Intercom
|
NOT YET PROCESSED
|
- [Yes ] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested**
The name of the device, hardware, or software you would like to control:
RTS Intercom/ [ODIN](https://products.rtsintercoms.com/na/en/odin/)
[OMNEO digital intercom](https://products.rtsintercoms.com/na/en/odin/)
What you would like to be able to make it do from Companion:
Have a Intercom with this protocols: ST 2110, AES67, AES70
Direct links or attachments to the ethernet control protocol or API:
https://ocaalliance.com/resources/standards-specifications/
|
1.0
|
RTS Intercom - - [Yes ] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested**
The name of the device, hardware, or software you would like to control:
RTS Intercom/ [ODIN](https://products.rtsintercoms.com/na/en/odin/)
[OMNEO digital intercom](https://products.rtsintercoms.com/na/en/odin/)
What you would like to be able to make it do from Companion:
Have a Intercom with this protocols: ST 2110, AES67, AES70
Direct links or attachments to the ethernet control protocol or API:
https://ocaalliance.com/resources/standards-specifications/
|
process
|
rts intercom i have researched the list of existing companion modules and requests and have determined this has not yet been requested the name of the device hardware or software you would like to control rts intercom what you would like to be able to make it do from companion have a intercom with this protocols st direct links or attachments to the ethernet control protocol or api
| 1
|
118,637
| 15,343,992,948
|
IssuesEvent
|
2021-02-27 22:51:44
|
arwes/arwes
|
https://api.github.com/repos/arwes/arwes
|
opened
|
Add application sounds starter package
|
app: playground app: website complexity: medium package: core type: feature type: sound design type: ui/ux design
|
The project should provide a free sounds starter package for an average Arwes application. This would be the same sounds used for the website application and playground sandboxes.
Since this requires the components to be built and the website user experience to be mostly completed to properly define them, the outcome of this task is to be updated as development progresses.
|
2.0
|
Add application sounds starter package - The project should provide a free sounds starter package for an average Arwes application. This would be the same sounds used for the website application and playground sandboxes.
Since this requires the components to be built and the website user experience to be mostly completed to properly define them, the outcome of this task is to be updated as development progresses.
|
non_process
|
add application sounds starter package the project should provide a free sounds starter package for an average arwes application this would be the same sounds used for the website application and playground sandboxes since this requires the components to be built and the website user experience to be mostly completed to properly define them the outcome of this task is to be updated as development progresses
| 0
|
24,671
| 12,367,987,056
|
IssuesEvent
|
2020-05-18 13:10:23
|
unisonweb/unison
|
https://api.github.com/repos/unisonweb/unison
|
closed
|
Initial pull of base uses a lot of memory
|
memory-usage performance
|
Here's a transcript:
```ucm
.> pull git@github.com:unisonweb/base:.trunk base
```
Just from watching activity monitor:
* Initial memory usage is about 14-15MB.
* Usage climbs to about 5.5GB during the "Importing downloaded files into local codebase" phase.
* After completion, memory usage stays at that level even after I do further commands to hopefully trigger some GC.
* After restarting on the same codebase, memory usage is much lower (300-500MB, see #1550).
* These results are with caching turned off - I turned off by hardcoding `nullCache` in `Main`, not relying on the config setting.
@aryairani I could have sworn this took very little memory at one point in the development of this new syncing algorithm
If I had to guess, it's some sort of space leak in `SlimCopyRegenerateIndex`. A couple things I tried (which are all in debug/1560 branch):
* Turned on `{-# Language Strict, StrictData -#}` in SlimCopyRegenerateIndex (total guess).
* Did a `ByteString.copy` everywhere in `V1.hs` where `getBytes` is called - by default, there was one in `getHash` and one in `getText`. Reasoning: `ByteString` values are just offsets into a single array of bytes. To allow that array of bytes to be GC'd you need to copy the bytestring to its own storage.
The fact that memory usage doesn't go back down again seems like a clue and is unexpected to me...
|
True
|
Initial pull of base uses a lot of memory - Here's a transcript:
```ucm
.> pull git@github.com:unisonweb/base:.trunk base
```
Just from watching activity monitor:
* Initial memory usage is about 14-15MB.
* Usage climbs to about 5.5GB during the "Importing downloaded files into local codebase" phase.
* After completion, memory usage stays at that level even after I do further commands to hopefully trigger some GC.
* After restarting on the same codebase, memory usage is much lower (300-500MB, see #1550).
* These results are with caching turned off - I turned off by hardcoding `nullCache` in `Main`, not relying on the config setting.
@aryairani I could have sworn this took very little memory at one point in the development of this new syncing algorithm
If I had to guess, it's some sort of space leak in `SlimCopyRegenerateIndex`. A couple things I tried (which are all in debug/1560 branch):
* Turned on `{-# Language Strict, StrictData -#}` in SlimCopyRegenerateIndex (total guess).
* Did a `ByteString.copy` everywhere in `V1.hs` where `getBytes` is called - by default, there was one in `getHash` and one in `getText`. Reasoning: `ByteString` values are just offsets into a single array of bytes. To allow that array of bytes to be GC'd you need to copy the bytestring to its own storage.
The fact that memory usage doesn't go back down again seems like a clue and is unexpected to me...
|
non_process
|
initial pull of base uses a lot of memory here s a transcript ucm pull git github com unisonweb base trunk base just from watching activity monitor initial memory usage is about usage climbs to about during the importing downloaded files into local codebase phase after completion memory usage stays at that level even after i do further commands to hopefully trigger some gc after restarting on the same codebase memory usage is much lower see these results are with caching turned off i turned off by hardcoding nullcache in main not relying on the config setting aryairani i could have sworn this took very little memory at one point in the development of this new syncing algorithm if i had to guess it s some sort of space leak in slimcopyregenerateindex a couple things i tried which are all in debug branch turned on language strict strictdata in slimcopyregenerateindex total guess did a bytestring copy everywhere in hs where getbytes is called by default there was one in gethash and one in gettext reasoning bytestring values are just offsets into a single array of bytes to allow that array of bytes to be gc d you need to copy the bytestring to its own storage the fact that memory usage doesn t go back down again seems like a clue and is unexpected to me
| 0
|
3,674
| 4,641,905,219
|
IssuesEvent
|
2016-09-30 07:34:07
|
signmeup/signmeup
|
https://api.github.com/repos/signmeup/signmeup
|
opened
|
Implement Shib logout
|
enhancement security
|
Restricted mode is risky to start if the user forgets to log out of Banner to end their Shib session. Instead, we should redirect them to the logout page to trigger this.
|
True
|
Implement Shib logout - Restricted mode is risky to start if the user forgets to log out of Banner to end their Shib session. Instead, we should redirect them to the logout page to trigger this.
|
non_process
|
implement shib logout restricted mode is risky to start if the user forgets to log out of banner to end their shib session instead we should redirect them to the logout page to trigger this
| 0
|
123,363
| 26,247,330,483
|
IssuesEvent
|
2023-01-05 16:14:24
|
Azure/azure-dev
|
https://api.github.com/repos/Azure/azure-dev
|
closed
|
VS Code <-> Azd integration requires additional `az login` for local app development use-cases
|
blocker vscode design
|
**Describe the issue:**
Task on the api requires authentication.
**Repro Steps:**
1. Open the project in VS Code.
2. Login with `azd login`.
3. Hit F1, Run Task, Start API and Web. During the api startup process, you will get the error as below:

Besides, if we execute `az login` according to the promption, this error can be fixed. But this operation is a bit strange because we have performed `azd login`.
**Environment:**
OS: Codespaces, WSL, Windows desktop, MacOS desktop, Linux desktop, Devcontainer in VS Code
Template: All templates.
Branch: [pr/1153](https://github.com/Azure/azure-dev/pull/1153)
Azd version: 0.4.0-beta.1-pr.1988386 (commit 7b6248aa6788d586bcf0b9a9491187866dad5cca)
**Expected behavior:**
After we `azd login`, we can start the api without other authentication.
@rajeshkamal5050 for notification.
|
1.0
|
VS Code <-> Azd integration requires additional `az login` for local app development use-cases - **Describe the issue:**
Task on the api requires authentication.
**Repro Steps:**
1. Open the project in VS Code.
2. Login with `azd login`.
3. Hit F1, Run Task, Start API and Web. During the api startup process, you will get the error as below:

Besides, if we execute `az login` according to the promption, this error can be fixed. But this operation is a bit strange because we have performed `azd login`.
**Environment:**
OS: Codespaces, WSL, Windows desktop, MacOS desktop, Linux desktop, Devcontainer in VS Code
Template: All templates.
Branch: [pr/1153](https://github.com/Azure/azure-dev/pull/1153)
Azd version: 0.4.0-beta.1-pr.1988386 (commit 7b6248aa6788d586bcf0b9a9491187866dad5cca)
**Expected behavior:**
After we `azd login`, we can start the api without other authentication.
@rajeshkamal5050 for notification.
|
non_process
|
vs code azd integration requires additional az login for local app development use cases describe the issue task on the api requires authentication repro steps open the project in vs code login with azd login hit run task start api and web during the api startup process you will get the error as below besides if we execute az login according to the promption this error can be fixed but this operation is a bit strange because we have performed azd login environment os codespaces wsl windows desktop macos desktop linux desktop devcontainer in vs code template all templates branch azd version beta pr commit expected behavior after we azd login we can start the api without other authentication for notification
| 0
|
11,106
| 7,058,580,615
|
IssuesEvent
|
2018-01-04 20:59:20
|
coreos/bugs
|
https://api.github.com/repos/coreos/bugs
|
closed
|
Ignition S3 Region Detection
|
area/usability component/ignition kind/bug platform/aws team/tools
|
The region detection used when retrieving S3 assets doesn't work in all regions, most specifically `us-gov-west-1`.
It could be easily retrieved with
`curl -s http://169.254.169.254/latest/dynamic/instance-identity/document | jq -r .region`
and using that as the `regionHint` parameter in
https://github.com/coreos/ignition/blob/294826a9d880b5ddc1a900011100ce9ce2134c28/internal/resource/url.go#L334.
|
True
|
Ignition S3 Region Detection - The region detection used when retrieving S3 assets doesn't work in all regions, most specifically `us-gov-west-1`.
It could be easily retrieved with
`curl -s http://169.254.169.254/latest/dynamic/instance-identity/document | jq -r .region`
and using that as the `regionHint` parameter in
https://github.com/coreos/ignition/blob/294826a9d880b5ddc1a900011100ce9ce2134c28/internal/resource/url.go#L334.
|
non_process
|
ignition region detection the region detection used when retrieving assets doesn t work in all regions most specifically us gov west it could be easily retrieved with curl s jq r region and using that as the regionhint parameter in
| 0
|
56,293
| 23,743,593,164
|
IssuesEvent
|
2022-08-31 14:18:14
|
miranda-ng/miranda-ng
|
https://api.github.com/repos/miranda-ng/miranda-ng
|
closed
|
VoiceService: do not grab focus on incoming call
|
bug regression VoiceService
|
> [21:49] deadsend: и диалог при входящем звонке начал перехватывать фокус, чего до этого не было и это было фичей
|
1.0
|
VoiceService: do not grab focus on incoming call - > [21:49] deadsend: и диалог при входящем звонке начал перехватывать фокус, чего до этого не было и это было фичей
|
non_process
|
voiceservice do not grab focus on incoming call deadsend и диалог при входящем звонке начал перехватывать фокус чего до этого не было и это было фичей
| 0
|
44,536
| 12,235,475,927
|
IssuesEvent
|
2020-05-04 14:56:37
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
closed
|
oaconvolve(a,b,'same') differs in shape from convolve(a,b,'same') when len(a)==1 and len(b)>1
|
defect scipy.signal
|
<!--
Thank you for taking the time to file a bug report.
Please fill in the fields below, deleting the sections that
don't apply to your issue. You can view the final output
by clicking the preview button above.
Note: This is a comment, and won't appear in the output.
-->
I wrote a matricial version of `convolve`, and was testing it against both `scipy.signal.convolve` and `scipy.signal.oaconvolve`. I realized that for very specific inputs **the two results differ in shape**. I suppose that they should return the same result (to machine precision), so I thought it'd be good to tell.
This happens only when:
- 1st argument is a 1 element vector
- 2nd argument is a longer vector
- mode is 'same'
This doesn't seem to happen when both vectors have a meaningful length (although 1 is meaningful as a limit case). I.e., when len(a)>=2, results are consistent.
(I have not tested for inputs of higher dimension)
#### Reproducing code example:
<!--
If you place your code between the triple backticks below,
it will be rendered as a code block.
-->
```
import scipy.signal as scisig
oaconv = lambda a,b: scipy.oaconvolve(a,b,'same')
conv = lambda a,b: scisig.convolve(a,b,'same')
conv([1],[1,-1]), oaconv([1],[1,-1]) # should be equal, I get (array([1]), array([ 1, -1]))
conv([1,-1],[1]), oaconv([1,-1],[1]) # these instead are fine: (array([ 1, -1]), array([ 1, -1]))
```
#### Error message:
<!-- If any, paste the *full* error message inside a code block
as above (starting from line Traceback)
-->
```
(No error message)
```
#### Scipy/Numpy/Python version information:
<!-- You can simply run the following and paste the result in a code block
```
import sys, scipy, numpy; print(scipy.__version__, numpy.__version__, sys.version_info)
```
-->
```
1.4.1 1.18.1 sys.version_info(major=3, minor=7, micro=6, releaselevel='final', serial=0)
```
|
1.0
|
oaconvolve(a,b,'same') differs in shape from convolve(a,b,'same') when len(a)==1 and len(b)>1 - <!--
Thank you for taking the time to file a bug report.
Please fill in the fields below, deleting the sections that
don't apply to your issue. You can view the final output
by clicking the preview button above.
Note: This is a comment, and won't appear in the output.
-->
I wrote a matricial version of `convolve`, and was testing it against both `scipy.signal.convolve` and `scipy.signal.oaconvolve`. I realized that for very specific inputs **the two results differ in shape**. I suppose that they should return the same result (to machine precision), so I thought it'd be good to tell.
This happens only when:
- 1st argument is a 1 element vector
- 2nd argument is a longer vector
- mode is 'same'
This doesn't seem to happen when both vectors have a meaningful length (although 1 is meaningful as a limit case). I.e., when len(a)>=2, results are consistent.
(I have not tested for inputs of higher dimension)
#### Reproducing code example:
<!--
If you place your code between the triple backticks below,
it will be rendered as a code block.
-->
```
import scipy.signal as scisig
oaconv = lambda a,b: scipy.oaconvolve(a,b,'same')
conv = lambda a,b: scisig.convolve(a,b,'same')
conv([1],[1,-1]), oaconv([1],[1,-1]) # should be equal, I get (array([1]), array([ 1, -1]))
conv([1,-1],[1]), oaconv([1,-1],[1]) # these instead are fine: (array([ 1, -1]), array([ 1, -1]))
```
#### Error message:
<!-- If any, paste the *full* error message inside a code block
as above (starting from line Traceback)
-->
```
(No error message)
```
#### Scipy/Numpy/Python version information:
<!-- You can simply run the following and paste the result in a code block
```
import sys, scipy, numpy; print(scipy.__version__, numpy.__version__, sys.version_info)
```
-->
```
1.4.1 1.18.1 sys.version_info(major=3, minor=7, micro=6, releaselevel='final', serial=0)
```
|
non_process
|
oaconvolve a b same differs in shape from convolve a b same when len a and len b thank you for taking the time to file a bug report please fill in the fields below deleting the sections that don t apply to your issue you can view the final output by clicking the preview button above note this is a comment and won t appear in the output i wrote a matricial version of convolve and was testing it against both scipy signal convolve and scipy signal oaconvolve i realized that for very specific inputs the two results differ in shape i suppose that they should return the same result to machine precision so i thought it d be good to tell this happens only when argument is a element vector argument is a longer vector mode is same this doesn t seem to happen when both vectors have a meaningful length although is meaningful as a limit case i e when len a results are consistent i have not tested for inputs of higher dimension reproducing code example if you place your code between the triple backticks below it will be rendered as a code block import scipy signal as scisig oaconv lambda a b scipy oaconvolve a b same conv lambda a b scisig convolve a b same conv oaconv should be equal i get array array conv oaconv these instead are fine array array error message if any paste the full error message inside a code block as above starting from line traceback no error message scipy numpy python version information you can simply run the following and paste the result in a code block import sys scipy numpy print scipy version numpy version sys version info sys version info major minor micro releaselevel final serial
| 0
|
5,685
| 8,559,032,457
|
IssuesEvent
|
2018-11-08 20:01:01
|
integr8ly/tutorial-web-app
|
https://api.github.com/repos/integr8ly/tutorial-web-app
|
closed
|
Request: Lock Dependency Versions
|
enhancement process
|
As the web app progresses I would like to see the dependencies listed in the `package.json` file be locked down to specific versions, rather than using the `^`. The `yarn.lock` file should not have to change unless there is a specific need for an introduction of a new dependency, or a vital version update.
|
1.0
|
Request: Lock Dependency Versions - As the web app progresses I would like to see the dependencies listed in the `package.json` file be locked down to specific versions, rather than using the `^`. The `yarn.lock` file should not have to change unless there is a specific need for an introduction of a new dependency, or a vital version update.
|
process
|
request lock dependency versions as the web app progresses i would like to see the dependencies listed in the package json file be locked down to specific versions rather than using the the yarn lock file should not have to change unless there is a specific need for an introduction of a new dependency or a vital version update
| 1
|
85,061
| 15,731,184,236
|
IssuesEvent
|
2021-03-29 16:46:08
|
wrbejar/bag-of-holding
|
https://api.github.com/repos/wrbejar/bag-of-holding
|
opened
|
WS-2018-0590 (High) detected in diff-1.4.0.tgz
|
security vulnerability
|
## WS-2018-0590 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>diff-1.4.0.tgz</b></p></summary>
<p>A javascript text diff implementation.</p>
<p>Library home page: <a href="https://registry.npmjs.org/diff/-/diff-1.4.0.tgz">https://registry.npmjs.org/diff/-/diff-1.4.0.tgz</a></p>
<p>Path to dependency file: bag-of-holding/package.json</p>
<p>Path to vulnerable library: bag-of-holding/node_modules/diff/package.json</p>
<p>
Dependency Hierarchy:
- gulp-sass-1.3.3.tgz (Root Library)
- node-sass-2.1.1.tgz
- mocha-2.5.3.tgz
- :x: **diff-1.4.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/wrbejar/bag-of-holding/commit/6087cf643d57f8f112ae650913c59bfc0a1033d6">6087cf643d57f8f112ae650913c59bfc0a1033d6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability was found in diff before v3.5.0, the affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks.
<p>Publish Date: 2018-03-05
<p>URL: <a href=https://bugzilla.redhat.com/show_bug.cgi?id=1552148>WS-2018-0590</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/kpdecker/jsdiff/commit/2aec4298639bf30fb88a00b356bf404d3551b8c0">https://github.com/kpdecker/jsdiff/commit/2aec4298639bf30fb88a00b356bf404d3551b8c0</a></p>
<p>Release Date: 2019-06-11</p>
<p>Fix Resolution: 3.5.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"diff","packageVersion":"1.4.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-sass:1.3.3;node-sass:2.1.1;mocha:2.5.3;diff:1.4.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.5.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2018-0590","vulnerabilityDetails":"A vulnerability was found in diff before v3.5.0, the affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks.","vulnerabilityUrl":"https://bugzilla.redhat.com/show_bug.cgi?id\u003d1552148","cvss2Severity":"high","cvss2Score":"7.0","extraData":{}}</REMEDIATE> -->
|
True
|
WS-2018-0590 (High) detected in diff-1.4.0.tgz - ## WS-2018-0590 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>diff-1.4.0.tgz</b></p></summary>
<p>A javascript text diff implementation.</p>
<p>Library home page: <a href="https://registry.npmjs.org/diff/-/diff-1.4.0.tgz">https://registry.npmjs.org/diff/-/diff-1.4.0.tgz</a></p>
<p>Path to dependency file: bag-of-holding/package.json</p>
<p>Path to vulnerable library: bag-of-holding/node_modules/diff/package.json</p>
<p>
Dependency Hierarchy:
- gulp-sass-1.3.3.tgz (Root Library)
- node-sass-2.1.1.tgz
- mocha-2.5.3.tgz
- :x: **diff-1.4.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/wrbejar/bag-of-holding/commit/6087cf643d57f8f112ae650913c59bfc0a1033d6">6087cf643d57f8f112ae650913c59bfc0a1033d6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability was found in diff before v3.5.0, the affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks.
<p>Publish Date: 2018-03-05
<p>URL: <a href=https://bugzilla.redhat.com/show_bug.cgi?id=1552148>WS-2018-0590</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/kpdecker/jsdiff/commit/2aec4298639bf30fb88a00b356bf404d3551b8c0">https://github.com/kpdecker/jsdiff/commit/2aec4298639bf30fb88a00b356bf404d3551b8c0</a></p>
<p>Release Date: 2019-06-11</p>
<p>Fix Resolution: 3.5.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"diff","packageVersion":"1.4.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-sass:1.3.3;node-sass:2.1.1;mocha:2.5.3;diff:1.4.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.5.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2018-0590","vulnerabilityDetails":"A vulnerability was found in diff before v3.5.0, the affected versions of this package are vulnerable to Regular Expression Denial of Service (ReDoS) attacks.","vulnerabilityUrl":"https://bugzilla.redhat.com/show_bug.cgi?id\u003d1552148","cvss2Severity":"high","cvss2Score":"7.0","extraData":{}}</REMEDIATE> -->
|
non_process
|
ws high detected in diff tgz ws high severity vulnerability vulnerable library diff tgz a javascript text diff implementation library home page a href path to dependency file bag of holding package json path to vulnerable library bag of holding node modules diff package json dependency hierarchy gulp sass tgz root library node sass tgz mocha tgz x diff tgz vulnerable library found in head commit a href found in base branch master vulnerability details a vulnerability was found in diff before the affected versions of this package are vulnerable to regular expression denial of service redos attacks publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree gulp sass node sass mocha diff isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier ws vulnerabilitydetails a vulnerability was found in diff before the affected versions of this package are vulnerable to regular expression denial of service redos attacks vulnerabilityurl
| 0
|
196,085
| 14,798,606,933
|
IssuesEvent
|
2021-01-13 00:11:13
|
openshift/odo
|
https://api.github.com/repos/openshift/odo
|
closed
|
[Flake] create, push and delete reports invalid configuration
|
area/component area/testing kind/flake lifecycle/rotten priority/Medium
|
[kind/bug]
<!--
Welcome! - We kindly ask you to:
1. Fill out the issue template below
2. Use the Google group if you have a question rather than a bug or feature request.
The group is at: https://groups.google.com/forum/#!forum/odo-users
Thanks for understanding, and for contributing to the project!
-->
## What versions of software are you using?
- Operating System:
- Output of `odo version`: master
## How did you run odo exactly?
On OpenShift CI
## Actual behavior
Error out
## Expected behavior
Should pass
## Any logs, error output, etc?
```
odo sub component command tests when component is in the current directory and --project flag is used
creates and pushes local nodejs component and then deletes --all
/go/src/github.com/openshift/odo/tests/integration/component.go:329
Created dir: /tmp/724815433
Creating a new project: texjxuxufc
Running odo with args [odo project create texjxuxufc -w -v4]
[odo] I0904 13:18:24.552396 19818 preference.go:118] The path for preference file is /tmp/724815433/config.yaml
[odo] I0904 13:18:24.552472 19818 occlient.go:479] Trying to connect to server api.ci-op-d1whffy3-f09f4.origin-ci-int-aws.dev.rhcloud.com:6443
[odo] I0904 13:18:24.567450 19818 occlient.go:486] Server https://api.ci-op-d1whffy3-f09f4.origin-ci-int-aws.dev.rhcloud.com:6443 is up
[odo] I0904 13:18:24.631580 19818 occlient.go:409] isLoggedIn err: <nil>
[odo] output: "developer"
[odo] • Waiting for project to come up ...
[odo] ✓ Waiting for project to come up [353ms]
[odo] ✓ Project 'texjxuxufc' is ready for use
[odo] ✓ New project created and now using project : texjxuxufc
[odo] I0904 13:18:25.006691 19818 odo.go:70] Could not get the latest release information in time. Never mind, exiting gracefully :)
Current working dir: /go/src/github.com/openshift/odo/tests/integration
Setting current dir to: /tmp/724815433
Running odo with args [odo component create nodejs my-component --app app --project texjxuxufc --env key=value,key1=value1]
[odo] • Validating component ...
[odo] ✓ Validating component [37ms]
[odo] Please use `odo push` command to create the component with source deployed
[odo]
Running odo with args [odo component push --context /tmp/724815433]
[odo] ✗ invalid configuration: [context was not found for specified context: jgjryhzmbp/api-ci-op-d1whffy3-f09f4-origin-ci-int-aws-dev-rhcloud-com:6443/developer, cluster has no server defined]
[odo] Please login to your server:
[odo]
[odo] odo login https://mycluster.mydomain.com
[odo]
Setting current dir to: /go/src/github.com/openshift/odo/tests/integration
Deleting project: texjxuxufc
Running odo with args [odo project delete texjxuxufc -f]
[odo] • Deleting project texjxuxufc ...
[odo] ✓ Deleting project texjxuxufc [5s]
[odo] ✓ Deleted project : texjxuxufc
Deleting dir: /tmp/724815433
• Failure [6.659 seconds]
odo sub component command tests
/go/src/github.com/openshift/odo/tests/integration/cmd_cmp_sub_test.go:13
when component is in the current directory and --project flag is used
/go/src/github.com/openshift/odo/tests/integration/component.go:306
creates and pushes local nodejs component and then deletes --all [It]
/go/src/github.com/openshift/odo/tests/integration/component.go:329
No future change is possible. Bailing out early after 0.102s.
Running odo with args [odo component push --context /tmp/724815433]
Expected
<int>: 1
to match exit code:
<int>: 0
/go/src/github.com/openshift/odo/tests/helper/helper_run.go:32
```
|
1.0
|
[Flake] create, push and delete reports invalid configuration - [kind/bug]
<!--
Welcome! - We kindly ask you to:
1. Fill out the issue template below
2. Use the Google group if you have a question rather than a bug or feature request.
The group is at: https://groups.google.com/forum/#!forum/odo-users
Thanks for understanding, and for contributing to the project!
-->
## What versions of software are you using?
- Operating System:
- Output of `odo version`: master
## How did you run odo exactly?
On OpenShift CI
## Actual behavior
Error out
## Expected behavior
Should pass
## Any logs, error output, etc?
```
odo sub component command tests when component is in the current directory and --project flag is used
creates and pushes local nodejs component and then deletes --all
/go/src/github.com/openshift/odo/tests/integration/component.go:329
Created dir: /tmp/724815433
Creating a new project: texjxuxufc
Running odo with args [odo project create texjxuxufc -w -v4]
[odo] I0904 13:18:24.552396 19818 preference.go:118] The path for preference file is /tmp/724815433/config.yaml
[odo] I0904 13:18:24.552472 19818 occlient.go:479] Trying to connect to server api.ci-op-d1whffy3-f09f4.origin-ci-int-aws.dev.rhcloud.com:6443
[odo] I0904 13:18:24.567450 19818 occlient.go:486] Server https://api.ci-op-d1whffy3-f09f4.origin-ci-int-aws.dev.rhcloud.com:6443 is up
[odo] I0904 13:18:24.631580 19818 occlient.go:409] isLoggedIn err: <nil>
[odo] output: "developer"
[odo] • Waiting for project to come up ...
[odo] ✓ Waiting for project to come up [353ms]
[odo] ✓ Project 'texjxuxufc' is ready for use
[odo] ✓ New project created and now using project : texjxuxufc
[odo] I0904 13:18:25.006691 19818 odo.go:70] Could not get the latest release information in time. Never mind, exiting gracefully :)
Current working dir: /go/src/github.com/openshift/odo/tests/integration
Setting current dir to: /tmp/724815433
Running odo with args [odo component create nodejs my-component --app app --project texjxuxufc --env key=value,key1=value1]
[odo] • Validating component ...
[odo] ✓ Validating component [37ms]
[odo] Please use `odo push` command to create the component with source deployed
[odo]
Running odo with args [odo component push --context /tmp/724815433]
[odo] ✗ invalid configuration: [context was not found for specified context: jgjryhzmbp/api-ci-op-d1whffy3-f09f4-origin-ci-int-aws-dev-rhcloud-com:6443/developer, cluster has no server defined]
[odo] Please login to your server:
[odo]
[odo] odo login https://mycluster.mydomain.com
[odo]
Setting current dir to: /go/src/github.com/openshift/odo/tests/integration
Deleting project: texjxuxufc
Running odo with args [odo project delete texjxuxufc -f]
[odo] • Deleting project texjxuxufc ...
[odo] ✓ Deleting project texjxuxufc [5s]
[odo] ✓ Deleted project : texjxuxufc
Deleting dir: /tmp/724815433
• Failure [6.659 seconds]
odo sub component command tests
/go/src/github.com/openshift/odo/tests/integration/cmd_cmp_sub_test.go:13
when component is in the current directory and --project flag is used
/go/src/github.com/openshift/odo/tests/integration/component.go:306
creates and pushes local nodejs component and then deletes --all [It]
/go/src/github.com/openshift/odo/tests/integration/component.go:329
No future change is possible. Bailing out early after 0.102s.
Running odo with args [odo component push --context /tmp/724815433]
Expected
<int>: 1
to match exit code:
<int>: 0
/go/src/github.com/openshift/odo/tests/helper/helper_run.go:32
```
|
non_process
|
create push and delete reports invalid configuration welcome we kindly ask you to fill out the issue template below use the google group if you have a question rather than a bug or feature request the group is at thanks for understanding and for contributing to the project what versions of software are you using operating system output of odo version master how did you run odo exactly on openshift ci actual behavior error out expected behavior should pass any logs error output etc odo sub component command tests when component is in the current directory and project flag is used creates and pushes local nodejs component and then deletes all go src github com openshift odo tests integration component go created dir tmp creating a new project texjxuxufc running odo with args preference go the path for preference file is tmp config yaml occlient go trying to connect to server api ci op origin ci int aws dev rhcloud com occlient go server is up occlient go isloggedin err output developer • waiting for project to come up ✓ waiting for project to come up ✓ project texjxuxufc is ready for use ✓ new project created and now using project texjxuxufc odo go could not get the latest release information in time never mind exiting gracefully current working dir go src github com openshift odo tests integration setting current dir to tmp running odo with args • validating component ✓ validating component please use odo push command to create the component with source deployed running odo with args ✗ invalid configuration please login to your server odo login setting current dir to go src github com openshift odo tests integration deleting project texjxuxufc running odo with args • deleting project texjxuxufc ✓ deleting project texjxuxufc ✓ deleted project texjxuxufc deleting dir tmp • failure odo sub component command tests go src github com openshift odo tests integration cmd cmp sub test go when component is in the current directory and project flag is used go src github com openshift odo tests integration component go creates and pushes local nodejs component and then deletes all go src github com openshift odo tests integration component go no future change is possible bailing out early after running odo with args expected to match exit code go src github com openshift odo tests helper helper run go
| 0
|
4,117
| 7,059,054,645
|
IssuesEvent
|
2018-01-04 23:09:07
|
chenhowa/chess-app
|
https://api.github.com/repos/chenhowa/chess-app
|
opened
|
Unit test tracking
|
process problem
|
Need a better way to track whether unit testing is currently adequate for a particular class or function. Currently testing is pretty good, but new additions to the code base are not always followed up with a corresponding unit test, and I might forget about it later.
|
1.0
|
Unit test tracking - Need a better way to track whether unit testing is currently adequate for a particular class or function. Currently testing is pretty good, but new additions to the code base are not always followed up with a corresponding unit test, and I might forget about it later.
|
process
|
unit test tracking need a better way to track whether unit testing is currently adequate for a particular class or function currently testing is pretty good but new additions to the code base are not always followed up with a corresponding unit test and i might forget about it later
| 1
|
603,390
| 18,545,180,302
|
IssuesEvent
|
2021-10-21 21:03:57
|
RTXteam/RTX-KG2
|
https://api.github.com/repos/RTXteam/RTX-KG2
|
opened
|
Potential candidate for a new node property "is toxic"
|
enhancement low priority
|
Biolink model 2.2.5 includes a node property, "is toxic", that could be useful in KG2
https://github.com/biolink/biolink-model/blob/d77172050122bf4d5b48cd1d487fb58a8b163620/biolink-model.yaml#L1231
```
is toxic:
description: >-
is_a: node property
multivalued: false
range: boolean
```
|
1.0
|
Potential candidate for a new node property "is toxic" - Biolink model 2.2.5 includes a node property, "is toxic", that could be useful in KG2
https://github.com/biolink/biolink-model/blob/d77172050122bf4d5b48cd1d487fb58a8b163620/biolink-model.yaml#L1231
```
is toxic:
description: >-
is_a: node property
multivalued: false
range: boolean
```
|
non_process
|
potential candidate for a new node property is toxic biolink model includes a node property is toxic that could be useful in is toxic description is a node property multivalued false range boolean
| 0
|
9,071
| 12,140,171,060
|
IssuesEvent
|
2020-04-23 20:04:59
|
pelias/whosonfirst
|
https://api.github.com/repos/pelias/whosonfirst
|
closed
|
Refactor bundleList and tests to not contact live website
|
good first issue high priority processed
|
The bundleList code [contacts WOF](https://github.com/pelias/whosonfirst/blob/master/src/bundleList.js#L64) for the bundle list. This should be refactored to inject the bundle list URL so that the test can operate offline.
|
1.0
|
Refactor bundleList and tests to not contact live website - The bundleList code [contacts WOF](https://github.com/pelias/whosonfirst/blob/master/src/bundleList.js#L64) for the bundle list. This should be refactored to inject the bundle list URL so that the test can operate offline.
|
process
|
refactor bundlelist and tests to not contact live website the bundlelist code for the bundle list this should be refactored to inject the bundle list url so that the test can operate offline
| 1
|
112,558
| 11,771,213,729
|
IssuesEvent
|
2020-03-15 22:51:44
|
matteobruni/tsparticles
|
https://api.github.com/repos/matteobruni/tsparticles
|
opened
|
tsParticles Default values
|
bug documentation enhancement good first issue help wanted up-for-grabs
|
The default values are ugly as hell. Some nicer are needed to init library with a good default effect.
Less options enabled is the best solution, so they don't create conflicts.
|
1.0
|
tsParticles Default values - The default values are ugly as hell. Some nicer are needed to init library with a good default effect.
Less options enabled is the best solution, so they don't create conflicts.
|
non_process
|
tsparticles default values the default values are ugly as hell some nicer are needed to init library with a good default effect less options enabled is the best solution so they don t create conflicts
| 0
|
633
| 3,092,121,639
|
IssuesEvent
|
2015-08-26 16:14:58
|
e-government-ua/iBP
|
https://api.github.com/repos/e-government-ua/iBP
|
opened
|
Надвірнянська РДА - Надання довідки про наявність у Державному земельному кадастрі відомостей про одержання у власність земельної ділянки у межах норм безоплатної приватизації за певним видом її цільового призначення (використання)
|
in process of creating
|
существующий процесс - https://drive.google.com/a/privatbank.ua/file/d/0B4vk1jpTDb_5aFA1X0xBdnRDM3M/view
предлагаемый процесс - https://drive.google.com/a/privatbank.ua/file/d/0B4vk1jpTDb_5SS1uc2F3YTk1UEE/view
|
1.0
|
Надвірнянська РДА - Надання довідки про наявність у Державному земельному кадастрі відомостей про одержання у власність земельної ділянки у межах норм безоплатної приватизації за певним видом її цільового призначення (використання) -
существующий процесс - https://drive.google.com/a/privatbank.ua/file/d/0B4vk1jpTDb_5aFA1X0xBdnRDM3M/view
предлагаемый процесс - https://drive.google.com/a/privatbank.ua/file/d/0B4vk1jpTDb_5SS1uc2F3YTk1UEE/view
|
process
|
надвірнянська рда надання довідки про наявність у державному земельному кадастрі відомостей про одержання у власність земельної ділянки у межах норм безоплатної приватизації за певним видом її цільового призначення використання существующий процесс предлагаемый процесс
| 1
|
14,830
| 18,168,278,105
|
IssuesEvent
|
2021-09-27 16:51:37
|
ORNL-AMO/AMO-Tools-Desktop
|
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
|
closed
|
PHAST - operating costs small calculator
|
enhancement Process Heating Intern To Do
|
make calculator to help user calculate weighted average of fuel costs if using more than one fuel
modal -
fuel A Fraction _____ Cost ____
fuel B Fraction _____ Cost_____
Result = sum (fraction * cost)
Kristina - Do I need to make a mock up ?
|
1.0
|
PHAST - operating costs small calculator - make calculator to help user calculate weighted average of fuel costs if using more than one fuel
modal -
fuel A Fraction _____ Cost ____
fuel B Fraction _____ Cost_____
Result = sum (fraction * cost)
Kristina - Do I need to make a mock up ?
|
process
|
phast operating costs small calculator make calculator to help user calculate weighted average of fuel costs if using more than one fuel modal fuel a fraction cost fuel b fraction cost result sum fraction cost kristina do i need to make a mock up
| 1
|
320,232
| 27,428,498,531
|
IssuesEvent
|
2023-03-01 22:27:53
|
tgstation/tgstation
|
https://api.github.com/repos/tgstation/tgstation
|
closed
|
Passive slime extracts do not work in modsuit storage
|
Bug Tested/Reproduced
|
<!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Round ID:
<!--- **INCLUDE THE ROUND ID**
If you discovered this issue from playing tgstation hosted servers:
[Round ID]: # (It can be found in the Status panel or retrieved from https://sb.atlantaned.space/rounds ! The round id let's us look up valuable information and logs for the round the bug happened.)-->
## Testmerges:
<!-- If you're certain the issue is to be caused by a test merge [OOC tab -> Show Server Revision], report it in the pull request's comment section rather than on the tracker(If you're unsure you can refer to the issue number by prefixing said number with #. The issue number can be found beside the title after submitting it to the tracker).If no testmerges are active, feel free to remove this section. -->
## Reproduction:
Tested with light pink stabilised. Put it in a medical modsuit. Did not run fast. Held the extract in hand or in pocket. Ran fast.
|
1.0
|
Passive slime extracts do not work in modsuit storage - <!-- Write **BELOW** The Headers and **ABOVE** The comments else it may not be viewable -->
## Round ID:
<!--- **INCLUDE THE ROUND ID**
If you discovered this issue from playing tgstation hosted servers:
[Round ID]: # (It can be found in the Status panel or retrieved from https://sb.atlantaned.space/rounds ! The round id let's us look up valuable information and logs for the round the bug happened.)-->
## Testmerges:
<!-- If you're certain the issue is to be caused by a test merge [OOC tab -> Show Server Revision], report it in the pull request's comment section rather than on the tracker(If you're unsure you can refer to the issue number by prefixing said number with #. The issue number can be found beside the title after submitting it to the tracker).If no testmerges are active, feel free to remove this section. -->
## Reproduction:
Tested with light pink stabilised. Put it in a medical modsuit. Did not run fast. Held the extract in hand or in pocket. Ran fast.
|
non_process
|
passive slime extracts do not work in modsuit storage round id include the round id if you discovered this issue from playing tgstation hosted servers it can be found in the status panel or retrieved from the round id let s us look up valuable information and logs for the round the bug happened testmerges reproduction tested with light pink stabilised put it in a medical modsuit did not run fast held the extract in hand or in pocket ran fast
| 0
|
10,699
| 13,493,659,816
|
IssuesEvent
|
2020-09-11 20:02:49
|
kubernetes/minikube
|
https://api.github.com/repos/kubernetes/minikube
|
closed
|
add make targets with gopogh
|
good first issue kind/process priority/important-soon
|
we should have a make target that runs integration tests and then converst the test restult to gopogh and puts it in /out/testout.html
and the raw test output should be in ./out/testout.txt
simmilar to how we do it in jenkins
https://github.com/medyagh/minikube/blob/0d33242daef60bd07e5b39fd6491e52d04eac207/hack/jenkins/common.sh#L352-L353
that way users can run integraiton tests and then have the test results in human readable way
|
1.0
|
add make targets with gopogh - we should have a make target that runs integration tests and then converst the test restult to gopogh and puts it in /out/testout.html
and the raw test output should be in ./out/testout.txt
simmilar to how we do it in jenkins
https://github.com/medyagh/minikube/blob/0d33242daef60bd07e5b39fd6491e52d04eac207/hack/jenkins/common.sh#L352-L353
that way users can run integraiton tests and then have the test results in human readable way
|
process
|
add make targets with gopogh we should have a make target that runs integration tests and then converst the test restult to gopogh and puts it in out testout html and the raw test output should be in out testout txt simmilar to how we do it in jenkins that way users can run integraiton tests and then have the test results in human readable way
| 1
|
831,479
| 32,050,430,138
|
IssuesEvent
|
2023-09-23 13:39:24
|
RbAvci/My-Coursework-Planner
|
https://api.github.com/repos/RbAvci/My-Coursework-Planner
|
opened
|
[PD] Budget for shift work (only for people without fixed income)
|
🐇 Size Small 📅 HTML-CSS 🏝️ Priority Stretch 📅 Week 2
|
### Coursework content
In a Google sheet, make a budget of how much money you make on average on your shift work, including the hours you work and all the expenses related to it (transportation, fuel, repair costs).
_Is your shift work worthwhile doing compared to other types of work?_
Check out [this link](https://www.jazzhr.com/blog/freelancer-vs-contractor-vs-permanent-employee-what-you-should-know-for-2021/#:~:text=Freelancers%20and%20contractors%20are%20self,work%20for%20a%20single%20company) to understand the differences.
Reflect on what changes you might need to bring to your life.
- Summary of my current situation
- My current plan
- What distractions do I have / My energy levels during the study
- Original plans I had after I finished the training
- Define them in short/medium/long-term goals
### Estimated time in hours
0.5
### What is the purpose of this assignment?
This exercise is for you to get a job in tech, whilst focussing on the right things and still having enough money to pay your bills.
### How to submit
**Optional**: you can discuss it with a peer or volunteer to get their feedback and insights.
### Anything else?
n/a
|
1.0
|
[PD] Budget for shift work (only for people without fixed income) - ### Coursework content
In a Google sheet, make a budget of how much money you make on average on your shift work, including the hours you work and all the expenses related to it (transportation, fuel, repair costs).
_Is your shift work worthwhile doing compared to other types of work?_
Check out [this link](https://www.jazzhr.com/blog/freelancer-vs-contractor-vs-permanent-employee-what-you-should-know-for-2021/#:~:text=Freelancers%20and%20contractors%20are%20self,work%20for%20a%20single%20company) to understand the differences.
Reflect on what changes you might need to bring to your life.
- Summary of my current situation
- My current plan
- What distractions do I have / My energy levels during the study
- Original plans I had after I finished the training
- Define them in short/medium/long-term goals
### Estimated time in hours
0.5
### What is the purpose of this assignment?
This exercise is for you to get a job in tech, whilst focussing on the right things and still having enough money to pay your bills.
### How to submit
**Optional**: you can discuss it with a peer or volunteer to get their feedback and insights.
### Anything else?
n/a
|
non_process
|
budget for shift work only for people without fixed income coursework content in a google sheet make a budget of how much money you make on average on your shift work including the hours you work and all the expenses related to it transportation fuel repair costs is your shift work worthwhile doing compared to other types of work check out to understand the differences reflect on what changes you might need to bring to your life summary of my current situation my current plan what distractions do i have my energy levels during the study original plans i had after i finished the training define them in short medium long term goals estimated time in hours what is the purpose of this assignment this exercise is for you to get a job in tech whilst focussing on the right things and still having enough money to pay your bills how to submit optional you can discuss it with a peer or volunteer to get their feedback and insights anything else n a
| 0
|
9,764
| 12,749,176,623
|
IssuesEvent
|
2020-06-26 21:59:15
|
googleapis/synthtool
|
https://api.github.com/repos/googleapis/synthtool
|
closed
|
Remove usage of `repos.json`
|
priority: p2 type: process
|
For node.js, python, and java [we currently use](https://github.com/googleapis/synthtool/blob/969a2340e74c73227e7c1638ed7650abcac22ee4/autosynth/providers/python.py#L46) [repos.json](https://github.com/googleapis/sloth/blob/master/repos.json) to identify all relevant repositories where autosynth should run.
We would like to [delete this file](https://github.com/googleapis/sloth/issues/730). Thinking out loud - I'm not sure why we would need repos.json. I was thinking that each repository that requires synth should own it's own processes for updating, and use a kokoro job on a cron to run `autosynth` instead of relying on a top level configuration. There could be other answers :)
|
1.0
|
Remove usage of `repos.json` - For node.js, python, and java [we currently use](https://github.com/googleapis/synthtool/blob/969a2340e74c73227e7c1638ed7650abcac22ee4/autosynth/providers/python.py#L46) [repos.json](https://github.com/googleapis/sloth/blob/master/repos.json) to identify all relevant repositories where autosynth should run.
We would like to [delete this file](https://github.com/googleapis/sloth/issues/730). Thinking out loud - I'm not sure why we would need repos.json. I was thinking that each repository that requires synth should own it's own processes for updating, and use a kokoro job on a cron to run `autosynth` instead of relying on a top level configuration. There could be other answers :)
|
process
|
remove usage of repos json for node js python and java to identify all relevant repositories where autosynth should run we would like to thinking out loud i m not sure why we would need repos json i was thinking that each repository that requires synth should own it s own processes for updating and use a kokoro job on a cron to run autosynth instead of relying on a top level configuration there could be other answers
| 1
|
4,066
| 2,610,086,835
|
IssuesEvent
|
2015-02-26 18:26:23
|
chrsmith/dsdsdaadf
|
https://api.github.com/repos/chrsmith/dsdsdaadf
|
opened
|
深圳除去痘痘
|
auto-migrated Priority-Medium Type-Defect
|
```
深圳除去痘痘【深圳韩方科颜全国热线400-869-1818,24小时QQ4008
691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘方—��
�韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方科颜�
��业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健康祛
痘技术并结合先进“先进豪华彩光”仪,开创国内专业治疗��
�刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘痘。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 7:15
|
1.0
|
深圳除去痘痘 - ```
深圳除去痘痘【深圳韩方科颜全国热线400-869-1818,24小时QQ4008
691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘方—��
�韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方科颜�
��业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健康祛
痘技术并结合先进“先进豪华彩光”仪,开创国内专业治疗��
�刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘痘。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 7:15
|
non_process
|
深圳除去痘痘 深圳除去痘痘【 , 】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘方—�� �韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方科颜� ��业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健康祛 痘技术并结合先进“先进豪华彩光”仪,开创国内专业治疗�� �刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘痘。 original issue reported on code google com by szft com on may at
| 0
|
17,144
| 22,691,369,668
|
IssuesEvent
|
2022-07-04 20:55:39
|
procrastinate-org/procrastinate
|
https://api.github.com/repos/procrastinate-org/procrastinate
|
closed
|
Configure Renovate to automerge simple PRs (for devDependencies & incremental deps)
|
Issue type: Bug 🐞 Issue type: Process ⚙️ Issue status: Blocked ⛔️
|
See [renovate doc](https://docs.renovatebot.com/configuration-options/) for details. It's about "autoMerge" and "lockfileMaintenance"
|
1.0
|
Configure Renovate to automerge simple PRs (for devDependencies & incremental deps) - See [renovate doc](https://docs.renovatebot.com/configuration-options/) for details. It's about "autoMerge" and "lockfileMaintenance"
|
process
|
configure renovate to automerge simple prs for devdependencies incremental deps see for details it s about automerge and lockfilemaintenance
| 1
|
16,509
| 21,518,607,401
|
IssuesEvent
|
2022-04-28 12:21:11
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
opened
|
Resolving an incident should make job activatable by writing an event
|
kind/toil scope/broker team/process-automation area/maintainability
|
**Description**
Currently, when an incident on a service task is resolved, the job is made activatable again only in the state. This is done by the IncidentResolvedEventApplier. However, such a state change should be done by a Job event applier.
Originally this was part of #9219, but extracted to this issue to reduce the scope and focus on fixing the bug in that PR.
We'll probably need a new JobIntent (e.g. `ACTIVATABLE` which also already exists as state in JobState) and an associated event applier. This would bring the event appliers logic closer to idempotently placing the event's data in the state. It would also allow exporter consumers to know when a job has become activatable again.
|
1.0
|
Resolving an incident should make job activatable by writing an event - **Description**
Currently, when an incident on a service task is resolved, the job is made activatable again only in the state. This is done by the IncidentResolvedEventApplier. However, such a state change should be done by a Job event applier.
Originally this was part of #9219, but extracted to this issue to reduce the scope and focus on fixing the bug in that PR.
We'll probably need a new JobIntent (e.g. `ACTIVATABLE` which also already exists as state in JobState) and an associated event applier. This would bring the event appliers logic closer to idempotently placing the event's data in the state. It would also allow exporter consumers to know when a job has become activatable again.
|
process
|
resolving an incident should make job activatable by writing an event description currently when an incident on a service task is resolved the job is made activatable again only in the state this is done by the incidentresolvedeventapplier however such a state change should be done by a job event applier originally this was part of but extracted to this issue to reduce the scope and focus on fixing the bug in that pr we ll probably need a new jobintent e g activatable which also already exists as state in jobstate and an associated event applier this would bring the event appliers logic closer to idempotently placing the event s data in the state it would also allow exporter consumers to know when a job has become activatable again
| 1
|
417,025
| 12,154,900,597
|
IssuesEvent
|
2020-04-25 10:35:37
|
mm-masahiro/profile-portfolio
|
https://api.github.com/repos/mm-masahiro/profile-portfolio
|
reopened
|
README.mdの改善
|
high priority improvement
|
## 何をやるのか
- デザインのリンクを貼るんじゃなくて、スクショを用意して表示する用意する(ポートフォリオを見る側が見辛い)
- 使用技術を書く
- その他書いた方が良さそうなことを自分で考えて書く
## なぜそれをやるのか
- ポートフォリオは最終的に人に見てもらうものだからしっかりと体裁を整えて見やすい形にしておく必要がある
## 参考リンク(あれば)
- 全然聞いてないけどこんなんあった
https://www.youtube.com/watch?v=vpldv8TZQtY
|
1.0
|
README.mdの改善 - ## 何をやるのか
- デザインのリンクを貼るんじゃなくて、スクショを用意して表示する用意する(ポートフォリオを見る側が見辛い)
- 使用技術を書く
- その他書いた方が良さそうなことを自分で考えて書く
## なぜそれをやるのか
- ポートフォリオは最終的に人に見てもらうものだからしっかりと体裁を整えて見やすい形にしておく必要がある
## 参考リンク(あれば)
- 全然聞いてないけどこんなんあった
https://www.youtube.com/watch?v=vpldv8TZQtY
|
non_process
|
readme mdの改善 何をやるのか デザインのリンクを貼るんじゃなくて、スクショを用意して表示する用意する(ポートフォリオを見る側が見辛い) 使用技術を書く その他書いた方が良さそうなことを自分で考えて書く なぜそれをやるのか ポートフォリオは最終的に人に見てもらうものだからしっかりと体裁を整えて見やすい形にしておく必要がある 参考リンク(あれば) 全然聞いてないけどこんなんあった
| 0
|
13,025
| 15,380,251,560
|
IssuesEvent
|
2021-03-02 20:49:36
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Pivot queries aren't recorded to query execution log
|
Priority:P2 Querying/Processor Type:Bug
|
Because we're using `qp/process-query` instead of `process-query-and-save-execution!`
|
1.0
|
Pivot queries aren't recorded to query execution log - Because we're using `qp/process-query` instead of `process-query-and-save-execution!`
|
process
|
pivot queries aren t recorded to query execution log because we re using qp process query instead of process query and save execution
| 1
|
52,713
| 22,355,213,841
|
IssuesEvent
|
2022-06-15 15:06:34
|
cityofaustin/atd-data-tech
|
https://api.github.com/repos/cityofaustin/atd-data-tech
|
closed
|
Update Technician - Direct cost in Signs and Markings
|
Workgroup: SMB Type: IT Support Service: Apps Product: Signs & Markings
|
<!-- Email -->
<!-- christina.tremel@austintexas.gov -->
> What application are you using?
Signs & Markings Operations
> Describe the problem.
Need to update Technician - Direct cost to potentially include more decimal places. Price needs to be 41.4684007692308 to match up with the FY22 cost calculator that was created by Finance. We submitted a very similar request ( https://github.com/cityofaustin/atd-data-tech/issues/8777) to update the Technician - Indirect cost.
> How soon do you need this?
Soon — This week
> Requested By
Christina T.
Request ID: DTS22-104220
|
1.0
|
Update Technician - Direct cost in Signs and Markings - <!-- Email -->
<!-- christina.tremel@austintexas.gov -->
> What application are you using?
Signs & Markings Operations
> Describe the problem.
Need to update Technician - Direct cost to potentially include more decimal places. Price needs to be 41.4684007692308 to match up with the FY22 cost calculator that was created by Finance. We submitted a very similar request ( https://github.com/cityofaustin/atd-data-tech/issues/8777) to update the Technician - Indirect cost.
> How soon do you need this?
Soon — This week
> Requested By
Christina T.
Request ID: DTS22-104220
|
non_process
|
update technician direct cost in signs and markings what application are you using signs markings operations describe the problem need to update technician direct cost to potentially include more decimal places price needs to be to match up with the cost calculator that was created by finance we submitted a very similar request to update the technician indirect cost how soon do you need this soon — this week requested by christina t request id
| 0
|
5,526
| 8,381,095,024
|
IssuesEvent
|
2018-10-07 21:16:25
|
saguaroib/saguaro
|
https://api.github.com/repos/saguaroib/saguaro
|
closed
|
Deletion class doesn't check for ghost bumping.
|
Administrative Post/text processing Revisit
|
TO-DO:
Prevent users from ghost bumping. Deletion timers maybe?
|
1.0
|
Deletion class doesn't check for ghost bumping. - TO-DO:
Prevent users from ghost bumping. Deletion timers maybe?
|
process
|
deletion class doesn t check for ghost bumping to do prevent users from ghost bumping deletion timers maybe
| 1
|
11,635
| 14,493,566,176
|
IssuesEvent
|
2020-12-11 08:42:05
|
GoogleCloudPlatform/dotnet-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/dotnet-docs-samples
|
closed
|
[Storage] Deactivate GoogleCloudSamples.StorageTest.TestDownloadObjectRequesterPays because of flakiness.
|
api: storage priority: p1 samples type: process
|
It's not clear why it's failing.
Have deactivate in PR #1061.
|
1.0
|
[Storage] Deactivate GoogleCloudSamples.StorageTest.TestDownloadObjectRequesterPays because of flakiness. - It's not clear why it's failing.
Have deactivate in PR #1061.
|
process
|
deactivate googlecloudsamples storagetest testdownloadobjectrequesterpays because of flakiness it s not clear why it s failing have deactivate in pr
| 1
|
305,170
| 26,367,685,488
|
IssuesEvent
|
2023-01-11 17:52:04
|
influxdata/influxdb_iox
|
https://api.github.com/repos/influxdata/influxdb_iox
|
closed
|
Flaky test: `end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet`
|
flaky test
|
Example in https://app.circleci.com/pipelines/github/influxdata/influxdb_iox/27431/workflows/95af4f59-5560-4d30-bded-fd2d987a186f/jobs/245010 :
```text
thread 'end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet' panicked at 'did not get additional Parquet files in the catalog: Elapsed(())', /home/rust/project/test_helpers_end_to_end/src/client.rs:148:6
stack backtrace:
0: rust_begin_unwind
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/std/src/panicking.rs:575:5
1: core::panicking::panic_fmt
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/panicking.rs:65:14
2: core::result::unwrap_failed
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/result.rs:1791:5
3: core::result::Result<T,E>::expect
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/result.rs:1070:23
4: test_helpers_end_to_end::client::wait_for_new_parquet_file::{{closure}}
at /home/rust/project/test_helpers_end_to_end/src/client.rs:124:5
5: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/future/mod.rs:91:19
6: test_helpers_end_to_end::steps::StepTest::run::{{closure}}
at /home/rust/project/test_helpers_end_to_end/src/steps.rs:205:21
7: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/future/mod.rs:91:19
8: end_to_end::end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet::{{closure}}
at ./tests/end_to_end_cases/querier.rs:922:9
9: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/future/mod.rs:91:19
10: <core::pin::Pin<P> as core::future::future::Future>::poll
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/future/future.rs:124:9
11: tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}::{{closure}}::{{closure}}
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:531:57
12: tokio::runtime::coop::with_budget
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/coop.rs:102:5
13: tokio::runtime::coop::budget
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/coop.rs:68:5
14: tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}::{{closure}}
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:531:25
15: tokio::runtime::scheduler::current_thread::Context::enter
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:340:19
16: tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:530:36
17: tokio::runtime::scheduler::current_thread::CoreGuard::enter::{{closure}}
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:601:57
18: tokio::macros::scoped_tls::ScopedKey<T>::set
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/macros/scoped_tls.rs:61:9
19: tokio::runtime::scheduler::current_thread::CoreGuard::enter
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:601:27
20: tokio::runtime::scheduler::current_thread::CoreGuard::block_on
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:520:19
21: tokio::runtime::scheduler::current_thread::CurrentThread::block_on
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:154:24
22: tokio::runtime::runtime::Runtime::block_on
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/runtime.rs:279:47
23: end_to_end::end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet
at ./tests/end_to_end_cases/querier.rs:901:9
24: end_to_end::end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet::{{closure}}
at ./tests/end_to_end_cases/querier.rs:892:11
25: core::ops::function::FnOnce::call_once
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/ops/function.rs:251:5
26: core::ops::function::FnOnce::call_once
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/ops/function.rs:251:5
```
|
1.0
|
Flaky test: `end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet` - Example in https://app.circleci.com/pipelines/github/influxdata/influxdb_iox/27431/workflows/95af4f59-5560-4d30-bded-fd2d987a186f/jobs/245010 :
```text
thread 'end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet' panicked at 'did not get additional Parquet files in the catalog: Elapsed(())', /home/rust/project/test_helpers_end_to_end/src/client.rs:148:6
stack backtrace:
0: rust_begin_unwind
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/std/src/panicking.rs:575:5
1: core::panicking::panic_fmt
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/panicking.rs:65:14
2: core::result::unwrap_failed
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/result.rs:1791:5
3: core::result::Result<T,E>::expect
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/result.rs:1070:23
4: test_helpers_end_to_end::client::wait_for_new_parquet_file::{{closure}}
at /home/rust/project/test_helpers_end_to_end/src/client.rs:124:5
5: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/future/mod.rs:91:19
6: test_helpers_end_to_end::steps::StepTest::run::{{closure}}
at /home/rust/project/test_helpers_end_to_end/src/steps.rs:205:21
7: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/future/mod.rs:91:19
8: end_to_end::end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet::{{closure}}
at ./tests/end_to_end_cases/querier.rs:922:9
9: <core::future::from_generator::GenFuture<T> as core::future::future::Future>::poll
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/future/mod.rs:91:19
10: <core::pin::Pin<P> as core::future::future::Future>::poll
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/future/future.rs:124:9
11: tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}::{{closure}}::{{closure}}
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:531:57
12: tokio::runtime::coop::with_budget
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/coop.rs:102:5
13: tokio::runtime::coop::budget
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/coop.rs:68:5
14: tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}::{{closure}}
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:531:25
15: tokio::runtime::scheduler::current_thread::Context::enter
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:340:19
16: tokio::runtime::scheduler::current_thread::CoreGuard::block_on::{{closure}}
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:530:36
17: tokio::runtime::scheduler::current_thread::CoreGuard::enter::{{closure}}
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:601:57
18: tokio::macros::scoped_tls::ScopedKey<T>::set
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/macros/scoped_tls.rs:61:9
19: tokio::runtime::scheduler::current_thread::CoreGuard::enter
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:601:27
20: tokio::runtime::scheduler::current_thread::CoreGuard::block_on
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:520:19
21: tokio::runtime::scheduler::current_thread::CurrentThread::block_on
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/scheduler/current_thread.rs:154:24
22: tokio::runtime::runtime::Runtime::block_on
at /usr/local/cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.22.0/src/runtime/runtime.rs:279:47
23: end_to_end::end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet
at ./tests/end_to_end_cases/querier.rs:901:9
24: end_to_end::end_to_end_cases::querier::kafkaless_rpc_write::basic_on_parquet::{{closure}}
at ./tests/end_to_end_cases/querier.rs:892:11
25: core::ops::function::FnOnce::call_once
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/ops/function.rs:251:5
26: core::ops::function::FnOnce::call_once
at /rustc/69f9c33d71c871fc16ac445211281c6e7a340943/library/core/src/ops/function.rs:251:5
```
|
non_process
|
flaky test end to end cases querier kafkaless rpc write basic on parquet example in text thread end to end cases querier kafkaless rpc write basic on parquet panicked at did not get additional parquet files in the catalog elapsed home rust project test helpers end to end src client rs stack backtrace rust begin unwind at rustc library std src panicking rs core panicking panic fmt at rustc library core src panicking rs core result unwrap failed at rustc library core src result rs core result result expect at rustc library core src result rs test helpers end to end client wait for new parquet file closure at home rust project test helpers end to end src client rs as core future future future poll at rustc library core src future mod rs test helpers end to end steps steptest run closure at home rust project test helpers end to end src steps rs as core future future future poll at rustc library core src future mod rs end to end end to end cases querier kafkaless rpc write basic on parquet closure at tests end to end cases querier rs as core future future future poll at rustc library core src future mod rs as core future future future poll at rustc library core src future future rs tokio runtime scheduler current thread coreguard block on closure closure closure at usr local cargo registry src github com tokio src runtime scheduler current thread rs tokio runtime coop with budget at usr local cargo registry src github com tokio src runtime coop rs tokio runtime coop budget at usr local cargo registry src github com tokio src runtime coop rs tokio runtime scheduler current thread coreguard block on closure closure at usr local cargo registry src github com tokio src runtime scheduler current thread rs tokio runtime scheduler current thread context enter at usr local cargo registry src github com tokio src runtime scheduler current thread rs tokio runtime scheduler current thread coreguard block on closure at usr local cargo registry src github com tokio src runtime scheduler current thread rs tokio runtime scheduler current thread coreguard enter closure at usr local cargo registry src github com tokio src runtime scheduler current thread rs tokio macros scoped tls scopedkey set at usr local cargo registry src github com tokio src macros scoped tls rs tokio runtime scheduler current thread coreguard enter at usr local cargo registry src github com tokio src runtime scheduler current thread rs tokio runtime scheduler current thread coreguard block on at usr local cargo registry src github com tokio src runtime scheduler current thread rs tokio runtime scheduler current thread currentthread block on at usr local cargo registry src github com tokio src runtime scheduler current thread rs tokio runtime runtime runtime block on at usr local cargo registry src github com tokio src runtime runtime rs end to end end to end cases querier kafkaless rpc write basic on parquet at tests end to end cases querier rs end to end end to end cases querier kafkaless rpc write basic on parquet closure at tests end to end cases querier rs core ops function fnonce call once at rustc library core src ops function rs core ops function fnonce call once at rustc library core src ops function rs
| 0
|
14,446
| 17,500,591,094
|
IssuesEvent
|
2021-08-10 08:57:29
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[iOS] Blank screen is displayed on failing eligibility test in a scenario
|
Bug P1 iOS Process: Fixed Process: Tested QA Process: Tested dev
|
**Steps:**
1. Freshly install the app
2. Click on 'Get Started'
3. Click on any study with eligibility test configured
4. Click on Participate. Navigated to signin screen
5. Signin successfully
6. Fail the eligibility test
7. Observe blank white screen is displayed and user is unable to navigate further
**Actual:** Blank screen is displayed on failing eligibility test in a scenario
**Expected:** Study overview screen should be displayed post failing eligibility test
Issue not observed when user signin into app > navigate to studies list > participate into study > fail the eligibility test
https://user-images.githubusercontent.com/60386291/128353031-18edc1e6-846e-4ea4-9cb5-160bcf952d35.MOV
|
3.0
|
[iOS] Blank screen is displayed on failing eligibility test in a scenario - **Steps:**
1. Freshly install the app
2. Click on 'Get Started'
3. Click on any study with eligibility test configured
4. Click on Participate. Navigated to signin screen
5. Signin successfully
6. Fail the eligibility test
7. Observe blank white screen is displayed and user is unable to navigate further
**Actual:** Blank screen is displayed on failing eligibility test in a scenario
**Expected:** Study overview screen should be displayed post failing eligibility test
Issue not observed when user signin into app > navigate to studies list > participate into study > fail the eligibility test
https://user-images.githubusercontent.com/60386291/128353031-18edc1e6-846e-4ea4-9cb5-160bcf952d35.MOV
|
process
|
blank screen is displayed on failing eligibility test in a scenario steps freshly install the app click on get started click on any study with eligibility test configured click on participate navigated to signin screen signin successfully fail the eligibility test observe blank white screen is displayed and user is unable to navigate further actual blank screen is displayed on failing eligibility test in a scenario expected study overview screen should be displayed post failing eligibility test issue not observed when user signin into app navigate to studies list participate into study fail the eligibility test
| 1
|
14,694
| 17,858,599,251
|
IssuesEvent
|
2021-09-05 14:22:15
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
I want to know how arm64 should be built?
|
P3 type: support / not a bug (process) team-Rules-CPP
|
I want to know how arm64 should be built?
```shell
➜ stage1 git:(master) ✗ gcc main/hello-world.cc -lstdc++
➜ stage1 git:(master) ✗ file a.out
a.out: Mach-O 64-bit executable x86_64
➜ stage1 git:(master) ✗ arch --arm64e gcc main/hello-world.cc -lstdc++
➜ stage1 git:(master) ✗ file a.out
a.out: Mach-O 64-bit executable arm64
➜ stage1 git:(master) ✗
```
### What operating system are you running Bazel on?
> MacBook Air (M1, 2020)
> MacOs big sur
> Build label: 4.0.0-homebrew
> Build target: bazel-out/darwin-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
> Build time: Sat Jan 23 02:10:29 2021 (1611367829)
> Build timestamp: 1611367829
> Build timestamp as int: 1611367829
|
1.0
|
I want to know how arm64 should be built? - I want to know how arm64 should be built?
```shell
➜ stage1 git:(master) ✗ gcc main/hello-world.cc -lstdc++
➜ stage1 git:(master) ✗ file a.out
a.out: Mach-O 64-bit executable x86_64
➜ stage1 git:(master) ✗ arch --arm64e gcc main/hello-world.cc -lstdc++
➜ stage1 git:(master) ✗ file a.out
a.out: Mach-O 64-bit executable arm64
➜ stage1 git:(master) ✗
```
### What operating system are you running Bazel on?
> MacBook Air (M1, 2020)
> MacOs big sur
> Build label: 4.0.0-homebrew
> Build target: bazel-out/darwin-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
> Build time: Sat Jan 23 02:10:29 2021 (1611367829)
> Build timestamp: 1611367829
> Build timestamp as int: 1611367829
|
process
|
i want to know how should be built i want to know how should be built shell ➜ git master ✗ gcc main hello world cc lstdc ➜ git master ✗ file a out a out mach o bit executable ➜ git master ✗ arch gcc main hello world cc lstdc ➜ git master ✗ file a out a out mach o bit executable ➜ git master ✗ what operating system are you running bazel on macbook air macos big sur build label homebrew build target bazel out darwin opt bin src main java com google devtools build lib bazel bazelserver deploy jar build time sat jan build timestamp build timestamp as int
| 1
|
133,080
| 18,279,021,733
|
IssuesEvent
|
2021-10-04 23:07:33
|
occmundial/occ-atomic
|
https://api.github.com/repos/occmundial/occ-atomic
|
closed
|
CVE-2021-37712 (High) detected in tar-4.4.8.tgz - autoclosed
|
security vulnerability
|
## CVE-2021-37712 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.8.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.8.tgz">https://registry.npmjs.org/tar/-/tar-4.4.8.tgz</a></p>
<p>Path to dependency file: occ-atomic/package.json</p>
<p>Path to vulnerable library: occ-atomic/node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- webpack-dev-server-3.7.2.tgz (Root Library)
- chokidar-2.1.6.tgz
- fsevents-1.2.9.tgz
- node-pre-gyp-0.12.0.tgz
- :x: **tar-4.4.8.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/occmundial/occ-atomic/commit/31fdd81a799b8f96cdae09899abd668af1b3ef09">31fdd81a799b8f96cdae09899abd668af1b3ef09</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution: tar - 4.4.18, 5.0.10, 6.1.9</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-37712 (High) detected in tar-4.4.8.tgz - autoclosed - ## CVE-2021-37712 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.8.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.8.tgz">https://registry.npmjs.org/tar/-/tar-4.4.8.tgz</a></p>
<p>Path to dependency file: occ-atomic/package.json</p>
<p>Path to vulnerable library: occ-atomic/node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- webpack-dev-server-3.7.2.tgz (Root Library)
- chokidar-2.1.6.tgz
- fsevents-1.2.9.tgz
- node-pre-gyp-0.12.0.tgz
- :x: **tar-4.4.8.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/occmundial/occ-atomic/commit/31fdd81a799b8f96cdae09899abd668af1b3ef09">31fdd81a799b8f96cdae09899abd668af1b3ef09</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution: tar - 4.4.18, 5.0.10, 6.1.9</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in tar tgz autoclosed cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href path to dependency file occ atomic package json path to vulnerable library occ atomic node modules tar package json dependency hierarchy webpack dev server tgz root library chokidar tgz fsevents tgz node pre gyp tgz x tar tgz vulnerable library found in head commit a href found in base branch main vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value additionally on windows systems long path portions would resolve to the same file system entities as their short path counterparts a specially crafted tar archive could thus include a directory with one form of the path followed by a symbolic link with a different string that resolves to the same file system entity followed by a file using the first form by first creating a directory and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar step up your open source security game with whitesource
| 0
|
8,129
| 11,308,419,832
|
IssuesEvent
|
2020-01-19 05:31:05
|
kubeflow/pipelines
|
https://api.github.com/repos/kubeflow/pipelines
|
closed
|
Proxy agent failed to claim URL
|
kind/process priority/p1
|
The error message
2020/01/15 18:55:19 Failed to read pending requests: "A proxy request failed: \"Get nullagent/pending: unsupported protocol scheme \\\"\\\"\""
This is due to a malformed backend Id given by the agent. this is might be introduced by
https://github.com/kubeflow/pipelines/commit/b5c54e1ba7aaf052a9a2bd21826429454b5f300c#diff-986b1153a97bee60b4f31250ea1ed45c
|
1.0
|
Proxy agent failed to claim URL - The error message
2020/01/15 18:55:19 Failed to read pending requests: "A proxy request failed: \"Get nullagent/pending: unsupported protocol scheme \\\"\\\"\""
This is due to a malformed backend Id given by the agent. this is might be introduced by
https://github.com/kubeflow/pipelines/commit/b5c54e1ba7aaf052a9a2bd21826429454b5f300c#diff-986b1153a97bee60b4f31250ea1ed45c
|
process
|
proxy agent failed to claim url the error message failed to read pending requests a proxy request failed get nullagent pending unsupported protocol scheme this is due to a malformed backend id given by the agent this is might be introduced by
| 1
|
19,749
| 26,112,520,752
|
IssuesEvent
|
2022-12-27 22:37:32
|
firebase/firebase-cpp-sdk
|
https://api.github.com/repos/firebase/firebase-cpp-sdk
|
closed
|
[C++] Nightly Integration Testing Report
|
type: process nightly-testing
|
Note: This report excludes firestore. Please also check **[the report for firestore](https://github.com/firebase/firebase-cpp-sdk/issues/1166)**
***
<hidden value="integration-test-status-comment"></hidden>
### ❌ [build against repo] Integration test FAILED
Requested by @DellaBitta on commit 3095517de64f316fee6ad3e978163f584f91bb67
Last updated: Tue Dec 27 03:05 PST 2022
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3786329233)**
| Failures | Configs |
|----------|---------|
| gma | [TEST] [FAILURE] [iOS] [macos] [2/6 ios_device: ios_min ios_latest]<details><summary>(2 failed tests)</summary> FirebaseGmaTest.TestRewardedAdLoad<br/> FirebaseGmaTest.TestRewardedAdLoadEmptyRequest</details>[TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> FirebaseGmaTest.TestRewardedAdStress</details>[TEST] [FLAKINESS] [iOS] [macos] [1/6 ios_device: ios_target]<details><summary>(2 failed tests)</summary> FirebaseGmaTest.TestRewardedAdLoad<br/> FirebaseGmaTest.TestRewardedAdLoadEmptyRequest</details> |
| messaging | [TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
| storage | [TEST] [FLAKINESS] [Android] [2/3 os: windows ubuntu] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
<hidden value="integration-test-status-comment"></hidden>
***
### [build against SDK] Integration test with FLAKINESS (succeeded after retry)
Requested by @firebase-workflow-trigger[bot] on commit 3095517de64f316fee6ad3e978163f584f91bb67
Last updated: Tue Dec 27 05:48 PST 2022
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3787453255)**
| Failures | Configs |
|----------|---------|
| analytics | [TEST] [FLAKINESS] [iOS] [macos] [1/6 ios_device: ios_latest]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
| storage | [TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
|
1.0
|
[C++] Nightly Integration Testing Report - Note: This report excludes firestore. Please also check **[the report for firestore](https://github.com/firebase/firebase-cpp-sdk/issues/1166)**
***
<hidden value="integration-test-status-comment"></hidden>
### ❌ [build against repo] Integration test FAILED
Requested by @DellaBitta on commit 3095517de64f316fee6ad3e978163f584f91bb67
Last updated: Tue Dec 27 03:05 PST 2022
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3786329233)**
| Failures | Configs |
|----------|---------|
| gma | [TEST] [FAILURE] [iOS] [macos] [2/6 ios_device: ios_min ios_latest]<details><summary>(2 failed tests)</summary> FirebaseGmaTest.TestRewardedAdLoad<br/> FirebaseGmaTest.TestRewardedAdLoadEmptyRequest</details>[TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> FirebaseGmaTest.TestRewardedAdStress</details>[TEST] [FLAKINESS] [iOS] [macos] [1/6 ios_device: ios_target]<details><summary>(2 failed tests)</summary> FirebaseGmaTest.TestRewardedAdLoad<br/> FirebaseGmaTest.TestRewardedAdLoadEmptyRequest</details> |
| messaging | [TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
| storage | [TEST] [FLAKINESS] [Android] [2/3 os: windows ubuntu] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
<hidden value="integration-test-status-comment"></hidden>
***
### [build against SDK] Integration test with FLAKINESS (succeeded after retry)
Requested by @firebase-workflow-trigger[bot] on commit 3095517de64f316fee6ad3e978163f584f91bb67
Last updated: Tue Dec 27 05:48 PST 2022
**[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/3787453255)**
| Failures | Configs |
|----------|---------|
| analytics | [TEST] [FLAKINESS] [iOS] [macos] [1/6 ios_device: ios_latest]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
| storage | [TEST] [FLAKINESS] [Android] [1/3 os: macos] [1/4 android_device: android_target]<details><summary>(1 failed tests)</summary> CRASH/TIMEOUT</details> |
Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)**
|
process
|
nightly integration testing report note this report excludes firestore please also check ❌ nbsp integration test failed requested by dellabitta on commit last updated tue dec pst failures configs gma failed tests nbsp nbsp firebasegmatest testrewardedadload nbsp nbsp firebasegmatest testrewardedadloademptyrequest failed tests nbsp nbsp firebasegmatest testrewardedadstress failed tests nbsp nbsp firebasegmatest testrewardedadload nbsp nbsp firebasegmatest testrewardedadloademptyrequest messaging failed tests nbsp nbsp crash timeout storage failed tests nbsp nbsp crash timeout add flaky tests to integration test with flakiness succeeded after retry requested by firebase workflow trigger on commit last updated tue dec pst failures configs analytics failed tests nbsp nbsp crash timeout storage failed tests nbsp nbsp crash timeout add flaky tests to
| 1
|
21,428
| 29,359,594,325
|
IssuesEvent
|
2023-05-28 00:37:22
|
devssa/onde-codar-em-salvador
|
https://api.github.com/repos/devssa/onde-codar-em-salvador
|
closed
|
[Hibrido / Aclimação, São Paulo, Brazil] Backend Developer C# na Coodesh
|
SALVADOR BACK-END REQUISITOS VB.NET PROCESSOS BACKEND GITHUB WEBSERVICES E-COMMERCE UMA C R APIs ERP MANUTENÇÃO NEGÓCIOS MS SQL WOOCOMMERCE ALOCADO Stale
|
## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/desenvolvedor-ecommerce-c-204542696?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p><strong>Capta Tecnologia</strong> está em busca de <strong><ins>Backend Developer C#</ins></strong> para compor seu time!</p>
<p>Estamos buscando pessoa desenvolvedora com experiência no Desenvolvimento de projeto de integração do ERP com lojas e-Commerce (VTEX, MAGENTO, WOOCOMMERCE, TRAY ou outras). Você será responsável e auxiliará na manutenção do nosso integrador do ERP e no desenvolvimento de nova versão utilizando Visual Studio 2022 e .NET CORE 6.</p>
<p></p>
<p>Local da atividade: bairro Aclimação/S.Paulo.</p>
<p></p>
## Capta Tecnologia:
<p>Sendo desenvolvido ao longo dos anos com o conceito de ERP, o Sistema CAPTA é composto por diversos módulos, direcionados a estruturas de negócios diferentes e a diferentes setores das empresas, mas integrando todos os departamentos da mesma forma, sem duplicidade de operações ou informação, e com a mesma interface operacional. O Sistema é abrangente e configurável, atingindo grande aderência aos processos das empresas, que apesar de semelhantes apresentam situações únicas em suas operações, que demandam flexibilidade com agilidade de resposta.</p>
</p>
## Habilidades:
- .NET
- C# .NET Core
- API
## Local:
Aclimação, São Paulo, Brazil
## Requisitos:
- Conhecimento em C#;
- Criação e manutenção de APIs e Webservices;
- Ter desenvolvido ou mantido interfaces com plataformas de eCommerce (VTEX, MAGENTO, WOOCOMMERCE, TRAY ou outras) através das API’s de integração;
- MS SQL Server.
## Diferenciais:
- Experiência com .Net Core;
- Conhecimento em LINQ/ADO;
- Experiências anteriores com VB.NET, WPF.
## Benefícios:
- Cajú Benefícios, R$32 ao dia, para CLT;
- Plano de Saúde e Odonto Sul América 100% subsídio para o titular;
- PLR;
- Seguro de Vida;
- Vale transporte.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Backend Developer C# na Capta Tecnologia](https://coodesh.com/jobs/desenvolvedor-ecommerce-c-204542696?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Alocado
#### Categoria
Back-End
|
1.0
|
[Hibrido / Aclimação, São Paulo, Brazil] Backend Developer C# na Coodesh - ## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/desenvolvedor-ecommerce-c-204542696?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p><strong>Capta Tecnologia</strong> está em busca de <strong><ins>Backend Developer C#</ins></strong> para compor seu time!</p>
<p>Estamos buscando pessoa desenvolvedora com experiência no Desenvolvimento de projeto de integração do ERP com lojas e-Commerce (VTEX, MAGENTO, WOOCOMMERCE, TRAY ou outras). Você será responsável e auxiliará na manutenção do nosso integrador do ERP e no desenvolvimento de nova versão utilizando Visual Studio 2022 e .NET CORE 6.</p>
<p></p>
<p>Local da atividade: bairro Aclimação/S.Paulo.</p>
<p></p>
## Capta Tecnologia:
<p>Sendo desenvolvido ao longo dos anos com o conceito de ERP, o Sistema CAPTA é composto por diversos módulos, direcionados a estruturas de negócios diferentes e a diferentes setores das empresas, mas integrando todos os departamentos da mesma forma, sem duplicidade de operações ou informação, e com a mesma interface operacional. O Sistema é abrangente e configurável, atingindo grande aderência aos processos das empresas, que apesar de semelhantes apresentam situações únicas em suas operações, que demandam flexibilidade com agilidade de resposta.</p>
</p>
## Habilidades:
- .NET
- C# .NET Core
- API
## Local:
Aclimação, São Paulo, Brazil
## Requisitos:
- Conhecimento em C#;
- Criação e manutenção de APIs e Webservices;
- Ter desenvolvido ou mantido interfaces com plataformas de eCommerce (VTEX, MAGENTO, WOOCOMMERCE, TRAY ou outras) através das API’s de integração;
- MS SQL Server.
## Diferenciais:
- Experiência com .Net Core;
- Conhecimento em LINQ/ADO;
- Experiências anteriores com VB.NET, WPF.
## Benefícios:
- Cajú Benefícios, R$32 ao dia, para CLT;
- Plano de Saúde e Odonto Sul América 100% subsídio para o titular;
- PLR;
- Seguro de Vida;
- Vale transporte.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Backend Developer C# na Capta Tecnologia](https://coodesh.com/jobs/desenvolvedor-ecommerce-c-204542696?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Alocado
#### Categoria
Back-End
|
process
|
backend developer c na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 capta tecnologia está em busca de backend developer c para compor seu time estamos buscando pessoa desenvolvedora com experiência no desenvolvimento de projeto de integração do erp com lojas e commerce vtex magento woocommerce tray ou outras você será responsável e auxiliará na manutenção do nosso integrador do erp e no desenvolvimento de nova versão utilizando visual studio e net core local da atividade bairro aclimação s paulo capta tecnologia sendo desenvolvido ao longo dos anos com o conceito de erp o sistema capta é composto por diversos módulos direcionados a estruturas de negócios diferentes e a diferentes setores das empresas mas integrando todos os departamentos da mesma forma sem duplicidade de operações ou informação e com a mesma interface operacional o sistema é abrangente e configurável atingindo grande aderência aos processos das empresas que apesar de semelhantes apresentam situações únicas em suas operações que demandam flexibilidade com agilidade de resposta habilidades net c net core api local aclimação são paulo brazil requisitos conhecimento em c criação e manutenção de apis e webservices ter desenvolvido ou mantido interfaces com plataformas de ecommerce vtex magento woocommerce tray ou outras através das api’s de integração ms sql server diferenciais experiência com net core conhecimento em linq ado experiências anteriores com vb net wpf benefícios cajú benefícios r ao dia para clt plano de saúde e odonto sul américa subsídio para o titular plr seguro de vida vale transporte como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação alocado categoria back end
| 1
|
207,579
| 23,459,552,443
|
IssuesEvent
|
2022-08-16 11:59:28
|
Gal-Doron/Baragon-test-10
|
https://api.github.com/repos/Gal-Doron/Baragon-test-10
|
opened
|
jetty-server-9.4.18.v20190429.jar: 4 vulnerabilities (highest severity is: 5.3)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p></summary>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-28169](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28169) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.41.v20210516 | ✅ |
| [CVE-2020-27218](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-27218) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.8 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.35.v20201120 | ✅ |
| [CVE-2021-34428](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-34428) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 3.5 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.41.v20210516 | ✅ |
| [CVE-2022-2047](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-2047) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 2.7 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.47.v20220610 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-28169</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
For Eclipse Jetty versions <= 9.4.40, <= 10.0.2, <= 11.0.2, it is possible for requests to the ConcatServlet with a doubly encoded path to access protected resources within the WEB-INF directory. For example a request to `/concat?/%2557EB-INF/web.xml` can retrieve the web.xml file. This can reveal sensitive information regarding the implementation of a web application.
<p>Publish Date: 2021-06-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28169>CVE-2021-28169</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-gwcr-j4wh-j3cq">https://github.com/eclipse/jetty.project/security/advisories/GHSA-gwcr-j4wh-j3cq</a></p>
<p>Release Date: 2021-06-09</p>
<p>Fix Resolution: 9.4.41.v20210516</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-27218</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty version 9.4.0.RC0 to 9.4.34.v20201102, 10.0.0.alpha0 to 10.0.0.beta2, and 11.0.0.alpha0 to 11.0.0.beta2, if GZIP request body inflation is enabled and requests from different clients are multiplexed onto a single connection, and if an attacker can send a request with a body that is received entirely but not consumed by the application, then a subsequent request on the same connection will see that body prepended to its body. The attacker will not see any data but may inject data into the body of the subsequent request.
<p>Publish Date: 2020-11-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-27218>CVE-2020-27218</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-86wm-rrjm-8wh8">https://github.com/eclipse/jetty.project/security/advisories/GHSA-86wm-rrjm-8wh8</a></p>
<p>Release Date: 2020-11-28</p>
<p>Fix Resolution: 9.4.35.v20201120</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2021-34428</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
For Eclipse Jetty versions <= 9.4.40, <= 10.0.2, <= 11.0.2, if an exception is thrown from the SessionListener#sessionDestroyed() method, then the session ID is not invalidated in the session ID manager. On deployments with clustered sessions and multiple contexts this can result in a session not being invalidated. This can result in an application used on a shared computer being left logged in.
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-34428>CVE-2021-34428</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>3.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Physical
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-m6cp-vxjx-65j6">https://github.com/eclipse/jetty.project/security/advisories/GHSA-m6cp-vxjx-65j6</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: 9.4.41.v20210516</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2022-2047</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty versions 9.4.0 thru 9.4.46, and 10.0.0 thru 10.0.9, and 11.0.0 thru 11.0.9 versions, the parsing of the authority segment of an http scheme URI, the Jetty HttpURI class improperly detects an invalid input as a hostname. This can lead to failures in a Proxy scenario.
<p>Publish Date: 2022-07-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-2047>CVE-2022-2047</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>2.7</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-cj7v-27pg-wf7q">https://github.com/eclipse/jetty.project/security/advisories/GHSA-cj7v-27pg-wf7q</a></p>
<p>Release Date: 2022-07-07</p>
<p>Fix Resolution: 9.4.47.v20220610</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
jetty-server-9.4.18.v20190429.jar: 4 vulnerabilities (highest severity is: 5.3) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p></summary>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-28169](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28169) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.3 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.41.v20210516 | ✅ |
| [CVE-2020-27218](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-27218) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.8 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.35.v20201120 | ✅ |
| [CVE-2021-34428](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-34428) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 3.5 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.41.v20210516 | ✅ |
| [CVE-2022-2047](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-2047) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 2.7 | jetty-server-9.4.18.v20190429.jar | Direct | 9.4.47.v20220610 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-28169</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
For Eclipse Jetty versions <= 9.4.40, <= 10.0.2, <= 11.0.2, it is possible for requests to the ConcatServlet with a doubly encoded path to access protected resources within the WEB-INF directory. For example a request to `/concat?/%2557EB-INF/web.xml` can retrieve the web.xml file. This can reveal sensitive information regarding the implementation of a web application.
<p>Publish Date: 2021-06-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28169>CVE-2021-28169</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-gwcr-j4wh-j3cq">https://github.com/eclipse/jetty.project/security/advisories/GHSA-gwcr-j4wh-j3cq</a></p>
<p>Release Date: 2021-06-09</p>
<p>Fix Resolution: 9.4.41.v20210516</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-27218</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty version 9.4.0.RC0 to 9.4.34.v20201102, 10.0.0.alpha0 to 10.0.0.beta2, and 11.0.0.alpha0 to 11.0.0.beta2, if GZIP request body inflation is enabled and requests from different clients are multiplexed onto a single connection, and if an attacker can send a request with a body that is received entirely but not consumed by the application, then a subsequent request on the same connection will see that body prepended to its body. The attacker will not see any data but may inject data into the body of the subsequent request.
<p>Publish Date: 2020-11-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-27218>CVE-2020-27218</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-86wm-rrjm-8wh8">https://github.com/eclipse/jetty.project/security/advisories/GHSA-86wm-rrjm-8wh8</a></p>
<p>Release Date: 2020-11-28</p>
<p>Fix Resolution: 9.4.35.v20201120</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2021-34428</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
For Eclipse Jetty versions <= 9.4.40, <= 10.0.2, <= 11.0.2, if an exception is thrown from the SessionListener#sessionDestroyed() method, then the session ID is not invalidated in the session ID manager. On deployments with clustered sessions and multiple contexts this can result in a session not being invalidated. This can result in an application used on a shared computer being left logged in.
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-34428>CVE-2021-34428</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>3.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Physical
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-m6cp-vxjx-65j6">https://github.com/eclipse/jetty.project/security/advisories/GHSA-m6cp-vxjx-65j6</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: 9.4.41.v20210516</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2022-2047</summary>
### Vulnerable Library - <b>jetty-server-9.4.18.v20190429.jar</b></p>
<p>The core jetty server artifact.</p>
<p>Library home page: <a href="http://www.eclipse.org/jetty">http://www.eclipse.org/jetty</a></p>
<p>Path to dependency file: /BaragonAgentService/pom.xml</p>
<p>Path to vulnerable library: /itory/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar,/home/wss-scanner/.m2/repository/org/eclipse/jetty/jetty-server/9.4.18.v20190429/jetty-server-9.4.18.v20190429.jar</p>
<p>
Dependency Hierarchy:
- :x: **jetty-server-9.4.18.v20190429.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/Baragon-test-10/commit/ad783b5f93a5095e0f7b4e1975fad0a54bdc7287">ad783b5f93a5095e0f7b4e1975fad0a54bdc7287</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Eclipse Jetty versions 9.4.0 thru 9.4.46, and 10.0.0 thru 10.0.9, and 11.0.0 thru 11.0.9 versions, the parsing of the authority segment of an http scheme URI, the Jetty HttpURI class improperly detects an invalid input as a hostname. This can lead to failures in a Proxy scenario.
<p>Publish Date: 2022-07-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-2047>CVE-2022-2047</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>2.7</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-cj7v-27pg-wf7q">https://github.com/eclipse/jetty.project/security/advisories/GHSA-cj7v-27pg-wf7q</a></p>
<p>Release Date: 2022-07-07</p>
<p>Fix Resolution: 9.4.47.v20220610</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
jetty server jar vulnerabilities highest severity is vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragonagentservice pom xml path to vulnerable library itory org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available medium jetty server jar direct medium jetty server jar direct low jetty server jar direct low jetty server jar direct details cve vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragonagentservice pom xml path to vulnerable library itory org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar dependency hierarchy x jetty server jar vulnerable library found in head commit a href found in base branch master vulnerability details for eclipse jetty versions it is possible for requests to the concatservlet with a doubly encoded path to access protected resources within the web inf directory for example a request to concat inf web xml can retrieve the web xml file this can reveal sensitive information regarding the implementation of a web application publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragonagentservice pom xml path to vulnerable library itory org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar dependency hierarchy x jetty server jar vulnerable library found in head commit a href found in base branch master vulnerability details in eclipse jetty version to to and to if gzip request body inflation is enabled and requests from different clients are multiplexed onto a single connection and if an attacker can send a request with a body that is received entirely but not consumed by the application then a subsequent request on the same connection will see that body prepended to its body the attacker will not see any data but may inject data into the body of the subsequent request publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragonagentservice pom xml path to vulnerable library itory org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar dependency hierarchy x jetty server jar vulnerable library found in head commit a href found in base branch master vulnerability details for eclipse jetty versions if an exception is thrown from the sessionlistener sessiondestroyed method then the session id is not invalidated in the session id manager on deployments with clustered sessions and multiple contexts this can result in a session not being invalidated this can result in an application used on a shared computer being left logged in publish date url a href cvss score details base score metrics exploitability metrics attack vector physical attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue cve vulnerable library jetty server jar the core jetty server artifact library home page a href path to dependency file baragonagentservice pom xml path to vulnerable library itory org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar home wss scanner repository org eclipse jetty jetty server jetty server jar dependency hierarchy x jetty server jar vulnerable library found in head commit a href found in base branch master vulnerability details in eclipse jetty versions thru and thru and thru versions the parsing of the authority segment of an http scheme uri the jetty httpuri class improperly detects an invalid input as a hostname this can lead to failures in a proxy scenario publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
9,230
| 12,260,864,099
|
IssuesEvent
|
2020-05-06 19:02:38
|
ClickHouse/ClickHouse
|
https://api.github.com/repos/ClickHouse/ClickHouse
|
closed
|
Version 20.3 has problem with simple select * queries
|
comp-processors performance prio-crit st-need-info v20.3
|
**What exactly works slower than expected?**
We have ReplicatedMergeTree tables with 100 million - 1 billion records and `select * from [table] limit 1` takes 10-20 seconds to respond.
I have tested the same table in 19 , 20.1 and 20.3 and it seems 20.3 has problem with `select * queries`
**Which ClickHouse server version to use**
20.3.8.53
|
1.0
|
Version 20.3 has problem with simple select * queries - **What exactly works slower than expected?**
We have ReplicatedMergeTree tables with 100 million - 1 billion records and `select * from [table] limit 1` takes 10-20 seconds to respond.
I have tested the same table in 19 , 20.1 and 20.3 and it seems 20.3 has problem with `select * queries`
**Which ClickHouse server version to use**
20.3.8.53
|
process
|
version has problem with simple select queries what exactly works slower than expected we have replicatedmergetree tables with million billion records and select from limit takes seconds to respond i have tested the same table in and and it seems has problem with select queries which clickhouse server version to use
| 1
|
14,272
| 17,226,007,987
|
IssuesEvent
|
2021-07-20 01:50:47
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
QGIS 3.18 has an issue with generating XYZ Tiles (MBtiles) at a zoom level of 19+
|
Bug Feedback Processing stale
|
I.d like to report an issue with QGIS 3.18 concerning the generation of high density XYZ tiles (MBtiles).
The issues i am about to report occurs while using the QGIS's QuickMapsService plugin, a.k.a; QMS, to generate maps from the ESRI World Imagery Maps. During the process, if I set the maximum zoom level to 19+ (these specific maps have a maximum zoom level of 22), QGIS fails to deliver what Ive obtained. The program will complete the process, but when i try to view the newly generated map, nothing shows up on the view port navigator despite the process claiming to finish successfully. QGIS becomes very unresponsive and will just show a blank white screen in the view port. After waiting a long time, I check the task manager; in which Windows will report that 99% of the memory is in use. Soon; Windows will start throwing low memory warnings and my PC eventually becoming 99% unresponsive unless i end the QGIS process via the task manager. Is this some kind of memory leak? I have seen other users of various different issues that are related to mine, but not exact also raise the question of QGIS seemingly suffering from a memory leak under certain circumstances. I am running these instances of QGIS 3.18 on Windows 7 Ultimate 64bit. Its important to note; that I have ran into these issues on more than 1 pc,. One is using an Intel i9, and the other using an AMD RYZEN. 1 Pc has 8gb of DDR3 while the other has 16GB of DDR4. I will note that on the pc with DDR4, the program will eventually load the map, but navigation is unrealistically slow to the point that its useless to try and view it anyway.
In addition to this issue, sometimes while performing the "generate XYZ tiles (mbtiles) tiles" for a large portion of a map, (like a piece of a country); QGIS will throw an error titled "QGIS has unexpectedly crashed". This causes the project to end and the already lengthy process is lost.
What is the course of action for this issue and can/will it be fixed? Can QGIS actually handle these operations? whats the limit?
|
1.0
|
QGIS 3.18 has an issue with generating XYZ Tiles (MBtiles) at a zoom level of 19+ - I.d like to report an issue with QGIS 3.18 concerning the generation of high density XYZ tiles (MBtiles).
The issues i am about to report occurs while using the QGIS's QuickMapsService plugin, a.k.a; QMS, to generate maps from the ESRI World Imagery Maps. During the process, if I set the maximum zoom level to 19+ (these specific maps have a maximum zoom level of 22), QGIS fails to deliver what Ive obtained. The program will complete the process, but when i try to view the newly generated map, nothing shows up on the view port navigator despite the process claiming to finish successfully. QGIS becomes very unresponsive and will just show a blank white screen in the view port. After waiting a long time, I check the task manager; in which Windows will report that 99% of the memory is in use. Soon; Windows will start throwing low memory warnings and my PC eventually becoming 99% unresponsive unless i end the QGIS process via the task manager. Is this some kind of memory leak? I have seen other users of various different issues that are related to mine, but not exact also raise the question of QGIS seemingly suffering from a memory leak under certain circumstances. I am running these instances of QGIS 3.18 on Windows 7 Ultimate 64bit. Its important to note; that I have ran into these issues on more than 1 pc,. One is using an Intel i9, and the other using an AMD RYZEN. 1 Pc has 8gb of DDR3 while the other has 16GB of DDR4. I will note that on the pc with DDR4, the program will eventually load the map, but navigation is unrealistically slow to the point that its useless to try and view it anyway.
In addition to this issue, sometimes while performing the "generate XYZ tiles (mbtiles) tiles" for a large portion of a map, (like a piece of a country); QGIS will throw an error titled "QGIS has unexpectedly crashed". This causes the project to end and the already lengthy process is lost.
What is the course of action for this issue and can/will it be fixed? Can QGIS actually handle these operations? whats the limit?
|
process
|
qgis has an issue with generating xyz tiles mbtiles at a zoom level of i d like to report an issue with qgis concerning the generation of high density xyz tiles mbtiles the issues i am about to report occurs while using the qgis s quickmapsservice plugin a k a qms to generate maps from the esri world imagery maps during the process if i set the maximum zoom level to these specific maps have a maximum zoom level of qgis fails to deliver what ive obtained the program will complete the process but when i try to view the newly generated map nothing shows up on the view port navigator despite the process claiming to finish successfully qgis becomes very unresponsive and will just show a blank white screen in the view port after waiting a long time i check the task manager in which windows will report that of the memory is in use soon windows will start throwing low memory warnings and my pc eventually becoming unresponsive unless i end the qgis process via the task manager is this some kind of memory leak i have seen other users of various different issues that are related to mine but not exact also raise the question of qgis seemingly suffering from a memory leak under certain circumstances i am running these instances of qgis on windows ultimate its important to note that i have ran into these issues on more than pc one is using an intel and the other using an amd ryzen pc has of while the other has of i will note that on the pc with the program will eventually load the map but navigation is unrealistically slow to the point that its useless to try and view it anyway in addition to this issue sometimes while performing the generate xyz tiles mbtiles tiles for a large portion of a map like a piece of a country qgis will throw an error titled qgis has unexpectedly crashed this causes the project to end and the already lengthy process is lost what is the course of action for this issue and can will it be fixed can qgis actually handle these operations whats the limit
| 1
|
19,055
| 25,069,356,351
|
IssuesEvent
|
2022-11-07 10:52:11
|
open-telemetry/opentelemetry-collector-contrib
|
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
|
closed
|
[spanmetricsprocessor] Default latency buckets to include timeout periods
|
Stale processor/spanmetrics
|
**Is your feature request related to a problem? Please describe.**
Based on [IETF's document on HTTP timeouts](https://tools.ietf.org/id/draft-thomson-hybi-http-timeout-00.html), timeouts are typically implemented with a range of 30-120s.
As such, to avoid situations where latencies fall into the "everything else" bucket of a couple of hundred years, I think it would make sense to support timeout cases, and leave the "everything else" bucket for exceptional cases.
**Describe the solution you'd like**
Add 30s and 120s buckets to the default latency histogram buckets.
|
1.0
|
[spanmetricsprocessor] Default latency buckets to include timeout periods - **Is your feature request related to a problem? Please describe.**
Based on [IETF's document on HTTP timeouts](https://tools.ietf.org/id/draft-thomson-hybi-http-timeout-00.html), timeouts are typically implemented with a range of 30-120s.
As such, to avoid situations where latencies fall into the "everything else" bucket of a couple of hundred years, I think it would make sense to support timeout cases, and leave the "everything else" bucket for exceptional cases.
**Describe the solution you'd like**
Add 30s and 120s buckets to the default latency histogram buckets.
|
process
|
default latency buckets to include timeout periods is your feature request related to a problem please describe based on timeouts are typically implemented with a range of as such to avoid situations where latencies fall into the everything else bucket of a couple of hundred years i think it would make sense to support timeout cases and leave the everything else bucket for exceptional cases describe the solution you d like add and buckets to the default latency histogram buckets
| 1
|
5,602
| 8,467,441,908
|
IssuesEvent
|
2018-10-23 16:58:34
|
googleapis/google-api-java-client-services
|
https://api.github.com/repos/googleapis/google-api-java-client-services
|
reopened
|
Make release process output cleaner
|
type: process
|
Create a junit output xml file that breaks the logs apart by artifact.
|
1.0
|
Make release process output cleaner - Create a junit output xml file that breaks the logs apart by artifact.
|
process
|
make release process output cleaner create a junit output xml file that breaks the logs apart by artifact
| 1
|
139,089
| 31,235,134,132
|
IssuesEvent
|
2023-08-20 07:10:52
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
damegender 0.5.4rc2 has 2 GuardDog issues
|
guarddog code-execution
|
https://pypi.org/project/damegender
https://inspector.pypi.io/project/damegender
```{
"dependency": "damegender",
"version": "0.5.4rc2",
"result": {
"issues": 2,
"errors": {},
"results": {
"code-execution": [
{
"location": "damegender-0.5.4rc2/setup.py:37",
"code": " f = os.popen('find '+ directory )",
"message": "This package is executing OS commands in the setup.py file"
},
{
"location": "damegender-0.5.4rc2/setup.py:45",
"code": " f = os.popen('find '+ directory)",
"message": "This package is executing OS commands in the setup.py file"
}
]
},
"path": "/tmp/tmphotdcafl/damegender"
}
}```
|
1.0
|
damegender 0.5.4rc2 has 2 GuardDog issues - https://pypi.org/project/damegender
https://inspector.pypi.io/project/damegender
```{
"dependency": "damegender",
"version": "0.5.4rc2",
"result": {
"issues": 2,
"errors": {},
"results": {
"code-execution": [
{
"location": "damegender-0.5.4rc2/setup.py:37",
"code": " f = os.popen('find '+ directory )",
"message": "This package is executing OS commands in the setup.py file"
},
{
"location": "damegender-0.5.4rc2/setup.py:45",
"code": " f = os.popen('find '+ directory)",
"message": "This package is executing OS commands in the setup.py file"
}
]
},
"path": "/tmp/tmphotdcafl/damegender"
}
}```
|
non_process
|
damegender has guarddog issues dependency damegender version result issues errors results code execution location damegender setup py code f os popen find directory message this package is executing os commands in the setup py file location damegender setup py code f os popen find directory message this package is executing os commands in the setup py file path tmp tmphotdcafl damegender
| 0
|
14,485
| 17,602,228,561
|
IssuesEvent
|
2021-08-17 13:13:32
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
reopened
|
Changes to GO:0039580 suppression by virus of host PKR activity
|
multi-species process
|
Hi @pmasson55
GO:0039580 suppression by virus of host PKR activity is mapped to https://www.uniprot.org/keywords/KW-1223 and https://viralzone.expasy.org/554
Looks like proteins annotated to that term inhibit PKR, which is a kinase that regulates eukaryotic translation initiation factor 2.
So - a mapping to 'GO:0039611 suppression by virus of host translation initiation factor activity' + GO:0030291 protein serine/threonine kinase inhibitor activity seems more appropriate, what do you think?
Also - I think this is a type of suppression of innate immune response, also because it is mediated by interferon?
Thanks, Pascale
|
1.0
|
Changes to GO:0039580 suppression by virus of host PKR activity - Hi @pmasson55
GO:0039580 suppression by virus of host PKR activity is mapped to https://www.uniprot.org/keywords/KW-1223 and https://viralzone.expasy.org/554
Looks like proteins annotated to that term inhibit PKR, which is a kinase that regulates eukaryotic translation initiation factor 2.
So - a mapping to 'GO:0039611 suppression by virus of host translation initiation factor activity' + GO:0030291 protein serine/threonine kinase inhibitor activity seems more appropriate, what do you think?
Also - I think this is a type of suppression of innate immune response, also because it is mediated by interferon?
Thanks, Pascale
|
process
|
changes to go suppression by virus of host pkr activity hi go suppression by virus of host pkr activity is mapped to and looks like proteins annotated to that term inhibit pkr which is a kinase that regulates eukaryotic translation initiation factor so a mapping to go suppression by virus of host translation initiation factor activity go protein serine threonine kinase inhibitor activity seems more appropriate what do you think also i think this is a type of suppression of innate immune response also because it is mediated by interferon thanks pascale
| 1
|
620,027
| 19,543,586,605
|
IssuesEvent
|
2022-01-01 12:05:15
|
M3THDOG/KnowOne
|
https://api.github.com/repos/M3THDOG/KnowOne
|
opened
|
Update readme.md
|
Low priority
|
README.md is imported from an old version of this project. Needs updating to current information.
|
1.0
|
Update readme.md - README.md is imported from an old version of this project. Needs updating to current information.
|
non_process
|
update readme md readme md is imported from an old version of this project needs updating to current information
| 0
|
60,824
| 6,716,085,443
|
IssuesEvent
|
2017-10-14 02:27:16
|
brave/browser-laptop
|
https://api.github.com/repos/brave/browser-laptop
|
opened
|
Download tests are failing on master
|
automated-tests bug feature/download
|
<!--
Have you searched for similar issues? We have received a lot of feedback and bug reports that we have closed as duplicates.
Before submitting this issue, please visit our wiki for common ones: https://github.com/brave/browser-laptop/wiki
By using search engines along with GitHub search function, you would be able to find duplicates more efficiently.
For more, check out our community site: https://community.brave.com/
-->
### Description
<!--
[Description of the issue]
-->
Download tests are failing on master
https://travis-ci.org/brave/browser-laptop/jobs/287802546#L3920
### Steps to Reproduce
<!--
Please add a series of steps to reproduce the problem. See https://stackoverflow.com/help/mcve for in depth information on how to create a minimal, complete, and verifiable example.
-->
1. `npm run test -- --grep='Downloads'`
2.
3.
**Actual result:**
<!--
Add screenshots if needed
-->
````
61 passing (9m)
10 pending
15 failing
1) Downloads Location and file naming tests check if first download completes:
Promise was rejected with the following reason: timeout
Error
2) Downloads Location and file naming tests check if second download completes and is renamed:
Promise was rejected with the following reason: timeout
Error
3) Downloads Item and bar tests check if download bar is shown:
Promise was rejected with the following reason: timeout
Error
4) Downloads Item and bar tests check if you can pause download:
Error: An element could not be located on the page using the given search parameters.
at element(".downloadItem") - moveToObject.js:47:17
at waitForElementCount("[data-test-id="resumeButton"]", 1) - downloadItemTest.js:101:10
5) Downloads Item and bar tests check if you can resume download:
Error: Promise was rejected with the following reason: timeout
at elements("[data-test-id="resumeButton"]") - brave.js:429:21
6) Downloads Item and bar tests check if you can cancel download:
Error: Promise was rejected with the following reason: timeout
at elements("[data-test-id="pauseButton"]") - brave.js:429:21
7) Downloads Item and bar tests check if you can re-download:
Error: Promise was rejected with the following reason: timeout
at elements("[data-test-id="redownloadButton"]") - brave.js:429:21
8) Downloads Item and bar tests check if you can remove item from the list:
Error: An element could not be located on the page using the given search parameters.
at element(".downloadItem") - moveToObject.js:47:17
at waitForElementCount(".downloadsBar", 0) - downloadItemTest.js:132:10
9) Downloads Item and bar tests check if you can delete downloaded item:
Error: Promise was rejected with the following reason: timeout
at elements(".downloadsBar") - brave.js:429:21
````
**Expected result:**
**Reproduces how often:** [What percentage of the time does it reproduce?]
### Brave Version
**about:brave info:**
<!--
Please open about:brave, copy the version information, and paste it.
-->
**Reproducible on current live release:**
<!--
Is this a problem with the live build? It matters for triage reasons.
-->
### Additional Information
<!--
Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue.
-->
0.19.x - passes: https://travis-ci.org/brave/browser-laptop/jobs/287397620#L3365
0.20.x - fails: https://travis-ci.org/brave/browser-laptop/jobs/287397555#L3896
0.21.x - fails: https://travis-ci.org/brave/browser-laptop/jobs/287397454#L3886
|
1.0
|
Download tests are failing on master - <!--
Have you searched for similar issues? We have received a lot of feedback and bug reports that we have closed as duplicates.
Before submitting this issue, please visit our wiki for common ones: https://github.com/brave/browser-laptop/wiki
By using search engines along with GitHub search function, you would be able to find duplicates more efficiently.
For more, check out our community site: https://community.brave.com/
-->
### Description
<!--
[Description of the issue]
-->
Download tests are failing on master
https://travis-ci.org/brave/browser-laptop/jobs/287802546#L3920
### Steps to Reproduce
<!--
Please add a series of steps to reproduce the problem. See https://stackoverflow.com/help/mcve for in depth information on how to create a minimal, complete, and verifiable example.
-->
1. `npm run test -- --grep='Downloads'`
2.
3.
**Actual result:**
<!--
Add screenshots if needed
-->
````
61 passing (9m)
10 pending
15 failing
1) Downloads Location and file naming tests check if first download completes:
Promise was rejected with the following reason: timeout
Error
2) Downloads Location and file naming tests check if second download completes and is renamed:
Promise was rejected with the following reason: timeout
Error
3) Downloads Item and bar tests check if download bar is shown:
Promise was rejected with the following reason: timeout
Error
4) Downloads Item and bar tests check if you can pause download:
Error: An element could not be located on the page using the given search parameters.
at element(".downloadItem") - moveToObject.js:47:17
at waitForElementCount("[data-test-id="resumeButton"]", 1) - downloadItemTest.js:101:10
5) Downloads Item and bar tests check if you can resume download:
Error: Promise was rejected with the following reason: timeout
at elements("[data-test-id="resumeButton"]") - brave.js:429:21
6) Downloads Item and bar tests check if you can cancel download:
Error: Promise was rejected with the following reason: timeout
at elements("[data-test-id="pauseButton"]") - brave.js:429:21
7) Downloads Item and bar tests check if you can re-download:
Error: Promise was rejected with the following reason: timeout
at elements("[data-test-id="redownloadButton"]") - brave.js:429:21
8) Downloads Item and bar tests check if you can remove item from the list:
Error: An element could not be located on the page using the given search parameters.
at element(".downloadItem") - moveToObject.js:47:17
at waitForElementCount(".downloadsBar", 0) - downloadItemTest.js:132:10
9) Downloads Item and bar tests check if you can delete downloaded item:
Error: Promise was rejected with the following reason: timeout
at elements(".downloadsBar") - brave.js:429:21
````
**Expected result:**
**Reproduces how often:** [What percentage of the time does it reproduce?]
### Brave Version
**about:brave info:**
<!--
Please open about:brave, copy the version information, and paste it.
-->
**Reproducible on current live release:**
<!--
Is this a problem with the live build? It matters for triage reasons.
-->
### Additional Information
<!--
Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue.
-->
0.19.x - passes: https://travis-ci.org/brave/browser-laptop/jobs/287397620#L3365
0.20.x - fails: https://travis-ci.org/brave/browser-laptop/jobs/287397555#L3896
0.21.x - fails: https://travis-ci.org/brave/browser-laptop/jobs/287397454#L3886
|
non_process
|
download tests are failing on master have you searched for similar issues we have received a lot of feedback and bug reports that we have closed as duplicates before submitting this issue please visit our wiki for common ones by using search engines along with github search function you would be able to find duplicates more efficiently for more check out our community site description download tests are failing on master steps to reproduce please add a series of steps to reproduce the problem see for in depth information on how to create a minimal complete and verifiable example npm run test grep downloads actual result add screenshots if needed passing pending failing downloads location and file naming tests check if first download completes promise was rejected with the following reason timeout error downloads location and file naming tests check if second download completes and is renamed promise was rejected with the following reason timeout error downloads item and bar tests check if download bar is shown promise was rejected with the following reason timeout error downloads item and bar tests check if you can pause download error an element could not be located on the page using the given search parameters at element downloaditem movetoobject js at waitforelementcount downloaditemtest js downloads item and bar tests check if you can resume download error promise was rejected with the following reason timeout at elements brave js downloads item and bar tests check if you can cancel download error promise was rejected with the following reason timeout at elements brave js downloads item and bar tests check if you can re download error promise was rejected with the following reason timeout at elements brave js downloads item and bar tests check if you can remove item from the list error an element could not be located on the page using the given search parameters at element downloaditem movetoobject js at waitforelementcount downloadsbar downloaditemtest js downloads item and bar tests check if you can delete downloaded item error promise was rejected with the following reason timeout at elements downloadsbar brave js expected result reproduces how often brave version about brave info please open about brave copy the version information and paste it reproducible on current live release is this a problem with the live build it matters for triage reasons additional information any additional information related issues extra qa steps configuration or data that might be necessary to reproduce the issue x passes x fails x fails
| 0
|
7,333
| 10,469,052,825
|
IssuesEvent
|
2019-09-22 17:59:29
|
produvia/ai-platform
|
https://api.github.com/repos/produvia/ai-platform
|
closed
|
Question Answering
|
natural-language-processing task wontfix
|
# Goal(s)
- Answer a user's question by finding short text segments on the web or some other collection of documents
# Input(s)
- Sentence
# Output(s)
- Sentence
# Objective Function(s)
- TBD
|
1.0
|
Question Answering - # Goal(s)
- Answer a user's question by finding short text segments on the web or some other collection of documents
# Input(s)
- Sentence
# Output(s)
- Sentence
# Objective Function(s)
- TBD
|
process
|
question answering goal s answer a user s question by finding short text segments on the web or some other collection of documents input s sentence output s sentence objective function s tbd
| 1
|
19,880
| 26,322,676,132
|
IssuesEvent
|
2023-01-10 02:00:06
|
lizhihao6/get-daily-arxiv-noti
|
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
|
opened
|
New submissions for Tue, 10 Jan 23
|
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
|
## Keyword: events
### In Defense of Structural Symbolic Representation for Video Event-Relation Prediction
- **Authors:** Andrew Lu, Xudong Lin, Yulei Niu, Shih-Fu Chang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2301.03410
- **Pdf link:** https://arxiv.org/pdf/2301.03410
- **Abstract**
Understanding event relationships in videos requires a model to understand the underlying structures of events, i.e., the event type, the associated argument roles, and corresponding entities) along with factual knowledge needed for reasoning. Structural symbolic representation (SSR) based methods directly take event types and associated argument roles/entities as inputs to perform reasoning. However, the state-of-the-art video event-relation prediction system shows the necessity of using continuous feature vectors from input videos; existing methods based solely on SSR inputs fail completely, event when given oracle event types and argument roles. In this paper, we conduct an extensive empirical analysis to answer the following questions: 1) why SSR-based method failed; 2) how to understand the evaluation setting of video event relation prediction properly; 3) how to uncover the potential of SSR-based methods. We first identify the failure of previous SSR-based video event prediction models to be caused by sub-optimal training settings. Surprisingly, we find that a simple SSR-based model with tuned hyperparameters can actually yield a 20\% absolute improvement in macro-accuracy over the state-of-the-art model. Then through qualitative and quantitative analysis, we show how evaluation that takes only video as inputs is currently unfeasible, and the reliance on oracle event information to obtain an accurate evaluation. Based on these findings, we propose to further contextualize the SSR-based model to an Event-Sequence Model and equip it with more factual knowledge through a simple yet effective way of reformulating external visual commonsense knowledge bases into an event-relation prediction pretraining dataset. The resultant new state-of-the-art model eventually establishes a 25\% Macro-accuracy performance boost.
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
### HyRSM++: Hybrid Relation Guided Temporal Set Matching for Few-shot Action Recognition
- **Authors:** Xiang Wang, Shiwei Zhang, Zhiwu Qing, Zhengrong Zuo, Changxin Gao, Rong Jin, Nong Sang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2301.03330
- **Pdf link:** https://arxiv.org/pdf/2301.03330
- **Abstract**
Recent attempts mainly focus on learning deep representations for each video individually under the episodic meta-learning regime and then performing temporal alignment to match query and support videos. However, they still suffer from two drawbacks: (i) learning individual features without considering the entire task may result in limited representation capability, and (ii) existing alignment strategies are sensitive to noises and misaligned instances. To handle the two limitations, we propose a novel Hybrid Relation guided temporal Set Matching (HyRSM++) approach for few-shot action recognition. The core idea of HyRSM++ is to integrate all videos within the task to learn discriminative representations and involve a robust matching technique. To be specific, HyRSM++ consists of two key components, a hybrid relation module and a temporal set matching metric. Given the basic representations from the feature extractor, the hybrid relation module is introduced to fully exploit associated relations within and cross videos in an episodic task and thus can learn task-specific embeddings. Subsequently, in the temporal set matching metric, we carry out the distance measure between query and support videos from a set matching perspective and design a Bi-MHM to improve the resilience to misaligned instances. In addition, we explicitly exploit the temporal coherence in videos to regularize the matching process. Furthermore, we extend the proposed HyRSM++ to deal with the more challenging semi-supervised few-shot action recognition and unsupervised few-shot action recognition tasks. Experimental results on multiple benchmarks demonstrate that our method achieves state-of-the-art performance under various few-shot settings. The source code is available at https://github.com/alibaba-mmai-research/HyRSMPlusPlus.
## Keyword: ISP
There is no result
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### A Specific Task-oriented Semantic Image Communication System for substation patrol inspection
- **Authors:** Senran Fan, Haotai Liang, Chen Dong, Xiaodong Xu, Geng Liu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Image and Video Processing (eess.IV)
- **Arxiv link:** https://arxiv.org/abs/2301.03331
- **Pdf link:** https://arxiv.org/pdf/2301.03331
- **Abstract**
Intelligent inspection robots are widely used in substation patrol inspection, which can help check potential safety hazards by patrolling the substation and sending back scene images. However, when patrolling some marginal areas with weak signal, the scene images cannot be sucessfully transmissted to be used for hidden danger elimination, which greatly reduces the quality of robots'daily work. To solve such problem, a Specific Task-oriented Semantic Communication System for Imag-STSCI is designed, which involves the semantic features extraction, transmission, restoration and enhancement to get clearer images sent by intelligent robots under weak signals. Inspired by that only some specific details of the image are needed in such substation patrol inspection task, we proposed a new paradigm of semantic enhancement in such specific task to ensure the clarity of key semantic information when facing a lower bit rate or a low signal-to-noise ratio situation. Across the reality-based simulation, experiments show our STSCI can generally surpass traditional image-compression-based and channel-codingbased or other semantic communication system in the substation patrol inspection task with a lower bit rate even under a low signal-to-noise ratio situation.
## Keyword: RAW
### HyRSM++: Hybrid Relation Guided Temporal Set Matching for Few-shot Action Recognition
- **Authors:** Xiang Wang, Shiwei Zhang, Zhiwu Qing, Zhengrong Zuo, Changxin Gao, Rong Jin, Nong Sang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2301.03330
- **Pdf link:** https://arxiv.org/pdf/2301.03330
- **Abstract**
Recent attempts mainly focus on learning deep representations for each video individually under the episodic meta-learning regime and then performing temporal alignment to match query and support videos. However, they still suffer from two drawbacks: (i) learning individual features without considering the entire task may result in limited representation capability, and (ii) existing alignment strategies are sensitive to noises and misaligned instances. To handle the two limitations, we propose a novel Hybrid Relation guided temporal Set Matching (HyRSM++) approach for few-shot action recognition. The core idea of HyRSM++ is to integrate all videos within the task to learn discriminative representations and involve a robust matching technique. To be specific, HyRSM++ consists of two key components, a hybrid relation module and a temporal set matching metric. Given the basic representations from the feature extractor, the hybrid relation module is introduced to fully exploit associated relations within and cross videos in an episodic task and thus can learn task-specific embeddings. Subsequently, in the temporal set matching metric, we carry out the distance measure between query and support videos from a set matching perspective and design a Bi-MHM to improve the resilience to misaligned instances. In addition, we explicitly exploit the temporal coherence in videos to regularize the matching process. Furthermore, we extend the proposed HyRSM++ to deal with the more challenging semi-supervised few-shot action recognition and unsupervised few-shot action recognition tasks. Experimental results on multiple benchmarks demonstrate that our method achieves state-of-the-art performance under various few-shot settings. The source code is available at https://github.com/alibaba-mmai-research/HyRSMPlusPlus.
## Keyword: raw image
There is no result
|
2.0
|
New submissions for Tue, 10 Jan 23 - ## Keyword: events
### In Defense of Structural Symbolic Representation for Video Event-Relation Prediction
- **Authors:** Andrew Lu, Xudong Lin, Yulei Niu, Shih-Fu Chang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2301.03410
- **Pdf link:** https://arxiv.org/pdf/2301.03410
- **Abstract**
Understanding event relationships in videos requires a model to understand the underlying structures of events, i.e., the event type, the associated argument roles, and corresponding entities) along with factual knowledge needed for reasoning. Structural symbolic representation (SSR) based methods directly take event types and associated argument roles/entities as inputs to perform reasoning. However, the state-of-the-art video event-relation prediction system shows the necessity of using continuous feature vectors from input videos; existing methods based solely on SSR inputs fail completely, event when given oracle event types and argument roles. In this paper, we conduct an extensive empirical analysis to answer the following questions: 1) why SSR-based method failed; 2) how to understand the evaluation setting of video event relation prediction properly; 3) how to uncover the potential of SSR-based methods. We first identify the failure of previous SSR-based video event prediction models to be caused by sub-optimal training settings. Surprisingly, we find that a simple SSR-based model with tuned hyperparameters can actually yield a 20\% absolute improvement in macro-accuracy over the state-of-the-art model. Then through qualitative and quantitative analysis, we show how evaluation that takes only video as inputs is currently unfeasible, and the reliance on oracle event information to obtain an accurate evaluation. Based on these findings, we propose to further contextualize the SSR-based model to an Event-Sequence Model and equip it with more factual knowledge through a simple yet effective way of reformulating external visual commonsense knowledge bases into an event-relation prediction pretraining dataset. The resultant new state-of-the-art model eventually establishes a 25\% Macro-accuracy performance boost.
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
### HyRSM++: Hybrid Relation Guided Temporal Set Matching for Few-shot Action Recognition
- **Authors:** Xiang Wang, Shiwei Zhang, Zhiwu Qing, Zhengrong Zuo, Changxin Gao, Rong Jin, Nong Sang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2301.03330
- **Pdf link:** https://arxiv.org/pdf/2301.03330
- **Abstract**
Recent attempts mainly focus on learning deep representations for each video individually under the episodic meta-learning regime and then performing temporal alignment to match query and support videos. However, they still suffer from two drawbacks: (i) learning individual features without considering the entire task may result in limited representation capability, and (ii) existing alignment strategies are sensitive to noises and misaligned instances. To handle the two limitations, we propose a novel Hybrid Relation guided temporal Set Matching (HyRSM++) approach for few-shot action recognition. The core idea of HyRSM++ is to integrate all videos within the task to learn discriminative representations and involve a robust matching technique. To be specific, HyRSM++ consists of two key components, a hybrid relation module and a temporal set matching metric. Given the basic representations from the feature extractor, the hybrid relation module is introduced to fully exploit associated relations within and cross videos in an episodic task and thus can learn task-specific embeddings. Subsequently, in the temporal set matching metric, we carry out the distance measure between query and support videos from a set matching perspective and design a Bi-MHM to improve the resilience to misaligned instances. In addition, we explicitly exploit the temporal coherence in videos to regularize the matching process. Furthermore, we extend the proposed HyRSM++ to deal with the more challenging semi-supervised few-shot action recognition and unsupervised few-shot action recognition tasks. Experimental results on multiple benchmarks demonstrate that our method achieves state-of-the-art performance under various few-shot settings. The source code is available at https://github.com/alibaba-mmai-research/HyRSMPlusPlus.
## Keyword: ISP
There is no result
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### A Specific Task-oriented Semantic Image Communication System for substation patrol inspection
- **Authors:** Senran Fan, Haotai Liang, Chen Dong, Xiaodong Xu, Geng Liu
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Image and Video Processing (eess.IV)
- **Arxiv link:** https://arxiv.org/abs/2301.03331
- **Pdf link:** https://arxiv.org/pdf/2301.03331
- **Abstract**
Intelligent inspection robots are widely used in substation patrol inspection, which can help check potential safety hazards by patrolling the substation and sending back scene images. However, when patrolling some marginal areas with weak signal, the scene images cannot be sucessfully transmissted to be used for hidden danger elimination, which greatly reduces the quality of robots'daily work. To solve such problem, a Specific Task-oriented Semantic Communication System for Imag-STSCI is designed, which involves the semantic features extraction, transmission, restoration and enhancement to get clearer images sent by intelligent robots under weak signals. Inspired by that only some specific details of the image are needed in such substation patrol inspection task, we proposed a new paradigm of semantic enhancement in such specific task to ensure the clarity of key semantic information when facing a lower bit rate or a low signal-to-noise ratio situation. Across the reality-based simulation, experiments show our STSCI can generally surpass traditional image-compression-based and channel-codingbased or other semantic communication system in the substation patrol inspection task with a lower bit rate even under a low signal-to-noise ratio situation.
## Keyword: RAW
### HyRSM++: Hybrid Relation Guided Temporal Set Matching for Few-shot Action Recognition
- **Authors:** Xiang Wang, Shiwei Zhang, Zhiwu Qing, Zhengrong Zuo, Changxin Gao, Rong Jin, Nong Sang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2301.03330
- **Pdf link:** https://arxiv.org/pdf/2301.03330
- **Abstract**
Recent attempts mainly focus on learning deep representations for each video individually under the episodic meta-learning regime and then performing temporal alignment to match query and support videos. However, they still suffer from two drawbacks: (i) learning individual features without considering the entire task may result in limited representation capability, and (ii) existing alignment strategies are sensitive to noises and misaligned instances. To handle the two limitations, we propose a novel Hybrid Relation guided temporal Set Matching (HyRSM++) approach for few-shot action recognition. The core idea of HyRSM++ is to integrate all videos within the task to learn discriminative representations and involve a robust matching technique. To be specific, HyRSM++ consists of two key components, a hybrid relation module and a temporal set matching metric. Given the basic representations from the feature extractor, the hybrid relation module is introduced to fully exploit associated relations within and cross videos in an episodic task and thus can learn task-specific embeddings. Subsequently, in the temporal set matching metric, we carry out the distance measure between query and support videos from a set matching perspective and design a Bi-MHM to improve the resilience to misaligned instances. In addition, we explicitly exploit the temporal coherence in videos to regularize the matching process. Furthermore, we extend the proposed HyRSM++ to deal with the more challenging semi-supervised few-shot action recognition and unsupervised few-shot action recognition tasks. Experimental results on multiple benchmarks demonstrate that our method achieves state-of-the-art performance under various few-shot settings. The source code is available at https://github.com/alibaba-mmai-research/HyRSMPlusPlus.
## Keyword: raw image
There is no result
|
process
|
new submissions for tue jan keyword events in defense of structural symbolic representation for video event relation prediction authors andrew lu xudong lin yulei niu shih fu chang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract understanding event relationships in videos requires a model to understand the underlying structures of events i e the event type the associated argument roles and corresponding entities along with factual knowledge needed for reasoning structural symbolic representation ssr based methods directly take event types and associated argument roles entities as inputs to perform reasoning however the state of the art video event relation prediction system shows the necessity of using continuous feature vectors from input videos existing methods based solely on ssr inputs fail completely event when given oracle event types and argument roles in this paper we conduct an extensive empirical analysis to answer the following questions why ssr based method failed how to understand the evaluation setting of video event relation prediction properly how to uncover the potential of ssr based methods we first identify the failure of previous ssr based video event prediction models to be caused by sub optimal training settings surprisingly we find that a simple ssr based model with tuned hyperparameters can actually yield a absolute improvement in macro accuracy over the state of the art model then through qualitative and quantitative analysis we show how evaluation that takes only video as inputs is currently unfeasible and the reliance on oracle event information to obtain an accurate evaluation based on these findings we propose to further contextualize the ssr based model to an event sequence model and equip it with more factual knowledge through a simple yet effective way of reformulating external visual commonsense knowledge bases into an event relation prediction pretraining dataset the resultant new state of the art model eventually establishes a macro accuracy performance boost keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb hyrsm hybrid relation guided temporal set matching for few shot action recognition authors xiang wang shiwei zhang zhiwu qing zhengrong zuo changxin gao rong jin nong sang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract recent attempts mainly focus on learning deep representations for each video individually under the episodic meta learning regime and then performing temporal alignment to match query and support videos however they still suffer from two drawbacks i learning individual features without considering the entire task may result in limited representation capability and ii existing alignment strategies are sensitive to noises and misaligned instances to handle the two limitations we propose a novel hybrid relation guided temporal set matching hyrsm approach for few shot action recognition the core idea of hyrsm is to integrate all videos within the task to learn discriminative representations and involve a robust matching technique to be specific hyrsm consists of two key components a hybrid relation module and a temporal set matching metric given the basic representations from the feature extractor the hybrid relation module is introduced to fully exploit associated relations within and cross videos in an episodic task and thus can learn task specific embeddings subsequently in the temporal set matching metric we carry out the distance measure between query and support videos from a set matching perspective and design a bi mhm to improve the resilience to misaligned instances in addition we explicitly exploit the temporal coherence in videos to regularize the matching process furthermore we extend the proposed hyrsm to deal with the more challenging semi supervised few shot action recognition and unsupervised few shot action recognition tasks experimental results on multiple benchmarks demonstrate that our method achieves state of the art performance under various few shot settings the source code is available at keyword isp there is no result keyword image signal processing there is no result keyword image signal process there is no result keyword compression a specific task oriented semantic image communication system for substation patrol inspection authors senran fan haotai liang chen dong xiaodong xu geng liu subjects computer vision and pattern recognition cs cv artificial intelligence cs ai image and video processing eess iv arxiv link pdf link abstract intelligent inspection robots are widely used in substation patrol inspection which can help check potential safety hazards by patrolling the substation and sending back scene images however when patrolling some marginal areas with weak signal the scene images cannot be sucessfully transmissted to be used for hidden danger elimination which greatly reduces the quality of robots daily work to solve such problem a specific task oriented semantic communication system for imag stsci is designed which involves the semantic features extraction transmission restoration and enhancement to get clearer images sent by intelligent robots under weak signals inspired by that only some specific details of the image are needed in such substation patrol inspection task we proposed a new paradigm of semantic enhancement in such specific task to ensure the clarity of key semantic information when facing a lower bit rate or a low signal to noise ratio situation across the reality based simulation experiments show our stsci can generally surpass traditional image compression based and channel codingbased or other semantic communication system in the substation patrol inspection task with a lower bit rate even under a low signal to noise ratio situation keyword raw hyrsm hybrid relation guided temporal set matching for few shot action recognition authors xiang wang shiwei zhang zhiwu qing zhengrong zuo changxin gao rong jin nong sang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract recent attempts mainly focus on learning deep representations for each video individually under the episodic meta learning regime and then performing temporal alignment to match query and support videos however they still suffer from two drawbacks i learning individual features without considering the entire task may result in limited representation capability and ii existing alignment strategies are sensitive to noises and misaligned instances to handle the two limitations we propose a novel hybrid relation guided temporal set matching hyrsm approach for few shot action recognition the core idea of hyrsm is to integrate all videos within the task to learn discriminative representations and involve a robust matching technique to be specific hyrsm consists of two key components a hybrid relation module and a temporal set matching metric given the basic representations from the feature extractor the hybrid relation module is introduced to fully exploit associated relations within and cross videos in an episodic task and thus can learn task specific embeddings subsequently in the temporal set matching metric we carry out the distance measure between query and support videos from a set matching perspective and design a bi mhm to improve the resilience to misaligned instances in addition we explicitly exploit the temporal coherence in videos to regularize the matching process furthermore we extend the proposed hyrsm to deal with the more challenging semi supervised few shot action recognition and unsupervised few shot action recognition tasks experimental results on multiple benchmarks demonstrate that our method achieves state of the art performance under various few shot settings the source code is available at keyword raw image there is no result
| 1
|
9,128
| 12,199,041,975
|
IssuesEvent
|
2020-04-30 00:27:33
|
kubeflow/testing
|
https://api.github.com/repos/kubeflow/testing
|
closed
|
Cleanup the old release infrastructure
|
area/engprod kind/process lifecycle/stale priority/p1
|
Follow on to #450
We should clean up the old release infrastructure.
e.g. we should have an old cron job for updating the jupyter web app that we can delete
|
1.0
|
Cleanup the old release infrastructure - Follow on to #450
We should clean up the old release infrastructure.
e.g. we should have an old cron job for updating the jupyter web app that we can delete
|
process
|
cleanup the old release infrastructure follow on to we should clean up the old release infrastructure e g we should have an old cron job for updating the jupyter web app that we can delete
| 1
|
22,567
| 31,789,803,662
|
IssuesEvent
|
2023-09-13 01:44:41
|
ReMobidyc/ReMobidyc
|
https://api.github.com/repos/ReMobidyc/ReMobidyc
|
opened
|
check blank lines in CSV files
|
enhancement processor
|
don't fix it automatically, but just warn and stop proccessing.
|
1.0
|
check blank lines in CSV files - don't fix it automatically, but just warn and stop proccessing.
|
process
|
check blank lines in csv files don t fix it automatically but just warn and stop proccessing
| 1
|
18,266
| 24,346,999,955
|
IssuesEvent
|
2022-10-02 12:58:31
|
rathena/FluxCP
|
https://api.github.com/repos/rathena/FluxCP
|
closed
|
Just a suggestions
|
Enhancement Request Component: Payment Processor
|
Adding an option to toggle wherein fees for paypal will be included or not.
|
1.0
|
Just a suggestions - Adding an option to toggle wherein fees for paypal will be included or not.
|
process
|
just a suggestions adding an option to toggle wherein fees for paypal will be included or not
| 1
|
22,461
| 31,237,823,718
|
IssuesEvent
|
2023-08-20 13:45:26
|
rust-lang/rust
|
https://api.github.com/repos/rust-lang/rust
|
reopened
|
Possible data race & use-after-free in std::env::var for unix implementation
|
I-unsound C-bug T-libs O-unix A-process
|
<!--
Thank you for filing a bug report! 🐛 Please provide a short summary of the bug,
along with any information you feel relevant to replicating the bug.
-->
Investigating library sources for [std::os::env::var/set_var](https://github.com/rust-lang/rust/blob/master/library/std/src/sys/unix/os.rs#L594), I have found that the copying of the env variable value is performed not under the lock:
```rust
pub fn getenv(k: &OsStr) -> Option<OsString> {
// environment variables with a nul byte can't be set, so their value is
// always None as well
let s = run_with_cstr(k.as_bytes(), |k| {
let _guard = env_read_lock();
Ok(unsafe { libc::getenv(k.as_ptr()) } as *const libc::c_char)
// lock is dropped here
})
.ok()?;
if s.is_null() {
None
} else {
// but access is here
Some(OsStringExt::from_vec(unsafe { CStr::from_ptr(s) }.to_bytes().to_vec()))
}
}
```
So there is a potential data race with setenv.
I tried this code with miri
```rust
fn main() {
let t1 = std::thread::spawn(|| {
let mut cnt = 0;
for _ in 0..100000 {
let var = std::env::var_os("HELLO");
cnt += var.map(|v| v.len()).unwrap_or_default();
}
cnt
});
let t2 = std::thread::spawn(|| {
for i in 0..100000 {
let value = format!("helloooooooooooooo{i}");
std::env::set_var("HELLO", &value);
}
});
let cnt = t1.join().expect("ok");
println!("{cnt}");
t2.join();
}
```
And got
```
error: Undefined Behavior: Data race detected between (1) Read on thread `<unnamed>` and (2) Deallocate on thread `<unnamed>` at alloc3346+0x6. (2) just happened here
--> /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/unix/os.rs:613:26
|
613 | cvt(unsafe { libc::setenv(k.as_ptr(), v.as_ptr(), 1) }).map(drop)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Data race detected between (1) Read on thread `<unnamed>` and (2) Deallocate on thread `<unnamed>` at alloc3346+0x6. (2) just happened here
|
help: and (1) occurred earlier here
--> src/main.rs:30:23
|
30 | let var = std::env::var_os("HELLO");
| ^^^^^^^^^^^^^^^^^^^^^^^^^
= help: this indicates a bug in the program: it performed an invalid operation, and caused Undefined Behavior
= help: see https://doc.rust-lang.org/nightly/reference/behavior-considered-undefined.html for further information
= note: BACKTRACE (of the first span):
= note: inside closure at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/unix/os.rs:613:26: 613:65
= note: inside `std::sys::common::small_c_string::run_with_cstr::<(), [closure@std::sys::unix::os::setenv::{closure#0}::{closure#0}]>` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/common/small_c_string.rs:43:18: 43:22
= note: inside closure at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/unix/os.rs:611:9: 614:11
= note: inside `std::sys::common::small_c_string::run_with_cstr::<(), [closure@std::sys::unix::os::setenv::{closure#0}]>` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/common/small_c_string.rs:43:18: 43:22
= note: inside `std::sys::unix::os::setenv` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/unix/os.rs:610:5: 615:7
= note: inside `std::env::_set_var` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/env.rs:347:5: 347:31
= note: inside `std::env::set_var::<&str, &std::string::String>` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/env.rs:343:5: 343:43
note: inside closure
--> src/main.rs:38:13
```
Have tried to move `Some(OsStringExt::from_vec(unsafe { CStr::from_ptr(s) }.to_bytes().to_vec()))` under the lock, but still got error with miri.
I was not able to trigger the real crash with it yet.
### Meta
<!--
If you're using the stable version of the compiler, you should also check if the
bug also exists in the beta or nightly versions.
-->
`rustc --version --verbose`:
```
rustc 1.71.1 (eb26296b5 2023-08-03)
binary: rustc
commit-hash: eb26296b556cef10fb713a38f3d16b9886080f26
commit-date: 2023-08-03
host: x86_64-unknown-linux-gnu
release: 1.71.1
LLVM version: 16.0.5
```
Bug was introduced by this commit: https://github.com/rust-lang/rust/commit/86974b83af48ee9e196da42730ec96ad646009c4 (PR: https://github.com/rust-lang/rust/pull/93668)
Before that changes, read was under the lock.
<!-- TRIAGEBOT_START -->
<!-- TRIAGEBOT_ASSIGN_START -->
<!-- TRIAGEBOT_ASSIGN_DATA_START$${"user":"ShE3py"}$$TRIAGEBOT_ASSIGN_DATA_END -->
<!-- TRIAGEBOT_ASSIGN_END -->
<!-- TRIAGEBOT_END -->
|
1.0
|
Possible data race & use-after-free in std::env::var for unix implementation - <!--
Thank you for filing a bug report! 🐛 Please provide a short summary of the bug,
along with any information you feel relevant to replicating the bug.
-->
Investigating library sources for [std::os::env::var/set_var](https://github.com/rust-lang/rust/blob/master/library/std/src/sys/unix/os.rs#L594), I have found that the copying of the env variable value is performed not under the lock:
```rust
pub fn getenv(k: &OsStr) -> Option<OsString> {
// environment variables with a nul byte can't be set, so their value is
// always None as well
let s = run_with_cstr(k.as_bytes(), |k| {
let _guard = env_read_lock();
Ok(unsafe { libc::getenv(k.as_ptr()) } as *const libc::c_char)
// lock is dropped here
})
.ok()?;
if s.is_null() {
None
} else {
// but access is here
Some(OsStringExt::from_vec(unsafe { CStr::from_ptr(s) }.to_bytes().to_vec()))
}
}
```
So there is a potential data race with setenv.
I tried this code with miri
```rust
fn main() {
let t1 = std::thread::spawn(|| {
let mut cnt = 0;
for _ in 0..100000 {
let var = std::env::var_os("HELLO");
cnt += var.map(|v| v.len()).unwrap_or_default();
}
cnt
});
let t2 = std::thread::spawn(|| {
for i in 0..100000 {
let value = format!("helloooooooooooooo{i}");
std::env::set_var("HELLO", &value);
}
});
let cnt = t1.join().expect("ok");
println!("{cnt}");
t2.join();
}
```
And got
```
error: Undefined Behavior: Data race detected between (1) Read on thread `<unnamed>` and (2) Deallocate on thread `<unnamed>` at alloc3346+0x6. (2) just happened here
--> /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/unix/os.rs:613:26
|
613 | cvt(unsafe { libc::setenv(k.as_ptr(), v.as_ptr(), 1) }).map(drop)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Data race detected between (1) Read on thread `<unnamed>` and (2) Deallocate on thread `<unnamed>` at alloc3346+0x6. (2) just happened here
|
help: and (1) occurred earlier here
--> src/main.rs:30:23
|
30 | let var = std::env::var_os("HELLO");
| ^^^^^^^^^^^^^^^^^^^^^^^^^
= help: this indicates a bug in the program: it performed an invalid operation, and caused Undefined Behavior
= help: see https://doc.rust-lang.org/nightly/reference/behavior-considered-undefined.html for further information
= note: BACKTRACE (of the first span):
= note: inside closure at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/unix/os.rs:613:26: 613:65
= note: inside `std::sys::common::small_c_string::run_with_cstr::<(), [closure@std::sys::unix::os::setenv::{closure#0}::{closure#0}]>` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/common/small_c_string.rs:43:18: 43:22
= note: inside closure at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/unix/os.rs:611:9: 614:11
= note: inside `std::sys::common::small_c_string::run_with_cstr::<(), [closure@std::sys::unix::os::setenv::{closure#0}]>` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/common/small_c_string.rs:43:18: 43:22
= note: inside `std::sys::unix::os::setenv` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/sys/unix/os.rs:610:5: 615:7
= note: inside `std::env::_set_var` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/env.rs:347:5: 347:31
= note: inside `std::env::set_var::<&str, &std::string::String>` at /home/dmis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/env.rs:343:5: 343:43
note: inside closure
--> src/main.rs:38:13
```
Have tried to move `Some(OsStringExt::from_vec(unsafe { CStr::from_ptr(s) }.to_bytes().to_vec()))` under the lock, but still got error with miri.
I was not able to trigger the real crash with it yet.
### Meta
<!--
If you're using the stable version of the compiler, you should also check if the
bug also exists in the beta or nightly versions.
-->
`rustc --version --verbose`:
```
rustc 1.71.1 (eb26296b5 2023-08-03)
binary: rustc
commit-hash: eb26296b556cef10fb713a38f3d16b9886080f26
commit-date: 2023-08-03
host: x86_64-unknown-linux-gnu
release: 1.71.1
LLVM version: 16.0.5
```
Bug was introduced by this commit: https://github.com/rust-lang/rust/commit/86974b83af48ee9e196da42730ec96ad646009c4 (PR: https://github.com/rust-lang/rust/pull/93668)
Before that changes, read was under the lock.
<!-- TRIAGEBOT_START -->
<!-- TRIAGEBOT_ASSIGN_START -->
<!-- TRIAGEBOT_ASSIGN_DATA_START$${"user":"ShE3py"}$$TRIAGEBOT_ASSIGN_DATA_END -->
<!-- TRIAGEBOT_ASSIGN_END -->
<!-- TRIAGEBOT_END -->
|
process
|
possible data race use after free in std env var for unix implementation thank you for filing a bug report 🐛 please provide a short summary of the bug along with any information you feel relevant to replicating the bug investigating library sources for i have found that the copying of the env variable value is performed not under the lock rust pub fn getenv k osstr option environment variables with a nul byte can t be set so their value is always none as well let s run with cstr k as bytes k let guard env read lock ok unsafe libc getenv k as ptr as const libc c char lock is dropped here ok if s is null none else but access is here some osstringext from vec unsafe cstr from ptr s to bytes to vec so there is a potential data race with setenv i tried this code with miri rust fn main let std thread spawn let mut cnt for in let var std env var os hello cnt var map v v len unwrap or default cnt let std thread spawn for i in let value format helloooooooooooooo i std env set var hello value let cnt join expect ok println cnt join and got error undefined behavior data race detected between read on thread and deallocate on thread at just happened here home dmis rustup toolchains nightly unknown linux gnu lib rustlib src rust library std src sys unix os rs cvt unsafe libc setenv k as ptr v as ptr map drop data race detected between read on thread and deallocate on thread at just happened here help and occurred earlier here src main rs let var std env var os hello help this indicates a bug in the program it performed an invalid operation and caused undefined behavior help see for further information note backtrace of the first span note inside closure at home dmis rustup toolchains nightly unknown linux gnu lib rustlib src rust library std src sys unix os rs note inside std sys common small c string run with cstr at home dmis rustup toolchains nightly unknown linux gnu lib rustlib src rust library std src sys common small c string rs note inside closure at home dmis rustup toolchains nightly unknown linux gnu lib rustlib src rust library std src sys unix os rs note inside std sys common small c string run with cstr at home dmis rustup toolchains nightly unknown linux gnu lib rustlib src rust library std src sys common small c string rs note inside std sys unix os setenv at home dmis rustup toolchains nightly unknown linux gnu lib rustlib src rust library std src sys unix os rs note inside std env set var at home dmis rustup toolchains nightly unknown linux gnu lib rustlib src rust library std src env rs note inside std env set var at home dmis rustup toolchains nightly unknown linux gnu lib rustlib src rust library std src env rs note inside closure src main rs have tried to move some osstringext from vec unsafe cstr from ptr s to bytes to vec under the lock but still got error with miri i was not able to trigger the real crash with it yet meta if you re using the stable version of the compiler you should also check if the bug also exists in the beta or nightly versions rustc version verbose rustc binary rustc commit hash commit date host unknown linux gnu release llvm version bug was introduced by this commit pr before that changes read was under the lock
| 1
|
15,994
| 20,188,204,609
|
IssuesEvent
|
2022-02-11 01:17:49
|
savitamittalmsft/WAS-SEC-TEST
|
https://api.github.com/repos/savitamittalmsft/WAS-SEC-TEST
|
opened
|
Use Azure Firewall or a 3rd party next-generation firewall to protect against data exfiltration
|
WARP-Import WAF FEB 2021 Security Performance and Scalability Capacity Management Processes Networking & Connectivity Connectivity
|
<a href="https://docs.microsoft.com/azure/architecture/framework/security/design-network-flow#data-exfiltration">Use Azure Firewall or a 3rd party next-generation firewall to protect against data exfiltration</a>
<p><b>Why Consider This?</b></p>
NVA solutions and Azure Firewall (for supported protocols) can be leveraged as a reverse proxy to restrict access to only authorized PaaS services for services where Private Link is not yet supported (Azure Firewall).
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Configure Azure Firewall or a 3rd party next generation firewall to protect against data exfiltration concerns.</span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/azure/firewall/" target="_blank"><span>https://docs.microsoft.com/en-us/azure/firewall/</span></a><span /></p><p><a href="https://azuremarketplace.microsoft.com/en-us/marketplace/apps/category/networking" target="_blank"><span>https://azuremarketplace.microsoft.com/en-us/marketplace/apps/category/networking</span></a><span /></p>
|
1.0
|
Use Azure Firewall or a 3rd party next-generation firewall to protect against data exfiltration - <a href="https://docs.microsoft.com/azure/architecture/framework/security/design-network-flow#data-exfiltration">Use Azure Firewall or a 3rd party next-generation firewall to protect against data exfiltration</a>
<p><b>Why Consider This?</b></p>
NVA solutions and Azure Firewall (for supported protocols) can be leveraged as a reverse proxy to restrict access to only authorized PaaS services for services where Private Link is not yet supported (Azure Firewall).
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Configure Azure Firewall or a 3rd party next generation firewall to protect against data exfiltration concerns.</span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/azure/firewall/" target="_blank"><span>https://docs.microsoft.com/en-us/azure/firewall/</span></a><span /></p><p><a href="https://azuremarketplace.microsoft.com/en-us/marketplace/apps/category/networking" target="_blank"><span>https://azuremarketplace.microsoft.com/en-us/marketplace/apps/category/networking</span></a><span /></p>
|
process
|
use azure firewall or a party next generation firewall to protect against data exfiltration why consider this nva solutions and azure firewall for supported protocols can be leveraged as a reverse proxy to restrict access to only authorized paas services for services where private link is not yet supported azure firewall context suggested actions configure azure firewall or a party next generation firewall to protect against data exfiltration concerns learn more
| 1
|
496
| 2,941,229,315
|
IssuesEvent
|
2015-07-02 06:02:18
|
e-government-ua/i
|
https://api.github.com/repos/e-government-ua/i
|
closed
|
Акутализация названий прикрепленных файлов в дашборде
|
In process of testing test version
|
Описание к файлу "вынести за пределы ссылки". Сейчас что при клике на название файла, что при ссылке на описание аттачмента - начинается довнлоад атачмента-файла. А должно только по клику на название файла.
ДЕТАЛЬНЕЙ:
Комментарий к прикрепляемому файлу (указанный после разделителя " ; ") добавляется на форме заявки в дашборде к названию файла, что далет в итоге затруднительным анализ приложений.
Например :
скан паспорта в заявк отображается как аттач с комментарием – Копія довідки з міста праці, додається тільки у випадку пільгової квартирної черги.
Необходимо комментарий отображать как комментарий к описанию файла (как это реализовано на форме подачи заявки на портале igov)
|
1.0
|
Акутализация названий прикрепленных файлов в дашборде - Описание к файлу "вынести за пределы ссылки". Сейчас что при клике на название файла, что при ссылке на описание аттачмента - начинается довнлоад атачмента-файла. А должно только по клику на название файла.
ДЕТАЛЬНЕЙ:
Комментарий к прикрепляемому файлу (указанный после разделителя " ; ") добавляется на форме заявки в дашборде к названию файла, что далет в итоге затруднительным анализ приложений.
Например :
скан паспорта в заявк отображается как аттач с комментарием – Копія довідки з міста праці, додається тільки у випадку пільгової квартирної черги.
Необходимо комментарий отображать как комментарий к описанию файла (как это реализовано на форме подачи заявки на портале igov)
|
process
|
акутализация названий прикрепленных файлов в дашборде описание к файлу вынести за пределы ссылки сейчас что при клике на название файла что при ссылке на описание аттачмента начинается довнлоад атачмента файла а должно только по клику на название файла детальней комментарий к прикрепляемому файлу указанный после разделителя добавляется на форме заявки в дашборде к названию файла что далет в итоге затруднительным анализ приложений например скан паспорта в заявк отображается как аттач с комментарием – копія довідки з міста праці додається тільки у випадку пільгової квартирної черги необходимо комментарий отображать как комментарий к описанию файла как это реализовано на форме подачи заявки на портале igov
| 1
|
214,728
| 7,276,278,984
|
IssuesEvent
|
2018-02-21 15:57:24
|
memcachier/docs
|
https://api.github.com/repos/memcachier/docs
|
closed
|
PHP-7
|
bug high-priority
|
We need to update our docs to either mirror what Heroku recommends (different configurations of the php session for v5 and v7 or just refer to their docs. Our example app currently doesn't work (because it basically results in php7 in practice). I need to figure out how to actually get php's compose working locally to update it properly.
|
1.0
|
PHP-7 - We need to update our docs to either mirror what Heroku recommends (different configurations of the php session for v5 and v7 or just refer to their docs. Our example app currently doesn't work (because it basically results in php7 in practice). I need to figure out how to actually get php's compose working locally to update it properly.
|
non_process
|
php we need to update our docs to either mirror what heroku recommends different configurations of the php session for and or just refer to their docs our example app currently doesn t work because it basically results in in practice i need to figure out how to actually get php s compose working locally to update it properly
| 0
|
679,808
| 23,246,015,261
|
IssuesEvent
|
2022-08-03 20:14:56
|
theiagen/public_health_bacterial_genomics
|
https://api.github.com/repos/theiagen/public_health_bacterial_genomics
|
closed
|
add BUSCO task?
|
priority:3 (medium)
|
https://gitlab.com/ezlab/busco
https://busco.ezlab.org/busco_userguide.html
already have a docker image, but we can add a StaPH-B one
Requested by one CA county lab and I think this would be a good addition
|
1.0
|
add BUSCO task? - https://gitlab.com/ezlab/busco
https://busco.ezlab.org/busco_userguide.html
already have a docker image, but we can add a StaPH-B one
Requested by one CA county lab and I think this would be a good addition
|
non_process
|
add busco task already have a docker image but we can add a staph b one requested by one ca county lab and i think this would be a good addition
| 0
|
91,099
| 8,289,418,427
|
IssuesEvent
|
2018-09-19 14:38:41
|
ampproject/amphtml
|
https://api.github.com/repos/ampproject/amphtml
|
closed
|
Fix flaky visual diff test for `amp-iframe - Amp By Example`
|
P2: Soon Related to: Flaky Tests Type: Bug
|
The ["amp-iframe - Amp By Example"](https://github.com/ampproject/amphtml/blob/master/examples/visual-tests/amp-by-example/components/amp-iframe/index.html) visual diff test is currently marked as flaky: https://github.com/ampproject/amphtml/blob/b19a67c7215fb601237a20a04ec41c090a5eb559/test/visual-diff/visual-tests#L720-L726
https://percy.io/ampproject/amphtml/builds/797668/view/44814236/411?mode=diff&browser=firefox&snapshot=44814236
Please reassign this if you're not the right person to fix this
|
1.0
|
Fix flaky visual diff test for `amp-iframe - Amp By Example` - The ["amp-iframe - Amp By Example"](https://github.com/ampproject/amphtml/blob/master/examples/visual-tests/amp-by-example/components/amp-iframe/index.html) visual diff test is currently marked as flaky: https://github.com/ampproject/amphtml/blob/b19a67c7215fb601237a20a04ec41c090a5eb559/test/visual-diff/visual-tests#L720-L726
https://percy.io/ampproject/amphtml/builds/797668/view/44814236/411?mode=diff&browser=firefox&snapshot=44814236
Please reassign this if you're not the right person to fix this
|
non_process
|
fix flaky visual diff test for amp iframe amp by example the visual diff test is currently marked as flaky please reassign this if you re not the right person to fix this
| 0
|
5,947
| 8,773,291,032
|
IssuesEvent
|
2018-12-18 16:30:06
|
googleapis/google-cloud-python
|
https://api.github.com/repos/googleapis/google-cloud-python
|
closed
|
[Firestore] implement `__eq__` for public api types.
|
api: firestore triaged for GA type: process
|
None of the public API types implement isEquals (__eq__)
|
1.0
|
[Firestore] implement `__eq__` for public api types. - None of the public API types implement isEquals (__eq__)
|
process
|
implement eq for public api types none of the public api types implement isequals eq
| 1
|
21,327
| 28,967,608,470
|
IssuesEvent
|
2023-05-10 08:59:10
|
NHMDenmark/Mass-Digitizer
|
https://api.github.com/repos/NHMDenmark/Mass-Digitizer
|
closed
|
Reformat date fields to use dd/mm/yyyy format
|
Specify post-processing 1
|
The current yyyy-mm-dd format cannot be imported with Workbench
|
1.0
|
Reformat date fields to use dd/mm/yyyy format - The current yyyy-mm-dd format cannot be imported with Workbench
|
process
|
reformat date fields to use dd mm yyyy format the current yyyy mm dd format cannot be imported with workbench
| 1
|
281,169
| 21,315,381,788
|
IssuesEvent
|
2022-04-16 07:15:00
|
Rdac0/pe
|
https://api.github.com/repos/Rdac0/pe
|
opened
|
Inconsistent spelling of JOB_TITLE in UG
|
type.DocumentationBug severity.VeryLow
|
one is JOB_TITLE, whilst the other is JOBTITLE

<!--session: 1650087955423-9a7d8e56-8cc2-4efe-bce2-716f9a6f7bb5-->
<!--Version: Web v3.4.2-->
|
1.0
|
Inconsistent spelling of JOB_TITLE in UG - one is JOB_TITLE, whilst the other is JOBTITLE

<!--session: 1650087955423-9a7d8e56-8cc2-4efe-bce2-716f9a6f7bb5-->
<!--Version: Web v3.4.2-->
|
non_process
|
inconsistent spelling of job title in ug one is job title whilst the other is jobtitle
| 0
|
16,786
| 6,284,507,390
|
IssuesEvent
|
2017-07-19 07:58:15
|
linkedin/gobblin
|
https://api.github.com/repos/linkedin/gobblin
|
closed
|
jar lib not included while running map-reduce
|
Bug:Generic Framework:Build Moved to Apache Jira
|
Hi,
Help message from ./gobblin-dist/bin/gobblin-mapreduce.sh says
> --jars <comma-separated list of job jars> Job jar(s): **if not set, "/home/michalw/gobblin-dist/lib" is examined**
So I didn't set it and get in mappers:
```
2016-12-16 20:36:07,106 INFO [ForkExecutor-0] gobblin.runtime.Fork-0: Wrapping writer gobblin.writer.PartitionedDataWriter@12910662
2016-12-16 20:36:07,108 ERROR [ForkExecutor-0] gobblin.runtime.Fork-0: Fork 0 of task task_GobblinKafkaQuickStart_1481949335882_0 failed to process data records
**java.lang.NoClassDefFoundError: com/github/rholder/retry/RetryListener**
at gobblin.writer.DataWriterWrapperBuilder.build(DataWriterWrapperBuilder.java:49)
at gobblin.runtime.Fork.buildWriter(Fork.java:377)
at gobblin.runtime.Fork.buildWriterIfNotPresent(Fork.java:382)
at gobblin.runtime.Fork.processRecords(Fork.java:399)
at gobblin.runtime.Fork.run(Fork.java:170)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: **java.lang.ClassNotFoundException: com.github.rholder.retry.RetryListener**
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
```
However if I do specify
--jars /home/michalw/gobblin-dist/lib/guava-retrying-2.0.0.jar
then it works. So to be clear: guava-retrying-2.0.0.jar is in lib, but is not included by default which is misleading regarding help text. If that is an intentional behaviour, I think it's worth to change help msg and add some notes in tutorials in docs. It would speed up a bit setting up.
Probably connected: #1321
The same issues described on groups.google forum couple of times.
I use 0.8.0, kafka->hdfs ingestion, MR
|
1.0
|
jar lib not included while running map-reduce - Hi,
Help message from ./gobblin-dist/bin/gobblin-mapreduce.sh says
> --jars <comma-separated list of job jars> Job jar(s): **if not set, "/home/michalw/gobblin-dist/lib" is examined**
So I didn't set it and get in mappers:
```
2016-12-16 20:36:07,106 INFO [ForkExecutor-0] gobblin.runtime.Fork-0: Wrapping writer gobblin.writer.PartitionedDataWriter@12910662
2016-12-16 20:36:07,108 ERROR [ForkExecutor-0] gobblin.runtime.Fork-0: Fork 0 of task task_GobblinKafkaQuickStart_1481949335882_0 failed to process data records
**java.lang.NoClassDefFoundError: com/github/rholder/retry/RetryListener**
at gobblin.writer.DataWriterWrapperBuilder.build(DataWriterWrapperBuilder.java:49)
at gobblin.runtime.Fork.buildWriter(Fork.java:377)
at gobblin.runtime.Fork.buildWriterIfNotPresent(Fork.java:382)
at gobblin.runtime.Fork.processRecords(Fork.java:399)
at gobblin.runtime.Fork.run(Fork.java:170)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: **java.lang.ClassNotFoundException: com.github.rholder.retry.RetryListener**
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
```
However if I do specify
--jars /home/michalw/gobblin-dist/lib/guava-retrying-2.0.0.jar
then it works. So to be clear: guava-retrying-2.0.0.jar is in lib, but is not included by default which is misleading regarding help text. If that is an intentional behaviour, I think it's worth to change help msg and add some notes in tutorials in docs. It would speed up a bit setting up.
Probably connected: #1321
The same issues described on groups.google forum couple of times.
I use 0.8.0, kafka->hdfs ingestion, MR
|
non_process
|
jar lib not included while running map reduce hi help message from gobblin dist bin gobblin mapreduce sh says jars job jar s if not set home michalw gobblin dist lib is examined so i didn t set it and get in mappers info gobblin runtime fork wrapping writer gobblin writer partitioneddatawriter error gobblin runtime fork fork of task task gobblinkafkaquickstart failed to process data records java lang noclassdeffounderror com github rholder retry retrylistener at gobblin writer datawriterwrapperbuilder build datawriterwrapperbuilder java at gobblin runtime fork buildwriter fork java at gobblin runtime fork buildwriterifnotpresent fork java at gobblin runtime fork processrecords fork java at gobblin runtime fork run fork java at java util concurrent executors runnableadapter call executors java at java util concurrent futuretask run futuretask java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java caused by java lang classnotfoundexception com github rholder retry retrylistener at java net urlclassloader findclass urlclassloader java at java lang classloader loadclass classloader java at sun misc launcher appclassloader loadclass launcher java at java lang classloader loadclass classloader java more however if i do specify jars home michalw gobblin dist lib guava retrying jar then it works so to be clear guava retrying jar is in lib but is not included by default which is misleading regarding help text if that is an intentional behaviour i think it s worth to change help msg and add some notes in tutorials in docs it would speed up a bit setting up probably connected the same issues described on groups google forum couple of times i use kafka hdfs ingestion mr
| 0
|
1,038
| 3,508,746,751
|
IssuesEvent
|
2016-01-08 19:19:41
|
triplea-game/triplea
|
https://api.github.com/repos/triplea-game/triplea
|
closed
|
How can we move faster?
|
Process Improvement
|
As a retrospective, and now that we've merged a good number of PRs, what are some of the lessons learned and things on our wish list so we can move faster? What are everyone's thoughts on how we can more effectively get code merged with less delay, and with the same or better quality?
|
1.0
|
How can we move faster? - As a retrospective, and now that we've merged a good number of PRs, what are some of the lessons learned and things on our wish list so we can move faster? What are everyone's thoughts on how we can more effectively get code merged with less delay, and with the same or better quality?
|
process
|
how can we move faster as a retrospective and now that we ve merged a good number of prs what are some of the lessons learned and things on our wish list so we can move faster what are everyone s thoughts on how we can more effectively get code merged with less delay and with the same or better quality
| 1
|
89,191
| 8,196,663,475
|
IssuesEvent
|
2018-08-31 10:36:34
|
humera987/HumTestData
|
https://api.github.com/repos/humera987/HumTestData
|
closed
|
humerafxtesting : api_v1_orgs_by-user_get_query_param_sql_injection_postgres_pageSize
|
humerafxtesting
|
Project : humerafxtesting
Job : UAT
Env : UAT
Region : FXLabs/US_WEST_1
Result : fail
Status Code : 200
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 30 Aug 2018 10:36:36 GMT]}
Endpoint : http://13.56.210.25/api/v1/orgs/by-user?pageSize=
Request :
Response :
{
"requestId" : "None",
"requestTime" : "2018-08-30T10:36:37.262+0000",
"errors" : false,
"messages" : [ ],
"data" : [ {
"id" : "8a8080886583d7bd0165840144b00b0e",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-29T04:46:40.304+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-29T04:46:40.304+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080886583d7bd0165840144850b0c",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-29T04:46:40.261+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-29T04:46:40.261+0000",
"version" : null,
"inactive" : false,
"name" : "FXL"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a808057657fff9c01658011a8950d9b",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T10:26:05.589+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T10:26:05.589+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a808057657fff9c01658011a8940d9a",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T10:26:05.588+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T10:26:05.588+0000",
"version" : null,
"inactive" : false,
"name" : "Facebook1"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a808057657fff9c0165800e5eea0c15",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T10:22:30.122+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T10:22:30.122+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a808057657fff9c0165800e5ee90c14",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T10:22:30.121+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T10:22:30.121+0000",
"version" : null,
"inactive" : false,
"name" : "Org 12"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ee657f422301657f6dd1fd000b",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T07:27:08.285+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T07:27:08.285+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ee657f422301657f6dd1fc000a",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T07:27:08.283+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T07:27:08.283+0000",
"version" : null,
"inactive" : false,
"name" : "Ola Uber"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ee657f422301657f6510590001",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T07:17:34.425+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T07:17:34.425+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ee657f422301657f65103f0000",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T07:17:34.396+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T07:17:34.396+0000",
"version" : null,
"inactive" : false,
"name" : "Cloud Space"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080e7657b6af401657b983d02002f",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T13:34:59.330+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T13:34:59.330+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080e7657b6af401657b983d00002e",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T13:34:59.328+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T13:34:59.328+0000",
"version" : null,
"inactive" : false,
"name" : "Watsapp"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080e7657b6af401657b77a37d000b",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:59:22.877+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:59:22.877+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080e7657b6af401657b77a37b000a",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:59:22.875+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:59:22.875+0000",
"version" : null,
"inactive" : false,
"name" : "Uber"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080e7657b6af401657b750cb10002",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:56:33.200+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:56:33.200+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080e7657b6af401657b750ca80001",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:56:33.192+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:56:33.192+0000",
"version" : null,
"inactive" : false,
"name" : "Ola"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ec657b1b2b01657b5d53b90af9",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:30:38.521+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:30:38.521+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ec657b1b2b01657b5d53b80af8",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:30:38.520+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:30:38.520+0000",
"version" : null,
"inactive" : false,
"name" : "Facebook"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ec657b1b2b01657b518e835900",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:17:47.139+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:17:47.139+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ec657b1b2b01657b518e8258ff",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:17:47.138+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:17:47.138+0000",
"version" : null,
"inactive" : false,
"name" : "Microsoft"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ec657b1b2b01657b49da773826",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:09:22.295+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:09:22.295+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ec657b1b2b01657b49da763825",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:09:22.294+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:09:22.294+0000",
"version" : null,
"inactive" : false,
"name" : "Google"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ec657b1b2b01657b1fab6f0675",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T11:23:17.743+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T11:23:17.743+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ec657b1b2b01657b1fab6f0674",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T11:23:17.743+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T11:23:17.743+0000",
"version" : null,
"inactive" : false,
"name" : "FXLabs_UI"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a80808a657aacf801657af6998d0020",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T10:38:26.189+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T10:38:26.189+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a80808a657aacf801657af6998c001f",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T10:38:26.188+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T11:27:59.360+0000",
"version" : null,
"inactive" : false,
"name" : "FXLabs_QA"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a80808a657aacf801657ab17dcc0002",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:57.099+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:57.108+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a80808a657aacf801657ab17dd30003",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:57.107+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:57.107+0000",
"version" : null,
"inactive" : false,
"name" : "FXLabs"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ENTERPRISE_ADMIN",
"status" : "ACTIVE"
} ],
"totalPages" : 1,
"totalElements" : 14
}
Logs :
Assertion [@StatusCode != 200] failed, not expecting [200] but found [200]
--- FX Bot ---
|
1.0
|
humerafxtesting : api_v1_orgs_by-user_get_query_param_sql_injection_postgres_pageSize - Project : humerafxtesting
Job : UAT
Env : UAT
Region : FXLabs/US_WEST_1
Result : fail
Status Code : 200
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 30 Aug 2018 10:36:36 GMT]}
Endpoint : http://13.56.210.25/api/v1/orgs/by-user?pageSize=
Request :
Response :
{
"requestId" : "None",
"requestTime" : "2018-08-30T10:36:37.262+0000",
"errors" : false,
"messages" : [ ],
"data" : [ {
"id" : "8a8080886583d7bd0165840144b00b0e",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-29T04:46:40.304+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-29T04:46:40.304+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080886583d7bd0165840144850b0c",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-29T04:46:40.261+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-29T04:46:40.261+0000",
"version" : null,
"inactive" : false,
"name" : "FXL"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a808057657fff9c01658011a8950d9b",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T10:26:05.589+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T10:26:05.589+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a808057657fff9c01658011a8940d9a",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T10:26:05.588+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T10:26:05.588+0000",
"version" : null,
"inactive" : false,
"name" : "Facebook1"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a808057657fff9c0165800e5eea0c15",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T10:22:30.122+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T10:22:30.122+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a808057657fff9c0165800e5ee90c14",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T10:22:30.121+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T10:22:30.121+0000",
"version" : null,
"inactive" : false,
"name" : "Org 12"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ee657f422301657f6dd1fd000b",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T07:27:08.285+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T07:27:08.285+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ee657f422301657f6dd1fc000a",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T07:27:08.283+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T07:27:08.283+0000",
"version" : null,
"inactive" : false,
"name" : "Ola Uber"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ee657f422301657f6510590001",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T07:17:34.425+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T07:17:34.425+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ee657f422301657f65103f0000",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-28T07:17:34.396+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-28T07:17:34.396+0000",
"version" : null,
"inactive" : false,
"name" : "Cloud Space"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080e7657b6af401657b983d02002f",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T13:34:59.330+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T13:34:59.330+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080e7657b6af401657b983d00002e",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T13:34:59.328+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T13:34:59.328+0000",
"version" : null,
"inactive" : false,
"name" : "Watsapp"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080e7657b6af401657b77a37d000b",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:59:22.877+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:59:22.877+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080e7657b6af401657b77a37b000a",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:59:22.875+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:59:22.875+0000",
"version" : null,
"inactive" : false,
"name" : "Uber"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080e7657b6af401657b750cb10002",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:56:33.200+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:56:33.200+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080e7657b6af401657b750ca80001",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:56:33.192+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:56:33.192+0000",
"version" : null,
"inactive" : false,
"name" : "Ola"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ec657b1b2b01657b5d53b90af9",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:30:38.521+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:30:38.521+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ec657b1b2b01657b5d53b80af8",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:30:38.520+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:30:38.520+0000",
"version" : null,
"inactive" : false,
"name" : "Facebook"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ec657b1b2b01657b518e835900",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:17:47.139+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:17:47.139+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ec657b1b2b01657b518e8258ff",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:17:47.138+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:17:47.138+0000",
"version" : null,
"inactive" : false,
"name" : "Microsoft"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ec657b1b2b01657b49da773826",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:09:22.295+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:09:22.295+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ec657b1b2b01657b49da763825",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T12:09:22.294+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T12:09:22.294+0000",
"version" : null,
"inactive" : false,
"name" : "Google"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a8080ec657b1b2b01657b1fab6f0675",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T11:23:17.743+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T11:23:17.743+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a8080ec657b1b2b01657b1fab6f0674",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T11:23:17.743+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T11:23:17.743+0000",
"version" : null,
"inactive" : false,
"name" : "FXLabs_UI"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a80808a657aacf801657af6998d0020",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T10:38:26.189+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T10:38:26.189+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a80808a657aacf801657af6998c001f",
"createdBy" : "8a80808a657aacf801657ab17ca30000",
"createdDate" : "2018-08-27T10:38:26.188+0000",
"modifiedBy" : "8a80808a657aacf801657ab17ca30000",
"modifiedDate" : "2018-08-27T11:27:59.360+0000",
"version" : null,
"inactive" : false,
"name" : "FXLabs_QA"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ADMIN",
"status" : "ACTIVE"
}, {
"id" : "8a80808a657aacf801657ab17dcc0002",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:57.099+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:57.108+0000",
"version" : null,
"inactive" : false,
"org" : {
"id" : "8a80808a657aacf801657ab17dd30003",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:57.107+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:57.107+0000",
"version" : null,
"inactive" : false,
"name" : "FXLabs"
},
"users" : {
"id" : "8a80808a657aacf801657ab17ca30000",
"createdBy" : "anonymousUser",
"createdDate" : "2018-08-27T09:22:56.800+0000",
"modifiedBy" : "anonymousUser",
"modifiedDate" : "2018-08-27T09:22:56.800+0000",
"version" : null,
"inactive" : false,
"name" : null,
"email" : "admin@fxlabs.io",
"username" : "admin",
"company" : null,
"location" : null,
"jobTitle" : null
},
"orgRole" : "ENTERPRISE_ADMIN",
"status" : "ACTIVE"
} ],
"totalPages" : 1,
"totalElements" : 14
}
Logs :
Assertion [@StatusCode != 200] failed, not expecting [200] but found [200]
--- FX Bot ---
|
non_process
|
humerafxtesting api orgs by user get query param sql injection postgres pagesize project humerafxtesting job uat env uat region fxlabs us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options content type transfer encoding date endpoint request response requestid none requesttime errors false messages data id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name fxl users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name org users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name ola uber users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name cloud space users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name watsapp users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name uber users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name ola users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name facebook users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name microsoft users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name google users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name fxlabs ui users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby createddate modifiedby modifieddate version null inactive false org id createdby createddate modifiedby modifieddate version null inactive false name fxlabs qa users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole admin status active id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false org id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name fxlabs users id createdby anonymoususer createddate modifiedby anonymoususer modifieddate version null inactive false name null email admin fxlabs io username admin company null location null jobtitle null orgrole enterprise admin status active totalpages totalelements logs assertion failed not expecting but found fx bot
| 0
|
122,280
| 16,099,947,329
|
IssuesEvent
|
2021-04-27 08:02:31
|
alphagov/govuk-frontend
|
https://api.github.com/repos/alphagov/govuk-frontend
|
opened
|
Consider deprecating govuk-link--muted
|
awaiting triage design typography
|
## What
We have a `govuk-link--muted` style in Frontend that we could consider deprecating:

## Why
* It isn't used by any of our components
* As far as we're aware it's not used anywhere else (it used to be used by GOV.UK in feedback banners at the bottom of pages but was replaced for something more noticeable)
* We have a black `govuk-link--text-colour` style which arguably serves the same purpose (i.e. to style navigational links that fade into the background a bit more) – do we need a third level of hierarchy for links?
We should ask the community and maybe search Github to see if it's being used and what for.
## Who needs to know about this
Designers
## Done when
- [ ] Worked out if there's a need for the style
- [ ] Decide on whether to keep it or not
- [ ] Maybe remove it!
|
1.0
|
Consider deprecating govuk-link--muted - ## What
We have a `govuk-link--muted` style in Frontend that we could consider deprecating:

## Why
* It isn't used by any of our components
* As far as we're aware it's not used anywhere else (it used to be used by GOV.UK in feedback banners at the bottom of pages but was replaced for something more noticeable)
* We have a black `govuk-link--text-colour` style which arguably serves the same purpose (i.e. to style navigational links that fade into the background a bit more) – do we need a third level of hierarchy for links?
We should ask the community and maybe search Github to see if it's being used and what for.
## Who needs to know about this
Designers
## Done when
- [ ] Worked out if there's a need for the style
- [ ] Decide on whether to keep it or not
- [ ] Maybe remove it!
|
non_process
|
consider deprecating govuk link muted what we have a govuk link muted style in frontend that we could consider deprecating why it isn t used by any of our components as far as we re aware it s not used anywhere else it used to be used by gov uk in feedback banners at the bottom of pages but was replaced for something more noticeable we have a black govuk link text colour style which arguably serves the same purpose i e to style navigational links that fade into the background a bit more – do we need a third level of hierarchy for links we should ask the community and maybe search github to see if it s being used and what for who needs to know about this designers done when worked out if there s a need for the style decide on whether to keep it or not maybe remove it
| 0
|
8,807
| 11,908,289,238
|
IssuesEvent
|
2020-03-31 00:31:17
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Resizing tools in processing modeler
|
Feature Request Processing
|
Author Name: **Magnus Nilsson** (Magnus Nilsson)
Original Redmine Issue: [16279](https://issues.qgis.org/issues/16279)
Redmine category:processing/modeller
Assignee: Victor Olaya
---
I would like to be able to resize the tools (at least up to a certain limit) in the modeler, since they sometimes are too small for the description that I have entered.
|
1.0
|
Resizing tools in processing modeler - Author Name: **Magnus Nilsson** (Magnus Nilsson)
Original Redmine Issue: [16279](https://issues.qgis.org/issues/16279)
Redmine category:processing/modeller
Assignee: Victor Olaya
---
I would like to be able to resize the tools (at least up to a certain limit) in the modeler, since they sometimes are too small for the description that I have entered.
|
process
|
resizing tools in processing modeler author name magnus nilsson magnus nilsson original redmine issue redmine category processing modeller assignee victor olaya i would like to be able to resize the tools at least up to a certain limit in the modeler since they sometimes are too small for the description that i have entered
| 1
|
280,805
| 24,334,580,100
|
IssuesEvent
|
2022-10-01 00:19:28
|
mozilla-mobile/android-components
|
https://api.github.com/repos/mozilla-mobile/android-components
|
closed
|
Intermittent UI test failure - org.mozilla.samples.glean.pings.BaselinePingTest - validateBaselinePing - java.lang.AssertionError testSetLocalEndpoint
|
🎲 intermittent-test
|
### Firebase Test Run:
https://console.firebase.google.com/project/moz-android-components-230120/testlab/histories/bh.acacddb44ee28c64/matrices/6074761911282324979/executions/bs.22752a5c356d2d34/testcases/1
### Stacktrace:
```
java.lang.AssertionError: Assertion failed
at mozilla.telemetry.glean.GleanInternalAPI.testSetLocalEndpoint$glean_release(Glean.kt:469)
at mozilla.telemetry.glean.testing.GleanTestLocalServer.starting(GleanTestLocalServer.kt:46)
at org.junit.rules.TestWatcher.startingQuietly(TestWatcher.java:108)
at org.junit.rules.TestWatcher.access$000(TestWatcher.java:46)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:53)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at androidx.test.ext.junit.runners.AndroidJUnit4.run(AndroidJUnit4.java:162)
at org.junit.runners.Suite.runChild(Suite.java:128)
at org.junit.runners.Suite.runChild(Suite.java:27)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at org.junit.runner.JUnitCore.run(JUnitCore.java:115)
at androidx.test.internal.runner.TestExecutor.execute(TestExecutor.java:56)
at androidx.test.runner.AndroidJUnitRunner.onStart(AndroidJUnitRunner.java:444)
at android.app.Instrumentation$InstrumentationThread.run(Instrumentation.java:1837)
```
### Build:
https://github.com/mozilla-mobile/android-components/commit/99c8dabbe94e088a9e82125180707dffe9b8d163
┆Issue is synchronized with this [Jira Task](https://mozilla-hub.atlassian.net/browse/FNXV2-20758)
|
1.0
|
Intermittent UI test failure - org.mozilla.samples.glean.pings.BaselinePingTest - validateBaselinePing - java.lang.AssertionError testSetLocalEndpoint - ### Firebase Test Run:
https://console.firebase.google.com/project/moz-android-components-230120/testlab/histories/bh.acacddb44ee28c64/matrices/6074761911282324979/executions/bs.22752a5c356d2d34/testcases/1
### Stacktrace:
```
java.lang.AssertionError: Assertion failed
at mozilla.telemetry.glean.GleanInternalAPI.testSetLocalEndpoint$glean_release(Glean.kt:469)
at mozilla.telemetry.glean.testing.GleanTestLocalServer.starting(GleanTestLocalServer.kt:46)
at org.junit.rules.TestWatcher.startingQuietly(TestWatcher.java:108)
at org.junit.rules.TestWatcher.access$000(TestWatcher.java:46)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:53)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at androidx.test.ext.junit.runners.AndroidJUnit4.run(AndroidJUnit4.java:162)
at org.junit.runners.Suite.runChild(Suite.java:128)
at org.junit.runners.Suite.runChild(Suite.java:27)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at org.junit.runner.JUnitCore.run(JUnitCore.java:115)
at androidx.test.internal.runner.TestExecutor.execute(TestExecutor.java:56)
at androidx.test.runner.AndroidJUnitRunner.onStart(AndroidJUnitRunner.java:444)
at android.app.Instrumentation$InstrumentationThread.run(Instrumentation.java:1837)
```
### Build:
https://github.com/mozilla-mobile/android-components/commit/99c8dabbe94e088a9e82125180707dffe9b8d163
┆Issue is synchronized with this [Jira Task](https://mozilla-hub.atlassian.net/browse/FNXV2-20758)
|
non_process
|
intermittent ui test failure org mozilla samples glean pings baselinepingtest validatebaselineping java lang assertionerror testsetlocalendpoint firebase test run stacktrace java lang assertionerror assertion failed at mozilla telemetry glean gleaninternalapi testsetlocalendpoint glean release glean kt at mozilla telemetry glean testing gleantestlocalserver starting gleantestlocalserver kt at org junit rules testwatcher startingquietly testwatcher java at org junit rules testwatcher access testwatcher java at org junit rules testwatcher evaluate testwatcher java at org junit rules runrules evaluate runrules java at org junit runners parentrunner runleaf parentrunner java at org junit runners runchild java at org junit runners runchild java at org junit runners parentrunner run parentrunner java at org junit runners parentrunner schedule parentrunner java at org junit runners parentrunner runchildren parentrunner java at org junit runners parentrunner access parentrunner java at org junit runners parentrunner evaluate parentrunner java at org junit runners parentrunner run parentrunner java at androidx test ext junit runners run java at org junit runners suite runchild suite java at org junit runners suite runchild suite java at org junit runners parentrunner run parentrunner java at org junit runners parentrunner schedule parentrunner java at org junit runners parentrunner runchildren parentrunner java at org junit runners parentrunner access parentrunner java at org junit runners parentrunner evaluate parentrunner java at org junit runners parentrunner run parentrunner java at org junit runner junitcore run junitcore java at org junit runner junitcore run junitcore java at androidx test internal runner testexecutor execute testexecutor java at androidx test runner androidjunitrunner onstart androidjunitrunner java at android app instrumentation instrumentationthread run instrumentation java build ┆issue is synchronized with this
| 0
|
47,214
| 13,210,445,009
|
IssuesEvent
|
2020-08-15 16:58:50
|
onokatio/blog.katio.net
|
https://api.github.com/repos/onokatio/blog.katio.net
|
closed
|
CVE-2018-11695 (High) detected in opennms-opennms-source-25.0.0-1
|
security vulnerability
|
## CVE-2018-11695 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-25.0.0-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
<p>Found in HEAD commit: <a href="https://github.com/onokatio/blog.katio.net/commit/fd58ea8e91b776da188df47930fc9a240c664d3d">fd58ea8e91b776da188df47930fc9a240c664d3d</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (136)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/mac_tool.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/expand.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/util.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/dump_dependency_json.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/emitter.hpp
- /blog.katio.net/node_modules/nan/nan_converters_pre_43_inl.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/make.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/analyzer.py
- /blog.katio.net/node_modules/nan/nan_persistent_12_inl.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/operation.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/tools/graphviz.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSProject.py
- /blog.katio.net/node_modules/nan/nan_implementation_pre_12_inl.h
- /blog.katio.net/node_modules/node-sass/src/custom_importer_bridge.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/functions.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/common.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/eval.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/flock_tool.py
- /blog.katio.net/node_modules/nan/nan_converters.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/input_test.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/xcode_ninja.py
- /blog.katio.net/node_modules/nan/nan.h
- /blog.katio.net/node_modules/node-sass/src/sass_context_wrapper.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/parser.cpp
- /blog.katio.net/node_modules/nan/nan_string_bytes.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/emitter.cpp
- /blog.katio.net/node_modules/nan/nan_new.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/ast.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/ninja_syntax.py
- /blog.katio.net/node_modules/nan/nan_maybe_43_inl.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/output.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/check_nesting.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/gypsh.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSToolFile.py
- /blog.katio.net/node_modules/node-gyp/gyp/PRESUBMIT.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/functions.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/cssize.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/gypd.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/easy_xml_test.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/inspect.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/win_tool.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/__init__.py
- /blog.katio.net/node_modules/node-sass/src/sass_types/color.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/values.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/list.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/check_nesting.hpp
- /blog.katio.net/node_modules/nan/nan_define_own_property_helper.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/context.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSUtil.py
- /blog.katio.net/node_modules/node-sass/src/sass_types/string.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/tools/pretty_sln.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/xcodeproj_file.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/context.hpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/boolean.h
- /blog.katio.net/node_modules/node-gyp/lib/Find-VS2017.cs
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/cmake.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/input.py
- /blog.katio.net/node_modules/nan/nan_private.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/eval.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSUserFile.py
- /blog.katio.net/node_modules/nan/nan_callbacks_pre_12_inl.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/expand.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/factory.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/operators.cpp
- /blog.katio.net/node_modules/nan/nan_implementation_12_inl.h
- /blog.katio.net/node_modules/node-sass/src/sass_types/boolean.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/value.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/common_test.py
- /blog.katio.net/node_modules/node-sass/src/callback_bridge.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/file.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/sass.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/xml_fix.py
- /blog.katio.net/node_modules/nan/nan_persistent_pre_12_inl.h
- /blog.katio.net/node_modules/node-gyp/src/win_delay_load_hook.cc
- /blog.katio.net/node_modules/node-sass/src/libsass/src/operators.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/constants.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/xcode_emulation.py
- /blog.katio.net/node_modules/nan/nan_weak.h
- /blog.katio.net/node_modules/node-gyp/tools/gyp/pylib/gyp/generator/compile_commands_json.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/parser.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs_test.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/constants.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/list.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/eclipse.py
- /blog.katio.net/node_modules/node-gyp/gyp/tools/pretty_vcproj.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/cssize.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/util.cpp
- /blog.katio.net/node_modules/node-sass/src/custom_function_bridge.cpp
- /blog.katio.net/node_modules/nan/nan_typedarray_contents.h
- /blog.katio.net/node_modules/node-sass/src/custom_importer_bridge.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/bind.cpp
- /blog.katio.net/node_modules/nan/nan_json.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/inspect.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode_test.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/backtrace.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/extend.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/debugger.hpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/number.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja_test.py
- /blog.katio.net/node_modules/node-sass/src/sass_types/color.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSNew.py
- /blog.katio.net/node_modules/nan/nan_maybe_pre_43_inl.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /blog.katio.net/node_modules/nan/nan_callbacks_12_inl.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/msvs_emulation.py
- /blog.katio.net/node_modules/nan/nan_object_wrap.h
- /blog.katio.net/node_modules/node-gyp/gyp/data/win/large-pdb-shim.cc
- /blog.katio.net/node_modules/node-sass/src/sass_types/null.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings_test.py
- /blog.katio.net/node_modules/node-gyp/gyp/tools/pretty_gyp.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/ast.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/android.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/to_c.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/to_value.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSVersion.py
- /blog.katio.net/node_modules/nan/nan_callbacks.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/ordered_dict.py
- /blog.katio.net/node_modules/node-sass/src/sass_context_wrapper.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/easy_xml.py
- /blog.katio.net/node_modules/node-sass/src/sass_types/map.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/to_value.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/simple_copy.py
- /blog.katio.net/node_modules/node-sass/src/binding.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /blog.katio.net/node_modules/nan/nan_converters_43_inl.h
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.2. A NULL pointer dereference was found in the function Sass::Expand::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695>CVE-2018-11695</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695</a></p>
<p>Release Date: 2018-06-04</p>
<p>Fix Resolution: 3.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2018-11695 (High) detected in opennms-opennms-source-25.0.0-1 - ## CVE-2018-11695 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-25.0.0-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
<p>Found in HEAD commit: <a href="https://github.com/onokatio/blog.katio.net/commit/fd58ea8e91b776da188df47930fc9a240c664d3d">fd58ea8e91b776da188df47930fc9a240c664d3d</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (136)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/mac_tool.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/expand.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/util.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/dump_dependency_json.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/emitter.hpp
- /blog.katio.net/node_modules/nan/nan_converters_pre_43_inl.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/make.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/analyzer.py
- /blog.katio.net/node_modules/nan/nan_persistent_12_inl.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/operation.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/tools/graphviz.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSProject.py
- /blog.katio.net/node_modules/nan/nan_implementation_pre_12_inl.h
- /blog.katio.net/node_modules/node-sass/src/custom_importer_bridge.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/functions.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/common.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/eval.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/flock_tool.py
- /blog.katio.net/node_modules/nan/nan_converters.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/input_test.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/xcode_ninja.py
- /blog.katio.net/node_modules/nan/nan.h
- /blog.katio.net/node_modules/node-sass/src/sass_context_wrapper.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/parser.cpp
- /blog.katio.net/node_modules/nan/nan_string_bytes.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/emitter.cpp
- /blog.katio.net/node_modules/nan/nan_new.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/ast.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/ninja_syntax.py
- /blog.katio.net/node_modules/nan/nan_maybe_43_inl.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/output.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/check_nesting.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/gypsh.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSToolFile.py
- /blog.katio.net/node_modules/node-gyp/gyp/PRESUBMIT.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/functions.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/cssize.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/gypd.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/easy_xml_test.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/inspect.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/win_tool.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/__init__.py
- /blog.katio.net/node_modules/node-sass/src/sass_types/color.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/values.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/list.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/check_nesting.hpp
- /blog.katio.net/node_modules/nan/nan_define_own_property_helper.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/context.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSUtil.py
- /blog.katio.net/node_modules/node-sass/src/sass_types/string.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/tools/pretty_sln.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/xcodeproj_file.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/context.hpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/boolean.h
- /blog.katio.net/node_modules/node-gyp/lib/Find-VS2017.cs
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/cmake.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/input.py
- /blog.katio.net/node_modules/nan/nan_private.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/eval.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSUserFile.py
- /blog.katio.net/node_modules/nan/nan_callbacks_pre_12_inl.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/expand.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/factory.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/operators.cpp
- /blog.katio.net/node_modules/nan/nan_implementation_12_inl.h
- /blog.katio.net/node_modules/node-sass/src/sass_types/boolean.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/value.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/common_test.py
- /blog.katio.net/node_modules/node-sass/src/callback_bridge.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/file.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/sass.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/xml_fix.py
- /blog.katio.net/node_modules/nan/nan_persistent_pre_12_inl.h
- /blog.katio.net/node_modules/node-gyp/src/win_delay_load_hook.cc
- /blog.katio.net/node_modules/node-sass/src/libsass/src/operators.hpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/constants.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/xcode_emulation.py
- /blog.katio.net/node_modules/nan/nan_weak.h
- /blog.katio.net/node_modules/node-gyp/tools/gyp/pylib/gyp/generator/compile_commands_json.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/parser.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs_test.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/constants.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/list.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/eclipse.py
- /blog.katio.net/node_modules/node-gyp/gyp/tools/pretty_vcproj.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/cssize.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/util.cpp
- /blog.katio.net/node_modules/node-sass/src/custom_function_bridge.cpp
- /blog.katio.net/node_modules/nan/nan_typedarray_contents.h
- /blog.katio.net/node_modules/node-sass/src/custom_importer_bridge.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/bind.cpp
- /blog.katio.net/node_modules/nan/nan_json.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/inspect.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode_test.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/backtrace.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/extend.cpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/debugger.hpp
- /blog.katio.net/node_modules/node-sass/src/sass_types/number.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja_test.py
- /blog.katio.net/node_modules/node-sass/src/sass_types/color.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings.py
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSNew.py
- /blog.katio.net/node_modules/nan/nan_maybe_pre_43_inl.h
- /blog.katio.net/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /blog.katio.net/node_modules/nan/nan_callbacks_12_inl.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/msvs_emulation.py
- /blog.katio.net/node_modules/nan/nan_object_wrap.h
- /blog.katio.net/node_modules/node-gyp/gyp/data/win/large-pdb-shim.cc
- /blog.katio.net/node_modules/node-sass/src/sass_types/null.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings_test.py
- /blog.katio.net/node_modules/node-gyp/gyp/tools/pretty_gyp.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/ast.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/generator/android.py
- /blog.katio.net/node_modules/node-sass/src/libsass/src/to_c.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/to_value.hpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/MSVSVersion.py
- /blog.katio.net/node_modules/nan/nan_callbacks.h
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/ordered_dict.py
- /blog.katio.net/node_modules/node-sass/src/sass_context_wrapper.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/easy_xml.py
- /blog.katio.net/node_modules/node-sass/src/sass_types/map.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/to_value.cpp
- /blog.katio.net/node_modules/node-gyp/gyp/pylib/gyp/simple_copy.py
- /blog.katio.net/node_modules/node-sass/src/binding.cpp
- /blog.katio.net/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /blog.katio.net/node_modules/nan/nan_converters_43_inl.h
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.2. A NULL pointer dereference was found in the function Sass::Expand::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695>CVE-2018-11695</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695</a></p>
<p>Release Date: 2018-06-04</p>
<p>Fix Resolution: 3.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in opennms opennms source cve high severity vulnerability vulnerable library opennmsopennms source a java based fault and performance management system library home page a href found in head commit a href library source files the source files were matched to this source library based on a best effort match source libraries are selected from a list of probable public libraries blog katio net node modules node gyp gyp pylib gyp mac tool py blog katio net node modules node sass src libsass src expand hpp blog katio net node modules node sass src libsass src util hpp blog katio net node modules node gyp gyp pylib gyp generator ninja py blog katio net node modules node gyp gyp pylib gyp generator dump dependency json py blog katio net node modules node sass src libsass src emitter hpp blog katio net node modules nan nan converters pre inl h blog katio net node modules node gyp gyp pylib gyp generator make py blog katio net node modules node gyp gyp pylib gyp generator analyzer py blog katio net node modules nan nan persistent inl h blog katio net node modules node sass src libsass src operation hpp blog katio net node modules node sass src libsass src error handling hpp blog katio net node modules node gyp gyp tools graphviz py blog katio net node modules node gyp gyp pylib gyp msvsproject py blog katio net node modules nan nan implementation pre inl h blog katio net node modules node sass src custom importer bridge cpp blog katio net node modules node sass src libsass src functions hpp blog katio net node modules node gyp gyp pylib gyp common py blog katio net node modules node gyp gyp pylib gyp generator xcode py blog katio net node modules node sass src libsass src eval hpp blog katio net node modules node gyp gyp pylib gyp flock tool py blog katio net node modules nan nan converters h blog katio net node modules node gyp gyp pylib gyp input test py blog katio net node modules node gyp gyp pylib gyp xcode ninja py blog katio net node modules nan nan h blog katio net node modules node sass src sass context wrapper h blog katio net node modules node sass src libsass src error handling cpp blog katio net node modules node sass src libsass src parser cpp blog katio net node modules nan nan string bytes h blog katio net node modules node sass src libsass src emitter cpp blog katio net node modules nan nan new h blog katio net node modules node gyp gyp pylib gyp generator msvs py blog katio net node modules node sass src libsass src ast hpp blog katio net node modules node gyp gyp pylib gyp ninja syntax py blog katio net node modules nan nan maybe inl h blog katio net node modules node sass src libsass src output cpp blog katio net node modules node sass src libsass src check nesting cpp blog katio net node modules node gyp gyp pylib gyp generator gypsh py blog katio net node modules node gyp gyp pylib gyp msvstoolfile py blog katio net node modules node gyp gyp presubmit py blog katio net node modules node sass src libsass src ast def macros hpp blog katio net node modules node sass src libsass src functions cpp blog katio net node modules node sass src libsass src cssize hpp blog katio net node modules node gyp gyp pylib gyp generator gypd py blog katio net node modules node sass src libsass src prelexer cpp blog katio net node modules node sass src libsass src ast fwd decl hpp blog katio net node modules node gyp gyp pylib gyp easy xml test py blog katio net node modules node sass src libsass src inspect hpp blog katio net node modules node gyp gyp pylib gyp win tool py blog katio net node modules node gyp gyp pylib gyp init py blog katio net node modules node sass src sass types color cpp blog katio net node modules node sass src libsass src values cpp blog katio net node modules node sass src sass types list h blog katio net node modules node sass src libsass src check nesting hpp blog katio net node modules nan nan define own property helper h blog katio net node modules node sass src libsass src context cpp blog katio net node modules node gyp gyp pylib gyp msvsutil py blog katio net node modules node sass src sass types string cpp blog katio net node modules node gyp gyp tools pretty sln py blog katio net node modules node gyp gyp pylib gyp xcodeproj file py blog katio net node modules node sass src libsass src prelexer hpp blog katio net node modules node sass src libsass src context hpp blog katio net node modules node sass src sass types boolean h blog katio net node modules node gyp lib find cs blog katio net node modules node gyp gyp pylib gyp generator cmake py blog katio net node modules node gyp gyp pylib gyp input py blog katio net node modules nan nan private h blog katio net node modules node sass src libsass src eval cpp blog katio net node modules node gyp gyp pylib gyp msvsuserfile py blog katio net node modules nan nan callbacks pre inl h blog katio net node modules node sass src libsass src expand cpp blog katio net node modules node sass src sass types factory cpp blog katio net node modules node sass src libsass src operators cpp blog katio net node modules nan nan implementation inl h blog katio net node modules node sass src sass types boolean cpp blog katio net node modules node sass src sass types value h blog katio net node modules node gyp gyp pylib gyp common test py blog katio net node modules node sass src callback bridge h blog katio net node modules node sass src libsass src file cpp blog katio net node modules node sass src libsass src sass cpp blog katio net node modules node gyp gyp pylib gyp xml fix py blog katio net node modules nan nan persistent pre inl h blog katio net node modules node gyp src win delay load hook cc blog katio net node modules node sass src libsass src operators hpp blog katio net node modules node sass src libsass src constants hpp blog katio net node modules node gyp gyp pylib gyp xcode emulation py blog katio net node modules nan nan weak h blog katio net node modules node gyp tools gyp pylib gyp generator compile commands json py blog katio net node modules node sass src libsass src parser hpp blog katio net node modules node gyp gyp pylib gyp generator msvs test py blog katio net node modules node sass src libsass src constants cpp blog katio net node modules node sass src sass types list cpp blog katio net node modules node gyp gyp pylib gyp generator eclipse py blog katio net node modules node gyp gyp tools pretty vcproj py blog katio net node modules node sass src libsass src cssize cpp blog katio net node modules node sass src libsass src util cpp blog katio net node modules node sass src custom function bridge cpp blog katio net node modules nan nan typedarray contents h blog katio net node modules node sass src custom importer bridge h blog katio net node modules node sass src libsass src bind cpp blog katio net node modules nan nan json h blog katio net node modules node sass src libsass src inspect cpp blog katio net node modules node gyp gyp pylib gyp generator xcode test py blog katio net node modules node sass src libsass src backtrace cpp blog katio net node modules node sass src libsass src extend cpp blog katio net node modules node sass src sass types sass value wrapper h blog katio net node modules node sass src libsass src debugger hpp blog katio net node modules node sass src sass types number cpp blog katio net node modules node gyp gyp pylib gyp generator ninja test py blog katio net node modules node sass src sass types color h blog katio net node modules node gyp gyp pylib gyp msvssettings py blog katio net node modules node gyp gyp pylib gyp msvsnew py blog katio net node modules nan nan maybe pre inl h blog katio net node modules node sass src libsass src sass values cpp blog katio net node modules nan nan callbacks inl h blog katio net node modules node gyp gyp pylib gyp msvs emulation py blog katio net node modules nan nan object wrap h blog katio net node modules node gyp gyp data win large pdb shim cc blog katio net node modules node sass src sass types null cpp blog katio net node modules node gyp gyp pylib gyp msvssettings test py blog katio net node modules node gyp gyp tools pretty gyp py blog katio net node modules node sass src libsass src ast cpp blog katio net node modules node gyp gyp pylib gyp generator android py blog katio net node modules node sass src libsass src to c cpp blog katio net node modules node sass src libsass src to value hpp blog katio net node modules node gyp gyp pylib gyp msvsversion py blog katio net node modules nan nan callbacks h blog katio net node modules node gyp gyp pylib gyp ordered dict py blog katio net node modules node sass src sass context wrapper cpp blog katio net node modules node gyp gyp pylib gyp easy xml py blog katio net node modules node sass src sass types map cpp blog katio net node modules node sass src libsass src to value cpp blog katio net node modules node gyp gyp pylib gyp simple copy py blog katio net node modules node sass src binding cpp blog katio net node modules node sass src libsass src sass context cpp blog katio net node modules nan nan converters inl h vulnerability details an issue was discovered in libsass through a null pointer dereference was found in the function sass expand operator which could be leveraged by an attacker to cause a denial of service application crash or possibly have unspecified other impact publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
12,620
| 15,015,294,703
|
IssuesEvent
|
2021-02-01 08:05:48
|
gy0512/comments.github.io
|
https://api.github.com/repos/gy0512/comments.github.io
|
opened
|
How would you go about testing a new requirement and what's the process? - TestBirds
|
/post/the-process-of-testing-a-new-requirement/ Gitalk
|
https://gy0512.github.io/post/the-process-of-testing-a-new-requirement/
Basic Test Knowledge.
|
1.0
|
How would you go about testing a new requirement and what's the process? - TestBirds - https://gy0512.github.io/post/the-process-of-testing-a-new-requirement/
Basic Test Knowledge.
|
process
|
how would you go about testing a new requirement and what s the process testbirds basic test knowledge
| 1
|
386,660
| 11,448,465,733
|
IssuesEvent
|
2020-02-06 03:31:18
|
incognitochain/incognito-chain
|
https://api.github.com/repos/incognitochain/incognito-chain
|
closed
|
[qc][training] deploy incognito chain on localhost
|
Priority: Medium
|
- [x] deploy chain
- [x] generate new address on each shard
- [ ] send PRV to each addresses
- [ ] send PRV privacy and non-privacy
- [ ] inspect PRV transaction `gettransactionbyhash`
- [x] init privacy Token
- [ ] send pToken
- [ ] inspect pToken transaction
- [ ] stake new node
- [ ] monitoring new staking node
- [ ] withdraw reward
|
1.0
|
[qc][training] deploy incognito chain on localhost - - [x] deploy chain
- [x] generate new address on each shard
- [ ] send PRV to each addresses
- [ ] send PRV privacy and non-privacy
- [ ] inspect PRV transaction `gettransactionbyhash`
- [x] init privacy Token
- [ ] send pToken
- [ ] inspect pToken transaction
- [ ] stake new node
- [ ] monitoring new staking node
- [ ] withdraw reward
|
non_process
|
deploy incognito chain on localhost deploy chain generate new address on each shard send prv to each addresses send prv privacy and non privacy inspect prv transaction gettransactionbyhash init privacy token send ptoken inspect ptoken transaction stake new node monitoring new staking node withdraw reward
| 0
|
238,457
| 26,114,051,584
|
IssuesEvent
|
2022-12-28 02:06:28
|
OpenHistoricalMap/issues
|
https://api.github.com/repos/OpenHistoricalMap/issues
|
closed
|
Upgrade iD (& anything else?) to OAuth2?
|
ID app security
|
In #420, @batpad said:
> I am still slightly concerned that our iD still uses OAuth 1, whereas for the "Website Application", it uses OAuth 2 - so far I have not seen this cause issues on staging, but probably something to keep an eye out for.
So, assuming better security is a good thing & key for integrations moving forward, it seems we should get rid of OAuth 1 wherever possible.
I'm not sure of all of these places, could we use this ticket to document:
- [ ] Places where OAuth1 is still used in our stack - e.g. iD, Tasking Manager (?)
- [ ] Whether we should go ahead and upgrade each of those places [could be just iD!]
|
True
|
Upgrade iD (& anything else?) to OAuth2? - In #420, @batpad said:
> I am still slightly concerned that our iD still uses OAuth 1, whereas for the "Website Application", it uses OAuth 2 - so far I have not seen this cause issues on staging, but probably something to keep an eye out for.
So, assuming better security is a good thing & key for integrations moving forward, it seems we should get rid of OAuth 1 wherever possible.
I'm not sure of all of these places, could we use this ticket to document:
- [ ] Places where OAuth1 is still used in our stack - e.g. iD, Tasking Manager (?)
- [ ] Whether we should go ahead and upgrade each of those places [could be just iD!]
|
non_process
|
upgrade id anything else to in batpad said i am still slightly concerned that our id still uses oauth whereas for the website application it uses oauth so far i have not seen this cause issues on staging but probably something to keep an eye out for so assuming better security is a good thing key for integrations moving forward it seems we should get rid of oauth wherever possible i m not sure of all of these places could we use this ticket to document places where is still used in our stack e g id tasking manager whether we should go ahead and upgrade each of those places
| 0
|
232,325
| 17,778,939,865
|
IssuesEvent
|
2021-08-30 23:59:37
|
bmigunov/i3blocks-contrib
|
https://api.github.com/repos/bmigunov/i3blocks-contrib
|
closed
|
Merge i3blocks-scripts into this repo
|
documentation enhancement
|
Merge personal i3blocks-scripts into this repo to keep features I made myself.
|
1.0
|
Merge i3blocks-scripts into this repo - Merge personal i3blocks-scripts into this repo to keep features I made myself.
|
non_process
|
merge scripts into this repo merge personal scripts into this repo to keep features i made myself
| 0
|
123,305
| 16,477,692,981
|
IssuesEvent
|
2021-05-24 07:54:12
|
ChangeWindows/Horizon
|
https://api.github.com/repos/ChangeWindows/Horizon
|
closed
|
Align design with our grid
|
design
|
With ChangeWindows 7, we finally follow at least some rules to put everything on a proper grid. However, we kinda ignored that grid in getting CW7 ready for launch.
Look how unsatisfying our current grid is:
<img width="644" alt="Screenshot 2021-05-21 001925" src="https://user-images.githubusercontent.com/1693592/119056154-7d33a100-b9ca-11eb-8149-016d8ab8cbf2.png">
There is different spacing between the timeline and the channels (0 vs 8), the top of the first timeline item isn't aligned with the top of the first channel, titles use different heights, etc. There is much to be desired.
<img width="643" alt="Screenshot 2021-05-21 002033" src="https://user-images.githubusercontent.com/1693592/119056211-9b999c80-b9ca-11eb-8562-d6de75300e50.png">
This is what we have now in development, everything matches. It's beautiful. Now we should take this, and apply it to everything. There is a bunch of things that this issue should solve.
- [ ] Properly align timelines with their sidebars where applicable.
- [ ] Properly space out any titles.
- [ ] Properly space out the pagination component.
|
1.0
|
Align design with our grid - With ChangeWindows 7, we finally follow at least some rules to put everything on a proper grid. However, we kinda ignored that grid in getting CW7 ready for launch.
Look how unsatisfying our current grid is:
<img width="644" alt="Screenshot 2021-05-21 001925" src="https://user-images.githubusercontent.com/1693592/119056154-7d33a100-b9ca-11eb-8149-016d8ab8cbf2.png">
There is different spacing between the timeline and the channels (0 vs 8), the top of the first timeline item isn't aligned with the top of the first channel, titles use different heights, etc. There is much to be desired.
<img width="643" alt="Screenshot 2021-05-21 002033" src="https://user-images.githubusercontent.com/1693592/119056211-9b999c80-b9ca-11eb-8562-d6de75300e50.png">
This is what we have now in development, everything matches. It's beautiful. Now we should take this, and apply it to everything. There is a bunch of things that this issue should solve.
- [ ] Properly align timelines with their sidebars where applicable.
- [ ] Properly space out any titles.
- [ ] Properly space out the pagination component.
|
non_process
|
align design with our grid with changewindows we finally follow at least some rules to put everything on a proper grid however we kinda ignored that grid in getting ready for launch look how unsatisfying our current grid is img width alt screenshot src there is different spacing between the timeline and the channels vs the top of the first timeline item isn t aligned with the top of the first channel titles use different heights etc there is much to be desired img width alt screenshot src this is what we have now in development everything matches it s beautiful now we should take this and apply it to everything there is a bunch of things that this issue should solve properly align timelines with their sidebars where applicable properly space out any titles properly space out the pagination component
| 0
|
61,832
| 3,154,892,078
|
IssuesEvent
|
2015-09-17 03:56:14
|
grafeo/grafeo
|
https://api.github.com/repos/grafeo/grafeo
|
closed
|
Lists
|
data structure feature request library priority: high
|
Implement some doubled linked lists:
- list_new(): Just create a List instance
- list_free(): Destroy the list (not the data)
## Insertions
- list_prepend(): Inserts a new item to the beginning of the list (it becomes a new list head)
- list_append(): Inserts a new item to the end of the list (it becomes a new list tail)
- list_prepend_at(): Inserts a new item before an specified list item
- list_append_at(): Inserts a new item after an specified list item
- list_prepend_at_index(): Inserts a new item before an item specified by index
- list_append_at_index(): Inserts a new item after an item specified by index
## Deletions
- list_remove(): Removes an specified item
- list_remove_from_value(): Removes an item specified by data value
- list_remove_at_index(): Removes an item specified by index
- list_remove_begin():
- list_remove_end():
## Accessors
- list_find():
- list_index_of(): Gets an index of a item specified by its value
- list_begin(): Returns the first item
- list_end(): Returns the last item
- list_length(): Returns the number of elements of the list
- list_is_empty(): `True` if list_length() equals to 0
- list_from_array(Array* array): Converts an Array to a List
- list_at(): Gets an List item at index
- list_value_at(): Gets the List item value at index
- list_next(): Gets the next list item
- list_prev(): Gets the previous list item
## Operations
- list_swap(): Swap items
- list_swap_at(): Swap items at specified indices
- list_swap_values(): Swap values (keep the items)
- list_swap_values_at(): Swap values (keep the items) at specified indices
- list_copy(): copy two values
- list_replace(): Replace items (remove one item and inserts another at same place)
- list_replace_at(): Replace a value at (remove
## Comparisons
- list_is_different(): True if two lists are different
- list_is_equal(): True if two lists are equal
|
1.0
|
Lists - Implement some doubled linked lists:
- list_new(): Just create a List instance
- list_free(): Destroy the list (not the data)
## Insertions
- list_prepend(): Inserts a new item to the beginning of the list (it becomes a new list head)
- list_append(): Inserts a new item to the end of the list (it becomes a new list tail)
- list_prepend_at(): Inserts a new item before an specified list item
- list_append_at(): Inserts a new item after an specified list item
- list_prepend_at_index(): Inserts a new item before an item specified by index
- list_append_at_index(): Inserts a new item after an item specified by index
## Deletions
- list_remove(): Removes an specified item
- list_remove_from_value(): Removes an item specified by data value
- list_remove_at_index(): Removes an item specified by index
- list_remove_begin():
- list_remove_end():
## Accessors
- list_find():
- list_index_of(): Gets an index of a item specified by its value
- list_begin(): Returns the first item
- list_end(): Returns the last item
- list_length(): Returns the number of elements of the list
- list_is_empty(): `True` if list_length() equals to 0
- list_from_array(Array* array): Converts an Array to a List
- list_at(): Gets an List item at index
- list_value_at(): Gets the List item value at index
- list_next(): Gets the next list item
- list_prev(): Gets the previous list item
## Operations
- list_swap(): Swap items
- list_swap_at(): Swap items at specified indices
- list_swap_values(): Swap values (keep the items)
- list_swap_values_at(): Swap values (keep the items) at specified indices
- list_copy(): copy two values
- list_replace(): Replace items (remove one item and inserts another at same place)
- list_replace_at(): Replace a value at (remove
## Comparisons
- list_is_different(): True if two lists are different
- list_is_equal(): True if two lists are equal
|
non_process
|
lists implement some doubled linked lists list new just create a list instance list free destroy the list not the data insertions list prepend inserts a new item to the beginning of the list it becomes a new list head list append inserts a new item to the end of the list it becomes a new list tail list prepend at inserts a new item before an specified list item list append at inserts a new item after an specified list item list prepend at index inserts a new item before an item specified by index list append at index inserts a new item after an item specified by index deletions list remove removes an specified item list remove from value removes an item specified by data value list remove at index removes an item specified by index list remove begin list remove end accessors list find list index of gets an index of a item specified by its value list begin returns the first item list end returns the last item list length returns the number of elements of the list list is empty true if list length equals to list from array array array converts an array to a list list at gets an list item at index list value at gets the list item value at index list next gets the next list item list prev gets the previous list item operations list swap swap items list swap at swap items at specified indices list swap values swap values keep the items list swap values at swap values keep the items at specified indices list copy copy two values list replace replace items remove one item and inserts another at same place list replace at replace a value at remove comparisons list is different true if two lists are different list is equal true if two lists are equal
| 0
|
270,835
| 20,611,642,331
|
IssuesEvent
|
2022-03-07 09:14:22
|
project-oak/oak
|
https://api.github.com/repos/project-oak/oak
|
opened
|
Update `development.md` with information on how to add a new example
|
documentation
|
The current documentation explains how to run an existing example, but does not explain how the various components of the example is used (e.g. the `module` directory, `example.toml` and `config.toml`) or how to add a new example
|
1.0
|
Update `development.md` with information on how to add a new example - The current documentation explains how to run an existing example, but does not explain how the various components of the example is used (e.g. the `module` directory, `example.toml` and `config.toml`) or how to add a new example
|
non_process
|
update development md with information on how to add a new example the current documentation explains how to run an existing example but does not explain how the various components of the example is used e g the module directory example toml and config toml or how to add a new example
| 0
|
9,860
| 12,866,201,137
|
IssuesEvent
|
2020-07-10 02:56:25
|
PHPSocialNetwork/phpfastcache
|
https://api.github.com/repos/PHPSocialNetwork/phpfastcache
|
closed
|
Redis With ElastiCache
|
8.0 [-_-] In Process
|
**Configuration (optional)**
- **PhpFastCache version:** 3.0.0
- **PhpFastCache API version:** 8.0.1
- **PHP version:** 7.1.9
- **Operating system:** Mac with MAMP / Also executed this same on Ubuntu
**My question**
I was trying to integrate Redis with ElastiCache, I am getting this error.
```php
Fatal error: Uncaught Phpfastcache\Exceptions\PhpfastcacheDriverConnectException: Redis failed to connect with the following error message: "Operation timed out" line 99 in
```
** More info **
If i try to connect it phpfastcache local Redis server, it starts working without any error, however if i try this same using an AWS Elasticache Redis server, i get this kind of error.
I am trying to connect with remote Redis server using below config, I don't know maybe be i am connecting in wrong way? I just replaced some chars of my host to redact it, host name format is absolutely same like below config.
```php
$InstanceCache = CacheManager::getInstance('redis', new Config([
'host' => 'redis-random.asdad23.sa.001.aps1.cache.amazonaws.com',
'port' => 6379
]));
```
Please let me know, What kind of more informations you need to debug this?
|
1.0
|
Redis With ElastiCache - **Configuration (optional)**
- **PhpFastCache version:** 3.0.0
- **PhpFastCache API version:** 8.0.1
- **PHP version:** 7.1.9
- **Operating system:** Mac with MAMP / Also executed this same on Ubuntu
**My question**
I was trying to integrate Redis with ElastiCache, I am getting this error.
```php
Fatal error: Uncaught Phpfastcache\Exceptions\PhpfastcacheDriverConnectException: Redis failed to connect with the following error message: "Operation timed out" line 99 in
```
** More info **
If i try to connect it phpfastcache local Redis server, it starts working without any error, however if i try this same using an AWS Elasticache Redis server, i get this kind of error.
I am trying to connect with remote Redis server using below config, I don't know maybe be i am connecting in wrong way? I just replaced some chars of my host to redact it, host name format is absolutely same like below config.
```php
$InstanceCache = CacheManager::getInstance('redis', new Config([
'host' => 'redis-random.asdad23.sa.001.aps1.cache.amazonaws.com',
'port' => 6379
]));
```
Please let me know, What kind of more informations you need to debug this?
|
process
|
redis with elasticache configuration optional phpfastcache version phpfastcache api version php version operating system mac with mamp also executed this same on ubuntu my question i was trying to integrate redis with elasticache i am getting this error php fatal error uncaught phpfastcache exceptions phpfastcachedriverconnectexception redis failed to connect with the following error message operation timed out line in more info if i try to connect it phpfastcache local redis server it starts working without any error however if i try this same using an aws elasticache redis server i get this kind of error i am trying to connect with remote redis server using below config i don t know maybe be i am connecting in wrong way i just replaced some chars of my host to redact it host name format is absolutely same like below config php instancecache cachemanager getinstance redis new config host redis random sa cache amazonaws com port please let me know what kind of more informations you need to debug this
| 1
|
13,166
| 15,590,844,071
|
IssuesEvent
|
2021-03-18 09:50:45
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
GO:0071174 missing links to subtypes?
|
PomBase cell cycle and DNA processes missing parentage
|
GO:0071174 ! mitotic spindle checkpoint signaling has the `gocheck_do_not_manually_annotate` flag, but it only has one subclass (and a regulation link).
Is it correctly flagged? If so, are some links to other terms missing? The comment suggests the latter: "... it should always be possible to specify the type of spindle checkpoint (assembly, orientation or Dma1-dependent)." (But also note that "Dma1-dependent checkpoint" is a narrow synonym for GO:0007094, not a separate term
|
1.0
|
GO:0071174 missing links to subtypes? - GO:0071174 ! mitotic spindle checkpoint signaling has the `gocheck_do_not_manually_annotate` flag, but it only has one subclass (and a regulation link).
Is it correctly flagged? If so, are some links to other terms missing? The comment suggests the latter: "... it should always be possible to specify the type of spindle checkpoint (assembly, orientation or Dma1-dependent)." (But also note that "Dma1-dependent checkpoint" is a narrow synonym for GO:0007094, not a separate term
|
process
|
go missing links to subtypes go mitotic spindle checkpoint signaling has the gocheck do not manually annotate flag but it only has one subclass and a regulation link is it correctly flagged if so are some links to other terms missing the comment suggests the latter it should always be possible to specify the type of spindle checkpoint assembly orientation or dependent but also note that dependent checkpoint is a narrow synonym for go not a separate term
| 1
|
49,851
| 12,420,498,380
|
IssuesEvent
|
2020-05-23 12:20:33
|
angband/angband
|
https://api.github.com/repos/angband/angband
|
closed
|
Angband breaks when built with gcc-10.1 with chests / chest_traps involved.
|
C: Build T: bug
|
This is a followup of bug #4416 ;
This time, it is related to chests and chest_traps. See log:
```
/usr/bin/ld: cmd-core.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: cmd-core.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: effects.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: effects.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: init.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: init.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: obj-chest.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: obj-chest.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: obj-desc.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: obj-desc.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: obj-make.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: obj-make.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: player-util.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: player-util.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: project-obj.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: project-obj.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: ui-context.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: ui-context.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
collect2: error: ld returned 1 exit status
```
|
1.0
|
Angband breaks when built with gcc-10.1 with chests / chest_traps involved. - This is a followup of bug #4416 ;
This time, it is related to chests and chest_traps. See log:
```
/usr/bin/ld: cmd-core.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: cmd-core.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: effects.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: effects.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: init.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: init.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: obj-chest.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: obj-chest.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: obj-desc.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: obj-desc.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: obj-make.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: obj-make.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: player-util.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: player-util.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: project-obj.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: project-obj.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
/usr/bin/ld: ui-context.o:(.bss+0x0): multiple definition of `chest_traps'; cmd-cave.o:(.bss+0x0): first defined here
/usr/bin/ld: ui-context.o:(.bss+0x8): multiple definition of `chests'; cmd-cave.o:(.bss+0x8): first defined here
collect2: error: ld returned 1 exit status
```
|
non_process
|
angband breaks when built with gcc with chests chest traps involved this is a followup of bug this time it is related to chests and chest traps see log usr bin ld cmd core o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld cmd core o bss multiple definition of chests cmd cave o bss first defined here usr bin ld effects o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld effects o bss multiple definition of chests cmd cave o bss first defined here usr bin ld init o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld init o bss multiple definition of chests cmd cave o bss first defined here usr bin ld obj chest o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld obj chest o bss multiple definition of chests cmd cave o bss first defined here usr bin ld obj desc o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld obj desc o bss multiple definition of chests cmd cave o bss first defined here usr bin ld obj make o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld obj make o bss multiple definition of chests cmd cave o bss first defined here usr bin ld player util o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld player util o bss multiple definition of chests cmd cave o bss first defined here usr bin ld project obj o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld project obj o bss multiple definition of chests cmd cave o bss first defined here usr bin ld ui context o bss multiple definition of chest traps cmd cave o bss first defined here usr bin ld ui context o bss multiple definition of chests cmd cave o bss first defined here error ld returned exit status
| 0
|
726,295
| 24,994,048,102
|
IssuesEvent
|
2022-11-02 21:39:22
|
TampaDevs/tampadevs
|
https://api.github.com/repos/TampaDevs/tampadevs
|
closed
|
FEATURE - /about page
|
high priority
|
this one is TBA, not sure how this would look
The context is sometimes you want to show a gallery of images quickly to people interested in TampaDevs. the video does work but its a 35 second video.
Some people need a static page with all the information on there
It's also useful if you want to share this to journalist or major news publications in Tampa when we are approaching and having a large event (Tech Event, Hackathon) to get more publicity in the area.
Basically, we want to get "hype interest", e.g. a potential press candidate wants to see how legitimate a group is based on past events, history, etc. **It should tell a story**
1. What is Tampa Devs
2. How was it founded
3. The people behind it
4. Which big companies and names are sponsoring it (carousel)
5. Galleria of images (carousel)
6. CTA (call to action)-> Here is a list of downloadable media and assets (marketing portfolio) for press coverage, etc. Contact information etc
this `/about` page might be used in the following:
1. Sent out to prospective people who want to help organize TampaDevs
2. Sent out alongside `/sponsor` for potential sponsors
3. Sent out alongside `/speakers` for potential speakers
**It's more of a generic "here's why you should get interested page if you want to show outside support" that isn't specific to who it's targeting**
----
For context on the other feature pages, this is how those links are going to be used For the other feature pages
- `/sponsor CTA`-> this is a hidden link. **It will have sponsors that we currently use**
- `/speaker CTA` -> this is a hidden link. **It will have speakers that have spoken before**
- ~`/groups` -> this is a public link, it might merge with `/about`.~
- ~It might even be hidden on the page and just have a direct link from `/about`~ merge with `/sponsors`
- `/about` -> this is a public link - aka this ticket. Might be renamed to `/press`
We have a sort of semi working /about page but it needs to be fleshed out
--
Some additional ideas with this page, it could be dynamic in nature and pull data/scraped assets from meetup.com
**The `about` page will be links to all the hidden links on the site as well**
It will be the first point of contact for a generic CTA into it's respective CTA depending on who and how different groups want to get involved, etc
|
1.0
|
FEATURE - /about page - this one is TBA, not sure how this would look
The context is sometimes you want to show a gallery of images quickly to people interested in TampaDevs. the video does work but its a 35 second video.
Some people need a static page with all the information on there
It's also useful if you want to share this to journalist or major news publications in Tampa when we are approaching and having a large event (Tech Event, Hackathon) to get more publicity in the area.
Basically, we want to get "hype interest", e.g. a potential press candidate wants to see how legitimate a group is based on past events, history, etc. **It should tell a story**
1. What is Tampa Devs
2. How was it founded
3. The people behind it
4. Which big companies and names are sponsoring it (carousel)
5. Galleria of images (carousel)
6. CTA (call to action)-> Here is a list of downloadable media and assets (marketing portfolio) for press coverage, etc. Contact information etc
this `/about` page might be used in the following:
1. Sent out to prospective people who want to help organize TampaDevs
2. Sent out alongside `/sponsor` for potential sponsors
3. Sent out alongside `/speakers` for potential speakers
**It's more of a generic "here's why you should get interested page if you want to show outside support" that isn't specific to who it's targeting**
----
For context on the other feature pages, this is how those links are going to be used For the other feature pages
- `/sponsor CTA`-> this is a hidden link. **It will have sponsors that we currently use**
- `/speaker CTA` -> this is a hidden link. **It will have speakers that have spoken before**
- ~`/groups` -> this is a public link, it might merge with `/about`.~
- ~It might even be hidden on the page and just have a direct link from `/about`~ merge with `/sponsors`
- `/about` -> this is a public link - aka this ticket. Might be renamed to `/press`
We have a sort of semi working /about page but it needs to be fleshed out
--
Some additional ideas with this page, it could be dynamic in nature and pull data/scraped assets from meetup.com
**The `about` page will be links to all the hidden links on the site as well**
It will be the first point of contact for a generic CTA into it's respective CTA depending on who and how different groups want to get involved, etc
|
non_process
|
feature about page this one is tba not sure how this would look the context is sometimes you want to show a gallery of images quickly to people interested in tampadevs the video does work but its a second video some people need a static page with all the information on there it s also useful if you want to share this to journalist or major news publications in tampa when we are approaching and having a large event tech event hackathon to get more publicity in the area basically we want to get hype interest e g a potential press candidate wants to see how legitimate a group is based on past events history etc it should tell a story what is tampa devs how was it founded the people behind it which big companies and names are sponsoring it carousel galleria of images carousel cta call to action here is a list of downloadable media and assets marketing portfolio for press coverage etc contact information etc this about page might be used in the following sent out to prospective people who want to help organize tampadevs sent out alongside sponsor for potential sponsors sent out alongside speakers for potential speakers it s more of a generic here s why you should get interested page if you want to show outside support that isn t specific to who it s targeting for context on the other feature pages this is how those links are going to be used for the other feature pages sponsor cta this is a hidden link it will have sponsors that we currently use speaker cta this is a hidden link it will have speakers that have spoken before groups this is a public link it might merge with about it might even be hidden on the page and just have a direct link from about merge with sponsors about this is a public link aka this ticket might be renamed to press we have a sort of semi working about page but it needs to be fleshed out some additional ideas with this page it could be dynamic in nature and pull data scraped assets from meetup com the about page will be links to all the hidden links on the site as well it will be the first point of contact for a generic cta into it s respective cta depending on who and how different groups want to get involved etc
| 0
|
366
| 2,801,268,143
|
IssuesEvent
|
2015-05-13 14:54:20
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
closed
|
Add interface for TaRB algorithm
|
preprocessor
|
In ```estimation``` we would need the following options:
1. ```use_TaRB``` which translates to ```options_.TaRB.use_TaRB=1```;
2. ```TaRB_mode_compute=integer``` resembling ```mode_compute``` and translating into ```options_.TaRB.mode_compute=4```
3. ```TaRB_new_block_probability=double``` translating into ```options_.TaRB.new_block_probability``` and checking whether it is between 0 and 1.
4. ```TaRB_optim``` resembling ```optim``` and translating into ```options_.TaRB.optim_opt```
|
1.0
|
Add interface for TaRB algorithm - In ```estimation``` we would need the following options:
1. ```use_TaRB``` which translates to ```options_.TaRB.use_TaRB=1```;
2. ```TaRB_mode_compute=integer``` resembling ```mode_compute``` and translating into ```options_.TaRB.mode_compute=4```
3. ```TaRB_new_block_probability=double``` translating into ```options_.TaRB.new_block_probability``` and checking whether it is between 0 and 1.
4. ```TaRB_optim``` resembling ```optim``` and translating into ```options_.TaRB.optim_opt```
|
process
|
add interface for tarb algorithm in estimation we would need the following options use tarb which translates to options tarb use tarb tarb mode compute integer resembling mode compute and translating into options tarb mode compute tarb new block probability double translating into options tarb new block probability and checking whether it is between and tarb optim resembling optim and translating into options tarb optim opt
| 1
|
22,251
| 30,801,984,179
|
IssuesEvent
|
2023-08-01 02:40:26
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
roblox-pyc 2.26.113 has 3 GuardDog issues
|
guarddog silent-process-execution
|
https://pypi.org/project/roblox-pyc
https://inspector.pypi.io/project/roblox-pyc
```{
"dependency": "roblox-pyc",
"version": "2.26.113",
"result": {
"issues": 3,
"errors": {},
"results": {
"silent-process-execution": [
{
"location": "roblox-pyc-2.26.113/robloxpyc/installationmanager.py:19",
"code": " subprocess.call([\"luarocks\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "roblox-pyc-2.26.113/robloxpyc/installationmanager.py:26",
"code": " subprocess.call([\"moonc\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "roblox-pyc-2.26.113/robloxpyc/installationmanager.py:79",
"code": " subprocess.call([\"npm\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp_haflz__/roblox-pyc"
}
}```
|
1.0
|
roblox-pyc 2.26.113 has 3 GuardDog issues - https://pypi.org/project/roblox-pyc
https://inspector.pypi.io/project/roblox-pyc
```{
"dependency": "roblox-pyc",
"version": "2.26.113",
"result": {
"issues": 3,
"errors": {},
"results": {
"silent-process-execution": [
{
"location": "roblox-pyc-2.26.113/robloxpyc/installationmanager.py:19",
"code": " subprocess.call([\"luarocks\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "roblox-pyc-2.26.113/robloxpyc/installationmanager.py:26",
"code": " subprocess.call([\"moonc\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
},
{
"location": "roblox-pyc-2.26.113/robloxpyc/installationmanager.py:79",
"code": " subprocess.call([\"npm\", \"--version\"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, stdin=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmp_haflz__/roblox-pyc"
}
}```
|
process
|
roblox pyc has guarddog issues dependency roblox pyc version result issues errors results silent process execution location roblox pyc robloxpyc installationmanager py code subprocess call stdout subprocess devnull stderr subprocess devnull stdin subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null location roblox pyc robloxpyc installationmanager py code subprocess call stdout subprocess devnull stderr subprocess devnull stdin subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null location roblox pyc robloxpyc installationmanager py code subprocess call stdout subprocess devnull stderr subprocess devnull stdin subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmp haflz roblox pyc
| 1
|
78,338
| 3,509,578,885
|
IssuesEvent
|
2016-01-08 23:35:09
|
OregonCore/OregonCore
|
https://api.github.com/repos/OregonCore/OregonCore
|
closed
|
Wrong spawn (Eversong Woods) (BB #1113)
|
Category: Database migrated Priority: Low Type: Bug
|
This issue was migrated from bitbucket.
**Original Reporter:** oregon
**Original Date:** 18.08.2015 14:36:43 GMT+0000
**Original Priority:** minor
**Original Type:** bug
**Original State:** closed
**Direct Link:** https://bitbucket.org/oregon/oregoncore/issues/1113
<hr>
There is a wrong spawn in Eversong Woods (MOB_GHARZUL)
|
1.0
|
Wrong spawn (Eversong Woods) (BB #1113) - This issue was migrated from bitbucket.
**Original Reporter:** oregon
**Original Date:** 18.08.2015 14:36:43 GMT+0000
**Original Priority:** minor
**Original Type:** bug
**Original State:** closed
**Direct Link:** https://bitbucket.org/oregon/oregoncore/issues/1113
<hr>
There is a wrong spawn in Eversong Woods (MOB_GHARZUL)
|
non_process
|
wrong spawn eversong woods bb this issue was migrated from bitbucket original reporter oregon original date gmt original priority minor original type bug original state closed direct link there is a wrong spawn in eversong woods mob gharzul
| 0
|
56,116
| 3,078,237,180
|
IssuesEvent
|
2015-08-21 08:52:47
|
pavel-pimenov/flylinkdc-r5xx
|
https://api.github.com/repos/pavel-pimenov/flylinkdc-r5xx
|
closed
|
Неправильно работает кнопочка удалить всё
|
bug Component-Logic Component-UI imported Priority-Medium Usability
|
_From [Tirael...@gmail.com](https://code.google.com/u/108935377450235604965/) on March 23, 2011 19:36:23_
При нажатии на пункт удалить всё в списке качающихся файлов диалог подтверждения удаления иногда выскакивает несколько раз (2-4 раза), насколько я понял этот баг повторяется на файлах с большим количеством источников, потому что иногда всё удаляется с первого раза.
**Attachment:** [Снимок.png](http://code.google.com/p/flylinkdc/issues/detail?id=413)
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=413_
|
1.0
|
Неправильно работает кнопочка удалить всё - _From [Tirael...@gmail.com](https://code.google.com/u/108935377450235604965/) on March 23, 2011 19:36:23_
При нажатии на пункт удалить всё в списке качающихся файлов диалог подтверждения удаления иногда выскакивает несколько раз (2-4 раза), насколько я понял этот баг повторяется на файлах с большим количеством источников, потому что иногда всё удаляется с первого раза.
**Attachment:** [Снимок.png](http://code.google.com/p/flylinkdc/issues/detail?id=413)
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=413_
|
non_process
|
неправильно работает кнопочка удалить всё from on march при нажатии на пункт удалить всё в списке качающихся файлов диалог подтверждения удаления иногда выскакивает несколько раз раза насколько я понял этот баг повторяется на файлах с большим количеством источников потому что иногда всё удаляется с первого раза attachment original issue
| 0
|
18,935
| 24,891,158,655
|
IssuesEvent
|
2022-10-28 12:09:42
|
zotero/zotero
|
https://api.github.com/repos/zotero/zotero
|
closed
|
Truncate long citation text in Quick Format dialog
|
Papercuts Word Processor Integration
|
For items where the title is shown rather than the author, and I guess if there's a long prefix/suffix.
Generally annoying, can push the cursor off the screen, and maybe causing this:
https://forums.zotero.org/discussion/94963/bug-citation-edit-window-outside-screen
We should probably just have a max length per citation after which they get truncated with an ellipsis. We'd still want to show the page number after the ellipsis, so can't just put a limit on the bubble itself.
|
1.0
|
Truncate long citation text in Quick Format dialog - For items where the title is shown rather than the author, and I guess if there's a long prefix/suffix.
Generally annoying, can push the cursor off the screen, and maybe causing this:
https://forums.zotero.org/discussion/94963/bug-citation-edit-window-outside-screen
We should probably just have a max length per citation after which they get truncated with an ellipsis. We'd still want to show the page number after the ellipsis, so can't just put a limit on the bubble itself.
|
process
|
truncate long citation text in quick format dialog for items where the title is shown rather than the author and i guess if there s a long prefix suffix generally annoying can push the cursor off the screen and maybe causing this we should probably just have a max length per citation after which they get truncated with an ellipsis we d still want to show the page number after the ellipsis so can t just put a limit on the bubble itself
| 1
|
118,282
| 17,577,240,385
|
IssuesEvent
|
2021-08-15 21:06:18
|
ghc-dev/Joseph-Haney
|
https://api.github.com/repos/ghc-dev/Joseph-Haney
|
closed
|
CVE-2020-1753 (Medium) detected in ansible-2.9.9.tar.gz - autoclosed
|
security vulnerability
|
## CVE-2020-1753 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ansible-2.9.9.tar.gz</b></p></summary>
<p>Radically simple IT automation</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz">https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz</a></p>
<p>Path to dependency file: Joseph-Haney/requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **ansible-2.9.9.tar.gz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Joseph-Haney/commit/b9ebf8b939d43edbfd02d831f1177e076b88d0d8">b9ebf8b939d43edbfd02d831f1177e076b88d0d8</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A security flaw was found in Ansible Engine, all Ansible 2.7.x versions prior to 2.7.17, all Ansible 2.8.x versions prior to 2.8.11 and all Ansible 2.9.x versions prior to 2.9.7, when managing kubernetes using the k8s module. Sensitive parameters such as passwords and tokens are passed to kubectl from the command line, not using an environment variable or an input configuration file. This will disclose passwords and tokens from process list and no_log directive from debug module would not have any effect making these secrets being disclosed on stdout and log files.
<p>Publish Date: 2020-03-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1753>CVE-2020-1753</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://security.gentoo.org/glsa/202006-11">https://security.gentoo.org/glsa/202006-11</a></p>
<p>Fix Resolution: All Ansible users should upgrade to the latest version # emerge --sync
# emerge --ask --oneshot --verbose >=app-admin/ansible-2.9.7 >= </p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"ansible","packageVersion":"2.9.9","packageFilePaths":["/requirements.txt"],"isTransitiveDependency":false,"dependencyTree":"ansible:2.9.9","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-1753","vulnerabilityDetails":"A security flaw was found in Ansible Engine, all Ansible 2.7.x versions prior to 2.7.17, all Ansible 2.8.x versions prior to 2.8.11 and all Ansible 2.9.x versions prior to 2.9.7, when managing kubernetes using the k8s module. Sensitive parameters such as passwords and tokens are passed to kubectl from the command line, not using an environment variable or an input configuration file. This will disclose passwords and tokens from process list and no_log directive from debug module would not have any effect making these secrets being disclosed on stdout and log files.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1753","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-1753 (Medium) detected in ansible-2.9.9.tar.gz - autoclosed - ## CVE-2020-1753 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ansible-2.9.9.tar.gz</b></p></summary>
<p>Radically simple IT automation</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz">https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz</a></p>
<p>Path to dependency file: Joseph-Haney/requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **ansible-2.9.9.tar.gz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Joseph-Haney/commit/b9ebf8b939d43edbfd02d831f1177e076b88d0d8">b9ebf8b939d43edbfd02d831f1177e076b88d0d8</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A security flaw was found in Ansible Engine, all Ansible 2.7.x versions prior to 2.7.17, all Ansible 2.8.x versions prior to 2.8.11 and all Ansible 2.9.x versions prior to 2.9.7, when managing kubernetes using the k8s module. Sensitive parameters such as passwords and tokens are passed to kubectl from the command line, not using an environment variable or an input configuration file. This will disclose passwords and tokens from process list and no_log directive from debug module would not have any effect making these secrets being disclosed on stdout and log files.
<p>Publish Date: 2020-03-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1753>CVE-2020-1753</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://security.gentoo.org/glsa/202006-11">https://security.gentoo.org/glsa/202006-11</a></p>
<p>Fix Resolution: All Ansible users should upgrade to the latest version # emerge --sync
# emerge --ask --oneshot --verbose >=app-admin/ansible-2.9.7 >= </p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"ansible","packageVersion":"2.9.9","packageFilePaths":["/requirements.txt"],"isTransitiveDependency":false,"dependencyTree":"ansible:2.9.9","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-1753","vulnerabilityDetails":"A security flaw was found in Ansible Engine, all Ansible 2.7.x versions prior to 2.7.17, all Ansible 2.8.x versions prior to 2.8.11 and all Ansible 2.9.x versions prior to 2.9.7, when managing kubernetes using the k8s module. Sensitive parameters such as passwords and tokens are passed to kubectl from the command line, not using an environment variable or an input configuration file. This will disclose passwords and tokens from process list and no_log directive from debug module would not have any effect making these secrets being disclosed on stdout and log files.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-1753","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in ansible tar gz autoclosed cve medium severity vulnerability vulnerable library ansible tar gz radically simple it automation library home page a href path to dependency file joseph haney requirements txt path to vulnerable library requirements txt dependency hierarchy x ansible tar gz vulnerable library found in head commit a href found in base branch master vulnerability details a security flaw was found in ansible engine all ansible x versions prior to all ansible x versions prior to and all ansible x versions prior to when managing kubernetes using the module sensitive parameters such as passwords and tokens are passed to kubectl from the command line not using an environment variable or an input configuration file this will disclose passwords and tokens from process list and no log directive from debug module would not have any effect making these secrets being disclosed on stdout and log files publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href fix resolution all ansible users should upgrade to the latest version emerge sync emerge ask oneshot verbose app admin ansible isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree ansible isminimumfixversionavailable false basebranches vulnerabilityidentifier cve vulnerabilitydetails a security flaw was found in ansible engine all ansible x versions prior to all ansible x versions prior to and all ansible x versions prior to when managing kubernetes using the module sensitive parameters such as passwords and tokens are passed to kubectl from the command line not using an environment variable or an input configuration file this will disclose passwords and tokens from process list and no log directive from debug module would not have any effect making these secrets being disclosed on stdout and log files vulnerabilityurl
| 0
|
9,426
| 12,417,970,674
|
IssuesEvent
|
2020-05-22 22:15:34
|
jyn514/rcc
|
https://api.github.com/repos/jyn514/rcc
|
closed
|
Separate macro replacement and macro parsing
|
enhancement preprocessor
|
Currently both replacement and parsing for the preprocessor are done at the same time, so you can't provide the preprocessor with a list of tokens and have those be replaced; instead you have to provide the raw string of the source code.
|
1.0
|
Separate macro replacement and macro parsing - Currently both replacement and parsing for the preprocessor are done at the same time, so you can't provide the preprocessor with a list of tokens and have those be replaced; instead you have to provide the raw string of the source code.
|
process
|
separate macro replacement and macro parsing currently both replacement and parsing for the preprocessor are done at the same time so you can t provide the preprocessor with a list of tokens and have those be replaced instead you have to provide the raw string of the source code
| 1
|
414,224
| 27,981,725,284
|
IssuesEvent
|
2023-03-26 08:18:12
|
emilengler/funion
|
https://api.github.com/repos/emilengler/funion
|
closed
|
Implement a consistent semantic for parsing, decoding, and fetching
|
documentation enhancement
|
Currently, there is no good terminology regarding the terms parsing, fetching, and decoding.
My suggestion would be, to introduce the following semantic:
- `decode`: Parse a binary in its entirety, that is, without returning any sort of remaining data
- `fetch`: Parse the first element/set of elements found within a binary, returning the remaining data
- `parse`: Abolish the term
|
1.0
|
Implement a consistent semantic for parsing, decoding, and fetching - Currently, there is no good terminology regarding the terms parsing, fetching, and decoding.
My suggestion would be, to introduce the following semantic:
- `decode`: Parse a binary in its entirety, that is, without returning any sort of remaining data
- `fetch`: Parse the first element/set of elements found within a binary, returning the remaining data
- `parse`: Abolish the term
|
non_process
|
implement a consistent semantic for parsing decoding and fetching currently there is no good terminology regarding the terms parsing fetching and decoding my suggestion would be to introduce the following semantic decode parse a binary in its entirety that is without returning any sort of remaining data fetch parse the first element set of elements found within a binary returning the remaining data parse abolish the term
| 0
|
4,900
| 7,780,640,073
|
IssuesEvent
|
2018-06-05 20:42:43
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
Improve definition 'behavior'
|
organism-level process
|
Following up on #12357
The definition of behavior should be improved.
Current = The internally coordinated responses (actions or inactions) of whole living organisms (individuals or groups) to internal or external stimuli.
Suggested = ??
|
1.0
|
Improve definition 'behavior' - Following up on #12357
The definition of behavior should be improved.
Current = The internally coordinated responses (actions or inactions) of whole living organisms (individuals or groups) to internal or external stimuli.
Suggested = ??
|
process
|
improve definition behavior following up on the definition of behavior should be improved current the internally coordinated responses actions or inactions of whole living organisms individuals or groups to internal or external stimuli suggested
| 1
|
138,229
| 12,810,437,069
|
IssuesEvent
|
2020-07-03 18:39:12
|
dsccommunity/SqlServerDsc
|
https://api.github.com/repos/dsccommunity/SqlServerDsc
|
closed
|
SqlDatabaseObjectPermission: Remove duplicate documentation for embedded instance
|
documentation in progress
|
Since the task that publish Wiki content was updated to correctly handle embedded instances the duplicate documentation should be removed from the resource README.md.
|
1.0
|
SqlDatabaseObjectPermission: Remove duplicate documentation for embedded instance - Since the task that publish Wiki content was updated to correctly handle embedded instances the duplicate documentation should be removed from the resource README.md.
|
non_process
|
sqldatabaseobjectpermission remove duplicate documentation for embedded instance since the task that publish wiki content was updated to correctly handle embedded instances the duplicate documentation should be removed from the resource readme md
| 0
|
9,153
| 12,214,642,214
|
IssuesEvent
|
2020-05-01 10:34:04
|
hashicorp/packer
|
https://api.github.com/repos/hashicorp/packer
|
reopened
|
docker-push support for multiple tags
|
enhancement post-processor/docker
|
I've reviewed the documentation and poured through the issues here, and I just can't seem to make `docker-push` behave in a way that makes sense to me. This might legitimately be a bug, but figured since the `docker-push` post-processor hasn't been touched in six years, it's considered "good enough" as-is and any additional functionality would be considered an enhancement.
#### Feature Description
I want to be able to have a `docker-tag` definition with multiple tags defined, and have all the tags pushed by the `docker-push` post-processor. Here's a naive example:
```yaml
"post-processors":
[
[
{
"only": ["bionic"],
"type": "docker-tag",
"repository": "terradatum/systemd",
"tag": "bionic,ubuntu-bionic,ubuntu-18.04"
},
{
"type": "docker-push",
"login": true,
"login_username": "{{user `docker_username`}}",
"login_password": "{{user `docker_password`}}"
}
]
]
```
You can see that with the above, all three tags are applied, but only the last is pushed:
```shell script
2020-04-28T18:31:33-07:00: ==> bionic: Committing the container
2020-04-28T18:31:35-07:00: bionic: Image ID: sha256:e64e33a222142e6a51dc62788232e4d8e81c70c3347373be36bbfa638e2c1183
2020-04-28T18:31:35-07:00: ==> bionic: Killing the container: f143f41a61c2dabd87fa825d3c009fc8a1d121951a720a68cea8029bf2ad2bc6
2020-04-28T18:31:40-07:00: ==> bionic: Running post-processor: docker-tag
2020-04-28T18:31:40-07:00: bionic (docker-tag): Tagging image: sha256:e64e33a222142e6a51dc62788232e4d8e81c70c3347373be36bbfa638e2c1183
2020-04-28T18:31:40-07:00: bionic (docker-tag): Repository: terradatum/systemd:bionic
2020-04-28T18:31:40-07:00: ==> bionic: Running post-processor: docker-tag
2020-04-28T18:31:40-07:00: bionic (docker-tag): Tagging image: terradatum/systemd:bionic
2020-04-28T18:31:40-07:00: bionic (docker-tag): Repository: terradatum/systemd:ubuntu-bionic
2020-04-28T18:31:40-07:00: ==> bionic: Running post-processor: docker-tag
2020-04-28T18:31:40-07:00: bionic (docker-tag): Tagging image: terradatum/systemd:ubuntu-bionic
2020-04-28T18:31:40-07:00: bionic (docker-tag): Repository: terradatum/systemd:ubuntu-18.04
2020-04-28T18:31:40-07:00: ==> bionic: Running post-processor: docker-push
2020-04-28T18:31:40-07:00: bionic (docker-push): Logging in...
2020-04-28T18:31:41-07:00: bionic (docker-push): WARNING! Your password will be stored unencrypted in /home/rbellamy/.docker/config.json.
2020-04-28T18:31:41-07:00: bionic (docker-push): Login Succeeded
2020-04-28T18:31:41-07:00: bionic (docker-push): Configure a credential helper to remove this warning. See
2020-04-28T18:31:41-07:00: bionic (docker-push): https://docs.docker.com/engine/reference/commandline/login/#credentials-store
2020-04-28T18:31:41-07:00: bionic (docker-push): Pushing: terradatum/systemd:ubuntu-18.04
2020-04-28T18:31:41-07:00: bionic (docker-push): The push refers to repository [docker.io/terradatum/systemd]
2020-04-28T18:31:42-07:00: bionic (docker-push): d3536ca2000e: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): 28ba7458d04b: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): 838a37a24627: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): a6ebef4a95c3: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): b7f7d2967507: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): 838a37a24627: Layer already exists
2020-04-28T18:31:42-07:00: bionic (docker-push): 28ba7458d04b: Layer already exists
2020-04-28T18:31:42-07:00: bionic (docker-push): b7f7d2967507: Layer already exists
2020-04-28T18:31:43-07:00: bionic (docker-push): a6ebef4a95c3: Layer already exists
2020-04-28T18:32:07-07:00: bionic (docker-push): d3536ca2000e: Pushed
2020-04-28T18:32:12-07:00: bionic (docker-push): ubuntu-18.04: digest: sha256:5935f2fc152e81d849ce02b4e32feda2677e7a6aa58589dfcaa6d687dbe89fc8 size: 1364
2020-04-28T18:32:12-07:00: bionic (docker-push): Logging out...
2020-04-28T18:32:12-07:00: bionic (docker-push): Removing login credentials for https://index.docker.io/v1/
2020-04-28T18:32:12-07:00: Build 'bionic' finished.
```
Instead, I have to do something like this to push multiple tags:
```yaml
"post-processors": [
[
{
"type": "docker-tag",
"repository": "terradatum/systemd",
"tag": "bionic",
"only": [
"bionic"
]
},
{
"type": "docker-push",
"login": true,
"login_username": "{{user `docker_username`}}",
"login_password": "{{user `docker_password`}}",
"only": [
"bionic"
]
}
],
[
{
"type": "docker-tag",
"repository": "terradatum/systemd",
"tag": "ubuntu-bionic",
"only": [
"bionic"
]
},
{
"type": "docker-push",
"login": true,
"login_username": "{{user `docker_username`}}",
"login_password": "{{user `docker_password`}}",
"only": [
"bionic"
]
}
],
[
{
"type": "docker-tag",
"repository": "terradatum/systemd",
"tag": "ubuntu-18.04",
"only": [
"bionic"
]
},
{
"type": "docker-push",
"login": true,
"login_username": "{{user `docker_username`}}",
"login_password": "{{user `docker_password`}}",
"only": [
"bionic"
]
}
]
]
```
#### Use Case(s)
Docker supports multiple tags, so does the `docker-tag` post-processor. It seems reasonable to me that my first example would "just work":tm:.
|
1.0
|
docker-push support for multiple tags - I've reviewed the documentation and poured through the issues here, and I just can't seem to make `docker-push` behave in a way that makes sense to me. This might legitimately be a bug, but figured since the `docker-push` post-processor hasn't been touched in six years, it's considered "good enough" as-is and any additional functionality would be considered an enhancement.
#### Feature Description
I want to be able to have a `docker-tag` definition with multiple tags defined, and have all the tags pushed by the `docker-push` post-processor. Here's a naive example:
```yaml
"post-processors":
[
[
{
"only": ["bionic"],
"type": "docker-tag",
"repository": "terradatum/systemd",
"tag": "bionic,ubuntu-bionic,ubuntu-18.04"
},
{
"type": "docker-push",
"login": true,
"login_username": "{{user `docker_username`}}",
"login_password": "{{user `docker_password`}}"
}
]
]
```
You can see that with the above, all three tags are applied, but only the last is pushed:
```shell script
2020-04-28T18:31:33-07:00: ==> bionic: Committing the container
2020-04-28T18:31:35-07:00: bionic: Image ID: sha256:e64e33a222142e6a51dc62788232e4d8e81c70c3347373be36bbfa638e2c1183
2020-04-28T18:31:35-07:00: ==> bionic: Killing the container: f143f41a61c2dabd87fa825d3c009fc8a1d121951a720a68cea8029bf2ad2bc6
2020-04-28T18:31:40-07:00: ==> bionic: Running post-processor: docker-tag
2020-04-28T18:31:40-07:00: bionic (docker-tag): Tagging image: sha256:e64e33a222142e6a51dc62788232e4d8e81c70c3347373be36bbfa638e2c1183
2020-04-28T18:31:40-07:00: bionic (docker-tag): Repository: terradatum/systemd:bionic
2020-04-28T18:31:40-07:00: ==> bionic: Running post-processor: docker-tag
2020-04-28T18:31:40-07:00: bionic (docker-tag): Tagging image: terradatum/systemd:bionic
2020-04-28T18:31:40-07:00: bionic (docker-tag): Repository: terradatum/systemd:ubuntu-bionic
2020-04-28T18:31:40-07:00: ==> bionic: Running post-processor: docker-tag
2020-04-28T18:31:40-07:00: bionic (docker-tag): Tagging image: terradatum/systemd:ubuntu-bionic
2020-04-28T18:31:40-07:00: bionic (docker-tag): Repository: terradatum/systemd:ubuntu-18.04
2020-04-28T18:31:40-07:00: ==> bionic: Running post-processor: docker-push
2020-04-28T18:31:40-07:00: bionic (docker-push): Logging in...
2020-04-28T18:31:41-07:00: bionic (docker-push): WARNING! Your password will be stored unencrypted in /home/rbellamy/.docker/config.json.
2020-04-28T18:31:41-07:00: bionic (docker-push): Login Succeeded
2020-04-28T18:31:41-07:00: bionic (docker-push): Configure a credential helper to remove this warning. See
2020-04-28T18:31:41-07:00: bionic (docker-push): https://docs.docker.com/engine/reference/commandline/login/#credentials-store
2020-04-28T18:31:41-07:00: bionic (docker-push): Pushing: terradatum/systemd:ubuntu-18.04
2020-04-28T18:31:41-07:00: bionic (docker-push): The push refers to repository [docker.io/terradatum/systemd]
2020-04-28T18:31:42-07:00: bionic (docker-push): d3536ca2000e: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): 28ba7458d04b: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): 838a37a24627: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): a6ebef4a95c3: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): b7f7d2967507: Preparing
2020-04-28T18:31:42-07:00: bionic (docker-push): 838a37a24627: Layer already exists
2020-04-28T18:31:42-07:00: bionic (docker-push): 28ba7458d04b: Layer already exists
2020-04-28T18:31:42-07:00: bionic (docker-push): b7f7d2967507: Layer already exists
2020-04-28T18:31:43-07:00: bionic (docker-push): a6ebef4a95c3: Layer already exists
2020-04-28T18:32:07-07:00: bionic (docker-push): d3536ca2000e: Pushed
2020-04-28T18:32:12-07:00: bionic (docker-push): ubuntu-18.04: digest: sha256:5935f2fc152e81d849ce02b4e32feda2677e7a6aa58589dfcaa6d687dbe89fc8 size: 1364
2020-04-28T18:32:12-07:00: bionic (docker-push): Logging out...
2020-04-28T18:32:12-07:00: bionic (docker-push): Removing login credentials for https://index.docker.io/v1/
2020-04-28T18:32:12-07:00: Build 'bionic' finished.
```
Instead, I have to do something like this to push multiple tags:
```yaml
"post-processors": [
[
{
"type": "docker-tag",
"repository": "terradatum/systemd",
"tag": "bionic",
"only": [
"bionic"
]
},
{
"type": "docker-push",
"login": true,
"login_username": "{{user `docker_username`}}",
"login_password": "{{user `docker_password`}}",
"only": [
"bionic"
]
}
],
[
{
"type": "docker-tag",
"repository": "terradatum/systemd",
"tag": "ubuntu-bionic",
"only": [
"bionic"
]
},
{
"type": "docker-push",
"login": true,
"login_username": "{{user `docker_username`}}",
"login_password": "{{user `docker_password`}}",
"only": [
"bionic"
]
}
],
[
{
"type": "docker-tag",
"repository": "terradatum/systemd",
"tag": "ubuntu-18.04",
"only": [
"bionic"
]
},
{
"type": "docker-push",
"login": true,
"login_username": "{{user `docker_username`}}",
"login_password": "{{user `docker_password`}}",
"only": [
"bionic"
]
}
]
]
```
#### Use Case(s)
Docker supports multiple tags, so does the `docker-tag` post-processor. It seems reasonable to me that my first example would "just work":tm:.
|
process
|
docker push support for multiple tags i ve reviewed the documentation and poured through the issues here and i just can t seem to make docker push behave in a way that makes sense to me this might legitimately be a bug but figured since the docker push post processor hasn t been touched in six years it s considered good enough as is and any additional functionality would be considered an enhancement feature description i want to be able to have a docker tag definition with multiple tags defined and have all the tags pushed by the docker push post processor here s a naive example yaml post processors only type docker tag repository terradatum systemd tag bionic ubuntu bionic ubuntu type docker push login true login username user docker username login password user docker password you can see that with the above all three tags are applied but only the last is pushed shell script bionic committing the container bionic image id bionic killing the container bionic running post processor docker tag bionic docker tag tagging image bionic docker tag repository terradatum systemd bionic bionic running post processor docker tag bionic docker tag tagging image terradatum systemd bionic bionic docker tag repository terradatum systemd ubuntu bionic bionic running post processor docker tag bionic docker tag tagging image terradatum systemd ubuntu bionic bionic docker tag repository terradatum systemd ubuntu bionic running post processor docker push bionic docker push logging in bionic docker push warning your password will be stored unencrypted in home rbellamy docker config json bionic docker push login succeeded bionic docker push configure a credential helper to remove this warning see bionic docker push bionic docker push pushing terradatum systemd ubuntu bionic docker push the push refers to repository bionic docker push preparing bionic docker push preparing bionic docker push preparing bionic docker push preparing bionic docker push preparing bionic docker push layer already exists bionic docker push layer already exists bionic docker push layer already exists bionic docker push layer already exists bionic docker push pushed bionic docker push ubuntu digest size bionic docker push logging out bionic docker push removing login credentials for build bionic finished instead i have to do something like this to push multiple tags yaml post processors type docker tag repository terradatum systemd tag bionic only bionic type docker push login true login username user docker username login password user docker password only bionic type docker tag repository terradatum systemd tag ubuntu bionic only bionic type docker push login true login username user docker username login password user docker password only bionic type docker tag repository terradatum systemd tag ubuntu only bionic type docker push login true login username user docker username login password user docker password only bionic use case s docker supports multiple tags so does the docker tag post processor it seems reasonable to me that my first example would just work tm
| 1
|
90,404
| 18,151,590,060
|
IssuesEvent
|
2021-09-26 11:05:19
|
Automattic/wp-calypso
|
https://api.github.com/repos/Automattic/wp-calypso
|
opened
|
Blog Posts & Post Carousel Blocks: Add setting to list all child and sibling pages
|
Pages [Type] Feature Request Shortcodes Block: Blog Posts Block: Post Carousel
|
### What
Adding a querying setting to the Blog Posts & Post Carousel Blocks to automatically list child and/or sibling pages of the current page
### Why
Given that it is not possible to register new Custom Post Types on simple sites, I often tend to use pages and page hierarchies to organize content on certain sites.
This method is simple and intuitive enough even for new users, has a meaningful representation in our admin pages and provides beautiful and semantic URLs.
The current Blog Posts & Post Carousel Blocks allow you to list all pages, and also to pick specific pages, but having an option to automatically list all sibling or child pages for the current page would allow for easier and more rational use of pages.
Currently, my workaround is to use one of these two shortcode approaches:
1. `[child-pages]` and `[sibling-pages]` [Source](https://wordpress.com/support/list-pages-shortcode/)
This first option is very straightforward but the options around what content to show (featured image, excerpts) and the styling are very limited.
2. `[display-posts post_type="page" post_parent="xxx"]` [Source](https://wordpress.com/support/display-posts-shortcode/)
This option offers more content & query customization options (order by, etc) but it requires adding the parent page ID manually, which can be challenging for the average user.
I will add that, aside from the two options above having specific limitations, we have been trying to stop depending on shortcodes if at all possible, and this feature request is a move in that direction.
### How
Once pages are selected,

we could add some more querying options for pages.

The specific UI would need to be thought through, as we would need to make those three options mutually exclusive (child pages of the current page, siblings, and child pages of a different page).
|
1.0
|
Blog Posts & Post Carousel Blocks: Add setting to list all child and sibling pages - ### What
Adding a querying setting to the Blog Posts & Post Carousel Blocks to automatically list child and/or sibling pages of the current page
### Why
Given that it is not possible to register new Custom Post Types on simple sites, I often tend to use pages and page hierarchies to organize content on certain sites.
This method is simple and intuitive enough even for new users, has a meaningful representation in our admin pages and provides beautiful and semantic URLs.
The current Blog Posts & Post Carousel Blocks allow you to list all pages, and also to pick specific pages, but having an option to automatically list all sibling or child pages for the current page would allow for easier and more rational use of pages.
Currently, my workaround is to use one of these two shortcode approaches:
1. `[child-pages]` and `[sibling-pages]` [Source](https://wordpress.com/support/list-pages-shortcode/)
This first option is very straightforward but the options around what content to show (featured image, excerpts) and the styling are very limited.
2. `[display-posts post_type="page" post_parent="xxx"]` [Source](https://wordpress.com/support/display-posts-shortcode/)
This option offers more content & query customization options (order by, etc) but it requires adding the parent page ID manually, which can be challenging for the average user.
I will add that, aside from the two options above having specific limitations, we have been trying to stop depending on shortcodes if at all possible, and this feature request is a move in that direction.
### How
Once pages are selected,

we could add some more querying options for pages.

The specific UI would need to be thought through, as we would need to make those three options mutually exclusive (child pages of the current page, siblings, and child pages of a different page).
|
non_process
|
blog posts post carousel blocks add setting to list all child and sibling pages what adding a querying setting to the blog posts post carousel blocks to automatically list child and or sibling pages of the current page why given that it is not possible to register new custom post types on simple sites i often tend to use pages and page hierarchies to organize content on certain sites this method is simple and intuitive enough even for new users has a meaningful representation in our admin pages and provides beautiful and semantic urls the current blog posts post carousel blocks allow you to list all pages and also to pick specific pages but having an option to automatically list all sibling or child pages for the current page would allow for easier and more rational use of pages currently my workaround is to use one of these two shortcode approaches and this first option is very straightforward but the options around what content to show featured image excerpts and the styling are very limited this option offers more content query customization options order by etc but it requires adding the parent page id manually which can be challenging for the average user i will add that aside from the two options above having specific limitations we have been trying to stop depending on shortcodes if at all possible and this feature request is a move in that direction how once pages are selected we could add some more querying options for pages the specific ui would need to be thought through as we would need to make those three options mutually exclusive child pages of the current page siblings and child pages of a different page
| 0
|
220,395
| 17,192,472,027
|
IssuesEvent
|
2021-07-16 13:01:04
|
4ian/GDevelop
|
https://api.github.com/repos/4ian/GDevelop
|
closed
|
Tiled Sprite object does not repeat some sprites, instead it stretches them.
|
👋 Needs confirmation/testing
|
## The bug
In short, I discovered a bug by accident by messing around with Gdevelop's editor. This bug makes it so any tiled sprite (Or as far as I can tell) become stretched-out instead of repeating the intended pattern on the sprite.
Based on extensive testing on my end, it affects the editor, the preview and the exported game, so pretending that the bug isn't there is not an option. It affects a specific type of sprite of a certain height and length.
The bug might affect only some computers for which are caused by the CPU or GPU.
## System details
* Debian 8 running on a Packard Bell laptop with an AMD processor
* 5.0.0-beta110 AppImage
## To Reproduce
I have a project file with the bug already there, so you may take a look at it instead.
Steps to reproduce the behavior:
1. Create any tiled sprite with the following resolutions: 16x16, 32x32, 64x64 or any other one with a 1:1 ratio. With that said I have only tested the aforementioned resolutions.
2. Import the file into the project.
3. Create a tiled sprite with the file.
4. Finally, increase its size via the editor and the sprite will appear stretched.
* Please include a link to a game if possible!
I have zipped a project made to showcase the issue.
[ISSUE.zip](https://github.com/4ian/GDevelop/files/6717759/ISSUE.zip)
* If applicable, add screenshots to help explain your problem.
Sure.

I would have used a video to better illustrate the problem, but it would take me days to install any software that's not from the app manager on my system due to how cumbersome Linux can get, and I have to make sure it actually works too. If that is demanded or asked I can always do that though.
## Workaround
If this bug does affect one of your projects, make sure to give the sprite any ratio that is not 1:1, it can be 1:2, 1:3, 2:3, 3:4, etc and it will work as intended again. That is how I figured this problem.
## Final comment
The stretching effect could be added into Gdevelop as an intended feature. I could see it being useful for GUIs, however, as of now, it is rather annoying.
|
1.0
|
Tiled Sprite object does not repeat some sprites, instead it stretches them. - ## The bug
In short, I discovered a bug by accident by messing around with Gdevelop's editor. This bug makes it so any tiled sprite (Or as far as I can tell) become stretched-out instead of repeating the intended pattern on the sprite.
Based on extensive testing on my end, it affects the editor, the preview and the exported game, so pretending that the bug isn't there is not an option. It affects a specific type of sprite of a certain height and length.
The bug might affect only some computers for which are caused by the CPU or GPU.
## System details
* Debian 8 running on a Packard Bell laptop with an AMD processor
* 5.0.0-beta110 AppImage
## To Reproduce
I have a project file with the bug already there, so you may take a look at it instead.
Steps to reproduce the behavior:
1. Create any tiled sprite with the following resolutions: 16x16, 32x32, 64x64 or any other one with a 1:1 ratio. With that said I have only tested the aforementioned resolutions.
2. Import the file into the project.
3. Create a tiled sprite with the file.
4. Finally, increase its size via the editor and the sprite will appear stretched.
* Please include a link to a game if possible!
I have zipped a project made to showcase the issue.
[ISSUE.zip](https://github.com/4ian/GDevelop/files/6717759/ISSUE.zip)
* If applicable, add screenshots to help explain your problem.
Sure.

I would have used a video to better illustrate the problem, but it would take me days to install any software that's not from the app manager on my system due to how cumbersome Linux can get, and I have to make sure it actually works too. If that is demanded or asked I can always do that though.
## Workaround
If this bug does affect one of your projects, make sure to give the sprite any ratio that is not 1:1, it can be 1:2, 1:3, 2:3, 3:4, etc and it will work as intended again. That is how I figured this problem.
## Final comment
The stretching effect could be added into Gdevelop as an intended feature. I could see it being useful for GUIs, however, as of now, it is rather annoying.
|
non_process
|
tiled sprite object does not repeat some sprites instead it stretches them the bug in short i discovered a bug by accident by messing around with gdevelop s editor this bug makes it so any tiled sprite or as far as i can tell become stretched out instead of repeating the intended pattern on the sprite based on extensive testing on my end it affects the editor the preview and the exported game so pretending that the bug isn t there is not an option it affects a specific type of sprite of a certain height and length the bug might affect only some computers for which are caused by the cpu or gpu system details debian running on a packard bell laptop with an amd processor appimage to reproduce i have a project file with the bug already there so you may take a look at it instead steps to reproduce the behavior create any tiled sprite with the following resolutions or any other one with a ratio with that said i have only tested the aforementioned resolutions import the file into the project create a tiled sprite with the file finally increase its size via the editor and the sprite will appear stretched please include a link to a game if possible i have zipped a project made to showcase the issue if applicable add screenshots to help explain your problem sure i would have used a video to better illustrate the problem but it would take me days to install any software that s not from the app manager on my system due to how cumbersome linux can get and i have to make sure it actually works too if that is demanded or asked i can always do that though workaround if this bug does affect one of your projects make sure to give the sprite any ratio that is not it can be etc and it will work as intended again that is how i figured this problem final comment the stretching effect could be added into gdevelop as an intended feature i could see it being useful for guis however as of now it is rather annoying
| 0
|
8,893
| 11,986,986,797
|
IssuesEvent
|
2020-04-07 20:20:39
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Prepared statement needs to be re-prepared MySQL 5.6 in v0.29.1
|
Database/MySQL Priority:P2 Querying/Processor Type:Bug
|
------
Error after migration of 0.28.3 to 0.29.1
java.sql.SQLException: Prepared statement needs to be re-prepared in a MySQL database.
-----
After upgrade a Metabase Instance to the version 0.29.1, I find with this error in some questions written in metabase 0.28.3
```java
java.sql.SQLException: Prepared statement needs to be re-prepared in a MySQL database.7
```
This instance are installed in a ubuntu 16.04 server and use MySQL 5.6 to internal DB.
- My Browser version is Chrome 65
- My OS is Ubuntu 16.04
- My database is MySQL 5.6
- Metabase version 0.29.1
- Docker Custom Images using Ubuntu server
- MySQL 5.6 as internal db
## Log output
```
WARN metabase.query-processor :: Query failure: java.sql.SQLException: Prepared statement needs to be re-prepared
["query_processor$assert_query_status_successful.invokeStatic(query_processor.clj:211)"
"query_processor$assert_query_status_successful.invoke(query_processor.clj:204)"
"query_processor$run_and_save_query_BANG_.invokeStatic(query_processor.clj:244)"
"query_processor$run_and_save_query_BANG_.invoke(query_processor.clj:237)"
"query_processor$fn__31502$process_query_and_save_execution_BANG___31507$fn__31508.invoke(query_processor.clj:283)"
"query_processor$fn__31502$process_query_and_save_execution_BANG___31507.invoke(query_processor.clj:269)"
"api.card$run_query_for_card.invokeStatic(card.clj:608)"
"api.card$run_query_for_card.doInvoke(card.clj:594)"
"api.card$fn__41057$fn__41060$fn__41061.invoke(card.clj:615)"
"api.card$fn__41057$fn__41060.invoke(card.clj:614)"
"api.common.internal$do_with_caught_api_exceptions.invokeStatic(internal.clj:254)"
"api.common.internal$do_with_caught_api_exceptions.invoke(internal.clj:249)"
"api.card$fn__41057.invokeStatic(card.clj:610)"
"api.card$fn__41057.invoke(card.clj:610)"
"middleware$enforce_authentication$fn__38506.invoke(middleware.clj:118)"
"api.routes$fn__49374.invokeStatic(routes.clj:64)"
"api.routes$fn__49374.invoke(routes.clj:64)"
"routes$fn__49469$fn__49470.doInvoke(routes.clj:108)"
"routes$fn__49469.invokeStatic(routes.clj:103)"
"routes$fn__49469.invoke(routes.clj:103)"
"middleware$log_api_call$fn__38605$fn__38607.invoke(middleware.clj:349)"
"middleware$log_api_call$fn__38605.invoke(middleware.clj:348)"
"middleware$add_security_headers$fn__38555.invoke(middleware.clj:251)"
"core$wrap_streamed_json_response$fn__54821.invoke(core.clj:67)"
"middleware$bind_current_user$fn__38510.invoke(middleware.clj:139)"
"middleware$maybe_set_site_url$fn__38559.invoke(middleware.clj:275)"]
```
|
1.0
|
Prepared statement needs to be re-prepared MySQL 5.6 in v0.29.1 - ------
Error after migration of 0.28.3 to 0.29.1
java.sql.SQLException: Prepared statement needs to be re-prepared in a MySQL database.
-----
After upgrade a Metabase Instance to the version 0.29.1, I find with this error in some questions written in metabase 0.28.3
```java
java.sql.SQLException: Prepared statement needs to be re-prepared in a MySQL database.7
```
This instance are installed in a ubuntu 16.04 server and use MySQL 5.6 to internal DB.
- My Browser version is Chrome 65
- My OS is Ubuntu 16.04
- My database is MySQL 5.6
- Metabase version 0.29.1
- Docker Custom Images using Ubuntu server
- MySQL 5.6 as internal db
## Log output
```
WARN metabase.query-processor :: Query failure: java.sql.SQLException: Prepared statement needs to be re-prepared
["query_processor$assert_query_status_successful.invokeStatic(query_processor.clj:211)"
"query_processor$assert_query_status_successful.invoke(query_processor.clj:204)"
"query_processor$run_and_save_query_BANG_.invokeStatic(query_processor.clj:244)"
"query_processor$run_and_save_query_BANG_.invoke(query_processor.clj:237)"
"query_processor$fn__31502$process_query_and_save_execution_BANG___31507$fn__31508.invoke(query_processor.clj:283)"
"query_processor$fn__31502$process_query_and_save_execution_BANG___31507.invoke(query_processor.clj:269)"
"api.card$run_query_for_card.invokeStatic(card.clj:608)"
"api.card$run_query_for_card.doInvoke(card.clj:594)"
"api.card$fn__41057$fn__41060$fn__41061.invoke(card.clj:615)"
"api.card$fn__41057$fn__41060.invoke(card.clj:614)"
"api.common.internal$do_with_caught_api_exceptions.invokeStatic(internal.clj:254)"
"api.common.internal$do_with_caught_api_exceptions.invoke(internal.clj:249)"
"api.card$fn__41057.invokeStatic(card.clj:610)"
"api.card$fn__41057.invoke(card.clj:610)"
"middleware$enforce_authentication$fn__38506.invoke(middleware.clj:118)"
"api.routes$fn__49374.invokeStatic(routes.clj:64)"
"api.routes$fn__49374.invoke(routes.clj:64)"
"routes$fn__49469$fn__49470.doInvoke(routes.clj:108)"
"routes$fn__49469.invokeStatic(routes.clj:103)"
"routes$fn__49469.invoke(routes.clj:103)"
"middleware$log_api_call$fn__38605$fn__38607.invoke(middleware.clj:349)"
"middleware$log_api_call$fn__38605.invoke(middleware.clj:348)"
"middleware$add_security_headers$fn__38555.invoke(middleware.clj:251)"
"core$wrap_streamed_json_response$fn__54821.invoke(core.clj:67)"
"middleware$bind_current_user$fn__38510.invoke(middleware.clj:139)"
"middleware$maybe_set_site_url$fn__38559.invoke(middleware.clj:275)"]
```
|
process
|
prepared statement needs to be re prepared mysql in error after migration of to java sql sqlexception prepared statement needs to be re prepared in a mysql database after upgrade a metabase instance to the version i find with this error in some questions written in metabase java java sql sqlexception prepared statement needs to be re prepared in a mysql database this instance are installed in a ubuntu server and use mysql to internal db my browser version is chrome my os is ubuntu my database is mysql metabase version docker custom images using ubuntu server mysql as internal db log output warn metabase query processor query failure java sql sqlexception prepared statement needs to be re prepared query processor assert query status successful invokestatic query processor clj query processor assert query status successful invoke query processor clj query processor run and save query bang invokestatic query processor clj query processor run and save query bang invoke query processor clj query processor fn process query and save execution bang fn invoke query processor clj query processor fn process query and save execution bang invoke query processor clj api card run query for card invokestatic card clj api card run query for card doinvoke card clj api card fn fn fn invoke card clj api card fn fn invoke card clj api common internal do with caught api exceptions invokestatic internal clj api common internal do with caught api exceptions invoke internal clj api card fn invokestatic card clj api card fn invoke card clj middleware enforce authentication fn invoke middleware clj api routes fn invokestatic routes clj api routes fn invoke routes clj routes fn fn doinvoke routes clj routes fn invokestatic routes clj routes fn invoke routes clj middleware log api call fn fn invoke middleware clj middleware log api call fn invoke middleware clj middleware add security headers fn invoke middleware clj core wrap streamed json response fn invoke core clj middleware bind current user fn invoke middleware clj middleware maybe set site url fn invoke middleware clj
| 1
|
121,347
| 10,166,280,364
|
IssuesEvent
|
2019-08-07 15:29:56
|
sylabs/singularity
|
https://api.github.com/repos/sylabs/singularity
|
closed
|
Singularity v3.3.0-rc.3 Checklist
|
Release 3.3.0 Testing
|
This is the QA Checklist for Singularity v3.3.0-rc.3
Download: https://github.com/sylabs/singularity/releases/tag/v3.3.0-rc.3
Each comment should include:
A description of the test done
Some example output
A link to an issue, if one was found
Each issue should include:
The Release 3.3.0 label
## Changes since v3.3.0-rc.1
- [x] #3791
- [x] #3749
- [x] #3796
- [ ] #3759
- [x] #3696
- [ ] #3780
- [x] #3825
- [ ] #3758
- [x] #3782
- [x] #3803
- [x] #3889
- [x] #3897
- [x] #3926
- [x] #3943
## Changed behaviors / defaults
- [x] `singularity remote login` will now use the default remote if one is not supplied
- [x] `singularity remote status` will now use the default remote if one is not supplied
- [x] `singularity pull` with the `shub` transport now caches the image
- [x] `singularity cache clean` will now delete only sub-directories of `SINGULARITY_CACHEDIR` instead of the directory itself
## New feature: `oras`
- [x] `singularity pull` understands a new transport, `oras`, which works with the Azure Container Registry and `docker/distribution`
- [x] `singularity push` understands a new transport, `oras`, which works with the Azure Container Registry and `docker/distribution`
## New feature: `fakeroot`
- [x] `singularity build` understands a new option, `--fakeroot`
- [x] `singularity exec` understands a new option, `--fakeroot`
- [x] `singularity exec --fakeroot` requires `--net` to access the network
- [x] `singularity instance start` understands a new option, `--fakeroot`
- [x] `singularity instance start --fakeroot` requires `--net` to access the network
- [x] `singularity run` understands a new option, `--fakeroot`
- [x] `singularity run --fakeroot` requires `--net` to access the network
- [x] `singularity shell` understands a new option, `--fakeroot`
- [x] `singularity shell --fakeroot` requires `--net` to access the network
- [x] `singularity test` understands a new option, `--fakeroot`
- [x] `singularity test --fakeroot` requires `--net` to access the network
## New command: `sif`
- [x] `singularity sif` is a new command that allows you to work with SIF files.
- [x] `singularity sif add` Add a data object to a SIF file
- [x] `singularity sif del` Delete a specified object descriptor and data from SIF file
- [x] `singularity sif dump` Extract and output data objects from SIF files
- [x] `singularity sif header` Display SIF global headers
- [x] `singularity sif info` Display detailed information of object descriptors
- [x] `singularity sif list` List object descriptors from SIF files
- [x] `singularity sif new` Create a new empty SIF image file
- [x] `singularity sif setprim` Set primary system partition
|
1.0
|
Singularity v3.3.0-rc.3 Checklist - This is the QA Checklist for Singularity v3.3.0-rc.3
Download: https://github.com/sylabs/singularity/releases/tag/v3.3.0-rc.3
Each comment should include:
A description of the test done
Some example output
A link to an issue, if one was found
Each issue should include:
The Release 3.3.0 label
## Changes since v3.3.0-rc.1
- [x] #3791
- [x] #3749
- [x] #3796
- [ ] #3759
- [x] #3696
- [ ] #3780
- [x] #3825
- [ ] #3758
- [x] #3782
- [x] #3803
- [x] #3889
- [x] #3897
- [x] #3926
- [x] #3943
## Changed behaviors / defaults
- [x] `singularity remote login` will now use the default remote if one is not supplied
- [x] `singularity remote status` will now use the default remote if one is not supplied
- [x] `singularity pull` with the `shub` transport now caches the image
- [x] `singularity cache clean` will now delete only sub-directories of `SINGULARITY_CACHEDIR` instead of the directory itself
## New feature: `oras`
- [x] `singularity pull` understands a new transport, `oras`, which works with the Azure Container Registry and `docker/distribution`
- [x] `singularity push` understands a new transport, `oras`, which works with the Azure Container Registry and `docker/distribution`
## New feature: `fakeroot`
- [x] `singularity build` understands a new option, `--fakeroot`
- [x] `singularity exec` understands a new option, `--fakeroot`
- [x] `singularity exec --fakeroot` requires `--net` to access the network
- [x] `singularity instance start` understands a new option, `--fakeroot`
- [x] `singularity instance start --fakeroot` requires `--net` to access the network
- [x] `singularity run` understands a new option, `--fakeroot`
- [x] `singularity run --fakeroot` requires `--net` to access the network
- [x] `singularity shell` understands a new option, `--fakeroot`
- [x] `singularity shell --fakeroot` requires `--net` to access the network
- [x] `singularity test` understands a new option, `--fakeroot`
- [x] `singularity test --fakeroot` requires `--net` to access the network
## New command: `sif`
- [x] `singularity sif` is a new command that allows you to work with SIF files.
- [x] `singularity sif add` Add a data object to a SIF file
- [x] `singularity sif del` Delete a specified object descriptor and data from SIF file
- [x] `singularity sif dump` Extract and output data objects from SIF files
- [x] `singularity sif header` Display SIF global headers
- [x] `singularity sif info` Display detailed information of object descriptors
- [x] `singularity sif list` List object descriptors from SIF files
- [x] `singularity sif new` Create a new empty SIF image file
- [x] `singularity sif setprim` Set primary system partition
|
non_process
|
singularity rc checklist this is the qa checklist for singularity rc download each comment should include a description of the test done some example output a link to an issue if one was found each issue should include the release label changes since rc changed behaviors defaults singularity remote login will now use the default remote if one is not supplied singularity remote status will now use the default remote if one is not supplied singularity pull with the shub transport now caches the image singularity cache clean will now delete only sub directories of singularity cachedir instead of the directory itself new feature oras singularity pull understands a new transport oras which works with the azure container registry and docker distribution singularity push understands a new transport oras which works with the azure container registry and docker distribution new feature fakeroot singularity build understands a new option fakeroot singularity exec understands a new option fakeroot singularity exec fakeroot requires net to access the network singularity instance start understands a new option fakeroot singularity instance start fakeroot requires net to access the network singularity run understands a new option fakeroot singularity run fakeroot requires net to access the network singularity shell understands a new option fakeroot singularity shell fakeroot requires net to access the network singularity test understands a new option fakeroot singularity test fakeroot requires net to access the network new command sif singularity sif is a new command that allows you to work with sif files singularity sif add add a data object to a sif file singularity sif del delete a specified object descriptor and data from sif file singularity sif dump extract and output data objects from sif files singularity sif header display sif global headers singularity sif info display detailed information of object descriptors singularity sif list list object descriptors from sif files singularity sif new create a new empty sif image file singularity sif setprim set primary system partition
| 0
|
16,504
| 21,485,395,208
|
IssuesEvent
|
2022-04-26 22:32:11
|
googleapis/python-bare-metal-solution
|
https://api.github.com/repos/googleapis/python-bare-metal-solution
|
closed
|
Release as stable
|
type: process api: baremetalsolution
|
[GA release template](https://github.com/googleapis/google-cloud-common/issues/287)
## Required
- [x] 28 days elapsed since last beta release with new API surface
- [x] Server API is GA
- [x] Package API is stable, and we can commit to backward compatibility
- [x] All dependencies are GA
|
1.0
|
Release as stable - [GA release template](https://github.com/googleapis/google-cloud-common/issues/287)
## Required
- [x] 28 days elapsed since last beta release with new API surface
- [x] Server API is GA
- [x] Package API is stable, and we can commit to backward compatibility
- [x] All dependencies are GA
|
process
|
release as stable required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility all dependencies are ga
| 1
|
16,725
| 21,886,178,748
|
IssuesEvent
|
2022-05-19 18:53:30
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Iterating over features in processing algorithm not creating correct destination folder
|
Processing Bug
|
### What is the bug or the crash?
If there is an algorithm that takes input `QgsProcessingParameterFeatureSource` and output is `QgsProcessingParameterFolderDestination`. The `QgsProcessingParameterFolderDestination` will not be created correctly if the iterate over features button is selected.
This bug is only occurring when you provide a directory path. Working as it should with temp directories.
See the attached screenshot when accessing the directory name using `parameterAsString`
The first run is temp directory and 2nd is actual path:

### Steps to reproduce the issue
Run the following processing script and choose the option of iterating over features and provide an actual directory path to the `Output Directory`.
```python
"""
Model exported as python.
Name : model
Group :
With QGIS : 32400
"""
from qgis.core import QgsProcessing
from qgis.core import QgsProcessingAlgorithm
from qgis.core import QgsProcessingMultiStepFeedback
from qgis.core import QgsProcessingParameterFeatureSource, QgsProcessingParameterFolderDestination
import processing
class Model(QgsProcessingAlgorithm):
def initAlgorithm(self, config=None):
self.addParameter(QgsProcessingParameterFeatureSource('iterativefeatures', 'Iterative Features', types=[QgsProcessing.TypeVectorPolygon], defaultValue=None))
self.addParameter(
QgsProcessingParameterFolderDestination(
"OutputDirectory",
"Output Directory",
createByDefault=True,
defaultValue=None,
)
)
def processAlgorithm(self, parameters, context, model_feedback):
# Use a multi-step feedback, so that individual child algorithm progress reports are adjusted for the
# overall progress through the model
feedback = QgsProcessingMultiStepFeedback(0, model_feedback)
results = {}
outputs = {}
output_dir = self.parameterAsString(parameters, "OutputDirectory", context)
feedback.pushWarning(output_dir)
return results
def name(self):
return 'model'
def displayName(self):
return 'model'
def group(self):
return ''
def groupId(self):
return ''
def createInstance(self):
return Model()
```
### Versions
3.22.4
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
I tried `parameterAsFile` and many others ways to access the `Output Directory`, and all have the same issue.
|
1.0
|
Iterating over features in processing algorithm not creating correct destination folder - ### What is the bug or the crash?
If there is an algorithm that takes input `QgsProcessingParameterFeatureSource` and output is `QgsProcessingParameterFolderDestination`. The `QgsProcessingParameterFolderDestination` will not be created correctly if the iterate over features button is selected.
This bug is only occurring when you provide a directory path. Working as it should with temp directories.
See the attached screenshot when accessing the directory name using `parameterAsString`
The first run is temp directory and 2nd is actual path:

### Steps to reproduce the issue
Run the following processing script and choose the option of iterating over features and provide an actual directory path to the `Output Directory`.
```python
"""
Model exported as python.
Name : model
Group :
With QGIS : 32400
"""
from qgis.core import QgsProcessing
from qgis.core import QgsProcessingAlgorithm
from qgis.core import QgsProcessingMultiStepFeedback
from qgis.core import QgsProcessingParameterFeatureSource, QgsProcessingParameterFolderDestination
import processing
class Model(QgsProcessingAlgorithm):
def initAlgorithm(self, config=None):
self.addParameter(QgsProcessingParameterFeatureSource('iterativefeatures', 'Iterative Features', types=[QgsProcessing.TypeVectorPolygon], defaultValue=None))
self.addParameter(
QgsProcessingParameterFolderDestination(
"OutputDirectory",
"Output Directory",
createByDefault=True,
defaultValue=None,
)
)
def processAlgorithm(self, parameters, context, model_feedback):
# Use a multi-step feedback, so that individual child algorithm progress reports are adjusted for the
# overall progress through the model
feedback = QgsProcessingMultiStepFeedback(0, model_feedback)
results = {}
outputs = {}
output_dir = self.parameterAsString(parameters, "OutputDirectory", context)
feedback.pushWarning(output_dir)
return results
def name(self):
return 'model'
def displayName(self):
return 'model'
def group(self):
return ''
def groupId(self):
return ''
def createInstance(self):
return Model()
```
### Versions
3.22.4
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
I tried `parameterAsFile` and many others ways to access the `Output Directory`, and all have the same issue.
|
process
|
iterating over features in processing algorithm not creating correct destination folder what is the bug or the crash if there is an algorithm that takes input qgsprocessingparameterfeaturesource and output is qgsprocessingparameterfolderdestination the qgsprocessingparameterfolderdestination will not be created correctly if the iterate over features button is selected this bug is only occurring when you provide a directory path working as it should with temp directories see the attached screenshot when accessing the directory name using parameterasstring the first run is temp directory and is actual path steps to reproduce the issue run the following processing script and choose the option of iterating over features and provide an actual directory path to the output directory python model exported as python name model group with qgis from qgis core import qgsprocessing from qgis core import qgsprocessingalgorithm from qgis core import qgsprocessingmultistepfeedback from qgis core import qgsprocessingparameterfeaturesource qgsprocessingparameterfolderdestination import processing class model qgsprocessingalgorithm def initalgorithm self config none self addparameter qgsprocessingparameterfeaturesource iterativefeatures iterative features types defaultvalue none self addparameter qgsprocessingparameterfolderdestination outputdirectory output directory createbydefault true defaultvalue none def processalgorithm self parameters context model feedback use a multi step feedback so that individual child algorithm progress reports are adjusted for the overall progress through the model feedback qgsprocessingmultistepfeedback model feedback results outputs output dir self parameterasstring parameters outputdirectory context feedback pushwarning output dir return results def name self return model def displayname self return model def group self return def groupid self return def createinstance self return model versions supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context i tried parameterasfile and many others ways to access the output directory and all have the same issue
| 1
|
22,220
| 30,771,260,952
|
IssuesEvent
|
2023-07-30 23:26:22
|
danrleypereira/verzel-pleno-prova
|
https://api.github.com/repos/danrleypereira/verzel-pleno-prova
|
opened
|
Aprimoramentos na aplicação de gerenciamento de veículos
|
enhancement feature Processo Seletivo
|
Durante o desenvolvimento e revisão de código do aplicativo de gerenciamento de veículos, identificamos várias áreas onde melhorias podem ser feitas para melhorar a eficiência, segurança e usabilidade do aplicativo.
Componentização da Lógica de Paginação
A lógica de paginação está atualmente contida no componente Cars. Para melhorar a reutilização de código e a manutenibilidade, essa lógica deve ser movida para seu próprio componente.
Criação de um Componente de Edição/Registro de Veículos
Atualmente, a funcionalidade de edição e registro de veículos é realizada no componente HoveredVehicle. Isso deve ser movido para um novo componente para separar as responsabilidades dos componentes.
Proteção das Rotas com Base na Expiração do Token
Para melhorar a segurança do aplicativo, as rotas devem ser protegidas com base na expiração do token do usuário. Isso significa que se o token do usuário expirou, ele deve ser redirecionado para a tela de login.
Tratamento de Rota Inexistente
Quando o usuário tenta navegar para uma rota que não existe, ele deve ser redirecionado para uma rota específica, como a página inicial ou uma página de erro personalizada.
Melhorias no Componente VehicleForm
O componente VehicleForm requer algumas melhorias para evitar a mudança entre inputs controlados e não controlados. O estado inicial dos veículos deve ser definido para evitar inputs não controlados.
Tarefas:
- [ ] Mover a lógica de paginação para seu próprio componente.
- [ ] Criar um componente de edição/registro de veículos.
- [ ] Implementar proteção de rotas com base na expiração do token.
- [ ] Implementar redirecionamento para rota específica quando o usuário tenta acessar uma rota que não existe.
- [ ] Fazer melhorias no componente VehicleForm para evitar a mudança entre inputs controlados e não controlados.
|
1.0
|
Aprimoramentos na aplicação de gerenciamento de veículos - Durante o desenvolvimento e revisão de código do aplicativo de gerenciamento de veículos, identificamos várias áreas onde melhorias podem ser feitas para melhorar a eficiência, segurança e usabilidade do aplicativo.
Componentização da Lógica de Paginação
A lógica de paginação está atualmente contida no componente Cars. Para melhorar a reutilização de código e a manutenibilidade, essa lógica deve ser movida para seu próprio componente.
Criação de um Componente de Edição/Registro de Veículos
Atualmente, a funcionalidade de edição e registro de veículos é realizada no componente HoveredVehicle. Isso deve ser movido para um novo componente para separar as responsabilidades dos componentes.
Proteção das Rotas com Base na Expiração do Token
Para melhorar a segurança do aplicativo, as rotas devem ser protegidas com base na expiração do token do usuário. Isso significa que se o token do usuário expirou, ele deve ser redirecionado para a tela de login.
Tratamento de Rota Inexistente
Quando o usuário tenta navegar para uma rota que não existe, ele deve ser redirecionado para uma rota específica, como a página inicial ou uma página de erro personalizada.
Melhorias no Componente VehicleForm
O componente VehicleForm requer algumas melhorias para evitar a mudança entre inputs controlados e não controlados. O estado inicial dos veículos deve ser definido para evitar inputs não controlados.
Tarefas:
- [ ] Mover a lógica de paginação para seu próprio componente.
- [ ] Criar um componente de edição/registro de veículos.
- [ ] Implementar proteção de rotas com base na expiração do token.
- [ ] Implementar redirecionamento para rota específica quando o usuário tenta acessar uma rota que não existe.
- [ ] Fazer melhorias no componente VehicleForm para evitar a mudança entre inputs controlados e não controlados.
|
process
|
aprimoramentos na aplicação de gerenciamento de veículos durante o desenvolvimento e revisão de código do aplicativo de gerenciamento de veículos identificamos várias áreas onde melhorias podem ser feitas para melhorar a eficiência segurança e usabilidade do aplicativo componentização da lógica de paginação a lógica de paginação está atualmente contida no componente cars para melhorar a reutilização de código e a manutenibilidade essa lógica deve ser movida para seu próprio componente criação de um componente de edição registro de veículos atualmente a funcionalidade de edição e registro de veículos é realizada no componente hoveredvehicle isso deve ser movido para um novo componente para separar as responsabilidades dos componentes proteção das rotas com base na expiração do token para melhorar a segurança do aplicativo as rotas devem ser protegidas com base na expiração do token do usuário isso significa que se o token do usuário expirou ele deve ser redirecionado para a tela de login tratamento de rota inexistente quando o usuário tenta navegar para uma rota que não existe ele deve ser redirecionado para uma rota específica como a página inicial ou uma página de erro personalizada melhorias no componente vehicleform o componente vehicleform requer algumas melhorias para evitar a mudança entre inputs controlados e não controlados o estado inicial dos veículos deve ser definido para evitar inputs não controlados tarefas mover a lógica de paginação para seu próprio componente criar um componente de edição registro de veículos implementar proteção de rotas com base na expiração do token implementar redirecionamento para rota específica quando o usuário tenta acessar uma rota que não existe fazer melhorias no componente vehicleform para evitar a mudança entre inputs controlados e não controlados
| 1
|
17,548
| 23,358,152,161
|
IssuesEvent
|
2022-08-10 09:16:20
|
ArneBinder/pie-utils
|
https://api.github.com/repos/ArneBinder/pie-utils
|
opened
|
document processor to create a partition via regex
|
document processor
|
This may take advantage from [previous implementation](https://github.com/ArneBinder/pytorch-ie-sam-template/blob/main/src/document_processors/partition.py). This should also collect the distribution of the lengths of the parts (parts entries) and the full texts (to compare against) and also the number of parts per document.
|
1.0
|
document processor to create a partition via regex - This may take advantage from [previous implementation](https://github.com/ArneBinder/pytorch-ie-sam-template/blob/main/src/document_processors/partition.py). This should also collect the distribution of the lengths of the parts (parts entries) and the full texts (to compare against) and also the number of parts per document.
|
process
|
document processor to create a partition via regex this may take advantage from this should also collect the distribution of the lengths of the parts parts entries and the full texts to compare against and also the number of parts per document
| 1
|
776,016
| 27,243,612,237
|
IssuesEvent
|
2023-02-21 23:00:04
|
DSpace/dspace-angular
|
https://api.github.com/repos/DSpace/dspace-angular
|
closed
|
PDF file download problem
|
bug help wanted usability authorization medium priority Estimate TBD
|
**Describe the bug**
I'll report this as an Angular UI issue to start the discussion, but aspects of the REST API are also involved.
Currently, when the user choses to download a PDF file (and probably other file formats that can be rendered inline in the browser) the bitstream download page is opened followed by a hard redirect to the REST API content endpoint. In the case of a PDF, the REST API response Content-Disposition header usually indicates an `inline` resource and the PDF file is rendered in the browser window.
There are 2 problems with this.
1. Two clicks in the browser navbar are required to return to the DSpace Item page; the first back click results in the PDF file being rendered again a second time.
2. If the bitstream has access restrictions, the first click instead results in a WhiteLabel page indicating a 401 error.
One possible solution for `#1` is to open the bitstream download page in a new browser tab. This can be done easily a themed version of `file-section-component.html` by adding `isBlank=true` to attributes of the`ds-download-link` element. We might want to consider making this change in the default component.
Unfortunately, if bitstream access is restricted, opening the download page in a new tab also results in `#2` (the 401 error ) so more work would be required to make this solution work.
There may be other, better ways to solve the navigation problem.
**To Reproduce**
Steps to reproduce the behavior:
1. Click to download a PDF file in the simple Item view
2. Once open, use the browser back button to navigate back to the item
3. Try the same thing again on a bitstream with restricted access and you should see a 401 error.
**Expected behavior**
The download page serves a useful purpose so _ideally_ everything would remain as is with the exception of fixing the 2-click return problem for inline files. Alternately, opening the download page in a new tab an acceptable compromise once the authorization error is fixed. There may be other solutions.
**Related work**
Link to any related tickets or PRs here.
|
1.0
|
PDF file download problem - **Describe the bug**
I'll report this as an Angular UI issue to start the discussion, but aspects of the REST API are also involved.
Currently, when the user choses to download a PDF file (and probably other file formats that can be rendered inline in the browser) the bitstream download page is opened followed by a hard redirect to the REST API content endpoint. In the case of a PDF, the REST API response Content-Disposition header usually indicates an `inline` resource and the PDF file is rendered in the browser window.
There are 2 problems with this.
1. Two clicks in the browser navbar are required to return to the DSpace Item page; the first back click results in the PDF file being rendered again a second time.
2. If the bitstream has access restrictions, the first click instead results in a WhiteLabel page indicating a 401 error.
One possible solution for `#1` is to open the bitstream download page in a new browser tab. This can be done easily a themed version of `file-section-component.html` by adding `isBlank=true` to attributes of the`ds-download-link` element. We might want to consider making this change in the default component.
Unfortunately, if bitstream access is restricted, opening the download page in a new tab also results in `#2` (the 401 error ) so more work would be required to make this solution work.
There may be other, better ways to solve the navigation problem.
**To Reproduce**
Steps to reproduce the behavior:
1. Click to download a PDF file in the simple Item view
2. Once open, use the browser back button to navigate back to the item
3. Try the same thing again on a bitstream with restricted access and you should see a 401 error.
**Expected behavior**
The download page serves a useful purpose so _ideally_ everything would remain as is with the exception of fixing the 2-click return problem for inline files. Alternately, opening the download page in a new tab an acceptable compromise once the authorization error is fixed. There may be other solutions.
**Related work**
Link to any related tickets or PRs here.
|
non_process
|
pdf file download problem describe the bug i ll report this as an angular ui issue to start the discussion but aspects of the rest api are also involved currently when the user choses to download a pdf file and probably other file formats that can be rendered inline in the browser the bitstream download page is opened followed by a hard redirect to the rest api content endpoint in the case of a pdf the rest api response content disposition header usually indicates an inline resource and the pdf file is rendered in the browser window there are problems with this two clicks in the browser navbar are required to return to the dspace item page the first back click results in the pdf file being rendered again a second time if the bitstream has access restrictions the first click instead results in a whitelabel page indicating a error one possible solution for is to open the bitstream download page in a new browser tab this can be done easily a themed version of file section component html by adding isblank true to attributes of the ds download link element we might want to consider making this change in the default component unfortunately if bitstream access is restricted opening the download page in a new tab also results in the error so more work would be required to make this solution work there may be other better ways to solve the navigation problem to reproduce steps to reproduce the behavior click to download a pdf file in the simple item view once open use the browser back button to navigate back to the item try the same thing again on a bitstream with restricted access and you should see a error expected behavior the download page serves a useful purpose so ideally everything would remain as is with the exception of fixing the click return problem for inline files alternately opening the download page in a new tab an acceptable compromise once the authorization error is fixed there may be other solutions related work link to any related tickets or prs here
| 0
|
8,926
| 12,032,987,176
|
IssuesEvent
|
2020-04-13 13:23:03
|
shivammalviya712/Real-Time-Trigger-Word-Detection
|
https://api.github.com/repos/shivammalviya712/Real-Time-Trigger-Word-Detection
|
reopened
|
Portaudio and Pyaudio errors
|
Preprocessing
|
- Could not import the pyaudio c module '_portaudio'
- ImportError: DLL load failed: The specified module could not be found.
- Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work
|
1.0
|
Portaudio and Pyaudio errors - - Could not import the pyaudio c module '_portaudio'
- ImportError: DLL load failed: The specified module could not be found.
- Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work
|
process
|
portaudio and pyaudio errors could not import the pyaudio c module portaudio importerror dll load failed the specified module could not be found couldn t find ffmpeg or avconv defaulting to ffmpeg but may not work
| 1
|
15,912
| 20,118,957,325
|
IssuesEvent
|
2022-02-07 22:59:48
|
googleapis/cloud-bigtable-cbt-cli
|
https://api.github.com/repos/googleapis/cloud-bigtable-cbt-cli
|
closed
|
Security Policy violation Outside Collaborators
|
allstar type: process
|
Allstar has detected that this repository’s Outside Collaborators security policy is out of compliance. Status:
Did not find any owners of this repository
This policy requires all repositories to have an organization member or team assigned as an administrator. Either there are no administrators, or all administrators are outside collaborators. A responsible party is required by organization policy to respond to security events and organization requests.
To add an administrator From the main page of the repository, go to Settings -> Manage Access.
(For more information, see https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories)
Alternately, if this repository does not have any maintainers, archive or delete it.
This issue will auto resolve when the policy is in compliance.
Issue created by Allstar. See https://github.com/ossf/allstar/ for more information. For questions specific to the repository, please contact the owner or maintainer.
|
1.0
|
Security Policy violation Outside Collaborators - Allstar has detected that this repository’s Outside Collaborators security policy is out of compliance. Status:
Did not find any owners of this repository
This policy requires all repositories to have an organization member or team assigned as an administrator. Either there are no administrators, or all administrators are outside collaborators. A responsible party is required by organization policy to respond to security events and organization requests.
To add an administrator From the main page of the repository, go to Settings -> Manage Access.
(For more information, see https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories)
Alternately, if this repository does not have any maintainers, archive or delete it.
This issue will auto resolve when the policy is in compliance.
Issue created by Allstar. See https://github.com/ossf/allstar/ for more information. For questions specific to the repository, please contact the owner or maintainer.
|
process
|
security policy violation outside collaborators allstar has detected that this repository’s outside collaborators security policy is out of compliance status did not find any owners of this repository this policy requires all repositories to have an organization member or team assigned as an administrator either there are no administrators or all administrators are outside collaborators a responsible party is required by organization policy to respond to security events and organization requests to add an administrator from the main page of the repository go to settings manage access for more information see alternately if this repository does not have any maintainers archive or delete it this issue will auto resolve when the policy is in compliance issue created by allstar see for more information for questions specific to the repository please contact the owner or maintainer
| 1
|
13,469
| 15,953,312,059
|
IssuesEvent
|
2021-04-15 12:17:42
|
prisma/e2e-tests
|
https://api.github.com/repos/prisma/e2e-tests
|
closed
|
Add test for Prisma Studio
|
process/candidate team/developer-productivity
|
Context
We had to release a patch because Studio was broken after being published, tests were green in prisma/prisma and e2e
https://github.com/prisma/prisma/releases/tag/2.21.1
https://github.com/prisma/studio/issues/659
So it seems that adding a test here to check if Studio work would bring more confidence in our automated testing. Maybe @madebysid has an idea about what we can test here exactly.
|
1.0
|
Add test for Prisma Studio - Context
We had to release a patch because Studio was broken after being published, tests were green in prisma/prisma and e2e
https://github.com/prisma/prisma/releases/tag/2.21.1
https://github.com/prisma/studio/issues/659
So it seems that adding a test here to check if Studio work would bring more confidence in our automated testing. Maybe @madebysid has an idea about what we can test here exactly.
|
process
|
add test for prisma studio context we had to release a patch because studio was broken after being published tests were green in prisma prisma and so it seems that adding a test here to check if studio work would bring more confidence in our automated testing maybe madebysid has an idea about what we can test here exactly
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.