Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
128,599
| 12,378,231,677
|
IssuesEvent
|
2020-05-19 10:19:16
|
dvbui/VocabularyBot
|
https://api.github.com/repos/dvbui/VocabularyBot
|
opened
|
Update README
|
documentation
|
Other people should know how to install this bot and use it in their Discord channel.
|
1.0
|
Update README - Other people should know how to install this bot and use it in their Discord channel.
|
non_process
|
update readme other people should know how to install this bot and use it in their discord channel
| 0
|
66,161
| 8,883,374,734
|
IssuesEvent
|
2019-01-14 15:33:02
|
LycheeOrg/Lychee
|
https://api.github.com/repos/LycheeOrg/Lychee
|
opened
|
Doc: add Couldron link to wiki
|
Documentation Low Priority
|
### Detailed description of the problem
from #164 :
> We are trying to provide an update for the Lychee Cloudron app. The current version is 3.1.6 and the new version will be 3.2.8-fixed. Since this is for Cloudron, the built-in updater is disabled, since on Cloudron apps run on a read-only filesystem. For reference, our app packaging code can be found at https://git.cloudron.io/cloudron/lychee-app
Maybe adding a pointer to this in the Wiki would be a nice idea. :)
|
1.0
|
Doc: add Couldron link to wiki - ### Detailed description of the problem
from #164 :
> We are trying to provide an update for the Lychee Cloudron app. The current version is 3.1.6 and the new version will be 3.2.8-fixed. Since this is for Cloudron, the built-in updater is disabled, since on Cloudron apps run on a read-only filesystem. For reference, our app packaging code can be found at https://git.cloudron.io/cloudron/lychee-app
Maybe adding a pointer to this in the Wiki would be a nice idea. :)
|
non_process
|
doc add couldron link to wiki detailed description of the problem from we are trying to provide an update for the lychee cloudron app the current version is and the new version will be fixed since this is for cloudron the built in updater is disabled since on cloudron apps run on a read only filesystem for reference our app packaging code can be found at maybe adding a pointer to this in the wiki would be a nice idea
| 0
|
163,491
| 20,363,824,163
|
IssuesEvent
|
2022-02-21 01:32:40
|
dmartinez777/color-picker-test
|
https://api.github.com/repos/dmartinez777/color-picker-test
|
opened
|
CVE-2020-28469 (High) detected in glob-parent-3.1.0.tgz
|
security vulnerability
|
## CVE-2020-28469 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>glob-parent-3.1.0.tgz</b></p></summary>
<p>Strips glob magic from a string to provide the parent directory path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator.
<p>Publish Date: 2021-06-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p>
<p>Release Date: 2021-06-03</p>
<p>Fix Resolution: 5.1.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-28469 (High) detected in glob-parent-3.1.0.tgz - ## CVE-2020-28469 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>glob-parent-3.1.0.tgz</b></p></summary>
<p>Strips glob magic from a string to provide the parent directory path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator.
<p>Publish Date: 2021-06-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p>
<p>Release Date: 2021-06-03</p>
<p>Fix Resolution: 5.1.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in glob parent tgz cve high severity vulnerability vulnerable library glob parent tgz strips glob magic from a string to provide the parent directory path library home page a href path to dependency file package json path to vulnerable library node modules glob parent package json dependency hierarchy found in base branch master vulnerability details this affects the package glob parent before the enclosure regex used to check for strings ending in enclosure containing path separator publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
195,632
| 14,741,986,619
|
IssuesEvent
|
2021-01-07 11:30:10
|
tyrepharm/upptime-test
|
https://api.github.com/repos/tyrepharm/upptime-test
|
opened
|
🛑 (test) Tyre Pharmacy is down
|
status test-tyre-pharmacy
|
In [`94706f6`](https://github.com/tyrepharm/upptime-test/commit/94706f65fabcb27d1022c3636e756216320a5214
), (test) Tyre Pharmacy ($TESTING_TYREPHARM) was **down**:
- HTTP code: 0
- Response time: 0 ms
|
1.0
|
🛑 (test) Tyre Pharmacy is down - In [`94706f6`](https://github.com/tyrepharm/upptime-test/commit/94706f65fabcb27d1022c3636e756216320a5214
), (test) Tyre Pharmacy ($TESTING_TYREPHARM) was **down**:
- HTTP code: 0
- Response time: 0 ms
|
non_process
|
🛑 test tyre pharmacy is down in test tyre pharmacy testing tyrepharm was down http code response time ms
| 0
|
10,685
| 13,465,008,442
|
IssuesEvent
|
2020-09-09 20:08:15
|
hashicorp/packer
|
https://api.github.com/repos/hashicorp/packer
|
closed
|
<no value> as a .Provider variable value
|
post-processor/vagrant
|
Packer is not setting the .Provider on the vagrant post-processor
-----------------------------------------------------------------
__Packer version__: 1.2.4
__Host Platform__: macOS 10.13.4
__Provider__: VirtualBox 5.2.12
__Gist__: https://gist.github.com/brodriguesneto/73600c762fecb1a230a7f92747a59730
I run a packer build to upload a custom Vagrant box. A have this code as a post-processor:
```JSON
"post-processors": [
[
{
"type": "vagrant",
"keep_input_artifact": false,
"only": [
"virtualbox-iso"
],
"output": "{{ user `artifact_root` }}/{{ user `vm_name`}}/{{ .Provider }}/{{ user `vm_name` }}.box"
},
{
"type": "vagrant-cloud",
"box_tag": "{{user `vagrant_cloud_account`}}/{{ user `vm_name` }}",
"version": "{{ timestamp }}",
"version_description": "{{user `vagrant_version_description`}}"
}
]
]
```
In the context of vagrant post-processors, the .Provider has no value:
```sh
==> virtualbox-iso (vagrant): Creating Vagrant box for 'virtualbox' provider
virtualbox-iso (vagrant): Copying from artifact: artifact/ubuntu-1604/<no value>/ubuntu-1604-disk001.vmdk
virtualbox-iso (vagrant): Copying from artifact: artifact/ubuntu-1604/<no value>/ubuntu-1604.ovf
virtualbox-iso (vagrant): Renaming the OVF to box.ovf...
virtualbox-iso (vagrant): Compressing: Vagrantfile
virtualbox-iso (vagrant): Compressing: box.ovf
virtualbox-iso (vagrant): Compressing: metadata.json
virtualbox-iso (vagrant): Compressing: ubuntu-1604-disk001.vmdk
```
But in the context of vagrant-cloud it sets the .Provider as expected:
```sh
==> virtualbox-iso: Running post-processor: vagrant-cloud
==> virtualbox-iso (vagrant-cloud): Verifying box is accessible: everproven/ubuntu-1604
virtualbox-iso (vagrant-cloud): Box accessible and matches tag
==> virtualbox-iso (vagrant-cloud): Creating version: 1527966362
==> virtualbox-iso (vagrant-cloud): Creating provider: virtualbox
==> virtualbox-iso (vagrant-cloud): Preparing upload of box: artifact/ubuntu-1604/virtualbox/ubuntu-1604.box
==> virtualbox-iso (vagrant-cloud): Uploading box: artifact/ubuntu-1604/virtualbox/ubuntu-1604.box
virtualbox-iso (vagrant-cloud): Depending on your internet connection and the size of the box,
virtualbox-iso (vagrant-cloud): this may take some time
virtualbox-iso (vagrant-cloud): Uploading box, attempt 1
virtualbox-iso (vagrant-cloud): Box successfully uploaded
==> virtualbox-iso (vagrant-cloud): Releasing version: 1527966362
virtualbox-iso (vagrant-cloud): Version successfully released and available
```
|
1.0
|
<no value> as a .Provider variable value - Packer is not setting the .Provider on the vagrant post-processor
-----------------------------------------------------------------
__Packer version__: 1.2.4
__Host Platform__: macOS 10.13.4
__Provider__: VirtualBox 5.2.12
__Gist__: https://gist.github.com/brodriguesneto/73600c762fecb1a230a7f92747a59730
I run a packer build to upload a custom Vagrant box. A have this code as a post-processor:
```JSON
"post-processors": [
[
{
"type": "vagrant",
"keep_input_artifact": false,
"only": [
"virtualbox-iso"
],
"output": "{{ user `artifact_root` }}/{{ user `vm_name`}}/{{ .Provider }}/{{ user `vm_name` }}.box"
},
{
"type": "vagrant-cloud",
"box_tag": "{{user `vagrant_cloud_account`}}/{{ user `vm_name` }}",
"version": "{{ timestamp }}",
"version_description": "{{user `vagrant_version_description`}}"
}
]
]
```
In the context of vagrant post-processors, the .Provider has no value:
```sh
==> virtualbox-iso (vagrant): Creating Vagrant box for 'virtualbox' provider
virtualbox-iso (vagrant): Copying from artifact: artifact/ubuntu-1604/<no value>/ubuntu-1604-disk001.vmdk
virtualbox-iso (vagrant): Copying from artifact: artifact/ubuntu-1604/<no value>/ubuntu-1604.ovf
virtualbox-iso (vagrant): Renaming the OVF to box.ovf...
virtualbox-iso (vagrant): Compressing: Vagrantfile
virtualbox-iso (vagrant): Compressing: box.ovf
virtualbox-iso (vagrant): Compressing: metadata.json
virtualbox-iso (vagrant): Compressing: ubuntu-1604-disk001.vmdk
```
But in the context of vagrant-cloud it sets the .Provider as expected:
```sh
==> virtualbox-iso: Running post-processor: vagrant-cloud
==> virtualbox-iso (vagrant-cloud): Verifying box is accessible: everproven/ubuntu-1604
virtualbox-iso (vagrant-cloud): Box accessible and matches tag
==> virtualbox-iso (vagrant-cloud): Creating version: 1527966362
==> virtualbox-iso (vagrant-cloud): Creating provider: virtualbox
==> virtualbox-iso (vagrant-cloud): Preparing upload of box: artifact/ubuntu-1604/virtualbox/ubuntu-1604.box
==> virtualbox-iso (vagrant-cloud): Uploading box: artifact/ubuntu-1604/virtualbox/ubuntu-1604.box
virtualbox-iso (vagrant-cloud): Depending on your internet connection and the size of the box,
virtualbox-iso (vagrant-cloud): this may take some time
virtualbox-iso (vagrant-cloud): Uploading box, attempt 1
virtualbox-iso (vagrant-cloud): Box successfully uploaded
==> virtualbox-iso (vagrant-cloud): Releasing version: 1527966362
virtualbox-iso (vagrant-cloud): Version successfully released and available
```
|
process
|
as a provider variable value packer is not setting the provider on the vagrant post processor packer version host platform macos provider virtualbox gist i run a packer build to upload a custom vagrant box a have this code as a post processor json post processors type vagrant keep input artifact false only virtualbox iso output user artifact root user vm name provider user vm name box type vagrant cloud box tag user vagrant cloud account user vm name version timestamp version description user vagrant version description in the context of vagrant post processors the provider has no value sh virtualbox iso vagrant creating vagrant box for virtualbox provider virtualbox iso vagrant copying from artifact artifact ubuntu ubuntu vmdk virtualbox iso vagrant copying from artifact artifact ubuntu ubuntu ovf virtualbox iso vagrant renaming the ovf to box ovf virtualbox iso vagrant compressing vagrantfile virtualbox iso vagrant compressing box ovf virtualbox iso vagrant compressing metadata json virtualbox iso vagrant compressing ubuntu vmdk but in the context of vagrant cloud it sets the provider as expected sh virtualbox iso running post processor vagrant cloud virtualbox iso vagrant cloud verifying box is accessible everproven ubuntu virtualbox iso vagrant cloud box accessible and matches tag virtualbox iso vagrant cloud creating version virtualbox iso vagrant cloud creating provider virtualbox virtualbox iso vagrant cloud preparing upload of box artifact ubuntu virtualbox ubuntu box virtualbox iso vagrant cloud uploading box artifact ubuntu virtualbox ubuntu box virtualbox iso vagrant cloud depending on your internet connection and the size of the box virtualbox iso vagrant cloud this may take some time virtualbox iso vagrant cloud uploading box attempt virtualbox iso vagrant cloud box successfully uploaded virtualbox iso vagrant cloud releasing version virtualbox iso vagrant cloud version successfully released and available
| 1
|
9,661
| 12,643,056,210
|
IssuesEvent
|
2020-06-16 09:11:12
|
prisma/prisma-engines
|
https://api.github.com/repos/prisma/prisma-engines
|
opened
|
Create a migration engine <-> introspection engine test setup
|
component: introspection engine component: migration engine process/candidate
|
Migrating after introspecting, or re-introspecting a migrated database are both workflows that we currently do not test, and that we would like to make smooth and painless for our users. We know some users are also having issues with this. A first step towards tackling this would be a test setup with both migration and introspection engines so we can easily test various scenarios.
This can very easily live within the existing engines test suite, maybe as a separate test crate.
|
1.0
|
Create a migration engine <-> introspection engine test setup - Migrating after introspecting, or re-introspecting a migrated database are both workflows that we currently do not test, and that we would like to make smooth and painless for our users. We know some users are also having issues with this. A first step towards tackling this would be a test setup with both migration and introspection engines so we can easily test various scenarios.
This can very easily live within the existing engines test suite, maybe as a separate test crate.
|
process
|
create a migration engine introspection engine test setup migrating after introspecting or re introspecting a migrated database are both workflows that we currently do not test and that we would like to make smooth and painless for our users we know some users are also having issues with this a first step towards tackling this would be a test setup with both migration and introspection engines so we can easily test various scenarios this can very easily live within the existing engines test suite maybe as a separate test crate
| 1
|
274,616
| 23,853,155,606
|
IssuesEvent
|
2022-09-06 20:01:07
|
Unidata/netcdf-c
|
https://api.github.com/repos/Unidata/netcdf-c
|
opened
|
Github Actions: LGTM.com is being EOL'd, we need to migrate to CodeQL.
|
area/tests
|
See [https://github.blog/2022-08-15-the-next-step-for-lgtm-com-github-code-scanning/](https://github.blog/2022-08-15-the-next-step-for-lgtm-com-github-code-scanning/) for more information.
|
1.0
|
Github Actions: LGTM.com is being EOL'd, we need to migrate to CodeQL. - See [https://github.blog/2022-08-15-the-next-step-for-lgtm-com-github-code-scanning/](https://github.blog/2022-08-15-the-next-step-for-lgtm-com-github-code-scanning/) for more information.
|
non_process
|
github actions lgtm com is being eol d we need to migrate to codeql see for more information
| 0
|
85,439
| 7,969,785,258
|
IssuesEvent
|
2018-07-16 10:17:47
|
fabric8io/fabric8-test
|
https://api.github.com/repos/fabric8io/fabric8-test
|
opened
|
Colorize log output
|
E2E-test
|
package for writing colorful text to console: https://www.npmjs.com/package/chalk
ideas:
- different color for info and debug logging
|
1.0
|
Colorize log output - package for writing colorful text to console: https://www.npmjs.com/package/chalk
ideas:
- different color for info and debug logging
|
non_process
|
colorize log output package for writing colorful text to console ideas different color for info and debug logging
| 0
|
3,626
| 6,662,294,246
|
IssuesEvent
|
2017-10-02 12:35:47
|
rogerthat-platform/rogerthat-ios-client
|
https://api.github.com/repos/rogerthat-platform/rogerthat-ios-client
|
reopened
|
It's hard to open the loyalty functionality in the toolbar
|
priority_major process_duplicate type_bug
|
You have to press the exact correct spot to be able to open the loyalty card
|
1.0
|
It's hard to open the loyalty functionality in the toolbar - You have to press the exact correct spot to be able to open the loyalty card
|
process
|
it s hard to open the loyalty functionality in the toolbar you have to press the exact correct spot to be able to open the loyalty card
| 1
|
415,178
| 12,125,815,807
|
IssuesEvent
|
2020-04-22 16:04:31
|
raceintospace/raceintospace
|
https://api.github.com/repos/raceintospace/raceintospace
|
opened
|
Add PgUp and PgDn and/or Home and End functionality to Prestige Summary
|
Low Priority enhancement
|
Prestige Summary is one of the few places you can get something of a bird's-eye view of where you are in your space program: what you've accomplished so far and what you haven't. However, it's awkward to use: partly because it's a prestige summary, not an event summary, but also because the only way to navigate in it is one line at a time.

|
1.0
|
Add PgUp and PgDn and/or Home and End functionality to Prestige Summary - Prestige Summary is one of the few places you can get something of a bird's-eye view of where you are in your space program: what you've accomplished so far and what you haven't. However, it's awkward to use: partly because it's a prestige summary, not an event summary, but also because the only way to navigate in it is one line at a time.

|
non_process
|
add pgup and pgdn and or home and end functionality to prestige summary prestige summary is one of the few places you can get something of a bird s eye view of where you are in your space program what you ve accomplished so far and what you haven t however it s awkward to use partly because it s a prestige summary not an event summary but also because the only way to navigate in it is one line at a time
| 0
|
36,905
| 2,813,536,667
|
IssuesEvent
|
2015-05-18 15:11:24
|
openshift/origin
|
https://api.github.com/repos/openshift/origin
|
closed
|
Switching projects prompts to log in immediately after logging in
|
component/cli kind/bug priority/P1
|
Inconsistent, but I did the following, and was prompted to log in a second time:
```
osc login
... log in ...
osc project (some project I have access to)
```
My theory is that the context for the project I was switching to referenced a second user stanza for the same user with a stale token
|
1.0
|
Switching projects prompts to log in immediately after logging in - Inconsistent, but I did the following, and was prompted to log in a second time:
```
osc login
... log in ...
osc project (some project I have access to)
```
My theory is that the context for the project I was switching to referenced a second user stanza for the same user with a stale token
|
non_process
|
switching projects prompts to log in immediately after logging in inconsistent but i did the following and was prompted to log in a second time osc login log in osc project some project i have access to my theory is that the context for the project i was switching to referenced a second user stanza for the same user with a stale token
| 0
|
11,296
| 14,102,783,432
|
IssuesEvent
|
2020-11-06 09:17:33
|
MEDEAEditions/DEPCHA
|
https://api.github.com/repos/MEDEAEditions/DEPCHA
|
closed
|
PreProcessing: csvToRDF - JSON Parsing Error
|
preprocessing
|
All task, bugs and to-do relating to the TORDF.xsl (Ingest XML/TEI into GAMS)
- [ ]
|
1.0
|
PreProcessing: csvToRDF - JSON Parsing Error - All task, bugs and to-do relating to the TORDF.xsl (Ingest XML/TEI into GAMS)
- [ ]
|
process
|
preprocessing csvtordf json parsing error all task bugs and to do relating to the tordf xsl ingest xml tei into gams
| 1
|
14,133
| 8,850,233,782
|
IssuesEvent
|
2019-01-08 12:40:44
|
gbif/registry-console
|
https://api.github.com/repos/gbif/registry-console
|
opened
|
inconsistens margins on presentations
|
usability
|
E.g. margin-bottom in datasets much larger than on contacts


|
True
|
inconsistens margins on presentations - E.g. margin-bottom in datasets much larger than on contacts


|
non_process
|
inconsistens margins on presentations e g margin bottom in datasets much larger than on contacts
| 0
|
8,009
| 11,202,127,960
|
IssuesEvent
|
2020-01-04 09:53:06
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
execSync filters out certain environment variables on macOS but not Linux
|
child_process macos
|
<!--
Thank you for reporting a possible bug in Node.js.
Please fill in as much of the template below as you can.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify the affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you can.
-->
* **Version**: v13.5.0
* **Platform**: macOS (sorry, can add uname output when I can check, if required).
* **Subsystem**: child_process module
<!-- Please provide more details below this comment. -->
I noticed this bug while trying to use visual studio code (the debug build of the program I was running requires a certain environment to start correctly), and have found it reproducible in node.js itself.
Essentially I've noticed that the environment variables `DYLD_LIBRARY_PATH` and `LD_LIBRARY_PATH` are both stripped out of the environment when child_process.execSync is used to execute a subprocess - but only on macOS. My initial testing occurred on Linux and there the variables seem to be visible to the child process correctly.
You can reproduce this yourself with the following two scripts (you may need to edit the path to your `node` executable in run.js):
## run.js
```
const child_process = require('child_process');
env = Object();
for(var key in process.env)
{
env[key] = process.env[key];
}
env['DYLD_LIBRARY_PATH'] = "ailjubghaehrguah";
env['LD_LIBRARY_PATH'] = "ailjubghaehrguah";
env['QQQNOPE'] = "akhsdfh";
child_process.execSync('"/usr/bin/node" spawn.js', {env: env, cwd: ".", stdio: 'inherit'});
```
## spawn.js
```
var keys_to_check = ['LD_LIBRARY_PATH', 'DYLD_LIBRARY_PATH', 'QQQNOPE'];
for (var i = 0; i < keys_to_check.length; i++)
{
var key = keys_to_check[i];
console.log(key, process.env[key]);
}
```
## Output
If you save these two files in the same folder and execute like this:
`node run.js`
You will see:
### Ubuntu 18.04 Linux (expected output, node version v13.5.0 from node.js download site as linux binary, same output from cannonical node build of v8.10.0):
```
LD_LIBRARY_PATH ailjubghaehrguah
DYLD_LIBRARY_PATH ailjubghaehrguah
QQQNOPE akhsdfh
```
### macOS (actual output, node version v13.5.0, from homebrew):
```
LD_LIBRARY_PATH undefined
DYLD_LIBRARY_PATH undefined
QQQNOPE akhsdfh
```
As I mentioned above, this is true whether the target process is node.js itself or another thing (mine was a native program compiled from C++ source.)
## Fix?
I have done some grepping around in the node source, but I can't see any mention of these environment variables except in unit tests, which reference them as part of the node.js build settings (certain builds can't run the tests if they're defined or not defined, something like that). No other code seems to deliberately filter them. Thus, I am completely baffled as to why node.js behaves this way.
|
1.0
|
execSync filters out certain environment variables on macOS but not Linux - <!--
Thank you for reporting a possible bug in Node.js.
Please fill in as much of the template below as you can.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify the affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you can.
-->
* **Version**: v13.5.0
* **Platform**: macOS (sorry, can add uname output when I can check, if required).
* **Subsystem**: child_process module
<!-- Please provide more details below this comment. -->
I noticed this bug while trying to use visual studio code (the debug build of the program I was running requires a certain environment to start correctly), and have found it reproducible in node.js itself.
Essentially I've noticed that the environment variables `DYLD_LIBRARY_PATH` and `LD_LIBRARY_PATH` are both stripped out of the environment when child_process.execSync is used to execute a subprocess - but only on macOS. My initial testing occurred on Linux and there the variables seem to be visible to the child process correctly.
You can reproduce this yourself with the following two scripts (you may need to edit the path to your `node` executable in run.js):
## run.js
```
const child_process = require('child_process');
env = Object();
for(var key in process.env)
{
env[key] = process.env[key];
}
env['DYLD_LIBRARY_PATH'] = "ailjubghaehrguah";
env['LD_LIBRARY_PATH'] = "ailjubghaehrguah";
env['QQQNOPE'] = "akhsdfh";
child_process.execSync('"/usr/bin/node" spawn.js', {env: env, cwd: ".", stdio: 'inherit'});
```
## spawn.js
```
var keys_to_check = ['LD_LIBRARY_PATH', 'DYLD_LIBRARY_PATH', 'QQQNOPE'];
for (var i = 0; i < keys_to_check.length; i++)
{
var key = keys_to_check[i];
console.log(key, process.env[key]);
}
```
## Output
If you save these two files in the same folder and execute like this:
`node run.js`
You will see:
### Ubuntu 18.04 Linux (expected output, node version v13.5.0 from node.js download site as linux binary, same output from cannonical node build of v8.10.0):
```
LD_LIBRARY_PATH ailjubghaehrguah
DYLD_LIBRARY_PATH ailjubghaehrguah
QQQNOPE akhsdfh
```
### macOS (actual output, node version v13.5.0, from homebrew):
```
LD_LIBRARY_PATH undefined
DYLD_LIBRARY_PATH undefined
QQQNOPE akhsdfh
```
As I mentioned above, this is true whether the target process is node.js itself or another thing (mine was a native program compiled from C++ source.)
## Fix?
I have done some grepping around in the node source, but I can't see any mention of these environment variables except in unit tests, which reference them as part of the node.js build settings (certain builds can't run the tests if they're defined or not defined, something like that). No other code seems to deliberately filter them. Thus, I am completely baffled as to why node.js behaves this way.
|
process
|
execsync filters out certain environment variables on macos but not linux thank you for reporting a possible bug in node js please fill in as much of the template below as you can version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify the affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you can version platform macos sorry can add uname output when i can check if required subsystem child process module i noticed this bug while trying to use visual studio code the debug build of the program i was running requires a certain environment to start correctly and have found it reproducible in node js itself essentially i ve noticed that the environment variables dyld library path and ld library path are both stripped out of the environment when child process execsync is used to execute a subprocess but only on macos my initial testing occurred on linux and there the variables seem to be visible to the child process correctly you can reproduce this yourself with the following two scripts you may need to edit the path to your node executable in run js run js const child process require child process env object for var key in process env env process env env ailjubghaehrguah env ailjubghaehrguah env akhsdfh child process execsync usr bin node spawn js env env cwd stdio inherit spawn js var keys to check for var i i keys to check length i var key keys to check console log key process env output if you save these two files in the same folder and execute like this node run js you will see ubuntu linux expected output node version from node js download site as linux binary same output from cannonical node build of ld library path ailjubghaehrguah dyld library path ailjubghaehrguah qqqnope akhsdfh macos actual output node version from homebrew ld library path undefined dyld library path undefined qqqnope akhsdfh as i mentioned above this is true whether the target process is node js itself or another thing mine was a native program compiled from c source fix i have done some grepping around in the node source but i can t see any mention of these environment variables except in unit tests which reference them as part of the node js build settings certain builds can t run the tests if they re defined or not defined something like that no other code seems to deliberately filter them thus i am completely baffled as to why node js behaves this way
| 1
|
125,820
| 17,860,455,510
|
IssuesEvent
|
2021-09-05 21:39:46
|
turkdevops/play-with-docker
|
https://api.github.com/repos/turkdevops/play-with-docker
|
opened
|
CVE-2020-11022 (Medium) detected in jquery-3.2.1.min.js
|
security vulnerability
|
## CVE-2020-11022 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.2.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js</a></p>
<p>Path to dependency file: play-with-docker/handlers/www/editor.html</p>
<p>Path to vulnerable library: /handlers/www/editor.html,/handlers/www/default/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-3.2.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/play-with-docker/commit/b855cec5559a8daf1ed5e7e1c654fd5c4d5a3fa7">b855cec5559a8daf1ed5e7e1c654fd5c4d5a3fa7</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jQuery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-11022 (Medium) detected in jquery-3.2.1.min.js - ## CVE-2020-11022 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.2.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js</a></p>
<p>Path to dependency file: play-with-docker/handlers/www/editor.html</p>
<p>Path to vulnerable library: /handlers/www/editor.html,/handlers/www/default/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-3.2.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/play-with-docker/commit/b855cec5559a8daf1ed5e7e1c654fd5c4d5a3fa7">b855cec5559a8daf1ed5e7e1c654fd5c4d5a3fa7</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jQuery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file play with docker handlers www editor html path to vulnerable library handlers www editor html handlers www default index html dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details in jquery versions greater than or equal to and before passing html from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource
| 0
|
13,093
| 15,441,190,358
|
IssuesEvent
|
2021-03-08 05:22:38
|
Vanuatu-National-Statistics-Office/vnso-RAP-tradeStats-materials
|
https://api.github.com/repos/Vanuatu-National-Statistics-Office/vnso-RAP-tradeStats-materials
|
opened
|
Monthly Report Issues
|
coding data processing help wanted monthly report
|
NSDP Table
- DataFrames for the NSDP indicators largely complete. Next step is to calculate whether these indicators are achieving their Targets, to do this we need Baseline data taken from the 2016 monthly report. This is because the NSDP was launched at the start of 2017. Working with Anna to get Raw data to calculate Baseline indicators can then calculate percentage increase or decrease in Trade for month we are reporting on.
- Need to take this data and merge it into the formatted tables, not sure how to do this.
Trade Balance by Major Partner Countries
- If using a map, should have Key too
- Again need to take this data and visualise it onto the map, not sure how to do this
- Should also have a table or chart that highlights the countries and gives their Balance of Trade figures
Trade Balance of Pacific Islands
- The DataFrame produced has positive Trade Balance, Zero Trade and Negative Trade Balance. Need to connect the countries that have positive and negative into a Bar Chart; for those that have Zero Trade maybe add them as a note- again not sure how to visualise this information.
Trade of Balance of New Emerging Markets
- Have to create a subset of Trade Balances for all countries that fall into OTHER category. Then within this subset sort in ascending order (highest to lowest). Take the Top 5 (countries export a lot too) and Lost 5 (Countries import a lot from) and visualise this on a bar chart. Again not sure how to do this.
Trade by Trade Agreement
- Is possible to go through code, and how this extracts information from cleaned dataset. This Table is also incomplete in the Latest Monthly Table script (2).
Principle Exports
- Currently working with Anna to get data for major exports by month from 2010 to create a DataFrame to visualise. We will then be able to track trends on monthly basis.
Top 5 New Major Exports
- This will be similar code to "Trade of Balance of New Emerging Markets" that not sure how to write
Principle Imports
- Currently working with Anna to get data for major imports by month from 2010 to create a DataFrame to visualise. We will then be able to track trends on monthly basis.
Top 5 New Major Imports
- This will be similar code to "Trade of Balance of New Emerging Markets" that not sure how to write
Methodology and Meta-Data
- Haven't got to this stage yet, have to think through definitions, concepts, rationales and methodologies that want to include
|
1.0
|
Monthly Report Issues - NSDP Table
- DataFrames for the NSDP indicators largely complete. Next step is to calculate whether these indicators are achieving their Targets, to do this we need Baseline data taken from the 2016 monthly report. This is because the NSDP was launched at the start of 2017. Working with Anna to get Raw data to calculate Baseline indicators can then calculate percentage increase or decrease in Trade for month we are reporting on.
- Need to take this data and merge it into the formatted tables, not sure how to do this.
Trade Balance by Major Partner Countries
- If using a map, should have Key too
- Again need to take this data and visualise it onto the map, not sure how to do this
- Should also have a table or chart that highlights the countries and gives their Balance of Trade figures
Trade Balance of Pacific Islands
- The DataFrame produced has positive Trade Balance, Zero Trade and Negative Trade Balance. Need to connect the countries that have positive and negative into a Bar Chart; for those that have Zero Trade maybe add them as a note- again not sure how to visualise this information.
Trade of Balance of New Emerging Markets
- Have to create a subset of Trade Balances for all countries that fall into OTHER category. Then within this subset sort in ascending order (highest to lowest). Take the Top 5 (countries export a lot too) and Lost 5 (Countries import a lot from) and visualise this on a bar chart. Again not sure how to do this.
Trade by Trade Agreement
- Is possible to go through code, and how this extracts information from cleaned dataset. This Table is also incomplete in the Latest Monthly Table script (2).
Principle Exports
- Currently working with Anna to get data for major exports by month from 2010 to create a DataFrame to visualise. We will then be able to track trends on monthly basis.
Top 5 New Major Exports
- This will be similar code to "Trade of Balance of New Emerging Markets" that not sure how to write
Principle Imports
- Currently working with Anna to get data for major imports by month from 2010 to create a DataFrame to visualise. We will then be able to track trends on monthly basis.
Top 5 New Major Imports
- This will be similar code to "Trade of Balance of New Emerging Markets" that not sure how to write
Methodology and Meta-Data
- Haven't got to this stage yet, have to think through definitions, concepts, rationales and methodologies that want to include
|
process
|
monthly report issues nsdp table dataframes for the nsdp indicators largely complete next step is to calculate whether these indicators are achieving their targets to do this we need baseline data taken from the monthly report this is because the nsdp was launched at the start of working with anna to get raw data to calculate baseline indicators can then calculate percentage increase or decrease in trade for month we are reporting on need to take this data and merge it into the formatted tables not sure how to do this trade balance by major partner countries if using a map should have key too again need to take this data and visualise it onto the map not sure how to do this should also have a table or chart that highlights the countries and gives their balance of trade figures trade balance of pacific islands the dataframe produced has positive trade balance zero trade and negative trade balance need to connect the countries that have positive and negative into a bar chart for those that have zero trade maybe add them as a note again not sure how to visualise this information trade of balance of new emerging markets have to create a subset of trade balances for all countries that fall into other category then within this subset sort in ascending order highest to lowest take the top countries export a lot too and lost countries import a lot from and visualise this on a bar chart again not sure how to do this trade by trade agreement is possible to go through code and how this extracts information from cleaned dataset this table is also incomplete in the latest monthly table script principle exports currently working with anna to get data for major exports by month from to create a dataframe to visualise we will then be able to track trends on monthly basis top new major exports this will be similar code to trade of balance of new emerging markets that not sure how to write principle imports currently working with anna to get data for major imports by month from to create a dataframe to visualise we will then be able to track trends on monthly basis top new major imports this will be similar code to trade of balance of new emerging markets that not sure how to write methodology and meta data haven t got to this stage yet have to think through definitions concepts rationales and methodologies that want to include
| 1
|
99,629
| 4,058,391,522
|
IssuesEvent
|
2016-05-25 03:57:06
|
leo-project/leofs
|
https://api.github.com/repos/leo-project/leofs
|
closed
|
interfacing with boto fails with SignatureDoesNotMatch
|
Bug Improve Priority-MIDDLE _leo_gateway _leo_s3_libs
|
Setup a standalone leofs server. Tried the boto example code against it using boto=2.38.0. Usng same credentials on s3cmd, works fine. But boto always dies with (debug enabled):
```
2015-07-28 18:25:08,390 boto [DEBUG]:Using access key provided by client.
2015-07-28 18:25:08,390 boto [DEBUG]:Using secret key provided by client.
2015-07-28 18:25:08,391 boto [DEBUG]:path=/storage/
2015-07-28 18:25:08,391 boto [DEBUG]:auth_path=/storage/
2015-07-28 18:25:08,391 boto [DEBUG]:Method: PUT
2015-07-28 18:25:08,391 boto [DEBUG]:Path: /storage/
2015-07-28 18:25:08,391 boto [DEBUG]:Data:
2015-07-28 18:25:08,391 boto [DEBUG]:Headers: {}
2015-07-28 18:25:08,392 boto [DEBUG]:Host: 192.168.67.103:8080
2015-07-28 18:25:08,392 boto [DEBUG]:Port: 8080
2015-07-28 18:25:08,392 boto [DEBUG]:Params: {}
2015-07-28 18:25:08,392 boto [DEBUG]:establishing HTTP connection: kwargs={'port': 8080, 'timeout': 70}
2015-07-28 18:25:08,392 boto [DEBUG]:Token: None
2015-07-28 18:25:08,393 boto [DEBUG]:StringToSign:
PUT
Tue, 28 Jul 2015 12:55:08 GMT
/storage/
2015-07-28 18:25:08,393 boto [DEBUG]:Signature:
AWS ebbaefcccdc7dbfc43f1:KzME+LWC650YPS5dUDQBFdDyouo=
2015-07-28 18:25:08,393 boto [DEBUG]:Final headers: {'Date': 'Tue, 28 Jul 2015 12:55:08 GMT', 'Content-Length': '0', 'Authorization': u'AWS ebbaefcccdc7dbfc43f1:KzME+LWC650YPS5dUDQBFdDyouo=', 'User-Agent': 'Boto/2.38.0 Python/2.7.9 Darwin/14.4.0'}
2015-07-28 18:25:08,429 boto [DEBUG]:Response headers: [('date', 'Tue, 28 Jul 2015 12:55:07 GMT'), ('connection', 'keep-alive'), ('content-length', '304'), ('server', 'LeoFS')]
Traceback (most recent call last):
File "test_leo.py", line 22, in <module>
bucket = conn.create_bucket("storage")
File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/boto/s3/connection.py", line 621, in create_bucket
response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?><Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your AWS secret access key and signing method.</Message><Resource>192.168.67.103/storage/</Resource><RequestId></RequestId></Error>
```
|
1.0
|
interfacing with boto fails with SignatureDoesNotMatch - Setup a standalone leofs server. Tried the boto example code against it using boto=2.38.0. Usng same credentials on s3cmd, works fine. But boto always dies with (debug enabled):
```
2015-07-28 18:25:08,390 boto [DEBUG]:Using access key provided by client.
2015-07-28 18:25:08,390 boto [DEBUG]:Using secret key provided by client.
2015-07-28 18:25:08,391 boto [DEBUG]:path=/storage/
2015-07-28 18:25:08,391 boto [DEBUG]:auth_path=/storage/
2015-07-28 18:25:08,391 boto [DEBUG]:Method: PUT
2015-07-28 18:25:08,391 boto [DEBUG]:Path: /storage/
2015-07-28 18:25:08,391 boto [DEBUG]:Data:
2015-07-28 18:25:08,391 boto [DEBUG]:Headers: {}
2015-07-28 18:25:08,392 boto [DEBUG]:Host: 192.168.67.103:8080
2015-07-28 18:25:08,392 boto [DEBUG]:Port: 8080
2015-07-28 18:25:08,392 boto [DEBUG]:Params: {}
2015-07-28 18:25:08,392 boto [DEBUG]:establishing HTTP connection: kwargs={'port': 8080, 'timeout': 70}
2015-07-28 18:25:08,392 boto [DEBUG]:Token: None
2015-07-28 18:25:08,393 boto [DEBUG]:StringToSign:
PUT
Tue, 28 Jul 2015 12:55:08 GMT
/storage/
2015-07-28 18:25:08,393 boto [DEBUG]:Signature:
AWS ebbaefcccdc7dbfc43f1:KzME+LWC650YPS5dUDQBFdDyouo=
2015-07-28 18:25:08,393 boto [DEBUG]:Final headers: {'Date': 'Tue, 28 Jul 2015 12:55:08 GMT', 'Content-Length': '0', 'Authorization': u'AWS ebbaefcccdc7dbfc43f1:KzME+LWC650YPS5dUDQBFdDyouo=', 'User-Agent': 'Boto/2.38.0 Python/2.7.9 Darwin/14.4.0'}
2015-07-28 18:25:08,429 boto [DEBUG]:Response headers: [('date', 'Tue, 28 Jul 2015 12:55:07 GMT'), ('connection', 'keep-alive'), ('content-length', '304'), ('server', 'LeoFS')]
Traceback (most recent call last):
File "test_leo.py", line 22, in <module>
bucket = conn.create_bucket("storage")
File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/boto/s3/connection.py", line 621, in create_bucket
response.status, response.reason, body)
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden
<?xml version="1.0" encoding="UTF-8"?><Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your AWS secret access key and signing method.</Message><Resource>192.168.67.103/storage/</Resource><RequestId></RequestId></Error>
```
|
non_process
|
interfacing with boto fails with signaturedoesnotmatch setup a standalone leofs server tried the boto example code against it using boto usng same credentials on works fine but boto always dies with debug enabled boto using access key provided by client boto using secret key provided by client boto path storage boto auth path storage boto method put boto path storage boto data boto headers boto host boto port boto params boto establishing http connection kwargs port timeout boto token none boto stringtosign put tue jul gmt storage boto signature aws kzme boto final headers date tue jul gmt content length authorization u aws kzme user agent boto python darwin boto response headers traceback most recent call last file test leo py line in bucket conn create bucket storage file opt local library frameworks python framework versions lib site packages boto connection py line in create bucket response status response reason body boto exception forbidden signaturedoesnotmatch the request signature we calculated does not match the signature you provided check your aws secret access key and signing method storage
| 0
|
10,399
| 13,201,346,214
|
IssuesEvent
|
2020-08-14 09:58:03
|
bisq-network/proposals
|
https://api.github.com/repos/bisq-network/proposals
|
closed
|
Add Detailed BSQ Issuance Information to Cycle Report
|
a:proposal re:processes was:approved
|
> _This is a Bisq Network proposal. Please familiarize yourself with the [submission and review process](https://docs.bisq.network/proposals.html)._
<!-- Please do not remove the text above. -->
### TL;DR
Currently, the [end-of-cycle report](https://bisq.network/blog/cycle-7-results/) currently lists the amount of BSQ that was issued and an approximate amount that was burned. I propose that we expand the BSQ Issuance information to include the breakdown of where the issued BSQ was "spent".
### Motivation
> I use the term "spent" in this proposal to reference issued BSQ that causes inflation.
There are two major questions that I had when evaluating the end-of-cycle report:
1. Where is the project spending money right now?
2. Does that spending match the priorities of the project?
Currently, it is hard to find where the money is being spent so evaluating whether or not the values are inline with the project priorities is hard. Currently, an interested party has to go through every compensation report, decode the initiative that each person participated in, and group them together. Then, they can analyze the data.
But, I believe that having 1 person do this (and potentially updating the compensation request template to require contributors to fill in this information) will give much more transparency to the project in terms of how much Bisq is spending on each initiative.
With that information more accessible, it should make it easier to identify areas where more investment is needed or incentives should be changed to ensure the priorities of the project are also the priorities of the contributors. Or, where funding should be reduced because the ROI isn't there.
### Implementation
I propose a simple broad set of categories that can always be expanded or reduced as appropriate. Each of these would be a sub-bullet under the "X BSQ Issued" line that already exists in the end-of-cycle report. The goal would be to start collecting the information so future proposals or discussions can happen backed with real data.
I've gone through the last few cycles and I think this list covers the major pieces. Feel free to comment if you think something should be added or removed. Or, if you have a particular use case that isn't covered in one of these sections.
1. Software Development (devs, maintainers, etc)
2. Testing
3. Marketing
4. Translations
5. Support
6. Infrastructure (Node operators, hosting, donation address owner salary, etc)
|
1.0
|
Add Detailed BSQ Issuance Information to Cycle Report - > _This is a Bisq Network proposal. Please familiarize yourself with the [submission and review process](https://docs.bisq.network/proposals.html)._
<!-- Please do not remove the text above. -->
### TL;DR
Currently, the [end-of-cycle report](https://bisq.network/blog/cycle-7-results/) currently lists the amount of BSQ that was issued and an approximate amount that was burned. I propose that we expand the BSQ Issuance information to include the breakdown of where the issued BSQ was "spent".
### Motivation
> I use the term "spent" in this proposal to reference issued BSQ that causes inflation.
There are two major questions that I had when evaluating the end-of-cycle report:
1. Where is the project spending money right now?
2. Does that spending match the priorities of the project?
Currently, it is hard to find where the money is being spent so evaluating whether or not the values are inline with the project priorities is hard. Currently, an interested party has to go through every compensation report, decode the initiative that each person participated in, and group them together. Then, they can analyze the data.
But, I believe that having 1 person do this (and potentially updating the compensation request template to require contributors to fill in this information) will give much more transparency to the project in terms of how much Bisq is spending on each initiative.
With that information more accessible, it should make it easier to identify areas where more investment is needed or incentives should be changed to ensure the priorities of the project are also the priorities of the contributors. Or, where funding should be reduced because the ROI isn't there.
### Implementation
I propose a simple broad set of categories that can always be expanded or reduced as appropriate. Each of these would be a sub-bullet under the "X BSQ Issued" line that already exists in the end-of-cycle report. The goal would be to start collecting the information so future proposals or discussions can happen backed with real data.
I've gone through the last few cycles and I think this list covers the major pieces. Feel free to comment if you think something should be added or removed. Or, if you have a particular use case that isn't covered in one of these sections.
1. Software Development (devs, maintainers, etc)
2. Testing
3. Marketing
4. Translations
5. Support
6. Infrastructure (Node operators, hosting, donation address owner salary, etc)
|
process
|
add detailed bsq issuance information to cycle report this is a bisq network proposal please familiarize yourself with the tl dr currently the currently lists the amount of bsq that was issued and an approximate amount that was burned i propose that we expand the bsq issuance information to include the breakdown of where the issued bsq was spent motivation i use the term spent in this proposal to reference issued bsq that causes inflation there are two major questions that i had when evaluating the end of cycle report where is the project spending money right now does that spending match the priorities of the project currently it is hard to find where the money is being spent so evaluating whether or not the values are inline with the project priorities is hard currently an interested party has to go through every compensation report decode the initiative that each person participated in and group them together then they can analyze the data but i believe that having person do this and potentially updating the compensation request template to require contributors to fill in this information will give much more transparency to the project in terms of how much bisq is spending on each initiative with that information more accessible it should make it easier to identify areas where more investment is needed or incentives should be changed to ensure the priorities of the project are also the priorities of the contributors or where funding should be reduced because the roi isn t there implementation i propose a simple broad set of categories that can always be expanded or reduced as appropriate each of these would be a sub bullet under the x bsq issued line that already exists in the end of cycle report the goal would be to start collecting the information so future proposals or discussions can happen backed with real data i ve gone through the last few cycles and i think this list covers the major pieces feel free to comment if you think something should be added or removed or if you have a particular use case that isn t covered in one of these sections software development devs maintainers etc testing marketing translations support infrastructure node operators hosting donation address owner salary etc
| 1
|
10,788
| 13,608,996,313
|
IssuesEvent
|
2020-09-23 03:58:20
|
googleapis/java-phishingprotection
|
https://api.github.com/repos/googleapis/java-phishingprotection
|
closed
|
Dependency Dashboard
|
api: phishingprotection type: process
|
This issue contains a list of Renovate updates and their statuses.
## Open
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
- [ ] <!-- rebase-branch=renovate/org.apache.maven.plugins-maven-project-info-reports-plugin-3.x -->build(deps): update dependency org.apache.maven.plugins:maven-project-info-reports-plugin to v3.1.1
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-phishingprotection-0.x -->chore(deps): update dependency com.google.cloud:google-cloud-phishingprotection to v0.29.1
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
1.0
|
Dependency Dashboard - This issue contains a list of Renovate updates and their statuses.
## Open
These updates have all been created already. Click a checkbox below to force a retry/rebase of any.
- [ ] <!-- rebase-branch=renovate/org.apache.maven.plugins-maven-project-info-reports-plugin-3.x -->build(deps): update dependency org.apache.maven.plugins:maven-project-info-reports-plugin to v3.1.1
- [ ] <!-- rebase-branch=renovate/com.google.cloud-google-cloud-phishingprotection-0.x -->chore(deps): update dependency com.google.cloud:google-cloud-phishingprotection to v0.29.1
---
- [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
|
process
|
dependency dashboard this issue contains a list of renovate updates and their statuses open these updates have all been created already click a checkbox below to force a retry rebase of any build deps update dependency org apache maven plugins maven project info reports plugin to chore deps update dependency com google cloud google cloud phishingprotection to check this box to trigger a request for renovate to run again on this repository
| 1
|
96,003
| 27,717,638,925
|
IssuesEvent
|
2023-03-14 18:01:36
|
helidon-io/helidon
|
https://api.github.com/repos/helidon-io/helidon
|
opened
|
Builder enhanced to support integration with OpenAPI / SmallRye
|
open-api-tools builder
|
See the [OCI's Example](https://docs.oracle.com/en-us/iaas/Content/API/SDKDocs/javasdkconcepts.htm#sync) as background. What is happening here behind the scene is that there is an api spec yaml, where `GetBucketResponse` and `GetBucketRequest` schema generated pojo. And then the builder pattern is used (`builder().namespaceName("myNamespace").bucketName("myBucket").build()`) to construct the request.
This functionality would be a nice addition for enhancing our Builders, where we would delegate directly to _BuilderCreator_ to generate the backing pojo types for this use case.
|
1.0
|
Builder enhanced to support integration with OpenAPI / SmallRye - See the [OCI's Example](https://docs.oracle.com/en-us/iaas/Content/API/SDKDocs/javasdkconcepts.htm#sync) as background. What is happening here behind the scene is that there is an api spec yaml, where `GetBucketResponse` and `GetBucketRequest` schema generated pojo. And then the builder pattern is used (`builder().namespaceName("myNamespace").bucketName("myBucket").build()`) to construct the request.
This functionality would be a nice addition for enhancing our Builders, where we would delegate directly to _BuilderCreator_ to generate the backing pojo types for this use case.
|
non_process
|
builder enhanced to support integration with openapi smallrye see the as background what is happening here behind the scene is that there is an api spec yaml where getbucketresponse and getbucketrequest schema generated pojo and then the builder pattern is used builder namespacename mynamespace bucketname mybucket build to construct the request this functionality would be a nice addition for enhancing our builders where we would delegate directly to buildercreator to generate the backing pojo types for this use case
| 0
|
2,130
| 4,970,715,870
|
IssuesEvent
|
2016-12-05 16:43:18
|
opendataby/city-dashboard
|
https://api.github.com/repos/opendataby/city-dashboard
|
closed
|
Describe main parameters for displaying on the dashboard (main pollutants) and their normal levels
|
need to process
|
Describe main parameters for displaying on the dashboard (main pollutants) and their normal levels
* главные загрязнители
* нормальный уровень для каждого
|
1.0
|
Describe main parameters for displaying on the dashboard (main pollutants) and their normal levels - Describe main parameters for displaying on the dashboard (main pollutants) and their normal levels
* главные загрязнители
* нормальный уровень для каждого
|
process
|
describe main parameters for displaying on the dashboard main pollutants and their normal levels describe main parameters for displaying on the dashboard main pollutants and their normal levels главные загрязнители нормальный уровень для каждого
| 1
|
9,976
| 13,020,785,996
|
IssuesEvent
|
2020-07-27 04:19:51
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Cannot import saved batch process of Model
|
Bug Modeller Processing Regression
|
**Describe the bug**
I have created a model using Graphical Modeler in QGIS 3.14. The works fine as a single process.
I then created a Batch Process. The batch process also works fine.
Then I saved the Batch Process.
Now when open the saved Batch Process JSON file, it does not load and gives the following error
'An error has occurred while executing Python code:
NameError: name 'QDateTime' is not defined
Traceback (most recent call last):
File "C:/PROGRA~1/QGIS3~1.14/apps/qgis/./python/plugins\processing\gui\BatchPanel.py", line 543, in load
value = eval(params[param.name()])
File "", line 1, in
NameError: name 'QDateTime' is not defined
Python version: 3.7.0 (v3.7.0:1bf9cc5093, Jun 27 2018, 04:59:51) [MSC v.1914 64 bit (AMD64)]
QGIS version: 3.14.0-Pi Pi, 9f7028fd23 '
Second issue is that one of the input fields does not show up in the columns when we are creating the Batch process.
The saved batch file opens in QGIS 3.10, but the 'Datetime' input of the Graphical Modeler shows an error and does not work. I guess there is not 'Datetime' input in 3.10
**How to Reproduce**
1. Go to Graphical Modeler
2. Create a model
3. Run the model
4. Select 'Run as Batch Process'
5. Create the batch Process and save it.
6. Close the Batch Process and open it again
7. Load the same saved batch Process
8. See the error --> NameError: name 'QDateTime' is not defined
**QGIS and OS versions**
<!-- In the QGIS Help menu -> About, click in the table, Ctrl+A and then Ctrl+C. Finally paste here -->
QGIS version
3.14.0-Pi
QGIS code revision
9f7028fd23
Compiled against Qt
5.11.2
Running against Qt
5.11.2
Compiled against GDAL/OGR
3.0.4
Running against GDAL/OGR
3.0.4
Compiled against GEOS
3.8.1-CAPI-1.13.3
Running against GEOS
3.8.1-CAPI-1.13.3
Compiled against SQLite
3.29.0
Running against SQLite
3.29.0
PostgreSQL Client Version
11.5
SpatiaLite Version
4.3.0
QWT Version
6.1.3
QScintilla2 Version
2.10.8
Compiled against PROJ
6.3.2
Running against PROJ
Rel. 6.3.2, May 1st, 2020
OS Version
Windows 10 (10.0)
Active python plugins
db_manager;
MetaSearch;
processing

**Additional context**
<!-- Add any other context about the problem here. -->
|
1.0
|
Cannot import saved batch process of Model - **Describe the bug**
I have created a model using Graphical Modeler in QGIS 3.14. The works fine as a single process.
I then created a Batch Process. The batch process also works fine.
Then I saved the Batch Process.
Now when open the saved Batch Process JSON file, it does not load and gives the following error
'An error has occurred while executing Python code:
NameError: name 'QDateTime' is not defined
Traceback (most recent call last):
File "C:/PROGRA~1/QGIS3~1.14/apps/qgis/./python/plugins\processing\gui\BatchPanel.py", line 543, in load
value = eval(params[param.name()])
File "", line 1, in
NameError: name 'QDateTime' is not defined
Python version: 3.7.0 (v3.7.0:1bf9cc5093, Jun 27 2018, 04:59:51) [MSC v.1914 64 bit (AMD64)]
QGIS version: 3.14.0-Pi Pi, 9f7028fd23 '
Second issue is that one of the input fields does not show up in the columns when we are creating the Batch process.
The saved batch file opens in QGIS 3.10, but the 'Datetime' input of the Graphical Modeler shows an error and does not work. I guess there is not 'Datetime' input in 3.10
**How to Reproduce**
1. Go to Graphical Modeler
2. Create a model
3. Run the model
4. Select 'Run as Batch Process'
5. Create the batch Process and save it.
6. Close the Batch Process and open it again
7. Load the same saved batch Process
8. See the error --> NameError: name 'QDateTime' is not defined
**QGIS and OS versions**
<!-- In the QGIS Help menu -> About, click in the table, Ctrl+A and then Ctrl+C. Finally paste here -->
QGIS version
3.14.0-Pi
QGIS code revision
9f7028fd23
Compiled against Qt
5.11.2
Running against Qt
5.11.2
Compiled against GDAL/OGR
3.0.4
Running against GDAL/OGR
3.0.4
Compiled against GEOS
3.8.1-CAPI-1.13.3
Running against GEOS
3.8.1-CAPI-1.13.3
Compiled against SQLite
3.29.0
Running against SQLite
3.29.0
PostgreSQL Client Version
11.5
SpatiaLite Version
4.3.0
QWT Version
6.1.3
QScintilla2 Version
2.10.8
Compiled against PROJ
6.3.2
Running against PROJ
Rel. 6.3.2, May 1st, 2020
OS Version
Windows 10 (10.0)
Active python plugins
db_manager;
MetaSearch;
processing

**Additional context**
<!-- Add any other context about the problem here. -->
|
process
|
cannot import saved batch process of model describe the bug i have created a model using graphical modeler in qgis the works fine as a single process i then created a batch process the batch process also works fine then i saved the batch process now when open the saved batch process json file it does not load and gives the following error an error has occurred while executing python code nameerror name qdatetime is not defined traceback most recent call last file c progra apps qgis python plugins processing gui batchpanel py line in load value eval params file line in nameerror name qdatetime is not defined python version jun qgis version pi pi second issue is that one of the input fields does not show up in the columns when we are creating the batch process the saved batch file opens in qgis but the datetime input of the graphical modeler shows an error and does not work i guess there is not datetime input in how to reproduce go to graphical modeler create a model run the model select run as batch process create the batch process and save it close the batch process and open it again load the same saved batch process see the error nameerror name qdatetime is not defined qgis and os versions about click in the table ctrl a and then ctrl c finally paste here qgis version pi qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel may os version windows active python plugins db manager metasearch processing additional context
| 1
|
22,582
| 31,810,170,516
|
IssuesEvent
|
2023-09-13 16:16:22
|
AssetRipper/AssetRipper
|
https://api.github.com/repos/AssetRipper/AssetRipper
|
closed
|
[Bug]: Incorrect UVs in Separated Static Meshes
|
bug mesh processing
|
### Are you on the latest version of AssetRipper?
Yes, I'm on the latest release of AssetRipper.
### Which release are you using?
Windows x64
### Which game did this occur on?
Tested on 3 games
using version 0.3.0.4 released
and latest commit as of 9/01/2023
### Which Unity version did this occur on?
2019.3.11
2020.3.29f1
2020.3.35f1
### Is the game Mono or IL2Cpp?
Mono
### Describe the issue.
Hi,
When using Static Mesh Seperation, I noticed that the seperated meshes have incorect UVs.
I noticed this because these meshes have a textured material that does not show.
When I export without the Static Mesh Seperation, the UVs are correct (in the corresponding Combined Mesh (root_ scene) xxx.asset
I can provide example exported mesh assets with incorrect UVs if you want to check this.
Also, just a question, What does 'enable Prefab outlining' do ?
Anyway,
Keep up the great work ;)
Regards,
Zbuffer
### Relevant log output
n/a no relevant info in the log.
I can provide one if needed.
|
1.0
|
[Bug]: Incorrect UVs in Separated Static Meshes - ### Are you on the latest version of AssetRipper?
Yes, I'm on the latest release of AssetRipper.
### Which release are you using?
Windows x64
### Which game did this occur on?
Tested on 3 games
using version 0.3.0.4 released
and latest commit as of 9/01/2023
### Which Unity version did this occur on?
2019.3.11
2020.3.29f1
2020.3.35f1
### Is the game Mono or IL2Cpp?
Mono
### Describe the issue.
Hi,
When using Static Mesh Seperation, I noticed that the seperated meshes have incorect UVs.
I noticed this because these meshes have a textured material that does not show.
When I export without the Static Mesh Seperation, the UVs are correct (in the corresponding Combined Mesh (root_ scene) xxx.asset
I can provide example exported mesh assets with incorrect UVs if you want to check this.
Also, just a question, What does 'enable Prefab outlining' do ?
Anyway,
Keep up the great work ;)
Regards,
Zbuffer
### Relevant log output
n/a no relevant info in the log.
I can provide one if needed.
|
process
|
incorrect uvs in separated static meshes are you on the latest version of assetripper yes i m on the latest release of assetripper which release are you using windows which game did this occur on tested on games using version released and latest commit as of which unity version did this occur on is the game mono or mono describe the issue hi when using static mesh seperation i noticed that the seperated meshes have incorect uvs i noticed this because these meshes have a textured material that does not show when i export without the static mesh seperation the uvs are correct in the corresponding combined mesh root scene xxx asset i can provide example exported mesh assets with incorrect uvs if you want to check this also just a question what does enable prefab outlining do anyway keep up the great work regards zbuffer relevant log output n a no relevant info in the log i can provide one if needed
| 1
|
1,569
| 4,165,435,381
|
IssuesEvent
|
2016-06-19 13:57:42
|
sysown/proxysql
|
https://api.github.com/repos/sysown/proxysql
|
opened
|
Create `before` and `after` triggers for queries
|
ADMIN CONNECTION POOL MYSQL PROTOCOL QUERY PROCESSOR
|
`mysql_query_rules` can be extended with 2 more fields (names to be defined) that will execute commands before and after the executing of the real queries.
This could be useful, for example, to change the default of a session variable before and after executing a query.
|
1.0
|
Create `before` and `after` triggers for queries - `mysql_query_rules` can be extended with 2 more fields (names to be defined) that will execute commands before and after the executing of the real queries.
This could be useful, for example, to change the default of a session variable before and after executing a query.
|
process
|
create before and after triggers for queries mysql query rules can be extended with more fields names to be defined that will execute commands before and after the executing of the real queries this could be useful for example to change the default of a session variable before and after executing a query
| 1
|
438,444
| 30,642,288,007
|
IssuesEvent
|
2023-07-24 23:33:14
|
FRRouting/frr
|
https://api.github.com/repos/FRRouting/frr
|
closed
|
RPM's for newer version
|
question documentation packaging
|
I'm curious if FRR will be providing binary RPM's (CentOS 7) for 7.2.1 or 7.3.
Currently we're using the provided RPM's: frr-7.2-01.el7.centos.x86_64
Thanks,
Nick
|
1.0
|
RPM's for newer version - I'm curious if FRR will be providing binary RPM's (CentOS 7) for 7.2.1 or 7.3.
Currently we're using the provided RPM's: frr-7.2-01.el7.centos.x86_64
Thanks,
Nick
|
non_process
|
rpm s for newer version i m curious if frr will be providing binary rpm s centos for or currently we re using the provided rpm s frr centos thanks nick
| 0
|
44,854
| 11,520,803,834
|
IssuesEvent
|
2020-02-14 15:28:37
|
wexond/desktop
|
https://api.github.com/repos/wexond/desktop
|
closed
|
Build error with npm
|
build platform/windows
|
**Bug description**
Hi, I've successfully installed the dependecies with "npm install" without errors but I can't build or run the application. I've tried both "npm run dev" and "npm start" with same results.
This is the complete output I get:
> wexond@4.0.0-beta.2 dev C:\desktop-master
> cross-env START=1 npm run watch
> wexond@4.0.0-beta.2 watch C:\desktop-master
> npm run extensions && concurrently "cross-env ENV='dev' webpack-dev-server --config webpack.config.renderer.js" " cross-env ENV='dev' webpack" "cross-env ENV='dev' webpack-dev-server --config webpack.config.web.js"
> wexond@4.0.0-beta.2 extensions C:\desktop-master
> node scripts/extensions.js
[1]
[1] webpack is watching the files...
[1]
[2] [34mi[39m [90m「wds」[39m: Project is running at http://localhost:4445/
[2] [34mi[39m [90m「wds」[39m: webpack output is served from /
[2] [34mi[39m [90m「wds」[39m: Content not from webpack is served from C:\desktop-master\build
[0] [34mi[39m [90m「wds」[39m: Project is running at http://localhost:4444/
[0] [34mi[39m [90m「wds」[39m: webpack output is served from /
[0] [34mi[39m [90m「wds」[39m: Content not from webpack is served from C:\desktop-master\build
[2] [hardsource:51cbc9e7] Using 102 MB of disk space.
[2] [hardsource:51cbc9e7] Writing new cache 51cbc9e7...
[2] [hardsource:51cbc9e7] Tracking node dependencies with: package-lock.json, yarn.lock.
[2] Starting type checking service...
[2] Using 1 worker with 2048MB memory limit
[0] [hardsource:56a7bcb3] Using 102 MB of disk space.
[0] [hardsource:56a7bcb3] Writing new cache 56a7bcb3...
[0] [hardsource:56a7bcb3] Tracking node dependencies with: package-lock.json, yarn.lock.
[0] Starting type checking service...
[0] Using 1 worker with 2048MB memory limit
[2] Type checking in progress...
[2] [34mi[39m [90m「wdm」[39m: Hash: 994488287bc0e506180f
[2] Version: webpack 4.41.5
[2] Time: 17193ms
[2] Built at: 02/12/2020 4:59:35 PM
[2] Asset Size Chunks Chunk Names
[2] 0029d6a83da87ebe19dbd1d3f8dc4fa8.svg 233 bytes [emitted]
[2] 09e70abbf7f3c8e0a712186b7840f08d.svg 265 bytes [emitted]
[2] 0d79c982c22fb9d4568cbdf9d42bd45c.png 10.6 KiB [emitted]
[2] 0e5a7b7a5f91d27ccc518401734adc77.svg 443 bytes [emitted]
[2] 11ea5efbd1c9e8f273869a6d8487a079.svg 194 bytes [emitted]
[2] 120534507fea30fb285bf30b09ff4736.svg 330 bytes [emitted]
[2] 125b8fe2cc7b8bfefed62142eff893c8.svg 349 bytes [emitted]
[2] 1731195fa1171e46db6c945db25dd722.svg 280 bytes [emitted]
[2] 1cf6f0d3481497ab66b3287324f3ec0d.png 6.59 KiB [emitted]
[2] 2246be89f74fcc7abdfa9b3d73cb90af.png 5.43 KiB [emitted]
[2] 2269792780a1012a77e2939ce390aa53.svg 590 bytes [emitted]
[2] 24a93bb6a956c41e0f5fde63df92e183.svg 184 bytes [emitted]
[2] 269c63a5f4d4151e28ce757080a00532.svg 347 bytes [emitted]
[2] 2751ee43015f9884c3642f103b7f70c9.woff2 48.1 KiB [emitted]
[2] 34bd6069c9f08bb444c86b8d099a000e.svg 315 bytes [emitted]
[2] 3dc9c6f233c8796c5a5854bcbb8a7a35.svg 283 bytes [emitted]
[2] 4253cfb9638d23b00d60c54f650d403f.png 7.38 KiB [emitted]
[2] 43d95dda9128450b9d5f1b743ae0cfaa.svg 279 bytes [emitted]
[2] 45b30f6af163a69443f70b9e363a9a6e.svg 287 bytes [emitted]
[2] 4d7dbcf6a1a48daa3613496488d59f53.svg 1.41 KiB [emitted]
[2] 5274472f3d866509ed3244c83991584b.svg 265 bytes [emitted]
[2] 574fd0b50367f886d359e8264938fc37.woff2 49 KiB [emitted]
[2] 5a3e9cea76f24ce5298f51238bdeb156.svg 229 bytes [emitted]
[2] 5f2c6e1979fcb3bac8179f4fe55c4cf4.svg 555 bytes [emitted]
[2] 64de38475d6ea6a35c277c6873f7fa08.svg 362 bytes [emitted]
[2] 654ec65131c8d4315ee1b585c478f8ba.svg 608 bytes [emitted]
[2] 69f8a0617ac472f78e45841323a3df9e.woff2 48.2 KiB [emitted]
[2] 6a545d58c2fcfde4ef7edb644866ed03.png 7.43 KiB [emitted]
[2] 721ee70cae50d5ee89bf69baa0029c8b.svg 2 KiB [emitted]
[2] 7241febe64c5c37603f1a6fe0e999a43.svg 265 bytes [emitted]
[2] 74fc797d2d34f79f05a5255671eaeb25.svg 199 bytes [emitted]
[2] 81bf2a2b7b05ac48c1d6e0b730b166d1.svg 311 bytes [emitted]
[2] 88daae015cde151a45c6e1a9cea04327.png 9.62 KiB [emitted]
[2] 8a0c8c4e90d8ac7681629b333225fc86.svg 398 bytes [emitted]
[2] 8dbfeb98213195d6f9d5e021b053c064.svg 266 bytes [emitted]
[2] 91da60531102e1686f6016e61a973a03.svg 360 bytes [emitted]
[2] 9e1a5f1d0a2778445c645683f7fb5fd9.svg 212 bytes [emitted]
[2] a301453b875c4730cf5d924b71171607.png 5.76 KiB [emitted]
[2] a5e3e9a9f3269c148e40b0b421001779.svg 283 bytes [emitted]
[2] a606474bb0d05e002edd595ce9c4d7a6.svg 784 bytes [emitted]
[2] ac9440512ba56b8f58e72384a30b0cb6.svg 198 bytes [emitted]
[2] afbaceae723f5019f524054125bc44c8.svg 808 bytes [emitted]
[2] b414c240e67daa96b06f9f45d01e1da9.svg 404 bytes [emitted]
[2] b7084da2b883ad0a2a68f0e602d14e8c.png 9.42 KiB [emitted]
[2] b8b53672e112c11e6002ad6028811cd1.png 9.29 KiB [emitted]
[2] bb54155821853dba1bda8b0a99ed1b5e.png 7.04 KiB [emitted]
[2] bce27a350d2d50e3b545aaacb555a617.svg 195 bytes [emitted]
[2] be448bbcb5ce45a136734b4155654240.svg 199 bytes [emitted]
[2] bookmarks.bundle.js 142 KiB bookmarks [emitted] bookmarks
[2] bookmarks.html 298 bytes [emitted]
[2] c1c4f176c0bdbb4285d8c1a16d60efa9.svg 682 bytes [emitted]
[2] c275e5189ecaabdbfdc5407dc031c28b.svg 183 bytes [emitted]
[2] c3d75740c5c58e55c4e6e9bac7e742ac.svg 195 bytes [emitted]
[2] d036747814b227e57059b656104e11af.svg 263 bytes [emitted]
[2] d36bd4a7a15f50d1a42ad58049254a48.svg 215 bytes [emitted]
[2] d4aaf69abf5165baddfe5f3ab28efc9c.svg 294 bytes [emitted]
[2] d60ca0119007632070ea071d45b2b68f.svg 593 bytes [emitted]
[2] e27192cbdd7dbc4012199afa6ba27175.svg 292 bytes [emitted]
[2] e6961caf78e208185a88d01ea2bc217d.svg 260 bytes [emitted]
[2] ea50966763a4e1164188394d49c2634b.svg 233 bytes [emitted]
[2] f02a9cc48ed93dd5af0de8b96bbae622.svg 330 bytes [emitted]
[2] f2060e6fa6ced402f155e6df5b14055c.svg 366 bytes [emitted]
[2] f5aa4fa4e4e9cef40beeca635415ce98.svg 274 bytes [emitted]
[2] f910cf3223a14fefec6cb63f892332d9.svg 266 bytes [emitted]
[2] f91a90c85b604a0df77a47fb5809bda1.png 3.56 KiB [emitted]
[2] fde3bc5099f6a1a78e705bc1402a0b28.png 8.09 KiB [emitted]
[2] history.bundle.js 98.7 KiB history [emitted] history
[2] history.html 296 bytes [emitted]
[2] newtab.bundle.js 594 KiB newtab [emitted] newtab
[2] newtab.html 295 bytes [emitted]
[2] settings.bundle.js 206 KiB settings [emitted] settings
[2] settings.html 297 bytes [emitted]
[2] vendor.web.bundle.js 4.94 MiB vendor.web [emitted] vendor.web
[2] Entrypoint settings = vendor.web.bundle.js settings.bundle.js
[2] Entrypoint history = vendor.web.bundle.js history.bundle.js
[2] Entrypoint newtab = vendor.web.bundle.js newtab.bundle.js
[2] Entrypoint bookmarks = vendor.web.bundle.js bookmarks.bundle.js
[2] [0] multi (webpack)-dev-server/client?http://localhost:4445 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/settings 64 bytes {settings} [built]
[2] [5] multi (webpack)-dev-server/client?http://localhost:4445 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/history 64 bytes {history} [built]
[2] [6] multi (webpack)-dev-server/client?http://localhost:4445 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/newtab 64 bytes {newtab} [built]
[2] [9] multi (webpack)-dev-server/client?http://localhost:4445 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/bookmarks 64 bytes {bookmarks} [built]
[2] [./node_modules/react-dom/index.js] 1.33 KiB {vendor.web} [built]
[2] [./node_modules/react-hot-loader/dist/react-hot-loader.development.js] 97.1 KiB {vendor.web} [built]
[2] [./node_modules/react-hot-loader/patch.js] 229 bytes {vendor.web} [built]
[2] [./node_modules/react/index.js] 190 bytes {vendor.web} [built]
[2] [./node_modules/webpack-dev-server/client/index.js?http://localhost:4445] (webpack)-dev-server/client?http://localhost:4445 4.29 KiB {vendor.web} [built]
[2] [./node_modules/webpack-dev-server/client/overlay.js] (webpack)-dev-server/client/overlay.js 3.51 KiB {vendor.web} [built]
[2] [./node_modules/webpack/hot/dev-server.js] (webpack)/hot/dev-server.js 1.59 KiB {vendor.web} [built]
[2] [./src/renderer/views/bookmarks/index.tsx] 902 bytes {bookmarks} [built]
[2] [./src/renderer/views/history/index.tsx] 902 bytes {history} [built]
[2] [./src/renderer/views/newtab/index.tsx] 902 bytes {newtab} [built]
[2] [./src/renderer/views/settings/index.tsx] 902 bytes {settings} [built]
[2] + 260 hidden modules
[2] Child html-webpack-plugin for "bookmarks.html":
[2] 1 asset
[2] Entrypoint undefined = bookmarks.html
[2] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[2] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[2] [./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 472 bytes {0} [built]
[2] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[2] Child html-webpack-plugin for "history.html":
[2] 1 asset
[2] Entrypoint undefined = history.html
[2] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[2] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[2] [./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 472 bytes {0} [built]
[2] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[2] Child html-webpack-plugin for "newtab.html":
[2] 1 asset
[2] Entrypoint undefined = newtab.html
[2] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[2] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[2] [./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 472 bytes {0} [built]
[2] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[2] Child html-webpack-plugin for "settings.html":
[2] 1 asset
[2] Entrypoint undefined = settings.html
[2] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[2] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[2] [./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 472 bytes {0} [built]
[2] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[2] [34mi[39m [90m「wdm」[39m: Compiled successfully.
[1] Hash: 10aee0863dd7294e10bec86d0639cf21b5e3f666
[1] Version: webpack 4.41.5
[1] Child
[1] Hash: 10aee0863dd7294e10be
[1] Time: 21874ms
[1] Built at: 02/12/2020 4:59:39 PM
[1] Asset Size Chunks Chunk Names
[1] bb872f07c62a77ce2e49c83d5ac786c4.node 601 KiB [emitted]
[1] extensions-preload.js 17.8 KiB [emitted]
[1] main.bundle.js 10.5 MiB main [emitted] main
[1] preload.js 7.36 KiB [emitted]
[1] Entrypoint main = main.bundle.js
[1] [./src/main/index.ts] 2.02 KiB {main} [built]
[1] [./src/main/menus/main.ts] 8.02 KiB {main} [built]
[1] [./src/main/models/protocol.ts] 1.49 KiB {main} [built]
[1] [./src/main/models/settings.ts] 6.88 KiB {main} [built]
[1] [./src/main/services/index.ts] 252 bytes {main} [built]
[1] [./src/main/services/storage.ts] 16.7 KiB {main} [built]
[1] [./src/main/sessions-manager.ts] 12.3 KiB {main} [built]
[1] [./src/main/windows-manager.ts] 4.64 KiB {main} [built]
[1] [./src/main/windows/index.ts] 208 bytes {main} [built]
[1] [./src/utils/files.ts] 825 bytes {main} [built]
[1] [./src/utils/index.ts] 336 bytes {main} [built]
[1] [electron] external "electron" 42 bytes {main} [built]
[1] [fs] external "fs" 42 bytes {main} [built]
[1] [os] external "os" 42 bytes {main} [built]
[1] [path] external "path" 42 bytes {main} [built]
[1] + 783 hidden modules
[1] Child
[1] Hash: c86d0639cf21b5e3f666
[1] Time: 4640ms
[1] Built at: 02/12/2020 4:59:22 PM
[1] Asset Size Chunks Chunk Names
[1] popup-preload.bundle.js 6.79 KiB popup-preload [emitted] popup-preload
[1] view-preload.bundle.js 118 KiB view-preload [emitted] view-preload
[1] Entrypoint view-preload = view-preload.bundle.js
[1] Entrypoint popup-preload = popup-preload.bundle.js
[1] [./package.json] 3.55 KiB {view-preload} [built]
[1] [./src/constants/files.ts] 554 bytes {view-preload} [built]
[1] [./src/constants/settings.ts] 8.06 KiB {view-preload} [built]
[1] [./src/preloads/chrome-webstore.ts] 3.02 KiB {view-preload} [built]
[1] [./src/preloads/constants/index.ts] 214 bytes {view-preload} [built]
[1] [./src/preloads/models/auto-complete.ts] 1.27 KiB {view-preload} [built]
[1] [./src/preloads/models/form.ts] 4.89 KiB {view-preload} [built]
[1] [./src/preloads/popup-preload.ts] 586 bytes {popup-preload} [built]
[1] [./src/preloads/utils/dom.ts] 248 bytes {view-preload} [built]
[1] [./src/preloads/utils/index.ts] 208 bytes {view-preload} [built]
[1] [./src/preloads/view-preload.ts] 7.34 KiB {view-preload} [built]
[1] [./src/renderer/constants/colors.ts] 678 bytes {view-preload} [built]
[1] [./src/renderer/constants/themes.ts] 3.39 KiB {view-preload} [built]
[1] [./src/utils/themes.ts] 351 bytes {view-preload} [built]
[1] [electron] external "electron" 42 bytes {view-preload} {popup-preload} [built]
[1] + 2 hidden modules
[1]
[1] > wexond@4.0.0-beta.2 start C:\desktop-master
[1] > cross-env NODE_ENV='dev' electron .
[1]
[1]
[1] App threw an error during load
[1] Error: Cannot open C:\desktop-master\build\bb872f07c62a77ce2e49c83d5ac786c4.node: Error: Module did not self-register.
[1] at Object.eval (webpack-internal:///./node_modules/keytar/build/Release/keytar.node:1:253)
[1] at eval (webpack-internal:///./node_modules/keytar/build/Release/keytar.node:2:30)
[1] at Object../node_modules/keytar/build/Release/keytar.node (C:\desktop-master\build\main.bundle.js:4101:1)
[1] at __webpack_require__ (C:\desktop-master\build\main.bundle.js:20:30)
[1] at eval (webpack-internal:///./node_modules/keytar/lib/keytar.js:1:14)
[1] at Object../node_modules/keytar/lib/keytar.js (C:\desktop-master\build\main.bundle.js:4112:1)
[1] at __webpack_require__ (C:\desktop-master\build\main.bundle.js:20:30)
[1] at eval (webpack-internal:///./src/main/services/messaging.ts:14:18)
[1] at Object../src/main/services/messaging.ts (C:\desktop-master\build\main.bundle.js:8808:1)
[1] at __webpack_require__ (C:\desktop-master\build\main.bundle.js:20:30)
[0] Type checking in progress...
[0] [34mi[39m [90m「wdm」[39m: Hash: e5805cc70dbc87e5587986297f3783fd975b6122
[0] Version: webpack 4.41.5
[0] Child
[0] Hash: e5805cc70dbc87e55879
[0] Time: 37424ms
[0] Built at: 02/12/2020 4:59:55 PM
[0] Asset Size Chunks Chunk Names
[0] 0029d6a83da87ebe19dbd1d3f8dc4fa8.svg 233 bytes [emitted]
[0] 09e70abbf7f3c8e0a712186b7840f08d.svg 265 bytes [emitted]
[0] 0d79c982c22fb9d4568cbdf9d42bd45c.png 10.6 KiB [emitted]
[0] 0e5a7b7a5f91d27ccc518401734adc77.svg 443 bytes [emitted]
[0] 11ea5efbd1c9e8f273869a6d8487a079.svg 194 bytes [emitted]
[0] 120534507fea30fb285bf30b09ff4736.svg 330 bytes [emitted]
[0] 125b8fe2cc7b8bfefed62142eff893c8.svg 349 bytes [emitted]
[0] 1731195fa1171e46db6c945db25dd722.svg 280 bytes [emitted]
[0] 1cf6f0d3481497ab66b3287324f3ec0d.png 6.59 KiB [emitted]
[0] 2246be89f74fcc7abdfa9b3d73cb90af.png 5.43 KiB [emitted]
[0] 2269792780a1012a77e2939ce390aa53.svg 590 bytes [emitted]
[0] 24a93bb6a956c41e0f5fde63df92e183.svg 184 bytes [emitted]
[0] 269c63a5f4d4151e28ce757080a00532.svg 347 bytes [emitted]
[0] 2751ee43015f9884c3642f103b7f70c9.woff2 48.1 KiB [emitted]
[0] 34bd6069c9f08bb444c86b8d099a000e.svg 315 bytes [emitted]
[0] 3dc9c6f233c8796c5a5854bcbb8a7a35.svg 283 bytes [emitted]
[0] 4253cfb9638d23b00d60c54f650d403f.png 7.38 KiB [emitted]
[0] 43d95dda9128450b9d5f1b743ae0cfaa.svg 279 bytes [emitted]
[0] 45b30f6af163a69443f70b9e363a9a6e.svg 287 bytes [emitted]
[0] 4d7dbcf6a1a48daa3613496488d59f53.svg 1.41 KiB [emitted]
[0] 5274472f3d866509ed3244c83991584b.svg 265 bytes [emitted]
[0] 574fd0b50367f886d359e8264938fc37.woff2 49 KiB [emitted]
[0] 5a3e9cea76f24ce5298f51238bdeb156.svg 229 bytes [emitted]
[0] 5f2c6e1979fcb3bac8179f4fe55c4cf4.svg 555 bytes [emitted]
[0] 64de38475d6ea6a35c277c6873f7fa08.svg 362 bytes [emitted]
[0] 654ec65131c8d4315ee1b585c478f8ba.svg 608 bytes [emitted]
[0] 69f8a0617ac472f78e45841323a3df9e.woff2 48.2 KiB [emitted]
[0] 6a545d58c2fcfde4ef7edb644866ed03.png 7.43 KiB [emitted]
[0] 721ee70cae50d5ee89bf69baa0029c8b.svg 2 KiB [emitted]
[0] 7241febe64c5c37603f1a6fe0e999a43.svg 265 bytes [emitted]
[0] 74fc797d2d34f79f05a5255671eaeb25.svg 199 bytes [emitted]
[0] 81bf2a2b7b05ac48c1d6e0b730b166d1.svg 311 bytes [emitted]
[0] 88daae015cde151a45c6e1a9cea04327.png 9.62 KiB [emitted]
[0] 8a0c8c4e90d8ac7681629b333225fc86.svg 398 bytes [emitted]
[0] 8dbfeb98213195d6f9d5e021b053c064.svg 266 bytes [emitted]
[0] 91da60531102e1686f6016e61a973a03.svg 360 bytes [emitted]
[0] 9e1a5f1d0a2778445c645683f7fb5fd9.svg 212 bytes [emitted]
[0] a301453b875c4730cf5d924b71171607.png 5.76 KiB [emitted]
[0] a5e3e9a9f3269c148e40b0b421001779.svg 283 bytes [emitted]
[0] a606474bb0d05e002edd595ce9c4d7a6.svg 784 bytes [emitted]
[0] ac9440512ba56b8f58e72384a30b0cb6.svg 198 bytes [emitted]
[0] add-bookmark.bundle.js 109 KiB add-bookmark [emitted] add-bookmark
[0] add-bookmark.html 301 bytes [emitted]
[0] afbaceae723f5019f524054125bc44c8.svg 808 bytes [emitted]
[0] app.bundle.js 890 KiB app [emitted] app
[0] app.html 292 bytes [emitted]
[0] auth.bundle.js 55.4 KiB auth [emitted] auth
[0] auth.html 293 bytes [emitted]
[0] b414c240e67daa96b06f9f45d01e1da9.svg 404 bytes [emitted]
[0] b7084da2b883ad0a2a68f0e602d14e8c.png 9.42 KiB [emitted]
[0] b8b53672e112c11e6002ad6028811cd1.png 9.29 KiB [emitted]
[0] bb54155821853dba1bda8b0a99ed1b5e.png 7.04 KiB [emitted]
[0] bce27a350d2d50e3b545aaacb555a617.svg 195 bytes [emitted]
[0] be448bbcb5ce45a136734b4155654240.svg 199 bytes [emitted]
[0] c1c4f176c0bdbb4285d8c1a16d60efa9.svg 682 bytes [emitted]
[0] c275e5189ecaabdbfdc5407dc031c28b.svg 183 bytes [emitted]
[0] c3d75740c5c58e55c4e6e9bac7e742ac.svg 195 bytes [emitted]
[0] credentials.bundle.js 67.7 KiB credentials [emitted] credentials
[0] credentials.html 300 bytes [emitted]
[0] d036747814b227e57059b656104e11af.svg 263 bytes [emitted]
[0] d36bd4a7a15f50d1a42ad58049254a48.svg 215 bytes [emitted]
[0] d4aaf69abf5165baddfe5f3ab28efc9c.svg 294 bytes [emitted]
[0] d60ca0119007632070ea071d45b2b68f.svg 593 bytes [emitted]
[0] downloads.bundle.js 75.8 KiB downloads [emitted] downloads
[0] downloads.html 298 bytes [emitted]
[0] e27192cbdd7dbc4012199afa6ba27175.svg 292 bytes [emitted]
[0] e6961caf78e208185a88d01ea2bc217d.svg 260 bytes [emitted]
[0] ea50966763a4e1164188394d49c2634b.svg 233 bytes [emitted]
[0] f02a9cc48ed93dd5af0de8b96bbae622.svg 330 bytes [emitted]
[0] f2060e6fa6ced402f155e6df5b14055c.svg 366 bytes [emitted]
[0] f5aa4fa4e4e9cef40beeca635415ce98.svg 274 bytes [emitted]
[0] f910cf3223a14fefec6cb63f892332d9.svg 266 bytes [emitted]
[0] f91a90c85b604a0df77a47fb5809bda1.png 3.56 KiB [emitted]
[0] fde3bc5099f6a1a78e705bc1402a0b28.png 8.09 KiB [emitted]
[0] find.bundle.js 64.8 KiB find [emitted] find
[0] find.html 293 bytes [emitted]
[0] form-fill.bundle.js 57.7 KiB form-fill [emitted] form-fill
[0] form-fill.html 298 bytes [emitted]
[0] menu.bundle.js 83.8 KiB menu [emitted] menu
[0] menu.html 293 bytes [emitted]
[0] permissions.bundle.js 59.4 KiB permissions [emitted] permissions
[0] permissions.html 300 bytes [emitted]
[0] preview.bundle.js 56.6 KiB preview [emitted] preview
[0] preview.html 296 bytes [emitted]
[0] search.bundle.js 134 KiB search [emitted] search
[0] search.html 295 bytes [emitted]
[0] tabgroup.bundle.js 57 KiB tabgroup [emitted] tabgroup
[0] tabgroup.html 297 bytes [emitted]
[0] vendor.app.bundle.js 4.77 MiB vendor.app [emitted] vendor.app
[0] Entrypoint app = vendor.app.bundle.js app.bundle.js
[0] Entrypoint permissions = vendor.app.bundle.js permissions.bundle.js
[0] Entrypoint auth = vendor.app.bundle.js auth.bundle.js
[0] Entrypoint form-fill = vendor.app.bundle.js form-fill.bundle.js
[0] Entrypoint credentials = vendor.app.bundle.js credentials.bundle.js
[0] Entrypoint find = vendor.app.bundle.js find.bundle.js
[0] Entrypoint menu = vendor.app.bundle.js menu.bundle.js
[0] Entrypoint search = vendor.app.bundle.js search.bundle.js
[0] Entrypoint preview = vendor.app.bundle.js preview.bundle.js
[0] Entrypoint tabgroup = vendor.app.bundle.js tabgroup.bundle.js
[0] Entrypoint downloads = vendor.app.bundle.js downloads.bundle.js
[0] Entrypoint add-bookmark = vendor.app.bundle.js add-bookmark.bundle.js
[0] [0] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/app 64 bytes {app} [built]
[0] [13] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/permissions 64 bytes {permissions} [built]
[0] [14] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/auth 64 bytes {auth} [built]
[0] [15] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/form-fill 64 bytes {form-fill} [built]
[0] [16] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/credentials 64 bytes {credentials} [built]
[0] [17] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/find 64 bytes {find} [built]
[0] [18] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/menu 64 bytes {menu} [built]
[0] [19] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/search 64 bytes {search} [built]
[0] [20] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/preview 64 bytes {preview} [built]
[0] [21] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/tabgroup 64 bytes {tabgroup} [built]
[0] [22] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/downloads 64 bytes {downloads} [built]
[0] [23] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/add-bookmark 64 bytes {add-bookmark} [built]
[0] [./node_modules/react-hot-loader/patch.js] 229 bytes {vendor.app} [built]
[0] [./node_modules/webpack-dev-server/client/index.js?http://localhost:4444] (webpack)-dev-server/client?http://localhost:4444 4.29 KiB {vendor.app} [built]
[0] [./node_modules/webpack/hot/dev-server.js] (webpack)/hot/dev-server.js 1.59 KiB {vendor.app} [built]
[0] + 298 hidden modules
[0] Child html-webpack-plugin for "add-bookmark.html":
[0] 1 asset
[0] Entrypoint undefined = add-bookmark.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "app.html":
[0] 1 asset
[0] Entrypoint undefined = app.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "auth.html":
[0] 1 asset
[0] Entrypoint undefined = auth.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "credentials.html":
[0] 1 asset
[0] Entrypoint undefined = credentials.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "downloads.html":
[0] 1 asset
[0] Entrypoint undefined = downloads.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "find.html":
[0] 1 asset
[0] Entrypoint undefined = find.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "form-fill.html":
[0] 1 asset
[0] Entrypoint undefined = form-fill.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "menu.html":
[0] 1 asset
[0] Entrypoint undefined = menu.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "permissions.html":
[0] 1 asset
[0] Entrypoint undefined = permissions.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "preview.html":
[0] 1 asset
[0] Entrypoint undefined = preview.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "search.html":
[0] 1 asset
[0] Entrypoint undefined = search.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "tabgroup.html":
[0] 1 asset
[0] Entrypoint undefined = tabgroup.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child
[0] Hash: 86297f3783fd975b6122
[0] Time: 2710ms
[0] Built at: 02/12/2020 4:59:21 PM
[0] Asset Size Chunks Chunk Names
[0] extension-popup.bundle.js 783 KiB extension-popup [emitted] extension-popup
[0] extension-popup.html 841 bytes [emitted]
[0] Entrypoint extension-popup = extension-popup.bundle.js
[0] [1] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js ./src/renderer/views/extension-popup 52 bytes {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/index.js?http://localhost:4444] (webpack)-dev-server/client?http://localhost:4444 4.29 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/overlay.js] (webpack)-dev-server/client/overlay.js 3.51 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/socket.js] (webpack)-dev-server/client/socket.js 1.53 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/utils/createSocketUrl.js] (webpack)-dev-server/client/utils/createSocketUrl.js 2.91 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/utils/log.js] (webpack)-dev-server/client/utils/log.js 964 bytes {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/utils/reloadApp.js] (webpack)-dev-server/client/utils/reloadApp.js 1.59 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/utils/sendMessage.js] (webpack)-dev-server/client/utils/sendMessage.js 402 bytes {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/node_modules/strip-ansi/index.js] (webpack)-dev-server/node_modules/strip-ansi/index.js 161 bytes {extension-popup} [built]
[0] [./node_modules/webpack/hot sync ^\.\/log$] (webpack)/hot sync nonrecursive ^\.\/log$ 170 bytes {extension-popup} [built]
[0] [./node_modules/webpack/hot/dev-server.js] (webpack)/hot/dev-server.js 1.59 KiB {extension-popup} [built]
[0] [./node_modules/webpack/hot/emitter.js] (webpack)/hot/emitter.js 75 bytes {extension-popup} [built]
[0] [./node_modules/webpack/hot/log-apply-result.js] (webpack)/hot/log-apply-result.js 1.27 KiB {extension-popup} [built]
[0] [./node_modules/webpack/hot/log.js] (webpack)/hot/log.js 1.34 KiB {extension-popup} [built]
[0] [./src/renderer/views/extension-popup/index.tsx] 2.77 KiB {extension-popup} [built]
[0] + 16 hidden modules
[0] Child html-webpack-plugin for "extension-popup.html":
[0] 1 asset
[0] Entrypoint undefined = extension-popup.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/extension-popup.html] 1010 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] [34mi[39m [90m「wdm」[39m: Compiled successfully.
[2] ERROR in C:/desktop-master/src/main/services/storage.ts(402,24):
[2] TS2349: This expression is not callable.
[2] Type 'typeof import("C:/desktop-master/node_modules/file-type/index")' has no call signatures.
[2] Version: typescript 3.7.5
[2] Time: 34792ms
[2] ERROR in C:/desktop-master/src/main/services/storage.ts(408,31):
[2] TS2349: This expression is not callable.
[2] Type 'typeof import("C:/desktop-master/node_modules/file-type/index")' has no call signatures.
[0] ERROR in C:/desktop-master/src/main/services/storage.ts(402,24):
[0] TS2349: This expression is not callable.
[0] Type 'typeof import("C:/desktop-master/node_modules/file-type/index")' has no call signatures.
[0] Version: typescript 3.7.5
[0] Time: 42281ms
[0] ERROR in C:/desktop-master/src/main/services/storage.ts(408,31):
[0] TS2349: This expression is not callable.
[0] Type 'typeof import("C:/desktop-master/node_modules/file-type/index")' has no call signatures.
[1] ^C^C^CTerminare il processo batch (S/N)? ^CTerminare il processo batch (S/N)? ^CTerminare il processo batch (S/N)? ^Ccross-env ENV='dev' webpack-dev-server --config webpack.config.web.js exited with code 3221225786
[0] ^CTerminare il processo batch (S/N)? cross-env ENV='dev' webpack-dev-server --config webpack.config.renderer.js exited with code 3221225786
[1] cross-env ENV='dev' webpack exited with code 3221225786
It also displays an error message box (see screenshot below) and when I close the messagebox the terminal doesn't continue, the cursor blinks in a new line but I can't write any command and I'm only able to quit.
**To Reproduce**
Run "npm run dev" or "npm start".
**Expected behavior**
Building without errors.
**Screenshots**

**Details:**
- Operating System: Windows 7 64-bit SP1
- Wexond version: 4.0.0-beta2
- Last known working Wexond version: I've never used Wexond before.
Thank you in advance for your help!
|
1.0
|
Build error with npm - **Bug description**
Hi, I've successfully installed the dependecies with "npm install" without errors but I can't build or run the application. I've tried both "npm run dev" and "npm start" with same results.
This is the complete output I get:
> wexond@4.0.0-beta.2 dev C:\desktop-master
> cross-env START=1 npm run watch
> wexond@4.0.0-beta.2 watch C:\desktop-master
> npm run extensions && concurrently "cross-env ENV='dev' webpack-dev-server --config webpack.config.renderer.js" " cross-env ENV='dev' webpack" "cross-env ENV='dev' webpack-dev-server --config webpack.config.web.js"
> wexond@4.0.0-beta.2 extensions C:\desktop-master
> node scripts/extensions.js
[1]
[1] webpack is watching the files...
[1]
[2] [34mi[39m [90m「wds」[39m: Project is running at http://localhost:4445/
[2] [34mi[39m [90m「wds」[39m: webpack output is served from /
[2] [34mi[39m [90m「wds」[39m: Content not from webpack is served from C:\desktop-master\build
[0] [34mi[39m [90m「wds」[39m: Project is running at http://localhost:4444/
[0] [34mi[39m [90m「wds」[39m: webpack output is served from /
[0] [34mi[39m [90m「wds」[39m: Content not from webpack is served from C:\desktop-master\build
[2] [hardsource:51cbc9e7] Using 102 MB of disk space.
[2] [hardsource:51cbc9e7] Writing new cache 51cbc9e7...
[2] [hardsource:51cbc9e7] Tracking node dependencies with: package-lock.json, yarn.lock.
[2] Starting type checking service...
[2] Using 1 worker with 2048MB memory limit
[0] [hardsource:56a7bcb3] Using 102 MB of disk space.
[0] [hardsource:56a7bcb3] Writing new cache 56a7bcb3...
[0] [hardsource:56a7bcb3] Tracking node dependencies with: package-lock.json, yarn.lock.
[0] Starting type checking service...
[0] Using 1 worker with 2048MB memory limit
[2] Type checking in progress...
[2] [34mi[39m [90m「wdm」[39m: Hash: 994488287bc0e506180f
[2] Version: webpack 4.41.5
[2] Time: 17193ms
[2] Built at: 02/12/2020 4:59:35 PM
[2] Asset Size Chunks Chunk Names
[2] 0029d6a83da87ebe19dbd1d3f8dc4fa8.svg 233 bytes [emitted]
[2] 09e70abbf7f3c8e0a712186b7840f08d.svg 265 bytes [emitted]
[2] 0d79c982c22fb9d4568cbdf9d42bd45c.png 10.6 KiB [emitted]
[2] 0e5a7b7a5f91d27ccc518401734adc77.svg 443 bytes [emitted]
[2] 11ea5efbd1c9e8f273869a6d8487a079.svg 194 bytes [emitted]
[2] 120534507fea30fb285bf30b09ff4736.svg 330 bytes [emitted]
[2] 125b8fe2cc7b8bfefed62142eff893c8.svg 349 bytes [emitted]
[2] 1731195fa1171e46db6c945db25dd722.svg 280 bytes [emitted]
[2] 1cf6f0d3481497ab66b3287324f3ec0d.png 6.59 KiB [emitted]
[2] 2246be89f74fcc7abdfa9b3d73cb90af.png 5.43 KiB [emitted]
[2] 2269792780a1012a77e2939ce390aa53.svg 590 bytes [emitted]
[2] 24a93bb6a956c41e0f5fde63df92e183.svg 184 bytes [emitted]
[2] 269c63a5f4d4151e28ce757080a00532.svg 347 bytes [emitted]
[2] 2751ee43015f9884c3642f103b7f70c9.woff2 48.1 KiB [emitted]
[2] 34bd6069c9f08bb444c86b8d099a000e.svg 315 bytes [emitted]
[2] 3dc9c6f233c8796c5a5854bcbb8a7a35.svg 283 bytes [emitted]
[2] 4253cfb9638d23b00d60c54f650d403f.png 7.38 KiB [emitted]
[2] 43d95dda9128450b9d5f1b743ae0cfaa.svg 279 bytes [emitted]
[2] 45b30f6af163a69443f70b9e363a9a6e.svg 287 bytes [emitted]
[2] 4d7dbcf6a1a48daa3613496488d59f53.svg 1.41 KiB [emitted]
[2] 5274472f3d866509ed3244c83991584b.svg 265 bytes [emitted]
[2] 574fd0b50367f886d359e8264938fc37.woff2 49 KiB [emitted]
[2] 5a3e9cea76f24ce5298f51238bdeb156.svg 229 bytes [emitted]
[2] 5f2c6e1979fcb3bac8179f4fe55c4cf4.svg 555 bytes [emitted]
[2] 64de38475d6ea6a35c277c6873f7fa08.svg 362 bytes [emitted]
[2] 654ec65131c8d4315ee1b585c478f8ba.svg 608 bytes [emitted]
[2] 69f8a0617ac472f78e45841323a3df9e.woff2 48.2 KiB [emitted]
[2] 6a545d58c2fcfde4ef7edb644866ed03.png 7.43 KiB [emitted]
[2] 721ee70cae50d5ee89bf69baa0029c8b.svg 2 KiB [emitted]
[2] 7241febe64c5c37603f1a6fe0e999a43.svg 265 bytes [emitted]
[2] 74fc797d2d34f79f05a5255671eaeb25.svg 199 bytes [emitted]
[2] 81bf2a2b7b05ac48c1d6e0b730b166d1.svg 311 bytes [emitted]
[2] 88daae015cde151a45c6e1a9cea04327.png 9.62 KiB [emitted]
[2] 8a0c8c4e90d8ac7681629b333225fc86.svg 398 bytes [emitted]
[2] 8dbfeb98213195d6f9d5e021b053c064.svg 266 bytes [emitted]
[2] 91da60531102e1686f6016e61a973a03.svg 360 bytes [emitted]
[2] 9e1a5f1d0a2778445c645683f7fb5fd9.svg 212 bytes [emitted]
[2] a301453b875c4730cf5d924b71171607.png 5.76 KiB [emitted]
[2] a5e3e9a9f3269c148e40b0b421001779.svg 283 bytes [emitted]
[2] a606474bb0d05e002edd595ce9c4d7a6.svg 784 bytes [emitted]
[2] ac9440512ba56b8f58e72384a30b0cb6.svg 198 bytes [emitted]
[2] afbaceae723f5019f524054125bc44c8.svg 808 bytes [emitted]
[2] b414c240e67daa96b06f9f45d01e1da9.svg 404 bytes [emitted]
[2] b7084da2b883ad0a2a68f0e602d14e8c.png 9.42 KiB [emitted]
[2] b8b53672e112c11e6002ad6028811cd1.png 9.29 KiB [emitted]
[2] bb54155821853dba1bda8b0a99ed1b5e.png 7.04 KiB [emitted]
[2] bce27a350d2d50e3b545aaacb555a617.svg 195 bytes [emitted]
[2] be448bbcb5ce45a136734b4155654240.svg 199 bytes [emitted]
[2] bookmarks.bundle.js 142 KiB bookmarks [emitted] bookmarks
[2] bookmarks.html 298 bytes [emitted]
[2] c1c4f176c0bdbb4285d8c1a16d60efa9.svg 682 bytes [emitted]
[2] c275e5189ecaabdbfdc5407dc031c28b.svg 183 bytes [emitted]
[2] c3d75740c5c58e55c4e6e9bac7e742ac.svg 195 bytes [emitted]
[2] d036747814b227e57059b656104e11af.svg 263 bytes [emitted]
[2] d36bd4a7a15f50d1a42ad58049254a48.svg 215 bytes [emitted]
[2] d4aaf69abf5165baddfe5f3ab28efc9c.svg 294 bytes [emitted]
[2] d60ca0119007632070ea071d45b2b68f.svg 593 bytes [emitted]
[2] e27192cbdd7dbc4012199afa6ba27175.svg 292 bytes [emitted]
[2] e6961caf78e208185a88d01ea2bc217d.svg 260 bytes [emitted]
[2] ea50966763a4e1164188394d49c2634b.svg 233 bytes [emitted]
[2] f02a9cc48ed93dd5af0de8b96bbae622.svg 330 bytes [emitted]
[2] f2060e6fa6ced402f155e6df5b14055c.svg 366 bytes [emitted]
[2] f5aa4fa4e4e9cef40beeca635415ce98.svg 274 bytes [emitted]
[2] f910cf3223a14fefec6cb63f892332d9.svg 266 bytes [emitted]
[2] f91a90c85b604a0df77a47fb5809bda1.png 3.56 KiB [emitted]
[2] fde3bc5099f6a1a78e705bc1402a0b28.png 8.09 KiB [emitted]
[2] history.bundle.js 98.7 KiB history [emitted] history
[2] history.html 296 bytes [emitted]
[2] newtab.bundle.js 594 KiB newtab [emitted] newtab
[2] newtab.html 295 bytes [emitted]
[2] settings.bundle.js 206 KiB settings [emitted] settings
[2] settings.html 297 bytes [emitted]
[2] vendor.web.bundle.js 4.94 MiB vendor.web [emitted] vendor.web
[2] Entrypoint settings = vendor.web.bundle.js settings.bundle.js
[2] Entrypoint history = vendor.web.bundle.js history.bundle.js
[2] Entrypoint newtab = vendor.web.bundle.js newtab.bundle.js
[2] Entrypoint bookmarks = vendor.web.bundle.js bookmarks.bundle.js
[2] [0] multi (webpack)-dev-server/client?http://localhost:4445 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/settings 64 bytes {settings} [built]
[2] [5] multi (webpack)-dev-server/client?http://localhost:4445 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/history 64 bytes {history} [built]
[2] [6] multi (webpack)-dev-server/client?http://localhost:4445 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/newtab 64 bytes {newtab} [built]
[2] [9] multi (webpack)-dev-server/client?http://localhost:4445 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/bookmarks 64 bytes {bookmarks} [built]
[2] [./node_modules/react-dom/index.js] 1.33 KiB {vendor.web} [built]
[2] [./node_modules/react-hot-loader/dist/react-hot-loader.development.js] 97.1 KiB {vendor.web} [built]
[2] [./node_modules/react-hot-loader/patch.js] 229 bytes {vendor.web} [built]
[2] [./node_modules/react/index.js] 190 bytes {vendor.web} [built]
[2] [./node_modules/webpack-dev-server/client/index.js?http://localhost:4445] (webpack)-dev-server/client?http://localhost:4445 4.29 KiB {vendor.web} [built]
[2] [./node_modules/webpack-dev-server/client/overlay.js] (webpack)-dev-server/client/overlay.js 3.51 KiB {vendor.web} [built]
[2] [./node_modules/webpack/hot/dev-server.js] (webpack)/hot/dev-server.js 1.59 KiB {vendor.web} [built]
[2] [./src/renderer/views/bookmarks/index.tsx] 902 bytes {bookmarks} [built]
[2] [./src/renderer/views/history/index.tsx] 902 bytes {history} [built]
[2] [./src/renderer/views/newtab/index.tsx] 902 bytes {newtab} [built]
[2] [./src/renderer/views/settings/index.tsx] 902 bytes {settings} [built]
[2] + 260 hidden modules
[2] Child html-webpack-plugin for "bookmarks.html":
[2] 1 asset
[2] Entrypoint undefined = bookmarks.html
[2] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[2] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[2] [./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 472 bytes {0} [built]
[2] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[2] Child html-webpack-plugin for "history.html":
[2] 1 asset
[2] Entrypoint undefined = history.html
[2] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[2] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[2] [./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 472 bytes {0} [built]
[2] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[2] Child html-webpack-plugin for "newtab.html":
[2] 1 asset
[2] Entrypoint undefined = newtab.html
[2] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[2] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[2] [./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 472 bytes {0} [built]
[2] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[2] Child html-webpack-plugin for "settings.html":
[2] 1 asset
[2] Entrypoint undefined = settings.html
[2] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[2] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[2] [./node_modules/webpack/buildin/global.js] (webpack)/buildin/global.js 472 bytes {0} [built]
[2] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[2] [34mi[39m [90m「wdm」[39m: Compiled successfully.
[1] Hash: 10aee0863dd7294e10bec86d0639cf21b5e3f666
[1] Version: webpack 4.41.5
[1] Child
[1] Hash: 10aee0863dd7294e10be
[1] Time: 21874ms
[1] Built at: 02/12/2020 4:59:39 PM
[1] Asset Size Chunks Chunk Names
[1] bb872f07c62a77ce2e49c83d5ac786c4.node 601 KiB [emitted]
[1] extensions-preload.js 17.8 KiB [emitted]
[1] main.bundle.js 10.5 MiB main [emitted] main
[1] preload.js 7.36 KiB [emitted]
[1] Entrypoint main = main.bundle.js
[1] [./src/main/index.ts] 2.02 KiB {main} [built]
[1] [./src/main/menus/main.ts] 8.02 KiB {main} [built]
[1] [./src/main/models/protocol.ts] 1.49 KiB {main} [built]
[1] [./src/main/models/settings.ts] 6.88 KiB {main} [built]
[1] [./src/main/services/index.ts] 252 bytes {main} [built]
[1] [./src/main/services/storage.ts] 16.7 KiB {main} [built]
[1] [./src/main/sessions-manager.ts] 12.3 KiB {main} [built]
[1] [./src/main/windows-manager.ts] 4.64 KiB {main} [built]
[1] [./src/main/windows/index.ts] 208 bytes {main} [built]
[1] [./src/utils/files.ts] 825 bytes {main} [built]
[1] [./src/utils/index.ts] 336 bytes {main} [built]
[1] [electron] external "electron" 42 bytes {main} [built]
[1] [fs] external "fs" 42 bytes {main} [built]
[1] [os] external "os" 42 bytes {main} [built]
[1] [path] external "path" 42 bytes {main} [built]
[1] + 783 hidden modules
[1] Child
[1] Hash: c86d0639cf21b5e3f666
[1] Time: 4640ms
[1] Built at: 02/12/2020 4:59:22 PM
[1] Asset Size Chunks Chunk Names
[1] popup-preload.bundle.js 6.79 KiB popup-preload [emitted] popup-preload
[1] view-preload.bundle.js 118 KiB view-preload [emitted] view-preload
[1] Entrypoint view-preload = view-preload.bundle.js
[1] Entrypoint popup-preload = popup-preload.bundle.js
[1] [./package.json] 3.55 KiB {view-preload} [built]
[1] [./src/constants/files.ts] 554 bytes {view-preload} [built]
[1] [./src/constants/settings.ts] 8.06 KiB {view-preload} [built]
[1] [./src/preloads/chrome-webstore.ts] 3.02 KiB {view-preload} [built]
[1] [./src/preloads/constants/index.ts] 214 bytes {view-preload} [built]
[1] [./src/preloads/models/auto-complete.ts] 1.27 KiB {view-preload} [built]
[1] [./src/preloads/models/form.ts] 4.89 KiB {view-preload} [built]
[1] [./src/preloads/popup-preload.ts] 586 bytes {popup-preload} [built]
[1] [./src/preloads/utils/dom.ts] 248 bytes {view-preload} [built]
[1] [./src/preloads/utils/index.ts] 208 bytes {view-preload} [built]
[1] [./src/preloads/view-preload.ts] 7.34 KiB {view-preload} [built]
[1] [./src/renderer/constants/colors.ts] 678 bytes {view-preload} [built]
[1] [./src/renderer/constants/themes.ts] 3.39 KiB {view-preload} [built]
[1] [./src/utils/themes.ts] 351 bytes {view-preload} [built]
[1] [electron] external "electron" 42 bytes {view-preload} {popup-preload} [built]
[1] + 2 hidden modules
[1]
[1] > wexond@4.0.0-beta.2 start C:\desktop-master
[1] > cross-env NODE_ENV='dev' electron .
[1]
[1]
[1] App threw an error during load
[1] Error: Cannot open C:\desktop-master\build\bb872f07c62a77ce2e49c83d5ac786c4.node: Error: Module did not self-register.
[1] at Object.eval (webpack-internal:///./node_modules/keytar/build/Release/keytar.node:1:253)
[1] at eval (webpack-internal:///./node_modules/keytar/build/Release/keytar.node:2:30)
[1] at Object../node_modules/keytar/build/Release/keytar.node (C:\desktop-master\build\main.bundle.js:4101:1)
[1] at __webpack_require__ (C:\desktop-master\build\main.bundle.js:20:30)
[1] at eval (webpack-internal:///./node_modules/keytar/lib/keytar.js:1:14)
[1] at Object../node_modules/keytar/lib/keytar.js (C:\desktop-master\build\main.bundle.js:4112:1)
[1] at __webpack_require__ (C:\desktop-master\build\main.bundle.js:20:30)
[1] at eval (webpack-internal:///./src/main/services/messaging.ts:14:18)
[1] at Object../src/main/services/messaging.ts (C:\desktop-master\build\main.bundle.js:8808:1)
[1] at __webpack_require__ (C:\desktop-master\build\main.bundle.js:20:30)
[0] Type checking in progress...
[0] [34mi[39m [90m「wdm」[39m: Hash: e5805cc70dbc87e5587986297f3783fd975b6122
[0] Version: webpack 4.41.5
[0] Child
[0] Hash: e5805cc70dbc87e55879
[0] Time: 37424ms
[0] Built at: 02/12/2020 4:59:55 PM
[0] Asset Size Chunks Chunk Names
[0] 0029d6a83da87ebe19dbd1d3f8dc4fa8.svg 233 bytes [emitted]
[0] 09e70abbf7f3c8e0a712186b7840f08d.svg 265 bytes [emitted]
[0] 0d79c982c22fb9d4568cbdf9d42bd45c.png 10.6 KiB [emitted]
[0] 0e5a7b7a5f91d27ccc518401734adc77.svg 443 bytes [emitted]
[0] 11ea5efbd1c9e8f273869a6d8487a079.svg 194 bytes [emitted]
[0] 120534507fea30fb285bf30b09ff4736.svg 330 bytes [emitted]
[0] 125b8fe2cc7b8bfefed62142eff893c8.svg 349 bytes [emitted]
[0] 1731195fa1171e46db6c945db25dd722.svg 280 bytes [emitted]
[0] 1cf6f0d3481497ab66b3287324f3ec0d.png 6.59 KiB [emitted]
[0] 2246be89f74fcc7abdfa9b3d73cb90af.png 5.43 KiB [emitted]
[0] 2269792780a1012a77e2939ce390aa53.svg 590 bytes [emitted]
[0] 24a93bb6a956c41e0f5fde63df92e183.svg 184 bytes [emitted]
[0] 269c63a5f4d4151e28ce757080a00532.svg 347 bytes [emitted]
[0] 2751ee43015f9884c3642f103b7f70c9.woff2 48.1 KiB [emitted]
[0] 34bd6069c9f08bb444c86b8d099a000e.svg 315 bytes [emitted]
[0] 3dc9c6f233c8796c5a5854bcbb8a7a35.svg 283 bytes [emitted]
[0] 4253cfb9638d23b00d60c54f650d403f.png 7.38 KiB [emitted]
[0] 43d95dda9128450b9d5f1b743ae0cfaa.svg 279 bytes [emitted]
[0] 45b30f6af163a69443f70b9e363a9a6e.svg 287 bytes [emitted]
[0] 4d7dbcf6a1a48daa3613496488d59f53.svg 1.41 KiB [emitted]
[0] 5274472f3d866509ed3244c83991584b.svg 265 bytes [emitted]
[0] 574fd0b50367f886d359e8264938fc37.woff2 49 KiB [emitted]
[0] 5a3e9cea76f24ce5298f51238bdeb156.svg 229 bytes [emitted]
[0] 5f2c6e1979fcb3bac8179f4fe55c4cf4.svg 555 bytes [emitted]
[0] 64de38475d6ea6a35c277c6873f7fa08.svg 362 bytes [emitted]
[0] 654ec65131c8d4315ee1b585c478f8ba.svg 608 bytes [emitted]
[0] 69f8a0617ac472f78e45841323a3df9e.woff2 48.2 KiB [emitted]
[0] 6a545d58c2fcfde4ef7edb644866ed03.png 7.43 KiB [emitted]
[0] 721ee70cae50d5ee89bf69baa0029c8b.svg 2 KiB [emitted]
[0] 7241febe64c5c37603f1a6fe0e999a43.svg 265 bytes [emitted]
[0] 74fc797d2d34f79f05a5255671eaeb25.svg 199 bytes [emitted]
[0] 81bf2a2b7b05ac48c1d6e0b730b166d1.svg 311 bytes [emitted]
[0] 88daae015cde151a45c6e1a9cea04327.png 9.62 KiB [emitted]
[0] 8a0c8c4e90d8ac7681629b333225fc86.svg 398 bytes [emitted]
[0] 8dbfeb98213195d6f9d5e021b053c064.svg 266 bytes [emitted]
[0] 91da60531102e1686f6016e61a973a03.svg 360 bytes [emitted]
[0] 9e1a5f1d0a2778445c645683f7fb5fd9.svg 212 bytes [emitted]
[0] a301453b875c4730cf5d924b71171607.png 5.76 KiB [emitted]
[0] a5e3e9a9f3269c148e40b0b421001779.svg 283 bytes [emitted]
[0] a606474bb0d05e002edd595ce9c4d7a6.svg 784 bytes [emitted]
[0] ac9440512ba56b8f58e72384a30b0cb6.svg 198 bytes [emitted]
[0] add-bookmark.bundle.js 109 KiB add-bookmark [emitted] add-bookmark
[0] add-bookmark.html 301 bytes [emitted]
[0] afbaceae723f5019f524054125bc44c8.svg 808 bytes [emitted]
[0] app.bundle.js 890 KiB app [emitted] app
[0] app.html 292 bytes [emitted]
[0] auth.bundle.js 55.4 KiB auth [emitted] auth
[0] auth.html 293 bytes [emitted]
[0] b414c240e67daa96b06f9f45d01e1da9.svg 404 bytes [emitted]
[0] b7084da2b883ad0a2a68f0e602d14e8c.png 9.42 KiB [emitted]
[0] b8b53672e112c11e6002ad6028811cd1.png 9.29 KiB [emitted]
[0] bb54155821853dba1bda8b0a99ed1b5e.png 7.04 KiB [emitted]
[0] bce27a350d2d50e3b545aaacb555a617.svg 195 bytes [emitted]
[0] be448bbcb5ce45a136734b4155654240.svg 199 bytes [emitted]
[0] c1c4f176c0bdbb4285d8c1a16d60efa9.svg 682 bytes [emitted]
[0] c275e5189ecaabdbfdc5407dc031c28b.svg 183 bytes [emitted]
[0] c3d75740c5c58e55c4e6e9bac7e742ac.svg 195 bytes [emitted]
[0] credentials.bundle.js 67.7 KiB credentials [emitted] credentials
[0] credentials.html 300 bytes [emitted]
[0] d036747814b227e57059b656104e11af.svg 263 bytes [emitted]
[0] d36bd4a7a15f50d1a42ad58049254a48.svg 215 bytes [emitted]
[0] d4aaf69abf5165baddfe5f3ab28efc9c.svg 294 bytes [emitted]
[0] d60ca0119007632070ea071d45b2b68f.svg 593 bytes [emitted]
[0] downloads.bundle.js 75.8 KiB downloads [emitted] downloads
[0] downloads.html 298 bytes [emitted]
[0] e27192cbdd7dbc4012199afa6ba27175.svg 292 bytes [emitted]
[0] e6961caf78e208185a88d01ea2bc217d.svg 260 bytes [emitted]
[0] ea50966763a4e1164188394d49c2634b.svg 233 bytes [emitted]
[0] f02a9cc48ed93dd5af0de8b96bbae622.svg 330 bytes [emitted]
[0] f2060e6fa6ced402f155e6df5b14055c.svg 366 bytes [emitted]
[0] f5aa4fa4e4e9cef40beeca635415ce98.svg 274 bytes [emitted]
[0] f910cf3223a14fefec6cb63f892332d9.svg 266 bytes [emitted]
[0] f91a90c85b604a0df77a47fb5809bda1.png 3.56 KiB [emitted]
[0] fde3bc5099f6a1a78e705bc1402a0b28.png 8.09 KiB [emitted]
[0] find.bundle.js 64.8 KiB find [emitted] find
[0] find.html 293 bytes [emitted]
[0] form-fill.bundle.js 57.7 KiB form-fill [emitted] form-fill
[0] form-fill.html 298 bytes [emitted]
[0] menu.bundle.js 83.8 KiB menu [emitted] menu
[0] menu.html 293 bytes [emitted]
[0] permissions.bundle.js 59.4 KiB permissions [emitted] permissions
[0] permissions.html 300 bytes [emitted]
[0] preview.bundle.js 56.6 KiB preview [emitted] preview
[0] preview.html 296 bytes [emitted]
[0] search.bundle.js 134 KiB search [emitted] search
[0] search.html 295 bytes [emitted]
[0] tabgroup.bundle.js 57 KiB tabgroup [emitted] tabgroup
[0] tabgroup.html 297 bytes [emitted]
[0] vendor.app.bundle.js 4.77 MiB vendor.app [emitted] vendor.app
[0] Entrypoint app = vendor.app.bundle.js app.bundle.js
[0] Entrypoint permissions = vendor.app.bundle.js permissions.bundle.js
[0] Entrypoint auth = vendor.app.bundle.js auth.bundle.js
[0] Entrypoint form-fill = vendor.app.bundle.js form-fill.bundle.js
[0] Entrypoint credentials = vendor.app.bundle.js credentials.bundle.js
[0] Entrypoint find = vendor.app.bundle.js find.bundle.js
[0] Entrypoint menu = vendor.app.bundle.js menu.bundle.js
[0] Entrypoint search = vendor.app.bundle.js search.bundle.js
[0] Entrypoint preview = vendor.app.bundle.js preview.bundle.js
[0] Entrypoint tabgroup = vendor.app.bundle.js tabgroup.bundle.js
[0] Entrypoint downloads = vendor.app.bundle.js downloads.bundle.js
[0] Entrypoint add-bookmark = vendor.app.bundle.js add-bookmark.bundle.js
[0] [0] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/app 64 bytes {app} [built]
[0] [13] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/permissions 64 bytes {permissions} [built]
[0] [14] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/auth 64 bytes {auth} [built]
[0] [15] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/form-fill 64 bytes {form-fill} [built]
[0] [16] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/credentials 64 bytes {credentials} [built]
[0] [17] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/find 64 bytes {find} [built]
[0] [18] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/menu 64 bytes {menu} [built]
[0] [19] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/search 64 bytes {search} [built]
[0] [20] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/preview 64 bytes {preview} [built]
[0] [21] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/tabgroup 64 bytes {tabgroup} [built]
[0] [22] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/downloads 64 bytes {downloads} [built]
[0] [23] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js react-hot-loader/patch ./src/renderer/views/add-bookmark 64 bytes {add-bookmark} [built]
[0] [./node_modules/react-hot-loader/patch.js] 229 bytes {vendor.app} [built]
[0] [./node_modules/webpack-dev-server/client/index.js?http://localhost:4444] (webpack)-dev-server/client?http://localhost:4444 4.29 KiB {vendor.app} [built]
[0] [./node_modules/webpack/hot/dev-server.js] (webpack)/hot/dev-server.js 1.59 KiB {vendor.app} [built]
[0] + 298 hidden modules
[0] Child html-webpack-plugin for "add-bookmark.html":
[0] 1 asset
[0] Entrypoint undefined = add-bookmark.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "app.html":
[0] 1 asset
[0] Entrypoint undefined = app.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "auth.html":
[0] 1 asset
[0] Entrypoint undefined = auth.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "credentials.html":
[0] 1 asset
[0] Entrypoint undefined = credentials.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "downloads.html":
[0] 1 asset
[0] Entrypoint undefined = downloads.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "find.html":
[0] 1 asset
[0] Entrypoint undefined = find.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "form-fill.html":
[0] 1 asset
[0] Entrypoint undefined = form-fill.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "menu.html":
[0] 1 asset
[0] Entrypoint undefined = menu.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "permissions.html":
[0] 1 asset
[0] Entrypoint undefined = permissions.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "preview.html":
[0] 1 asset
[0] Entrypoint undefined = preview.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "search.html":
[0] 1 asset
[0] Entrypoint undefined = search.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child html-webpack-plugin for "tabgroup.html":
[0] 1 asset
[0] Entrypoint undefined = tabgroup.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/app.html] 375 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] Child
[0] Hash: 86297f3783fd975b6122
[0] Time: 2710ms
[0] Built at: 02/12/2020 4:59:21 PM
[0] Asset Size Chunks Chunk Names
[0] extension-popup.bundle.js 783 KiB extension-popup [emitted] extension-popup
[0] extension-popup.html 841 bytes [emitted]
[0] Entrypoint extension-popup = extension-popup.bundle.js
[0] [1] multi (webpack)-dev-server/client?http://localhost:4444 (webpack)/hot/dev-server.js ./src/renderer/views/extension-popup 52 bytes {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/index.js?http://localhost:4444] (webpack)-dev-server/client?http://localhost:4444 4.29 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/overlay.js] (webpack)-dev-server/client/overlay.js 3.51 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/socket.js] (webpack)-dev-server/client/socket.js 1.53 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/utils/createSocketUrl.js] (webpack)-dev-server/client/utils/createSocketUrl.js 2.91 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/utils/log.js] (webpack)-dev-server/client/utils/log.js 964 bytes {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/utils/reloadApp.js] (webpack)-dev-server/client/utils/reloadApp.js 1.59 KiB {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/client/utils/sendMessage.js] (webpack)-dev-server/client/utils/sendMessage.js 402 bytes {extension-popup} [built]
[0] [./node_modules/webpack-dev-server/node_modules/strip-ansi/index.js] (webpack)-dev-server/node_modules/strip-ansi/index.js 161 bytes {extension-popup} [built]
[0] [./node_modules/webpack/hot sync ^\.\/log$] (webpack)/hot sync nonrecursive ^\.\/log$ 170 bytes {extension-popup} [built]
[0] [./node_modules/webpack/hot/dev-server.js] (webpack)/hot/dev-server.js 1.59 KiB {extension-popup} [built]
[0] [./node_modules/webpack/hot/emitter.js] (webpack)/hot/emitter.js 75 bytes {extension-popup} [built]
[0] [./node_modules/webpack/hot/log-apply-result.js] (webpack)/hot/log-apply-result.js 1.27 KiB {extension-popup} [built]
[0] [./node_modules/webpack/hot/log.js] (webpack)/hot/log.js 1.34 KiB {extension-popup} [built]
[0] [./src/renderer/views/extension-popup/index.tsx] 2.77 KiB {extension-popup} [built]
[0] + 16 hidden modules
[0] Child html-webpack-plugin for "extension-popup.html":
[0] 1 asset
[0] Entrypoint undefined = extension-popup.html
[0] [./node_modules/html-webpack-plugin/lib/loader.js!./static/pages/extension-popup.html] 1010 bytes {0} [built]
[0] [./node_modules/lodash/lodash.js] 528 KiB {0} [built]
[0] [./node_modules/webpack/buildin/module.js] (webpack)/buildin/module.js 497 bytes {0} [built]
[0] [34mi[39m [90m「wdm」[39m: Compiled successfully.
[2] ERROR in C:/desktop-master/src/main/services/storage.ts(402,24):
[2] TS2349: This expression is not callable.
[2] Type 'typeof import("C:/desktop-master/node_modules/file-type/index")' has no call signatures.
[2] Version: typescript 3.7.5
[2] Time: 34792ms
[2] ERROR in C:/desktop-master/src/main/services/storage.ts(408,31):
[2] TS2349: This expression is not callable.
[2] Type 'typeof import("C:/desktop-master/node_modules/file-type/index")' has no call signatures.
[0] ERROR in C:/desktop-master/src/main/services/storage.ts(402,24):
[0] TS2349: This expression is not callable.
[0] Type 'typeof import("C:/desktop-master/node_modules/file-type/index")' has no call signatures.
[0] Version: typescript 3.7.5
[0] Time: 42281ms
[0] ERROR in C:/desktop-master/src/main/services/storage.ts(408,31):
[0] TS2349: This expression is not callable.
[0] Type 'typeof import("C:/desktop-master/node_modules/file-type/index")' has no call signatures.
[1] ^C^C^CTerminare il processo batch (S/N)? ^CTerminare il processo batch (S/N)? ^CTerminare il processo batch (S/N)? ^Ccross-env ENV='dev' webpack-dev-server --config webpack.config.web.js exited with code 3221225786
[0] ^CTerminare il processo batch (S/N)? cross-env ENV='dev' webpack-dev-server --config webpack.config.renderer.js exited with code 3221225786
[1] cross-env ENV='dev' webpack exited with code 3221225786
It also displays an error message box (see screenshot below) and when I close the messagebox the terminal doesn't continue, the cursor blinks in a new line but I can't write any command and I'm only able to quit.
**To Reproduce**
Run "npm run dev" or "npm start".
**Expected behavior**
Building without errors.
**Screenshots**

**Details:**
- Operating System: Windows 7 64-bit SP1
- Wexond version: 4.0.0-beta2
- Last known working Wexond version: I've never used Wexond before.
Thank you in advance for your help!
|
non_process
|
build error with npm bug description hi i ve successfully installed the dependecies with npm install without errors but i can t build or run the application i ve tried both npm run dev and npm start with same results this is the complete output i get wexond beta dev c desktop master cross env start npm run watch wexond beta watch c desktop master npm run extensions concurrently cross env env dev webpack dev server config webpack config renderer js cross env env dev webpack cross env env dev webpack dev server config webpack config web js wexond beta extensions c desktop master node scripts extensions js webpack is watching the files 「wds」 project is running at 「wds」 webpack output is served from 「wds」 content not from webpack is served from c desktop master build 「wds」 project is running at 「wds」 webpack output is served from 「wds」 content not from webpack is served from c desktop master build using mb of disk space writing new cache tracking node dependencies with package lock json yarn lock starting type checking service using worker with memory limit using mb of disk space writing new cache tracking node dependencies with package lock json yarn lock starting type checking service using worker with memory limit type checking in progress 「wdm」 hash version webpack time built at pm asset size chunks chunk names svg bytes svg bytes png kib svg bytes svg bytes svg bytes svg bytes svg bytes png kib png kib svg bytes svg bytes svg bytes kib svg bytes svg bytes png kib svg bytes svg bytes svg kib svg bytes kib svg bytes svg bytes svg bytes svg bytes kib png kib svg kib svg bytes svg bytes svg bytes png kib svg bytes svg bytes svg bytes svg bytes png kib svg bytes svg bytes svg bytes svg bytes svg bytes png kib png kib png kib svg bytes svg bytes bookmarks bundle js kib bookmarks bookmarks bookmarks html bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes png kib png kib history bundle js kib history history history html bytes newtab bundle js kib newtab newtab newtab html bytes settings bundle js kib settings settings settings html bytes vendor web bundle js mib vendor web vendor web entrypoint settings vendor web bundle js settings bundle js entrypoint history vendor web bundle js history bundle js entrypoint newtab vendor web bundle js newtab bundle js entrypoint bookmarks vendor web bundle js bookmarks bundle js multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views settings bytes settings multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views history bytes history multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views newtab bytes newtab multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views bookmarks bytes bookmarks kib vendor web kib vendor web bytes vendor web bytes vendor web webpack dev server client kib vendor web webpack dev server client overlay js kib vendor web webpack hot dev server js kib vendor web bytes bookmarks bytes history bytes newtab bytes settings hidden modules child html webpack plugin for bookmarks html asset entrypoint undefined bookmarks html bytes kib webpack buildin global js bytes webpack buildin module js bytes child html webpack plugin for history html asset entrypoint undefined history html bytes kib webpack buildin global js bytes webpack buildin module js bytes child html webpack plugin for newtab html asset entrypoint undefined newtab html bytes kib webpack buildin global js bytes webpack buildin module js bytes child html webpack plugin for settings html asset entrypoint undefined settings html bytes kib webpack buildin global js bytes webpack buildin module js bytes 「wdm」 compiled successfully hash version webpack child hash time built at pm asset size chunks chunk names node kib extensions preload js kib main bundle js mib main main preload js kib entrypoint main main bundle js kib main kib main kib main kib main bytes main kib main kib main kib main bytes main bytes main bytes main external electron bytes main external fs bytes main external os bytes main external path bytes main hidden modules child hash time built at pm asset size chunks chunk names popup preload bundle js kib popup preload popup preload view preload bundle js kib view preload view preload entrypoint view preload view preload bundle js entrypoint popup preload popup preload bundle js kib view preload bytes view preload kib view preload kib view preload bytes view preload kib view preload kib view preload bytes popup preload bytes view preload bytes view preload kib view preload bytes view preload kib view preload bytes view preload external electron bytes view preload popup preload hidden modules wexond beta start c desktop master cross env node env dev electron app threw an error during load error cannot open c desktop master build node error module did not self register at object eval webpack internal node modules keytar build release keytar node at eval webpack internal node modules keytar build release keytar node at object node modules keytar build release keytar node c desktop master build main bundle js at webpack require c desktop master build main bundle js at eval webpack internal node modules keytar lib keytar js at object node modules keytar lib keytar js c desktop master build main bundle js at webpack require c desktop master build main bundle js at eval webpack internal src main services messaging ts at object src main services messaging ts c desktop master build main bundle js at webpack require c desktop master build main bundle js type checking in progress 「wdm」 hash version webpack child hash time built at pm asset size chunks chunk names svg bytes svg bytes png kib svg bytes svg bytes svg bytes svg bytes svg bytes png kib png kib svg bytes svg bytes svg bytes kib svg bytes svg bytes png kib svg bytes svg bytes svg kib svg bytes kib svg bytes svg bytes svg bytes svg bytes kib png kib svg kib svg bytes svg bytes svg bytes png kib svg bytes svg bytes svg bytes svg bytes png kib svg bytes svg bytes svg bytes add bookmark bundle js kib add bookmark add bookmark add bookmark html bytes svg bytes app bundle js kib app app app html bytes auth bundle js kib auth auth auth html bytes svg bytes png kib png kib png kib svg bytes svg bytes svg bytes svg bytes svg bytes credentials bundle js kib credentials credentials credentials html bytes svg bytes svg bytes svg bytes svg bytes downloads bundle js kib downloads downloads downloads html bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes svg bytes png kib png kib find bundle js kib find find find html bytes form fill bundle js kib form fill form fill form fill html bytes menu bundle js kib menu menu menu html bytes permissions bundle js kib permissions permissions permissions html bytes preview bundle js kib preview preview preview html bytes search bundle js kib search search search html bytes tabgroup bundle js kib tabgroup tabgroup tabgroup html bytes vendor app bundle js mib vendor app vendor app entrypoint app vendor app bundle js app bundle js entrypoint permissions vendor app bundle js permissions bundle js entrypoint auth vendor app bundle js auth bundle js entrypoint form fill vendor app bundle js form fill bundle js entrypoint credentials vendor app bundle js credentials bundle js entrypoint find vendor app bundle js find bundle js entrypoint menu vendor app bundle js menu bundle js entrypoint search vendor app bundle js search bundle js entrypoint preview vendor app bundle js preview bundle js entrypoint tabgroup vendor app bundle js tabgroup bundle js entrypoint downloads vendor app bundle js downloads bundle js entrypoint add bookmark vendor app bundle js add bookmark bundle js multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views app bytes app multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views permissions bytes permissions multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views auth bytes auth multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views form fill bytes form fill multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views credentials bytes credentials multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views find bytes find multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views menu bytes menu multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views search bytes search multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views preview bytes preview multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views tabgroup bytes tabgroup multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views downloads bytes downloads multi webpack dev server client webpack hot dev server js react hot loader patch src renderer views add bookmark bytes add bookmark bytes vendor app webpack dev server client kib vendor app webpack hot dev server js kib vendor app hidden modules child html webpack plugin for add bookmark html asset entrypoint undefined add bookmark html bytes kib webpack buildin module js bytes child html webpack plugin for app html asset entrypoint undefined app html bytes kib webpack buildin module js bytes child html webpack plugin for auth html asset entrypoint undefined auth html bytes kib webpack buildin module js bytes child html webpack plugin for credentials html asset entrypoint undefined credentials html bytes kib webpack buildin module js bytes child html webpack plugin for downloads html asset entrypoint undefined downloads html bytes kib webpack buildin module js bytes child html webpack plugin for find html asset entrypoint undefined find html bytes kib webpack buildin module js bytes child html webpack plugin for form fill html asset entrypoint undefined form fill html bytes kib webpack buildin module js bytes child html webpack plugin for menu html asset entrypoint undefined menu html bytes kib webpack buildin module js bytes child html webpack plugin for permissions html asset entrypoint undefined permissions html bytes kib webpack buildin module js bytes child html webpack plugin for preview html asset entrypoint undefined preview html bytes kib webpack buildin module js bytes child html webpack plugin for search html asset entrypoint undefined search html bytes kib webpack buildin module js bytes child html webpack plugin for tabgroup html asset entrypoint undefined tabgroup html bytes kib webpack buildin module js bytes child hash time built at pm asset size chunks chunk names extension popup bundle js kib extension popup extension popup extension popup html bytes entrypoint extension popup extension popup bundle js multi webpack dev server client webpack hot dev server js src renderer views extension popup bytes extension popup webpack dev server client kib extension popup webpack dev server client overlay js kib extension popup webpack dev server client socket js kib extension popup webpack dev server client utils createsocketurl js kib extension popup webpack dev server client utils log js bytes extension popup webpack dev server client utils reloadapp js kib extension popup webpack dev server client utils sendmessage js bytes extension popup webpack dev server node modules strip ansi index js bytes extension popup webpack hot sync nonrecursive log bytes extension popup webpack hot dev server js kib extension popup webpack hot emitter js bytes extension popup webpack hot log apply result js kib extension popup webpack hot log js kib extension popup kib extension popup hidden modules child html webpack plugin for extension popup html asset entrypoint undefined extension popup html bytes kib webpack buildin module js bytes 「wdm」 compiled successfully error in c desktop master src main services storage ts this expression is not callable type typeof import c desktop master node modules file type index has no call signatures version typescript time error in c desktop master src main services storage ts this expression is not callable type typeof import c desktop master node modules file type index has no call signatures error in c desktop master src main services storage ts this expression is not callable type typeof import c desktop master node modules file type index has no call signatures version typescript time error in c desktop master src main services storage ts this expression is not callable type typeof import c desktop master node modules file type index has no call signatures c c cterminare il processo batch s n cterminare il processo batch s n cterminare il processo batch s n ccross env env dev webpack dev server config webpack config web js exited with code cterminare il processo batch s n cross env env dev webpack dev server config webpack config renderer js exited with code cross env env dev webpack exited with code it also displays an error message box see screenshot below and when i close the messagebox the terminal doesn t continue the cursor blinks in a new line but i can t write any command and i m only able to quit to reproduce run npm run dev or npm start expected behavior building without errors screenshots details operating system windows bit wexond version last known working wexond version i ve never used wexond before thank you in advance for your help
| 0
|
1,871
| 4,697,838,743
|
IssuesEvent
|
2016-10-12 10:43:47
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
execFile hangs when child process exits, but has disowned children
|
child_process
|
* **Version**: 4.3.2
* **Platform**: OS X 10.11.3
* **Subsystem**: `child_process`
When spawning a child via `child_process.execFile`, and if that child spawns but disowns another process, `execFile` never calls its callback, even though the returned `ChildProcess` _does_ emit an `exit` event.
For example:
`disowner.sh`:
```
yes >/dev/null & # Don't forget to kill me after you're done testing!
disown
```
`hang.js`:
```
const execFile = require('child_process').execFile;
const proc = execFile('bash', ['disowner.sh'], () => {
console.log('callback');
});
proc.on('exit', () => { console.log('child exited'); });
```
---
Expected:
```
$ node hang.js
child exited
callback
$
```
Actual:
```
$ node hang.js
child exited
```
|
1.0
|
execFile hangs when child process exits, but has disowned children - * **Version**: 4.3.2
* **Platform**: OS X 10.11.3
* **Subsystem**: `child_process`
When spawning a child via `child_process.execFile`, and if that child spawns but disowns another process, `execFile` never calls its callback, even though the returned `ChildProcess` _does_ emit an `exit` event.
For example:
`disowner.sh`:
```
yes >/dev/null & # Don't forget to kill me after you're done testing!
disown
```
`hang.js`:
```
const execFile = require('child_process').execFile;
const proc = execFile('bash', ['disowner.sh'], () => {
console.log('callback');
});
proc.on('exit', () => { console.log('child exited'); });
```
---
Expected:
```
$ node hang.js
child exited
callback
$
```
Actual:
```
$ node hang.js
child exited
```
|
process
|
execfile hangs when child process exits but has disowned children version platform os x subsystem child process when spawning a child via child process execfile and if that child spawns but disowns another process execfile never calls its callback even though the returned childprocess does emit an exit event for example disowner sh yes dev null don t forget to kill me after you re done testing disown hang js const execfile require child process execfile const proc execfile bash console log callback proc on exit console log child exited expected node hang js child exited callback actual node hang js child exited
| 1
|
16,859
| 5,293,789,216
|
IssuesEvent
|
2017-02-09 08:51:08
|
Alzi/TinyCMS
|
https://api.github.com/repos/Alzi/TinyCMS
|
closed
|
Implement 'change_image` within gui.Image and/or gui.SelectImage
|
cleaner code
|
We should prevent, that single items inside the ContentCell's `components[]` - array need to be removed.
Either add a clean new ContentType or implement the possibility to change them inside the `gui.py`-module.
|
1.0
|
Implement 'change_image` within gui.Image and/or gui.SelectImage - We should prevent, that single items inside the ContentCell's `components[]` - array need to be removed.
Either add a clean new ContentType or implement the possibility to change them inside the `gui.py`-module.
|
non_process
|
implement change image within gui image and or gui selectimage we should prevent that single items inside the contentcell s components array need to be removed either add a clean new contenttype or implement the possibility to change them inside the gui py module
| 0
|
15,119
| 18,851,982,191
|
IssuesEvent
|
2021-11-11 22:16:05
|
googleapis/python-bigquery-pandas
|
https://api.github.com/repos/googleapis/python-bigquery-pandas
|
opened
|
CircleCI tests with prerelease (nightlies) are failing due to conda timeout
|
type: process
|
Stuck on "solving environment"
```
+++ export CONDA_PREFIX=/opt/conda/envs/test-environment
+++ CONDA_PREFIX=/opt/conda/envs/test-environment
+++ export CONDA_SHLVL=1
+++ CONDA_SHLVL=1
+++ export CONDA_DEFAULT_ENV=test-environment
+++ CONDA_DEFAULT_ENV=test-environment
+++ export CONDA_PROMPT_MODIFIER=
+++ CONDA_PROMPT_MODIFIER=
+++ export CONDA_EXE=/opt/conda/bin/conda
+++ CONDA_EXE=/opt/conda/bin/conda
+++ export _CE_M=
+++ _CE_M=
+++ export _CE_CONDA=
+++ _CE_CONDA=
+++ export CONDA_PYTHON_EXE=/opt/conda/bin/python
+++ CONDA_PYTHON_EXE=/opt/conda/bin/python
++ __conda_hashr
++ '[' -n '' ']'
++ '[' -n '' ']'
++ hash -r
+ REQ=ci/requirements-3.9-NIGHTLY
+ conda install -q --file ci/requirements-3.9-NIGHTLY.conda
+ local cmd=install
+ case "$cmd" in
+ __conda_exe install -q --file ci/requirements-3.9-NIGHTLY.conda
+ __add_sys_prefix_to_path
+ '[' -n '' ']'
++ dirname /opt/conda/bin/conda
+ SYSP=/opt/conda/bin
++ dirname /opt/conda/bin
+ SYSP=/opt/conda
+ '[' -n '' ']'
+ PATH=/opt/conda/bin:/opt/conda/envs/test-environment/bin:/opt/conda/condabin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ export PATH
+ /opt/conda/bin/conda install -q --file ci/requirements-3.9-NIGHTLY.conda
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... /opt/conda/etc/profile.d/conda.sh: line 35: 799 Killed "$CONDA_EXE" $_CE_M $_CE_CONDA "$@"
+ return
Exited with code exit status 137
CircleCI received exit code 137
```
https://app.circleci.com/pipelines/github/googleapis/python-bigquery-pandas/150/workflows/f1bfd01f-dd32-42bc-90d4-4a11c26046ec/jobs/1108
Maybe switch to `mamba`?
|
1.0
|
CircleCI tests with prerelease (nightlies) are failing due to conda timeout - Stuck on "solving environment"
```
+++ export CONDA_PREFIX=/opt/conda/envs/test-environment
+++ CONDA_PREFIX=/opt/conda/envs/test-environment
+++ export CONDA_SHLVL=1
+++ CONDA_SHLVL=1
+++ export CONDA_DEFAULT_ENV=test-environment
+++ CONDA_DEFAULT_ENV=test-environment
+++ export CONDA_PROMPT_MODIFIER=
+++ CONDA_PROMPT_MODIFIER=
+++ export CONDA_EXE=/opt/conda/bin/conda
+++ CONDA_EXE=/opt/conda/bin/conda
+++ export _CE_M=
+++ _CE_M=
+++ export _CE_CONDA=
+++ _CE_CONDA=
+++ export CONDA_PYTHON_EXE=/opt/conda/bin/python
+++ CONDA_PYTHON_EXE=/opt/conda/bin/python
++ __conda_hashr
++ '[' -n '' ']'
++ '[' -n '' ']'
++ hash -r
+ REQ=ci/requirements-3.9-NIGHTLY
+ conda install -q --file ci/requirements-3.9-NIGHTLY.conda
+ local cmd=install
+ case "$cmd" in
+ __conda_exe install -q --file ci/requirements-3.9-NIGHTLY.conda
+ __add_sys_prefix_to_path
+ '[' -n '' ']'
++ dirname /opt/conda/bin/conda
+ SYSP=/opt/conda/bin
++ dirname /opt/conda/bin
+ SYSP=/opt/conda
+ '[' -n '' ']'
+ PATH=/opt/conda/bin:/opt/conda/envs/test-environment/bin:/opt/conda/condabin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
+ export PATH
+ /opt/conda/bin/conda install -q --file ci/requirements-3.9-NIGHTLY.conda
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... /opt/conda/etc/profile.d/conda.sh: line 35: 799 Killed "$CONDA_EXE" $_CE_M $_CE_CONDA "$@"
+ return
Exited with code exit status 137
CircleCI received exit code 137
```
https://app.circleci.com/pipelines/github/googleapis/python-bigquery-pandas/150/workflows/f1bfd01f-dd32-42bc-90d4-4a11c26046ec/jobs/1108
Maybe switch to `mamba`?
|
process
|
circleci tests with prerelease nightlies are failing due to conda timeout stuck on solving environment export conda prefix opt conda envs test environment conda prefix opt conda envs test environment export conda shlvl conda shlvl export conda default env test environment conda default env test environment export conda prompt modifier conda prompt modifier export conda exe opt conda bin conda conda exe opt conda bin conda export ce m ce m export ce conda ce conda export conda python exe opt conda bin python conda python exe opt conda bin python conda hashr hash r req ci requirements nightly conda install q file ci requirements nightly conda local cmd install case cmd in conda exe install q file ci requirements nightly conda add sys prefix to path dirname opt conda bin conda sysp opt conda bin dirname opt conda bin sysp opt conda path opt conda bin opt conda envs test environment bin opt conda condabin opt conda bin usr local sbin usr local bin usr sbin usr bin sbin bin export path opt conda bin conda install q file ci requirements nightly conda collecting package metadata current repodata json working done solving environment working failed with initial frozen solve retrying with flexible solve solving environment working failed with repodata from current repodata json will retry with next repodata source collecting package metadata repodata json working done solving environment working opt conda etc profile d conda sh line killed conda exe ce m ce conda return exited with code exit status circleci received exit code maybe switch to mamba
| 1
|
8,050
| 11,220,788,584
|
IssuesEvent
|
2020-01-07 16:28:23
|
code4romania/expert-consultation-api
|
https://api.github.com/repos/code4romania/expert-consultation-api
|
closed
|
[Documents] Load new document to platform
|
document processing documents java spring
|
As a user of the Legal Consultation platform I want to be able to load a document containing the text of a law proposition. The document can be in doc or docx format.
The loading of the document will
- save the document metadata
- trigger the document breakdown logic, that will split the document in articles and chapters
- save the document breakdown structure
- return the new UUID of the document metadata so it can be queried

|
1.0
|
[Documents] Load new document to platform - As a user of the Legal Consultation platform I want to be able to load a document containing the text of a law proposition. The document can be in doc or docx format.
The loading of the document will
- save the document metadata
- trigger the document breakdown logic, that will split the document in articles and chapters
- save the document breakdown structure
- return the new UUID of the document metadata so it can be queried

|
process
|
load new document to platform as a user of the legal consultation platform i want to be able to load a document containing the text of a law proposition the document can be in doc or docx format the loading of the document will save the document metadata trigger the document breakdown logic that will split the document in articles and chapters save the document breakdown structure return the new uuid of the document metadata so it can be queried
| 1
|
68,384
| 7,094,209,646
|
IssuesEvent
|
2018-01-13 00:38:12
|
The-Charge/2015_TrashBot
|
https://api.github.com/repos/The-Charge/2015_TrashBot
|
closed
|
Strafe and Drive Ticks
|
test on trashbot
|
This needs to be tested on trashbot. Make sure the tick value for driving and strafing is right..
**STRAFING SHOULD NOT BE RIGHT. THE VALUE WILL CHANGE FROM TRASHBOT TO PLYBOT
|
1.0
|
Strafe and Drive Ticks - This needs to be tested on trashbot. Make sure the tick value for driving and strafing is right..
**STRAFING SHOULD NOT BE RIGHT. THE VALUE WILL CHANGE FROM TRASHBOT TO PLYBOT
|
non_process
|
strafe and drive ticks this needs to be tested on trashbot make sure the tick value for driving and strafing is right strafing should not be right the value will change from trashbot to plybot
| 0
|
352,386
| 25,064,935,065
|
IssuesEvent
|
2022-11-07 07:21:14
|
crispindeity/issue-tracker
|
https://api.github.com/repos/crispindeity/issue-tracker
|
opened
|
Add Spring Rest Doc
|
📃 Documentation 📬 API BE
|
# Description
- API 문서화를 위해 Spring Rest Doc 을 추가하고, 문서화를 진행한다.
# Progress
- [ ] Spring Rest Doc 의존성 추가
- [ ] Controller 문서화
|
1.0
|
Add Spring Rest Doc - # Description
- API 문서화를 위해 Spring Rest Doc 을 추가하고, 문서화를 진행한다.
# Progress
- [ ] Spring Rest Doc 의존성 추가
- [ ] Controller 문서화
|
non_process
|
add spring rest doc description api 문서화를 위해 spring rest doc 을 추가하고 문서화를 진행한다 progress spring rest doc 의존성 추가 controller 문서화
| 0
|
17,071
| 22,535,890,476
|
IssuesEvent
|
2022-06-25 07:52:14
|
PyCQA/pylint
|
https://api.github.com/repos/PyCQA/pylint
|
closed
|
pylint 2.7 hangs on many-core Windows machines
|
Bug :beetle: Crash 💥 topic-multiprocessing Windows 🪟
|
### Bug description
We use PyLint in Chromium and we have been switching to Python 3 and PyLint 2.7 recently. While testing on a many-core (96 logical processor) machine I found that PyLint reliably hangs. This is due to a known Python 3 bug with multi-processing on machines with more than 56 (possibly more than 63 or 64) logical processors. We first hit this bug last year (discussed in crbug.com/1190269) but only recently discovered that PyLint is vulnerable.
If we were running PyLint from the command line then we could just pass a -j value that is capped at 56 (or 63, or 64) but I have not been able to figure out how to cap the jobs value when invoking PyLint through lint.Run().
Internally PyLint should cap its parallelism until the Python 3 bug (not yet reported I believe, sorry) is fixed.
### Configuration
```ini
96 logical processor Windows machine
The version of Windows does not seem to matter. This only happens on Python 3 and it is believed to affect all versions of Python 3.
```
### Command used
```shell
I invoke pylint quite directly using "git cl presubmit --all --force" from Chromium. This ends up invoking many steps including pylint_main.py, like this:
from pylint import lint # pylint: disable=bad-option-value,import-outside-toplevel
lint.Run(argv)
https://source.chromium.org/chromium/chromium/tools/depot_tools/+/main:pylint_main.py;l=17?q=pylint_main.py&sq=
```
### Pylint output
```shell
No output - it hangs.
```
### Expected behavior
I expect the PyLint run to complete.
### Pylint version
```shell
pylint 2.7
Python 3.8.10
```
### OS / Environment
Windows 10
### Additional dependencies
_No response_
|
1.0
|
pylint 2.7 hangs on many-core Windows machines - ### Bug description
We use PyLint in Chromium and we have been switching to Python 3 and PyLint 2.7 recently. While testing on a many-core (96 logical processor) machine I found that PyLint reliably hangs. This is due to a known Python 3 bug with multi-processing on machines with more than 56 (possibly more than 63 or 64) logical processors. We first hit this bug last year (discussed in crbug.com/1190269) but only recently discovered that PyLint is vulnerable.
If we were running PyLint from the command line then we could just pass a -j value that is capped at 56 (or 63, or 64) but I have not been able to figure out how to cap the jobs value when invoking PyLint through lint.Run().
Internally PyLint should cap its parallelism until the Python 3 bug (not yet reported I believe, sorry) is fixed.
### Configuration
```ini
96 logical processor Windows machine
The version of Windows does not seem to matter. This only happens on Python 3 and it is believed to affect all versions of Python 3.
```
### Command used
```shell
I invoke pylint quite directly using "git cl presubmit --all --force" from Chromium. This ends up invoking many steps including pylint_main.py, like this:
from pylint import lint # pylint: disable=bad-option-value,import-outside-toplevel
lint.Run(argv)
https://source.chromium.org/chromium/chromium/tools/depot_tools/+/main:pylint_main.py;l=17?q=pylint_main.py&sq=
```
### Pylint output
```shell
No output - it hangs.
```
### Expected behavior
I expect the PyLint run to complete.
### Pylint version
```shell
pylint 2.7
Python 3.8.10
```
### OS / Environment
Windows 10
### Additional dependencies
_No response_
|
process
|
pylint hangs on many core windows machines bug description we use pylint in chromium and we have been switching to python and pylint recently while testing on a many core logical processor machine i found that pylint reliably hangs this is due to a known python bug with multi processing on machines with more than possibly more than or logical processors we first hit this bug last year discussed in crbug com but only recently discovered that pylint is vulnerable if we were running pylint from the command line then we could just pass a j value that is capped at or or but i have not been able to figure out how to cap the jobs value when invoking pylint through lint run internally pylint should cap its parallelism until the python bug not yet reported i believe sorry is fixed configuration ini logical processor windows machine the version of windows does not seem to matter this only happens on python and it is believed to affect all versions of python command used shell i invoke pylint quite directly using git cl presubmit all force from chromium this ends up invoking many steps including pylint main py like this from pylint import lint pylint disable bad option value import outside toplevel lint run argv pylint output shell no output it hangs expected behavior i expect the pylint run to complete pylint version shell pylint python os environment windows additional dependencies no response
| 1
|
7,609
| 10,723,438,744
|
IssuesEvent
|
2019-10-27 18:48:22
|
akey7/organicml
|
https://api.github.com/repos/akey7/organicml
|
closed
|
Collect 400 training images
|
Collection/Preprocessing
|
Take images from my organic chemistry textbooks. 200 with benzene rings. 200 without benzene rings.
|
1.0
|
Collect 400 training images - Take images from my organic chemistry textbooks. 200 with benzene rings. 200 without benzene rings.
|
process
|
collect training images take images from my organic chemistry textbooks with benzene rings without benzene rings
| 1
|
9,528
| 12,500,607,207
|
IssuesEvent
|
2020-06-01 22:43:58
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Field Mapper parameters cannot have parent layer set in models
|
Bug Processing
|
Author Name: **gcarrillo -** (gcarrillo -)
Original Redmine Issue: [18605](https://issues.qgis.org/issues/18605)
Affected QGIS version: 3.7(master)
Redmine category:processing/modeller
Assignee: Victor Olaya
---
A model using Refactor Fields algorithm is expected to have two inputs:
1) Input Layer
2) Fields Mapper
When running such model and setting an input layer, the Source Expression in the Fields Mapper widget never gets a layer set. Therefore, users cannot choose among input layer fields nor use "Fields and Values" section in the Expression builder dialog.
Explanatory GIF: http://downloads.tuxfamily.org/tuxgis/tmp/refactor_fields_expression_layer.gif
---
- [model_etl_ladm_col.model3](https://issues.qgis.org/attachments/download/12548/model_etl_ladm_col.model3) (gcarrillo -)
- [model_etl_ladm_col_.model3](https://issues.qgis.org/attachments/download/12549/model_etl_ladm_col_.model3) (gcarrillo -)
|
1.0
|
Field Mapper parameters cannot have parent layer set in models - Author Name: **gcarrillo -** (gcarrillo -)
Original Redmine Issue: [18605](https://issues.qgis.org/issues/18605)
Affected QGIS version: 3.7(master)
Redmine category:processing/modeller
Assignee: Victor Olaya
---
A model using Refactor Fields algorithm is expected to have two inputs:
1) Input Layer
2) Fields Mapper
When running such model and setting an input layer, the Source Expression in the Fields Mapper widget never gets a layer set. Therefore, users cannot choose among input layer fields nor use "Fields and Values" section in the Expression builder dialog.
Explanatory GIF: http://downloads.tuxfamily.org/tuxgis/tmp/refactor_fields_expression_layer.gif
---
- [model_etl_ladm_col.model3](https://issues.qgis.org/attachments/download/12548/model_etl_ladm_col.model3) (gcarrillo -)
- [model_etl_ladm_col_.model3](https://issues.qgis.org/attachments/download/12549/model_etl_ladm_col_.model3) (gcarrillo -)
|
process
|
field mapper parameters cannot have parent layer set in models author name gcarrillo gcarrillo original redmine issue affected qgis version master redmine category processing modeller assignee victor olaya a model using refactor fields algorithm is expected to have two inputs input layer fields mapper when running such model and setting an input layer the source expression in the fields mapper widget never gets a layer set therefore users cannot choose among input layer fields nor use fields and values section in the expression builder dialog explanatory gif gcarrillo gcarrillo
| 1
|
15,471
| 19,682,720,572
|
IssuesEvent
|
2022-01-11 18:24:57
|
CliMA/TurbulenceConvection.jl
|
https://api.github.com/repos/CliMA/TurbulenceConvection.jl
|
opened
|
Contour plots should use interpolated data for comparison
|
enhancement Post processing :gear:
|
Currently, our contour plots plot the raw data from simulations, it may be a smoother experience for users if we instead plotted interpolated data when, for example, we change the top of the domain.
|
1.0
|
Contour plots should use interpolated data for comparison - Currently, our contour plots plot the raw data from simulations, it may be a smoother experience for users if we instead plotted interpolated data when, for example, we change the top of the domain.
|
process
|
contour plots should use interpolated data for comparison currently our contour plots plot the raw data from simulations it may be a smoother experience for users if we instead plotted interpolated data when for example we change the top of the domain
| 1
|
14,267
| 17,209,687,222
|
IssuesEvent
|
2021-07-19 01:02:18
|
bisq-network/proposals
|
https://api.github.com/repos/bisq-network/proposals
|
closed
|
Bisq 2.0 - Social Trading Expansion Proposal
|
a:proposal re:features re:processes
|
> _This is a Bisq Network proposal. Please familiarize yourself with the [submission and review process](https://bisq.wiki/Proposals)._
<!-- Please do not remove the text above. -->
**Background:** In response to #330 having social trading, and me wanting to also do social trading for a while as well, I decided to work on this proposal. I think Bisq can utilize some variation of social trading and I have a general idea of features I would hope for.
#### Table of contents
1. [Overview](https://github.com/Mentors4EDU/Bisq-Proposal#overview)
2. [Base Modules](https://github.com/Mentors4EDU/Bisq-Proposal#base-modules)
- [Profile](https://github.com/Mentors4EDU/Bisq-Proposal#profile)
- [Trade History](https://github.com/Mentors4EDU/Bisq-Proposal#trade-history)
- [Simulation Modes](https://github.com/Mentors4EDU/Bisq-Proposal#simulation-modes)
- [Support for Scripts](https://github.com/Mentors4EDU/Bisq-Proposal#support-for-scripts)
- [Ranking Mechanism](https://github.com/Mentors4EDU/Bisq-Proposal#ranking-mechanism)
- [Test and Realtime Support](https://github.com/Mentors4EDU/Bisq-Proposal#test-and-realtime-support)
- [Disclosures](https://github.com/Mentors4EDU/Bisq-Proposal#disclosures)
- [Regarding Privacy](https://github.com/Mentors4EDU/Bisq-Proposal#regarding-privacy)
3. [Roadmap](https://github.com/Mentors4EDU/Bisq-Proposal#roadmap)
4. [Project Goals](https://github.com/Mentors4EDU/Bisq-Proposal#project-goals)
- [Basic Community Acceptance](https://github.com/Mentors4EDU/Bisq-Proposal#basic-community-acceptance)
- [Get Funding](https://github.com/Mentors4EDU/Bisq-Proposal#get-funding)
- [Design and Mockups](https://github.com/Mentors4EDU/Bisq-Proposal#design-and-mockups)
- [Hiring Team](https://github.com/Mentors4EDU/Bisq-Proposal#hiring-team)
- [Marketing](https://github.com/Mentors4EDU/Bisq-Proposal#marketing)
- [MVP Built](https://github.com/Mentors4EDU/Bisq-Proposal#mvp-built)
- [Project Launch](https://github.com/Mentors4EDU/Bisq-Proposal#project-launch)
# Overview
Bisq as a decentralized exchange has the potential for many social aspects and technological implementations. One of those aspects could be an integration of some form or variation of social trading. Social trading as a concept is open and transparent given the utilization of actual statistics and insights based off of skills or trading abilities. This proposal is on an open-source implementation of social trading on Bisq's network.
# Base Modules
There will need to be a series of modules and/or components integrated into this project. Having something modularly built will allow for easy upgrading, and I do take somewhat of an inspiration from the Bisq-2.0 proposals. If something isn't modularly built, not only is upgrading a hassle. For certain issues, debugging and maintaining over time can also be hectic.
## Profile
The profile needs to have just the basics. Optionally, it doesn't really have to have a profile picture but can denote some sort of icon given that a profile picture is just meaningless data storage. *(At least in this instance)*. It depends on the community's preference. What it will have will include a description, a link to website, and a button to view trade history. It will also include metrics in regards to percentages for YoY returns. There will be a follow trades button which the user can input a certain percentage such as the default commission being 5% or set a custom fee. The ability to chose between both fee and percentage can offer flexibility for many users.
## Trade History
Users can have the option between trading in real time or with a test network. Obviously, you only earn for real trades. There will be a tab for real time trades and test network trades. The tabs for trade history will only show entry and exit prices and percentages, it will not show how much money was put in order to encourage fair markets and a layer of anonymity.
## Simulation Modes
You will be able to see simulation charts for the different trading histories and have the option to follow user's trades. You can follow trades that users do with real money including test network strategies. You can also run simulation tests. When following test network strategies with real money, that means every time the user inputs a simulated trade, you are actually doing that trade with real money. This is similar to other social trading websites such as Collective2's TOS badge or eToro's hypothetical performance charts.
For charting and market data, we would use the default Bisq charts or build untop of them.
## Support for Scripts
I recommend support for the [Julia programming language](https://julialang.org/), given it is suited for numerical analysis and perfect for quant traders. I think you can utilize some sort of [embedded API](https://docs.julialang.org/en/v1/manual/embedding/) for implementation. There are many things in regards to what support for Julia could do. People can not just make manually traded strategies, but also make trading strategies that are automated or custom indicators that people can trade with for a fee percentage or custom licensing fee. The support for both manual and robotraders is something many social trading networks have, and support for Julia can attract alot of quants and mathematicians. Optionally, we can add secondary support for Python as well, but the use of Python has already been overly utilized.
In regards to Julia, packages such as the [decentralized-internet](https://github.com/Lonero-Team/Decentralized-Internet) SDK, [Gen](https://www.gen.dev/), [Quandl.jl](https://juliapackages.com/p/quandl), [TradingLogic.jl](https://juliapackages.com/p/tradinglogic), [FinMarkets.jl](https://juliapackages.com/p/finmarkets), [Ito.jl](https://juliapackages.com/p/ito), and [TradeModels.jl](https://juliapackages.com/p/trademodels) can come pre-installed to support custom developer environments.
## Ranking Mechanism
In regards to the ranking mechanism, I believe the UML below best illustrates how that will play out: \
 \
The ranking mechanism should support fairness and decentralization. That is why profiles are going to be ranked on performance-based metrics rather than how many people are following their trades. The metrics are going to be: Categories for YoY estimated ROI %, ROI % per Trade, and a Risk Tolerance subcatagory. Risk tolerance in regards to uncertainty is measured by drawdown percentage in relation to returns. The better one performs, the higher they are ranked in a certain category. Performance-based is the best type of ranking mechanism in the world of trading.
## Test and Realtime Support
As noted, both trades on a test network and realtime strategies can be followed. The same applies whether it is manual or automated. This is actually a good thing, because along with the ranking mechanism, traders can be rewarded on skills rather then them putting large amounts of money in their strategies upfront. Also, lower income traders might be higher skilled then some higher income traders. Leveling up the playing field is key.
## Disclosures
Likely some sort of risk disclosure will need to be implemented that is site-wide and usage specific. This is due to the nature of exchanges and social trading in general, and the disclosure text can be open-sourced or within the public domain.
## Regarding Privacy
I think users can have the option of being public to an extent *(description and site profile fields)*, or having some level of anonymity or pseudonymity. There has been comments in the past regarding levels of pseudonymity, and I believe the network can still have privacy-centric aspects to it. This will be one of the key focuses or aspects to look at in regards to development.
# Roadmap
There are lots of things to take into consideration in regards to building this platform. Most likely from acceptance and funding to MVP, the timeline is 4 to 5 month with around 4 to 7 part time and voluntary software developers. More details will be in the funding section. This is almost the equivalent of building a full scale exchange given the advancements needing to be in place, but it is a very ambitious and promising project.
# Project Goals
In regards to project goals, besides a basic *ACK*, we are trying to get funding, design, build and market the platform. My goal is to eventually maintain this and personally work on marketing this platform and bringing more awareness towards Bisq. There are already lots of mediums I have to market this as a career-long startup advisor, and I have a general idea of how I will be managing the technological implementations of this project.
## Basic Community Acceptance
We are hoping to receive an *ACK* for this proposal. Please keep in mind that proposals can be commented on, discussed and updated over time. Also, remember criticism is best handled with a degree of respect and politeness within the context of the idea being critiqued.
## Get Funding
The funding I am looking for is around 35k Bisq done through the Bisq DAO. Although this is negotiable, I calculated how much it would cost for the potential hiring of engineers + my time allocation. The actual [COCOMO model](https://www.javatpoint.com/cocomo-model) would suggest probably alot more. Also, I would want a 0.25% trading fee in regards to project maintenance over time *(from the social trading only)*. Keep in mind this will still be open-source. Again, the funding will likely be crucial given lots of time will be voluntary and the low volume of Bisq as well
## Design and Mockups
Part of this project will also include a design phase. We want to make sure that the design fits well with Bisq's current UI and UX. This includes everything from trade history and order books to its social aspects.
## Hiring Team
The team will be paid from the Bisq funding pool. I will be taking the role of mostly a PM and CTO during the development process. I won't be taking from the funding pool, but rather instead be taking trading fees as a form of managing the platform over time and sort of being its chief technologist and marketing strategist. Many marketing costs that may occur afterwards will be mostly on my own dime. Henceforth, this is why the minor fee is helpful.
## Marketing
I think marketing is a thing that Bisq lacks. You can still be decentralized and theoretically teamless while focusing on others to use your software. I believe that social trading provides the opportunity to tap into markets that Bisq has yet to enter. Also, with the rise of many DEX and supposed "DeFi" apps, I believe Bisq can offer a popular alternative to what is mainstream.
## MVP Built
Once the MVP is built, we will start garnishing some community feedback and preparing for the launch.
## Project Launch
Once all the milestones are met, the MVP has been built, and things are ready for full deployment, it will be officially launch time. My focus at that time will than be marketing, network growth and scalability.
*For sake of transparency, in regards to issue edits based on consensus, you can see the draft history via the repo [here](https://github.com/Mentors4EDU/Bisq-Proposal). This is as of the writing of this proposal.*
|
1.0
|
Bisq 2.0 - Social Trading Expansion Proposal - > _This is a Bisq Network proposal. Please familiarize yourself with the [submission and review process](https://bisq.wiki/Proposals)._
<!-- Please do not remove the text above. -->
**Background:** In response to #330 having social trading, and me wanting to also do social trading for a while as well, I decided to work on this proposal. I think Bisq can utilize some variation of social trading and I have a general idea of features I would hope for.
#### Table of contents
1. [Overview](https://github.com/Mentors4EDU/Bisq-Proposal#overview)
2. [Base Modules](https://github.com/Mentors4EDU/Bisq-Proposal#base-modules)
- [Profile](https://github.com/Mentors4EDU/Bisq-Proposal#profile)
- [Trade History](https://github.com/Mentors4EDU/Bisq-Proposal#trade-history)
- [Simulation Modes](https://github.com/Mentors4EDU/Bisq-Proposal#simulation-modes)
- [Support for Scripts](https://github.com/Mentors4EDU/Bisq-Proposal#support-for-scripts)
- [Ranking Mechanism](https://github.com/Mentors4EDU/Bisq-Proposal#ranking-mechanism)
- [Test and Realtime Support](https://github.com/Mentors4EDU/Bisq-Proposal#test-and-realtime-support)
- [Disclosures](https://github.com/Mentors4EDU/Bisq-Proposal#disclosures)
- [Regarding Privacy](https://github.com/Mentors4EDU/Bisq-Proposal#regarding-privacy)
3. [Roadmap](https://github.com/Mentors4EDU/Bisq-Proposal#roadmap)
4. [Project Goals](https://github.com/Mentors4EDU/Bisq-Proposal#project-goals)
- [Basic Community Acceptance](https://github.com/Mentors4EDU/Bisq-Proposal#basic-community-acceptance)
- [Get Funding](https://github.com/Mentors4EDU/Bisq-Proposal#get-funding)
- [Design and Mockups](https://github.com/Mentors4EDU/Bisq-Proposal#design-and-mockups)
- [Hiring Team](https://github.com/Mentors4EDU/Bisq-Proposal#hiring-team)
- [Marketing](https://github.com/Mentors4EDU/Bisq-Proposal#marketing)
- [MVP Built](https://github.com/Mentors4EDU/Bisq-Proposal#mvp-built)
- [Project Launch](https://github.com/Mentors4EDU/Bisq-Proposal#project-launch)
# Overview
Bisq as a decentralized exchange has the potential for many social aspects and technological implementations. One of those aspects could be an integration of some form or variation of social trading. Social trading as a concept is open and transparent given the utilization of actual statistics and insights based off of skills or trading abilities. This proposal is on an open-source implementation of social trading on Bisq's network.
# Base Modules
There will need to be a series of modules and/or components integrated into this project. Having something modularly built will allow for easy upgrading, and I do take somewhat of an inspiration from the Bisq-2.0 proposals. If something isn't modularly built, not only is upgrading a hassle. For certain issues, debugging and maintaining over time can also be hectic.
## Profile
The profile needs to have just the basics. Optionally, it doesn't really have to have a profile picture but can denote some sort of icon given that a profile picture is just meaningless data storage. *(At least in this instance)*. It depends on the community's preference. What it will have will include a description, a link to website, and a button to view trade history. It will also include metrics in regards to percentages for YoY returns. There will be a follow trades button which the user can input a certain percentage such as the default commission being 5% or set a custom fee. The ability to chose between both fee and percentage can offer flexibility for many users.
## Trade History
Users can have the option between trading in real time or with a test network. Obviously, you only earn for real trades. There will be a tab for real time trades and test network trades. The tabs for trade history will only show entry and exit prices and percentages, it will not show how much money was put in order to encourage fair markets and a layer of anonymity.
## Simulation Modes
You will be able to see simulation charts for the different trading histories and have the option to follow user's trades. You can follow trades that users do with real money including test network strategies. You can also run simulation tests. When following test network strategies with real money, that means every time the user inputs a simulated trade, you are actually doing that trade with real money. This is similar to other social trading websites such as Collective2's TOS badge or eToro's hypothetical performance charts.
For charting and market data, we would use the default Bisq charts or build untop of them.
## Support for Scripts
I recommend support for the [Julia programming language](https://julialang.org/), given it is suited for numerical analysis and perfect for quant traders. I think you can utilize some sort of [embedded API](https://docs.julialang.org/en/v1/manual/embedding/) for implementation. There are many things in regards to what support for Julia could do. People can not just make manually traded strategies, but also make trading strategies that are automated or custom indicators that people can trade with for a fee percentage or custom licensing fee. The support for both manual and robotraders is something many social trading networks have, and support for Julia can attract alot of quants and mathematicians. Optionally, we can add secondary support for Python as well, but the use of Python has already been overly utilized.
In regards to Julia, packages such as the [decentralized-internet](https://github.com/Lonero-Team/Decentralized-Internet) SDK, [Gen](https://www.gen.dev/), [Quandl.jl](https://juliapackages.com/p/quandl), [TradingLogic.jl](https://juliapackages.com/p/tradinglogic), [FinMarkets.jl](https://juliapackages.com/p/finmarkets), [Ito.jl](https://juliapackages.com/p/ito), and [TradeModels.jl](https://juliapackages.com/p/trademodels) can come pre-installed to support custom developer environments.
## Ranking Mechanism
In regards to the ranking mechanism, I believe the UML below best illustrates how that will play out: \
 \
The ranking mechanism should support fairness and decentralization. That is why profiles are going to be ranked on performance-based metrics rather than how many people are following their trades. The metrics are going to be: Categories for YoY estimated ROI %, ROI % per Trade, and a Risk Tolerance subcatagory. Risk tolerance in regards to uncertainty is measured by drawdown percentage in relation to returns. The better one performs, the higher they are ranked in a certain category. Performance-based is the best type of ranking mechanism in the world of trading.
## Test and Realtime Support
As noted, both trades on a test network and realtime strategies can be followed. The same applies whether it is manual or automated. This is actually a good thing, because along with the ranking mechanism, traders can be rewarded on skills rather then them putting large amounts of money in their strategies upfront. Also, lower income traders might be higher skilled then some higher income traders. Leveling up the playing field is key.
## Disclosures
Likely some sort of risk disclosure will need to be implemented that is site-wide and usage specific. This is due to the nature of exchanges and social trading in general, and the disclosure text can be open-sourced or within the public domain.
## Regarding Privacy
I think users can have the option of being public to an extent *(description and site profile fields)*, or having some level of anonymity or pseudonymity. There has been comments in the past regarding levels of pseudonymity, and I believe the network can still have privacy-centric aspects to it. This will be one of the key focuses or aspects to look at in regards to development.
# Roadmap
There are lots of things to take into consideration in regards to building this platform. Most likely from acceptance and funding to MVP, the timeline is 4 to 5 month with around 4 to 7 part time and voluntary software developers. More details will be in the funding section. This is almost the equivalent of building a full scale exchange given the advancements needing to be in place, but it is a very ambitious and promising project.
# Project Goals
In regards to project goals, besides a basic *ACK*, we are trying to get funding, design, build and market the platform. My goal is to eventually maintain this and personally work on marketing this platform and bringing more awareness towards Bisq. There are already lots of mediums I have to market this as a career-long startup advisor, and I have a general idea of how I will be managing the technological implementations of this project.
## Basic Community Acceptance
We are hoping to receive an *ACK* for this proposal. Please keep in mind that proposals can be commented on, discussed and updated over time. Also, remember criticism is best handled with a degree of respect and politeness within the context of the idea being critiqued.
## Get Funding
The funding I am looking for is around 35k Bisq done through the Bisq DAO. Although this is negotiable, I calculated how much it would cost for the potential hiring of engineers + my time allocation. The actual [COCOMO model](https://www.javatpoint.com/cocomo-model) would suggest probably alot more. Also, I would want a 0.25% trading fee in regards to project maintenance over time *(from the social trading only)*. Keep in mind this will still be open-source. Again, the funding will likely be crucial given lots of time will be voluntary and the low volume of Bisq as well
## Design and Mockups
Part of this project will also include a design phase. We want to make sure that the design fits well with Bisq's current UI and UX. This includes everything from trade history and order books to its social aspects.
## Hiring Team
The team will be paid from the Bisq funding pool. I will be taking the role of mostly a PM and CTO during the development process. I won't be taking from the funding pool, but rather instead be taking trading fees as a form of managing the platform over time and sort of being its chief technologist and marketing strategist. Many marketing costs that may occur afterwards will be mostly on my own dime. Henceforth, this is why the minor fee is helpful.
## Marketing
I think marketing is a thing that Bisq lacks. You can still be decentralized and theoretically teamless while focusing on others to use your software. I believe that social trading provides the opportunity to tap into markets that Bisq has yet to enter. Also, with the rise of many DEX and supposed "DeFi" apps, I believe Bisq can offer a popular alternative to what is mainstream.
## MVP Built
Once the MVP is built, we will start garnishing some community feedback and preparing for the launch.
## Project Launch
Once all the milestones are met, the MVP has been built, and things are ready for full deployment, it will be officially launch time. My focus at that time will than be marketing, network growth and scalability.
*For sake of transparency, in regards to issue edits based on consensus, you can see the draft history via the repo [here](https://github.com/Mentors4EDU/Bisq-Proposal). This is as of the writing of this proposal.*
|
process
|
bisq social trading expansion proposal this is a bisq network proposal please familiarize yourself with the background in response to having social trading and me wanting to also do social trading for a while as well i decided to work on this proposal i think bisq can utilize some variation of social trading and i have a general idea of features i would hope for table of contents overview bisq as a decentralized exchange has the potential for many social aspects and technological implementations one of those aspects could be an integration of some form or variation of social trading social trading as a concept is open and transparent given the utilization of actual statistics and insights based off of skills or trading abilities this proposal is on an open source implementation of social trading on bisq s network base modules there will need to be a series of modules and or components integrated into this project having something modularly built will allow for easy upgrading and i do take somewhat of an inspiration from the bisq proposals if something isn t modularly built not only is upgrading a hassle for certain issues debugging and maintaining over time can also be hectic profile the profile needs to have just the basics optionally it doesn t really have to have a profile picture but can denote some sort of icon given that a profile picture is just meaningless data storage at least in this instance it depends on the community s preference what it will have will include a description a link to website and a button to view trade history it will also include metrics in regards to percentages for yoy returns there will be a follow trades button which the user can input a certain percentage such as the default commission being or set a custom fee the ability to chose between both fee and percentage can offer flexibility for many users trade history users can have the option between trading in real time or with a test network obviously you only earn for real trades there will be a tab for real time trades and test network trades the tabs for trade history will only show entry and exit prices and percentages it will not show how much money was put in order to encourage fair markets and a layer of anonymity simulation modes you will be able to see simulation charts for the different trading histories and have the option to follow user s trades you can follow trades that users do with real money including test network strategies you can also run simulation tests when following test network strategies with real money that means every time the user inputs a simulated trade you are actually doing that trade with real money this is similar to other social trading websites such as s tos badge or etoro s hypothetical performance charts for charting and market data we would use the default bisq charts or build untop of them support for scripts i recommend support for the given it is suited for numerical analysis and perfect for quant traders i think you can utilize some sort of for implementation there are many things in regards to what support for julia could do people can not just make manually traded strategies but also make trading strategies that are automated or custom indicators that people can trade with for a fee percentage or custom licensing fee the support for both manual and robotraders is something many social trading networks have and support for julia can attract alot of quants and mathematicians optionally we can add secondary support for python as well but the use of python has already been overly utilized in regards to julia packages such as the sdk and can come pre installed to support custom developer environments ranking mechanism in regards to the ranking mechanism i believe the uml below best illustrates how that will play out the ranking mechanism should support fairness and decentralization that is why profiles are going to be ranked on performance based metrics rather than how many people are following their trades the metrics are going to be categories for yoy estimated roi roi per trade and a risk tolerance subcatagory risk tolerance in regards to uncertainty is measured by drawdown percentage in relation to returns the better one performs the higher they are ranked in a certain category performance based is the best type of ranking mechanism in the world of trading test and realtime support as noted both trades on a test network and realtime strategies can be followed the same applies whether it is manual or automated this is actually a good thing because along with the ranking mechanism traders can be rewarded on skills rather then them putting large amounts of money in their strategies upfront also lower income traders might be higher skilled then some higher income traders leveling up the playing field is key disclosures likely some sort of risk disclosure will need to be implemented that is site wide and usage specific this is due to the nature of exchanges and social trading in general and the disclosure text can be open sourced or within the public domain regarding privacy i think users can have the option of being public to an extent description and site profile fields or having some level of anonymity or pseudonymity there has been comments in the past regarding levels of pseudonymity and i believe the network can still have privacy centric aspects to it this will be one of the key focuses or aspects to look at in regards to development roadmap there are lots of things to take into consideration in regards to building this platform most likely from acceptance and funding to mvp the timeline is to month with around to part time and voluntary software developers more details will be in the funding section this is almost the equivalent of building a full scale exchange given the advancements needing to be in place but it is a very ambitious and promising project project goals in regards to project goals besides a basic ack we are trying to get funding design build and market the platform my goal is to eventually maintain this and personally work on marketing this platform and bringing more awareness towards bisq there are already lots of mediums i have to market this as a career long startup advisor and i have a general idea of how i will be managing the technological implementations of this project basic community acceptance we are hoping to receive an ack for this proposal please keep in mind that proposals can be commented on discussed and updated over time also remember criticism is best handled with a degree of respect and politeness within the context of the idea being critiqued get funding the funding i am looking for is around bisq done through the bisq dao although this is negotiable i calculated how much it would cost for the potential hiring of engineers my time allocation the actual would suggest probably alot more also i would want a trading fee in regards to project maintenance over time from the social trading only keep in mind this will still be open source again the funding will likely be crucial given lots of time will be voluntary and the low volume of bisq as well design and mockups part of this project will also include a design phase we want to make sure that the design fits well with bisq s current ui and ux this includes everything from trade history and order books to its social aspects hiring team the team will be paid from the bisq funding pool i will be taking the role of mostly a pm and cto during the development process i won t be taking from the funding pool but rather instead be taking trading fees as a form of managing the platform over time and sort of being its chief technologist and marketing strategist many marketing costs that may occur afterwards will be mostly on my own dime henceforth this is why the minor fee is helpful marketing i think marketing is a thing that bisq lacks you can still be decentralized and theoretically teamless while focusing on others to use your software i believe that social trading provides the opportunity to tap into markets that bisq has yet to enter also with the rise of many dex and supposed defi apps i believe bisq can offer a popular alternative to what is mainstream mvp built once the mvp is built we will start garnishing some community feedback and preparing for the launch project launch once all the milestones are met the mvp has been built and things are ready for full deployment it will be officially launch time my focus at that time will than be marketing network growth and scalability for sake of transparency in regards to issue edits based on consensus you can see the draft history via the repo this is as of the writing of this proposal
| 1
|
12,786
| 15,167,077,981
|
IssuesEvent
|
2021-02-12 17:15:23
|
icra/ecam
|
https://api.github.com/repos/icra/ecam
|
closed
|
Results | Summary Page
|
in process
|
Hey @holalluis
is it possible to integrate this from the VITI report (see screenshot):

Currently, there is no indication of what we are looking at:

|
1.0
|
Results | Summary Page - Hey @holalluis
is it possible to integrate this from the VITI report (see screenshot):

Currently, there is no indication of what we are looking at:

|
process
|
results summary page hey holalluis is it possible to integrate this from the viti report see screenshot currently there is no indication of what we are looking at
| 1
|
114,356
| 4,629,453,376
|
IssuesEvent
|
2016-09-28 09:18:36
|
hfst/hfst
|
https://api.github.com/repos/hfst/hfst
|
closed
|
get backlinks
|
auto-migrated future low priority sourceforge spam
|
naturally like your web-site but you need to test the spelling on quite a few of your posts. A number of them are rife with spelling problems and I to find it very troublesome to tell the reality on the other hand Iˇll definitely come again again.
Reported by: *anonymous
|
1.0
|
get backlinks - naturally like your web-site but you need to test the spelling on quite a few of your posts. A number of them are rife with spelling problems and I to find it very troublesome to tell the reality on the other hand Iˇll definitely come again again.
Reported by: *anonymous
|
non_process
|
get backlinks naturally like your web site but you need to test the spelling on quite a few of your posts a number of them are rife with spelling problems and i to find it very troublesome to tell the reality on the other hand iˇll definitely come again again reported by anonymous
| 0
|
52,851
| 13,065,295,585
|
IssuesEvent
|
2020-07-30 19:33:52
|
lbl-srg/modelica-buildings
|
https://api.github.com/repos/lbl-srg/modelica-buildings
|
closed
|
Implement dual maximum constant volume heating control logic for VAV box with reheat
|
OpenBuildingControl
|
Refactor `Buildings.Examples.VAVReheat.Controls.RoomVAV` to allow the following control logic.
<img width="640" alt="2C8734DE-B171-45D1-A139-5AD3A6A4BB8D" src="https://user-images.githubusercontent.com/10965994/87016571-2caabf00-c1cf-11ea-92d1-0142589387fe.png">
(Note that the current single maximum logic can still be represented by setting the minimum air flow equal to the heating air flow.)
This is needed because the comparison between ASHRAE2006 and Guideline36 shows (see figure below based on the master branch commit 0ef82efb2a6f737c3f2c04ba66095941c532927f)
- a high outdoor air flow rate ratio for ASHRAE2006: this is addressed by #2019,
- a high supply air flow rate ratio for ASHRAE2006: this is due to the minimum discharge flow rate of the terminal units.

Note that the outdoor air flow rate ratio is frequently below the minimum (0.3 for ASHRAE2006). This is due to:
- the minimum TMix control in ASHRAE2006,
- the high integral time values used in both models for the minimum outdoor air control (600s for ASHRAE2006 and 1200s for Guideline36). See below for the impact on a week of June for ASHRAE2006 when the control switches from economizer cooling to minimum outdoor air: it takes 2-3h for the economizer to reach the outdoor air set point.

The issue related to the high integral is addressed in #2030 for both models.
|
1.0
|
Implement dual maximum constant volume heating control logic for VAV box with reheat - Refactor `Buildings.Examples.VAVReheat.Controls.RoomVAV` to allow the following control logic.
<img width="640" alt="2C8734DE-B171-45D1-A139-5AD3A6A4BB8D" src="https://user-images.githubusercontent.com/10965994/87016571-2caabf00-c1cf-11ea-92d1-0142589387fe.png">
(Note that the current single maximum logic can still be represented by setting the minimum air flow equal to the heating air flow.)
This is needed because the comparison between ASHRAE2006 and Guideline36 shows (see figure below based on the master branch commit 0ef82efb2a6f737c3f2c04ba66095941c532927f)
- a high outdoor air flow rate ratio for ASHRAE2006: this is addressed by #2019,
- a high supply air flow rate ratio for ASHRAE2006: this is due to the minimum discharge flow rate of the terminal units.

Note that the outdoor air flow rate ratio is frequently below the minimum (0.3 for ASHRAE2006). This is due to:
- the minimum TMix control in ASHRAE2006,
- the high integral time values used in both models for the minimum outdoor air control (600s for ASHRAE2006 and 1200s for Guideline36). See below for the impact on a week of June for ASHRAE2006 when the control switches from economizer cooling to minimum outdoor air: it takes 2-3h for the economizer to reach the outdoor air set point.

The issue related to the high integral is addressed in #2030 for both models.
|
non_process
|
implement dual maximum constant volume heating control logic for vav box with reheat refactor buildings examples vavreheat controls roomvav to allow the following control logic img width alt src note that the current single maximum logic can still be represented by setting the minimum air flow equal to the heating air flow this is needed because the comparison between and shows see figure below based on the master branch commit a high outdoor air flow rate ratio for this is addressed by a high supply air flow rate ratio for this is due to the minimum discharge flow rate of the terminal units note that the outdoor air flow rate ratio is frequently below the minimum for this is due to the minimum tmix control in the high integral time values used in both models for the minimum outdoor air control for and for see below for the impact on a week of june for when the control switches from economizer cooling to minimum outdoor air it takes for the economizer to reach the outdoor air set point the issue related to the high integral is addressed in for both models
| 0
|
392,198
| 11,584,748,097
|
IssuesEvent
|
2020-02-22 19:09:45
|
joe27g/EnhancedDiscord
|
https://api.github.com/repos/joe27g/EnhancedDiscord
|
closed
|
installer problems
|
category: installer priority: high status: planned type: bug
|
ok, so several people have had problems with being unable to launch discord due to the injection path being wrong/ the file not existing. to ensure that these errors are non-destructive, some of the following should be done:
- download and extract the ED files BEFORE injecting
- check that the path exists before setting it as the injection file
- maybe add a note saying to uninject before deleting injection file
also, a note that's somewhat related: checking registry entry `Computer\HKEY_CLASSES_ROOT\Discord\shell\open\command` could make the installer better.
|
1.0
|
installer problems - ok, so several people have had problems with being unable to launch discord due to the injection path being wrong/ the file not existing. to ensure that these errors are non-destructive, some of the following should be done:
- download and extract the ED files BEFORE injecting
- check that the path exists before setting it as the injection file
- maybe add a note saying to uninject before deleting injection file
also, a note that's somewhat related: checking registry entry `Computer\HKEY_CLASSES_ROOT\Discord\shell\open\command` could make the installer better.
|
non_process
|
installer problems ok so several people have had problems with being unable to launch discord due to the injection path being wrong the file not existing to ensure that these errors are non destructive some of the following should be done download and extract the ed files before injecting check that the path exists before setting it as the injection file maybe add a note saying to uninject before deleting injection file also a note that s somewhat related checking registry entry computer hkey classes root discord shell open command could make the installer better
| 0
|
737,174
| 25,504,509,723
|
IssuesEvent
|
2022-11-28 08:19:43
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
usa.experian.com - design is broken
|
priority-normal browser-focus-geckoview engine-gecko android13
|
<!-- @browser: Firefox Mobile 107.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 13; Mobile; rv:107.0) Gecko/107.0 Firefox/107.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/114756 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://usa.experian.com/
**Browser / Version**: Firefox Mobile 107.0
**Operating System**: Android 13
**Tested Another Browser**: Yes Chrome
**Problem type**: Design is broken
**Description**: Images not loaded
**Steps to Reproduce**:
Opening this site in Firefox Focus there are missing images and the layout is broken. This also occurs if I request the desktop site.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20221110173214</li><li>channel: release</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/11/9a96f74f-cabf-4306-811b-1bd834d12064)
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
usa.experian.com - design is broken - <!-- @browser: Firefox Mobile 107.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 13; Mobile; rv:107.0) Gecko/107.0 Firefox/107.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/114756 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://usa.experian.com/
**Browser / Version**: Firefox Mobile 107.0
**Operating System**: Android 13
**Tested Another Browser**: Yes Chrome
**Problem type**: Design is broken
**Description**: Images not loaded
**Steps to Reproduce**:
Opening this site in Firefox Focus there are missing images and the layout is broken. This also occurs if I request the desktop site.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20221110173214</li><li>channel: release</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/11/9a96f74f-cabf-4306-811b-1bd834d12064)
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
usa experian com design is broken url browser version firefox mobile operating system android tested another browser yes chrome problem type design is broken description images not loaded steps to reproduce opening this site in firefox focus there are missing images and the layout is broken this also occurs if i request the desktop site browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel release hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️
| 0
|
14,853
| 18,248,646,923
|
IssuesEvent
|
2021-10-01 22:46:24
|
opensearch-project/data-prepper
|
https://api.github.com/repos/opensearch-project/data-prepper
|
closed
|
Grok Prepper Basic Matching
|
plugin - processor
|
This is a subtask of the issue for a grok processor: https://github.com/opensearch-project/data-prepper/issues/256.
Basic pattern matching and capture functionality will be added to Grok Prepper using the existing java grok library: [https://github.com/thekrakken/java-grok](url)
|
1.0
|
Grok Prepper Basic Matching - This is a subtask of the issue for a grok processor: https://github.com/opensearch-project/data-prepper/issues/256.
Basic pattern matching and capture functionality will be added to Grok Prepper using the existing java grok library: [https://github.com/thekrakken/java-grok](url)
|
process
|
grok prepper basic matching this is a subtask of the issue for a grok processor basic pattern matching and capture functionality will be added to grok prepper using the existing java grok library url
| 1
|
21,735
| 2,642,447,910
|
IssuesEvent
|
2015-03-12 00:03:33
|
chrsmith/html5rocks
|
https://api.github.com/repos/chrsmith/html5rocks
|
opened
|
PlayN Brick Out Tutorial
|
gaming New Priority-P2 Tutorial Type-Bug
|
Original [issue 755](https://code.google.com/p/html5rocks/issues/detail?id=755) created by chrsmith on 2011-12-26T15:49:11.000Z:
Create a tutorial to demonstrate creating a Brick Out game using PlayN.
https://developers.google.com/playn/
Largely an adaptation of
https://sites.google.com/site/bigdicegamestutorials/newtutorials/game-100---break-out
|
1.0
|
PlayN Brick Out Tutorial - Original [issue 755](https://code.google.com/p/html5rocks/issues/detail?id=755) created by chrsmith on 2011-12-26T15:49:11.000Z:
Create a tutorial to demonstrate creating a Brick Out game using PlayN.
https://developers.google.com/playn/
Largely an adaptation of
https://sites.google.com/site/bigdicegamestutorials/newtutorials/game-100---break-out
|
non_process
|
playn brick out tutorial original created by chrsmith on create a tutorial to demonstrate creating a brick out game using playn largely an adaptation of
| 0
|
299,746
| 22,620,844,323
|
IssuesEvent
|
2022-06-30 06:07:31
|
giotto-ai/giotto-deep
|
https://api.github.com/repos/giotto-ai/giotto-deep
|
closed
|
Create a notebook reproducing topological study of entangled tori
|
documentation mid priority
|
**Is your feature request related to a problem? Please describe.**
Create notebook recreating the experiments of [this paper](https://www.stat.uchicago.edu/~lekheng/work/topdeep.pdf)
<!-- Thanks for contributing! -->
|
1.0
|
Create a notebook reproducing topological study of entangled tori - **Is your feature request related to a problem? Please describe.**
Create notebook recreating the experiments of [this paper](https://www.stat.uchicago.edu/~lekheng/work/topdeep.pdf)
<!-- Thanks for contributing! -->
|
non_process
|
create a notebook reproducing topological study of entangled tori is your feature request related to a problem please describe create notebook recreating the experiments of
| 0
|
16,622
| 21,678,126,892
|
IssuesEvent
|
2022-05-09 01:23:14
|
lynnandtonic/nestflix.fun
|
https://api.github.com/repos/lynnandtonic/nestflix.fun
|
closed
|
Add Citizen Starlight
|
suggested title in process
|
Please add as much of the following info as you can:
Title: Citizen Starlight
Type (film/tv show): TV show - reality
Film or show in which it appears: The Boys
Is the parent film/show streaming anywhere? Yes - Amazon Prime
About when in the parent film/show does it appear? Episode 1x06 "The Innocents": 48:50 - 49:29 (with behind the scenes footage).
Actual footage of the film/show can be seen (yes/no)? Yes
Production Company: Vought Studios
|
1.0
|
Add Citizen Starlight - Please add as much of the following info as you can:
Title: Citizen Starlight
Type (film/tv show): TV show - reality
Film or show in which it appears: The Boys
Is the parent film/show streaming anywhere? Yes - Amazon Prime
About when in the parent film/show does it appear? Episode 1x06 "The Innocents": 48:50 - 49:29 (with behind the scenes footage).
Actual footage of the film/show can be seen (yes/no)? Yes
Production Company: Vought Studios
|
process
|
add citizen starlight please add as much of the following info as you can title citizen starlight type film tv show tv show reality film or show in which it appears the boys is the parent film show streaming anywhere yes amazon prime about when in the parent film show does it appear episode the innocents with behind the scenes footage actual footage of the film show can be seen yes no yes production company vought studios
| 1
|
325,965
| 24,067,635,571
|
IssuesEvent
|
2022-09-17 18:21:47
|
zulip/zulip-terminal
|
https://api.github.com/repos/zulip/zulip-terminal
|
closed
|
Where to download zuliprc?
|
area: documentation
|
The readme says:
> Your personal zuliprc file can be obtained from Zulip servers in your account settings in the web application, which gives you all the permissions you have there. Bot zuliprc files can be downloaded from a similar area for each bot, and will have more limited permissions.
But I can't find it on a self-hosted instance Version 5.2.
I could only find the documentation for bots: https://zulip.com/api/running-bots
|
1.0
|
Where to download zuliprc? - The readme says:
> Your personal zuliprc file can be obtained from Zulip servers in your account settings in the web application, which gives you all the permissions you have there. Bot zuliprc files can be downloaded from a similar area for each bot, and will have more limited permissions.
But I can't find it on a self-hosted instance Version 5.2.
I could only find the documentation for bots: https://zulip.com/api/running-bots
|
non_process
|
where to download zuliprc the readme says your personal zuliprc file can be obtained from zulip servers in your account settings in the web application which gives you all the permissions you have there bot zuliprc files can be downloaded from a similar area for each bot and will have more limited permissions but i can t find it on a self hosted instance version i could only find the documentation for bots
| 0
|
16,263
| 11,888,735,084
|
IssuesEvent
|
2020-03-28 10:13:02
|
davids91/rafko
|
https://api.github.com/repos/davids91/rafko
|
opened
|
Define and implement Java GRPC Client
|
infrastructure java
|
Create the Client untility for Java to communicate with the Sparse net solution server based on the services defined in #14
|
1.0
|
Define and implement Java GRPC Client - Create the Client untility for Java to communicate with the Sparse net solution server based on the services defined in #14
|
non_process
|
define and implement java grpc client create the client untility for java to communicate with the sparse net solution server based on the services defined in
| 0
|
12,182
| 14,742,055,383
|
IssuesEvent
|
2021-01-07 11:37:17
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Site 068: Portland - SABilling CSV download of Usage History
|
anc-process anp-1 ant-bug ant-support
|
In GitLab by @kdjstudios on Mar 12, 2019, 08:20
**Submitted by:** "Jeffrey Casey" <jeffrey.casey@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/7415145
**Server:** Internal
**Client/Site:** Portland
**Account:** All
**Issue:**
I used to be able to go into any of my accounts and download the CSV file containing the usage over multiple billing cycles. Downloading the CSV file provided more detailed information than what you get on screen or the PDF download. When attempting to download the CSV and error message on a white screen reads “we’re sorry, but something went wrong”. This download is invaluable when helping clients to understand overages they have within a billing cycle.
Could we please restore this previous functionality?
|
1.0
|
Site 068: Portland - SABilling CSV download of Usage History - In GitLab by @kdjstudios on Mar 12, 2019, 08:20
**Submitted by:** "Jeffrey Casey" <jeffrey.casey@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/7415145
**Server:** Internal
**Client/Site:** Portland
**Account:** All
**Issue:**
I used to be able to go into any of my accounts and download the CSV file containing the usage over multiple billing cycles. Downloading the CSV file provided more detailed information than what you get on screen or the PDF download. When attempting to download the CSV and error message on a white screen reads “we’re sorry, but something went wrong”. This download is invaluable when helping clients to understand overages they have within a billing cycle.
Could we please restore this previous functionality?
|
process
|
site portland sabilling csv download of usage history in gitlab by kdjstudios on mar submitted by jeffrey casey helpdesk server internal client site portland account all issue i used to be able to go into any of my accounts and download the csv file containing the usage over multiple billing cycles downloading the csv file provided more detailed information than what you get on screen or the pdf download when attempting to download the csv and error message on a white screen reads “we’re sorry but something went wrong” this download is invaluable when helping clients to understand overages they have within a billing cycle could we please restore this previous functionality
| 1
|
1,063
| 3,535,993,816
|
IssuesEvent
|
2016-01-16 23:05:34
|
t3kt/vjzual2
|
https://api.github.com/repos/t3kt/vjzual2
|
closed
|
parameter smoothing in the linked transform module is buggy
|
bug control routing video processing
|
see #260.
instead of grabbing master parameters and then re-smoothing the modified values, grab the smoothed values from the master and modify them.
|
1.0
|
parameter smoothing in the linked transform module is buggy - see #260.
instead of grabbing master parameters and then re-smoothing the modified values, grab the smoothed values from the master and modify them.
|
process
|
parameter smoothing in the linked transform module is buggy see instead of grabbing master parameters and then re smoothing the modified values grab the smoothed values from the master and modify them
| 1
|
377,054
| 26,230,395,169
|
IssuesEvent
|
2023-01-04 23:18:03
|
PantomInach/DataBot
|
https://api.github.com/repos/PantomInach/DataBot
|
closed
|
[SET-UP] Readme must be updated
|
documentation good first issue important
|
Bot also requires PRESENCE INTENT, SERVER MEMBERS INTENT and **MESSAGE CONTENT INTENT** in Privileged Gateway Intents when setting it up on Discord Developer Portal.
Readme only reads PRESENCE INTENT and SERVER MEMBERS INTENT
**What version of the bot**
`2.1.0`
|
1.0
|
[SET-UP] Readme must be updated - Bot also requires PRESENCE INTENT, SERVER MEMBERS INTENT and **MESSAGE CONTENT INTENT** in Privileged Gateway Intents when setting it up on Discord Developer Portal.
Readme only reads PRESENCE INTENT and SERVER MEMBERS INTENT
**What version of the bot**
`2.1.0`
|
non_process
|
readme must be updated bot also requires presence intent server members intent and message content intent in privileged gateway intents when setting it up on discord developer portal readme only reads presence intent and server members intent what version of the bot
| 0
|
71,030
| 13,579,022,849
|
IssuesEvent
|
2020-09-20 10:54:10
|
joomla/joomla-cms
|
https://api.github.com/repos/joomla/joomla-cms
|
closed
|
[4.0] W3C Web Authentication (WebAuthn) Login layout
|
No Code Attached Yet
|
### Steps to reproduce the issue
Visit Joomla 4 user edit page on a SSL url
Visit the `W3C Web Authentication (WebAuthn) Login` tab
### Expected result
The layout not to be bunched up into a small column
### Actual result
This screenshot is half of my 27 inch iMac screen so approx 1600x1600px
So much wasted whitespace :)
The fieldset title `W3C Web Authentication (WebAuthn) Login` is immediately duplicated underneath and split into two lines
The `Authenticator added on 2020-06-18 20:24:16` is wrapped
Remove icon appears larger than the edit button
The edit icon is bunched up to the word Edit
The information is wrapped below.
<img width="1595" alt="Screenshot 2020-06-18 at 21 27 54" src="https://user-images.githubusercontent.com/400092/85068669-b7893280-b1aa-11ea-83b9-86f245055204.png">
### System information (as much as possible)
### Additional comments
|
1.0
|
[4.0] W3C Web Authentication (WebAuthn) Login layout - ### Steps to reproduce the issue
Visit Joomla 4 user edit page on a SSL url
Visit the `W3C Web Authentication (WebAuthn) Login` tab
### Expected result
The layout not to be bunched up into a small column
### Actual result
This screenshot is half of my 27 inch iMac screen so approx 1600x1600px
So much wasted whitespace :)
The fieldset title `W3C Web Authentication (WebAuthn) Login` is immediately duplicated underneath and split into two lines
The `Authenticator added on 2020-06-18 20:24:16` is wrapped
Remove icon appears larger than the edit button
The edit icon is bunched up to the word Edit
The information is wrapped below.
<img width="1595" alt="Screenshot 2020-06-18 at 21 27 54" src="https://user-images.githubusercontent.com/400092/85068669-b7893280-b1aa-11ea-83b9-86f245055204.png">
### System information (as much as possible)
### Additional comments
|
non_process
|
web authentication webauthn login layout steps to reproduce the issue visit joomla user edit page on a ssl url visit the web authentication webauthn login tab expected result the layout not to be bunched up into a small column actual result this screenshot is half of my inch imac screen so approx so much wasted whitespace the fieldset title web authentication webauthn login is immediately duplicated underneath and split into two lines the authenticator added on is wrapped remove icon appears larger than the edit button the edit icon is bunched up to the word edit the information is wrapped below img width alt screenshot at src system information as much as possible additional comments
| 0
|
5,242
| 8,038,514,241
|
IssuesEvent
|
2018-07-30 15:34:57
|
GoogleCloudPlatform/google-cloud-python
|
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-python
|
closed
|
BigQuery: systest 'test_lists_datasets_by_label' failure
|
api: bigquery flaky testing type: process
|
See: https://circleci.com/gh/GoogleCloudPlatform/google-cloud-python/7193
```python
_________________________ test_list_datasets_by_label __________________________
client = <google.cloud.bigquery.client.Client object at 0x7f6cd03273d0>
to_delete = [Dataset(DatasetReference(u'precise-truck-742', u'list_datasets_by_label_1531507522596'))]
def test_list_datasets_by_label(client, to_delete):
dataset_id = 'list_datasets_by_label_{}'.format(_millis())
dataset = bigquery.Dataset(client.dataset(dataset_id))
dataset.labels = {'color': 'green'}
dataset = client.create_dataset(dataset) # API request
to_delete.append(dataset)
# [START bigquery_list_datasets_by_label]
# from google.cloud import bigquery
# client = bigquery.Client()
# The following label filter example will find datasets with an
# arbitrary 'color' label set to 'green'
label_filter = 'labels.color:green'
datasets = list(client.list_datasets(filter=label_filter))
if datasets:
print('Datasets filtered by {}:'.format(label_filter))
for dataset in datasets: # API request(s)
print('\t{}'.format(dataset.dataset_id))
else:
print('No datasets found with this filter.')
# [END bigquery_list_datasets_by_label]
> assert len(datasets) == 1
E assert 2 == 1
E + where 2 = len([<google.cloud.bigquery.dataset.DatasetListItem object at 0x7f6cd03a5e50>, <google.cloud.bigquery.dataset.DatasetListItem object at 0x7f6cd040fd90>])
../docs/bigquery/snippets.py:173: AssertionError
```
Likely overlapping test runs. Suggested fix is to use a label value with `_unique_id`, or else weaken the assertion (assert that the current dataset ID is in the set returned by the `list_datasets` call).
|
1.0
|
BigQuery: systest 'test_lists_datasets_by_label' failure - See: https://circleci.com/gh/GoogleCloudPlatform/google-cloud-python/7193
```python
_________________________ test_list_datasets_by_label __________________________
client = <google.cloud.bigquery.client.Client object at 0x7f6cd03273d0>
to_delete = [Dataset(DatasetReference(u'precise-truck-742', u'list_datasets_by_label_1531507522596'))]
def test_list_datasets_by_label(client, to_delete):
dataset_id = 'list_datasets_by_label_{}'.format(_millis())
dataset = bigquery.Dataset(client.dataset(dataset_id))
dataset.labels = {'color': 'green'}
dataset = client.create_dataset(dataset) # API request
to_delete.append(dataset)
# [START bigquery_list_datasets_by_label]
# from google.cloud import bigquery
# client = bigquery.Client()
# The following label filter example will find datasets with an
# arbitrary 'color' label set to 'green'
label_filter = 'labels.color:green'
datasets = list(client.list_datasets(filter=label_filter))
if datasets:
print('Datasets filtered by {}:'.format(label_filter))
for dataset in datasets: # API request(s)
print('\t{}'.format(dataset.dataset_id))
else:
print('No datasets found with this filter.')
# [END bigquery_list_datasets_by_label]
> assert len(datasets) == 1
E assert 2 == 1
E + where 2 = len([<google.cloud.bigquery.dataset.DatasetListItem object at 0x7f6cd03a5e50>, <google.cloud.bigquery.dataset.DatasetListItem object at 0x7f6cd040fd90>])
../docs/bigquery/snippets.py:173: AssertionError
```
Likely overlapping test runs. Suggested fix is to use a label value with `_unique_id`, or else weaken the assertion (assert that the current dataset ID is in the set returned by the `list_datasets` call).
|
process
|
bigquery systest test lists datasets by label failure see python test list datasets by label client to delete def test list datasets by label client to delete dataset id list datasets by label format millis dataset bigquery dataset client dataset dataset id dataset labels color green dataset client create dataset dataset api request to delete append dataset from google cloud import bigquery client bigquery client the following label filter example will find datasets with an arbitrary color label set to green label filter labels color green datasets list client list datasets filter label filter if datasets print datasets filtered by format label filter for dataset in datasets api request s print t format dataset dataset id else print no datasets found with this filter assert len datasets e assert e where len docs bigquery snippets py assertionerror likely overlapping test runs suggested fix is to use a label value with unique id or else weaken the assertion assert that the current dataset id is in the set returned by the list datasets call
| 1
|
14,816
| 18,150,961,771
|
IssuesEvent
|
2021-09-26 08:55:59
|
solid/specification
|
https://api.github.com/repos/solid/specification
|
closed
|
Determine the URIs for the specifications
|
status: Needs Process Help doc: Ecosystem
|
We need to have some commitment for the spec URIs. At the moment, we are using GitHub pages as an interim solution, however long term commitment and URI persistence is important for the project. We could transition different specs at different times as the project and documents evolve.
Basic things to consider:
* Under whose authority?
* What URI space? (under w3.org; community, wiki, member submission, or solidproject.org, github.io, w3id.org, ..)
This issue should be addressed by the Solid process and the W3C Solid Community Group.
|
1.0
|
Determine the URIs for the specifications - We need to have some commitment for the spec URIs. At the moment, we are using GitHub pages as an interim solution, however long term commitment and URI persistence is important for the project. We could transition different specs at different times as the project and documents evolve.
Basic things to consider:
* Under whose authority?
* What URI space? (under w3.org; community, wiki, member submission, or solidproject.org, github.io, w3id.org, ..)
This issue should be addressed by the Solid process and the W3C Solid Community Group.
|
process
|
determine the uris for the specifications we need to have some commitment for the spec uris at the moment we are using github pages as an interim solution however long term commitment and uri persistence is important for the project we could transition different specs at different times as the project and documents evolve basic things to consider under whose authority what uri space under org community wiki member submission or solidproject org github io org this issue should be addressed by the solid process and the solid community group
| 1
|
242,667
| 26,277,775,301
|
IssuesEvent
|
2023-01-07 01:09:17
|
vipinsun/eosio-web-ide
|
https://api.github.com/repos/vipinsun/eosio-web-ide
|
opened
|
CVE-2022-0536 (Medium) detected in follow-redirects-1.8.1.tgz
|
security vulnerability
|
## CVE-2022-0536 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>follow-redirects-1.8.1.tgz</b></p></summary>
<p>HTTP and HTTPS modules that follow redirects.</p>
<p>Library home page: <a href="https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.8.1.tgz">https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.8.1.tgz</a></p>
<p>Path to dependency file: /webapp/package.json</p>
<p>Path to vulnerable library: /webapp/node_modules/follow-redirects</p>
<p>
Dependency Hierarchy:
- http-server-0.11.1.tgz (Root Library)
- http-proxy-1.17.0.tgz
- :x: **follow-redirects-1.8.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Exposure of Sensitive Information to an Unauthorized Actor in NPM follow-redirects prior to 1.14.8.
<p>Publish Date: 2022-02-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-0536>CVE-2022-0536</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0536">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0536</a></p>
<p>Release Date: 2022-02-09</p>
<p>Fix Resolution (follow-redirects): 1.14.8</p>
<p>Direct dependency fix Resolution (http-server): 0.11.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-0536 (Medium) detected in follow-redirects-1.8.1.tgz - ## CVE-2022-0536 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>follow-redirects-1.8.1.tgz</b></p></summary>
<p>HTTP and HTTPS modules that follow redirects.</p>
<p>Library home page: <a href="https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.8.1.tgz">https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.8.1.tgz</a></p>
<p>Path to dependency file: /webapp/package.json</p>
<p>Path to vulnerable library: /webapp/node_modules/follow-redirects</p>
<p>
Dependency Hierarchy:
- http-server-0.11.1.tgz (Root Library)
- http-proxy-1.17.0.tgz
- :x: **follow-redirects-1.8.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Exposure of Sensitive Information to an Unauthorized Actor in NPM follow-redirects prior to 1.14.8.
<p>Publish Date: 2022-02-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-0536>CVE-2022-0536</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0536">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0536</a></p>
<p>Release Date: 2022-02-09</p>
<p>Fix Resolution (follow-redirects): 1.14.8</p>
<p>Direct dependency fix Resolution (http-server): 0.11.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in follow redirects tgz cve medium severity vulnerability vulnerable library follow redirects tgz http and https modules that follow redirects library home page a href path to dependency file webapp package json path to vulnerable library webapp node modules follow redirects dependency hierarchy http server tgz root library http proxy tgz x follow redirects tgz vulnerable library found in base branch master vulnerability details exposure of sensitive information to an unauthorized actor in npm follow redirects prior to publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution follow redirects direct dependency fix resolution http server step up your open source security game with mend
| 0
|
34,066
| 12,237,989,302
|
IssuesEvent
|
2020-05-04 19:00:05
|
uniquelyparticular/sync-moltin-to-shipengine
|
https://api.github.com/repos/uniquelyparticular/sync-moltin-to-shipengine
|
opened
|
WS-2020-0068 (Medium) detected in multiple libraries
|
security vulnerability
|
## WS-2020-0068 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>yargs-parser-10.1.0.tgz</b>, <b>yargs-parser-9.0.2.tgz</b>, <b>yargs-parser-13.1.1.tgz</b>, <b>yargs-parser-16.1.0.tgz</b></p></summary>
<p>
<details><summary><b>yargs-parser-10.1.0.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-10.1.0.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-10.1.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/sync-moltin-to-shipengine/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/sync-moltin-to-shipengine/node_modules/ts-jest/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- semantic-release-15.14.0.tgz (Root Library)
- commit-analyzer-6.3.3.tgz
- conventional-commits-parser-3.0.8.tgz
- meow-5.0.0.tgz
- :x: **yargs-parser-10.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>yargs-parser-9.0.2.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-9.0.2.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-9.0.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/sync-moltin-to-shipengine/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/sync-moltin-to-shipengine/node_modules/jsome/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- micro-dev-3.0.0.tgz (Root Library)
- jsome-2.5.0.tgz
- yargs-11.1.0.tgz
- :x: **yargs-parser-9.0.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>yargs-parser-13.1.1.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-13.1.1.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-13.1.1.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/sync-moltin-to-shipengine/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/sync-moltin-to-shipengine/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- jest-24.9.0.tgz (Root Library)
- jest-cli-24.9.0.tgz
- yargs-13.3.0.tgz
- :x: **yargs-parser-13.1.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>yargs-parser-16.1.0.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-16.1.0.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-16.1.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/sync-moltin-to-shipengine/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/sync-moltin-to-shipengine/node_modules/semantic-release/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- semantic-release-15.14.0.tgz (Root Library)
- yargs-15.0.2.tgz
- :x: **yargs-parser-16.1.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/uniquelyparticular/sync-moltin-to-shipengine/commit/d66c99a0c5b66580d24572646c2d7f6068ed184a">d66c99a0c5b66580d24572646c2d7f6068ed184a</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Affected versions of yargs-parser are vulnerable to prototype pollution. Arguments are not properly sanitized, allowing an attacker to modify the prototype of Object, causing the addition or modification of an existing property that will exist on all objects. Parsing the argument --foo.__proto__.bar baz' adds a bar property with value baz to all objects. This is only exploitable if attackers have control over the arguments being passed to yargs-parser.
<p>Publish Date: 2020-05-01
<p>URL: <a href=https://www.npmjs.com/advisories/1500>WS-2020-0068</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/package/yargs-parser">https://www.npmjs.com/package/yargs-parser</a></p>
<p>Release Date: 2020-05-04</p>
<p>Fix Resolution: https://www.npmjs.com/package/yargs-parser/v/18.1.2,https://www.npmjs.com/package/yargs-parser/v/15.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2020-0068 (Medium) detected in multiple libraries - ## WS-2020-0068 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>yargs-parser-10.1.0.tgz</b>, <b>yargs-parser-9.0.2.tgz</b>, <b>yargs-parser-13.1.1.tgz</b>, <b>yargs-parser-16.1.0.tgz</b></p></summary>
<p>
<details><summary><b>yargs-parser-10.1.0.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-10.1.0.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-10.1.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/sync-moltin-to-shipengine/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/sync-moltin-to-shipengine/node_modules/ts-jest/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- semantic-release-15.14.0.tgz (Root Library)
- commit-analyzer-6.3.3.tgz
- conventional-commits-parser-3.0.8.tgz
- meow-5.0.0.tgz
- :x: **yargs-parser-10.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>yargs-parser-9.0.2.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-9.0.2.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-9.0.2.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/sync-moltin-to-shipengine/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/sync-moltin-to-shipengine/node_modules/jsome/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- micro-dev-3.0.0.tgz (Root Library)
- jsome-2.5.0.tgz
- yargs-11.1.0.tgz
- :x: **yargs-parser-9.0.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>yargs-parser-13.1.1.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-13.1.1.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-13.1.1.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/sync-moltin-to-shipengine/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/sync-moltin-to-shipengine/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- jest-24.9.0.tgz (Root Library)
- jest-cli-24.9.0.tgz
- yargs-13.3.0.tgz
- :x: **yargs-parser-13.1.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>yargs-parser-16.1.0.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-16.1.0.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-16.1.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/sync-moltin-to-shipengine/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/sync-moltin-to-shipengine/node_modules/semantic-release/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- semantic-release-15.14.0.tgz (Root Library)
- yargs-15.0.2.tgz
- :x: **yargs-parser-16.1.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/uniquelyparticular/sync-moltin-to-shipengine/commit/d66c99a0c5b66580d24572646c2d7f6068ed184a">d66c99a0c5b66580d24572646c2d7f6068ed184a</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Affected versions of yargs-parser are vulnerable to prototype pollution. Arguments are not properly sanitized, allowing an attacker to modify the prototype of Object, causing the addition or modification of an existing property that will exist on all objects. Parsing the argument --foo.__proto__.bar baz' adds a bar property with value baz to all objects. This is only exploitable if attackers have control over the arguments being passed to yargs-parser.
<p>Publish Date: 2020-05-01
<p>URL: <a href=https://www.npmjs.com/advisories/1500>WS-2020-0068</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/package/yargs-parser">https://www.npmjs.com/package/yargs-parser</a></p>
<p>Release Date: 2020-05-04</p>
<p>Fix Resolution: https://www.npmjs.com/package/yargs-parser/v/18.1.2,https://www.npmjs.com/package/yargs-parser/v/15.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in multiple libraries ws medium severity vulnerability vulnerable libraries yargs parser tgz yargs parser tgz yargs parser tgz yargs parser tgz yargs parser tgz the mighty option parser used by yargs library home page a href path to dependency file tmp ws scm sync moltin to shipengine package json path to vulnerable library tmp ws scm sync moltin to shipengine node modules ts jest node modules yargs parser package json dependency hierarchy semantic release tgz root library commit analyzer tgz conventional commits parser tgz meow tgz x yargs parser tgz vulnerable library yargs parser tgz the mighty option parser used by yargs library home page a href path to dependency file tmp ws scm sync moltin to shipengine package json path to vulnerable library tmp ws scm sync moltin to shipengine node modules jsome node modules yargs parser package json dependency hierarchy micro dev tgz root library jsome tgz yargs tgz x yargs parser tgz vulnerable library yargs parser tgz the mighty option parser used by yargs library home page a href path to dependency file tmp ws scm sync moltin to shipengine package json path to vulnerable library tmp ws scm sync moltin to shipengine node modules yargs parser package json dependency hierarchy jest tgz root library jest cli tgz yargs tgz x yargs parser tgz vulnerable library yargs parser tgz the mighty option parser used by yargs library home page a href path to dependency file tmp ws scm sync moltin to shipengine package json path to vulnerable library tmp ws scm sync moltin to shipengine node modules semantic release node modules yargs parser package json dependency hierarchy semantic release tgz root library yargs tgz x yargs parser tgz vulnerable library found in head commit a href vulnerability details affected versions of yargs parser are vulnerable to prototype pollution arguments are not properly sanitized allowing an attacker to modify the prototype of object causing the addition or modification of an existing property that will exist on all objects parsing the argument foo proto bar baz adds a bar property with value baz to all objects this is only exploitable if attackers have control over the arguments being passed to yargs parser publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
22,292
| 2,648,594,575
|
IssuesEvent
|
2015-03-14 02:14:19
|
dhamp/eiskaltdcpp
|
https://api.github.com/repos/dhamp/eiskaltdcpp
|
opened
|
Падения eiskaltdcpp-qt
|
bug imported Priority-Medium
|
_From [unikum...@gmail.com](https://code.google.com/u/114963251942515168095/) on May 02, 2011 17:43:00_
What steps will reproduce the problem? 1. Запуск eiskaltdcpp-qt What is the expected output? What do you see instead? $ eiskaltdcpp-qt
Signal handlers installed.
Loading: Хеш базы данных
Loading: Расшаренные файлы
Loading: Очередь
Loading: Пользователи
Загружены иконки для списка пользователей
Загружены иконки программы
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Аварийный останов What version of the product are you using? On what operating system? eiskaltdcpp-qt-2.2.2-1 Archlinux i686. Please provide any additional information below.
_Original issue: http://code.google.com/p/eiskaltdc/issues/detail?id=1063_
|
1.0
|
Падения eiskaltdcpp-qt - _From [unikum...@gmail.com](https://code.google.com/u/114963251942515168095/) on May 02, 2011 17:43:00_
What steps will reproduce the problem? 1. Запуск eiskaltdcpp-qt What is the expected output? What do you see instead? $ eiskaltdcpp-qt
Signal handlers installed.
Loading: Хеш базы данных
Loading: Расшаренные файлы
Loading: Очередь
Loading: Пользователи
Загружены иконки для списка пользователей
Загружены иконки программы
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Аварийный останов What version of the product are you using? On what operating system? eiskaltdcpp-qt-2.2.2-1 Archlinux i686. Please provide any additional information below.
_Original issue: http://code.google.com/p/eiskaltdc/issues/detail?id=1063_
|
non_process
|
падения eiskaltdcpp qt from on may what steps will reproduce the problem запуск eiskaltdcpp qt what is the expected output what do you see instead eiskaltdcpp qt signal handlers installed loading хеш базы данных loading расшаренные файлы loading очередь loading пользователи загружены иконки для списка пользователей загружены иконки программы terminate called after throwing an instance of std bad alloc what std bad alloc аварийный останов what version of the product are you using on what operating system eiskaltdcpp qt archlinux please provide any additional information below original issue
| 0
|
9,983
| 13,025,612,156
|
IssuesEvent
|
2020-07-27 13:48:17
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
This is what I absolutely HATE about Azure. Article is just over 3 months old and there is NO Connect-AzAccount that shows up in the library.
|
Pri2 automation/svc cxp doc-enhancement process-automation/subsvc triaged
|
[Enter feedback here]
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 4799ebe4-d654-b9d2-e64f-923d025ad7cf
* Version Independent ID: f6519afb-bf55-c886-e3c3-d11174da3aea
* Content: [Create a graphical runbook in Azure Automation](https://docs.microsoft.com/en-us/azure/automation/learn/automation-tutorial-runbook-graphical)
* Content Source: [articles/automation/learn/automation-tutorial-runbook-graphical.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/automation/learn/automation-tutorial-runbook-graphical.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @MGoedtel
* Microsoft Alias: **magoedte**
|
1.0
|
This is what I absolutely HATE about Azure. Article is just over 3 months old and there is NO Connect-AzAccount that shows up in the library. -
[Enter feedback here]
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 4799ebe4-d654-b9d2-e64f-923d025ad7cf
* Version Independent ID: f6519afb-bf55-c886-e3c3-d11174da3aea
* Content: [Create a graphical runbook in Azure Automation](https://docs.microsoft.com/en-us/azure/automation/learn/automation-tutorial-runbook-graphical)
* Content Source: [articles/automation/learn/automation-tutorial-runbook-graphical.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/automation/learn/automation-tutorial-runbook-graphical.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @MGoedtel
* Microsoft Alias: **magoedte**
|
process
|
this is what i absolutely hate about azure article is just over months old and there is no connect azaccount that shows up in the library document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login mgoedtel microsoft alias magoedte
| 1
|
12,120
| 8,607,851,784
|
IssuesEvent
|
2018-11-18 04:10:53
|
AOSC-Dev/aosc-os-abbs
|
https://api.github.com/repos/AOSC-Dev/aosc-os-abbs
|
opened
|
patch: CVE-2018-6952
|
security
|
<!-- Please remove items do not apply. -->
**CVE IDs:** CVE-2018-6952
**Other security advisory IDs:** MGASA-2018-0448
**Descriptions:**
https://bugs.mageia.org/show_bug.cgi?id=23704
**Patches:** http://git.savannah.gnu.org/cgit/patch.git/commit/?id=9c986353e420ead6e706262bf204d6e03322c300
**PoC(s):** https://savannah.gnu.org/bugs/index.php?53133
**Architectural progress:**
<!-- Please remove any architecture to which the security vulnerabilities do not apply. -->
- [ ] AMD64 `amd64`
- [ ] 32-bit Optional Environment `optenv32`
- [ ] AArch64 `arm64`
- [ ] ARMv7 `armel`
- [ ] PowerPC 64-bit BE `ppc64`
- [ ] PowerPC 32-bit BE `powerpc`
- [ ] RISC-V 64-bit `riscv64`
|
True
|
patch: CVE-2018-6952 - <!-- Please remove items do not apply. -->
**CVE IDs:** CVE-2018-6952
**Other security advisory IDs:** MGASA-2018-0448
**Descriptions:**
https://bugs.mageia.org/show_bug.cgi?id=23704
**Patches:** http://git.savannah.gnu.org/cgit/patch.git/commit/?id=9c986353e420ead6e706262bf204d6e03322c300
**PoC(s):** https://savannah.gnu.org/bugs/index.php?53133
**Architectural progress:**
<!-- Please remove any architecture to which the security vulnerabilities do not apply. -->
- [ ] AMD64 `amd64`
- [ ] 32-bit Optional Environment `optenv32`
- [ ] AArch64 `arm64`
- [ ] ARMv7 `armel`
- [ ] PowerPC 64-bit BE `ppc64`
- [ ] PowerPC 32-bit BE `powerpc`
- [ ] RISC-V 64-bit `riscv64`
|
non_process
|
patch cve cve ids cve other security advisory ids mgasa descriptions patches poc s architectural progress bit optional environment armel powerpc bit be powerpc bit be powerpc risc v bit
| 0
|
100,138
| 30,626,556,254
|
IssuesEvent
|
2023-07-24 11:56:35
|
xamarin/xamarin-android
|
https://api.github.com/repos/xamarin/xamarin-android
|
opened
|
The APK with .NET 7 Android is twice the size compared to compiling with Xamarin.Android.
|
Area: App+Library Build needs-triage
|
### Android application type
.NET Android (net7.0-android, etc.)
### Affected platform version
.net 7 android
### Description
I am upgrading my app from Xamarin.Android to .NET 7, but the APK/AAB size is twice as large as the one built with Xamarin.Android.
Building and archiving the APK with .NET 7 (dotnet publish myapp.csproj -f net7.0-android -c Release) or from VSMac 17 results in a size that is twice as large as the APK/AAB built and archived with Xamarin.Android 13.
how can i reduce apk/aab, in xamarin.android is 30mb in .net7 android is 65mb ?
Xamarin Android project configuration
```
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
<ProductVersion>8.0.30703</ProductVersion>
<SchemaVersion>2.0</SchemaVersion>
<ProjectGuid>{782BE5B1-8B1B-4D8F-B154-2C003ABA8504}</ProjectGuid>
<ProjectTypeGuids>{EFBA0AD7-5A72-4C68-AF49-83D382785DCF};{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}</ProjectTypeGuids>
<TemplateGuid>{84dd83c5-0fe3-4294-9419-09e7c8ba324f}</TemplateGuid>
<OutputType>Library</OutputType>
<AppDesignerFolder>Properties</AppDesignerFolder>
<RootNamespace>Myapp.Droid</RootNamespace>
<AssemblyName>Myapp.Droid</AssemblyName>
<FileAlignment>512</FileAlignment>
<Deterministic>True</Deterministic>
<AndroidApplication>True</AndroidApplication>
<AndroidResgenFile>Resources\Resource.designer.cs</AndroidResgenFile>
<AndroidResgenClass>Resource</AndroidResgenClass>
<GenerateSerializationAssemblies>Off</GenerateSerializationAssemblies>
<TargetFrameworkVersion>v13.0</TargetFrameworkVersion>
<AndroidManifest>Properties\AndroidManifest.xml</AndroidManifest>
<MonoAndroidResourcePrefix>Resources</MonoAndroidResourcePrefix>
<MonoAndroidAssetsPrefix>Assets</MonoAndroidAssetsPrefix>
<AndroidEnableSGenConcurrent>true</AndroidEnableSGenConcurrent>
<AndroidUseAapt2>true</AndroidUseAapt2>
<AndroidHttpClientHandlerType>Xamarin.Android.Net.AndroidClientHandler</AndroidHttpClientHandlerType>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
<DebugSymbols>True</DebugSymbols>
<DebugType>portable</DebugType>
<Optimize>True</Optimize>
<OutputPath>bin\Release\</OutputPath>
<DefineConstants>TRACE</DefineConstants>
<ErrorReport>prompt</ErrorReport>
<WarningLevel>4</WarningLevel>
<AndroidManagedSymbols>true</AndroidManagedSymbols>
<AndroidUseSharedRuntime>False</AndroidUseSharedRuntime>
<AndroidLinkMode>SdkOnly</AndroidLinkMode>
<EmbedAssembliesIntoApk>True</EmbedAssembliesIntoApk>
</PropertyGroup>
```
vs .net7-android configuration
```
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net7.0-android</TargetFramework>
<SupportedOSPlatformVersion>27</SupportedOSPlatformVersion>
<OutputType>Exe</OutputType>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<ApplicationId>myapp.Android</ApplicationId>
<ApplicationVersion>1</ApplicationVersion>
<ApplicationDisplayVersion>1.0</ApplicationDisplayVersion>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
<AndroidManagedSymbols>true</AndroidManagedSymbols>
<AndroidUseSharedRuntime>False</AndroidUseSharedRuntime>
<AndroidLinkMode>SdkOnly</AndroidLinkMode>
<EmbedAssembliesIntoApk>True</EmbedAssembliesIntoApk>
<Optimize>True</Optimize>
</PropertyGroup>
```
nuget packages used in main project
```
<ItemGroup>
<PackageReference Include="Xamarin.AndroidX.AppCompat" Version="1.6.1.3" />
<PackageReference Include="Xamarin.AndroidX.Lifecycle.Common" Version="2.6.1.3" />
<PackageReference Include="Xamarin.Firebase.Messaging" Version="123.1.2.2" />
<PackageReference Include="Xamarin.AndroidX.Work.Runtime" Version="2.8.1.3" />
<PackageReference Include="Xamarin.AndroidX.ConstraintLayout" Version="2.1.4.6" />
<PackageReference Include="Xamarin.AndroidX.Lifecycle.Process" Version="2.6.1.3" />
<PackageReference Include="Xamarin.Google.Android.Material" Version="1.9.0.2" />
<PackageReference Include="Xamarin.AndroidX.Fragment" Version="1.6.0.1" />
<PackageReference Include="Xamarin.Controls.SignaturePad" Version="3.0.0" />
<PackageReference Include="Xamarin.AndroidX.Navigation.Fragment" Version="2.6.0.1" />
</ItemGroup>
```
nuget packages used in submodules projects
```
<ItemGroup>
<PackageReference Include="Microsoft.Toolkit.Mvvm" Version="7.1.2" />
<PackageReference Include="Xamarin.Essentials" Version="1.7.7" />
</ItemGroup>
```
```
<ItemGroup>
<PackageReference Include="Xamarin.AndroidX.AppCompat" Version="1.6.1.3" />
<PackageReference Include="Xamarin.AndroidX.RecyclerView" Version="1.3.0.3" />
<PackageReference Include="Xamarin.Google.Android.Material" Version="1.9.0.2" />
<PackageReference Include="Com.Airbnb.Android.Lottie" Version="4.2.2" />
<PackageReference Include="System.Reactive" Version="6.0.0" />
<PackageReference Include="Refractored.GifImageView" Version="2.0.0" />
<PackageReference Include="Mindscape.Raygun4Net" Version="5.6.0" />
</ItemGroup>
```
```
<ItemGroup>
<PackageReference Include="Xamarin.AndroidX.Camera.Camera2" Version="1.2.3.1" />
<PackageReference Include="Xamarin.AndroidX.ConstraintLayout" Version="2.1.4.6" />
<PackageReference Include="Xamarin.AndroidX.AppCompat" Version="1.6.1.3" />
<PackageReference Include="Xamarin.Google.Android.Material" Version="1.9.0.2" />
<PackageReference Include="Xamarin.AndroidX.Camera.View" Version="1.2.3.1" />
<PackageReference Include="Xamarin.AndroidX.Camera.Lifecycle" Version="1.2.3.1" />
</ItemGroup>
```
```
<ItemGroup>
<PackageReference Include="System.Drawing.Common" Version="7.0.0" />
</ItemGroup>
```
```
<ItemGroup>
<PackageReference Include="SQLiteNetExtensions" Version="2.1.0" />
<PackageReference Include="NLog" Version="5.2.2" />
<PackageReference Include="libphonenumber-csharp" Version="8.13.16" />
<PackageReference Include="Microsoft.AppCenter.Crashes" Version="5.0.2" />
<PackageReference Include="Microsoft.AppCenter.Analytics" Version="5.0.2" />
<PackageReference Include="Xamarin.Essentials" Version="1.7.7" />
<PackageReference Include="SkiaSharp" Version="2.88.3" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
</ItemGroup>
```
runtimeconfiguration.json
```
{
"runtimeOptions": {
"tfm": "net7.0",
"frameworks": [
{
"name": "Microsoft.NETCore.App",
"version": "7.0.0"
},
{
"name": "Microsoft.Android",
"version": ""
}
],
"configProperties": {
"Microsoft.Extensions.DependencyInjection.VerifyOpenGenericServiceTrimmability": false,
"System.AggressiveAttributeTrimming": true,
"System.ComponentModel.TypeConverter.EnableUnsafeBinaryFormatterInDesigntimeLicenseContextSerialization": false,
"System.Diagnostics.Debugger.IsSupported": false,
"System.Diagnostics.Tracing.EventSource.IsSupported": false,
"System.Globalization.Invariant": false,
"System.Net.Http.EnableActivityPropagation": false,
"System.Net.Http.UseNativeHttpHandler": true,
"System.Reflection.Metadata.MetadataUpdater.IsSupported": false,
"System.Reflection.NullabilityInfoContext.IsSupported": false,
"System.Resources.ResourceManager.AllowCustomResourceTypes": false,
"System.Resources.UseSystemResourceKeys": true,
"System.Runtime.InteropServices.BuiltInComInterop.IsSupported": false,
"System.Runtime.InteropServices.EnableConsumingManagedCodeFromNativeHosting": false,
"System.Runtime.InteropServices.EnableCppCLIHostActivation": false,
"System.Runtime.Serialization.EnableUnsafeBinaryFormatterSerialization": false,
"System.StartupHookProvider.IsSupported": false,
"System.Threading.Thread.EnableAutoreleasePool": false,
"System.Text.Encoding.EnableUnsafeUTF7Encoding": false,
"Xamarin.Android.Net.UseNegotiateAuthentication": false
}
}
}
```
### Steps to Reproduce
dotnet publish myapp.csproj -f net7.0-android -c Release
### Did you find any workaround?
no
### Relevant log output
_No response_
|
1.0
|
The APK with .NET 7 Android is twice the size compared to compiling with Xamarin.Android. - ### Android application type
.NET Android (net7.0-android, etc.)
### Affected platform version
.net 7 android
### Description
I am upgrading my app from Xamarin.Android to .NET 7, but the APK/AAB size is twice as large as the one built with Xamarin.Android.
Building and archiving the APK with .NET 7 (dotnet publish myapp.csproj -f net7.0-android -c Release) or from VSMac 17 results in a size that is twice as large as the APK/AAB built and archived with Xamarin.Android 13.
how can i reduce apk/aab, in xamarin.android is 30mb in .net7 android is 65mb ?
Xamarin Android project configuration
```
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
<ProductVersion>8.0.30703</ProductVersion>
<SchemaVersion>2.0</SchemaVersion>
<ProjectGuid>{782BE5B1-8B1B-4D8F-B154-2C003ABA8504}</ProjectGuid>
<ProjectTypeGuids>{EFBA0AD7-5A72-4C68-AF49-83D382785DCF};{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}</ProjectTypeGuids>
<TemplateGuid>{84dd83c5-0fe3-4294-9419-09e7c8ba324f}</TemplateGuid>
<OutputType>Library</OutputType>
<AppDesignerFolder>Properties</AppDesignerFolder>
<RootNamespace>Myapp.Droid</RootNamespace>
<AssemblyName>Myapp.Droid</AssemblyName>
<FileAlignment>512</FileAlignment>
<Deterministic>True</Deterministic>
<AndroidApplication>True</AndroidApplication>
<AndroidResgenFile>Resources\Resource.designer.cs</AndroidResgenFile>
<AndroidResgenClass>Resource</AndroidResgenClass>
<GenerateSerializationAssemblies>Off</GenerateSerializationAssemblies>
<TargetFrameworkVersion>v13.0</TargetFrameworkVersion>
<AndroidManifest>Properties\AndroidManifest.xml</AndroidManifest>
<MonoAndroidResourcePrefix>Resources</MonoAndroidResourcePrefix>
<MonoAndroidAssetsPrefix>Assets</MonoAndroidAssetsPrefix>
<AndroidEnableSGenConcurrent>true</AndroidEnableSGenConcurrent>
<AndroidUseAapt2>true</AndroidUseAapt2>
<AndroidHttpClientHandlerType>Xamarin.Android.Net.AndroidClientHandler</AndroidHttpClientHandlerType>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
<DebugSymbols>True</DebugSymbols>
<DebugType>portable</DebugType>
<Optimize>True</Optimize>
<OutputPath>bin\Release\</OutputPath>
<DefineConstants>TRACE</DefineConstants>
<ErrorReport>prompt</ErrorReport>
<WarningLevel>4</WarningLevel>
<AndroidManagedSymbols>true</AndroidManagedSymbols>
<AndroidUseSharedRuntime>False</AndroidUseSharedRuntime>
<AndroidLinkMode>SdkOnly</AndroidLinkMode>
<EmbedAssembliesIntoApk>True</EmbedAssembliesIntoApk>
</PropertyGroup>
```
vs .net7-android configuration
```
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net7.0-android</TargetFramework>
<SupportedOSPlatformVersion>27</SupportedOSPlatformVersion>
<OutputType>Exe</OutputType>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<ApplicationId>myapp.Android</ApplicationId>
<ApplicationVersion>1</ApplicationVersion>
<ApplicationDisplayVersion>1.0</ApplicationDisplayVersion>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
<AndroidManagedSymbols>true</AndroidManagedSymbols>
<AndroidUseSharedRuntime>False</AndroidUseSharedRuntime>
<AndroidLinkMode>SdkOnly</AndroidLinkMode>
<EmbedAssembliesIntoApk>True</EmbedAssembliesIntoApk>
<Optimize>True</Optimize>
</PropertyGroup>
```
nuget packages used in main project
```
<ItemGroup>
<PackageReference Include="Xamarin.AndroidX.AppCompat" Version="1.6.1.3" />
<PackageReference Include="Xamarin.AndroidX.Lifecycle.Common" Version="2.6.1.3" />
<PackageReference Include="Xamarin.Firebase.Messaging" Version="123.1.2.2" />
<PackageReference Include="Xamarin.AndroidX.Work.Runtime" Version="2.8.1.3" />
<PackageReference Include="Xamarin.AndroidX.ConstraintLayout" Version="2.1.4.6" />
<PackageReference Include="Xamarin.AndroidX.Lifecycle.Process" Version="2.6.1.3" />
<PackageReference Include="Xamarin.Google.Android.Material" Version="1.9.0.2" />
<PackageReference Include="Xamarin.AndroidX.Fragment" Version="1.6.0.1" />
<PackageReference Include="Xamarin.Controls.SignaturePad" Version="3.0.0" />
<PackageReference Include="Xamarin.AndroidX.Navigation.Fragment" Version="2.6.0.1" />
</ItemGroup>
```
nuget packages used in submodules projects
```
<ItemGroup>
<PackageReference Include="Microsoft.Toolkit.Mvvm" Version="7.1.2" />
<PackageReference Include="Xamarin.Essentials" Version="1.7.7" />
</ItemGroup>
```
```
<ItemGroup>
<PackageReference Include="Xamarin.AndroidX.AppCompat" Version="1.6.1.3" />
<PackageReference Include="Xamarin.AndroidX.RecyclerView" Version="1.3.0.3" />
<PackageReference Include="Xamarin.Google.Android.Material" Version="1.9.0.2" />
<PackageReference Include="Com.Airbnb.Android.Lottie" Version="4.2.2" />
<PackageReference Include="System.Reactive" Version="6.0.0" />
<PackageReference Include="Refractored.GifImageView" Version="2.0.0" />
<PackageReference Include="Mindscape.Raygun4Net" Version="5.6.0" />
</ItemGroup>
```
```
<ItemGroup>
<PackageReference Include="Xamarin.AndroidX.Camera.Camera2" Version="1.2.3.1" />
<PackageReference Include="Xamarin.AndroidX.ConstraintLayout" Version="2.1.4.6" />
<PackageReference Include="Xamarin.AndroidX.AppCompat" Version="1.6.1.3" />
<PackageReference Include="Xamarin.Google.Android.Material" Version="1.9.0.2" />
<PackageReference Include="Xamarin.AndroidX.Camera.View" Version="1.2.3.1" />
<PackageReference Include="Xamarin.AndroidX.Camera.Lifecycle" Version="1.2.3.1" />
</ItemGroup>
```
```
<ItemGroup>
<PackageReference Include="System.Drawing.Common" Version="7.0.0" />
</ItemGroup>
```
```
<ItemGroup>
<PackageReference Include="SQLiteNetExtensions" Version="2.1.0" />
<PackageReference Include="NLog" Version="5.2.2" />
<PackageReference Include="libphonenumber-csharp" Version="8.13.16" />
<PackageReference Include="Microsoft.AppCenter.Crashes" Version="5.0.2" />
<PackageReference Include="Microsoft.AppCenter.Analytics" Version="5.0.2" />
<PackageReference Include="Xamarin.Essentials" Version="1.7.7" />
<PackageReference Include="SkiaSharp" Version="2.88.3" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
</ItemGroup>
```
runtimeconfiguration.json
```
{
"runtimeOptions": {
"tfm": "net7.0",
"frameworks": [
{
"name": "Microsoft.NETCore.App",
"version": "7.0.0"
},
{
"name": "Microsoft.Android",
"version": ""
}
],
"configProperties": {
"Microsoft.Extensions.DependencyInjection.VerifyOpenGenericServiceTrimmability": false,
"System.AggressiveAttributeTrimming": true,
"System.ComponentModel.TypeConverter.EnableUnsafeBinaryFormatterInDesigntimeLicenseContextSerialization": false,
"System.Diagnostics.Debugger.IsSupported": false,
"System.Diagnostics.Tracing.EventSource.IsSupported": false,
"System.Globalization.Invariant": false,
"System.Net.Http.EnableActivityPropagation": false,
"System.Net.Http.UseNativeHttpHandler": true,
"System.Reflection.Metadata.MetadataUpdater.IsSupported": false,
"System.Reflection.NullabilityInfoContext.IsSupported": false,
"System.Resources.ResourceManager.AllowCustomResourceTypes": false,
"System.Resources.UseSystemResourceKeys": true,
"System.Runtime.InteropServices.BuiltInComInterop.IsSupported": false,
"System.Runtime.InteropServices.EnableConsumingManagedCodeFromNativeHosting": false,
"System.Runtime.InteropServices.EnableCppCLIHostActivation": false,
"System.Runtime.Serialization.EnableUnsafeBinaryFormatterSerialization": false,
"System.StartupHookProvider.IsSupported": false,
"System.Threading.Thread.EnableAutoreleasePool": false,
"System.Text.Encoding.EnableUnsafeUTF7Encoding": false,
"Xamarin.Android.Net.UseNegotiateAuthentication": false
}
}
}
```
### Steps to Reproduce
dotnet publish myapp.csproj -f net7.0-android -c Release
### Did you find any workaround?
no
### Relevant log output
_No response_
|
non_process
|
the apk with net android is twice the size compared to compiling with xamarin android android application type net android android etc affected platform version net android description i am upgrading my app from xamarin android to net but the apk aab size is twice as large as the one built with xamarin android building and archiving the apk with net dotnet publish myapp csproj f android c release or from vsmac results in a size that is twice as large as the apk aab built and archived with xamarin android how can i reduce apk aab in xamarin android is in android is xamarin android project configuration project toolsversion defaulttargets build xmlns debug anycpu library properties myapp droid myapp droid true true resources resource designer cs resource off properties androidmanifest xml resources assets true true xamarin android net androidclienthandler true portable true bin release trace prompt true false sdkonly true vs android configuration android exe enable enable myapp android true false sdkonly true true nuget packages used in main project nuget packages used in submodules projects runtimeconfiguration json runtimeoptions tfm frameworks name microsoft netcore app version name microsoft android version configproperties microsoft extensions dependencyinjection verifyopengenericservicetrimmability false system aggressiveattributetrimming true system componentmodel typeconverter enableunsafebinaryformatterindesigntimelicensecontextserialization false system diagnostics debugger issupported false system diagnostics tracing eventsource issupported false system globalization invariant false system net http enableactivitypropagation false system net http usenativehttphandler true system reflection metadata metadataupdater issupported false system reflection nullabilityinfocontext issupported false system resources resourcemanager allowcustomresourcetypes false system resources usesystemresourcekeys true system runtime interopservices builtincominterop issupported false system runtime interopservices enableconsumingmanagedcodefromnativehosting false system runtime interopservices enablecppclihostactivation false system runtime serialization enableunsafebinaryformatterserialization false system startuphookprovider issupported false system threading thread enableautoreleasepool false system text encoding false xamarin android net usenegotiateauthentication false steps to reproduce dotnet publish myapp csproj f android c release did you find any workaround no relevant log output no response
| 0
|
203
| 2,612,505,466
|
IssuesEvent
|
2015-02-27 15:00:56
|
Graylog2/graylog2-server
|
https://api.github.com/repos/Graylog2/graylog2-server
|
closed
|
Support extractors for inputs running on radio nodes
|
inputs processing
|
At the moment, it seems to be possible to only have an extractor set per radio input, but not for inputs actually running on the radio.
|
1.0
|
Support extractors for inputs running on radio nodes - At the moment, it seems to be possible to only have an extractor set per radio input, but not for inputs actually running on the radio.
|
process
|
support extractors for inputs running on radio nodes at the moment it seems to be possible to only have an extractor set per radio input but not for inputs actually running on the radio
| 1
|
6,755
| 9,881,612,513
|
IssuesEvent
|
2019-06-24 15:01:49
|
googleapis/api-common-java
|
https://api.github.com/repos/googleapis/api-common-java
|
closed
|
Bad organization in pom.xml
|
type: process
|
This looks suspiciously like the output of a default toString method:
```xml
<developer>
<id>GoogleAPIs</id>
<name>GoogleAPIs</name>
<email>googleapis@googlegroups.com</email>
<url>https://github.com/googleapis</url>
<organization>org.apache.maven.model.Organization@10e5337</organization>
<organizationUrl>https://www.google.com</organizationUrl>
</developer>
```
|
1.0
|
Bad organization in pom.xml - This looks suspiciously like the output of a default toString method:
```xml
<developer>
<id>GoogleAPIs</id>
<name>GoogleAPIs</name>
<email>googleapis@googlegroups.com</email>
<url>https://github.com/googleapis</url>
<organization>org.apache.maven.model.Organization@10e5337</organization>
<organizationUrl>https://www.google.com</organizationUrl>
</developer>
```
|
process
|
bad organization in pom xml this looks suspiciously like the output of a default tostring method xml googleapis googleapis googleapis googlegroups com org apache maven model organization
| 1
|
43,486
| 2,889,807,536
|
IssuesEvent
|
2015-06-13 19:37:30
|
damonkohler/sl4a
|
https://api.github.com/repos/damonkohler/sl4a
|
opened
|
Ability to List Installed Packages
|
auto-migrated Priority-Medium Type-Enhancement
|
_From @GoogleCodeExporter on May 31, 2015 11:27_
```
The AndroidFacade includes getPackageVersion(String packageName) and
getPackageVersionCode(String packageName). This seems woefully incomplete. At a
minimum, we should have a way to return a list of installed packages, and also
ways to retrieve the package Market URLs.
```
Original issue reported on code.google.com by `inebriec...@gmail.com` on 15 Aug 2010 at 4:38
_Copied from original issue: damonkohler/android-scripting#397_
|
1.0
|
Ability to List Installed Packages - _From @GoogleCodeExporter on May 31, 2015 11:27_
```
The AndroidFacade includes getPackageVersion(String packageName) and
getPackageVersionCode(String packageName). This seems woefully incomplete. At a
minimum, we should have a way to return a list of installed packages, and also
ways to retrieve the package Market URLs.
```
Original issue reported on code.google.com by `inebriec...@gmail.com` on 15 Aug 2010 at 4:38
_Copied from original issue: damonkohler/android-scripting#397_
|
non_process
|
ability to list installed packages from googlecodeexporter on may the androidfacade includes getpackageversion string packagename and getpackageversioncode string packagename this seems woefully incomplete at a minimum we should have a way to return a list of installed packages and also ways to retrieve the package market urls original issue reported on code google com by inebriec gmail com on aug at copied from original issue damonkohler android scripting
| 0
|
11,706
| 14,545,539,804
|
IssuesEvent
|
2020-12-15 19:48:29
|
pacificclimate/quail
|
https://api.github.com/repos/pacificclimate/quail
|
closed
|
Cold Spell Duration Index
|
process
|
## Description
This function takes a climdexInput object as input and computes the climdex index CSDI (Cold Spell Duration Index).
## Function to wrap
[`climdex.csdi`](https://github.com/pacificclimate/climdex.pcic/blob/master/R/climdex.r#L1053)
|
1.0
|
Cold Spell Duration Index - ## Description
This function takes a climdexInput object as input and computes the climdex index CSDI (Cold Spell Duration Index).
## Function to wrap
[`climdex.csdi`](https://github.com/pacificclimate/climdex.pcic/blob/master/R/climdex.r#L1053)
|
process
|
cold spell duration index description this function takes a climdexinput object as input and computes the climdex index csdi cold spell duration index function to wrap
| 1
|
199,649
| 22,705,809,687
|
IssuesEvent
|
2022-07-05 14:32:56
|
nexmo-community/vonage-campus-client-sdk-workshop
|
https://api.github.com/repos/nexmo-community/vonage-campus-client-sdk-workshop
|
opened
|
tfjs-node-1.2.9.tgz: 6 vulnerabilities (highest severity is: 8.6)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tfjs-node-1.2.9.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-37713](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37713) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.6 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2021-37712](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.6 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2021-37701](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.6 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2021-32804](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.1 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2021-32803](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.1 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2020-7788](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | ini-1.3.5.tgz | Transitive | 1.2.10 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-37713</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted. This is, in part, accomplished by sanitizing absolute paths of entries within the archive, skipping archive entries that contain `..` path portions, and resolving the sanitized paths against the extraction target directory. This logic was insufficient on Windows systems when extracting tar files that contained a path that was not an absolute path, but specified a drive letter different from the extraction target, such as `C:some\path`. If the drive letter does not match the extraction target, for example `D:\extraction\dir`, then the result of `path.resolve(extractionDirectory, entryPath)` would resolve against the current working directory on the `C:` drive, rather than the extraction target directory. Additionally, a `..` portion of the path could occur immediately after the drive letter, such as `C:../foo`, and was not properly sanitized by the logic that checked for `..` within the normalized and split portions of the path. This only affects users of `node-tar` on Windows systems. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. There is no reasonable way to work around this issue without performing the same path normalization procedures that node-tar now does. Users are encouraged to upgrade to the latest patched versions of node-tar, rather than attempt to sanitize paths themselves.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37713>CVE-2021-37713</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-5955-9wpr-37jh">https://github.com/npm/node-tar/security/advisories/GHSA-5955-9wpr-37jh</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.18</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-37712</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.18</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-37701</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\` and `/` characters as path separators, however `\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701>CVE-2021-37701</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc">https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.16</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-32804</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 6.1.1, 5.0.6, 4.4.14, and 3.3.2 has a arbitrary File Creation/Overwrite vulnerability due to insufficient absolute path sanitization. node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the `preservePaths` flag is not set to `true`. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example `/home/user/.bashrc` would turn into `home/user/.bashrc`. This logic was insufficient when file paths contained repeated path roots such as `////home/user/.bashrc`. `node-tar` would only strip a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. `///home/user/.bashrc`) would still resolve to an absolute path, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.2, 4.4.14, 5.0.6 and 6.1.1. Users may work around this vulnerability without upgrading by creating a custom `onentry` method which sanitizes the `entry.path` or a `filter` method which removes entries with absolute paths. See referenced GitHub Advisory for details. Be aware of CVE-2021-32803 which fixes a similar bug in later versions of tar.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804>CVE-2021-32804</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9">https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution (tar): 4.4.14</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-32803</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution (tar): 4.4.15</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7788</summary>
### Vulnerable Library - <b>ini-1.3.5.tgz</b></p>
<p>An ini encoder/decoder for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/ini/-/ini-1.3.5.tgz">https://registry.npmjs.org/ini/-/ini-1.3.5.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/ini/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- node-pre-gyp-0.13.0.tgz
- rc-1.2.8.tgz
- :x: **ini-1.3.5.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package ini before 1.3.6. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context.
<p>Publish Date: 2020-12-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788>CVE-2020-7788</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788</a></p>
<p>Release Date: 2020-12-11</p>
<p>Fix Resolution (ini): 1.3.6</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
tfjs-node-1.2.9.tgz: 6 vulnerabilities (highest severity is: 8.6) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tfjs-node-1.2.9.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2021-37713](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37713) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.6 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2021-37712](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.6 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2021-37701](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.6 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2021-32804](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.1 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2021-32803](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 8.1 | tar-4.4.13.tgz | Transitive | 1.2.10 | ✅ |
| [CVE-2020-7788](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.3 | ini-1.3.5.tgz | Transitive | 1.2.10 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-37713</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted. This is, in part, accomplished by sanitizing absolute paths of entries within the archive, skipping archive entries that contain `..` path portions, and resolving the sanitized paths against the extraction target directory. This logic was insufficient on Windows systems when extracting tar files that contained a path that was not an absolute path, but specified a drive letter different from the extraction target, such as `C:some\path`. If the drive letter does not match the extraction target, for example `D:\extraction\dir`, then the result of `path.resolve(extractionDirectory, entryPath)` would resolve against the current working directory on the `C:` drive, rather than the extraction target directory. Additionally, a `..` portion of the path could occur immediately after the drive letter, such as `C:../foo`, and was not properly sanitized by the logic that checked for `..` within the normalized and split portions of the path. This only affects users of `node-tar` on Windows systems. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. There is no reasonable way to work around this issue without performing the same path normalization procedures that node-tar now does. Users are encouraged to upgrade to the latest patched versions of node-tar, rather than attempt to sanitize paths themselves.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37713>CVE-2021-37713</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-5955-9wpr-37jh">https://github.com/npm/node-tar/security/advisories/GHSA-5955-9wpr-37jh</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.18</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-37712</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.18</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-37701</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\` and `/` characters as path separators, however `\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc.
<p>Publish Date: 2021-08-31
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701>CVE-2021-37701</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.6</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc">https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc</a></p>
<p>Release Date: 2021-08-31</p>
<p>Fix Resolution (tar): 4.4.16</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-32804</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 6.1.1, 5.0.6, 4.4.14, and 3.3.2 has a arbitrary File Creation/Overwrite vulnerability due to insufficient absolute path sanitization. node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the `preservePaths` flag is not set to `true`. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example `/home/user/.bashrc` would turn into `home/user/.bashrc`. This logic was insufficient when file paths contained repeated path roots such as `////home/user/.bashrc`. `node-tar` would only strip a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. `///home/user/.bashrc`) would still resolve to an absolute path, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.2, 4.4.14, 5.0.6 and 6.1.1. Users may work around this vulnerability without upgrading by creating a custom `onentry` method which sanitizes the `entry.path` or a `filter` method which removes entries with absolute paths. See referenced GitHub Advisory for details. Be aware of CVE-2021-32803 which fixes a similar bug in later versions of tar.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804>CVE-2021-32804</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9">https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution (tar): 4.4.14</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-32803</summary>
### Vulnerable Library - <b>tar-4.4.13.tgz</b></p>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.13.tgz">https://registry.npmjs.org/tar/-/tar-4.4.13.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- :x: **tar-4.4.13.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>8.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution (tar): 4.4.15</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-7788</summary>
### Vulnerable Library - <b>ini-1.3.5.tgz</b></p>
<p>An ini encoder/decoder for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/ini/-/ini-1.3.5.tgz">https://registry.npmjs.org/ini/-/ini-1.3.5.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/ini/package.json</p>
<p>
Dependency Hierarchy:
- tfjs-node-1.2.9.tgz (Root Library)
- node-pre-gyp-0.13.0.tgz
- rc-1.2.8.tgz
- :x: **ini-1.3.5.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/vonage-campus-client-sdk-workshop/commit/4e6c8e6136fef43439d41bbb878358bbb16f2003">4e6c8e6136fef43439d41bbb878358bbb16f2003</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
This affects the package ini before 1.3.6. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context.
<p>Publish Date: 2020-12-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788>CVE-2020-7788</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788</a></p>
<p>Release Date: 2020-12-11</p>
<p>Fix Resolution (ini): 1.3.6</p>
<p>Direct dependency fix Resolution (@tensorflow/tfjs-node): 1.2.10</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
tfjs node tgz vulnerabilities highest severity is vulnerable library tfjs node tgz path to dependency file package json path to vulnerable library node modules tar package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high tar tgz transitive high tar tgz transitive high tar tgz transitive high tar tgz transitive high tar tgz transitive high ini tgz transitive details cve vulnerable library tar tgz tar for node library home page a href path to dependency file package json path to vulnerable library node modules tar package json dependency hierarchy tfjs node tgz root library x tar tgz vulnerable library found in head commit a href found in base branch main vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be outside of the extraction target directory is not extracted this is in part accomplished by sanitizing absolute paths of entries within the archive skipping archive entries that contain path portions and resolving the sanitized paths against the extraction target directory this logic was insufficient on windows systems when extracting tar files that contained a path that was not an absolute path but specified a drive letter different from the extraction target such as c some path if the drive letter does not match the extraction target for example d extraction dir then the result of path resolve extractiondirectory entrypath would resolve against the current working directory on the c drive rather than the extraction target directory additionally a portion of the path could occur immediately after the drive letter such as c foo and was not properly sanitized by the logic that checked for within the normalized and split portions of the path this only affects users of node tar on windows systems these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar there is no reasonable way to work around this issue without performing the same path normalization procedures that node tar now does users are encouraged to upgrade to the latest patched versions of node tar rather than attempt to sanitize paths themselves publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution tensorflow tfjs node rescue worker helmet automatic remediation is available for this issue cve vulnerable library tar tgz tar for node library home page a href path to dependency file package json path to vulnerable library node modules tar package json dependency hierarchy tfjs node tgz root library x tar tgz vulnerable library found in head commit a href found in base branch main vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value additionally on windows systems long path portions would resolve to the same file system entities as their short path counterparts a specially crafted tar archive could thus include a directory with one form of the path followed by a symbolic link with a different string that resolves to the same file system entity followed by a file using the first form by first creating a directory and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution tensorflow tfjs node rescue worker helmet automatic remediation is available for this issue cve vulnerable library tar tgz tar for node library home page a href path to dependency file package json path to vulnerable library node modules tar package json dependency hierarchy tfjs node tgz root library x tar tgz vulnerable library found in head commit a href found in base branch main vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems the cache checking logic used both and characters as path separators however is a valid filename character on posix systems by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite additionally a similar confusion could arise on case insensitive filesystems if a tar archive contained a directory at foo followed by a symbolic link named foo then on case insensitive file systems the creation of the symbolic link would remove the directory from the filesystem but not from the internal directory cache as it would not be treated as a cache hit a subsequent file entry within the foo directory would then be placed in the target of the symbolic link thinking that the directory had already been created these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution tensorflow tfjs node rescue worker helmet automatic remediation is available for this issue cve vulnerable library tar tgz tar for node library home page a href path to dependency file package json path to vulnerable library node modules tar package json dependency hierarchy tfjs node tgz root library x tar tgz vulnerable library found in head commit a href found in base branch main vulnerability details the npm package tar aka node tar before versions and has a arbitrary file creation overwrite vulnerability due to insufficient absolute path sanitization node tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the preservepaths flag is not set to true this is achieved by stripping the absolute path root from any absolute file paths contained in a tar file for example home user bashrc would turn into home user bashrc this logic was insufficient when file paths contained repeated path roots such as home user bashrc node tar would only strip a single path root from such paths when given an absolute file path with repeating path roots the resulting path e g home user bashrc would still resolve to an absolute path thus allowing arbitrary file creation and overwrite this issue was addressed in releases and users may work around this vulnerability without upgrading by creating a custom onentry method which sanitizes the entry path or a filter method which removes entries with absolute paths see referenced github advisory for details be aware of cve which fixes a similar bug in later versions of tar publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution tensorflow tfjs node rescue worker helmet automatic remediation is available for this issue cve vulnerable library tar tgz tar for node library home page a href path to dependency file package json path to vulnerable library node modules tar package json dependency hierarchy tfjs node tgz root library x tar tgz vulnerable library found in head commit a href found in base branch main vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite vulnerability via insufficient symlink protection node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory this order of operations resulted in the directory being created and added to the node tar directory cache when a directory is present in the directory cache subsequent calls to mkdir for that directory are skipped however this is also where node tar checks for symlinks occur by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite this issue was addressed in releases and publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution tensorflow tfjs node rescue worker helmet automatic remediation is available for this issue cve vulnerable library ini tgz an ini encoder decoder for node library home page a href path to dependency file package json path to vulnerable library node modules ini package json dependency hierarchy tfjs node tgz root library node pre gyp tgz rc tgz x ini tgz vulnerable library found in head commit a href found in base branch main vulnerability details this affects the package ini before if an attacker submits a malicious ini file to an application that parses it with ini parse they will pollute the prototype on the application this can be exploited further depending on the context publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ini direct dependency fix resolution tensorflow tfjs node rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
2,702
| 5,557,721,838
|
IssuesEvent
|
2017-03-24 12:56:50
|
DynareTeam/dynare
|
https://api.github.com/repos/DynareTeam/dynare
|
closed
|
Add interface for #1372
|
enhancement preprocessor
|
@houtanb To identify the needed modifications look at #1372 and the new entries in the reference manual.
|
1.0
|
Add interface for #1372 - @houtanb To identify the needed modifications look at #1372 and the new entries in the reference manual.
|
process
|
add interface for houtanb to identify the needed modifications look at and the new entries in the reference manual
| 1
|
177,936
| 21,509,217,813
|
IssuesEvent
|
2022-04-28 01:17:15
|
jainisking/MyTest
|
https://api.github.com/repos/jainisking/MyTest
|
closed
|
CVE-2020-7226 (High) detected in cryptacular-1.1.0.jar - autoclosed
|
security vulnerability
|
## CVE-2020-7226 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptacular-1.1.0.jar</b></p></summary>
<p>The spectacular complement to the Bouncy Castle crypto API for Java.</p>
<p>Library home page: <a href="http://www.cryptacular.org">http://www.cryptacular.org</a></p>
<p>Path to dependency file: MyTest/pom.xml</p>
<p>Path to vulnerable library: 20201014070535_IWZCUR/downloadResource_YKIDYI/20201014070948/cryptacular-1.1.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **cryptacular-1.1.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jainisking/MyTest/commit/e5de90517a1184858b277fcf3eb6f530183ac30b">e5de90517a1184858b277fcf3eb6f530183ac30b</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
CiphertextHeader.java in Cryptacular 1.2.3, as used in Apereo CAS and other products, allows attackers to trigger excessive memory allocation during a decode operation, because the nonce array length associated with "new byte" may depend on untrusted input within the header of encoded data.
<p>Publish Date: 2020-01-24
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7226>CVE-2020-7226</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7226">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7226</a></p>
<p>Release Date: 2020-01-24</p>
<p>Fix Resolution: org.cryptacular:cryptacular:1.1.4,1.2.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-7226 (High) detected in cryptacular-1.1.0.jar - autoclosed - ## CVE-2020-7226 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptacular-1.1.0.jar</b></p></summary>
<p>The spectacular complement to the Bouncy Castle crypto API for Java.</p>
<p>Library home page: <a href="http://www.cryptacular.org">http://www.cryptacular.org</a></p>
<p>Path to dependency file: MyTest/pom.xml</p>
<p>Path to vulnerable library: 20201014070535_IWZCUR/downloadResource_YKIDYI/20201014070948/cryptacular-1.1.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **cryptacular-1.1.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jainisking/MyTest/commit/e5de90517a1184858b277fcf3eb6f530183ac30b">e5de90517a1184858b277fcf3eb6f530183ac30b</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
CiphertextHeader.java in Cryptacular 1.2.3, as used in Apereo CAS and other products, allows attackers to trigger excessive memory allocation during a decode operation, because the nonce array length associated with "new byte" may depend on untrusted input within the header of encoded data.
<p>Publish Date: 2020-01-24
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7226>CVE-2020-7226</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7226">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7226</a></p>
<p>Release Date: 2020-01-24</p>
<p>Fix Resolution: org.cryptacular:cryptacular:1.1.4,1.2.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in cryptacular jar autoclosed cve high severity vulnerability vulnerable library cryptacular jar the spectacular complement to the bouncy castle crypto api for java library home page a href path to dependency file mytest pom xml path to vulnerable library iwzcur downloadresource ykidyi cryptacular jar dependency hierarchy x cryptacular jar vulnerable library found in head commit a href vulnerability details ciphertextheader java in cryptacular as used in apereo cas and other products allows attackers to trigger excessive memory allocation during a decode operation because the nonce array length associated with new byte may depend on untrusted input within the header of encoded data publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org cryptacular cryptacular step up your open source security game with whitesource
| 0
|
429,527
| 30,080,950,268
|
IssuesEvent
|
2023-06-29 03:14:11
|
chipsenkbeil/distant
|
https://api.github.com/repos/chipsenkbeil/distant
|
opened
|
Write documentation on https://distant.dev/
|
documentation
|
This issue represents the main tracker of documentation associated with the CLI & JSON API of distant:
- [ ] Installing distant
- [ ] Once we have https://sh.distant.dev available, provide those examples
- [ ] Link where to download binaries with a section per platform?
- [ ] Guide to build from source
- Check out https://wezfurlong.org/wezterm/installation.html as an example (also uses Mkdocs Material)
- [ ] Configuration section covering all settings available in `config.toml`
- [ ] Cli reference section
- [ ] distant api
- [ ] distant connect
- [ ] distant fs
- [ ] copy
- [ ] exists
- [ ] make-dir
- [ ] metadata
- [ ] read
- [ ] remove
- [ ] rename
- [ ] search
- [ ] set-permissions
- [ ] watch
- [ ] write
- [ ] distant launch
- [ ] distant shell
- [ ] distant spawn (including environment variables & LSP examples)
- [ ] distant system-info
- [ ] distant version
- [ ] distant manager
- [ ] select
- [ ] service (install/uninstall/start/stop and explain user)
- [ ] listen
- [ ] version
- [ ] info
- [ ] list
- [ ] kill
- [ ] distant server
- [ ] listen
- [ ] distant generate
- [ ] config
- [ ] completion
- [ ] API Reference
- [ ] Opening guide explaining the JSON format
- [ ] Section describing authentication
- [ ] Section outlining different messages highlighting request and response with examples
|
1.0
|
Write documentation on https://distant.dev/ - This issue represents the main tracker of documentation associated with the CLI & JSON API of distant:
- [ ] Installing distant
- [ ] Once we have https://sh.distant.dev available, provide those examples
- [ ] Link where to download binaries with a section per platform?
- [ ] Guide to build from source
- Check out https://wezfurlong.org/wezterm/installation.html as an example (also uses Mkdocs Material)
- [ ] Configuration section covering all settings available in `config.toml`
- [ ] Cli reference section
- [ ] distant api
- [ ] distant connect
- [ ] distant fs
- [ ] copy
- [ ] exists
- [ ] make-dir
- [ ] metadata
- [ ] read
- [ ] remove
- [ ] rename
- [ ] search
- [ ] set-permissions
- [ ] watch
- [ ] write
- [ ] distant launch
- [ ] distant shell
- [ ] distant spawn (including environment variables & LSP examples)
- [ ] distant system-info
- [ ] distant version
- [ ] distant manager
- [ ] select
- [ ] service (install/uninstall/start/stop and explain user)
- [ ] listen
- [ ] version
- [ ] info
- [ ] list
- [ ] kill
- [ ] distant server
- [ ] listen
- [ ] distant generate
- [ ] config
- [ ] completion
- [ ] API Reference
- [ ] Opening guide explaining the JSON format
- [ ] Section describing authentication
- [ ] Section outlining different messages highlighting request and response with examples
|
non_process
|
write documentation on this issue represents the main tracker of documentation associated with the cli json api of distant installing distant once we have available provide those examples link where to download binaries with a section per platform guide to build from source check out as an example also uses mkdocs material configuration section covering all settings available in config toml cli reference section distant api distant connect distant fs copy exists make dir metadata read remove rename search set permissions watch write distant launch distant shell distant spawn including environment variables lsp examples distant system info distant version distant manager select service install uninstall start stop and explain user listen version info list kill distant server listen distant generate config completion api reference opening guide explaining the json format section describing authentication section outlining different messages highlighting request and response with examples
| 0
|
296,689
| 25,569,908,598
|
IssuesEvent
|
2022-11-30 16:52:08
|
zackfall/project-manager
|
https://api.github.com/repos/zackfall/project-manager
|
closed
|
Create functions to create issues in the database
|
foundations tests
|
Add tests for the functions that will interact with database, I don't care if they fail just now, we are doing TDD
|
1.0
|
Create functions to create issues in the database - Add tests for the functions that will interact with database, I don't care if they fail just now, we are doing TDD
|
non_process
|
create functions to create issues in the database add tests for the functions that will interact with database i don t care if they fail just now we are doing tdd
| 0
|
22,447
| 31,166,395,313
|
IssuesEvent
|
2023-08-16 20:04:58
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
[MLv2] Selected LHS/RHS column is missing bucket info in `join-condition-columns` methods
|
.Backend .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
|
`join-condition-lhs-columns` and `join-condition-rhs-columns` are missing temporal bucket info for selected datetime columns.
We have a similar use-case with e.g. `breakoutable-columns`. If there's a breakout on a datetime column, the column would have a temporal unit info when returned in `breakoutable-columns`. Example:
```js
// Let's say we have a query with a breakout on the "Created At: Quarter" column
const columns = Lib.breakoutableColumns(query, 0)
const createdAt = findColumn(columns, ...)
Lib.temporalBucket(createdAt) // returns a Quarter bucket object
```
OTOH, even though `join-condition-lhs-columns` and `join-condition-rhs-columns` highlight selected datetime columns, their bucket info is missing:
```js
// Let's say we have a query joining Orders with Products on
// Orders.CREATED_AT (Quarter) = Products.CREATED_AT (Quarter)
const columns = Lib.joinConditionLHSColumns(query, 0, join, lhsColumn, rhsColumn)
const createdAt = findColumn(columns, ...)
Lib.temporalBucket(createdAt) // returns null
```
The FE expects it to behave the same way as `breakoutable-columns` and attach bucket info to columns
|
1.0
|
[MLv2] Selected LHS/RHS column is missing bucket info in `join-condition-columns` methods - `join-condition-lhs-columns` and `join-condition-rhs-columns` are missing temporal bucket info for selected datetime columns.
We have a similar use-case with e.g. `breakoutable-columns`. If there's a breakout on a datetime column, the column would have a temporal unit info when returned in `breakoutable-columns`. Example:
```js
// Let's say we have a query with a breakout on the "Created At: Quarter" column
const columns = Lib.breakoutableColumns(query, 0)
const createdAt = findColumn(columns, ...)
Lib.temporalBucket(createdAt) // returns a Quarter bucket object
```
OTOH, even though `join-condition-lhs-columns` and `join-condition-rhs-columns` highlight selected datetime columns, their bucket info is missing:
```js
// Let's say we have a query joining Orders with Products on
// Orders.CREATED_AT (Quarter) = Products.CREATED_AT (Quarter)
const columns = Lib.joinConditionLHSColumns(query, 0, join, lhsColumn, rhsColumn)
const createdAt = findColumn(columns, ...)
Lib.temporalBucket(createdAt) // returns null
```
The FE expects it to behave the same way as `breakoutable-columns` and attach bucket info to columns
|
process
|
selected lhs rhs column is missing bucket info in join condition columns methods join condition lhs columns and join condition rhs columns are missing temporal bucket info for selected datetime columns we have a similar use case with e g breakoutable columns if there s a breakout on a datetime column the column would have a temporal unit info when returned in breakoutable columns example js let s say we have a query with a breakout on the created at quarter column const columns lib breakoutablecolumns query const createdat findcolumn columns lib temporalbucket createdat returns a quarter bucket object otoh even though join condition lhs columns and join condition rhs columns highlight selected datetime columns their bucket info is missing js let s say we have a query joining orders with products on orders created at quarter products created at quarter const columns lib joinconditionlhscolumns query join lhscolumn rhscolumn const createdat findcolumn columns lib temporalbucket createdat returns null the fe expects it to behave the same way as breakoutable columns and attach bucket info to columns
| 1
|
145,310
| 13,148,042,393
|
IssuesEvent
|
2020-08-08 19:03:14
|
FritzAndFriends/BlazorWebFormsComponents
|
https://api.github.com/repos/FritzAndFriends/BlazorWebFormsComponents
|
closed
|
Need a logo
|
documentation / samples enhancement
|
We could really use a logo for this project to put on the NuGet registry and to use here on GitHub
|
1.0
|
Need a logo - We could really use a logo for this project to put on the NuGet registry and to use here on GitHub
|
non_process
|
need a logo we could really use a logo for this project to put on the nuget registry and to use here on github
| 0
|
63,181
| 8,664,447,220
|
IssuesEvent
|
2018-11-28 20:11:52
|
influxdata/telegraf
|
https://api.github.com/repos/influxdata/telegraf
|
closed
|
[doc] Plugin influxdb_v2 listed as input plugin
|
documentation
|
There is a small issue on the [README.md](https://github.com/influxdata/telegraf/blob/master/README.md#input-plugins), as the plugin influxdb_v2 is listed as input plugin instead of output plugin.
I only found that because I clicked on it by mistake and I got a 404. The commit that added this link is few months old, and no one noticed (or reported).
I think adding a link checker, like https://github.com/dkhamsing/awesome_bot to the build process could help finding these issues and eventually finding also new broken links, like links to 3rd party documentation on the plugins docs.
|
1.0
|
[doc] Plugin influxdb_v2 listed as input plugin - There is a small issue on the [README.md](https://github.com/influxdata/telegraf/blob/master/README.md#input-plugins), as the plugin influxdb_v2 is listed as input plugin instead of output plugin.
I only found that because I clicked on it by mistake and I got a 404. The commit that added this link is few months old, and no one noticed (or reported).
I think adding a link checker, like https://github.com/dkhamsing/awesome_bot to the build process could help finding these issues and eventually finding also new broken links, like links to 3rd party documentation on the plugins docs.
|
non_process
|
plugin influxdb listed as input plugin there is a small issue on the as the plugin influxdb is listed as input plugin instead of output plugin i only found that because i clicked on it by mistake and i got a the commit that added this link is few months old and no one noticed or reported i think adding a link checker like to the build process could help finding these issues and eventually finding also new broken links like links to party documentation on the plugins docs
| 0
|
229,291
| 17,538,333,065
|
IssuesEvent
|
2021-08-12 09:04:16
|
rcgsheffield/sheffield_hpc
|
https://api.github.com/repos/rcgsheffield/sheffield_hpc
|
closed
|
Update the Getting an Account page to reflect the new HPC driving license.
|
Documentation Error Enhancement
|
Content can be lifted from https://www.sheffield.ac.uk/it-services/research/hpc-facilities and placed onto https://docs.iceberg.shef.ac.uk/en/latest/hpc/accounts.html
|
1.0
|
Update the Getting an Account page to reflect the new HPC driving license. - Content can be lifted from https://www.sheffield.ac.uk/it-services/research/hpc-facilities and placed onto https://docs.iceberg.shef.ac.uk/en/latest/hpc/accounts.html
|
non_process
|
update the getting an account page to reflect the new hpc driving license content can be lifted from and placed onto
| 0
|
4,422
| 7,301,995,556
|
IssuesEvent
|
2018-02-27 08:07:30
|
DevExpress/testcafe-hammerhead
|
https://api.github.com/repos/DevExpress/testcafe-hammerhead
|
closed
|
Add 'async' expression generator to the esotope
|
AREA: client AREA: server SYSTEM: resource processing TYPE: bug
|
Original script:
```js
const foo = new class {
async bar() {
return async () => {
await d(obj[prop]);
};
}
}();
```
Processed script:
```js
const foo = new class {
bar () {
return () => {
await d(__get$(obj,prop));
};
}
}();
```
|
1.0
|
Add 'async' expression generator to the esotope - Original script:
```js
const foo = new class {
async bar() {
return async () => {
await d(obj[prop]);
};
}
}();
```
Processed script:
```js
const foo = new class {
bar () {
return () => {
await d(__get$(obj,prop));
};
}
}();
```
|
process
|
add async expression generator to the esotope original script js const foo new class async bar return async await d obj processed script js const foo new class bar return await d get obj prop
| 1
|
19,601
| 25,958,758,003
|
IssuesEvent
|
2022-12-18 15:49:13
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
process.versions does not list all bundled dependencies
|
good first issue process
|
The `process. Versions` API lists the versions of vendored dependencies that are bundled into Node.js. Unfortunately, it only shows a subset. Notably missing from the list are:
* [ ] `deps/acorn`
* [ ] `deps/base64`
* [ ] `deps/cjs-module-lexer`
* [ ] `deps/histogram`
* [ ] `deps/undici`
* [ ] `deps/uvwasi`
|
1.0
|
process.versions does not list all bundled dependencies - The `process. Versions` API lists the versions of vendored dependencies that are bundled into Node.js. Unfortunately, it only shows a subset. Notably missing from the list are:
* [ ] `deps/acorn`
* [ ] `deps/base64`
* [ ] `deps/cjs-module-lexer`
* [ ] `deps/histogram`
* [ ] `deps/undici`
* [ ] `deps/uvwasi`
|
process
|
process versions does not list all bundled dependencies the process versions api lists the versions of vendored dependencies that are bundled into node js unfortunately it only shows a subset notably missing from the list are deps acorn deps deps cjs module lexer deps histogram deps undici deps uvwasi
| 1
|
13,067
| 15,397,080,467
|
IssuesEvent
|
2021-03-03 21:35:36
|
retaildevcrews/ngsa
|
https://api.github.com/repos/retaildevcrews/ngsa
|
opened
|
Sprint 3 Point dev checklist
|
Process
|
Sprint checklist for point dev.
- [ ] Helium <https://aka.ms/heliumdashboard>
- look for anything abnormal. normal looks like
- green "Availability tests summary"
- relatively straight lines for requests and response times
- 0 failed requests
- [ ] Helium Smokers [Portal](https://ms.portal.azure.com/#@microsoft.onmicrosoft.com/resource/subscriptions/fc127bd9-b9bd-4a86-a502-6e1d554bed0a/resourceGroups/helium-e2e-smoker-rg/overview)
- Check that ACI containers are running
- Check container events
- Check logs
- look into making progress on a smokers dashboard for helium
- [ ] Check GitHub actions
- check scheduled actions are running, and look for deprecation warnings
- <https://github.com/retaildevcrews/ngsa/actions>
- <https://github.com/retaildevcrews/helium/actions>
- <https://github.com/retaildevcrews/helium-csharp/actions>
- <https://github.com/retaildevcrews/helium-java/actions>
- <https://github.com/retaildevcrews/helium-typescript/actions>
- <https://github.com/retaildevcrews/helium-terraform/actions>
- [ ] Run credscan on repos that don't have it automated with GitHub actions
- aka.ms/credscan
- [ ] Check repos for issues that are not in a project board.
- [ ] Check Helium triage.
- call out if triage list is getting long
- call out items you feel are important
- [ ] [Dependency Warnings](https://github.com/orgs/retaildevcrews/insights/dependencies?query=is:vulnerable+sort:vulnerabilities-desc)
- check for warnings in Helium and NGSA repos
- [ ] Delete old ghcr.io tags
|
1.0
|
Sprint 3 Point dev checklist - Sprint checklist for point dev.
- [ ] Helium <https://aka.ms/heliumdashboard>
- look for anything abnormal. normal looks like
- green "Availability tests summary"
- relatively straight lines for requests and response times
- 0 failed requests
- [ ] Helium Smokers [Portal](https://ms.portal.azure.com/#@microsoft.onmicrosoft.com/resource/subscriptions/fc127bd9-b9bd-4a86-a502-6e1d554bed0a/resourceGroups/helium-e2e-smoker-rg/overview)
- Check that ACI containers are running
- Check container events
- Check logs
- look into making progress on a smokers dashboard for helium
- [ ] Check GitHub actions
- check scheduled actions are running, and look for deprecation warnings
- <https://github.com/retaildevcrews/ngsa/actions>
- <https://github.com/retaildevcrews/helium/actions>
- <https://github.com/retaildevcrews/helium-csharp/actions>
- <https://github.com/retaildevcrews/helium-java/actions>
- <https://github.com/retaildevcrews/helium-typescript/actions>
- <https://github.com/retaildevcrews/helium-terraform/actions>
- [ ] Run credscan on repos that don't have it automated with GitHub actions
- aka.ms/credscan
- [ ] Check repos for issues that are not in a project board.
- [ ] Check Helium triage.
- call out if triage list is getting long
- call out items you feel are important
- [ ] [Dependency Warnings](https://github.com/orgs/retaildevcrews/insights/dependencies?query=is:vulnerable+sort:vulnerabilities-desc)
- check for warnings in Helium and NGSA repos
- [ ] Delete old ghcr.io tags
|
process
|
sprint point dev checklist sprint checklist for point dev helium look for anything abnormal normal looks like green availability tests summary relatively straight lines for requests and response times failed requests helium smokers check that aci containers are running check container events check logs look into making progress on a smokers dashboard for helium check github actions check scheduled actions are running and look for deprecation warnings run credscan on repos that don t have it automated with github actions aka ms credscan check repos for issues that are not in a project board check helium triage call out if triage list is getting long call out items you feel are important check for warnings in helium and ngsa repos delete old ghcr io tags
| 1
|
15,320
| 19,430,814,417
|
IssuesEvent
|
2021-12-21 11:43:52
|
apache/arrow-rs
|
https://api.github.com/repos/apache/arrow-rs
|
closed
|
Remove the `integration-testing/unskip.patch` file
|
arrow enhancement development-process
|
Context from @b41sh -- https://github.com/apache/arrow-rs/pull/779#issuecomment-989605395
This file is a local change that is applied to the overall arrow integration tests. Once the upstream arrow pr https://github.com/apache/arrow/pull/11238 has been merged, we will not have to apply this patch anymore in arrow.
This ticket tracks removing it
|
1.0
|
Remove the `integration-testing/unskip.patch` file - Context from @b41sh -- https://github.com/apache/arrow-rs/pull/779#issuecomment-989605395
This file is a local change that is applied to the overall arrow integration tests. Once the upstream arrow pr https://github.com/apache/arrow/pull/11238 has been merged, we will not have to apply this patch anymore in arrow.
This ticket tracks removing it
|
process
|
remove the integration testing unskip patch file context from this file is a local change that is applied to the overall arrow integration tests once the upstream arrow pr has been merged we will not have to apply this patch anymore in arrow this ticket tracks removing it
| 1
|
2,438
| 5,216,771,496
|
IssuesEvent
|
2017-01-26 11:30:39
|
raphym/Simulation_of_message_routing_by_intelligent_agents
|
https://api.github.com/repos/raphym/Simulation_of_message_routing_by_intelligent_agents
|
opened
|
Change name Quorum to Backbone
|
being processed
|
The importants nodes are called Backbone , not quorum , quorum is a list of node of a specific backbone
|
1.0
|
Change name Quorum to Backbone - The importants nodes are called Backbone , not quorum , quorum is a list of node of a specific backbone
|
process
|
change name quorum to backbone the importants nodes are called backbone not quorum quorum is a list of node of a specific backbone
| 1
|
3,834
| 6,802,432,542
|
IssuesEvent
|
2017-11-02 20:11:19
|
gratipay/inside.gratipay.com
|
https://api.github.com/repos/gratipay/inside.gratipay.com
|
closed
|
incorporate in Estonia
|
Governance & Process
|
Reticketed from https://github.com/gratipay/inside.gratipay.com/issues/199#issuecomment-100655302 and https://github.com/gratipay/inside.gratipay.com/issues/242#issuecomment-135587886.
> The Republic of Estonia is the first country to offer e-Residency — a transnational digital identity available to anyone in the world interested in administering a location-independent business online.
The purpose of moving Gratipay to Estonia would be to make it easier to convert to a worker-owned cooperative (#72), because our workers are citizens of many different countries.
|
1.0
|
incorporate in Estonia - Reticketed from https://github.com/gratipay/inside.gratipay.com/issues/199#issuecomment-100655302 and https://github.com/gratipay/inside.gratipay.com/issues/242#issuecomment-135587886.
> The Republic of Estonia is the first country to offer e-Residency — a transnational digital identity available to anyone in the world interested in administering a location-independent business online.
The purpose of moving Gratipay to Estonia would be to make it easier to convert to a worker-owned cooperative (#72), because our workers are citizens of many different countries.
|
process
|
incorporate in estonia reticketed from and the republic of estonia is the first country to offer e residency — a transnational digital identity available to anyone in the world interested in administering a location independent business online the purpose of moving gratipay to estonia would be to make it easier to convert to a worker owned cooperative because our workers are citizens of many different countries
| 1
|
4,215
| 4,892,483,971
|
IssuesEvent
|
2016-11-18 19:51:55
|
couchbaselabs/mobile-testkit
|
https://api.github.com/repos/couchbaselabs/mobile-testkit
|
closed
|
Automated end-to-end performance test
|
infrastructure performance sync gateway
|
- [x] Setup Jenkins job for end-to-end test. Set of parameters would be superset of what's needed for locust, provisioning, etc. Use a fixed hardware topology to run the tests on, but provisioning installs SG, CBS, etc. Expects InfluxDB to already exist.
- [x] Datasource creation in Grafana
- [x] Name hosts to support filtering by host type in Grafana
- [x] Provision telegraf agent on each host with appropriate configurations
- [x] Link in Jenkins output directly to Grafana dashboard for appropriate datasource
- [x] Telegraf agent lifecycle (when to start, ensuring shutdown)
|
1.0
|
Automated end-to-end performance test - - [x] Setup Jenkins job for end-to-end test. Set of parameters would be superset of what's needed for locust, provisioning, etc. Use a fixed hardware topology to run the tests on, but provisioning installs SG, CBS, etc. Expects InfluxDB to already exist.
- [x] Datasource creation in Grafana
- [x] Name hosts to support filtering by host type in Grafana
- [x] Provision telegraf agent on each host with appropriate configurations
- [x] Link in Jenkins output directly to Grafana dashboard for appropriate datasource
- [x] Telegraf agent lifecycle (when to start, ensuring shutdown)
|
non_process
|
automated end to end performance test setup jenkins job for end to end test set of parameters would be superset of what s needed for locust provisioning etc use a fixed hardware topology to run the tests on but provisioning installs sg cbs etc expects influxdb to already exist datasource creation in grafana name hosts to support filtering by host type in grafana provision telegraf agent on each host with appropriate configurations link in jenkins output directly to grafana dashboard for appropriate datasource telegraf agent lifecycle when to start ensuring shutdown
| 0
|
50,242
| 12,486,787,039
|
IssuesEvent
|
2020-05-31 04:47:49
|
tensorflow/tensorflow
|
https://api.github.com/repos/tensorflow/tensorflow
|
closed
|
Installation broken - Tensorflow 2.1 Cuda 10.1 Ubuntu 18.04
|
TF 2.1 subtype: ubuntu/linux type:build/install
|
Having followed the installation guide for GPU support multiple times, each time starting from a blank Ubuntu 18.04 LTS instance it breaks when installing cuda 10-1 after installing the driver and rebooting.
See guide for Ubuntu 18.04 + Cuda 10.1: https://www.tensorflow.org/install/gpu
Cuda 10.1 is the version with which Tensorflow 2.1 is compiled and therefore Cuda 10.1 needs to be installed.
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04 LTS
- TensorFlow installed from (source or binary): Tensorflow 2.1 binary
- TensorFlow version: 2.1
- Python version: 3.6
- Installed using virtualenv? pip? conda?: virtualenv
- CUDA/cuDNN version: 10.1
- GPU model and memory: Tesla T4
Having restarted the machine and confirmed that the driver recognises the GPU:
The installation guide for GPU support breaks at this section:
```
sudo apt-get install --no-install-recommends \
cuda-10-1 \
libcudnn7=7.6.4.38-1+cuda10.1 \
libcudnn7-dev=7.6.4.38-1+cuda10.1
```
Which results in:
```
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:
The following packages have unmet dependencies:
cuda-10-1 : Depends: cuda-runtime-10-1 (>= 10.1.243) but it is not going to be installed
Depends: cuda-demo-suite-10-1 (>= 10.1.243) but it is not going to be installed
E: Unable to correct problems, you have held broken packages.
```
Having searched stackoverflow, github and the Tensorflow website, it seems that the dependencies list can not be installed.
Again, I have rerun this installation multiple times on a *blank* machine, without running *anything else* before trying to run this installation.
|
1.0
|
Installation broken - Tensorflow 2.1 Cuda 10.1 Ubuntu 18.04 - Having followed the installation guide for GPU support multiple times, each time starting from a blank Ubuntu 18.04 LTS instance it breaks when installing cuda 10-1 after installing the driver and rebooting.
See guide for Ubuntu 18.04 + Cuda 10.1: https://www.tensorflow.org/install/gpu
Cuda 10.1 is the version with which Tensorflow 2.1 is compiled and therefore Cuda 10.1 needs to be installed.
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04 LTS
- TensorFlow installed from (source or binary): Tensorflow 2.1 binary
- TensorFlow version: 2.1
- Python version: 3.6
- Installed using virtualenv? pip? conda?: virtualenv
- CUDA/cuDNN version: 10.1
- GPU model and memory: Tesla T4
Having restarted the machine and confirmed that the driver recognises the GPU:
The installation guide for GPU support breaks at this section:
```
sudo apt-get install --no-install-recommends \
cuda-10-1 \
libcudnn7=7.6.4.38-1+cuda10.1 \
libcudnn7-dev=7.6.4.38-1+cuda10.1
```
Which results in:
```
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:
The following packages have unmet dependencies:
cuda-10-1 : Depends: cuda-runtime-10-1 (>= 10.1.243) but it is not going to be installed
Depends: cuda-demo-suite-10-1 (>= 10.1.243) but it is not going to be installed
E: Unable to correct problems, you have held broken packages.
```
Having searched stackoverflow, github and the Tensorflow website, it seems that the dependencies list can not be installed.
Again, I have rerun this installation multiple times on a *blank* machine, without running *anything else* before trying to run this installation.
|
non_process
|
installation broken tensorflow cuda ubuntu having followed the installation guide for gpu support multiple times each time starting from a blank ubuntu lts instance it breaks when installing cuda after installing the driver and rebooting see guide for ubuntu cuda cuda is the version with which tensorflow is compiled and therefore cuda needs to be installed system information os platform and distribution e g linux ubuntu ubuntu lts tensorflow installed from source or binary tensorflow binary tensorflow version python version installed using virtualenv pip conda virtualenv cuda cudnn version gpu model and memory tesla having restarted the machine and confirmed that the driver recognises the gpu the installation guide for gpu support breaks at this section sudo apt get install no install recommends cuda dev which results in some packages could not be installed this may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of incoming the following information may help to resolve the situation the following packages have unmet dependencies cuda depends cuda runtime but it is not going to be installed depends cuda demo suite but it is not going to be installed e unable to correct problems you have held broken packages having searched stackoverflow github and the tensorflow website it seems that the dependencies list can not be installed again i have rerun this installation multiple times on a blank machine without running anything else before trying to run this installation
| 0
|
10,204
| 13,066,637,377
|
IssuesEvent
|
2020-07-30 22:09:21
|
qri-io/qri
|
https://api.github.com/repos/qri-io/qri
|
opened
|
Some releases are not tagged as Verified
|
release process
|
### What happened?
Some [releases](https://github.com/qri-io/qri/tags) are not _Verified_.
v0.9.10, v0.9.9 and v0.9.7 are not _Verified_.
### What did you expect to happen?
All releases should be _Verified_.
|
1.0
|
Some releases are not tagged as Verified - ### What happened?
Some [releases](https://github.com/qri-io/qri/tags) are not _Verified_.
v0.9.10, v0.9.9 and v0.9.7 are not _Verified_.
### What did you expect to happen?
All releases should be _Verified_.
|
process
|
some releases are not tagged as verified what happened some are not verified and are not verified what did you expect to happen all releases should be verified
| 1
|
350,193
| 10,479,944,419
|
IssuesEvent
|
2019-09-24 06:12:15
|
wso2/product-apim
|
https://api.github.com/repos/wso2/product-apim
|
closed
|
Swagger resources are not shown for public APIs in Store
|
3.0.0 Priority/Highest Type/Bug
|
Swagger resource is not visible for Public APIs in the store. If the user logs in, he can view them. Seems like the anonymous role is not set for the swagger resource.
|
1.0
|
Swagger resources are not shown for public APIs in Store - Swagger resource is not visible for Public APIs in the store. If the user logs in, he can view them. Seems like the anonymous role is not set for the swagger resource.
|
non_process
|
swagger resources are not shown for public apis in store swagger resource is not visible for public apis in the store if the user logs in he can view them seems like the anonymous role is not set for the swagger resource
| 0
|
13,818
| 16,581,086,495
|
IssuesEvent
|
2021-05-31 11:57:07
|
bisq-network/proposals
|
https://api.github.com/repos/bisq-network/proposals
|
closed
|
Implementing Bisq DAO with RSK
|
an:idea re:processes
|
I don't have technical skills, but I think it's good to share this idea on my mind for a long time to be shared.
### About RSK
[RSK](https://www.rsk.co/) (or rootstock) is a Bitcoin sidechain that allows smart contracts through Solidity, being ETH compatible.
### Rationale
Bitcoin first layer is becoming more expensive as it succeeds, making small transactions irrational due to its cost. Using a Bitcoin second layer like Liquid or Lightning Network to reduce costs and access other features, like quicker trades or improved privacy is something that Bisq will have to face sooner or later. Smart contracts also offer different features like automating DAO tasks and hopefully remove CPOF.
### Features
- Automate burning man task: this [role](https://bisq.wiki/Donation_Address_Owner) buys BSQ with BTC that has collected in the donation address wallet and burns purchased BSQ. If BSQ becomes a RSK token, fees and trade RBTC funds into dispute that are sent to DAO donation contract could automatically swapped for BSQ at [RSKswap](https://app.rskswap.com/swap) and burned.
- Stablecoins integration: DAI or DOC could be used to trade at Bisq, substantially reducing volatility risk for trades and allowing exchange of fixed price payment methods (Amazon gift cards).
- Fast confirmation times: Various RSK blocks are published every minute.
- Small fees: That could be a good reason enough to move for short term, but I'm not going to pretend something that is not; reduced fees depend on reduced network use. For long term, in case blocks get full and fees goes up, Lumino might get used, which is like Lightning Network but for RSK.
### Risks
- RBTC is not BTC. Bisq would be built over BTC and RSK, each new layer multiply risks.
- Smart contracts are complex.
- Export Bisq DAO from BTC to RSK. A huge challenge.
|
1.0
|
Implementing Bisq DAO with RSK - I don't have technical skills, but I think it's good to share this idea on my mind for a long time to be shared.
### About RSK
[RSK](https://www.rsk.co/) (or rootstock) is a Bitcoin sidechain that allows smart contracts through Solidity, being ETH compatible.
### Rationale
Bitcoin first layer is becoming more expensive as it succeeds, making small transactions irrational due to its cost. Using a Bitcoin second layer like Liquid or Lightning Network to reduce costs and access other features, like quicker trades or improved privacy is something that Bisq will have to face sooner or later. Smart contracts also offer different features like automating DAO tasks and hopefully remove CPOF.
### Features
- Automate burning man task: this [role](https://bisq.wiki/Donation_Address_Owner) buys BSQ with BTC that has collected in the donation address wallet and burns purchased BSQ. If BSQ becomes a RSK token, fees and trade RBTC funds into dispute that are sent to DAO donation contract could automatically swapped for BSQ at [RSKswap](https://app.rskswap.com/swap) and burned.
- Stablecoins integration: DAI or DOC could be used to trade at Bisq, substantially reducing volatility risk for trades and allowing exchange of fixed price payment methods (Amazon gift cards).
- Fast confirmation times: Various RSK blocks are published every minute.
- Small fees: That could be a good reason enough to move for short term, but I'm not going to pretend something that is not; reduced fees depend on reduced network use. For long term, in case blocks get full and fees goes up, Lumino might get used, which is like Lightning Network but for RSK.
### Risks
- RBTC is not BTC. Bisq would be built over BTC and RSK, each new layer multiply risks.
- Smart contracts are complex.
- Export Bisq DAO from BTC to RSK. A huge challenge.
|
process
|
implementing bisq dao with rsk i don t have technical skills but i think it s good to share this idea on my mind for a long time to be shared about rsk or rootstock is a bitcoin sidechain that allows smart contracts through solidity being eth compatible rationale bitcoin first layer is becoming more expensive as it succeeds making small transactions irrational due to its cost using a bitcoin second layer like liquid or lightning network to reduce costs and access other features like quicker trades or improved privacy is something that bisq will have to face sooner or later smart contracts also offer different features like automating dao tasks and hopefully remove cpof features automate burning man task this buys bsq with btc that has collected in the donation address wallet and burns purchased bsq if bsq becomes a rsk token fees and trade rbtc funds into dispute that are sent to dao donation contract could automatically swapped for bsq at and burned stablecoins integration dai or doc could be used to trade at bisq substantially reducing volatility risk for trades and allowing exchange of fixed price payment methods amazon gift cards fast confirmation times various rsk blocks are published every minute small fees that could be a good reason enough to move for short term but i m not going to pretend something that is not reduced fees depend on reduced network use for long term in case blocks get full and fees goes up lumino might get used which is like lightning network but for rsk risks rbtc is not btc bisq would be built over btc and rsk each new layer multiply risks smart contracts are complex export bisq dao from btc to rsk a huge challenge
| 1
|
4,164
| 7,107,918,918
|
IssuesEvent
|
2018-01-16 21:45:53
|
18F/product-guide
|
https://api.github.com/repos/18F/product-guide
|
closed
|
SECTION UPDATE (Project Start) - Team kickoff
|
process change
|
At the start of every project, it should be standard practice for each team to kick off together and talk about team process, scrum vs kanban, shared understandings of definition of done, things like that. The team can also spend this time brainstorming around personas and other things related to the project.
Important to make note that we may not always want to wait to start if we are expecting someone to join the team in a month, so we should be careful to synthesize info from the kick-off in some kind of notes or artifact, since we may often have unexpected staffing changes at some point in the project.
[slack convo for background](https://18f.slack.com/archives/product/p1453855194000034)
|
1.0
|
SECTION UPDATE (Project Start) - Team kickoff - At the start of every project, it should be standard practice for each team to kick off together and talk about team process, scrum vs kanban, shared understandings of definition of done, things like that. The team can also spend this time brainstorming around personas and other things related to the project.
Important to make note that we may not always want to wait to start if we are expecting someone to join the team in a month, so we should be careful to synthesize info from the kick-off in some kind of notes or artifact, since we may often have unexpected staffing changes at some point in the project.
[slack convo for background](https://18f.slack.com/archives/product/p1453855194000034)
|
process
|
section update project start team kickoff at the start of every project it should be standard practice for each team to kick off together and talk about team process scrum vs kanban shared understandings of definition of done things like that the team can also spend this time brainstorming around personas and other things related to the project important to make note that we may not always want to wait to start if we are expecting someone to join the team in a month so we should be careful to synthesize info from the kick off in some kind of notes or artifact since we may often have unexpected staffing changes at some point in the project
| 1
|
118,106
| 17,576,353,243
|
IssuesEvent
|
2021-08-15 17:33:00
|
turkdevops/brackets
|
https://api.github.com/repos/turkdevops/brackets
|
closed
|
CVE-2020-8203 (High) detected in lodash-3.10.1.tgz, lodash-2.4.2.tgz - autoclosed
|
security vulnerability
|
## CVE-2020-8203 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-3.10.1.tgz</b>, <b>lodash-2.4.2.tgz</b></p></summary>
<p>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: brackets/package.json</p>
<p>Path to vulnerable library: brackets/node_modules/grunt-contrib-watch/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-watch-1.0.0.tgz (Root Library)
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-2.4.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, & extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz">https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz</a></p>
<p>Path to dependency file: brackets/package.json</p>
<p>Path to vulnerable library: brackets/node_modules/grunt-cli/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-cli-0.1.9.tgz (Root Library)
- findup-sync-0.1.3.tgz
- :x: **lodash-2.4.2.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/brackets/commit/486c979c9181a42d1ab9d6be10b160fb1973f21b">486c979c9181a42d1ab9d6be10b160fb1973f21b</a></p>
<p>Found in base branch: <b>checkTravis</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution attack when using _.zipObjectDeep in lodash before 4.17.20.
<p>Publish Date: 2020-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8203>CVE-2020-8203</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1523">https://www.npmjs.com/advisories/1523</a></p>
<p>Release Date: 2020-10-21</p>
<p>Fix Resolution: lodash - 4.17.19</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-8203 (High) detected in lodash-3.10.1.tgz, lodash-2.4.2.tgz - autoclosed - ## CVE-2020-8203 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-3.10.1.tgz</b>, <b>lodash-2.4.2.tgz</b></p></summary>
<p>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: brackets/package.json</p>
<p>Path to vulnerable library: brackets/node_modules/grunt-contrib-watch/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-watch-1.0.0.tgz (Root Library)
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-2.4.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, & extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz">https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz</a></p>
<p>Path to dependency file: brackets/package.json</p>
<p>Path to vulnerable library: brackets/node_modules/grunt-cli/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-cli-0.1.9.tgz (Root Library)
- findup-sync-0.1.3.tgz
- :x: **lodash-2.4.2.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/brackets/commit/486c979c9181a42d1ab9d6be10b160fb1973f21b">486c979c9181a42d1ab9d6be10b160fb1973f21b</a></p>
<p>Found in base branch: <b>checkTravis</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution attack when using _.zipObjectDeep in lodash before 4.17.20.
<p>Publish Date: 2020-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8203>CVE-2020-8203</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/1523">https://www.npmjs.com/advisories/1523</a></p>
<p>Release Date: 2020-10-21</p>
<p>Fix Resolution: lodash - 4.17.19</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in lodash tgz lodash tgz autoclosed cve high severity vulnerability vulnerable libraries lodash tgz lodash tgz lodash tgz the modern build of lodash modular utilities library home page a href path to dependency file brackets package json path to vulnerable library brackets node modules grunt contrib watch node modules lodash package json dependency hierarchy grunt contrib watch tgz root library x lodash tgz vulnerable library lodash tgz a utility library delivering consistency customization performance extras library home page a href path to dependency file brackets package json path to vulnerable library brackets node modules grunt cli node modules lodash package json dependency hierarchy grunt cli tgz root library findup sync tgz x lodash tgz vulnerable library found in head commit a href found in base branch checktravis vulnerability details prototype pollution attack when using zipobjectdeep in lodash before publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash step up your open source security game with whitesource
| 0
|
14,535
| 17,631,825,778
|
IssuesEvent
|
2021-08-19 08:56:46
|
ethereumclassic/ECIPs
|
https://api.github.com/repos/ethereumclassic/ECIPs
|
closed
|
Enable Github Discussion Feature
|
meta:3 process
|
I'm proposing to enable [Github Discussions](https://github.blog/2021-08-17-github-discussions-out-of-beta/) for this repository to host ECIP discussions. Discussions often happen via Github Issues, but now we have this neat discussion feature for more organized discourse including the other community engagements and metrics Github Discussions offer.
I've assigned this issue to the ECIP editors. If you think this is a good idea, then I can move forward with setting this feature up. Let me know what you think.
|
1.0
|
Enable Github Discussion Feature - I'm proposing to enable [Github Discussions](https://github.blog/2021-08-17-github-discussions-out-of-beta/) for this repository to host ECIP discussions. Discussions often happen via Github Issues, but now we have this neat discussion feature for more organized discourse including the other community engagements and metrics Github Discussions offer.
I've assigned this issue to the ECIP editors. If you think this is a good idea, then I can move forward with setting this feature up. Let me know what you think.
|
process
|
enable github discussion feature i m proposing to enable for this repository to host ecip discussions discussions often happen via github issues but now we have this neat discussion feature for more organized discourse including the other community engagements and metrics github discussions offer i ve assigned this issue to the ecip editors if you think this is a good idea then i can move forward with setting this feature up let me know what you think
| 1
|
10,472
| 13,246,447,326
|
IssuesEvent
|
2020-08-19 15:43:10
|
prisma/e2e-tests
|
https://api.github.com/repos/prisma/e2e-tests
|
closed
|
Lower the number of pushes from Renovate and Dependabot
|
process/candidate
|
Automated PRs trigger Actions runs. If too many changes to branches and PRs are done at the same time, that suffocates our normal GH Actions.
We should see if there is a way to not let them do that many changes at the same time or if we can rate limit their activity somehow to lower the load on Actions.
|
1.0
|
Lower the number of pushes from Renovate and Dependabot - Automated PRs trigger Actions runs. If too many changes to branches and PRs are done at the same time, that suffocates our normal GH Actions.
We should see if there is a way to not let them do that many changes at the same time or if we can rate limit their activity somehow to lower the load on Actions.
|
process
|
lower the number of pushes from renovate and dependabot automated prs trigger actions runs if too many changes to branches and prs are done at the same time that suffocates our normal gh actions we should see if there is a way to not let them do that many changes at the same time or if we can rate limit their activity somehow to lower the load on actions
| 1
|
182,134
| 14,900,857,555
|
IssuesEvent
|
2021-01-21 15:50:45
|
apache/apisix-ingress-controller
|
https://api.github.com/repos/apache/apisix-ingress-controller
|
closed
|
request help: Need to add Apache APISIX link in README
|
documentation good first issue
|
Apache APISIX ingress is base on Apache APISIX, In order to facilitate users to quickly find information about Apache APISIX, a link to APISIX needs to be added to the README.
|
1.0
|
request help: Need to add Apache APISIX link in README - Apache APISIX ingress is base on Apache APISIX, In order to facilitate users to quickly find information about Apache APISIX, a link to APISIX needs to be added to the README.
|
non_process
|
request help need to add apache apisix link in readme apache apisix ingress is base on apache apisix in order to facilitate users to quickly find information about apache apisix a link to apisix needs to be added to the readme
| 0
|
830,465
| 32,009,350,300
|
IssuesEvent
|
2023-09-21 16:52:13
|
GoogleCloudPlatform/professional-services-data-validator
|
https://api.github.com/repos/GoogleCloudPlatform/professional-services-data-validator
|
closed
|
Dataflow Template: Support hash column list
|
priority: p1 dataflow template
|
DVT supports the following option:
```
--hash COLUMNS Comma separated list of columns to hash or * for all columns
```
The Dataflow Template currently only supports `*`. We should support a hash column list as well, particularly to ignore complex data types.
|
1.0
|
Dataflow Template: Support hash column list - DVT supports the following option:
```
--hash COLUMNS Comma separated list of columns to hash or * for all columns
```
The Dataflow Template currently only supports `*`. We should support a hash column list as well, particularly to ignore complex data types.
|
non_process
|
dataflow template support hash column list dvt supports the following option hash columns comma separated list of columns to hash or for all columns the dataflow template currently only supports we should support a hash column list as well particularly to ignore complex data types
| 0
|
10,838
| 13,618,217,898
|
IssuesEvent
|
2020-09-23 18:12:30
|
kubernetes/minikube
|
https://api.github.com/repos/kubernetes/minikube
|
closed
|
Makefile targets for pushing storage-provisioner and kicbase
|
kind/process priority/important-soon
|
Make sure to not push if an images already exists at the given target.
|
1.0
|
Makefile targets for pushing storage-provisioner and kicbase - Make sure to not push if an images already exists at the given target.
|
process
|
makefile targets for pushing storage provisioner and kicbase make sure to not push if an images already exists at the given target
| 1
|
38,373
| 8,468,131,963
|
IssuesEvent
|
2018-10-23 18:51:41
|
mozilla-mobile/android-components
|
https://api.github.com/repos/mozilla-mobile/android-components
|
opened
|
No executed tests in BrowserToolbarTest (Test without @Test)
|
⌨️ code
|
Is there any reason why the below tests are not been executed?
https://github.com/mozilla-mobile/android-components/blob/977b2e29892f9927200ef34e3236f89fb68e63fa/components/browser/toolbar/src/test/java/mozilla/components/browser/toolbar/BrowserToolbarTest.kt#L288
https://github.com/mozilla-mobile/android-components/blob/977b2e29892f9927200ef34e3236f89fb68e63fa/components/browser/toolbar/src/test/java/mozilla/components/browser/toolbar/BrowserToolbarTest.kt#L205
|
1.0
|
No executed tests in BrowserToolbarTest (Test without @Test) - Is there any reason why the below tests are not been executed?
https://github.com/mozilla-mobile/android-components/blob/977b2e29892f9927200ef34e3236f89fb68e63fa/components/browser/toolbar/src/test/java/mozilla/components/browser/toolbar/BrowserToolbarTest.kt#L288
https://github.com/mozilla-mobile/android-components/blob/977b2e29892f9927200ef34e3236f89fb68e63fa/components/browser/toolbar/src/test/java/mozilla/components/browser/toolbar/BrowserToolbarTest.kt#L205
|
non_process
|
no executed tests in browsertoolbartest test without test is there any reason why the below tests are not been executed
| 0
|
58,880
| 14,499,086,360
|
IssuesEvent
|
2020-12-11 16:17:35
|
eclipse/openj9
|
https://api.github.com/repos/eclipse/openj9
|
closed
|
jdknext failing to build, Java Preprocessor No configuration or non-existant configuration specified
|
blocker comp:build jdk17
|
I suspect this is occurring because JDK16 was split and jdknext is building jdk17 now. We need to add the jdk17 preprocessor configuration, and I think a few other changes to support jdk17 build.
```
23:21:32 Building OpenJ9 Java Preprocessor
23:21:32 Building /home/jenkins/workspace/Build_JDKnext_ppc64le_linux_xl_OpenJDK/build/linux-ppc64le-server-release/support/j9tools/jpp.jar
23:21:40 Generating J9JCL sources
23:21:40 Reading preprocess instructions from xml...
23:21:40 No configuration or non-existant configuration specified (Configurations are case sensitive)
23:21:40 PREPROCESS WAS NOT SUCCESSFUL
```
|
1.0
|
jdknext failing to build, Java Preprocessor No configuration or non-existant configuration specified - I suspect this is occurring because JDK16 was split and jdknext is building jdk17 now. We need to add the jdk17 preprocessor configuration, and I think a few other changes to support jdk17 build.
```
23:21:32 Building OpenJ9 Java Preprocessor
23:21:32 Building /home/jenkins/workspace/Build_JDKnext_ppc64le_linux_xl_OpenJDK/build/linux-ppc64le-server-release/support/j9tools/jpp.jar
23:21:40 Generating J9JCL sources
23:21:40 Reading preprocess instructions from xml...
23:21:40 No configuration or non-existant configuration specified (Configurations are case sensitive)
23:21:40 PREPROCESS WAS NOT SUCCESSFUL
```
|
non_process
|
jdknext failing to build java preprocessor no configuration or non existant configuration specified i suspect this is occurring because was split and jdknext is building now we need to add the preprocessor configuration and i think a few other changes to support build building java preprocessor building home jenkins workspace build jdknext linux xl openjdk build linux server release support jpp jar generating sources reading preprocess instructions from xml no configuration or non existant configuration specified configurations are case sensitive preprocess was not successful
| 0
|
20,730
| 27,430,241,856
|
IssuesEvent
|
2023-03-02 00:21:30
|
Azure/bicep
|
https://api.github.com/repos/Azure/bicep
|
closed
|
Release checklist
|
process
|
**EDIT 3/1/23: This has been moved to https://github.com/Azure/bicep/blob/main/docs/release-checklist.md**
<s>
The following steps must be performed for an official release:
- [x] Build on the main branch is all green
- [x] Merge PRs (if any) to temporarily disable publicly-visible experimental features.
- [x] Verify documentation for new linter rules (check history for bicep\src\vscode-bicep\schemas\bicepconfig.schema.json) exists and is pointed to by the correct link in aka.ms [https://aka.ms/bicep/linter/linter-code])
- [x] If updated types are required:
- [ ] Run [type generation](https://github.com/Azure/bicep-types-az) to update types. (Usually can just use the Github Action)
- [ ] Run the official build for types.
- [ ] Publish Bicep.Types NuGet packages to nuget.org. Follow the readme [here](https://dev.azure.com/msazure/One/_git/BicepMirror-Types-Az).
- Find your build [here](https://dev.azure.com/msazure/One/_build?definitionId=179851&_a=summary) and wait for it to finish successfully. Then click on it and for the `drop_build_main`, download the artifacts.
- Follow instructions to download the nuget.exe from [here](https://learn.microsoft.com/en-us/nuget/install-nuget-client-tools)
- Run `./scripts/UploadPackages.ps1 -PackageDirectory <downloads>/drop_build_main -NuGetPath <nuget_tool_directory>`
- You need to be part of the armdeployments org on nuget.org. (Ask one of the admins to be added) You must generate an API key and then use that as the password for when the popup window appears after running the above command. (Username can be anything)
- [ ] Bump the Bicep.Types NuGet package version in this project in this [file](https://github.com/Azure/bicep/blob/main/src/Bicep.Core/Bicep.Core.csproj) by creating and merging a PR
- Might need to run a `dotnet refresh` to update the packages.lock.json files
- Might also need to update baseline tests (run `bicep/scripts/SetBaseline.ps1`)
- [x] Bump version if necessary (it's necessary for each milestone).
- [x] Run official build by following readme [here](https://msazure.visualstudio.com/One/_git/BicepMirror)
- [x] Get version number from official build and push a new tag to the Bicep repo.
- [x] Create draft release for the new tag and set release title to the tag name.
- [x] Run `bicep/scripts/CreateReleaseNotes -FromTag <previous tag> -ToTag <new tag>` and set the output as the release description.
- [x] Run `BicepMirror/scripts/UploadSignedReleaseArtifacts.ps1` to add official artifacts to the release.
- `-WorkingDir` can be any empty temporary directory
- `-BuildId` is only needed if the latest official build is NOT the official build you are trying to release
- [x] Clean up release notes
- [x] Validate VSIX and Bicep CLI manually on most common platforms
- [x] Publish release
- [x] Upload copyleft dependency source to 3rd party disclosure site. See [instructions](https://msazure.visualstudio.com/One/_wiki/wikis/Azure%20Deployments%20Team%20Wiki/369910/Bicep-release-step-Upload-copyleft-source-to-3rd-party-disclosure-site).
- [x] Upload vscode-bicep.VSIX to VS marketplace
- [x] Upload vs-bicep.VSIX to VS marketplace
- [x] Upload NuGet packages to nuget.org via `./scripts/PublishPackages.ps1`. (Make sure to include CLI packages. Easiest is to use the `__assets` directory created by the `UploadSignedReleaseArtifacts.ps1` script.)
- [x] Update homebrew by going here [here](https://github.com/Azure/homebrew-bicep/actions/workflows/update-homebrew.yml) and clicking on `Run workflow`
- A PR will be auto created by this action. Make sure to find it in the PR list then get it approved and merged.
- [ ] Update issue threads that are affected by this release
- [ ] Errors that will now be warnings as a result of merging #5509 (#5631, #5661)
- [ ] Merge any post-release doc PRs
- [ ] Revert any disabled publicly-visible experimental features.
</s>
|
1.0
|
Release checklist - **EDIT 3/1/23: This has been moved to https://github.com/Azure/bicep/blob/main/docs/release-checklist.md**
<s>
The following steps must be performed for an official release:
- [x] Build on the main branch is all green
- [x] Merge PRs (if any) to temporarily disable publicly-visible experimental features.
- [x] Verify documentation for new linter rules (check history for bicep\src\vscode-bicep\schemas\bicepconfig.schema.json) exists and is pointed to by the correct link in aka.ms [https://aka.ms/bicep/linter/linter-code])
- [x] If updated types are required:
- [ ] Run [type generation](https://github.com/Azure/bicep-types-az) to update types. (Usually can just use the Github Action)
- [ ] Run the official build for types.
- [ ] Publish Bicep.Types NuGet packages to nuget.org. Follow the readme [here](https://dev.azure.com/msazure/One/_git/BicepMirror-Types-Az).
- Find your build [here](https://dev.azure.com/msazure/One/_build?definitionId=179851&_a=summary) and wait for it to finish successfully. Then click on it and for the `drop_build_main`, download the artifacts.
- Follow instructions to download the nuget.exe from [here](https://learn.microsoft.com/en-us/nuget/install-nuget-client-tools)
- Run `./scripts/UploadPackages.ps1 -PackageDirectory <downloads>/drop_build_main -NuGetPath <nuget_tool_directory>`
- You need to be part of the armdeployments org on nuget.org. (Ask one of the admins to be added) You must generate an API key and then use that as the password for when the popup window appears after running the above command. (Username can be anything)
- [ ] Bump the Bicep.Types NuGet package version in this project in this [file](https://github.com/Azure/bicep/blob/main/src/Bicep.Core/Bicep.Core.csproj) by creating and merging a PR
- Might need to run a `dotnet refresh` to update the packages.lock.json files
- Might also need to update baseline tests (run `bicep/scripts/SetBaseline.ps1`)
- [x] Bump version if necessary (it's necessary for each milestone).
- [x] Run official build by following readme [here](https://msazure.visualstudio.com/One/_git/BicepMirror)
- [x] Get version number from official build and push a new tag to the Bicep repo.
- [x] Create draft release for the new tag and set release title to the tag name.
- [x] Run `bicep/scripts/CreateReleaseNotes -FromTag <previous tag> -ToTag <new tag>` and set the output as the release description.
- [x] Run `BicepMirror/scripts/UploadSignedReleaseArtifacts.ps1` to add official artifacts to the release.
- `-WorkingDir` can be any empty temporary directory
- `-BuildId` is only needed if the latest official build is NOT the official build you are trying to release
- [x] Clean up release notes
- [x] Validate VSIX and Bicep CLI manually on most common platforms
- [x] Publish release
- [x] Upload copyleft dependency source to 3rd party disclosure site. See [instructions](https://msazure.visualstudio.com/One/_wiki/wikis/Azure%20Deployments%20Team%20Wiki/369910/Bicep-release-step-Upload-copyleft-source-to-3rd-party-disclosure-site).
- [x] Upload vscode-bicep.VSIX to VS marketplace
- [x] Upload vs-bicep.VSIX to VS marketplace
- [x] Upload NuGet packages to nuget.org via `./scripts/PublishPackages.ps1`. (Make sure to include CLI packages. Easiest is to use the `__assets` directory created by the `UploadSignedReleaseArtifacts.ps1` script.)
- [x] Update homebrew by going here [here](https://github.com/Azure/homebrew-bicep/actions/workflows/update-homebrew.yml) and clicking on `Run workflow`
- A PR will be auto created by this action. Make sure to find it in the PR list then get it approved and merged.
- [ ] Update issue threads that are affected by this release
- [ ] Errors that will now be warnings as a result of merging #5509 (#5631, #5661)
- [ ] Merge any post-release doc PRs
- [ ] Revert any disabled publicly-visible experimental features.
</s>
|
process
|
release checklist edit this has been moved to the following steps must be performed for an official release build on the main branch is all green merge prs if any to temporarily disable publicly visible experimental features verify documentation for new linter rules check history for bicep src vscode bicep schemas bicepconfig schema json exists and is pointed to by the correct link in aka ms if updated types are required run to update types usually can just use the github action run the official build for types publish bicep types nuget packages to nuget org follow the readme find your build and wait for it to finish successfully then click on it and for the drop build main download the artifacts follow instructions to download the nuget exe from run scripts uploadpackages packagedirectory drop build main nugetpath you need to be part of the armdeployments org on nuget org ask one of the admins to be added you must generate an api key and then use that as the password for when the popup window appears after running the above command username can be anything bump the bicep types nuget package version in this project in this by creating and merging a pr might need to run a dotnet refresh to update the packages lock json files might also need to update baseline tests run bicep scripts setbaseline bump version if necessary it s necessary for each milestone run official build by following readme get version number from official build and push a new tag to the bicep repo create draft release for the new tag and set release title to the tag name run bicep scripts createreleasenotes fromtag totag and set the output as the release description run bicepmirror scripts uploadsignedreleaseartifacts to add official artifacts to the release workingdir can be any empty temporary directory buildid is only needed if the latest official build is not the official build you are trying to release clean up release notes validate vsix and bicep cli manually on most common platforms publish release upload copyleft dependency source to party disclosure site see upload vscode bicep vsix to vs marketplace upload vs bicep vsix to vs marketplace upload nuget packages to nuget org via scripts publishpackages make sure to include cli packages easiest is to use the assets directory created by the uploadsignedreleaseartifacts script update homebrew by going here and clicking on run workflow a pr will be auto created by this action make sure to find it in the pr list then get it approved and merged update issue threads that are affected by this release errors that will now be warnings as a result of merging merge any post release doc prs revert any disabled publicly visible experimental features
| 1
|
21,822
| 30,316,715,634
|
IssuesEvent
|
2023-07-10 16:03:22
|
tdwg/dwc
|
https://api.github.com/repos/tdwg/dwc
|
closed
|
New Term - parentMeasurementID
|
Term - add Class - MeasurementOrFact Extensions normative Process - complete
|
## New term : parentMeasurementID
* Submitter: Guillaume Body, Anne-Sophie Archambeau, Sophie Pamerlon
* Efficacy Justification (why is this term necessary?): Estimated records are a wide group of data that share similar information, mostly on statistical precision: confidence interval, standard deviation, distribution. These measurements are precision on other measurements (the main estimated value). To correctly describe this relation, the DwC standard needs to nest measurement within other measurements, such as events which can be nested in each other.
* Demand Justification (name at least two organizations that independently need this term): European Food Safety Authority (enetwild project), French Office of Biodiversity, potentially GEO BON (all essential biodiversity variables are statistically estimated)
Proposed attributes of the new term:
* Term name (in lowerCamelCase for properties, UpperCamelCase for classes): parentMeasurementID
* Organized in Class (e.g., Occurrence, Event, Location, Taxon): MeasurementOrFact
* Definition of the term (normative): An identifier for the broader Measurement that groups this and potentially other Measurements or fact
* Usage comments (recommendations regarding content, etc., not normative): Use a globally unique identifier for a dwc:MeasurementOrFact or an identifier for a dwc:MeasurementOrFact that is specific to the data set.
* Examples (not normative): 9c752d22-b09a-11e8-96f8-529269fb1459 ; E1_E1_O1_M1
* Note: for correct identification of the record, the basisOfRecord should include a new value: "statistical estimation"
|
1.0
|
New Term - parentMeasurementID - ## New term : parentMeasurementID
* Submitter: Guillaume Body, Anne-Sophie Archambeau, Sophie Pamerlon
* Efficacy Justification (why is this term necessary?): Estimated records are a wide group of data that share similar information, mostly on statistical precision: confidence interval, standard deviation, distribution. These measurements are precision on other measurements (the main estimated value). To correctly describe this relation, the DwC standard needs to nest measurement within other measurements, such as events which can be nested in each other.
* Demand Justification (name at least two organizations that independently need this term): European Food Safety Authority (enetwild project), French Office of Biodiversity, potentially GEO BON (all essential biodiversity variables are statistically estimated)
Proposed attributes of the new term:
* Term name (in lowerCamelCase for properties, UpperCamelCase for classes): parentMeasurementID
* Organized in Class (e.g., Occurrence, Event, Location, Taxon): MeasurementOrFact
* Definition of the term (normative): An identifier for the broader Measurement that groups this and potentially other Measurements or fact
* Usage comments (recommendations regarding content, etc., not normative): Use a globally unique identifier for a dwc:MeasurementOrFact or an identifier for a dwc:MeasurementOrFact that is specific to the data set.
* Examples (not normative): 9c752d22-b09a-11e8-96f8-529269fb1459 ; E1_E1_O1_M1
* Note: for correct identification of the record, the basisOfRecord should include a new value: "statistical estimation"
|
process
|
new term parentmeasurementid new term parentmeasurementid submitter guillaume body anne sophie archambeau sophie pamerlon efficacy justification why is this term necessary estimated records are a wide group of data that share similar information mostly on statistical precision confidence interval standard deviation distribution these measurements are precision on other measurements the main estimated value to correctly describe this relation the dwc standard needs to nest measurement within other measurements such as events which can be nested in each other demand justification name at least two organizations that independently need this term european food safety authority enetwild project french office of biodiversity potentially geo bon all essential biodiversity variables are statistically estimated proposed attributes of the new term term name in lowercamelcase for properties uppercamelcase for classes parentmeasurementid organized in class e g occurrence event location taxon measurementorfact definition of the term normative an identifier for the broader measurement that groups this and potentially other measurements or fact usage comments recommendations regarding content etc not normative use a globally unique identifier for a dwc measurementorfact or an identifier for a dwc measurementorfact that is specific to the data set examples not normative note for correct identification of the record the basisofrecord should include a new value statistical estimation
| 1
|
196,519
| 14,876,885,624
|
IssuesEvent
|
2021-01-20 01:52:33
|
IBM/kui
|
https://api.github.com/repos/IBM/kui
|
closed
|
kubectl Terminal tab test is flakey
|
plugin-kubectl tests
|
The failing part waits for a fixed 5 seconds, sends "exit 1", and then waits (until mocha timeout) for the sidecar toolbar text to indicate exit with error.
My guess is that sometimes the remote terminal takes more than 5 seconds to start up --- a bit more likely in travis-ci.com, which has relatively slow VMs.
We are seeing this failure in perhaps 5% of runs. 5% of the time the terminal starts up slowly? Seems plausible?
|
1.0
|
kubectl Terminal tab test is flakey - The failing part waits for a fixed 5 seconds, sends "exit 1", and then waits (until mocha timeout) for the sidecar toolbar text to indicate exit with error.
My guess is that sometimes the remote terminal takes more than 5 seconds to start up --- a bit more likely in travis-ci.com, which has relatively slow VMs.
We are seeing this failure in perhaps 5% of runs. 5% of the time the terminal starts up slowly? Seems plausible?
|
non_process
|
kubectl terminal tab test is flakey the failing part waits for a fixed seconds sends exit and then waits until mocha timeout for the sidecar toolbar text to indicate exit with error my guess is that sometimes the remote terminal takes more than seconds to start up a bit more likely in travis ci com which has relatively slow vms we are seeing this failure in perhaps of runs of the time the terminal starts up slowly seems plausible
| 0
|
7,719
| 10,824,323,580
|
IssuesEvent
|
2019-11-09 08:20:30
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
GDAL raster calculator fails when using compression
|
Bug Processing
|
QGIS 3.8.3 on Windows 10, GDAL v2.4.1
Using any compression options in the additional creation options leads to a syntax error in the GDAL raster calculator:
`gdal_calc.py: error: no such option: -c`
The problem is in the syntax created by the tool. Instead of --co COMPRESS=..., it will use -co COMPRESS=...
The issue was reproduced running the GDAL command created by the tool in the command line. After changing -co to --co it runs successfully.
I remember being able to change the GDAL/OGR console call (at least on some tools) in prior QGIS versions. This does not seem to be possible anymore.
|
1.0
|
GDAL raster calculator fails when using compression - QGIS 3.8.3 on Windows 10, GDAL v2.4.1
Using any compression options in the additional creation options leads to a syntax error in the GDAL raster calculator:
`gdal_calc.py: error: no such option: -c`
The problem is in the syntax created by the tool. Instead of --co COMPRESS=..., it will use -co COMPRESS=...
The issue was reproduced running the GDAL command created by the tool in the command line. After changing -co to --co it runs successfully.
I remember being able to change the GDAL/OGR console call (at least on some tools) in prior QGIS versions. This does not seem to be possible anymore.
|
process
|
gdal raster calculator fails when using compression qgis on windows gdal using any compression options in the additional creation options leads to a syntax error in the gdal raster calculator gdal calc py error no such option c the problem is in the syntax created by the tool instead of co compress it will use co compress the issue was reproduced running the gdal command created by the tool in the command line after changing co to co it runs successfully i remember being able to change the gdal ogr console call at least on some tools in prior qgis versions this does not seem to be possible anymore
| 1
|
9,453
| 12,429,431,528
|
IssuesEvent
|
2020-05-25 08:27:31
|
Jeffail/benthos
|
https://api.github.com/repos/Jeffail/benthos
|
closed
|
Simple operations ought to have a quicker solution
|
annoying enhancement processors
|
Benthos has a huge amount of flexibility with processors such as `process_field` and `process_map`. However, most of the time users just want to perform a relatively simple and common operation on a field, e.g. increment a value:
```yaml
- process_field:
codec: metadata
path: foo
result_type: int
processors:
- number:
operator: add
value: 1
```
For that purpose we _could_ use `awk`:
```yaml
pipeline:
processors:
- awk:
program: |
{ metadata_set("foo", metadata_get("foo") + 1) }
```
But there's a performance hit and it seems a little heavy handed for something so simple.
Perhaps we could add a more general solution for these common operators that doesn't require a full scripting engine.
|
1.0
|
Simple operations ought to have a quicker solution - Benthos has a huge amount of flexibility with processors such as `process_field` and `process_map`. However, most of the time users just want to perform a relatively simple and common operation on a field, e.g. increment a value:
```yaml
- process_field:
codec: metadata
path: foo
result_type: int
processors:
- number:
operator: add
value: 1
```
For that purpose we _could_ use `awk`:
```yaml
pipeline:
processors:
- awk:
program: |
{ metadata_set("foo", metadata_get("foo") + 1) }
```
But there's a performance hit and it seems a little heavy handed for something so simple.
Perhaps we could add a more general solution for these common operators that doesn't require a full scripting engine.
|
process
|
simple operations ought to have a quicker solution benthos has a huge amount of flexibility with processors such as process field and process map however most of the time users just want to perform a relatively simple and common operation on a field e g increment a value yaml process field codec metadata path foo result type int processors number operator add value for that purpose we could use awk yaml pipeline processors awk program metadata set foo metadata get foo but there s a performance hit and it seems a little heavy handed for something so simple perhaps we could add a more general solution for these common operators that doesn t require a full scripting engine
| 1
|
4
| 2,490,863,650
|
IssuesEvent
|
2015-01-02 20:55:04
|
tinkerpop/tinkerpop3
|
https://api.github.com/repos/tinkerpop/tinkerpop3
|
closed
|
Fix SampleStep multi-item probability distribution selection algorithm.
|
enhancement process
|
1) sort by weight decreasing and sum up weights as total_weight
2) for i=0...X sample number s in [0,1) then iterate over list until aggregate_weight/total_weight > s
pick that element and subtract its weight from total_weight
on next iteration, skip previously selected elements (i.e. put them into set)
if X is small, I would skip the sorting
worst case complexity is O(Xn)
@mbroecheler
|
1.0
|
Fix SampleStep multi-item probability distribution selection algorithm. - 1) sort by weight decreasing and sum up weights as total_weight
2) for i=0...X sample number s in [0,1) then iterate over list until aggregate_weight/total_weight > s
pick that element and subtract its weight from total_weight
on next iteration, skip previously selected elements (i.e. put them into set)
if X is small, I would skip the sorting
worst case complexity is O(Xn)
@mbroecheler
|
process
|
fix samplestep multi item probability distribution selection algorithm sort by weight decreasing and sum up weights as total weight for i x sample number s in then iterate over list until aggregate weight total weight s pick that element and subtract its weight from total weight on next iteration skip previously selected elements i e put them into set if x is small i would skip the sorting worst case complexity is o xn mbroecheler
| 1
|
739,132
| 25,582,301,567
|
IssuesEvent
|
2022-12-01 06:03:07
|
ZPTXDev/Quaver
|
https://api.github.com/repos/ZPTXDev/Quaver
|
closed
|
Use namespaces for types
|
type:enhancement priority:p3 status:confirmed
|
**Describe the feature**
What feature are you proposing?
Use namespaces for types to make things tidy
**Priority**
- [ ] High
- [ ] Medium
- [x] Low
**List the benefits of adding such a feature**
What will you gain from this feature?
Stop my eyes from bleeding
**Is this feature request related to a problem?**
Is there a problem with the current implementation (if any), or is there an issue that would be resolved with this feature?
N/A
|
1.0
|
Use namespaces for types - **Describe the feature**
What feature are you proposing?
Use namespaces for types to make things tidy
**Priority**
- [ ] High
- [ ] Medium
- [x] Low
**List the benefits of adding such a feature**
What will you gain from this feature?
Stop my eyes from bleeding
**Is this feature request related to a problem?**
Is there a problem with the current implementation (if any), or is there an issue that would be resolved with this feature?
N/A
|
non_process
|
use namespaces for types describe the feature what feature are you proposing use namespaces for types to make things tidy priority high medium low list the benefits of adding such a feature what will you gain from this feature stop my eyes from bleeding is this feature request related to a problem is there a problem with the current implementation if any or is there an issue that would be resolved with this feature n a
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.